ECE 5525FinalProject.. - Florida Institute of Technology

advertisement

Sonar Signal Processing

ECE 5525 Speech Processing

Final Project

December 7, 2010

Cheryl Skibski

Ocean Engineering

Florida Institute of Technology

Table of Contents

1.0 Introduction ............................................................................................................................................. 3

2.0 Background ............................................................................................................................................. 3

2.1 SONAR ............................................................................................................................................... 3

2.2 Active and Passive Sonar .................................................................................................................... 3

2.3 Correlation, Autocorrelation, and Matched Filtering .......................................................................... 5

3.0 Solution and Implementation .................................................................................................................. 6

3.1 Data Acquisition Toolbox (DAT) ....................................................................................................... 7

3.2 Signal Processing ................................................................................................................................ 8

4.0 Results ..................................................................................................................................................... 9

5.0 Conclusion and Future Work ................................................................................................................ 12

6.0 References ............................................................................................................................................. 12

List of Figures

Figure 1: Active and Passive Sonar ................................................................................................ 4

Figure 2: Acoustic Transducers Examples ..................................................................................... 4

Figure 3: Cross-Correlation (Smith) ............................................................................................... 6

Figure 4: Microphone and Speaker ................................................................................................. 7

Figure 5: Transmitted and Received Signals .................................................................................. 8

Figure 6: Original Chirp.................................................................................................................. 9

Figure 7: Received Signal plotted against time (ms) .................................................................... 10

Figure 8: Cross-Correlation of Received Signal at 5 inches from Wall ....................................... 10

Figure 9: Cross-Correlation of Received Signals plotted amplitude verse time (ms) .................. 11

Figure 10: Hydrophone ................................................................................................................. 12

2

1.0 Introduction

The objective of this project is to illustrate the basic principle of Sound Navigation and

Ranging (SONAR). This paper details the basic background of sonar, problems that occur while using sound underwater, and types of underwater sound devices. The MATLAB programming used for this project includes the Data Acquisition Toolbox and Signal Processing Toolbox. The

Data Acquisition Toolbox is used to record and read the data from a microphone.

2.0 Background

2.1 SONAR

Sound Navigation and Ranging (SONAR) is a method used in underwater application that detects underwater objects using acoustics. This underwater sound travel is essential to underwater object detection and navigation in areas where light is unavailable. Typical sonar systems operate between the frequencies of 20 to 20 kHz. Higher frequencies are used for shorter range detection and lower frequencies are used for longer distances.

When designing and implementing a sonar system, there are multiple parameters that are different depending on the medium. These include source level, transmission loss, noise level, target strength, receiving directivity index, reverberation level, and detection threshold (Urick).

Every value is measured in decibels with a reference that can be added up to indicate the performance of the system. The speed of sound is also a parameter that changes depending on the medium and is affected by salinity, temperature, and pressure.

In a basic sonar system, the signal is said to be the desired portion of the acoustic field at the receiver, where background noise is the undesired value (Urick). The background consists of noise and reverberation. The main objective for designing and implementing a sonar system is for increasing the signal to noise ratio. In this implementation of sonar, the medium is air, the equipment is a speaker and microphone combination and the target is a wall.

2.2 Active and Passive Sonar

The two types of sonar are active and passive sonar. Passive sonar systems consist of a hydrophone as a receiver. The hydrophone listens to sounds and transforms acoustical energy into electrical energy to be processed.

In an active sonar system, a sound ping is purposely generated by one of the system components called a transducer, which operate as both a transmitter and a receiver. When operating as a transmitter, electrical energy is transformed into acoustic energy that propagates through a medium, and is captured by the receiver mode. The receiver acts like a hydrophone and transforms the acoustic energy into electrical energy. The active sonar system uses more power than the passive sonar system.

3

Figure 1: Active and Passive Sonar

Figure 2: Acoustic Transducers Examples

Beamforming is a technique used in sonar that sends a pulse from one projector at different times in a microphone array so the signals can be added together into one signal. This creates one loud signal being captured by the hydrophone or sent from the transducer. This technique helps in amplifying the signal and bearing of where the sound originated. This implementation of sonar uses a single microphone and single speaker. Future work on this project includes increasing the amount of microphones and individually recorded sounds.

4

2.3 Correlation, Autocorrelation, and Matched Filtering

Matched filtering is a type of filtering that uses correlation to detect a known waveform

(Smith). The output of the filter measures how well the filter matches the section of the input signal. The output of the filter does not look like the input signal, but rather the similarities between the two signals, especially if a noisy background is present. The matched filter decreases noise and therefore maximizes the signal to noise ratio.

Cross-correlation, used in this implementation, measures how similar two waveforms are by recognizing a time delay between the two signals. It uses two signals to produce another signal. This is helpful when detecting a known waveform in a noisy signal. During crosscorrelation, the target signal is detected. The amplitude is the measure of how much the received signal resembles the target signal at that location (Smith). Autocorrelation is the cross correlation of the signal with itself and provides the similarity between signals and creates zero lag.

The function used in MATLAB is shown below. There are options for normalization for cross correlation in MATLAB. c = xcorr(x,y) c = xcorr(x)

Cross-correlation between two signals

Autocorrelation

The figure below describes the mathematical method of cross-correlation along with the equation. 𝑥[𝑛] ∗ 𝑡[−𝑛] = 𝑦[𝑛]

5

Figure 3: Cross-Correlation (Smith)

3.0 Solution and Implementation

The basic implementation of active sonar involves outputting a windowed chirp signal of varying frequency, have the chirp travel through air until it hits an object and scatters back, and return the sifted and scaled version of the original signal including random noise from outside sources (Smith). In this implementation, a chirp signal is outputted from the computer through a speaker and is captured by the microphone. Knowing that the sound travels in air at approximately 343 m/s, the time between the transmitted and received signal can be converted to the distance of the object being detected from the transducer. The picture below shows the microphone and speaker used.

6

Figure 4: Microphone and Speaker

3.1 Data Acquisition Toolbox (DAT)

The DAT in MATLAB acquires data from continuous, analog, electrical signals and converts to a list of numbers from digital sampling. The DAT User’s Manual describes the commands for the data acquisition. This conversion allows for signal processing of the sound.

Create analog input and output device objects and open channels for data input and output using the Windows sound card. ai=analoginput('winsound', 0); addchannel(ai,1); ao=analogoutput('winsound', 0); addchannel(ao, 1);

Set the sample rate, duration, number of samples to acquire, trigger condition. The sample rate acquires 44,100 samples in one second. The trigger is condition is set to start as soon as the start command is commanded. The value of N here defines the number of samples per trigger. duration=4.5; in milliseconds

SampleRate=44100; set([ai ao], 'SampleRate', SampleRate); set(ai,'SamplesPerTrigger',N)

The following commands allows for starting the data acquisition with the pulse chirp simultaneously. The channel is cleared after each data acquisition and is restarted with the acquisition of new data. putdata(ao,x); start([ai ao]); clear ai

7

Each data set is taken 5 inches, 10 inches, 15 inches, and 20 inches away from the microphonespeaker system. This data is stored using in the memory using: s5=getdata(ai);

The data is now ready to be processed.

3.2 Signal Processing

The chirp and the received pulse from the microphone are different in the delay of time and the addition of noise into the signal.

Figure 5: Transmitted and Received Signals

The first step of the signal processing of the chirp is to perform autocorrelation. The chirp is correlation with itself in order to view the perfect echo after correlation. c=xcorr(x);

Next, each received signal is correlated with the original chirp in order to eliminate noise and to get the location of the received signal. The higher the amplitude of the cross correlation result shows the possibility of the received signal resembling the target signal at that location

(Smith). s5=getdata(ai); plot(t , s5); axis tight; grid on; xs5=xcorr(s5, pulse); plot(t2, xs5(1:length(t2))); hold on;

A spike is ideal for the cross-correlation procedure whenever a chirp echo is detected.

8

The following block shows the conversion of the time to meters and inches for a distance measurement. The time is the difference in seconds between the two sharp spikes of the crosscorrelation.

C=343;

Dmeter=(C*time)/2 Calculates distance in meters

Dinch=Dmeter/0.0254; Converts meters to inch

4.0 Results

The results presented display the original chirp and the autocorrelation, the recorded sounds at certain distances from the target, and the correlation of those recorded sounds with the original chirp to show the location of the echo and obtain the time delay.

Figure 6: Original Chirp

9

Figure 7: Received Signal plotted against time (ms)

The figure above displays a test with increments of 5 inches. The data was acquired at 5,

10, 15, and 20 inch increments. It is clearly seen that the graphs for 10 and 20 inches are delayed slightly. Moving this delay will allow all the graphs to look similar except for the time delay.

The time is in milliseconds due to the inputted duration in milliseconds for the data acquisition toolbox

Figure 8: Cross-Correlation of Received Signal at 5 inches from Wall

10

The cross-correlation above shows the echo for the 5ms distance from the target.

Time delay calculations:

T1=2.75

T2=3.57

Time=3.57-2.75 = 0.82 ms

Dmeter= (0.82x10

-3

* 343)/2 = 0.14 m

Dinch=5.5 inches

The key to the cross correlation signals:

Magenta: true echo

Blue: 5 inches

Green: 10 inches

Red: 15 inches

Figure 9: Cross-Correlation of Received Signals plotted amplitude verse time (ms)

The above graph displays the echoes at 5, 10, and 15 inches away from the target. These echoes are consistent close to the target.

11

5.0 Conclusion and Future Work

This implementation of sonar was successful for mostly distances closer to the sound source. This can be due to scattering of the reflected signal, other noise in the room, and errors due to the hardware microphone and speaker. A distance of approximately 5.5 inches and 9.5 inches were recorded. The speaker having more power and a more powerful sound outputted would be beneficiary to this implementation.

Future work includes:

Get more familiar with the Data Acquisition Toolbox in MATLAB

Add in a microphone array and sonar beamforming and account for more sonar parameters

Use MATLAB to create a plot like an underwater sonar image. Use colors to represent different the intensities of the reflected signal.

Implement underwater using a hydrophone similar to the one shown below.

Figure 10: Hydrophone

6.0 References

Matejowsky, Eddie. “Eddie's Lounge room sonar project”.

12 Jan 2008. Web. 7 Dec 2010

<http://eddiem.com/projects/chirp/chirp.htm>.

Quatieri, Thomas. “Discrete-Time Speech Signal Processing: Principles and Practice,” Prentice

Hall, 2002

Smith, Stephen. “The Scientist and Engineer’s Guide to Digital Signal Processing.” California

Technical Publishing, <http://www.dspguide.com/ch1.htm>.

Tohyama, Mikio and Tsunehiko Koike. “Fundamentals of Acoustic Signal Processing,”

Academic Press, 1998.

Urick, Robert. “Principles of Underwater Sound 3 rd

ed,” McGraw-Hill, Inc, 1983.

12

Download