DSPRelated.com
Forums

tracking sound source

Started by Sylvia July 15, 2007
On Tue, 24 Jul 2007 05:28:33 -0700, Rune Allnor <allnor@tele.ntnu.no>
wrote:

>On 16 Jul, 19:18, Eric Jacobsen <eric.jacob...@ieee.org> wrote: >> On Sun, 15 Jul 2007 14:52:38 -0700, Rune Allnor <all...@tele.ntnu.no> >> wrote: >> >> >> >> >> >> >On 15 Jul, 22:28, "Philip Martel" <pomar...@comcast.net> wrote: >> >> "Eric Jacobsen" <eric.jacob...@ieee.org> wrote in message >> >> >>news:cfmk93t3lkb3de6tsjj24ots2usdnjnp9o@4ax.com... >> >> >> > On Sun, 15 Jul 2007 11:19:31 -0400, "Philip Martel" >> >> > <pomar...@comcast.net> wrote: >> >> >> >>"Sylvia" <sylvia.za...@gmail.com> wrote in message >> >> >>news:uLOdnfVcWq6GhQfbnZ2dnUVZ_u6rnZ2d@giganews.com... >> >> >>> Does any one know good material on tracking single sound source using >> >> >>> only >> >> >>> two microphones on a dummy head.I have seen kalman filter tracking for >> >> >>> constant velocity targets etc in case of radar applications but i dont >> >> >>> know how to use these in case of sound sources. >> >> >>> Thanks >> >> >> >>Unless you know the sound amplitude of the source, you won't be able to >> >> >>get >> >> >>range information. With only two microphones, you can use beamforming or >> >> >>interfrometry techniques to detrmine the position of the source as >> >> >>somewhere >> >> >>on a cone. In the 2 dimensional case this reduces to 2 lines that cross >> >> >>the >> >> >>line between the sensors at the same point. If, as is usually the case, >> >> >>the >> >> >>source is far from the two microphones compared to the separation of the >> >> >>microphones, you will have localized the source to two lines that cross >> >> >>the >> >> >>line formed by the microphones at a known angle. Usually, you assume that >> >> >>the source is on one side of the sensor. >> >> >> >>With these assumptions, you have a series of angles to the sensor. Google >> >> >>"alpha beta tracker" or "alpha beta gamma tracker" for ways of predicting >> >> >>the source's future position. >> >> >> >> Best wishes, >> >> >> --Phil Martel >> >> >> > Believe it or not, you can get range with a single microphone *with >> >> > some qualifying assumptions*. Basically, if the target is travelling >> >> > in a straight line the doppler characteristic can be used to determine >> >> > range once the target approaches close to (but even a little before) >> >> > the point where it is closest to the microphone. >> >> >> > Eric Jacobsen >> >> > Minister of Algorithms >> >> > Abineau Communications >> >> >http://www.ericjacobsen.org >> >> >> Well, given a fixed frequency sound source (helicopter for exemple) I >> >> suppose you're right, though I'd have to think about it for a while to >> >> convince myself that the shape of the doppeler curve before CPA and the >> >> bearing rate would be enough to determine a unique range >> >> >You can't fix the range that way, only get a time for the CPA. >> >You'll need at least two mics to geat a bearing to CPA. >> >> >In order to fix a range with only one mic, you will need >> >*knowledge* of the type of helicopter. If you *know* the >> >make and model of the helicopter, you also *know* certain >> >key characteristics in the sound signature, and can use >> >those to estimate the speed and range based on the Doppler >> >characteristics. Provided, of course, that the pilot plays >> >your game and flies at constant speed in a straight line. >> >> >Once you do no longer *know* the characteristics, but have >> >to *estimate* them, with all the uncertainty that follows, >> >all bets are off what ranges and speed are concerned -- again, >> >with only one mic involved. If you have an array where you >> >can track bearings, things become somewhat easier. >> >> >Rune >> >> Well, I demonstrated range detection using a single microphone for my >> thesis, and it required no previous characterization of the signal. It >> does require that the target is moving straight and level and isn't >> making rapid variations in it's acoustic signature (slower variations >> are actually okay). It also requires that the acoustic signature has >> some discernible features that provides a reasonably well-behaved >> cross-correlation of the spectrum. > >How did you do that? You need to observe the source while >passing the CPA and estimate the acoustic signature with >no Doppler? OK, I'll agree that would work, but it would >hardly be robust. As you may be aware of, I have this >very awkward preoccupation with applications and robustness; >I can't see how your method would work if you do *not* observe >the source at CPA and do *not* know the source characteristics. > >Rune
Rune, I was travelling for quite a while and just got back, but I wanted to catch up on this. BTW, I'm not clear on what "CPA" stands for, but I'm guessing it's the closest point of approach or something like that? i.e., the point where the target is closest to the sensor? Anyway, the trick is just to observe that one doesn't really need the full doppler curve to start the processing, the derivative will do. And to get the derivative you don't really need to know the acoustic signature of the source, just the change in frequency at reasonable intervals. To get the change, you just cross-correlate the spectrum at intervals and see how much it has shifted...if you do this frequently enough the dilation will be small and the correlation high enough to give a good, discernible peak from which the shift can be computed well enough to do the job. Once the peak in the derivative is detected (which happens slightly before the closest distance to the sensor), one has enough information to estimate the entire doppler curve as well as the time of the closest point. From there one knows the actual, unshifted acoustic signature by computing the zero-doppler spectrum (and you now know when that is), but you don't really need it to do the estimation this way. The cross-correlation process has a lot of processing gain and does a pretty good job of rejecting uncorrelated noise or interference. "Robustness" is in the eye of the beholder and subject to the requirements of a particular application. I was just showing that it was possible, but it did work reasonably well. If you need very high accuracy you need some other method, but if you just want to get a good approximation so that you can focus some other sensor, this works quite well. When I first proposed my thesis (which I did independently, it was not funded research), I proposed estimating both velocity and range. Halfway into it I was beginning to think that I was in over my head and range could not be estimated from the information I was computing. I asked my thesis professor if I could drop the range estimation and he refused, "Must do range!" (in a Korean accent). So I figured I'd spend the weekend doing a proof that showed that I *couldn't* get range from a single sensor but to my surprise by the end of the weekend I wound up with a pretty simple algorithm for estimating range...and it worked pretty well. Eric Jacobsen Minister of Algorithms Abineau Communications http://www.ericjacobsen.org