DSPRelated.com
Forums

Real Life Beamforming with substandard arrays. Need help forming some sort of solution.

Started by Phil Winder June 25, 2008
Hi,
I'm doing a very simple beamforming/phased array project for a bit of
fun.  What I have is 4 40kHz ultrasonic transducers being sampled at
200kHz.
I'm having problems formulating a simple delay and sum beamformer.
When I do quite simply delay by so many samples and sum, what I get
are answers which are reasonable but wrong. So I cant quite figure out
what step I'm missing.

The data received are 4 Gaussian shaped sinusoids from a reflection
that is about head on (broadside).  If I add them up without delay I
get a nice big gaussian sine wave.  If I delay the data slightly then
the gaussian reduces but when I delay it enough so that the 4 signals
are about back in phase again (actually 360 degrees out) I get a big
gaussian again.  This makes total sense.  So what am I doing wrong?

I thought about retrieving the envelope of the gaussian then doing the
delay and sum, although I havent tried yet.  That should work, but I
dont think it is optimal in any sense.

Thanks for any responses.

Phil
On 25 Jun, 12:55, Phil Winder <philipwin...@googlemail.com> wrote:
> Hi, > I'm doing a very simple beamforming/phased array project for a bit of > fun. &#4294967295;What I have is 4 40kHz ultrasonic transducers being sampled at > 200kHz. > I'm having problems formulating a simple delay and sum beamformer. > When I do quite simply delay by so many samples and sum, what I get > are answers which are reasonable but wrong. So I cant quite figure out > what step I'm missing. > > The data received are 4 Gaussian shaped sinusoids from a reflection > that is about head on (broadside). &#4294967295;If I add them up without delay I > get a nice big gaussian sine wave. &#4294967295;If I delay the data slightly then > the gaussian reduces but when I delay it enough so that the 4 signals > are about back in phase again (actually 360 degrees out) I get a big > gaussian again. &#4294967295;This makes total sense. &#4294967295;So what am I doing wrong? > > I thought about retrieving the envelope of the gaussian then doing the > delay and sum, although I havent tried yet. &#4294967295;That should work, but I > dont think it is optimal in any sense. > > Thanks for any responses. > > Phil
I can think of a few main reasons for the problems: 1) Sampling synchronisation. Sampling the different channels with the required accuracy is a big deal. Are you sure your system is sufficiently precise? 2) Sensor directivity. Ideally, array sensors ought to be omnidirectional. Unless a sensor has been designed to be used in an array, it is likely that the directivity is far from omnidirectional. 3) Reflections. Arrays that are used in multipath environments behave weirdly. One diagnostic is that some perform well only at/near broadside. And then there are implementations of the delay operator. Unless you sort out how to implement fractional delay, you might get weird problems. One possible way of testing the hardware would be to use a sinusoidal steady-state signal. Record the signal for some time with the source at different angles. Then compute the (f,k) spectrum of the signals and see that the spike in the spectrum moves along the k axis. Rune

Phil Winder wrote:


> I'm doing a very simple beamforming/phased array project for a bit of > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > 200kHz.
That looks like pretty coarse time resolution.
> I'm having problems formulating a simple delay and sum beamformer. > When I do quite simply delay by so many samples and sum, what I get > are answers which are reasonable but wrong. So I cant quite figure out > what step I'm missing. > > The data received are 4 Gaussian shaped sinusoids from a reflection > that is about head on (broadside).
How the envelope of the pulse compares to the beamforming delays?
> If I add them up without delay I > get a nice big gaussian sine wave. If I delay the data slightly then > the gaussian reduces but when I delay it enough so that the 4 signals > are about back in phase again (actually 360 degrees out) I get a big > gaussian again. This makes total sense. So what am I doing wrong? > > I thought about retrieving the envelope of the gaussian then doing the > delay and sum, although I havent tried yet. That should work, but I > dont think it is optimal in any sense. > > Thanks for any responses.
Try increasing the sample rate and use CW sine. Vladimir Vassilevsky DSP and Mixed Signal Design Consultant http://www.abvolt.com
On Jun 25, 3:14 pm, Vladimir Vassilevsky <antispam_bo...@hotmail.com>
wrote:
> Phil Winder wrote: > > I'm doing a very simple beamforming/phased array project for a bit of > > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > > 200kHz. > > That looks like pretty coarse time resolution. > > > I'm having problems formulating a simple delay and sum beamformer. > > When I do quite simply delay by so many samples and sum, what I get > > are answers which are reasonable but wrong. So I cant quite figure out > > what step I'm missing. > > > The data received are 4 Gaussian shaped sinusoids from a reflection > > that is about head on (broadside). > > How the envelope of the pulse compares to the beamforming delays? > > > If I add them up without delay I > > get a nice big gaussian sine wave. If I delay the data slightly then > > the gaussian reduces but when I delay it enough so that the 4 signals > > are about back in phase again (actually 360 degrees out) I get a big > > gaussian again. This makes total sense. So what am I doing wrong? > > > I thought about retrieving the envelope of the gaussian then doing the > > delay and sum, although I havent tried yet. That should work, but I > > dont think it is optimal in any sense. > > > Thanks for any responses. > > Try increasing the sample rate and use CW sine. > > Vladimir Vassilevsky > DSP and Mixed Signal Design Consultanthttp://www.abvolt.com
Hi, thanks for the replies. @Rune: Synchronisation is not a big problem here, since all the targets and the array are static. Results taken are always the same. I havent seen any multipath in my setup, the walls are quite far away. The delay operator is simply having the data pointer at incremented intervals. E.g. A[0], B[2], C[4], D[6]. What do you mean by fractional delay? Do you mean I would have to interpolate the data so that I could do A[0], B[0.5], C[1], D[1.5] ?? And that is all I am really doing with this test, although I am creating the signal on the same board so I do rely on reflections. Thanks. @Vladimir: Yeah I know, but unfortunately thats as fast as my ADC will go on the microcontroller. Like before, I could interpolate? The envelope is very long compared to the delays. Hence the 360 degree in-phase problem. What ido you mean by "CW Sine"? Thanks, Phil
On Jun 25, 3:55 am, Phil Winder <philipwin...@googlemail.com> wrote:

> Hi, > I'm doing a very simple beamforming/phased array project for a bit of > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > 200kHz.
You haven't said what the array shape is. Is it a line equally spaced? If so, how far apart are the elements?
> I'm having problems formulating a simple delay and sum beamformer. > When I do quite simply delay by so many samples and sum, what I get > are answers which are reasonable but wrong. So I cant quite figure out > what step I'm missing.
> The data received are 4 Gaussian shaped sinusoids from a reflection > that is about head on (broadside). If I add them up without delay I > get a nice big gaussian sine wave.
That's good.
> If I delay the data slightly then the gaussian reduces
That's good
> but when I delay it enough so that the 4 signals > are about back in phase again (actually 360 degrees out) I get a big > gaussian again. This makes total sense.
True
> So what am I doing wrong?
You have failed to recognize that you have rediscovered spatial aliasing.
> > I thought about retrieving the envelope of the gaussian then doing the > delay and sum, although I havent tried yet. That should work, but I > dont think it is optimal in any sense. > > Thanks for any responses. > > Phil
Most of the linear array beamforming literature deals with arrays where the element separation is less that or equal to half a wavelength at the highest frequency of interest. This is the same as being at or below the Nyquist frequency for the spatial sampling. Where does your array fit in? Dale B. Dalrymple
On 25 Jun, 19:20, Phil Winder <philipwin...@googlemail.com> wrote:
> On Jun 25, 3:14 pm, Vladimir Vassilevsky <antispam_bo...@hotmail.com> > wrote: > > > > > > > Phil Winder wrote: > > > I'm doing a very simple beamforming/phased array project for a bit of > > > fun. &#4294967295;What I have is 4 40kHz ultrasonic transducers being sampled at > > > 200kHz. > > > That looks like pretty coarse time resolution. > > > > I'm having problems formulating a simple delay and sum beamformer. > > > When I do quite simply delay by so many samples and sum, what I get > > > are answers which are reasonable but wrong. So I cant quite figure out > > > what step I'm missing. > > > > The data received are 4 Gaussian shaped sinusoids from a reflection > > > that is about head on (broadside). > > > How the envelope of the pulse compares to the beamforming delays? > > > > &#4294967295;If I add them up without delay I > > > get a nice big gaussian sine wave. &#4294967295;If I delay the data slightly then > > > the gaussian reduces but when I delay it enough so that the 4 signals > > > are about back in phase again (actually 360 degrees out) I get a big > > > gaussian again. &#4294967295;This makes total sense. &#4294967295;So what am I doing wrong? > > > > I thought about retrieving the envelope of the gaussian then doing the > > > delay and sum, although I havent tried yet. &#4294967295;That should work, but I > > > dont think it is optimal in any sense. > > > > Thanks for any responses. > > > Try increasing the sample rate and use CW sine. > > > Vladimir Vassilevsky > > DSP and Mixed Signal Design Consultanthttp://www.abvolt.com > > Hi, thanks for the replies. > @Rune: > Synchronisation is not a big problem here, since all the targets and > the array are static. &#4294967295;Results taken are always the same. > I havent seen any multipath in my setup, the walls are quite far away.
Famous last words... You need to consider the duration of the pulse (T) and the speed of sound in the medium (c). The spatial scale is determined by the length (in meter) of the pulse, computed as L = c*T. So if the pulse lasts for 0.1 s and the medium is air with sound speed 340 m/s, the length of the pulse is 34 meters. If there is any reflecting surface (walls, ceilings, floors, ground) in your setup is closer than that, you are de facto dealing with multipath propagation.
> The delay operator is simply having the data pointer at incremented > intervals. &#4294967295;E.g. A[0], B[2], C[4], D[6]. &#4294967295;What do you mean by > fractional delay? Do you mean I would have to interpolate the data so > that I could do A[0], B[0.5], C[1], D[1.5] ??
Yes. The time delay td between consecutive elements in uniform linear arrays is given by the direction of arrival as something like (writing off the top of my mind, might not be 100% correct) td = D*cos(theta)/c where D is the distance between sensors, c is the speed of sound, and theta is the direction of arrival relative to the array axis. So if you constrain yourself to integer delays you will only be able to steer the beam in a few directions. If you can implement arbitrary delays you can steer the beams in any direction. Rune
"Phil Winder" <philipwinder@googlemail.com> wrote in message 
news:ea3868f4-81b2-4721-99a3-1ce0c5dab835@m73g2000hsh.googlegroups.com...
> Hi, > I'm doing a very simple beamforming/phased array project for a bit of > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > 200kHz. > I'm having problems formulating a simple delay and sum beamformer. > When I do quite simply delay by so many samples and sum, what I get > are answers which are reasonable but wrong. So I cant quite figure out > what step I'm missing. > > The data received are 4 Gaussian shaped sinusoids from a reflection > that is about head on (broadside). If I add them up without delay I > get a nice big gaussian sine wave. If I delay the data slightly then > the gaussian reduces but when I delay it enough so that the 4 signals > are about back in phase again (actually 360 degrees out) I get a big > gaussian again. This makes total sense. So what am I doing wrong? > > I thought about retrieving the envelope of the gaussian then doing the > delay and sum, although I havent tried yet. That should work, but I > dont think it is optimal in any sense.
Phil, Vladmir suggested higher temporal resolution. You should get into the numbers if you want to think this through: - you are currently sampling 5 times per cycle. That amounts to samples that are 360/5 = 72 degrees. That's a lot of phase shift. Thus the comments you get regarding fractional delays. The simple solution is to be able to beamform without fractional delays. What is the minimum phase shift you need in the implementation? You didn't really say what beamforming implementation other than delay and sum you're using. You didn't say what "reasonable but wrong" means to you. By my reckoning, if everything is perfectly aligned in time according to your description, the first steered beam you could form would be with phase shifts of 0, 72, 144, 216 degrees - assuming an integral number of samples to form a delay using 0, 1, 2, 3 sample shifts. You didn't say what the spacing between elements is. The sum of the elements has a beam pattern determined by the spacing. As you delay the element outputs this has an affect for everything that's not broadside. This is a big deal so you can't overlook it. The beams that are formable depend on the element spacing. You need to relate the electrical phase angles (delays) to the mechanical phase angles (the beam steering direction) somehow. Consider this: let's put the element spacing at 360 degrees of phase - the same as 5 samples. That means they are 0.0281 feet apart for 40kHz. OK - so now even if you don't delay any of the elements there will be beam pattern peaks at broadside and at endfire. And, when you delay each element in increments of 5 samples they will all be in phase again. 90 mechanical degrees is the maximum steering angle for a line array because the beam is "conical" and 75 degrees and 105 degrees are on the same conical beam that's formed. ..... well, conical in a perfect sense which this isn't. There will be big sidelobes. And, if you have this spacing then you can steer the beam in increments of 1 sample delay (theoretically). I think this works out to beams that are spaced 22.5 degrees apart. Now, if the target is always broadside then how are you testing the beamformer? At best you will get an attentuation as you steer the beam away from broadside. There will be sidelobes that will have less attentuation. All this in a perfect world. But, you don't get the gaussian shape? There are two key reasons for this that I can think of: 1) Beamformers by their nature create constructive and destructive interference of the received waveforms. If you have strong destructive interference in the middle of the waveform for a particular beam steering direction then that will definitely change the envelope. This will definitely happen if you steer a null toward the target. The middle of the gaussian (if it's long enough) will be zero and all you will see are the leading and trailing edges. ... just like a FIR filter transient response. 2) If there is any multipath then that acts a bit like a set of additional delays laid on top of each element/delay and you can get what's described in #1. Lloyd's Mirror is a simple example of this where peaks and nulls are formed as a function of frequency. Fred
Phil Winder wrote:
> Hi, > I'm doing a very simple beamforming/phased array project for a bit of > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > 200kHz. > I'm having problems formulating a simple delay and sum beamformer. > When I do quite simply delay by so many samples and sum, what I get > are answers which are reasonable but wrong. So I cant quite figure out > what step I'm missing. > > The data received are 4 Gaussian shaped sinusoids from a reflection > that is about head on (broadside). If I add them up without delay I > get a nice big gaussian sine wave. If I delay the data slightly then > the gaussian reduces but when I delay it enough so that the 4 signals > are about back in phase again (actually 360 degrees out) I get a big > gaussian again. This makes total sense. So what am I doing wrong? > > I thought about retrieving the envelope of the gaussian then doing the > delay and sum, although I havent tried yet. That should work, but I > dont think it is optimal in any sense. > > Thanks for any responses. >
You haven't told us about the geometries, distances, medium (water?) etc. As Vladimir mentioned a 1/5th cycle delay granularity is rather coarse. Us medical ultrasound guys usually shoot for 1/20th or better. So you may have to increase your sample rate. Find out what the ultrasound propagation speed in your medium is. Then calculate the delay between target and each transducer. Multiply times two (often forgotten in the heat of the game ...) and see how much delay you need for each. That should always jibe. Of course, 40kHz easily gets into frames, surfaces and so on and then you end up having one big transducer instead of four. They need to be somewhat isolated and have good backing material. Fixing all four onto a hard surface with some epoxy won't likely cut it. Mounting and backing is a whole science to itself, we often have special materials engineers just for that. -- Regards, Joerg http://www.analogconsultants.com/ "gmail" domain blocked because of excessive spam. Use another domain or send PM.
On Jun 26, 2:03 am, "Fred Marshall" <fmarshallx@remove_the_x.acm.org>
wrote:
> "Phil Winder" <philipwin...@googlemail.com> wrote in message > > news:ea3868f4-81b2-4721-99a3-1ce0c5dab835@m73g2000hsh.googlegroups.com... > > > > > Hi, > > I'm doing a very simple beamforming/phased array project for a bit of > > fun. What I have is 4 40kHz ultrasonic transducers being sampled at > > 200kHz. > > I'm having problems formulating a simple delay and sum beamformer. > > When I do quite simply delay by so many samples and sum, what I get > > are answers which are reasonable but wrong. So I cant quite figure out > > what step I'm missing. > > > The data received are 4 Gaussian shaped sinusoids from a reflection > > that is about head on (broadside). If I add them up without delay I > > get a nice big gaussian sine wave. If I delay the data slightly then > > the gaussian reduces but when I delay it enough so that the 4 signals > > are about back in phase again (actually 360 degrees out) I get a big > > gaussian again. This makes total sense. So what am I doing wrong? > > > I thought about retrieving the envelope of the gaussian then doing the > > delay and sum, although I havent tried yet. That should work, but I > > dont think it is optimal in any sense. > > Phil, > > Vladmir suggested higher temporal resolution. You should get into the > numbers if you want to think this through: > > - you are currently sampling 5 times per cycle. That amounts to samples > that are 360/5 = 72 degrees. That's a lot of phase shift. Thus the > comments you get regarding fractional delays. The simple solution is to be > able to beamform without fractional delays. What is the minimum phase shift > you need in the implementation? > > You didn't really say what beamforming implementation other than delay and > sum you're using. You didn't say what "reasonable but wrong" means to you. > By my reckoning, if everything is perfectly aligned in time according to > your description, the first steered beam you could form would be with phase > shifts of 0, 72, 144, 216 degrees - assuming an integral number of samples > to form a delay using 0, 1, 2, 3 sample shifts. > > You didn't say what the spacing between elements is. The sum of the > elements has a beam pattern determined by the spacing. As you delay the > element outputs this has an affect for everything that's not broadside. > This is a big deal so you can't overlook it. The beams that are formable > depend on the element spacing. You need to relate the electrical phase > angles (delays) to the mechanical phase angles (the beam steering direction) > somehow. > > Consider this: let's put the element spacing at 360 degrees of phase - the > same as 5 samples. That means they are 0.0281 feet apart for 40kHz. > OK - so now even if you don't delay any of the elements there will be beam > pattern peaks at broadside and at endfire. And, when you delay each element > in increments of 5 samples they will all be in phase again. 90 mechanical > degrees is the maximum steering angle for a line array because the beam is > "conical" and 75 degrees and 105 degrees are on the same conical beam that's > formed. ..... well, conical in a perfect sense which this isn't. There will > be big sidelobes. > > And, if you have this spacing then you can steer the beam in increments of 1 > sample delay (theoretically). I think this works out to beams that are > spaced 22.5 degrees apart. > > Now, if the target is always broadside then how are you testing the > beamformer? At best you will get an attentuation as you steer the beam away > from broadside. There will be sidelobes that will have less attentuation. > All this in a perfect world. > > But, you don't get the gaussian shape? There are two key reasons for this > that I can think of: > > 1) Beamformers by their nature create constructive and destructive > interference of the received waveforms. If you have strong destructive > interference in the middle of the waveform for a particular beam steering > direction then that will definitely change the envelope. This will > definitely happen if you steer a null toward the target. The middle of the > gaussian (if it's long enough) will be zero and all you will see are the > leading and trailing edges. ... just like a FIR filter transient response. > > 2) If there is any multipath then that acts a bit like a set of additional > delays laid on top of each element/delay and you can get what's described in > #1. Lloyd's Mirror is a simple example of this where peaks and nulls are > formed as a function of frequency. > > Fred
Wow, thanks for all the time guys. Well thought out posts there. Well firstly I will answer some general questions that some of you put: Spacing: Yes, I am using 1 wavelength seperation, or 360 degrees. This is purely a mechanical constraint. I figured the wavelength to be 8mm. Now the transducers are 15mm wide so I had to mount them in a -_-_ format so that the element spacing will be constrained. I believed that this would introduce a vertical directivity, but not affect the horizontal. This is acceptable. Medium: Air. Receive Pulse Length: About 6mS = 2m. Plus its not quite gaussian. Its more like the transmission signals you see. Curved up then decay down. @Dale: I think that most people agree that there is not enough resolution here so I will try and implement something a bit more fine scale by interpolating. I dont really want to use an external ADC to keep it simple. I will try and get it working this way, and if it doesnt then I might give it a try. @Rune: Ah yeah, never thought of it that way. Well you could argue that the spreading in the pulse is in some part due to multipath. That will only affect the temporal resoultion, so I not too fussed about that yet. I want to get some sort of directivity first. And yes, I will try and interpolate and see if that helps. @Joerg: Yes it is very course. It is a constraint of my hardware and its at it's maximum already. I could get an external ADC but this would complicate things further. As for distance, I'm trying to go as far as possible. At the moment its about 2m, but I'm hoping to get more when I swap to a better microcontroller. My test object is about 30cm away. Yes thats a good idea, just to have a look at the times. I'll try that. And transducer frame coupling, bit too advanced yet. In any case I am only steering the receive at present. Not the transmit. But yes that could be a problem. In the future I will test for that. Thanks. @Fred: Long post! Yes I agree that the conversion is slow. 10 degrees of accuracy would be great, but for now 45 degrees would make me go yaaaay. But I disagree that increasing the conversion rate would be simplest. It means extra hardware and extra coding. If I did it in software then its not going to be strictly true, but its only coding. A lot simpler. Beamforming, is delay and sum. Thats it. The transmit is currently firing a broadside pattern (assuming that it doesnt become one big coupled transducer from Joerg). By resonable I meant that the signals go from being in phase out of phase then in phase again although the envelope has clearly moved causing a large signal again. But I think interpolating and making sure that I dont go over 180 degrees should fix that. I did some simulations on the beam width, and its wide. Very wide about 30 degrees, but it is directional. And that is the aim at the moment. Beta 1 as it were. Plus because of the 1 wavelength spacing there were some nasty side lobes around the end fire. But because of the directionality of the transducers (vaguely directional) I thought that these would compensate slightly for the bad side lobes. But obviously I wouldn't be able to get anything on or near to end fire. I am not scanning the transmit, only the receive. And Im doing that in software. So yes I agree that the sidelobes would have something nasty to say, but I want to see more of a distinction. I think interpolation is the way to go. And finally, yes I do get the Gaussian shape. Its due to the transmission medium and the fact that I am putting a squarewave in to drive the transmitters. The pulse would get "smoothed" out because of the transducers mechanical resistance and then due to multipath and scattering the pulse comes back as described. Thanks to all of you that have replied. Any more thoughts then feel free to answer back, but for the mean time I am going to give the interpolation a go. Best Regards, Phil
Phil Winder wrote:

[...]

> @Rune: Ah yeah, never thought of it that way. Well you could argue > that the spreading in the pulse is in some part due to multipath. > That will only affect the temporal resoultion, so I not too fussed > about that yet. I want to get some sort of directivity first. And > yes, I will try and interpolate and see if that helps. >
Interpolation after the acquisition won't yield that much. Since it seems you are bent on using a small uC you could try this: a. Fire -> Acquire at 200ksps. b. Fire one MCLK cycle delayed -> Acquire at 200ksps. c. Fire two MCLK cycles delayed -> Acquire at 200ksps. d. ... . . . Stop after you are one MCLK cycle away from a 360 turn on your 200kHz ADC clock. Now stitch together all the information gathered. Now if your uC begins to choke under all that data you could loosen the spec a bit, maybe step several MCLK cycles each time but you need to get into the vicinity of at least 1MSPS. This is pretty much how digital scopes work in equivalent time sampling mode, where their ADC isn't fast enough for the signal. It requires that the signal doesn't move, or in your case that the targets don't move too much. [...] -- Regards, Joerg http://www.analogconsultants.com/ "gmail" domain blocked because of excessive spam. Use another domain or send PM.