# Acoustic Beamforming Question

Started by September 28, 2009
```Hello Greg,

Greg Berchin wrote:
> On Mon, 28 Sep 2009 13:41:53 -0700 (PDT), HardySpicer
> <gyansorova@gmail.com> wrote:
>
>> I always assumed that sensors for a beamformer had to be spaced at
>> least lambda/2 (half a wavelength) apart else we get spatial aliasing.
>
> That's backwards.  They have to be spaced at MOST half a wavelength
> apart to avoid spatial aliasing.
>
>> Is this true for acoustic arrays too?
>
> It is true for all arrays.  Just remember that the shading
> coefficients and the beam pattern form a Fourier Transform pair (with
> an extra sin[theta] term, where theta is the angle from array-normal).
> If you think of a line array in "endfire" (theta equals 90 degrees)
> and a sinusoidal signal, then each element is sampling the sine wave
> as it passes by.  If the elements are spaced farther than a half
> wavelength apart, it is like sampling a time domain waveform at less
> than twice the frequency of the sinusoid.  At any theta less than 90
> degrees, the elements are effectively closer together and spatial
> aliasing is less likely.

I think if you want an array to work well in endfire mode you need
maximum of 1/4 wave length separation (not 1/2 wave length). For
given total (1D) aperture length the gain in endfire mode is
potentially 3dB greater than broadside mode, but to get this benefit
you need twice as many microphones/hydrophones in the array. At
half wavelength spacing you can't tell which end of the array an
incedent endfire wave is coming from (at least at maximum operating
frequency).

Regards
--
```
```On Sep 28, 3:41&#2013266080;pm, HardySpicer <gyansor...@gmail.com> wrote:
> I always assumed that sensors for a beamformer had to be spaced at
> least lambda/2 (half a wavelength) apart else we get spatial aliasing.
> Is this true for acoustic arrays too? I have seen papers on small
> arrays (circular say) for mobiles and such like. Let us asume that the
> mics are 1cm apart giving a wavelenth of 0.02m and a frequency (min)
> of 16.5kHz. Not much use is it?
>
> Hardy

Here is something I found some time ago to simulate an array and plot
what the therotical would look like.  Don't remember where I got it.
You can play around with the number of elements, the spacing,

function [r,theta] = antplot(N,D,S)

%
%  function [r,theta] = antplot(N,D,S)
%
%  Plots the polar coordinate antenna pattern of an isotropic array
%
%  N = Number of elements in the array
%  D = the distance between elements in terms of wavelengths
%  S = Phasing in the array
%    For a broadside array, S = 0 (0 degrees)
%    for an end-fire array, S = -1.57 (-90 degrees)
%    for an arbitrary array angle, S = angle in radians
%
%
%
%
%
d = 2*pi*D;
g = 67/N;
theta = .01:.01:2*pi;
a = cos(theta);
p = d*a + S;
r = g*abs(sin(N*p/2)./sin(p/2));
theta = theta(:);
r = r(:);
polar(theta,r);

Maurice
```
```On 29 Sep., 11:17, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 29 Sep, 10:52, Andor <andor.bari...@gmail.com> wrote:
>
>
>
>
>
> > On 28 Sep., 23:00, Rune Allnor <all...@tele.ntnu.no> wrote:
>
> > > On 28 Sep, 22:41, HardySpicer <gyansor...@gmail.com> wrote:
>
> > > > I always assumed that sensors for a beamformer had to be spaced at
> > > > least lambda/2 (half a wavelength) apart else we get spatial aliasing.
>
> > > If the spacing is smaller than L/2, then there are few if
> > > any problems with spatial aliasing, but the array might
> > > be very expensive. If the spacing is larger than L/2, you
> > > can still get useful results, particularly with broad-bans
> > > signals: The lower frequencies are not aliased, so you can
> > > use information from the low-frequency bands to resolve
> > > aliasing at higher frequencies.
>
> > Rune, that's intersting. If I read you correctly, you are suggesting
> > to use correlation (if it exists) between low- and high-band to
> > resolve spatial aliasing. Let us assume that low- and high-band of a
> > signal from a desired direction are somehow correlated. How would you
> > proceed to use this correlation to split the high-band into desired
> > signal and undesired noise (ie. input from aliased beams)?
>
> You can't separate out noise coming from an aliased beam by
> this method, so it doesn't work quite as well as a directional
> noise filter as one might have wanted it to. But you can use
> the broad-band property of the signal to resolve directional
> amgiguities at high frequencies, so you can do directional
> processing of broad-band transient signals.
>
> This kind of technique is used all over seismics, where one
> deploy sensors bundeled up in cables. Once upon a time there
> were practical limitations to both the density of sensors along
> the cable and to how many many cables one could deploy [*], so
> the sampling theorem was almost never satisfied in the spatial
> dimensions.

So the point is that a transient (being broadband) should be recorded
in both low- and high-band (assuming some simple two-band split of the
signal coming from the array). If only a transient in the high-band is
recorded it can't come from the main beam because there is none
corresponding transient in the low band, right?

But this is indeed the first step towards spatial filtering using high/
low band correlation.

I am thinking of a more general situation though, applied for
Uni Aachen (Peter Jax) who claimed he could "reconstruct" the high-
frequency content from highly lowpassed speech signals (never heard a
demo, though). This means that, at least to some extend, for speech
signals the high- and low-bands are correlated (kind of obvious). The
question is, how to use the correlation for spatial filtering? An easy
first ideat is be to use Jax's high-band reconstruction using only low-
band from the beamformer output signal and use this as a "desired"
input in an adaptive Wiener filter on the high-band signal, then sum
low-band and Wiener filter output. I was wondering if  research along
speech processing).

Regards,
Andor
```
```On 30 Sep, 09:08, Andor <andor.bari...@gmail.com> wrote:
> On 29 Sep., 11:17, Rune Allnor <all...@tele.ntnu.no> wrote:
>
>
>
>
>
> > On 29 Sep, 10:52, Andor <andor.bari...@gmail.com> wrote:
>
> > > On 28 Sep., 23:00, Rune Allnor <all...@tele.ntnu.no> wrote:
>
> > > > On 28 Sep, 22:41, HardySpicer <gyansor...@gmail.com> wrote:
>
> > > > > I always assumed that sensors for a beamformer had to be spaced at
> > > > > least lambda/2 (half a wavelength) apart else we get spatial aliasing.
>
> > > > If the spacing is smaller than L/2, then there are few if
> > > > any problems with spatial aliasing, but the array might
> > > > be very expensive. If the spacing is larger than L/2, you
> > > > can still get useful results, particularly with broad-bans
> > > > signals: The lower frequencies are not aliased, so you can
> > > > use information from the low-frequency bands to resolve
> > > > aliasing at higher frequencies.
>
> > > Rune, that's intersting. If I read you correctly, you are suggesting
> > > to use correlation (if it exists) between low- and high-band to
> > > resolve spatial aliasing. Let us assume that low- and high-band of a
> > > signal from a desired direction are somehow correlated. How would you
> > > proceed to use this correlation to split the high-band into desired
> > > signal and undesired noise (ie. input from aliased beams)?
>
> > You can't separate out noise coming from an aliased beam by
> > this method, so it doesn't work quite as well as a directional
> > noise filter as one might have wanted it to. But you can use
> > the broad-band property of the signal to resolve directional
> > amgiguities at high frequencies, so you can do directional
> > processing of broad-band transient signals.
>
> > This kind of technique is used all over seismics, where one
> > deploy sensors bundeled up in cables. Once upon a time there
> > were practical limitations to both the density of sensors along
> > the cable and to how many many cables one could deploy [*], so
> > the sampling theorem was almost never satisfied in the spatial
> > dimensions.
>
> So the point is that a transient (being broadband) should be recorded
> in both low- and high-band (assuming some simple two-band split of the
> signal coming from the array). If only a transient in the high-band is
> recorded it can't come from the main beam because there is none
> corresponding transient in the low band, right?

That's one of the possibilites. The other is that it can be a
narrower-band transient that never contained any low-frequency
energy, that comes from any of the aliased beams.

> But this is indeed the first step towards spatial filtering using high/
> low band correlation.

It's afirst step, but far from perfect.

> I am thinking of a more general situation though, applied for
> microphone arrays for speech processing. There was this guy over at
> Uni Aachen (Peter Jax) who claimed he could "reconstruct" the high-
> frequency content from highly lowpassed speech signals (never heard a
> demo, though). This means that, at least to some extend, for speech
> signals the high- and low-bands are correlated (kind of obvious). The
> question is, how to use the correlation for spatial filtering? An easy
> first ideat is be to use Jax's high-band reconstruction using only low-
> band from the beamformer output signal and use this as a "desired"
> input in an adaptive Wiener filter on the high-band signal, then sum
> low-band and Wiener filter output. I was wondering if &#2013266080;research along
> this line had already been conducted (perhaps in other fields than
> speech processing).

I don't know. For the application you describe, keep in mind
that a heavily aliased reciever pattern at least in principle
can be fine-tuned such that only a few beams point in
directions where human speakers can possibly be located.

If the array is mounted high on a wall, then one can diregard
strong-energy aliases from the roof. If it is hanging near a
corner, one can disregard aliases coming from the adjacent wall.
And so on.

As always, consider as many details of the total problem
statement as possible.

Rune
```