On 24 June, 15:59, Rune Allnor <all...@tele.ntnu.no> wrote:
> I have no idea. I have always worked with baseband data.
> Down-mixing to baseband before beamforming would represent
> a significant computational load, as each sensor would have
> to be treated separately.
That's what massive parallelism - made possible by modern day beefy
FPGAs - is there for.
> So it seems you might be located between a rock and a hard
> place.
I am.
> On the prositive side, the down-mixing problem is *only*
> one of economy: It can be solved by adding mixing steps
> at each sensor. In contrast, the phase sensitivity issue
> is one of physics. Not easy (maybe even impossible) to
> beat.
Indeed.
Many thanks,
-M
Reply by Rune Allnor●June 24, 20092009-06-24
On 24 Jun, 16:40, Manny <mlou...@hotmail.com> wrote:
> On Jun 24, 12:58=A0pm, Rune Allnor <all...@tele.ntnu.no> wrote:> On 24 Ju=
n, 13:24, Rune Allnor <all...@tele.ntnu.no> wrote:
>
> > > The half-wavelength constraint originates in the academic
> > > case of a monochromatic narrow-band sinusoidal. I have done
> > > array processing where the sensor separation was about lamda/10.
>
> > Bah! It's 21C today; far too hot for my mind to work.
> > What I meant to say was that the processing I did
> > worked with sensor separations of 3-5 lambda.
>
> > Rune
>
> To put things in less ambiguous language, I meant to say that I can
> steer the envelope of my CDMA signal i.e. individual chips at baseband
> at opposed to the raw bandpass signal. This way even the stringent
> lambda/2 requirement can be met. Does this make sense?
I have no idea. I have always worked with baseband data.
Down-mixing to baseband before beamforming would represent
a significant computational load, as each sensor would have
to be treated separately.
On the other hand, doing the beamforming in higher frequencies
may or may not introduce problems with sensitivity to phase.
You already know about the spatial alias issue.
So it seems you might be located between a rock and a hard
place.
On the prositive side, the down-mixing problem is *only*
one of economy: It can be solved by adding mixing steps
at each sensor. In contrast, the phase sensitivity issue
is one of physics. Not easy (maybe even impossible) to
beat.
Rune
Reply by Manny●June 24, 20092009-06-24
On Jun 24, 12:58=A0pm, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 24 Jun, 13:24, Rune Allnor <all...@tele.ntnu.no> wrote:
>
> > The half-wavelength constraint originates in the academic
> > case of a monochromatic narrow-band sinusoidal. I have done
> > array processing where the sensor separation was about lamda/10.
>
> Bah! It's 21C today; far too hot for my mind to work.
> What I meant to say was that the processing I did
> worked with sensor separations of 3-5 lambda.
>
> Rune
To put things in less ambiguous language, I meant to say that I can
steer the envelope of my CDMA signal i.e. individual chips at baseband
at opposed to the raw bandpass signal. This way even the stringent
lambda/2 requirement can be met. Does this make sense? I also know
that carefully spaced elements with greater than lambda separation can
work well if one knows what one's doing. In our application,
compactness is also a highly desirable thing.
Many thanks for your thoughts Rune,
-M
Reply by Rune Allnor●June 24, 20092009-06-24
On 24 Jun, 13:24, Rune Allnor <all...@tele.ntnu.no> wrote:
> The half-wavelength constraint originates in the academic
> case of a monochromatic narrow-band sinusoidal. I have done
> array processing where the sensor separation was about lamda/10.
Bah! It's 21C today; far too hot for my mind to work.
What I meant to say was that the processing I did
worked with sensor separations of 3-5 lambda.
Rune
Reply by Rune Allnor●June 24, 20092009-06-24
On 24 Jun, 11:08, Manny <mlou...@hotmail.com> wrote:
> --------------------
> DISCLAIMER:
> Content may contain rubbish intellect/utter stupidity which may cause
> offence or arouse contempt. Therein discretion is advised.
> --------------------
> Hello,
>
> I've started thinking about the application of beamforming to the
> direction finding problem in the context of an acoustic system I'm
> building. It is a DSSS ultrasonic system to which I would like to add
> angular resolution capability, albeit a coarse one (no fancy comms
> diversity or DOA subspace techniques or anything of the sorts).
>
> I know from basic array processing theory that in the general case the
> separation in a broadband ULA should at most be half the shortest
> wavelength present in the signal. However, since we here know a priori
> the structure of the CDMA signal, my question is: can this be relaxed?
The half-wavelength constraint originates in the academic
case of a monochromatic narrow-band sinusoidal. I have done
array processing where the sensor separation was about lamda/10.
This worked, since the signal itself was broad-band (DC- 10/lambda),
and thus I could unwrap the spatial aliasing across the whole
bandwidth.
Whether similar ideas work in your case, depends totally
on the nature of your signals.
As for ultrasound appliations, keep in mind that you have
two directivities to keep track of:
- The directivity of teach individual transducer element
- The directivity of the array of transducer elements.
Do note that the omnidirectional approximation of element
directivity is only valid if the wavelength is much larger
than the physical dimensions of the sensor elements. This
might not be the case with your system. If the wavelength
is similar to, or smaller than, the physical size of
individual sensor elements, including the element directivity
in the analysis might go a long way to resolve spatial aliases.
Rune
Reply by Manny●June 24, 20092009-06-24
--------------------
DISCLAIMER:
Content may contain rubbish intellect/utter stupidity which may cause
offence or arouse contempt. Therein discretion is advised.
--------------------
Hello,
I've started thinking about the application of beamforming to the
direction finding problem in the context of an acoustic system I'm
building. It is a DSSS ultrasonic system to which I would like to add
angular resolution capability, albeit a coarse one (no fancy comms
diversity or DOA subspace techniques or anything of the sorts).
I know from basic array processing theory that in the general case the
separation in a broadband ULA should at most be half the shortest
wavelength present in the signal. However, since we here know a priori
the structure of the CDMA signal, my question is: can this be relaxed?
To put things numerically, instead of 70kHz fastest frequency, can I
make the separation half a chip @ 20kHz? I other words, I'm thinking
that spatial aliasing applies here on chips (within a code) regardless
of other higher harmonics present in the code edges.
The main problem I have is the transducer itself - The piezo films
we'r using lose sensitivity a lot with decreasing widths and as a
result even the array gain won't make up for this.
The beamformer I have in mind is a basic low resolution time-domain
realization and detection is performed afterwards with a multi-user
Rake receiver, all to be done in real-time, embedded hardware.
Would be immensely grateful for any comments on this.
-M