DSPRelated.com
Forums

Upsampling and Interpolation for TDOA Correlation

Started by DonkeyKong April 26, 2006
Hi y'all,

I'm working on developing a TDOA system with a desired time resolution of
1ns (~1ft).  However, the hardware I'm using restricts me to 1-2MSps for
each signal being correlated.  Typical correlation signals are in the
430MHz band and occupy a bandwidth in the ballpark of 128kHz.  The problem
I've encountered is that as much as I can simulate and successfully
correlate delayed signals if I construct them at 1GSps resolution in
Matlab, when I construct 1MSps signals and attempt to upsample and
interpolate before correlation, the signals fail to correlate to the
correct time difference.  e.g. signals simulated at 1GSps and offset by
5ns will have a correlation peak which corresponds to 5 samples, but
signals simulated at 1MSps and offset by 5ns, when upsampled and
interpolated to 1GSps, correlate to a peak, but that peak doesn't
correspond to 5 samples (5ns) of separation.  Does the peak get moved by
the process of upsampling/interpolation, and if so is it possible to
convert it back into a meaningful time delay?

This seems like a simple enough reconstruction problem since my sampling
hardware isn't coherent with the data and so I should be able to correlate
on the TDOA of the data (shouldn't I?), but once I interpolate this just
isn't happening.

I guess the questions I have are: 1) Is it actually working and do I just
need to do some more math to dig the time delay out of the upsampled
correlation?  2) If not, is there a way to perform interpolation that will
preserve these time differences for TDOA purposes?

Thanks for your help,

-DK


DonkeyKong skrev:
> Hi y'all, > > I'm working on developing a TDOA system with a desired time resolution of > 1ns (~1ft). However, the hardware I'm using restricts me to 1-2MSps for > each signal being correlated. Typical correlation signals are in the > 430MHz band and occupy a bandwidth in the ballpark of 128kHz. The problem > I've encountered is that as much as I can simulate and successfully > correlate delayed signals if I construct them at 1GSps resolution in > Matlab, when I construct 1MSps signals and attempt to upsample and > interpolate before correlation, the signals fail to correlate to the > correct time difference. e.g. signals simulated at 1GSps and offset by > 5ns will have a correlation peak which corresponds to 5 samples, but > signals simulated at 1MSps and offset by 5ns, when upsampled and > interpolated to 1GSps, correlate to a peak, but that peak doesn't > correspond to 5 samples (5ns) of separation. Does the peak get moved by > the process of upsampling/interpolation, and if so is it possible to > convert it back into a meaningful time delay?
You aim at an "effective" sampling rate of 1 GHz, but is restricted to a practical sampling rate of 1 MHz? To get past that, you try to interpolate the data you actually get. I don't see any reason why this should work. The process of "interpolation" only implies that you impose your own "prejudice" on the data, you don't get anything new out of it that you did not get from the original data.
> This seems like a simple enough reconstruction problem since my sampling > hardware isn't coherent with the data and so I should be able to correlate > on the TDOA of the data (shouldn't I?), but once I interpolate this just > isn't happening.
These are two different questions: estimate fractional-sample time delays from correlation or interpolate to find the fractional-sample time delay. There is one trick you could try, based on analyzing the phase of the cross spectrum between your two signals. I did get a crude version to work for fractional-sample time delays, but you are aiming for 1/1000th of a sample, so... Rune
Perhaps I didn't explain myself especially well.  What I'm wondering is
according to sampling theorem shouldn't I be able to perfectly reconstruct
a signal sampled above its nyquist frequency (as the 128khz bandlimited
data is)?  If I can reconstruct that, and the data arrived a few
nanoseconds later at one receiver than at another, shouldn't I--since I
can reconstruct the signal perfectly--be able to interpolate the data so
as to realize the couple-of-nanoseconds delay between the two arrivals? 
If so then how?  If not, then what's the flaw in my reasoning?

Thanks

DK

> >DonkeyKong skrev: >> Hi y'all, >> >> I'm working on developing a TDOA system with a desired time resolution
of
>> 1ns (~1ft). However, the hardware I'm using restricts me to 1-2MSps
for
>> each signal being correlated. Typical correlation signals are in the >> 430MHz band and occupy a bandwidth in the ballpark of 128kHz. The
problem
>> I've encountered is that as much as I can simulate and successfully >> correlate delayed signals if I construct them at 1GSps resolution in >> Matlab, when I construct 1MSps signals and attempt to upsample and >> interpolate before correlation, the signals fail to correlate to the >> correct time difference. e.g. signals simulated at 1GSps and offset
by
>> 5ns will have a correlation peak which corresponds to 5 samples, but >> signals simulated at 1MSps and offset by 5ns, when upsampled and >> interpolated to 1GSps, correlate to a peak, but that peak doesn't >> correspond to 5 samples (5ns) of separation. Does the peak get moved
by
>> the process of upsampling/interpolation, and if so is it possible to >> convert it back into a meaningful time delay? > >You aim at an "effective" sampling rate of 1 GHz, but is restricted to >a >practical sampling rate of 1 MHz? To get past that, you try to >interpolate >the data you actually get. > >I don't see any reason why this should work. The process of >"interpolation" only implies that you impose your own "prejudice" >on the data, you don't get anything new out of it that you did not >get from the original data. > >> This seems like a simple enough reconstruction problem since my
sampling
>> hardware isn't coherent with the data and so I should be able to
correlate
>> on the TDOA of the data (shouldn't I?), but once I interpolate this
just
>> isn't happening. > >These are two different questions: estimate fractional-sample time >delays >from correlation or interpolate to find the fractional-sample time >delay. > >There is one trick you could try, based on analyzing the phase of the >cross spectrum between your two signals. I did get a crude version >to work for fractional-sample time delays, but you are aiming for >1/1000th of a sample, so... > >Rune > >
"DonkeyKong" <doogiekoch@yahoo.com> wrote in message 
news:SdadnXmgma5SUdLZnZ2dneKdnZydnZ2d@giganews.com...
> Perhaps I didn't explain myself especially well. What I'm wondering is > according to sampling theorem shouldn't I be able to perfectly reconstruct > a signal sampled above its nyquist frequency (as the 128khz bandlimited > data is)? If I can reconstruct that, and the data arrived a few > nanoseconds later at one receiver than at another, shouldn't I--since I > can reconstruct the signal perfectly--be able to interpolate the data so > as to realize the couple-of-nanoseconds delay between the two arrivals? > If so then how? If not, then what's the flaw in my reasoning?
Actually you can interpolate to a 1000th of a sample - I have done this in a working product. The difficulty and limit stems from noise which will make your correlation peak not so sharp. In fact you can end up with an almost flat top with some little bumps on it and you won't know which one is the real peak. Also I correlated the data 1st and then interpolated to find the peak's location. Clay
>Actually you can interpolate to a 1000th of a sample - I have done this in
a
>working product. The difficulty and limit stems from noise which will
make
>your correlation peak not so sharp. In fact you can end up with an almost
>flat top with some little bumps on it and you won't know which one is the
>real peak. Also I correlated the data 1st and then interpolated to find
the
>peak's location.
I'm prepared to deal with the noise issue. The problem I'm having right now is that when I correlate simulations of noiseless signals the peak I find from the correlation doesn't seem to correspond to any meaningful time delay. Is there some offset or additional delay that arises from the upsampling/interpolation? Identical filters are being used on both signals, but the peaks I'm finding--which are still pretty pointy--just aren't close to where I expect them to be. --DK
"DonkeyKong" <doogiekoch@yahoo.com> wrote in message 
news:l-6dnW2DDfflRtLZnZ2dnUVZ_vadnZ2d@giganews.com...
> >Actually you can interpolate to a 1000th of a sample - I have done this > >in > a >>working product. The difficulty and limit stems from noise which will > make >>your correlation peak not so sharp. In fact you can end up with an almost > >>flat top with some little bumps on it and you won't know which one is the > >>real peak. Also I correlated the data 1st and then interpolated to find > the >>peak's location. > > I'm prepared to deal with the noise issue. The problem I'm having right > now is that when I correlate simulations of noiseless signals the peak I > find from the correlation doesn't seem to correspond to any meaningful > time delay. Is there some offset or additional delay that arises from > the upsampling/interpolation? Identical filters are being used on both > signals, but the peaks I'm finding--which are still pretty pointy--just > aren't close to where I expect them to be. >
I guess the 1st question I have is how are you doing the upsampling? If you stuff zeroes in-between your samples and then lowpass filter, you will likely see the delay of the filter added in. Clay
>I guess the 1st question I have is how are you doing the upsampling? If
you
>stuff zeroes in-between your samples and then lowpass filter, you will >likely see the delay of the filter added in. > >Clay > > > >
I'm upsampling by cascading 2x upsamples with 128th order equiripple FIRs to lowpass/interpolate until I achieve 1024x upsampling. If this is done identically on both channels, shouldn't the correlation still just have the delay due to the arrival difference? I can see where flattening out would come into play, but when I simulate this in Matlab I'm not getting correlations at all. Is it possible that things will work better if I stop trying to simulate signals and correlate them and I just record some actual data and correlate it? Thanks for all your help. DK