DSPRelated.com
Forums

Estimating correct sampling rate for 802.11-like signal (for timing adjustment)

Started by robkjn99 5 years ago7 replieslatest reply 5 years ago176 views

Hi,

I am sorry if my question might look trivial to you, but I have several books and papers scattered on my desk and still I cannot find a good answer to it (possibly I don't know how to search for it...).

I have created a protocol with a preamble inspired by 802.11 (that is, having 10 Short Training symbols used for the initial synchronization) and I am able to both get the frame start and to get a coarse CFO correction out of it. However, I'd like to compensate for (possibly large) mismatches in the sampling rates of TX/RX. I have the luxury of having a sampling rate at RX that can go up to 2X the one used for TX, and I have already successfully used a polyphase filter to compensate this mismatch in the past (without the 2X sampling rate but with a stream protocol, therefore I just had a PID controller looking at correlations and filtering them over time to slowly adjust a resampler). My current protocol operates on bursts, and I am scratching my head trying to figure out a way to get the most appropriate target sampling rate to which my samples have to be resampled. I have just this 10x16 samples of a known pattern to work with (and most implementations assume that you lose half of them due to the AGC), and this seems to me a super-short interval to get a precise estimate of what the discrepancy in the sampling rates is. Since I am designing the protocol, if there's a very strong reason, I can modify it eventually (but, since they get away with it in 802.11, this shouldn't be necessary...).

Can anyone point me towards a good algorithm for doing it, please? For the moment I am operating on a PC (I am using SDRs) and on recorded files, therefore I have no issues on the computational costs of the operation (though, in the future, I'll have to move to an FPGA implementation).

I have considered doing a binary search on resampled versions (with resampling factor in [0.4-0.6], for instance) of the short training sequence, and matching them to the received signal to get the "best" factor given the data. However

  • I strongly suspect that this approach is quite noise-sensitive 
  • I am not sure to avoid local minima
  • In any case, any FPGA engineer would scream of horror if I were to propose it...

Thanks a lot!

Rob

[ - ]
Reply by SlartibartfastOctober 1, 2019

I think what you really want for timing synchronization is estimation and correction of phase offset, not frequency.   As mentioned, usually with a burst protocol the clock frequency offsets are not enough to worry about over the duration of a burst, but the initial phase offset must be corrected so that the symbols are sampled to provide mininum intersymbol interference and maximum signal.

The design of the preamble for most protocols, including 802.11, facilitates this offset estimation.   There are methods that can trade off computational complexity for sampling rate, but an easy approach is to oversample by about 4x and pick the closest offset sample (via correlators or some comparable method against the preamble).  As you suggest, if I understood correctly, you can also use a lower sample rate, e.g., 2x oversampled, then estimate the fractional correction between samples and jump to that sample point with a phase adjustment in a polyphase filter.   There are a number of techniques that have been used successfully in deployments, but somehow you need to estimate and remove the initial sampling phase offset.

Estimating and correcting frequency offset is a separate issue, but can sometimes be done jointly with timing offset if the preamble and algorithms are designed and used wisely.

[ - ]
Reply by robkjn99October 1, 2019

I had a quick glance back at my code and I definitely have a bug hidden somewhere in the synchronizer... I'll reimplement it with a 2X oversampling and try to find out where the bug is! (my guess is that I am not aligned properly with the FFT window... also because I do have weird effects here and there... :S)

Thanks for your suggestions! :)

[ - ]
Reply by dgshaw6October 1, 2019

A questions related to your estimation process.

What is the worst case difference between the TX and the RX clock in PPM?

Remember, that you have an uncertainty problem between accurate frequency estimation and the amount of time domain data you can supply.  So, obtaining an accurate timing offset from a very short piece of data is, at least not easy, if not impossible.

If your PPM value is fairly small, say < 100, then, if you are using bursty traffic, then you don't need to compensate for the timing frequency offset, but maybe only for the initial phase. This is because the timing phase change over a typical packet burst will be small enough that your receiver will survive it.

Do you have the capacity to store the entire burst for processing?

If so, then you can choose a sampling phase that is correct for the middle of the burst, and then your worst case offset is only half the total phase drift that would occur for the entire burst.

Just some dumb thoughts.

David

[ - ]
Reply by robkjn99October 1, 2019

Hello! Thanks for your answer!

I am in the 20 PPM range, thus well below your threshold, but I still have a lot of noise in the constellation even in trivial situations (30cm cable). This, despite having implemented all the corrections suggested in the paper "Frequency Offset Estimation and Correction in the 802.11a WLAN" (but not yet the 2X oversampling) -- and in simulation everything works perfectly, I can add as much noise and distortions as I want on the GNU Radio's channel model, and still I get a perfect constellation. That's why I was considering the issue as linked to the sampling rate and I started looking for a way of compensating the mismatch...

[ - ]
Reply by dgshaw6October 1, 2019

20 ppm is even better then, and I re-enforce my previous statement.  If you think about it, then it takes 50,000 samples to have a one sample shift in timing. I presume that your signal is OFDM, if you are modulating similar to the 802.11.  If this is true, then your most serious problem is most likely the start of FFT sync derived from the short training, and not the timing offset.

[ - ]
Reply by robkjn99October 1, 2019

Yes, it's an OFDM signal!

Ok, then I'll have to debug a bit more my synchronizer... :S

Thanks a lot for your help!! :)

[ - ]
Reply by robkjn99October 1, 2019

It turned out that it was a mistake in the positioning of the OFDM window (the time synchronization on the short training symbols was wrong due to a bad threshold)...

Thanks for all the valuable suggestions!

Rob