Hello all,
I have captured OFDM I-Q data at IF. I have down-converted to it to base-band and then simply performed autocorrleation. I was trying to find a clear peak that would be show me OFDM useful time duration. Although I see the peak where it should be due to cyclic prefix, I also find other peaks and harmonic like peaks. I was wondering -
1) What could be the reason for such peak?
2) How to suppress other peaks to get my desired peak?
3) How to identify the harmonics and rejects them?
KB
Hi,
can you tell us how do you do autocorrelation of received ofdm,
with what?
You take time sample and use python or matlab built-in function to perform autocorrelation.
It's autocorrelation, so it was correlated with itself.
If I xcorr any vector with copy of itself I will get a very nice peak.
I wouldn't do that as I don't see any benefit.
If I xcorr a known tx signal with its Rx received copy (with noise...) then it makes some sense.
For ofdm if you want to check symbol/cp boundaries then you can xcorr your signal with one delayed/advanced version and keep delaying/advancing until you detect areas of cp coinciding with end of its ofdm symbol then you will get nice regular peaks.
The noisy received signal (rx) is completely unknown. The whole objective is to auto-correlate to "estimate" the OFDM useful symbol time (Tu). It is producing a peak at t = Tu. However, I see a repeated harmonic like peak
output of autocorr:
|
|
| | |
|_____________|____|_____________|_|_ |_________|_|_|_...
0 Tu False Peak Harmonics
so, there are two issue:
1. Sometimes true peak has higher value then the false peak and I can simply get Tu. But some other cases false peak have higher value. Assuming this is due to some anomaly involved in the signal, how to suppress the false peaks to get my desired peak? Use over-sampling or more correlation lags or any other way? FYI, I don't see significant false peak all the time, only some time.
2. What are the reasons for those harmonics like peak and how to discard them during bling estimation.
3. Although the signal is most likely some IEEE standard (wifi, wimax, dvb-t etc. etc.), it can be anything.
I can imagine that you may have two cases depending on lag.
1) cp aligns with symbol end. Leading to distinct but low peaks regularly
2) [cp & symbol] aligns with [cp & symbol] leading to regular peaks including maximum case depending on lag value.
You better target maximum peak of 2 above. as to false peaks I can't tell but it could be some repeat data patterns.
[cp & symbol] aligns with [cp & symbol] at 0 lag where we an see a huge peak. The other peak is where cp aligns with symbol end. That's where we are going to get the Tu. Anyway, thanks.
Are you sure these built-in autocorrelation functions do what you want? I suspect they don't, unless you are being pretty careful about how you feed them data.
Or perhaps I'm misunderstanding what you are trying to achieve. I don't completely follow your original post.
well, built-in correlation function with same signal can perform auto-correlation, right? Like xcorr(x,x)
Is it possible that the sequence you are autocorrelating contains a repeating, repeating pattern? (i.e. something that repeats several times).
This is pretty common, as it can lead to cheap implementations in hardware. For example, WiFi (802.11g/n/ac) uses this trick. The autocorrelation function then tends to ramp up and down within a triangular envelope.
Could you upload a sketch of what you are seeing? Can you provide more information about your OFDM signal?
Here is the output of autocorr:
|
|
| | |
|_____________|____|_____________|_|_ |_________|_|_|_...
0 Tu False Peak Harmonics
This is a repeating pattern. Why is that? Although, I don't know exactly, but for sure it is a IEEE standard that uses OFDM.
If I understand correctly, you expect the "true" autocorrelation peak to correspond to the offset for which all the cyclic prefixes in the capture are aligned with the ends of the OFDM symbols (which the cyclic prefixes are repetitions of).
I guess then that the assumption is that all the other parts of data will be ~uncorrelated, so any other offset will produce ~zero in the autocorrelation function. This would give one nice clean peak where you expect, but I don't think this assumption is guaranteed to be valid. If, for example, the data is the same OFDM symbol repeating indefinitely, then we'll see a gigantic peak at an offset of 1 OFDM symbol period (and "harmonics" at 2, 3, 4, ...). Of course, that's an extreme example, but it illustrates a point.
If that's not the issue, then another question worth asking is whether the OFDM signal is bursty or continuous. And over how much data are you computing your correlation? If the signal is bursty and you're correlating over more than one burst, then (depending on the type of signal) there may be no timing synchronization between different bursts, which would create a big mess.
Also, are you sure you are only using a pure stream of OFDM symbols? (No non-OFDM pilot sequences, or similar, appended). And is your signal environment fairly "clean"? I guess severe channel effects could do bad things to your autocorrelation.
Thanks for your feedback. This is hugely helpful.
You have mentioned about OFDM symbol repeating indefinitely as this leads to cheap implementations in hardware (i.e., WiFi (802.11g/n/ac) uses this trick). Could you please let me know about more about the rationale of this repeating nature? Or could you please let me know some references about this adaptation? Thanks.
Two interesting observation:
1. in autocorr plot - from 0 to Tu gives duration for useful data in OFDM. (this is consistent with literature I have found)
2. in autocorr plot - from 0 to 1st harmonic is four times OFDM Symbol Duration (Tu + Tcp). Now it seems like the repeatition you mention does not ALWAYS start immediately after one OFDM symbol, it waits a few symbol duration more. Is that how you would interpret it? Or there is another interpretation?
The sequences don't repeat indefinitely in WiFi, or there would be no way to transfer any information. WiFi is a bursty signal, so some useful "preamble" is prepended to each burst to help the receiver to synchronize. This preamble is defined in the WiFi standard, so it doesn't carry any user information - it's just there to simplify the receiver design.
One part of this preamble is designed to have favourable autocorrelation properties (i.e. if you correlate the received signal with a delayed version of itself, it produces a strong peak at the correct alignment and a weak response at small offsets from that), which makes it relatively easy to detect in noise. Autocorrelation is preferred to cross-correlation (with a clean copy of the signal, which is known in advance from the definition in the WiFi standard) because it is significantly cheaper to implement in hardware. It is made cheaper still by repetitions within repetitions, which allow the sharing of hardware multipliers.
Anyway, at the start of every packet, there are 10 repetitions of the same 16 samples. Maybe your signal contains 77 repetitions of the same 4 samples. The point is that the content of the signal will affect the autocorrelation. And even if the payload is random, there can still be highly structured components within the signal.
Is this true for other standard as well, like dvbt-t and wimax? the signal i was dealing with is not wi-fi, but other ieee standard like dvb-t and wimax.