Sounds promising! Do you have any references like journals, technical report or thesis for this algorithm?

Well, even though I will start working with few common IEEE standards, I want to eventually implement practical algorithms for generic pilot location detection/estimation...

I have captured OFDM I-Q data at IF. I have down-converted to it to baseband and then simply performed autocorrleation. Based on autocorrelation-based techniques...

Is this true for other standard as well, like dvbt-t and wimax? the signal i was dealing with is not wi-fi, but other ieee standard like dvb-t and wimax.

Thanks for your feedback. This is hugely helpful. You have mentioned about OFDM symbol repeating indefinitely as this leads to cheap implementations in hardware...

[cp & symbol] aligns with [cp & symbol] at 0 lag where we an see a huge peak. The other peak is where cp aligns with symbol end. That's where we are going...

Here is the output of autocorr:| || | ||_____________|____|_____________|_|_ |_________|_|_|_... 0 ...

well, built-in correlation function with same signal can perform auto-correlation, right? Like xcorr(x,x)

The noisy received signal (rx) is completely unknown. The whole objective is to auto-correlate to "estimate" the OFDM useful symbol time (Tu). It is producing...

You take time sample and use python or matlab built-in function to perform autocorrelation. It's autocorrelation, so it was correlated with itself.

Use this form to contact **KanaBaba**

Before you can contact a member of the *Related Sites:

- You must be logged in (register here)
- You must confirm you email address