How do we know the receiver delay in an OFDM transceiver system?

Started by amitjonak 4 years ago4 replieslatest reply 4 years ago128 views

Hello everyone!

I was simulating a simple OFDM transceiver system without a channel (just the transmitter and the receiver directly connected). I saw that if I use the "Find Delay" block in Simulink between the transmitted and received bits, I find that it keeps changing, because of which it becomes difficult to decide a specific receiver delay to calculate the BER. This however shows a constant value of delay in just a simple BPSK transceiver system without the OFDM. Can anyone please tell me what is actually happening and what are the things to consider while calculating the BER of an OFDM system?

[ - ]
Reply by Mannai_MuraliApril 27, 2020

I have used only matlab not simulink. Try giving ZERO delay if it allows as hard coded.If it finds delay probably it may look for a particular sync bit pattern in the bits stream.Since the particular sync pattern come randomly it may give variable delay.

Or first frame send a known bit pattern and give it as the sync bit pattern.

[ - ]
Reply by thirugvasApril 27, 2020

What is your channel model? whether you are considering Complex Gaussian Channel or Rayleigh fading channel?

[ - ]
Reply by amitjonakApril 27, 2020

Actually I am not using any channel, but I am using the in-built Digital Up-converter and Digital Down-converter to modulate and demodulate my OFDM signal. Basically, I am trying to implement a wideband frequency hopped OFDM system through the use of DUC and DDC. Nowadays, we have RFSoCs which houses RF sampling ADCs and DACs in them, which can make the entire transceiver system a fully digital system. So I am trying to replicate such a system in simulation level through which we can achieve frequency hopping and translations without the use of a mixer, which is an analog thing. So just now while simulating, I realized the constellation diagram was rotating, which meant there was phase error. This error could be because of the oscillators used in the DUC and DDC were probably not at sync. So, I calculated the phase error and compensated by rotating the signal with the obtained phase error. Now I get a zero BER with perfect constellation diagram. My next step will be to test this on the Xilinx ZCU1275 RFSoC board. Thank you very much for the response! Appreciate it. 

[ - ]
Reply by rrlagicApril 27, 2020


I have no experience with Simulink, but with practical OFDM systems. As you already guessed, channel will introduce delay which would result in constellation rotation. But there are more troubles in real world. Just few to name are Tx/Rx clock discrepancy, LO discrepancy, and these oscillators are out of phase sync. Because of that equalization, sampling time offset estimation, frequency offset estimation and sometimes phase tracking are usual jobs in OFDM. That is done with help of a priori known information about reference signals. These could be pilot tones as in WiFi, or reference signals in LTE DL, or even reference symbols as in LTE UL. Once you recovered all those discrepancies, received constellation would stay firm for some short period of time, or better say, its rotation would be so small, that would not affect reception of symbols train of certain length. In process of euqalization we effectively chop out channel effect, however, we rarely directly calculate what was channel delay. If that was within permissible range, the symbol would be recovered properly. If delay was too long, then ISI will break reception.

Hope this gives you some idea.