IFFT in OFDM Simulation
I have met the same problem in simulating an OFDM system in baseband. When I
finished a single-user BPSK modulated simulation in flat Rayleigh fading
channel. The BER performance is very strange and cannot match the ideal
performance for BPSK signals in flat Rayleigh fading channel. When I check
the Matlab program, I can find that if I use yt(x),
xing*sig.+AWGN,the results are strange. If I change the model to be
y=ifft(fading*Sig.)+AWGN, the results is ok (but I did not compare it with
the ideal performance. It look like ok. So I investaged the IFFT of AWGN, I
found the mean and the vance changed greatly, even though I am not sure
weather or not the pdf after FFT the noise is still Gaussian distributed. I
also examined another OFDM program for OFDM channel estimation in frequency.
I did that according to an IEEE paper. I found the results can match the
paper when I use the Y=HX+N after FFT with N to be AWGN.
Any help will be appreciated
Thanks in advance Li Pingan