DSPRelated.com
Forums

help required regarding LMMSE channel estimation in OFDM

Started by usmangul66 October 8, 2008
i have simulated the MSE for channel estimate for different techniques for
OFDM. But for LMMSE i am not getting the desired results. the MSE for
LMMSE
comes out to be better than transform domain LS and time domain LS for
low
SNR (like below 5 dB) but gets poor at higher snrs and approaches
frequency
domain LS at high SNRs like 40 dB.
    My simulation parameters are:
     symbol_length = 64;
     channel length = 5; (the channel is constant during the
transmission)
     channel is Rayleigh fading channel generated by matlab command:
h = (randn(1,M)+sqrt(-1)*randn(1,M))/sqrt(2*M); M is the channel length
=5;
     I calculate the LMMMSE channel estimate as follows.
[X,Rhh] = corrmtx(H,frame_size-1);
temp = noise_var *(diag(mod_points(:,1))*diag(mod_points(:,1)'))^-1;
hlmmse = Rhh*((Rhh + temp)^-1)*ch_LS;

H is the frequency response vector of channel.
Rhh is the autocorrelation matrix calculated by corrmtx function.
ch_LS is the frequency domain LS estimate.
mod_points(:,1) is the training symbol.
hlmmse is the calculated LMMSE estimate.

     Can you please identify the problem with this simulation or these
results are correct? is the procedure for autocorrelation matrix
calculation correct?