LS channel estimation for LTE

Started by Sri2424 January 7, 2008
Hi Pals,
     I am developing a least square channel estimation in matlab. I will
brief u what i am doing. Let me know if i am doing anything wrong. 
     At the receiver, the frequency domain channel estimation is done by
performing a simple division operation i.e., received pilot by transmitted
pilot. Once the frequency domain estimates are found we will perform FD
interpolation. we are using LS equation, which converts into time domain
as well as windowing it to CP length and converting back to frequency
domain gives the channel coefficients across all sub carriers. I am using
SCM channel and inbuilt AWGN function for channel simulation. 
     If i plot performance graph SNR Vs NMSE, i am not getting a good
perfromance and FDE is performing better than LS. But actually after
interpolation it should perfrom better. ( LS should perfrom better than
I will just summarize the equations what i am using,

r = xh + n
H_FDE = r / x;
hLS = (F' F)-1 * F' * H_FDE;
HLS = Fsc * hLS;

     F is a twiddle factor matrix only for reference signal position
across  CP length
     F' is complex conjugate of F
     Fsc is a twiddle matrix at all subcarriers across CP length.

Please help me if am doing anything wrong. I am not able to find where the
issue is. Ur suggestions are always welcomed.
Thanks in advance,