DSPRelated.com
Forums

OFDM Autocovariance Channel Matrix for LMMSE channel estimation

Started by tonialar July 14, 2008
Hello,

I am testing different OFDM Channel estimation (CE) approaches by
modelation in Simulink, and I am having a problem, that might be stupid but
is taking to much of my time without finding a solution.

My problem comes when I want to compare several methods to ideal LMMSE CE.
I know the channel from measurements, so I have the attenuation
coefficients for each subcarrier, i.e. the column vector H.

Since Hest_LMMSE=RHH*(RHH+beta/SNR*I)^-1 * Hest_LS, I need to compute RHH.
Therefore, since I have the true vector H, I compute:

RHH=H*H_hermitian (resulting an NxN matrix). I then I apply the real SNR
value to obtain the matrix filter. The problem is that I get a perfectly
shaped estimation, but there is an offset, i.e. Hest_LMMSE=K*H. Where K is
a constant that is different for each channel...

I was wondering whether I am computing the RHH matrix wrongly, but since

RHH=E{H*H_hermitian} and I have perfect channel knowledge,
RHH=H*H_hermitian. Is not it?

The fact is that this matrix has not 1s in its diagonal, but the abs^2 of
each attenuation coefficient. Am I missing some kind of normalization?

Thanks