Sampling frequency correction by Interpolation/Decimation technique?

Started by nqh May 3, 2007
Dear all,
In OFDM system, we difine T is the sampling period at the output of IFFT
(at transmiter), T' is sampling period at receiver, SFO=(T'-T)/T

At receiver, after CP removal, the (m,n)-th sample of the time-domain
received signal 
is r'(m,n),(include SFO).
How can we use Interpolation/decimation technique to correct the r'(m,n)
(to get desied signal r(m.n) without SFO)
Who can help me to propose the Algorithm ? (assume perfect knowledge

Any help is greatly appreaciated.


Do you know a company who employs DSP engineers?  
Is it already listed at ?