Reply by August 27, 20052005-08-27
Steve Pope wrote:
> Snowball <sdris@softlab.ntua.gr> wrote: > > >Hi, > > > I am not sure how to model the noise though. Is it correct to add > > AWGN noise on each frequency carrier seperately (i.e. in the > > simulation, I would add an AWGN simulink block just prior to > > each of my 256 QAM demodulators)? > > If you're not modeling the channel or the time domain generally > it's correct to add noise at the demodulator inputs, scaled > by the inverse channel response. This gives you a quickie > idea of how much the channel response is impairing the operating point. > You can simulate OFDM performance entirely in the frequency > domain this way. > > However since in your case you are modeling the time domain, it > is more appropriate to add the noise in the time domain -- directly > after the channel model (which BTW should have complex > coefficients if it's at baseband), and before the FFT. > > Good luck. > > Steve
Hi Steve, Thanks for your reply. You are right about the channel- I am actually using a real-coefficient filter (i.e. not baseband), since the output from my IFFT at the OFDM modulator is real. I pre-pend my 256 complex QAM samples with their complex conjugate, so the IFFT at the OFDM modulator outputs real samples. I think this is correct, it seems to attenuate each carrier by the expected amount. I must be doing something wrong, but when I add the noise after the channel model I get extremely different results than when I add it prior to each QAM demodulator. Perhaps I am scaling it incorrectly in the Simulink block... This is why I was hoping the two methods are equivalent. Anyway, I'll have to call it a night, I will try again tomorrow... Thanks.
Reply by Steve Pope August 27, 20052005-08-27
Snowball <sdris@softlab.ntua.gr> wrote:

>Hi,
> I am not sure how to model the noise though. Is it correct to add > AWGN noise on each frequency carrier seperately (i.e. in the > simulation, I would add an AWGN simulink block just prior to > each of my 256 QAM demodulators)?
If you're not modeling the channel or the time domain generally it's correct to add noise at the demodulator inputs, scaled by the inverse channel response. This gives you a quickie idea of how much the channel response is impairing the operating point. You can simulate OFDM performance entirely in the frequency domain this way. However since in your case you are modeling the time domain, it is more appropriate to add the noise in the time domain -- directly after the channel model (which BTW should have complex coefficients if it's at baseband), and before the FFT. Good luck. Steve
Reply by Snowball August 27, 20052005-08-27
Hi,

I am simulating a baseband OFDM system in MATLAB Simulink. It looks similar
to the adsl_sim demo, though I am not doing any CRCing, scrambling or
interleaving. I use a 256-carrier DMT modulator, and transmit my data over a
channel which is basically a lowpass FIR filter with real coefficients.

I am not sure how to model the noise though. Is it correct to add AWGN noise
on each frequency carrier seperately (i.e. in the simulation, I would add an
AWGN simulink block just prior to each of my 256 QAM demodulators)? Accoding
to Cioffi in "A Multicarrier Primer", the SNR at each carrier is scaled by
the magnitude^2 of the channel at that frequency, which is what the above
would achieve (Cioffi treats the analysis of the OFDM system as the
superposition of multiple QAM systems).

On an another issue, I am using a guard channel to avoid ISI. Suppose I add
50 samples to the result of my 512-IFFT. Am I right in understanding that
the resulting decrease in bit rate would be 50/512 (for the same occupied
bandwidth).

Thanks,
Stefan