Hi, I'm simulating an OFDM system with Multipath and AWGN channels. One thing that i'm observing is that whenever I simulate the system on baseband the AWGN command adds almost the correct noise but whenever I modulate the baseband signal to a carrier and then add the noise, it adds very little noise compared to previous case. I have implemented the system as follows. =============================================== function CHN_OUT = channel_multi_td(PARAMS, mod_output) % mod_output is OFDM transmitter output mod_out_abs = abs(mod_output); mod_out_ang = angle(mod_output); chn_out_mean = mean(mod_out_abs); % power amplifier model mod_out_amp_abs = mod_out_abs./sqrt(1+(mod_out_abs/(chn_out_mean*10^(PARAMS.sat_level/10))).^2); mod_out_amp_ang = mod_out_ang; if PARAMS.bypass_pa == 0 mod_output_pa = mod_out_amp_abs.*exp(j*mod_out_amp_ang); else mod_output_pa = mod_output; end % multipath channel if PARAMS.bypass_multipath == 0 CHN_OUT_MULTI = filter(PARAMS.channel, mod_output_pa); else CHN_OUT_MULTI = mod_output_pa; end % awgn channel if PARAMS.bypass_awgn == 0 CHN_OUT_AWGN = awgn(CHN_OUT_MULTI, PARAMS.snr, 'measured'); else CHN_OUT_AWGN = CHN_OUT_MULTI; end % carrier frequency offset freq_offset = 2*pi*PARAMS.freq_off*PARAMS.subcarrier_spacing/PARAMS.symbol_rate; CHN_OUT = CHN_OUT_AWGN.*exp(j*freq_offset*(0:length(CHN_OUT_AWGN)-1)); ========================================================= does anyone has any clue why it adds little noise in the case when it is modulated over a carrier.. ? best regards, ubaid
AWGN Noise with OFDM Signal
Started by ●July 25, 2008
Reply by ●July 25, 20082008-07-25
On Jul 25, 3:06�am, "Ubaid Abdullah" <ubaid_abdul...@yahoo.com> wrote:> Hi, I'm simulating an OFDM system with Multipath and AWGN channels. One > thing that i'm observing is that whenever I simulate the system on baseband > the AWGN command adds almost the correct noise but whenever I modulate the > baseband signal to a carrier and then add the noise, it adds very little > noise compared to previous case.[snip] The MATLAB awgn function works on a sample-by-sample basis. So you may have to normalize according to the oversampling factor, and also according to the number of information bits per sample or symbol. Is this for a homework assignment?