Why is it that the BER for a baseband system in AWGN channel higher than a passband system under same AWGN channel conditions?
Started by 5 years ago●3 replies●latest reply 5 years ago●215 viewsHello everyone!
I successfully designed the wideband frequency hopped OFDM+BPSK system where I use multiple DUCs and DDCs to achieve up and down-conversion to and from different passband frequencies in GHz range. However, on comparing the BER vs SNR curve under AWGN channel conditions, I get something weird. For example, the system achieves a BER of 10^-5 at 10dB whereas the same system operating in baseband gives me a BER of around 10^-4 at 10dB. Why is this happening? Is it because I didn't use a fading channel for the passband system? In that case, how do I model the channel?
I have attached some pictures of my system along with the BER vs SNR curve for passband and baseband
Do you mean that you're using a fading channel model in one case and not the other? That'd definitely explain the difference. They should be the same, and if they're not it means there is some difference in the channels or the implementations that is causing it.
In my view if the impairment is AWGN both in RF or BB domain, the expected value for BER at given Eb/N0 or SNR should be the same.
The fact that there is delta in BER for given Eb/N0 or SNR between RF and BB domain, implies additional impairments introduction to the system.
Also, since it is BB domain which has lower BER for given Eb/N0 or SNR, then it points to fading as the culprit.
Best regards,
Shahram Shafie
I'm confused by your results too.
10^-4 at baseband (presumably without fading) and 10^-5 at passband (with fading) is the opposite of what we would expect.
Did I mis-read your post?