DSPRelated.com
Forums

Scrambling in OFDM standards?

Started by Oli Filth September 18, 2006
Dear all,

In most OFDM standards (802.11a, 802.16, DAB, DVB), there is a data 
scrambling stage, which involves multiplying the frequency-domain data 
stream with a PRBS.  My question is, what is the purpose of this?

In the DAB spec (EN 300 401), it says:

"The purpose is to avoid the transmission of signal patterns which might 
result in an unwanted regularity in the transmitted signal."

This would be fair enough, except for two things:

* The scrambling occurs *before* coding and interleaving - surely these 
steps alone would be enough to muck up any "regularity" in the bit stream?

* The scrambler has a code rate of 1.  Therefore, assuming all input bit 
sequences are equiprobable, then all output bit sequences are possible 
and equiprobable.  Therefore, "regularity" should be just as likely at 
the scrambler output.


Can anyone shed any light on this?


-- 
Oli
Oli Filth wrote:
> Dear all, > > In most OFDM standards (802.11a, 802.16, DAB, DVB), there is a data > scrambling stage, which involves multiplying the frequency-domain data > stream with a PRBS. My question is, what is the purpose of this? > > In the DAB spec (EN 300 401), it says: > > "The purpose is to avoid the transmission of signal patterns which might > result in an unwanted regularity in the transmitted signal." > > This would be fair enough, except for two things: > > * The scrambling occurs *before* coding and interleaving - surely these > steps alone would be enough to muck up any "regularity" in the bit stream?
If you start with a good pseudo-random stream, then code it and interleave it, it'll still have lots of ones and zeros, and not too many long strings of one or the other.
> > * The scrambler has a code rate of 1. Therefore, assuming all input bit > sequences are equiprobable, then all output bit sequences are possible > and equiprobable. Therefore, "regularity" should be just as likely at > the scrambler output. >
The crux is that the 'all code sequences are equiprobable' is almost certainly not true. Any time you're transmitting real-world signals like voice or video in uncompressed or lossy compressed form, you're going to be prone to sending silence or white screens -- these will almost certainly take the form of long strings of zeros or ones, or other regular forms that would mess up synchronization and generally produce big spikes in your spectrum. Scrambling the data goes a big way toward eliminating this, although there is still that chance that you'll have pathological data that will scramble to all zeros.
> > Can anyone shed any light on this? > >
I hope this helps. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com Posting from Google? See http://cfaj.freeshell.org/google/ "Applied Control Theory for Embedded Systems" came out in April. See details at http://www.wescottdesign.com/actfes/actfes.html
Tim Wescott said the following on 18/09/2006 19:51:
> Oli Filth wrote: >> * The scrambling occurs *before* coding and interleaving - surely >> these steps alone would be enough to muck up any "regularity" in the >> bit stream? > > If you start with a good pseudo-random stream, then code it and > interleave it, it'll still have lots of ones and zeros, and not too many > long strings of one or the other.
My point was more the opposite - couldn't we do away with the scrambler, as the coder/interleaver would provide sufficient "randomisation" by themselves?
>> * The scrambler has a code rate of 1. Therefore, assuming all input >> bit sequences are equiprobable, then all output bit sequences are >> possible and equiprobable. Therefore, "regularity" should be just as >> likely at the scrambler output. >> > The crux is that the 'all code sequences are equiprobable' is almost > certainly not true. Any time you're transmitting real-world signals > like voice or video in uncompressed or lossy compressed form, you're > going to be prone to sending silence or white screens -- these will > almost certainly take the form of long strings of zeros or ones, or
Would this still be true when MPEG compression is used, particularly as Huffman encoding is used as a final stage? I know very little about compression algorithms.
> other regular forms that would mess up synchronization and generally > produce big spikes in your spectrum. > > Scrambling the data goes a big way toward eliminating this, although > there is still that chance that you'll have pathological data that will > scramble to all zeros.
In the case of i.i.d. data, as one might expect on 802.11 or 802.16, then isn't the "all sequences equiprobable" assumption justified - so pathological sequences that scramble to all zeros are just as likely as sequences that were all zeros to start with? -- Oli
Oli Filth wrote:

> Tim Wescott said the following on 18/09/2006 19:51: > >> Oli Filth wrote: >> >>> * The scrambling occurs *before* coding and interleaving - surely >>> these steps alone would be enough to muck up any "regularity" in the >>> bit stream? >> >> >> If you start with a good pseudo-random stream, then code it and >> interleave it, it'll still have lots of ones and zeros, and not too >> many long strings of one or the other. > > > My point was more the opposite - couldn't we do away with the scrambler, > as the coder/interleaver would provide sufficient "randomisation" by > themselves? >
It depends on the coder. Many block check codes (and, IIDHMHUMA* convolutional codes as well) will emit all zeros for an all-zero input. You could modify the code to have lots of ones in the parity check of a zero input, however.
> >>> * The scrambler has a code rate of 1. Therefore, assuming all input >>> bit sequences are equiprobable, then all output bit sequences are >>> possible and equiprobable. Therefore, "regularity" should be just as >>> likely at the scrambler output. >>> >> The crux is that the 'all code sequences are equiprobable' is almost >> certainly not true. Any time you're transmitting real-world signals >> like voice or video in uncompressed or lossy compressed form, you're >> going to be prone to sending silence or white screens -- these will >> almost certainly take the form of long strings of zeros or ones, or > > > Would this still be true when MPEG compression is used, particularly as > Huffman encoding is used as a final stage? I know very little about > compression algorithms. >
I don't know a lot of detail, but I _do_ know that the 'ideal' compression algorithm pretty much has to take data with lots of discernible pattern to it and reduce it down to something that's apparently random -- because there is an "if and only if" relationship between discernible patterns and redundant information.
> >> other regular forms that would mess up synchronization and generally >> produce big spikes in your spectrum. >> >> Scrambling the data goes a big way toward eliminating this, although >> there is still that chance that you'll have pathological data that >> will scramble to all zeros. > > > In the case of i.i.d. data, as one might expect on 802.11 or 802.16, > then isn't the "all sequences equiprobable" assumption justified - so > pathological sequences that scramble to all zeros are just as likely as > sequences that were all zeros to start with? > >
If I make my standard, and give it to you to play with, I have to either constrain you from using easy pathological sequences (all zeros) or I have to scramble the data myself. I think I'd choose paranoia. * If I Don't Have My Head Up My Assumptions. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com Posting from Google? See http://cfaj.freeshell.org/google/ "Applied Control Theory for Embedded Systems" came out in April. See details at http://www.wescottdesign.com/actfes/actfes.html

Oli Filth wrote:

> Dear all, > > In most OFDM standards (802.11a, 802.16, DAB, DVB), there is a data > scrambling stage, which involves multiplying the frequency-domain data > stream with a PRBS. My question is, what is the purpose of this? > > In the DAB spec (EN 300 401), it says: > > "The purpose is to avoid the transmission of signal patterns which might > result in an unwanted regularity in the transmitted signal." > > This would be fair enough, except for two things: > > * The scrambling occurs *before* coding and interleaving - surely these > steps alone would be enough to muck up any "regularity" in the bit stream?
No, it is not sufficient.
> * The scrambler has a code rate of 1. Therefore, assuming all input bit > sequences are equiprobable, then all output bit sequences are possible > and equiprobable. Therefore, "regularity" should be just as likely at > the scrambler output.
The scrambing is supposed to break the long runs of the repeated data. This is required to maintain the symbol synchronization. Vladimir Vassilevsky DSP and Mixed Signal Design Consultant http://www.abvolt.com

Oli Filth wrote:


> > My point was more the opposite - couldn't we do away with the scrambler, > as the coder/interleaver would provide sufficient "randomisation" by > themselves?
The all zero sequences for the 'empty' data are very likely. Therefore the codes can't guarantee the sufficient randomness.
> > >>> * The scrambler has a code rate of 1. Therefore, assuming all input >>> bit sequences are equiprobable, then all output bit sequences are >>> possible and equiprobable. >>> >> The crux is that the 'all code sequences are equiprobable' is almost >> certainly not true.
Yes, Tim Wescott's point is correct.
> > Would this still be true when MPEG compression is used, particularly as > Huffman encoding is used as a final stage? I know very little about > compression algorithms.
You don't need to scramle the compressed data.
> > In the case of i.i.d. data, as one might expect on 802.11 or 802.16, > then isn't the "all sequences equiprobable" assumption justified
No, this is not justified. The sequences of the repeated zeroes are very likely. VLV
Vladimir Vassilevsky said the following on 19/09/2006 00:31:
> > Oli Filth wrote: >> >>>> * The scrambler has a code rate of 1. Therefore, assuming all input >>>> bit sequences are equiprobable, then all output bit sequences are >>>> possible and equiprobable. >>>> >>> The crux is that the 'all code sequences are equiprobable' is almost >>> certainly not true. > > Yes, Tim Wescott's point is correct. > >> >> Would this still be true when MPEG compression is used, particularly >> as Huffman encoding is used as a final stage? I know very little >> about compression algorithms. > > You don't need to scramle the compressed data.
So why does DAB, for instance, which uses MPEG-2 compression, include a scrambler?
>> In the case of i.i.d. data, as one might expect on 802.11 or 802.16, >> then isn't the "all sequences equiprobable" assumption justified > > No, this is not justified. The sequences of the repeated zeroes are very > likely.
Does this mean that i.i.d. data is not a suitable model for arbitrary network data? -- Oli

Oli Filth wrote:


>> >> You don't need to scramle the compressed data. > > > So why does DAB, for instance, which uses MPEG-2 compression, include a > scrambler?
Because it has to transmit the other information besides the continious stream of the compressed data. Such as the headers, the zero padding, the empty and the half empty blocks and so.
>>> In the case of i.i.d. data, as one might expect on 802.11 or 802.16, >>> then isn't the "all sequences equiprobable" assumption justified >> >> >> No, this is not justified. The sequences of the repeated zeroes are >> very likely. > > Does this mean that i.i.d. data is not a suitable model for arbitrary > network data?
The real network data is not the iid. VLV
Vladimir Vassilevsky said the following on 19/09/2006 01:04:
> > > Oli Filth wrote: > > >>> >>> You don't need to scramle the compressed data. >> >> >> So why does DAB, for instance, which uses MPEG-2 compression, include >> a scrambler? > > Because it has to transmit the other information besides the continious > stream of the compressed data. Such as the headers, the zero padding, > the empty and the half empty blocks and so.
Ah, that makes sense.
>>>> In the case of i.i.d. data, as one might expect on 802.11 or 802.16, >>>> then isn't the "all sequences equiprobable" assumption justified >>> >>> >>> No, this is not justified. The sequences of the repeated zeroes are >>> very likely. >> >> Does this mean that i.i.d. data is not a suitable model for arbitrary >> network data? > > The real network data is not the iid.
If this is the case, then scrambling makes much more sense. Thank you for your input! -- Oli
Vladimir Vassilevsky wrote:
> > > Oli Filth wrote: > > >> >> My point was more the opposite - couldn't we do away with the >> scrambler, as the coder/interleaver would provide sufficient >> "randomisation" by themselves? > > The all zero sequences for the 'empty' data are very likely. Therefore > the codes can't guarantee the sufficient randomness. > >> >> >>>> * The scrambler has a code rate of 1. Therefore, assuming all input >>>> bit sequences are equiprobable, then all output bit sequences are >>>> possible and equiprobable. >>>> >>> The crux is that the 'all code sequences are equiprobable' is almost >>> certainly not true. > > Yes, Tim Wescott's point is correct. > >> >> Would this still be true when MPEG compression is used, particularly >> as Huffman encoding is used as a final stage? I know very little >> about compression algorithms. > > You don't need to scramle the compressed data.
Although that is clearly true for typical video pictures, is it guaranteed to be true for extremely simple noise free images?
> >> >> In the case of i.i.d. data, as one might expect on 802.11 or 802.16, >> then isn't the "all sequences equiprobable" assumption justified > > No, this is not justified. The sequences of the repeated zeroes are very > likely. > > > VLV >
Steve