DSPRelated.com
Forums

FSK encoding: alternatives to Manchester and NRZ

Started by howy February 3, 2007
Vladimir Vassilevsky <antispam_bogus@hotmail.com> writes:

> Randy Yates wrote: > >>>>>I am transmitting FSK using a MICRF505 transceiver chip. The FSK >>>>>modulator in this chip requires a bit encoding scheme to reduce the DC >>>>>content of the bit stream to a manageable level. >>> >>>>It sounds like you have a binary source with an entropy of less than 1 >>>>bit. If you know the probabilistic model of the source, you can >>>>simultaneously remove the DC and _REDUCE_ the bitrate by >>>>source-encoding, e.g., by using a Huffman code. >>> >>>Any kind of probabilistic compression does not guarantee against the >>>DC bias as well as all zero or all one sequences at the output. It can >>>only reduce the probability of that. >> Correct, and if the probability is, say, 0.000001? > > This would be too impractical to achieve.
A binary source with maximum entropy has a uniform pdf. Thus the probability of a string of 20 zeros (or ones) is about 0.000001. Is a string of 20 zeros "too much" DC?
> What does matter is the probability that the DC is going to exceed X > over the window size of N bits. The probabilistic coder produces a > random stream of bits, and the distribution of the DC can be assumed > to be Gaussian with the dispersion of ~ sqrt(N). Now run the numbers.
I have no idea what you just said. It looks like English, but it doesn't pass my interpreter. -- % Randy Yates % "Though you ride on the wheels of tomorrow, %% Fuquay-Varina, NC % you still wander the fields of your %%% 919-577-9882 % sorrow." %%%% <yates@ieee.org> % '21st Century Man', *Time*, ELO http://home.earthlink.net/~yatescr

Randy Yates wrote:

>>>>Any kind of probabilistic compression does not guarantee against the >>>>DC bias as well as all zero or all one sequences at the output. It can >>>>only reduce the probability of that. >>> >>>Correct, and if the probability is, say, 0.000001? >> >>This would be too impractical to achieve. > > > A binary source with maximum entropy has a uniform pdf. Thus the > probability of a string of 20 zeros (or ones) is about 0.000001. > > Is a string of 20 zeros "too much" DC?
How about the strings of 19 zeroes? Or 18 zeroes?
> >>What does matter is the probability that the DC is going to exceed X >>over the window size of N bits. The probabilistic coder produces a >>random stream of bits, and the distribution of the DC can be assumed >>to be Gaussian with the dispersion of ~ sqrt(N). Now run the numbers. > > > I have no idea what you just said. It looks like English, but it > doesn't pass my interpreter.
I said that your suggestion is freaking nonsense and you'd better think before saying anything that you don't have any clue about. Is that clear? VLV
Vladimir Vassilevsky <antispam_bogus@hotmail.com> writes:

> Randy Yates wrote: > >>>>>Any kind of probabilistic compression does not guarantee against the >>>>>DC bias as well as all zero or all one sequences at the output. It can >>>>>only reduce the probability of that. >>>> >>>>Correct, and if the probability is, say, 0.000001? >>> >>>This would be too impractical to achieve. >> A binary source with maximum entropy has a uniform pdf. Thus the >> probability of a string of 20 zeros (or ones) is about 0.000001. Is >> a string of 20 zeros "too much" DC? > > How about the strings of 19 zeroes? Or 18 zeroes?
If 20 zeros is not too much, then neither should 19 or 18 be too much.
>>>What does matter is the probability that the DC is going to exceed X >>>over the window size of N bits. The probabilistic coder produces a >>>random stream of bits, and the distribution of the DC can be assumed >>>to be Gaussian with the dispersion of ~ sqrt(N). Now run the numbers. >> I have no idea what you just said. It looks like English, but it >> doesn't pass my interpreter. > > I said that your suggestion is freaking nonsense and you'd better > think before saying anything that you don't have any clue about. Is > that clear?
Do you show your asshole in public as much as you do here? -- % Randy Yates % "With time with what you've learned, %% Fuquay-Varina, NC % they'll kiss the ground you walk %%% 919-577-9882 % upon." %%%% <yates@ieee.org> % '21st Century Man', *Time*, ELO http://home.earthlink.net/~yatescr

Randy Yates wrote:


>>I said that your suggestion is freaking nonsense and you'd better >>think before saying anything that you don't have any clue about. Is >>that clear? > > > Do you show your asshole in public as much as you do here?
But what I said is true, isn't it? VLV
Vladimir Vassilevsky <antispam_bogus@hotmail.com> writes:

> Randy Yates wrote: > > >>>I said that your suggestion is freaking nonsense and you'd better >>>think before saying anything that you don't have any clue about. Is >>>that clear? >> Do you show your asshole in public as much as you do here? > > But what I said is true, isn't it?
What is nonsense about it? If you look at the datasheet for the part howy mentioned, you'll see that their main concern is that the PLL bandwidth isn't so high that it starts to track your data. If you set the bandwidth low enough, then a string of 20 zeros shouldn't bother it. If a string of greater than 20 zeros happens once a month, then so fricking what? What I'm assuming is that an occasional bit error is acceptable. If it isn't, then this won't work. The fact that howy mentioned multipath implies that either the occasional bit error isn't a problem, or there's going to be some sort of FEC. Tell me where you think this reasoning is flawed. -- % Randy Yates % "I met someone who looks alot like you, %% Fuquay-Varina, NC % she does the things you do, %%% 919-577-9882 % but she is an IBM." %%%% <yates@ieee.org> % 'Yours Truly, 2095', *Time*, ELO http://home.earthlink.net/~yatescr

Randy Yates wrote:


> If you look at the datasheet for the part howy mentioned,
First, this part does not look like any good choice for the performance radio link. I'd say there is no point to bother about multipath, FEC and such.
> you'll see > that their main concern is that the PLL bandwidth isn't so high that > it starts to track your data.
The PLL on the transmit side acts as the 2-nd order HPF over the modulating signal. That's why the DC is going to be lost.
> If you set the bandwidth low enough, > then a string of 20 zeros shouldn't bother it. If a string of greater > than 20 zeros happens once a month, then so fricking what?
Not so fast, Randy. On random bitstream decoded by a comparator, the highpassing is equivalent to the loss of SNR. If you want to keep it under 3dB, then the cutoff frequency of the PLL should be ~1000 times lower then the bitrate. Therefore you may want a specially designed signal with the spectrum limited from the low side, rather then just a random signal. BTW, this topic is closely related to the data storage systems and the RLL coding that they are using. Vladimir Vassilevsky DSP and Mixed Signal Design Consultant http://www.abvolt.com