DSPRelated.com
Forums

Coding for adjusting entropy

Started by ylboy July 31, 2003
Dear all,

Suppose that, in a DSP system, entropy of input X will affect the
performance of system.
If X and Y are the input and output of coding scheme respectively, what kind
of coding scheme can make the entropy of Y as wish.

Thanks!!


On Fri, 1 Aug 2003, ylboy wrote:

> Dear all, > > Suppose that, in a DSP system, entropy of input X will affect the > performance of system. > If X and Y are the input and output of coding scheme respectively, what kind > of coding scheme can make the entropy of Y as wish. > > Thanks!! >
you can just simply ignore X completely, and make Y a random variable or process with whatever entropy or entropy rate you wish. i think i'm missing something in your question though! could you explain in more detail what you want? julius -- The most rigorous proofs will be shown by vigorous handwaving. http://www.mit.edu/~kusuma opinion of author is not necessarily of the institute
Julius Kusuma <kusuma@mit.edu> wrote in message news:<Pine.GSO.4.33L.0308011044110.20071-100000@cathedral-seven.mit.edu>...
> On Fri, 1 Aug 2003, ylboy wrote: > > > Dear all, > > > > Suppose that, in a DSP system, entropy of input X will affect the > > performance of system. > > If X and Y are the input and output of coding scheme respectively, what kind > > of coding scheme can make the entropy of Y as wish. > > > > Thanks!! > > > > you can just simply ignore X completely, and make Y a random variable or > process with whatever entropy or entropy rate you wish. i think i'm > missing something in your question though! could you explain in more > detail what you want? >
Yes. I think I didn't describe my question clearly enough. Suppose a binary message X is the input a trasmission system, and entropy of X, H(x), affects the error-rate of trasmission. When the entropy of X is 0.5, we can get the lowest error-rate. I would like to use a coding method to encode X to Y of which entropy is nearly 0.5 and let Y be the input of trasmission system to decrease the error-rate. The receiver certainly knows the coding method and thus decodes the received signal. My question is what kind of coding method I can use. By the way, the lengths of X and Y should be the same. Thank you Yes. I think I didn't describe my question clearly enough. Suppose a binary message X is the input a transmission system, and entropy of X, H(x), affects the error-rate of trasmission. When the entropy of X is 0.5, we can get the lowest error-rate. I would like to use a coding method to encode X to Y of which entropy is nearly 0.5 and let Y be the input of trasmission system to decrease the error-rate. The receiver certainly knows the coding method and thus decodes the received signal. My question is what kind of coding method I can use. By the way, the lengths of X and Y should be the same. Thank you
> julius
"ylboy" <ylboy.tw@yahoo.com.tw> wrote in message
news:72339006.0308040212.6b643634@posting.google.com...
> Suppose a binary message X is the input a trasmission system, and > entropy of X, H(x), affects the error-rate of trasmission. When the > entropy of X is 0.5, we can get the lowest error-rate. I would like to > use a coding method to encode X to Y of which entropy is nearly 0.5 > and let Y be the input of trasmission system to decrease the > error-rate. The receiver certainly knows the coding method and thus > decodes the received signal. My question is what kind of coding method > I can use. By the way, the lengths of X and Y should be the same.
If you mean what you say, then: Presuming H(X) < 0.5, then XOR evern Nth bit of X with psuedo-random bits to increase its apparent entropy. But I think you're asking the wrong question. There best encoding scheme for any given channel depends on more than its capacity. It seems you want to know about channel coding, and you should probably look here: http://www.cs.ucl.ac.uk/staff/S.Bhatti/D51-notes/node31.html and here http://www.stanford.edu/~vjsriniv/project/channel_capacity_7.htm And probably in some actual books about information threory.
"Matt Timmermans" <mt0000@sympatico.nospam-remove.ca> wrote in message news:<matXa.1337$pq5.214986@news20.bellglobal.com>...

> If you mean what you say, then: Presuming H(X) < 0.5, then XOR evern Nth > bit of X with psuedo-random bits to increase its apparent entropy. > > But I think you're asking the wrong question. There best encoding scheme > for any given channel depends on more than its capacity. It seems you want > to know about channel coding, and you should probably look here: > > http://www.cs.ucl.ac.uk/staff/S.Bhatti/D51-notes/node31.html > > and here > > http://www.stanford.edu/~vjsriniv/project/channel_capacity_7.htm > > And probably in some actual books about information threory.
Thank you for your reply. Would you please explain clearly about why XOR every Nth bit of X with random bit will increase its entropy. I indeed ask an unclear question. Let me try it again. I designed a signal processing system of which input is a binary message. The system processes the input bit-by-bit. After simulation, I found the perfromance is best when the input message X is concentrated evenly, i.e. half of X is '0'; and half of X is '1' (H(x)=1). Therefore, I think I need a pre-process system when H(x)<1. In my idea, I think this pre-process system is something like encoding scheme. And, my question is what encoding scheme is proper?
ylboy wrote:
>
...
> > Thank you for your reply. > Would you please explain clearly about why XOR every Nth bit of X with > random bit will increase its entropy.
Entropy is a measure of randomness. Scrambling the bits promotes randomness. Mat was trying to point out that intentionally adding errors to improve the statistics doesn't achieve what you want, even thogh your words implied that it would. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
On 4 Aug 2003, ylboy wrote:

> Thank you for your reply. > Would you please explain clearly about why XOR every Nth bit of X with > random bit will increase its entropy. >
assume that your original signal is memoryless, identically distributed. let Pr(X_t = 1) = p, where X_t is binary. then what is the entropy of X, H(X)=? now consider Y_t = X_t + Z_t. let Z_t also memoryless, identically distibuted, and binary. let Pr(Z_t = 1) = q. what is Pr(Y_t = 1)? is Y_t memoryless? is it identically distributed? what is the entropy H(Y)=?
> I indeed ask an unclear question. Let me try it again. > I designed a signal processing system of which input is a binary > message. The system processes the input bit-by-bit. After simulation, > I found the perfromance is best when the input message X is > concentrated evenly, i.e. half of X is '0'; and half of X is '1' > (H(x)=1). Therefore, I think I need a pre-process system when H(x)<1. > In my idea, I think this pre-process system is something like encoding > scheme. And, my question is what encoding scheme is proper? >
-- The most rigorous proofs will be shown by vigorous handwaving. http://www.mit.edu/~kusuma opinion of author is not necessarily of the institute
Jerry Avins wrote:
> > ... Mat was trying ...
Sorry: Matt. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
On 4 Aug 2003 03:12:06 -0700, ylboy.tw@yahoo.com.tw (ylboy) wrote:

>Julius Kusuma <kusuma@mit.edu> wrote in message news:<Pine.GSO.4.33L.0308011044110.20071-100000@cathedral-seven.mit.edu>... >> On Fri, 1 Aug 2003, ylboy wrote: >> >> > Dear all, >> > >> > Suppose that, in a DSP system, entropy of input X will affect the >> > performance of system. >> > If X and Y are the input and output of coding scheme respectively, what kind >> > of coding scheme can make the entropy of Y as wish. >> > >> > Thanks!! >> > >> >> you can just simply ignore X completely, and make Y a random variable or >> process with whatever entropy or entropy rate you wish. i think i'm >> missing something in your question though! could you explain in more >> detail what you want? >> >Yes. I think I didn't describe my question clearly enough. > >Suppose a binary message X is the input a trasmission system, and >entropy of X, H(x), affects the error-rate of trasmission. When the >entropy of X is 0.5, we can get the lowest error-rate. I would like to >use a coding method to encode X to Y of which entropy is nearly 0.5 >and let Y be the input of trasmission system to decrease the >error-rate. The receiver certainly knows the coding method and thus >decodes the received signal. My question is what kind of coding method >I can use. By the way, the lengths of X and Y should be the same.
Your terminology is not quite right. What you're asking about is a scrambling system, not a coding system. If the lengths of X and Y are the same and you want high entropy on the output, this is an excellent description of many generally available scrambling systems. Matt alluded to a common implementation in his reply without naming it. Eric Jacobsen Minister of Algorithms, Intel Corp. My opinions may not be Intel's opinions. http://www.ericjacobsen.org
Thank you all for your kindly help. Now, I know how to solve my problem.
Suppose Y= X XOR Z, and H(Z)=1; than H(Y)=1.
Beause
Pr(Y=0)=Pr(X=0)Pr(Z=0)+Pr(X=1)Pr(Z=1)
       =Pr(X=0)*0.5+Pr(X=1)*0.5
       =0.5  --> H(Y)=1

By the way, if H(Z) is not 1, how to express H(Y) in terms of H(X) and H(Z)?