### Example: Random Bit String

Consider a random sequence of 1s and 0s, *i.e.*, the probability of a 0 or
1 is always
. The corresponding probability density function
is

(D.31) |

and the entropy is

(D.32) |

Thus, 1 bit is required for each bit of the sequence. In other words, the sequence cannot be compressed. There is no redundancy.

If instead the probability of a 0 is 1/4 and that of a 1 is 3/4, we get

and the sequence can be compressed about .

In the degenerate case for which the probability of a 0 is 0 and that of a 1 is 1, we get

Thus, the entropy is 0 when the sequence is perfectly predictable.

**Next Section:**

Maximum Entropy Distributions

**Previous Section:**

Entropy of a Probability Distribution