DSPRelated.com
Free Books

Entropy of a Probability Distribution

The entropy of a probability density function (PDF) $ p(x)$ is defined as [48]

$\displaystyle \zbox {h(p) \isdef \int_x p(x) \cdot \lg\left[\frac{1}{p(x)}\right] dx}$ (D.29)

where $ \lg$ denotes the logarithm base 2. The entropy of $ p(x)$ can be interpreted as the average number of bits needed to specify random variables $ x$ drawn at random according to $ p(x)$ :

$\displaystyle h(p) = {\cal E}_p\left\{\lg \left[\frac{1}{p(x)}\right]\right\}$ (D.30)

The term $ \lg[1/p(x)]$ can be viewed as the number of bits which should be assigned to the value $ x$ . (The most common values of $ x$ should be assigned the fewest bits, while rare values can be assigned many bits.)


Next Section:
Example: Random Bit String
Previous Section:
Binomial Distribution