Entropy of a Probability Distribution
The entropy of a probability density function (PDF) is defined as [48]
(D.29) |
where denotes the logarithm base 2. The entropy of can be interpreted as the average number of bits needed to specify random variables drawn at random according to :
(D.30) |
The term can be viewed as the number of bits which should be assigned to the value . (The most common values of should be assigned the fewest bits, while rare values can be assigned many bits.)
Next Section:
Example: Random Bit String
Previous Section:
Binomial Distribution