Uniform Distribution

Among probability distributions $ p(x)$ which are nonzero over a finite range of values $ x\in[a,b]$ , the maximum-entropy distribution is the uniform distribution. To show this, we must maximize the entropy,

$\displaystyle H(p) \isdef -\int_a^b p(x)\, \lg p(x)\, dx$ (D.33)

with respect to $ p(x)$ , subject to the constraints

p(x) &\geq& 0\\
\int_a^b p(x)\,dx &=& 1.

Using the method of Lagrange multipliers for optimization in the presence of constraints [86], we may form the objective function

$\displaystyle J(p) \isdef -\int_a^b p(x) \, \ln p(x) \,dx + \lambda_0\left(\int_a^b p(x)\,dx - 1\right)$ (D.34)

and differentiate with respect to $ p(x)$ (and renormalize by dropping the $ dx$ factor multiplying all terms) to obtain

$\displaystyle \frac{\partial}{\partial p(x)\,dx} J(p) = - \ln p(x) - 1 + \lambda_0.$ (D.35)

Setting this to zero and solving for $ p(x)$ gives

$\displaystyle p(x) = e^{\lambda_0-1}.$ (D.36)

(Setting the partial derivative with respect to $ \lambda_0$ to zero merely restates the constraint.)

Choosing $ \lambda_0$ to satisfy the constraint gives $ \lambda_0
=1-\ln(b-a)$ , yielding

$\displaystyle p(x) = \left\{\begin{array}{ll} \frac{1}{b-a}, & a\leq x \leq b \\ [5pt] 0, & \hbox{otherwise}. \\ \end{array} \right.$ (D.37)

That this solution is a maximum rather than a minimum or inflection point can be verified by ensuring the sign of the second partial derivative is negative for all $ x$ :

$\displaystyle \frac{\partial^2}{\partial p(x)^2dx} J(p) = - \frac{1}{p(x)}$ (D.38)

Since the solution spontaneously satisfied $ p(x)>0$ , it is a maximum.

Next Section:
Exponential Distribution
Previous Section:
Sample-Variance Variance