DSPRelated.com
Free Books

Exponential Distribution

Among probability distributions $ p(x)$ which are nonzero over a semi-infinite range of values $ x\in[0,\infty]$ and having a finite mean $ \mu$ , the exponential distribution has maximum entropy.

To the previous case, we add the new constraint

$\displaystyle \int_{-\infty}^\infty x\,p(x)\,dx = \mu < \infty$ (D.39)

resulting in the objective function

\begin{eqnarray*}
J(p) &\isdef & -\int_0^\infty p(x) \, \ln p(x)\,dx
+ \lambda_0\left(\int_0^\infty p(x)\,dx - 1\right).\\
& & + \lambda_1\left(\int_0^\infty x\,p(x)\,dx - \mu\right)
\end{eqnarray*}

Now the partials with respect to $ p(x)$ are

\begin{eqnarray*}
\frac{\partial}{\partial p(x)\,dx} J(p) &=& - \ln p(x) - 1 + \lambda_0 + \lambda_1 x\\
\frac{\partial^2}{\partial p(x)^2 dx} J(p) &=& - \frac{1}{p(x)}
\end{eqnarray*}

and $ p(x)$ is of the form $ p(x) = e^{(\lambda_0-1)+\lambda_1x}$ . The unit-area and finite-mean constraints result in $ \exp(\lambda_0-1) =
1/\mu$ and $ \lambda_1=-1/\mu$ , yielding

$\displaystyle p(x) = \left\{\begin{array}{ll} \frac{1}{\mu} e^{-x/\mu}, & x\geq 0 \\ [5pt] 0, & \hbox{otherwise}. \\ \end{array} \right.$ (D.40)


Next Section:
Gaussian Distribution
Previous Section:
Uniform Distribution