DSPRelated.com
Free Books

Gaussian Distribution

The Gaussian distribution has maximum entropy relative to all probability distributions covering the entire real line $ x\in(-\infty,\infty)$ but having a finite mean $ \mu$ and finite variance $ \sigma^2$ .

Proceeding as before, we obtain the objective function

\begin{eqnarray*}
J(p) &\isdef & -\int_{-\infty}^\infty p(x) \, \ln p(x)\,dx
+ \lambda_0\left(\int_{-\infty}^\infty p(x)\,dx - 1\right)\\
&+& \lambda_1\left(\int_{-\infty}^\infty x\,p(x)\,dx - \mu\right)
+ \lambda_2\left(\int_{-\infty}^\infty x^2\,p(x)\,dx - \sigma^2\right)
\end{eqnarray*}

and partial derivatives

\begin{eqnarray*}
\frac{\partial}{\partial p(x)\,dx} J(p) &=& - \ln p(x) - 1 + \lambda_0 + \lambda_1 x\\
\frac{\partial^2}{\partial p(x)^2 dx} J(p) &=& - \frac{1}{p(x)}
\end{eqnarray*}

leading to

$\displaystyle p(x) = e^{(\lambda_0-1)+\lambda_1 x + \lambda_2 x^2} = \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{x^2}{2\sigma^2}}.$ (D.41)

For more on entropy and maximum-entropy distributions, see [48].


Next Section:
Wavetable Synthesis
Previous Section:
Exponential Distribution