Gaussian Moments

Gaussian Mean

The mean of a distribution $ f(t)$ is defined as its first-order moment:

$\displaystyle \mu \isdef \int_{-\infty}^\infty t f(t)dt$ (D.42)

To show that the mean of the Gaussian distribution is $ \mu$ , we may write, letting $ g\isdef 1/\sqrt{2\pi\sigma^2}$ ,

\int_{-\infty}^\infty t f(t) dt &\isdef &
g \int_{-\infty}^\infty t e^{-\frac{(t-\mu)^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty (t+\mu) e^{-\frac{t^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty t e^{-\frac{t^2}{2\sigma^2}} dt + \mu\\
&=&\left.g(-\sigma^2) e^{-\frac{t^2}{2\sigma^2}} \right\vert _{-\infty}^{\infty} + \mu\\
&=& \mu

since $ f(\pm\infty)=0$ .

Gaussian Variance

The variance of a distribution $ f(t)$ is defined as its second central moment:

$\displaystyle \sigma^2 \isdef \int_{-\infty}^\infty (t-\mu)^2 f(t)dt$ (D.43)

where $ \mu$ is the mean of $ f(t)$ .

To show that the variance of the Gaussian distribution is $ \sigma^2$ , we write, letting $ g\isdef 1/\sqrt{2\pi\sigma^2}$ ,

\int_{-\infty}^\infty (t-\mu)^2 f(t) dt &\isdef &
g \int_{-\infty}^\infty (t-\mu)^2 e^{-\frac{(t-\mu)^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty \nu^2 e^{-\frac{\nu^2}{2\sigma^2}} d\nu\\
&=&g \int_{-\infty}^\infty \underbrace{\nu}_{u} \cdot \underbrace{\nu e^{-\frac{\nu^2}{2\sigma^2}} d\nu}_{dv}\\
&=& \left. g \nu (-\sigma^2)e^{-\frac{\nu^2}{2\sigma^2}} \right\vert _{-\infty}^{\infty} \\
& & - g \int_{-\infty}^\infty (-\sigma^2) e^{-\frac{\nu^2}{2\sigma^2}} d\nu \\

where we used integration by parts and the fact that $ \nu f(\nu)\to 0$ as $ \left\vert\nu\right\vert\to\infty$ .

Higher Order Moments Revisited

Theorem: The $ n$ th central moment of the Gaussian pdf $ p(x)$ with mean $ \mu$ and variance $ \sigma^2$ is given by

$\displaystyle m_n \isdef {\cal E}_p\{(x-\mu)^n\} = \left\{\begin{array}{ll} (n-1)!!\cdot\sigma^n, & \hbox{$n$\ even} \\ [5pt] $0$, & \hbox{$n$\ odd} \\ \end{array} \right. \protect$ (D.44)

where $ (n-1)!!$ denotes the product of all odd integers up to and including $ n-1$ (see ``double-factorial notation''). Thus, for example, $ m_2=\sigma^2$ , $ m_4=3\,\sigma^4$ , $ m_6=15\,\sigma^6$ , and $ m_8=105\,\sigma^8$ .

Proof: The formula can be derived by successively differentiating the moment-generating function $ M(\alpha) = {\cal E}_p\{\exp(\alpha x)\}
= \exp(\mu \alpha + \sigma^2 \alpha^2 / 2)$ with respect to $ \alpha $ and evaluating at $ \alpha=0$ ,D.4 or by differentiating the Gaussian integral

$\displaystyle \int_{-\infty}^\infty e^{-\alpha x^2} dx = \sqrt{\frac{\pi}{\alpha}}$ (D.45)

successively with respect to $ \alpha $ [203, p. 147-148]:

\int_{-\infty}^\infty (-x^2) e^{-\alpha x^2} dx &=& \sqrt{\pi}(-1/2)\alpha^{-3/2}\\
\int_{-\infty}^\infty (-x^2)(-x^2) e^{-\alpha x^2} + dx &=& \sqrt{\pi}(-1/2)(-3/2)\alpha^{-5/2}\\
\vdots & & \vdots\\
\int_{-\infty}^\infty x^{2k} e^{-\alpha x^2} dx &=& \sqrt{\pi}\,[(2k-1)!!]\,2^{-k/2}\alpha^{-(k+1)/2}

for $ k=1,2,3,\ldots\,$ . Setting $ \alpha = 1/(2\sigma^2)$ and $ n=2k$ , and dividing both sides by $ \sigma\sqrt{2\pi}$ yields

$\displaystyle {\cal E}_p\{x^n\} \isdefs \frac{1}{\sigma\sqrt{2\pi}}\int_{-\infty}^\infty x^n e^{-\frac{x^2}{2\sigma^2}} dx \eqsp \zbox {\sigma^n \cdot (n-1)!!}$ (D.46)

for $ n=2,4,6,\ldots\,$ . Since the change of variable $ x
= \tilde{x}-\mu$ has no affect on the result, (D.44) is also derived for $ \mu\ne0$ .

Moment Theorem

Theorem: For a random variable $ x$ ,

$\displaystyle {\cal E}\{x^n\} = \left.\frac{1}{j^n}\frac{d^n}{d\omega^n}\Phi(\omega)\right\vert _{\omega=0}$ (D.47)

where $ \Phi(\omega)$ is the characteristic function of the PDF $ p(x)$ of $ x$ :

$\displaystyle \Phi(\omega) \isdef {\cal E}_p\{ e^{j\omega x} \} = \int_{-\infty}^\infty p(x)e^{j\omega x}dx$ (D.48)

(Note that $ \Phi(\omega)$ is the complex conjugate of the Fourier transform of $ p(x)$ .)

Proof: [201, p. 157] Let $ m_i$ denote the $ i$ th moment of $ x$ , i.e.,

$\displaystyle m_i \isdef {\cal E}_p\{x^i\} \isdef \int_{-\infty}^\infty x^i p(x)dx$ (D.49)


\Phi(\omega) &=& \int_{-\infty}^\infty p(x)e^{j\omega x} dx \\
&=& \int_{-\infty}^\infty p(x) \left(1 + j\omega x + \cdots + \frac{(j\omega)^n}{n!}+\cdots\right)dx\\
&=& 1 + j\omega m_1 + \frac{(j\omega)^2}{2} m_2 + \cdots + \frac{(j\omega)^n}{n!}m_n+\cdots

where the term-by-term integration is valid when all moments $ m_i$ are finite.

Gaussian Characteristic Function

Since the Gaussian PDF is

$\displaystyle p(t) \isdef \frac{1}{\sqrt{2\pi\sigma^2}} e^{-\frac{(t-\mu)^2}{2\sigma^2}}$ (D.50)

and since the Fourier transform of $ p(t)$ is

$\displaystyle P(\omega) = e^{-j\mu \omega} e^{-\frac{1}{2}\sigma^2\omega^2}$ (D.51)

It follows that the Gaussian characteristic function is

$\displaystyle \Phi(\omega) = \overline{P(\omega)} = e^{j\mu \omega} e^{-\frac{1}{2}\sigma^2\omega^2}.$ (D.52)

Gaussian Central Moments

The characteristic function of a zero-mean Gaussian is

$\displaystyle \Phi(\omega) = e^{-\frac{1}{2}\sigma^2\omega^2}$ (D.53)

Since a zero-mean Gaussian $ p(t)$ is an even function of $ t$ , (i.e., $ p(-t)=p(t)$ ), all odd-order moments $ m_i$ are zero. By the moment theorem, the even-order moments are

$\displaystyle m_i = \left.(-1)^{\frac{n}{2}}\frac{d^n}{d\omega^n}\Phi(\omega)\right\vert _{\omega=0}$ (D.54)

In particular,

\Phi^\prime(\omega) &=& -\frac{1}{2}\sigma^2 2\omega\Phi(\omega)\\ [5pt]
\Phi^{\prime\prime}(\omega) &=& -\frac{1}{2}\sigma^2 2\omega\Phi^\prime(\omega)
-\frac{1}{2}\sigma^2 2\Phi(\omega)

Since $ \Phi(0)=1$ and $ \Phi^\prime(0)=0$ , we see $ m_1=0$ , $ m_2=\sigma^2$ , as expected.

Next Section:
A Sum of Gaussian Random Variables is a Gaussian Random Variable
Previous Section:
Maximum Entropy Property of the Gaussian Distribution