Gaussian Moments
Gaussian Mean
The mean of a distribution
is defined as its
first-order moment:
![]() |
(D.42) |
To show that the mean of the Gaussian distribution is
, we may write,
letting
,
![\begin{eqnarray*}
\int_{-\infty}^\infty t f(t) dt &\isdef &
g \int_{-\infty}^\infty t e^{-\frac{(t-\mu)^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty (t+\mu) e^{-\frac{t^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty t e^{-\frac{t^2}{2\sigma^2}} dt + \mu\\
&=&\left.g(-\sigma^2) e^{-\frac{t^2}{2\sigma^2}} \right\vert _{-\infty}^{\infty} + \mu\\
&=& \mu
\end{eqnarray*}](http://www.dsprelated.com/josimages_new/sasp2/img2828.png)
since
.
Gaussian Variance
The
variance of a distribution
is defined as its
second central moment:
![]() |
(D.43) |
where
![$ \mu$](http://www.dsprelated.com/josimages_new/sasp2/img385.png)
![$ f(t)$](http://www.dsprelated.com/josimages_new/sasp2/img560.png)
To show that the variance of the Gaussian distribution is
, we write,
letting
,
![\begin{eqnarray*}
\int_{-\infty}^\infty (t-\mu)^2 f(t) dt &\isdef &
g \int_{-\infty}^\infty (t-\mu)^2 e^{-\frac{(t-\mu)^2}{2\sigma^2}} dt\\
&=&g \int_{-\infty}^\infty \nu^2 e^{-\frac{\nu^2}{2\sigma^2}} d\nu\\
&=&g \int_{-\infty}^\infty \underbrace{\nu}_{u} \cdot \underbrace{\nu e^{-\frac{\nu^2}{2\sigma^2}} d\nu}_{dv}\\
&=& \left. g \nu (-\sigma^2)e^{-\frac{\nu^2}{2\sigma^2}} \right\vert _{-\infty}^{\infty} \\
& & - g \int_{-\infty}^\infty (-\sigma^2) e^{-\frac{\nu^2}{2\sigma^2}} d\nu \\
&=&\sigma^2
\end{eqnarray*}](http://www.dsprelated.com/josimages_new/sasp2/img2831.png)
where we used integration by parts and the fact that
as
.
Higher Order Moments Revisited
Theorem:
The
th central moment of the Gaussian pdf
with mean
and variance
is given by
where
![$ (n-1)!!$](http://www.dsprelated.com/josimages_new/sasp2/img2835.png)
![$ n-1$](http://www.dsprelated.com/josimages_new/sasp2/img2592.png)
![$ m_2=\sigma^2$](http://www.dsprelated.com/josimages_new/sasp2/img2836.png)
![$ m_4=3\,\sigma^4$](http://www.dsprelated.com/josimages_new/sasp2/img2837.png)
![$ m_6=15\,\sigma^6$](http://www.dsprelated.com/josimages_new/sasp2/img2838.png)
![$ m_8=105\,\sigma^8$](http://www.dsprelated.com/josimages_new/sasp2/img2839.png)
Proof:
The formula can be derived by successively differentiating the
moment-generating function
with respect to
and evaluating at
,D.4 or by differentiating the
Gaussian integral
![]() |
(D.45) |
successively with respect to
![$ \alpha $](http://www.dsprelated.com/josimages_new/sasp2/img4.png)
![\begin{eqnarray*}
\int_{-\infty}^\infty (-x^2) e^{-\alpha x^2} dx &=& \sqrt{\pi}(-1/2)\alpha^{-3/2}\\
\int_{-\infty}^\infty (-x^2)(-x^2) e^{-\alpha x^2} + dx &=& \sqrt{\pi}(-1/2)(-3/2)\alpha^{-5/2}\\
\vdots & & \vdots\\
\int_{-\infty}^\infty x^{2k} e^{-\alpha x^2} dx &=& \sqrt{\pi}\,[(2k-1)!!]\,2^{-k/2}\alpha^{-(k+1)/2}
\end{eqnarray*}](http://www.dsprelated.com/josimages_new/sasp2/img2843.png)
for
.
Setting
and
, and dividing both sides by
yields
![]() |
(D.46) |
for
![$ n=2,4,6,\ldots\,$](http://www.dsprelated.com/josimages_new/sasp2/img2849.png)
![$ x
= \tilde{x}-\mu$](http://www.dsprelated.com/josimages_new/sasp2/img2850.png)
![$ \mu\ne0$](http://www.dsprelated.com/josimages_new/sasp2/img2851.png)
Moment Theorem
Theorem:
For a random variable
,
![]() |
(D.47) |
where
![$ \Phi(\omega)$](http://www.dsprelated.com/josimages_new/sasp2/img2853.png)
![$ p(x)$](http://www.dsprelated.com/josimages_new/sasp2/img2632.png)
![$ x$](http://www.dsprelated.com/josimages_new/sasp2/img38.png)
![]() |
(D.48) |
(Note that
![$ \Phi(\omega)$](http://www.dsprelated.com/josimages_new/sasp2/img2853.png)
![$ p(x)$](http://www.dsprelated.com/josimages_new/sasp2/img2632.png)
Proof: [201, p. 157]
Let
denote the
th moment of
, i.e.,
![]() |
(D.49) |
Then
![\begin{eqnarray*}
\Phi(\omega) &=& \int_{-\infty}^\infty p(x)e^{j\omega x} dx \\
&=& \int_{-\infty}^\infty p(x) \left(1 + j\omega x + \cdots + \frac{(j\omega)^n}{n!}+\cdots\right)dx\\
&=& 1 + j\omega m_1 + \frac{(j\omega)^2}{2} m_2 + \cdots + \frac{(j\omega)^n}{n!}m_n+\cdots
\end{eqnarray*}](http://www.dsprelated.com/josimages_new/sasp2/img2857.png)
where the term-by-term integration is valid when all moments
are
finite.
Gaussian Characteristic Function
Since the Gaussian PDF is
![]() |
(D.50) |
and since the Fourier transform of
![$ p(t)$](http://www.dsprelated.com/josimages_new/sasp2/img2859.png)
![]() |
(D.51) |
It follows that the Gaussian characteristic function is
![]() |
(D.52) |
Gaussian Central Moments
The characteristic function of a zero-mean Gaussian is
![]() |
(D.53) |
Since a zero-mean Gaussian
![$ p(t)$](http://www.dsprelated.com/josimages_new/sasp2/img2859.png)
![$ t$](http://www.dsprelated.com/josimages_new/sasp2/img344.png)
![$ p(-t)=p(t)$](http://www.dsprelated.com/josimages_new/sasp2/img2863.png)
![$ m_i$](http://www.dsprelated.com/josimages_new/sasp2/img2855.png)
![]() |
(D.54) |
In particular,
![\begin{eqnarray*}
\Phi^\prime(\omega) &=& -\frac{1}{2}\sigma^2 2\omega\Phi(\omega)\\ [5pt]
\Phi^{\prime\prime}(\omega) &=& -\frac{1}{2}\sigma^2 2\omega\Phi^\prime(\omega)
-\frac{1}{2}\sigma^2 2\Phi(\omega)
\end{eqnarray*}](http://www.dsprelated.com/josimages_new/sasp2/img2865.png)
Since
and
, we see
,
, as expected.
Next Section:
A Sum of Gaussian Random Variables is a Gaussian Random Variable
Previous Section:
Maximum Entropy Property of the Gaussian Distribution