Estimator Variance
As mentioned in §6.12, the pwelch function in Matlab and Octave offer ``confidence intervals'' for an estimated power spectral density (PSD). A confidence interval encloses the true value with probability (the confidence level). For example, if , then the confidence level is .
This section gives a first discussion of ``estimator variance,'' particularly the variance of sample means and sample variances for stationary stochastic processes.
Sample-Mean Variance
The simplest case to study first is the sample mean:
(C.29) |
Here we have defined the sample mean at time as the average of the successive samples up to time --a ``running average''. The true mean is assumed to be the average over any infinite number of samples such as
(C.30) |
or
(C.31) |
Now assume , and let denote the variance of the process , i.e.,
Var | (C.32) |
Then the variance of our sample-mean estimator can be calculated as follows:
where we used the fact that the time-averaging operator is linear, and denotes the unbiased autocorrelation of . If is white noise, then , and we obtain
We have derived that the variance of the -sample running average of a white-noise sequence is given by , where denotes the variance of . We found that the variance is inversely proportional to the number of samples used to form the estimate. This is how averaging reduces variance in general: When averaging independent (or merely uncorrelated) random variables, the variance of the average is proportional to the variance of each individual random variable divided by .
Sample-Variance Variance
Consider now the sample variance estimator
(C.33) |
where the mean is assumed to be , and denotes the unbiased sample autocorrelation of based on the samples leading up to and including time . Since is unbiased, . The variance of this estimator is then given by
where
The autocorrelation of need not be simply related to that of . However, when is assumed to be Gaussian white noise, simple relations do exist. For example, when ,
(C.34) |
by the independence of and , and when , the fourth moment is given by . More generally, we can simply label the th moment of as , where corresponds to the mean, corresponds to the variance (when the mean is zero), etc.
When is assumed to be Gaussian white noise, we have
(C.35) |
so that the variance of our estimator for the variance of Gaussian white noise is
Var | (C.36) |
Again we see that the variance of the estimator declines as .
The same basic analysis as above can be used to estimate the variance of the sample autocorrelation estimates for each lag, and/or the variance of the power spectral density estimate at each frequency.
As mentioned above, to obtain a grounding in statistical signal processing, see references such as [201,121,95].
Next Section:
Product of Two Gaussian PDFs
Previous Section:
Independent Implies Uncorrelated