Sample Autocorrelation

The sample autocorrelation of a sequence $ v(n)$ , $ n=0,1,2,\ldots,N-1$ may be defined by

$\displaystyle \hat{r}_{v,N}(l) \isdef \frac{1}{N-\vert l\vert} \sum_{n=0}^{N-1}\overline{v(n)}v(n+l), \quad l=0,\pm1,\pm2, \ldots, \pm (N-1), \protect$ (7.6)

where $ v(n)$ is defined as zero outside of the range $ n\in[0,N-1]$ . (Note that this differs from the usual definition of indexing modulo $ N$ for the DFT.) In more explicit detail, (6.6) can be written out as

$\displaystyle \left\{\begin{array}{ll} \frac{1}{N-l}\sum_{n=0}^{N-1-l}\overline{v(n)}v(n+l), & l=0,1,2,\ldots,N-1 \\ [5pt] \frac{1}{N+l}\sum_{n=-l}^{N-1}\overline{v(n)}v(n+l), & l=-1,-2,\ldots,-N+1 \\ \end{array} \right. \protect$ (7.7)

and zero for $ \left\vert l\right\vert\geq N$ .

In matlab, the sample autocorrelation of a vector x can be computed using the xcorr function.7.3


octave:1> xcorr([1 1 1 1], 'unbiased')
ans =
  1   1   1   1   1   1   1
The xcorr function also performs cross-correlation when given a second signal argument, and offers additional features with additional arguments. Say help xcorr for details.

Note that $ \hat{r}_{v,N}(l)$ is the average of the lagged product $ x(n)x(n+l)$ over all available data. For white noise, this average approaches zero for $ l\neq0$ as the number of terms in the average increases. That is, we must have

$\displaystyle \hat{r}_{v,N}(l) \approx \left\{\begin{array}{ll} \hat{\sigma}_{v,N}^2, & l=0 \\ [5pt] 0, & l\neq 0 \\ \end{array} \right. \isdef \hat{\sigma}_{v,N}^2 \delta(l)$ (7.8)


$\displaystyle \hat{\sigma}_{v,N}^2 \isdef \frac{1}{N}\sum_{n=0}^{N-1} \left\vert v(n)\right\vert^2$ (7.9)

is defined as the sample variance of $ v$ .7.4

The plot in the upper left corner of Fig.6.1 shows the sample autocorrelation obtained for 32 samples of pseudorandom numbers (synthetic random numbers). (For reasons to be discussed below, the sample autocorrelation has been multiplied by a Bartlett (triangular) window.) Proceeding down the column on the left, the results of averaging many such sample autocorrelations can be seen. It is clear that the average sample autocorrelation function is approaching an impulse, as desired by definition for white noise. (The right column shows the Fourier transform of each sample autocorrelation function, which is a smoothed estimate of the power spectral density, as discussed in §6.6 below.)

Figure 6.1: Averaged sample autocorrelations and their Fourier transforms.

For stationary stochastic processes $ v(n)$ , the sample autocorrelation function $ \hat{r}_{v,N}(l)$ approaches the true autocorrelation function $ r_v(l)$ in the limit as the number of observed samples $ N$ goes to infinity, i.e.,

$\displaystyle \lim_{N\to\infty} \hat{r}_{v,N}(l) = r_v(l).$ (7.10)

The true autocorrelation function of a random process is defined in Appendix C. For our purposes here, however, the above limit can be taken as the definition of the true autocorrelation function for the noise sequence $ v(n)$ .

At lag $ l=0$ , the autocorrelation function of a zero-mean random process $ v(n)$ reduces to the variance:

$\displaystyle r_v(0) \isdef \lim_{N\to\infty}\frac{1}{N}\sum_{m=0}^{N-1} \left\vert v(m)\right\vert^2
= \sigma_v^2

The variance can also be called the average power or mean square. The square root $ \sigma_v$ of the variance is called the standard deviation or root mean square (RMS).

Next Section:
Sample Power Spectral Density
Previous Section:
White Noise