# sample autocorrelation matrix eigenvalues negative

Started by November 29, 2011
```Hey all, i am computing sample autocorrelation for some continouous valued
data (using this:
https://ccrma.stanford.edu/~jos/sasp/Sample_Autocorrelation.html) but the
generated correlation matrix has negative eigenvalues. Is there a way to
fix the autocorrelation matrix to  have non-negative eigenvalues? I always
thought that the sample correlation matrix should have been positive
semidefinite. thanks.
```
```On 29 Nov, 10:08, "doublehelics"
<ozgun.harmanci@n_o_s_p_a_m.gmail.com> wrote:
> Hey all, i am computing sample autocorrelation for some continouous valued
> data (using this:https://ccrma.stanford.edu/~jos/sasp/Sample_Autocorrelation.html) but the
> generated correlation matrix has negative eigenvalues. Is there a way to
> fix the autocorrelation matrix to &#2013266080;have non-negative eigenvalues? I always
> thought that the sample correlation matrix should have been positive
> semidefinite. thanks.

'Positive semidefinite' means that some eigenvalues
might be 0. If so, there is a possibility that
computational errors push some eigenvalues to the
negative side, but with very small magnitudes.

If you can reliably detect these cases, treat the
eigenvalue as it is 0.

If the negative eigenvalues have large magnitude
compared to the numerical precision of the results,
then something else is going on. Contact whoever
set up the example you quote.

Rune
```
```On 2011-11-29 06:03:41 -0400, Rune Allnor said:

> On 29 Nov, 10:08, "doublehelics"
> <ozgun.harmanci@n_o_s_p_a_m.gmail.com> wrote:
>> Hey all, i am computing sample autocorrelation for some continouous valued
>> data (using
>> this:https://ccrma.stanford.edu/~jos/sasp/Sample_Autocorrelation.html)
>> but the
>> generated correlation matrix has negative eigenvalues. Is there a way to
>> fix the autocorrelation matrix to &#2013266080;have non-negative eigenvalues? I always
>> thought that the sample correlation matrix should have been positive
>> semidefinite. thanks.
>
> 'Positive semidefinite' means that some eigenvalues
> might be 0. If so, there is a possibility that
> computational errors push some eigenvalues to the
> negative side, but with very small magnitudes.
>
> If you can reliably detect these cases, treat the
> eigenvalue as it is 0.
>
> If the negative eigenvalues have large magnitude
> compared to the numerical precision of the results,
> then something else is going on. Contact whoever
> set up the example you quote.
>
> Rune

The problem with autocorrelations is that there are two common estimators
in use. One is unbiased and the other is positive definite. The unbiased
estimator is not positive definite and the positive definite estimator
is not unbiased. Both are the same in the limit of large N so both are
consistent estimators

The unbiased estimator has the N-t divisor while the positive definite
estimator has the N divisor. If you compute the autocorreleation by
FFT-magnitude-inverse FFT you will get the positive definte estimator.
The unbiased estimator is more common if you are summing lagged products
where the divisor of N-t is more natural.

The OP will have to specify which estimaor is in use. Negative eigenvalues
can be expected from the unbiased estimator but are a numerical problem
for the positive definite estimator. Nonzero means can cause numerical
problems for variance estimators due to roundoff in cancellation.

```
```On Nov 29, 4:08&#2013266080;am, "doublehelics"
<ozgun.harmanci@n_o_s_p_a_m.gmail.com> wrote:
> Hey all, i am computing sample autocorrelation for some continouous valued
> data (using this:https://ccrma.stanford.edu/~jos/sasp/Sample_Autocorrelation.html) but the
> generated correlation matrix has negative eigenvalues. Is there a way to
> fix the autocorrelation matrix to &#2013266080;have non-negative eigenvalues? I always
> thought that the sample correlation matrix should have been positive
> semidefinite. thanks.

There are numerous fixes.

I would first look at the SVD of your sample correlation estimate.  If
you see a negative singular value, something is wrong.

The easiest solution is to diagonally load the matrix.  That pushes
the eigenvalues equally in the positive direction.  You typically have
to play with the data to figure out an appropriate loading scaler.

For a problems assumed stationary, some people average over diagonals
and force the matrix to be Toeplitz.  This sort of works, sometimes.

Again for problems assumed stationary, average the flip-up-down( flip-
left-right) sample covariance with the sample covariance.

I would recommend using a rank-1 update approach.  Typically, a
covariance matrix inverse is the desired entity down stream, so
building a factorization from the data is usually best.  It is also is
more numerically stable.  There are rank one update algorithms for
Cholesky, QR, eigenvector, and I think  even for  an SVD.

There are a lot of papers on covariance estimation.  These are just a
few tricks.
```