Forums

Decorrelation vs Whitening

Started by HardySpicer July 7, 2009
To me, whitening is takinga coloured noise signal and reproducing it's
white noise input.(as Weiner has it)

However, I have seen otehr definitions eg suppose we have a mixture of
signals via a matrix A

X=A.S

where S is a vector of signals. Does decorrelating then result in a
diagonal matrix? Is this what some people call whitening? Also, can I
decorrelate by taking the correlation matrix of X Rxx and multiplying
X by

Rxx^(-0.5)?? ie the inverse square root of Rxx.

Hardy
HardySpicer  <gyansorova@gmail.com> wrote:

>To me, whitening is takinga coloured noise signal and reproducing it's >white noise input.(as Weiner has it)
>However, I have seen otehr definitions eg suppose we have a mixture of >signals via a matrix A
>where S is a vector of signals. Does decorrelating then result in a >diagonal matrix? Is this what some people call whitening?
Not that I've heard. Whitening usually means removing some of the coloration of a signal, say by linear prediction, but leaving the spectral fine structure which presumably has some information you want (such as pitch or magnitude). Of course, the more time goes by, the more likely it seems I am to run into some (to me) unusual use of terminology that turns out to be widespread in some obscure corner of the industry. Steve
On 8 Jul, 00:50, HardySpicer <gyansor...@gmail.com> wrote:
> To me, whitening is takinga coloured noise signal and reproducing it's > white noise input.(as Weiner has it) > > However, I have seen otehr definitions eg suppose we have a mixture of > signals via a matrix A > > X=A.S > > where S is a vector of signals. Does decorrelating then result in a > diagonal matrix? Is this what some people call whitening? > > Also, can I > decorrelate by taking the correlation matrix of X Rxx and multiplying > X by > > Rxx^(-0.5)?? ie the inverse square root of Rxx.
For the estimators for Rxx that are (conjugate) symemtric and full rank, there exists a Cholesky decomposition C of Rxx such that Rxx = C^{H}C where superscript H means 'conjugate transposed'. Rxx can then be diagonalized as D = C^{-H}RxxC^{-1} Maybe some people might say that C = Rxx^{1/2}. This is the general idea behind whitening filters in time domain. In that case Rxx is the noise (temporal) covariance matrix. While one might have opinions about using the term 'whitening' in other contexts, the same formal trick can be used in any context for any covariance matrix that is (conjugate) symmetric and of full rank. Rune
On Jul 7, 11:10&#2013266080;pm, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 8 Jul, 00:50, HardySpicer <gyansor...@gmail.com> wrote: > > > > > To me, whitening is takinga coloured noise signal and reproducing it's > > white noise input.(as Weiner has it) > > > However, I have seen otehr definitions eg suppose we have a mixture of > > signals via a matrix A > > > X=A.S > > > where S is a vector of signals. Does decorrelating then result in a > > diagonal matrix? Is this what some people call whitening? > > > Also, can I > > decorrelate by taking the correlation matrix of X Rxx and multiplying > > X by > > > Rxx^(-0.5)?? ie the inverse square root of Rxx. > > For the estimators for Rxx that are (conjugate) symemtric > and full rank, there exists a Cholesky decomposition C of > Rxx such that > > Rxx = C^{H}C > > where superscript H means 'conjugate transposed'. Rxx can > then be diagonalized as > > D = C^{-H}RxxC^{-1} > > Maybe some people might say that C = Rxx^{1/2}. > > This is the general idea behind whitening filters in time > domain. In that case Rxx is the noise (temporal) covariance > matrix. > > While one might have opinions about using the term > 'whitening' in other contexts, the same formal trick can > be used in any context for any covariance matrix that is > (conjugate) symmetric and of full rank. > > Rune
So can you decouple in the time-domain X(k)=A.S(k) by multiplying by what you call C? (Rxx(^-0.5)) ie will X' become diagonally dominant where X'=C.X? This with ref to ICA where you are trying to separate a mixture of signals. Hardy
>To me, whitening is takinga coloured noise signal and reproducing it's >white noise input.(as Weiner has it) > >However, I have seen otehr definitions eg suppose we have a mixture of >signals via a matrix A > >X=A.S
This is more an approximation of decorrelating signal with unitary matrix, it is used to simplify computing think, like using a Fourier matrix to decorrelate a signal. If you applied this your auto-correlation matrix is closed to a diagonal, but it is not whitening because the diagonal elements are not equal as for white signal.
> >where S is a vector of signals. Does decorrelating then result in a >diagonal matrix? Is this what some people call whitening? Also, can I >decorrelate by taking the correlation matrix of X Rxx and multiplying >X by > >Rxx^(-0.5)?? ie the inverse square root of Rxx. > >Hardy >
Moctar
On 8 Jul, 08:56, HardySpicer <gyansor...@gmail.com> wrote:
> On Jul 7, 11:10&#2013266080;pm, Rune Allnor <all...@tele.ntnu.no> wrote: > > > > > > > On 8 Jul, 00:50, HardySpicer <gyansor...@gmail.com> wrote: > > > > To me, whitening is takinga coloured noise signal and reproducing it's > > > white noise input.(as Weiner has it) > > > > However, I have seen otehr definitions eg suppose we have a mixture of > > > signals via a matrix A > > > > X=A.S > > > > where S is a vector of signals. Does decorrelating then result in a > > > diagonal matrix? Is this what some people call whitening? > > > > Also, can I > > > decorrelate by taking the correlation matrix of X Rxx and multiplying > > > X by > > > > Rxx^(-0.5)?? ie the inverse square root of Rxx. > > > For the estimators for Rxx that are (conjugate) symemtric > > and full rank, there exists a Cholesky decomposition C of > > Rxx such that > > > Rxx = C^{H}C > > > where superscript H means 'conjugate transposed'. Rxx can > > then be diagonalized as > > > D = C^{-H}RxxC^{-1} > > > Maybe some people might say that C = Rxx^{1/2}. > > > This is the general idea behind whitening filters in time > > domain. In that case Rxx is the noise (temporal) covariance > > matrix. > > > While one might have opinions about using the term > > 'whitening' in other contexts, the same formal trick can > > be used in any context for any covariance matrix that is > > (conjugate) symmetric and of full rank. > > > Rune > > So can you decouple in the time-domain > > X(k)=A.S(k) > > by multiplying by what you call C? (Rxx(^-0.5)) ie will X' become > diagonally dominant where X'=C.X? This with ref to ICA where you are > trying to separate a mixture of signals.
Well, if Rxx = X'X then Rxx is diagonalized as D = C'RxxC = C'X'XC and so the product Y = XC might be of some interest. Whether this is of any use in the context X = SA is a completely different question. Rune
On Jul 7, 5:50=A0pm, HardySpicer <gyansor...@gmail.com> wrote:
> To me, whitening is takinga coloured noise signal and reproducing it's > white noise input.(as Weiner has it) > > However, I have seen otehr definitions eg suppose we have a mixture of > signals via a matrix A > > X=3DA.S > > where S is a vector of signals. Does decorrelating then result in a > diagonal matrix? Is this what some people call whitening? Also, can I > decorrelate by taking the correlation matrix of X Rxx and multiplying > X by > > Rxx^(-0.5)?? ie the inverse square root of Rxx. > > Hardy
When you used terms like mixture of signals and signals vector, I suspected you were refering to ICA. Your second post confirmed that. In ICA, the observed vector is transformed to get a new vector that is "white". This means that any correlations in the observed data are removed. The components of the new vector are uncorrelated. If that is the case, then the covariance matrix of the new vector is diagonal. Additionally, if the new vector can be made so that the components are unity variance, as well as uncorrelated, then the covariance matrix will be the idenity matrix. One way of doing this is to use eigen- value decomposition. Then the new vector is x_new =3D E D^(-1/2) E^H x_old E is a matrix of eigenvectors and D is the diagonal matrix of eigenvalues. If you are looking at Comon's paper, I suggest you look also at Hyvarinen's paper "Independent Component Analysis: Algorithms and Application" in Neural Networks. I believe it was vol 13, 2000. Hyvarinen was with Helsinki Univ. If you can't find the paper, he may have a copy he can send you. Maurice Givens
On Jul 7, 6:50&#2013266080;pm, HardySpicer <gyansor...@gmail.com> wrote:
> To me, whitening is takinga coloured noise signal and reproducing it's > white noise input.(as Weiner has it) > > However, I have seen otehr definitions eg suppose we have a mixture of > signals via a matrix A > > X=A.S > > where S is a vector of signals. Does decorrelating then result in a > diagonal matrix? Is this what some people call whitening? Also, can I > decorrelate by taking the correlation matrix of X Rxx and multiplying > X by > > Rxx^(-0.5)?? ie the inverse square root of Rxx. > > Hardy
Decorrelation (e.g., via PCA) creates a diagonal covariance matrix. Whitening creates a unit covariance matrix. Hope this helps. Greg