DSPRelated.com
Forums

SVD vs. PCA

Started by RichD October 12, 2015
I've been studying principal component analysis, and don't get 
it, though I know it's popular.  However, it reminds me of singular 
value decomposition, though I haven't used that in a while.

Can anyone clue me as to the intuition for PCA, when it's 
applicable, as opposed to cases where one would  use SVD?

Or recommend a tutorial paper, or text book, which covers 
these questions?

--
Rich
RichD <r_delaney2001@yahoo.com> wrote:

> I've been studying principal component analysis, and don't get > it, though I know it's popular. However, it reminds me of singular > value decomposition, though I haven't used that in a while.
> Can anyone clue me as to the intuition for PCA, when it's > applicable, as opposed to cases where one would use SVD?
As well as I remember it, they are pretty much related. Looking at: https://en.wikipedia.org/wiki/Principal_component_analysis it seems to me that PCA uses a small number of the larger eigenvectors, and that SVD is, at least in the mechanical engineering case, the way you find them. There might be some optimizations for the case where you only want a small number of components/eigenvectors/modes.
> Or recommend a tutorial paper, or text book, which covers > these questions?
You can start with the wikipedia pages for the two. If I remember, Numerical Recipes has a good description of them, but it has been some years since I read it. For those who like to think mechanically, an arbitrary shaped object (that is, with no symmetries) has three perpendicual axes that it will rotate nicely around. This happens under the coordinate system that diagonalizes the moment of inertia tensor. Rotations along those axes have an angular momentum axis parallel to the rotation axis. -- glen
On 13.10.15 03.08, RichD wrote:
> I've been studying principal component analysis, and don't get > it, though I know it's popular. However, it reminds me of singular > value decomposition, though I haven't used that in a while.
PCA tries to build a given vector of values (V) by the weighted sum (a,b,c) of other vectors called 'components' (Ci). V = a * C1 + b * C2 + c * C3 ... This is a set of linear equations, One for each vector element. V[0] = a * C1[0] + b * C2[0] + c * C3[0] ... V[1] = a * C1[1] + b * C2[1] + c * C3[1] ... ... Since there are more vector elements than base vectors the system of equations is overdetermined. So in fact PCA is a least square fit to chose the weight factors in a way to minimize the deviation of the result from the target vector. The only thing that distinguishes PCA from any other least square fit is that the calculation is much more easy. It does not require an iterative algorithm like Levenberg Marquardt. All you need is to calculate the pseudoinverse of the component matrix and multiply it by your data. This will give you the exact set of weight factors a,b,c that minimize the error. E.g. take the complex impedance graph of a probe. If you fit this frequency dependent graph with the base functions 1, i*f and 1/i/f you will directly get ESR, ESL and ESC of the impedance. So you basically created a quite reasonable LCR meter, e.g. to characterize capacitors.
> Can anyone clue me as to the intuition for PCA, when it's > applicable, as opposed to cases where one would use SVD?
No idea of SVD. But PCA is always a solution if your model is linear. Simply because it is fast and reliable - well, as long as your model fits your needs. Marcel
glen herrmannsfeldt <gah@ugcs.caltech.edu> writes:

> RichD <r_delaney2001@yahoo.com> wrote: > >> I've been studying principal component analysis, and don't get >> it, though I know it's popular. However, it reminds me of singular >> value decomposition, though I haven't used that in a while. > >> Can anyone clue me as to the intuition for PCA, when it's >> applicable, as opposed to cases where one would use SVD? > > As well as I remember it, they are pretty much related. > > Looking at: > > https://en.wikipedia.org/wiki/Principal_component_analysis
I saw that too. It looks like they're saying PCA is a family of algorithms that perform this decomposition, of which SVD is one. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
On 2015-10-13 03:08, RichD wrote:
> I've been studying principal component analysis, and don't get > it, though I know it's popular. However, it reminds me of singular > value decomposition, though I haven't used that in a while. > > Can anyone clue me as to the intuition for PCA, when it's > applicable, as opposed to cases where one would use SVD? > > Or recommend a tutorial paper, or text book, which covers > these questions?
The PCA returns the sorted list of eigenvector (sorted by decreasing eigenvalue) of the *covariance* matrix of your (measured) data. You can use SVD is order to find eigenvectors and eigenvalues (of the covariance matrix). So, SVD is the tool to compute PCA. Just to be complete: in the assumption to have "good" data (i.e. "compact", i.e. multivariate, i.e. Gaussian), the eigenvectors and eigenvalues of the covariance matrix represent directions (in space) where the data is distributed, so it, to put it really simple, it is a first order (i.e. linear) approximation of the data. bye, -- piergiorgio
On 10/12/15 8:08 PM, RichD wrote:
> Or recommend a tutorial paper, or text book, which covers > these questions?
http://arxiv.org/pdf/1404.1100.pdf Regards, Chris