DSPRelated.com
Forums

Non-orthogonal decompositions

Started by Ryan M November 20, 2008
I was just thinking about finding optimal decompositions for signal
mixtures where the basis functions are not necessarily orthogonal, but
would make some kind of physical sense. e.g. I'm sure cardiac signals
are periodic and could be decomposed into contributions from various
valves etc. Accelerometer measurements on a runner could be decomposed
into various physical components (hip, knee, ankle etc.). All of these
are periodic signals, yet the Fourier spectrum decomposes into more
basis functions (sinusoids) than are useful.

I suppose there are two problems: given a set of signal measurements,
determine the most useful (I suppose smallest number) of basis
functions (what happens with samples that would be considered
outliers...?)

And then, given a particular sample, find the coefficients of the non-
orthogonal basis functions.

I don't know if this makes any sense. Any suggestions for theory that
could be useful?

Ryan
On 20 Nov, 10:52, Ryan M <ryan.mitch...@gmail.com> wrote:
> I was just thinking about finding optimal decompositions for signal > mixtures where the basis functions are not necessarily orthogonal, but > would make some kind of physical sense.
...
> I don't know if this makes any sense. Any suggestions for theory that > could be useful?
It doesn't make sense. One of the famous theorems of Real Analysis (I *think* it is one by Weierstrass, but I'm not sure) states that any function (or data sequence) can be represented by any complete set of linearly independent basis vectors. So if you were able to come up with a sequence of basis vectors like you suggest, you would be able to fit any data sequence to them. Motor engine noise would easily be described in terms of contributions from the cardiac system. Or in terms of leg joints. The point is that while you can fit data to physical 'signatures', there is no way an analysis would come back with a result like "sorry,what you try to do doesn't make sense - you are fitting motor noise to a heart-sound basis." So stick with the Fourier basis. It takes you exactly as far as it makes sense to go. And no further. Rune
On Nov 20, 12:27&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote:
> > It doesn't make sense. > > One of the famous theorems of Real Analysis (I *think* it > is one by Weierstrass, but I'm not sure) states that any > function (or data sequence) can be represented by any > complete set of linearly independent basis vectors. > > So stick with the Fourier basis. It takes you exactly as far > as it makes sense to go. And no further.
Okay, I need some more convincing. Say I had some prototypical signals for e.g. crank rotation, valve opening, ring slap etc. (derived theoretically for the sake of argument). I don't see why I couldn't say, given two mixtures that, e.g. Engine 1 = 0.5 * crank + 0.2 * valve + 0.7 * ring Engine 2 = 0.8 * crank + 0.5 * valve + 0.3 * ring
On 20 Nov, 11:49, Ryan M <ryan.mitch...@gmail.com> wrote:
> On Nov 20, 12:27&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote: > > > > > It doesn't make sense. > > > One of the famous theorems of Real Analysis (I *think* it > > is one by Weierstrass, but I'm not sure) states that any > > function (or data sequence) can be represented by any > > complete set of linearly independent basis vectors. > > > So stick with the Fourier basis. It takes you exactly as far > > as it makes sense to go. And no further. > > Okay, I need some more convincing. Say I had some prototypical signals > for e.g. crank rotation, valve opening, ring slap etc. (derived > theoretically for the sake of argument). > > I don't see why I couldn't say, given two mixtures that, e.g. > Engine 1 = 0.5 * crank + 0.2 * valve + 0.7 * ring > Engine 2 = 0.8 * crank + 0.5 * valve + 0.3 * ring
You could, but it would be extremely stupid to do so. The reason is that you would be able to represent any one of these signals in terms of some or all of the others + noise. So the risk is that you might start interpreting the representation above as if it makes sense. Which would be a very bad thing, if you start acting as if it does. Rune
On Nov 20, 2:13&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote:
> > You could, but it would be extremely stupid to do so. > > The reason is that you would be able to represent any > one of these signals in terms of some or all of the > others + noise. > > So the risk is that you might start interpreting the > representation above as if it makes sense. Which would > be a very bad thing, if you start acting as if it does.
Okay, but what if the noise is known to be below some low threshold and the "basis" signals are at least semi-orthogonal (which I'm sure would occur for some problem domains, particularly if the number of "basis" functions is small compared to the sample length). Surely one can assign at least some probability to a given combination producing a certain mixture? This sounds too useful to be impossible. I could be mistaken - but isn't this similar to the procedure following mass spectrometry? I mean, the spectrometer gives concentrations of certain elements or molecule fragments, but there's still some probabilistic detective work to determine which complex molecules/compounds were originally present?
On 20 Nov, 13:29, Ryan M <ryan.mitch...@gmail.com> wrote:
> On Nov 20, 2:13&#4294967295;pm, Rune Allnor <all...@tele.ntnu.no> wrote: > > > > > You could, but it would be extremely stupid to do so. > > > The reason is that you would be able to represent any > > one of these signals in terms of some or all of the > > others + noise. > > > So the risk is that you might start interpreting the > > representation above as if it makes sense. Which would > > be a very bad thing, if you start acting as if it does. > > Okay, but what if the noise is known to be below some low threshold
It isn't.
> and the "basis" signals are at least semi-orthogonal (which I'm sure > would occur for some problem domains, particularly if the number of > "basis" functions is small compared to the sample length).
It doesn't change the main problem: That any basis function fits any data.
> Surely one can assign at least some probability to a given combination > producing a certain mixture? This sounds too useful to be impossible.
Lots of people have wasted entire carreers on that proposition. If you are so determined to become one more, I will not prevent you.
> I could be mistaken -
You are.
> but isn't this similar to the procedure > following mass spectrometry?
'Similar' is not 'the same.'
> I mean, the spectrometer gives > concentrations of certain elements or molecule fragments, but there's > still some probabilistic detective work to determine which complex > molecules/compounds were originally present?
I don't know how mass spectrometers work. However, the mass of a molecule is a discrete entity. Either it is 1e-15 kg or it isn't. If you find a molecule fragment with a given mass, there are only so many combinations of atoms that can end up in that area. Note the fundamentals here: - There is a finite small number of possible atoms - There is a finite number of atom constellations - Some such constellations are far more likely than others. H_2O for instance, is a common substance whereas HeO is an impossible combination. This simple fact resolves a possible ambiguity. As far as I know, one tests for known substances. So if one comes up with some new substance not presently in the library, the instrument will not tell you what it is. Note also that the mass spectrometer works on a *molecular* level. One doesn't just shovel a lump og goo into the machine and test for its constituents. Rune
These look interesting:

http://en.wikipedia.org/wiki/Matching_pursuit
http://davis.wpi.edu/~matt/courses/nland/node4.html
On Nov 20, 3:52 am, Ryan M <ryan.mitch...@gmail.com> wrote:
> I was just thinking about finding optimal decompositions for signal > mixtures where the basis functions are not necessarily orthogonal, but > would make some kind of physical sense. e.g. I'm sure cardiac signals > are periodic and could be decomposed into contributions from various > valves etc. Accelerometer measurements on a runner could be decomposed > into various physical components (hip, knee, ankle etc.). All of these > are periodic signals, yet the Fourier spectrum decomposes into more > basis functions (sinusoids) than are useful. > > I suppose there are two problems: given a set of signal measurements, > determine the most useful (I suppose smallest number) of basis > functions (what happens with samples that would be considered > outliers...?) > > And then, given a particular sample, find the coefficients of the non- > orthogonal basis functions. > > I don't know if this makes any sense. Any suggestions for theory that > could be useful? > > Ryan
Maybe what you are looking for is what is called "sparse approximation". Although you seem to care about very model-dependent and context- dependent things that are almost parametric, and sparse approximation may not do well for those. Use with caution. Julius
On 20 Nov, 14:06, Ryan M <ryan.mitch...@gmail.com> wrote:
> These look interesting: > > http://en.wikipedia.org/wiki/Matching_pursuithttp://davis.wpi.edu/~matt/courses/nland/node4.html
As I said, lots of people have wasted lots of time on these things. If you actually try these dieas with data, you will find that any data set can be fitted to any basis. The only thing you gain from this stuff, that you don't get with, say, the Fourier transform, is confusion. Rune
On Nov 20, 4:52&#4294967295;am, Ryan M <ryan.mitch...@gmail.com> wrote:
> I was just thinking about finding optimal decompositions for signal > mixtures where the basis functions are not necessarily orthogonal, but > would make some kind of physical sense. e.g. I'm sure cardiac signals > are periodic and could be decomposed into contributions from various > valves etc. Accelerometer measurements on a runner could be decomposed > into various physical components (hip, knee, ankle etc.). All of these > are periodic signals, yet the Fourier spectrum decomposes into more > basis functions (sinusoids) than are useful. > > I suppose there are two problems: given a set of signal measurements, > determine the most useful (I suppose smallest number) of basis > functions (what happens with samples that would be considered > outliers...?) > > And then, given a particular sample, find the coefficients of the non- > orthogonal basis functions. > > I don't know if this makes any sense. Any suggestions for theory that > could be useful? > > Ryan
Ryan, Rune and Julius have given some good warnings about potential problems. It sounds as though you want to take a complicated signnal and be able to write it as a sum of a small number of well known signals. Rune is right that once you have a complete basis for the signal, you can convert it to any other complete basis you want. And a DFT is a complete basis. You can set up a matrix problem to find the amounts of each mechanical basis function, but noise and other errors can easily cause the solution to say you have a negative amount of one of the bases. What does that mean? The mechanical process is absorbing that sound instead of making that sound? This is where you can easily get into trouble. This is the type of problem where you likely have a constraint on the coefficients (amount of each basis function) to be nonnegative real numbers. To convince you of the difficulty, try getting a signal or making one in the computer with a little distortion and noise but only using two non orthogonal basis functions. Then try to decompose the complicated corrupted signal into the two components. Your coef matrix will be two by two. If your bases were orthogonal, your matrix would be diagonal and the finding of one basis function is independent of the finding of the amount of the other one. But for nonorthogonal bases you have to worry about the interaction terms (the off diagonal parts in ther matrix) and plus you need to add in your nonnegative constraints for the amounts of each basis. Try it, it is tricky even for this simple case! Clay