DSPRelated.com
Forums

analyzing large array separately for HF and LF, then re-combining subset of freq bins

Started by all4dsp September 22, 2010
Hi Everyone,

I've got a very large (real) array, on the order of 200Mpts, that makes it
unlikely to obtain an accurate FFT or IFFT. One thought to get around this
is to look at the low- and high- frequency sides of the spectrum
separately. 

I could decimate the array, say 100 times for a more manageable 2Mpt FFT,
to observe the low-frequency side of the spectrum. For the high-frequency
side, I could segment the original array into 100 non-overlapping 2Mpt
regions, and FFT one of the 2Mpt segments. 

In both (LF and HF) spectrums, I'm only interested in certain frequency
bins (not known ahead of time, but determined from the resulting spectrum).
My goal is to obtain a time-domain waveform whose frequency content is
derived from only these certain frequency bins, but this waveform must be
steady-state and must be sufficiently long to cover a full period for the
lowest spur detected (spur range is 1K-4G Hz). 

Not being that familiar with DSP, I'm not sure if this is an easy or
difficult problem. Can anyone recommend an approach that might be used? If
it helps, some background for this application is in a previous post
http://www.dsprelated.com/showmessage/130292/1.php

I thought that since both LF- and HF-sides are derived from the same
original array, the phases are related to each other, so perhaps a sum of
sinewaves based on the certain frequency bins (for each of LF- and
HF-sides) could be created (using sum of sinewaves instead of IFFT, since
only a few bins out of many are in play), then simply add these resulting
waveforms together to combine LF- and HF- contributions into one
time-domain waveform. But, I'm wondering whether circular convolution
prevents any steady-state output, and if so, what could be done to avoid
this, etc.

Thanks in advance for any suggestions
all4dsp <all4dsp@n_o_s_p_a_m.comcast.net> wrote:
 
> I've got a very large (real) array, on the order of 200Mpts, that makes it > unlikely to obtain an accurate FFT or IFFT. One thought to get around this > is to look at the low- and high- frequency sides of the spectrum > separately.
(big snip) Some time ago I was wondering about an FFT of an entire CD, or, more likely, just one track on a CD. I don't believe that it is so hard to do even if it is too big to fit into memory. Note that 200Mpts will fit on many current machines. Otherwise, it should be possible in the later passes to do it, like an external sort, reading from and writing to, disk. -- glen
>all4dsp <all4dsp@n_o_s_p_a_m.comcast.net> wrote: > >> I've got a very large (real) array, on the order of 200Mpts, that makes
it
>> unlikely to obtain an accurate FFT or IFFT. One thought to get around
this
>> is to look at the low- and high- frequency sides of the spectrum >> separately. > >(big snip) > >Some time ago I was wondering about an FFT of an entire CD, >or, more likely, just one track on a CD. > >I don't believe that it is so hard to do even if it is too >big to fit into memory. Note that 200Mpts will fit on many >current machines. Otherwise, it should be possible in the >later passes to do it, like an external sort, reading from >and writing to, disk. > >-- glen >
Thanks Glen. Yes, I can do an 268Mpt FFT on my machine in a couple minutes, even larger if I wait more time, but I didn't know if the result was reliable, and thus didn't want to risk the results not knowing any better. Kudos to anyone if they can shed light on whether such results remain accurate (or not).
On Sep 22, 4:56=A0pm, "all4dsp" <all4dsp@n_o_s_p_a_m.comcast.net> wrote:
> > Thanks Glen. Yes, I can do an 268Mpt FFT on my machine in a couple minute=
s,
> even larger if I wait more time, but I didn't know if the result was > reliable, and thus didn't want to risk the results not knowing any better=
.
> Kudos to anyone if they can shed light on whether such results remain > accurate (or not).
well, if your computer didn't blow up, if the call to FFTW (or wherever you got your FFT) returned 0 or NoErr or something like that, if 64-bit double precision words were used throughout (4 GB of memory, hmmm that might be a problem; how wide is your pointer variable?), why wouldn't you believe the results? r b-j
all4dsp wrote:
> Hi Everyone, > > I've got a very large (real) array, on the order of 200Mpts, that makes it > unlikely to obtain an accurate FFT or IFFT. [*SNIP* LOL ;]
Why decimate? ???? You have data, why not analyze? ?? ???? *HINT* You didn't flag any "real-time" constraint. WHY "unlikely to obtain an accurate FFT or IFFT"? ?? ???? ?! *!!*
>On Sep 22, 4:56=A0pm, "all4dsp" <all4dsp@n_o_s_p_a_m.comcast.net> wrote: >> >> Thanks Glen. Yes, I can do an 268Mpt FFT on my machine in a couple
minute=
>s, >> even larger if I wait more time, but I didn't know if the result was >> reliable, and thus didn't want to risk the results not knowing any
better=
>. >> Kudos to anyone if they can shed light on whether such results remain >> accurate (or not). > >well, if your computer didn't blow up, if the call to FFTW (or >wherever you got your FFT) returned 0 or NoErr or something like that, >if 64-bit double precision words were used throughout (4 GB of memory, >hmmm that might be a problem; how wide is your pointer variable?), why >wouldn't you believe the results? > >r b-j > >
Well, because, I don't know what I don't know. However, Rick Lyons made the following comment on a previous post of mine, which gives me pause. ------ And another thing, as Veronica said in the movie "The Fly", I'd "Be afraid. Be very afraid" of numerical errors in performing such large-sized FFTs. In those FFTs you're expecting the software to accurately estimate the sine and cosine of angles in the range of 360/2^30 degrees!!! Are sine and cosine approximation algorithms (in the FFT software) accurate enough for such super-small angles???
>all4dsp wrote: >> Hi Everyone, >> >> I've got a very large (real) array, on the order of 200Mpts, that makes
it
>> unlikely to obtain an accurate FFT or IFFT. [*SNIP* LOL ;] > >Why decimate? ???? > >You have data, why not analyze? ?? ???? > > >*HINT* You didn't flag any "real-time" constraint. > >WHY "unlikely to obtain an accurate FFT or IFFT"? ?? ???? ?! *!!* > >
Hi Richard, decimate is intended to cut down the array size to avoid 200+ megapoint FFT and IFFTs (since I'm not sure if they'd remain accurate; see my reply above to r b-j). Getting this to run fast (i.e. real-time constraint) is secondary to having accurate results. I sent in a question to the FFTW folks asking what their experience is with large sized FFTs (how large before can no longer trust results), but no response yet. But maybe someone in this forum has some experience with this. Thanks
> >Well, because, I don't know what I don't know. However, Rick Lyons made
the
>following comment on a previous post of mine, which gives me pause. > >------ >And another thing, as Veronica said in the movie "The Fly", >I'd "Be afraid. Be very afraid" of numerical errors in >performing such large-sized FFTs. In those FFTs you're >expecting the software to accurately estimate the sine and >cosine of angles in the range of 360/2^30 degrees!!! >Are sine and cosine approximation algorithms (in the FFT >software) accurate enough for such super-small angles??? > >
Running Matlab on a 64bit machine, the cosine function runs out of significant digits to represent an answer at 2*pi/(134e6), e.g. 2^27=134M.
>> cos(2*pi/(2^25))
ans = 0.999999999999982
>> cos(2*pi/(2^26))
ans = 0.999999999999996
>> cos(2*pi/(2^27))
ans = 0.999999999999999
>> cos(2*pi/(2^28))
ans = 1 Could this impact FFT or IFFT results near 134M points? Or, is there a better way to answer this question?
On Sep 22, 11:44=A0pm, "all4dsp" <all4dsp@n_o_s_p_a_m.comcast.net>
wrote:
> >Well, because, I don't know what I don't know. However, Rick Lyons made > the > >following comment on a previous post of mine, which gives me pause. > > >------ > >And another thing, as Veronica said in the movie "The Fly", > >I'd "Be afraid. =A0Be very afraid" of numerical errors in > >performing such large-sized FFTs. =A0In those FFTs you're > >expecting the software to accurately estimate the sine and > >cosine of angles in the range of 360/2^30 degrees!!! > >Are sine and cosine approximation algorithms (in the FFT > >software) accurate enough for such super-small angles??? > > Running Matlab on a 64bit machine, the cosine function runs out of > significant digits to represent an answer at 2*pi/(134e6), e.g. 2^27=3D13=
4M.
> > >> cos(2*pi/(2^25)) > > ans =3D 0.999999999999982 > > >> cos(2*pi/(2^26)) > > ans =3D 0.999999999999996 > > >> cos(2*pi/(2^27)) > > ans =3D 0.999999999999999 > > >> cos(2*pi/(2^28)) > > ans =3D 1 > > Could this impact FFT or IFFT results near 134M points? Or, is there a > better way to answer this question?
I don't see why you'd have a problem as long as your word size is adequate. Radio astronomers have routinely worked with mega point elements. I remember a lecture by a Harvard astronomer who was doing 4 M point FFT's, and that was some 15 years ago. Other people are looking at even bigger FFT's (e.g.: 2^30 below): http://images.apple.com/acg/pdf/FFTdistIntel20070816.pdf I also found this (only a mere 64 M points): http://www.iidsp.com/forum/viewtopic.php?p=3D3785&sid=3Dc9be2230be8190439cd= 21ac3ab2bf3b1 I also think the topic of FFT accuracy when computing very large arrays has come up before. I tried searching for the appropriate thread(s), but gave up after a few minutes. I'll leave that up to you. Kevin McGee
all4dsp wrote:
>> all4dsp wrote: >>> Hi Everyone, >>> >>> I've got a very large (real) array, on the order of 200Mpts, that makes > it >>> unlikely to obtain an accurate FFT or IFFT. [*SNIP* LOL ;] >> >> Why decimate? ???? >> >> You have data, why not analyze? ?? ???? >> >> >> *HINT* You didn't flag any "real-time" constraint. >> >> WHY "unlikely to obtain an accurate FFT or IFFT"? ?? ???? ?! *!!* >> >> > > Hi Richard, decimate is intended to cut down the array size to avoid 200+ > megapoint FFT and IFFTs (since I'm not sure if they'd remain accurate; see > my reply above to r b-j). Getting this to run fast (i.e. real-time > constraint) is secondary to having accurate results. > > I sent in a question to the FFTW folks asking what their experience is with > large sized FFTs (how large before can no longer trust results), but no > response yet. But maybe someone in this forum has some experience with > this. > > Thanks
If all else fails *THINK* When doth ignoring data give *MORE* ... .... ? ???? P.S. yepp, leading question :)