DSPRelated.com
Forums

FFT's of very long data (bigger than available RAM)

Started by Unknown July 4, 2005
I wish to analyse a very long data (eg. 2 GBytes) as a single FFT
block. Of course, the amount of RAM is not such big.
How can I analyse that? I searched over net and I didnt found any
library that can do this (all require RAM to store the full data).
I mean, how can I split the data in such way that I can analyse small
blocks, store the results to hard drive and after this to combine the
results?

tazilanigram@yahoo.com wrote:

> I wish to analyse a very long data (eg. 2 GBytes) as a single FFT > block. Of course, the amount of RAM is not such big. > How can I analyse that? I searched over net and I didnt found any > library that can do this (all require RAM to store the full data). > I mean, how can I split the data in such way that I can analyse small > blocks, store the results to hard drive and after this to combine the > results? >
I can't remember the name of the algorithm but there is at least one FFT method which makes sequential passes through the data. This kind of thing was essential back in the Olden Days when "core" memory was very small and data often resided on slow and/or sequential media, like tape. Googling for "out-of-core FFT" seems to turn up some likely candidates. Paul
You can break your long sequence into smaller parts. What's the lowest
frequency you want to estimate, 1/length, do you really need it ?

Yuri

tazilanigram@yahoo.com wrote:
> I wish to analyse a very long data (eg. 2 GBytes) as a single FFT > block. Of course, the amount of RAM is not such big. > How can I analyse that? I searched over net and I didnt found any > library that can do this (all require RAM to store the full data). > I mean, how can I split the data in such way that I can analyse small > blocks, store the results to hard drive and after this to combine the > results?
I recently ran into the same problem. I didn't find much useful literature. What I did was divide the input band 0-Fs/2 into segments and mix each one down to DC and apply a decimating LPF. Then I used a small complex FFT to get the spectrum of the segment. I kept the middle 75% of the bins and discarded the rest. Then I stepped the mixer frequency up and repeated. By tiling all the segment FFTs together, a good approximation to a large FFT is achieved. John
Try looking up periodograms.

-Phil

What if I would do things like this (for 1d data):
I have the x containing the samples: x[0],x[1],x[2],x[3],x[4]....x[N];
and I would break in 2 arrays - one for odd samples and other for even
samples:
 1) x[0],x[2],x[4]....
 2) x[1],x[3],x[5]....
And repeat recursively until I get small enough data and FFT each.
I know that above procedure is a part of FFT but I don't know how to
combine the results from each block to the bigger blocks.

Ex: if I got 2 frequencies responses from the above method: Xc and Xs
are cosine/sine components of even samples, Yc and Ys are cosine/sine
components of odd samples. Now, I want to combine Xc,Xs and Yc,Ys to a
bigger frequency response Fc,Fs.
How can I compote Fc[] and Fs[] if I know Xc,Xs,Yc,Ys?
I guess if I could do this, I could use recursivity to do the FFT for
very big 1d data.

tazilanigram@yahoo.com wrote:

> What if I would do things like this (for 1d data): > I have the x containing the samples: x[0],x[1],x[2],x[3],x[4]....x[N]; > and I would break in 2 arrays - one for odd samples and other for even > samples: > 1) x[0],x[2],x[4].... > 2) x[1],x[3],x[5].... > And repeat recursively until I get small enough data and FFT each. > I know that above procedure is a part of FFT but I don't know how to > combine the results from each block to the bigger blocks. > > Ex: if I got 2 frequencies responses from the above method: Xc and Xs > are cosine/sine components of even samples, Yc and Ys are cosine/sine > components of odd samples. Now, I want to combine Xc,Xs and Yc,Ys to a > bigger frequency response Fc,Fs. > How can I compote Fc[] and Fs[] if I know Xc,Xs,Yc,Ys? > I guess if I could do this, I could use recursivity to do the FFT for > very big 1d data. >
You seem to want to reinvent the wheel. Techniques for performaing large out-of-core FFT's were developed many years ago. The following paper contains references to a number of suitable algorithms: <http://www.cs.tau.ac.il/~stoledo/Pubs/oocsurvey.pdf>, e.g. Gentleman and Sande (1966). Paul
tazilanigram@yahoo.com wrote:

> I wish to analyse a very long data (eg. 2 GBytes) as a single FFT > block. Of course, the amount of RAM is not such big. > How can I analyse that? I searched over net and I didnt found any > library that can do this (all require RAM to store the full data). > I mean, how can I split the data in such way that I can analyse small > blocks, store the results to hard drive and after this to combine the > results? >
You can do FFT's of partial data and combine the bits to a longer FFT. Top-down instead of bottom-up (or whichever way it should go). While trying to understand FFT some years ago I found articles on this in some old parallel computing /distributed computing journals. The same problem arises for distributed, parallel or multi-processor computing. I seem to remember it's a little more expensive, but the result is of course equivalent to standard in-memory FFT. Google for FFT plus "Paul Swartztrauber", "parallel computing", uhm... "large" and "reordered" seem pertinent too. I see there are new papers out, but haven't read them and couldn't estimate their usefulness. Citeseer may have something, but was inaccessible right now. Try first two articles in this search: http://www.google.com/search?q=Paul+Swartztrauber+fft+%22parallel+computing%22&btnG=Search homsan
Paul Russell wrote:
> > You seem to want to reinvent the wheel. Techniques for performaing large > out-of-core FFT's were developed many years ago. The following paper > contains references to a number of suitable algorithms: > <http://www.cs.tau.ac.il/~stoledo/Pubs/oocsurvey.pdf>, e.g. Gentleman > and Sande (1966). > Paul
" ... the algorithm requires only two passes on the data and an out-of-core matrix transposition." Of course! It's so simple. Just write it out to the matrix-transposing disk drive. (Sorry for the sarcasm. I look forward to reading some of the links.) -- Mark Borgerding
Mark wrote:
> " ... the algorithm requires only two passes on the data and an > out-of-core matrix transposition." > > Of course! It's so simple. Just write it out to the matrix-transposing > disk drive.
It's odd that anybody would bother to physically transpose a matrix in memory - after all, it's the same matrix, just different indexing ...