Forums

Large FFT vs Many FFTs

Started by Edison April 5, 2007
Hi DSP Gurus,

I've been asked to research a DSP system to take an FFT to detect harmonic
distortion.

The base frequency will be 110kHz and I need to detect up to the 5th
harmonic. Therfore I will need a samplerate of at least 1.2MHz.

The requirements are to take 1 seconds worth of data and FFT it, giving a
1.2 million point FFT. The requirements state that the resulting very
narrow bin width will reduce the effect of noise. Is this a sensible thing
to do?

I have a feeling it would be better to take a number of smaller FFTs (say
1k point), calculate the magnitudes and average them. Thus reducing the
effect of random noise and giving a more flexible system that requires a
lot less memory. Is this the right way or would the method above be more
benificial?

Thanks in advance for your help.



_____________________________________
Do you know a company who employs DSP engineers?  
Is it already listed at http://dsprelated.com/employers.php ?
Edison wrote:
> Hi DSP Gurus, > > I've been asked to research a DSP system to take an FFT to detect harmonic > distortion. > > The base frequency will be 110kHz and I need to detect up to the 5th > harmonic. Therfore I will need a samplerate of at least 1.2MHz. > > The requirements are to take 1 seconds worth of data and FFT it, giving a > 1.2 million point FFT. The requirements state that the resulting very > narrow bin width will reduce the effect of noise. Is this a sensible thing > to do? > > I have a feeling it would be better to take a number of smaller FFTs (say > 1k point), calculate the magnitudes and average them. Thus reducing the > effect of random noise and giving a more flexible system that requires a > lot less memory. Is this the right way or would the method above be more > benificial? > > Thanks in advance for your help. > >
Edison, It seems it would be easier to use a bank of five band-pass filters. This requires much less memory than the FFT approach. Because of the very large number of samples to be averaged, to find the average signal levels you could use a fairly crude approach of averaging the absolute values of the filtered signal samples, for each of the five filters. As long as the sampling rate is not locked to the fundamental this should be OK. Also, as the fundamental will have the same scaling error as the harmonics, you could probably safely leave out applying a scaling factor to convert 'average value' to RMS. If you really wanted the maximum possible precision you would convert the signal to analytic, use a bank of complex filters, calculate the magnitude of each of the five complex signals at each sample period, and compute the five average magnitudes over a time of one second. Regards, John
>Edison, >It seems it would be easier to use a bank of five band-pass filters. >This requires much less memory than the FFT approach. > >Because of the very large number of samples to be averaged, to find the >average signal levels you could use a fairly crude approach of averaging
>the absolute values of the filtered signal samples, for each of the five
>filters. As long as the sampling rate is not locked to the fundamental >this should be OK. > >Also, as the fundamental will have the same scaling error as the >harmonics, you could probably safely leave out applying a scaling factor
>to convert 'average value' to RMS. > >If you really wanted the maximum possible precision you would convert >the signal to analytic, use a bank of complex filters, calculate the >magnitude of each of the five complex signals at each sample period, and
>compute the five average magnitudes over a time of one second. > >Regards, >John >
Thanks John, Just to complicate things there is a requirement for a second fundamental at 90kHz. with its associated 5 harmonics. Is the solution you suggest still feasable or am I back to a big FFT? Ed _____________________________________ Do you know a company who employs DSP engineers? Is it already listed at http://dsprelated.com/employers.php ?
"Edison" <bell561@btinternet.com> wrote in
news:tqudndhT08OKTonbnZ2dnUVZ_vmqnZ2d@giganews.com: 

> Hi DSP Gurus, > > I've been asked to research a DSP system to take an FFT to detect > harmonic distortion. > > The base frequency will be 110kHz and I need to detect up to the 5th > harmonic. Therfore I will need a samplerate of at least 1.2MHz. > > The requirements are to take 1 seconds worth of data and FFT it, > giving a 1.2 million point FFT. The requirements state that the > resulting very narrow bin width will reduce the effect of noise. Is > this a sensible thing to do? > > I have a feeling it would be better to take a number of smaller FFTs > (say 1k point), calculate the magnitudes and average them. Thus > reducing the effect of random noise and giving a more flexible system > that requires a lot less memory. Is this the right way or would the > method above be more benificial? > > Thanks in advance for your help. > > > > _____________________________________ > Do you know a company who employs DSP engineers? > Is it already listed at http://dsprelated.com/employers.php ?
You could use a Welch Periodogram-- 50% overlapping epochs of data. It's fairly close to just averaging many FFT's, but you get twice as many epochs. You'll find a description in Bendat and Peirsoll -- Scott Reverse name to reply
On Apr 5, 9:42 am, "Edison" <bell...@btinternet.com> wrote:
> >Edison, > >It seems it would be easier to use a bank of five band-pass filters. > >This requires much less memory than the FFT approach. > > >Because of the very large number of samples to be averaged, to find the > >average signal levels you could use a fairly crude approach of averaging > >the absolute values of the filtered signal samples, for each of the five > >filters. As long as the sampling rate is not locked to the fundamental > >this should be OK. > > >Also, as the fundamental will have the same scaling error as the > >harmonics, you could probably safely leave out applying a scaling factor > >to convert 'average value' to RMS. > > >If you really wanted the maximum possible precision you would convert > >the signal to analytic, use a bank of complex filters, calculate the > >magnitude of each of the five complex signals at each sample period, and > >compute the five average magnitudes over a time of one second. > > >Regards, > >John > > Thanks John, > > Just to complicate things there is a requirement for a second fundamental > at 90kHz. with its associated 5 harmonics. Is the solution you suggest > still feasable or am I back to a big FFT? > > Ed > > _____________________________________ > Do you know a company who employs DSP engineers? > Is it already listed athttp://dsprelated.com/employers.php?
The method that John suggested would be good for one where you just need to observe a few known harmonics. However, as you asked in your first message, your idea of averaging a number of small FFTs is a reasonable solution. In fact, Welch's method of spectral estimation is very similar to that. It involves splitting a long signal into a number of smaller chunks, perhaps with overlap. Each chunk is windowed and FFTed, and the spectrum estimate is taken as the average of all of the FFT outputs. Some other parametric methods may be more optimal in some sense, but Welch's method is simple and it works relatively well if you need an estimate of the entire spectrum. In your case, it sounds like you don't, so the bandpass filter bank solution would probably work well, but if the number of frequencies that you want to observe increases, you might look at a different solution. Jason
On Thu, 05 Apr 2007 05:44:39 -0500, "Edison" <bell561@btinternet.com>
wrote:

>Hi DSP Gurus, > >I've been asked to research a DSP system to take an FFT to detect harmonic >distortion. > >The base frequency will be 110kHz and I need to detect up to the 5th >harmonic. Therfore I will need a samplerate of at least 1.2MHz. > >The requirements are to take 1 seconds worth of data and FFT it, giving a >1.2 million point FFT. The requirements state that the resulting very >narrow bin width will reduce the effect of noise. Is this a sensible thing >to do? > >I have a feeling it would be better to take a number of smaller FFTs (say >1k point), calculate the magnitudes and average them. Thus reducing the >effect of random noise and giving a more flexible system that requires a >lot less memory. Is this the right way or would the method above be more >benificial? > >Thanks in advance for your help.
Hi Edison, You ask a very sensible question. Yes, performing very large FFTs will "pull" your desired harmonic spectral components up above the background spectral noise but the variance of repeated measurements of some spectral component will be large. (I. e., the measured magnitudes of some spectral component, that you obtain by repeating your measurements, will fluctuate by a large amount.) So your thought of averaging multiple FFT magnitudes is certainly a good idea. If the variance of your repeated measurements of some spectral component is SIGMASQUARED, then the variance of your repeated measurements of some "average of K FFTs" spectral component will be SIGMASQUARED/K. So, ... averaging multiple FFT results will yield more accurate measurements (estimations) of the magnitude of some spectral component. Of course, smaller-sized FFTs will NOT have the "fine-grained" frequency sample spacing that is provided by large-sized FFTs. The topic you're exploring has been described in a zillion technical papers and there's much information available on the Internet. [Edison, to read those papers, it'll help if you have a Ph.D in mathematics. :-) That's because a thorough analysis of this topic forces one to dive deep in the cold and murky waters of statistics.] You might search the Web for "Bartlett's Method" and "Welch's Method" as a start. Ya' know what I'd do if I were you? I'd try both schemes (single large-sized FFTs, and averaging multiple smaller-sized FFTs), and do my best to see which method yields the "best" results based on your signals and your noise. Good Luck, [-Rick-]

Edison wrote:

> Hi DSP Gurus, > > I've been asked to research a DSP system to take an FFT to detect harmonic > distortion.
If your goal is measuring the harmonic distortion, you don't need the FFT. All you need is a bandstop filter and a comb filter. Vladimir Vassilevsky DSP and Mixed Signal Design Consultant http://www.abvolt.com
On Thu, 05 Apr 2007 05:44:39 -0500, "Edison" <bell561@btinternet.com>
wrote:

>Hi DSP Gurus, > >I've been asked to research a DSP system to take an FFT to detect harmonic >distortion. > >The base frequency will be 110kHz and I need to detect up to the 5th >harmonic. Therfore I will need a samplerate of at least 1.2MHz. > >The requirements are to take 1 seconds worth of data and FFT it, giving a >1.2 million point FFT. The requirements state that the resulting very >narrow bin width will reduce the effect of noise. Is this a sensible thing >to do? > >I have a feeling it would be better to take a number of smaller FFTs (say >1k point), calculate the magnitudes and average them. Thus reducing the >effect of random noise and giving a more flexible system that requires a >lot less memory. Is this the right way or would the method above be more >benificial? > >Thanks in advance for your help.
Edison, I don't know the details of your application but something just occurred to me. I once worked on a "real-time" spectrum analysis system where a signal's spectrum was output for display on a CRT screen. To reduce the fluctuations of the displayed spectral magnitudes, each real-time sequence of FFT bin magnitudes was passed through an "exponential averager". That kind of averaging required much less memory and fewer computations than standard averaging. (Just something for you to think about.) See Ya', [-Rick-]
On Apr 5, 7:01 am, R.Lyons@_BOGUS_ieee.org (Rick Lyons) wrote:
> > Hi Edison, > You ask a very sensible question. > Yes, performing very large FFTs will "pull" your > desired harmonic spectral components up above the > background spectral noise but the variance of repeated > measurements of some spectral component will be large. > (I. e., the measured magnitudes of some spectral > component, that you obtain by repeating your measurements, > will fluctuate by a large amount.)
If the spectral component is fixed in frequency the variance of the power spectrum of the tone will be determined only by SNR re: 1 bin. It is the power spectrum of noise contributions that has a large variance. A large error source for a fixed frequency tone is the scalloping loss due to the tone not being bin centered. For a rectangular window (unwindowed) the peak error is about 3.9 dB and will vary amoung the harmonics unless they are all bin centered. A good window in this case may be one of the flattop (in a frequency domain sense) windows. Modern flattop windows can have 100 dB sidelobe rejection and scalloping loss of less than 0.01 dB. Be careful of older flattop windows that may have only 40 dB or so rejection.
> So your thought of averaging multiple FFT magnitudes is > certainly a good idea. > > > ... > [-Rick-]
Dale B. Dalrymple http://dbdimages.com
Edison wrote:
>> Edison, >> It seems it would be easier to use a bank of five band-pass filters. >> This requires much less memory than the FFT approach. >> >> Because of the very large number of samples to be averaged, to find the >> average signal levels you could use a fairly crude approach of averaging > >> the absolute values of the filtered signal samples, for each of the five > >> filters. As long as the sampling rate is not locked to the fundamental >> this should be OK. >> >> Also, as the fundamental will have the same scaling error as the >> harmonics, you could probably safely leave out applying a scaling factor > >> to convert 'average value' to RMS. >> >> If you really wanted the maximum possible precision you would convert >> the signal to analytic, use a bank of complex filters, calculate the >> magnitude of each of the five complex signals at each sample period, and > >> compute the five average magnitudes over a time of one second. >> >> Regards, >> John >> > > Thanks John, > > Just to complicate things there is a requirement for a second fundamental > at 90kHz. with its associated 5 harmonics. Is the solution you suggest > still feasable or am I back to a big FFT? > > Ed >
Edison, The second fundamental component complicates things because, depending on the degree of the non-linearity causing the distortion you may have present all possible sums and products of harmonics of the two frequencies. The filter bank is starting to look a bit messy now. There are techniques for combining many small FFTs into one big one, but unless you particularly need a resolution of 1.0Hz this would be an unnecessary complication, so I agree with the comments of others in this thread that a FFT averaging technique is the best one to use for your problem. Regards, John