DSPRelated.com
Forums

Obtaining 1/3 octave levels from a PSD

Started by Roland February 5, 2008
hello

i am looking for the easiest way to calculate 1/3 octave levels from an 
acquired signal sampled at a high rate (usually 350kHz). as i understand 
it, the usual method is to create a bunch of butterworth filters and 
calculate the RMS level in each band, decimating occasionally to 
increase resolution at lower frequencies.

my question is: can the 1/3 octave levels be accurately calculated from 
the PSD of a signal, simply by summing the levels in each passband for 
the equivalent butterworth filter? presumably the accuracy will be 
better at high frequencies?

or will this not work?
On Feb 5, 8:34 am, Roland <rol...@gmail.com> wrote:
> hello > > i am looking for the easiest way to calculate 1/3 octave levels from an > acquired signal sampled at a high rate (usually 350kHz). as i understand > it, the usual method is to create a bunch of butterworth filters and > calculate the RMS level in each band, decimating occasionally to > increase resolution at lower frequencies. > > my question is: can the 1/3 octave levels be accurately calculated from > the PSD of a signal, simply by summing the levels in each passband for > the equivalent butterworth filter? presumably the accuracy will be > better at high frequencies? > > or will this not work?
Instrument manufacturers have long done this. Care is necessary at the low frequency end if filter band size approaches fft binwidths. http://www.physics.rutgers.edu/ugrad/326/SR760m_chap2.pdf Dale B. Dalrymple http://dbdimages.com http://stores.lulu.com/dbd
On 5 Feb, 17:34, Roland <rol...@gmail.com> wrote:
> hello > > i am looking for the easiest way to calculate 1/3 octave levels from an > acquired signal sampled at a high rate (usually 350kHz). as i understand > it, the usual method is to create a bunch of butterworth filters and > calculate the RMS level in each band, decimating occasionally to > increase resolution at lower frequencies.
I am not sure one decimates to increase resolution -- to save computations might be a better reason -- but otherwise this seems correct.
> my question is: can the 1/3 octave levels be accurately calculated from > the PSD of a signal, simply by summing the levels in each passband for > the equivalent butterworth filter? presumably the accuracy will be > better at high frequencies? > > or will this not work?
First, your approach will work -- you might get problems if you want to track dynamics in the data, but that's a separate issue. The main objection to your approach is not accuracy or anything like that, but that you use a different approach than what is usual for this standard problem. Because you do things *differently* than the accepted way (note that there is no reason to think your way is either better or worse when leaving dynamics out), your measurements and results will not be directly comparable to other people's measurements and results. If you intend to use your measurements to evaluate some data and to compare them to other people's results, you may be better off using the accepted methods. If you intend to do some in-house tests on stationary data, go ahead and use the PSD. Rune
Rune Allnor wrote:
> On 5 Feb, 17:34, Roland <rol...@gmail.com> wrote: >> hello >> >> i am looking for the easiest way to calculate 1/3 octave levels from an >> acquired signal sampled at a high rate (usually 350kHz). as i understand >> it, the usual method is to create a bunch of butterworth filters and >> calculate the RMS level in each band, decimating occasionally to >> increase resolution at lower frequencies. > > I am not sure one decimates to increase resolution -- to save > computations might be a better reason -- but otherwise this > seems correct. > >> my question is: can the 1/3 octave levels be accurately calculated from >> the PSD of a signal, simply by summing the levels in each passband for >> the equivalent butterworth filter? presumably the accuracy will be >> better at high frequencies? >> >> or will this not work? > > First, your approach will work -- you might get problems > if you want to track dynamics in the data, but that's > a separate issue. > > The main objection to your approach is not accuracy or > anything like that, but that you use a different approach > than what is usual for this standard problem. Because you > do things *differently* than the accepted way (note that > there is no reason to think your way is either better or > worse when leaving dynamics out), your measurements and > results will not be directly comparable to other people's > measurements and results. > > If you intend to use your measurements to evaluate some > data and to compare them to other people's results, you > may be better off using the accepted methods. If you > intend to do some in-house tests on stationary data, go > ahead and use the PSD. > > Rune
much obliged.
On Feb 5, 12:25 pm, Rune Allnor <all...@tele.ntnu.no> wrote:
>... > If you intend to use your measurements to evaluate some > data and to compare them to other people's results, you > may be better off using the accepted methods. If you > intend to do some in-house tests on stationary data, go > ahead and use the PSD. > > Rune
Whenever you make comparisons you need to use consistent methods. There's no telling what people might have accepted or why. PSD vs filter bank is not the only consideration. The post detection averaging applied by some applications may make the PSD vs filter bank indistinguishable, but averaging may produce different results from the same type of implementation. The hard part may be knowing what was actually done in earlier measurements. Some instrumentation vendors like Bruel&Kjear have even sold both kinds of implementations. Any time someone says they used the accepted methods, smile, nod and check the documentation. As others have noted before me: The nice thing about standards is that there are so many to choose from. Dale B. Dalrymple http://dbdimages.com http://stores.lulu.com/dbd
On 6 Feb, 17:32, dbd <d...@ieee.org> wrote:
> Any > time someone says they used the accepted methods, smile, nod and check > the documentation.
That's one way of phrasing it. As far as I am concerned, it's worth considering what the analysis will be used for when deciding how to process the data. Rune
On Feb 6, 10:15 am, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 6 Feb, 17:32, dbd <d...@ieee.org> wrote: > > > Any > > time someone says they used the accepted methods, smile, nod and check > > the documentation. > > That's one way of phrasing it. As far as I am concerned, > it's worth considering what the analysis will be used for > when deciding how to process the data. > > Rune
That's exactly the problem. You can't trust anyone who uses the words 'optimized' or 'accepted' without making sure you agree with them about the 'for what'. Dale B. Dalrymple
On 6 Feb, 23:46, dbd <d...@ieee.org> wrote:
> On Feb 6, 10:15 am, Rune Allnor <all...@tele.ntnu.no> wrote: > > > On 6 Feb, 17:32, dbd <d...@ieee.org> wrote: > > > > &#4294967295;Any > > > time someone says they used the accepted methods, smile, nod and check > > > the documentation. > > > That's one way of phrasing it. As far as I am concerned, > > it's worth considering what the analysis will be used for > > when deciding how to process the data. > > > Rune > > That's exactly the problem. You can't trust anyone who uses the words > 'optimized' &#4294967295;or 'accepted' without making sure you agree with them > about the 'for what'.
The thing is that lots of people use at least the term 'optimized' as a matter of course -- it helps sell in an article to a journal. As for 'accepted' I can't remember having seen it in this context too often. Even so, it is worth the effort to check what everybody else do in order to determine if there is an 'accepted' (or maybe 'standard', 'usual' or 'traditional') way of doing things, before one has a go oneself. Doing things the 'standard' way first, has several advantages: - One reinforces basic knowledge and procedures; if one can't do the trivial stuff one lacks the skills to do the hard stuff - One establishes a baseline for evaluating other efforts; it is hard to justify expensive, elaborate, hard-to-use tools unless one has demonstrated that the simple stuff doesn't do the job - One can justify why one later chooses different approaches - There may exist canned tools which simplify the most common basic tasks and operations All in all, one avoids a lot of trouble by trying to stick to the basic toolkit. Rune
On Feb 7, 1:15 am, Rune Allnor <all...@tele.ntnu.no> wrote:
> On 6 Feb, 23:46, dbd <d...@ieee.org> wrote: > > > > > On Feb 6, 10:15 am, Rune Allnor <all...@tele.ntnu.no> wrote: > > > > On 6 Feb, 17:32, dbd <d...@ieee.org> wrote: > > > > > Any > > > > time someone says they used the accepted methods, smile, nod and check > > > > the documentation. > > > > That's one way of phrasing it. As far as I am concerned, > > > it's worth considering what the analysis will be used for > > > when deciding how to process the data. > > > > Rune > > > That's exactly the problem. You can't trust anyone who uses the words > > 'optimized' or 'accepted' without making sure you agree with them > > about the 'for what'. > > The thing is that lots of people use at least the term 'optimized' > as a matter of course -- it helps sell in an article to a journal. > As for 'accepted' I can't remember having seen it in this context > too often. Even so, it is worth the effort to check what everybody > else do in order to determine if there is an 'accepted' (or maybe > 'standard', 'usual' or 'traditional') way of doing things, before one > has a go oneself. > > Doing things the 'standard' way first, has several advantages: > > - One reinforces basic knowledge and procedures; if one > can't do the trivial stuff one lacks the skills to do the > hard stuff > - One establishes a baseline for evaluating other efforts; > it is hard to justify expensive, elaborate, hard-to-use > tools unless one has demonstrated that the simple > stuff doesn't do the job > - One can justify why one later chooses different approaches > - There may exist canned tools which simplify the most > common basic tasks and operations > > All in all, one avoids a lot of trouble by trying to stick to > the basic toolkit. > > Rune
Rune We agree on the value standards, optimization and accepted practices. The issue I have is with the people who fail to communicate because they use 'standard', 'optimized' or 'accepted' without stating the 'which' or 'for what'. Sometimes those people know the choice they have made and why. Sometimes they don't. Sometimes those people have made the same choice the reader would. Sometimes they have not. You can't tell whether they are ignorant, arrogant, thoughtless and/or just poor communicators (choose at least one) until you can ask the 'which' or 'for what' or check the documentation. Dale B. Dalrymple