DSPRelated.com
Forums

filter effect

Started by rama raju April 8, 2009
Hi,

I am working on accelerometer sensor based vibration
analysis. For spectrum analysis I am
using DSP processor.

Now I need to scale my output( magnitude of FFT) according
to my input signal.

I am using low pass filter, Decimation, and windowing to process
time domain data. Because of these
function there will be non-linearity in output.
Is there any standard method for correcting filter effect Decimation, and
windowing?

Can anyone through some light on this regard?

Thanks in advance.

Regards,

SVS

On Wed, 8 Apr 2009, rama raju wrote:

> Hi,
>
> I am working on accelerometer sensor based vibration
> analysis. For spectrum analysis I am
> using DSP processor.
>
> Now I need to scale my output( magnitude of FFT) according
> to my input signal.
>
> I am using low pass filter, Decimation, and windowing to process
> time domain data. Because of these
> function there will be non-linearity in output.
> Is there any standard method for correcting filter effect Decimation, and
> windowing?
>
> Can anyone through some light on this regard?

That's the whole point of the filter - to reduce unwanted signal regions.
It's non-linear on purpose.

To get a measure of what you are really doing, send a known frequency
through your filter system and see what amplitude comes out. As you sweep
across frequency, you will get different amplitudes. Plotting the whole
range of amplitude vs frequency gives you the over all transfer curve, and
then you can use that as your definition of the filter system. If you
take the inverse of the transfer curve and multiply it by your total
output, you can estimate your actual input values. It's only an estimate,
but it will be reasonably close to what you want.

Patience, persistence, truth,
Dr. mike