Reply by Sampo Niskanen February 13, 20092009-02-13

I have an application where I need to compute derivatives of measurement 
results in near-real time.  However, noise and quantization error of the 
measurements should affect the result as little as possible.  What would 
be the best approach to this?  I've tried searching on the net but 
haven't found much info.

I guess it is basically a question of combining a [-1 1]/dt filter with 
some suitable low-pass filter.  I want to avoid ripple in the outcome, 
and when sampling at 100Hz the signal should not be delayed by more than 
a few (2-5) samples.  The sampling rate is much higher than the 
oscillation rate of the signal (0-5Hz) and the phase is not significant 
in the application.  The noise will probably be relatively small 
compared to the signal amplitude.

Would a FIR or IIR filter be better?  I've been considering a 
Butterworth filter and a simple y[i] = a*x[i] + (1-a)*y[i-1] IIR filter.  
Any suggestions are welcomed.

I have basic knowledge of FIR and IIR filters, but not much experience 
on their properties or applying them.

Thanks for any help.

 /____\   Sampo Niskanen <=>  \
       \     \
        \     ________________________________________\___