Minimum lag high pass filter for very low frequencies

Started by April 11, 2013
I was hoping I could get some help on designing a high pass filter for very, very low frequency signals (relative to the sampling rate).

The frequencies I'm looking at are: 20Hz sampling, I want to filter off everything below 1/30Hz (or 1/15 if that's the best I can do). My application is very lag and phase shift sensitive. I have to double integrate this filtered signal, and the better I can pull out that low frequency information, the more accurate my results are.

I have been using a very simple low pass:

mean(n) = (mean(n-1) * (filter_coef - 1) + sample(n)) / filter_coef; //where filter_coef is around 150

and subtracting that from the original signal. This is great because it's very simple to design and easy to implement (I'm short an FPU and there's very little free memory in my embedded application) but the filter has some bad phase characteristics (needless to say) and, hence, big error.

The biggest problem is that my signal is between 2 Hz and 1/20 Hz.

What kind of improvements can I make?

Thank you for your help in advance!