DSPRelated.com
Forums

Removing signals from data using FFT Filtering

Started by Phil W May 3, 2012
Hi DSP Team,

I am attempting to use excel’s FFT analysis to remove the seasonal and
annual signals (noise) in long monthly sea level records as part of current
sea level rise studies. I have successfully used the FFT function in excel
to identify high frequency (repetitive) signals of interest. I was
wondering however, if there is a way of separating or isolating these
particular signals, removing them and then using the “inverse” function
in Excel FFT analysis to recompile the data record, but, now excluding the
noisy signals.  Is anyone able to step me through the process of isolating
a subject signal and using the “inverse” FFT to recompile the record in
Excel(bearing in mind I am not an advanced user of excel or digital
processing techniques). Many thanks.

Phil W 



On Thu, 03 May 2012 14:51:17 -0500, Phil W wrote:

> Hi DSP Team, > > I am attempting to use excel’s FFT analysis to remove the seasonal and > annual signals (noise) in long monthly sea level records as part of > current sea level rise studies. I have successfully used the FFT > function in excel to identify high frequency (repetitive) signals of > interest. I was wondering however, if there is a way of separating or > isolating these particular signals, removing them and then using the > “inverse” function in Excel FFT analysis to recompile the data record, > but, now excluding the noisy signals. Is anyone able to step me through > the process of isolating a subject signal and using the “inverse” FFT to > recompile the record in Excel(bearing in mind I am not an advanced user > of excel or digital processing techniques). Many thanks.
First, there are a lot of subtleties, and there is no way that anyone can educate you on this with one reply, or even 100. If you really need to know this, you need a book -- perhaps a book on data analysis that talks about the FFT, rather than necessarily a book on digital signal processing. Second, the easiest way to do this is to sidestep the use of the FFT entirely. Instead, take the average temperature for each year. Any variation that has a period of one year will automatically drop out. Third, I ain't no Excel user, much less an Excell FFT user. But I'll tell you how you might accomplish this task in general, then you can see if you can figure out how to do it for your specific case: 1: Detrend your data. Do a least-squares fit to what you have, and subtract that out. 2: Pad your data. Add a bunch of zeros onto the end of it (given an input x_0, x_1, ... x_{n-1}, the FFT operates on the data as if x_{n-1} and x_0 are sitting right next to each other -- this means that the inevitable discontinuity between x_{n-1} and x_0 will cause artifacts -- you don't want this). 3: Don't window your data. Folks will disagree with me on this, but if I'm right about what you have, then in this case -- unlike many cases where you're doing this sort of thing with the FFT -- you don't want to window your data. 4: Take the FFT -- yay! Taking the inverse FFT at this point and verifying that its output is essentially what you put into the FFT is a Very Good Idea. 5: Multiply your high-frequency data by 0, your low-frequency data by 1, and your mid-frequency data by something that makes a smooth transition. Graphing the absolute value of your FFT'd data is a good idea here, to give you an idea of what you're filtering out. If your padded data set is N points long, the "annual" frequency should occur at an FFT bin of around N/12 on your FFT output -- look for the spike. You want to multiply this by zero. I would try for a filter shape that incorporated a raised cosine to transition from low frequency to high. I'd probably play around with it, but you basically want something like: if f < F0, multiply by 1 if f > F1, multiply by 0 if F0 <= f <= F1, multiply by 1/2 + cos(mumble-mumble(f)) (I think that mumble-mumble(f) = pi * (f - F0)/(F1 - F0) but verifying it is up to you) 6: Take the inverse FFT of the result. You should have something that tracks the long-time motion of your data, but filters out the jaggies. You should _not_ have too much of a transition from the end of your data to the zeros, or at the very end of your padded data (which the FFT will treat as being contiguous with the beginning) -- if you do, then I was wrong about step 3. 7: Trim and re-trend your data: ignore the places where you padded with zeros in step 2, and add back in what you subtracted in step 1. Note that the filtered data is going to have artifacts at the ends. Moreover, the lower the frequency of the filtering, the longer these artifacts will be. If you don't like it, then the FFT is not the right technique to use to analyze your data. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
On 5/4/12 12:58 AM, Tim Wescott wrote:
> On Thu, 03 May 2012 14:51:17 -0500, Phil W wrote: > >> I am attempting to use excel&rsquo;s FFT analysis to remove the seasonal and >> annual signals (noise) in long monthly sea level records as part of >> current sea level rise studies.
...
> > First, there are a lot of subtleties, and there is no way that anyone can > educate you on this with one reply, or even 100. >
Amen.
> Second, the easiest way to do this is to sidestep the use of the FFT > entirely. >
Amen. i wouldn't mess with the FFT at all. not at first anyway. you don't need the speed and you should do the computations explicitly. then you'll know what you are doing with the data.
> 1: Detrend your data. Do a least-squares fit to what you have, and > subtract that out.
here is how i might approach that: since you are trying to remove the cyclical annual trend, we expect a cyclical component with period equal to one year. given some parameter (like sea level), i would line the data up in a two-dimensional array (or "matrix", i guess) with the year identifying the row and the calendar date identifying the column. i dunno what i would do with Feb 29, maybe skip it (there is a method that uses interpolation that would allow for a year length of 365.25 days and Feb 29 would remain a valid data point, come back and talk to us if that is what you want to do). for each date (column), compute the mean value over all of the years. do this for each date and then subtract that mean from the value for those dates only. then look at the trend over time with the cyclical mean subtracted out of the data. if you have a linear or some other polynomial model for the trend, least-squares sounds as good as any. BTW, i've been worrying about processing data like this (and detrending the yearly cyclical part of it), but instead of sea level (i would think that, worldwide, it would be sorta detrended since winter in the northern hemisphere is summer in the south), i have been worrying about the level of Lake Champlain, which barged into my home uninvited for 6 weeks a year ago. since the lake is clearly not in the southern hemisphere or on the equator, it has a one-sided annual cyclical property. May is much different than November (except this year).
>> Many thanks.
FWIW. -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
"robert bristow-johnson" <rbj@audioimagination.com> wrote in message 
news:jo0tu1$6g2$1@dont-email.me...
> On 5/4/12 12:58 AM, Tim Wescott wrote: >> On Thu, 03 May 2012 14:51:17 -0500, Phil W wrote: >> >>> I am attempting to use excel&rsquo;s FFT analysis to remove the seasonal and >>> annual signals (noise) in long monthly sea level records as part of >>> current sea level rise studies. > ... >> >> First, there are a lot of subtleties, and there is no way that anyone can >> educate you on this with one reply, or even 100. >> > > Amen. > >> Second, the easiest way to do this is to sidestep the use of the FFT >> entirely. >> > > Amen. i wouldn't mess with the FFT at all. not at first anyway. you > don't need the speed and you should do the computations explicitly. then > you'll know what you are doing with the data. > >> 1: Detrend your data. Do a least-squares fit to what you have, and >> subtract that out. > > here is how i might approach that: since you are trying to remove the > cyclical annual trend, we expect a cyclical component with period equal to > one year. given some parameter (like sea level), i would line the data up > in a two-dimensional array (or "matrix", i guess) with the year > identifying the row and the calendar date identifying the column. i dunno > what i would do with Feb 29, maybe skip it (there is a method that uses > interpolation that would allow for a year length of 365.25 days and Feb 29 > would remain a valid data point, come back and talk to us if that is what > you want to do). > > for each date (column), compute the mean value over all of the years. do > this for each date and then subtract that mean from the value for those > dates only. then look at the trend over time with the cyclical mean > subtracted out of the data. if you have a linear or some other polynomial > model for the trend, least-squares sounds as good as any.
Tidal effects are known to be a very strong signal at diurnal, daily, and monthly periods. Couldn't he usefully filter those frequencies before analysis?
> > BTW, i've been worrying about processing data like this (and detrending > the yearly cyclical part of it), but instead of sea level (i would think > that, worldwide, it would be sorta detrended since winter in the northern > hemisphere is summer in the south), i have been worrying about the level > of Lake Champlain, which barged into my home uninvited for 6 weeks a year > ago. since the lake is clearly not in the southern hemisphere or on the > equator, it has a one-sided annual cyclical property. May is much > different than November (except this year). > >>> Many thanks. > > FWIW. > > -- > > r b-j rbj@audioimagination.com > > "Imagination is more important than knowledge." > >
On 5/4/12 12:21 PM, MikeWhy wrote:
> > "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > news:jo0tu1$6g2$1@dont-email.me... >> On 5/4/12 12:58 AM, Tim Wescott wrote: >>
...
>>> 1: Detrend your data. Do a least-squares fit to what you have, and >>> subtract that out. >> >> here is how i might approach that: since you are trying to remove the >> cyclical annual trend, we expect a cyclical component with period >> equal to one year. given some parameter (like sea level), i would line >> the data up in a two-dimensional array (or "matrix", i guess) with the >> year identifying the row and the calendar date identifying the column. >> i dunno what i would do with Feb 29, maybe skip it (there is a method >> that uses interpolation that would allow for a year length of 365.25 >> days and Feb 29 would remain a valid data point, come back and talk to >> us if that is what you want to do). >> >> for each date (column), compute the mean value over all of the years. >> do this for each date and then subtract that mean from the value for >> those dates only. then look at the trend over time with the cyclical >> mean subtracted out of the data. if you have a linear or some other >> polynomial model for the trend, least-squares sounds as good as any. > > Tidal effects are known to be a very strong signal at diurnal, daily, > and monthly periods. Couldn't he usefully filter those frequencies > before analysis?
sure. actually the filter that i described is a form of a comb filter. you can have a comb filter tuned to frequencies of 1/(365.2427 days) and 1/(29.53059 days) for the Earth's solar orbit and the Moon's orbit around the Earth. perhaps 1/1 for the Earth's spin, but for tidal effects, the (diurnal) driving frequency would be something like [1 - 1/29.53059] or [1 + 1/29.53059], (which is it Clay, can you tell us?) i think it's the latter, but a lunar day is 24 hours and 50.4 minutes but this comes out as 24 hours and 48.7 minutes. so i don't know what i'm doing wrong. but it should be possible to filter the uniformly-sampled data with precision-tuned comb filters to take out a periodic component of a known period. to do a precision-tuned comb filter, you need a precision delay element which is something we discuss periodically here at comp.dsp . -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
On Fri, 04 May 2012 11:21:29 -0500, MikeWhy wrote:

> "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > news:jo0tu1$6g2$1@dont-email.me... >> On 5/4/12 12:58 AM, Tim Wescott wrote: >>> On Thu, 03 May 2012 14:51:17 -0500, Phil W wrote: >>> >>>> I am attempting to use excel&acirc;&euro;&trade;s FFT analysis to remove the seasonal >>>> and annual signals (noise) in long monthly sea level records as part >>>> of current sea level rise studies. >> ... >>> >>> First, there are a lot of subtleties, and there is no way that anyone >>> can educate you on this with one reply, or even 100. >>> >>> >> Amen. >> >>> Second, the easiest way to do this is to sidestep the use of the FFT >>> entirely. >>> >>> >> Amen. i wouldn't mess with the FFT at all. not at first anyway. you >> don't need the speed and you should do the computations explicitly. >> then you'll know what you are doing with the data. >> >>> 1: Detrend your data. Do a least-squares fit to what you have, and >>> subtract that out. >> >> here is how i might approach that: since you are trying to remove the >> cyclical annual trend, we expect a cyclical component with period equal >> to one year. given some parameter (like sea level), i would line the >> data up in a two-dimensional array (or "matrix", i guess) with the year >> identifying the row and the calendar date identifying the column. i >> dunno what i would do with Feb 29, maybe skip it (there is a method >> that uses interpolation that would allow for a year length of 365.25 >> days and Feb 29 would remain a valid data point, come back and talk to >> us if that is what you want to do). >> >> for each date (column), compute the mean value over all of the years. >> do this for each date and then subtract that mean from the value for >> those dates only. then look at the trend over time with the cyclical >> mean subtracted out of the data. if you have a linear or some other >> polynomial model for the trend, least-squares sounds as good as any. > > Tidal effects are known to be a very strong signal at diurnal, daily, > and monthly periods. Couldn't he usefully filter those frequencies > before analysis?
Well, yes. But if you're doing things off-line in batches, the FFT is often a quick and easy way to implement lots of filtering fairly fast. So it's a valid approach to do the job with the FFT. In fact, if you use almost any shift-invariant filter you're going to have artifacts at the ends of the data, just like I described with the FFT -- the FFT is just a (potentially) faster way to get the same results. Only by going to a data analysis method that takes into account the fact that you have no knowledge of the data past the endpoints will you minimize the end effects -- and if the OP's data is long enough, and he's looking for trends, he may well be able to ignore the ends of the data on the output. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
On Friday, May 4, 2012 12:58:06 PM UTC-4, robert bristow-johnson wrote:
> On 5/4/12 12:21 PM, MikeWhy wrote: > > > > "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > > news:jo0tu1$6g2$1@dont-email.me... > >> On 5/4/12 12:58 AM, Tim Wescott wrote: > >> > ... > >>> 1: Detrend your data. Do a least-squares fit to what you have, and > >>> subtract that out. > >> > >> here is how i might approach that: since you are trying to remove the > >> cyclical annual trend, we expect a cyclical component with period > >> equal to one year. given some parameter (like sea level), i would line > >> the data up in a two-dimensional array (or "matrix", i guess) with the > >> year identifying the row and the calendar date identifying the column. > >> i dunno what i would do with Feb 29, maybe skip it (there is a method > >> that uses interpolation that would allow for a year length of 365.25 > >> days and Feb 29 would remain a valid data point, come back and talk to > >> us if that is what you want to do). > >> > >> for each date (column), compute the mean value over all of the years. > >> do this for each date and then subtract that mean from the value for > >> those dates only. then look at the trend over time with the cyclical > >> mean subtracted out of the data. if you have a linear or some other > >> polynomial model for the trend, least-squares sounds as good as any. > > > > Tidal effects are known to be a very strong signal at diurnal, daily, > > and monthly periods. Couldn't he usefully filter those frequencies > > before analysis? > > sure. actually the filter that i described is a form of a comb filter. > you can have a comb filter tuned to frequencies of 1/(365.2427 days) > and 1/(29.53059 days) for the Earth's solar orbit and the Moon's orbit > around the Earth. perhaps 1/1 for the Earth's spin, but for tidal > effects, the (diurnal) driving frequency would be something like [1 - > 1/29.53059] or [1 + 1/29.53059], (which is it Clay, can you tell us?) i > think it's the latter, but a lunar day is 24 hours and 50.4 minutes but > this comes out as 24 hours and 48.7 minutes. so i don't know what i'm > doing wrong. > > but it should be possible to filter the uniformly-sampled data with > precision-tuned comb filters to take out a periodic component of a known > period. to do a precision-tuned comb filter, you need a precision delay > element which is something we discuss periodically here at comp.dsp . > > -- > > r b-j rbj@audioimagination.com > > "Imagination is more important than knowledge."
On Friday, May 4, 2012 12:58:06 PM UTC-4, robert bristow-johnson wrote:
> On 5/4/12 12:21 PM, MikeWhy wrote: > > > > "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > > news:jo0tu1$6g2$1@dont-email.me... > >> On 5/4/12 12:58 AM, Tim Wescott wrote: > >> > ... > >>> 1: Detrend your data. Do a least-squares fit to what you have, and > >>> subtract that out. > >> > >> here is how i might approach that: since you are trying to remove the > >> cyclical annual trend, we expect a cyclical component with period > >> equal to one year. given some parameter (like sea level), i would line > >> the data up in a two-dimensional array (or "matrix", i guess) with the > >> year identifying the row and the calendar date identifying the column. > >> i dunno what i would do with Feb 29, maybe skip it (there is a method > >> that uses interpolation that would allow for a year length of 365.25 > >> days and Feb 29 would remain a valid data point, come back and talk to > >> us if that is what you want to do). > >> > >> for each date (column), compute the mean value over all of the years. > >> do this for each date and then subtract that mean from the value for > >> those dates only. then look at the trend over time with the cyclical > >> mean subtracted out of the data. if you have a linear or some other > >> polynomial model for the trend, least-squares sounds as good as any. > > > > Tidal effects are known to be a very strong signal at diurnal, daily, > > and monthly periods. Couldn't he usefully filter those frequencies > > before analysis? > > sure. actually the filter that i described is a form of a comb filter. > you can have a comb filter tuned to frequencies of 1/(365.2427 days) > and 1/(29.53059 days) for the Earth's solar orbit and the Moon's orbit > around the Earth. perhaps 1/1 for the Earth's spin, but for tidal > effects, the (diurnal) driving frequency would be something like [1 - > 1/29.53059] or [1 + 1/29.53059], (which is it Clay, can you tell us?) i > think it's the latter, but a lunar day is 24 hours and 50.4 minutes but > this comes out as 24 hours and 48.7 minutes. so i don't know what i'm > doing wrong. > > but it should be possible to filter the uniformly-sampled data with > precision-tuned comb filters to take out a periodic component of a known > period. to do a precision-tuned comb filter, you need a precision delay > element which is something we discuss periodically here at comp.dsp . > > -- > > r b-j rbj@audioimagination.com > > "Imagination is more important than knowledge."
Actually trying to "filter" out lunar influences is fraught with issues. First of all the perigee/apogee variations fluctuate strongly during the course of the year. This is due to the Earth's ellipitical orbit and due to the Moon's elliptical orbit. Instead I would just use actual lunar data (easily calculated using Chapront & Chapront's lunar theory. And then fit that (magnitude and delay) to the ocean data. Even though you will find published values (averages) for the delay between successive lunar events listed out to 8 decimal places, one should not forget the time delay between sucessive lunar events stutter by +- a day for things like phase etc. So I'm saying you don't want to filter out an FM and AM type of signal, instead fit the real signal from lunar theory to the data and subtract it out. If you want more info on how to calculate lunar parameters, I can point you to several good sources. I have coded up quite a bit of it in c++. Clay
On Friday, May 4, 2012 2:02:09 PM UTC-4, cl...@claysturner.com wrote:
> On Friday, May 4, 2012 12:58:06 PM UTC-4, robert bristow-johnson wrote: > > On 5/4/12 12:21 PM, MikeWhy wrote: > > > > > > "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > > > news:jo0tu1$6g2$1@dont-email.me... > > >> On 5/4/12 12:58 AM, Tim Wescott wrote: > > >> > > ... > > >>> 1: Detrend your data. Do a least-squares fit to what you have, and > > >>> subtract that out. > > >> > > >> here is how i might approach that: since you are trying to remove the > > >> cyclical annual trend, we expect a cyclical component with period > > >> equal to one year. given some parameter (like sea level), i would line > > >> the data up in a two-dimensional array (or "matrix", i guess) with the > > >> year identifying the row and the calendar date identifying the column. > > >> i dunno what i would do with Feb 29, maybe skip it (there is a method > > >> that uses interpolation that would allow for a year length of 365.25 > > >> days and Feb 29 would remain a valid data point, come back and talk to > > >> us if that is what you want to do). > > >> > > >> for each date (column), compute the mean value over all of the years. > > >> do this for each date and then subtract that mean from the value for > > >> those dates only. then look at the trend over time with the cyclical > > >> mean subtracted out of the data. if you have a linear or some other > > >> polynomial model for the trend, least-squares sounds as good as any. > > > > > > Tidal effects are known to be a very strong signal at diurnal, daily, > > > and monthly periods. Couldn't he usefully filter those frequencies > > > before analysis? > > > > sure. actually the filter that i described is a form of a comb filter. > > you can have a comb filter tuned to frequencies of 1/(365.2427 days) > > and 1/(29.53059 days) for the Earth's solar orbit and the Moon's orbit > > around the Earth. perhaps 1/1 for the Earth's spin, but for tidal > > effects, the (diurnal) driving frequency would be something like [1 - > > 1/29.53059] or [1 + 1/29.53059], (which is it Clay, can you tell us?) i > > think it's the latter, but a lunar day is 24 hours and 50.4 minutes but > > this comes out as 24 hours and 48.7 minutes. so i don't know what i'm > > doing wrong. > > > > but it should be possible to filter the uniformly-sampled data with > > precision-tuned comb filters to take out a periodic component of a known > > period. to do a precision-tuned comb filter, you need a precision delay > > element which is something we discuss periodically here at comp.dsp . > > > > -- > > > > r b-j rbj@audioimagination.com > > > > "Imagination is more important than knowledge." > > > > On Friday, May 4, 2012 12:58:06 PM UTC-4, robert bristow-johnson wrote: > > On 5/4/12 12:21 PM, MikeWhy wrote: > > > > > > "robert bristow-johnson" <rbj@audioimagination.com> wrote in message > > > news:jo0tu1$6g2$1@dont-email.me... > > >> On 5/4/12 12:58 AM, Tim Wescott wrote: > > >> > > ... > > >>> 1: Detrend your data. Do a least-squares fit to what you have, and > > >>> subtract that out. > > >> > > >> here is how i might approach that: since you are trying to remove the > > >> cyclical annual trend, we expect a cyclical component with period > > >> equal to one year. given some parameter (like sea level), i would line > > >> the data up in a two-dimensional array (or "matrix", i guess) with the > > >> year identifying the row and the calendar date identifying the column. > > >> i dunno what i would do with Feb 29, maybe skip it (there is a method > > >> that uses interpolation that would allow for a year length of 365.25 > > >> days and Feb 29 would remain a valid data point, come back and talk to > > >> us if that is what you want to do). > > >> > > >> for each date (column), compute the mean value over all of the years. > > >> do this for each date and then subtract that mean from the value for > > >> those dates only. then look at the trend over time with the cyclical > > >> mean subtracted out of the data. if you have a linear or some other > > >> polynomial model for the trend, least-squares sounds as good as any. > > > > > > Tidal effects are known to be a very strong signal at diurnal, daily, > > > and monthly periods. Couldn't he usefully filter those frequencies > > > before analysis? > > > > sure. actually the filter that i described is a form of a comb filter. > > you can have a comb filter tuned to frequencies of 1/(365.2427 days) > > and 1/(29.53059 days) for the Earth's solar orbit and the Moon's orbit > > around the Earth. perhaps 1/1 for the Earth's spin, but for tidal > > effects, the (diurnal) driving frequency would be something like [1 - > > 1/29.53059] or [1 + 1/29.53059], (which is it Clay, can you tell us?) i > > think it's the latter, but a lunar day is 24 hours and 50.4 minutes but > > this comes out as 24 hours and 48.7 minutes. so i don't know what i'm > > doing wrong. > > > > but it should be possible to filter the uniformly-sampled data with > > precision-tuned comb filters to take out a periodic component of a known > > period. to do a precision-tuned comb filter, you need a precision delay > > element which is something we discuss periodically here at comp.dsp . > > > > -- > > > > r b-j rbj@audioimagination.com > > > > "Imagination is more important than knowledge." > > Actually trying to "filter" out lunar influences is fraught with issues. First of all the perigee/apogee variations fluctuate strongly during the course of the year. This is due to the Earth's ellipitical orbit and due to the Moon's elliptical orbit. Instead I would just use actual lunar data (easily calculated using Chapront & Chapront's lunar theory. And then fit that (magnitude and delay) to the ocean data. Even though you will find published values (averages) for the delay between successive lunar events listed out to 8 decimal places, one should not forget the time delay between sucessive lunar events stutter by +- a day for things like phase etc. So I'm saying you don't want to filter out an FM and AM type of signal, instead fit the real signal from lunar theory to the data and subtract it out. If you want more info on how to calculate lunar parameters, I can point you to several good sources. I have coded up quite a bit of it in c++. > > Clay
I ran the calculation for 2012. Here is a link to an Excel file with a chart showing the Earth-Moon distance for 2012. The data also contains the phase info. Note how the center to center distance variation is more than a simple sine wave, so that is why a simple frequency filter will do poorly. The perigee near day 125 is tommorow's "supermoon" where for this year the moon is at a "deep" perigee at nearly the same time is having a full phase. http://www.claysturner.com/MoonDist2012.xls Clay p.s. Interesting Factoid, the Moon's orbit is always curved towards the Sun!
On 5/4/12 4:02 PM, clay@claysturner.com wrote:
> > Note how the center to center distance variation is more than a simple sine wave, so that is why a simple frequency filter will do poorly.
well, i was only assuming periodic, not sinusoidal. so the filter i was suggesting was a comb filter, not a mere notch filter. i dunno.
> > p.s. Interesting Factoid, the Moon's orbit is always curved towards the Sun! >
as if the Sun as any influence. thanks, Clay. i'll have to look at this later tonight. i'm taking my kid to the "Bully" movie, now. L8r, -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
On 5/4/2012 2:02 PM, clay@claysturner.com wrote:

   ...

> Actually trying to "filter" out lunar influences is fraught with issues. First of all the perigee/apogee variations fluctuate strongly during the course of the year. This is due to the Earth's ellipitical orbit and due to the Moon's elliptical orbit. Instead I would just use actual lunar data (easily calculated using Chapront& Chapront's lunar theory. And then fit that (magnitude and delay) to the ocean data. Even though you will find published values (averages) for the delay between successive lunar events listed out to 8 decimal places, one should not forget the time delay between sucessive lunar events stutter by +- a day for things like phase etc. So I'm saying you don't want to filter out an FM and AM type of signal, instead fit the real signal from lunar theory to the data and subtract it out. If you want more info on how to calculate lunar parameters, I can point you to several good sources. I have coded up quite a bit of it in c++.
Will a generalized theory suffice for local conditions? The times of high and low tides even at the same longitude differ in a ray that depends on nearby islands, inlets and bars. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;