Hi, reading about recursive adaptive filters on wikipedia: http://en.wikipedia.org/wiki/Recursive_least_squares_filter (in the first paragraph), it says that this is used for "deterministic" signals as opposed to "LMS" algorithm which would be more suitable for stochastic signals. Could someone explain what he means by that? I have a repeating signal in my data set. The signal is broadband and about 2 seconds long, but very much correlated from one occurance to the next. The cross correlation value decreases as time eveolves, ie. correlation coefficient is rather low between one and the 10th recurrence, but always very high between one and the next. I was thinking of using an adaptive filter, using one signal as reference input and its next as input to be filtered and so on, but I sort of stuck in Wikipedia article. Thanks, Kamran
adaptive filters
Started by ●December 4, 2012
Reply by ●December 4, 20122012-12-04
On Dec 4, 9:38�am, Kamran Iranpour <kamran.iranp...@gmail.com> wrote:> Hi, > reading about recursive adaptive filters on wikipedia:http://en.wikipedia.org/wiki/Recursive_least_squares_filter > > (in the first paragraph), it says that this is used for > "deterministic" signals as opposed to "LMS" algorithm which would be > more suitable for stochastic signals. Could someone explain what he > means by that? > I have a repeating signal in my data set. The signal is broadband and > about 2 seconds long, but very much correlated from one occurance to > the next. The cross correlation value decreases as time eveolves, ie. > correlation coefficient is rather low between one and the 10th > recurrence, but always very high between one and the next. I was > thinking of using an adaptive filter, using one signal as reference > input and its next as input to be filtered and so on, but I sort of > stuck in Wikipedia article. > > Thanks, > > KamranI would say the wiki article is poorly written. You can use RLS on a stochastic signal as long as it is stationary in the statistical sense and it will usually converge faster than LMS. If you consider applying RLS and LMS to a system identification problem then the 2 cases are a non-time varying system, and a time varying signal Consider a system where the transfer function suddenly changes from H1 to H2, and you're using an adaptive filter to try and identify them system, as it is changing . In the RLS methods you have to update a matrix - if the system you're trying to identify changes all of a sudden - the problem is that you'll have to wait several samples until you get the correct correlation matrix. RLS is a doing a least squares fit. So if the fit is being done over data samples from an old system and a new system (i.e. time varying transfer function) then it doesn't tend to do so well - until you get to the point where the fit is being done over the transfer function. This is similar to a transient response seen when you first start up a digitalfilter. The LMS isn't a least squares fit - it uses an instantaneous estimate of the gradient to try and find a minimum. So for a non-time varying system the RLS tends to converge more quickly than LMS, since it uses more data/information to form the solution. If the system changes, then it takes more time for RLS algorithm to track the change. The LMS tends to track then change in the system better because it isn't using the previous signal history. Hope that helps. Cheers, Dave
Reply by ●December 4, 20122012-12-04
Kamran Iranpour <kamran.iranpour@gmail.com> writes:> Hi, > reading about recursive adaptive filters on wikipedia: > http://en.wikipedia.org/wiki/Recursive_least_squares_filter > > (in the first paragraph), it says that this is used for > "deterministic" signals as opposed to "LMS" algorithm which would be > more suitable for stochastic signals. Could someone explain what he > means by that? > I have a repeating signal in my data set. The signal is broadband and > about 2 seconds long, but very much correlated from one occurance to > the next. The cross correlation value decreases as time eveolves, ie. > correlation coefficient is rather low between one and the 10th > recurrence, but always very high between one and the next. I was > thinking of using an adaptive filter, using one signal as reference > input and its next as input to be filtered and so on, but I sort of > stuck in Wikipedia article. > > Thanks, > > KamranDeterministic signals are signals that are determined precisely. They have any uncertainty. You could say they don't contain noise. Stochastic signals (AKA random signals) are not precisely determined and contain noise. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
Reply by ●December 4, 20122012-12-04
Randy Yates <yates@digitalsignallabs.com> writes:> [...] > They have any uncertainty.They DON'T have any uncertainty. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
Reply by ●December 5, 20122012-12-05
On Wednesday, December 5, 2012 4:50:17 AM UTC+13, Dave wrote:> On Dec 4, 9:38�am, Kamran Iranpour <kamran.iranp...@gmail.com> wrote: > > > Hi, > > > reading about recursive adaptive filters on wikipedia:http://en.wikipedia.org/wiki/Recursive_least_squares_filter > > > > > > (in the first paragraph), it says that this is used for > > > "deterministic" signals as opposed to "LMS" algorithm which would be > > > more suitable for stochastic signals. Could someone explain what he > > > means by that? > > > I have a repeating signal in my data set. The signal is broadband and > > > about 2 seconds long, but very much correlated from one occurance to > > > the next. The cross correlation value decreases as time eveolves, ie. > > > correlation coefficient is rather low between one and the 10th > > > recurrence, but always very high between one and the next. I was > > > thinking of using an adaptive filter, using one signal as reference > > > input and its next as input to be filtered and so on, but I sort of > > > stuck in Wikipedia article. > > > > > > Thanks, > > > > > > Kamran > > > > I would say the wiki article is poorly written. You can use RLS on a > > stochastic signal as long as it is stationary in the statistical sense > > and it will usually converge faster than LMS. If you consider applying > > RLS and LMS to a system identification problem then the 2 cases are a > > non-time varying system, and a time varying signal > > > > Consider a system where the transfer function suddenly changes from H1 > > to H2, and you're using an adaptive filter to try and identify them > > system, as it is changing . > > > > In the RLS methods you have to update a matrix - if the system you're > > trying to identify changes all of a sudden - the problem is that > > you'll have to wait several samples until you get the correct > > correlation matrix. RLS is a doing a least squares fit. So if the fit > > is being done over data samples from an old system and a new system > > (i.e. time varying transfer function) then it doesn't tend to do so > > well - until you get to the point where the fit is being done over the > > transfer function. This is similar to a transient response seen when > > you first start up a digitalfilter. > > > > The LMS isn't a least squares fit - it uses an instantaneous estimate > > of the gradient to try and find a minimum. > > > > So for a non-time varying system the RLS tends to converge more > > quickly than LMS, since it uses more data/information to form the > > solution. > > > > If the system changes, then it takes more time for RLS algorithm to > > track the change. The LMS tends to track then change in the system > > better because it isn't using the previous signal history. > > > > Hope that helps. > > Cheers, > > DaveThe LMS isn't a least squares fit - it uses an instantaneous estimate of the gradient to try and find a minimum. actually LMS is minimising the mean-square error by gradient descent so is a least-squares fit as well as the RLS. Difference is that RLS can be applied to pole-zero models whereas LMS is all zero models (normally). Hardy
Reply by ●December 5, 20122012-12-05
On Dec 4, 4:50�pm, Dave <dspg...@netscape.net> wrote:> On Dec 4, 9:38�am, Kamran Iranpour <kamran.iranp...@gmail.com> wrote: > > > > > > > > > > > Hi, > > reading about recursive adaptive filters on wikipedia:http://en.wikipedia.org/wiki/Recursive_least_squares_filter > > > (in the first paragraph), it says that this is used for > > "deterministic" signals as opposed to "LMS" algorithm which would be > > more suitable for stochastic signals. Could someone explain what he > > means by that? > > I have a repeating signal in my data set. The signal is broadband and > > about 2 seconds long, but very much correlated from one occurance to > > the next. The cross correlation value decreases as time eveolves, ie. > > correlation coefficient is rather low between one and the 10th > > recurrence, but always very high between one and the next. I was > > thinking of using an adaptive filter, using one signal as reference > > input and its next as input to be filtered and so on, but I sort of > > stuck in Wikipedia article. > > > Thanks, > > > Kamran > > I would say the wiki article is poorly written. You can use RLS on a > stochastic signal as long as it is stationary in the statistical sense > and it will usually converge faster than LMS. If you consider applying > RLS and LMS to a system identification problem then the 2 cases are a > non-time varying system, and a time varying signal > > Consider a system where the transfer function suddenly changes from H1 > to H2, and you're using an adaptive filter to try and identify them > system, as it is changing . > > In the RLS methods you have to update a matrix - if the system you're > trying to identify changes all of a sudden - the problem is that > you'll have to wait several samples until you get the correct > correlation matrix. RLS is a doing a least squares fit. So if the fit > is being done over data samples from an old system and a new system > (i.e. time varying transfer function) then it doesn't tend to do so > well - until you get to the point where the fit is being done over the > transfer function. This is similar to a transient response seen when > you first start up a �digitalfilter. > > The LMS isn't a least squares fit - it uses an instantaneous estimate > of the gradient to try and find a minimum. > > So for a non-time varying system the RLS tends to converge more > quickly than LMS, since it uses more data/information to form the > solution. > > If the system changes, then it takes more time for RLS algorithm to > track the change. The LMS tends to track then change in the system > better because it isn't using the previous signal history. > > Hope that helps. > Cheers, > DaveThank you all for the answers. Kamran
Reply by ●December 5, 20122012-12-05
<gyansorova@gmail.com> wrote:>The LMS isn't a least squares fit -Idiot. LMS = least mean squares.> it uses an instantaneous estimate >of the gradient to try and find a minimum.Using instantaneous gradient = SG method (stochastic gradient).>actually LMS is minimising the mean-square error by gradient descent so is >a least-squares fit as well as the RLS.If the statistics of signal is gaussian, then SG method converges to LMS solution.>Difference is that RLS can be applied to pole-zero models whereas LMS is >all zero models (normally).SG method could be used for pole-zero models as well. The computation of the gradient is somewhat more complicated then in all zero case, as it requires derivatives of denominator. Vladimir Vassilevsky DSP and Mixed Signal Consultant www.abvolt.com
Reply by ●December 5, 20122012-12-05
On Dec 5, 10:59�am, "Vladimir Vassilevsky" <nos...@nowhere.com> wrote:> >The LMS isn't a least squares fit - > > Idiot. > LMS = least mean squares. >Vlad - I'm glad to see you at least know what LMS stands for. Sorry I must have missed your valuable and insightful response to the original post. LMS - does not solve any least squares problem in any given iteration - it approximates a solution but the error only gets minimized over several iterations thus the term "mean" in the title. Generally it will converge - assuming you choose an appropriate step size. Refer to "Fundamentals of Adaptive Filtering" by Sayed pg 216: "The LMS algorithm was derived in sec. 5.2.1 as as approximate iterative solution to the linear least-mean-squares estimation problem, in the sense that it was obtained by replacing the actual gradient vector in the steepest-descent implementation by an instantaneous approximation for it." RLS - Solves a least squares problem at each iteration. I would suggest you go read a good textbook. David
Reply by ●December 5, 20122012-12-05
"Dave" <dspguy2@netscape.net> wrote" On Dec 5, 10:59 am, "Vladimir Vassilevsky" <nos...@nowhere.com> wrote:> >The LMS isn't a least squares fit - > > Idiot. > LMS = least mean squares. >>Vlad - I'm glad to see you at least know what LMS stands for. Sorry I >must have missed your valuable and insightful response to the original >post.
Reply by ●December 5, 20122012-12-05
On Dec 5, 2:34�pm, "Vladimir Vassilevsky" <nos...@nowhere.com> wrote:> "Dave" <dspg...@netscape.net> wrote" > On Dec 5, 10:59 am, "Vladimir Vassilevsky" <nos...@nowhere.com> wrote:Vlad - Grow up ! If you can't make a reasonable contribution then go back and crawl under whatever rock you came from. Dave






