DSPRelated.com
Forums

RLS Algorithm (Memoryless)

Started by copernicus 7 years ago3 replieslatest reply 7 years ago375 views

Hi all,

I have been studying the adaptive filters lately and now when I am at RLS (Recursive Least Squar) Algorithm I came across the term used in the weighting function of the RLS called forgetting factor (lambda). This term ‘memoryless’ itself confuses me. The confusion is:

- How is memoryless RLS different than the standard RLS? Does is involve some specific value for lambda (as in, it is said that if lambda = 1, system has infinite memory).

I would be glad if anyone could help.


[ - ]
Reply by dkguptaDecember 14, 2017

Lambda is called forgetting factor because by choosing an appropriate value of Lambda you can decide how much you want to depend on past samples for the estimation. 

As you have said if we choose lambda =1 in that case system will require infinite memory to process the signal correctly because lambda =1 signifies that system never forgets anything and makes the estimation depending upon present and past values giving each of them equal weightage. 

You can understand it from a non linear equation

Y= x(0)+ lembda*x(1)+ (lembda^2)*x(2)+(lembda^3)*x(3)+.....

In above equation if you substitute lambda =1, y will be addition of all the samples till that point. And as the number of samples will increase systems requirement for memory will also increase. 

Now if you substitute a value less than 1 for lembda, because previous samples are being multiplied with an increasing power of lembda, means that previous samples are given less weightage in comparison of more recent samples I.e. present sample will be given most weightage. 

If you substitute lembda =0, it means system forgets everything from the past. In this case value of y will depend only on present sample. As you don’t need any of the previous samples so they can be discarded and hence no need of memory to store previous samples therefore a memoryless RLS. 

In general term a system is called memoryless if it depends only on the present value of input and not on the past. 

If I understood your question, I hope it will help. 

[ - ]
Reply by copernicusDecember 14, 2017

First of all I thank you very much for taking your valuable time out and answering the quest.

As a matter of fact, that is what I exactly thought and it looks so intuitively appropriate but regarding your explanation I have two concerns,

  • If we put lambda = 0  then the gain vector k(n) in the recursive equation wont make any sense,
  • and moreover, I just read a paper (Memoryless Polynomial RLS Adaptive filter for Trajectory Target Tracking by Cai et al.) where the authors have devised an algorithm (which is nothing but standard RLS  *literally*) and called it Memoryless.

i.e. even after using lambda in the recursion they are calling it memoryless! This is what confuses me.

[ - ]
Reply by copernicusDecember 14, 2017

About the first point,

We can actually do away with it by multiplying and dividing the gain vector by lambda. (Just noticed)