DSPRelated.com
Forums

LMS data self correlation problem

Started by andrewstanfordjason 5 years ago4 replieslatest reply 5 years ago107 views

I have implemented an adaptive filter using an LMS based approach. The purpose of the filter is to subtract a signal X from a signal Y where Y is assumed to have been X convolved with a filter H,
i.e. Error = Y - X H_hat. Normal NLMS filter update is used.

My implementation works great until X becomes a highly periodic signal(periodicity is within the length of H). At this point the filter update goes a bit bananas. If anyone knows a good way of controlling such problems I'd be very grateful, thanks.


[ - ]
Reply by djmaguireMarch 2, 2019
In the absence of an explanation of the meaning of the term "bananas" in the context of adaptive filtering, I'd say add leakage.
[ - ]
Reply by andrewstanfordjasonMarch 2, 2019

Apologies for the poor description, I've now looked further into what is happening. My observations are that if I use uncorrelated broadband noise as my input for X then the filter converges very well. If however the X signal is switched for a pure sine of 1kHz then the filter H goes unstable. 

My observation is that the update to H is now producing many "false" correlations due to the periodicity of X.

Thank you for your suggestion of adding leakage, I am trying to adaptively add that when I detect that this condition is met. The condition is easy to spot as the gain of the filter gets very high.

[ - ]
Reply by djmaguireMarch 2, 2019
Based on your description (even the original one), it seemed as though either adding leakage or a dither signal would help.  Leakage is better.  Also, you shouldn't need to switch leakage on and off.  Your leakage term will be very small compared to learning rate and you can simply increase learning rate slightly to offset any impact on convergence.



[ - ]
Reply by andrewstanfordjasonMarch 2, 2019

Thank you