Simplification of an LMS implementation
![](https://www.embeddedrelated.com/new/images/defaultavatar.jpg)
Hello, I have implemented an adaptive blocking matrix(ABM) as part of a robust generalised sidelobe canceller. This involves using LMS to cancel the output of a delay sum beamformer from each of the channels of the signals that formed the beam. This results in signals of just the noise(in theory), i.e.
dsb = 0.0
for i in range(channel_count):
x[i] = apply_delay(x[I], delay[I]) #now all the signals are aligned to the signal of interest
dsb += x[i]
for i in range(channel_count):
noise[i] = lms_filter(apply_delay(x[i], N), dsb[i])
Where N is half the LMS filter length(this is to keep the filter causal).
The simplification I would like to perform is to reduce the LMS filter to an adaptive delay/advance and scale. The reduction would take the adaptive filter (which is essentially a many coefficient unconstrained FIR) and reduce it to two variables (theta(delay or advance time) and s(the scalar). It's worth noting that I am doing this in the frequency domain on frames.
If anyone has seen this kind of thing before or knows how to do it then I would be interested in hearing about it. Thanks
![](https://d23s79tivgl8me.cloudfront.net/user/profilepictures/33457.jpg)
Yes, you can base your improved delay-and-scale filtering on an interpolating delay line. I have an overview here that I use in my teaching:
![](https://www.embeddedrelated.com/new/images/defaultavatar.jpg)
Thank you very much
![](https://d23s79tivgl8me.cloudfront.net/user/profilepictures/113586.jpg)
Nice!