Using dlms routine of dsplib

Started by Marlo Flores March 18, 2002
Hello all,
I am implementing the adaptive lms filter routine on
C5402 with CCS 2.1. I wrote my own code and
simulations with this indicate that my code works
fine. However, it could not work in real time for size
of filter that goes beyond 200. So I tried to use the
adaptive delayed lms filter dlms() routine of dsplib.
I first tried to simulate its real-time behavior. But
it does not work fine. The size of the delay buffer
and the adaptive filter is 32. FOllowing the spru518
documentation I aligned these arrays to a 32-bit
boundary in the linker file:

INT_DM_SCRATCH_PAD_DRAM: origin = 060h, length = 20h
INT_DM_1: origin = 0080h, length = 1000h
INT_DM_2: origin = 1080h, length = 380h
EXT_DM_RAM: origin = 1400h, length = 0ec00h

.dbuffer: {} > INT_DM_1 PAGE 1, align(32)
.coefs : {} > EXT_DM_RAM PAGE 1, align(32)

then in the header file i declare

#define M 32 //size of filter and delay buffer
#define MU 164

DATA x, e;

#pragma DATA_SECTION (y,".dbuffer")
DATA y[M];
DATA *dp = y;

#pragma DATA_SECTION (c,".coefs")
DATA c[M],r;

the main routine is
x = filter_response;
e = desired_response;

I used dlms() on a sample by sample basis so sizes of
x, r, and e are 1. Anyone knows whats wrong with this