DSPRelated.com
Forums

Problem when implementing fir filter for NTSC video stream using DM642

Started by david olave June 20, 2005

Hi,
I am implementing a fir filter in the DM642 for an NTSC video stream. I am using the DSP_fir_gen filter function in order to run it.
For the test, I am using a sin x/x signal as an input and the output is display in the oscilloscope. I am testing the frequency response of the impulse. The output display almost shows the typical graph (amplitude vs frequency)of a low pass filter with its respective cut off, stop band, passband ripple, and attenuation levels. A low pass filter graph is what I am looking for. The problem now is that I am getting a peak close to 3.75 Mhz which is the Nqyist frequency of the chroma sampling. If I move the cut off frequency below 3.75 (that means that 3.75 will be at the attenuation area), the peak remains with a magnitude of -15 dB. The maximum amplitude in this area is designed to be less than -40 dB. If I don't filter the chroma buffers (cr1 and cb1) the peak is reduced but it doesn't disappear.
Eventhough I filter only the luma component, I get a peak (smaller but I still have a peak).
Is this normal? Any suggestions or help so I can get rid of this peak will be really appreciated.

Davidastro



FREE pop-up blocking with the new MSN Toolbar MSN Toolbar Get it now!