DSPRelated.com
Forums

Real-time wavelet denoising

Started by Poj December 27, 2007
Does anyone have experience developing real-time wavelet denoising software
without using any DSP chip? I have a problem with delay time caused by
buffering data and want to reduce the delay time as much as possible. The
data I am dealing with is 1-D with sampling rate lower than 1kHz. I have
developed the application in C++. The program reads data into a buffer
sample by sample until the buffer has enough data to perform DWT,
eliminate noise, and IDWT. So, the time to wait for data becomes delay
time. In other words, output of my wavelet is behind real time about 40
seconds or so due to low sampling rate and data buffering. There is no
problem with denoising quality. or processing speed. I just want a more
realistic real-time. Any suggestions or ideas are highly appreciated.

Thank you and Merry Christmas


"Poj" <netsiri_c@hotmail.com> wrote in message
news:4q-dnZp-37uyOe7anZ2dnUVZ_qGknZ2d@giganews.com...
> Does anyone have experience developing real-time wavelet denoising
software
> without using any DSP chip? I have a problem with delay time caused by > buffering data and want to reduce the delay time as much as possible. The > data I am dealing with is 1-D with sampling rate lower than 1kHz. I have > developed the application in C++. The program reads data into a buffer > sample by sample until the buffer has enough data to perform DWT, > eliminate noise, and IDWT. So, the time to wait for data becomes delay > time. In other words, output of my wavelet is behind real time about 40 > seconds or so due to low sampling rate and data buffering. There is no > problem with denoising quality. or processing speed. I just want a more > realistic real-time. Any suggestions or ideas are highly appreciated.
Throw away the popular wavelet nonsense and implement a reasonable denoising algorithm, like an adaptive filter or some kind of nonlinear smoothing. What would be the best algorithm depends on what is the signal and what is the noise. Depending on how much exactly is the "high appreciation", I might be able to help you. Vladimir Vassilevsky DSP and Mixed Signal Consultant www.abvolt.com
Thanks for your comment. 

For your information, I have tried LPF and smoothing filter such as
Stavisky-Golay on Matlab. Both were unable to effectively produce a
satisfied output because the frequencies of intrinsic signal and noise are
very low (<20Hz) and close to each other. The noise I am talking about is
DC baseline swing and at this frequency region it is difficult to use HPF
to block it. Also, LPF cutoff is not sharp enough to separate the signal
and noise in this region. I am very happy with the filtering result from
my wavelet denoising. I just want to know techniques to reduce buffering
time delay available.
  

> >"Poj" <netsiri_c@hotmail.com> wrote in message >news:4q-dnZp-37uyOe7anZ2dnUVZ_qGknZ2d@giganews.com... >> Does anyone have experience developing real-time wavelet denoising >software >> without using any DSP chip? I have a problem with delay time caused by >> buffering data and want to reduce the delay time as much as possible.
The
>> data I am dealing with is 1-D with sampling rate lower than 1kHz. I
have
>> developed the application in C++. The program reads data into a buffer >> sample by sample until the buffer has enough data to perform DWT, >> eliminate noise, and IDWT. So, the time to wait for data becomes delay >> time. In other words, output of my wavelet is behind real time about
40
>> seconds or so due to low sampling rate and data buffering. There is no >> problem with denoising quality. or processing speed. I just want a
more
>> realistic real-time. Any suggestions or ideas are highly appreciated. > >Throw away the popular wavelet nonsense and implement a reasonable
denoising
>algorithm, like an adaptive filter or some kind of nonlinear smoothing. >What would be the best algorithm depends on what is the signal and what
is
>the noise. Depending on how much exactly is the "high appreciation", I
might
>be able to help you. > >Vladimir Vassilevsky >DSP and Mixed Signal Consultant >www.abvolt.com > > > >
If you are using a linear filter to resolve the frequencies, then the 
incurred delay can't be less then about 1/transition band. It doesn't 
matter if the filter is implemented in a direct form, or by FFT, or as a 
wavelet shamanism.

It is sad to hear that you couldn't get a basic HPF/LPF right, so you 
had to resort to tomtomes and tambourines.

I suggest you to take a different look at the problem without being 
carried away by cool buzzwords such as "Savitsky-Golay", "Wavelet", 
"Matlab".


Vladimir Vassilevsky
DSP and Mixed Signal Design Consultant
http://www.abvolt.com




Poj wrote:

> Thanks for your comment. > > For your information, I have tried LPF and smoothing filter such as > Stavisky-Golay on Matlab. Both were unable to effectively produce a > satisfied output because the frequencies of intrinsic signal and noise are > very low (<20Hz) and close to each other. The noise I am talking about is > DC baseline swing and at this frequency region it is difficult to use HPF > to block it. Also, LPF cutoff is not sharp enough to separate the signal > and noise in this region. I am very happy with the filtering result from > my wavelet denoising. I just want to know techniques to reduce buffering > time delay available. > > > >>"Poj" <netsiri_c@hotmail.com> wrote in message >>news:4q-dnZp-37uyOe7anZ2dnUVZ_qGknZ2d@giganews.com... >> >>>Does anyone have experience developing real-time wavelet denoising >> >>software >> >>>without using any DSP chip? I have a problem with delay time caused by >>>buffering data and want to reduce the delay time as much as possible. > > The > >>>data I am dealing with is 1-D with sampling rate lower than 1kHz. I > > have > >>>developed the application in C++. The program reads data into a buffer >>>sample by sample until the buffer has enough data to perform DWT, >>>eliminate noise, and IDWT. So, the time to wait for data becomes delay >>>time. In other words, output of my wavelet is behind real time about > > 40 > >>>seconds or so due to low sampling rate and data buffering. There is no >>>problem with denoising quality. or processing speed. I just want a > > more > >>>realistic real-time. Any suggestions or ideas are highly appreciated. >> >>Throw away the popular wavelet nonsense and implement a reasonable > > denoising > >>algorithm, like an adaptive filter or some kind of nonlinear smoothing. >>What would be the best algorithm depends on what is the signal and what > > is > >>the noise. Depending on how much exactly is the "high appreciation", I > > might > >>be able to help you. >> >>Vladimir Vassilevsky >>DSP and Mixed Signal Consultant >>www.abvolt.com >> >> >> >>
On Dec 27, 11:49 am, Vladimir Vassilevsky <antispam_bo...@hotmail.com>
wrote:
> If you are using a linear filter to resolve the frequencies, then the > incurred delay can't be less then about 1/transition band. It doesn't > matter if the filter is implemented in a direct form, or by FFT, or as a > wavelet shamanism. > > It is sad to hear that you couldn't get a basic HPF/LPF right, so you > had to resort to tomtomes and tambourines. > > I suggest you to take a different look at the problem without being > carried away by cool buzzwords such as "Savitsky-Golay", "Wavelet", > "Matlab". > > Vladimir Vassilevsky > DSP and Mixed Signal Design Consultanthttp://www.abvolt.com > > Poj wrote: > > Thanks for your comment. > > > For your information, I have tried LPF and smoothing filter such as > > Stavisky-Golay on Matlab. Both were unable to effectively produce a > > satisfied output because the frequencies of intrinsic signal and noise are > > very low (<20Hz) and close to each other. The noise I am talking about is > > DC baseline swing and at this frequency region it is difficult to use HPF > > to block it. Also, LPF cutoff is not sharp enough to separate the signal > > and noise in this region. I am very happy with the filtering result from > > my wavelet denoising. I just want to know techniques to reduce buffering > > time delay available. > > >>"Poj" <netsir...@hotmail.com> wrote in message > >>news:4q-dnZp-37uyOe7anZ2dnUVZ_qGknZ2d@giganews.com... > > >>>Does anyone have experience developing real-time wavelet denoising > > >>software > > >>>without using any DSP chip? I have a problem with delay time caused by > >>>buffering data and want to reduce the delay time as much as possible. > > > The > > >>>data I am dealing with is 1-D with sampling rate lower than 1kHz. I > > > have > > >>>developed the application in C++. The program reads data into a buffer > >>>sample by sample until the buffer has enough data to perform DWT, > >>>eliminate noise, and IDWT. So, the time to wait for data becomes delay > >>>time. In other words, output of my wavelet is behind real time about > > > 40 > > >>>seconds or so due to low sampling rate and data buffering. There is no > >>>problem with denoising quality. or processing speed. I just want a > > > more > > >>>realistic real-time. Any suggestions or ideas are highly appreciated. > > >>Throw away the popular wavelet nonsense and implement a reasonable > > > denoising > > >>algorithm, like an adaptive filter or some kind of nonlinear smoothing. > >>What would be the best algorithm depends on what is the signal and what > > > is > > >>the noise. Depending on how much exactly is the "high appreciation", I > > > might > > >>be able to help you. > > >>Vladimir Vassilevsky > >>DSP and Mixed Signal Consultant > >>www.abvolt.com
Baseline wander removal is difficult in a realtime system for all the reasons you cite. I'm guessing you are driving some kind of display with the result, and can't tolerate much lag between stimulus and response. My hats off to you if you have found a satisfactory approach. There is a sliding DWT that you might look at. John

>>If you are using a linear filter to resolve the frequencies, then the >>incurred delay can't be less then about 1/transition band.
> Baseline wander removal is difficult in a realtime system for all the > reasons you cite.
The only way to trick the nature is using parametric models of the signal and the drift. The parameters should be estimated as ML. However the results are going to be wild if the models do not match. VLV
On Dec 27, 12:59 pm, Vladimir Vassilevsky <antispam_bo...@hotmail.com>
wrote:
> >>If you are using a linear filter to resolve the frequencies, then the > >>incurred delay can't be less then about 1/transition band. > > Baseline wander removal is difficult in a realtime system for all the > > reasons you cite. > > The only way to trick the nature is using parametric models of the > signal and the drift. The parameters should be estimated as ML. However > the results are going to be wild if the models do not match. > > VLV
The drift is difficult to model. In the case of a Galvanic Skin Response (GSR), sometimes the baseline can be holding flat for many seconds, then all of a sudden it plunges downward. John
Hi John,

Thanks for message.

Yes, you are right. Lag between input and output is too much. Do you have
any articles, links or other information related to sliding DWT?

Actually I have sliding wavelet kernels in my program for both DWT and
IDWT. However, since my wavelet can give the best result at Level 7, I
have to freeze the real-time data in the buffer to decompose the data to
Level 7, then extract the signal I want and reconstruct it back to Level
1. I am still looking for a better way to implement this in real-time
environment. 

I combined Wavelet Denoising with bidirectional HPF (Wavelet Denoising
first, then forward HPF and finally backward HPF) to extract signal and
remove the baseline wander, gaussian white noise, and compensate phase
shift. This combination gives the best result over other stuff I have been
testing so far. The bidirectional HPF uses pure sliding kernels and does
not cause significant lag problem at all.

-poj

> >Baseline wander removal is difficult in a realtime system for all the >reasons you cite. I'm guessing you are driving some kind of display >with the result, and can't tolerate much lag between stimulus and >response. My hats off to you if you have found a satisfactory >approach. There is a sliding DWT that you might look at. > > >John >
Vladimir Vassilevsky wrote:
> > >>> If you are using a linear filter to resolve the frequencies, then the >>> incurred delay can't be less then about 1/transition band. > >> Baseline wander removal is difficult in a realtime system for all the >> reasons you cite. > > The only way to trick the nature is using parametric models of the > signal and the drift. The parameters should be estimated as ML. However > the results are going to be wild if the models do not match.
How about fixing the baseline? A muffler on the noise source is usually more effective than noise-canceling earmuffs. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
John wrote:
> On Dec 27, 12:59 pm, Vladimir Vassilevsky <antispam_bo...@hotmail.com> > wrote: >>>> If you are using a linear filter to resolve the frequencies, then the >>>> incurred delay can't be less then about 1/transition band. >>> Baseline wander removal is difficult in a realtime system for all the >>> reasons you cite. >> The only way to trick the nature is using parametric models of the >> signal and the drift. The parameters should be estimated as ML. However >> the results are going to be wild if the models do not match. >> >> VLV > > The drift is difficult to model. In the case of a Galvanic Skin > Response (GSR), sometimes the baseline can be holding flat for many > seconds, then all of a sudden it plunges downward.
That isn't baseline, it's an as-yet unaccounted-for phenomenon that you're measuring. There's information in that thar noise. Can you find a way to use it? Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;