DSPRelated.com
Forums

DWT anti-causality

Started by Unknown August 8, 2006
I wrote:
...
> Time-series prediction using linear methods does require
^^^^^^ not Sorry.
Andor wrote:
> Denoir wrote: > >> Hi, >> >> I have been playing around with the discrete wavelet transform (dwt) as >> a preprocessor for an adaptive system (for system identification and >> time-series prediction). The problem is however that the DWT is >> anti-causal and suffers from boundary value problems (due to the signal >> extension). >> >> I've tried the first thing that I could think of - a dwt with an >> expanding window where I only keep the last sample from each transform. >> The results were utter rubbish (due to the boundary distortions I >> suspect). >> >> Does anybody know an approach that would work - to enforce causality so >> that the dwt doesn't use future samples? > > Perhaps your problem is how to transform a linear-phase filter into a > minimum-phase filter? > > I can't really tell what your problem is, and why you feel that the DWT > is a solution. Time-series prediction using linear methods does require > any bandsplitting. You can directly compute the linear predictor for > any subband that you wish in the frequency domain. >
However, prediction in transform domain has some benefits. Consider the lowpass band (i.e., the approximation of the original signal). N-tap predictor applied for the lowpass band spans much longer time interval than N-tap predictor applied for the original signal. Thus, prediction in the approximation band can capture much longer term correlations in the signal that what plain linear prediction could do. Note also that even though one has to use many predictors in the subband approach (i.e., one for every band), computationally both are equally cheap (neglecting the cost of the subband decomposition). -- Jani Huhtanen Tampere University of Technology, Pori
Jani Huhtanen wrote:

> Andor wrote: >> Denoir wrote: >> >>> Hi, >>> >>> I have been playing around with the discrete wavelet transform (dwt) as >>> a preprocessor for an adaptive system (for system identification and >>> time-series prediction). The problem is however that the DWT is >>> anti-causal and suffers from boundary value problems (due to the signal >>> extension). >>> >>> I've tried the first thing that I could think of - a dwt with an >>> expanding window where I only keep the last sample from each transform. >>> The results were utter rubbish (due to the boundary distortions I >>> suspect). >>> >>> Does anybody know an approach that would work - to enforce causality so >>> that the dwt doesn't use future samples? >> >> Perhaps your problem is how to transform a linear-phase filter into a >> minimum-phase filter? >> >> I can't really tell what your problem is, and why you feel that the DWT >> is a solution. Time-series prediction using linear methods does require >> any bandsplitting. You can directly compute the linear predictor for >> any subband that you wish in the frequency domain. >> > > However, prediction in transform domain has some benefits. Consider the > lowpass band (i.e., the approximation of the original signal). N-tap > predictor applied for the lowpass band spans much longer time interval > than N-tap predictor applied for the original signal. Thus, prediction in > the approximation band can capture much longer term correlations in the > signal that what plain linear prediction could do. > > Note also that even though one has to use many predictors in the subband > approach (i.e., one for every band), computationally both are equally > cheap (neglecting the cost of the subband decomposition). >
Oops, missed your last sentence Andor. What do you mean by that? Although one could theoretically do that, why bother doing it in full band? -- Jani Huhtanen Tampere University of Technology, Pori
Jani Huhtanen wrote:

> Jani Huhtanen wrote: > > > Andor wrote: > >> Denoir wrote: > >> > >>> Hi, > >>> > >>> I have been playing around with the discrete wavelet transform (dwt) as > >>> a preprocessor for an adaptive system (for system identification and > >>> time-series prediction). The problem is however that the DWT is > >>> anti-causal and suffers from boundary value problems (due to the signal > >>> extension). > >>> > >>> I've tried the first thing that I could think of - a dwt with an > >>> expanding window where I only keep the last sample from each transform. > >>> The results were utter rubbish (due to the boundary distortions I > >>> suspect). > >>> > >>> Does anybody know an approach that would work - to enforce causality so > >>> that the dwt doesn't use future samples? > >> > >> Perhaps your problem is how to transform a linear-phase filter into a > >> minimum-phase filter? > >> > >> I can't really tell what your problem is, and why you feel that the DWT > >> is a solution. Time-series prediction using linear methods does require > >> any bandsplitting. You can directly compute the linear predictor for > >> any subband that you wish in the frequency domain. > >> > > > > However, prediction in transform domain has some benefits. Consider the > > lowpass band (i.e., the approximation of the original signal). N-tap > > predictor applied for the lowpass band spans much longer time interval > > than N-tap predictor applied for the original signal. Thus, prediction in > > the approximation band can capture much longer term correlations in the > > signal that what plain linear prediction could do. > > > > Note also that even though one has to use many predictors in the subband > > approach (i.e., one for every band), computationally both are equally > > cheap (neglecting the cost of the subband decomposition). > > > > Oops, missed your last sentence Andor. What do you mean by that? Although > one could theoretically do that, why bother doing it in full band?
Since the OP was unhappy with the performance of his DWT, I was giving suggestion to achieve similar results using other techniques. Another possibility to gain non-linear frequency resolution for predictors is to use frequency-warping. As I said, not knowing the OPs real intentions, we can only guess. Regards, Andor
Andor wrote:

> > Jani Huhtanen wrote: > >> Jani Huhtanen wrote: >> >> > Andor wrote: >> >> Denoir wrote: >> >> >> >>> Hi, >> >>> >> >>> I have been playing around with the discrete wavelet transform (dwt) >> >>> as a preprocessor for an adaptive system (for system identification >> >>> and time-series prediction). The problem is however that the DWT is >> >>> anti-causal and suffers from boundary value problems (due to the >> >>> signal extension). >> >>> >> >>> I've tried the first thing that I could think of - a dwt with an >> >>> expanding window where I only keep the last sample from each >> >>> transform. The results were utter rubbish (due to the boundary >> >>> distortions I suspect). >> >>> >> >>> Does anybody know an approach that would work - to enforce causality >> >>> so that the dwt doesn't use future samples? >> >> >> >> Perhaps your problem is how to transform a linear-phase filter into a >> >> minimum-phase filter? >> >> >> >> I can't really tell what your problem is, and why you feel that the >> >> DWT is a solution. Time-series prediction using linear methods does >> >> require any bandsplitting. You can directly compute the linear >> >> predictor for any subband that you wish in the frequency domain. >> >> >> > >> > However, prediction in transform domain has some benefits. Consider the >> > lowpass band (i.e., the approximation of the original signal). N-tap >> > predictor applied for the lowpass band spans much longer time interval >> > than N-tap predictor applied for the original signal. Thus, prediction >> > in the approximation band can capture much longer term correlations in >> > the signal that what plain linear prediction could do. >> > >> > Note also that even though one has to use many predictors in the >> > subband approach (i.e., one for every band), computationally both are >> > equally cheap (neglecting the cost of the subband decomposition). >> > >> >> Oops, missed your last sentence Andor. What do you mean by that? Although >> one could theoretically do that, why bother doing it in full band? > > Since the OP was unhappy with the performance of his DWT, I was giving > suggestion to achieve similar results using other techniques. Another > possibility to gain non-linear frequency resolution for predictors is > to use frequency-warping. > > As I said, not knowing the OPs real intentions, we can only guess. >
Ah, ok then :) -- Jani Huhtanen Tampere University of Technology, Pori
Andor wrote:
> Since the OP was unhappy with the performance of his DWT, I was giving > suggestion to achieve similar results using other techniques. Another > possibility to gain non-linear frequency resolution for predictors is > to use frequency-warping. > > As I said, not knowing the OPs real intentions, we can only guess.
The intention is to use it as a pre-processing stage for a dynamic neural network. The reason why the transform is so suitable is because it is able to extract both local and global multiresolution information from non-stationary processes. The decomposition isn't something that the neural net can do on its own, but it makes great use of the signal components. In addition, the wavelet filters themselves can be adapted to fit the problem. It's really a great transform for the purpose, except for that pesky causality bit.
Jerry Avins wrote:
> > BTW, a sliding FFT, which is what you just described, needn't be > inefficient. Google it.
Well, not quite. First of all, was referring to an expanding window, that uses all historic data and not just a frame of it. In addition FFT is in several ways inferior for the task than DWT. First of all while a sliding window does localize it a bit, you sacrifice the global picture completely. It makes it unsuitable for non-stationary processes with local non-periodic properties. Apart from extending a window, I can think of another method: fake-extending the signal. The basic idea being that I pad the signal (symmetric extension or something like that) with the length of the maximum needed delay for the original signal. I then do the transform and and pick my sample from its original point in time, ignoring the extended samples. I haven't tried it yet, but I suspect I'll end up with the same results as with the expanding window...
lucas.denoir@gmail.com wrote:

> ... It's really a great transform for the purpose, except for that > pesky causality bit.
That reminds me of a Principals quip: "Running a school would be a great job if it weren't for the kids." A global view of a long interval implies low frequency. Long delays go with that territory. You don't have non-causality. You simply haven't incorporated enough delay. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Jerry Avins wrote:

> lucas.denoir@gmail.com wrote: > >> ... It's really a great transform for the purpose, except for that >> pesky causality bit. > > That reminds me of a Principals quip: "Running a school would be a great > job if it weren't for the kids." > > A global view of a long interval implies low frequency.
I have learned that narrow bands imply long intervals. Could you elaborate why global view of a long interval implies low frequency? Perhaps I'm not entirely clear what is meant by global view here. :\ -- Jani Huhtanen Tampere University of Technology, Pori
Jani Huhtanen wrote:
> Jerry Avins wrote: > >> lucas.denoir@gmail.com wrote: >> >>> ... It's really a great transform for the purpose, except for that >>> pesky causality bit. >> That reminds me of a Principals quip: "Running a school would be a great >> job if it weren't for the kids." >> >> A global view of a long interval implies low frequency. > > I have learned that narrow bands imply long intervals. Could you elaborate > why global view of a long interval implies low frequency? Perhaps I'm not > entirely clear what is meant by global view here. :\
Let's suppose that a sample rate is 8 KHz, and that periods of a minute encompass significances that would be lost by considering only shorter intervals. That's nearly half a million samples that need to be treated jointly. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������