Reply by Steve Pope May 27, 20102010-05-27
This problem makes sense only when the delay between the signals
is comforatably larger than the inverse frequency 
of any signal in the passband, but not too large.

For example, if the passband is 100 Hz to 5 KHz, and the 
delay is 50 milliseconds, you *might* be able to construct
an allpass filter that corrects for phase down to about 20 
degrees, and that itself only has a delay of 20 milliseconds,
although it's going to be one hairy filter.  It may not
even be possible.

If it is possible, it might subjectively sound like the effect 
you are probably going for -- a sort of neutral doubling, say for 
doubling vocals.

I think you will have better results with an analysis method
where the allpass only tries to correct for phase at
a smaller number of frequency points where most of the energy is,
than with a blind method.

Steve
Reply by Fred Marshall May 27, 20102010-05-27
toneburst wrote:
> Hi, > > Here is an audio signal processing problem that I have not been able to > find an answer to: > > I would like to analyze the effects of adding a delayed version of a signal > to itself, while having the delayed signal in-phase with the non-delayed > signal. Essentially, I'd like to "undo" phase rotation caused by (fixed) > time delay, using some sort of phase equalization filter. This would only > be for audio signals, and the filter can ONLY operate on the delayed > signal. Other constraints: > > - MUST be causal (to be implemented in a real-time system) > - Filter magnitude should be as flat as possible > - The wider the frequency region in which this filter flattens the phase > response, the better > - The type (IIR/FIR) and order of filter are generally not a concern > - Group delay is NOT a concern > > Hope this makes sense. Any advice would be helpful, except the bad kind > =P > > Thanks in advance! > Rick > >
3:48 a.m. huh? What were you smokin' dude? Just "undelay" it and you've got it.... :-) I don't know how to do it but can kick it around... It seems you'd want an added delay that's frequency dependent - now that's dispersive, no? The lower the frequency, the greater the necessary delay in order to line things up. And, depending on the original delay, that could be as much as nearly equal to the original. Like this: Let's say the original delay is 200msec - the period of a 5Hz sinusoid. Not really too long... So, at 5 Hz, the alignment is no problem. At 6Hz the period is 167msec. So, you've delayed it 1.2 periods and now must delay it another .8 periods to align phase. That amounts to around 133msec. At 11Hz, the period is 90.9msec. So, you've delayed it 2.2 periods and now must delay it another 0.8 periods to align phase. That amounts to a delay of around 72msec. And at 12Hz, a delay of 50msec. and so forth..... If you accept this model of the situation then I think the expression for added delay (the "filter" specification) would be: D = the original delay K = an integer that's "large enough" to keep out nonlinear sorts of things in the expression like INT() p = 1/f You need to delay at f by the remaining fraction of a period after the initial delay. D/p is the number of periods at f in the original delay D - either it's a fraction or a fraction plus an integer. K - D/p is a number >1 that includes in it (1 - D/p) and 1 - D/p is the fraction of a period we need to add at f. Then, K just adds another integral period which is of no consequence. You want (using Fourier Transform pairs for perhaps insight): f(t) <> F(w) f(t - p(K - D/p) = f(t - pK +D) <> F(w)e^-jw(pK -D) = F(w) * e^-jwpK * e^jwD The next to the last term can be written: e^-j2piwK/w = e^-j2piK which is just a delay so that can't be right. oh well Fred
Reply by Clay May 27, 20102010-05-27
On May 27, 6:48&#4294967295;am, "toneburst" <mr.woffer@n_o_s_p_a_m.gmail.com>
wrote:
> Hi, > > Here is an audio signal processing problem that I have not been able to > find an answer to: > > I would like to analyze the effects of adding a delayed version of a signal > to itself, while having the delayed signal in-phase with the non-delayed > signal. &#4294967295;Essentially, I'd like to "undo" phase rotation caused by (fixed) > time delay, using some sort of phase equalization filter. &#4294967295;This would only > be for audio signals, and the filter can ONLY operate on the delayed > signal. &#4294967295;Other constraints: > > - MUST be causal (to be implemented in a real-time system) > - Filter magnitude should be as flat as possible > - The wider the frequency region in which this filter flattens the phase > response, the better > - The type (IIR/FIR) and order of filter are generally not a concern > - Group delay is NOT a concern > > Hope this makes sense. &#4294967295;Any advice would be helpful, except the bad kind > =P > > Thanks in advance! > Rick
You have contradictory requirements! Something needs to give so as to be able to obtain a practical solution. Clay
Reply by toneburst May 27, 20102010-05-27
Hi,

Here is an audio signal processing problem that I have not been able to
find an answer to:

I would like to analyze the effects of adding a delayed version of a signal
to itself, while having the delayed signal in-phase with the non-delayed
signal.  Essentially, I'd like to "undo" phase rotation caused by (fixed)
time delay, using some sort of phase equalization filter.  This would only
be for audio signals, and the filter can ONLY operate on the delayed
signal.  Other constraints:

- MUST be causal (to be implemented in a real-time system)
- Filter magnitude should be as flat as possible
- The wider the frequency region in which this filter flattens the phase
response, the better
- The type (IIR/FIR) and order of filter are generally not a concern
- Group delay is NOT a concern

Hope this makes sense.  Any advice would be helpful, except the bad kind
=P

Thanks in advance!
Rick