DSPRelated.com
Forums

How to phase align two waveforms?

Started by Brian April 15, 2004
I am recording electric guitar with several microphones. I want to ensure
that each mic's audio is phase aligned with each other. So I was thinking I
could set one of the audio files as a reference file and then phase align
everything to that reference signal. Anyone have any idea how on earth I
would actually perform the phase alignment??

thanks,

brian


Hi Brian!

Calculate the cross correlation function and find the maximum closest to 
zero. This is the time shift betweeen your signals.

Best regards,

Andre

Brian wrote:

> I am recording electric guitar with several microphones. I want to ensure > that each mic's audio is phase aligned with each other. So I was thinking I > could set one of the audio files as a reference file and then phase align > everything to that reference signal. Anyone have any idea how on earth I > would actually perform the phase alignment?? > > thanks, > > brian > >
-- Please change no_spam to a.lodwig when replying via email!
On Thu, 15 Apr 2004 09:10:32 GMT, "Brian" <brian.huether@NOdlrSPAM.de>
wrote:

>I am recording electric guitar with several microphones. I want to ensure >that each mic's audio is phase aligned with each other. So I was thinking I >could set one of the audio files as a reference file and then phase align >everything to that reference signal. Anyone have any idea how on earth I >would actually perform the phase alignment??
If the difference between the two microphones was one of purely a time delay, then you might try the cross correlation technique already proposed. But it is more likely that the two microphones will contain a combination of reflections from walls, etc. Different frequencies may be delayed different amounts. My guess is the time series from the two microphones will not line up no matter what you do. But you will have a better chance lining up low frequencies than high frequencies. Try running each microphone series through a low-pass filter and then do the cross correlation on that to determine the time difference to apply to the non-filtered samples. -Robert Scott Ypsilanti, Michigan (Reply through this forum, not by direct e-mail to me, as automatic reply address is fake.)
"Brian" <brian.huether@NOdlrSPAM.de> writes:

> I am recording electric guitar with several microphones. I want to ensure > that each mic's audio is phase aligned with each other. So I was thinking I > could set one of the audio files as a reference file and then phase align > everything to that reference signal. Anyone have any idea how on earth I > would actually perform the phase alignment??
Ensure the microphones are all on the same sphere and you'll ensure alignment at record-time. -- Randy Yates Sony Ericsson Mobile Communications Research Triangle Park, NC, USA randy.yates@sonyericsson.com, 919-472-1124
"Randy Yates" <randy.yates@sonyericsson.com> wrote in message
news:xxp65c1pcrq.fsf@usrts005.corpusers.net...
> "Brian" <brian.huether@NOdlrSPAM.de> writes: > > > I am recording electric guitar with several microphones. I want to
ensure
> > that each mic's audio is phase aligned with each other. So I was
thinking I
> > could set one of the audio files as a reference file and then phase
align
> > everything to that reference signal. Anyone have any idea how on earth I > > would actually perform the phase alignment?? > > Ensure the microphones are all on the same sphere and you'll > ensure alignment at record-time.
Brian, You've received some good answers already. In case you didn't catch the significance: - phase is a measure at a single frequency and you have many frequencies. Maybe the phase change with frequency is perfectly linear or the same - but likely not. - time delay is a measure that can apply over large frequency ranges. Perhaps that's what you meant instead of "phase". Phase is the integral of the time delay. So, for a constant time delay, the phase varies linearly with frequency. The higher the frequency, the higher the phase shift for the same time delay. Makes sense, right? More cycles per unit time.... So, to align two waveforms, you want the time delays to be equal. Finite impulse response (FIR) filters with symmetry in their coefficients / impulse response / have linear phase and flat delay. Rooms might approximate such a filter but probably not completely because of re-reverberation. If the reverberation is controlled then it can be a good approximation - subject to the microphone and amplifier frequency responses. Randy's suggestion that the microphones be on a sphere (where the source of sound is at the center) is a good one for a low-reverberation situation and probably the best you can do anyway. In other words, all microphones are equi-distant from the source of sound. Small differences in distance will make a big difference in phase at the higher frequencies because the wavelengths are short. That's why all these things work best at low frequencies. Fred
Why don't you simply record an impulse through your setup to capture
the (frequency dependent) lag for the two channels as impulse
response.

You could then use an FIR filter with the IR of the opposite channel
to compensate for the (channel relative) distortion - that should be
easy and should work if the lags involved aren't too large and your
mics are reasonably good... or am I missing something?

--smb


no-one@dont-mail-me.com (Robert Scott) wrote in message news:<
> If the difference between the two microphones was one of purely a time > delay, then you might try the cross correlation technique already > proposed. But it is more likely that the two microphones will contain > a combination of reflections from walls, etc. Different frequencies > may be delayed different amounts. My guess is the time series from > the two microphones will not line up no matter what you do. But you > will have a better chance lining up low frequencies than high > frequencies. Try running each microphone series through a low-pass > filter and then do the cross correlation on that to determine the time > difference to apply to the non-filtered samples. > > > -Robert Scott > Ypsilanti, Michigan > (Reply through this forum, not by direct e-mail to me, as automatic reply address is fake.)
On Thu, 15 Apr 2004 09:10:32 GMT, "Brian" <brian.huether@NOdlrSPAM.de>
wrote:

>I am recording electric guitar with several microphones. I want to ensure >that each mic's audio is phase aligned with each other. So I was thinking I >could set one of the audio files as a reference file and then phase align >everything to that reference signal. Anyone have any idea how on earth I >would actually perform the phase alignment?? > >thanks, > >brian
I'll add a tiny bit to an already good dialogue. First, I'll ask a question: What do expect to gain by doing such an alignment? How much difference do you expect in the delays between the microphones? As has already been pointed out, each microphone is likely already receiving not only the direct path, but the reflected paths as well. The upshot is that each input is already a sum of the direct and delayed signals. Summing the signals from the different microphones without any attempts at phase alignment likely won't sound any worse than any single microphone, and there will be the corresponding gain in SNR regardless. Taking pains to align the signals will provide some slight flattening of the spectrum, but I'm not sure that anyone without the most golden of ears would be able to tell the difference (disclaimer, I'm not an audio guy, and I'll certainly defer to the audio deities among us if I'm wrong here). If the propagation delay spreads are less than 1/(desired bandwidth), i.e., 1/20kHz or about 50us, then I think little benefit could be expected from coherent combining. This is just me applying comm-theory stuff to audio, so I may be off my rocker here, but I think it is a reasonable first analysis. Eric Jacobsen Minister of Algorithms, Intel Corp. My opinions may not be Intel's opinions. http://www.ericjacobsen.org
Brian wrote:

> I am recording electric guitar with several microphones. I want to ensure > that each mic's audio is phase aligned with each other. So I was thinking I > could set one of the audio files as a reference file and then phase align > everything to that reference signal. Anyone have any idea how on earth I > would actually perform the phase alignment??
Others have answered the question about phase, but.... The whole idea of an electric guitar is that the string vibrations are magnetically coupled (variable reluctance induces a current in a coil as the string vibrates.) Did you mean to put microphones near the speaker connected to the amplified signal from the guitar? There is an old joke, "A plane crashes right on the border between the US and Canada. On which side do they bury the survivors." -- glen