DSPRelated.com
Forums

Phase Lag vs. Sampling Frequency

Started by Unknown December 12, 2007
Gents,

I know that the amount of system phase lag, introduced by a DSP alone,
decreases with an increase in its sampling frequency.

Assuming the ADC read + DAC write both happen within a sample period
and assuming the only work the DSP does is setting its output=input,
does anyone know a formula that describes the relationship between
phase lag and sampling is linear?

I set up a small and simple experiment whereby I sample at various
frequencies and immediately output the sampled value to a DAC; I use
no analog filtering before or after the ADC/DAC. I use a frequency
response analyzer to observe the DSP's magnitude and phase response as
I sweep the input frequency from 0.1Hz to 100Hz, but based on my
experiment I see no clear relationship between the two.

When I sample at 1kHz I see a certain amount of phase lag at 100Hz.
When I sample at 3.5kHz I see almost 20deg LESS phase lag at 100Hz
than I did at Fs=1kHz - great. However, when I sample at 6kHz the
phase lag only improves by 2deg over the phase lag observed at 3.5kHz.
Shouldn't the lag decrease in relative prooprtion to the sampling
frequency increases?

Thanks,
Mark
On 12 Des, 06:58, gtsunf...@hotmail.com wrote:
> Gents, > > I know that the amount of system phase lag, introduced by a DSP alone, > decreases with an increase in its sampling frequency.
How do you know that? At what ferquency do you measure this phase lag?
> Assuming the ADC read + DAC write both happen within a sample period > and assuming the only work the DSP does is setting its output=input, > does anyone know a formula that describes the relationship between > phase lag and sampling is linear?
I don't know. Are you sure you don't mean that the DSP introduces a constant time delay? In that case one may say the DSP contributes with a linear phase shift to the signal, assuming that one can come up with a useful reference signal.
> I set up a small and simple experiment whereby I sample at various > frequencies and immediately output the sampled value to a DAC; I use > no analog filtering before or after the ADC/DAC. I use a frequency > response analyzer to observe the DSP's magnitude and phase response as > I sweep the input frequency from 0.1Hz to 100Hz, but based on my > experiment I see no clear relationship between the two. > > When I sample at 1kHz I see a certain amount of phase lag at 100Hz. > When I sample at 3.5kHz I see almost 20deg LESS phase lag at 100Hz > than I did at Fs=1kHz - great. However, when I sample at 6kHz the > phase lag only improves by 2deg over the phase lag observed at 3.5kHz. > Shouldn't the lag decrease in relative prooprtion to the sampling > frequency increases?
Again, phase lag or delay needs a reference to be well-defined. I can't see any reference signal in the set-up you describe? Rune
On Tue, 11 Dec 2007 21:58:24 -0800, gtsunfire wrote:

> Gents, > > I know that the amount of system phase lag, introduced by a DSP alone, > decreases with an increase in its sampling frequency. > > Assuming the ADC read + DAC write both happen within a sample period and > assuming the only work the DSP does is setting its output=input, does > anyone know a formula that describes the relationship between phase lag > and sampling is linear? > > I set up a small and simple experiment whereby I sample at various > frequencies and immediately output the sampled value to a DAC; I use no > analog filtering before or after the ADC/DAC. I use a frequency response > analyzer to observe the DSP's magnitude and phase response as I sweep > the input frequency from 0.1Hz to 100Hz, but based on my experiment I > see no clear relationship between the two. > > When I sample at 1kHz I see a certain amount of phase lag at 100Hz. When > I sample at 3.5kHz I see almost 20deg LESS phase lag at 100Hz than I did > at Fs=1kHz - great. However, when I sample at 6kHz the phase lag only > improves by 2deg over the phase lag observed at 3.5kHz. Shouldn't the > lag decrease in relative prooprtion to the sampling frequency increases? > > Thanks, > Mark
The phase lag you're referring to comes from delay. In an ideal world the process of sampling the signal doesn't introduce any delay. The DAC, however, acts like a zero-order hold with a hold time of one sampling interval. Such a zero-order hold has a delay of 1/2 a sampling interval. So in an ideal world you should see a phase lag consistent with a 1/2 sample delay, for input frequencies from DC to light. The phase delay associated with a pure delay in time is equal to 360 degrees times the frequency times the delay, so at a frequency of 100Hz I get phase lags of 18 degrees, 5 degrees, and 3 degrees for 1kHz, 3.5kHz, and 6kHz, respectively. This is consistent with the second two of your measurements; I can't explain why you saw 20 degrees between 1kHz and 3.5kHz where I only get 13 -- did you do your measurement correctly? Note that the phase lag at a particular signal frequency is equal to the reciprocal of the sampling rate -- it is _not_ linear to the sampling rate. -- Tim Wescott Control systems and communications consulting http://www.wescottdesign.com Need to learn how to apply control theory in your embedded system? "Applied Control Theory for Embedded Systems" by Tim Wescott Elsevier/Newnes, http://www.wescottdesign.com/actfes/actfes.html
On Dec 12, 7:50 pm, Tim Wescott <t...@seemywebsite.com> wrote:
> On Tue, 11 Dec 2007 21:58:24 -0800, gtsunfire wrote: > > Gents, > > > I know that the amount of system phase lag, introduced by a DSP alone, > > decreases with an increase in its sampling frequency. > > > Assuming the ADC read + DAC write both happen within a sample period and > > assuming the only work the DSP does is setting its output=input, does > > anyone know a formula that describes the relationship between phase lag > > and sampling is linear? > > > I set up a small and simple experiment whereby I sample at various > > frequencies and immediately output the sampled value to a DAC; I use no > > analog filtering before or after the ADC/DAC. I use a frequency response > > analyzer to observe the DSP's magnitude and phase response as I sweep > > the input frequency from 0.1Hz to 100Hz, but based on my experiment I > > see no clear relationship between the two. > > > When I sample at 1kHz I see a certain amount of phase lag at 100Hz. When > > I sample at 3.5kHz I see almost 20deg LESS phase lag at 100Hz than I did > > at Fs=1kHz - great. However, when I sample at 6kHz the phase lag only > > improves by 2deg over the phase lag observed at 3.5kHz. Shouldn't the > > lag decrease in relative prooprtion to the sampling frequency increases? > > > Thanks, > > Mark > > The phase lag you're referring to comes from delay. > > In an ideal world the process of sampling the signal doesn't introduce > any delay. The DAC, however, acts like a zero-order hold with a hold > time of one sampling interval. Such a zero-order hold has a delay of 1/2 > a sampling interval. > > So in an ideal world you should see a phase lag consistent with a 1/2 > sample delay, for input frequencies from DC to light. > > The phase delay associated with a pure delay in time is equal to 360 > degrees times the frequency times the delay, so at a frequency of 100Hz I > get phase lags of 18 degrees, 5 degrees, and 3 degrees for 1kHz, 3.5kHz, > and 6kHz, respectively. This is consistent with the second two of your > measurements; I can't explain why you saw 20 degrees between 1kHz and > 3.5kHz where I only get 13 -- did you do your measurement correctly? > > Note that the phase lag at a particular signal frequency is equal to the > reciprocal of the sampling rate -- it is _not_ linear to the sampling > rate. > > -- > Tim Wescott > Control systems and communications consultinghttp://www.wescottdesign.com > > Need to learn how to apply control theory in your embedded system? > "Applied Control Theory for Embedded Systems" by Tim Wescott > Elsevier/Newnes,http://www.wescottdesign.com/actfes/actfes.html
Something like T*sinc(function)*exp(-jwT/2) which is the impulse response of a zero-order hold of duration T secs. The delay introduces phase lag and this is a problem when you have feedback of course. The higher the sample rate the smaller the phase lag. At a freq of w= ws/2 ie half sampling or Nyquist,the delay is -ws/2xT/2 = -wsXT/4 = -2pi/4 =-pi/2 or -90 degrees. To get say 1 degree of phase shift at the Nquist freq you would need to sample about 100 times faster. Hardy