> Hello Forum,
>
> With reference to FM Gardners article "Interpolation In Digital
> Modems---PART1: Fundamentals", I have a query regarding the term JITTER.
>
> Under the heading "Interpolation Jitter" the author states that " Athough
> the kth interpolation is computed FOR a time kTi=(mk+uk)Ts, the
> interpolant is actually delivered coincident with a clock tick no earlier
> than (mk+1)Ts. Therefore the output exhibits a timing jitter with peak to
> peak fluctuations of Ts....."
>
> Those of you familiar with Gardners Interpolation based Timing Recovery,
> would quickly remember that the input signal is sampled at Ts while
> interpolator output is at Ti. There are other units like TED and NCO
> working within the feedback loop to adjust the timing.
>
> 1)Can anybody clarify at what rate (Fs=1/Ts or Fi=1/TI) the interpolator
> is actually running? I mean if there were to be a hardware interpolator,
> which of Fs or Fi would be its master clock?
I think it is easiest to think of the Gardner structure as a fractional
decimator with output rate set by the period of an NCO. The NCO is
updated on every input sample. When it wraps, an output sample is
produced. Let's say we have a 16-bit unsigned, downcounting NCO and
wish to decimate by 3. Then on each input sample we subtract 65536/3
from the NCO. So a typical series of NCO values would look like:
0
43691 --> interpolate an output sample
21845
0
43691 --> interpolate an output sample
21845
0
43691 --> interpolate an output sample
So you see that the interpolator executes at the *output* rate on
average.
>
> 2)How do we say that there is "jitter" in output clock if it is "fixed"
> (Fs or Fi)? "the interpolant is actually delivered coincident with a clock
> tick no earlier than (mk+1)Ts" suggests me that the master clock of
> interpolator is Fs=1/Ts, which is obviously jitter free?
This is just a manifestation of the fact that you only interpolate an
output on input clock edges. On average, the number of output
samples/sec is correct, but they are "bursty" in that they are not
produced at a constant rate. If you stuck all of the output samples in
a FIFO, then clocked them out using an output clock that ultimately
controls the decimation rate (NCO period) via a feedback loop, then
that output clock will jitter with respect to the input clock by Ts.
John
Reply by rider●October 3, 20052005-10-03
Hello Forum,
With reference to FM Gardners article "Interpolation In Digital
Modems---PART1: Fundamentals", I have a query regarding the term JITTER.
Under the heading "Interpolation Jitter" the author states that " Athough
the kth interpolation is computed FOR a time kTi=(mk+uk)Ts, the
interpolant is actually delivered coincident with a clock tick no earlier
than (mk+1)Ts. Therefore the output exhibits a timing jitter with peak to
peak fluctuations of Ts....."
Those of you familiar with Gardners Interpolation based Timing Recovery,
would quickly remember that the input signal is sampled at Ts while
interpolator output is at Ti. There are other units like TED and NCO
working within the feedback loop to adjust the timing.
1)Can anybody clarify at what rate (Fs=1/Ts or Fi=1/TI) the interpolator
is actually running? I mean if there were to be a hardware interpolator,
which of Fs or Fi would be its master clock?
2)How do we say that there is "jitter" in output clock if it is "fixed"
(Fs or Fi)? "the interpolant is actually delivered coincident with a clock
tick no earlier than (mk+1)Ts" suggests me that the master clock of
interpolator is Fs=1/Ts, which is obviously jitter free?
Rider
This message was sent using the Comp.DSP web interface on
www.DSPRelated.com