Forums

Demodulator Timing Recovery Architecture Question

Started by Randy Yates April 8, 2006
Which would be a better architecture: using a hardware VCO to drive
the ADC as part of the PLL for timing recovery, or using a
fixed-crystal oscillator for the ADC and resampling in the digital
domain?  Assume the latter is done on a processor and not in hardware.
-- 
%  Randy Yates                  % "Watching all the days go by...    
%% Fuquay-Varina, NC            %  Who are you and who am I?"
%%% 919-577-9882                % 'Mission (A World Record)', 
%%%% <yates@ieee.org>           % *A New World Record*, ELO
http://home.earthlink.net/~yatescr
On Sat, 08 Apr 2006 23:18:46 GMT, in comp.dsp you wrote:

>Which would be a better architecture: using a hardware VCO to drive >the ADC as part of the PLL for timing recovery, or using a >fixed-crystal oscillator for the ADC and resampling in the digital >domain? Assume the latter is done on a processor and not in hardware.
Better in what sense ? First of all have you verified the possibility of resampling (by interpolation I assume) with a processor ? Using the digital solution (which I prefer) gives you a very clean local clock domain as there is no recovered clock but it makes it impossible to transmit with your recovered clock if it's necessary. Also simulating it is much easier. If you actually can do all the processing and you don't need the recovered clock and you can meet your BER (other other system performance and budget) requirements go for it.
mk<kal*@dspia.*comdelete> writes:

> On Sat, 08 Apr 2006 23:18:46 GMT, in comp.dsp you wrote: > >>Which would be a better architecture: using a hardware VCO to drive >>the ADC as part of the PLL for timing recovery, or using a >>fixed-crystal oscillator for the ADC and resampling in the digital >>domain? Assume the latter is done on a processor and not in hardware. > > Better in what sense ?
Yeah, good question. I'm not sure myself. I guess there are at least two senses I am interested in: 1. Performance 2. Simplicity/cost
> First of all have you verified the possibility > of resampling (by interpolation I assume) with a processor ?
That's what I said, wasn't it? That is option 2.
> Using the digital solution (which I prefer) gives you a very clean > local clock domain as there is no recovered clock but it makes it > impossible to transmit with your recovered clock if it's necessary. > Also simulating it is much easier. If you actually can do all the > processing and you don't need the recovered clock and you can meet > your BER (other other system performance and budget) requirements go > for it.
OK, thanks for your input, mk<kal*. -- % Randy Yates % "Watching all the days go by... %% Fuquay-Varina, NC % Who are you and who am I?" %%% 919-577-9882 % 'Mission (A World Record)', %%%% <yates@ieee.org> % *A New World Record*, ELO http://home.earthlink.net/~yatescr
I don't think the first option is practicable. I would be concerned about
phase noise on the ADC as the clock is bounced around. I would also worry
about control stability since, presumably, you are trying to recover a baud
line from the ADC data which necessarily will have modulation imparted on it
from the vco changes. The feedback loop seems potentially unstable.

Also, generally, the ADC is run at a high rate to digitize an IF, so you
can't really change that clock much and still meet nyquist. It might work at
one baud rate, but you certainly could not do a large range of baud rates.

-Clark

"Randy Yates" <yates@ieee.org> wrote in message
news:m3hd53odro.fsf@ieee.org...
> Which would be a better architecture: using a hardware VCO to drive > the ADC as part of the PLL for timing recovery, or using a > fixed-crystal oscillator for the ADC and resampling in the digital > domain? Assume the latter is done on a processor and not in hardware. > -- > % Randy Yates % "Watching all the days go by... > %% Fuquay-Varina, NC % Who are you and who am I?" > %%% 919-577-9882 % 'Mission (A World Record)', > %%%% <yates@ieee.org> % *A New World Record*, ELO > http://home.earthlink.net/~yatescr
On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>I don't think the first option is practicable.
It has been used in many successful demodulators. Prior to cheap DSP, it was only practical way to make a demodulator using coherent detection.
>I would be concerned about >phase noise on the ADC as the clock is bounced around.
Controlling the phase of the VCXO doesn't imply excessive "bouncing".
>I would also worry >about control stability since, presumably, you are trying to recover a baud >line from the ADC data which necessarily will have modulation imparted on it >from the vco changes.
The feedback loop locks the frequency to the incoming symbol rate. The VCO will change if the incoming rate changes, otherwise there will be no modulation except for noise.
>The feedback loop seems potentially unstable.
*Any* feedback loop is potentially unstable. The feedback loop is either controlling the analog clock phase or it's controlling the phase of an NCO in a digital downconverter (or its equivalent). I'm not sure why you think an analog one won't work when a digital one would be ok, since that's what your post seems to imply.
>Also, generally, the ADC is run at a high rate to digitize an IF, so you >can't really change that clock much and still meet nyquist. It might work at >one baud rate, but you certainly could not do a large range of baud rates.
The OP's use of the term 'VCXO' implied frequency changes of at most 100ppm or so.
>-Clark
Allan
> >"Randy Yates" <yates@ieee.org> wrote in message >news:m3hd53odro.fsf@ieee.org... >> Which would be a better architecture: using a hardware VCO to drive >> the ADC as part of the PLL for timing recovery, or using a >> fixed-crystal oscillator for the ADC and resampling in the digital >> domain? Assume the latter is done on a processor and not in hardware. >> -- >> % Randy Yates % "Watching all the days go by... >> %% Fuquay-Varina, NC % Who are you and who am I?" >> %%% 919-577-9882 % 'Mission (A World Record)', >> %%%% <yates@ieee.org> % *A New World Record*, ELO >> http://home.earthlink.net/~yatescr >
On Sat, 08 Apr 2006 23:18:46 GMT, Randy Yates <yates@ieee.org> wrote:

>Which would be a better architecture: using a hardware VCO to drive >the ADC as part of the PLL for timing recovery, or using a >fixed-crystal oscillator for the ADC and resampling in the digital >domain? Assume the latter is done on a processor and not in hardware.
Whilst both could be made to work, you should stick to the all digital approach with a fixed sample rate. DSP is getting cheaper all the time. I can't see the VCXO approach being cost effective today. Using a VCXO made sense 10-15 years ago for high speed modems (e.g. > a few Mbit/s). Further back, most designs would have used a VCO regardless of bitrate. Please bear this in mind if reading older academic papers. Regards, Allan
On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>I don't think the first option is practicable. I would be concerned about >phase noise on the ADC as the clock is bounced around. I would also worry >about control stability since, presumably, you are trying to recover a baud >line from the ADC data which necessarily will have modulation imparted on it >from the vco changes. The feedback loop seems potentially unstable. >
This is such a humorous comment. VCO based clock recovery has been in use for decades and even now there are problems which can't be solved digitally for various reasons (in Gigabit Ethernet for copper, you are supposed to transmit with your recovered clock with quite a bit low jitter so a narrow bandwitdh clock recovery pll is nearly the only option, also for SONET the jitter rejection requirements are so high I dare any one without a dual loop pll and don't even think about 10 gigabit ethernet over copper). The pll in the clock recovery loop filters the high frequency noise and follows the low frequency change nicely so there are no jumps etc. Of course these two requirements are contradictory so you need to select pll bandwidth carefully and do all the loop stability checks but it is no more difficult than any other feedback control system.
"mk" <kal*@dspia.*comdelete> wrote in message
news:giii32h0mpvl21cevuk1mckuqjeium69sp@4ax.com...
> On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com> > wrote: > > >I don't think the first option is practicable. I would be concerned about > >phase noise on the ADC as the clock is bounced around. I would also worry > >about control stability since, presumably, you are trying to recover a
baud
> >line from the ADC data which necessarily will have modulation imparted on
it
> >from the vco changes. The feedback loop seems potentially unstable. > > > This is such a humorous comment. VCO based clock recovery has been in > use for decades and even now there are problems which can't be solved > digitally for various reasons (in Gigabit Ethernet for copper, you are > supposed to transmit with your recovered clock with quite a bit low > jitter so a narrow bandwitdh clock recovery pll is nearly the only > option, also for SONET the jitter rejection requirements are so high I > dare any one without a dual loop pll and don't even think about 10 > gigabit ethernet over copper). > The pll in the clock recovery loop filters the high frequency noise > and follows the low frequency change nicely so there are no jumps etc. > Of course these two requirements are contradictory so you need to > select pll bandwidth carefully and do all the loop stability checks > but it is no more difficult than any other feedback control system. >
You're talking about a wired link at one clock rate, correct? The original post, I believe, was about sampling an RF signal to do digital demodulation. I assume he wanted a range of baud rates. My digital demodulator tracks over 4 orders of magnitude of baud rate, for example. Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, how many bits? If it's one or two then, I agree. phase noise probably doesn't matter so much. -Clark
On Sun, 09 Apr 2006 18:19:44 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, >how many bits? If it's one or two then, I agree. phase noise probably >doesn't matter so much.
Depending on how much complexity you add to the AFE, you need at least 6 and some designs use an 8 bit ADC. Are you really with microsoft ? I don't think it's good form to fake your headers like that.
On Sun, 09 Apr 2006 19:25:46 GMT, mk<kal*@dspia.*comdelete> wrote:

>On Sun, 09 Apr 2006 18:19:44 GMT, "Anonymous" <someone@microsoft.com> >wrote: > >>Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, >>how many bits? If it's one or two then, I agree. phase noise probably >>doesn't matter so much. > >Depending on how much complexity you add to the AFE, you need at least >6 and some designs use an 8 bit ADC. Are you really with microsoft ? I >don't think it's good form to fake your headers like that.
I also checked the headers in his post, but in my case it was to see if the date was 1-April. Regards, Allan