Reply by wuji...@163.com April 19, 20062006-04-19
Hi,
I know a scrambler is needed to timing recovery of QAM demodulator. If
I want to realize a QAM demodulator software, that is, all is done by
programming after A/D with fixed clock, how does a scrambler work?
There is no training sequence available in multipoint broadcast system.
If carrier offset frequency exists, does this method still work well?
How can I get this kind of information, any paper?
Thanks
Best Regards
Jeff

Reply by Eric Jacobsen April 13, 20062006-04-13
On Sat, 08 Apr 2006 23:18:46 GMT, Randy Yates <yates@ieee.org> wrote:

>Which would be a better architecture: using a hardware VCO to drive >the ADC as part of the PLL for timing recovery, or using a >fixed-crystal oscillator for the ADC and resampling in the digital >domain? Assume the latter is done on a processor and not in hardware.
Randy, sorry I didn't see this until today, but I've been travelling a lot lately. I'll add a bit to the other answers you've gotten and perhaps touch a point that hasn't been mentioned yet. I've built systems both ways, and there are some advantages and disadvantages both ways. The VCO clock to the ADC approach works well as the symbol rate approaches the maximum clock rate. In other words, the digital resampling method starts to hit some jitter-associated degradation as the oversampling rate is reduced. If you were to plot performance vs symbol rate and everything else is working well, you'll see some jitter-induced degradation as the symbol rate increases beyond some threshold rate. The VCO/ADC approach naturally doesn't have this problem since it naturally removes sampling jitter by synchronizing to the symbols. Otherwise the resampling (aka interpolation) process is naturally limited. Some of that can be overcome with processing, but in general that's the basic tradeoff as I've seen it. The obvious downside to the VCO->ADC method is more components on the board, especially analog components, than you'd have otherwise. So if your symbol rates aren't all that high the all-digital approach is often the best. If you're really power limited or something like that the tradeoff may be harder since the all-digital approach may (depending on the requirements) need a lot more gates. That has to be traded off with whatever external components would be added with the VCO (which is often an NCO and a DAC) to make it all work. Eric Jacobsen Minister of Algorithms, Intel Corp. My opinions may not be Intel's opinions. http://www.ericjacobsen.org
Reply by Bhaskar Thiagarajan April 10, 20062006-04-10
"Randy Yates" <yates@ieee.org> wrote in message
news:m3hd53odro.fsf@ieee.org...
> Which would be a better architecture: using a hardware VCO to drive > the ADC as part of the PLL for timing recovery, or using a > fixed-crystal oscillator for the ADC and resampling in the digital > domain? Assume the latter is done on a processor and not in hardware.
Hi Randy I think the better architecture might depend on how flexible you need your system to be. If you are building a general purpose demodulator that will have several software modules that need to support various types of (de)modulation, I'd definitely go with option 2. If you are building a demodulator for a specific type of modulation and even a specific standard where you know the symbol rates ahead of time, I might look into Option 1 a little more (but still be biased towards Option 2). I wouldn't even consider Option 1 if my team didn't have expertise in building a controllable clock for an ADC input. Also, if speed is an issue, I'd consider Option 1 (a resampling block in software does take some processing time and can be a bottleneck for large blocks of data). In fact, I'm facing a similar problem right now. My hardware guy tells me he can easily provide me the flexibility of a half dozen ADC clocks (with some restrictions...for example, they'll only be in a small range 10 MHz +/- 2 MHz). I think I can do decent resampling in software. I also have an FPGA that can implement a resampler (we have some IP that shouldn't take too much effort to integrate). Given this situation, I'm inclined to stick with my firmware (FPGA) and software options for maximizing flexibility and minimizing hardware cost as well as development time. Cheers Bhaskar
> -- > % Randy Yates % "Watching all the days go by... > %% Fuquay-Varina, NC % Who are you and who am I?" > %%% 919-577-9882 % 'Mission (A World Record)', > %%%% <yates@ieee.org> % *A New World Record*, ELO > http://home.earthlink.net/~yatescr >
Reply by Allan Herriman April 10, 20062006-04-10
On Sun, 09 Apr 2006 19:25:46 GMT, mk<kal*@dspia.*comdelete> wrote:

>On Sun, 09 Apr 2006 18:19:44 GMT, "Anonymous" <someone@microsoft.com> >wrote: > >>Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, >>how many bits? If it's one or two then, I agree. phase noise probably >>doesn't matter so much. > >Depending on how much complexity you add to the AFE, you need at least >6 and some designs use an 8 bit ADC. Are you really with microsoft ? I >don't think it's good form to fake your headers like that.
I also checked the headers in his post, but in my case it was to see if the date was 1-April. Regards, Allan
Reply by mk April 9, 20062006-04-09
On Sun, 09 Apr 2006 18:19:44 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, >how many bits? If it's one or two then, I agree. phase noise probably >doesn't matter so much.
Depending on how much complexity you add to the AFE, you need at least 6 and some designs use an 8 bit ADC. Are you really with microsoft ? I don't think it's good form to fake your headers like that.
Reply by Anonymous April 9, 20062006-04-09
"mk" <kal*@dspia.*comdelete> wrote in message
news:giii32h0mpvl21cevuk1mckuqjeium69sp@4ax.com...
> On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com> > wrote: > > >I don't think the first option is practicable. I would be concerned about > >phase noise on the ADC as the clock is bounced around. I would also worry > >about control stability since, presumably, you are trying to recover a
baud
> >line from the ADC data which necessarily will have modulation imparted on
it
> >from the vco changes. The feedback loop seems potentially unstable. > > > This is such a humorous comment. VCO based clock recovery has been in > use for decades and even now there are problems which can't be solved > digitally for various reasons (in Gigabit Ethernet for copper, you are > supposed to transmit with your recovered clock with quite a bit low > jitter so a narrow bandwitdh clock recovery pll is nearly the only > option, also for SONET the jitter rejection requirements are so high I > dare any one without a dual loop pll and don't even think about 10 > gigabit ethernet over copper). > The pll in the clock recovery loop filters the high frequency noise > and follows the low frequency change nicely so there are no jumps etc. > Of course these two requirements are contradictory so you need to > select pll bandwidth carefully and do all the loop stability checks > but it is no more difficult than any other feedback control system. >
You're talking about a wired link at one clock rate, correct? The original post, I believe, was about sampling an RF signal to do digital demodulation. I assume he wanted a range of baud rates. My digital demodulator tracks over 4 orders of magnitude of baud rate, for example. Regarding gigabit ethernet: I wasn't aware there was an ADC involved? If so, how many bits? If it's one or two then, I agree. phase noise probably doesn't matter so much. -Clark
Reply by mk April 9, 20062006-04-09
On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>I don't think the first option is practicable. I would be concerned about >phase noise on the ADC as the clock is bounced around. I would also worry >about control stability since, presumably, you are trying to recover a baud >line from the ADC data which necessarily will have modulation imparted on it >from the vco changes. The feedback loop seems potentially unstable. >
This is such a humorous comment. VCO based clock recovery has been in use for decades and even now there are problems which can't be solved digitally for various reasons (in Gigabit Ethernet for copper, you are supposed to transmit with your recovered clock with quite a bit low jitter so a narrow bandwitdh clock recovery pll is nearly the only option, also for SONET the jitter rejection requirements are so high I dare any one without a dual loop pll and don't even think about 10 gigabit ethernet over copper). The pll in the clock recovery loop filters the high frequency noise and follows the low frequency change nicely so there are no jumps etc. Of course these two requirements are contradictory so you need to select pll bandwidth carefully and do all the loop stability checks but it is no more difficult than any other feedback control system.
Reply by Allan Herriman April 9, 20062006-04-09
On Sat, 08 Apr 2006 23:18:46 GMT, Randy Yates <yates@ieee.org> wrote:

>Which would be a better architecture: using a hardware VCO to drive >the ADC as part of the PLL for timing recovery, or using a >fixed-crystal oscillator for the ADC and resampling in the digital >domain? Assume the latter is done on a processor and not in hardware.
Whilst both could be made to work, you should stick to the all digital approach with a fixed sample rate. DSP is getting cheaper all the time. I can't see the VCXO approach being cost effective today. Using a VCXO made sense 10-15 years ago for high speed modems (e.g. > a few Mbit/s). Further back, most designs would have used a VCO regardless of bitrate. Please bear this in mind if reading older academic papers. Regards, Allan
Reply by Allan Herriman April 9, 20062006-04-09
On Sun, 09 Apr 2006 13:50:26 GMT, "Anonymous" <someone@microsoft.com>
wrote:

>I don't think the first option is practicable.
It has been used in many successful demodulators. Prior to cheap DSP, it was only practical way to make a demodulator using coherent detection.
>I would be concerned about >phase noise on the ADC as the clock is bounced around.
Controlling the phase of the VCXO doesn't imply excessive "bouncing".
>I would also worry >about control stability since, presumably, you are trying to recover a baud >line from the ADC data which necessarily will have modulation imparted on it >from the vco changes.
The feedback loop locks the frequency to the incoming symbol rate. The VCO will change if the incoming rate changes, otherwise there will be no modulation except for noise.
>The feedback loop seems potentially unstable.
*Any* feedback loop is potentially unstable. The feedback loop is either controlling the analog clock phase or it's controlling the phase of an NCO in a digital downconverter (or its equivalent). I'm not sure why you think an analog one won't work when a digital one would be ok, since that's what your post seems to imply.
>Also, generally, the ADC is run at a high rate to digitize an IF, so you >can't really change that clock much and still meet nyquist. It might work at >one baud rate, but you certainly could not do a large range of baud rates.
The OP's use of the term 'VCXO' implied frequency changes of at most 100ppm or so.
>-Clark
Allan
> >"Randy Yates" <yates@ieee.org> wrote in message >news:m3hd53odro.fsf@ieee.org... >> Which would be a better architecture: using a hardware VCO to drive >> the ADC as part of the PLL for timing recovery, or using a >> fixed-crystal oscillator for the ADC and resampling in the digital >> domain? Assume the latter is done on a processor and not in hardware. >> -- >> % Randy Yates % "Watching all the days go by... >> %% Fuquay-Varina, NC % Who are you and who am I?" >> %%% 919-577-9882 % 'Mission (A World Record)', >> %%%% <yates@ieee.org> % *A New World Record*, ELO >> http://home.earthlink.net/~yatescr >
Reply by Anonymous April 9, 20062006-04-09
I don't think the first option is practicable. I would be concerned about
phase noise on the ADC as the clock is bounced around. I would also worry
about control stability since, presumably, you are trying to recover a baud
line from the ADC data which necessarily will have modulation imparted on it
from the vco changes. The feedback loop seems potentially unstable.

Also, generally, the ADC is run at a high rate to digitize an IF, so you
can't really change that clock much and still meet nyquist. It might work at
one baud rate, but you certainly could not do a large range of baud rates.

-Clark

"Randy Yates" <yates@ieee.org> wrote in message
news:m3hd53odro.fsf@ieee.org...
> Which would be a better architecture: using a hardware VCO to drive > the ADC as part of the PLL for timing recovery, or using a > fixed-crystal oscillator for the ADC and resampling in the digital > domain? Assume the latter is done on a processor and not in hardware. > -- > % Randy Yates % "Watching all the days go by... > %% Fuquay-Varina, NC % Who are you and who am I?" > %%% 919-577-9882 % 'Mission (A World Record)', > %%%% <yates@ieee.org> % *A New World Record*, ELO > http://home.earthlink.net/~yatescr