DSPRelated.com
Forums

Demodulator Timing Recovery Architecture Question

Started by Randy Yates April 8, 2006
"Randy Yates" <yates@ieee.org> wrote in message
news:m3hd53odro.fsf@ieee.org...
> Which would be a better architecture: using a hardware VCO to drive > the ADC as part of the PLL for timing recovery, or using a > fixed-crystal oscillator for the ADC and resampling in the digital > domain? Assume the latter is done on a processor and not in hardware.
Hi Randy I think the better architecture might depend on how flexible you need your system to be. If you are building a general purpose demodulator that will have several software modules that need to support various types of (de)modulation, I'd definitely go with option 2. If you are building a demodulator for a specific type of modulation and even a specific standard where you know the symbol rates ahead of time, I might look into Option 1 a little more (but still be biased towards Option 2). I wouldn't even consider Option 1 if my team didn't have expertise in building a controllable clock for an ADC input. Also, if speed is an issue, I'd consider Option 1 (a resampling block in software does take some processing time and can be a bottleneck for large blocks of data). In fact, I'm facing a similar problem right now. My hardware guy tells me he can easily provide me the flexibility of a half dozen ADC clocks (with some restrictions...for example, they'll only be in a small range 10 MHz +/- 2 MHz). I think I can do decent resampling in software. I also have an FPGA that can implement a resampler (we have some IP that shouldn't take too much effort to integrate). Given this situation, I'm inclined to stick with my firmware (FPGA) and software options for maximizing flexibility and minimizing hardware cost as well as development time. Cheers Bhaskar
> -- > % Randy Yates % "Watching all the days go by... > %% Fuquay-Varina, NC % Who are you and who am I?" > %%% 919-577-9882 % 'Mission (A World Record)', > %%%% <yates@ieee.org> % *A New World Record*, ELO > http://home.earthlink.net/~yatescr >
On Sat, 08 Apr 2006 23:18:46 GMT, Randy Yates <yates@ieee.org> wrote:

>Which would be a better architecture: using a hardware VCO to drive >the ADC as part of the PLL for timing recovery, or using a >fixed-crystal oscillator for the ADC and resampling in the digital >domain? Assume the latter is done on a processor and not in hardware.
Randy, sorry I didn't see this until today, but I've been travelling a lot lately. I'll add a bit to the other answers you've gotten and perhaps touch a point that hasn't been mentioned yet. I've built systems both ways, and there are some advantages and disadvantages both ways. The VCO clock to the ADC approach works well as the symbol rate approaches the maximum clock rate. In other words, the digital resampling method starts to hit some jitter-associated degradation as the oversampling rate is reduced. If you were to plot performance vs symbol rate and everything else is working well, you'll see some jitter-induced degradation as the symbol rate increases beyond some threshold rate. The VCO/ADC approach naturally doesn't have this problem since it naturally removes sampling jitter by synchronizing to the symbols. Otherwise the resampling (aka interpolation) process is naturally limited. Some of that can be overcome with processing, but in general that's the basic tradeoff as I've seen it. The obvious downside to the VCO->ADC method is more components on the board, especially analog components, than you'd have otherwise. So if your symbol rates aren't all that high the all-digital approach is often the best. If you're really power limited or something like that the tradeoff may be harder since the all-digital approach may (depending on the requirements) need a lot more gates. That has to be traded off with whatever external components would be added with the VCO (which is often an NCO and a DAC) to make it all work. Eric Jacobsen Minister of Algorithms, Intel Corp. My opinions may not be Intel's opinions. http://www.ericjacobsen.org
Hi,
I know a scrambler is needed to timing recovery of QAM demodulator. If
I want to realize a QAM demodulator software, that is, all is done by
programming after A/D with fixed clock, how does a scrambler work?
There is no training sequence available in multipoint broadcast system.
If carrier offset frequency exists, does this method still work well?
How can I get this kind of information, any paper?
Thanks
Best Regards
Jeff