I am trying to set up an ADC to interface with my DSP and I was
wondering-- what is keeping me from "underclocking" the ADC? It's
specified sampling rate is 65 MSPS, but I clock it at 50 MSPS, so I
can clock it with the DSP unit, rather than setting up a clocking
circuit. Not really a DSP question, but I thought a seasoned
engineer would have an informative response. My guess is that the
device should function just fine at 50 MSPS, but I'm a rookie, so if
anybody has any advice regarding, please oblige me.
ADC clock rate
> I am trying to set up an ADC to interface with my DSP and I was
> wondering-- what is keeping me from "underclocking" the ADC?
The answer is most probably nothing. The ADC's datasheet (easily
found using google) will give a definitive answer, but since we're
talking about 65 MBPS vs. 50 MBPS, I wouldn't bother.
> specified sampling rate is 65 MSPS, but I clock it at 50 MSPS, so I
> can clock it with the DSP unit, rather than setting up a clocking
Now, here we have a catch. The DSP gives a 50 MHz clock, but I'm not
sure you want to use it as a sample clock. The clock has a high
jitter, which means that the exact timing each clock edge is randomly
shifted. It's a tiny shift, but since it's random, it affects the
quality of the samples.
If you'll analyze the sampled signal, you may discover that the SNR
is much lower than the original, analog signal. The behavior can also
be interpreted as if the sampled signals had a high phase noise.
This may still be acceptable in many applications, so if you're not
very uptight about SNR, using the DSP's clock may be an adequate
BTW, this phenomenum is evident when using the DSP's clock for an
DAC. You get a lot of noise that wasn't in the samples. Same
Good luck & skill,