DSPRelated.com
Forums

A/D Reference Levels

Started by Randy Yates August 31, 2004
An adjacent thread in which Steve asked some questions about A/D converter
level matching brought up an old "quandary" in my head that I thought I'd
throw out for the group to comment on. 

As I said in that post, and as most of us probably know, the system
SNR (or equivalently, the system noise figure) is degraded the most in
the early gain stages. For this reason I've always wondered why we
don't design A/D converters with very small reference levels so that
the analog signal can be captured as early in the analog chain as
possible, thus avoiding the degradations that occur with each analog
gain stage. Ideally, we'd like to convert at the source (for example,
right at the antenna! with, e.g., a reference level of just a few 10s
or 100s of microvolts). 

I realize this means that the converter's voltage reference would have to be
VERY clean, namely, have a noise level of at least 6N dB below the
reference level in the Nyquist bandwidth. But isn't this feasible with
careful circuit design?

Soooo, why isn't this done?
-- 
%  Randy Yates                  % "Maybe one day I'll feel her cold embrace,
%% Fuquay-Varina, NC            %                    and kiss her interface, 
%%% 919-577-9882                %            til then, I'll leave her alone."
%%%% <yates@ieee.org>           %        'Yours Truly, 2095', *Time*, ELO   
http://home.earthlink.net/~yatescr
Randy Yates wrote:
> An adjacent thread in which Steve asked some questions about A/D converter > level matching brought up an old "quandary" in my head that I thought I'd > throw out for the group to comment on. > > As I said in that post, and as most of us probably know, the system > SNR (or equivalently, the system noise figure) is degraded the most in > the early gain stages. For this reason I've always wondered why we > don't design A/D converters with very small reference levels so that > the analog signal can be captured as early in the analog chain as > possible, thus avoiding the degradations that occur with each analog > gain stage. Ideally, we'd like to convert at the source (for example, > right at the antenna! with, e.g., a reference level of just a few 10s > or 100s of microvolts). > > I realize this means that the converter's voltage reference would have to be > VERY clean, namely, have a noise level of at least 6N dB below the > reference level in the Nyquist bandwidth. But isn't this feasible with > careful circuit design? > > Soooo, why isn't this done?
Because a good part of the noise comes from the comparator, and that's more or less fixed. In fact your average 100ksps 16-bit ADC converter will have more than 1LSB of noise; the only way to get 16-bit accuracy is by oversampling and averaging. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Randy Yates wrote:

>An adjacent thread in which Steve asked some questions about A/D converter >level matching brought up an old "quandary" in my head that I thought I'd >throw out for the group to comment on. > >As I said in that post, and as most of us probably know, the system >SNR (or equivalently, the system noise figure) is degraded the most in >the early gain stages. For this reason I've always wondered why we >don't design A/D converters with very small reference levels so that >the analog signal can be captured as early in the analog chain as >possible, thus avoiding the degradations that occur with each analog >gain stage. Ideally, we'd like to convert at the source (for example, >right at the antenna! with, e.g., a reference level of just a few 10s >or 100s of microvolts). > >I realize this means that the converter's voltage reference would have to be >VERY clean, namely, have a noise level of at least 6N dB below the >reference level in the Nyquist bandwidth. But isn't this feasible with >careful circuit design? > >Soooo, why isn't this done? > >
A simple amp can have a very good noise figure. In many cases their use need not add more than 1dB beyond the inherent kTB noise, which no solution can avoid. Also, an amp can be separate from the converter, where is can be cossetted in a very electrically clean environment, and use noise optimised silicon processes. Most other circuits do not have such impressive noise figures. Therefore, moving the signal far away from kTB with a simple amp is generally the best first step. Regards, Steve
Tim Wescott <tim@wescottnospamdesign.com> writes:

> Randy Yates wrote: >> An adjacent thread in which Steve asked some questions about A/D converter >> level matching brought up an old "quandary" in my head that I thought I'd >> throw out for the group to comment on. As I said in that post, and >> as most of us probably know, the system >> SNR (or equivalently, the system noise figure) is degraded the most in >> the early gain stages. For this reason I've always wondered why we >> don't design A/D converters with very small reference levels so that >> the analog signal can be captured as early in the analog chain as >> possible, thus avoiding the degradations that occur with each analog >> gain stage. Ideally, we'd like to convert at the source (for example, >> right at the antenna! with, e.g., a reference level of just a few 10s >> or 100s of microvolts). I realize this means that the converter's >> voltage reference would have to be >> VERY clean, namely, have a noise level of at least 6N dB below the >> reference level in the Nyquist bandwidth. But isn't this feasible with >> careful circuit design? >> Soooo, why isn't this done? > > Because a good part of the noise comes from the comparator, and that's > more or less fixed.
Are you talking about a delta sigma design? Otherwise, I'm not sure what comparator you're talking about.
> In fact your average 100ksps 16-bit ADC converter will have more than > 1LSB of noise;
You mean in the reference?
> the only way to get 16-bit accuracy is by oversampling > and averaging.
Thanks for responding, Tim. -- % Randy Yates % "...the answer lies within your soul %% Fuquay-Varina, NC % 'cause no one knows which side %%% 919-577-9882 % the coin will fall." %%%% <yates@ieee.org> % 'Big Wheels', *Out of the Blue*, ELO http://home.earthlink.net/~yatescr
Steve Underwood <steveu@dis.org> writes:

> Randy Yates wrote: > >>An adjacent thread in which Steve asked some questions about A/D converter >>level matching brought up an old "quandary" in my head that I thought I'd >> throw out for the group to comment on. As I said in that post, and >> as most of us probably know, the system >>SNR (or equivalently, the system noise figure) is degraded the most in >>the early gain stages. For this reason I've always wondered why we >>don't design A/D converters with very small reference levels so that >>the analog signal can be captured as early in the analog chain as >>possible, thus avoiding the degradations that occur with each analog >>gain stage. Ideally, we'd like to convert at the source (for example, >>right at the antenna! with, e.g., a reference level of just a few 10s >> or 100s of microvolts). I realize this means that the converter's >> voltage reference would have to be >>VERY clean, namely, have a noise level of at least 6N dB below the >>reference level in the Nyquist bandwidth. But isn't this feasible with >>careful circuit design? >> >>Soooo, why isn't this done? >> > A simple amp can have a very good noise figure. In many cases their > use need not add more than 1dB beyond the inherent kTB noise, which no > solution can avoid. Also, an amp can be separate from the converter, > where is can be cossetted in a very electrically clean environment, > and use noise optimised silicon processes. Most other circuits do not > have such impressive noise figures. Therefore, moving the signal far > away from kTB with a simple amp is generally the best first step.
So you're saying that an amplifier can be designed which has a better noise figure than an A/D with a clean reference? -- % Randy Yates % "Remember the good old 1980's, when %% Fuquay-Varina, NC % things were so uncomplicated?" %%% 919-577-9882 % 'Ticket To The Moon' %%%% <yates@ieee.org> % *Time*, Electric Light Orchestra http://home.earthlink.net/~yatescr
Randy Yates wrote:

   ...

> So you're saying that an amplifier can be designed which has a better > noise figure than an A/D with a clean reference?
That's probably true, but it's not the only point. If a transducer has a digital output, then it doesn't matter to those who use it what internal design decisions are made. If it has an analog output, then it's up to you or me to bring that output cleanly to the converter. Which would you rather move: a 10V signal, or 10 mV? If there's a chance that the transducer's and converter's grounds are at different potentials, an instrumentation amplifier is needed anyway*. Its output noise rises less than linearly with gain. If it can provide 10V P-P and your converter accepts only 3, send at the high voltage and attenuate at the converter. Jerry __________________________________ * Heavy ground straps rarely work. The ground loops they make actually increase the problem. -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
"Randy Yates" <yates@ieee.org> wrote in message news:n00as3xf.fsf@ieee.org...
> Steve Underwood <steveu@dis.org> writes: > > > Randy Yates wrote: > > > >>An adjacent thread in which Steve asked some questions about A/D converter > >>level matching brought up an old "quandary" in my head that I thought I'd > >> throw out for the group to comment on. As I said in that post, and > >> as most of us probably know, the system > >>SNR (or equivalently, the system noise figure) is degraded the most in > >>the early gain stages. For this reason I've always wondered why we > >>don't design A/D converters with very small reference levels so that > >>the analog signal can be captured as early in the analog chain as > >>possible, thus avoiding the degradations that occur with each analog > >>gain stage. Ideally, we'd like to convert at the source (for example, > >>right at the antenna! with, e.g., a reference level of just a few 10s > >> or 100s of microvolts). I realize this means that the converter's > >> voltage reference would have to be > >>VERY clean, namely, have a noise level of at least 6N dB below the > >>reference level in the Nyquist bandwidth. But isn't this feasible with > >>careful circuit design? > >> > >>Soooo, why isn't this done? > >> > > A simple amp can have a very good noise figure. In many cases their > > use need not add more than 1dB beyond the inherent kTB noise, which no > > solution can avoid. Also, an amp can be separate from the converter, > > where is can be cossetted in a very electrically clean environment, > > and use noise optimised silicon processes. Most other circuits do not > > have such impressive noise figures. Therefore, moving the signal far > > away from kTB with a simple amp is generally the best first step. > > So you're saying that an amplifier can be designed which has a better > noise figure than an A/D with a clean reference?
That certainly makes sense to me. I'm no expert, but it seems a purely linear device can be made with better noise performance than one that has a whole pile of high-speed digital switching going on. In the audio world, good clean gain stages (op amps, microphone preamps) add very little noise compared to the ADCs and DACs.
Jerry Avins wrote:
> > * Heavy ground straps rarely work. The ground loops they make actually > increase the problem.
So, doesn't one use single point grounds in instrumentation. And to convince any doubters that ground loops may be impressive. In my college days, I worked at the student radio station, WVBR. Signal, power, and safety grounds in the studio control room were common due to past *POOR* decisions. We had a "HUM" problem ;{ The engineering staff choked on properly rebuilding the grounds as a star and opted for brute force 1" copper braid between cabinets ( this is vacuum tube era ). The lead engineer wished to impress on fellow engineers why grounds are *IMPORTANT*. He disconnected one strap and brushed it against where it had been connected. DRAWING ONE QUARTER INCH LONG ARC!
"Randy Yates" <yates@ieee.org> wrote in message
news:4qmiwxij.fsf@ieee.org...
> An adjacent thread in which Steve asked some questions about A/D converter > level matching brought up an old "quandary" in my head that I thought I'd > throw out for the group to comment on. > > As I said in that post, and as most of us probably know, the system > SNR (or equivalently, the system noise figure) is degraded the most in > the early gain stages. For this reason I've always wondered why we > don't design A/D converters with very small reference levels so that > the analog signal can be captured as early in the analog chain as > possible, thus avoiding the degradations that occur with each analog > gain stage. Ideally, we'd like to convert at the source (for example, > right at the antenna! with, e.g., a reference level of just a few 10s > or 100s of microvolts). > > I realize this means that the converter's voltage reference would have to
be
> VERY clean, namely, have a noise level of at least 6N dB below the > reference level in the Nyquist bandwidth. But isn't this feasible with > careful circuit design? > > Soooo, why isn't this done? > --
Hello Randy, With receivers, it is not unusual to have to work with signals with a range of power levels of over 1,000,000 to 1. So instead of using a "megabit" convertor where most of the bits are trying to handle the overall signal level, it is better to use 8 to 12 bits and let a variable gain amp make up for the difference. Plus current convertors trade the number of bits per sample vs their sampling rate. This is all a matter of scaling the signal before sampling. Besides scaling the signal with an AGC, proper prefiltering also removes some of the loading from the A/D. Imagine a receiver with a very wide IF, and it is looking at a narrow band signal. In this case the A/D is partially wasted sampling noise and interference outside of the band of the signal. And prefilters need a significant signal level to function. Since most of the receiver's noise comes from the first stage or two, a proper choice of quiet amps with distributed gains takes care of most of the problem. I had a recent project where I was decoding 9kHz wide signals in the L band. One doesn't easily get IF filters that narrow at the needed operating frequencies, so I used a 50kHz wide IF and let the DSP handle the last little bit of filtering. Also letting the IF be a wider than the signal band allows one to obtain filters that have a flat group delay in the passband and receiver frequency errors can be tolerated. The DSP handles the frequency offset. While the idea of an A/D directly connected to the antenna sounds neat, it turns out to have a lot of practical problems. IHTH, Clay S. Turner
Randy Yates wrote:
> Tim Wescott <tim@wescottnospamdesign.com> writes: > > >>Randy Yates wrote: >> >>>An adjacent thread in which Steve asked some questions about A/D converter >>>level matching brought up an old "quandary" in my head that I thought I'd >>>throw out for the group to comment on. As I said in that post, and >>>as most of us probably know, the system >>>SNR (or equivalently, the system noise figure) is degraded the most in >>>the early gain stages. For this reason I've always wondered why we >>>don't design A/D converters with very small reference levels so that >>>the analog signal can be captured as early in the analog chain as >>>possible, thus avoiding the degradations that occur with each analog >>>gain stage. Ideally, we'd like to convert at the source (for example, >>>right at the antenna! with, e.g., a reference level of just a few 10s >>>or 100s of microvolts). I realize this means that the converter's >>>voltage reference would have to be >>>VERY clean, namely, have a noise level of at least 6N dB below the >>>reference level in the Nyquist bandwidth. But isn't this feasible with >>>careful circuit design? >>>Soooo, why isn't this done? >> >>Because a good part of the noise comes from the comparator, and that's >>more or less fixed. > > > Are you talking about a delta sigma design? Otherwise, I'm not sure > what comparator you're talking about. > > >>In fact your average 100ksps 16-bit ADC converter will have more than >>1LSB of noise; > > > You mean in the reference? > > >>the only way to get 16-bit accuracy is by oversampling >>and averaging. > > > Thanks for responding, Tim.
All ADC converters contain at least one comparator -- that's the nub of how you get from analog to digital. To make the ADC speed high the comparator has to be a high-bandwidth amplifier, which means that it will be noisy, even with a perfectly quiet reference. I misspoke on my comment about 16-bit accuracy, though: by slowing down the comparisons and/or using a comparator on a separate chip you can get the comparator quiet enough to make good measurements. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com