DSPRelated.com
Forums

AD7687 and ADSP218

Started by Pierre de Vos February 6, 2006
Hi all

I have a board with 6 daisy chained AD7687 pulsar ADCs.  I have these ADC 
connected via a FPGA to an ADSP2186 running at 40MHz.  Signals I'm measuring 
are in the 45 to 65Hz range.  I employ a synchronous sampling mechanism 
where the sampling rate is always a 2^N multiple of the incoming base 
frequency.  In this specific case e.g. for 50Hz the sample rate is 
50x128x8=51200Hz.

In the DSP I use autobuffering to buffer 48 samples = 8 samples/channel 
between SPORT interrupts.  After every autobuffer interrupt I average the 8 
samples/channel to give an effective sample rate of 50x128=6400Hz.

The problem I'm having is when I change the input signals frequency to say 
60Hz, thus changing the sample rate to 61440Hz, the magnitude of the sampled 
signal changes by about -60dB.  I verify this by calculating the RMS at both 
50 and 60Hz and I get a difference of about 20 counts in 18000.  By this I 
mean that e.g. if the RMS value (in raw ADC counts) is 18000 at 60Hz, it is 
18020 at 50Hz - the weirdest thing is that going down to 45Hz, the RMS value 
is again approximately the same as at 60Hz.

I've used two precision AC calibrators (one being a Fluke 6100 power 
calibrator) to inject signals into the board - both exibit the same 
phenomena.

I've tried with an oversampling of both 4 and 8 (8 in the example above), 
with the same result.

Keeping the sample rate fixed at say 61440Hz and varying the input signal 
frequency from 45 to 65Hz - the sampled signal magnitude stays the same.

I only have a single order anti-aliasing filter with the -3dB point at 15kHz 
(100R and 100nF).

Could this still be caused by aliasing?  I've already bypassed all the input 
amplifiers and injected directly into the ADCs with a very low distortion AC 
calibrator.

Any ideas?

Regards
Pierre