Hi all,
I am using high speed ADC from TI and it is used in interleaved mode.
http://www.ti.com/lit/ds/symlink/adc12d800rf.pdf
When I wire sine wave from generator that has signal level of 10 dBm and frequency of 1.4MHz and sample that with 1.4GHz, I get peaks at around 4.6 dBm. Shouldn't they be at 7 dBm (two peaks for real sine wave)?
This is the best case. I normally use this for subsampling applications and my signal level is even lower. Does anyone know an explanation for this.
Moreover, I have a huge issue with interleaving spurs at fs/2-fin frequency. I tried all the offered solutions from TI such as DES Time Adjust, I and Q Full Scale Range Adjust and Calibration. DES Time Adjust setting does help a lot when input signal that is wired is high but for lower signals (like -10 dBm) it does not help or it acts quite unpredictably - sometimes it gives some improvement and sometimes not at all.
Does this behavior make sense?
My goal is to increase my SFDR but now I am not sure if the cause is the fact that spur is too high or that my signal is too low.
Is input signal power somehow spread across harmonics?
Do you know any other good way in which I could reduce these spurs?
I really can't help much, but this is a cool problem to look at. You are 2.4 dB down from where you expect to be, that is sqrt(2) factor. You have 2 channels alternating (anyway that's what I think DES is doing) which means the mux is doing something to the input signal to get it to the internal ADC's. Sqrt(2) in power is a factor of 2 in voltage, so maybe it has something to do with how the internal switching works.
The spurs and fs/2-fin sounds like an aliasing problem. I'd first try a filter between the generator and the ADC - but even that might make things worse. That sample rate is really high (for what I do anyway!) and you might need to match the generator to the ADC so there are absolutely no possible reflections at the 1.4GHz frequency.
That's some nice toys you have to play with there. I'm jealous :-) Good luck figuring it out!!
Dr. mike
It is not clear the set up that you have, however if you set the signal generator to 10dBm, it does NOT mean that the ADC input is 10dBm. There are signal generator error less than 1dB and cable loss of couple of dB at most (depending on the cable). So, you are off by a few dB instead of 5.3dB. To be exact, you will need to calibrate for the signal generator and cable loss, so you can deduce the exact level at ADC input.
fs/2 - fin is the image signal of fin.
You can change the input power level at signal generator by xdB and measure the fs/2 -fin, and compare it with the original measurement. For instance, if you change the input at signal generator from 10dBm to 5dBm, what happens to the level of the image signal. Does it change dB per dB. That would tell more about the mechanism and if you are seeing what you expect to see.
Also, change the frequency and repeat the above measurement to be sure that you are seeing what you expect to see. There could be multiple undesired signals at the very same frequency, which cause instability in measurement, rapid changes of its level.
Input signal gets spread across harmonics, however the harmonics are probably well below the input signal, by -30dBc say, that means the harmonic is 1/1000 of the input signal, so for practical purpose you still should get whatever you put into ADC at the fundamental frequency.
Once you do the above measurements, you will get a better picture of the spur behavior, which could tell you more in mitigating it.
Good Luck,
Shahram
ortenga.com