As long as we are on the subject of software PLLs, I have a question.... A software PLL is based on an NCO and an NCO unlike a VCO has a minimum step size so it can only achieve a number of discrete frequencies, i.e. the output frequency is quantized. Now if the input to the PLL is an arbitrary frequency the NCO will not be able to lock exactly to the correct frequency but only to the nearest step. (I know the resolution steps can be very small under 1 Hz but this is more of a theoretical question rather than practical. For the sake of discussion, lets make it easy and assume the step size is 1 Hz). So I assume the NCO will toggle between the two steps that are just above the exact frequency and just below the exact frequency. This will create undesired jitter or phase noise in the output i.e. unwanted FM with deviation of ~+/-0.5 Hz. I also assume the frequency (speed) of this toggling will be a function of the loop bandwidth but that the magnitude of the frequency deviation is fixed by the NCO step size (resolution). This seems to me to be analogous to any quantized system. It seems this issue can be addressed in the same way quantization is addressed, and that is with dither. If random phase noise is present or added, the system will average out to the exact frequency but it still seems to me the NCO frequency must jump back and forth +/-0.5 Hz (in this example). It also seems that if you try to address this by adding bits to the NCO improving its resolution and making the step size smaller that you will decreases the deviation of the FM but you will also increase the rate of the toggling which may or may not be better. (If you had a course step size of 10 Hz, the toggling might be once per second, but if you had a finer step size resolution of say 0.1 Hz, the toggling would speed up to 100 Hz) So the questions... Are these observations correct and why is it not a problem in practice. thanks Mark
Re: Software PLL (SPLL)
Started by ●February 8, 2006