Hi, This is a small followup from "Model for jitter on oscillator for downconversion" but since I think I am on the right track now, I have a specific question here. Suppose I have the following stochastic process: cos(w t + phi) where phi ~ N(0,sigma^2). Now I want to calculate the expected "normalized mean squared" error over one period: e = (\int_0^T (cos(w t) - cos(w t + phi))^2 dt) / (\int_0^T cos(w t)^2 dt) I have two options now: 1.) First, fix "t" and calculate the expected value for one specific "t". Then evaluate the integral over time 2.) Evaluate the symbolic integral first and then take the expected value for phi I would think both options should give me the same result but they do not. Do I forgot something? Thank you, Peter For reference, my solution for 1.) for one "t" is: NMSE(t) = 1 - 2e^(-sigma^2/2) + e^(-2 sigma^2) + (1-e^(-2 sigma^2))/(2 cos^2(w t)) So the error indeed heavily depends on "t" (emprically it is largest at the peaks of the cosine). In order to average the error now over time, I say t is uniformly distributed over [0,T] and calculate another expected value: E{ NMSE(t) }. Fortunately, the expected value of the last terms results in zero, giving the net result: NMSE1 = 1 - 2e^(-sigma^2/2) + e^(-2 sigma^2) The solution for 2.) is even simpler. Assuming phi to be fixed for now, the expression of "e" from above is readily evaluated as: NMSE(phi) = 2 - 2cos(phi) Now, since phi ~ N(0,sigma^2): NMSE2 = E{ NMSE(phi) } = 2 - 2 e^(-sigma^2/2) As can be seen, NMSE1 and NMSE2 are not the same (although the are very similar when plotted on a log-log scale)
Expected time average of a stochastic process
Started by ●July 25, 2015