DSPRelated.com
Forums

A question on autocorrelation

Started by Jay April 23, 2004
Susheem wrote:

> No, Peter is correct. What we have to understand is that at each time > instant in the random process, the value of the random process is > uniformly distributed. However, for the next time instant, the value > of the random process will be uniformly distributed and independent of > the previous time instant. This process is a white random process > with uniformly distributed samples.
Not according to the original description. Yes, this is exactly what the OP meant (indeed, a white noise process), but it is *not* what he wrote. So, I have to disagree with your "what we have to understand" -- if you want to understand that, then you're not understanding what the OP wrote in his first message. Carlos -- PS: But I know, discussing this is pointless now :-)
innocent_802000@yahoo.com (Jay) writes:

> > > > Nope! > > > > E{X(n)X(n-m)} = E{Y.Y} if m=0 > > 0 if m != 0 > > > > This is because X(n) and X(n-m) are independent random variables. > > > > > Thanks for your tip on independence. > 1 1 > rxx(m) = E{X(n)X(n-m)} = integral integral x y joint pdf of (x,y) dx dy > 0 0 > > 1 1 > = integral x pdf of x dx * integral y pdf of y dy > 0 0 > this is by independence > = 1/2 * 1/2 = 1/4. > Matlab also gave me the same answer 0.254 for m != 0.
Errk! You're correct. I usually assum zero-mean random variables, hence the slip. Ciao, Peter K. -- Peter J. Kootsookos "I will ignore all ideas for new works [..], the invention of which has reached its limits and for whose improvement I see no further hope." - Julius Frontinus, c. AD 84
Carlos Moreno <moreno_at_mochima_dot_com@xx.xxx> writes:

> PS: But I know, discussing this is pointless now :-)
Yup. Ciao, Peter K. -- Peter J. Kootsookos "I will ignore all ideas for new works [..], the invention of which has reached its limits and for whose improvement I see no further hope." - Julius Frontinus, c. AD 84