DSPRelated.com
Forums

Simple Question on Random Variables and Random Processes

Started by Randy Yates February 14, 2006
Let X(t) be a stationary white noise process with autocorrelation
function R_{xx}(\tau) = N0 * \delta(\tau). Let V be a random variable
determined from X(t) by

  V = \int_{-\infty}^{+\infty} X(t) dt.

How do you compute statistics on V, like E[V^2]?

Yes, this is a homework problem.
-- 
%  Randy Yates                  % "Watching all the days go by...    
%% Fuquay-Varina, NC            %  Who are you and who am I?"
%%% 919-577-9882                % 'Mission (A World Record)', 
%%%% <yates@ieee.org>           % *A New World Record*, ELO
http://home.earthlink.net/~yatescr
Randy Yates <yates@ieee.org> writes:

> Let X(t) be a stationary white noise process with autocorrelation > function R_{xx}(\tau) = N0 * \delta(\tau). Let V be a random variable > determined from X(t) by > > V = \int_{-\infty}^{+\infty} X(t) dt.
Sorry, wrong limits: V = \int_{0}^{T} X(t) dt. -- % Randy Yates % "Midnight, on the water... %% Fuquay-Varina, NC % I saw... the ocean's daughter." %%% 919-577-9882 % 'Can't Get It Out Of My Head' %%%% <yates@ieee.org> % *El Dorado*, Electric Light Orchestra http://home.earthlink.net/~yatescr
Randy Yates <yates@ieee.org> writes:

> Randy Yates <yates@ieee.org> writes: > > > Let X(t) be a stationary white noise process with autocorrelation > > function R_{xx}(\tau) = N0 * \delta(\tau). Let V be a random variable > > determined from X(t) by > > > > V = \int_{-\infty}^{+\infty} X(t) dt. > > Sorry, wrong limits: > > V = \int_{0}^{T} X(t) dt.
Note that integration and expectation are interchangeable: E[V] = E[\int_{0}^{T} X(t) dt] = \int_{0}^{T} E[X(t)] dt E[V^2] = E[\int_{0}^{T} X(t) dt \int_{0}^{T} X(s) ds] = \int_{0}^{T} \int_{0}^{T} E[X(t) X(s)] dt ds You should then be able to use the fact that X is stationary, and the autocorrelation expression to reduce both further. Does that help? It's interesting that you're saying autocorrelation rather than autocovariance... Ciao, Peter K. -- "And he sees the vision splendid of the sunlit plains extended And at night the wondrous glory of the everlasting stars."
p.kootsookos@remove.ieee.org (Peter K.) writes:

> Randy Yates <yates@ieee.org> writes: > >> Randy Yates <yates@ieee.org> writes: >> >> > Let X(t) be a stationary white noise process with autocorrelation >> > function R_{xx}(\tau) = N0 * \delta(\tau). Let V be a random variable >> > determined from X(t) by >> > >> > V = \int_{-\infty}^{+\infty} X(t) dt. >> >> Sorry, wrong limits: >> >> V = \int_{0}^{T} X(t) dt. > > Note that integration and expectation are interchangeable: > > E[V] = E[\int_{0}^{T} X(t) dt] > = \int_{0}^{T} E[X(t)] dt > > E[V^2] = E[\int_{0}^{T} X(t) dt \int_{0}^{T} X(s) ds] > = \int_{0}^{T} \int_{0}^{T} E[X(t) X(s)] dt ds > > You should then be able to use the fact that X is stationary, and the > autocorrelation expression to reduce both further. > > Does that help?
Yes, thanks much, Peter! I've been staring at it for hours! I kept on getting hung up on the idea that we can determine a random variable that is a function of a finite number of random variables easily enough, but this is a function of an uncountably infinite number of random variables! Then I got hung up on another viewpoint in which I modeled this V = V(0), where V(t) is the convolution of a pulse with X(t), and then using the theorems on output autocorrelation of a linear system. But that only get's you the autocorrelation, E[V(t) * V'(t+\tau)] (where "'" denotes conjugate). What I needed what E[V(t) * V(t)].
> It's interesting that you're saying autocorrelation rather than > autocovariance...
Why? -- % Randy Yates % "I met someone who looks alot like you, %% Fuquay-Varina, NC % she does the things you do, %%% 919-577-9882 % but she is an IBM." %%%% <yates@ieee.org> % 'Yours Truly, 2095', *Time*, ELO http://home.earthlink.net/~yatescr
Randy Yates wrote:

> Then I got hung up on another viewpoint in > which I modeled this V = V(0), where V(t) is the convolution of a > pulse with X(t), and then using the theorems on output autocorrelation > of a linear system. But that only get's you the autocorrelation, > E[V(t) * V'(t+\tau)] (where "'" denotes conjugate). What I needed > what E[V(t) * V(t)].
I don't understand why conjugation is a problem here. The original statement seemed to imply that X(t) was a real process, not a complex-valued process, and thus conjugation should not matter. If so, setting \tau = 0 should give the desired answer. Can you reveal what you got for E[V(t) * V(t + \tau)]?
dvsarwate@ieee.org writes:

> Randy Yates wrote: > >> Then I got hung up on another viewpoint in >> which I modeled this V = V(0), where V(t) is the convolution of a >> pulse with X(t), and then using the theorems on output autocorrelation >> of a linear system. But that only get's you the autocorrelation, >> E[V(t) * V'(t+\tau)] (where "'" denotes conjugate). What I needed >> what E[V(t) * V(t)]. > > I don't understand why conjugation is a problem here. The original > statement seemed to imply that X(t) was a real process, not a > complex-valued process, and thus conjugation should not matter. > If so, setting \tau = 0 should give the desired answer.
Hi Dilip, Yes, that would be true. I'm trying to decide if X really is real. All I know is that it is white, stationary, and \phi_{xx}(\tau) = N_0 \delta(\tau). Can you tell from this?
> Can you reveal what you got for E[V(t) * V(t + \tau)]?
I got E[V^2] = T*N0, where the original \phi_{xx}(\tau) was N0 * \delta(\tau). -- % Randy Yates % "Rollin' and riding and slippin' and %% Fuquay-Varina, NC % sliding, it's magic." %%% 919-577-9882 % %%%% <yates@ieee.org> % 'Living' Thing', *A New World Record*, ELO http://home.earthlink.net/~yatescr
Randy Yates <yates@ieee.org> writes:

> dvsarwate@ieee.org writes: > >> Randy Yates wrote: >> >>> Then I got hung up on another viewpoint in >>> which I modeled this V = V(0), where V(t) is the convolution of a >>> pulse with X(t), and then using the theorems on output autocorrelation >>> of a linear system. But that only get's you the autocorrelation, >>> E[V(t) * V'(t+\tau)] (where "'" denotes conjugate). What I needed >>> what E[V(t) * V(t)]. >> >> I don't understand why conjugation is a problem here. The original >> statement seemed to imply that X(t) was a real process, not a >> complex-valued process, and thus conjugation should not matter. >> If so, setting \tau = 0 should give the desired answer. > > Hi Dilip, > > Yes, that would be true. I'm trying to decide if X really is > real. All I know is that it is white, stationary, and > \phi_{xx}(\tau) = N_0 \delta(\tau). Can you tell from this?
Dilip et al., I also know that it's zero-mean. This seems to be a nasty problem. It is problem 4.3 b in Proakis, 4th edition. First of all, they say z(t) = x(t) + j * y(t) is the (complex) lowpass equivalent of a bandpass signal in part a). Then they say it's white in part b), but in the text they state that white noise cannot be expressed in terms of quadrature components (the x(t) and y(t) are derived from bandlimited bandpass signals, which white noise is not). Then I started thinking: is it possible for a complex signal to have a real autocorrelation? The answer is "yes." (Note I'm dropping the LaTeX-ese a bit for convenience.) Rzz(tau) = E[(x(t) + j*y(t))(x(t+tau) - j*y(t+tau))] = Rxx(tau) + Ryy(tau) + j * [Ryx(tau) - Rxy(tau)]. So what we require is that Ryx(tau) = Rxy(tau). One case in which that is true is when x(t) = y(t), so that z(t) = x(t) * (1 + j). So then if we interpret z(t) (replace X(t) with z(t) in my original post) as being complex, then a) we do get different results for E[V^2] vs. E[V*V'], and b) there is no way to simplify the resulting equations in x and y since you cannot use the conditions of bandpass signals which state, e.g., Rxx(tau) = Ryy(tau). Any illumination would be appreciated. -- % Randy Yates % "Bird, on the wing, %% Fuquay-Varina, NC % goes floating by %%% 919-577-9882 % but there's a teardrop in his eye..." %%%% <yates@ieee.org> % 'One Summer Dream', *Face The Music*, ELO http://home.earthlink.net/~yatescr
Randy Yates wrote:

> > > It's interesting that you're saying autocorrelation rather than > > autocovariance... > > Why?
Well, because if it was autocovariance, you wouldn't necessarily know that the RV is zero mean. Most definitions of autocovariance I've seen remove the mean before calculating the sequence. Autocorreltation definitions that I'm used to generally don't mean-correct. At least that's what I was musing when I wrote the above! :-) Ciao, Peter K.
"Peter K." <p.kootsookos@iolfree.ie> writes:

> Randy Yates wrote: > >> >> > It's interesting that you're saying autocorrelation rather than >> > autocovariance... >> >> Why? > > Well, because if it was autocovariance, you wouldn't necessarily know > that the RV is zero mean.
I agree. Did I write or state something that required a zero mean?
> Most definitions of autocovariance I've seen remove the mean before > calculating the sequence. Autocorreltation definitions that I'm > used to generally don't mean-correct.
Yes, that's the way I'm seeing them in Proakis, Papoulis, Leon-Garcia, etc. -- % Randy Yates % "Maybe one day I'll feel her cold embrace, %% Fuquay-Varina, NC % and kiss her interface, %%% 919-577-9882 % til then, I'll leave her alone." %%%% <yates@ieee.org> % 'Yours Truly, 2095', *Time*, ELO http://home.earthlink.net/~yatescr
"Randy Yates" <yates@ieee.org> wrote in message 
news:m3slql45tc.fsf@ieee.org...
> dvsarwate@ieee.org writes: >> Can you reveal what you got for E[V(t) * V(t + \tau)]? > > I got E[V^2] = T*N0, where the original \phi_{xx}(\tau) was N0 * > \delta(\tau).
That is certainly the right answer (for \tau > 0). More generally, it is true that E[V(t) * V(t + \tau)] = N0*min(t, t+\tau). {V(t)} is called a Wiener process.