On Wed, 15 Feb 2006 14:35:28 -0800, "Fred Marshall" <fmarshallx@remove_the_x.acm.org> wrote:> >"Randy Yates" <yates@ieee.org> wrote in message >news:m3bqx82t1e.fsf@ieee.org... >> Just Cocky <just@cocky.com> writes: >>> [...] >>> Huh? Mean of a sample? What's the point of speaking of the mean of one >>> known value? >> >> I think he meant a section of time from a realization of a random process. >> >> I've seen the term "sample waveform" used in Proakis and I don't like it >> either. > >Right and point taken. I suppose it leaks over from statistics where a >"sample" is what we'd here call a "sequence" or "vector" or ..... a set of >samples taken over a temporal epoch. >Ok, got it. For some reason, sample has a different "feel" when thinking digital electronics...
Is White Noise Necessarily Zero-Mean?
Started by ●February 15, 2006
Reply by ●February 16, 20062006-02-16
Reply by ●February 16, 20062006-02-16
Just Cocky wrote:> On Wed, 15 Feb 2006 12:41:58 -0800, "Fred Marshall" > <fmarshallx@remove_the_x.acm.org> wrote: > >>"Jani Huhtanen" <jani.huhtanen@kolumbus.fi> wrote in message >>news:dsvpsq$l2f$1@phys-news4.kolumbus.fi... >> >>>Randy Yates wrote: >>> >>> >>>>Let Z(t) be a white-noise (stationary) random process. >>>>Can we conclude that Z(t) has zero mean? >>>> >>>>My thought is: yes. The intuitive reason that comes to >>>>mind (and it may be wrong!) is this: If Z(t) has a >>>>non-zero mean, then there would be some amount of >>>>correlation between samples due to the means. Thus >>>>the autocorrelation would be a delta function. >>> >>>Yes. Whit noise is necessarily zero-mean. This follows from the definition >>>of white (i.e. covariance matrix is identity matrix). >>> >>>Let x be a random vector. Covariance matrix of a random vector is Cx = >>>E{x*x'}. Mean of the random vector is m = E{x}. Let y be a white random >>>vector with zero mean so that y = x + m; >>> >>>E{x*x'} = E{(y+m)*(y'+m')} = E{y*y'}+E{y*m'}+E{m*y'}+E{m*m'} = I + E{y}*m' >>>+ >>>m*E{y'} + m*m' = I + m*m' != I => random vector is not white if the mean >>>is >>>not zero. >>> >> >>Just be careful to note that a *sample* of such white noise doesn't >>necessarily have zero mean. >> > > > Huh? Mean of a sample? What's the point of speaking of the mean of one > known value?Sample in the statistical sense. Read "subset". Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●February 16, 20062006-02-16
Fred Marshall wrote:> > "Jani Huhtanen" <jani.huhtanen@kolumbus.fi> wrote in message > news:dsvpsq$l2f$1@phys-news4.kolumbus.fi... >> Randy Yates wrote: >> >>> Let Z(t) be a white-noise (stationary) random process. >>> Can we conclude that Z(t) has zero mean? >>> >>> My thought is: yes. The intuitive reason that comes to >>> mind (and it may be wrong!) is this: If Z(t) has a >>> non-zero mean, then there would be some amount of >>> correlation between samples due to the means. Thus >>> the autocorrelation would be a delta function. >> >> Yes. Whit noise is necessarily zero-mean. This follows from the >> definition of white (i.e. covariance matrix is identity matrix). >> >> Let x be a random vector. Covariance matrix of a random vector is Cx = >> E{x*x'}. Mean of the random vector is m = E{x}. Let y be a white random >> vector with zero mean so that y = x + m; >> >> E{x*x'} = E{(y+m)*(y'+m')} = E{y*y'}+E{y*m'}+E{m*y'}+E{m*m'} = I + >> E{y}*m' + >> m*E{y'} + m*m' = I + m*m' != I => random vector is not white if the mean >> is >> not zero. >> >> >> -- >> Jani Huhtanen >> Tampere University of Technology, Pori > > Just be careful to note that a *sample* of such white noise doesn't > necessarily have zero mean. > > Start with a white noise generator (with zero mean implied / assured). > Grab a sample over some finite temporal epoch such that the mean *is* > zero. It should be easy enough to do this by generating the noise and > computing the mean from the beginning, for some time, and then continuing > to compute the mean of the entire record until the mean is zero - then > stop grabbing data. > Now, cut the data into two equal halves in time. > There is some large probability that each half will not each have zero > mean. Of course the sum of the means of the two halves will be zero > because that's what you constructed in the grab. > > FredThere is a difference between mean and average. If you calculate something from the data then you're just estimating some parameter. In this case average is used as an estimate for the mean. Secondly, if you generate white noise (x) and calculate the average (avg_n) repeatedly after every sample generated, you should notice that the chance of the average being larger than 0 is equal to of it being smaller than 0 (i.e. P(avg_n>0) = 0.5). (avg_n = sum_{i=1}^n x[n], where x is a sequence of random samples.) This follows from the central limit theorem. (On a second thought, this was probably just what you were trying to say) -- Jani Huhtanen Tampere University of Technology, Pori
Reply by ●February 16, 20062006-02-16
Jani Huhtanen wrote:> Fred Marshall wrote: > >> >> "Jani Huhtanen" <jani.huhtanen@kolumbus.fi> wrote in message >> news:dsvpsq$l2f$1@phys-news4.kolumbus.fi... >>> Randy Yates wrote: >>> >>>> Let Z(t) be a white-noise (stationary) random process. >>>> Can we conclude that Z(t) has zero mean? >>>> >>>> My thought is: yes. The intuitive reason that comes to >>>> mind (and it may be wrong!) is this: If Z(t) has a >>>> non-zero mean, then there would be some amount of >>>> correlation between samples due to the means. Thus >>>> the autocorrelation would be a delta function. >>> >>> Yes. Whit noise is necessarily zero-mean. This follows from the >>> definition of white (i.e. covariance matrix is identity matrix). >>> >>> Let x be a random vector. Covariance matrix of a random vector is Cx = >>> E{x*x'}. Mean of the random vector is m = E{x}. Let y be a white random >>> vector with zero mean so that y = x + m; >>> >>> E{x*x'} = E{(y+m)*(y'+m')} = E{y*y'}+E{y*m'}+E{m*y'}+E{m*m'} = I + >>> E{y}*m' + >>> m*E{y'} + m*m' = I + m*m' != I => random vector is not white if the mean >>> is >>> not zero. >>> >>> >>> -- >>> Jani Huhtanen >>> Tampere University of Technology, Pori >> >> Just be careful to note that a *sample* of such white noise doesn't >> necessarily have zero mean. >> >> Start with a white noise generator (with zero mean implied / assured). >> Grab a sample over some finite temporal epoch such that the mean *is* >> zero. It should be easy enough to do this by generating the noise and >> computing the mean from the beginning, for some time, and then continuing >> to compute the mean of the entire record until the mean is zero - then >> stop grabbing data. >> Now, cut the data into two equal halves in time. >> There is some large probability that each half will not each have zero >> mean. Of course the sum of the means of the two halves will be zero >> because that's what you constructed in the grab. >> >> Fred >/*snip*/> Secondly, if you generate white noise (x) and calculate the average > (avg_n) repeatedly after every sample generated, you should notice that > the chance of the average being larger than 0 is equal to of it being > smaller than 0 (i.e. P(avg_n>0) = 0.5). (avg_n = sum_{i=1}^n x[n], where x > is a sequence of random samples.) This follows from the central limit > theorem.On a third thought, the avg_n isn't i.i.d anymore. So what I said is probably incorrect (i.e. the conditional probability P(avg_n avg_n-1,...,avg_0) probably doesn't have to be symmetric around 0). Not sure about this though (shouldn't post in haste :)) -- Jani Huhtanen Tampere University of Technology, Pori
Reply by ●February 16, 20062006-02-16
I am going to discuss continuous-time random processes and white noise only. So, please do not muddy the waters by bringing in discrete-time white noise processes as obvious counterexamples for what I say below. For a second-order (i.e. finite variance) stationary random process, the autocorrelation function Rxx(t) = E[X(s)X(s+t)] is either a periodic function or decays asymptotically as t approaches infinity (or -infinity) to m^2, the square of the mean of the process. In other words, two random variables corresponding to time instants that are far apart on the time axis are *uncorrelated* in the statistical sense of the word; their covariance is 0; the autocovariance function Cxx(t) approaches 0 asymptotically, and Rxx(t) = Cxx(t) + m^2 approaches m^2. If we apply these notions to white noise which is *not* a second-order process but for many purposes can be treated as a second-order processs, then Rxx(t) = N0 delta(t) *does* imply that the mean m is zero. The process described by Tim Westcott (white noise riding on a DC term) is not really a *white* noise process in the sense that the autocorrelation function of this other process is *not* of the form N0 delta(t). So, the real question is "what do we mean when we say that X(t) is a white noise process?" If "white" means that the autocorrelation function is of the form N0 delta(t), then the answer to Randy's question is Yes, white noise has zero mean. Notice also that the power spectral density phixx(f) has value N0 for all f for this "version" of white noise. For the white noise + DC term, the power spectral density includes an impulse at f = 0. Do we still want to call it white noise? I wouldn't but then, I am well known on this group for being a stuck-up theoretician and pedant.... Ultimately, the question of white noise is a philosophical and mathematical question. For many linear filters, it is an experimentally observed fact that the noise process at the output has an autocorrelation function Ryy(t) that is proportional to the autocorrelation function Rhh(t) of the impulse response h(t) of the filter. Equivalently, phiyy(f) the power spectral density of the filter output is proportional to |H(f)|^2, the power transfer function of the filter. These results can be "justified" very easily by PRETENDING that there exists a process with autocorrelation function N0 delta(t), and that we can treat this process as a second order process even though it has infinite variance. The standard second-order theory for linear systems says that Ryyy(t) = Rxx(t) * Rhh(t) which equals N0 Rhh(t) if Rxx(t) happens to be N0 delta(t). Similarly, phiyy(f) = phixx(f)|H(f)|^2 which equals N0 |H(f))^2 if phixx(f) happens to have value N0 for all f. Thus, the theory gives results that are in line with experimental observations. Does white noise actually exist? What is the meaning of life, the universe, and everything? I think it is best (as Randy says in a later note in this thread) that we use the term white noise exclusively to denote a (zero-mean) process with autocorrelation function N0 delta(t), and power spectral density phixx(f) = N0 for all f. Finally, I want to re-emphasize that this discussion is about continuous-time processes. So, let's keep discrete-time white noise processes out of it. Hope this helps
Reply by ●February 16, 20062006-02-16
dvsarwate@ieee.org wrote: ...> Hope this helpsOf course it helps. Some of us would rather think than do math; that's like walking on thin ice. Others would rather do math than think; that's like going around blindfolded. Dilip does both wonderfully well, and so usually has the last word, even when we don't realize and keep talking. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●February 17, 20062006-02-17
Dilip, Tim, et al., Thank you for that clarification, and thanks for your help. I claim that white noise, as defined by at least two authors, does not require a zero mean. However, in each case I feel the definitions from these authors is missing a necessary ingredient. I describe this in detail in the new paper "Mean of a White-Noise Process," http://www.digitalsignallabs.com/white.pdf Please have a look and critique - if you see an error or an oversight, I want to know about it. --Randy Yates dvsarwate@ieee.org writes:> I am going to discuss continuous-time random > processes and white noise only. So, please do > not muddy the waters by bringing in discrete-time > white noise processes as obvious counterexamples > for what I say below. > > For a second-order (i.e. finite variance) stationary > random process, the autocorrelation function > Rxx(t) = E[X(s)X(s+t)] is either a periodic function > or decays asymptotically as t approaches infinity > (or -infinity) to m^2, the square of the mean of the > process. In other words, two random variables > corresponding to time instants that are far apart on > the time axis are *uncorrelated* in the statistical > sense of the word; their covariance is 0; the > autocovariance function Cxx(t) approaches 0 > asymptotically, and Rxx(t) = Cxx(t) + m^2 > approaches m^2. > > If we apply these notions to white noise which is > *not* a second-order process but for many purposes > can be treated as a second-order processs, then > Rxx(t) = N0 delta(t) *does* imply that the mean m is > zero. > > The process described by Tim Westcott (white noise > riding on a DC term) is not really a *white* noise process > in the sense that the autocorrelation function of this > other process is *not* of the form N0 delta(t). So, the > real question is "what do we mean when we say that > X(t) is a white noise process?" If "white" means that > the autocorrelation function is of the form N0 delta(t), > then the answer to Randy's question is Yes, white noise > has zero mean. Notice also that the power spectral > density phixx(f) has value N0 for all f for this "version" > of white noise. For the white noise + DC term, the > power spectral density includes an impulse at f = 0. > Do we still want to call it white noise? I wouldn't > but then, I am well known on this group for being a > stuck-up theoretician and pedant.... > > Ultimately, the question of white noise is a philosophical > and mathematical question. For many linear filters, it > is an experimentally observed fact that the noise > process at the output has an autocorrelation function > Ryy(t) that is proportional to the autocorrelation function > Rhh(t) of the impulse response h(t) of the filter. > Equivalently, phiyy(f) the power spectral density of the filter > output is proportional to |H(f)|^2, the power transfer > function of the filter. These results can be "justified" > very easily by PRETENDING that there exists a process > with autocorrelation function N0 delta(t), and that we can > treat this process as a second order process even though > it has infinite variance. The standard second-order theory > for linear systems says that > Ryyy(t) = Rxx(t) * Rhh(t) which equals N0 Rhh(t) if > Rxx(t) happens to be N0 delta(t). Similarly, > phiyy(f) = phixx(f)|H(f)|^2 which equals N0 |H(f))^2 if > phixx(f) happens to have value N0 for all f. Thus, > the theory gives results that are in line with experimental > observations. Does white noise actually exist? > What is the meaning of life, the universe, and everything? > > I think it is best (as Randy says in a later note in > this thread) that we use the term white noise > exclusively to denote a (zero-mean) process with > autocorrelation function N0 delta(t), and power > spectral density phixx(f) = N0 for all f. > > Finally, I want to re-emphasize that this discussion > is about continuous-time processes. So, let's > keep discrete-time white noise processes out of it. > > Hope this helps >-- % Randy Yates % "I met someone who looks alot like you, %% Fuquay-Varina, NC % she does the things you do, %%% 919-577-9882 % but she is an IBM." %%%% <yates@ieee.org> % 'Yours Truly, 2095', *Time*, ELO http://home.earthlink.net/~yatescr
Reply by ●February 17, 20062006-02-17
Hi, Equation 7 (def. for uncorrelated). You have E[x(t+tau)x(t)]=E[x(t+tau)]E[x(t)] That seems like the definition of independence and not uncorrelation. I think you meant: E[x(t+tau)x(t)]= 0 Offcourse indepedence results in uncorrelation, but uncorrelation does not result in independence. Is that a typo or am i missing something ?
Reply by ●February 17, 20062006-02-17
Hi, I think Brown's definition of WN gives zero mean for the process (unless I did a mistake).>From his definition a WN (stationary with constant psd) :E[x(t)]=mx %constant R(x(t2),x(t1))= R(tau) Sx= var (constant psd) which implies -> R(tau)=delta(tau)*var So that using this equation for the mean var = E[x(t)^2] - (E[x(t)])^2 we get (E[x(t)])^2 = E[x(t)^2] -var = E[R(0)] - var = var -var = 0 -Ikaro (E[x(t)])^2 =
Reply by ●February 17, 20062006-02-17