Reply by Dilip V. Sarwate November 3, 20032003-11-03
Randy Yates <yates@ieee.org> wrote in message news:<_B_ob.1666$9M3.58@newsread2.news.atl.earthlink.net>...


> It is indeed a contradiction to me, and I cannot resolve > the contradiction. It is one I have had for years.
Randy: We have crossed swords on this point before, but what the heck, once more into the breach... Let us first talk of energy and power for a deterministic signal. A signal x(t) delivers, say during the interval [0, T], into a 1 ohm resistor) an amount of energy that can be expressed as En(T) = *integral* of x^2(t) from 0 to T The *average power* during this interval is (1/T).En(T) while the *instantaneous power* P(t) at any time t during this interval is the ratio of the energy delivered during a small interval of duration dt to the length of the interval, i.e. P(t) = [En(t + dt) - En(t)]/dt, or more properly, the limiting value of this ratio as dt becomes very small. We can extend all these notions to the whole real line appropriately, and for many signals, we will have that En(T) increases without bound as T approaches infinity while (1/T).En(T) approaches a constant value (called the average power of the signal), but let's just stick to a finite length interval for a while. Now, underlying all this mathematical malarkey is the notion that x^2(t) is an *integrable* function of t. Next, let us consider the random process that you and Carlos are thinking about, viz. a random process for which each X(t) is uniformly distributed on [-1, 1] and for which each X(t) is independent of all other X(t'). If we think about the *entire set* of realizations or sample functions of this process, then we see that these realizations are nothing more and nothing less than the set of *all* functions x(t) such that |x(t)| is at most 1 over the interval under consideration. We can define the energy delivered for all these signals as above. Or can we? Are *all* (or almost all) bounded functions also integrable functions?. No. In fact almost all of these sample functions of the random process are *not integrable*. Thus it is not clear how we can define energy delivered for the sample functions of the process that you are considering. But, you say, a typical sample function is, after all, a signal that we can produce in the lab, and therefore we can measure its energy and figure out how to define the appropriate formula for energy. Well, as I pointed out in a different subthread, a typical sample function is discontinuous almost everywhere, and requires infinite currents to flow to charge and discharge capacitors instantaneously. Can we get a pretty good approximation by creating a sampled version? Well, you have then "filtered" the process in some sense when you sample it, and the discrepancy between infinite power and finite power disappears. In summary, I suggest that we all stop trying to look for Wiener's or Khinchine's original definitions and/or proofs in the hopes of resolving the apparent contradictions. The answer is not there but rather in basic calculus which we have forgotten in our rush to judgment... The Wiener-Khinchine formulation does not apply to the so-called white noise process with finite variance. --Dilip Sarwate
Reply by Piergiorgio Sartor November 3, 20032003-11-03
Carlos Moreno wrote:

>> The PDS of a stochastic process is the "power" >> or "energy" the process can have in terms of >> probability, not in terms of physical energy. > > Hmm... Interesting (i.e., interesting how for > so many years I've misunderstood all these > concepts :-) I mean, :-( ).
Well, I'm not sure I'm really on the right way...
> I would ask you to define "power the process can > have in terms of probability"... But I guess
Maybe see below.
> this falls again in the "I-better-go-do-my- > homework" category (homework not in the sense > that this question is part of some homework I > got; but in the sense that I have my own work > to do in investigating these things)
Well, I think that posing doubts is a good way to proceed.
>> In case of white noise, it means that there >> is no limitation to the possible sequence. > > Can you elaborate?? No limitation to the possible > sequence meaning what exactly? (what kind of > limitation -- or lack thereof -- are we talking > about?)
Let's say you've white noise (discrete time, it's easier). This means that if at the time T1, x[T1] = 1, at the time T2, x[T2] can be anything. If the process is not white, for example is low pass filtered, than if x[T1] = 1, x[T2] cannot be anything, there is not enough "power" to change it to one of all possible values. It will be limited to change to values close to 1, the more is low passed, the more it's limited. The wider the spectrum, the higher is the probability to have "anything" as sequence. As the "power" of the stochastic process increases, the probability of having something different is higher. Something like the probability is boosted by the statistical power. That's my interpretation.
>> The energy or power you calculate for a possible >> realization is something else, it does not relate >> with the PDS is any physical sense. > > I'm now confused by why are the units of the PSD > so conveniently Watts per Hz?
Why not? It's "power" at the end, but it must be interpreted in a slightly different way. I think very often these related each other, but must not be swapped without good sense. bye, -- Piergiorgio Sartor
Reply by Randy Yates November 2, 20032003-11-02
Carlos Moreno wrote:
> Randy Yates wrote: >> >> Rxx(0) = E[X^2(t)], by definition. > > > Huh?? > > I thought Rxx(0) was defined as the time-domain correlation: > > oo > / > | > Rxx(T) = | X^2(t) dt > | > / > -oo > > Notice that in that case, this represents the total energy > of the signal... Hmmm, though something confuses me again: > this would be the energy of *one particular* realization. > (so, I think this brings me back to my original doubt: > what the hell does PSD and Rxx represent for a random > process?)
Yup, there's that definition too. But the autocorrelation function which the Wiener-Khinchine theorem utilizes is the probabilistic one.
>> PS: Go to sci.math, where I posed this very question > > > Still unanswered. We'll wait. Maybe in that reference I > was pointed to (Priestly, vol. 1) such proof/justification > is given?
Maybe. If you find it there, please let me know. -- % Randy Yates % "...the answer lies within your soul %% Fuquay-Varina, NC % 'cause no one knows which side %%% 919-577-9882 % the coin will fall." %%%% <yates@ieee.org> % 'Big Wheels', *Out of the Blue*, ELO http://home.earthlink.net/~yatescr
Reply by Jerry Avins November 2, 20032003-11-02
Carlos Moreno wrote:

> Jerry Avins wrote: > >> By applying correct geometric reasoning to a flawed figure, I can prove >> that two lines which intersect are parallel. So what? >> >> Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not >> absurd to say that if the line extends from -infinity to infinity, then >> the power represented by them in the aggregate is infinite. Absurdity >> lies in the belief that a line of such extent must follow rules that >> apply to things that can be. > > > Jerry, > > You seem to keep missing my point. ...
It's worse than that: We're missing each other's points. Mine are 1. When integrating from -infinity to +infinity that which exists only locally in time, the result is of no practical interest. Maybe that's too strong. Put another way, integrating a function over all frequencies that represents power and is valid only within a limited band can yield the surprising conclusion that troubles you. 2. The assumption that successive samples of a signal are statistically independent is only valid when the bandwidth of the signal is less than half the sample rate. Your assumed properties of the signal should assure you that the samples you hypothesize, and the calculations you perform on them, are worthless. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
Reply by Carlos Moreno November 2, 20032003-11-02
Piergiorgio Sartor wrote:
> Carlos Moreno wrote: > [...] > > My feeling is that you're mixing things that > seems to be the same, but they're not. > > The PDS of a stochastic process is the "power" > or "energy" the process can have in terms of > probability, not in terms of physical energy.
Hmm... Interesting (i.e., interesting how for so many years I've misunderstood all these concepts :-) I mean, :-( ). I would ask you to define "power the process can have in terms of probability"... But I guess this falls again in the "I-better-go-do-my- homework" category (homework not in the sense that this question is part of some homework I got; but in the sense that I have my own work to do in investigating these things)
> In case of white noise, it means that there > is no limitation to the possible sequence.
Can you elaborate?? No limitation to the possible sequence meaning what exactly? (what kind of limitation -- or lack thereof -- are we talking about?)
> The energy or power you calculate for a possible > realization is something else, it does not relate > with the PDS is any physical sense.
I'm now confused by why are the units of the PSD so conveniently Watts per Hz? Thanks, Carlos --
Reply by Piergiorgio Sartor November 2, 20032003-11-02
Carlos Moreno wrote:
[...]

My feeling is that you're mixing things that
seems to be the same, but they're not.

The PDS of a stochastic process is the "power"
or "energy" the process can have in terms of
probability, not in terms of physical energy.

In case of white noise, it means that there
is no limitation to the possible sequence.
If not white, than the process is limited in
the "possible" realization and this is reflected
in the PDS as well.

The energy or power you calculate for a possible
realization is something else, it does not relate
with the PDS is any physical sense.
It just happens they can be exchanged, but they
should not.

bye,

-- 

piergiorgio

Reply by Carlos Moreno November 2, 20032003-11-02
Jerry Avins wrote:

> By applying correct geometric reasoning to a flawed figure, I can prove > that two lines which intersect are parallel. So what? > > Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not > absurd to say that if the line extends from -infinity to infinity, then > the power represented by them in the aggregate is infinite. Absurdity > lies in the belief that a line of such extent must follow rules that > apply to things that can be.
Jerry, You seem to keep missing my point. What you're saying is more or less equivalent to deriving an incorrect property of sinusoidal signals and claiming that that's because complex numbers do not exist in nature, and thus it is absurd to pretend that we'll get a meaningful result when applied to something that exists in nature, etc. etc. Complex numbers *are* an absurdity, the way you define absurdity. It is a mathematical abstraction that does not correspond to anything that exist in our universe (at least nothing thay we humans are aware of). Complex numbers simply follow *the mathematical rules* established by their purely mathematical definition. Following **those** mathematical definition, rules and axioms, you derive other results. Provided the definition that i^2 is equal to -1, Euler derived, following that definition and following the usual axioms of mathematics/algebra/calculus, that e^(x+iy) = e^x (cosy + i siny). You could claim that anything that you derive from there is plain wrong, since the imaginary unit does not exist in nature. Same thing with the delta -- you claim that none of what I've said makes sense because white noise can not exist in nature! Then why do you justify that results derived by applying the properties and the definition of Dirac's delta are indeed correct?? Direc's delta most certainly can not exist in nature; however, the results that you obtain using that mathematical abstraction are verifiable mathematically (and well, from the practical point of view, they also have relevance, since they *very closely* approximate the way certain phisical systems behave). You convert a differential equation to its integral form, and applying the definition of the delta (a definition that has no place in the real world), which implies that the integral from -epsilon to epsilon is 1 (for every epsilon > 0), then you obtain results that do not contradict any other result obtained using mathematical constructs. I insist: it does not make sense to use the argument "white noise can not exist in nature" to justify a contradiction that arises in mathematical terms -- two different ways of analytically calculating something that should be equivalent, yield different results; your argument would be valid if I claimed that in an actual experiment, I expect to obtain whatever this or whatever that, and I obtain something different from what the formulas tell me... Then, your argument *might* be valid (emphasis on the *might*; the conclusions you draw from properties of complex numbers, Dirac's delta, etc., *are indeed* verifiable in practice, with an accuracy given by the precision of the model, the implementation, and the precision of the measurements). Carlos --
Reply by Carlos Moreno November 2, 20032003-11-02
Randy Yates wrote:

> It is indeed a contradiction to me, and I cannot resolve > the contradiction. It is one I have had for years.
Hmmm... This thread is becoming interesting!! :-)
>> In fact, now that you "equate" the results of those two >> different approaches, I guess one problem (maybe part of >> the same problem?) is that I'm not sure I understand the >> justification of defining the PSD and Rxx as a Fourier >> transform pair. > > The justification of the definition? Are you asking how > Wiener and Khinchine proved the theorem that the PSD > is the Fourier transform of the autocorrelation function?
Well, I wasn't looking for something as rigorous as a proof of the theorem. Just an interpretation of the definition that helps me understand it better (as opposed to an interpretation *of the results* or an interpretation of the consequences of such definition -- that, I think I understand relatively well).
>> Fine. So, why the contradiction? (or the "apparent" >> contradiction of results?). > > Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.
Ok.
>> In fact, how would Rxx(0) be related to the average >> electrical power of the signal? (the average "watts" >> that a voltage signal would dissipate on a 1-ohm resistor). >> I seem to see clearly how the E{x^2} is related to the >> average power of a voltage signal, but not so sure that >> I understand the link with Rxx. > > > Rxx(0) = E[X^2(t)], by definition.
Huh?? I thought Rxx(0) was defined as the time-domain correlation: oo / | Rxx(T) = | X^2(t) dt | / -oo Notice that in that case, this represents the total energy of the signal... Hmmm, though something confuses me again: this would be the energy of *one particular* realization. (so, I think this brings me back to my original doubt: what the hell does PSD and Rxx represent for a random process?)
> Carlos, I am indeed not mad with you.
I never actually thought you'd be mad, of course! :-)
> PS: Go to sci.math, where I posed this very question
Still unanswered. We'll wait. Maybe in that reference I was pointed to (Priestly, vol. 1) such proof/justification is given? Thanks, Carlos --
Reply by Jerry Avins November 2, 20032003-11-02
Carlos Moreno wrote:

   ...
> > I'm really hoping that someone will be able and > willing to make me understand what's happening!! > (so that I can leave you guys alone AND sleep > peacefully at the same time :-)) -- I'm serious!! > I'm having nightmares about this!!! :-((( > > Cheers, > > Carlos > --
By applying correct geometric reasoning to a flawed figure, I can prove that two lines which intersect are parallel. So what? Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not absurd to say that if the line extends from -infinity to infinity, then the power represented by them in the aggregate is infinite. Absurdity lies in the belief that a line of such extent must follow rules that apply to things that can be. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
Reply by Randy Yates November 1, 20032003-11-01
Hi Carlos,

Carlos Moreno wrote:
> > Hi Randy, > > Randy Yates wrote: > >> >> Carlos, >> >> Although I haven't read all of the dozens of >> articles in this thread, would the following >> describe the "question in your head"? ...: >> >> Let X(t) be a zero-mean, IID, continuous-time process >> with variance of sigma^2. Now we know that since this >> function is IID, it has a white PSD and therefore its >> autocorrelation function at lag 0, R_XX(0), should be >> b*delta(tau), where delta(tau) is the usual Dirac delta >> function and b is some constant. >> >> However, we also know that the autocorrelation function R_XX(tau) >> is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0) >> is E[X^2] (where we have dropped the dependence on t since the >> process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2, >> which is not infinite. >> >> Is this the contradiction you speak of? > > > Maybe. As a matter of fact, this may be at the heart of > what I perceive as a contradiction. What surprises me is > that you show the two "contradictory" derivations, but > don't mention why the contradiction, or if it is indeed > a contradiction (what I mean is that it might be an > *apparent* contradiction).
It is indeed a contradiction to me, and I cannot resolve the contradiction. It is one I have had for years.
> In fact, now that you "equate" the results of those two > different approaches, I guess one problem (maybe part of > the same problem?) is that I'm not sure I understand the > justification of defining the PSD and Rxx as a Fourier > transform pair.
The justification of the definition? Are you asking how Wiener and Khinchine proved the theorem that the PSD is the Fourier transform of the autocorrelation function? That I cannot say - it is an interesting question, though.
> I mean, I remember from my Signals and > Systems course, understanding the intuitive interpretation > of it: the more high-frequency contents, the less correlated > close samples would be -- in particular, for white noise, > with strong frequency contents going to infinity, samples > arbitrarily close are still uncorrelated. That made (and > still makes) perfect sense to me. > > Fine. So, why the contradiction? (or the "apparent" > contradiction of results?).
Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.
> In fact, how would Rxx(0) be related to the average > electrical power of the signal? (the average "watts" > that a voltage signal would dissipate on a 1-ohm resistor). > I seem to see clearly how the E{x^2} is related to the > average power of a voltage signal, but not so sure that > I understand the link with Rxx.
Rxx(0) = E[X^2(t)], by definition. Remember, Rxx(tau) is defined to be E[X(t)*X(t+tau)], so when tau = 0, this yields E[X^2(t)].
>> PS: I'm a little put off that you haven't responded to my post on >> the units of PSD. Did you see it? > > > Yes!! And I apologize for letting it go unnoticed!! > The thing is that I read it together with the other > messages that finally made me decide that I was starting > to sound like a troll, and thus decided to witdraw... > My sincere apologies!
Acknowledged. Thanks, Carlos.
> Ironically, that was *the* message that best seemed to > address my initial point (I mean, it was the one message > that did not give me that feeling of "this guy didn't > understand what I was asking" -- not that I'm saying > that the others didn't understand; but almost all the > other messages (the initial ones, at least) gave me, at > some extent, the impression that they were going off a > tangent). > > When you showed me that the units of the PSD are indeed > watts per herz, that seemed like a very precise attempt > at convincing me of what the PSD is... (still, after > reading that message, I could not seem ro reconcile > ideas that seemed contradictory, and then again, that > contributed to my decision that "maybe I should go > re-study these concepts before I continue to bother > these guys" :-)) > > So, I'm hoping that you won't be too mad at me and will > be willing to elaborate a bit on this contradiction of > Rxx(0) being infinite while E{x^2} being finite.
Carlos, I am indeed not mad with you. Thanks for asking these questions - it helps remind me and others of defintions and concepts that tend to fade out unless you revisit them often. Regarding the contradiction, I hope my responses to you previously in this post have clarified. PS: Go to sci.math, where I posed this very question (in fact, I had cut and paste it here in my last message to you). As of a few minutes ago, I hadn't yet gotten any responses. -- % Randy Yates % "...the answer lies within your soul %% Fuquay-Varina, NC % 'cause no one knows which side %%% 919-577-9882 % the coin will fall." %%%% <yates@ieee.org> % 'Big Wheels', *Out of the Blue*, ELO http://home.earthlink.net/~yatescr