Reply by Piergiorgio Sartor●November 3, 20032003-11-03
Carlos Moreno wrote:
>> The PDS of a stochastic process is the "power"
>> or "energy" the process can have in terms of
>> probability, not in terms of physical energy.
> Hmm... Interesting (i.e., interesting how for
> so many years I've misunderstood all these
> concepts :-) I mean, :-( ).
Well, I'm not sure I'm really on the right way...
> I would ask you to define "power the process can
> have in terms of probability"... But I guess
Maybe see below.
> this falls again in the "I-better-go-do-my-
> homework" category (homework not in the sense
> that this question is part of some homework I
> got; but in the sense that I have my own work
> to do in investigating these things)
Well, I think that posing doubts is a good
way to proceed.
>> In case of white noise, it means that there
>> is no limitation to the possible sequence.
> Can you elaborate?? No limitation to the possible
> sequence meaning what exactly? (what kind of
> limitation -- or lack thereof -- are we talking
Let's say you've white noise (discrete time, it's easier).
This means that if at the time T1, x[T1] = 1, at the
time T2, x[T2] can be anything.
If the process is not white, for example is low pass
filtered, than if x[T1] = 1, x[T2] cannot be anything,
there is not enough "power" to change it to one of
all possible values.
It will be limited to change to values close to 1, the
more is low passed, the more it's limited.
The wider the spectrum, the higher is the probability
to have "anything" as sequence.
As the "power" of the stochastic process increases,
the probability of having something different is
higher. Something like the probability is boosted
by the statistical power.
That's my interpretation.
>> The energy or power you calculate for a possible
>> realization is something else, it does not relate
>> with the PDS is any physical sense.
> I'm now confused by why are the units of the PSD
> so conveniently Watts per Hz?
It's "power" at the end, but it must be interpreted in
a slightly different way.
I think very often these related each other, but must
not be swapped without good sense.
Reply by Dilip V. Sarwate●November 3, 20032003-11-03
Randy Yates <firstname.lastname@example.org> wrote in message news:<_B_ob.1666$9M3.email@example.com>...
> It is indeed a contradiction to me, and I cannot resolve
> the contradiction. It is one I have had for years.
We have crossed swords on this point before, but what the
heck, once more into the breach...
Let us first talk of energy and power for a deterministic signal.
A signal x(t) delivers, say during the interval [0, T], into a 1
ohm resistor) an amount of energy that can be expressed as
En(T) = *integral* of x^2(t) from 0 to T
The *average power* during this interval is (1/T).En(T) while the
*instantaneous power* P(t) at any time t during this interval is the
ratio of the energy delivered during a small interval of duration dt
to the length of the interval, i.e. P(t) = [En(t + dt) - En(t)]/dt,
or more properly, the limiting value of this ratio as dt becomes
very small. We can extend all these notions to the whole real line
appropriately, and for many signals, we will have that En(T) increases
without bound as T approaches infinity while (1/T).En(T) approaches a
constant value (called the average power of the signal), but let's
just stick to a finite length interval for a while.
Now, underlying all this mathematical malarkey is the notion that
x^2(t) is an *integrable* function of t. Next, let us consider the
random process that you and Carlos are thinking about, viz. a random
process for which each X(t) is uniformly distributed on [-1, 1] and
for which each X(t) is independent of all other X(t'). If we think
about the *entire set* of realizations or sample functions of this
process, then we see that these realizations are nothing more and
nothing less than the set of *all* functions x(t) such that |x(t)|
is at most 1 over the interval under consideration. We can define
the energy delivered for all these signals as above. Or can we?
Are *all* (or almost all) bounded functions also integrable functions?.
No. In fact almost all of these sample functions of the random process
are *not integrable*. Thus it is not clear how we can define
energy delivered for the sample functions of the process that you
But, you say, a typical sample function is, after all, a signal
that we can produce in the lab, and therefore we can measure its
energy and figure out how to define the appropriate formula for
energy. Well, as I pointed out in a different subthread, a typical
sample function is discontinuous almost everywhere, and requires
infinite currents to flow to charge and discharge capacitors
instantaneously. Can we get a pretty good approximation by creating
a sampled version? Well, you have then "filtered" the process in some
sense when you sample it, and the discrepancy between infinite power
and finite power disappears.
In summary, I suggest that we all stop trying to look for Wiener's
or Khinchine's original definitions and/or proofs in the hopes of
resolving the apparent contradictions. The answer is not there but
rather in basic calculus which we have forgotten in our rush to
judgment... The Wiener-Khinchine formulation does not apply to the
so-called white noise process with finite variance.