# Rigorous definition of the Spectral Density of a random signal?

Started by October 25, 2003
```Carlos Moreno wrote:
> [...]
> I feel so frustrated that I haven't been able to
> communicate what's in my mind!!

Carlos,

Although I haven't read all of the dozens of
articles in this thread, would the following

Let X(t) be a zero-mean, IID, continuous-time process
with variance of sigma^2. Now we know that since this
function is IID, it has a white PSD and therefore its
autocorrelation function at lag 0, R_XX(0), should be
b*delta(tau), where delta(tau) is the usual Dirac delta
function and b is some constant.

However, we also know that the autocorrelation function R_XX(tau)
is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
is E[X^2] (where we have dropped the dependence on t since the
process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
which is not infinite.

Is this the contradiction you speak of?

PS: I'm a little put off that you haven't responded to my post on
the units of PSD. Did you see it?
--
%% Fuquay-Varina, NC            %       'cause no one knows which side
%%% 919-577-9882                %                   the coin will fall."
%%%% <yates@ieee.org>           %  'Big Wheels', *Out of the Blue*, ELO

```
```Hi Randy,

Randy Yates wrote:
>
> Carlos,
>
> Although I haven't read all of the dozens of
> articles in this thread, would the following
>
>   Let X(t) be a zero-mean, IID, continuous-time process
>   with variance of sigma^2. Now we know that since this
>   function is IID, it has a white PSD and therefore its
>   autocorrelation function at lag 0, R_XX(0), should be
>   b*delta(tau), where delta(tau) is the usual Dirac delta
>   function and b is some constant.
>
>   However, we also know that the autocorrelation function R_XX(tau)
>   is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
>   is E[X^2] (where we have dropped the dependence on t since the
>   process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
>   which is not infinite.
>
> Is this the contradiction you speak of?

Maybe.  As a matter of fact, this may be at the heart of
what I perceive as a contradiction.  What surprises me is
that you show the two "contradictory" derivations, but
don't mention why the contradiction, or if it is indeed
a contradiction (what I mean is that it might be an

In fact, now that you "equate" the results of those two
different approaches, I guess one problem (maybe part of
the same problem?) is that I'm not sure I understand the
justification of defining the PSD and Rxx as a Fourier
transform pair.  I mean, I remember from my Signals and
Systems course, understanding the intuitive interpretation
of it:  the more high-frequency contents, the less correlated
close samples would be -- in particular, for white noise,
with strong frequency contents going to infinity, samples
arbitrarily close are still uncorrelated.  That made (and
still makes) perfect sense to me.

Fine.  So, why the contradiction?  (or the "apparent"

In fact, how would Rxx(0) be related to the average
electrical power of the signal?  (the average "watts"
that a voltage signal would dissipate on a 1-ohm resistor).
I seem to see clearly how the E{x^2} is related to the
average power of a voltage signal, but not so sure that
I understand the link with Rxx.

> PS: I'm a little put off that you haven't responded to my post on
> the units of PSD. Did you see it?

Yes!!  And I apologize for letting it go unnoticed!!
The thing is that I read it together with the other
messages that finally made me decide that I was starting
to sound like a troll, and thus decided to witdraw...
My sincere apologies!

Ironically, that was *the* message that best seemed to
address my initial point  (I mean, it was the one message
that did not give me that feeling of "this guy didn't
understand what I was asking"  -- not that I'm saying
that the others didn't understand; but almost all the
other messages (the initial ones, at least) gave me, at
some extent, the impression that they were going off a
tangent).

When you showed me that the units of the PSD are indeed
watts per herz, that seemed like a very precise attempt
at convincing me of what the PSD is... (still, after
reading that message, I could not seem ro reconcile
ideas that seemed contradictory, and then again, that
contributed to my decision that "maybe I should go
re-study these concepts before I continue to bother
these guys"  :-))

So, I'm hoping that you won't be too mad at me and will
be willing to elaborate a bit on this contradiction of
Rxx(0) being infinite while E{x^2} being finite.

Thanks!!

Carlos
--

```
```Hi Carlos,

Carlos Moreno wrote:
>
> Hi Randy,
>
> Randy Yates wrote:
>
>>
>> Carlos,
>>
>> Although I haven't read all of the dozens of
>> articles in this thread, would the following
>>
>>   Let X(t) be a zero-mean, IID, continuous-time process
>>   with variance of sigma^2. Now we know that since this
>>   function is IID, it has a white PSD and therefore its
>>   autocorrelation function at lag 0, R_XX(0), should be
>>   b*delta(tau), where delta(tau) is the usual Dirac delta
>>   function and b is some constant.
>>
>>   However, we also know that the autocorrelation function R_XX(tau)
>>   is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
>>   is E[X^2] (where we have dropped the dependence on t since the
>>   process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
>>   which is not infinite.
>>
>> Is this the contradiction you speak of?
>
>
> Maybe.  As a matter of fact, this may be at the heart of
> what I perceive as a contradiction.  What surprises me is
> that you show the two "contradictory" derivations, but
> don't mention why the contradiction, or if it is indeed
> a contradiction (what I mean is that it might be an

It is indeed a contradiction to me, and I cannot resolve

> In fact, now that you "equate" the results of those two
> different approaches, I guess one problem (maybe part of
> the same problem?) is that I'm not sure I understand the
> justification of defining the PSD and Rxx as a Fourier
> transform pair.

The justification of the definition? Are you asking how
Wiener and Khinchine proved the theorem that the PSD
is the Fourier transform of the autocorrelation function?
That I cannot say - it is an interesting question, though.

>  I mean, I remember from my Signals and
> Systems course, understanding the intuitive interpretation
> of it:  the more high-frequency contents, the less correlated
> close samples would be -- in particular, for white noise,
> with strong frequency contents going to infinity, samples
> arbitrarily close are still uncorrelated.  That made (and
> still makes) perfect sense to me.
>
> Fine.  So, why the contradiction?  (or the "apparent"

Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.

> In fact, how would Rxx(0) be related to the average
> electrical power of the signal?  (the average "watts"
> that a voltage signal would dissipate on a 1-ohm resistor).
> I seem to see clearly how the E{x^2} is related to the
> average power of a voltage signal, but not so sure that
> I understand the link with Rxx.

Rxx(0) = E[X^2(t)], by definition. Remember, Rxx(tau) is
defined to be E[X(t)*X(t+tau)], so when tau = 0, this
yields E[X^2(t)].

>> PS: I'm a little put off that you haven't responded to my post on
>> the units of PSD. Did you see it?
>
>
> Yes!!  And I apologize for letting it go unnoticed!!
> The thing is that I read it together with the other
> messages that finally made me decide that I was starting
> to sound like a troll, and thus decided to witdraw...
> My sincere apologies!

Acknowledged. Thanks, Carlos.

> Ironically, that was *the* message that best seemed to
> address my initial point  (I mean, it was the one message
> that did not give me that feeling of "this guy didn't
> understand what I was asking"  -- not that I'm saying
> that the others didn't understand; but almost all the
> other messages (the initial ones, at least) gave me, at
> some extent, the impression that they were going off a
> tangent).
>
> When you showed me that the units of the PSD are indeed
> watts per herz, that seemed like a very precise attempt
> at convincing me of what the PSD is... (still, after
> reading that message, I could not seem ro reconcile
> ideas that seemed contradictory, and then again, that
> contributed to my decision that "maybe I should go
> re-study these concepts before I continue to bother
> these guys"  :-))
>
> So, I'm hoping that you won't be too mad at me and will
> be willing to elaborate a bit on this contradiction of
> Rxx(0) being infinite while E{x^2} being finite.

these questions - it helps remind me and others of defintions
and concepts that tend to fade out unless you revisit them
often.

Regarding the contradiction, I hope my responses to you previously
in this post have clarified.

PS: Go to sci.math, where I posed this very question (in fact,
I had cut and paste it here in my last message to you). As of
a few minutes ago, I hadn't yet gotten any responses.
--
%% Fuquay-Varina, NC            %       'cause no one knows which side
%%% 919-577-9882                %                   the coin will fall."
%%%% <yates@ieee.org>           %  'Big Wheels', *Out of the Blue*, ELO

```
```Carlos Moreno wrote:

...
>
> I'm really hoping that someone will be able and
> willing to make me understand what's happening!!
> (so that I can leave you guys alone AND sleep
> peacefully at the same time  :-)) -- I'm serious!!
>
> Cheers,
>
> Carlos
> --

By applying correct geometric reasoning to a flawed figure, I can prove
that two lines which intersect are parallel. So what?

Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
absurd to say that if the line extends from -infinity to infinity, then
the power represented by them in the aggregate is infinite. Absurdity
lies in the belief that a line of such extent must follow rules that
apply to things that can be.

Jerry
--
Engineering is the art of making what you want from things you can get.
&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;

```
```Randy Yates wrote:

> It is indeed a contradiction to me, and I cannot resolve

Hmmm...  This thread is becoming interesting!!  :-)

>> In fact, now that you "equate" the results of those two
>> different approaches, I guess one problem (maybe part of
>> the same problem?) is that I'm not sure I understand the
>> justification of defining the PSD and Rxx as a Fourier
>> transform pair.
>
> The justification of the definition? Are you asking how
> Wiener and Khinchine proved the theorem that the PSD
> is the Fourier transform of the autocorrelation function?

Well, I wasn't looking for something as rigorous as a proof
of the theorem.  Just an interpretation of the definition
that helps me understand it better (as opposed to an
interpretation *of the results* or an interpretation of
the consequences of such definition -- that, I think I
understand relatively well).

>> Fine.  So, why the contradiction?  (or the "apparent"
>
> Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.

Ok.

>> In fact, how would Rxx(0) be related to the average
>> electrical power of the signal?  (the average "watts"
>> that a voltage signal would dissipate on a 1-ohm resistor).
>> I seem to see clearly how the E{x^2} is related to the
>> average power of a voltage signal, but not so sure that
>> I understand the link with Rxx.
>
>
> Rxx(0) = E[X^2(t)], by definition.

Huh??

I thought Rxx(0) was defined as the time-domain correlation:

oo
/
|
Rxx(T) =  |   X^2(t) dt
|
/
-oo

Notice that in that case, this represents the total energy
of the signal...  Hmmm, though something confuses me again:
this would be the energy of *one particular* realization.
(so, I think this brings me back to my original doubt:
what the hell does PSD and Rxx represent for a random
process?)

> Carlos, I am indeed not mad with you.

I never actually thought you'd be mad, of course!  :-)

> PS: Go to sci.math, where I posed this very question

Still unanswered.  We'll wait.  Maybe in that reference I
was pointed to (Priestly, vol. 1) such proof/justification
is given?

Thanks,

Carlos
--

```
```Jerry Avins wrote:

> By applying correct geometric reasoning to a flawed figure, I can prove
> that two lines which intersect are parallel. So what?
>
> Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
> absurd to say that if the line extends from -infinity to infinity, then
> the power represented by them in the aggregate is infinite. Absurdity
> lies in the belief that a line of such extent must follow rules that
> apply to things that can be.

Jerry,

You seem to keep missing my point.  What you're saying is more
or less equivalent to deriving an incorrect property of sinusoidal
signals and claiming that that's because complex numbers do not
exist in nature, and thus it is absurd to pretend that we'll get
a meaningful result when applied to something that exists in
nature, etc. etc.

Complex numbers *are* an absurdity, the way you define absurdity.

It is a mathematical abstraction that does not correspond to anything
that exist in our universe (at least nothing thay we humans are aware
of).

Complex numbers simply follow *the mathematical rules* established
by their purely mathematical definition.  Following **those**
mathematical definition, rules and axioms, you derive other
results.

Provided the definition that i^2 is equal to -1, Euler derived,
following that definition and following the usual axioms of
mathematics/algebra/calculus, that e^(x+iy) = e^x (cosy + i siny).

You could claim that anything that you derive from there is plain
wrong, since the imaginary unit does not exist in nature.

Same thing with the delta -- you claim that none of what I've said
makes sense because white noise can not exist in nature!

Then why do you justify that results derived by applying the
properties and the definition of Dirac's delta are indeed correct??
Direc's delta most certainly can not exist in nature;  however,
the results that you obtain using that mathematical abstraction
are verifiable mathematically (and well, from the practical point
of view, they also have relevance, since they *very closely*
approximate the way certain phisical systems behave).  You convert
a differential equation to its integral form, and applying the
definition of the delta (a definition that has no place in the real
world), which implies that the integral from -epsilon to epsilon
is 1 (for every epsilon > 0), then you obtain results that do not
contradict any other result obtained using mathematical constructs.

I insist:  it does not make sense to use the argument "white noise
can not exist in nature" to justify a contradiction that arises
in mathematical terms -- two different ways of analytically
calculating something that should be equivalent, yield different
results;   your argument would be valid if I claimed that in an
actual experiment, I expect to obtain whatever this or whatever
that, and I obtain something different from what the formulas tell
me...  Then, your argument *might* be valid  (emphasis on the
*might*;  the conclusions you draw from properties of complex
numbers, Dirac's delta, etc., *are indeed* verifiable in practice,
with an accuracy given by the precision of the model, the
implementation, and the precision of the measurements).

Carlos
--

```
```Carlos Moreno wrote:
[...]

My feeling is that you're mixing things that
seems to be the same, but they're not.

The PDS of a stochastic process is the "power"
or "energy" the process can have in terms of
probability, not in terms of physical energy.

In case of white noise, it means that there
is no limitation to the possible sequence.
If not white, than the process is limited in
the "possible" realization and this is reflected
in the PDS as well.

The energy or power you calculate for a possible
realization is something else, it does not relate
with the PDS is any physical sense.
It just happens they can be exchanged, but they
should not.

bye,

--

piergiorgio

```
```Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
> [...]
>
> My feeling is that you're mixing things that
> seems to be the same, but they're not.
>
> The PDS of a stochastic process is the "power"
> or "energy" the process can have in terms of
> probability, not in terms of physical energy.

Hmm...  Interesting  (i.e., interesting how for
so many years I've misunderstood all these
concepts  :-)  I mean, :-( ).

I would ask you to define "power the process can
have in terms of probability"...  But I guess
this falls again in the "I-better-go-do-my-
homework" category  (homework not in the sense
that this question is part of some homework I
got;  but in the sense that I have my own work
to do in investigating these things)

> In case of white noise, it means that there
> is no limitation to the possible sequence.

Can you elaborate??  No limitation to the possible
sequence meaning what exactly?  (what kind of
limitation -- or lack thereof -- are we talking

> The energy or power you calculate for a possible
> realization is something else, it does not relate
> with the PDS is any physical sense.

I'm now confused by why are the units of the PSD
so conveniently Watts per Hz?

Thanks,

Carlos
--

```
```Carlos Moreno wrote:

> Jerry Avins wrote:
>
>> By applying correct geometric reasoning to a flawed figure, I can prove
>> that two lines which intersect are parallel. So what?
>>
>> Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
>> absurd to say that if the line extends from -infinity to infinity, then
>> the power represented by them in the aggregate is infinite. Absurdity
>> lies in the belief that a line of such extent must follow rules that
>> apply to things that can be.
>
>
> Jerry,
>
> You seem to keep missing my point.   ...

It's worse than that: We're missing each other's points. Mine are

1. When integrating from -infinity to +infinity that which exists only
locally in time, the result is of no practical interest. Maybe that's too
strong. Put another way, integrating a function over all frequencies that
represents power and is valid only within a limited band can yield the
surprising conclusion that troubles you.

2. The assumption that successive samples of a signal are statistically
independent is only valid when the bandwidth of the signal is less than
half the sample rate. Your assumed properties of the signal should assure
you that the samples you hypothesize, and the calculations you perform on
them, are worthless.

Jerry
--
Engineering is the art of making what you want from things you can get.
&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;

```
```Carlos Moreno wrote:
> Randy Yates wrote:
>>
>> Rxx(0) = E[X^2(t)], by definition.
>
>
> Huh??
>
> I thought Rxx(0) was defined as the time-domain correlation:
>
>             oo
>            /
>           |
> Rxx(T) =  |   X^2(t) dt
>           |
>          /
>           -oo
>
> Notice that in that case, this represents the total energy
> of the signal...  Hmmm, though something confuses me again:
> this would be the energy of *one particular* realization.
> (so, I think this brings me back to my original doubt:
> what the hell does PSD and Rxx represent for a random
> process?)

Yup, there's that definition too. But the autocorrelation function
which the Wiener-Khinchine theorem utilizes is the probabilistic
one.

>> PS: Go to sci.math, where I posed this very question
>
>
> Still unanswered.  We'll wait.  Maybe in that reference I
> was pointed to (Priestly, vol. 1) such proof/justification
> is given?

Maybe. If you find it there, please let me know.
--