# Rigorous definition of the Spectral Density of a random signal?

Started by October 25, 2003
```Carlos Moreno wrote:

...

>
> My point is that, the white noise as I defined it, as a
> mathematical abstraction, has properties that can be
> rigorously defined according to the laws of mathematics.
>
> The way I see it, the power of the white noise that I
> described *can not* be infinite.  In fact, calculating
> the variance is a trivial exercise:  E{x^2} = integral
> from -1 to 1 of (1/2) x^2 dx.  That gives you the average
> power (assuming a normalized 1-ohm resistor, etc. etc.),
> which is definitely not infinite.
>
> Yes, white noise is a mathematical abstraction;  the
> disagreement is on the mathematical properties of this
> abstraction.

Why does it surprise you that the calculated properties of impossible
abstractions are themselves not possible? White noise as you defined is
is characterized as volts per root Hz. (Or watts per Hz.) It is a useful
abstraction where the actual bandwidth is limited, even though it
implies infinite power for infinite bandwidth. That doesn't bother me
because neither infinity can be realized.

> You mention also the delta.  As an example, if someone
> tells you that the value of the delta at t = 0 is 1??
> You will undoubtedly tell them that they're wrong!!

You would do well to doubt.  I might, however, ask 1 what?

> It doesn't matter that a delta can not exist in practice;
> the mathematical properties that define a delta are
> specific and rigurous.

Well, it behooves us to be precise. What to you mean by "value" above?
Certainly nor width or height. Some would say "area", others "strength".

> *Gaussian* white noise with zero mean and variance
> sigma^2.  Well, if the variance is sigma^2, then the
> average power is not infinite!

Are you sure that sigma^2 represents power here? There is no limit to
the magnitude of true Gaussian noise, even though the frequency of a
particular amplitude's being exceeded decreases rapidly with amplitude.
If there is just one infinite-amplitude event in a year, the average
power is rather large, to say the least.

It doesn't do to say that impossible conditions are OK because that are
merely mathematical constructs, and then to reject the conclusions
mathematically constructed from them.

Jerry
--
Engineering is the art of making what you want from things you can get.
&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;

```
```Carlos Moreno <moreno_at_mochima_dot_com@x.xxx> wrote in message news:<iwkob.13706\$RG1.441838@wagner.videotron.net>...

> *Gaussian* white noise with zero mean and variance
> sigma^2.  Well, if the variance is sigma^2, then the
> average power is not infinite!

Carlos:

**You** may talk about Gaussian white noise with zero
mean and variance sigma^2, but in the context of
continuous-time signals, this does not make sense.
In most systems, the thermal noise at the output of
a filter can be modeled as a Gaussian random process
whose PSD (that word you hate!) is proportional to
|H(w)|^2.  We can derive this result using the standard
general theory of second-order (i.e. finite power)
random processes by **pretending** that the filter input
is a Gaussian random process with PSD = constant for all w.
This hypothetical process is called Gaussian white noise,
and generalized Fourier theory (which allows the notion
of Fourier transforms of **power** signals) allows us to
pretend that the input process has autocorrelation that
is a delta function.  But, it does not make sense to talk
about the variance of white noise: it is a meaningless
concept.  For continuous-time signals, the only meaning to
be ascribed to white noise -- Gaussian or not -- is
that it is a hypothetical process defined by the property
that when it is passed through a filter with transfer function
H(w), it produces a random process with PSD that is proportional
to |H(w)|^2. We cannot talk of its properties except in terms
of what we can observe, and any observation necessarily
implies some filtering.  For example, we cannot talk of the
properties of the random *variable* X(5), say, and claim that
it has zero mean or is Gaussian with some specified variance
(possibly infinite), because we cannot sample the process
at t = 5 instantaneously, even though we often pretend in DSP
circles that we can and do get an instantaneous sample.  This
pretence works because the fact that an actual sampler switch
will stay closed for a small nonzero period of time does not matter
verymuch in typical DSP applications: the signal being sampled has
typically been filtered anyway.  But, to apply this notion of
instantaneous sampling to a white noise process just leads to
the same sort of dilemmas that you are stuck with.

For **discrete-time** random processes, the concept
of Gaussian white noise is a sequence of independent
zero-mean Gaussian random variables with fixed finite
variance sigma^2.  The power here is indeed finite (and
equal to sigma^2), but bear in mind if we think of this
discrete-time process as having been obtained from a
continuous-time process via sampling, then the continuous-time
white noise has, implicitly or explicitly, been filtered
before sampling and thus has finite variance.

Now, you can say that you are very mathematical and rigorous
and don't care a whit about practical notions, and have given
an explicit construction of a continuous-time random process
with finite variance such that X(t) and X(t') are independent
if t and t' are two different real numbers.  Now, why don't
you figure out what a typical sample function (or realization)
of this process is, whether such a realization can actually be
exhibited (remember that instantaneous changes in voltage will
require that infinite currents instantaneously change the
charges in various capacitors in the circuit!), and whether
second-order random process theory can be applied to this
process that you have described?  Since you claim that you are
not trolling or seeking help on homework, maybe you should be asking

--Dilip Sarwate
```
```Jerry Avins wrote:

> Why does it surprise you that the calculated properties of impossible
> abstractions are themselves not possible?

Because mathematical abstractions follow mathematical rules
and axioms, not the possibility or impossibility of such
abstraction to exist in nature.

The great majority of mathematical constructs are abstractions
that are not possible in nature (I would almost dare to say *all*
mathematical abstractions).

The simplest and most ubiqutous abstraction:  the notion of
"continuous" values.  It doesn't seem to exist in nature; yet
we do use differential equations and differential calculus to
obtain *real* results (that do approximate the way physical
systems behave).

(and let's not get started with complex numbers!!  Just
because complex numbers do not exist in nature, would you
dismiss as invalid all of the Fourier analysis theory or
anything that someone derives based on complex numbers
properties?)

What I'm trying to say (which applies to the white noise
discussion) is:  if we set up a mathematical abstraction
with certain properties (properties that do not contradict
other more fundamental mathematical axioms), then anything
that we derive from it will have to be consistent with that
and other mathematical axioms!

White noise should not be an exception!  So white noise can
not exist in nature?  So what?  Complex numbers don't either,
and the delta function doesn't either, and yet they don't
properties of them.

See, that's my point:  what I'm seeing when thinking of PSD
as density of power is a contradiction *in mathematical
terms*, not a contradiction with physical properties or
terms.

>> *Gaussian* white noise with zero mean and variance
>> sigma^2.  Well, if the variance is sigma^2, then the
>> average power is not infinite!
>
> Are you sure that sigma^2 represents power here?

It represents "mean power" or "average" power, of course
I'm sure!  At least it has to!

> There is no limit to
> the magnitude of true Gaussian noise

I know.  And it doesn't matter:  the average is still
finite!  The expected value of x^2, where x is a random
variable with gaussian distribution, is the variance of
it.

> If there is just one infinite-amplitude event in a year, the average
> power is rather large, to say the least.

Actually, there is never an event with infinite-amplitude.
At any time, the value of a Gaussian variable is an *actual
number*.  There is no such thing as a function, or a random
variable, taking "infinite-value".

> It doesn't do to say that impossible conditions are OK because that are
> merely mathematical constructs

Hmmm, I know that we may be shifting to the "philosophical
grounds" here, but I have to disagree with that.  As I said,
virtually all mathematical constructs are indeed impossible
to achieve in nature.  Many of them are things that we use
on a daily basis in DSP, and in general in signal analysis
and the like (e.g., the delta, complex numbers, differential
equations and differential calculus in general, integrals
from 0 to infinity, or from -infinity to infinity).

BTW, notice that I'm not saying that impossible conditions
are ok:  I'm saying that mathematical constructs representing
impossible conditions may be ok -- as long as they're ok from
the point of view of the mathematical rules that we use to
deal with them.

Applying those mathematical constructs to represent physical
things, that's a different thing, and I do agree with what
you mention about dealing with "band limited" versions of
white noise, etc.

Carlos
--

```
```Carlos Moreno wrote:

[...]

Maybe I'm wrong, but I think the sigma^2 refers
to the Gaussian variable, while the PDS refers
to the Gaussian process, i.e. a sequence of
Gaussian variables.

So the process has infinite "power", while the
variable itself has statistical power = sigma^2.

Does it fit to you?

bye,

--
Piergiorgio Sartor

```
```Carlos Moreno wrote:

> Jerry Avins wrote:
>
>> Why does it surprise you that the calculated properties of impossible
>> abstractions are themselves not possible?
>
>
> Because mathematical abstractions follow mathematical rules
> and axioms, not the possibility or impossibility of such
> abstraction to exist in nature.

This is silly. I can prove that if the moon is made of green cheese,
then you are your own grandmother. In a proof, any false premise can
lead to any conclusion, true or false. Just so, a non-realizable
abstraction can lead to a non-realizable property. That's OK. What's not
OK is to claim that it really is realizable, or that a non-realizable
conclusion invalidated the argument. In fact, all it shows is that the
abstraction was extended outside it's useful domain. Just as functions
have regions of convergence, abstractions have regions of applicability.

Be happy!

>
...

Jerry
--
Engineering is the art of making what you want from things you can get.
&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;

```
```Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
>
> [...]
>
> Maybe I'm wrong, but I think the sigma^2 refers
> to the Gaussian variable, while the PDS refers
> to the Gaussian process, i.e. a sequence of
> Gaussian variables.
>
> So the process has infinite "power", while the
> variable itself has statistical power = sigma^2.
>
> Does it fit to you?

I'm not sure...  The first impression is to say "no
it doesn't", but I'm not quite sure that I'm not
missing some subtlety.

I know I may be repeating myself, but applying your
concepts to the random process that I described
before  (x(t) is independent of the value of x at
any time other than t, and has uniform distribution
in [-1,1].

Any possible realization of such process must have
an average power less than one.  In fact, for any
possible realization of such process, the power at
any given time (as an "instantaneous" measure) must
be less than 1 (since the magnitude of the signal
can not be greater than one).

Also, the variance of the signal at any given time
(i.e., the variance with respect to all possible
realizations of the process) is also less than one.

So, how could a concept that leads to "infinite
power" fit in here?   Notice that this example
emphasizes a bit more the impossibility of having
infinite power, but it is indeed equivalent to the
example of Gaussian white noise.

Carlos
--

```
```Dilip V. Sarwate wrote:

> [...]
> --Dilip Sarwate

I guess I can only thank you for a detailed and careful
explanation of the terms causing my confusion... (in
fact, I'll have to re-read it carefully :-))

I was in the process of writing a long and detailed
message in response to yours, but to be honest, *I*
myself read the message and it *really* started to
sound like I'm trolling...  :-(

My conclusion is that I will definitely have to figure
out a way to reconcile ideas that in my mind are

At some point during this discussion I though about
the possibility that this is indeed one of those
"mathematical paradoxes"  (something like the notion
that there is exactly the same number of points in
the interval (0,1) as in the interval (0,infinity),
given that I can define a one-to-one, bijective
function that maps (0,1) to (0,infinity).

This paradox is "explained" by the fact that the real
numbers is not a countable set.  (I think -- maybe
mathematicians out there will scream at me telling
me that I have no clue of what I'm saying  :-))

See, the thing is that I'm not able to see if this
PSD thing with white noise is one of those paradoxes,
and if so, where would the origin of that paradox be.

Oh well, I guess I bothered you guys enough, so I'll
stop.  Thanks to all that participated!!

Cheers,

Carlos
--

```
```Carlos Moreno wrote:

...

> I know I may be repeating myself, but applying your
> concepts to the random process that I described
> before  (x(t) is independent of the value of x at
> any time other than t, and has uniform distribution
> in [-1,1].

Then it isn't bandlimited, so you can't draw legitimate conclusions from
the samples.

...

Jerry
--
Engineering is the art of making what you want from things you can get.
&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;

```
```Carlos Moreno wrote:

> I know I may be repeating myself, but applying your
> concepts to the random process that I described
> before  (x(t) is independent of the value of x at
> any time other than t, and has uniform distribution
> in [-1,1].

OK...

> Any possible realization of such process must have
> an average power less than one.  In fact, for any
> possible realization of such process, the power at
> any given time (as an "instantaneous" measure) must
> be less than 1 (since the magnitude of the signal
> can not be greater than one).

I do not get this.
A possibile realization is:

..., 0, 0, 1, 0, 0...

which has quite a lot of spectral energy, I would say.

> Also, the variance of the signal at any given time
> (i.e., the variance with respect to all possible
> realizations of the process) is also less than one.

That signal does not have any variance, the
random variable has variance.

> So, how could a concept that leads to "infinite
> power" fit in here?   Notice that this example
> emphasizes a bit more the impossibility of having
> infinite power, but it is indeed equivalent to the
> example of Gaussian white noise.

Statistical power, but I do not see the point.

In my view the process you described has infinite
power, why not?

bye,

--

piergiorgio

```
```I know I said I would leave you guys alone and let the
and it feels kind of rude to let you talking alone...

Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
>
>> I know I may be repeating myself, but applying your
>> concepts to the random process that I described
>> before  (x(t) is independent of the value of x at
>> any time other than t, and has uniform distribution
>> in [-1,1].
>
>> Any possible realization of such process must have
>> an average power less than one.  In fact, for any
>> possible realization of such process, the power at
>> any given time (as an "instantaneous" measure) must
>> be less than 1 (since the magnitude of the signal
>> can not be greater than one).
>
> I do not get this.
> A possibile realization is:
>
> ..., 0, 0, 1, 0, 0...

???

How can that be a possible realization?  The process
I described is continuous-time.  But regardless, I
don't understand what you mean with "a lot of spectral
energy".

I'm talking about signals representing voltage, and
thus, the power I'm referring to is electric power,
which, as an instantaneous measure, at time t, is
equal to x(t)^2  (divided by the value of the
resistor to which it is applied, but let's assume
a normalized 1 ohm resistor)

If we talk about the average power of one particular
realization (let's call it the "time average power",
to avoid confusion with any other parameter), then
that would be given by:

T
/
1  |
TAP = lim    --- |  x(t)^2 dt
T->oo  2T  |
/
-T

That integral is less-than-or-equal to an integral
with the same limits and a function that is >= x(t)^2
for all t.

So, such function could be f(t) = 1.  If you plug
f(t) = 1 (replacing x(t)^2) in the above formula,
you obtain that the average power  (time average
power) is 1.  Thus, the average power of the signal
(the particular realization of the process I described)
is less-than-or-equal than one.

If all possible realizations have a time average
power less than one, then the ensemble average power
(i.e., the average over all possible realizations
of the time average power) must also be less than
or equal than one.

I'm using basic definition and basic properties of
integrals, and I reach a conclusion that contradicts
the fact that the white noise is found to have
infinite power, if calculated from the fact that
the constant-valued PSD represents density of power.

I feel so frustrated that I haven't been able to
communicate what's in my mind!!  And that is a
fact, because I am seeing a contradiction that
no-one else sees, and no-one has been able to make
me understand why there isn't a contradiction, or
why such contradiction is to be expected  (no, I'm
still not buying the "since white noise is impossible
to achieve in nature..." -- again, the way I see it,
the contradiction arises in purely abstract mathematical
terms, using the rules and axioms of mathematics, which
apply to mathematical constructs).

> In my view the process you described has infinite
> power, why not?

Well, the way I see it, because of the above...
Maybe I'm still using the wrong terminology?  Or
maybe I'm confusion some terms or properties??

I'm really hoping that someone will be able and
willing to make me understand what's happening!!
(so that I can leave you guys alone AND sleep
peacefully at the same time  :-)) -- I'm serious!!