# Rigorous definition of the Spectral Density of a random signal?

Started by October 25, 2003
Carlos Moreno wrote:
> They do not give a definition of what the
> PSD is, but just spit a bunch of blah-blah about what I
> would call "the consequences from an intuitive point of
> view of what PSD is"  :-\  You see how one can get pretty
> frustrated  :-(

Yes.  Look at Priestly; it focuses on the statistical properties
and is very rigorous and technical.  It's one of the top few
references on statistical properties of periodograms and
related statistics.

Part of the issue is context.  For example, if you believe your
data contains an unknown single sinusoid and Gaussian noise, then
there is a simple interpretation of the PSD as the log of the
(marginal) posterior probability density for the frequency of
the sinusoid (this is from the Bayesian viewpoint; there is an
analogous frequentist result in terms of least squares estimation
of the frequency).  If instead you believe the signal has a smooth
spectrum, things get more complicated.  For example, Whittle showed
that, under certain conditions, you can use the PSD of the data to
easily construct an approximate likelihood function for the PSD of the
this).  So just from these two examples perhaps you
can see that there is likely to be more than one "correct" answer
problem you are trying to solve.  Priestly discusses much of this
in great detail (though not any of the Bayesian stuff; for that
look at bayes.wustl.edu for Jaynes's article on Bayesian spectrum
and chirp analysis, and at Bretthorst's book).

-Tom

--

To respond by email, replace "somewhere" with "astro" in the

Rune Allnor wrote:

>>If the PSD represents density of power, one contradiction that
>>I find is:  suppose that you define a signal as the following:
>>the value of x(t) at each time t is an independent random
>>variable uniformly distributed in the interval (-1,1)  (when I
>>say independent, I mean independent of x at *any other time*).
>>
>>The PSD of this signal is necessarily a constant value for all
>>frequencies.
>
>
> No, it's not.

Sorry, yes it is!

> The infinite sequence comprising only unit elements,
>
>   ....., 1, 1, 1, 1, 1,......
>
> is a valid (thoght almost improbable) realization of this process.

Except that I was talking about the PSD of the random signal that
I defined.  You are talking about the spectrum of one particular
realization of that random signal -- those are two entirely different
things.

Carlos
--


Carlos, I think you are abusing terminology and thus missing Rune's point.

> > If the PSD represents density of power, one contradiction that
> > I find is:  suppose that you define a signal as the following:
> > the value of x(t) at each time t is an independent random
> > variable uniformly distributed in the interval (-1,1)  (when I
> > say independent, I mean independent of x at *any other time*).

This does not define a signal; it defines a *distribution* of signals.

Carlos Moreno wrote:
>
> Except that I was talking about the PSD of the random signal that
> I defined.  You are talking about the spectrum of one particular
> realization of that random signal -- those are two entirely different
> things.

Indeed they are---and only a particular realization of the process
you defined has a PSD (and Rune did correctly identify one possible
realization with a nonconstant PSD).  The process as a whole produces
a distribution of PSD functions.

-Tom

--

To respond by email, replace "somewhere" with "astro" in the

Tom Loredo wrote:
> Carlos, I think you are abusing terminology and thus missing Rune's point.
>
>>>If the PSD represents density of power, one contradiction that
>>>I find is:  suppose that you define a signal as the following:
>>>the value of x(t) at each time t is an independent random
>>>variable uniformly distributed in the interval (-1,1)  (when I
>>>say independent, I mean independent of x at *any other time*).
>
> This does not define a signal; it defines a *distribution* of signals.

Actually, I disagree.  Although I see that I wasn't rigorous
enough with my phrasing.

The above (my previous post) defines a *random signal*, or a
*random process*  (maybe the latter is a more appropriate
term).  But I got used to talk about "random signals",
understanding that when we say "the (random) signal" we
are talking about an experiment with outcomes being the
realizations of that random signal.

The PSD that I'm talking about is the PSD of random signals,
and not the Spectrum of a particular realization of a
random signal.  That's why I contested Rune's point; the
spectrum of any specific realization of the (random) signal
I defined is not relevant.

White noise has a PSD that is constant for all values of
frequency (from -infinity to infinity).

The random signal (or random process) that I described is
white noise;  it has a PSD that is constant for all values
of frequency.  Yet its average power is not infinite (as
we would conclude if the PSD represented average density
of power).

So, I'm hoping that I'm clarifying the notation and what
I meant.

What troubles me is that there are a couple of posts in
this thread that claim that PSD is the average (over all
possible realizations of the random signal) of *power*
density, and no-one has contested that notion -- except
me, claiming that it represents density of *energy*.
The funny detail is that no-one has explicitly contested
my claim either!!  Are you guys under the impression
that I'm just trolling?  Or that maybe I'm cheating in
some homework.  I can truthfully tell you that neither
of these options is true.  As I said in my first post,
something that our teacher said in class triggered a
disagreement that I think goes back to different
understandings of what PSD is:  they claim that white
noise has infinite variance, and I disagree  (their
argument is that the power is obtained by integrating
the PSD with respect to frequency, and since the PSD
of white noise is constant, then the integral gives
infinite power).

Cheers,

Carlos
--


Rune Allnor wrote:

> ...
> Could you provide examples? Admittedly, a solar panel or a
> hydroelectric generator are examples on types of recievers that
> actually generate energy. I was thinking of radio recievers, where
> the measured signal is used to modulate a flow of energy internal
> to the reciever, energy that was provided by the reciever's power
> supply.

In Berlin (was it in the early 30th?), people took lamp bulbs,
connected two wires to them, one of them was grounded.
By raising the other wire into the air, they could switch on the
lamp.

Surely, this was only possible because of the short distance of the
transmitting station.

This became common usage in an increasing number of garden houses,
garages, ... until in the end a law was released which forbade the

Even today, the "Deutsche Welle" short-wave broadcasting station,
causes lamps (fluorescent tubes) to shine without being switched on,
as soon as the antenna is directed to Africa.
This happens in a village some 3km south of the station.
(I've been told by somebody who lived there.)

Bernhard

Carlos Moreno <moreno_at_mochima_dot_com@x.xxx> wrote in message news:<eEXnb.5927$_G2.147481@wagner.videotron.net>... > Rune Allnor wrote: > > >>If the PSD represents density of power, one contradiction that > >>I find is: suppose that you define a signal as the following: > >>the value of x(t) at each time t is an independent random > >>variable uniformly distributed in the interval (-1,1) (when I > >>say independent, I mean independent of x at *any other time*). > >> > >>The PSD of this signal is necessarily a constant value for all > >>frequencies. > > > > > > No, it's not. > > Sorry, yes it is! > > > The infinite sequence comprising only unit elements, > > > > ....., 1, 1, 1, 1, 1,...... > > > > is a valid (thoght almost improbable) realization of this process. > > Except that I was talking about the PSD of the random signal that > I defined. You are talking about the spectrum of one particular > realization of that random signal -- those are two entirely different > things. Could you please define what a "realization of a random signal" is? I can't make any sense of that. If you said "realization of a random process" it would make sense. There is a vast difference between "signal" and "process". The PSD is a property of the process, the squared magnitude of the frequency spectrum of the signal is an estimate of that spectrum. See below. I did ask in a different post that you define more clearly what you mean by "random signal". I can't find a post of yours where you do that. I will make an attempt to clarify the basis of my argumentation: If you study one sequence of numbers or one signal (one particular realization), then use the standard techniques like DFTs, and do so. However, once we start talking about "random signals" and "random processes" it is implied that we regard the observed signal (the "realization") as one glimpse into a world where some governing rule, of which we know little (the "process"), influences the behaviour of the observed data. The objective of the study is to use the observed data to infer something about these governing rules, not to describe the data themselves in any particular detail. The statistician admits (by using the very term "random") that he can not (and will not) attemt a complete, deterministic description of the system. He will, however, make the best use he can of what data he has, in such a way that the inherent uncertainty of the analysis is vclear, and also such that he can use any new data that may become available to him. So we *assume* some property of the data, i.e. that they are a "realization" of some "random process". We can the impose some more assumption of this model, ("stationary", "ergodic", "zero mean", "Gaussian",...) and try to characterize it or estimate some parameters. You may want to look into the difference between "true" parameters and "estimators" for those parameters. As in the example cited above, the estimated PSD of the (constant) data series of one realization is as far from the "true" PSD of the random process as possible, but the statistical arguments still hold. Of course, attempting to estimate the true PSD on the basis of that realization is doomed to fail, but that's just the inherent peril of statistical signal processing or statistics in general. I'll just remind you about a point I made in an earlier post about interpreting the maths of the statistics in a somewhat different manner than "only maths". Rune  Carlos Moreno <moreno_at_mochima_dot_com@x.xxx> wrote in message news:<KW_nb.7503$_G2.329389@wagner.videotron.net>...
> Tom Loredo wrote:
> > Carlos, I think you are abusing terminology and thus missing Rune's point.
> >
> >>>If the PSD represents density of power, one contradiction that
> >>>I find is:  suppose that you define a signal as the following:
> >>>the value of x(t) at each time t is an independent random
> >>>variable uniformly distributed in the interval (-1,1)  (when I
> >>>say independent, I mean independent of x at *any other time*).
> >
> > This does not define a signal; it defines a *distribution* of signals.
>
> Actually, I disagree.  Although I see that I wasn't rigorous
> enough with my phrasing.
>
> The above (my previous post) defines a *random signal*, or a
> *random process*  (maybe the latter is a more appropriate
> term).  But I got used to talk about "random signals",
> understanding that when we say "the (random) signal" we
> are talking about an experiment with outcomes being the
> realizations of that random signal.

Then it may be useful to check out the terminology. The "random
process" usually denotes this distribution of "realizations",
and the term "signal" usually refers to one particular outcome
or observation in one particular realization or experiment.

> The PSD that I'm talking about is the PSD of random signals,
> and not the Spectrum of a particular realization of a
> random signal. That's why I contested Rune's point; the
> spectrum of any specific realization of the (random) signal
> I defined is not relevant.

The spectral densities of realizations are most certainly relevant,
they are what you observe, they are what you have to work with.
Ref the connection between the probability density function of
your dice, and the outcome of actually rolling it.

> White noise has a PSD that is constant for all values of
> frequency (from -infinity to infinity).

The generating process does. One particular realization need not have.

> The random signal (or random process) that I described is
> white noise;  it has a PSD that is constant for all values
> of frequency.  Yet its average power is not infinite (as
> we would conclude if the PSD represented average density
> of power).
>
> So, I'm hoping that I'm clarifying the notation and what
> I meant.

You have clarified you notation, but not necessarily what you meant.
Humpty Dumpty may have been a charming guy, but his linguistics
is not an example to follow.

> What troubles me is that there are a couple of posts in
> this thread that claim that PSD is the average (over all
> possible realizations of the random signal) of *power*
> density, and no-one has contested that notion -- except
> me, claiming that it represents density of *energy*.
> The funny detail is that no-one has explicitly contested
> my claim either!!

I *think* I wrote a post a couple of days ago, where I commented
on the difference between Energy Signals and Power Signals.
There need not be any contradictions between energy and power,
it's a matter of using the correct tool in each case.

Coming to think of it, I actually asked in that very same post if
you could make an effort to check wether the signals you are concerned
about were best classified as Energy Signal or Power Signals...

> Are you guys under the impression
> that I'm just trolling?

The thought has occured to me...

> Or that maybe I'm cheating in
> some homework. I can truthfully tell you that neither
> of these options is true.  As I said in my first post,
> something that our teacher said in class triggered a
> disagreement that I think goes back to different
> understandings of what PSD is:  they claim that white
> noise has infinite variance, and I disagree  (their
> argument is that the power is obtained by integrating
> the PSD with respect to frequency, and since the PSD
> of white noise is constant, then the integral gives
> infinite power).

I don't know what your teacher said, so I can't comment on any
remarks you heard in class. But it seems more and more clear to me
that this is an issue of getting the different definitions right,
understanding those differences, and to know that different analysis
tools exist for different types of signals. You might find it useful

Rune

Carlos Moreno wrote:

...

> White noise has a PSD that is constant for all values of
> frequency (from -infinity to infinity).
>
> The random signal (or random process) that I described is
> white noise;  it has a PSD that is constant for all values
> of frequency.  Yet its average power is not infinite (as
> we would conclude if the PSD represented average density
> of power).

...

The average power that that model implies is indeed infinite. The
argument is a demonstration that ideal white noise -- constant energy
per unit bandwidth for all frequencies -- can't exist. Like the rigid
bodies of mechanics, white noise is no more than a useful abstraction.

Just a the notion of a rigid body provides an easy attack on statically
determinate systems, the notion of pure white noise is usefully applied
to bandlimited systems. Properties of the noise outside the system's
bandwidth are irrelevant. We deal regularly with abstractions that don't
represent something we can build or touch. Impulses and other
singularities are examples.

Jerry
--
Engineering is the art of making what you want from things you can get.
&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;&#2013266095;


Jerry Avins wrote:

>> White noise has a PSD that is constant for all values of
>> frequency (from -infinity to infinity).
>>
>> The random signal (or random process) that I described is
>> white noise;  it has a PSD that is constant for all values
>> of frequency.  Yet its average power is not infinite (as
>> we would conclude if the PSD represented average density
>> of power).
>
> The average power that that model implies is indeed infinite.

But it can not be!  The power of something that has values
bounded in absolute value can not be infinite!

> The argument is a demonstration that ideal white noise -- constant energy
> per unit bandwidth for all frequencies -- can't exist.

If it can exist in nature or not, is irrelevant  (I mean, any
argument could be taken to philosophical grounds, claiming
that numbers do not exist in nature and that they're only
a model, an abstraction that only exist in our minds....
With that sort of argument one could dismiss basically
*any* mathematical result, valid or not).

My point is that, the white noise as I defined it, as a
mathematical abstraction, has properties that can be
rigorously defined according to the laws of mathematics.

The way I see it, the power of the white noise that I
described *can not* be infinite.  In fact, calculating
the variance is a trivial exercise:  E{x^2} = integral
from -1 to 1 of (1/2) x^2 dx.  That gives you the average
power (assuming a normalized 1-ohm resistor, etc. etc.),
which is definitely not infinite.

Yes, white noise is a mathematical abstraction;  the
disagreement is on the mathematical properties of this
abstraction.

You mention also the delta.  As an example, if someone
tells you that the value of the delta at t = 0 is 1??
You will undoubtedly tell them that they're wrong!!  It
doesn't matter that a delta can not exist in practice;
the mathematical properties that define a delta are
specific and rigurous.

*Gaussian* white noise with zero mean and variance
sigma^2.  Well, if the variance is sigma^2, then the
average power is not infinite!

Carlos
--


Carlos Moreno wrote:
> [...]
> So, can someone help me with finding a rigurous definition
> of what the value at a given frequency of the PSD of a
> random signal means?

Carlos,

I think you mean to ask is how we derive the units
of the PSD. If so, that's not too hard.

For this discussion, let's assume that the units of our
signal is volts, which I will denote by "[V]" ([anything]
denotes the units of "anything"). Let's also say the
voltage signal is identically distributed, and that the
pdf of the voltage (the ensemble pdf) at any time t
is simply f(x). Let's also denote the voltage signal
X(t).

First of all we need to convince ourselves that the pdf
of a random variable representing voltage has units of
[1/V]. Reason like this: The integral of the pdf gives
a probability, which is unitless. Since the integral
of a pdf is of the form \int f(x) dx, and the "dx" has
units of [V], then f(x) must be [1/V] to make the result
unitless.

OK, now what are the units of the autocorrelation function
Rxx(tau) of X(t)? All time lags have the same units, then for
simplicity let's examine the units of Rxx(0) = E[X^2]. This
is

\int x^2 f(x) dx,

which has units of [V^2] * [1/V] * [V], or [V^2]. Of course you
know the units [V^2] are directly proportional to power.

Finally, what are the units of the PSD?

\int Rxx(tau) e^{j*omega*tau) dtau

has units of [V^2] * [s] ("s" = seconds). And that is
directly proportional to watts/Hz (since [Hz] = [1/s]).
--