DSPRelated.com
Forums

Probability densities

Started by Unknown August 8, 2012
Suppose I add two noise signals y=x+v

where x has a PDF Px and v has a PDF Pv

What is the PDF of y?? I am led to believe that it is the convolution of the two PDF's of x and v - is this right and if so why?

Therefore if x and v are both Guassian, y must have a PDF which is the convolution of two Guassian PDF's - I assume that must be guassian too???


On Aug 8, 4:09�pm, gyansor...@gmail.com wrote:
> Suppose I add two noise signals y=x+v > > where x has a PDF Px and v has a PDF Pv > > What is the PDF of y?? I am led to believe that it is the convolution of the two PDF's of x and v - is this right and if so why? > > Therefore if x and v are both Guassian, y must have a PDF which is the convolution of two Guassian PDF's - I assume that must be guassian too???
You are correct in that it is the convolution. To derive the conv. relation just figure for each value of y, you have to look at all values of x and v where x and v add to y. And of course you are doing this inside of integrals. Pretty much any stats book will have this. IHTH, Clay
gyansorova@gmail.com wrote:

> Suppose I add two noise signals y=x+v
> where x has a PDF Px and v has a PDF Pv
> What is the PDF of y?? I am led to believe that it is > the convolution of the two PDF's of x and v - is this > right and if so why?
You don't say that the two distributions are statistically independent. Maybe that is obvious in context, but it is important to know to get the right answer. -- glen
On 8/8/12 5:19 PM, glen herrmannsfeldt wrote:
> gyansorova@gmail.com wrote: > >> Suppose I add two noise signals y=x+v > >> where x has a PDF Px and v has a PDF Pv > >> What is the PDF of y?? I am led to believe that it is >> the convolution of the two PDF's of x and v - is this >> right and if so why? >
i think you can show that v can be divided into an orthogonal component and a component that is in-line with y-x. so then the p.d.f. of y is Py(y) = integral{ Px(x) Pv(y-x) dx } you add up the probability of v for each possible value for x. and then, i guess, you just recognize the form of the equation as that of convolution. the really cool thing is that you can define this thing called "the Characteristic Function" which is the Fourier Transform of the p.d.f. so what's cool is that when you add two random variables, you convolve their p.d.f.'s , and then you multiply their "characteristic functions", and then you add the logarithms of the characteristic functions. it's kinda like this cepstrum thing ("cepstrum" whatever-trum with reversed letters) so you have a mapping that transforms some additive quantity which could be random, to something else that is additive, but deterministic. whenever you add r.v.'s, you add the logs of the F.T. of the p.d.f.'s of the r.v.'s.
> You don't say that the two distributions are statistically > independent. Maybe that is obvious in context, but it > is important to know to get the right answer.
i know that orthogonal is less restrictive than independent. i believe you can show that two independent rv's are also orthogonal, in that the inner product of two equal segments of samples have an expectation value of 0. it may not be true, it's fudging, but i'm assuming both. -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
>> characteristic functions
that's one keyword to look up. For example, Proakis, "Digital communications" has a good section on the topic.
>so what's cool is that when you add two random variables, you convolve >their p.d.f.'s , and then you multiply their "characteristic functions", >and then you add the logarithms of the characteristic functions.
Or for that matter "Moment generating functions" which can be extremely useful when the inverse transformation can't be done analytically and (very) accurate approximation is desired.