DSPRelated.com
Forums

Function of Random Variables

Started by S Didde February 9, 2009
I am trying to determine the pdf of a function which is a convolution of
two random variables. How do I determine the pdf of the function?

Let me illustrate with a simple case cited in many text books:

Z=X+Y (sum of two random variables)
The final pdf, 
fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they
are independent.

I am looking for the pdf fz(z) when,
Z=X*Y (convolution of two random variables)

From the data I have it appears that it is also a convolution of the
individual pdf's when they are independent. 
But I am unable to prove it from first principles.
Any help would be highly appreciated.
Thanks,
Stephen


On 9 Feb, 19:33, "S Didde" <s.di...@sbcglobal.net> wrote:
> I am trying to determine the pdf of a function which is a convolution of > two random variables. How do I determine the pdf of the function? > > Let me illustrate with a simple case cited in many text books: > > Z=X+Y (sum of two random variables) > The final pdf, > fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they > are independent. > > I am looking for the pdf fz(z) when, > Z=X*Y (convolution of two random variables) > > From the data I have it appears that it is also a convolution of the > individual pdf's when they are independent. > But I am unable to prove it from first principles.
A quick look in Papoulis' "Probability, Random Variables and Stochastic Processes" (1992) only uncovers discussions of convolving PDFs, not data. An off-the-top-of-my-head guess would be that the convolution of data would invoke the Central Limit Theorem, such that the end PDF of z would asymptotically become Gaussian. Rune
On Feb 9, 12:33&#4294967295;pm, "S Didde" <s.di...@sbcglobal.net> wrote:
> I am trying to determine the pdf of a function which is a convolution of > two random variables. How do I determine the pdf of the function? > > Let me illustrate with a simple case cited in many text books: > > Z=X+Y (sum of two random variables) > The final pdf, > fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they > are independent. > > I am looking for the pdf fz(z) when, > Z=X*Y (convolution of two random variables) > > From the data I have it appears that it is also a convolution of the > individual pdf's when they are independent. > But I am unable to prove it from first principles. > Any help would be highly appreciated. > Thanks, > Stephen
Perhaps if you defined what you mean by the convolution of two random variables, someone might be able to help you.
On 2009-02-09 14:33:42 -0400, "S Didde" <s.didde@sbcglobal.net> said:

> I am trying to determine the pdf of a function which is a convolution of > two random variables. How do I determine the pdf of the function? > > Let me illustrate with a simple case cited in many text books: > > Z=X+Y (sum of two random variables) > The final pdf, > fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they > are independent. > > I am looking for the pdf fz(z) when, > Z=X*Y (convolution of two random variables)
What is your definition of convolution of random variables? In X+Y you took a single number distributed as X and likewise for Y and combined the two numbers. For a convolution you need two functions to combine to yield a new function. Where did the functions come from and how are they related to the distibution of the the points X? Methinks this is just run away confusion!
> From the data I have it appears that it is also a convolution of the > individual pdf's when they are independent. > But I am unable to prove it from first principles. > Any help would be highly appreciated. > Thanks, > Stephen
>I am trying to determine the pdf of a function which is a convolution of >two random variables. How do I determine the pdf of the function? > >Let me illustrate with a simple case cited in many text books: > >Z=X+Y (sum of two random variables) >The final pdf, >fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they >are independent. > >I am looking for the pdf fz(z) when, >Z=X*Y (convolution of two random variables) > >From the data I have it appears that it is also a convolution of the >individual pdf's when they are independent. >But I am unable to prove it from first principles. >Any help would be highly appreciated. >Thanks, >Stephen > > >
Thanks to everyone who responded! I realized soon after posting this question, that I needed to clarify this further. The real issue is to find statistical information of a signal going through an LTI, linear time invariant channel. It is essential for my work to find out the pdf of the signal at the output. To get down to the details of the problem: If I know the pdf or (pmf) of the amplitude distribution of the input signal, specifically binary bits, and if I can model the impulse response of the channel as another random variable (probabilistic distribution of amplitudes at the sampling instants) then I can treat this as a convolution of two random variables. Hope this makes a bit more sense?
On Feb 9, 2:37&#4294967295;pm, "S Didde" <s.di...@sbcglobal.net> wrote:
> > The real issue is to find statistical information of a signal going > through &#4294967295;an LTI, linear time invariant channel. > It is essential for my work to find out the pdf of the signal at the > output. > To get down to the details of the problem: If I know the pdf or (pmf) of > the amplitude distribution of the input signal, specifically binary bits, > and if I can model the impulse response of the channel as another random > variable (probabilistic distribution of amplitudes at the sampling > instants) then I can treat this as a convolution of two random variables. > Hope this makes a bit more sense?
one nitpick regarding terminology: i think you are considering the convolution of two random *processes*. (if you like to impress academics and other eggheads, you might call them "stochastic processes".) a random variable comes out as a number. you can do to it what you do to numbers (like add or multiply them). if you do some well defined operation to a random variable (or two RVs, like add or multiply), you can come up with an expression for the pdf of the resulting RV. you might need to put a little bit more into defining the random process ("RP") representing the impulse response of the channel. that RP cannot be white noise, nor may it even be a finite power signal. it has to be a finite energy signal. now what Rune said is likely true. if you have a decent model of the impulse response, restricting it to finite energy, if the RV that is h(tau)*x(t-tau) ("*" means multiplication here, t is completely random, tau is specified, but variable.) gives you a RV that has finite variance (for any tau), adding it up for all possible taus will add to a normal or gaussian RV. if you know the autocorrelation (or power spectrum) of x(t) and the autocorrelation (or energy spectrum) of the h(t) processes, then, i think that the power spectrum of the convolution result will be the product of the two spectrums. but, i'll bet the pdf tends to be gaussian. r b-j
>On Feb 9, 2:37=A0pm, "S Didde" <s.di...@sbcglobal.net> wrote: >> >> The real issue is to find statistical information of a signal going >> through =A0an LTI, linear time invariant channel. >> It is essential for my work to find out the pdf of the signal at the >> output. >> To get down to the details of the problem: If I know the pdf or (pmf)
of
>> the amplitude distribution of the input signal, specifically binary
bits,
>> and if I can model the impulse response of the channel as another
random
>> variable (probabilistic distribution of amplitudes at the sampling >> instants) then I can treat this as a convolution of two random
variables.
>> Hope this makes a bit more sense? > >one nitpick regarding terminology: i think you are considering the >convolution of two random *processes*. (if you like to impress >academics and other eggheads, you might call them "stochastic >processes".) a random variable comes out as a number. you can do to >it what you do to numbers (like add or multiply them). if you do some >well defined operation to a random variable (or two RVs, like add or >multiply), you can come up with an expression for the pdf of the >resulting RV. > >you might need to put a little bit more into defining the random >process ("RP") representing the impulse response of the channel. that >RP cannot be white noise, nor may it even be a finite power signal. >it has to be a finite energy signal. > >now what Rune said is likely true. if you have a decent model of the >impulse response, restricting it to finite energy, if the RV that is > > h(tau)*x(t-tau) > >("*" means multiplication here, t is completely random, tau is >specified, but variable.) > >gives you a RV that has finite variance (for any tau), adding it up >for all possible taus will add to a normal or gaussian RV. if you >know the autocorrelation (or power spectrum) of x(t) and the >autocorrelation (or energy spectrum) of the h(t) processes, then, i >think that the power spectrum of the convolution result will be the >product of the two spectrums. but, i'll bet the pdf tends to be >gaussian. > >r b-j >
Thanks r b-j and Rune for the helpful suggestions. May be I should define the problem a bit more: What I am interested in finding out how the amplitude probabilities span out as the bits go through a finite impulse response channel. Ultimately I'd like to compute the probability of inter-symbol-interference (ISI). Of course, there's the issue of random noise as a stochastic process interfering with the bit stream, but I am ignoring that as a second order effect. Signals start out without any ISI at the input to the channel and due to non-linear phase and amplitude response in the channel they begin to overlap. I would like to compute the probability of the amplitude distribution at the output of the channel. Not sure if this is enough information, but would surely appreciate if anyone has any ideas.
>On Feb 9, 2:37=A0pm, "S Didde" <s.di...@sbcglobal.net> wrote: >> >> The real issue is to find statistical information of a signal going >> through =A0an LTI, linear time invariant channel. >> It is essential for my work to find out the pdf of the signal at the >> output. >> To get down to the details of the problem: If I know the pdf or (pmf)
of
>> the amplitude distribution of the input signal, specifically binary
bits,
>> and if I can model the impulse response of the channel as another
random
>> variable (probabilistic distribution of amplitudes at the sampling >> instants) then I can treat this as a convolution of two random
variables.
>> Hope this makes a bit more sense? > >one nitpick regarding terminology: i think you are considering the >convolution of two random *processes*. (if you like to impress >academics and other eggheads, you might call them "stochastic >processes".) a random variable comes out as a number. you can do to >it what you do to numbers (like add or multiply them). if you do some >well defined operation to a random variable (or two RVs, like add or >multiply), you can come up with an expression for the pdf of the >resulting RV. > >you might need to put a little bit more into defining the random >process ("RP") representing the impulse response of the channel. that >RP cannot be white noise, nor may it even be a finite power signal. >it has to be a finite energy signal. > >now what Rune said is likely true. if you have a decent model of the >impulse response, restricting it to finite energy, if the RV that is > > h(tau)*x(t-tau) > >("*" means multiplication here, t is completely random, tau is >specified, but variable.) > >gives you a RV that has finite variance (for any tau), adding it up >for all possible taus will add to a normal or gaussian RV. if you >know the autocorrelation (or power spectrum) of x(t) and the >autocorrelation (or energy spectrum) of the h(t) processes, then, i >think that the power spectrum of the convolution result will be the >product of the two spectrums. but, i'll bet the pdf tends to be >gaussian. > >r b-j >
Thanks r b-j and Rune for the helpful suggestions. May be I should define the problem a bit more: What I am interested in finding out how the amplitude probabilities span out as the bits go through a finite impulse response channel. Ultimately I'd like to compute the probability of inter-symbol-interference (ISI). Of course, there's the issue of random noise as a stochastic process interfering with the bit stream, but I am ignoring that as a second order effect. Signals start out without any ISI at the input to the channel and due to non-linear phase and amplitude response in the channel they begin to overlap. I would like to compute the probability of the amplitude distribution at the output of the channel. Not sure if this is enough information, but would surely appreciate if anyone has any ideas.