DSPRelated.com
Forums

variance vs mean square error

Started by sdeepa November 28, 2005
can anyone please let me know how the variance of noise affects the mean
square error? Will the mean square error value increase with value of
sample variance of noise?




the mean square error _is_ - as far as I know - the variance of the error...

-------------
> can anyone please let me know how the variance of noise affects the mean > square error? Will the mean square error value increase with value of > sample variance of noise?
Yes - it is also the average power of the error.
The standard deviation (we assume zero-mean) is the rms value.


Naebad

sdeepa wrote:

> can anyone please let me know how the variance of noise affects the mean > square error? Will the mean square error value increase with value of > sample variance of noise?
Generally, as the noise variance increases the mean square error will increase --- though the relationship may not be linear. Your question is still a little ambiguous to answer and more specifically. The difference between the variance and the mean square error can be a little subtle to understand, but there is a difference. Just because you generate a sequence of N samples from a zero-mean noise source of variance V does not mean that a) the mean of the sequence is zero or b) the mean square error of the sequence is V. Sure, the mean will _tend_ towards zero and the mean square error will _tend_ towards V, but the sample mean and the actual mean and the "sample variance" (mean square error) and the actual variance will almost never be identical. HTH. Ciao, Peter K.
Peter K. wrote:
> > The difference between the variance and the mean square error can be a > little subtle to understand, but there is a difference. > > Just because you generate a sequence of N samples from a zero-mean > noise source of variance V does not mean that a) the mean of the > sequence is zero or b) the mean square error of the sequence is V. > > Sure, the mean will _tend_ towards zero and the mean square error will > _tend_ towards V, but the sample mean and the actual mean and the > "sample variance" (mean square error) and the actual variance will > almost never be identical.
Peter, do you mean the difference between the actual mean and variance of a random variable (we're calling "error") and the estimated values of mean and variance obtained by sampling it? if so, then the mean of the estimation *will* tent toward the true mean (which might be zero) but the estimated variance is biased a little and although the biased estimate tends toward the true variance, you can with a finite number of samples get a better estimate: http://groups.google.com/group/comp.dsp/msg/1d89215c51200101 i figger you know this but wanted to put it out there. -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
"robert bristow-johnson" <rbj@audioimagination.com> writes:


robert bristow-johnson wrote:

> Peter, do you mean the difference between the actual mean and variance > of a random variable (we're calling "error") and the estimated values > of mean and variance obtained by sampling it? > > if so, then the mean of the estimation *will* tent toward the true mean > (which might be zero) but the estimated variance is biased a little and > although the biased estimate tends toward the true variance, you can > with a finite number of samples get a better estimate: > > http://groups.google.com/group/comp.dsp/msg/1d89215c51200101 > > i figger you know this but wanted to put it out there.
Ya. That's the estimate I use; any other is biased. It's good to see a derivation of why to use it. :-) Ciao, Peter K.
"robert bristow-johnson" <rbj@audioimagination.com> writes:

> Peter, do you mean the difference between the actual mean and variance > of a random variable (we're calling "error") and the estimated values > of mean and variance obtained by sampling it?
Not really. I just mean that if you plug mu and sigma^2 into a random number generator, then you shouldn't expect the estimates of the mean and variance of the resulting numbers to be precisely mu and sigma^2 --- regardless of what estimator you use.
> if so, then the mean of the estimation *will* tent toward the true mean > (which might be zero) but the estimated variance is biased a little and > although the biased estimate tends toward the true variance, you can > with a finite number of samples get a better estimate: > > http://groups.google.com/group/comp.dsp/msg/1d89215c51200101 > > i figger you know this but wanted to put it out there.
OK, I suppose I always use the unbiased variance estimator, so I don't tend to think about the 1/N version. Good to see _why_ to do it posted somewhere, though, thanks for the link. Ciao, Peter K.
John wrote:

> the mean square error _is_ - as far as I know - the variance of the > error... >
Actually mean square error is sample variance which in turn is just an estimator for the true variance. Sample variance is "accurate" if the noise is white noise, but generally one should take the probability distribution into account. http://mathworld.wolfram.com/Variance.html -- Jani Huhtanen Tampere University of Technology, Pori
Jani Huhtanen wrote:

> Actually mean square error is sample variance
Only if the relationship between where the noise is added and where you're calculating the mean square error is direct: y = x + n so you're calculating the mean square error of y from x and you're talking about the sample variance of y (about mean x). More generally, the mean square error is taken between x and xhat: y = f(x) + n xhat = fhat(y) in which case the mean square error is almost completely different from the sample variance.
> which in turn is just an estimator for the true variance.
Ayup.
> Sample variance is "accurate" if the noise > is white noise, but generally one should take the probability distribution > into account.
You mean there's a variance on the variance estimator? Never! ;-) Ciao, Peter K.
Jani Huhtanen wrote:
> John wrote: > > > the mean square error _is_ - as far as I know - the variance of the > > error... > > Actually mean square error is sample variance which in turn is just an > estimator for the true variance. Sample variance is "accurate" if the noise > is white noise, but generally one should take the probability distribution > into account.
it's a semantic issue i guess, but even given the meaning of "sample variance" taken from the Wolfram link, i am not sure what you mean by "accurate", Jani. the sample variance (divide by N) is less accurate than the unbiased estimator (divide by N-1 instead of N) for getting to the "population variance" which is the variance in the p.d.f. this is true whether tha r.v. is white or not. r b-j