### Maximum Likelihood Sinusoid Estimation

The*maximum likelihood estimator*(MLE) is widely used in practical signal modeling [121]. A full treatment of maximum likelihood estimators (and statistical estimators in general) lies beyond the scope of this book. However, we will show that the MLE is equivalent to the least squares estimator for a wide class of problems, including well resolved sinusoids in white noise.

Consider again the signal model of (5.32) consisting of a complex sinusoid in additive white (complex) noise:

Again, is the complex amplitude of the sinusoid, and is white noise. In addition to assuming is white, we add the assumption that it is

*Gaussian distributed*

^{6.12}with

*zero mean*; that is, we assume that its probability density function (see Appendix C) is given by

^{6.13}

(6.46) |

We express the zero-mean Gaussian assumption by writing

(6.47) |

The parameter is called the

*variance*of the random process , and is called the

*standard deviation*. It turns out that when Gaussian random variables are uncorrelated (

*i.e.*, when is white noise), they are also

*independent*. This means that the probability of observing particular values of and is given by the product of their respective probabilities [121]. We will now use this fact to compute an explicit probability for observing any data sequence in (5.44). Since the sinusoidal part of our signal model, , is

*deterministic*;

*i.e.*, it does not including any random components; it may be treated as the

*time-varying mean*of a Gaussian random process . That is, our signal model (5.44) can be rewritten as

(6.48) |

and the probability density function for the whole set of observations , is given by

(6.49) |

Thus, given the noise variance and the three sinusoidal parameters (remember that ), we can compute the relative probability of any observed data samples .

**Next Section:**

Likelihood Function

**Previous Section:**

Least Squares Sinusoidal Parameter Estimation