def. The Maximum Likelihood (Statistics) Estimator is an estimator that maximizes .
- It also works for log likelihood, because the natural log is a monotonic function:
Under certain regularity conditions, we can find the MLE by finding stationary points in the log likelihood. These are called the likelihood equation:
To consider whether this stationary point is the maximum (as opposed to a miminum) either:
- take the second derivative…
- …or find out via other means
Properties of MLEs:
- MLEs are always a function of a sufficient statistic.
- MLEs are not necessarily unbiased.
- MLEs may not reach the CRLB in variance.
thm. Functional Equivariance of MLE: Given parameter and let . Then:
⇒ the estimator for any function over the parameter can then be found easily.
thm. Asymptotic Normality of MLE. [=Fisher’s Approximation]
let data generated by a univariate single parameter distribution .
let also that is found by the likelihood equation . Then both are equivalently true: