Minimum-variance unbiased estimator

Jump to navigation Jump to search


In statistics a uniformly minimum-variance unbiased estimator (often abbreviated as UMVU or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. Consider estimation of <math>\scriptstyle g(\theta) </math> based on data data <math> \scriptstyle X_{1}, X_{2}, ..., X_{n} </math> i.i.d. from some family of densities <math>\scriptstyle p_{\theta}, \theta \in \Omega </math>, where <math>\scriptstyle \Omega </math> is the parameter space. An unbiased estimator <math>\scriptstyle \delta(X_{1}, X_{2}, ..., X_{n}) </math> of <math>\scriptstyle g(\theta) </math> is UMVU if <math>\scriptstyle \forall \theta \in \Omega </math>

<math> \mathrm{var}(\delta(X_{1}, X_{2}, ..., X_{n})) \leq \mathrm{var}(\tilde{\delta}(X_{1}, X_{2}, ..., X_{n})) </math>

for any other unbiased estimator <math>\scriptstyle \tilde{\delta} </math>. If an unbiased estimator of <math>\scriptstyle g(\theta) </math> exists, then one can prove there is an essentially unique UMVU estimator. Using the Rao-Blackwell Theorem one can also prove that determining the UMVU estimator is simply a matter of finding a complete sufficient statistic for the family <math>\scriptstyle p_{\theta}, \theta \in \Omega </math> and conditioning any unbiased estimator on it. Put formally, suppose <math>\scriptstyle \delta(X_{1}, X_{2}, ..., X_{n}) </math> is unbiased for <math>\scriptstyle g(\theta) </math>, and that <math>\scriptstyle T </math> is a complete sufficient statistic for the family of densities. Then

<math> \eta(X_{1}, X_{2}, ..., X_{n}) = \mathrm{E}(\delta(X_{1}, X_{2}, ..., X_{n})|T)\, </math>

is the UMVU estimator for <math>\scriptstyle g(\theta) </math>.

Example

Consider the data to be a single observation from an absolutely continuous distribution on <math> \scriptstyle \mathbb{R} </math> with density

<math> p_{\theta}(x) = \frac{ \theta e^{-x} }{(1 + e^{-x})^{\theta + 1} } </math>

and we wish to find the UMVU estimator of

<math> g(\theta) = \frac{1}{\theta^{2}} </math>

First we recognize that the density can be written as

<math> \frac{ e^{-x} } { 1 + e^{-x} } \mathrm{exp}( -\theta \mathrm{log}(1 + e^{-x}) + \mathrm{log}(\theta)) </math>

Which is an exponential family with sufficient statistic <math>\scriptstyle T = \mathrm{log}(1 + e^{-x})</math>. In fact this is a full rank exponential family, and therefore <math> T </math> is complete sufficient. See exponential family for a derivation which shows

<math> \mathrm{E}(T) = \frac{1}{\theta}, \mathrm{var}(T) = \frac{1}{\theta^{2}} </math>

Therefore

<math> \mathrm{E}(T^2) = \frac{2}{\theta^{2}} </math>

Apparently <math>\scriptstyle \delta(X) = \frac{T^2}{2} </math> is unbiased, thus the UMVU estimator is

<math> \eta(X) = \mathrm{E}(\delta(X) | T) = \mathrm{E}(\frac{T^2}{2} | T) = \frac{T^{2}}{2} = \frac{\mathrm{log}(1 + e^{-X})^{2}}{2}</math>

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU.

References

  • Keener, Robert W. (2006). Statistical Theory: Notes for a Course in Theoretical Statistics. Springer. pp. 47–48, 57–58.


Template:Statistics-stub

Template:WS