Method of moments (statistics)

In statistics, the method of moments is a method of estimation of population parameters such as mean, variance, median, etc. (which need not be moments), by equating sample moments with unobservable population moments and then solving those equations for the quantities to be estimated.

Example

Suppose X1, ..., Xn are independent identically distributed random variables with a gamma distribution with probability density function

${\displaystyle {x^{\alpha -1}e^{-x/\beta } \over \beta ^{\alpha }\,\Gamma (\alpha )}\,\!}$

for x > 0, and 0 for x < 0.

The first moment, i.e., the expected value, of a random variable with this probability distribution is

${\displaystyle \operatorname {E} (X_{1})=\alpha \beta \,}$

and the second moment, i.e., the expected value of its square, is

${\displaystyle \operatorname {E} (X_{1}^{2})=\beta ^{2}\alpha (\alpha +1).\,}$

These are the "population moments".

The first and second "sample moments" m1 and m2 are respectively

${\displaystyle m_{1}={X_{1}+\cdots +X_{n} \over n}\,\!}$

and

${\displaystyle m_{2}={X_{1}^{2}+\cdots +X_{n}^{2} \over n}.\,\!}$

Equating the population moments with the sample moments, we get

${\displaystyle \alpha \beta =m_{1}\,\!}$

and

${\displaystyle \beta ^{2}\alpha (\alpha +1)=m_{2}.\,\!}$

Solving these two equations for α and β, we get

${\displaystyle \alpha ={m_{1}^{2} \over m_{2}-m_{1}^{2}}\,\!}$

and

${\displaystyle \beta ={m_{2}-m_{1}^{2} \over m_{1}}.\,\!}$

We then use these two quantities as estimates, based on the sample, of the two unobservable population parameters α and β.