- Published on
Notes on Regression - Method of Moments
- Authors
- Name
Another way of establishing the OLS formula is through the method of moments approach. This method supposedly goes way back to Pearson in 1894. It could be thought of as replacing a population moment with a sample analogue and using it to solve for the parameter of interest.
Example 1
To find an estimator for the sample mean, , one replaces the expected value with a sample analogue,
Example 2
Let be drawn from a normal distribution i.e. The goal is to find an estimator for the two parameters, and . The first and second moment of a normal distribution is given by:
An estimator for is easy and is simply .
Replace the moment condition with the sample analogue and substitute in the estimator for to find an estimator for :
Example 3
Let be drawn from a poisson distribution i.e. . The poisson distribution is characterised by the following equality: . This gives rise to two possible estimators for :
Since there is only one parameter to be estimated but two moment conditions, one would need some way of 'combining' the two conditions. Using only one condition would be not making full use of the information at hand.
Regression - Method of Moments
More generally, one can write the moment conditions as a vector of functions , where is the observed data, including all variables and instruments in the regression model, while is the vector of parameters of length . The model is identified if the solution is unique, i.e. and imply that . This requires that we have at least restrictions for parameters.
For the OLS regression, one can use the moment condition or to solve for the usual OLS estimator.
The idea can be carried over to other more complicated regression models. For example, in the case where is linear in i.e. or , and the model is perfectly identified , solving the moment condition yields the formula for the IV regression:
Hence an IV regression could be thought of as substituting 'problematic' OLS moments for hopefully better moment conditions with the addition of instruments.
Extension - Generalised Method of Moments (GMM)
While it is not possible to identify if there are too few restrictions, one could still identify if there are restrictions (overidentified), as seen in the poisson example.1 One might then wonder what is the best way to combine these restrictions. The GMM approach, introduced by Hansen in 1982, finds an estimate of that brings the sample moments as close to zero as possible. Note that the moment conditions for all the restrictions are still equal to zero, but the sample approximation, being drawn from a finite sample, may not be equal to zero. In other words, the GMM estimator is defined as the value of that minimizes the weighted distance of :
where is the matrix of weights which is used to select the ideal linear combination of instruments. In the case of the regression model where is linear in but is overidentified, the general GMM formula can be found by minimising the above condition and is given by:
Note that when , .^[This also shows that the 2SLS estimator is a GMM estimator for the linear model. is also the most efficient estimator if the errors are homoskedastic. In general, there may be other more efficient choices of the weighting matrix.] Please google efficient GMM, for more information on the optimal choice of the weighting matrix.
Footnotes
In the case of regressions, this happens when there are more instruments than endogenous regressors. ↩