Chapter 14: The Gauss-Markov Theorem
This chapter brings together all the key ideas in this book:
• In order to do inference one must have a model of the data generating
process.
• There are many possible estimators of the population parameters.
• Estimators can be classified according to whether they are unbiased
– that is, on average correct.
• Many, but by no means all, estimators are linear estimators.
• One of the main criteria for comparing estimators is the variance
of the estimator.
• When the data are generated according to the classical econometric
box model, ordinary least squares is the best estimator in the class of linear,
unbiased estimators – best, that is, according to the criterion of finding
the estimator with the minimum variance for a given sample size.
This last statement is often stated in shorthand as “OLS is BLUE” (best linear unbiased estimator) and is known as the Gauss–Markov theorem from which the title of this chapter is derived. This theorem explains the preeminence of the OLS estimator in econometrics.
The Gauss–Markov theorem also works in reverse: when the data generating process does not follow the classical econometric model, ordinary least squares is typically no longer the preferred estimator. Much of econometrics concerns pointing out the deficiencies of OLS and finding better estimators under particular violations of requirements of the CEM.
Throughout this chapter, we work with the classical econometric model. To make matters as clear as possible, we begin with a simple problem: estimating the population average for a single variable. This case, considered in Section 14.2, allows us to introduce the notion of linear estimators and to demonstrate that there are many possible estimators for a given population parameter. Section 14.3 races various estimators to show how we decide the winner. Section 14.4 presents a formal proof of the Gauss–Markov theorem for the univariate case. Sections 14.5 and 14.6 consider the more complicated bivariate case. Once again, we will show that there are many possible estimators of the parameters, that some of them are linear (i.e., weighted sums of the dependent variable), and that the OLS estimator is in fact the best estimator in the bivariate case. Finally, Section 14.7 uses the algebra of expectations to present the ideas in this chapter in a more formal way.
The Gauss–Markov theorem is a crowning achievement in statistics. The
time and effort spent in understanding this material are well worth it.
Excel Workbooks