0% found this document useful (0 votes)
267 views

04 Gauss Markov Proof

The document provides a proof of the Gauss-Markov theorem. It states that under the linear regression model, the ordinary least squares (OLS) estimator of an estimable linear function is the best linear unbiased estimator (BLUE), meaning it has the smallest variance compared to any other linear unbiased estimator. The proof shows that for any other linear unbiased estimator, the variance will be greater than the variance of the OLS estimator by at least the variance of the difference between the estimators. Therefore, the OLS estimator has the smallest possible variance and is the BLUE.

Uploaded by

imelda_wang108
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
267 views

04 Gauss Markov Proof

The document provides a proof of the Gauss-Markov theorem. It states that under the linear regression model, the ordinary least squares (OLS) estimator of an estimable linear function is the best linear unbiased estimator (BLUE), meaning it has the smallest variance compared to any other linear unbiased estimator. The proof shows that for any other linear unbiased estimator, the variance will be greater than the variance of the OLS estimator by at least the variance of the difference between the estimators. Therefore, the OLS estimator has the smallest possible variance and is the BLUE.

Uploaded by

imelda_wang108
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Proof of the Gauss-Markov Theorem

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

1/8

The Gauss-Markov Theorem

of
Under the Gauss-Markov Linear Model, the OLS estimator c0
an estimable linear function c0 is the unique Best Linear
is strictly
Unbiased Estimator (BLUE) in the sense that Var(c0 )
less than the variance of any other linear unbiased estimator of
c0 .

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

2/8

Unbiased Linear Estimators of c0


If a is a fixed vector, then a0 y is a linear function of y.
An estimator that is a linear function of y is said to be a
linear estimator.
A linear estimator a0 y is an unbiased estimator of c0 if and
only if
E(a0 y) = c0 IRp

c
Copyright 2012
Dan Nettleton (Iowa State University)

a0 E(y) = c0 IRp
a0 X = c0 IRp
a0 X = c0 .

Statistics 511

3/8

The OLS Estimator of c0 is a Linear Estimator


We have previously defined the Ordinary Least Squares
where
is any
(OLS) estimator of an estimable c0 by c0 ,
0
0
solution to the normal equations X Xb = X y.
is the same for any

We have previously shown that c0


that is a solution to the normal equations.
We have previously shown that (X0 X) X0 y is a solution to the
normal equations for any generalized inverse of X0 X denoted
by (X0 X) .
= c0 (X0 X) X0 y = `0 y (where `0 = c0 (X0 X) X0 ) so
Thus, c0
is a linear estimator.
that c0
c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

4/8

is an Unbiased Estimator of an Estimable c0


c0
By definition, c0 is estimable if and only if there exists a
linear unbiased estimator of c0 .
It follows from slide 3 that c0 is estimable if and only if
c0 = a0 X for some vector a.
If c0 is estimable, then
`0 X = c0 (X0 X) X0 X = a0 X(X0 X) X0 X = a0 PX X = a0 X = c0 .
= `0 y is an unbiased estimator of c0
Thus, by slide 3, c0
whenever c0 is estimable.

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

5/8

Proof of the Gauss-Markov Theorem

Suppose d0 y is any linear unbiased estimator other than the


= `0 y.
OLS estimator c0
Then we know the following:
1

d 6= ` ||d `||2 = (d `)0 (d `) > 0, and

d0 X = `0 X = c0 = d0 X `0 X = 00 = (d `)0 X = 00 .

We need to show Var(d0 y) > Var(c0 ).

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

6/8

Proof of the Gauss-Markov Theorem

Var(d0 y) = Var(d0 y c0 + c0 )
0
+ Var(c0 )
+ 2Cov(d0 y c0 ,
c0 ).

= Var(d y c0 )

= Var(d0 y `0 y) = Var((d0 `0 )y) = Var((d `)0 y)


Var(d0 y c0 )
= (d `)0 Var(y)(d `) = (d `)0 ( 2 I)(d `)
= 2 (d `)0 I(d `) = 2 (d `)0 (d `) > 0 by (1).

c0 )
= Cov(d0 y `0 y, `0 y) = Cov((d `)0 y, `0 y)
Cov(d0 y c0 ,
= (d `)0 Var(y)` = 2 (d `)0 `
= 2 (d `)0 X[(X0 X) ]0 c = 0 by (2).

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

7/8

Proof of the Gauss-Markov Theorem

It follows that
+ Var(c0 )

Var(d0 y) = Var(d0 y c0 )

> Var(c0 ).
2

c
Copyright 2012
Dan Nettleton (Iowa State University)

Statistics 511

8/8

You might also like