0% found this document useful (0 votes)
34 views

Econometrics I, MT2 Problem Set 1: y y Corr R

This document outlines problems for an econometrics course. It includes: 1) Defining residuals and deriving the OLS estimator for a simple linear regression model. 2) Showing that the sum of squared residuals equals the sum of fitted values minus observed values and that R^2 is the square of the sample correlation between observed and fitted values. 3) Deriving expressions for the slope coefficient and showing it is uncorrelated with the errors, and deriving expressions for the intercept coefficient and its variance. 4) Referring to additional problems from a textbook. 5) Showing that the error variance estimator is unbiased. 6) Showing that the estimator connecting the first and last

Uploaded by

GoranShved
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

Econometrics I, MT2 Problem Set 1: y y Corr R

This document outlines problems for an econometrics course. It includes: 1) Defining residuals and deriving the OLS estimator for a simple linear regression model. 2) Showing that the sum of squared residuals equals the sum of fitted values minus observed values and that R^2 is the square of the sample correlation between observed and fitted values. 3) Deriving expressions for the slope coefficient and showing it is uncorrelated with the errors, and deriving expressions for the intercept coefficient and its variance. 4) Referring to additional problems from a textbook. 5) Showing that the error variance estimator is unbiased. 6) Showing that the estimator connecting the first and last

Uploaded by

GoranShved
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Econometrics I, MT2

Problem set 1
1) For the model y=
1
x+u
a) Define the vertical distance from each point to the line that you are estimating. Build the
sum of square distances function.
b) Minimize the sum of square distances to derive the OLS estimator of the slope and check
that the second order conditions are satisfied.
2) Suppose, the linear model y=
0
+
1
x+u is estimated with OLS technique. Show that in this case:
a)
=
=
n
i
i i
u y
1
0
, where the elements in the sum are the usual fitted values and residuals
obtained after OLS estimation. What does this equality imply about sample correlation
between residuals and fitted values?
b) Show that the goodness of fit measure R2 is the square of sample correlation between
values of y and fitted values:
( ) ) , (
2
y y Corr R =
.
3) In the simple linear regression model show that:
a) b
1
can be written as
x
i
i
n
i
i i
SST
x x
w u w b

= + =

=
where ,
1
1 1
| .
b) Show that

=
=
n
i
i
w
1
0 and that as a result correlation between b
1
and u is zero. Hint: this is
equivalent to showing that ( )( ) | | 0 ) ( ) (
1 1
= u E u b E b E . Note that

=
=
n
i
i
u
n
u
1
1
and do
not mix up errors with residuals!
c) Show that b
0
can be written as x b u b ) (
1 1 0 0
| | + = .
d) Use results from above to show that
x
SST
x
n
x b V
2 2 2
0
) | (
o o
+ =
e) Show that expression for variance of b
0
derived in part d) is equivalent to
x
n
i
i
SST n
x
x b V
2
1
2
0
) | (
o
=

=
.
4) Problem 2.2 from Wooldridge (edition 3, p.66).
5) Problem 2.5 from Wooldridge (p.67).
6) Problem 2.9 (i,ii) from Wooldridge (p.68).
7) Show that the estimator of error variance is unbiased (see Proof for Theorem 2.3 in
Wooldridge, p. 62).
8) Show that the estimator that connects the first and the last observations in the dataset indeed
has smaller variance than b-OLS.

You might also like