0% found this document useful (0 votes)
17 views

Sheet 3 Multicollinearity

The document is a worksheet for a course on Econometrics, focusing on the concept of multicollinearity in regression analysis. It includes true or false statements regarding multicollinearity, problems related to regression results, and interpretation of parameters. The second part contains specific regression examples and questions aimed at assessing understanding of multicollinearity and its implications in econometric modeling.

Uploaded by

amiraahmedg122
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Sheet 3 Multicollinearity

The document is a worksheet for a course on Econometrics, focusing on the concept of multicollinearity in regression analysis. It includes true or false statements regarding multicollinearity, problems related to regression results, and interpretation of parameters. The second part contains specific regression examples and questions aimed at assessing understanding of multicollinearity and its implications in econometric modeling.

Uploaded by

amiraahmedg122
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Faculty of Economics and Political Science

Economics Department
Third Year (Arabic Section)
Econometrics I
TA. Heba Mohamed

Sheet 3
Multicollinearity

First Part : State whether the following statements are True or False
with brief justification.

1. In the simple regression model, multicollinearity refers to exact linear


relationships between regressors.
2. If multicollinearity is perfect, the regression coefficients are
determinate and their standard errors are zero.
3. If z = x2, so there will be a multicallinaearity if we use Z and X as
regressors.
4. Despite perfect multicollinearity, OLS estimators are BLUE.
5. The only effect of multicollinearity is to make it hard to get
coefficient estimates with small standard error.
6. Having small number of observations leads to estimators with small
standard error.
7. Multicollinearity, small number of observations, and small variance
of the independent variables are all the same problem.
8. Micronumerosity is the counterpart of multicollinearity.
9. Exact micronumerosity arises when n, the sample size is infinity.
10.Near micronumerosity arises when the number of observations barely
exceeds the number of parameters to be estimated.
11.Multicollinearity is essential population phenomenom.
12.The larger the variability in a regressor, the greater the precision with
which its coefficient can be estimated.
13.In the case of multicollinearity, the probability of failing to reject a
false hypothesis (type II error) increases.
14.In situation of extreme multicollinearity dropping the highly collinear
variable will often make the other X variable statistically significant.
15.If the VIF between variables exceeds 10, these variables are said to
be highly collinear.
16.The closer TOLj is to 1, the greater the degree of collinearity of that
variable with the other regressors.
17.The TOL is a better measure of multicollinearity than the VIF.

1
18.You will not obtain a high R2 value in a multiple regression if all the
partial slope coefficients are individually statistically insignificant on
the basis of the usual t test.

Second part: problems.

1. From the annual data for the U.S. manufacturing sector for 1899-
1992, the researcher obtained the following regression results:

a. Is there multicollinearity in regression (1)? How do you


know?
b. In regression (1), what is a priori sign of log K? Do the
results conform to this expectation? Why or why not?
c. How would you justify the functional form of regression
(1)? (Hint: Cobb-Douglas production function.)
d. Interpret regression (1). What is the role of the trend variable
in this regression?
e. What is the logic behind estimating regression (2)?
f. If there was multicollinearity in regression (1), has that been
reduced by regression (2)? How do you know?
g. If regression (2) is a restricted version of regression (1), what
restriction is imposed by the auther? (Hint: returns to scale.)
How do you know if this restriction is valid? Which test do
you use? Show all your calculations.

2
2. If we have the following regression results:

Yi = 24.37 + 0.94 x2i – 0.04 x3i

(Se) (16.752) (0.822) (0.080)

(Tc) (3.6690) (1.1442) (-0.522)

R2 = 0.963 n=10

Where, Yi: the consumption expenditure

X2i: the disposable income

X3i: the wealth value

a. Interpret the estimated parameters ( )


b. Is there multicollinearity problem? How do you know?

3. If you have the following correlation matrix:

1 0.9742 0.9284

1 0.9872

a. "Since the zero - order correlations are very high, there must be
serious multicollinearity", comment.
b. Would you drop variables Xi2 and Xi3 from the model?
c. If you drop them, what will happen to the coefficient of Xi?

You might also like