©the Mcgraw-Hill Companies, Inc. 2008 Mcgraw-Hill/Irwin
©the Mcgraw-Hill Companies, Inc. 2008 Mcgraw-Hill/Irwin
2008
Course code:Eco- 601
Class: BS 7th (2017-21)
Course Presenter
Mahwish Arshad
2
In two-variable model (consumption–income
example) we assumed that only income X
affects consumption Y. But besides income,
a number of other variables are also likely to
affect consumption expenditure. An obvious
example is wealth of the consumer.
3
As another example, the demand for a commodity is
likely to depend not only on its own price but also on the
prices of other competing or complementary goods,
income of the consumer, social status, etc. Therefore,
we need to extend our simple two-variable regression
model to cover models involving more than two
variables. Adding more variables leads us to the
discussion of multiple regression models, that is, models
in which the dependent variable, or regressand, Y
depends on two or more explanatory variables, or
regressors.
4
The simplest possible multiple regression model
is three-variable regression, with one dependent
variable and two explanatory variables.
7.1 THE THREE-VARIABLE MODEL:
5
In Eq. (7.1.1) β1 is the intercept term. As
usual, it gives the mean or average effect on
Y of all the variables excluded from the
model, although its mechanical interpretation
is the average value of Y when X2 and X3
are set equal to zero.
The coefficients β2 and β3 are called the
partial regression coefficients,
6
We continue to operate within the framework of the
classical linear regression model (CLRM) first
introduced, Specifically, we assume the following:
Zero mean value of ui , or
E(ui | X2i , X3i) = 0 for each i (7.1.2)
No serial correlation, or
cov (ui , uj ) = 0 i=j (7.1.3)
Homoscedasticity, or var (ui) = σ2 (7.1.4)
Zero covariance between ui and each X variable,
7 or cov (ui , X2i) = cov (ui , X3i) = 0 (7.1.5)
No specification bias, or The model is correctly
specified (7.1.6)
No exact collinearity between the X variables, or
No exact linear relationship between X2 and X3
(7.1.7)
In addition, as in Chapter 3, we assume that the
multiple regression model is linear in the
parameters, that the values of the regressors are
fixed in repeated sampling, and that there is
sufficient variability in the values of the regressors.
8
Assumption (7.1.7) indicates that there be no exact
linear relationship between X2 and X3, technically known
as the assumption of no multicollinearity
Suppose that in (7.1.1) Y, X2, and X3 represent
consumption expenditure, income, and wealth of the
consumer, respectively.
If there is an exact linear relationship between income
and wealth, we have only one independent variable, not
two, and there is no way to assess the separate
influence of income and wealth on consumption.
9
To see this clearly,
let X3i = 2X2i
in the consumption–income–wealth regression. Then
the regression (7.1.1) becomes
Yi = β1 + β2X2i + β3X3i + ui
Yi = β1 + β2X2i + β3(2X2i) + ui
= β1 + (β2 + 2β3)X2i + ui (7.1.10)
= β1 + αX2i + ui
We in fact have a two-variable and not a three
variable regression.
10
There is no way to estimate the separate influence
of X2 (= β2) and X3 (= β3) on Y, for α gives the
combined influence of X2 and X3 on Y.
Assumption of no multicollinearity pertains to our
theoretical (i.e., PRF) model.
In practice, when we collect data for empirical
analysis there is no guarantee that there will not
be correlations among the regressors.
What we require is that there be no exact
relationships among the regressors,
11
Keep in mind that we are talking only about
perfect linear relationships between two or
more variables. Multicollinearity does not rule
out nonlinear relationships between
variables. Like
This does not violate the assumption of no
perfect collinearity, as the relationship
between the variables here is nonlinear
12
PARTIAL REGRESSION
COEFFICIENTS
13
How do we actually go about holding the influence
of a regressor constant?
14
7.4 OLS ESTIMATION OF
THE PARTIAL REGRESSION COEFFICIENTS
OLS Estimators
Sample regression function (SRF corresponding
to the PRF of (7.1.1)
15
The OLS estimator of the population intercept
β1 is
Following are the OLS estimators of the
population partial regression coefficients β2
and β3, respectively.
16
Variances and Standard Errors of OLS
Estimators
or, equivalently,
17
or, equivalently
18
An unbiased estimator of σ2 is given by
The estimator ˆσ 2 can be computed from (7.4.18) once the residuals are
available,
Where u2i = (yi _ yi ^ ) 2 …………..(1)
In Multiple Regression
Yi ^ = ^1 + ^2 X2 + ^3 X 3
Or yi ^ = ^2 x2 + ^3 x 3 ………………. (2)
So put (2) in (1) we get
u2i = (yi _ ^2 x2 + ^3 x 3 ) 2 Then we get following
19
End of the Lecture
06/16/21 20