0% found this document useful (0 votes)
2 views

2022 Econometrics Chapter Three

Chapter Three discusses the multiple linear regression model, emphasizing the relationship between a dependent variable and multiple explanatory variables. It outlines key assumptions, estimation of parameters, hypothesis testing, and the significance of individual and overall model parameters. The chapter also introduces the adjusted coefficient of determination as a measure of model fit, addressing the limitations of traditional R².

Uploaded by

ibsaasheka
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

2022 Econometrics Chapter Three

Chapter Three discusses the multiple linear regression model, emphasizing the relationship between a dependent variable and multiple explanatory variables. It outlines key assumptions, estimation of parameters, hypothesis testing, and the significance of individual and overall model parameters. The chapter also introduces the adjusted coefficient of determination as a measure of model fit, addressing the limitations of traditional R².

Uploaded by

ibsaasheka
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 66

Chapter Three

THE CLASSICAL REGRESSION ANALYSIS


Multiple Linear Regression Model

By : Habtamu Legese 2022


3.1 Introduction
• In simple regression we study the relationship between
a dependent variable and a single explanatory
variable. But it is rarely the case that economic
relationships involve just two variables.
• Rather a dependent variable Y can depend on a whole
series of explanatory variables or regressors.
• For instance, in demand studies we study the
relationship between quantity demanded of a good and
price of the good, price of substitute goods and the
consumer’s income.
Cont.
• The model we assume is:
Cont.
3.2 Assumptions of Multiple Regression Model
• In order to specify our multiple linear regression
model and proceed our analysis with regard to
this model, some assumptions are compulsory.

• But these assumptions are the same as in the


single explanatory variable model developed
earlier except the assumption of no perfect
multicollinearity.
These assumptions are:
Cont.
3.3.1 Estimation of parameters of two-explanatory
variables model
Cont.
Cont.
Cont.
Cont.
Cont.
We know that
Cont.
Cont.
Cont.
18
19
20
21
22
Variance of the parameters
Gujarati: Basic Econometrics, Fourth Edition page 208

23
Cont.

24
Cont.

25
26
27
3.5. Hypothesis Testing in Multiple Regression Model
• In multiple regression models we will undertake two
tests of significance.

• One is significance of individual parameters of the


model. This test of significance is the same as the tests
discussed in simple regression model.

• The second test is overall significance of the model.


3.5.1. Tests of individual significance
34
35
3.3.2 The coefficient of determination(𝑹𝟐 ):two
explanatory variables case
• In the simple regression model, we introduced 𝑹𝟐 as a
measure of the proportion of variation in the
dependent variable that is explained by variation in the
explanatory variable.
• In multiple regression model the same measure is
relevant, and the same formulas are valid but now we
talk of the proportion of variation in the dependent
variable explained by all explanatory variables
included in the model.
The coefficient of determination is:
Cont.

𝟐
• If 𝑹 is low, there is no association between the
values of Y and the values predicted by the model,
𝑌෠ and the model does not fit the data well.
3.3.3 Adjusted Coefficient of Determination 𝑹𝟐
• One difficulty with 𝑹𝟐 is that it can be made large by adding
more and more variables, even if the variables added have no
economic justification.
• Algebraically, it is the fact that as the variables are added the
sum of squared errors (RSS) goes down (it can remain
unchanged, but this is rare) and thus 𝑹𝟐 goes up.
• If the model contains n-1 variables then 𝑹𝟐 = 1. The
manipulation of model just to obtain a high 𝑹𝟐 is not wise.
• An alternative measure of goodness of fit, called the adjusted 2

𝑹𝟐 and often symbolized as 𝑹𝟐 , is usually reported by


regression programs.
Cont.
• It is computed as:

• This measure does not always goes up when a


variable is added because of the degree of freedom
term n-k is the numerator.
42
43
3.5.2 Test of Overall Significance
• Through out the previous section we were concerned with testing
the significance of the estimated partial regression coefficients
individually, i.e. under the separate hypothesis that each of the
true population partial regression coefficient was zero.
Cont.
Cont.
• The test procedure for any set of hypothesis can be
based on a comparison of the sum of squared errors
from the original, the unrestricted multiple
regression model to the sum of squared errors from a
regression model in which the null hypothesis is
assumed to be true.
Cont.
Cont.
Cont.
Cont.
Cont.
Cont.
• If the computed value of F is greater than the critical
value of F (k-1, n-k), then the parameters of the model
are jointly significant or the dependent variable Y is
linearly related to the independent variables included
in the model.
53
3.4.General Linear Regression Model and Matrix Approach
• So far we have discussed the regression models
containing one or two explanatory variables.
• Let us now generalize the model assuming that it
contains k variables.
• It will be of the form:
Cont.
Cont.
3.4.1 Matrix Approach to Linear Regression Model
Cont.
Cont.
Cont.
Cont.
Cont.
• Equating the expression to null vector 0, we obtain:
Cont.
• We hope that from the discussion made so far on multiple
regression model, in general, you may make the following
summary of results.
Thank You

You might also like