0% found this document useful (0 votes)
14 views

Multiple Regression Model

This document discusses multiple regression models. It covers estimating coefficients, implications of multicollinearity, goodness of fit, hypothesis testing, comparing coefficients, and dealing with multicollinearity. Key aspects include estimating beta coefficients by minimizing the sum of squared errors, adjusting R-squared, testing coefficients and overall model significance, and comparing models through F-tests and t-tests. Multicollinearity can inflate standard errors and change coefficient signs.

Uploaded by

saurabh2727spam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Multiple Regression Model

This document discusses multiple regression models. It covers estimating coefficients, implications of multicollinearity, goodness of fit, hypothesis testing, comparing coefficients, and dealing with multicollinearity. Key aspects include estimating beta coefficients by minimizing the sum of squared errors, adjusting R-squared, testing coefficients and overall model significance, and comparing models through F-tests and t-tests. Multicollinearity can inflate standard errors and change coefficient signs.

Uploaded by

saurabh2727spam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 13

Multiple Regression Model

Model:

Estimation of the Coefficients

From the model specified above, we have

Objective is to minimize with respect to , and

Hence, or

(1)

Similarly, or (2)

Further, or

(3)
Solving these equations, we get

(a) ; (b)

and (c)

Further, from (2) we have

Hence, or
Or [as ]

Similarly, from (3) we have

Hence, or
Or, [as ]

Further,

Similarly,
On the other hand,
Implications of Multicollinearity:

(a) If there is perfect multicollinearity between X1 and X2, i.e., ,

Here, and implying that

Or,

Or,

Similarly,

Further,

; ;

If there is perfect multicollinearity between X1 and X2, i.e.,

and

This means that the standard errors of the OLS estimators of the slope coefficients will be
infinitely large. This will subsequently affect the respective test statistics.

(b) If there is no multicollinearity between X1 and X2, i.e., ;

Consider the following models:

(1) ; (2) ; (3)


(4) ; (5)

σ1 = standard deviation of X1; σ2 = standard deviation of X2


This means that when ,

This means that when ,

Further, if there is no multicollinearity between X1 and X2, i.e.,

and
(c) Decomposition of Goodness-of-Fit

Testing of Hypothesis

Model:

(a) For the Overall Model

Null Hypothesis
Alternative Hypothesis: At least one of the slope coefficient is different from zero
Test Statistics;

(b) For the Individual Coefficients

Two-tailed Test: Null Hypothesis ; Alternative Hypothesis


One-tailed Test: Null Hypothesis ; Alternative Hypothesis

Test Statistic: where (This

estimator of variance is only for the slope coefficients - except the intercept)

Goodness-of-Fit

Derivation of Adjusted

We have

Or,

When ,

Similarly, when ,

On the other hand, when ,


In this case, will be negative for

Comparing Two Coefficients of a Multiple Regression Model:

Model:

Test Hypothesis: or

(a) Through t Test:

Test Statistic:
By the Null Hypothesis and

; ;

(b) Through Restricted F Test:

Unrestricted Model:

Restriction:

 Restricted Model:

Test Statistic:

Here, m stands for the number of restrictions


Example: Examine if the production function follows constant returns to scale

Production Function: ; Unrestricted Model:

Test Hypothesis: or

Through t Test

Test Statistic: or

Through Restricted F Test:


Restriction: or
Restricted Model:

 or

Test Statistic:
Testing for Validity of Restrictions

1. Testing for Inclusion/Exclusion of Variables

Case I:
Unrestricted Model:
Restriction:
Null Hypothesis: ; Alternative Hypothesis: At least one of these additional slope coefficients is different from zero

Restricted Model:

Restricted F Test: (Here, m stands for the number of restrictions and k for the number of coefficients)

Rejection of the Null Hypothesis indicates that the unrestricted model should be selected.

Case II:
Unrestricted Model:
Restriction:
Null Hypothesis: ; Alternative Hypothesis: The additional slope coefficient is different from zero

Restricted Model:

Restricted F Test: (Here, m stands for the number of restrictions and k for the number of coefficients)

Through t Test: Test Statistic:

Rejection of the Null Hypothesis in either case indicates that the unrestricted model should be selected.

2. Testing for Equality of Coefficients


Unrestricted Model:
Restriction: ; Null Hypothesis: ; Alternative Hypothesis: These slope coefficients are significantly different
Restricted Model:
Restricted F Test: (Here, m stands for the number of restrictions and k for the number of coefficients)

Through t Test: Test Statistic: or

Rejection of the Null Hypothesis in either case indicates that the coefficients are statistically significantly different

3. Testing for Validity of Restriction


Production Function: ; Unrestricted Model: ; Restriction: or
Null Hypothesis: or ; Alternative Hypothesis: The production function does not follow CSR

Through t Test: Test Statistic: or

Through Restricted F Test:


Restricted Model: or

Test Statistic:

Rejection of the Null Hypothesis in either case indicates that the production function does not follow constant returns to scale

4. Comparing Coefficients: Use of Dummy Variables

Model Specification: (Here, for rural households and for urban households

Two Alternatives: (1) Given X and D = 0, ; (2) Given X and D = 1,


The PRFs will differ depending on statistical significance and sign of α3 and α4

Possibilities
(1) If both α3 and α4 are not significant, the two PRFs will coincide indicating not significant difference

(2) If α3 is significant and α4 is not, the two PRFs will be parallel (structural difference only in respect of intercept)
(a) If α3 is positive, PRF for urban households will have higher intercept; (b) If α 3 is negative, PRF for urban households will have
lower intercept
(3) If α4 is significant and α3 is not, the two PRFs will be concurrent (structural break - difference will be only in respect of slope)
(a) If α4 is positive, PRF for urban households will be steeper; (b) If α4 is negative, PRF for these households will be flatter

(4) If both α3 and α4 are significant, the two PRFs will be dissimilar
(a) If both α3 and α4 positive, PRF for urban households will be steeper with a higher intercept
(b) If both α3 and α4 negative, PRF for urban households will be flatter with a lower intercept
(c) If α3 is positive but α4 is negative, PRF for urban households will be flatter with a higher intercept
(d) If α3 is negative but α4 is positive, PRF for urban households will be steeper with a lower intercept

Comparing Coefficients of Two Different Models:

(1) with n1 observations; (2) with n2 observations


Null Hypothesis: ; Alternative Hypothesis:

Unbiased estimator of common variance

Multiple Regression Analysis: A Summary


Multiple Regression Model:

Other Models: (1) ; (2) ; (3) ; (4)

Aspect Estimator
OLS estimator of

OLS estimator of
OLS estimator of

OLS estimator of and


and

OLS estimator of and


and

OLS estimator of and


; and

Explanatory Power of the Estimated Model

Adjusted R2

Covariance between and


Multicollinearity

Reasons of Multicollinearity Consequences Symptoms and Statistical Tests Remedies Possible Consequences

High R2; Few/No Change in


Significant Coefficient Significance of
Other Variables
Symptoms High Correlation
Coefficients Dropping Change in Sign of
Sampling Problem Variable(s) Other Variables
High Variance
Inflation Factors
Model Specification Severity of Combination Loss of Information
Multicollinearity of Variables on Variables
Derived Variables
Heteroscedasticity
True Correlation Statistical Farrar-Glaubar Remedial Ratios of
Test Test Measures Variables
Non-linearity

Effect on Value of Change in Sign of Large Standard Model not


Coefficients Coefficients Errors of Coefficients Significant

Autocorrelation
First Difference
of Variables
Misleading Conclusions Model not
on Regression Results Significant

You might also like