Econometrics II Chapter Two
Econometrics II Chapter Two
1
Lecture Plan
5
Consequence of heteroskedasticity
If the assumption of homoscedasticity is violated, it will have
the following consequences
1. Heteroskedasticity increases the variances of the
distributions of the coefficients of OLS thereby turning
the OLS estimators inefficient.
2. OLS estimators shall be inefficient : If the random
term Ui is heteroskedasticity, the OLS estimates do not
have the minimum variance in the class of unbiased
estimators. Therefore they are not efficient both in
small & large samples.
5
Consequence of heteroskedasticity
o So, heteroskedasticity has a wide impact on hypothesis
testing; the conventional t and F statistics are no more
reliable for hypothesis testing
5
Detection of heteroskedasticity
There are informal & formal methods of detecting
heteroskedasticity
1. Informal
A. Nature of Problems
o Following the pioneering work from empirical information
B. Graphical Method
o Plotting the square of the residual against the dependent
variable gives rough indication of the existence of
heteroskedasticity
o If there appears a systematic trend in the graph it may
be an indication of the existence of heteroskedasticity.
Detection of heteroskedasticity
II. Formal Methods
A. The spearman rank- correlation Test
o This is the simplest & approximate test for defecting
hetroscedastic which will be applied either to small or
large samples.
o A high rank correlation coefficient suggests the presence
of heteroskedasticity. If we have more explanatory
variable we may compute the rank correlation coefficient
between ei & each one of the explanatory variables
separately.
Detection of heteroskedasticity
B. The Breusch-Pagan Test
o This test is applicable for large samples & the number of
observations (at least) i.e sample size is twice the number
of explanatory variables.
o If the numbers of explanatory variables are 3(X1, X2,
X3) then the sample size is at least must be 6.
o If the computed value is greater than the table value
rejects the null hypothesis that there is
homoscedasticity & accepts the alternative that there is
heteroskedasticity.
Detection of heteroskedasticity
C. White’s General Heteroskedasticity Test
o It is an LM test, but it has the advantage that it does not
require any prior knowledge about the pattern of
heteroskedasticity. The assumption of normality is also
not required here. For all these reasons, it is considered
to be more powerful among the tests of
heteroskedasticity.
o Basic intuition → focuses on systematic patterns between
the residual variance, the explanatory variables, the
squares of explanatory variables and their cross-
products.
Detection of heteroskedasticity
Limitations of White’s test
i. When we have large number of explanatory variables,
the number of terms in the auxiliary regression model
will be so high that we may not have adequate degrees
of freedom.
ii. It is basically a large sample test so that, when we
work with a small sample, it may fail to detect the
presence of heteroskedasticity in data even when such
a problem is present.
Detection of heteroskedasticity
D. Goldfeld-Quandt Test
o It may be applied when one of the explanatory variables
is suspected to be the heteroskedasticity culprit.
o The basic idea here is that if the variances of the
disturbances are same across all observations (i.e.,
homoscedastic), then the variance of one part of the
sample should be same the variance of another part of
the sample.
Remedies of heteroskedasticity
1. Log-transformation of data ⟶ log-transformation
measured.
5
Autocorrelation
o Autocorrelation :- refers to the internal correlation
between members of series of observation ordered in
time or space.
o Autocorrelation is a special case of correlation where the
association is not between elements of two or more
variables but between successive value of one variable,
While correlation refers to the relation ship between
values of two or more different variables.
5
Source of Autocorrelation
1. Omitted Explanatory variables
o If an auto correlated variable has been excluded from
the set of explanatory variables then its influence will
be reflected in the random variables U.
o If several auto correlated explanatory variables are
omitted, then the random variables ,U, may not be auto
correlated. This is because the auto correlation pattern
of the omitted variables may offset each other.
5
Source of Autocorrelation
2. Mis-specification of the mathematical form of the
model
o If we use mathematical form which differ from the
correct form of relation ship then the random variables
may show the serial correlation.
o Example : If we chosen a linear function while the
correct form are non-linear, then the values of U will be
correlated.
5
Source of Autocorrelation
3. Mis-specification of the true random term U.
o Many random factors like war, drought, weather
condition, strikes etc exert influence that are spread
over more than one period of time.
o Example : the effect of weather condition in agricultural
sector will influence the performance of all other
economic variables in several times in the future. A strike
in an organization affect the production process which
will persist for several future periods. In such cases the
value’s of U become serially dependent, so that
autocorrelation has happen. 5
Source of Autocorrelation
4. Interpolation in the statistical observation
o Most time series data involves some interpolation and
smoothing process to remove seasonal effect which do
average the true disturbance of over successive time
period. As a result, the successive value of U’s are
interrelated and show autocorrelation pattern.
5
Consequences of autocorrelation
1. The OLS estimator is unbiased
2. The OLS estimator is inefficient hence it is not BLUE
3. Lower standard error & highest t-statistics which leads to
acceptance of the non- significant variables.
o Usual t-ratio and F ratio tests provide misleading
results.
o Overly optimistic view from R 2
5
Detection of Autocorrelation
B. Plot the residuals against time (t)- & we will have two
alternatives.
i. If the sign of successive values of the residuals etet-1
are changing rapidly their sign we can say there is
negative autocorrelation.
ii. If the sign of successive values of etet-1 do not
change its sign frequently i.e. several positives are
followed by several negatives values of etet-1 we can
conclude that there is positive autocorrelation.
5
Detection of Autocorrelation
Formal test
5
# END!!!!!!!
14