4_autocorrelation
4_autocorrelation
• Meaning
• Consequences
• Detection
• Remedial Measures
Meaning
What happens when the following assumption related to the error term is
violated?
𝐶𝑜𝑣+𝜀! , 𝜀$ - = 𝐸+𝜀! , 𝜀$ - = 0; 𝑖 ≠ 𝑗
• Autocorrelation: auto+correlation
• “correlation between members of series of observations ordered in time
[as in time series data] or space [as in cross-sectional data]”
• For instance, if one disturbance is large and positive then another
disturbance too is large and positive!
• Autocorrelation can be either positive or negative.
Page 1 of 13
Examples: output and labor and capital inputs, household incomes and
expenditures (we don’t expect the correlation but there may be)
Page 2 of 13
No autocorrelation
• Inertia
o most macro indicators (for instance, price indexes, production,
employment) move in tandem with general macroeconomic
conditions (business cycles)
o intertia (to change) -> past outcomes have strong influence on the
current ones
o Earthquake, tsunami, war
• Specification error
o incorrect functional form
o omission of relevant variables
o Note that in such cases, the appropriate procedue is to address
misspecification (and not autocorrelation or heteroscedasticity)
Page 3 of 13
o Suppose in the period t, Pt turns out to be lower than Pt-1, in the
next period (t+1) farmers may decide to produce less. This leads to a
cobweb pattern with farmers overproducing in one period and
underproducing in the another. In such a situation the error terms
would not be random.
Consequences
Page 4 of 13
'!
𝑉𝑎𝑟 +𝛽9# - ≠ ∑ )"!
unless ρ = 0
As was the case with heteroscedasticity, the bias does not disappear even with an
increase in the sample size.
Page 5 of 13
The correction factors generally pertain to the first observation in the sample
According to Kennedy, the relative efficiency of GLS vis-à-vis OLS (i.e., the ratio of
two variances) is roughly
1 − 𝜌#
1 + 𝜌#
Detection
Graphical methods
Gives us clue not just about autocorrelation but also about heteroscedasticity
and specification error
Page 6 of 13
Formal Methods
∑*%+#(𝜀̂% − 𝜀̂%&" )#
𝑑=
∑*%+"(𝜀̂% )#
0≤d≤4
𝑑 ≈ 2(1 − 𝜌;)
where
∑ 𝜀̂% 𝜀̂%&"
𝜌; =
∑ 𝜀̂% #
Remember, 𝜀% = 𝜌𝜀%&" + 𝜗%
Page 7 of 13
Decision Rule
Durbin and Watson derived two critical values (a lower bound dL and an upper
bound dU). If the estimated d lies outside these, a decision can be made regarding
the presence of autocorrelation. The critical values depend on the level of
significance, sample size, and the number of explanatory variables.
Steps:
Page 8 of 13
What if we end up in no man's land (i.e., indecisive zone)
• Modified d-test (one can use the upper limit dU as the critical value; beyond
the scope of this course)
• Breush-Godfrey Test
Steps:
S2: Run an auxiliary regression of 𝜀̂% on X and p lagged values of 𝜀̂% . Get R2 from
this regression. Note the number of observations to run the regression reduces to
n-p
S3: It has been shown that (n-p) times R2 follows the chi-square distribution with
'p' df. That is,
(𝑛 − 𝑝). 𝑅 # ~𝜒,#
Decision rule: If the LHS exceeds the critical chi-square value, reject the Null that
there is no autocorrelation.
Page 9 of 13
Limitation: The value of p, the length of the lag, cannot be specified a priori. Some
experimentation is inevitable.
Remedial Measures
Use GLS
Or,
𝑌% − 𝜌𝑌%&" = 𝛽" (1 − 𝜌) + 𝛽# (𝑋% − 𝜌𝑋%&" ) + 𝜀% − 𝜌𝜀%&"
𝑌% ∗ = 𝛽" ∗ + 𝛽# ∗ 𝑋% ∗ + 𝜗%
A: Treat it as equal to 1
May be a good approximation even if ρ is quite close to 1. In this case, we can use
the first difference
∆𝑌% = 𝛽# ∆𝑋% + 𝜗%
Page 10 of 13
Note that
𝜌; = 1 − 𝑑/2
𝜀% = 𝜌𝜀%&" + 𝜗%
The above methods of first estimating ρ and then using the same in GLS are
known as Feasible GLS (FGLS)
D: Iterative Methods
Approximation, starting with an initial value of ρ; more useful when higher order
autocorrelation is present
Page 11 of 13
Correcting the OLS standard errors
The d-statistics seems closer to zero; Note that the sample size = 46, alpha = 0.05,
k=1 (excluding the intercept), dL = 1.475 and dU = 1.566. The estimated value is
smaller than dL, therefore, we do not reject the Null of positive autocorrelation.
Page 12 of 13
Kennedy notes that the above correction eliminates the asymptotic bias though it
may not completely eliminate the small sample bias.
Page 13 of 13