7 Regression With Stationary Time-series Data-revised(1)
7 Regression With Stationary Time-series Data-revised(1)
Walter R. Paczkowski
Rutgers University
Principles of Econometrics, 4th Chapter 9: Regression with Time Series Data:
Page 1
Edition Stationary Variables
Chapter Contents
9.1 Introduction
9.2 Stationary and Weak Dependence
9.3 Serial Correlation
9.4 Other Tests for Serially Correlated Errors
9.5 Estimation with Serially Correlated Errors
9.6 Autoregressive Distributed Lag Models
9.7 Forecasting
9.8 Multiplier Analysis
(9.1) 𝑦 𝑡 =𝛼+ 𝛽0 𝑥𝑡 + 𝛽 1 𝑥 𝑡 −1 + 𝛽2 𝑥𝑡 −2 +… + 𝛽 𝑞 𝑥 𝑡 − 𝑞 +𝑒
(9.10) 𝑒 𝑡 =𝑝 𝑒 𝑡 −1 +𝑣 𝑡
To estimate 𝑝1:
𝑇
1
c^
ov(𝑥 𝑡 , 𝑥𝑡 −1 )= ∑
𝑇 −1 𝑡 =2
( 𝑥 𝑡 − 𝑥 )( 𝑥𝑡 −1 − 𝑥 )
𝑇
1
var(𝑥𝑡 )=
^ ∑
𝑇 −1 𝑡 =1
( 𝑥𝑡 − 𝑥 )
2
∑ ( 𝑥𝑡 − 𝑥) ( 𝑥𝑡− 1− 𝑥 )
(9.19) 𝑟1= 𝑡 =2
𝑇
∑ ( 𝑥𝑡 − 𝑥 ) 2
𝑡 =1
∑ ( 𝑥 𝑡 − 𝑥 ) ( 𝑥𝑡 − 𝑠 − 𝑥 )
(9.20) 𝑟 𝑠 = 𝑡 =𝑠 +1
𝑇
∑ ( 𝑥𝑡 − 𝑥 ) 2
𝑡 =1
𝑟𝑠 − 0
𝑍= = √𝑇 𝑟 𝑠 𝑎´ 𝑁 ( 0 ,1 )
√ 1/𝑇
9.3.1
Serial
Correlation in
DUt is the change 1.25
Scatter for change in U
Output Growth
in unemployment 1.00
rate in quarter t.
0.75
DU t is described 0.50
DU_t-1
as autocorrelated 0.25
or serially
0.00
correlated
-0.25
autocorrelation DU_t
assumption is
violated
Principles of Econometrics, 4th Chapter 9: Regression with Time Series Data:
Page 23
Edition Stationary Variables
9.3
Serial FIGURE 9.5 Scatter diagram for Gt and Gt-1
Correlation
9.3.1
Serial
Correlation in
Output Growth
9.3.1a
Computing To test if cov(DUt, DUs) = 0, we can use the correlation
Autocorrelation
between DU’s that are k periods apart, which is known as
the k-th order sample autocorrelation (ρk):
T
( DU t DU )( DU t k DU )
Eq. 9.14 rk t k 1
t 1 t
T 2
( DU DU )
9.3.1a
Computing
Autocorrelation
For our problem, we have:
9.3.1b
The Correlagram
9.3.2a
A Phillips Curve The k-th order autocorrelation for the residuals can
be written as: T
eˆ eˆ t t k
Eq. 9.21 rk t k T1
t
ˆ
e 2
t 1
– The least squares equation is:
0.7776 0.5279 DU
INF
Eq. 9.22
se 0.0658 0.2294
9.3.2a
A Phillips Curve
The correlogram, a
sequence of Correlogram of Residuals
autocorrelations r1, r2,
r3,…, can be used to
check whether the
assumption cov(et, es)
= 0 for t ≠ s is violated
The correlogram is
also called the sample
autocorrelation
function
9.4.1
A Lagrange
Multiplier Test
Consider
yt β1 β 2 xt et
9.4.1
A Lagrange
Multiplier Test
For the autocorrelation LM test, we write:
yt β1 β 2 xt ρeˆt 1 vt
9.4.1
A Lagrange
Multiplier Test
If H0: ρ = 0 is true, then LM = T x R2 has an
approximate χ2(1) distribution
– T and R2 are the sample size and goodness-of-
fit statistic, respectively, from least squares
estimation of Eq. 9.26
9.4.1
A Lagrange Applying the autocorrelation LM test:
Multiplier Test
9.5.1
Least Squares Consequences of using least squares estimation without
Estimation
recognizing the existence of serially correlated errors:
1. Still a linear unbiased estimator, but no longer the
best
2. The standard errors computed on the basis of
uncorrelated errors are no longer correct
9.5.1
Least Squares
Estimation
Let’s reconsider the Phillips Curve model:
Eq. 9.29 0.7776 0.5279 DU
INF
0.0658 0.2294 incorrect se
0.1030 0.3127 HAC se
9.5.2
Estimating an
AR(1) Error
Model
Return to the Lagrange multiplier test for serially
correlated errors where we used the equation:
Eq. 9.30 et ρet 1 vt
9.5.2
Estimating an
AR(1) Error
Eq. 9.30 describes a first-order autoregressive
Model
model or a first-order autoregressive process for
et
– The term AR(1) model is used as an
abbreviation for first-order autoregressive
model
– It is called an autoregressive model because it
can be viewed as a regression model
– It is called first-order because the right-hand-
side variable is et lagged one period
9.5.2a
Properties of an
AR(1) Error
We assume that:
Eq. 9.32 1 ρ 1
The mean and variance of et are:
2
Eq. 9.33 E et 0
var et 2
e
v
1 ρ2
The covariance term is:
ρ k 2
Eq. 9.34 cov et , et k , k 0 v
2
1 ρ
9.5.2a
Properties of an The correlation implied by the covariance is:
ρ k corr et , et k
AR(1) Error
cov et , et k
var et var et k
cov et , et k
Eq. 9.35
var et
ρ k v2 1 ρ 2
v2 1 ρ 2
ρ k
Principles of Econometrics, 4th Chapter 9: Regression with Time Series Data:
Page 43
Edition Stationary Variables
9.5
Estimation with
Serially
Correlated
Errors
9.5.2a
Properties of an
AR(1) Error
Setting k = 1:
Eq. 9.36 ρ1 corr et , et 1 ρ
9.5.2a
Properties of an
AR(1) Error
Each et depends on all past values of the errors vt:
2 3
Eq. 9.37 et vt ρvt 1 ρ vt 2 ρ vt 3
9.5.2a
Properties of an
AR(1) Error
9.5.2b
Nonlinear Least Write the Phillips curve model with an AR(1) error as:
Squares
Estimation
yt β1 β 2 xt et with et ρet 1 vt , 1 ρ 1
Eq. 9.38
9.5.2b
Nonlinear Least Although Eq. 9.43 is a linear function of the variables xt ,
Squares
Estimation
yt-1 and xt-1, it is not a linear function of the parameters (β1,
β2, ρ)
– The coefficient of xt-1 equals -ρβ2
– These are nonlinear least squares estimates (NLS)
Eviews: INF = C(1)*(1-C(3)) + C(2)*DU +C(3)*INF(-1) -C(3)*C(2)*DU(-1)
9.5.3
Estimating a Eq. 9.43 is
yt β1 1 ρ β 2 xt ρyt 1 ρβ 2 xt 1 vt
More General
Model
Note that
δ β1 1 ρ δ0 β 2 δ1 ρβ2 θ1 ρ
Eq. 9.48
9.5.3
Estimating a Applying the least squares estimator to Eq. 9.47 using the
More General
Model data for the Phillips curve example yields:
t 0.3336 0.5593INF 0.6882 DU 0.3200 DU
INF t 1 t t1
Eq. 9.49
se 0.0899 0.0908 0.2575 0.2499
δˆ 1 ρβ
ˆ ˆ 2 0.5574 0.6944 0.3871
9.5.4
Summary of
Section 9.5 and
We have described three ways of overcoming the
Looking Ahead
effect of serially correlated errors:
1. Estimate the model using least squares with
HAC standard errors
2. Use nonlinear least squares to estimate the
model with a lagged x, a lagged y, and the
restriction implied by an AR(1) error
specification
3. Use least squares to estimate the model with a
lagged x and a lagged y, but without the
restriction implied by an AR(1) error
specification
Principles of Econometrics, 4th Chapter 9: Regression with Time Series Data:
Page 51
Edition Stationary Variables
9.6
Autoregressive Distributed Lag
Models
Eq. 9.52 yt 0 xt 1 xt 1 q xt q 1 yt 1 p yt p vt
– Two examples:
0.3336 0.5593INF 0.6882 DU 0.3200 DU
ADRL 1,1 : INFt t 1 t t1
9.6.1
The Phillips Consider the previously estimated ARDL(1,0) model:
Curve
t 0.3548 0.5282 INF 0.4909 DU , obs 90
INF t 1 t
Eq. 9.56
se 0.0876 0.0851 0.1921
For an ARDL(4,0) version of the model:
t 0.1001 0.2354 INF 0.1213INF 0.1677 INF
INF t 1 t 2 t 3
Table 9.4
AIC and SC
Values for
Phillips
Curve
ARDL
Models
Principles of Econometrics, 4th Chapter 9: Regression with Time Series Data:
Page 55
Edition Stationary Variables
9.6
Autoregressive
Distributed Lag
Models
9.6.1
The Phillips
Curve
9.7.1
Forecasting with
an AR Model
GT 1 δ θ1GT θ 2GT 1 vT 1
9.7.1
Forecasting with
an AR Model
9.7.1
Forecasting with
an AR Model
For two quarters ahead, the forecast for G2010Q1 is:
9.1.1
Dynamic Nature Ways to model the dynamic relationship:
of Relationships
1. Specify that a dependent variable y is a function of current
and past values of an explanatory variable x
Eq. 9.1 yt f ( xt , xt 1 , xt 2 ,...)
• et is correlated
Principles of Econometrics, 4th
with et - 1Series Data:
Chapter 9: Regression with Time
Page 66
Edition Stationary Variables
9.1
Introduction
9.1.2
Least Squares
Assumptions
The primary assumption is Assumption MR4:
cov yi , y j cov ei , e j 0 for i j
9.2.1
Assumptions
TSMR1. yt β 0 xt β1 xt 1 β 2 xt 2 β q xt q et , t q 1, , T
TSMR2. y and x are stationary random variables, and et is independent of
current, past and future values of x.
TSMR3. E(et) = 0
TSMR4. var(et) = σ2
TSMR5. cov(et, es) = 0 t ≠ s
TSMR6. et ~ N(0, σ2)
9.2.2
An Example: Okun’s Law implies that
Okun’s Law
DU t β 0Gt et
We can
DUexpand
this
β Gtoinclude
β G lags:
βG β qGt q et
t 0 t 1 t 1 2 t 2
9.2.2
An Example:
Okun’s Law