0% found this document useful (0 votes)
37 views

Unit12 Handout

1. The document discusses linear prediction and ARMA forecasting methods. It introduces the best linear predictor and shows how to derive the prediction equations for a stationary time series. 2. It then focuses on one-step ahead linear prediction and how to calculate the mean square prediction error. It also discusses how to construct prediction intervals. 3. Finally, it explains how to perform ARMA forecasting by representing the ARMA model as an infinite order moving average process using the coefficients ψj.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Unit12 Handout

1. The document discusses linear prediction and ARMA forecasting methods. It introduces the best linear predictor and shows how to derive the prediction equations for a stationary time series. 2. It then focuses on one-step ahead linear prediction and how to calculate the mean square prediction error. It also discusses how to construct prediction intervals. 3. Finally, it explains how to perform ARMA forecasting by representing the ARMA model as an infinite order moving average process using the coefficients ψj.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Best Linear Predictor ARMA Forecasting

Unit 12: Forecasting

Department of Statistics, University of Virginia

Spring 2016

1 / 38
Best Linear Predictor ARMA Forecasting

Readings for Unit 12

Textbook chapter 3.5.

2 / 38
Best Linear Predictor ARMA Forecasting

Last Unit

1 ACF for MA(q)


2 ACF for Causal ARMA(p,q)
3 Partial Autocorrelation Function

3 / 38
Best Linear Predictor ARMA Forecasting

This Unit

1 Best linear predictor


2 ARMA forecasting

4 / 38
Best Linear Predictor ARMA Forecasting

Motivation

In this unit, we explore a forecasting: predicting future values of a


time series based on observed data.

5 / 38
Best Linear Predictor ARMA Forecasting

1 Best Linear Predictor

2 ARMA Forecasting

6 / 38
Best Linear Predictor ARMA Forecasting

Forecasting

In forecasting, the goal is to predict future values of a time series,


xn+m , based on the observed data x = {xn , xn−1 , · · · , x1 }. In this
unit, we assume {xt } is stationary.

7 / 38
Best Linear Predictor ARMA Forecasting

Forecasting

The minimum mean square error predictor of xn+m is

n
xn+m = E(xn+m |x) (1)
as the conditional expectation minimizes the mean square error
E [xn+m − g (x)]2 , where g (x) is a function of the observations.

8 / 38
Best Linear Predictor ARMA Forecasting

Forecasting

First, we restrict our attention to predictors that are linear


functions of the observations, i.e.
n
X
n
xn+m = α0 + αj xj (2)
j=1

where α0 , α1 , · · · , αn ∈ R. Linear predictors of the form (2) that


minimize the mean square prediction error are called
.

Linear prediction depends on on the second-order moments of the


process, which can be estimated from the data.

9 / 38
Best Linear Predictor ARMA Forecasting

Projection Theorem

Theorem
Let M be a closed subspace of the Hilbert space H and let y be
an element in H. Then, y can be uniquely represented as
y = ŷ + z where ŷ ∈ M and z is orthogonal to M. Therefore, for
any w ∈ M,
• ky − w k ≥ ky − ŷ k and
• < z, w >= 0.

10 / 38
Best Linear Predictor ARMA Forecasting

Projection Theorem

11 / 38
Best Linear Predictor ARMA Forecasting

Projection Theorem: Linear Prediction

Given 1, x1 , x2 , · · · , xn ∈ {X : E(X 2 ) <P


∞}, choose
α0 , α1 , · · · , αn ∈ R so that U = α0 + nj=1 αj xj minimizes
E(xn+m − U)2 .

Note that:
M = {U = α0 + nj=1 αj xj : αj ∈ R} = sp{1,
P
¯ x1 , · · · , xn } and
y = xn+m .

12 / 38
Best Linear Predictor ARMA Forecasting

Projection Theorem: Linear Prediction

Let xn+m denote the best linear predictor, i.e.

n
kxn+m − xn+m k2 ≤ kU − xn+m k2
for all U ∈ M. The projection theorem implies

13 / 38
Best Linear Predictor ARMA Forecasting

Projection Theorem: Linear Prediction

• The prediction errors xn+m


n − xn+m are orthogonal to the
prediction variables (1, x1 , · · · , xn ).
• Orthogonality of prediction error and 1 implies we can
from all variables xn+m and xi .
• Therefore, we typically assume µ = 0 for forecasting.

14 / 38
Best Linear Predictor ARMA Forecasting

BLP for Stationary Process

Given x1 , · · · ,P
xn , the best linear predictor for stationary processes,
n
xn+m = α0 + nj=1 αj xj , of xn+m , for m ≥ 1, is found by solving

n
 
E (xn+m − xn+m )xk = 0 for k = 0, 1, · · · , n, (3)
where x0 = 1, for α0 , α1 , · · · , αn . The equations (3) are called the
prediction equations.

15 / 38
Best Linear Predictor ARMA Forecasting

One-Step-Ahead Linear Prediction


Consider one-step-ahead prediction. Given x1 , · · · , xn , we want to
forecast xn+1 . The BLP takes the form

n
xn+1 = φn1 xn + φn2 xn−1 + · · · + φnn x1 . (4)
Therefore, the prediction equations (3) become

16 / 38
Best Linear Predictor ARMA Forecasting

One-Step-Ahead Linear Prediction


In matrix form:

17 / 38
Best Linear Predictor ARMA Forecasting

One-Step-Ahead Linear Prediction

The mean square one-step-ahead prediction error is

n n
Pn+1 = E(xn+1 − xn+1 )2
=
=
=
=
= (5)

18 / 38
Best Linear Predictor ARMA Forecasting

Prediction Intervals

Construct prediction interval:


q
n n .
xn+1 ± 1.96 Pn+1

for Gaussian processes. The prediction error has distribution


n ).
N(0, Pn+1

19 / 38
Best Linear Predictor ARMA Forecasting

1 Best Linear Predictor

2 ARMA Forecasting

20 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

The prediction equations (3) for stationary processes provide little


insight into forecasting for ARMA models. Let’s consider an
ARMA model that is causal and invertible

φ(B)Xt = θ(B)wt ,

where

φ(B) = 1 − φ1 B − · · · − φp B p and θ(B) = 1 + θ1 B + · · · + θq B q .

21 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

By causality and invertibility, we have



X
xn+m = ψj wn+m−j , ψ0 = 1, (6)
j=0

θ(z) P∞ j,
where ψ(z) = φ(z) = j=0 ψj z and


X
wn+m = πj xn+m−j , π0 = 1, (7)
j=0

φ(z) P∞ j.
where π(z) = θ(z) = j=0 πj z

22 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

Given the past information xn , xn−1 , . . ., we are interested in


predicting xn+m . We use xen+m = E (xn+m |xn , xn−1 , ...), the
conditional expectation of xn+m given all the past xn , xn−1 , . . . , to
forecast xn+m .

23 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

Note also that


E (wt |xn , xn−1 , ...) = 0
for t > n because of causality. For t ≤ n, wt is determined by
xt , xt−1 , . . . , which are included in xn , xn−1 , . . .. Thus,

E (wt |xn , xn−1 , ...) = wt .

24 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

So


0, t > n,
E (wt |xn , xn−1 , ...) = (8)
wt , t ≤ n.

25 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

Now, we take the infinite AR representation (7) and take the


conditional expectation (conditioning on xn , xn−1 , ...) on both sides
of (7) to get

26 / 38
Best Linear Predictor ARMA Forecasting

ARMA Forecasting

This leads to
m−1
X ∞
X
xen+m = − πj xen+m−j − πj xn+m−j . (9)
j=1 j=m

Letting m = 1 in (9), we have



X ∞
X
xen+1 = − πj xn+1−j = − πj 0 +1 xn−j 0 .
j=1 j 0 =0

So, start by finding xen+1 and then recursively use (9) to find the
later xen+m . This is called the .

27 / 38
Best Linear Predictor ARMA Forecasting

Worked Example

We have an AR(2) model xt = φ1 xt−1 + φ2 xt−2 + wt . Suppose we


have observations up to the nth term, i.e. xn , xn−1 , · · · . So we wish
to use the observed data and estimated AR(2) model to forecast
the value of xn+1 and xn+2 . Writing out these two values, we have

xn+1 = φ1 xn + φ2 xn−1 + wn+1 ,


xn+2 = φ1 xn+1 + φ2 xn + wn+2 .

28 / 38
Best Linear Predictor ARMA Forecasting

Worked Example

To forecast xn+1 , we use the observed values of xn and xn−1 and


replace wn+1 by its expected value of 0.

Forecasting xn+2 poses a challenge, since it requires the


unobserved value of xn+1 . We use the forecasted value of xn+1 .

29 / 38
Best Linear Predictor ARMA Forecasting

Framework in Forecasting

In general, the forecasting procedure for an ARMA(p,q) model is


as follows:
• For any wj with 1 ≤ j ≤ n, use the sample residual at time j.
• For any wj with j > n, use the expected value of wj , which is
0.
• For any xj with 1 ≤ j ≤ n, use the observed value of xj .
• For any xj with j > n, use the forecasted value of xj .

30 / 38
Best Linear Predictor ARMA Forecasting

Prediction Error

We use the infinite MA representation (6) and write

31 / 38
Best Linear Predictor ARMA Forecasting

Prediction Error

Therefore, the mean-squared prediction error, or variance of the


difference between the forecasted value and the true value at time
n + m is
m−1
X
n
Pn+m = E (xn+m − xen+m )2 = σw2 ψj2 , (10)
j=0

and the standard error of the forecast error at time n + m is


v
u m−1
tσˆ2
u X
w ψj2 . (11)
j=0

32 / 38
Best Linear Predictor ARMA Forecasting

Prediction Error

Question: Write out the standard error of the forecast error for
m = 1 and m = 2.

33 / 38
Best Linear Predictor ARMA Forecasting

Prediction Error

Notice that as m gets larger–i.e. as we predict farther into the


future, this is but essentially asymptotes. This means
that you are getting essentially a constant prediction interval after
a certain distance into the future, as if we do not know what was
going on previously.

34 / 38
Best Linear Predictor ARMA Forecasting

Prediction Error

Also, for fixed sample size n, the prediction errors are correlated.
For h ≥ 1,

m−1
X
2
E {(xn+m − xen+m )(xn+m+h − xen+m+h )} = σ ψj ψj+k .
j=0

35 / 38
Best Linear Predictor ARMA Forecasting

Prediction Interval

For Gaussian processes, the 95% prediction interval for xn+m , the
future value of the series at time n + m is
v
u m−1
xn+m ± 1.96tσˆw2
n
u X
ψj2 . (12)
j=0

36 / 38
Best Linear Predictor ARMA Forecasting

Worked Example

Question: We have an AR(1) model xt = 40 + 0.6xt−1 + wt .


Suppose we have n = 100 observations, σˆw2 = 1 and x100 = 80.
We wish to forecast the values at times 101 and 102.

37 / 38
Best Linear Predictor ARMA Forecasting

Worked Example

38 / 38

You might also like