TS Part2
TS Part2
Forecasting
Seasonal variation
Removing trends
Ioannis Papastathopoulos
School of Mathematics, University of Edinburgh
1 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering
⊲ Forecasting
Forecasting is dangerous
Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
Example
n
Finding Xn+h for h > 1
Forecast error
Example
Forecast error
Comments
Prediction uncertainty
ARIMA forecasting
Exponential smoothing
Forecasting
Exponential smoothing and
ARIMA(0,1,1) models
Exponential smoothing and
ARIMA(0,1,1) models
Summary
Adjustments for a finite number of
past observations
Adjustments for a finite number of
past observations
Holt’s method
Holt’s method
Recursive formula for Holt’s method
Seasonal variation
Removing trends
2 / 62
Forecasting is dangerous
Seasonal variation
Removing trends
3 / 62
Straight-line regression
Forecasting iid
X
Forecasting is dangerous Xt = β0 + β1 (t − t̄) + wt , t = 1, 2, . . . , n, wt ∼ (0, σ 2 ), t̄ = n−1 t
⊲ Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
Xt (t − t̄)/ (t − t̄)2 .
P P
Example
n
Finding Xn+h for h > 1
for which the (uncorrelated) estimators are β̂0 = X̄ and β̂1 =
Forecast error
Example
n
The natural predictor is Xn+h = βˆ0 + β̂1 (n + h − t̄) and this has variance
Forecast error
Comments
2
Prediction uncertainty
n 2 1 (n + h − t̄)
ARIMA forecasting var(Xn+h ) = V (h) = σ + P 2
,
Exponential smoothing
Exponential smoothing and
n (t − t̄)
ARIMA(0,1,1) models
Exponential smoothing and
ARIMA(0,1,1) models
Summary
which increases quadratically in n + h − t̄, but tends to zero for fixed h as n → ∞.
Adjustments for a finite number of
past observations If the model is correct, the future observations will be
Adjustments for a finite number of
past observations Xn+h = β0 + β1 (n + h − t̄) + wn+h , where wn+h is independent of the previous data,
Holt’s method
Holt’s method and then
n
) = σ 2 + V (h) → σ 2
Recursive formula for Holt’s method
var(Xn+h − Xn+h (1)
Holt’s method and the ARIMA(0,2,2)
model
even if the sample size n → ∞.
The terms in (1) represent the uncertainty due to intrinsic variability of the system, σ 2,
Seasonal variation
Removing trends
and that due to estimating the system, V (h), to which must be added the variability
due to guessing the system (here a linear model) from the data.
4 / 62
ARMA forecasting
Forecasting – Forecasting is the process of estimating future values of an observed time series.
Forecasting is dangerous
Straight-line regression – The problem is to estimate the value of Xn+h for some integer h given observations
⊲ ARMA forecasting x1 , . . . xn .
Forecasting Xn+1 at time n
Example
n
Finding Xn+h for h > 1
– We will discuss forecasting for regular ARMA(p,q) models.
Forecast error
Example
– To simplify matters, we will assume that there is no constant term.
Forecast error
Comments
Prediction uncertainty Notation:
ARIMA forecasting
n
Exponential smoothing
Exponential smoothing and
– Let Xn+h denote the forecast made at time n for Xn+h .
ARIMA(0,1,1) models
Exponential smoothing and – E.g. When h = 1 we have
ARIMA(0,1,1) models
n
Summary Xn+1 = forecast for Xn+1 , made at time n.
Adjustments for a finite number of
past observations
Adjustments for a finite number of
past observations General idea:
Holt’s method
Holt’s method
Recursive formula for Holt’s method – The general idea is to set wt = 0 (its mean) for any future value of t. Thus, at time
Holt’s method and the ARIMA(0,2,2) n, wn+1 , wn+2 , . . . are set to zero. Xt is set to its forecasted value for future values
model
n
of t. For example, for a forecast made at time n, Xn+1 is set to Xn+1 . Then,
Seasonal variation
forecasts are made using the form of the model. The approach discussed is known as
Removing trends
the Box-Jenkins approach.
5 / 62
Forecasting Xn+1 at time n
Seasonal variation
Substituting this into (2), we get our forecast, of
Removing trends
n
Xn+1 = −π1 Xn − π2 Xn−1 − . . .
6 / 62
Example
Seasonal variation
Removing trends
7 / 62
n
Finding Xn+h for h > 1
8 / 62
Forecast error
⊲ Forecast error
Example
Forecast error
Comments
Hence, if {wt } are Gaussian white noise, a 95% confidence interval for Xn+h is
Prediction uncertainty
ARIMA forecasting 1/2
n
ψ12 2
Exponential smoothing
Exponential smoothing and
Xn+h ± 1.96 1 + + . . . ψh−1 σ.
ARIMA(0,1,1) models
Exponential smoothing and
ARIMA(0,1,1) models
Summary
Adjustments for a finite number of
past observations
In particular, when h = 1, we have en (1) = wn+1 ∼ N (0, σ 2 ), and a 95% confidence
Adjustments for a finite number of
past observations interval for Xn+1 is
n
Holt’s method
Holt’s method Xn+1 ± 1.96σ.
Recursive formula for Holt’s method
Removing trends
estimate. There are other, recursive approaches that can be used to obtain point
estimates of the forecasts (see e.g. p83-84 of Chatfield).
9 / 62
Example
Seasonal variation
Removing trends
10 / 62
Forecast error
Removing trends where this final simplification uses the sum of a geometric series.
11 / 62
Comments
Seasonal variation
Removing trends
12 / 62
Prediction uncertainty
13 / 62
ARIMA forecasting
Seasonal variation
Removing trends
14 / 62
Exponential smoothing
n
MATH11131 Part 2: Forecasting,
seasonal variation and filtering The method of Exponential smoothing forecasts Xn+1 using a weighted sum of past
Forecasting
Forecasting is dangerous
observations (where {Xt } is a time series with no trend).
Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
The formula used is
Example
n
n
= (1 − θ)Xn + (1 − θ)θXn−1 + (1 − θ)θ2 Xn−2 + . . . ,
Finding Xn+h for h > 1
Forecast error Xn+1
Example
Forecast error
Comments
Prediction uncertainty
where θ is some constant such that 0 < θ < 1 (so all the weights are positive). A
ARIMA forecasting
⊲ Exponential smoothing recursive version of this formula is
Exponential smoothing and
ARIMA(0,1,1) models
n
Exponential smoothing and
ARIMA(0,1,1) models
Xn+1 = (1 − θ)Xn + θXnn−1 ,
Summary
Adjustments for a finite number of
past observations
Adjustments for a finite number of
i.e. a weighted average of the current value (Xn ) and the previous forecast for Xn .
past observations
Holt’s method
Holt’s method A suitable constant, θ may be obtained by minimising errors for forecasts of known
Recursive formula for Holt’s method
Seasonal variation
In practice, a starting value such as Xt−t−1 is needed, where t is some past time.
Removing trends Remark: this method does not assume a particular model for the series {Xt }.
15 / 62
Exponential smoothing and ARIMA(0,1,1) models
Seasonal variation
= − π1 Xn − π2 Xn−1 − . . .
Removing trends
16 / 62
Exponential smoothing and ARIMA(0,1,1) models
Seasonal variation
Removing trends
17 / 62
Summary
Seasonal variation
Removing trends
18 / 62
Adjustments for a finite number of past observations
19 / 62
Adjustments for a finite number of past observations
Seasonal variation
Removing trends
20 / 62
Holt’s method
Seasonal variation
Removing trends
21 / 62
Holt’s method
Forecasting 2. bn+1 is an estimate of the trend (so the increase from time n). We will let
Forecasting is dangerous
Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
bn+1 = γ(an+1 − an ) + (1 − γ)bn for some γ (0 < γ < 1)
Example
n
Finding Xn+h for h > 1
Forecast error so that bn+1 is a weighted average of the change in ‘level’ from time n to time
Example
Forecast error n + 1 (an+1 − an ) and the estimate of the trend at time n (bn ).
Comments
Prediction uncertainty
ARIMA forecasting
It will be shown that Holt’s method agrees with the usual (Box-Jenkins) method if
Exponential smoothing
Exponential smoothing and
ARIMA(0,1,1) models
1. The underlying time series is ARIMA(0,2,2) (without constant).
Exponential smoothing and
ARIMA(0,1,1) models
2. α and γ are chosen suitably.
Summary
Adjustments for a finite number of
past observations
But Holt’s method can be used more generally, choosing α and γ using error
Adjustments for a finite number of
past observations
minimisation methods.
Holt’s method
⊲ Holt’s method
As with exponential smoothing, a particular model is not assumed for the data.
Recursive formula for Holt’s method
Seasonal variation
Removing trends
22 / 62
Recursive formula for Holt’s method
n
= α(1 + γ)Xn − αXn−1 + (2 − α − αγ)Xnn−1 − (1 − α)Xn−1
n−2
Forecasting
Forecasting is dangerous Xn+1
Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
Example
Proof. We know that
n
Finding Xn+hn
Forecast error
for h > 1
Xn+1 = an+1 + bn+1 .
Example
Forecast error Substituting in the given definitions for an+1 and bn+1 and rearranging we get
Comments
Prediction uncertainty
n
ARIMA forecasting
Exponential smoothing Xn+1 = αXn + (1 − α)Xnn−1 + γ(an+1 − an ) + (1 − γ)bn
Exponential smoothing and
ARIMA(0,1,1) models
Exponential smoothing and
ARIMA(0,1,1) models
Summary = αXn + (1 − α)(an + bn ) + γ αXn + (1 − α) (an + bn )
Adjustments for a finite number of
past observations
Adjustments for a finite number of
past observations
Holt’s method
Holt’s method
− γan + (1 − γ)bn
Recursive formula for Holt’s
⊲ method
Removing trends
= α(1 + γ)Xn + (2 − α − αγ)(an + bn ) − an . (5)
23 / 62
n−2
We now observe that an + bn = Xnn−1 and an = αXn−1 + (1 − α)Xn−1
MATH11131 Part 2: Forecasting,
seasonal variation and filtering . Substituting these
Forecasting
Forecasting is dangerous into (5) we get
Straight-line regression
ARMA forecasting
n
Forecasting Xn+1 at time n
Example
Xn+1 n−2
= α(1 + γ)Xn − αXn−1 + (2 − α − αγ)Xnn−1 − (1 − α)Xn−1 .
n
Finding Xn+h for h > 1
Forecast error
Example Note: in practice, to use this recursive formula, we would need two starting values, e.g.
Forecast error
Comments X0−1 and X−1
−2
.
Prediction uncertainty
ARIMA forecasting
Exponential smoothing
Exponential smoothing and
ARIMA(0,1,1) models
Exponential smoothing and
ARIMA(0,1,1) models
Summary
Adjustments for a finite number of
past observations
Adjustments for a finite number of
past observations
Holt’s method
Holt’s method
Recursive formula for Holt’s method
⊲Holt’s method and the ARIMA(0,2,2)
model
Seasonal variation
Removing trends
24 / 62
Holt’s method and the ARIMA(0,2,2) model
Forecasting
Forecasting is dangerous Xt = 2Xt−1 − Xt−2 + wt + θ1 wt−1 + θ2 wt−2 ,
Straight-line regression
ARMA forecasting
Forecasting Xn+1 at time n
Example
where θ1 and θ2 are constants and {wt } ∼ W N (0, σ 2 ).
n
To find a recursive formula for Xn+1 using the usual (Box-Jenkins) method, set
n
Finding Xn+h for h > 1
Forecast error
Example
Forecast error
wn+1 = 0 and write
Comments
Prediction uncertainty n
ARIMA forecasting Xn+1 = 2Xn − Xn−1 + θ1 wn + θ2 wn−1 .
Exponential smoothing
Exponential smoothing and
ARIMA(0,1,1) models
Exponential smoothing and Now note that
ARIMA(0,1,1) models
Summary wn = Xn − Xnn−1 and wn−1 = Xn−1 − Xn−1
n−2
,
Adjustments for a finite number of
past observations
Adjustments for a finite number of
past observations
and hence
Holt’s method
Holt’s method n
Recursive formula for Holt’s method Xn+1 = 2Xn − Xn−1 + θ1 wn + θ2 wn−1
Holt’s method and the
⊲ ARIMA(0,2,2) model
Seasonal variation
n−2
= 2Xn − Xn−1 + θ1 (Xn − Xnn−1 ) + θ2 (Xn−1 − Xn−1 )
Removing trends
25 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering This recursive formula has the same form as the recursive formula for the Holt method
Forecasting
Forecasting is dangerous
if we have that
Straight-line regression
ARMA forecasting 1. α(1 + γ) = (2 + θ1 )
Forecasting Xn+1 at time n
Example
n
2. α = (1 − θ2 )
Finding Xn+h for h > 1
Forecast error 3. (2 − α − αγ) = −θ1
Example
Forecast error 4. (1 − α) = θ2 .
Comments
Prediction uncertainty
ARIMA forecasting
Exponential smoothing
From (2) and (4), we see that α = (1 − θ2 ). Using (1), we get that
Exponential smoothing and
ARIMA(0,1,1) models
Exponential smoothing and (2 + θ1 ) 1 + θ1 + θ2
ARIMA(0,1,1) models
γ= −1= .
Summary
Adjustments for a finite number of
(1 − θ2 ) 1 − θ2
past observations
Adjustments for a finite number of
past observations
Holt’s method
Substituting these equations for α and γ into (2 − α − αγ), we get −θ1 (exercise:
Holt’s method
Recursive formula for Holt’s method
check), so these equations are consistent with (3).
Holt’s method and the ARIMA(0,2,2)
model
⊲ We see that if α = (1 − θ2 ) and γ = 1 + θ1 + θ2 /1 − θ2 , the Holt method gives the
Seasonal variation same recursive formula as the Box-Jenkins approach, and so for ARIMA(0,2,2) models,
Removing trends
the two forecasting approaches are equivalent.
26 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering
Forecasting
⊲ Seasonal variation
Seasonal variation
Seasonal variation
Quarterly seasonal variation
Example
Example
Removing trends
Seasonal variation
27 / 62
Seasonal variation
600
Number of passengers (thousands)
Example
Example
500
Removing trends
400
300
200
100
Year
The Box-Jenkins data, with monthly totals of international airline passengers in the USA,
from January 1949 to December 1960.
28 / 62
Seasonal variation
Example
Example
Forecasts can be made by forecasting the transformed stationary series, and then
Removing trends inverting any transformations made.
We will go through an example that has quarterly seasonal variations. The methods can
be adjusted to consider other types of seasonal variation, e.g. monthly.
29 / 62
Quarterly seasonal variation
Seasonal variation
Seasonal variation Months 2003 2004 2005 2006
Seasonal variation
⊲ Quarterly seasonal variation Jan-Mar 21 35 39 78
Quarterly seasonal variation Apr-Jun 42 54 82 114
Jul-Sept 60 91 136 160
Example
Example Oct-Dec 12 14 28 40
Removing trends
Figure 1: Quarterly demand for electricity in a government building. Data provided by a
former MSc in OR student.
It might be suspected that there is seasonal variation in this data. In a cool country,
demand for electricity might be higher in the winter than in the summer.
30 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering A plot of the data is given below. It seems that there is both seasonal variation and a trend.
Forecasting
Seasonal variation
Seasonal variation
Seasonal variation Electricity demand
Quarterly seasonal variation
⊲
Quarterly seasonal variation
150
Example
Example
Removing trends
100
Demand
50
5 10 15
Time
31 / 62
Quarterly seasonal variation
Removing trends
where ut is the ‘trend’ at time t, e′t is a random error with zero mean and st is the
seasonal variation at time t.
It is assumed that st takes the values sI , sII , sIII and sIV in quarters I, II, III and IV
respectively. By convention, we assume that
The usual procedure is first to smooth the raw data. There are various methods of
doing this (one of which we will discuss later).
32 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering For quarterly seasonal variation we will use the 5-term adjusted-average formula with
Forecasting
coefficients
Seasonal variation
1 1 1 1 1
Seasonal variation
, , , , ,
Seasonal variation
Quarterly seasonal variation 8 4 4 4 8
Quarterly seasonal variation which may be applied for all t, except the two values in the upper tail and the two values
⊲
in the lower tail. The 5-term adjusted-average formula with these coefficients sets
Example
Example
Removing trends 1 1 1
yt = xt−2 + (xt−1 + xt + xt+1 ) + xt+2 (3 ≤ t ≤ n − 2)
8 4 8
and
et = xt − yt (3 ≤ t ≤ n − 2).
To calculate the seasonal variation in each quarter, we start by taking the average of et
over all observations that relate to the first quarter, calling this sI∗ . Similarly, we obtain
sII
∗ ,sIII
∗ and sIV
∗ for quarters two, three and four.
33 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering To ensure that the seasonal variations add to zero, we set
Forecasting
Seasonal variation
Seasonal variation
sI =sI∗ − adj
Seasonal variation
Quarterly seasonal variation sII =sII
∗ − adj
Quarterly seasonal variation
sIII =sIII
∗ − adj
⊲
Example
Example
sIV =sIV
∗ − adj
Removing trends
Removing trends
6 54 48.25 5.75 II
7 91 49.00 42.00 III
8 14 53.00 -39.00 IV
9 39 62.13 -23.13 I
10 82 69.50 12.50 II
11 136 76.13 59.88 III
12 28 85.00 -57.00 IV
13 78 92.00 -14.00 I
14 114 96.50 17.50 II
15 160 x x III
16 40 x x IV
35 / 62
Example
Seasonal variation
a result
Seasonal variation
Seasonal variation
Quarterly seasonal variation sI = − 14.85
Quarterly seasonal variation
sII =12.49
Example sIII =42.70
⊲ Example
Removing trends sIV = − 40.35.
36 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering
Forecasting
Seasonal variation
⊲ Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
Removing trends
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
37 / 62
Trends
Removing trends
⊲ Trends More formally, we will use the notation
Adjusted average graduation
Notes
Practical steps
Xt = ut + st + Et for t ∈ Z,
Desirable properties of an
adjusted-average formula
Distortion of the trend
where ut is the ‘trend’ at time t, st is the seasonal component at time t and Et is a
Cubic trend
random error with zero mean. It is assumed that if the seasonality has period d, then
Spencer’s 15 point average filter
st + st+1 + . . . st+d−1 = 0.
The problem of the tails
Fitting a polynomial
Differencing This model is known as the classical decomposition model.
Differencing to remove seasonality
Example - seasonality and trend
For example, if the series has a linear trend, we might set ut = a + bt, where a and b are
constants.
38 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering In the last lecture we saw how to estimate the seasonal component st , based on
Forecasting
observations x1 , . . . xn .
Seasonal variation
Removing trends
Trends We will now discuss methods of removing the trend, ut .
⊲
Adjusted average graduation
Notes
Practical steps The idea is that by removing the deterministic components st and ut , we will hopefully
Desirable properties of an be left with a stationary stochastic process Et . Then, the stationary models we have
adjusted-average formula
Distortion of the trend already studied (ARMA) can be fitted to Et and used to estimate and forecast the
Cubic trend process (adding the trend and seasonality back in as appropriate).
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial We have already discussed the use of differencing (ARIMA models) to remove trend.
Differencing
Differencing to remove seasonality
Example - seasonality and trend There are two other common procedures for trend removal (known as detrending).
These are
1. Fitting a parametric function to the data.
2. Filtering.
In this section, we will focus on the use of filters for detrending a series.
39 / 62
Adjusted average graduation
Seasonal variation β
X
Removing trends
Trends
yt = Kj xt+j ,
⊲ Adjusted average graduation j=α
Notes
Practical steps
Desirable properties of an
where {Kj ; j = α, α + 1, . . . , β} is a set of coefficients.
adjusted-average formula
Distortion of the trend
The values {yt } are the smoothed or graduated observations. These smoothed
Cubic trend
observations are used as an estimate of the trend, so ût = yt .
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
Differencing
Thus,P
a weighted average is used to estimate the trend. It therefore makes sense to
Differencing to remove seasonality
Example - seasonality and trend
have Kj = 1.
E.g. when adjusting for seasonal variations we used a linear filter with
1 1 1 1 1
K−2 = , K−1 = , K0 = , K1 = , K2 = ,
8 4 4 4 8
and then subtracted the estimated trend (yt ) from the observations (xt ) in order to
estimate st .
40 / 62
Notes
Removing trends
Trends The length (l) of the adjusted average formula is given by the number of coefficients,
Adjusted average graduation i.e.
⊲ Notes
Practical steps l =β−α+1
Desirable properties of an
adjusted-average formula
Distortion of the trend A central formula has α = −β (so the coefficients Kj run from j = −β to j = β, and
Cubic trend the length is 2β + 1).
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
A symmetric formula is a central formula such that
Differencing
Differencing to remove seasonality
Example - seasonality and trend Kj = K−j for j = 1, 2, . . . β
41 / 62
Practical steps
Removing trends
have that
Trends
xt = ut + st + et , for t = 1, . . . n.
Adjusted average graduation
Notes
⊲ Practical steps
Brockwell and Davis (2002) suggest the following steps to remove seasonality and trend:
Desirable properties of an
adjusted-average formula
1. Use an adjusted average formula of the following form
Distortion of the trend
(
1 1 1
Cubic trend
d 2 xt−q + xt−q+1 + . . . + 2 xt+q for d=2q, q<t<n−q,
yt = 1
d (xt−q + xt−q+1 + . . . + xt+q ) for d=2q+1, q+1<t<n−q .
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
to obtain an initial estimate of the trend, ût = yt .
3. Estimate the seasonal effects as done in the previous lecture to obtain ŝt (adjusted so
Pd
that t=1 ŝt = 0).
42 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering 4. Subtract the estimates of the seasonal effects to obtain a new series x′t = xt − ŝt .
Forecasting
Seasonal variation
Removing trends
5. Use a suitable adjusted average formula to estimate the trend of the new series {x′t }
Trends (let this new estimate now be ût ).
Adjusted average graduation
6. Subtract this estimate of the trend from {x′t }, to obtain a new series x′′t = xt − ŝt − ût .
Notes
Practical steps
⊲Desirable properties of an
adjusted-average formula
Distortion of the trend 7. Model this new series {x′′t } with a stationary process.
Cubic trend
Note that steps 1 to 4 are equivalent to those previously discussed for the removal of trend
Spencer’s 15 point average filter
The problem of the tails for quarterly seasonal variation.
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
43 / 62
Desirable properties of an adjusted-average formula
Removing trends β β
Trends X X
Adjusted average graduation
yt = Kj xt+j = Kj (ut+j + et+j )
Notes
j=α j=α
Practical steps
Desirable properties of an
β β
⊲ adjusted-average formula
=
X
Kj ut+j +
X
Kj et+j
Distortion of the trend
44 / 62
Distortion of the trend
Removing trends
Trends
Adjusted average graduation We will first derive some conditions on the coefficients that result in filters that do not
Notes
Practical steps distort linear trends (i.e. trends of the form a + bt for constants a and b).
Desirable properties of an
adjusted-average formula
⊲ Distortion of the trend
Cubic trend
Assume that the trend ut = a + bt. In order for this trend to not be distorted, we must
Spencer’s 15 point average filter
have that u′t = ut for t = 1, . . . n. Substituting ut = a + bt into the formula for u′t we get
The problem of the tails
Fitting a polynomial
β β
Differencing X X
Differencing to remove seasonality
Example - seasonality and trend
u′t = Kj ut+j = Kj (a + b(t + j))
j=α j=α
β
X β
X
=(a + bt) Kj + b jKj .
j=α j=α
45 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering So if the coefficients are such that
Forecasting Pβ
Seasonal variation
1. j=α Kj = 1 and
Pβ
Removing trends
Trends
2. j=α jKj = 0,
Adjusted average graduation we will have that
Notes
Practical steps u′t = a + bt = ut ,
Desirable properties of an
adjusted-average formula
Distortion of the trend
and the filter will allow the linear trend to pass without distortion.
⊲
Cubic trend
Note that if the filter is symmetric, then α = −β and Kj = K−j so condition, (ii) is
Spencer’s 15 point average filter
The problem of the tails automatically satisfied.
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
The adjusted-average formula used for the example on quarterly seasonal variation had
coefficients 18 , 41 , 41 , 14 , 81 . This formula is symmetric and has
2
X
Kj = 1.
j=−2
Therefore, this filter does not distort linear trend, and is ‘exact on straight lines’.
46 / 62
Cubic trend
Seasonal variation
Removing trends
Trends We now have that
Adjusted average graduation
Notes β
X β
X
u′t = Kj (a + b(t + j) + c(t + j)2 + d(t + j)3 )
Practical steps
Kj ut+j =
Desirable properties of an
adjusted-average formula j=α j=α
Distortion of the trend
β
⊲ Cubic trend X
Spencer’s 15 point average filter
= Kj (a + bt + bj + c(t2 + 2tj + j 2 ) + d(t3 + st2 j + 3j 2 t + j 3 )
The problem of the tails j=α
Fitting a polynomial X X
=(a + bt + ct2 + dt3 ) Kj + (b + 2ct + 3t2 d)
Differencing
Differencing to remove seasonality jKj
Example - seasonality and trend
X X
+ (c + 3td) j 2 Kj + d j 3 Kj
47 / 62
So, we will have u′t = a + bt + ct2 + dt3 if and only if the following four conditions hold:
MATH11131 Part 2: Forecasting,
seasonal variation and filtering
Forecasting Pβ
Seasonal variation 1. j=α Kj = 1
Removing trends
Trends
Pβ
2. j=α jKj = 0
Adjusted average graduation
Notes Pβ 2
Practical steps 3. j=α j Kj = 0
Desirable properties of an Pβ 3
adjusted-average formula
Distortion of the trend
4. j=α j Kj = 0
Cubic trend
⊲ For symmetric filters conditions (ii) and (iv) are automatically satisified, so we need only
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
check conditions (i) and (iii).
Differencing
Differencing to remove seasonality
Example - seasonality and trend Similar conditions can be obtained for quadratic trends and trends which are
polynomials of higher orders.
48 / 62
Spencer’s 15 point average filter
Seasonal variation
Removing trends
(K0 , K±1 , K±2 , K±3 , K±4 , K±5 , K±6 , K±7 ) =
Trends 1
(74, 67, 46, 21, 3, −5, −6, −3).
Adjusted average graduation
Notes
320
Practical steps
Desirable properties of an This filter is symmetric so we only need to check conditions (i) and (iii) to check whether
adjusted-average formula
Distortion of the trend the filter is exact on cubics.
Cubic trend We have that
⊲ Spencer’s 15 point average filter Pβ
The problem of the tails
j=α Kj = 1 so condition (i) holds.
Fitting a polynomial
Differencing P
Differencing to remove seasonality
Pβ 2 β 2
Pβ 2
Example - seasonality and trend j=α j Kj = 2 j=1 j Kj so condition (iii) is satisfied if j=1 Kj = 0. We have
j
1
(1 × 67 + 22 × 46 + 32 × 21 + 42 × 3 + 52 × −5 + 62 × −6 + 72 × −3) = 0,
320
so condition (iii) holds.
49 / 62
The problem of the tails
Removing trends
Trends
Adjusted average graduation Instead, we must use non-central filters, that use only present and past values of xt .
Notes
Practical steps
Desirable properties of an
adjusted-average formula
Distortion of the trend
This problem is particularly important for forecasting.
Cubic trend
Methods for adjusting the weights for a finite number of past observations can be used
here.
50 / 62
Fitting a polynomial
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Least squares estimation can be used to find estimates of the coefficients a0 , a1 , . . . ak .
Cubic trend
This involves minimizing the sum of squares, given by
Spencer’s 15 point average filter n
The problem of the tails
(xt − ut )2 .
X
⊲ Fitting a polynomial
Differencing
Differencing to remove seasonality t=1
Example - seasonality and trend
The fitting of such a polynomial should be done after any seasonal adjustments have
been made to the series.
51 / 62
Differencing
Removing trends
As we have already seen, if the observed series {xt } is thought to have a polynomial
Trends
trend of degree k, then the k-th order differences ∇k xt should be considered.
Adjusted average graduation
Notes
For example, if the trend is quadratic, with ut = a + bt + ct2 , then assuming that
Practical steps
xt = ut + et for stationary {et },
Desirable properties of an
adjusted-average formula
Distortion of the trend
∇2 xt =xt − 2xt−1 + xt−2
Cubic trend
=ut + et − 2(ut−1 − et−1 ) + ut−2 + et−2
Spencer’s 15 point average filter
The problem of the tails
Fitting a polynomial
=a + bt + ct2 − 2a − 2b(t − 1) − 2c(t − 1)2
⊲ Differencing + a + b(t − 2) + c(t − 2)2 + (et − 2et−1 + et−2 )
Differencing to remove seasonality
Example - seasonality and trend
=(a − 2a + a + 2b − 2c − 2b + 4c) + t(b − 2b + 4c + b − 4c)
+ t2 (c − 2c + c) + e′t
=2c + e′t , where e′t = ∇2 et .
Thus, the second differences are a stationary process, with constant of 2c.
Differencing in this way should be applied after any seasonal adjustments have been
made.
52 / 62
Differencing to remove seasonality
Removing trends
Trends (1 − B d )xt = xt − xt−d
Adjusted average graduation
Notes
Practical steps should be considered.
Desirable properties of an Assume we have
adjusted-average formula
Distortion of the trend xt = ut + st + et ,
Cubic trend
with st representing the deterministic seasonal effect with period d, so that st = st−d ,
Spencer’s 15 point average filter
The problem of the tails and {et} stationary. This means that
Fitting a polynomial
Differencing
⊲ Differencing to remove seasonality
Example - seasonality and trend
(1 − B d )xt = ut + st + et − ut−d − st−d − et−d = (ut − ut−d ) + (et − et−d ).
The term ut − ut−d is the trend component of the differenced series. This can be
estimated using any of the methods discussed.
The term et − et−d is a stationary series.
Note that (1 − B d ) 6= ∇d = (1 − B)d .
53 / 62
Example - seasonality and trend
Removing trends
quarterly measurements of the average house price (in £) in Edinburgh.
Trends
54 / 62
MATH11131 Part 2: Forecasting,
A plot of the raw data is given below. We see a clear trend but it is not clear whether there
seasonal variation and filtering
Forecasting
is any seasonal variation.
Seasonal variation
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
55 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering We will assume the classical decomposition model of
Forecasting
Seasonal variation
Removing trends
xt = ut + st + et ,
Trends
Adjusted average graduation where xt is the observed house price at time t, ut is the trend, st is the seasonal effect
Notes
Practical steps and et is a stationary series with mean zero.
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
As we have quarterly data, we would expect seasonal variation to have period 2, so we
Spencer’s 15 point average filter
set d = 4, and st = st−d for all t.
The problem of the tails
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
56 / 62
MATH11131 Part 2: Forecasting,
seasonal variation and filtering The first few lines of R output are
Forecasting Date Price ERR1 SAS1 SAF1 ST C1
Seasonal variation
Removing trends
Q1 1993 57607.90 1794.47 60798.24 -3190.34 59003.78
Trends
Q2 1993 61504.50 -3449.07 56490.89 5013.61 59939.96
Adjusted average graduation
Notes
Q3 1993 60762.60 718.42 62530.73 -1768.13 61812.31
Practical steps
Q4 1993 64394.70 1156.80 64449.83 -55.13 63293.04
Desirable properties of an
adjusted-average formula Q1 1994 62848.60 1520.51 66038.94 -3190.34 64518.44
Distortion of the trend
Q2 1994 67671.20 -2216.33 62657.59 5013.61 64873.92
Cubic trend
57 / 62
The values in the SAF1 column are the values sI , sII , sIII and sIV . Note that the two
MATH11131 Part 2: Forecasting,
seasonal variation and filtering
Forecasting
Seasonal variation
values shown in Q1 are the same.
Removing trends
Trends
Desirable properties of an R has estimated the trend ut using an adjusted average formula with coefficients
adjusted-average formula
1
9 (1, 2, 3, 2, 1), with adjustments in the tails (applied to the seasonally adjusted data).
Distortion of the trend
Cubic trend This formula is exact on straight lines. This estimated trend is given in the ST C1
Spencer’s 15 point average filter column. We could use another formula for this if we wanted.
The problem of the tails
Fitting a polynomial
Differencing
Differencing to remove seasonality
Example - seasonality and trend
The ERR1 column is xt − ŝt − ût , i.e. it is the estimates êt . These should hopefully
form a stationary series.
⊲
58 / 62
MATH11131 Part 2: Forecasting,
We can plot the estimated trend, to help us see the general change in house prices in
seasonal variation and filtering
Forecasting
Edinburgh, without the seasonal effect. We see (unsurprisingly) that house prices rose fairly
Seasonal variation steadily over the time period.
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
59 / 62
MATH11131 Part 2: Forecasting,
We can also plot the errors. They seem fairly stationary, although there was a big error
seasonal variation and filtering
Forecasting
around Q2 of 2004. We might need to investigate this.
Seasonal variation
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
60 / 62
MATH11131 Part 2: Forecasting,
The Ljung-Box statistics for the errors imply that they are not white noise. We should
seasonal variation and filtering
Forecasting
therefore investigate possible stationary models to fit to the error terms.
Seasonal variation A correlogram of the errors is
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
61 / 62
MATH11131 Part 2: Forecasting,
A PACF for the errors is shown below. It looks like an AR(1) model might be appropriate.
seasonal variation and filtering
Forecasting
What would you estimate for the parameter α1 ?
Seasonal variation
Removing trends
Trends
Desirable properties of an
adjusted-average formula
Distortion of the trend
Cubic trend
62 / 62