0% found this document useful (0 votes)
94 views

Y .C, YA,: Yt Yy y Ys

The document defines key concepts in time series analysis including stochastic processes, mean and covariance functions, and stationary processes. A stochastic process is a sequence of random variables observed over time. The mean and covariance functions describe the first and second moments of the process. A process is stationary if its joint and marginal distributions are unchanged by shifts in time. For a stationary process, the mean is constant over time, the variance and covariance depend only on the time lag between observations, not the actual time.

Uploaded by

General Master
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
94 views

Y .C, YA,: Yt Yy y Ys

The document defines key concepts in time series analysis including stochastic processes, mean and covariance functions, and stationary processes. A stochastic process is a sequence of random variables observed over time. The mean and covariance functions describe the first and second moments of the process. A process is stationary if its joint and marginal distributions are unchanged by shifts in time. For a stationary process, the mean is constant over time, the variance and covariance depend only on the time lag between observations, not the actual time.

Uploaded by

General Master
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

4.

STATIONARY PROCESSES
We shall describe the fundamental concepts in theory of time series models. Introduce
the concepts of stochastic processes, mean and covariance functions, stationary
processes and autocorrelation functions.

4.1 Stochastic Processes


...
Y*.c, Ye YA, YA
-, 1,
+
YA+2. ...

The sequence of random variables {Yt : t = 0, ±1, ±2,......} is a stochastic process and
the observed series y1 , y2 ....., yt
n is regarded as a realisation of the underlying process,

where the Y ' s will have some joint distribution.

 Information in these joint distributions can be described in terms of means,


variances, and covariances
 We will focus on the first and second moments

4.2 Mean, Variance, Covariance


(i) The process {Yt : t = 0, ±1, ±2,......} , the mean function is defined by
µt = E (Yt ) for t = 0, ±1, ±2,.... (4.2.1)
 µt is the expected value of the process at time ' t '
 In general, µt can be different at each time point ' t '

(ii) The auto-covariance function, γ t , s is defined as


γ t ,s = Cov (Yt , Ys ) for t , s = 0, ±1, ±2,... (4.2.2)
where
Cov (Yt , Ys ) = E (Yt − µt )(Ys − µs ) 

= E (YtYs ) − µt µs

1
(iii) The autocorrelation function, ρt , s is given by
ρt ,s = Corr (Yt , Ys ) (4.2.3)

where
Cov (Yt , Ys ) γ t ,s
Corr (Yt , Ys ) = = (4.2.4)
Var (Yt )Var (Ys ) γ t ,t γ s ,s

(iv) Recall that both covariance and correlation are measures of the & linear
dependence between random variables but that the unitless correlation is
somewhat easier to interpret.

(v) The following properties are from known results and our definitions:

MENPRIZE  γ t ,t = Var (Yt ) ρt ,t = 1

 γ t ,s = γ s ,t ρ t ,s = ρ s ,t (4.2.5)
corMe,Y) Cov(Ys,Y)
=
CourLYe,1) Gor(Ys,YA)
=

 γ t ,s ≤ γ t ,t γ s ,s ρt ,s ≤ 1

(vi) If c1, c2 ,..., cm and d1, d2 ,..., dn are constants, and t1, t2 ,..., tm and s1, s2 ,..., sn
are time points, then
 m n  m n
CLEARUP FO Cov  ∑ ciYti , ∑ j s j  = ∑∑ ci d j Cov Yti , Ys j
d Y ( ) (4.2.6)
 i =1
YOURSELF  j =1  i =1 j =1

 m  m n i −1
(vii) Var  ∑ ciYti  = ∑ ci2Var Yti + 2∑∑ ci c j Cov Yti , Ys j
( ) ( ) (4.2.7)
 i =1  i =1 i = 2 j =1

2
-
111111"he
YEh(z k-
dll ll
4.3 Stationarity
(i) A process {Yt } is said to be strongly stationary or strictly stationary if the
joint distribution of Yt1 , Yt2 ,...., Ytn is the same as the joint distribution of
Yt1−k , Yt2 −k ,...., Ytn −k for all sets of time points t1, t2 ,..., tn and lag ' k '

Va(Ye
<a (ii) If a process is strictly stationary and has finite variance, then the covariance
function must depend only on the time lag

(iii) The distribution of Yt is the same as that of Yt −k ∀t , k that is, the Y ' s are
marginally identically distributed

28K  E (Yt ) = E (Yt −k ) ∀t , k mean function is constant for all time


 Var (Yt ) = Var (Yt −k ) ∀t , k variance constant over time

(iv) The bivariate distribution of Yt and Ys is the same as that of Yt −k and Ys −k


 Cov (Yt , Ys ) = Cov (Yt −k , Ys −k ) ∀t , s, k

 Putting k = s and then k = t we get:-


Cor(YA-s,Ys-s) Cor(YA-s,4d
XCor(Ye, Ys)
=m
=

γ t ,s = Cov (Yt −s , Y0 )
-
= Cov (Y0 , Ys−t ) Cor(Yz,Ys) Cor(YA-1, YS k)
=

· Gr(Yt A, YS t)
=

(
= Cov Y0 , Y t − s ) - -

Cor(Yo, YS-t)
=

= γ 0, t − s
cv(Ye,Y( 1))
Generall,
=

S THEN T
The covariance between Yt and Ys depends on time only through the time
difference t − s and not on the actual times ' t ' and ' s '
>A
hdds!!! I
Cor(12,4) Cor(Y0,Y2) Cor((,
=

) Cor(Y8,410)
= =

3
V V N
ie k2k
= 2
=

k2
=
Cor(1,4s) Cov(Ye,y,t 5)) Ve,1t 31 011 5
=
=
=

-
-
-

Cor(YA, Yt k)
=
-

Cor(Y,YA (t k))
-

=Cor(Ye, Yk)
-
(v)

Thus for a stationary process:


γ k = Cov (Yt , Yt −k ) /
γ
ρ k = Corr (Yt , Yt −k ) = k
γ0
intcolysis
recen
Ve,k Up
=
=

The general properties as stated in Eqn (4.2.5) now become:

 γ 0 = Var (Yt ) ρ0 = 1
 γ k = γ −k ρk = ρ− k (4.2.8)
 γk ≤ γ0 ρk ≤ 1

A process {Yt } is weakly stationary or second order stationary if

(i). E (Yt ) = µ ∀t

(ii). Var (Yt ) = σ 2 ∀t

(iii). Cov (Yt , Yt −k ) = γ k depends only on the lag

 A strictly stationary process is weakly stationary

 The sequence {γ k , k ∈ ℤ} is called the auto-covariance function

 The sequence {ρk , k ∈ℤ} is called the autocorrelation function (ACF)

 A series {Yt } is said to be lagged if its time axis is shifted:- shifting by k lags

gives the series {Yt −k }

4
Illustration 1: Random Walk evidg,v2)
Let e1, e2 ,..., be a sequence of independent, identical distributed random variables each

with mean zero and σ e2 , constant variance.


The observed time series, {Yt : t = 1,2,....} is constructed as follows:
Y1 = e1 

Y2 = e1 + e2 
 ⇒ Yt = Yt −1 + et
⋮ 
Yt = e1 + e2 + .... + et  7 2,3,....
=

Find the following:


(a) mean
(b) variance
(c) auto-covariance function,
(d) autocorrelation function

Illustration 1 solution
(a) µt = E (Yt )
= E ( e1 + e2 + .... + et )
= E ( e1 ) + E ( e2 ) + .... + E ( et )
=0 ∀t

(h) Var (Yt ) = Var ( e1 + e2 + .... + et )


India of time

I <-
= Var ( e1 ) + Var ( e2 ) + .... + Var ( et )

= σ e2 + σ e2 + ... + σ e2
= tσ e2 the process variance increases linearly with time
INDEPENDENCE
IMPLICATION:
of et's
not stationary
Random walk is
a

process

5
Yz 2+2z
=

...
+

et
e
Y 1, 2z (1 et 1 ...
+

...
+
7 +
+
=

(c) Suppose that 1 ≤ t ≤ s


γ t ,s = Cov (Yt , Ys )
= Cov ( e1 + e2 + .... + et , e1 + e2 + .... + et +1 + ..... + es )

Using Eqn 4.2.6 Ca(ei,ej) 0,i+ j


=

⑨ Cor(ei,ej) 0, i j
-
=

s t
γ t ,s = ∑∑ Cov ( ei , e j ) <
And sincets
i =1 j =1
of
there will be the sum
= tσ e2 if i = j

⑫ado
to
=0 if i ≠ j

Ut,s mm(s,A).*,
ij
assume t -s =>
=
=

not

Since γ t , s = γ s ,t we have the auto-covariance function for all time-points, t and s , then

γ t ,s = tσ e2 for 1 ≤ t ≤ s

(d) The autocorrelation function for the random walk:-


γ t ,s
Recall
ρt , s = 1≤ t ≤ s
γ t ,t ⋅ γ s ,s Vm(Yt) 702
=

Va(Ys) sOe2
=> =

tσ e2
Cor(Ye,Ys) to?, fonts
= =

tσ e2 ⋅ sσ e2

t
= 1≤ t ≤ s
s

For example:-
ρ1,2 = 1 = 0.707 ρ 24,25 = 24 = 0.98
2 25

ρ8,9 = 8 = 0.943 ρ1,25 = 1 = 0.200


9 25

6
Noticing that values of Y at neighbouring time points, are strongly correlated and
values of Y at distant time points, are less correlated.

Illustration 2: Moving Average


Suppose that
et + et −1
Yt =
2
where e ' s are iid with zero mean and variance σ e2
Find the following:
(a) mean
(b) variance
(c) auto-covariance function,
(d) autocorrelation function

Illustration 2 solution

e +e 
(a) E (Yt ) = E  t t −1  = 0
 2 

INDEPENDENCE
(b)
 e + e  Var ( et + et −1 ) 1 2
Var (Yt ) = Var  t t −1  =
 2  4
= 2 σe X

Vo ()"Vm(1 1 1) [v(t) Vm(e -1)]
= +

+
=
+

4(0+02) 102
=
=

7
YA=E(ey et 1)
+

2)
Yz z(1 +ef
-
+ -

e +e e +e 
(c) Cov (Yt , Yt −1 ) = Cov  t t −1 , t −1 t −2  INDEPENDENCE
m  2 2 
Y =8 =8 x8
1 ~- ~
=
4
{Cov ( et , et −1 ) + Cov ( et , et −2 ) + Cov ( et −1, et −1 ) + Cov ( et −1, et −2 )}
Cov ( et −1 , et −1 )
= all other covariances are zero
4
3)
σ e2
Cor(G 2 12 27
+

I
+
-
-

2
=
+

z
E(ey et 1)
YA=
=
+

Y1 2(21 18 3)
-
+

2
-

-
i.e γ t ,t −1 = 14 σ e2 ∀t

e +e e +e 
Cov (Yt , Yt −2 ) = Cov  t t −1 , t −2 t −3 
#testates 8
=
+GRETICIE)

m  2 2 
V2
=0 since the e ' s are independent

Similarly Cov ( Yt , Yt −k ) = 0 ∀k > 1

So: E
+
 1 σ e2 t −&
k =0
 2
=  14 σ e2 V. =

to2
s
γ t ,s t − k =1
 mo
0 t − Sk > 1

Vs zv1A 5
-
-
0
=

\
From the fastthat
Pt,s = 20=1,1 0

I
Etoe"
PAsEvEtton, re
z,
=
1t -s= 1

-2 0.1t-sp/
=

8
For autocorrelation function (acf):

1

·
t−k =0

ρ t ,k =  12 t − k =1

0 t − k >1

Note that ρ2,1 = ρ3,2 = ρ4,3 = ρ9,8 = 12 . That is, values of y precisely one time
unit apart have exactly the same correlation no matter where they occur in time.
Now ρ 3,1 = ρ 4,2 = ρ t ,t −2 .
In general ρt ,t −k is the same for all values of t .

Example 4.3.1
Suppose that the observed process is given by
Yt = Z t + θ Z t −1 where {Zt } ~ N ( 0,σ z2 )
Find the autocorrelation function (acf) for Yt when θ = 3 and when θ = 13

Solution 4.3.1 =Tzz) 8E(EA 1)


+

E (Yt ) = E ( Zt + θ Z t −1 ) = 0

thisis
<
Var (Yt ) = Var ( Z t + θ Z t −1 = σ z2
) +θ 2
σ z2 = σ z2 (1 + θ )
2

Vur(zt) Var(8ZA 1)
= +

1
INDEPENDENCE

9
INDENDENCE TO
- -0
1
A cat
[
t
E i) Cov(zt,827 2)
-

⑤ I
+

F
- -

((8zt ,21 1) CaE((8zt2)


+

Cov (Yt , Yt −1 ) = Cov ( Z t + θ Z t −1 , Z t −1 + θ Zt −2 ) =


+
-
-

/
= θ Cov ( Z t −1 , Z t −1 ) = θσ z2 0

*
hb -

For k > 1:

&
Cov (Yt , Yt − k ) = Cov ( Zt + θ Z t −1, Z t − k + θ Zt −k −1 )
G(Yt,y1.h) Vm 0
= =

=0 since all these errors are uncorrelated


Ak>1
Pk

=
- Cov (Yt , Yt −k )
Corr (Yt , Yt −k ) =
Var (Yt )Var (Yt −k )

Re: To a= 1
1

fazz-
k =0
 k 1:=

 2
ρ k =  2θσ z 2 = θ 2 k =1
 σ z (1+θ )
1+θ

0
 k >1 h>: 1783 8
=

3
For θ = 3 : ρ1 = 3
= 103
1+32
1
So the autocorrelation functions are identical
θ = 13 : ρ1 = 2
3
= 103
( )
1+ 13

52 E
=

Check:8 = 5:

5. (( Fz E =5.25 2,
=
8
=
=
=

10
scalar

Example 4.3.2 2
Suppose Yt = 5 + 2t + Z t , where {Z t } is a zero-mean stationary series with auto-
- -

covariance function γ k
(a) Find the mean function for {Yt }
(b) Find the auto-covariance function for {Yt }
(c) Is {Yt } stationary? Why or why not?

E(2A) E(zt)
XE(5)
+
+

Solution 4.3.2
(a) E (Yt ) = E ( 5 + 2t + Z t ) = 5 + 2t < functionof time

(b) Cov (Yt , Yt −k ) = Cov ( 5 + 2t + Z t , 5+2 ( t − k ) + Z t −k )

= Cov ( Z t , Z t −k )
= γk

(c) Since the process mean varies with time, then Yt is not stationary

11
Example 4.3.3 - >E(YA) m, Nt =

Suppose that {Yt } is stationary with auto-covariance function γ k .

(i) Show that Wt = ∇Yt = Yt − Yt −1 is stationary by finding the mean and


WE y1 YA
= -

-
1
autocovariance function for Wt

We-kYt k Yt k 1
=

-
-

-
-

(ii) Show that U t = ∇Wt is stationary

Solution4.33

(i) E (Wt ) = E (Yt − Yt −1 ) a


constant, notafunctionof time
= E (Yt ) − E (Yt −1 ) = 0 since {Yt } is stationary
2n
u
M

Cov (Wt ,Wt −k ) = Cov (Yt − Yt −1 , Yt −k − Yt −k −1 )


&
= Cov (Yt , Yt −k ) − Cov (Yt ,Yt −k −1 ) − Cov (Yt −1 , Yt −k ) + Cov ( Yt −1 , Yt − k −1 )
⑫ ⑫ ⑫ I

= γ k − γ k +1 − γ k −1 + γ k
<We stationary>
is
= 2γ k − γ k +1 − γ k −1
functionof lags

(ii) U t = ∇Wt X(YA YA 1) PY


=
-

-
=
-
DYA -

1 (YA
=

- YA 1)
-
-
(YA
-
1
-

YA
-
1)
= ∇ 2Yt YA
=

- YA
-
1
-

YE
-
1 yz
+

-
2
= Yt − 2Yt −1 + Yt − 2
Yz
=

-
2Yz - 1 YA
+ -
2

Now, U t is the first difference of the process {∇Yt } , and by part (i), {∇Yt } is
stationary.

So, U t is the difference of a stationary process and from part (i), is itself
stationary.

YA.2) E(Y) 2G(YA 1) E(YA 2)


=

1 -
2n 1 +
0
=

E(nz) E(YA 2Yz


- =
+

He i
+ -
-

-
= -

or
E(ht) E(Wt) = -
E(WF1) 0 = -
0 0
=

12
Cor (Ut,"A-h) Cor(Wz-Wt-), WE-R-WA-k-1) Cor(WA, WA-R) CovSWA,WF -k+)
(
=
= -

Cov(We-1, We-R) Cov(We-1, Wt A 1)


-
+

Each term is function of the


lays a
-
-

Example 4.3.4 zid(,vt


Consider the series Yt = α Yt −1 + Z t where {Z t } is a white noise process. This series
is known as a Markov process or Autoregressive series of order 1. When α = 1 we
have a random walk.

Find the autocovariance function and autocorrelation function.

Solution 4.3.4

E (Yt ) = E (α Yt −1 + Z t )
= α E (Yt −1 )
= 0 ∀t µ = αµ ⇒ (1 − α ) µ = 0

We need γ k autocovariance function:-

Multiply the series by Yt :


Yt2 = αYtYt −1 + Yt Zt
Take expectation:

( )
E Yt 2 = E (α YtYt −1 ) + E (Yt Zt )

γ 0 = αγ1 + E (Yt Zt )
Now:
E (Yt Zt ) = E (αYt −1 + Zt ) Zt 

( )
= α E (Yt −1Zt ) + E Zt2

= 0 + σ z2

Hence, γ 0 = αγ 1 + σ z2 (1)

13
YA hYA-1 ZA
=
+ where Zt i (0,03)

YA zt xyz
=> =
+
-
1

zz 4(zz
=
+
-
1 (YA 2)
+

z1 (27
=
+
- 1 12VA
+
-
2

12(z1 2 (YA 3)
+

zA azz
-
-

=
+ +
+

13zz + 1YA -
3
zA (z1 2
-
+ +

1
=

(YA 4)
+

zA (z1
=
+

+
1
+ (221 -
2
+

13(21 -

3
+
-

(221 +
1321 -
3
(YA
+ -
4
z1 421 2
+
= -

1
+
1

:
1321 (21 [Pzz ...
+

(221
.. p
+
-
+

+ -
3
+
- 4
z1 421 2
+
= -

1
+
1

=
ciz
- +i reall:Zn id(0,03

siz i) E;E(xz i) 265z(z + i)


E(YA) E(z
=
-
=
+ -

= + -

=>

zc0 0
=
=

INDEPENDENCE

Ve,"vm(y) vm(z5z
= + i) = 2Vm(xz + 1 i) =
1z0()vm(zz 1) -

=
8(13):0=0z01(aY" RECON
<

E( ) 1
=
= where <1
16K /

in (0,02)
Recall:Yr <
=

Yz-1 Znwhen I
+

([cYA
=

-
2 zz
+
-
7 zz +

[yA
=
-
z 2zA
+
-

1 zt
+
Likewise, we can find that

YA k (kyz
+
=
2
+

xizA k i
+ -
involve EA+1,", =A k
+

:Cor(Ye,YA h) +
=
xzA
car(Y1,2 yz 2 4 i)
+
+ -

Uh* =
Cov(Ye, <*YA)
atcor(YA, YA)
=

corarianceone
1
xkVm(Yt)
=

ch,
=
k

- and
cuela
cour(Ye,beta)
PR

ak
=
Multiply the series by Yt −1 :
YtYt −1 = α Yt 2−1 + Yt −1Zt
Take expectation:

( )
E (YtYt −1 ) = α E Yt 2−1 + E (Yt −1Zt )

γ 1 = αγ 0 + 0 (2)

Subst. (2) into (1): γ 0 = α 2γ 0 + σ z2


σ z2
⇒ γ0 = ∀t provided α < 1
1−α 2
γ1
Now ρ1 = =α
γ0

In general, multiply the series by Yt − k :

YtYt −k = αYt −1Yt −k + ZtYt −k

E (YtYt −k ) = α E (Yt −1Yt −k ) + E ( ZtYt −k )

γ k = αγ k −1 k ≥2

when k = 2 : γ 2 = αγ1 = α 2γ 0

k = 3: γ 3 = αγ 2 = α 3γ 0

So γ k = α kγ 0 and ρk = α k k ≥ 1

14
Alternative:

Var (Yt ) = α 2Var (Yt −1 ) + Var ( Z t ) + 2α Cov (Yt −1 , Z t )


 
=0

γ 0 = α 2γ 0 + σ z2

σ z2
⇒ γ0 =
1−α 2

Cov (Yt , Yt −k ) = Cov (Yt ,αYt −k −1 + Zt −k )

= α Cov (Yt , Yt −k −1 ) + Cov (Yt , Z t −k )


  
=0

∴ γ k = αγ k −1 k ≥1

15
4.4 General Linear Process
Let {Yt } denote the observed time series; and let {Z t } represent an unobserved white
noise series, that is, a sequence of identically distributed, zero-mean, independent
random variables. The assumption of independence could be replaced by the weaker
assumption, that the {Z t } are uncorrelated random variables, but we will not pursue
that slight generality.
YA =
z8Yizt -
i

A general linear process, {Yt } is one that can be represented as a weighted linear
combination of present and past white noise terms as-

- Yt = Z t + ψ 1Z t −1 + ψ 2 Z t −2 + ... (4.4.1)

Y. 1
=

Since this is an infinite series, certain conditions must be placed on the ψ -weights to
be meaningful mathematically.
We assume that

E(X)

∑ψ i2 < ∞
<0
(4.4.2)
i =1

Since {Z t } is unobservable, without loss of generality (wlog) of equation (4.4.2), we


assume that the coefficient on Z t is 1; hence, ψ 0 = 1

Now ψ ' s form an exponentially decaying sequence :-


In Example 4.3.4

ψ j =φ j &
where −1 < φ < 1 10K1

So: Yt = Z t + φ Zt −1 + φ 2 Z t −2 + ...

For example: Recall:Es (0,03


(
E (Yt ) = E Z t + φ Zt −1 + φ 2 Zt −2 + ... = 0 )
Thus, {Yt } has a constant mean of zeroes.

1streg. for stationarity w

16
Vo

-104
Also III
(
Var (Yt ) = Var Zt + φ Zt −1 + φ 2 Zt −2 + ... )
independence= Var ( Zt ) + φ 2Var ( Zt −1 ) + φ 4Var ( Zt −2 ) + ....
L

& id(0,02) 0
=
+ 0P0+10402 ...

= a
(
= σ e2 1 + φ 2 + φ 4 + .... (y2):
) 82 =

00 Geometric

~
σ e2 series
= (by summing a geometric series)
1−φ2

Furthermore
-
X
Cov (Yt , Yt −1 ) ( t t −1
2
= Cov Z + φ Z + φ Z + ..., Z + φ Z + φ Z + ....
t −2 t −1 t −2
2
t −3 )
V
( )
= Cov (φ Zt −1, Zt −1 ) + Cov φ 2 Zt −2 , φ Zt −2 + .....

= φσ e2 + φ 3σ e2 + φ 5σ e2 + ....

) =00328(02):
(
= φσ e2 1 + φ 2 + φ 4 + .....
Geometric
o
φσ e2 series
= (by summing a geometric series)
1−φ2

Also

Corr (Yt , Yt −1 ) =
 φσ e2 

 1−φ2 


P =
 σ e2 
/

=
 2 
1−φ  n
P

17
4 Cov(Yt, Ye -1) ((Ex + 0Zt+ BZt 7
Z1 1 0Z1 8E1 3,...)
+
.; +
= + - 2 -

2
= - -

+) G(z7,0ZA 2) Gr(z7,02Z7 3) ...


I
= +
+
+ -
-

everythiswith Z => 0

Cor(0Zt-1, EA 1) (ar(ZA -110ZA-2) CoL0Zt -1, pZ8-3) ..


+ +
+
+
-


o

Car(0ZA.2, ZA. Ca(OFZ1-2, gZA-2) G(PPE7.2, 8Z*-3)


+ +
+

...
+

+...

8C(E++1, Et -) 0PCor(Et -2,Zt-2) 0*Cr(EE-3, Zt-s)


= +
... +
+

=
00 p02 0502...
+
+

0204 ...)
00(1-
=

+
+ +

Geometric series

00+2
=

0-a is
=
this

Not said V,
0rez,
=

and know 50 e
=

=> V ne
=

0"z1 0471
(z G(yt,yA 2) G(zt 1zt 021
+ + +
+

4 ...,
+

3
-

1 2
-
= - -

-
=

zt -
2 + 0zt -

3
+

8It -

4 ...)
+

=
Co(0Z1z,Ze -2) Cov(0"ZA-3, 1Z1-3) +
+
Cov(0*ZA +,x*ZA-4)
-
+..

=
80+ 040 002 +
...
-

002(g*
=

...)
old
same sometic exies

g20(#
(
=

00.0 028
=
=
So... V 0%
=

0 08.
=

v 050
=

ohoere
=

Soph R ut =
= =
φ kσ e2
Likewise, Cov (Yt , Yt −k ) = and Corr (Yt , Yt − k ) = φ k
1−φ2

weakly
The process is stationary:- the autocovariance structure depends only on time lag.
stationary
For a general linear process, Yt = Z t + ψ 1Z t −1 + ψ 2 Z t −2 + ... , similar calculation yield the
Before we following results:
assumed
vi 0i
=
 E ( Yt ) = 0


 γ k = Cov (Yt , Yt − k ) = σ e2 ∑ψ iψ i+k k ≥ 0 with ψ 0 = 1
i =0

A process with a nonzero mean, µ , may be obtained by adding µ to the RHS of


equation (2.4.1). Since the mean does not affect the covariance properties of a process,
and we assume a zero mean until we begin fitting models to data.

In general a transformation of the form



Yt = ∑ a j X t − j Readprehi
·

j =0
document
= A( B) X t

where A ( B ) = ∑ ajB j is called a linear filter.
j =−∞

18
4.5 Summary

(i) Stationarity
A stationary series is:
 roughly horizontal
 constant variance
 no patterns predictable in the long-term

Transformations help to stabilize the variance. For ARIMA modelling, we


also need to stabilise the mean.

(ii) Non-stationarity
To identify the non-stationary series:-
 time plot
 the ACF of stationary data drops to zero relatively quickly
 the ACF of non-stationary data decreases slowly
 the value of r1 is often large and positive

(iii) Differencing
 Differencing helps to stabilise the mean

 The differenced series is the change between each observation in the


original series

 The differenced series will have only T − 1 values since it is not possible
to calculate a difference for the first observation

 Occasionally the differenced data will not appear stationary and it may
be necessary to difference the data a second time

19
(iv) Seasonal differencing
A seasonal difference is the difference between an observation and the
corresponding observation from the previous year.

When both seasonal and first differences are applied:-


 it makes no difference which is done first; the result will be the same

 if seasonality is strong, it is recommended that seasonal differencing be


done first because sometimes the resulting series will be stationary and
there will be no need for further first difference

(v) Interpretation of differencing


It is important that if differencing is used, the differences are interpretable.
 First differences are the change between one observation and the next
 Seasonal differences are the change between one year to the next

20

You might also like