0% found this document useful (0 votes)
18 views

Markov Chain Notes Copy

Uploaded by

adityasingh50893
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Markov Chain Notes Copy

Uploaded by

adityasingh50893
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

f

- --- - .

324 l Statistics Section


and standar d deviation 0·02 litre . Set \.1p :~. c,ntM l chart
for
controJJing the m ,:an, if a sawple cf size 4 1; ··aken.
4.'.
{A1n. (0·974 to 1·026) er lO ·nO to 1•020) depen·.:1ing upon
I
95% l1r 99% level 0t· 3igdfic,1•.11:.: t~.]
4. On the average, about 3~1• of the bolt· produced by a firm
defecti ve. To mai ntain th1s qnulity , a sitmpk of '/Cfl l:i0ltr.
produc ed is exam111 cd ever} f. .)ur hours. De~ermiue (a, 99%
are
Chapt er 8
and (b) SS~~ contro l limits for t.he ·n11mber •)f defo.ctive bolts
each sample .
[Ans. The upper contro l limits are 6 a nd 4 def~cti ve iolts in
in
---------------- ________________
Maz· cov Cha in
,

(a) and (b) respect ivdy. Note that only tbe ur,per "Man ilatb a weary pilg,·.image,
contro l limits are needed h ere.l A •. 1b rot1gb th world he wends."
6. A factory desires to turn out cNton thread. the breakin'4 str.eng 1 -Southey.
:h
of wh·ich is to have a mean aud ~}tun ~a~r :\ dr- 11:·!ation 6·5(, gm at'. d
1·50 gm respectively. Assum ng th1r.: tbi:, standard bas been
attaine d, find the 99% an d J a co.:itrol li mits ~1or the mean Introd uction . There are many phenomeLa both in natural E.nd
the routine sample s o f' 10 p i<'ces of th re ad .
or social science s which fluctua te i o random manner s. The actual
nature of their r.haracterstics are best i..nderstood with
[Ans. 5·57 to 7·43 , 5·08 to 7 ·9~.J (t e l tht" probabilistic: methods callied the random proces,;eti. theThere
help of
are
.Jmany z:andom proceis es of which the Marko v process occupie
sa
, promin ent plact.. Thft ideas behind this process are much clarifit
d
if one bas some kn0wle dge of the Markov Chain, which may
be
conside1ed as a very pa.rticular case of the Markov process .
The present discuss i'6 n is concern ed with the elemen ts of
the ~·1

Markov Chain. The pre-req uisite, for this study are the theorie
s of
pro babi.lity, multi-dimensional vector, determinant and matrix
ialgebra .
~ [See "A text book of Vectors " and "A text book of Ma1trix nod
I.Tensors' by the author. ]
·•s.t. Some prelim inary definit ions. v,.. 7t
t:~
We shall use the fo llowing definitions /•'
in the discusston of :_ ...-,
IMairco v Chains . '!~5t' ;
,,, }.~!- ~~
An n-dim cnsiona l row vector V ·,)t''=

·f'j
.. L ......,.. _~ ~ ,.,;{f.;1
· j' U=(.:t1, u2, .. , u,, ) i' 'l ~
.. (l}
J ~ .::~~

called a probability wet'!! .if its compon ents are non-negative :in
heir sum is unity, i.e.
d
,.~
r, ..:,~~>
:;' ...·•..
•t,,~·
.ll;;it;,- ,t
· ,.J1 > 0, u51> 0, .. , l-',, ;> O.

r
.l . .. (2)
u1+u2 + .. . +Un= 1. .. (3)
\

t
.......
Section I f,, l So -ne Prclimina•y Definitions I 327
326 I Stat/J tics
The vector
u , u2, ••• , un ar~ / Solutio,1. Tbe d,zfinitio.'.l is given by ( 1), (2) aod (3).
T,1e condition (3) shows that the components 1 e and tb e
io terms of_the_f- ;;is a probability vector, shr.e :ts elements are non-negativ
not all in<iependent,but onqi f them .can be \!;'{press~~
lmes convement 'i sum of its e.lements is ½+0 +t,-t- ¼= 1, i.e. equal to unity.
remaining (n- 1) components. It is therefore somel I .. .
it conta ins a
to write (1) as The vector b is not a i:roba bil1ty vector, srnce

... (4) I uegati ve ehruent -!.
U=(U1 , U2, . .. , Un-1 ; 1- UJ. - rt2 -- ••• ...... Un_1), ➔
sum of its
.. ., un can The vecto~ c i,; not a pro~abil1ty vector, since the
It is also evident from (2) and (3) that none of u1, u2, 1 +?. bS+ I= :~½ which is not equal to unity.
comp onen ts~·s
be g-:eater than unity or Jcr.s than zero. , ,/
x. Find a
Exam pl '2 _;YI 1efine the fix~d point of a square matri
Any vector with non-negative component s
fixed point o lh1.: matrix

V=(V1 , Vs, ... , Vn) .. . (Sa)


3 . .1l 3lJ,
can be reduc ed to tb.e proba bility vecto r ~ 1) by
➔ ➔
the defini tion

u=Xv=(>,v1" >.v2, ... , >.v.,.), ...9b)


i
A= 1
l 4 -1 8
if it exists. Find also the fixed point probabilit:,
vector of this
when : .\ is the scalar define.d by mat ~ix, if i'te:tists. '
,\ ----
1
·--·- - , .. (Sc) l S I i.: · Th r·1xe<l point is 9efined .by -+
(6). If a=(a 1 , a 2 , a 3) i.5
v1fv2 + ••. +vn . o .t.tV'''.'h", ~ ,~
j
-:. , 1 a fixed p omt of A, the □ by (6),
smce u in (5b) satisfies the conditioni. (2) and (3).
said to be a fix~dll
-, -.,1
(a lL , a2, r.'3.)
[3I. -11 3JJ =(ai-, a 2, 03).
1. A noo-z ero row vector ;=(a1 , a2, ... , Ora) is ➔ 4 -1 8
.....
ct aA is equal to a, . It g,ves
pomt of a square matrix A of order n if the produ

· 3a1 + aa +4a3 = a 1 ,
i.e. ,,.,,,..
➔ ➔

,. . (6; i - ·aa+a:1- oa=a 2,


aA=a , 0 d-3a2 +8a3 =q3, (
sJgnificance as
(A zero vec:tc,r al ways satisfies (6) and hence is of no Hence 1 = - k, a:r"= -
0 2k, 113 =k, where k is any constant. Hence
a fixed point.) =(-lc , _ 2k, k) , if k is uon~zero.
. If a fixed_ ~oinq s a proba bility vector , t:.ien it
1s ~a lled the fixed the fixed po hr: ic tlie vector t
pomt probabzltty vector. ive . If k
-" . lf k is positive, the: f irst two cc,mponents of ; an· negat ➔
A square matrix '. . . . ➔
be
1
✓ P=[p 11] ... (7) ; Is negattv_ e:"the tlur-d c·omponer.t of q~.,negative. Hence ~n ot
is a probability ~ robab.!.!!! r ~~ -
--
is called a stochastic matrix if each of its rows - . Th~s: th e ma1r ix A bas a fixed point but ha~. uo fixed point
vector. . . . . .______________ . . . . .,. . . bl11ty vector ·
· · • at 1east ,
'd to be _regu lar 1'f t here exists . proba ,· ~
· sa1 1 •
A stocbastic matrix 1s 3 matrix and a regula r stocha stic
the mat11i.t,fm ' . .~.JM;fllPle , Defme a stochai:tic
one positive integer m for which all the elements of stochastic matrices
pm does not conta in any zero eleme nt). ,·~; ·lmatrix. T~st whethr~r th~ i'ollowi~rn matrices are
are positi ve (i.e. er it i~ a
obability vector. Test whether t il, rsf .or not. If any ~-r th ernx 18 a stochastic matrix, test wheth
Exam ple 1. D..:.fi11e a • p:~ 1 regula r stoch astic matri
[ ¼ ¼
• • [O · ]
fo llowing vec1:r s are, pmbab1h ty ve1}tors or not

1

I I ➔_ l ~ i 0 0 ➔ -- ! . ,
0 1
A= i· :1 .l , B= l - to 4 g]
• - -..
I 21
}c-(7' '5,1) ,, .o IO
1
_ 2
a- ( 2• 0•3•6 ),b- \5• - 4• • 20·
,.

\ l.__ _ ·-·~ ---


.....
r·-
'
328 / Statistics Section I 8.3
-.-, -,-------·---------
Theorems on Probt!bility Vector and Stochastic M atrix ( 329
4:
l Solution. TLe definitions are given in Sec. P..1 ~ince a U the Since the interchaoging or rows aud columns of a determ1na ut
rows of A are probability vectors, the .:uatri,i. A is 1 stoct astk J:,Litrix __,_does not alter its vaiue, the above 11:suJt implies
Moreover, we find that r ·
0 7[ 0 1 162
0 1
au - 1
~;~~1
\012
1122- 1
a1n
a2n
I
=O. :::o) 1A-JI::: 0
I •
. A2=1J~2 1/4
Lo I
164J1/4
1 0j
5/16
.1/?. 1/4
ij47=[1~'
1/4
j

I
5/?6] r Onl an2 a,rn - 1
which contains zero clements Thi:, is the expanded form of (I). Hence we prove the theorem.

A=[1&2
3
0
1/4 1] [ l/80 · 5/16
1/4 ' 5/016 ] = rl/2
5/ 32
1/4
45/64 1/4]
9/64
Theorem 2. Every stochastic matrb~ has a fixed po\nt.
1 0 I/2 1/'J 1/4 . Ll/8 5/16 5/16 Proof. -If P' :[p,.,] is a stochastic matrix so that each of' its rows
is a probability vedtor, we get the determinant
which does not contain zero element.. Hence ✓. 1 is 'l. n~gula1: stochasti~
matrix. Pll - l P12 Ptn
The matrix B is not a stochastic· !:.,atJix, 5iu,;:e its second and j P-1 I= ~~l P22- l P2n

third rows ara not probabili~y vectors . j ,.,./ \ Pnl Pn2 Pnn- l
r.
8.2. Theorems on Probability Vector and £tochas i ic Matrix.
O P12 Pin
The following theorems are genernUy used in the di<;i:·uss~on on 0 P:>.2-l P2n

\.../,/
Markov Cbaios. ~,

Theorem 1. A square matnx A=fa1;l has a fl.xed pom. if a nd ,
• _ • ♦
= it I'p,.2
• : pn,, - 1 I t.
r,.
only if the determinant of the matrix A - I is zero, (where I is the 11 where the last step follows 16y addinb the elements of tbe secon d , L
~, third, ... , n-th columns to \.he corrcspon..iing elements of the first r-·
UJJ it matrix ), i.e.
I A-I 1=0. ... (1 ) l column . Thus l~
Proof. If ;=(x 1 , x 2 , .. . , x,,) is a fix €dt~poin t of A=[a,;), then · ·, f:!${ I P-1 \=0. ,~
,:1..
/~.
x' 1-i. x , ~-Hence, by theorem 1, the matrix P has a fixed point.
its definition ,g ives
au a12
-
ah] i
Theorem 3.
.➔
If u=(u1, Un) is a probability vector and
f~
r-
~) ...

'
[x1, X2, .. . , Xn] [ ~-~l a22 ai,,, =[x i, x 2, ... , Xn ] ; t.J2, ... , \

an1 a,, 2 ~·,:" ,j P=[pu~ !s a stochastic matrix of order n, th en the product ;J, is a .
Jt implies tbe equations t.. ,; p}Qbob1hty vector.
1
[Pu F.· .

(an - J) x1 +a21 x2 f- . .. +a,,1 Xn=O,


a12X1+(a22-l) x 2 + ,. , -:f"an2 .l'.11=0,
!
'
i Proof. Now, ;~=(u1., U2, \•c' lit,] ~~l P2 2
· Pnl
P12

.?n2
Pln]
~~n
P nn
i "L.

a 1 ,, x 1 + a2 ,, x 2+ .. . +(ann '-l)x,1=0 . .,,, ~ _ _ =[· ..,. ,...,


. ' •• ~;~ ~ !lit#.~ a1, a2, .... an] . .. (i)
Then equations have a n on •zero solution lf and only 1f tne.,1. : ,·'h ·
determin ·
· a nt of thei·r coeffi c1ents ,s · 1·tr
, ·zero, z.e. -•-
i-';tlw ere a1 = i:hp11 + U2P21 + ... +unPnl,
an-1 a21 ... an1 1..jlh a2=u1p1 24ll2P22+ .•. +unpn2,
a12 a22- l an2
1=0. 1l
'I
?· . .. (ii)
a 1,, Olin an,.- I all =u1p1n +unp2n + .. . + UnP n" •
330 l Statistics Section 8.2 Theorems on Probability ~"ector and Stochastic Matrix I 331
We notice th at a1, a2, .• . , a,, are all non-negativ~, since u, and
po are non-negative.

a1
Also, by addition we get
+a11+ ... + a11=U1 (Pu +Pu + '"'+ Pln) · ~
+ui (Pn +P22+ ... ; -f,211)
-~·
\

It shows that t i:; alsc , the fixed point of pm, Since P is also
regular, there is an integer m such ti.lat the matrix
Q=Pm
has no zero element, by (iii) definition. Also, Th,!orem 4 shows that
-
+ Un (Pnl + Pn2 + ... +Pnn) Q is a stochastic matrix.
=u1(l) +ur,(l) + .. . + un ll ) qll q12 q1n·
~ =1,

__,. •.~(iii) Let Q= [ :.~:1 ~-~t %, .. .(iv) 11

since each row of P and u are probability vectors. qul qnl qnn'

where q11 ... arc positive r.nd non-z~ro.


Now, (i ), (ii) and (iii) shows that ;pis a probability vector. ➔

Let t=(t1 (2, ... , tn), 1 ·


\ Theorem 4. If A and. B. . are stochastic mat.ices, then the
f
1

product AB is a stochastic mati:ix, and all powers Am ( where m is a Then, by (ii) and (iii) we get
~- .
positive integer) are also stochastic matrices. • :_;
(ti, I:?, ... , tn) Q==U1, t2, -· , fn). i ;·

an a12 ... a1n] [b11 b111 ... b1,.] Hence, by (iv) f/


Proof. Now AB= ~-~1
[
~~-'.I ·.·.'. ~~n ~-~
1
b~2 ·.·.·. ~~-"
l
11 ?ll + t ~ t.'21 + ... + (,i ~'nl =t1,
t·.'
ant an,1 b,;l b,,2 .. . bn"

1 I...
a112 _ ... -.,

~~ I t1 q1r-t tn q:.:r.!-+·:. -H, •lrr+ ... +tn q,,,=tn


'
..,
The first row of the pI'Oduct AB i~J obtained by multiplying ·me
~

first row of A with the columns of th.e~J{tatrix B, ano this is a proba- t: -;


bility v .!ctor by Theorem J. . ' ➔ 11 q:i. .. +t2 t2,.+ ... +- /,. q,.n=tn. . .. (v)
Sim.ilarl~ ,: it is proved that the Qther ro~vs. of the product AB are 1: 1 i_s ~ probability v; ctor, then 111 its elements are non~negative>. ~-: t
,:~
also p1 ubab1hty vectors. . !',o w, if 1r=O, th,m '-"the ,-th equation in (v) is satisfied only if l,.
. Hence AB.is ~ st?chasti.~ n:a.trix. Putting B ~A we find that~ 2. j t:1.=0, t2=0, .. : 1r_i=O, ,tr+1=0, ... , t,,"""'O. It shows ~11:t .no 1: ' .. ~ f.;•
1s a stol' l.lastlc matrix. Henc1, A 3, A •i, ... ,. A are also SlochaSW, 1 compooef'i t ◊f t ca.n h~ z1~ro. Th i~ ?roves the tlic!orem. ( ' · t . • .r.. -t'' ·.::•" ...,. ,i
1
:
i·:
n a.1:ri.cc•;,
. . d . r·- d ,
r. • l • 1
Theorer.a. u If P 1s a regu '·ir •1···c1c •ast1', , t ·
➔ -«• ,.\.w
- (. - ,, t,'
(
.;

--I;·•· 1 r Theorem 5. If p is a regul~r stochastic matrix, an


➔ ➔
its 1xe 1 . ....· •' "''• .u .... m.i nx an d t 1s 1·ts 11xe
, pomt probab1 11ty vector, then pm (w here m is a positive integer)
"' d
t:.
-..point is the probability v~~ctor t, then the components oft are ,,,11 1 1 tends to a matri lt Tin tht, /Jmlt ,1s m tends to infinity. :;uch that
1Lt 0 ) 1 ; -i- •
, .,._.(,YI:; ➔ l e"r \l • "'W 1 ot T ls t•q t ♦ i '~i
positive (i.e. t has uo zero w mpo nent)~ \ . ~. · ' '-· - • tia ,f! t.
-i- Th tis, if , /
Proof. Since P is a stocbasti,~ matrix, it has
a fixed point t by
-· '~ ·-1(".,t,, .
i
.J
,,..
,<!il<!<'"-·,;, "
~ ',
1,/ Pnr P12
Pl"] ':'<

'.) f · ~~n
(1
\r · P= : P!ll P'J.2
Theorem 2. It means

tR=t.

. .(ii <; L;;l p;, Jinn
... (2) ~
"'('f
-:

----- ·j has the fixed point' probabiUty vtctor


.-~(i~.L
Hence, if m is any positive integer, we have /.'.

I
-► ➔ ~ ➔ -+ ... (3)
t p m= (tP) pm-1:.:/ y m-1= .. ,=ti'=t_. i=(lt, 1,, ..., t.) ,..

.,
332 /' Statistics Section 8.2 Theorems on .Probability Vector and Stochastic Matn'J: ! 333
Then

li m
m➔ =
pm= t,
[
ti

... ...
It.
t2
tn
tra
...
l - -("'/~
-:: --ti ,,. .•. (4) ~
_''} If it has a unique fixed poin_t ~•·obability vector ~= (t1, . t2, I - t1
--t2),, then by tb:ore·m 6, the limit of pm as m➔oo 1s matrix T, each

11 1'2. tn -· row of which is t.


[The proof of this theorem was firi;~ given by A. A. Markov and ➔·

is omitted here.] Now, by definition of t, we have


O 1 OJ
[ 0! { f =(t1, t2,
Theorem 7. If Pis a regu].ar st.or::lH' Stic r.natrix atd p is any
(t1, t2, 1 - t1.- ta) I - t1 - 1'2)·
probability vector, then
3 3
➔ ao:--
\,,/' lim (p pm)=t This implies
m - +oo . ,
... (5)
-+
t t2=t1, t1+½ t2~ ! (I-t1-t2)= t2,
where t is !he fixed point probability vector of P, and m is a positive ½t2+ ¼ (\ -t1-t2)=l _- t1-t2,
integer.
l 3 3
10 , 5,
[Thus the limit is Independent of p.]
~
So that t1= t2= l-t1-t2= ·
10

P roof. Since p is independen t of m, we get Hel'.ICC

➔ ➔
pm, 1/10 3/~ 3/10]
lim (p pm)= p lim lim Pr»= 1/10 3/5 3/10 /
!."I ➔ oo [
m-:,.c·-,o · m ➔ o:i J/10 3/5 3/J 0 / '

=(P1 - P2, .. , Pn) · · ·


[
t1 ..., ti
.• ~·
!n]
. . . by theore m 6 ..,l__
Exerci°SeS VIII (A)
•I-
r
;
l
t1 t2 Tn
l. Test whether the follow_i?g vectors are probability vectors or
=[(P1+ P 2+ .. + Pn ) 11, (/11·+P2+ .. . ;-p,.) t2, ... ,
r,ot. !-
=[t1, 12, ... , fn]=t,
- ;. -~
( P1+P2+ ... 4-pt.) in]

··'
(1 1 1) ''
➔a=--= 4' 9, 4' O, 2 .'·
h=Ul ;, D· h
...
I-•
~

\;
This proves (5).
/ ¥anipl0 If the matrix P is defined by
~
C=
C3' -1 61 J25) .
12 1 1
a=(1· 0, ½, ¼, n- ~,.;-.

·:·'~·\f
V P=[i 0
l i]3 :i
,1
-~ (
f = 0,
I 1 J
-2' i6' 7 . j (1
➔f= 5' _2, - 2'
1 0) . ~·.
I~ '\·~"
•¢',.II

prove that P is a regular stochast ic matrix 1 and e·raluate ➔ ➔


L., ,t

g=(2, 3, 5, 9). h=(-2, -3, -1, 7).


Jim p,,1 • ➔➔➔
";,f -~ <

, _ . .
m - ➔ oo . -+ ,I..
;~~~
(Ans. a, b, c, dare probability vectors. Others are not.]
Solution. Smee ~he rows of P are probability vectors, w~ prove . A'
that pis a stochas tic matrix. M c,reovcr. we find r:~··· ""', "•',., , _..2~-re~t whether the following matrices are stochastic matrices or
J./6 112.
p2= 1/12 23/36
1/3
5/18 I
l 1-l
~ I1
not.
[ 1/9 5/9 1/3 j ✓ I A= ½ ½
½ ¼,
[½ OJ B= [- ½1 ½0 ½½
]•
wliich does not contain any zeto element. Hence, P is a regular f :-~
•1· --, , U ½ i ¼ ½ f ~.

L
~
stochastic mat rix. - (Ans . A is SJ.oCbastic matri x but B is not.}
~

I '."I
l
I
I

l Section Sv, ne Definitions In Markov Chain I 335


334 Statistics
3. Test whether the fofl owing ma.trices
matr ices or not.
are regular i:.tochastic
S.J [ Ans.
(i)
• ➔
1::

i:
(

J
TI'
4 ,g
13 '
I )
·17 ' (ii) ma trix havin g rows ~

[O1] p ½ , C= ½
Aw= ½ ½ , B=1_, OJ [! t}/ (iii)

6. If the matrix JI ls def ined by

D=[! ¼] [ot 0 OJ¼


0
J
~' ,
O
E=
0
0
l
½•
0 ½ ½¼ ½
0 ¼
B= () 0 0 1
[
O OJ

F=[t ¼t 01½ [½o½ ¼I 1OJ.7


0 ½ 0 ½
½ O , G= ➔

0 ½ find (i) the fixed point probability vector t of B , (ii) Bm when


m is la, ge, (iii} (½, ½, 0, 0) Bm when m is large, an d (iv) (½, ½,
[Ans. A, C, D, E regular, B, F , G are not regular.]
l , 0) Bm when mis large.
(2 4 I 4)
1,,
4. 'fest whether the following matrices ha•1e the fixed points. If
[ A-11.s . (1) ➔/ --


they have the fixed points, fi nd whethm: they have the fixed ·- 1 - I ·- · - (ii) matrix having rows z.
-- 11 11 11 11 '
point probability vectors.
(iii) and (iv,, gives ➔]
t.

A= [
0- 1 OJ
½ ~- i ' B= [O½ 1] .
t 'C= i
[! ¼]
t 7. Find a scalar multiple of ::a.ch v~ctor, which mak~s th e vector a
probab ility vec tor .

D= [ A 1 ½
½
o
.l
. E= ½ l .l] · [' t 1] 0
o, (i> (l.
'2' ~3' o,·3
~~ r)
6 ·' <i I·> (\ o' 3' 1' 5' 6) ' <111
I
2 3 5
... ) o o, o' o) •
( ' ["
0 U 1 0
\ "[A (') 41.
· F = [½1 0½ - 431, G= [43
1 2 3
2j
1 :
-- I]
0 .
l ns \
....-•-\
.. )
(11 30
93 , (ii i) none.
]
•• b
t
7 6,3. Some d,!fi.111i dou,s in Marf&:ov Chain.
[Ans . For test, use tlieorem 1 of Sec. 8,2, Matrices, A, R, C, Let us suppose fh e.t a r,ystem Scan take randomly a Fitate Gi at a
D, E b&.ve respectively the fix ed point probability step Tk, where tl,t ) s or.c ofthe states (a1 , Oz, ... , a,,), so that i= 1, 2,
,d
vectors. 3, . ... " · ~'.
(s'1 s'2 s2) ' (1:r °32) ' ~3
'2 3l)• (·Ii'4 TI4· Ti3) ' (29' §'6 91)· At fo.e next stei:• T1,+1 , let the rnndom stat:e of the system S be aj , ['
1
•1where j= 1, 2, .. . , ii. l

I
'

Matrices F , l-:Yhave no fixed points.] 1 We assu '11e ~hat the lau cr state aJ dep ends at most upon its t.
.,,,--.
( ~Jf the mat rix A, is defined by
iimm~diately . preceeding state a1 and n0t upon any other state ~:

o !t
A= ½ ¼ U
[0 l 0
t] }
1,aft~r _a1.
to 111 .
, pt't!' 'iO US
Le~1·, ,. 1k note tile probabiiity thlit. the state a1 occurs immediately
./
, '

•►
fmd (i) the fi xed ·point pq~bability vector t of A, (ii) the limit of ~·-/ / 4 _,"'. ~!i\i_c__(i,{~i;domJ proc~ , _l>!!l.\c_h has_the above_ char- ' .

Am as m tends to infini t/, and (iii) the: limi t of a Am as m t.ends


to infin ity, wlJere a;::f..¼, ½, ¼),lit' they exist.)


➔, actens1,,:s_1~ called_a !!!_g_1~~ Cl1ain U\., A. Marko v (1 856-1922)].
- - . .
The ?,1· t of sta1:es ((1 1 , a~, .. . , "": may be through of constttutmg
,1 a space: called the_sttJk spac1
- -- - -·--
? cf the process.
-
f
i
,

_;~ , 1::
336 I Statistl.::s S,xtlon
8.3 Some Definitions in Markov Chain l 337
Since n _is in general finite, tbe. process is callee the finite / The state a, of a. Markov Chain is said_ to be ~he . absorbing state ._ -~.
Markov Cham. . -~ if the !y~tem remains it that state once 1t enters there. Thus, the
It is convenient to represer. t the probabihtie~ P1f in the following \ state a,; is a,n absorbing state if
matrix torm.
• "
, I (. ') ,
p w = I, p11= O c•1-=;:=; ... (5) \...--'-
,,..

I
- t a, ll2
'\ 0
a. i.e. the £-th row of the transition matrix P con;ains 1 in 'the main
diagonal and zeroes everywhere else.
./

A Markov Chain is said to be an ergodic Markov Chain1-if any


a11 Pll P12 Pln two of its states can ce,mmur,icate with each other. This is possible
.. .(1) if none of the elements of the transition ma·(rix Is zero at a certain
:;--J·. . a~,: I P21. P'l.2 P2n step, i.e. the transition matrix is regular.
The transit/on prohabf!lties p 11 cau be represf.,nted by a diagram, ,
On I Pn1 r,2 Pnn ; ca.lled the transition diagram, by denoting it with an arrow from the v
' state a, to the state aj,
Here, the previous states (a 1 , G·2, ... , an) are written under a Thus, if there a.re only three states a 1 , a~ and aa in (I}, the
column and the latter states (a1, .J2, ... , an) are written in a row. transition diagram is as: shown in Fig. 8.1.
Thus, Pll n:presents tbe probability tht.t the stc,te a1 changes to a 1 ;
P12 represents the probability that the state <T1 changes tu a 2 ; p 2 1.
represent:.; the probability that the state 02 changes to a1 ; and so on. c...

The matrix P d r;fined by /


J
Pll PU. ··· Ptn] \
P=
[
~-~l ~~2 ::: · ~~n ... (2) <
"\ !
P"2 •·· J),.n
P nl '
'T' :i
is called the_transitiqn mairi:r of the Ma-ckov ch~in at the step ., k or j
l
at the time T1~. l
81 I
The elements o~\he fint row of tl:-.e matrix (2), i.e. ij L'
\';

Pn, Pu - ... , .Ptn ... (3) ( ' ~-


C
\, __ ___., A
,. --........,_ -~-~•- - r
are the probabilities that the state 01 changes to a1, a2, ... , an j I,

respectively. Now , since the states a1, 02, ... , an are assumed to .i pl 1. P73 Pa3 r
be mutually exclusive and exbaustiv,e, and also since tlie change of ~ r
a 1 to one of (a 1 , a 2 , .• . , an) is a cutainty, "'e get .11' Fig. 8.1
P11+Pn+ ... +P1n=l.
1
••• (4) . If any of~he probabilities P11 has the zero value, it is uot /,
by the addition low of probability. .:\.lso, p 11 , p 12 , :necessa,? to ~lay it in the transition diagram .
.. , P1n ,are non- ·1

negative, since a probability cannot b~: negativ~. Hence the '~lement~ ~ ' iinpl(!J (Throwing balls amougst players). //,/_...,. l1

--J
(3) represents a!~~ecto_1 :.:_
1
_ '"·--/
Similarly, it can be shown that all other rows of (2) are also
01
I
Three players a1, a2, aa are throwing a ~11 to one aoo ther ;
always th rows th e b~ll to 0 2, and a2 always throws the ball to as ;
probabil~ty vectors. Hence, the transition matrix P in (2) is a but a. throws the ball e,_thet· to a1. or to_a2 with equal probabilities.
Stochastic matrix. , Bf
0

plam that the possession of the ball ts a Markov Chain. Fma the
It
I
r.
transition matrix.
1·.
I

,f
-~ ·.- -

. . ""':·
,

. .;- ~;1, ,-:


• i•
338 I Statistics Section 8.4 Probabh ty Distribu tion in Markov Chain / 339 .I
Solution. At a particul ar step (or time), the ball may be in the
. [In this example , if the l!lld r;osi tioos ao a od OJ are absorbing,
possessi on of either a1 or a2 or a 3 • The 111,;.xt utep of posser.:;ion of n~ it reaches there.
-, then the particle will remain io these posit :ons, o
t he ball is influenc ed only by the present step and not by any step for the absorb:.!!, & barriers is as follow3 .
The transitio n matri.c - ___....
pre:vious tl, the present step. Hence the possession of the ball (
forms a Markov Chain.
The transitio n matrix is a:; follows :
.__,.,.,,
P=C4J
a1
[~o 6 ~~
q (J p
~a
O
~4]
0
0
'1

as O 1 0 p
01 ,:r2 a:i
as O O q O p
P=a1 [ 0
a2 0
aa i
l
0
½ i] \,#
a4 0 0 0 0 1

The above type of walk is called the random walk with absorbing ,f
The first row of the matrix is such that a1, always throws the ball I
I
barriers.]
to 02. The second row implies that 02 always throws the ball to as, .;
I
I
The. third row shows that 03 throws the ball to a1 and a2 with equal 8 ,4. Probab
ility ,distrib ution io Markov Chain .
I

i
r
probabilities ½, 1
At a certain step, let p, de;.1otes the proba!:>il ity that the system is i
Examp le 2. .(Ra.odom walk with reflecting barriers).
in ·the state a, , (i=l, 2, ... , n.). Tht>o the vector I
! {
A particle under impulse s can take f.ny of the position s ao, a ,
1 -+
./ . . . (1 )
a2, as, a4 marked on a straight line in thi!l order. The end positions ~
P=(p1, P'l., , .. , Pn) E

,
I
ao and a1 are reflecting, i.e. the particle when at a0 or a4 can. move j is ca~led the J!!J.!.Qfl.lulity._~fj!~~but_!!~_vec~°.~ at that-s tep. r
,:mi y to a1 or a3 respecfrvely. In other position s, it can move to the
1
i.
r.
next position at the right \11 ith probability p and hence can muve toV
In J ccordan ce with t~ is <l 1.·finition , the initial probabili ty distri-
the position at the left with probabi lity q= 1- p. Explain that the I
b1.1tion vector is dl!fined by the distribut ion wh~ n the process startt.5 i
movem ent of the particle is a. Markov Chain and construc t the and is denoted by t:
transition matrix. -+((1)
=, / (0) (0) (0))
p 1 ' Pi ' .... Pn .
...(.::) r
i'
Solution. At any particular step, the particle can occupy one of p
t
the position s a , a 1, a 2, a 3 , a4 . At the next step, the position of the
0
particle is influenc ed only by the present :;tep and not by any step
previou s to the present step. Hence the process o( moveme nt of the !
Them-th step probabil ily distribut ion vector is defined as the
distribut ion a(er the m-tb E,\.1'. p and is denoted by ,
Ir:
particle is a Markov Chain. ;cm) C.:J ( (m) (m) (m) \ ...(3)
The transition matrix is as follows : PI ' P2 • ···• Pn 1.
•• a1 ,,, . aa ••

a, [o o o o]O If.Pis the transiti on matrix of a' ~k Ch .

'
a1 q O
P=ao 1 p O O ..-- rem
'11l/ "..-Theo 'A\ 1., n, > ov a,n,
'- ,f

0 q
a, O O
O
q
p
O p
·
J
~(-,)
aTict P denot<a th ' probat ility distribution \'OctOT af" r th; ~a;;, ~
'
/ /\...1
a O O O 1 0 - m stl:'ps, then i
-+(m) -+(O) m
,J
.. (4) J
4 JI =P p t
The first ro•• shows that a 0 !Dust c!Jaage to 01, and the last
j

j/O) is the initial probabil ity d i~tributi0n Vc!dor . I_


shows that a must cbange to a3 E-tplanations of other rows (r' where
4 -· ) r..
evident. /
\J l
"' - ~·-· --

340 I Statistics Section 8.4 Probubility Distributi?n in Markov Chain I 3-H


Proof. Let the trans ition matrix P be Proceeding in the sarnl! way, we can fi nd the probability p ~i ) of
an )j
a1 02
· the sy;;tem's being at any OD.! of the state3 a1 , 02, ... , 0 11 and then
P=a1
a2
[Pll P12
P21 .P2~
. ..
P211.. '
Pl11]
~
...(i)
chaueing to a2 in one step. It is gi ven by

an J'nt Pn2 Pn" (1) (0 ) (0) (0)


P2 · =p1 P12 + P2 P22 + ... + Pn Pn 2 · .. . (viiJ
We first determine the prob:ibilit~• distribution vector ;<t) after
one step only. Generalizing it, we get: the prn bability p~l ) of the system's being
The system can go to the state a1 from any on1:: of the sta.tes ai,
at any one o'r the st2.tes a1, a2, ... , an and then changing to a,. in
a2 , ... , a,, with prob:,.bilitie s one step, as
Pll, P21, · .. , P11l ... (ii) ' (1) (0) (0) (0)
Pn :;:;=Pl ·_p 1n+P2 P1,.n + .. · +Pn Pnn · ... (5)
respectively , as in (i).

Since the probability of being at the state ai is pf)and that of The relations (vi), l vii) and (5) can be written in t he vector
form as
changing from a1 t~ a 1 is Pn, we fln.:i that t'r e probability of the
system's being at a 1 and then changinr, from a 1 to a1 is ➔ (1) ( (1) (1) (1))
P = P1 'P2 ' ... ,pn

Pi
((11)
Pu .. . (iii) ~J
·,
= ( Pl
(0) (0)
'P2 ' .. 'Pn
f Pu
L...
(0) ) ' P21
P 12
P22 Pl"]
pz,,

Pnl Pn2 P nn
by the multiplicati on lq_.w of probability.
➔(l) ➔ (0)
Similarly, the probability thait the :;ystem was at a~ and then or p =P P. . .. (6)
changed to a1 is
"p' (O)p . .. (iv) Now, the probability distribution vector ;<2>after two s tep s can
2 21
be obtained by the formula (6) if we treat ;<I) as the initial state.
. . .... ... Thus, by (6)
-~ The probability that the sys·tem was at an and then changed to
➔ (2) ·➔ (I)
f . ~-. .,(
() /'l /)

' a1 is
(0)
p c:,p p
•-=- \)
( C, )
f
'l-
... (7)
... (v) so that
Pn Pnl "
4 l2) - ►(O)
p =-=iCJ P2 . . .. (8)
Hence, t he probability Pll) t a.at the sys1em was at any one of the
Generalizing (8), we get the pre bability distribution vector after
states a 1, a2 , .. . , an and then changed to a1 in one step is obtained _11 m steps as
' ....
➔ (m) ""'"(0)
from (iii J to (v) as
P =P P"'. , I ..
(1 ) (0) (0) (OJ , .. (Vi) I
This proves ( 4).
P1 = P1 P11+P2 l21. + . . + Pn Pnl'
1
J
-------·----
342 I Statistics Section 8.4 Pmbability Distribution In Markov Chain I 343

Note 1. Alternative formulas for probability distribution. From sunday itO friday, there involve five steps. Hence, the
probability distribution vec:tor on friday is
By (5), we get "'l~ ➔(5) ➔ (O)
\ p =P pr,
pCm)=p (m-I)P +/m-l) p +... +/m-l )pnn . ... (9) '
2n .,
n 1 In 2
ii
O I 0]
& =
5
(130 28 59) .
w' s1 1 243
It is an alternativ e form of (4).
=<o, i, o) [
Similarly, from (7) we get Thus, the prcbabilit) of his go.ing to Con ..friday is 59/243.
[We also note that tLc probability of his going to A and Bon
➔(m) ➔ (m-1}
P =p P. ... (10) friday are respectively 130/243 and 28/8 l.] '
Example 2. A particle under impulses can take any of the
Note l. An interpretatfo,, of pm.
p,:>sitions ao, o1 , 02, as, a4, 05 marked on a straight line in this
Let us choose the initial distribution as ; ~O) =(l, O, O, .. ), which order. The end point, ao and or, form re.fleeting barriers. The
means that the S!(Stem starts from a 1 . From (4) we then see that ' particle can move to the right with the probability p and to the left
with the probab i.ii ty q where ff= 1- p. If it stares from a2, fin•d the
the vector ;cm) becomes ideutical with tlK~ fim row of the na·\rix
probability that it reaches a2 again after exactly four steps.
pm, Thus, tbe elements cf the firs·,: row o.r" pm gives the probabiliti1JS
Solution. Tll ~ transition JI1atrix P is as follows.
tha·; the system is in a given state after m-th step," r.tart ing from the
state a1. Simil.lrly , the elements of the sec.ond row of pm give ao a1 t12 as n4
o,
05
P=oo (0 1 0 0 0
the probabili ties that the system is in a giver, state after m•\h step,
(1 a1 I q 0 p 0 0 01
starting from the state a2, and so on. a2 Io q 0 p 0 01
as I 0 0 q 0 p 01
, Esampl ~"[) <\. sales organiser goes to one of the three market;
a4 Io . o 0 q 0 pl
, Band C everyday . He never goes to the same market on con• as LO 0 X 0 1 OJ
~ secutivr days. If be goes to A, then the next day he goes to B; If ➔ (0)
he goes to either B or C, then the next day be is twice as likely to Also p ;::= (0, 0, 1, 0, 0, 0).
go to A as to the other market. If be goes to Bon a sunday, find ➔ (4) ·-lo-(1))
the probabil ity of bis going to Con the next friday. Rener p = p ..¾P'
, ~

=(~r3+2p 2q2, 0, pq2 ♦ 2pSq+3p2q2, 0, 3p3q+ p8, 0).


Solution, The traositio.a matrix P representing the probabilities
of bis going frolil one mar.ket. to another on two consecutive days is Thus, after steps, tht} pe.~·ticle will be at a 2 with the probability
prescribe d as follows.
+
pq2 + 2pSq 3p2q2,
A B C [[t may be noted th:1t the particle reachF,S ao, a1, a3, a4 and 05
l'=A [O
1 OJ
B i 0 ¼
with tt,:; probabilities q~ +? p1q2 , 0, 0, 3p3q +pS and O respective ly.]
C i ¼ 0
f.x~rcises VIII (B)
Since _ h_e...
. decides to go to Eon suoday, the initial probabi lity
1. A room is divided into fo ur compartments a1, a2, as and a4.
;<
distribution vecto,· 0>, I.e. the probabilities of his go,ing to the ' There ar,: tv•1 door;, bet ween a1 and a 2, one door between a2 . i
~
and as, two doors betwe(:n a& and a4 , and three doors between
various markets on sunday is 04 110d 01. A rat is ra.ndomly moving from one compartment
ti
➔(0)
p ={0, l ,~ 0).

1.. /
j
--

344 I Statistics Sect!on 8.5


Multi-Step in Markov Chafn I 34.S
t.
to the other. Explain that its positicn fo:>m a Markov Chain, 8.5. l~uh·i-step in Markov Chain.
and prove thnt t he transition matdx is as follows.
ii The probability that a system wh ich is in the sta te a, will be in
)' the state a, after exac tly one t tep bas be en denoted by p •
01 a2 u3 a4 t· 11

a1
02
[o! i
0 ½
O ?-
OJ W..! introduce the notation /":) to deno.e probability that
IJ
a
03
a4
0
t !1 13 system which is in the s-cate a, will be in the state a, after exac-tly
m steps.
2. A particle under impulses can take any of tl:ie positions a1, a2, This definition impl!es that if m= 1, we have
a 3 , 04 marked oJ. the circumferenc•! of a cirde in this ord~r in
the counter clockwise senB!. It can move one step counter (I)
pij =plj. ... ( 1)
clockwise with fre probability p and one step clockwise with
the probability q where q= 1- p. Explain that the movement of
the particle forms a Markov Chain, and find the transition It is convenient to write p~;) in the matrix form
matrix.
(m) (m) (m)
01 a2 03 a4 P11 P12 P1n

OT n I P21
<mr
[Ans. p 0 (m) (m) (m)
a2 q 0 p P = P22 P2n ... (2)
t.13 0 q 0
a4 p 0 q (m) (m) (,,,)
Pn i Pn2 Pnn
3. Draw the transitioB diagr:c.ms for the following tramition -~?
matrices We call p(m)as the rn-th step transition matrix.
In ac~ordance with the no:at b:1 P of Sec. 8.3 and th e definition

(i) P=[½1 00 ½l0


¼ ½ ¼...i
· (ii) P,~u1
u1
U3
[o¼
Ut

.0
d]
U~

¼ ½
l
U3

½
· (I) above, we have
) P<l' =P. . .. (3)
1 Theorem. I. If P denotes the transition matrix cf a Markov

..• 10½ ½OJ


½ ¼ 0 ½
(m) P= . U O O 1 ·
'. Chain in one step and p(m> denote:i the m-Lh step transitio n matrix ,
j then
I

LO½ O l p<ml=P'!I, ... (4)


that is, them-th step mR trix is the m-tb p t, 11er of. 0 •
8
[Hints . See Fig . .1-1 Proof.''f;I.et us suppose that the system i; in th e state a1 initially
,, • ,g . ,. ball to one another ; 0 1 • and it changes from the state a1 tu the state aic tn
2 a 8 arc t b rowir. ,,. · one step, and
1
4 . Three player~ a 1;, a , • and a 2 always th-:ows the ball to • then changes from the state ax~to the s~ate a; in the next step.
alwa, s t b rows the ball to o2 ' • c1 ,1 ir to a2 with equal - ( 1)
1
0 3 • but a 3 throws the ball et b,:r t b .. , ·h ball prove ' ~ ow, p . uenotes the prota·,)1 lit y ( : ·:htnging fr..)m the state a
' . . . · the first person D t 10 '-'· \. e
18 ' ' 11c 1
probab1ht1es. If aa b i )" · ,, ot ! aving the ball by I . ( . . (1))
that after tl ,te f, t hrows, the p ro ta · me.,
a 1, a 2 and 03 ure r...:-spectively f. ¼1:1.fc\ ·l ,
· (ii) 0 I\' '. be stf. ti' a,. m one step, and p1c ; which 1s cqnal t o pk.
'J
denotes
1
346 I Statlltics Section 8.5 Multi-Step in Marko v Chain I 347
tl.e probability of ·haugini; from the state Ok to the state a, in one
step. It foll uws rnat the probabili ty of changing from a1 to Ok and In general, we prove in the similar manner rbat
then from ak to a,, is -,i·1

t (m) " (m - 1)
(I) b. P; ; = ~-1
1: plk P IH , ...{7J
plk f"1' 1 IJ

by the multiplication .law of probability. where we remember by (1) that


Since the intermediate state ak may be any one of the states 01, (1 )
a2, ... , an, it follows that the probability Pi~) that the system
Pl·1=pkj'

changes from a 1 to a1 in two steps through any one of the inter- Now, let P be the one-step transi1ion matrix
mediate states a1, a~, ... , an, is given by
(I) (1) <t> l
/~)= i pCl) Pkl ... (i) Pll P12 P In I
lj k•l . Ik
r
P= I P21
(1) (1)
P22
(1)
P2n I
I
I
by the addition law of probability. I ...
Similarly, the probability p ~~) that the system changes from the
I <1)
LP,,1
(1)
p n_') Pnn
(1) I
.J
·i:
a2 to any one of the states a1 , a 2,

p .
2J
(2)
... .,

--1 -
an anc. then to ~ · is given b;i'

= 1: P,.,k p rt 1
(1) ... (ii)
I
Hence

P2=
(
I P71
!
())

<1)
I P2,1
(1)
Pi 2
(1)
P22
o> r (1> P12
Pin 1I [Pll
I
0 > I <n
o>
°>
P :2n I l P21 P22
.. . <1>
Pin
(1)
P2,i .
l
In general, the probability/~ ) that the system chaoges froru
I) I .. ... I I ·· · ··· ··· ··· I
the s1ate a, to any one of the state~ 01,, a2, ... , an and then to a1, is I (,) (1) (1 ) I I {I) (1) (1) I
LPnl Pn2 P nn J Lpn l p n2 .. Pnn J
given by
(2)
p .. = k..,:E"1 pl(1)
.k P k l ,
... (5) . r <2) (2) (2)'\
,] I pl l P12 Pln I
I (2' (2) (2) \
Now, taking tlle state a, in the second step as the initial state, . I ' ) P22 ..
= I l,,.,1 P2,, I
3 ... I
we can similarly calculale the probability / ,r ) that the system chan- I

I (2)
t,P111
(2)
Pn.?.
(2) I
Pnn J
ges from a, to a; in two ·' steps and from ai to a, in one s1ep. It is

!;iven by by the use of (i) , (ii) an<i (5). Th~s together with (2) gives
(3) " (2)
p . = :z p .. p ,,.
·-•~ p2=p<2>.

j
If J-1 iJ .. .(iii)
I
/l
--4--

L
348 I Stati.uics Sectlo 8.5 Multi-Step in Markov Chain l 349
Also, by (5) and (iii), we r,et
By the theorem 7 of Sec. 8.2 and the theorem of Sec. 8 4, we get
1
I P11
(2) (2) (2) 1I rI Pu
O) O) (I) 1 -+(m) ➔(a) ➔
Pi2 Pin P12 Pin I lim p =lirJ {p P"'}=t. .•. (8)
m-= m-=
! II p2l
I (2) (2J (2) I (1) (1) (1) I
I P21 . P 22 I
1

ps=p2 (P)= P2n p22 .. ➔


P2n
Thus, each row t of T dr.,es not depend upon the initial probabillty
I (2): (2) ·;2) I \ (.j) {l)
... I
(1) I ·distribution p(O). In othei- words, even if the initial states of a
lpnl

r
I Pu
(3)

', p (3)
pn2

P12
(3)

(3)
Pnn

(3)
plll
(3)
l
J lpnl pn2 ... p
nn
j
stationar y Markov Chain are different, th-e limiting 1ransition matrix
'Tis the same. The limiting state is called the equilibrium state or
! the steady state.

;
--

Examp t(i.; The transition matrix P of a Markov Chain is a-s


p p
= I 21 22 2n I . :follows .
I
I
···
(3)
···
(3)
··· I
('l\ I o ½
P= [ ½ ½
½]
U
Lp nl pn2 pm.. j 0 1 0
Hence we have
The initial probabili ty distributi on of the process is ;(O) =(i, 0, ½),
p3=f'(3)

Similarly, we prove by induction that


p1·11 =J' (m>.
P13• p 'P3 .
1

!If p (m) .
1;·· denotes
(1) ➔(4) (4)
. ··
t h e m•th step tra.ns1tion prnb ab'l'

")
. fi od . p(2 )'
11t1es,
' 32

ve proof (4) may b e obtained by using the fo rmula~ . ·


[An alternati
(4) and Note 2 of Sec. 8.4. Howeve r, ·the abovt: proof is d irect and 11 Fmtl also
➔ (0) ✓ ,.,
also instructive, since formula (7) have be·!n deduced in course of~ · lim {p · P'"} anc\ lim
m➔~
pm
m ➔= '
this proof.]
Note 1. Stationa ry Markov Chain. Solt,tion. By (2) and (4)
If a MarkoY Chain be such ·:bat its m-th st,e p transitio n matrix j r c2) ,2) (2) ")':
pll P12
p<m>, which is equal to p m, tends to a defbite m:1tri1t. T as m tends to 1 P13 I [ .)
i11.finity, thrn it is called a stationary Mul-.ov C hain..
'
· p<2>- I (2)
- p21
_(2)
p22 P23 \
(2J l =P2i::, ½ ½
Cl 1 0
'J2
l)

1
/J-· This definition implies that after a large m1:ml:er of steps m, the I (2) (2 ) (2) I
· t 1·ansition matrix p<m> changes !D s,'011,ly that it may be considered to· P33 J
LP31 /3_2
reach a defin ~te value. 1

By Theorem 6 of Sec. 8 2, it f0llo V1: tu.a : a Markov Chain is : r¼ i o·j1


stati o nary if itq one-step transitio n mi: ti ix .Pis rt!gular and has aJ}, o,'
==l:i¼ .,½. ·: •¼'
fhed point pro bability vector. The p hysical m t an iug of regularitv'. f'
fa that it is poss ibk fo r t he s~stem to d 1:n~;e fr .H.l a ny on e of tn:' Bene
(2) I p (2') = U,
P~2 =2• 1.3
states a 1 , a 2• • , u,, !o any on e 0f t'r1 •: th1: Ul . :inr:e no elen:ec.t of Pig
zero . I tis i:rgotic state. Italso foll ::i ws rba. die limitit ~. valueT By (4) of Sec. 8.4 we get -¼ ,t Ot = ( 1 7 1!,:,
')f p tm) is such that each row of Tis equal t.o the fi xed p oir. 1 )robabi~

lity vector t of P.
➔(4) ➔(O) 4-(~0
p ; = P I!... - 3' ' 3
!l[1 1. *½ t0 ·= 4' i1'· 'V
- . 2 .. ft "'
--------------·-----, --- ---.: :-.- --- - ~
~:·
350 I Statislfcs Section,' Multi-Step in Markov Chain I 351 I}
cJ~
,(

Henc e by (3) of Sec. 8.4, we g~ is ½, then


(4 ~t the system be in ,t1te a2. Then 1 red is proba bility
B with proba -
,: .-.1tem remai ns at a 2• If 1 white is select ed from
p
3
IJ
) ,
J °1li'l ity i, the syst,!m move~. to a 1 . Thus, the third row of P is (0, ¾, t..
~

2
I: ~ay be shown that the fixed point probability vector 7 of p is1· ¼), since, the syste~ . canno t ~ov~ from a2 to
Thus, the tra.ns1t1on matrrn: P 1.:,
ao,
~
l~
( 7' 71 7)- (;,_

a0 a1 a;i,.---
1
j
Henc e Jim
2/7,
1-'''' -= 2/7,
4/7,
4/i,
1/7]
1/7 , I!
P=ao[2 fI ¼OJ
a1 n ·~
i~

m ·,. ex- [ 2; i, 4/7, 1/7 "'! a:i O f ¼


ii'
ao so that~ (O) = (1, 0, 0). Thus
and Jim
m ➔ oo
{i,(O)pm}=
.
~=(37, ~' I),
7 7
! (ii) The syFtem began wifo state !•

f<..

(1) (0) (2) (1) (1 1 1) , tee


p =p =(0, 1, 0), p =p P= 6' 2' 3
'X Ezan iple 2. There are 2 white mmbl es in the box A and 3 red f<
ts of
~ar~l es in the box B. Each step of a rando m proces s consis
puttin g them back (3) (2) , ( I 23 5) rr
p1ckm g a irarbl e from each box and then
after interc hange . If ao, a1 and a2 denot e the
l.iUcb that box A has no red. one red and two red marbl
states of the boxfs
es•res pectively
P =p E= i2 '36 1 J 8 ·
.. . . I
fi nd'ti ) the transi tion matri x p for these states , (is) the
that there arc 2 red marbl es in A after 3 stl!pt and
pro~a bility : ' Hence , the probab111ty that there
(i ii) the proba bi- I ~-
are 2 red m A after 3 steps 1s
f
in the long run. ' 18 '
Jity that there are 2 red ma rbles- in A
(iii) Here, we requir e the uniqu e fixed
proba bility vecto r Jc

. ·
an d 3 re
~ • 1
➔" t1
. SoIutfon. (1) The state a0 co? sistG of 2 white in A,
t.
. ,t=(t1, ts, 1- t1 - t2) of P. t_-
m B.
1 white, 2 red ;.'1 Then, (11, t 2, 1- t 1 - t 2) P=(t 1 , t 1 , 1 - t 1 - t 2),
..
, ..
The state a1 consis ts of 1 white , 1 red i-a A, and - - , ,¢t
in B.
The state a2 consists of 2 red in A , and 2 white, I red
in B.
1
o1,, (11 , 12, l--11 - 12)[1 i
O ! -t
i]-(11 , 1,,, 1-- 11- t2).
1
I~
~
dj I
If the system is in state a0, then a white marbl e can be selecte t
from A and a red marbl e from B, so that the system
moves to state/ Hence , ~ t2=t1 , t1+½ t2+ 1 (1-t1 -t2)= t2, ~?-
row of the/ _
first
a 1 . T he system canno t md'jle to a0 or a2 . :-Ience the / ½ t2 + ¼ (1- t1 - t,., -1 - t1 - t2), ~
t-
transi tion matri x P is (0, 1, 0). 1

to a0 it l red is; They give t1 =9· l, t2=0· 6, ta=O· J.


~
Let the sy Hem be i~ state a1 . It can move
select ed from A an d t wlJ.ire from B; the pr0ba bility
·of so happe ning Thus, in the long run , the _?roba bility that there will
be 2 red in
\'
is i -¼= k. Thus prn = l. The sy~tem can ruov~
select ed from A a nd 1 red from B_; the probab1 u!~
_to . a:i if 1 whit:'

1lrty
of so happ~n'.n (; i
of remam rn
ii.·'. (/ ;
, . ~
is O· 3.
Exer cises VIII (C'
J
I-f. "
"'
is i·i = ½- T hus p 12 = f. Accor dmgly . tr,e probab
1
cbox Aand 4red marb lesin ,;
r_ow ofPt,·•· Tbere are 2 white m arble sinth
inai is p11 = l - p10 - PJ2 = l - ¼- t =½. 1hus, tbese cond
met th the box lJ. Rach step af a rando m proce ss consists
of pickin g a lt:.
is (t ½, ½). [Note that p 11 can a lso be obtarn ed from the1_ marbl e from each box and then puttin g then: back
after inter- ~~

~ ~
~mai ns at a if either I write fro m each bo_x ~; selecr
tte system 1 with no red,
each box with prob!' chang e. If au, ,11 and a 2 deoot e the states of a b:>x
with proba bility ½-¼ = i. 1 red is sdete d from the transi tion
one red and two red l'.L.arbles respec tively , fin d (i)
Iity i -i = ¼, so tlJat p 11=k +¼=½ J. ~

1
f
352

I Statzst1cs

.
I

Sectiori ,
·

,
(.

1
·r ·\
1,
,
.

·,
."(

\
\ .,
l '

\

,
.-.~ - -•" · -
'.1·
:7
l

l
.•

.,·
t
-:; -~i,H,· t ;;;_·
·t.,-,.;;:j1-f,, 1',•,. cnd~-r.;
. ...... . ,

niif;~;.~,
• \f

lr,1:,,_,.~~~:,~-..~.,;~~lw
\ ,.f;,.1c~•i,,~~L;i:;

L
J"",
:

1~ Jfifi:l\i}~,r~illJ/;~~_
• • ·.,•.;\,.,.,. :.~ .
ltl ,. .,,., ,,;:-., ~

,:,):,:,
·
~,·....:~H·ri.,..,,,.:·~M:-_'
~;,-., .... , ,,.. ~}

•!l\'~.,\lirm..~i•;i,~..&-~-,.-~,,..:,,1(:-:i:
.,_..ti' ;~ • ~· ,{,
-~.(.j,1 , ..

~~di-~':"J.~l.w~;.~ •tf~~r~~~,7";..?JK/',;

':"t,.\.-c:~:,:•~t.:'·::··:~!t ..,_;J.~f0'::'P:lL · -:· .


i"/::J; ..
'•Pf~1fr, t 'i;;...~f, 1,.; -:~~-~~~r~" ;~-t-t,,itf; ~ . ~~J,..,.,,J ·S '
t-t/::tt _: ,.·:~"\~-·~?:Pi!/-~' ·:· -/".'i
:
1 ('

l
./ '- I
:'- t ! J
.
I

l 1• LQ • -'I J'
) ' r~ / : ' i.. ~
e
t I I

;;,1 () ,i t + ( !~: I) o~,3 ~ t J


matrix P for these states, (ii) t 11e
probability that ther~ are 2 ·j\ : }.;;,'\ , -. I

red marbles in A after 3 st:=ps, and (i 1i) the 1>robability thEl\ · l \ t~{ij~;
there are 2 red marbles in A in the long run. • \.J1 • .{;. ;rf/.-:~~~f. ~ /J t? t I ,- 0•1 ;,- -l u 1
~ 1-l ,
' ,:· ,1 p -'~<;:i.~,i·
0 J O] 11
L, jl ) ,';t~ ,
[Ans. (i) P= [ 1/8 1/2 3/8 , (ii) 3/8, (iii) 2/5.] ~I! i".1-J_.'_·::·V:'-'
o 112 112 . ,·
1 -l (.) 1 'b t 1 -+ iJ 1 - o , 3 t:,
1 ~ r: I
2 . Solve example J, assuming that there arc 3 wl'ute marbles in A ·]: .
and 3 red marbles in B .
-, (1- 0 1 ', -t-0 •3 ) ·b - 0 ' .3.
. O 1 0 OJ
1/9 4/9 419
0 .. . ..
(1) P=
[ 00 4/9 4/9 1/9 , 32/81, -:..,J "'1 ----·.1-
0 - 1
0 l 0
[11) (lll) 9/20.] I :
.... --- - o, {.
o ·~ r;
3. For a Markov Chain, the transition matrix is
V 0
½ o
P= l O l) ,
½] 11/-r ,~- ~ (.,.1, -:- I - t I - ! - 0 ,,
[¼ i ¼ ,/1 ~ o ·½
and the initial probability distribution is p(O ) =(½, ½, 0). Fini' :
•I r, . -lc--1..AA -''MM ,
P , P(2) .p➔(2) and p (3),
(2)
• I
~- ( ✓ ,.
, Nh ,.· , . usu~i~
~- /·,i
13 23 1
.. rc the symbols have their
11ij· . '•ri , \ I
. ll/.-V
~ ~1~(:'· 1. t· :: .
' ~

( 0 1 G1 0 · i.f)
meaning . I \. ) . ·, ')1'' ~ ~~

[ Ans.
3 1 ( 7
8' 2' \16 '
2 7) 7
16' 16 'fr_
J . d11•·'--tl
t,I I

I ,
l I

(NI
,
r/-,~ !--~ 1, '
•A _I -~ r,,t-t.c i~
I
~ -"'C" .,C b O 'I ·
I

/
4. A man's-smoking habits are a:; folk ws. If he smokes fl' / 0 ~
cigar one week. he switches to non-filter ciga,~ next weak "_ J f ~.tkA ......
V
t
~ 1 .J.u.-t.-d I

✓ probability 0·2. On the other aand, the ,r0l1.1bility th ) ~ ~


smokes non-filter tw u weel:s ir1 sur:ce: si,)n is O·"i: Io the l~b;!
run . bow often, he smokes filter .cigars ? I
w· 1ML
} Y1 ~ X ,
~
'
~f ~
n. ;'y~I/~6;_
('-·'m/4 0'i . ;,~...:u,-;- f? ,-1, ·1
f.1/
,

[Ans. 60% of time.] r-· • 1 .d ~ , ..._· ,. .r J , ~ r"'. { .,. ~,. ~..,.


,,{

. .
:i :f
0 ' I ' ' 1 '
· I.
"t IN fl W 0/J.J /v ~
../ ,
- tJ
. ? . ,J' ') . z_ J·

p ( ., .:) {0 • 7,
\ J: ~ ~
; ;
~
' .)
0, ::_ J- i I , •-I

1-; I
l
, ~·1_J 'U <,.
·; r, vi dP('~r( 2. ) ' 3

t
I •

You might also like