Markov Chain Notes Copy
Markov Chain Notes Copy
- --- - .
(a) and (b) respect ivdy. Note that only tbe ur,per "Man ilatb a weary pilg,·.image,
contro l limits are needed h ere.l A •. 1b rot1gb th world he wends."
6. A factory desires to turn out cNton thread. the breakin'4 str.eng 1 -Southey.
:h
of wh·ich is to have a mean aud ~}tun ~a~r :\ dr- 11:·!ation 6·5(, gm at'. d
1·50 gm respectively. Assum ng th1r.: tbi:, standard bas been
attaine d, find the 99% an d J a co.:itrol li mits ~1or the mean Introd uction . There are many phenomeLa both in natural E.nd
the routine sample s o f' 10 p i<'ces of th re ad .
or social science s which fluctua te i o random manner s. The actual
nature of their r.haracterstics are best i..nderstood with
[Ans. 5·57 to 7·43 , 5·08 to 7 ·9~.J (t e l tht" probabilistic: methods callied the random proces,;eti. theThere
help of
are
.Jmany z:andom proceis es of which the Marko v process occupie
sa
, promin ent plact.. Thft ideas behind this process are much clarifit
d
if one bas some kn0wle dge of the Markov Chain, which may
be
conside1ed as a very pa.rticular case of the Markov process .
The present discuss i'6 n is concern ed with the elemen ts of
the ~·1
Markov Chain. The pre-req uisite, for this study are the theorie
s of
pro babi.lity, multi-dimensional vector, determinant and matrix
ialgebra .
~ [See "A text book of Vectors " and "A text book of Ma1trix nod
I.Tensors' by the author. ]
·•s.t. Some prelim inary definit ions. v,.. 7t
t:~
We shall use the fo llowing definitions /•'
in the discusston of :_ ...-,
IMairco v Chains . '!~5t' ;
,,, }.~!- ~~
An n-dim cnsiona l row vector V ·,)t''=
·f'j
.. L ......,.. _~ ~ ,.,;{f.;1
· j' U=(.:t1, u2, .. , u,, ) i' 'l ~
.. (l}
J ~ .::~~
called a probability wet'!! .if its compon ents are non-negative :in
heir sum is unity, i.e.
d
,.~
r, ..:,~~>
:;' ...·•..
•t,,~·
.ll;;it;,- ,t
· ,.J1 > 0, u51> 0, .. , l-',, ;> O.
r
.l . .. (2)
u1+u2 + .. . +Un= 1. .. (3)
\
t
.......
Section I f,, l So -ne Prclimina•y Definitions I 327
326 I Stat/J tics
The vector
u , u2, ••• , un ar~ / Solutio,1. Tbe d,zfinitio.'.l is given by ( 1), (2) aod (3).
T,1e condition (3) shows that the components 1 e and tb e
io terms of_the_f- ;;is a probability vector, shr.e :ts elements are non-negativ
not all in<iependent,but onqi f them .can be \!;'{press~~
lmes convement 'i sum of its e.lements is ½+0 +t,-t- ¼= 1, i.e. equal to unity.
remaining (n- 1) components. It is therefore somel I .. .
it conta ins a
to write (1) as The vector b is not a i:roba bil1ty vector, srnce
➔
... (4) I uegati ve ehruent -!.
U=(U1 , U2, . .. , Un-1 ; 1- UJ. - rt2 -- ••• ...... Un_1), ➔
sum of its
.. ., un can The vecto~ c i,; not a pro~abil1ty vector, since the
It is also evident from (2) and (3) that none of u1, u2, 1 +?. bS+ I= :~½ which is not equal to unity.
comp onen ts~·s
be g-:eater than unity or Jcr.s than zero. , ,/
x. Find a
Exam pl '2 _;YI 1efine the fix~d point of a square matri
Any vector with non-negative component s
fixed point o lh1.: matrix
➔
· 3a1 + aa +4a3 = a 1 ,
i.e. ,,.,,,..
➔ ➔
I I ➔_ l ~ i 0 0 ➔ -- ! . ,
0 1
A= i· :1 .l , B= l - to 4 g]
• - -..
I 21
}c-(7' '5,1) ,, .o IO
1
_ 2
a- ( 2• 0•3•6 ),b- \5• - 4• • 20·
,.
I
5/?6] r Onl an2 a,rn - 1
which contains zero clements Thi:, is the expanded form of (I). Hence we prove the theorem.
A=[1&2
3
0
1/4 1] [ l/80 · 5/16
1/4 ' 5/016 ] = rl/2
5/ 32
1/4
45/64 1/4]
9/64
Theorem 2. Every stochastic matrb~ has a fixed po\nt.
1 0 I/2 1/'J 1/4 . Ll/8 5/16 5/16 Proof. -If P' :[p,.,] is a stochastic matrix so that each of' its rows
is a probability vedtor, we get the determinant
which does not contain zero element.. Hence ✓. 1 is 'l. n~gula1: stochasti~
matrix. Pll - l P12 Ptn
The matrix B is not a stochastic· !:.,atJix, 5iu,;:e its second and j P-1 I= ~~l P22- l P2n
third rows ara not probabili~y vectors . j ,.,./ \ Pnl Pn2 Pnn- l
r.
8.2. Theorems on Probability Vector and £tochas i ic Matrix.
O P12 Pin
The following theorems are genernUy used in the di<;i:·uss~on on 0 P:>.2-l P2n
\.../,/
Markov Cbaios. ~,
•
Theorem 1. A square matnx A=fa1;l has a fl.xed pom. if a nd ,
• _ • ♦
= it I'p,.2
• : pn,, - 1 I t.
r,.
only if the determinant of the matrix A - I is zero, (where I is the 11 where the last step follows 16y addinb the elements of tbe secon d , L
~, third, ... , n-th columns to \.he corrcspon..iing elements of the first r-·
UJJ it matrix ), i.e.
I A-I 1=0. ... (1 ) l column . Thus l~
Proof. If ;=(x 1 , x 2 , .. . , x,,) is a fix €dt~poin t of A=[a,;), then · ·, f:!${ I P-1 \=0. ,~
,:1..
/~.
x' 1-i. x , ~-Hence, by theorem 1, the matrix P has a fixed point.
its definition ,g ives
au a12
-
ah] i
Theorem 3.
.➔
If u=(u1, Un) is a probability vector and
f~
r-
~) ...
'
[x1, X2, .. . , Xn] [ ~-~l a22 ai,,, =[x i, x 2, ... , Xn ] ; t.J2, ... , \
an1 a,, 2 ~·,:" ,j P=[pu~ !s a stochastic matrix of order n, th en the product ;J, is a .
Jt implies tbe equations t.. ,; p}Qbob1hty vector.
1
[Pu F.· .
.?n2
Pln]
~~n
P nn
i "L.
a1
Also, by addition we get
+a11+ ... + a11=U1 (Pu +Pu + '"'+ Pln) · ~
+ui (Pn +P22+ ... ; -f,211)
-~·
\
➔
It shows that t i:; alsc , the fixed point of pm, Since P is also
regular, there is an integer m such ti.lat the matrix
Q=Pm
has no zero element, by (iii) definition. Also, Th,!orem 4 shows that
-
+ Un (Pnl + Pn2 + ... +Pnn) Q is a stochastic matrix.
=u1(l) +ur,(l) + .. . + un ll ) qll q12 q1n·
~ =1,
➔
__,. •.~(iii) Let Q= [ :.~:1 ~-~t %, .. .(iv) 11
since each row of P and u are probability vectors. qul qnl qnn'
product AB is a stochastic mati:ix, and all powers Am ( where m is a Then, by (ii) and (iii) we get
~- .
positive integer) are also stochastic matrices. • :_;
(ti, I:?, ... , tn) Q==U1, t2, -· , fn). i ;·
1 I...
a112 _ ... -.,
'.) f · ~~n
(1
\r · P= : P!ll P'J.2
Theorem 2. It means
➔
tR=t.
➔
. .(ii <; L;;l p;, Jinn
... (2) ~
"'('f
-:
I
-► ➔ ~ ➔ -+ ... (3)
t p m= (tP) pm-1:.:/ y m-1= .. ,=ti'=t_. i=(lt, 1,, ..., t.) ,..
.,
332 /' Statistics Section 8.2 Theorems on .Probability Vector and Stochastic Matn'J: ! 333
Then
li m
m➔ =
pm= t,
[
ti
... ...
It.
t2
tn
tra
...
l - -("'/~
-:: --ti ,,. .•. (4) ~
_''} If it has a unique fixed poin_t ~•·obability vector ~= (t1, . t2, I - t1
--t2),, then by tb:ore·m 6, the limit of pm as m➔oo 1s matrix T, each
➔
O 1 OJ
[ 0! { f =(t1, t2,
Theorem 7. If Pis a regu].ar st.or::lH' Stic r.natrix atd p is any
(t1, t2, 1 - t1.- ta) I - t1 - 1'2)·
probability vector, then
3 3
➔ ao:--
\,,/' lim (p pm)=t This implies
m - +oo . ,
... (5)
-+
t t2=t1, t1+½ t2~ ! (I-t1-t2)= t2,
where t is !he fixed point probability vector of P, and m is a positive ½t2+ ¼ (\ -t1-t2)=l _- t1-t2,
integer.
l 3 3
10 , 5,
[Thus the limit is Independent of p.]
~
So that t1= t2= l-t1-t2= ·
10
➔
P roof. Since p is independen t of m, we get Hel'.ICC
➔ ➔
pm, 1/10 3/~ 3/10]
lim (p pm)= p lim lim Pr»= 1/10 3/5 3/10 /
!."I ➔ oo [
m-:,.c·-,o · m ➔ o:i J/10 3/5 3/J 0 / '
··'
(1 1 1) ''
➔a=--= 4' 9, 4' O, 2 .'·
h=Ul ;, D· h
...
I-•
~
\;
This proves (5).
/ ¥anipl0 If the matrix P is defined by
~
C=
C3' -1 61 J25) .
12 1 1
a=(1· 0, ½, ¼, n- ~,.;-.
·:·'~·\f
V P=[i 0
l i]3 :i
,1
-~ (
f = 0,
I 1 J
-2' i6' 7 . j (1
➔f= 5' _2, - 2'
1 0) . ~·.
I~ '\·~"
•¢',.II
, _ . .
m - ➔ oo . -+ ,I..
;~~~
(Ans. a, b, c, dare probability vectors. Others are not.]
Solution. Smee ~he rows of P are probability vectors, w~ prove . A'
that pis a stochas tic matrix. M c,reovcr. we find r:~··· ""', "•',., , _..2~-re~t whether the following matrices are stochastic matrices or
J./6 112.
p2= 1/12 23/36
1/3
5/18 I
l 1-l
~ I1
not.
[ 1/9 5/9 1/3 j ✓ I A= ½ ½
½ ¼,
[½ OJ B= [- ½1 ½0 ½½
]•
wliich does not contain any zeto element. Hence, P is a regular f :-~
•1· --, , U ½ i ¼ ½ f ~.
L
~
stochastic mat rix. - (Ans . A is SJ.oCbastic matri x but B is not.}
~
I '."I
l
I
I
i:
(
J
TI'
4 ,g
13 '
I )
·17 ' (ii) ma trix havin g rows ~
•
[O1] p ½ , C= ½
Aw= ½ ½ , B=1_, OJ [! t}/ (iii)
A= [
0- 1 OJ
½ ~- i ' B= [O½ 1] .
t 'C= i
[! ¼]
t 7. Find a scalar multiple of ::a.ch v~ctor, which mak~s th e vector a
probab ility vec tor .
D= [ A 1 ½
½
o
.l
. E= ½ l .l] · [' t 1] 0
o, (i> (l.
'2' ~3' o,·3
~~ r)
6 ·' <i I·> (\ o' 3' 1' 5' 6) ' <111
I
2 3 5
... ) o o, o' o) •
( ' ["
0 U 1 0
\ "[A (') 41.
· F = [½1 0½ - 431, G= [43
1 2 3
2j
1 :
-- I]
0 .
l ns \
....-•-\
.. )
(11 30
93 , (ii i) none.
]
•• b
t
7 6,3. Some d,!fi.111i dou,s in Marf&:ov Chain.
[Ans . For test, use tlieorem 1 of Sec. 8,2, Matrices, A, R, C, Let us suppose fh e.t a r,ystem Scan take randomly a Fitate Gi at a
D, E b&.ve respectively the fix ed point probability step Tk, where tl,t ) s or.c ofthe states (a1 , Oz, ... , a,,), so that i= 1, 2,
,d
vectors. 3, . ... " · ~'.
(s'1 s'2 s2) ' (1:r °32) ' ~3
'2 3l)• (·Ii'4 TI4· Ti3) ' (29' §'6 91)· At fo.e next stei:• T1,+1 , let the rnndom stat:e of the system S be aj , ['
1
•1where j= 1, 2, .. . , ii. l
I
'
Matrices F , l-:Yhave no fixed points.] 1 We assu '11e ~hat the lau cr state aJ dep ends at most upon its t.
.,,,--.
( ~Jf the mat rix A, is defined by
iimm~diately . preceeding state a1 and n0t upon any other state ~:
o !t
A= ½ ¼ U
[0 l 0
t] }
1,aft~r _a1.
to 111 .
, pt't!' 'iO US
Le~1·, ,. 1k note tile probabiiity thlit. the state a1 occurs immediately
./
, '
•►
fmd (i) the fi xed ·point pq~bability vector t of A, (ii) the limit of ~·-/ / 4 _,"'. ~!i\i_c__(i,{~i;domJ proc~ , _l>!!l.\c_h has_the above_ char- ' .
_;~ , 1::
336 I Statistl.::s S,xtlon
8.3 Some Definitions in Markov Chain l 337
Since n _is in general finite, tbe. process is callee the finite / The state a, of a. Markov Chain is said_ to be ~he . absorbing state ._ -~.
Markov Cham. . -~ if the !y~tem remains it that state once 1t enters there. Thus, the
It is convenient to represer. t the probabihtie~ P1f in the following \ state a,; is a,n absorbing state if
matrix torm.
• "
, I (. ') ,
p w = I, p11= O c•1-=;:=; ... (5) \...--'-
,,..
I
- t a, ll2
'\ 0
a. i.e. the £-th row of the transition matrix P con;ains 1 in 'the main
diagonal and zeroes everywhere else.
./
respectively. Now , since the states a1, 02, ... , an are assumed to .i pl 1. P73 Pa3 r
be mutually exclusive and exbaustiv,e, and also since tlie change of ~ r
a 1 to one of (a 1 , a 2 , .• . , an) is a cutainty, "'e get .11' Fig. 8.1
P11+Pn+ ... +P1n=l.
1
••• (4) . If any of~he probabilities P11 has the zero value, it is uot /,
by the addition low of probability. .:\.lso, p 11 , p 12 , :necessa,? to ~lay it in the transition diagram .
.. , P1n ,are non- ·1
negative, since a probability cannot b~: negativ~. Hence the '~lement~ ~ ' iinpl(!J (Throwing balls amougst players). //,/_...,. l1
--J
(3) represents a!~~ecto_1 :.:_
1
_ '"·--/
Similarly, it can be shown that all other rows of (2) are also
01
I
Three players a1, a2, aa are throwing a ~11 to one aoo ther ;
always th rows th e b~ll to 0 2, and a2 always throws the ball to as ;
probabil~ty vectors. Hence, the transition matrix P in (2) is a but a. throws the ball e,_thet· to a1. or to_a2 with equal probabilities.
Stochastic matrix. , Bf
0
plam that the possession of the ball ts a Markov Chain. Fma the
It
I
r.
transition matrix.
1·.
I
,f
-~ ·.- -
. . ""':·
,
as O 1 0 p
01 ,:r2 a:i
as O O q O p
P=a1 [ 0
a2 0
aa i
l
0
½ i] \,#
a4 0 0 0 0 1
The above type of walk is called the random walk with absorbing ,f
The first row of the matrix is such that a1, always throws the ball I
I
barriers.]
to 02. The second row implies that 02 always throws the ball to as, .;
I
I
The. third row shows that 03 throws the ball to a1 and a2 with equal 8 ,4. Probab
ility ,distrib ution io Markov Chain .
I
•
i
r
probabilities ½, 1
At a certain step, let p, de;.1otes the proba!:>il ity that the system is i
Examp le 2. .(Ra.odom walk with reflecting barriers).
in ·the state a, , (i=l, 2, ... , n.). Tht>o the vector I
! {
A particle under impulse s can take f.ny of the position s ao, a ,
1 -+
./ . . . (1 )
a2, as, a4 marked on a straight line in thi!l order. The end positions ~
P=(p1, P'l., , .. , Pn) E
,
I
ao and a1 are reflecting, i.e. the particle when at a0 or a4 can. move j is ca~led the J!!J.!.Qfl.lulity._~fj!~~but_!!~_vec~°.~ at that-s tep. r
,:mi y to a1 or a3 respecfrvely. In other position s, it can move to the
1
i.
r.
next position at the right \11 ith probability p and hence can muve toV
In J ccordan ce with t~ is <l 1.·finition , the initial probabili ty distri-
the position at the left with probabi lity q= 1- p. Explain that the I
b1.1tion vector is dl!fined by the distribut ion wh~ n the process startt.5 i
movem ent of the particle is a. Markov Chain and construc t the and is denoted by t:
transition matrix. -+((1)
=, / (0) (0) (0))
p 1 ' Pi ' .... Pn .
...(.::) r
i'
Solution. At any particular step, the particle can occupy one of p
t
the position s a , a 1, a 2, a 3 , a4 . At the next step, the position of the
0
particle is influenc ed only by the present :;tep and not by any step
previou s to the present step. Hence the process o( moveme nt of the !
Them-th step probabil ily distribut ion vector is defined as the
distribut ion a(er the m-tb E,\.1'. p and is denoted by ,
Ir:
particle is a Markov Chain. ;cm) C.:J ( (m) (m) (m) \ ...(3)
The transition matrix is as follows : PI ' P2 • ···• Pn 1.
•• a1 ,,, . aa ••
'
a1 q O
P=ao 1 p O O ..-- rem
'11l/ "..-Theo 'A\ 1., n, > ov a,n,
'- ,f
0 q
a, O O
O
q
p
O p
·
J
~(-,)
aTict P denot<a th ' probat ility distribution \'OctOT af" r th; ~a;;, ~
'
/ /\...1
a O O O 1 0 - m stl:'ps, then i
-+(m) -+(O) m
,J
.. (4) J
4 JI =P p t
The first ro•• shows that a 0 !Dust c!Jaage to 01, and the last
j
Since the probability of being at the state ai is pf)and that of The relations (vi), l vii) and (5) can be written in t he vector
form as
changing from a1 t~ a 1 is Pn, we fln.:i that t'r e probability of the
system's being at a 1 and then changinr, from a 1 to a1 is ➔ (1) ( (1) (1) (1))
P = P1 'P2 ' ... ,pn
Pi
((11)
Pu .. . (iii) ~J
·,
= ( Pl
(0) (0)
'P2 ' .. 'Pn
f Pu
L...
(0) ) ' P21
P 12
P22 Pl"]
pz,,
Pnl Pn2 P nn
by the multiplicati on lq_.w of probability.
➔(l) ➔ (0)
Similarly, the probability thait the :;ystem was at a~ and then or p =P P. . .. (6)
changed to a1 is
"p' (O)p . .. (iv) Now, the probability distribution vector ;<2>after two s tep s can
2 21
be obtained by the formula (6) if we treat ;<I) as the initial state.
. . .... ... Thus, by (6)
-~ The probability that the sys·tem was at an and then changed to
➔ (2) ·➔ (I)
f . ~-. .,(
() /'l /)
' a1 is
(0)
p c:,p p
•-=- \)
( C, )
f
'l-
... (7)
... (v) so that
Pn Pnl "
4 l2) - ►(O)
p =-=iCJ P2 . . .. (8)
Hence, t he probability Pll) t a.at the sys1em was at any one of the
Generalizing (8), we get the pre bability distribution vector after
states a 1, a2 , .. . , an and then changed to a1 in one step is obtained _11 m steps as
' ....
➔ (m) ""'"(0)
from (iii J to (v) as
P =P P"'. , I ..
(1 ) (0) (0) (OJ , .. (Vi) I
This proves ( 4).
P1 = P1 P11+P2 l21. + . . + Pn Pnl'
1
J
-------·----
342 I Statistics Section 8.4 Pmbability Distribution In Markov Chain I 343
Note 1. Alternative formulas for probability distribution. From sunday itO friday, there involve five steps. Hence, the
probability distribution vec:tor on friday is
By (5), we get "'l~ ➔(5) ➔ (O)
\ p =P pr,
pCm)=p (m-I)P +/m-l) p +... +/m-l )pnn . ... (9) '
2n .,
n 1 In 2
ii
O I 0]
& =
5
(130 28 59) .
w' s1 1 243
It is an alternativ e form of (4).
=<o, i, o) [
Similarly, from (7) we get Thus, the prcbabilit) of his go.ing to Con ..friday is 59/243.
[We also note that tLc probability of his going to A and Bon
➔(m) ➔ (m-1}
P =p P. ... (10) friday are respectively 130/243 and 28/8 l.] '
Example 2. A particle under impulses can take any of the
Note l. An interpretatfo,, of pm.
p,:>sitions ao, o1 , 02, as, a4, 05 marked on a straight line in this
Let us choose the initial distribution as ; ~O) =(l, O, O, .. ), which order. The end point, ao and or, form re.fleeting barriers. The
means that the S!(Stem starts from a 1 . From (4) we then see that ' particle can move to the right with the probability p and to the left
with the probab i.ii ty q where ff= 1- p. If it stares from a2, fin•d the
the vector ;cm) becomes ideutical with tlK~ fim row of the na·\rix
probability that it reaches a2 again after exactly four steps.
pm, Thus, tbe elements cf the firs·,: row o.r" pm gives the probabiliti1JS
Solution. Tll ~ transition JI1atrix P is as follows.
tha·; the system is in a given state after m-th step," r.tart ing from the
state a1. Simil.lrly , the elements of the sec.ond row of pm give ao a1 t12 as n4
o,
05
P=oo (0 1 0 0 0
the probabili ties that the system is in a giver, state after m•\h step,
(1 a1 I q 0 p 0 0 01
starting from the state a2, and so on. a2 Io q 0 p 0 01
as I 0 0 q 0 p 01
, Esampl ~"[) <\. sales organiser goes to one of the three market;
a4 Io . o 0 q 0 pl
, Band C everyday . He never goes to the same market on con• as LO 0 X 0 1 OJ
~ secutivr days. If be goes to A, then the next day he goes to B; If ➔ (0)
he goes to either B or C, then the next day be is twice as likely to Also p ;::= (0, 0, 1, 0, 0, 0).
go to A as to the other market. If be goes to Bon a sunday, find ➔ (4) ·-lo-(1))
the probabil ity of bis going to Con the next friday. Rener p = p ..¾P'
, ~
1.. /
j
--
a1
02
[o! i
0 ½
O ?-
OJ W..! introduce the notation /":) to deno.e probability that
IJ
a
03
a4
0
t !1 13 system which is in the s-cate a, will be in the state a, after exac-tly
m steps.
2. A particle under impulses can take any of tl:ie positions a1, a2, This definition impl!es that if m= 1, we have
a 3 , 04 marked oJ. the circumferenc•! of a cirde in this ord~r in
the counter clockwise senB!. It can move one step counter (I)
pij =plj. ... ( 1)
clockwise with fre probability p and one step clockwise with
the probability q where q= 1- p. Explain that the movement of
the particle forms a Markov Chain, and find the transition It is convenient to write p~;) in the matrix form
matrix.
(m) (m) (m)
01 a2 03 a4 P11 P12 P1n
OT n I P21
<mr
[Ans. p 0 (m) (m) (m)
a2 q 0 p P = P22 P2n ... (2)
t.13 0 q 0
a4 p 0 q (m) (m) (,,,)
Pn i Pn2 Pnn
3. Draw the transitioB diagr:c.ms for the following tramition -~?
matrices We call p(m)as the rn-th step transition matrix.
In ac~ordance with the no:at b:1 P of Sec. 8.3 and th e definition
.0
d]
U~
¼ ½
l
U3
½
· (I) above, we have
) P<l' =P. . .. (3)
1 Theorem. I. If P denotes the transition matrix cf a Markov
t (m) " (m - 1)
(I) b. P; ; = ~-1
1: plk P IH , ...{7J
plk f"1' 1 IJ
changes from a 1 to a1 in two steps through any one of the inter- Now, let P be the one-step transi1ion matrix
mediate states a1, a~, ... , an, is given by
(I) (1) <t> l
/~)= i pCl) Pkl ... (i) Pll P12 P In I
lj k•l . Ik
r
P= I P21
(1) (1)
P22
(1)
P2n I
I
I
by the addition law of probability. I ...
Similarly, the probability p ~~) that the system changes from the
I <1)
LP,,1
(1)
p n_') Pnn
(1) I
.J
·i:
a2 to any one of the states a1 , a 2,
p .
2J
(2)
... .,
--1 -
an anc. then to ~ · is given b;i'
= 1: P,.,k p rt 1
(1) ... (ii)
I
Hence
P2=
(
I P71
!
())
<1)
I P2,1
(1)
Pi 2
(1)
P22
o> r (1> P12
Pin 1I [Pll
I
0 > I <n
o>
°>
P :2n I l P21 P22
.. . <1>
Pin
(1)
P2,i .
l
In general, the probability/~ ) that the system chaoges froru
I) I .. ... I I ·· · ··· ··· ··· I
the s1ate a, to any one of the state~ 01,, a2, ... , an and then to a1, is I (,) (1) (1 ) I I {I) (1) (1) I
LPnl Pn2 P nn J Lpn l p n2 .. Pnn J
given by
(2)
p .. = k..,:E"1 pl(1)
.k P k l ,
... (5) . r <2) (2) (2)'\
,] I pl l P12 Pln I
I (2' (2) (2) \
Now, taking tlle state a, in the second step as the initial state, . I ' ) P22 ..
= I l,,.,1 P2,, I
3 ... I
we can similarly calculale the probability / ,r ) that the system chan- I
I (2)
t,P111
(2)
Pn.?.
(2) I
Pnn J
ges from a, to a; in two ·' steps and from ai to a, in one s1ep. It is
!;iven by by the use of (i) , (ii) an<i (5). Th~s together with (2) gives
(3) " (2)
p . = :z p .. p ,,.
·-•~ p2=p<2>.
j
If J-1 iJ .. .(iii)
I
/l
--4--
L
348 I Stati.uics Sectlo 8.5 Multi-Step in Markov Chain l 349
Also, by (5) and (iii), we r,et
By the theorem 7 of Sec. 8.2 and the theorem of Sec. 8 4, we get
1
I P11
(2) (2) (2) 1I rI Pu
O) O) (I) 1 -+(m) ➔(a) ➔
Pi2 Pin P12 Pin I lim p =lirJ {p P"'}=t. .•. (8)
m-= m-=
! II p2l
I (2) (2J (2) I (1) (1) (1) I
I P21 . P 22 I
1
r
I Pu
(3)
', p (3)
pn2
P12
(3)
(3)
Pnn
(3)
plll
(3)
l
J lpnl pn2 ... p
nn
j
stationar y Markov Chain are different, th-e limiting 1ransition matrix
'Tis the same. The limiting state is called the equilibrium state or
! the steady state.
;
--
!If p (m) .
1;·· denotes
(1) ➔(4) (4)
. ··
t h e m•th step tra.ns1tion prnb ab'l'
")
. fi od . p(2 )'
11t1es,
' 32
1
/J-· This definition implies that after a large m1:ml:er of steps m, the I (2) (2 ) (2) I
· t 1·ansition matrix p<m> changes !D s,'011,ly that it may be considered to· P33 J
LP31 /3_2
reach a defin ~te value. 1
lity vector t of P.
➔(4) ➔(O) 4-(~0
p ; = P I!... - 3' ' 3
!l[1 1. *½ t0 ·= 4' i1'· 'V
- . 2 .. ft "'
--------------·-----, --- ---.: :-.- --- - ~
~:·
350 I Statislfcs Section,' Multi-Step in Markov Chain I 351 I}
cJ~
,(
2
I: ~ay be shown that the fixed point probability vector 7 of p is1· ¼), since, the syste~ . canno t ~ov~ from a2 to
Thus, the tra.ns1t1on matrrn: P 1.:,
ao,
~
l~
( 7' 71 7)- (;,_
r·
a0 a1 a;i,.---
1
j
Henc e Jim
2/7,
1-'''' -= 2/7,
4/7,
4/i,
1/7]
1/7 , I!
P=ao[2 fI ¼OJ
a1 n ·~
i~
f<..
. ·
an d 3 re
~ • 1
➔" t1
. SoIutfon. (1) The state a0 co? sistG of 2 white in A,
t.
. ,t=(t1, ts, 1- t1 - t2) of P. t_-
m B.
1 white, 2 red ;.'1 Then, (11, t 2, 1- t 1 - t 2) P=(t 1 , t 1 , 1 - t 1 - t 2),
..
, ..
The state a1 consis ts of 1 white , 1 red i-a A, and - - , ,¢t
in B.
The state a2 consists of 2 red in A , and 2 white, I red
in B.
1
o1,, (11 , 12, l--11 - 12)[1 i
O ! -t
i]-(11 , 1,,, 1-- 11- t2).
1
I~
~
dj I
If the system is in state a0, then a white marbl e can be selecte t
from A and a red marbl e from B, so that the system
moves to state/ Hence , ~ t2=t1 , t1+½ t2+ 1 (1-t1 -t2)= t2, ~?-
row of the/ _
first
a 1 . T he system canno t md'jle to a0 or a2 . :-Ience the / ½ t2 + ¼ (1- t1 - t,., -1 - t1 - t2), ~
t-
transi tion matri x P is (0, 1, 0). 1
1lrty
of so happ~n'.n (; i
of remam rn
ii.·'. (/ ;
, . ~
is O· 3.
Exer cises VIII (C'
J
I-f. "
"'
is i·i = ½- T hus p 12 = f. Accor dmgly . tr,e probab
1
cbox Aand 4red marb lesin ,;
r_ow ofPt,·•· Tbere are 2 white m arble sinth
inai is p11 = l - p10 - PJ2 = l - ¼- t =½. 1hus, tbese cond
met th the box lJ. Rach step af a rando m proce ss consists
of pickin g a lt:.
is (t ½, ½). [Note that p 11 can a lso be obtarn ed from the1_ marbl e from each box and then puttin g then: back
after inter- ~~
~ ~
~mai ns at a if either I write fro m each bo_x ~; selecr
tte system 1 with no red,
each box with prob!' chang e. If au, ,11 and a 2 deoot e the states of a b:>x
with proba bility ½-¼ = i. 1 red is sdete d from the transi tion
one red and two red l'.L.arbles respec tively , fin d (i)
Iity i -i = ¼, so tlJat p 11=k +¼=½ J. ~
1
f
352
•
I Statzst1cs
•
.
I
Sectiori ,
·
,
(.
1
·r ·\
1,
,
.
•
·,
."(
\
\ .,
l '
\
•
,
.-.~ - -•" · -
'.1·
:7
l
l
.•
.,·
t
-:; -~i,H,· t ;;;_·
·t.,-,.;;:j1-f,, 1',•,. cnd~-r.;
. ...... . ,
niif;~;.~,
• \f
lr,1:,,_,.~~~:,~-..~.,;~~lw
\ ,.f;,.1c~•i,,~~L;i:;
L
J"",
:
1~ Jfifi:l\i}~,r~illJ/;~~_
• • ·.,•.;\,.,.,. :.~ .
ltl ,. .,,., ,,;:-., ~
,:,):,:,
·
~,·....:~H·ri.,..,,,.:·~M:-_'
~;,-., .... , ,,.. ~}
•!l\'~.,\lirm..~i•;i,~..&-~-,.-~,,..:,,1(:-:i:
.,_..ti' ;~ • ~· ,{,
-~.(.j,1 , ..
~~di-~':"J.~l.w~;.~ •tf~~r~~~,7";..?JK/',;
l
./ '- I
:'- t ! J
.
I
l 1• LQ • -'I J'
) ' r~ / : ' i.. ~
e
t I I
red marbles in A after 3 st:=ps, and (i 1i) the 1>robability thEl\ · l \ t~{ij~;
there are 2 red marbles in A in the long run. • \.J1 • .{;. ;rf/.-:~~~f. ~ /J t? t I ,- 0•1 ;,- -l u 1
~ 1-l ,
' ,:· ,1 p -'~<;:i.~,i·
0 J O] 11
L, jl ) ,';t~ ,
[Ans. (i) P= [ 1/8 1/2 3/8 , (ii) 3/8, (iii) 2/5.] ~I! i".1-J_.'_·::·V:'-'
o 112 112 . ,·
1 -l (.) 1 'b t 1 -+ iJ 1 - o , 3 t:,
1 ~ r: I
2 . Solve example J, assuming that there arc 3 wl'ute marbles in A ·]: .
and 3 red marbles in B .
-, (1- 0 1 ', -t-0 •3 ) ·b - 0 ' .3.
. O 1 0 OJ
1/9 4/9 419
0 .. . ..
(1) P=
[ 00 4/9 4/9 1/9 , 32/81, -:..,J "'1 ----·.1-
0 - 1
0 l 0
[11) (lll) 9/20.] I :
.... --- - o, {.
o ·~ r;
3. For a Markov Chain, the transition matrix is
V 0
½ o
P= l O l) ,
½] 11/-r ,~- ~ (.,.1, -:- I - t I - ! - 0 ,,
[¼ i ¼ ,/1 ~ o ·½
and the initial probability distribution is p(O ) =(½, ½, 0). Fini' :
•I r, . -lc--1..AA -''MM ,
P , P(2) .p➔(2) and p (3),
(2)
• I
~- ( ✓ ,.
, Nh ,.· , . usu~i~
~- /·,i
13 23 1
.. rc the symbols have their
11ij· . '•ri , \ I
. ll/.-V
~ ~1~(:'· 1. t· :: .
' ~
( 0 1 G1 0 · i.f)
meaning . I \. ) . ·, ')1'' ~ ~~
[ Ans.
3 1 ( 7
8' 2' \16 '
2 7) 7
16' 16 'fr_
J . d11•·'--tl
t,I I
I ,
l I
(NI
,
r/-,~ !--~ 1, '
•A _I -~ r,,t-t.c i~
I
~ -"'C" .,C b O 'I ·
I
/
4. A man's-smoking habits are a:; folk ws. If he smokes fl' / 0 ~
cigar one week. he switches to non-filter ciga,~ next weak "_ J f ~.tkA ......
V
t
~ 1 .J.u.-t.-d I
. .
:i :f
0 ' I ' ' 1 '
· I.
"t IN fl W 0/J.J /v ~
../ ,
- tJ
. ? . ,J' ') . z_ J·
p ( ., .:) {0 • 7,
\ J: ~ ~
; ;
~
' .)
0, ::_ J- i I , •-I
1-; I
l
, ~·1_J 'U <,.
·; r, vi dP('~r( 2. ) ' 3
t
I •