0% found this document useful (0 votes)
29 views3 pages

Markov Chain

1. A Markov chain is defined as a stochastic process where the probability of moving to the next state depends only on the current state, not on the sequence of events that preceded it. 2. The transition probability matrix (tpm) contains the one-step transition probabilities between states. If the transition probabilities do not change over time, it is called a homogeneous Markov chain. 3. For a regular Markov chain with transition probability matrix P and steady state distribution z, the product TP^n converges to z as n approaches infinity based on the Chapman-Kolmogorov theorem.

Uploaded by

pooja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views3 pages

Markov Chain

1. A Markov chain is defined as a stochastic process where the probability of moving to the next state depends only on the current state, not on the sequence of events that preceded it. 2. The transition probability matrix (tpm) contains the one-step transition probabilities between states. If the transition probabilities do not change over time, it is called a homogeneous Markov chain. 3. For a regular Markov chain with transition probability matrix P and steady state distribution z, the product TP^n converges to z as n approaches infinity based on the Chapman-Kolmogorov theorem.

Uploaded by

pooja
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

MARKOV CHAIN

DEFINITION:

follows.
Markov Chain as
We define the Xo = ag}
=-1,Xn -2 -2 Xo =ao}
IfP {X =a, /Xn -1

a,/Xn -1 =an-1}for alln


P{X, =
is called as Markov Chain.
then the process {X,,}, n
=
0, 1, 2, ...,

Chain.
1 a1, a2 4 are called the
states ofthe Markov
2. The conditional probability

PX,a,|X,-1 a}=Pij (n-1,n)


is called the one-step transition probability from state a; to
state a, at the nh step.
the
3 If the one step transition probability does not depend on
step, that is if|Py - 1, n) =Py (m- 1, m), then the Markov
(n
Chain is called the homogenous Markov Chain or the Chan
is said to have stationary transition probabilities.
4. When the Markov Chain is homogenous, the one step transitio
probability 1s denoted byY Py The matrix P (Pi)is calc
(one-step) transition probability matrix (tpm).
UNIT 3 ssssRssasnnssstasasanssstAS#sseRASes esssene.
RANDOMPROCESSES

The tpm of a arkov chain is a Stochastic 3.47


5. matrix.
(0 P 0

() P (Sum of elements of any row is 1)

6. The conditional probability that the


process is in state at
a,
n. given that it was in state a, at
step 0. step
P X,-a,/ Xo =
a} =

P
is called the n-step transition probability.
Let p; Probability that the process is in state
=

a; at any step
(G= 1,2, ...k). The
row vectorp= (P1: P2 P) is called
the probability distribution of the process at that time.
8. - P Pz ..p(0) is
called the initial
(0)
probability
distribution where p= P (X%= ); (0)
P=
2 P (Xo =2) ;..
are the initial
probabilities for states 1, 2, .
9. If P is the tpm of the regular chain, and z
(a row vector) is the
steady state distribution, then
TP T
O CHAPMAN-KOLMOGOROV THEOREM

If P is the transition
Chain and the n-step
probability matrix of a Homogeneous Markov
tpm P) P", then nh power of the tpm.
=

Thus P(2) P xP
p3 = p2xP

P4= p3 x P and so on.

REGULAR MATRIX:
A
Stochastic Matrix P is said to be a regular matrix, if all the
ries of
pm (for some positive integer m) are positive.
oIfthe transition probability matrix is regular, then we say that the
ogenous Markov Chain is
regular.
oasosasssuueaauseses ResogRReasnosus******
*98sns090 0g00euouuee ********* UNIT 3
3.48 PROBABILITY AND RANDOM PROCEes
CLASSIFICATION OF STATES OF A MARKOV CHAIN
A Markov Chain is said to be
Irreducible :

state can be reached from every other state,


pn) where
educible
irreducible
0 for>
if
ie every
and for all i and j. Note that the tpm of an
irreducible chainsome enn
irreducible matrix. Otherwise, the is said to be Chain is an
or reducible. non-irreducible
Return State : If pn)>0, for some n> 1,tthen
of the Markov Chain as return state.
we call the
Period Let pm)>0 for all m. Let i be a return state. Then
define the period d, as follows. en we

d= GCD {m:
pm)> 0},
where GCD stands for the Greatest
Common divisor.
State f' is said to be with periodic period d, if d,> I and
d=1. aperiodic if
Recurrence time probability : The
returns to state i,
starting from state i, for the first
probability
that the chain
called the Recurrence time at the nth
time step is
probability
probability and is denoted by f{),
or the first return
time
o Recurrent State

IfF=>J) =1, the returm to state i is certain and the


n=1
is said to be
state '7
transient.
persistent or recurrent. Otherwise, it is said to be

P 2nfa ) is called the mean recurrence time of


n=l the state 1.

The state i is said


to be non null
time
pis finite and persistent if its mean recurrence
null persistent if P= 0,
A non null
persistent and
Theorems (Without Proof) aperiodic state is called ergodic.
1. Ifa Markov chain
is
type. They are all irreducible then all its states are of tnehe same
transient, all null persistent or
persistent. All its states are either all nohee
same period. aperiodic or periodiC wi
2. If a Markov
2. chain is finite
persistent. irreducible, all its states are no null
UNIT 3
*psaessssssvesopsssssasssssssssssssssssssssssssesssssssssssssssssossss ssssss*s*****

You might also like