Markov Chain - HW4
Markov Chain - HW4
HW #4
Instructor: Songfeng (Andy) Zheng
P (X0 = 1, X1 = 1, X2 = 0)
and
P (X1 = 1, X2 = 1, X3 = 0)
(c). The initial distribution is p0 = 0.5 and p1 = 0.5. Please determine the probabilities
P (X2 = 0) and P (X3 = 0).
Problem 2. Consider a Markov chain with state space {0, 1, 2, 3}. Suppose the transition
probability matrix is
0.4 0.3 0.2 0.1
0.1 0.4 0.3 0.2
P =
.
0.3 0.2 0.1 0.4
0.2 0.1 0.4 0.3
If the initial distribution is pi = 1/4, for i = 0, 1, 2, 3. Show that for all n, P (Xn = k) = 1/4,
for k = 0, 1, 2, 3. Can you generalize this result from this example? Prove your conjecture.
Problem 3. Consider a Markov chain with state space {0, 1, 2, 3}, and the transition prob-
ability matrix is
1 0 0 0
0.1 0.6 0.1 0.2
P =
.
0.2 0.3 0.4 0.1
0 0 0 1
1
2
(a) Starting in state 1, determine the probability that the Markov chain ends in state 0.
(b) Determine the mean time to absorption.
Problem 4. Consider a Markov chain with state space {0, 1, 2, 3, 4}, and the transition
probability matrix is
q p 0 0 0
q 0 p 0 0
P =
q 0 0 p 0
q 0 0 0 p
0 0 0 0 1
where p + q = 1. Determine the mean time to reach state 4 starting from state 0. That is,
find E[T |X0 = 0], where T = min{n ≥ 0; Xn = 4}.
Problem 5. Consider a Markov chain with state space {0, 1, 2}, and the transition proba-
bility matrix is
0.3 0.2 0.5
P =
0.5 0.1 0.4
.
0 0 1
The process starts with X0 = 0. Eventually the process will end up in state 2. What is the
probability that when the process moves into state 2, it does so from state 1?
Hint: let T = min{n ≥ 0; Xn = 2}, then the asked case happens if XT −1 = 1, considering
the initial states, we want to find P (XT −1 = 1|X0 = 0). Let zi = P (XT −1 = 1|X0 = i), use
first step analysis to find the desired result.
where
q1 q2 · · · qk
ρk = .
p1 p2 · · · pk
Problem 8. Let {Xn } be a branching process with E(ξ) = µ. Show that Zn = Xn /µn is a
nonnegative martingale.