Homework_Week5_solutions
Homework_Week5_solutions
1
1/2 2 1
1/2
1/2
1/2
(n)
find a general formula for p11 .
Solution: The transition matrix, and its square are as follows
0 1 0 0 12 21
P = 0 12 12 , P 2 = 14 12 41 .
1 1
2 2 0 0 34 41
1 2π 1 2π
λ ∈ {1, e−i 3 , ei 3 } ,
2 2
and also
1 −i 2nπ 1 i 2nπ
λn ∈ {1,
e 3 , ne 3 } ,
2n 2
that written in terms of trigonometric functions is
1 1
λn ∈ {1, n
(cos(2nπ/3) − i sin(2nπ/3)) , n (cos(2nπ/3) + i sin(2nπ/3))} .
2 2
(n)
So we can write p11 as a linear combination of these eigenvalues, that is
(n) b′ c′
p11 = a′ + (cos(2nπ/3) − i sin(2nπ/3)) + (cos(2nπ/3) + i sin(2nπ/3)) .
2n 2n
However, a′ , b′ and c′ can be complex, and the result has to be real for any n ≥ 0, Therefore,
with a, b, c ∈ R, and a′ = a, b′ = (b + ic)/2, and c′ = (b − ic)/2 we have that
(n) b c
p11 = a + cos(2nπ/3) + n sin(2nπ/3) .
2n 2
To solve for the three unknown, by looking at the first element of P 0 , P and P 2 , we have
that
0
p11
=1=a+b
√
c 3
p111 = 0 = a + 2b cos(2π/3) + 2c sin(2π/3) = a − b
4 + 4
,
√
2 b c b c 3
p11 = 0 = a + cos(4π/3) + sin(4π/3) = a −
4 4 4 − 4
√
2 3
whose solution is a = 1/7, b = 6/7, and c = 21 .
It follows that n √ !
(n) 1 1 6 2 3
p11 = + cos(2nπ/3) + sin(2nπ/3) .
7 2 7 21
See also Example 1.1.6 pag. 6 in
[N’97] J.R. Norris (1997). Markov Chains. Cambridge University Press.
Exercise 2: A particle moves on the six vertices of a hexagon in the following way: at each
step the particle is equally likely to move and stay, then in case of moving it is equally likely
to move to each of its two adjacent vertices.
Let i be the initial vertex occupied by the particle and o the vertex opposite to i .
1. Describe the Markov chain model, classify its states and their periods
Solution:
1. The transition matrix and the transition diagram of the Markov chain are the following
Ei 1{Xn = o} = xo = 1 .
n=1
4. To compute the expected number of steps until the first visit to o we can consider the
following simplified chain where o is an absorbing state:
1 1
2 2 0 0
1/2 1/2
1 1 1
4 2 4 0
. 1/2 1/4 1/4
0 1 1 1
4 2 4 1/2 i 1/4 1 1/4 2 o 1
0 0 0 1
We have that
1 1
mi = 1 + 2 mi + 2 m1
m1 = 1 + 14 mi + 12 m1 + 14 m2
m2 = 1 + 14 m1 + 12 m2
Exercise 3: Let {Xn }n≥0 be a HMC on I = {0, 1, 2, . . .} with the following transition
probabilities
p00 = p10 = pi,i−1 = 1/4 i ≥ 2
p01 = p11 = pi,i+1 = 3/4 i ≥ 2
Assume that the chain starts in state 3 ∈ I at time 0.
Solution:
3/4
3/4 3/4 3/4 3/4 3/4 3/4 3/4
Since on X(0) = 3 the set {T1 < ∞} ⊂ {T2 < ∞}, we have
where in the last equality we used the Strong Markov property on {X(T2 ) = 2}.
Therefore we have
1 3 2
h= + h
4 4
that admits two solutions
1
h=1 y h= ,
3
and we choose the minimal one, that is h = 1/3.
It follows that ui = hi−1 = 31−i for i ≥ 2 and
2. We know that, once entered in the closed recurrend class, the chain, that will be re-
stricted to that class, behaves as a two-states HMC. The limiting distribution in there
is the stationary one as we computed in class when we studied the two-states HMC.
Therefore conditioning that in finite time the chain visits the state 0 ∈ I, we have
P(X(∞) = 0|T0 < ∞) = π0
P(X(∞) = 1|T0 < ∞) = π1
P(X(∞) = j|T0 < ∞) = 0, j ≥ 2 .
1
P(X(∞) = 0|T0 < ∞) =
4
3
P(X(∞) = 1|T0 < ∞) =
4
P(X(∞) = j|T0 < ∞) = 0 = πj , j≥2.
3. By the previous results, we have for i ∈ {0, 1},
(n) 1
lim p = π0 =
n→∞ i0 4
(n) 3
lim p = π1 =
n→∞ i1 4
(n)
lim p = 0, j ≥ 2 ,
n→∞ ij
(n) 31−i
lim pi0 = Pi (X(∞) = 0|T0 < ∞)Pi (T0 < ∞) = ui π0 =
n→∞ 4
(n) 32−i
lim pi1 = Pi (X(∞) = 1|T0 < ∞)Pi (T0 < ∞) = ui π1 =
n→∞ 4
(n)
lim p = 0, j ≥ 2 .
n→∞ ij
Exercise 4: Let {Xn }n≥0 be a HMC on I = {0, 1, 2, . . .} with the following transition
probabilities, for i ≥ 1,
pi,i+1 = pi
pi,i−1 = qi .
For i = 1, 2, . . . , we have 0 < pi = 1 − qi < 1, while p00 = 1, making 0 an absorbing state.
Calculate the absorption probability starting from i ≥ 1, that is Pi (T0 < ∞).
Solution: See Example 1.3.4 (Birth-and-death chain) pag. 16 in
[N’97] J.R. Norris (1997). Markov Chains. Cambridge University Press.