Topic 4 CEM615
Topic 4 CEM615
Management
TOPIC 4
ASSET DETERIORATION
1
Lesson Learning Outcomes (LO)
• At the conclusion of this lesson, the student
should be able to:
– Define and explain Markov Chain (MC)
– Describe transition probabilities, transition
diagram, transition matrix, absorbing state
– Differentiate Regular Markov Chain and Non-
regular Markov Chain
– Elaborate state vector, steady state Condition,
steady state vector
2
5.1 -Markov Chains
3
• We have a set of states, S = {S1 , S2 , …….., Sr}.
• If the chain is currently in state Si, then it
moves to state Sj at the next step with a
probability denoted by transition probabilities,
Pij.
• P44 = absorbing state…once enter cannot leave
(unless you repair the asset)
4
• Transition Diagram
5
END 1
6
Example 11.1
• Example 11.1. The Land of Oz is never have two nice days in a row. If they have a nice day,
they are just as likely to have snow as rain the next day. If they have snow or rain, they have
an even chance of having the same the next day. If there is change from snow or rain, only
half of the time is this a change to a nice day. With this information we form a Markov chain
as follows. We take as states the kinds of weather R, N, and S. From the above information
we determine the transition probabilities and the steady state condition / long term
condition of the weather.
• The transition probabilities are most conveniently represented in a square array as
7
• Such a square array is called the matrix of
transition probabilities, or the transition
matrix.
8
9
• Theorem 11.1. Let P be the transition matrix
of a Markov chain. The ijth entry p(n)ij of the
matrix Pijn gives the probability that the
Markov chain, starting in state Si, will be in
state Sj after n steps. (Quiz 5)
10
• In the long term, the
probabilities (for weather) of
Rainy days is 40%
Nice days is 20%
Snowy days 40%
11
END 2
12
Regular Markov Chain
• Regular Markov Chain is a Markov Chain where some power of transition matrix
has all the entries non-zero.
• @ where for some positive integer n, the matrix T n has no 0 entries.
0 1
T =
• No 1 0
check if T n = non - zero entries
13
• Regular Markov Chains (or irreducible)
14
• Example 1: Determine if the following is irreducible (connectable):
0.25 0.75
T =
0.65 0.35
0.2 0.4 0.4
T = 0.3 0 0.7
0 0 1
0.6 0 0.4
T = 0.2 0 0.8
0 0.8 0.2
15
• Regular Markov Chain:
17
Non-regular Markov chain
• One important special type of a non-regular Markov chain is an absorbing
Markov chain.
• “A state Sk of a Markov chain is called an absorbing state if, once the Markov
chain enters the state, it remains there forever. In other words, the probability
of leaving the state is zero. This means Pkk = 1, and Pjk = 0 for j ≠ k .
18
19
20
21
Summary:
1) "Regular Markov chains always ACHIEVE A STEADY STATE. A
markov chain is regular if the transition matrix P or some power of P has no zero
entries.“……….. definition from a textbook.
• So we indeed have to test every power of P and see if P^2, P^3, etc have zero
entries or not. How do we know when to stop testing? We can never conclude that
a Markov chain is non-regular without testing powers of P until
infinity.........=> you will often find that either zeros vanish quickly or they are
obviously not going to go away. So basically to test it, just go up a few
powers and see if the zeros disappears.
• If you are starting from a state of certainty. You know that you DEFINITELY lost the
game you just played, so what is your initial state?
22
END 4
23
Steady state condition
(The above can only applied on Regular Markov chain)
• The state vector or is called the Steady State Vector where: P.T = P
• multiplying the Steady State Vector by the Transition Matrix = Steady State Vector
• In the following example, steady state vector = (0.5294, 0.4706)
24
Example
25
Example
26
• Continue example 4.4
27
Example
2) Gina noticed that the performance of her baseball team seemed to depend
on the outcome of their previous game. When her team won, there was a
70% chance that they would win the next game. If they lost, however,
there was only a 40% chance that they would win their next game.
2a) Following a loss, what is the probability that Gina's team will win two
games later? Use the Markov chains to find out. [the answer provided is
0.52, but no matter what I try, I can't get the answer]
http://www.sciforums.com/threads/math-markov-chains-regular-markov-
chains.55328/
28
• The probability that Gina’s team continuously win a game is 0.7. Using P(W->W) = .7,
P(W->L) = .3, P(L->W) = .4, P(L->L) = .6 you can find:
▪ S(0) = [0, 1]
▪ S(1) = S(0) P = [.4 .6]
▪ S(2) = S(1) P = S(0) P^2 = [.52 .49].
▪ Therefore, following a loss, the probability that Gina's team will win two games later is
0.52 or 52%
29
30
END 5
31
Exercise 1
• Draw the transition diagram. Find T2. T = Matrix below
32
Exercise 2
• A market analyst is interested in whether consumers prefer APPLE tablet
or SAMSUNG tablet. Two market surveys taken one year apart reveal the
following:
– 10% of APPLE owners had switched to SAMSUNG and the rest
continued with APPLE.
– 35% of SAMSUNG owners had switched to APPLE and the rest
continued with SAMSUNG.
• In long run, what is the fraction of Apple and Samsung in the market?
33
• A market analyst is interested in whether consumers prefer APPLE tablet or SAMSUNG tablet. Two market
surveys taken one year apart reveal the following:
– 10% of APPLE owners had switched to SAMSUNG and the rest continued with APPLE.
– 35% of SAMSUNG owners had switched to APPLE and the rest continued with SAMSUNG.
• In long run, what is the fraction of Apple and Samsung in the market?
0 − Apple
0.9 0.1
1 − Samsung 0.35 0.65
0.9 0.1
T= , until T n
= steady state
0.35 0.65
34
END 6
35
Articles to read
• 5.2 -Regular_Markov_Chains
• 5.2a -Regular_Markov_Chains + Eg 4.3
• 5.3 -Regular Markov chains
• 5.8 Chap4part1
• 5.10 sec92
36
37
Exercise 3
• A sewerage treatment plant is in one of four possible states: 0 = working
without a problem; 1 = working but in need of minor repair; 2 = working
but in need of major repair; 3 = out-of-order. The corresponding transition
probability matrix is
38
0.80 0.14 0.04 0.02
0 0.60 0.30 0.10
T=
0 0 0.65 0.35
0.90 0 0 0.10
• Regular Markov Chain is a Markov Chain where some power of transition matrix has all the entries non-
zero. A regular Markov Chain is one where for some positive integer n, the matrix T n has no 0 entries.
• Check Tn until no 0 entries.
• Generate Tnew until no significant change in values
39
Exercise 4
• The corresponding transition probability matrix P is :
40
T= 0.33 0.33 0.33
0.25 0.50 0.25
0.17 0.33 0.50
41
Exercise 5/ good question
• If a bridge has 150 meter long element, assumed to be a steel girder, that
at age zero(new bridge) has 100% of this element in condition one, and
the estimated Makovian transition matrix p is as shown in Table 1.0.
Predict the future condition of bridge on 10 years of life span.
42
So = ( 1 0 0 0 )
S1 = ( 1 0 0 0 ) 0.994 0.006 0 0
0 0.993 0.007 0
0 0 0.976 0.024
0 0 0 1
S2 = ( 0.994 0.006 0 0 )
43
Exercise 6
Suppose that General Motor (GM), Ford (F), and Chrysler (C) each introduce a new
SUV vehicle.
• General Motors keep 85% of its customers but loses 10% to Ford and 5% to
Chrysler.
• Ford keeps 80% of its customers but loses 10% to General Motors and 10% to
Chrysler.
• Chrysler keeps 60% of its customers but loses 25% to General Motors and 15% to
Ford.
Find the distribution of the market in the long run or the Steady State Vector.
44
1 – General Motor 2 – Ford 3 - Chrysler
1 – General Motor
2 – Ford
3 - Chrysler
45
• Suppose that General Motor (GM), Ford (F), and Chrysler (C) each introduce a new SUV vehicle.
• General Motors keep 85% of its customers but loses 10% to Ford and 5% to Chrysler.
• Ford keeps 80% of its customers but loses 10% to General Motors and 10% to Chrysler.
• Chrysler keeps 60% of its customers but loses 25% to General Motors and 15% to Ford.
Find the distribution of the market in the long run or the Steady State Vector.
Find T n = steady state
1 – General Motor 2 – Ford 3 - Chrysler
1 – General Motor
46
END 7
47
5.4 -Markov Model for Storm Water Pipe Deterioration
• P44 = 1… is an absoring state, i.e. once enter cannot be left
48
End for students notes
49
50
5.7 markovchain-IKRana
51
52
53
54
END 8
55
Exercise 1.3
• Consider a person moving on a 4 × 4 grid. He
can move only to the inter- section points on
the right or down, each with probability 1/2. If
he starts his walk from the top left corner and
Xn, n ≥ 1 denotes his position after n steps.
Show that {Xn}n≥0} is a Markov chain. Sketch
its transition graph and compute the
transition probability matrix. Also find the
initial distribution vector.
56
57
Exercise 1.4
58
59
60
61
62
63
• End of……..5.7 markovchain-IKRana
64
END 9
65