0% found this document useful (0 votes)
59 views

Markov Analysis

1. Markov analysis allows prediction of future states through initial state vectors and transition probability matrices. 2. It is used in marketing, finance, human resources and other areas to model processes where the next state depends only on the current state. 3. Key assumptions include a finite number of states, time-independent transition probabilities, and an initial known state vector from which forecasting is possible.

Uploaded by

havilla
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views

Markov Analysis

1. Markov analysis allows prediction of future states through initial state vectors and transition probability matrices. 2. It is used in marketing, finance, human resources and other areas to model processes where the next state depends only on the current state. 3. Key assumptions include a finite number of states, time-independent transition probabilities, and an initial known state vector from which forecasting is possible.

Uploaded by

havilla
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

40

3. Markov
k Analysis
A Markov chain is a discrete random process with the property
that the next state depends only on the current state. It is named
after Andrey Markov, and is a mathematical tool for statistical
modeling in modern applied mathematics. A Markov process is
used to analyse decision problems in which the occurrence of an
event depends on the occurrence of a previous event.

3.1. Area of application


CUK

1. In marketing- to forecast market shares of competitors


2. In finance- to predict share prices in the stock exchange
market
3. In human resource management- to analyse the shifting of
JJ II personnel to various departments, branches , divisions etc
J I 4. In Accounting- to estimate the provision for bad debts
Back Close
Section 3: Markov Analysis 41
5. kIn universities and colleges- to predict future enrolments etc

3.2. Definition of Terms


1. Markov analysis- A type of analysis that allows the decision
maker to predict the future state of nature of a system by
using the initial state vector and the matrix of transition
probabilities.
2. State probability- this is the probability of an event occur-
ring at a given time.
CUK

3. Vector of state probabilities- This is a vector of all state


probabilities for a given system or process. This vector could
be initial or future state vector.
4. Transition probability-This is the conditional probability of
JJ II moving from one state to another or remaining in the same
J I state in the future
Back Close
Section 3: Markov Analysis 42
5. kMatrix of transition probabilities-This is a matrix that con-
tains all transition probabilities for a certain process or sys-
tem
6. Equilibrium condition or steady state condition-This refers
to the long run stable state of a system or process .Provided
the assumptions of Markov process persists, the system fi-
nally reaches an equilibrium or steady state condition. At
equilibrium , state probabilities for a future period are the
same as the state probabilities for the previous period.
CUK

7. Absorbing state-This is a state which once entered cannot


be left. It has a transition probability of one to itself and
zero to all other states. In business, absorbing states include
the payment of a bill, termination of a contract, sale of a
capital Asset etc.
JJ II
J I 8. Re- current state- This refers to a state which can be left
Back Close
Section 3: Markov Analysis 43

kand re-entered many times.


9. Closed state- this refers to a state which once left cannot be
re-entered. A closed state loses but never gains.

3.3. Assumptions of Markov processes


1. There are a finite number of possible states
2. The probability of changing states remains the same during
the period of the analysis.
3. The number and composition of possible states do not change.
CUK

4. The subjects of analysis do not change both in number and


composition eg same number & type of customers
5. The various states of nature are mutually exclusive and col-
lectively exhaustive .That is, at any given time a subject of
JJ II analysis belongs to one and only one state
J I
6. The system begins from some initial known state vector
Back Close
Section 3: Markov Analysis 44
7. kForecasting is possible once we have the initial state vector
and the transition matrix
CUK

Note:

a) Pij = probability of the system being in state j in future if


it is in the current state i
JJ II
J I
Back Close
Section 3: Markov Analysis 45
b) k
P11 + P12 + · · · + P1n = 1
P21 + P22 + · · · + P2n = 1
Pn1 + Pn2 + · · · + P1n = 1

c) The transition matrix is a square matrix . That is the num-


ber of rows= number of columns
d) The transition matrix can be obtained through observation,
data collection and data analysis (or Research)
CUK

Illustrations:
Question 1 Two TV stations, S1 and S2 compete for viewers. Of
those who view S1 on a given day, 40% view S2 the next day. In
the case of those who view S2 on a given day, 30% switch over to
S1 the next day. Yesterday’s market share of viewers for s1 was
JJ II
60% while that of S2 was 40%.
J I
Required
Back Close
Section 3: Markov Analysis 46
Determine
k the percentage of viewers for each station:
a) Today
b) Tomorrow
c) In the long run (equilibrium or steady state)
Solution

To
" S1 S2 #
CUK

a)
S1 0.6 0.4
Transition matrix, T = From
S2 0.3 0.7

h S1 S2 i
Yesterday’s market share( initial state vector) = 0.6 0.4
Future state vector = ( initial state vector)×( Transition matri
JJ II
Today’s market share of viewers =
J I
Back Close
Section 3: Markov Analysis 47
" #
h
k 0.6 0.4
i 0.6 0.4 h i
= 0.48 0.52
0.3 0.7
Thus S1 = 48% and S2 = 52%

b) Tomorrow’s market
" share #of viewers =
h i 0.6 0.4 h i
0.48 0.52 = 0.444 0.556
0.3 0.7
Thus S1 = 44.4% and S2 = 55.6%
CUK

c) Provided the assumptions of Markov process persist the sys-


tem finally reaches an equilibrium or steady state or Long
run status. This is where although transition continue, for
any one state, the percentage loses equals to percentage
gains. Therefore sum of net gains is zero.
JJ II At equilibrium the following hold:
J I
(Equilibrium state vector)×(Transition matrix) = (Equilibrium
Back Close
Section 3: Markov Analysis 48

kLet x1 = long run market share ofS1


x2 = 1 − x1 = long run market share ofS2
In the long run," #
h i 0.6 0.4 h i
x1 1 − x1 = x1 1 − x1
0.3 0.7

0.6x1 + 0.3(1 − x1 ) = x1
−0.7x1 = −0.3
x1 = 0.4286
CUK

x1 = 42.86%
x2 = 1 − x1 = 1 − 0.4286 = 0.5714 = 57.14%
Question 2: Kagoro village consist of a total of 16,000 house-
holds.A market research firm gathered data in an attempt to in-
vestigate the loyalty of the households for soaps A, B and C sold
JJ II
in the village shops. A consumer survey at the end of the month
J I
of Dec 2009 revealed the following brand switching patterns:
Back Close
Section 3: Markov Analysis 49

k To
A B C
A 4000 500 500
From B 1000 3500 500
C 600 1800 3600
Required
a) Determine the transition matrix for the above Markov pro-
cess
b) Determine the market share percentage and number of house-
CUK

holds using each of the three types of soaps at the end of


January 2010
c) Determine the steady state market share percentage and
number of households using each of the three types of soaps.
JJ II Solution
J I
Back Close
Section 3: Markov Analysis 50
To
k
Total
A B C
(Nov 2009)
A 4000 500 500 5000
a)
From B 1000 3500 500 5000
C 600 1800 3600 6000
Total
5600 5800 4600 16000
(Dec 2009)
To
A B C
 
A 4050
5
50
5
50
CUK

Transition matrix, T = From B  10 35 5 



50 50 50 
6 18 36
C 60 60 60
To
A B C
 
A 0.8 0.1 0.1
= From B 0.2 0.7 0.1
 
JJ II C 0.1 0.3 0.6
J I
b) Market share in Dec 2009( initial state vector)
Back Close
Section 3: Markov Analysis 51

k h A B C i h A B C i
5600 5800 4600
= 16000 16000 16000 = 0.35 0.3625 0.2875
! !
initial Transition
Future state vector = ×
state vector matrix
Market share at the end ofJan 2010 = 
h i 0.8 0.1 0.1
0.35 0.3625 0.2875  0.2 0.7 0.1 
 

0.1 0.3 0.6


h i
= 0.38125 0.3750 0.24375
Thus
A = 0.38125 = 38.125% = 0.38125 × 16000 = 6100 households
CUK

B = 0.3750 = 37.50% = 0.3750 × 16000 = 6000 households


C = 0.24375 = 24.375% = 0.24375 × 16000 = 3900 households
b) Let
x1 = Steady state market share of A

x2 = Steady state market share of B


JJ II 1 − x1 − x2 = Steady state market share of C
J I
Back Close
Section 3: Markov Analysis 52

kIn the long run,


 
h i 0.8 0.1 0.1 h i
x1 x2 1 − x1 − x2  0.2 0.7 0.1  = x1 x2 1 − x1 − x2
 

0.1 0.3 0.6

0.8x1 + 0.2x2 + 0.1(1 − x1 − x2 ) = x1


−0.3x1 + 0.1x2 = −0.1 · · · · · · (i)
CUK

0.1x1 + 0.7x2 + 0.3(1 − x1 − x2 ) = x2


−0.2x1 − 0.6x2 = −0.3 · · · · · · (ii)

Solving equation (i) and (ii) simultaneously, we get


x1 = 0.45 = 45%
JJ II x2 = 0.35 = 35%
x3 = 1 − x1 − x2 = 1 − 0.45 − 0.35 = 0.2 = 20%
J I
Thus
Back Close
53

kA = 0.45 × 16000 = 7200 households


B = 0.35 × 16000 = 5600 households
C = 0.2 × 16000 = 3200 households

4. Activities
a) Solve the following simultaneous equations using matrix in-
verse method
5a − 12b = 4
4a − 7b = −2
CUK

b) Solve the following simultaneous equations using Cramers


rule
3x − 2y − z = 2
−4x + y − z = 3
JJ II
2x +z =1
J I
Back Close
54
5. Summary
k
We can now use matrix inverse method and Cramers rule to solve
simultaneous equations. Knowledge of markov analysis is useful
in modelling various business problems.

6. Self assessment
A, B and C are Accounting firms offering auditing and consultancy
services in Mombasa and its environs. The management of firm
A were concerned of the shrinking market share of the firm in the
CUK

region and therefore commissioned a survey to monitor the trend.


The results of the survey were as follows:

JJ II
J I
Back Close
Section 6: Self assessment 55

Required

i) Formulate the above data into a matrix of transition prob-


CUK

abilities
ii) Estimate the market share for each of the firms for the year
ended 31 Dec 2010
iii) Determine the steady state market share for each of the
JJ II firm.
J I
Back Close

You might also like