Operations Management Module 3
Operations Management Module 3
Module 3
Productivity
Efficiency
Effectiveness
Markov chains
Productivity
Productivity
A measure of the effective use of resources,
usually expressed as the ratio of output to input
Productivity
A measure of the quantity of output per unit of input.
Productivity = Amount of Output
Amount of Input
Input man hours/ machine hours / material consumed
Traditionally,labour productivity
Workers productivity =
Number of units of output
Number of days taken
Group of workers productivity =
Tonnes(or kg) of output
Number of workers
. Limited view
Year
2010 - 2011
16,000
2011- 2012
20,000
20,000
15,000
2,000
1600
2.
3.
4.
5.
Multi factor
Labour Productivity
= Gross value added
Average daily employment
Gross value =
Ex factory price (raw material inputs + fuel + electricity)
Average daily employment =
Total attendance of all persons, all shifts, all working days / no.
of days worked
Capital Productivity
= Output
Capital
Where capital includes land, buildings,plant, machinery etc. Does
not include working capital
Multifactor Measures
Output
;
Multiple Inputs
Ouput
;
Labor+Machine
Output
Labor+Capital+Energy
Productivity =
180 * 150
4000+20000+ 12000
= 0.75
Markov Chains
Developed by Russian mathematician Andrei
A. Markov. They are a probabilistic models
known as stochastic processes.
Applications
Useful in analyzing consumer buying
patterns by examining and predicting the
behavior of students in terms of their brand
loyalty.
For planning personnel needs
To study stock market movements
etc
Data set
Model
Parameters:
Markov Process
Markov Property: The state of the system at time t+1 depends only
on the state of the system at time t
Pr X t 1 xt 1 | X 1 X t x1 xt
X1
Pr X t 1 xt 1 | X t xt
Markov Process
Simple Example
Weather:
raining today
Stochastic FSM:
0.6
0.4
0.8
no
rain
rain
0.2
Markov Process
Simple Example
Weather:
raining today
0.4 0.6
P
0.2 0.8
Stochastic matrix:
Rows sum up to 1
Double stochastic matrix:
Rows and columns sum up to 1
Markov Process
Coke vs. Pepsi Example
transition matrix:
0.9 0.1
P
0.2 0.8
0.
1
0.
9
coke
0.
8
pepsi
0.
2
Markov Process
Coke vs. Pepsi Example (cont)
Given that a person is currently a Pepsi purchaser,
what is the probability that he will purchase Coke two
purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 *
0.9
= 0.34
00.9.9 00.1.1
P
00.2.2 00.8.8
2
Pepsi ?
0.8
0.2
? Coke
Markov Process
Coke vs. Pepsi Example (cont)
Given that a person is currently a Coke purchaser,
what is the probability that he will purchase Pepsi
three purchases from now?
0.9 * 0.17 + 0.1 * 0.66 = 0.219
Markov Process
Coke vs. Pepsi Example
(cont)
0.9 0.1
P
0
.
2
0
.
8
0.781 0.219
P
0
.
438
0
.
562