0% found this document useful (0 votes)
6 views250 pages

08 - Continuous Time Markov Chains

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views250 pages

08 - Continuous Time Markov Chains

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 250

Continuous Time Markov Chains

Stochastics

Illés Horváth

2021/11/02

Stochastics Illés Horváth Continuous Time Markov Chains


(1) Starting point: Markov Chains

(2) Innitesimal generator matrix

(3) Another interpretation

(4) Long-term behaviour

(5) Short-term behaviour

(6) Embedded Markov chain

(7) Finite and innite queues

Stochastics Illés Horváth Continuous Time Markov Chains


Continuous time Markov chains

A (discrete time) Markov chain models systems where changes


occur at regular times (daily, monthly etc.). In many real life
situations, changes may occur anytime.

Stochastics Illés Horváth Continuous Time Markov Chains


Continuous time Markov chains

A (discrete time) Markov chain models systems where changes


occur at regular times (daily, monthly etc.). In many real life
situations, changes may occur anytime.

We aim to dene continuous time Markov chains, where changes


may occur at any point in time, but the system is otherwise similar
to discrete time Markov chains.

Stochastics Illés Horváth Continuous Time Markov Chains


Continuous time Markov chains

A (discrete time) Markov chain models systems where changes


occur at regular times (daily, monthly etc.). In many real life
situations, changes may occur anytime.

We aim to dene continuous time Markov chains, where changes


may occur at any point in time, but the system is otherwise similar
to discrete time Markov chains.

There are several ways to dene continuous time Markov chains:

(1) Start from a discrete time Markov chain, and randomize the
waiting time between transitions.

(2) Start from a discrete time Markov chain, and consider the
limit when the discrete time unit converges to 0.

(3) Abstract denition via semigroups.

Stochastics Illés Horváth Continuous Time Markov Chains


Continuous time Markov chains

A (discrete time) Markov chain models systems where changes


occur at regular times (daily, monthly etc.). In many real life
situations, changes may occur anytime.

We aim to dene continuous time Markov chains, where changes


may occur at any point in time, but the system is otherwise similar
to discrete time Markov chains.

There are several ways to dene continuous time Markov chains:

(1) Start from a discrete time Markov chain, and randomize the
waiting time between transitions.

(2) Start from a discrete time Markov chain, and consider the
limit when the discrete time unit converges to 0.

(3) Abstract denition via semigroups.

We will take approach (1), but the other approaches are equivalent;
all of them result in the same process.

Stochastics Illés Horváth Continuous Time Markov Chains


Starting point: Markov chains

The starting point is a discrete time Markov chain (DTMC).


Instead of the next transition occurring 1 unit of time later, we look
to randomize the time spent at the current state. Let T denote the
time spent at a state before transitioning; what should be the
distribution of T?

Stochastics Illés Horváth Continuous Time Markov Chains


Starting point: Markov chains

The starting point is a discrete time Markov chain (DTMC).


Instead of the next transition occurring 1 unit of time later, we look
to randomize the time spent at the current state. Let T denote the
time spent at a state before transitioning; what should be the
distribution of T?
We still want the Markov property to hold; that is, the future
should only depend on the current state, not what has happened
before. In this case, this also includes the time already spent at the
current state.

Stochastics Illés Horváth Continuous Time Markov Chains


Starting point: Markov chains

The starting point is a discrete time Markov chain (DTMC).


Instead of the next transition occurring 1 unit of time later, we look
to randomize the time spent at the current state. Let T denote the
time spent at a state before transitioning; what should be the
distribution of T?
We still want the Markov property to hold; that is, the future
should only depend on the current state, not what has happened
before. In this case, this also includes the time already spent at the
current state. We encountered this property before: this is the
same as saying that T is a memoryless distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


Starting point: Markov chains

The starting point is a discrete time Markov chain (DTMC).


Instead of the next transition occurring 1 unit of time later, we look
to randomize the time spent at the current state. Let T denote the
time spent at a state before transitioning; what should be the
distribution of T?
We still want the Markov property to hold; that is, the future
should only depend on the current state, not what has happened
before. In this case, this also includes the time already spent at the
current state. We encountered this property before: this is the
same as saying that T is a memoryless distribution.

The only memoryless continuous distribution is the exponential


distribution, so T must have exponential distribution. The
parameter may depend on the state i; it will be denoted by λi .

Stochastics Illés Horváth Continuous Time Markov Chains


Simulating a continuous time Markov chain
Altogether, this means that a continuous Markov chain can be
dened by the following information:
list of states;
transition probability matrix P;
initial state v (0);
a list of λ1 , . . . , λ k > 0 numbers that are the parameters of
the exponential waiting time in each state.
Based on the above information, a continuous time Markov chain
can be simulated as follows.

Stochastics Illés Horváth Continuous Time Markov Chains


Simulating a continuous time Markov chain
Altogether, this means that a continuous Markov chain can be
dened by the following information:
list of states;
transition probability matrix P;
initial state v (0);
a list of λ1 , . . . , λ k > 0 numbers that are the parameters of
the exponential waiting time in each state.
Based on the above information, a continuous time Markov chain
can be simulated as follows.

If the Markov chain is in some state i , we generate a random


variable with distribution T ∼ EXP(λi ).

Stochastics Illés Horváth Continuous Time Markov Chains


Simulating a continuous time Markov chain
Altogether, this means that a continuous Markov chain can be
dened by the following information:
list of states;
transition probability matrix P;
initial state v (0);
a list of λ1 , . . . , λ k > 0 numbers that are the parameters of
the exponential waiting time in each state.
Based on the above information, a continuous time Markov chain
can be simulated as follows.

If the Markov chain is in some state i , we generate a random


variable with distribution T ∼ EXP(λi ).
After time T has passed, we select the next state randomly
according to row i of the transition probability matrix P,
independently from T (and the past in general).

Stochastics Illés Horváth Continuous Time Markov Chains


Simulating a continuous time Markov chain
Altogether, this means that a continuous Markov chain can be
dened by the following information:
list of states;
transition probability matrix P;
initial state v (0);
a list of λ1 , . . . , λ k > 0 numbers that are the parameters of
the exponential waiting time in each state.
Based on the above information, a continuous time Markov chain
can be simulated as follows.

If the Markov chain is in some state i , we generate a random


variable with distribution T ∼ EXP(λi ).
After time T has passed, we select the next state randomly
according to row i of the transition probability matrix P,
independently from T (and the past in general).

The Markov chain transitions to that state, and we iterate this loop
by generating a new T, and so on.
Stochastics Illés Horváth Continuous Time Markov Chains
Example
Example. We have a discrete time Markov chain with 3 states with
v (0) = ( 1 0 0 ) and transition probability matrix
 
0 1/3 2 /3
P =  1 /5 0 4 /5 ,
3 /4 0 1 /4

and λ1 = 1, λ2 = 4, λ3 = 2. Then for the realization


1, 3, 1, 2, 3, 1, 3, 3, 1, . . . (of the discrete time Markov chain), the
corresponding realization of the continuous time Markov chain
looks like this:

3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

We use the previous construction as the denition of a continuous


time Markov chain (CTMC). It satises the Markov property.

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

We use the previous construction as the denition of a continuous


time Markov chain (CTMC). It satises the Markov property.

Let v (t) denote the state vector at time t, that is,

vi (t) = P(the Markov chain is in state i at time t).

Note that t ∈ [0, ∞) now, not just integers.

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

We use the previous construction as the denition of a continuous


time Markov chain (CTMC). It satises the Markov property.

Let v (t) denote the state vector at time t, that is,

vi (t) = P(the Markov chain is in state i at time t).

Note that t ∈ [0, ∞) now, not just integers.

Theorem

v (t) = v (0)e Qt ,
where the matrix Q is obtained from P − I by multiplying row i by
λi for each i , where I is the identity matrix.

Q is known as the innitesimal generator or simply generator of the


process.

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

Proof (sketch). Let

pij (t) = P(the MC is in state j at time t|started from i ),


at time 0

which are collected into a matrix P(t).

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

Proof (sketch). Let

pij (t) = P(the MC is in state j at time t|started from i ),


at time 0

which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)

P(t + s) = P(t)P(s) ∀s, t ≥ 0

due to total probability.

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

Proof (sketch). Let

pij (t) = P(the MC is in state j at time t|started from i ),


at time 0

which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)

P(t + s) = P(t)P(s) ∀s, t ≥ 0

due to total probability. The solution is

d
P(t) = e Qt where Q= P(t) t=0
.
d t

Stochastics Illés Horváth Continuous Time Markov Chains


Innitesimal generator

Proof (sketch). Let

pij (t) = P(the MC is in state j at time t|started from i ),


at time 0

which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)

P(t + s) = P(t)P(s) ∀s, t ≥ 0

due to total probability. The solution is

d
P(t) = e Qt where Q= P(t) t=0
.
d t
d
Finally, computing
dt P(t) t=0 gives the Q as dened in the
theorem.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

In the previous example with

 
0 1/3 2 /3
P =  1/5 0 4 /5 ,
3/4 0 1 /4

and λ1 = 1, λ2 = 4, λ3 = 2, we have
 
−1 1/3 2/3
P − I =  1/5 −1 4/5 
3 /4 0 −3/4

and  
−1 1/3 2 /3
Q = 4/5
 −4 16/5 .
3/2 0 −3/2

Stochastics Illés Horváth Continuous Time Markov Chains


Properties of Q

The following properties are valid for any generator:

the diagonal elements of Q are negative or 0;

other elements of Q are positive or 0, and

each row of Q has a sum equal to 0.

Stochastics Illés Horváth Continuous Time Markov Chains


Properties of Q

The following properties are valid for any generator:

the diagonal elements of Q are negative or 0;

other elements of Q are positive or 0, and

each row of Q has a sum equal to 0.

Any Q matrix with the above properties is a valid generator.

Stochastics Illés Horváth Continuous Time Markov Chains


Properties of Q

The following properties are valid for any generator:

the diagonal elements of Q are negative or 0;

other elements of Q are positive or 0, and

each row of Q has a sum equal to 0.

Any Q matrix with the above properties is a valid generator.

For a matrix A, e A can be computed from either power series form:


X An
eA = ,
n!
n=0

or using Jordan-form.

Stochastics Illés Horváth Continuous Time Markov Chains


Properties of Q

The following properties are valid for any generator:

the diagonal elements of Q are negative or 0;

other elements of Q are positive or 0, and

each row of Q has a sum equal to 0.

Any Q matrix with the above properties is a valid generator.

For a matrix A, e A can be computed from either power series form:


X An
eA = ,
n!
n=0

or using Jordan-form.

We will also have some easy-to-compute approximations for e Qt .

Stochastics Illés Horváth Continuous Time Markov Chains


Clock and die

According to the previous construction of a continuous time


Markov chain, rst the time spent at the current state is generated,
then the next state is chosen.

Stochastics Illés Horváth Continuous Time Markov Chains


Clock and die

According to the previous construction of a continuous time


Markov chain, rst the time spent at the current state is generated,
then the next state is chosen.

We are going to refer to the above interpretation as clock and die


since we can imagine it as rst setting a clock that goes o after
EXP(λi ) time, and rolling a die after that according to row i of P
to select the next state.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

Next we present a dierent but ultimately equivalent interpretation.


It is based on the following lemma.

Lemma

Let X1 , . . . , Xk be independent, Xi ∼ EXP(µi ) and


Y = min(X1 , . . . , Xk ).

Then Y ∼ EXP(µ1 + · · · + µk ) and


µi
P(Y = Xi ) = .
µ 1 + · · · + µk

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

Next we present a dierent but ultimately equivalent interpretation.


It is based on the following lemma.

Lemma

Let X1 , . . . , Xk be independent, Xi ∼ EXP(µi ) and


Y = min(X1 , . . . , Xk ).

Then Y ∼ EXP(µ1 + · · · + µk ) and


µi
P(Y = Xi ) = .
µ 1 + · · · + µk

No proof (but it is a simple calculation for k =2 and we can use


induction for general k ).

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

Consider the previous example:

 
−1 1/3 2 /3
Q = 4/5
 −4 16/5 ,
3/2 0 −3/2

and let's say the Markov chain is in state 2.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

Consider the previous example:

 
−1 1/3 2 /3
Q = 4/5
 −4 16/5 ,
3/2 0 −3/2

and let's say the Markov chain is in state 2.

From state 2, the Markov chain can transition to either state 1 or


state 3 next. We set random 2 clocks now, one for each possible
transition: T2→1 ∼ EXP(4/5) and T2→3 ∼ EXP(16/5), and
assume they are independent.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

Consider the previous example:

 
−1 1/3 2 /3
Q = 4/5
 −4 16/5 ,
3/2 0 −3/2

and let's say the Markov chain is in state 2.

From state 2, the Markov chain can transition to either state 1 or


state 3 next. We set random 2 clocks now, one for each possible
transition: T2→1 ∼ EXP(4/5) and T2→3 ∼ EXP(16/5), and
assume they are independent.

The clocks are competing: whichever goes o rst, that transition


will occur and the other one not.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks
So what does the lemma say about this situation?

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks
So what does the lemma say about this situation?

The time when the next transition occurs is

T = min(T2→1 , T2→3 ).
According to the lemma, T ∼ EXP(16/5 + 4/5), whose parameter
is the same as λ2 = 4.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks
So what does the lemma say about this situation?

The time when the next transition occurs is

T = min(T2→1 , T2→3 ).
According to the lemma, T ∼ EXP(16/5 + 4/5), whose parameter
is the same as λ2 = 4.
Also,

4/5
P(T = T2→1 ) = = 1/5,
+ 16/5 4 /5

16/5
P(T = T2→3 ) = = 4/5,
4/5 + 16/5

in accordance with the second row of P:


 
0 1/3 2 /3
P =  1/5 0 4 /5 .
3/4 0 1 /4

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

The above interpretation, referred to as competing (racing) clocks,


is equivalent to the previous interpretation (clock and die). This is
true in general; a continuous time Markov chain can be interpreted
either way.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

The above interpretation, referred to as competing (racing) clocks,


is equivalent to the previous interpretation (clock and die). This is
true in general; a continuous time Markov chain can be interpreted
either way.

Example: consider the number of customers waiting in a queue at


the shop.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

The above interpretation, referred to as competing (racing) clocks,


is equivalent to the previous interpretation (clock and die). This is
true in general; a continuous time Markov chain can be interpreted
either way.

Example: consider the number of customers waiting in a queue at


the shop. If currently the number of customers is 2, then next it
can change to either 1 or 3 due to

a customer being served at the front of the queue, or

a new customer entering the end of the queue.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

The above interpretation, referred to as competing (racing) clocks,


is equivalent to the previous interpretation (clock and die). This is
true in general; a continuous time Markov chain can be interpreted
either way.

Example: consider the number of customers waiting in a queue at


the shop. If currently the number of customers is 2, then next it
can change to either 1 or 3 due to

a customer being served at the front of the queue, or

a new customer entering the end of the queue.

Whether the length of the queue will change by +1 or −1 next


depends on which event happens rst. So in this sense, these two
possible changes are competing.

Stochastics Illés Horváth Continuous Time Markov Chains


Competing clocks

The above interpretation, referred to as competing (racing) clocks,


is equivalent to the previous interpretation (clock and die). This is
true in general; a continuous time Markov chain can be interpreted
either way.

Example: consider the number of customers waiting in a queue at


the shop. If currently the number of customers is 2, then next it
can change to either 1 or 3 due to

a customer being served at the front of the queue, or

a new customer entering the end of the queue.

Whether the length of the queue will change by +1 or −1 next


depends on which event happens rst. So in this sense, these two
possible changes are competing.

This concept is applicable to many real life scenarios: a system may


change due to several competing sources.

Stochastics Illés Horváth Continuous Time Markov Chains


Transition rates
Consider the example
 
−1 1/3 2/3
Q = 4 /5
 −4 16/5 
3 /2 0 −3/2
again. The o-diagonal elements of Q are called transition rates.

Stochastics Illés Horváth Continuous Time Markov Chains


Transition rates
Consider the example
 
−1 1/3 2/3
Q = 4 /5
 −4 16/5 
3 /2 0 −3/2
again. The o-diagonal elements of Q are called transition rates.
They are similar to the rate of a Poisson process in the following
sense. When the Markov chain is in state 2 for example, the
waiting time until a transition to state 1 is EXP(4/5), just like in a
Poisson process, and the waiting time until a transition to state 3 is
EXP(16/5).

Stochastics Illés Horváth Continuous Time Markov Chains


Transition rates
Consider the example
 
−1 1/3 2/3
Q = 4 /5
 −4 16/5 
3 /2 0 −3/2
again. The o-diagonal elements of Q are called transition rates.
They are similar to the rate of a Poisson process in the following
sense. When the Markov chain is in state 2 for example, the
waiting time until a transition to state 1 is EXP(4/5), just like in a
Poisson process, and the waiting time until a transition to state 3 is
EXP(16/5).

The negative diagonal elements correspond to the outgoing rate


from each state, that is, the total rate of leaving each state (with
− sign).

Stochastics Illés Horváth Continuous Time Markov Chains


Transition rates
Consider the example
 
−1 1/3 2/3
Q = 4 /5
 −4 16/5 
3 /2 0 −3/2
again. The o-diagonal elements of Q are called transition rates.
They are similar to the rate of a Poisson process in the following
sense. When the Markov chain is in state 2 for example, the
waiting time until a transition to state 1 is EXP(4/5), just like in a
Poisson process, and the waiting time until a transition to state 3 is
EXP(16/5).

The negative diagonal elements correspond to the outgoing rate


from each state, that is, the total rate of leaving each state (with
− sign).

Once a transition happens, the Markov chain is in a new state and


the possible target states and transition rates are updated.
Stochastics Illés Horváth Continuous Time Markov Chains
Poisson process as a Markov chain

A PPP(λ) can be understood as a continuous time Markov chain.


The process N(t) (which denotes the number of arrivals up to time
t) may take the values 0, 1, 2, . . .

Stochastics Illés Horváth Continuous Time Markov Chains


Poisson process as a Markov chain

A PPP(λ) can be understood as a continuous time Markov chain.


The process N(t) (which denotes the number of arrivals up to time
t) may take the values 0, 1, 2, . . .

The time between transitions 0 → 1, 1 →2 etc. are all EXP(λ), so


N(t) is in fact a Markov chain on the innite state space 0, 1, 2, . . .
with generator

 
−λ λ 0 0 ...
 0 −λ λ 0 
Q=
 
 0 0 −λ λ 

. .. ..
. . .
.

Stochastics Illés Horváth Continuous Time Markov Chains


Poisson process as a Markov chain

A PPP(λ) can be understood as a continuous time Markov chain.


The process N(t) (which denotes the number of arrivals up to time
t) may take the values 0, 1, 2, . . .

The time between transitions 0 → 1, 1 →2 etc. are all EXP(λ), so


N(t) is in fact a Markov chain on the innite state space 0, 1, 2, . . .
with generator

 
−λ λ 0 0 ...
 0 −λ λ 0 
Q=
 
 0 0 −λ λ 

. .. ..
. . .
.

(Q is an innite matrix, but let's not worry about that for now.)

Stochastics Illés Horváth Continuous Time Markov Chains


Example

In a small money-exchange oce, clients arrive according to a


Poisson process with a rate of 0.2 clients per minute. The average
service time of a client is 2 minutes. Apart from the client currently
served, at most 2 others can stay in the oce; when a client arrives
with the oce full, he goes on without entering.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

In a small money-exchange oce, clients arrive according to a


Poisson process with a rate of 0.2 clients per minute. The average
service time of a client is 2 minutes. Apart from the client currently
served, at most 2 others can stay in the oce; when a client arrives
with the oce full, he goes on without entering.

Model this situation with a Markov chain!

Stochastics Illés Horváth Continuous Time Markov Chains


Example

In a small money-exchange oce, clients arrive according to a


Poisson process with a rate of 0.2 clients per minute. The average
service time of a client is 2 minutes. Apart from the client currently
served, at most 2 others can stay in the oce; when a client arrives
with the oce full, he goes on without entering.

Model this situation with a Markov chain! The states are 0, 1, 2


according to the number of clients in the oce.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

In a small money-exchange oce, clients arrive according to a


Poisson process with a rate of 0.2 clients per minute. The average
service time of a client is 2 minutes. Apart from the client currently
served, at most 2 others can stay in the oce; when a client arrives
with the oce full, he goes on without entering.

Model this situation with a Markov chain! The states are 0, 1, 2


according to the number of clients in the oce.

From state 0, the only possible change is that a client arrives. This
has rate 0.2, so the rst row of Q is

 
−0.2 0 .2 0 .

Note that q13 = 0 because in continuous time, two clients may not
arrive at exactly the same time.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.

The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.

The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.

Is this assumption justied? We don't really know. However, the


Markov property only holds if the service time is exponential, so if
no further information is given about the service time distribution,
we may as well make this assumption.

Stochastics Illés Horváth Continuous Time Markov Chains


Example

From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.

The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.

Is this assumption justied? We don't really know. However, the


Markov property only holds if the service time is exponential, so if
no further information is given about the service time distribution,
we may as well make this assumption.

If the service time is not exponentially distributed, the Markov


property does not hold, so the process is not a Markov chain and
requires more complicated tools to analyze.

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!
The transitions 0 →1 and 1 →2 correspond to arrivals, so their
rate is 0.2.

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!
The transitions 0 →1 and 1 →2 correspond to arrivals, so their
rate is 0.2.

The transitions 1 →0 and 2 →1 correspond to service, so their


rate is 0.5.

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!
The transitions 0 →1 and 1 →2 correspond to arrivals, so their
rate is 0.2.

The transitions 1 →0 and 2 →1 correspond to service, so their


rate is 0.5.

The transitions 0 →2 and 2 →0 are not possible. If 2 customers


would arrive, one of them will arrive a little bit earlier, and there
will be a short period of time when there is 1 customer inside.

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!
The transitions 0 →1 and 1 →2 correspond to arrivals, so their
rate is 0.2.

The transitions 1 →0 and 2 →1 correspond to service, so their


rate is 0.5.

The transitions 0 →2 and 2 →0 are not possible. If 2 customers


would arrive, one of them will arrive a little bit earlier, and there
will be a short period of time when there is 1 customer inside.

The diagonal elements are lled in last: they are negative such that
each row sums to 0.

Stochastics Illés Horváth Continuous Time Markov Chains


Example
Assuming that service time is exponentially distributed, the process
is indeed a Markov chain. Let's gure out Q!
The transitions 0 →1 and 1 →2 correspond to arrivals, so their
rate is 0.2.

The transitions 1 →0 and 2 →1 correspond to service, so their


rate is 0.5.

The transitions 0 →2 and 2 →0 are not possible. If 2 customers


would arrive, one of them will arrive a little bit earlier, and there
will be a short period of time when there is 1 customer inside.

The diagonal elements are lled in last: they are negative such that
each row sums to 0. Altogether, the generator is
 
−0.2 0.2 0
Q =  0.5 −0.7 0.2  .
0 0.5 −0.5

Stochastics Illés Horváth Continuous Time Markov Chains


Structural properties and stationary distribution
Irreducibility for continuous time Markov chains is dened in the
same way as for discrete time Markov chains.

For simplicity, we are going to examine only irreducible continuous


time Markov chains in this course.

Stochastics Illés Horváth Continuous Time Markov Chains


Structural properties and stationary distribution
Irreducibility for continuous time Markov chains is dened in the
same way as for discrete time Markov chains.

For simplicity, we are going to examine only irreducible continuous


time Markov chains in this course.

There is no periodicity for continuous time Markov chains; basically,


all continuous time Markov chains are automatically aperiodic.

Stochastics Illés Horváth Continuous Time Markov Chains


Structural properties and stationary distribution
Irreducibility for continuous time Markov chains is dened in the
same way as for discrete time Markov chains.

For simplicity, we are going to examine only irreducible continuous


time Markov chains in this course.

There is no periodicity for continuous time Markov chains; basically,


all continuous time Markov chains are automatically aperiodic.

Denition of the stationary vector is slightly dierent compared to


DTMCs: vst = (x1 . . . xk ) is stationary if

x1 + · · · + xk = 1,
vst Q = 0,

where 0 denotes the constant 0 vector.

Stochastics Illés Horváth Continuous Time Markov Chains


Structural properties and stationary distribution
Irreducibility for continuous time Markov chains is dened in the
same way as for discrete time Markov chains.

For simplicity, we are going to examine only irreducible continuous


time Markov chains in this course.

There is no periodicity for continuous time Markov chains; basically,


all continuous time Markov chains are automatically aperiodic.

Denition of the stationary vector is slightly dierent compared to


DTMCs: vst = (x1 . . . xk ) is stationary if

x1 + · · · + xk = 1,
vst Q = 0,

where 0 denotes the constant 0 vector.

v (0) = vst =⇒ v (t) = vst ∀t > 0.

Stochastics Illés Horváth Continuous Time Markov Chains


Long-term behaviour

Theorem (Main)

(a) There is always at least one stationary vector for any (nite
state) continuous time Markov chain.
(b) If the Markov chain is irreducible, then vst is unique, its
elements are strictly positive, and
lim v (t) = vst
t→∞

for any v0 initial vector.


(c) The long term average ratio of time spent at state i is xi ,
where vst = (x1 . . . xk ).

Stochastics Illés Horváth Continuous Time Markov Chains


Example
The convergence in limt→∞ v (t) = vst is very fast, similar to
DTMCs. For the earlier example
 
−1 1/3 2/3
Q =  4/5 −4 16/5 
3/2 0 −3/2

and v0 = 1 0 0 , we plot v (t) (3 colored graphs) and vst
(gray horizontal lines):
1.0

0.8

v1 (t)
0.6

0.4 v3 (t)

0.2

v2 (t)

0.0
0 1 2 3 4 5

Stochastics Illés Horváth Continuous Time Markov Chains


Long-term behaviour

Theorem (Ergodic theorem)

Denote the realization of a Markov chain by


X (t), t ≥ 0.

If the Markov chain is irreducible, then for any function f given on


the states, R T
0 f (X (t))dt
lim = Est (f ),
T →∞ T
where
Est (f ) = x1 f (1) + · · · + xk f (k),
where
vst = ( x1 . . . xk ).

Stochastics Illés Horváth Continuous Time Markov Chains


Approximations for e Qt
Following from the main theorem, the long-term approximation for
e Qt is  
vst
.
e Qt ≈  . t.
 
.  for large

vst

Stochastics Illés Horváth Continuous Time Markov Chains


Approximations for e Qt
Following from the main theorem, the long-term approximation for
e Qt is  
vst
.
e Qt ≈  . t.
 
.  for large

vst
The short-term approximation is

e Qt ≈ I + tQ for small t.

(First order approximation of the exponential function.)

Stochastics Illés Horváth Continuous Time Markov Chains


Approximations for e Qt
Following from the main theorem, the long-term approximation for
e Qt is  
vst
.
e Qt ≈  . t.
 
.  for large

vst
The short-term approximation is

e Qt ≈ I + tQ for small t.

(First order approximation of the exponential function.)

The short-term approximation has an interesting interpretation:


during a short time (when t is small), there might be

no transition with probability close to 1;

1 transition with probability proportional to t, and

the probability of 2 or more transitions is negligible.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

For a CTMC, the embedded Markov chain is the sequence of states


that the Markov chain visits. It is a discrete time Markov chain.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

For a CTMC, the embedded Markov chain is the sequence of states


that the Markov chain visits. It is a discrete time Markov chain.

For the running example with the realization

the corresponding realization of the embedded Markov chain is

1, 3, 1, 2, 3, 1, 3, 1, . . .

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
Let's compute the transition probability matrix P0 of the embedded
Markov chain. Consider the example
 
−1 1/3 2/3
Q =  4 /5 −4 16/5 
3 /2 0 −3/2

again. From state 1, the possible transitions are 1 →2 and 1 → 3,


and

1/3
P(next |
state is 2 current state is 1 )= = 1/3,
1/3
+ 2 /3
2/3
P(next |
state is 3 current state is 1) = = 2/3.
1/3 + 2/3

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
Let's compute the transition probability matrix P0 of the embedded
Markov chain. Consider the example
 
−1 1/3 2/3
Q =  4 /5 −4 16/5 
3 /2 0 −3/2

again. From state 1, the possible transitions are 1 →2 and 1 → 3,


and

1/3
P(next |
state is 2 current state is 1 )= = 1/3,
1/3
+ 2 /3
2/3
P(next |
state is 3 current state is 1) = = 2/3.
1/3 + 2/3

Based on this, the rst row of P0 is


 
0 1/3 2 /3 .

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
For the second row of P 0, we have

4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5

Note that the denominator 4/5 + 16/5 is in fact equal to |q22 | = 4


since each row of Q sums up to 0.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
For the second row of P 0, we have

4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5

Note that the denominator 4/5 + 16/5 is in fact equal to |q22 | = 4


since each row of Q sums up to 0.

So in fact the above calculations can be summarized as follows:


divide row i of Q by |qii |, then add the identity matrix I.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
For the second row of P 0, we have

4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5

Note that the denominator 4/5 + 16/5 is in fact equal to |q22 | = 4


since each row of Q sums up to 0.

So in fact the above calculations can be summarized as follows:


divide row i of Q by |qii |, then add the identity matrix I.
Altogether, we get

 
0 1/3 2/3
P 0 =  1/5 0 4/5 .
1 0 0

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

The above operation is essentially reverting the original


P →P −I →Q calculation. So do we get the same matrix as the
original P ?

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

The above operation is essentially reverting the original


P →P −I →Q calculation. So do we get the same matrix as the
original P ?

   
0 1/3 2/3 0 1/3 2/3
P 0 =  1/5 0 4/5  P =  1/5 0 4/5 .
1 0 0 3/4 0 1/4

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

The above operation is essentially reverting the original


P →P −I →Q calculation. So do we get the same matrix as the
original P ?

   
0 1/3 2/3 0 1/3 2/3
P 0 =  1/5 0 4/5  P =  1/5 0 4/5 .
1 0 0 3/4 0 1/4

No, not the same! The rst two rows are the same, but the third
row is dierent.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain

The above operation is essentially reverting the original


P →P −I →Q calculation. So do we get the same matrix as the
original P ?

   
0 1/3 2/3 0 1/3 2/3
P 0 =  1/5 0 4/5  P =  1/5 0 4/5 .
1 0 0 3/4 0 1/4

No, not the same! The rst two rows are the same, but the third
row is dierent.

The reason for this is that for the original DTMC, the transition
3 →3 has a positive probability. For the embedded Markov chain,
the next state is the rst state actually dierent from the current
state. Accordingly, consecutive repeated states are considered as a
single step in the embedded Markov chain.

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
So if the realization for P is

1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
So if the realization for P is

1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,

the realization for Q is

3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t

Stochastics Illés Horváth Continuous Time Markov Chains


The embedded Markov chain
So if the realization for P is

1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,

the realization for Q is

3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t

then the corresponding realization for P0 is

1, 3, 1, 2, 3, 1, 3, 1, . . .

Stochastics Illés Horváth Continuous Time Markov Chains


Stationary and embedded stationary distribution

Let vst = (x1 x2 . . . xk ) denote the stationary distribution of an


irreducible CTMC with generator Q, and let ust = (y1 y2 . . . yk )
denote the stationary distribution of the embedded Markov chain
with transition probability matrix P 0. Also, let λi = |qii | denote the
outgoing rates. Then

Stochastics Illés Horváth Continuous Time Markov Chains


Stationary and embedded stationary distribution

Let vst = (x1 x2 . . . xk ) denote the stationary distribution of an


irreducible CTMC with generator Q, and let ust = (y1 y2 . . . yk )
denote the stationary distribution of the embedded Markov chain
with transition probability matrix P 0. Also, let λi = |qii | denote the
outgoing rates. Then

Lemma

yi λi xi /λi
xi = P yi = P .
yi λi xi /λi

Stochastics Illés Horváth Continuous Time Markov Chains


Stationary and embedded stationary distribution
Proof. xi is the average ratio of the number of intervals spent in
state i from among all intervals. However, intervals have dierent
lengths.

3 1/λ 1/λ 1/λ


3 3 3
2 1/λ
2
1 1/λ 1/λ 1/λ 1/λ
1 1 1 1
t

Stochastics Illés Horváth Continuous Time Markov Chains


Stationary and embedded stationary distribution
Proof. xi is the average ratio of the number of intervals spent in
state i from among all intervals. However, intervals have dierent
lengths.

3 1/λ 1/λ 1/λ


3 3 3
2 1/λ
2
1 1/λ 1/λ 1/λ 1/λ
1 1 1 1
t

yi is the ratio of time spent in state i , which is thus weighted by


the average interval lengths, which is 1/λi for state i. This gives

xi /λi
yi = P .
xi /λi

Stochastics Illés Horváth Continuous Time Markov Chains


Stationary and embedded stationary distribution
Proof. xi is the average ratio of the number of intervals spent in
state i from among all intervals. However, intervals have dierent
lengths.

3 1/λ 1/λ 1/λ


3 3 3
2 1/λ
2
1 1/λ 1/λ 1/λ 1/λ
1 1 1 1
t

yi is the ratio of time spent in state i , which is thus weighted by


the average interval lengths, which is 1/λi for state i. This gives

xi /λi
yi = P .
xi /λi

The other formula is equivalent.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Next we examine a class of Markov chains with a special structure.


When a Markov-chain has states 0, 1, 2 . . . such that transitions are
only possible between states with a dierence of 1, it is called a
birth-death process or a Markov queue.

λ0 λ1 λ2 λ3

0 1 2 3 4 ...
µ1 µ2 µ3 µ4

The number of states could be either nite or innite. Usually λi


denotes the transition rate for i →i +1 and µi denotes the
transition rate for i → i − 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

It is called birth-death process because it can model the change in


a population. Going up by 1 corresponds to the birth of a member
of the population, and going down by 1 corresponds to the death of
a member.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

It is called birth-death process because it can model the change in


a population. Going up by 1 corresponds to the birth of a member
of the population, and going down by 1 corresponds to the death of
a member.

It is called Markov queue because it can model the change in the


length of a queue (see the earlier example with the money exchange
oce). In this setup, going up by 1 corresponds to the arrival of a
new customer/job to the server, and going down by 1 corresponds
to nishing the service of a customer/job. It is called Markov
because when both the service times and interarrival times have
exponential distributions, this is a continuous time Markov chain.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Any birth-death process is irreducible; if it is nite state, then we


know that there is a unique stationary distribution, and v (t) → vst
as t→∞ for any v (0).

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Any birth-death process is irreducible; if it is nite state, then we


know that there is a unique stationary distribution, and v (t) → vst
as t→∞ for any v (0).
If the birth-death process has innitely many states, then we don't
even know if vst exists, or if there are several dierent stationary
distributions.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Any birth-death process is irreducible; if it is nite state, then we


know that there is a unique stationary distribution, and v (t) → vst
as t→∞ for any v (0).
If the birth-death process has innitely many states, then we don't
even know if vst exists, or if there are several dierent stationary
distributions.

However, the following theorem is valid for innite birth-death


processes too.

Theorem (Balance equations)

For any (nite or innite state) birth-death process, if (x0 x1 x2 . . . )


is a stationary distribution, then
xi λi = xi+1 µi+1 .

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Proof. A stationary distribution corresponds to a type of dynamic


balance; in the long run, the rate of leaving any subset of states
must be the same as the rate of entering that same subset.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Proof. A stationary distribution corresponds to a type of dynamic


balance; in the long run, the rate of leaving any subset of states
must be the same as the rate of entering that same subset.

For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Proof. A stationary distribution corresponds to a type of dynamic


balance; in the long run, the rate of leaving any subset of states
must be the same as the rate of entering that same subset.

For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.
When the Markov chain is in state i , it transitions to state i + 1
with rate λi . However, it is in state i only in an xi fraction of time,
so the long-term average rate of i → i + 1 transitions is xi λi .

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

Proof. A stationary distribution corresponds to a type of dynamic


balance; in the long run, the rate of leaving any subset of states
must be the same as the rate of entering that same subset.

For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.
When the Markov chain is in state i , it transitions to state i + 1
with rate λi . However, it is in state i only in an xi fraction of time,
so the long-term average rate of i → i + 1 transitions is xi λi .
Due to the dynamic balance, this must be equal to the long-term
average rate of i +1→i transitions, which is xi+1 µi+1 .

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

i+1

Stochastics Illés Horváth Continuous Time Markov Chains


Birth-death processes

xi+1

i+1 λ λ λ λ
i i i i
µ µ µ
i i+1 i+1 i+1
xi

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1/K queue

A special type of birth-death process is the M/M/1/K queue. It


models a single server with a buer that can hold K jobs (including
the job in the server). The server is serving the rst job in the
queue, while the other jobs are waiting. When the rst job has
nished service, it leaves the system and the server immediately
starts serving the next job in the queue. New jobs arrive and enter
the end of the queue. When the queue is full, incoming jobs are
discarded without entering the queue.

Arrivals are according to a PPP with rate λ; service time is EXP(µ).

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ λ λ λ

0 1 2 3 4 ...
µ µ µ µ

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ λ λ λ

0 1 2 3 4 ...
µ µ µ µ

Another type of depiction:

λ µ

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ λ λ λ

0 1 2 3 4 ...
µ µ µ µ

Another type of depiction:

λ µ

M/M/1/K stands for Markovian arrival, Markovian service, 1 server


and buer size K. It's also known as Kendall's notation.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
The M/M/1 queue is identical to the M/M/1 queue except that it
has an innite buer.

λ ... µ

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
The M/M/1 queue is identical to the M/M/1 queue except that it
has an innite buer.

λ ... µ

Let's compute vst = (x0 x1 . . . ). From the balance equations,

λ
xi+1 = · xi ,
µ

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
The M/M/1 queue is identical to the M/M/1 queue except that it
has an innite buer.

λ ... µ

Let's compute vst = (x0 x1 . . . ). From the balance equations,

λ
xi+1 = · xi ,
µ
so  2
λ λ λ
x1 = x0 , x2 = x1 = x0
µ µ µ
and so on.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
The M/M/1 queue is identical to the M/M/1 queue except that it
has an innite buer.

λ ... µ

Let's compute vst = (x0 x1 . . . ). From the balance equations,

λ
xi+1 = · xi ,
µ
so  2
λ λ λ
x1 = x0 , x2 = x1 = x0
µ µ µ
and so on. In general,
 n
λ
xn = x0 .
µ
Stochastics Illés Horváth Continuous Time Markov Chains
M/M/1 queue
We want to nd x0 . We also know

∞  n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
We want to nd x0 . We also know

∞  n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0

λ
If
µ < 1, the sum is nite:

∞  n
X λ 1
= ,
µ 1 − λ/µ
n=0

and the solution of the balance equations is

 n
λ 1
xn = · .
µ 1 − λ/µ

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue
We want to nd x0 . We also know

∞  n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0

λ
If
µ < 1, the sum is nite:

∞  n
X λ 1
= ,
µ 1 − λ/µ
n=0

and the solution of the balance equations is

 n
λ 1
xn = · .
µ 1 − λ/µ

In this case, we say that the queue is stable; there is a unique vst ,
and v (t) → vst as t→∞ also holds for any v (0). (No proof.)

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ
However, when
µ ≥ 1,
∞  n
X λ
= ∞.
µ
n=0

1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ
However, when
µ ≥ 1,
∞  n
X λ
= ∞.
µ
n=0

1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.
Altogether, if λ ≥ µ, there is no solution to the balance equations,
and a stationary distribution does not exist.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

λ
However, when
µ ≥ 1,
∞  n
X λ
= ∞.
µ
n=0

1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.
Altogether, if λ ≥ µ, there is no solution to the balance equations,
and a stationary distribution does not exist.

Intuitively the following is happening: when λ > µ, the arrival rate


is larger than the service rate, so in the long run, jobs will keep
accumulating in the queue, and the number of jobs in the queue
goes to innity, so it does not converge to any (stationary)
distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

The
λ
µ <1 case is known as a stable queue. It behaves
essentially just as nice as any nite Markov chain, with a
unique vst and v (t) → vst also holds.

The
λ
µ =1 case is known as a critical queue. There is no vst ,
and the queue is going up and down randomly. It will actually
visit every state innitely often.

The
λ
µ >1 case is known as an unstable queue. There is no
vst , and the queue is going to innity in the long run.

Stochastics Illés Horváth Continuous Time Markov Chains


M/M/1 queue

The
λ
µ <1 case is known as a stable queue. It behaves
essentially just as nice as any nite Markov chain, with a
unique vst and v (t) → vst also holds.

The
λ
µ =1 case is known as a critical queue. There is no vst ,
and the queue is going up and down randomly. It will actually
visit every state innitely often.

The
λ
µ >1 case is known as an unstable queue. There is no
vst , and the queue is going to innity in the long run.

λ
ρ=
µ
is known as the system load; the condition of stability of an
M/M/1 queue is thus ρ < 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Outlook: Other queues

M/M/c and M/M/c /K denote similar queues, but with c servers


instead of just 1. Each server always serves a job (if there are any
in the queue), and starts serving the next job immediately upon
nishing. The condition of stability for an M/M/c queue is

λ
< 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Outlook: Other queues

M/M/c and M/M/c /K denote similar queues, but with c servers


instead of just 1. Each server always serves a job (if there are any
in the queue), and starts serving the next job immediately upon
nishing. The condition of stability for an M/M/c queue is

λ
< 1.

G/M/1 denotes a queue where the interarrival time distribution is


not EXP. In this case, the queue is not a Markov chain, and other
tools are required to analyze it. G stands for general (distribution).

Stochastics Illés Horváth Continuous Time Markov Chains


Outlook: Other queues

M/M/c and M/M/c /K denote similar queues, but with c servers


instead of just 1. Each server always serves a job (if there are any
in the queue), and starts serving the next job immediately upon
nishing. The condition of stability for an M/M/c queue is

λ
< 1.

G/M/1 denotes a queue where the interarrival time distribution is


not EXP. In this case, the queue is not a Markov chain, and other
tools are required to analyze it. G stands for general (distribution).

M/G/1 denotes a queue where the service time distribution is not


EXP. Once again, this is not a Markov chain.

Stochastics Illés Horváth Continuous Time Markov Chains


Outlook: Other queues

M/M/c and M/M/c /K denote similar queues, but with c servers


instead of just 1. Each server always serves a job (if there are any
in the queue), and starts serving the next job immediately upon
nishing. The condition of stability for an M/M/c queue is

λ
< 1.

G/M/1 denotes a queue where the interarrival time distribution is


not EXP. In this case, the queue is not a Markov chain, and other
tools are required to analyze it. G stands for general (distribution).

M/G/1 denotes a queue where the service time distribution is not


EXP. Once again, this is not a Markov chain.

For G/G/1 queues, both the interarrival time and service time have
a general distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1

A machine works for a random time with distribution Exp(0.1) (in


hours) before it fails. When it fails, repair starts immediately, and
lasts for a random time with distribution Exp(0.9) (in hours),
independently from the length of the working period.

(a) Argue that this system is a continuous time Markov chain.


Determine the generator.

(b) Calculate the stationary distribution. In the long run, what is


the ratio of time spent with repair?

(c) As long as the machine is operating, it produces an income of


60 dollars per hour. The fee of the repairman is 30 dollars per
hour. What is the long term average net prot per hour in the
long run?

(d) Determine the transition matrix of the embedded Markov


chain.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1
Solution.
The states are 1 - working, 2 - in repair. The only possible
transitions are 1 →2 and 2 → 1, and the waiting time in each state
is exponentially distributed, independent of the past, so the
memoryless property holds and this is a Markov chain. It is
irreducible.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1
Solution.
The states are 1 - working, 2 - in repair. The only possible
transitions are 1 →2 and 2 → 1, and the waiting time in each state
is exponentially distributed, independent of the past, so the
memoryless property holds and this is a Markov chain. It is
irreducible.  
−0.1 0.1
Q= .
0.9 −0.9

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1
Solution.
The states are 1 - working, 2 - in repair. The only possible
transitions are 1 →2 and 2 → 1, and the waiting time in each state
is exponentially distributed, independent of the past, so the
memoryless property holds and this is a Markov chain. It is
irreducible.  
−0.1 0.1
Q= .
0.9 −0.9
Let vst = (x1 x2 ), then

−0.1x1 + 0.9x2 = 0
0.1x1 − 0.9x2 = 0
x1 + x2 = 1,
whose solution is
 
9 1
vst = (x1 x2 ) = .
10 10

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1

1
The long term average ratio of time spent in repair is x2 = 10 .

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1

1
The long term average ratio of time spent in repair is x2 = 10 .
According to the ergodic theorem, the long term average net prot
is
9 1
x1 · 60 + x2 · (−30) = · 60 + · (−30) = 51
10 10

dollars per hour.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 1

1
The long term average ratio of time spent in repair is x2 = 10 .
According to the ergodic theorem, the long term average net prot
is
9 1
x1 · 60 + x2 · (−30) = · 60 + · (−30) = 51
10 10

dollars per hour.

The transition matrix of the embedded Markov chain (not very


interesting in this case):

 
0 1
P= .
1 0

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

In a bank, clients are served at 2 windows. In the client area, at


most 5 clients may be present at the same time (including the ones
being served). When the client area is full, the security guard turns
away further clients without service. A client arrives on average
every 5 minutes. Serving a client takes 8 minutes on average.
When a client is served, the next client in line goes to the window
and service starts immediately. If a client arrives when the client
area is empty, he will pick a window at random.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

(a) Model this process with a CTMC. Calculate the generator.

(b) Calculate the stationary distribution.

(c) What is the probability that at a random time, 3 clients are in


the bank?

(d) What is the long term average number of clients?

(e) In the long run, what is the ratio of potential clients that are
turned away due to a full client area?

(f ) What is the long term average ratio of time when the


administrator at the rst window is idle?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
Solution.
(a) The states are 0, 1, 2, 3, 4, 5 according to the number of
clients inside. Assuming that the arrivals are according to a
Poisson process and the service time is EXP(1/8) independent
from the past, this is a CTMC. Using Kendall notation, this is
actually an M/M/2/5 queue.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
Solution.
(a) The states are 0, 1, 2, 3, 4, 5 according to the number of
clients inside. Assuming that the arrivals are according to a
Poisson process and the service time is EXP(1/8) independent
from the past, this is a CTMC. Using Kendall notation, this is
actually an M/M/2/5 queue.

Note that when both administrators are working, the service


rate is 1/8 + 1/8 = 2/8; each of them will nish with rate 1/8
independently, so the rate that of either of them nishing is
1/8 + 1/8.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
Solution.
(a) The states are 0, 1, 2, 3, 4, 5 according to the number of
clients inside. Assuming that the arrivals are according to a
Poisson process and the service time is EXP(1/8) independent
from the past, this is a CTMC. Using Kendall notation, this is
actually an M/M/2/5 queue.

Note that when both administrators are working, the service


rate is 1/8 + 1/8 = 2/8; each of them will nish with rate 1/8
independently, so the rate that of either of them nishing is
1/8
+ 1/8.
−1/5
 
1/5 0 0 0 0
 1/8 −13/40 1/5 0 0 0 
 
 0 2/8 −18/40 1/5 0 0 
Q= .
 0
 0 2/ 8 −18 /40 1 / 5 0 

 0 0 0 2 /8 −18/40 1/5 
0 0 0 0 2/8 −2/8
Stochastics Illés Horváth Continuous Time Markov Chains
Problem 2
(b) Calculate the stationary distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
(b) Calculate the stationary distribution.

Solution. This is a birth-death process, so the balance


equations hold for the stationary distribution:

1 1 2 1 2 1 2 1
x1 = x0 , x2 = x1 , x3 = x2 , x4 = x3 ,
8 5 8 5 8 5 8 5
2 1
x5 = x4 , x0 + x1 + · · · + x5 = 1,
8 5

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
(b) Calculate the stationary distribution.

Solution. This is a birth-death process, so the balance


equations hold for the stationary distribution:

1 1 2 1 2 1 2 1
x1 = x0 , x2 = x1 , x3 = x2 , x4 = x3 ,
8 5 8 5 8 5 8 5
2 1
x5 = x4 , x0 + x1 + · · · + x5 = 1,
8 5

from which

 2  3  4 !
8 8 4 8 4 8 4 8 4
x0 1 + + · + · + · + · =1
5 5 5 5 5 5 5 5 5

and

vst = (0.157 0.251 0.201 0.160 0.128 0.103) .

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

(c) What is the probability that at a random time, 3 clients are in


the bank?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

(c) What is the probability that at a random time, 3 clients are in


the bank?

P(3 clients are in the bank at a random time ) = x3 = 0.160.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

(c) What is the probability that at a random time, 3 clients are in


the bank?

P(3 clients are in the bank at a random time ) = x3 = 0.160.


(d) What is the long term average number of clients?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2

(c) What is the probability that at a random time, 3 clients are in


the bank?

P(3 clients are in the bank at a random time ) = x3 = 0.160.


(d) What is the long term average number of clients?

According to the ergodic theorem, the long-term average


number of clients in the bank is

0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 + 4 · x4 + 5 · x5 = 2.161.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
(e) In the long run, what is the ratio of potential clients that are
turned away due to a full client area?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
(e) In the long run, what is the ratio of potential clients that are
turned away due to a full client area?

Solution. Clients are turned away when the bank is full, that
is, the CTMC is in state 5. This is a ratio of x5 = 0.103 of
total time. This means that the ratio of clients arriving when
the CTMC is in state 5 is also 0.103 since clients arrive
independently of the state.

(f ) What is the long term average ratio of time when the


administrator at the rst window is idle?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 2
(e) In the long run, what is the ratio of potential clients that are
turned away due to a full client area?

Solution. Clients are turned away when the bank is full, that
is, the CTMC is in state 5. This is a ratio of x5 = 0.103 of
total time. This means that the ratio of clients arriving when
the CTMC is in state 5 is also 0.103 since clients arrive
independently of the state.

(f ) What is the long term average ratio of time when the


administrator at the rst window is idle?
Solution. In state 0, both administrators are idle. In state 1,
one administrator is working, and it is the administrator at the
rst window in half that time, so overall, the long term average
ratio of time when the administrator at window 1 is idle is

1 1
x0 + x1 = 0.157 + · 0.251 = 0.282.
2 2

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
Tivadar is a freelance programmer. He takes two types of jobs,
both of which have random length. A type A job takes on average
1 month, a type B job takes on average 2 months. When Tivadar
nishes a job, he has to wait on average 2/3 months before he is
oered a type A job, and he has to wait on average 1 month before
he is oered a type B job. He takes whichever comes rst.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

(b) Right now, Tivadar is out of a job. What is the probability


that he will be oered a type A job within 2 days?

(c) Right now, Tivadar is out of a job. What is the approximate


probability that he will be doing a type A job 2 days from
now? (You may assume a month has 30 days.)

(d) Calculate the stationary distribution. What is the ratio of time


Tivadar spends with jobs of type A in the long run?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.

(f ) Calculate the transition matrix of the embedded Markov chain.

(g) In the long run, from among all jobs he takes, what is the ratio
of type A jobs?

(h) Assume that the distribution of the length of a type B job is


NOT exponential, but instead each type B job consists of two
IID parts which are both exponential. How can this be
modelled by a CTMC? Calculate the generator and the
stationary distribution for this new process. Describe the
dierence between this process and the original.

(i) He decides not to take type A jobs anymore. Calculate his


long term average income per day after that.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
Solution.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
Solution.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

Solution. The states are 0, A, B (0 means he has no job at the


time). The possible transitions are
0 → A, 0 → B, A → 0, B → 0. Assuming that the waiting
time between jobs is exponentially distributed and the length
of each job is also exponentially distributed, this is a CTMC.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
Solution.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

Solution. The states are 0, A, B (0 means he has no job at the


time). The possible transitions are
0 → A, 0 → B, A → 0, B → 0. Assuming that the waiting
time between jobs is exponentially distributed and the length
of each job is also exponentially distributed, this is a CTMC.

The arrival rate of type A jobs is 3/2, and the arrival rate of
type B jobs is 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
Solution.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

Solution. The states are 0, A, B (0 means he has no job at the


time). The possible transitions are
0 → A, 0 → B, A → 0, B → 0. Assuming that the waiting
time between jobs is exponentially distributed and the length
of each job is also exponentially distributed, this is a CTMC.

The arrival rate of type A jobs is 3/2, and the arrival rate of
type B jobs is 1.

The innitesimal generator is

 
−5/2 3/2 1
Q= 1 −1 0 .
1 /2 0 −1/2

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(b) Right now, Tivadar is out of a job. What is the probability


that he will be oered a type A job within 2 days?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(b) Right now, Tivadar is out of a job. What is the probability


that he will be oered a type A job within 2 days?

Solution. Let TA denote the waiting time before the rst A job
oer. Then TA ∼ EXP(3/2), and

P(TA < 2/30) = 1 − e −2/30·3/2 ≈ 0.095.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(c) Right now, Tivadar is out of a job. What is the approximate


probability that he will be doing a type A job 2 days from
now? (You may assume a month has 30 days.)

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(c) Right now, Tivadar is out of a job. What is the approximate


probability that he will be doing a type A job 2 days from
now? (You may assume a month has 30 days.)

Solution. We use the short-term approximation of the state


probability vector:

v (2/30) = v (0)e Q·2/30 ≈ v (0)(I + Q · 2/30) =


    
1 0 0 −5/2 3/2 1
( 1 0 0 ) ·  0 1 0  +  1 −1 0  · 2/30 =
0 0 1 1/2 0 −1/2
( 5/6 1/10 1/15 ),

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(c) Right now, Tivadar is out of a job. What is the approximate


probability that he will be doing a type A job 2 days from
now? (You may assume a month has 30 days.)

Solution. We use the short-term approximation of the state


probability vector:

v (2/30) = v (0)e Q·2/30 ≈ v (0)(I + Q · 2/30) =


    
1 0 0 −5/2 3/2 1
( 1 0 0 ) ·  0 1 0  +  1 −1 0  · 2/30 =
0 0 1 1/2 0 −1/2
( 5/6 1/10 1/15 ),

so

P(he will be doing a type A job 2 days from now ) ≈ 1/10 = 0.1.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(d) Calculate the stationary distribution. What is the ratio of time
Tivadar spends with jobs of type A in the long run?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(d) Calculate the stationary distribution. What is the ratio of time
Tivadar spends with jobs of type A in the long run?

Solution. Let vst = (x1 x2 x3 ), then

−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(d) Calculate the stationary distribution. What is the ratio of time
Tivadar spends with jobs of type A in the long run?

Solution. Let vst = (x1 x2 x3 ), then

−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.

The solution is
 
2 3 4
vst = (x1 x2 x3 ) = .
9 9 9

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(d) Calculate the stationary distribution. What is the ratio of time
Tivadar spends with jobs of type A in the long run?

Solution. Let vst = (x1 x2 x3 ), then

−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.

The solution is
 
2 3 4
vst = (x1 x2 x3 ) = .
9 9 9

The long term average ratio of time Tivadar spends with type
A jobs is x2 = 39 .
Stochastics Illés Horváth Continuous Time Markov Chains
Problem 4

(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.

Solution. According to the ergodic theorem, the long-term


average income is

2 3 4
0·x1 +20000·x2 +50000·x3 = 0· +20000· +50000· = 28900
9 9 9

HUF per day.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(f ) Calculate the transition matrix of the embedded Markov chain.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(f ) Calculate the transition matrix of the embedded Markov chain.

Solution. For the transition probability matrix of the


embedded Markov chain, we need to divide row i of Q by |qii |
for each i, then add I:
     
−5/2 3/2 1 −1 3 /5 2 /5 0 3 /5 2/5
 1 −1 0 → 1 −1 0 → 1 0 0 ,
1/2 0 −1/2 1 0 −1 1 0 0

so  
0 3/5 2 /5
P=1 0 0 .
1 0 0

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(g) In the long run, from among all jobs he takes, what is the ratio
of type A jobs?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(g) In the long run, from among all jobs he takes, what is the ratio
of type A jobs?

Solution. Let ust = (y1 y2 y3 ) be the stationary distribution of


the embedded Markov chain. Then from

ust · P = ust , y1 + y2 + y3 = 1
we have
3 2
y1 = y2 + y3 , y1 = y2 , y1 = y3 , y1 + y2 + y3 = 1
5 5
whose solution is
 
1 3 2
ust = (y1 y2 y3 ) = .
2 10 10

y2 and y3 correspond to jobs, so from among all jobs he takes,


the ratio of type A jobs is

3/10 3
= .
3/10 + 2/10 5

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(h) Assume that the distribution of the length of a type B job is


NOT exponential, but instead each type B job consists of two
IID parts which are both exponential. How can this be
modelled by a CTMC? Calculate the generator and the
stationary distribution for this new process. Describe the
dierence between this process and the original.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4

(h) Assume that the distribution of the length of a type B job is


NOT exponential, but instead each type B job consists of two
IID parts which are both exponential. How can this be
modelled by a CTMC? Calculate the generator and the
stationary distribution for this new process. Describe the
dierence between this process and the original.

Solution. Now the states are 0, A, B1, B2, and

 
−5/2 3 /2 1 0
 1 −1 0 0 
Q= .
 0 0 −1 1 
1 0 0 −1

since each half of a type B job has expectation 1 month now,


so they last for EXP(1).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Computing the stationary distribution for this Q gives
 
2 3 2 2
vst = ,
9 9 9 9

while the original stationary distribution was


 
2 3 4
vst = .
9 9 9

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Computing the stationary distribution for this Q gives
 
2 3 2 2
vst = ,
9 9 9 9

while the original stationary distribution was


 
2 3 4
vst = .
9 9 9

Are the two processes the same?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Computing the stationary distribution for this Q gives
 
2 3 2 2
vst = ,
9 9 9 9

while the original stationary distribution was


 
2 3 4
vst = .
9 9 9

Are the two processes the same?

Even though the stationary distributions (and thus long-term


behaviour) are consistent, the two processes are still dierent.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Computing the stationary distribution for this Q gives
 
2 3 2 2
vst = ,
9 9 9 9

while the original stationary distribution was


 
2 3 4
vst = .
9 9 9

Are the two processes the same?

Even though the stationary distributions (and thus long-term


behaviour) are consistent, the two processes are still dierent.

The time spent at state B has dierent distributions for the


two processes. For the original process, it is EXP(1/2), but for
the second process, it is Erlang(2,1), which has the same mean
(this is why the stationary distributions are consistent), but
has a smaller variance.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Computing the stationary distribution for this Q gives
 
2 3 2 2
vst = ,
9 9 9 9

while the original stationary distribution was


 
2 3 4
vst = .
9 9 9

Are the two processes the same?

Even though the stationary distributions (and thus long-term


behaviour) are consistent, the two processes are still dierent.

The time spent at state B has dierent distributions for the


two processes. For the original process, it is EXP(1/2), but for
the second process, it is Erlang(2,1), which has the same mean
(this is why the stationary distributions are consistent), but
has a smaller variance.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Outlook: essentially, by replacing a state with a mini-Markov
chain on 2 states, the distribution of the waiting time spent in
state B was modied. In general, it can be modied to
approximate any distribution by using a more complicated
mini-Markov chain. See also phase-type distributions and
stochastic modelling.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Outlook: essentially, by replacing a state with a mini-Markov
chain on 2 states, the distribution of the waiting time spent in
state B was modied. In general, it can be modied to
approximate any distribution by using a more complicated
mini-Markov chain. See also phase-type distributions and
stochastic modelling.

(i) He decides not to take type A jobs anymore. Calculate his


long term average income per day after that.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 4
(h) Outlook: essentially, by replacing a state with a mini-Markov
chain on 2 states, the distribution of the waiting time spent in
state B was modied. In general, it can be modied to
approximate any distribution by using a more complicated
mini-Markov chain. See also phase-type distributions and
stochastic modelling.

(i) He decides not to take type A jobs anymore. Calculate his


long term average income per day after that.

Solution. Now the states are just 0 and B,


 
−1 1
Q= ,
1 −1
vst = (0.5 0.5),
and, according to the ergodic theorem, his long term average
daily income is

0.5 · 50000 + 0.5 · 0 = 25000 (HUF per day ).


Stochastics Illés Horváth Continuous Time Markov Chains
Problem 5

Let X (t) be a CTMC on state space {1 , 2 } and Y (t) be a CTMC


on state space {a, b, c} with generators

 
  −2 1 1
−1/2 1/2
QX = QY =  1 −1 0 
1 −1
3 0 −3

X (t) and Y (t) are running independently parallel to each other.


Let Z (t) = (X (t), Y (t)). Argue that Z (t) is a CTMC. Calculate
the states and generator matrix of Z (t). Calculate the stationary
distribution for X (t) and Y (t) and compare it to the stationary
distribution of Z (t).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5
Solution. The possible states are 1a, 1b, 1c, 2a, 2b, 2c. From the
state 1a, the possible transitions are:

1a→1b. This happens when a→b in Y (t) occurs, so it has


rate 1.

1a→1c also with rate 1.

1a→2a happens when X (t) makes a 1→2 transition, so it has


rate 1/2.

Altogether, QZ is

1a 1b 1c 2a 2b 2c
1a −5/2 1 1 1/2 0 0
1b 1 −3/2 0 0 1 /2 0
1c 3 0 −7/2 0 0 1/2
2a 1 0 0 −3 1 1
2b 0 1 0 1 −2 0
2c 0 0 1 3 0 −4

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5

The structure of QZ can be understood as follows. The transitions


corresponding to transitions of Y (t) are highlighted in red, and the
transitions due to X (t) are highlighted in blue:

1a 1b 1c 2a 2b 2c
1
1a ∗ 1 1
2 0 0
1
1b 1 ∗ 0 0
2 0
1
1c 3 0 ∗ 0 0
2
2a 1 0 0 ∗ 1 1
2b 0 1 0 1 ∗ 0
2c 0 0 1 3 0 ∗

(*'s mark the negative diagonal elements.)

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5

The structure of QZ can be understood as follows. The transitions


corresponding to transitions of Y (t) are highlighted in red, and the
transitions due to X (t) are highlighted in blue:

1a 1b 1c 2a 2b 2c
1
1a ∗ 1 1
2 0 0
1
1b 1 ∗ 0 0
2 0
1
1c 3 0 ∗ 0 0
2
2a 1 0 0 ∗ 1 1
2b 0 1 0 1 ∗ 0
2c 0 0 1 3 0 ∗

(*'s mark the negative diagonal elements.)

We can see that the red elements are essentially two copies of QY .
The exact structure can be understood with Kronecker-product
notation.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5
The Kronecker-product of an n×n matrix A and an m×m matrix
B is the mn × mn matrix with block structure
 
a11 B a12 B . . . a1n B
 a21 B a22 B . . . a2n B 
A⊗B = . .
 
.. .
 .. . .
. 
an 1 B a n 2 B . . . ann B

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5
The Kronecker-product of an n×n matrix A and an m×m matrix
B is the mn × mn matrix with block structure
 
a11 B a12 B . . . a1n B
 a21 B a22 B . . . a2n B 
A⊗B = . .
 
.. .
 .. . .
. 
an 1 B a n 2 B . . . ann B

With this notation, the red part of the matrix is actually


 
1 1 0 0 0

 1 ∗ 0 0 0 0 

 3 0 ∗ 0 0 0 
I2 ⊗ QY =  .

 0 0 0 ∗ 1 1 

 0 0 0 1 ∗ 0 
0 0 0 3 0 ∗

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 5

Similarly, the transitions corresponding to X (t) are

1

 
0 0 0 0
2
1
 0 ∗ 0 0
2 0 
1
 
 0 0 ∗ 0 0
2

QX ⊗ I3 =  

 1 0 0 ∗ 1 1 

 0 1 0 1 ∗ 0 
0 0 1 3 0 ∗

and
QZ = I2 ⊗ QY + QX ⊗ I3 .
This is the equation that describes the joint transition rate matrix
of two CTMC's running simultaneosly, independently from each
other. It is also known as the Kronecker-sum of QX and QY , and is
denoted by QX ⊕ QY .

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6

On a telecommunications channel, three independent data streams


are present. Each stream has two states: ON and OFF. A stream in
ON state has speed 1 Mb/s; in OFF, 0 Mb/s. Each stream changes
from OFF to ON with rate λ and from ON to OFF with rate µ
independently from everything else. Let Xt denote the total speed
of the three data streams at time t. Argue that that the Markov
property holds for Xt and calculate the stationary distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6
Solution. The states are 0, 1, 2 and 3 according to the total speed
of the three data streams.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6
Solution. The states are 0, 1, 2 and 3 according to the total speed
of the three data streams.

All streams are memoryless (since the time spent in each state is
EXP, independent of the past), and the streams are identical, so
when there is e.g. 1 stream turned ON, for the evolution of the
process it doesn't matter which one.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6
Solution. The states are 0, 1, 2 and 3 according to the total speed
of the three data streams.

All streams are memoryless (since the time spent in each state is
EXP, independent of the past), and the streams are identical, so
when there is e.g. 1 stream turned ON, for the evolution of the
process it doesn't matter which one.

In state 0, all streams are OFF, and the only change that can occur
is 0 →1 when one of the streams turns ON. The rate of any single
stream turning on is λ, so the total rate of 0 → 1 is 3λ. Similarly,
the rate of 1 → 0 is µ and the rate of 1 → 2 is 2λ, and so on;
 
−3λ 3λ 0 0
 µ −µ − 2λ 2λ 0 
Q= .
 0 2µ −2µ − λ λ 
0 0 3µ −3µ

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6

This is an example where the Kronecker-product of the three 2 ×2


CTMC's corresponding to each stream would give a CTMC on a
state space of size 2
3 = 8, where the 8 states would include all
information about which streams are ON and which are OFF.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6

This is an example where the Kronecker-product of the three 2 ×2


CTMC's corresponding to each stream would give a CTMC on a
state space of size 2
3 = 8, where the 8 states would include all
information about which streams are ON and which are OFF.

The states are then grouped together into 4 states according to the
number of ON streams.

Normally, when grouping together some of the states of a Markov


chain, the end result is not a Markov chain, but in this case, all
streams are identical, so it doesn't matter which of the streams are
ON, only their number, and the number of ON streams will also be
a Markov chain on its own.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6

The stationary distribution is

µ3 2 2 λ3
 
3µ λ 3µλ
vst = ,
(λ + µ)3 (λ + µ)3 (λ + µ)3 (λ + µ)3
µ
which is actually a BIN(3,
λ+µ ) distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 6

The stationary distribution is

µ3 2 2 λ3
 
3µ λ 3µλ
vst = ,
(λ + µ)3 (λ + µ)3 (λ + µ)3 (λ + µ)3
µ
which is actually a BIN(3,
λ+µ ) distribution.
A possible interpretation is that each stream is a CTMC
 on its own,

µ λ
with 2 states (OFF/ON) and stationary distribution
λ+µ , λ+µ ,
so in the stationary distribution of
  Z (t), each will be ON with
µ λ
probability
λ+µ , λ+µ independently.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 7

A burglar commits on average 2 robberies per month when not in


prison. During each robbery, he gets caught with probability 1/4
and is sent to prison. He spends on average 6 months in prison,
then goes back to robbing immediately after he is released from
prison. The average damage per robbery is 2 million HUF.

(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.

(b) Calculate the long-term average ratio of time he spends in


prison.

(c) Calculate the long-term average damage he makes per month.

(d) Right now, he is free. Estimate the probability that he will still
be free 10 days from now (10 days is 1/3 months).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 7

Solution.

(a) Two states: 1 - free, 2 - in prison. The rate of robberies is 2


per month, but only robberies where he gets caught cause a
1→2 transition. Robberies where he gets caught have a rate of
1 1
4 ·2= 2 per month.
He spends on average 6 months in prison, so the 2→1 rate is
1/6. Altogether,

 
−1/2 1/2
Q= .
1 /6 −1/6

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 7

Solution.

(a) Two states: 1 - free, 2 - in prison. The rate of robberies is 2


per month, but only robberies where he gets caught cause a
1→2 transition. Robberies where he gets caught have a rate of
1 1
4 ·2= 2 per month.
He spends on average 6 months in prison, so the 2→1 rate is
1/6. Altogether,

 
−1/2 1/2
Q= .
1 /6 −1/6

1 3

(b) The stationary distribution is
4 4 , so he spends on average
3
4 of the time in prison.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 7

(c) When free, he commits on average 2 robberies per month, and


each robbery causes an average damage of 2 (million HUF).
This means that when he is free, the average damage per
month he causes is 2 ×2=4 (million HUF per month). When
he is in prison, it is 0. Overall, according to the ergodic
theorem, the long-term average damage he makes

1 3
·4+ ·0=1
4 4

(million HUF per month).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 7

(d) He is free, so v (0) = (1 0). According to the short-term


approximation,

   
1 1
v ≈ v (0) · I + Q
3 3
   
1 0 1 −1/2 1/2
= (1 0) · + ·
0 1 3 1 /6 −1/6
   
5/6 1/6 5 1
= (1 0) · = ,
1/18 17/18 6 6

so
5
P(he will still be free 10 days from now )≈ .
6

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

A dark corridor is lit by a single light that can be turned on by a


button at either end of the corridor. When someone enters the
corridor with the light o, they turn on the light. The light turns
o automatically after 1 minute; while the light is on, pressing the
button has no eect. One person arrives to the corridor on average
every 4 minutes. Let Xt denote the state of the light at time t.
(a) What are the possible states? Is Xt a Markov-chain?

(b) Calculate the long-term ratio of time when the light is on.

(c) What is the probability that the light is on when a person


arrives?

(d) It takes 20 seconds for a person to cross the corridor.


Assuming that a person found the light on when arriving to
the corridor, what is the probability that he can cross the
entire corridor with the light on?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

Solution.

(a) The possible states are ON and OFF according to whether the
light is on or o.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

Solution.

(a) The possible states are ON and OFF according to whether the
light is on or o.

This is not a Markov chain because the waiting time in the ON


state is not exponentially distributed. In other words, when the
light is ON, the future also depends on how long it has been in
the ON state, so the Markov property does not hold.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

Solution.

(a) The possible states are ON and OFF according to whether the
light is on or o.

This is not a Markov chain because the waiting time in the ON


state is not exponentially distributed. In other words, when the
light is ON, the future also depends on how long it has been in
the ON state, so the Markov property does not hold.

(b) So this is not a Markov chain, but the process still has a
relatively simple structure.

1 1 1 1
on

off
EXP(1/4) EXP(1/4)
t

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

(b) ON and OFF intervals are alternating, each ON interval has


length 1 minute, and each OFF interval has average length 4
minutes, so altogether, the ratio of time when the light is ON
1
is
1+4 = 0.2.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

(b) ON and OFF intervals are alternating, each ON interval has


length 1 minute, and each OFF interval has average length 4
minutes, so altogether, the ratio of time when the light is ON
1
is
1+4 = 0.2.
(c) Arrivals are independent from the state of the light, and the
light is ON 0.2 of the time, so the probability that a person
nds the light ON is also 0.2.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

(b) ON and OFF intervals are alternating, each ON interval has


length 1 minute, and each OFF interval has average length 4
minutes, so altogether, the ratio of time when the light is ON
1
is
1+4 = 0.2.
(c) Arrivals are independent from the state of the light, and the
light is ON 0.2 of the time, so the probability that a person
nds the light ON is also 0.2.

(d) If the person arrived in the rst 40 seconds of the current 60


second ON interval, then he can cross, otherwise the light will
turn OFF before he crosses.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 9

(b) ON and OFF intervals are alternating, each ON interval has


length 1 minute, and each OFF interval has average length 4
minutes, so altogether, the ratio of time when the light is ON
1
is
1+4 = 0.2.
(c) Arrivals are independent from the state of the light, and the
light is ON 0.2 of the time, so the probability that a person
nds the light ON is also 0.2.

(d) If the person arrived in the rst 40 seconds of the current 60


second ON interval, then he can cross, otherwise the light will
turn OFF before he crosses.

Assuming the light was ON, the probability that the person
40 2
arrives in the rst 40 seconds is
60 = 3.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
People are queueing at an ATM machine. On average, one person
arrives at the machine every 2 minutes. Each person uses the
machine for 1 minutes on average. When a person arrives at the
machine but someone else is already using the machine, they will
wait in a queue. However, if there are at least 2 people waiting (in
addition to the person using the machine), further people will leave
immediately (and go to another ATM machine). Let X (t) denote
the number of people at the machine at time t.
(a) Model X (t) with a continuous time Markov chain. What are
the states? Calculate the generator.

(b) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 20 minutes
from now, the machine will be free.

(c) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 10 seconds
from now, the machine will be free.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

(a) Model X (t) with a continuous time Markov chain. What are
the states? Calculate the generator.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

(a) Model X (t) with a continuous time Markov chain. What are
the states? Calculate the generator.

Solution. The states are 0, 1, 2 and 3 according to the number


of people in the queue; if we assume that the waiting times
and the interarrival times are exponentially distributed and
independent, then X (t) is a continuous time Markov chain
with generator

 
−1/2 1/2 0 0
 1 −3/2 1 /2 0 
Q= .
 0 1 −3/2 1 /2 
0 0 1 −1

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
(b) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 1 hour from
now, the machine will be free.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
(b) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 1 hour from
now, the machine will be free.

Solution. 20 minutes is a long time, so we can approximate


v (t) by vst . This is a birth-death process, so if
vst = (x0 x1 x2 x3 ), then the balance equations hold:
1 1 1
x0 = x1 , x1 = x2 , x2 = x3 ,
2 2 2
x0 + x1 + · · · + x5 = 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
(b) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 1 hour from
now, the machine will be free.

Solution. 20 minutes is a long time, so we can approximate


v (t) by vst . This is a birth-death process, so if
vst = (x0 x1 x2 x3 ), then the balance equations hold:
1 1 1
x0 = x1 , x1 = x2 , x2 = x3 ,
2 2 2
x0 + x1 + · · · + x5 = 1.
From the rst three equations, x0 : x1 : x2 : x3 = 8 : 4 : 2 : 1,
so
 
8 4 2 1
vst = ,
15 15 15 15

and
8
P(13:00 tomorrow, the machine will be free )= .
15
Stochastics Illés Horváth Continuous Time Markov Chains
Problem 10
(c) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 10 seconds
from now, the machine will be free.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
(c) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 10 seconds
from now, the machine will be free.

Solution. 10 seconds (= 1/6 minutes) is a short time, so we


use the short term approximation:

v (1/6) = v (0)e Q·1/6 ≈ v (0)(I + Q · 1/6) =


    
1 0 0 0 −1/2 1/2 0 0
 0 1 0 0   1 −3/2 1/2 0  
(0 1 0 0) ·   +  · 1/6 =
 0 0 1 0   0 1 −3/21/2  
0 0 0 1 0 0 1 −1
 
11/12 1/12 0 0
 2/12 9/12 1/12 0 
(0 1 0 0) ·   = ( 2/12 9/12 1/12 0 )
 0 2/12 9/12 1/12 
0 0 2/12 10/12

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

When is the short term approximation applicable? When is the long


term (stationary) approximation applicable?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

When is the short term approximation applicable? When is the long


term (stationary) approximation applicable?

For a generator Q, the speed of the process can be roughly


described by the values |qii |; the average time spent in state i is
1/|qii |.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

When is the short term approximation applicable? When is the long


term (stationary) approximation applicable?

For a generator Q, the speed of the process can be roughly


described by the values |qii |; the average time spent in state i is
1/|qii |.

Rule of thumb: when

t > 10 max 1/|qii |,


i

then the long term (stationary) approximation is applicable.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10

When is the short term approximation applicable? When is the long


term (stationary) approximation applicable?

For a generator Q, the speed of the process can be roughly


described by the values |qii |; the average time spent in state i is
1/|qii |.

Rule of thumb: when

t > 10 max 1/|qii |,


i

then the long term (stationary) approximation is applicable.

When
1
t< min 1/|qii |,
3 i

then the short term approximation is applicable.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
Compare the approximations to the precise values:

vst = (0.5333 0.2667 0.1333 0.0667) ,


v (20) = (0.5333 0.2667 0.1333 0.0667) .

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
Compare the approximations to the precise values:

vst = (0.5333 0.2667 0.1333 0.0667) ,


v (20) = (0.5333 0.2667 0.1333 0.0667) .

Actually, for this Markov-chain, the approximation is very good


already at t=5 minutes:

v (5) = (0.5331 0.2669 0.1334 0.0666) .

Recommended topic: Markov chain mixing times.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 10
Compare the approximations to the precise values:

vst = (0.5333 0.2667 0.1333 0.0667) ,


v (20) = (0.5333 0.2667 0.1333 0.0667) .

Actually, for this Markov-chain, the approximation is very good


already at t=5 minutes:

v (5) = (0.5331 0.2669 0.1334 0.0666) .

Recommended topic: Markov chain mixing times.

For the short term approximation:

v (1/6) = (0.1419 0.7900 0.0653 0.0028) ,

and from part (c) we got

v (1/6) ≈ (0.1667 0.7500 0.0833 0) .

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11

In a self-service laundry, there are 3 identical laundry machines. On


average, 2 clients per hour arrive. When a client arrives, if there is
an available machine, he starts using it right away. The average
time of a laundry program is 1 hour. We assume the client takes
the clothes away immediately upon the machine nishing. If a
client arrives at a time when none of the machines are available, he
leaves and does not come back.

(a) Model this process with a continuous time Markov chain.


Calculate the generator.

(b) What is the long term ratio of time when at least 1 machine is
available?

(c) Calculate the average number of machines in use.

(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11

(a) Model this process with a continuous time Markov chain.


Calculate the generator.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11

(a) Model this process with a continuous time Markov chain.


Calculate the generator.

Solution. The states are 0, 1, 2 and 3 according to the number


of machines working. It is a birth-death process; the rates
going up (corresponding to arrivals) are 2, but the rates going
down (corresponding to machines nishing a program) depend
on the number of machines working:

 
−2 2 0 0
 1 −3 2 0 
Q= .
 0 2 −4 2 
0 0 3 −3

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(b) What is the long term ratio of time when at least 1 machine is
available?

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(b) What is the long term ratio of time when at least 1 machine is
available?

Solution. We compute the stationary distribution. This is a


birth-death process, so if vst = (x0 x1 x2 x3 ), then the balance
equations hold:

2x0 = x1 , 2x1 = 2x2 , 2x3 = 3x4 ,


x0 + x1 + · · · + x5 = 1.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(b) What is the long term ratio of time when at least 1 machine is
available?

Solution. We compute the stationary distribution. This is a


birth-death process, so if vst = (x0 x1 x2 x3 ), then the balance
equations hold:

2x0 = x1 , 2x1 = 2x2 , 2x3 = 3x4 ,


x0 + x1 + · · · + x5 = 1.
From the rst three equations, x0 : x1 : x2 : x3 = 3 : 6 : 6 : 4,
so
 
3 6 6 4
vst = ,
19 19 19 19

and the long term ratio of time when at least 1 machine


available is
15
x0 + x1 + x2 = .
19

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(c) Calculate the average number of machines in use.

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(c) Calculate the average number of machines in use.

Solution. According to the ergodic theorem,

30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(c) Calculate the average number of machines in use.

Solution. According to the ergodic theorem,

30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19

(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(c) Calculate the average number of machines in use.

Solution. According to the ergodic theorem,

30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19

(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).

Solution. When there is 1 machine working, the rst machine


is in use 1/3 of the time; when there are 2 machines working,
the rst machine is in use 2/3 of the time, and when all 3
machines are working, it is of course in use. So the average
usage rate is
1 2 10
· x1 + · x2 + ·x3 = .
3 3 19

Stochastics Illés Horváth Continuous Time Markov Chains


Problem 11
(c) Calculate the average number of machines in use.

Solution. According to the ergodic theorem,

30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19

(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).

Solution. When there is 1 machine working, the rst machine


is in use 1/3 of the time; when there are 2 machines working,
the rst machine is in use 2/3 of the time, and when all 3
machines are working, it is of course in use. So the average
usage rate is
1 2 10
· x1 + · x2 + ·x3 = .
3 3 19
Note that this is also equal to the average number of machines
(30/19) divided by the total number of machines (3).
Stochastics Illés Horváth Continuous Time Markov Chains
PASTA principle

Outlook: Queueing Theory. We look at a few more properties of


queues.

Stochastics Illés Horváth Continuous Time Markov Chains


PASTA principle

Outlook: Queueing Theory. We look at a few more properties of


queues.

We know that for a stationary Markov queue (birth-death process),


if we look at the queue in a given point in time, the state of the
system will be according to the stationary distribution.
But what is the distribution that an arriving customer sees?

Stochastics Illés Horváth Continuous Time Markov Chains


PASTA principle

Outlook: Queueing Theory. We look at a few more properties of


queues.

We know that for a stationary Markov queue (birth-death process),


if we look at the queue in a given point in time, the state of the
system will be according to the stationary distribution.
But what is the distribution that an arriving customer sees?
Intuitively, this should also be the stationary distribution.

Stochastics Illés Horváth Continuous Time Markov Chains


PASTA principle

Outlook: Queueing Theory. We look at a few more properties of


queues.

We know that for a stationary Markov queue (birth-death process),


if we look at the queue in a given point in time, the state of the
system will be according to the stationary distribution.
But what is the distribution that an arriving customer sees?
Intuitively, this should also be the stationary distribution.

This is true, and it is known as the PASTA principle (Poisson


Arrival Sees Time Average). Actually, as long as the arrival process
is a Poisson process, this is true even if the service times are not
exponential (in which case the system is not a Markov process,
hence the wording time average instead of stationary distribution).

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time
Consider a Markov queue where customers arrive with rate λ. The
service rates can be arbitrary, and the queue can be either nite or
innite (but if it is innite, we assume it is stable).

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time
Consider a Markov queue where customers arrive with rate λ. The
service rates can be arbitrary, and the queue can be either nite or
innite (but if it is innite, we assume it is stable).

The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time
Consider a Markov queue where customers arrive with rate λ. The
service rates can be arbitrary, and the queue can be either nite or
innite (but if it is innite, we assume it is stable).

The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.

The quality of service received by customers is often measured by


the average system time, that is, the long term average of the
system time of each customer.

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time
Consider a Markov queue where customers arrive with rate λ. The
service rates can be arbitrary, and the queue can be either nite or
innite (but if it is innite, we assume it is stable).

The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.

The quality of service received by customers is often measured by


the average system time, that is, the long term average of the
system time of each customer.

For the long term behaviour of the queue, we typically compute the
stationary distribution, which is an important descriptor of the
system, but it is talking about the number of customers in the
system, and it doesn't directly relate to system times.

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time
Consider a Markov queue where customers arrive with rate λ. The
service rates can be arbitrary, and the queue can be either nite or
innite (but if it is innite, we assume it is stable).

The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.

The quality of service received by customers is often measured by


the average system time, that is, the long term average of the
system time of each customer.

For the long term behaviour of the queue, we typically compute the
stationary distribution, which is an important descriptor of the
system, but it is talking about the number of customers in the
system, and it doesn't directly relate to system times.

Next we look at how the average system time can be computed


from the stationary distribution.
Stochastics Illés Horváth Continuous Time Markov Chains
Waiting time

We dene 3 quantities rst.

Let W denote the average system time. This is what we want


to compute.

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time

We dene 3 quantities rst.

Let W denote the average system time. This is what we want


to compute.

Let λe denote the eective arrival rate. For a stable, innite


queue, λe = λ; for a nite queue, the arrivals lost due to a full
queue are not included in λe . If we denote

q = the ratio of arrivals lost due to a full queue ,

then λe = (1 − q)λ.

Stochastics Illés Horváth Continuous Time Markov Chains


Waiting time

We dene 3 quantities rst.

Let W denote the average system time. This is what we want


to compute.

Let λe denote the eective arrival rate. For a stable, innite


queue, λe = λ; for a nite queue, the arrivals lost due to a full
queue are not included in λe . If we denote

q = the ratio of arrivals lost due to a full queue ,

then λe = (1 − q)λ.
Let L denote the average number of customers in the system.
From the ergodic theorem, we know that

L = 0 · x0 + 1 · x1 + 2 · x2 + . . .

Stochastics Illés Horváth Continuous Time Markov Chains


Little's Law
Theorem (Little's Law)

L = λe W .

Stochastics Illés Horváth Continuous Time Markov Chains


Little's Law
Theorem (Little's Law)

L = λe W .

Proof (sketch). Let's depict customers this way (ordered according


to the time of arrival):

...
customer 8
c. 7
customer 6
c. 5
c. 4
customer 3
c. 2
customer 1
time

customer 1 arrives
customer 1 finishes

Stochastics Illés Horváth Continuous Time Markov Chains


Little's Law

Consider the blue area for large T.

Stochastics Illés Horváth Continuous Time Markov Chains


Little's Law

Consider the blue area for large T is large, the number of


T. When
customers up to time T is approximately λe T , and the average
time spent in the system by each customer is W , so the blue area is
B = λe TW in total.
Stochastics Illés Horváth Continuous Time Markov Chains
Little's Law

3 2
234
3 2 1 0
1 23
2
1

On the other hand, we can get the average number of customers


from the blue area by dividing it by T, so

B λe TW
L= = = λe W .
T T
Stochastics Illés Horváth Continuous Time Markov Chains
Little's Law

Little's law actually holds for practically any queuing system. It is


valid even if the interarrival or service time distributions are not
exponential (so the system is not a Markov chain). It is valid for
larger systems including more queues; it is valid for each queue
separately or the system as a whole.

Stochastics Illés Horváth Continuous Time Markov Chains


Little's Law

Little's law actually holds for practically any queuing system. It is


valid even if the interarrival or service time distributions are not
exponential (so the system is not a Markov chain). It is valid for
larger systems including more queues; it is valid for each queue
separately or the system as a whole.

Little's law describes the system time averaged out for all
customers in the long run. If we want more detailed information
(e.g. what is the average system time of a customer that arrives in
a queue of length i ), we need more detailed computations.

Stochastics Illés Horváth Continuous Time Markov Chains

You might also like