08 - Continuous Time Markov Chains
08 - Continuous Time Markov Chains
Stochastics
Illés Horváth
2021/11/02
(1) Start from a discrete time Markov chain, and randomize the
waiting time between transitions.
(2) Start from a discrete time Markov chain, and consider the
limit when the discrete time unit converges to 0.
(1) Start from a discrete time Markov chain, and randomize the
waiting time between transitions.
(2) Start from a discrete time Markov chain, and consider the
limit when the discrete time unit converges to 0.
We will take approach (1), but the other approaches are equivalent;
all of them result in the same process.
The Markov chain transitions to that state, and we iterate this loop
by generating a new T, and so on.
Stochastics Illés Horváth Continuous Time Markov Chains
Example
Example. We have a discrete time Markov chain with 3 states with
v (0) = ( 1 0 0 ) and transition probability matrix
0 1/3 2 /3
P = 1 /5 0 4 /5 ,
3 /4 0 1 /4
3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t
Theorem
v (t) = v (0)e Qt ,
where the matrix Q is obtained from P − I by multiplying row i by
λi for each i , where I is the identity matrix.
which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)
which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)
d
P(t) = e Qt where Q= P(t) t=0
.
d t
which are collected into a matrix P(t). Then P(t) satises the
ChapmanKolmogorov equation (or semigroup equation)
d
P(t) = e Qt where Q= P(t) t=0
.
d t
d
Finally, computing
dt P(t) t=0 gives the Q as dened in the
theorem.
0 1/3 2 /3
P = 1/5 0 4 /5 ,
3/4 0 1 /4
and λ1 = 1, λ2 = 4, λ3 = 2, we have
−1 1/3 2/3
P − I = 1/5 −1 4/5
3 /4 0 −3/4
and
−1 1/3 2 /3
Q = 4/5
−4 16/5 .
3/2 0 −3/2
∞
X An
eA = ,
n!
n=0
or using Jordan-form.
∞
X An
eA = ,
n!
n=0
or using Jordan-form.
Lemma
Lemma
−1 1/3 2 /3
Q = 4/5
−4 16/5 ,
3/2 0 −3/2
−1 1/3 2 /3
Q = 4/5
−4 16/5 ,
3/2 0 −3/2
−1 1/3 2 /3
Q = 4/5
−4 16/5 ,
3/2 0 −3/2
T = min(T2→1 , T2→3 ).
According to the lemma, T ∼ EXP(16/5 + 4/5), whose parameter
is the same as λ2 = 4.
T = min(T2→1 , T2→3 ).
According to the lemma, T ∼ EXP(16/5 + 4/5), whose parameter
is the same as λ2 = 4.
Also,
4/5
P(T = T2→1 ) = = 1/5,
+ 16/5 4 /5
16/5
P(T = T2→3 ) = = 4/5,
4/5 + 16/5
−λ λ 0 0 ...
0 −λ λ 0
Q=
0 0 −λ λ
. .. ..
. . .
.
−λ λ 0 0 ...
0 −λ λ 0
Q=
0 0 −λ λ
. .. ..
. . .
.
(Q is an innite matrix, but let's not worry about that for now.)
From state 0, the only possible change is that a client arrives. This
has rate 0.2, so the rst row of Q is
−0.2 0 .2 0 .
Note that q13 = 0 because in continuous time, two clients may not
arrive at exactly the same time.
From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.
From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.
The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.
From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.
The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.
From state 1, two events may happen next. Either a new client
arrives and stands in the queue, which has rate 0.2 as before, or the
current client is served.
The only information given about the service time of a client is that
it is on average 2 minutes. Let's assume that the service time has
exponential distribution! Then, in order to have mean equal to 2
minutes, the parameter must be 0.5.
The diagonal elements are lled in last: they are negative such that
each row sums to 0.
The diagonal elements are lled in last: they are negative such that
each row sums to 0. Altogether, the generator is
−0.2 0.2 0
Q = 0.5 −0.7 0.2 .
0 0.5 −0.5
x1 + · · · + xk = 1,
vst Q = 0,
x1 + · · · + xk = 1,
vst Q = 0,
Theorem (Main)
(a) There is always at least one stationary vector for any (nite
state) continuous time Markov chain.
(b) If the Markov chain is irreducible, then vst is unique, its
elements are strictly positive, and
lim v (t) = vst
t→∞
0.8
v1 (t)
0.6
0.4 v3 (t)
0.2
v2 (t)
0.0
0 1 2 3 4 5
vst
vst
The short-term approximation is
e Qt ≈ I + tQ for small t.
vst
The short-term approximation is
e Qt ≈ I + tQ for small t.
1, 3, 1, 2, 3, 1, 3, 1, . . .
1/3
P(next |
state is 2 current state is 1 )= = 1/3,
1/3
+ 2 /3
2/3
P(next |
state is 3 current state is 1) = = 2/3.
1/3 + 2/3
1/3
P(next |
state is 2 current state is 1 )= = 1/3,
1/3
+ 2 /3
2/3
P(next |
state is 3 current state is 1) = = 2/3.
1/3 + 2/3
4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5
4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5
4/5
P(next |
state is 1 current state is 2 )= = 1/5,
4 /5
+ 16/5
16/5
P(next |
state is 3 current state is 2) = = 4/5.
4/5 + 16/5
0 1/3 2/3
P 0 = 1/5 0 4/5 .
1 0 0
0 1/3 2/3 0 1/3 2/3
P 0 = 1/5 0 4/5 P = 1/5 0 4/5 .
1 0 0 3/4 0 1/4
0 1/3 2/3 0 1/3 2/3
P 0 = 1/5 0 4/5 P = 1/5 0 4/5 .
1 0 0 3/4 0 1/4
No, not the same! The rst two rows are the same, but the third
row is dierent.
0 1/3 2/3 0 1/3 2/3
P 0 = 1/5 0 4/5 P = 1/5 0 4/5 .
1 0 0 3/4 0 1/4
No, not the same! The rst two rows are the same, but the third
row is dierent.
The reason for this is that for the original DTMC, the transition
3 →3 has a positive probability. For the embedded Markov chain,
the next state is the rst state actually dierent from the current
state. Accordingly, consecutive repeated states are considered as a
single step in the embedded Markov chain.
1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,
1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,
3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t
1, 3, 1, 2, 3, 1, 3, 3, 1, . . . ,
3
EXP(2) EXP(2) EXP(2) EXP(2)
2
EXP(4)
1
EXP(1) EXP(1) EXP(1) EXP(1)
t
1, 3, 1, 2, 3, 1, 3, 1, . . .
Lemma
yi λi xi /λi
xi = P yi = P .
yi λi xi /λi
xi /λi
yi = P .
xi /λi
xi /λi
yi = P .
xi /λi
λ0 λ1 λ2 λ3
0 1 2 3 4 ...
µ1 µ2 µ3 µ4
For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.
For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.
When the Markov chain is in state i , it transitions to state i + 1
with rate λi . However, it is in state i only in an xi fraction of time,
so the long-term average rate of i → i + 1 transitions is xi λi .
For a birth-death process, the only transition that leaves the subset
{0, . . . , i} is i → i + 1, and the only transition entering this subset
isi + 1 → i.
When the Markov chain is in state i , it transitions to state i + 1
with rate λi . However, it is in state i only in an xi fraction of time,
so the long-term average rate of i → i + 1 transitions is xi λi .
Due to the dynamic balance, this must be equal to the long-term
average rate of i +1→i transitions, which is xi+1 µi+1 .
i+1
xi+1
i+1 λ λ λ λ
i i i i
µ µ µ
i i+1 i+1 i+1
xi
λ λ λ λ
0 1 2 3 4 ...
µ µ µ µ
λ λ λ λ
0 1 2 3 4 ...
µ µ µ µ
λ µ
λ λ λ λ
0 1 2 3 4 ...
µ µ µ µ
λ µ
λ ... µ
λ ... µ
λ
xi+1 = · xi ,
µ
λ ... µ
λ
xi+1 = · xi ,
µ
so 2
λ λ λ
x1 = x0 , x2 = x1 = x0
µ µ µ
and so on.
λ ... µ
λ
xi+1 = · xi ,
µ
so 2
λ λ λ
x1 = x0 , x2 = x1 = x0
µ µ µ
and so on. In general,
n
λ
xn = x0 .
µ
Stochastics Illés Horváth Continuous Time Markov Chains
M/M/1 queue
We want to nd x0 . We also know
∞ n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0
∞ n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0
λ
If
µ < 1, the sum is nite:
∞ n
X λ 1
= ,
µ 1 − λ/µ
n=0
n
λ 1
xn = · .
µ 1 − λ/µ
∞ n
X λ
1 = x0 + x1 + x2 + · · · = x0 .
µ
n=0
λ
If
µ < 1, the sum is nite:
∞ n
X λ 1
= ,
µ 1 − λ/µ
n=0
n
λ 1
xn = · .
µ 1 − λ/µ
In this case, we say that the queue is stable; there is a unique vst ,
and v (t) → vst as t→∞ also holds for any v (0). (No proof.)
λ
However, when
µ ≥ 1,
∞ n
X λ
= ∞.
µ
n=0
1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.
λ
However, when
µ ≥ 1,
∞ n
X λ
= ∞.
µ
n=0
1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.
Altogether, if λ ≥ µ, there is no solution to the balance equations,
and a stationary distribution does not exist.
λ
However, when
µ ≥ 1,
∞ n
X λ
= ∞.
µ
n=0
1
In this case, x0 would be ∞ = 0, but then 0 = x0 = x1 = . . . , so
x0 + x1 + · · · = 1 does not hold.
Altogether, if λ ≥ µ, there is no solution to the balance equations,
and a stationary distribution does not exist.
The
λ
µ <1 case is known as a stable queue. It behaves
essentially just as nice as any nite Markov chain, with a
unique vst and v (t) → vst also holds.
The
λ
µ =1 case is known as a critical queue. There is no vst ,
and the queue is going up and down randomly. It will actually
visit every state innitely often.
The
λ
µ >1 case is known as an unstable queue. There is no
vst , and the queue is going to innity in the long run.
The
λ
µ <1 case is known as a stable queue. It behaves
essentially just as nice as any nite Markov chain, with a
unique vst and v (t) → vst also holds.
The
λ
µ =1 case is known as a critical queue. There is no vst ,
and the queue is going up and down randomly. It will actually
visit every state innitely often.
The
λ
µ >1 case is known as an unstable queue. There is no
vst , and the queue is going to innity in the long run.
λ
ρ=
µ
is known as the system load; the condition of stability of an
M/M/1 queue is thus ρ < 1.
λ
< 1.
cµ
λ
< 1.
cµ
λ
< 1.
cµ
λ
< 1.
cµ
For G/G/1 queues, both the interarrival time and service time have
a general distribution.
−0.1x1 + 0.9x2 = 0
0.1x1 − 0.9x2 = 0
x1 + x2 = 1,
whose solution is
9 1
vst = (x1 x2 ) = .
10 10
1
The long term average ratio of time spent in repair is x2 = 10 .
1
The long term average ratio of time spent in repair is x2 = 10 .
According to the ergodic theorem, the long term average net prot
is
9 1
x1 · 60 + x2 · (−30) = · 60 + · (−30) = 51
10 10
1
The long term average ratio of time spent in repair is x2 = 10 .
According to the ergodic theorem, the long term average net prot
is
9 1
x1 · 60 + x2 · (−30) = · 60 + · (−30) = 51
10 10
0 1
P= .
1 0
(e) In the long run, what is the ratio of potential clients that are
turned away due to a full client area?
1 1 2 1 2 1 2 1
x1 = x0 , x2 = x1 , x3 = x2 , x4 = x3 ,
8 5 8 5 8 5 8 5
2 1
x5 = x4 , x0 + x1 + · · · + x5 = 1,
8 5
1 1 2 1 2 1 2 1
x1 = x0 , x2 = x1 , x3 = x2 , x4 = x3 ,
8 5 8 5 8 5 8 5
2 1
x5 = x4 , x0 + x1 + · · · + x5 = 1,
8 5
from which
2 3 4 !
8 8 4 8 4 8 4 8 4
x0 1 + + · + · + · + · =1
5 5 5 5 5 5 5 5 5
and
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 + 4 · x4 + 5 · x5 = 2.161.
Solution. Clients are turned away when the bank is full, that
is, the CTMC is in state 5. This is a ratio of x5 = 0.103 of
total time. This means that the ratio of clients arriving when
the CTMC is in state 5 is also 0.103 since clients arrive
independently of the state.
Solution. Clients are turned away when the bank is full, that
is, the CTMC is in state 5. This is a ratio of x5 = 0.103 of
total time. This means that the ratio of clients arriving when
the CTMC is in state 5 is also 0.103 since clients arrive
independently of the state.
1 1
x0 + x1 = 0.157 + · 0.251 = 0.282.
2 2
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.
(g) In the long run, from among all jobs he takes, what is the ratio
of type A jobs?
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
The arrival rate of type A jobs is 3/2, and the arrival rate of
type B jobs is 1.
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
The arrival rate of type A jobs is 3/2, and the arrival rate of
type B jobs is 1.
−5/2 3/2 1
Q= 1 −1 0 .
1 /2 0 −1/2
Solution. Let TA denote the waiting time before the rst A job
oer. Then TA ∼ EXP(3/2), and
so
P(he will be doing a type A job 2 days from now ) ≈ 1/10 = 0.1.
−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.
−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.
The solution is
2 3 4
vst = (x1 x2 x3 ) = .
9 9 9
−5/2x1 + x2 + 1/2x3 = 0
3/2x1 − x2 = 0
x1 − 1/2x3 = 0,
x1 + x2 + x3 = 1.
The solution is
2 3 4
vst = (x1 x2 x3 ) = .
9 9 9
The long term average ratio of time Tivadar spends with type
A jobs is x2 = 39 .
Stochastics Illés Horváth Continuous Time Markov Chains
Problem 4
(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.
(e) Tivadar has a contract fee of 20000 HUF/day for type A jobs
and 50000 HUF/day for type B jobs. Calculate his long term
average income per day.
2 3 4
0·x1 +20000·x2 +50000·x3 = 0· +20000· +50000· = 28900
9 9 9
so
0 3/5 2 /5
P=1 0 0 .
1 0 0
ust · P = ust , y1 + y2 + y3 = 1
we have
3 2
y1 = y2 + y3 , y1 = y2 , y1 = y3 , y1 + y2 + y3 = 1
5 5
whose solution is
1 3 2
ust = (y1 y2 y3 ) = .
2 10 10
3/10 3
= .
3/10 + 2/10 5
−5/2 3 /2 1 0
1 −1 0 0
Q= .
0 0 −1 1
1 0 0 −1
−2 1 1
−1/2 1/2
QX = QY = 1 −1 0
1 −1
3 0 −3
Altogether, QZ is
1a 1b 1c 2a 2b 2c
1a −5/2 1 1 1/2 0 0
1b 1 −3/2 0 0 1 /2 0
1c 3 0 −7/2 0 0 1/2
2a 1 0 0 −3 1 1
2b 0 1 0 1 −2 0
2c 0 0 1 3 0 −4
1a 1b 1c 2a 2b 2c
1
1a ∗ 1 1
2 0 0
1
1b 1 ∗ 0 0
2 0
1
1c 3 0 ∗ 0 0
2
2a 1 0 0 ∗ 1 1
2b 0 1 0 1 ∗ 0
2c 0 0 1 3 0 ∗
1a 1b 1c 2a 2b 2c
1
1a ∗ 1 1
2 0 0
1
1b 1 ∗ 0 0
2 0
1
1c 3 0 ∗ 0 0
2
2a 1 0 0 ∗ 1 1
2b 0 1 0 1 ∗ 0
2c 0 0 1 3 0 ∗
We can see that the red elements are essentially two copies of QY .
The exact structure can be understood with Kronecker-product
notation.
∗
1 1 0 0 0
1 ∗ 0 0 0 0
3 0 ∗ 0 0 0
I2 ⊗ QY = .
0 0 0 ∗ 1 1
0 0 0 1 ∗ 0
0 0 0 3 0 ∗
1
∗
0 0 0 0
2
1
0 ∗ 0 0
2 0
1
0 0 ∗ 0 0
2
QX ⊗ I3 =
1 0 0 ∗ 1 1
0 1 0 1 ∗ 0
0 0 1 3 0 ∗
and
QZ = I2 ⊗ QY + QX ⊗ I3 .
This is the equation that describes the joint transition rate matrix
of two CTMC's running simultaneosly, independently from each
other. It is also known as the Kronecker-sum of QX and QY , and is
denoted by QX ⊕ QY .
All streams are memoryless (since the time spent in each state is
EXP, independent of the past), and the streams are identical, so
when there is e.g. 1 stream turned ON, for the evolution of the
process it doesn't matter which one.
All streams are memoryless (since the time spent in each state is
EXP, independent of the past), and the streams are identical, so
when there is e.g. 1 stream turned ON, for the evolution of the
process it doesn't matter which one.
In state 0, all streams are OFF, and the only change that can occur
is 0 →1 when one of the streams turns ON. The rate of any single
stream turning on is λ, so the total rate of 0 → 1 is 3λ. Similarly,
the rate of 1 → 0 is µ and the rate of 1 → 2 is 2λ, and so on;
−3λ 3λ 0 0
µ −µ − 2λ 2λ 0
Q= .
0 2µ −2µ − λ λ
0 0 3µ −3µ
The states are then grouped together into 4 states according to the
number of ON streams.
µ3 2 2 λ3
3µ λ 3µλ
vst = ,
(λ + µ)3 (λ + µ)3 (λ + µ)3 (λ + µ)3
µ
which is actually a BIN(3,
λ+µ ) distribution.
µ3 2 2 λ3
3µ λ 3µλ
vst = ,
(λ + µ)3 (λ + µ)3 (λ + µ)3 (λ + µ)3
µ
which is actually a BIN(3,
λ+µ ) distribution.
A possible interpretation is that each stream is a CTMC
on its own,
µ λ
with 2 states (OFF/ON) and stationary distribution
λ+µ , λ+µ ,
so in the stationary distribution of
Z (t), each will be ON with
µ λ
probability
λ+µ , λ+µ independently.
(a) Model the process with a CTMC. What are the states?
Calculate the innitesimal generator.
(d) Right now, he is free. Estimate the probability that he will still
be free 10 days from now (10 days is 1/3 months).
Solution.
−1/2 1/2
Q= .
1 /6 −1/6
Solution.
−1/2 1/2
Q= .
1 /6 −1/6
1 3
(b) The stationary distribution is
4 4 , so he spends on average
3
4 of the time in prison.
1 3
·4+ ·0=1
4 4
1 1
v ≈ v (0) · I + Q
3 3
1 0 1 −1/2 1/2
= (1 0) · + ·
0 1 3 1 /6 −1/6
5/6 1/6 5 1
= (1 0) · = ,
1/18 17/18 6 6
so
5
P(he will still be free 10 days from now )≈ .
6
(b) Calculate the long-term ratio of time when the light is on.
Solution.
(a) The possible states are ON and OFF according to whether the
light is on or o.
Solution.
(a) The possible states are ON and OFF according to whether the
light is on or o.
Solution.
(a) The possible states are ON and OFF according to whether the
light is on or o.
(b) So this is not a Markov chain, but the process still has a
relatively simple structure.
1 1 1 1
on
off
EXP(1/4) EXP(1/4)
t
Assuming the light was ON, the probability that the person
40 2
arrives in the rst 40 seconds is
60 = 3.
(b) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 20 minutes
from now, the machine will be free.
(c) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 10 seconds
from now, the machine will be free.
(a) Model X (t) with a continuous time Markov chain. What are
the states? Calculate the generator.
(a) Model X (t) with a continuous time Markov chain. What are
the states? Calculate the generator.
−1/2 1/2 0 0
1 −3/2 1 /2 0
Q= .
0 1 −3/2 1 /2
0 0 1 −1
and
8
P(13:00 tomorrow, the machine will be free )= .
15
Stochastics Illés Horváth Continuous Time Markov Chains
Problem 10
(c) Right now, the machine is being used by a single person with
nobody else waiting. Estimate the probability that 10 seconds
from now, the machine will be free.
When
1
t< min 1/|qii |,
3 i
(b) What is the long term ratio of time when at least 1 machine is
available?
(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).
−2 2 0 0
1 −3 2 0
Q= .
0 2 −4 2
0 0 3 −3
30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19
30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19
(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).
30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19
(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).
30
0 · x0 + 1 · x1 + 2 · x2 + 3 · x3 = .
19
(d) Calculate the usage rate of the rst machine (that is, the ratio
of time when the rst machine is in use).
The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.
The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.
The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.
For the long term behaviour of the queue, we typically compute the
stationary distribution, which is an important descriptor of the
system, but it is talking about the number of customers in the
system, and it doesn't directly relate to system times.
The system time of a customer is the time between the arrival and
service of that customer. It doesn't matter how much of it the
customer spent waiting or in service, just the total time that the
customer spends in the system.
For the long term behaviour of the queue, we typically compute the
stationary distribution, which is an important descriptor of the
system, but it is talking about the number of customers in the
system, and it doesn't directly relate to system times.
then λe = (1 − q)λ.
then λe = (1 − q)λ.
Let L denote the average number of customers in the system.
From the ergodic theorem, we know that
L = 0 · x0 + 1 · x1 + 2 · x2 + . . .
L = λe W .
L = λe W .
...
customer 8
c. 7
customer 6
c. 5
c. 4
customer 3
c. 2
customer 1
time
customer 1 arrives
customer 1 finishes
3 2
234
3 2 1 0
1 23
2
1
B λe TW
L= = = λe W .
T T
Stochastics Illés Horváth Continuous Time Markov Chains
Little's Law
Little's law describes the system time averaged out for all
customers in the long run. If we want more detailed information
(e.g. what is the average system time of a customer that arrives in
a queue of length i ), we need more detailed computations.