0% found this document useful (0 votes)
194 views

2 - Probability and Queueing Theory

This document appears to be an exam for the course "Probability and Queueing Theory" given in May/June 2012. It contains 14 multiple choice and numerical problems assessing concepts in probability, random variables, Markov chains, and queueing theory. Key topics covered include probability mass/density functions, moment generating functions, means, variances, central limit theorem, stationary and non-stationary processes, Markov chains, and M/M/1, M/G/1 queueing models.

Uploaded by

UdupiSri group
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
194 views

2 - Probability and Queueing Theory

This document appears to be an exam for the course "Probability and Queueing Theory" given in May/June 2012. It contains 14 multiple choice and numerical problems assessing concepts in probability, random variables, Markov chains, and queueing theory. Key topics covered include probability mass/density functions, moment generating functions, means, variances, central limit theorem, stationary and non-stationary processes, Markov chains, and M/M/1, M/G/1 queueing models.

Uploaded by

UdupiSri group
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 178

Semester-IV

Probability and
Queueing Theory

01-MAY JUNE 2012..indd 1 12/7/2012 5:58:12 PM


The aim of this publication is to supply information taken from sources believed to be valid and
reliable. This is not an attempt to render any type of professional advice or analysis, nor is it to
be treated as such. While much care has been taken to ensure the veracity and currency of the
information presented within, neither the publisher nor its authors bear any responsibility for
any damage arising from inadvertent omissions, negligence or inaccuracies (typographical or
factual) that may have found their way into this book.

01-MAY JUNE 2012..indd 2 12/7/2012 5:58:13 PM


B.E./B.Tech. DEGREE EXAMINATION,
MAY/JUNE 2012
Fourth Semester
Computer Science and Engineering
MA2262 – PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
(Regulation 2008)
(Statistical Tables may be permitted)
Time: Three hours Maximum: 100 Marks
Answer ALL questions
PART A − (10 × 2 = 20 Marks)
1. Check whether the following is a probability density function or not:
⎧ λe−λ x , x ≥ ,λ > 0
f ( x) = ⎨
⎩0, elsewhere.

2. If a random variable has the moment generating function given by


2
M x (t ) = , determine the variance of X.
X
2−t

3. The regression equations of X on Y and Y on X are respectively 5x − y = 22


and 64x − 45y
5 = 24. Find the means of X and Y.Y

4. State Central limit theorem.

5. Define Wide sense stationary process.


⎛ 5 1⎞
6. If the initial state probability distribution of a Markov chain is P ( )
=⎜
⎝ 6 6 ⎟⎠
⎛ 0 1⎞
and the transition probability matrix of the chain is ⎜ , find the
⎝1/2 1/2⎟⎠
probability distribution of the chain after 2 steps.

7. State Little’s formula for a (M/ M/1): (GD / N/ ∞) queuing model.

01-MAY JUNE 2012..indd 3 12/7/2012 5:58:13 PM


1.4 B.E./B.Tech. Question Papers

8. Define steady state and transient state in Queuing theory.

9. When a M/G/1 queuing model will become a classic M/M/1 queuing


model?

10. State Pollaczak–khintchine formula for the average number of customers


in a M/G/1 queuing model.

PART B – (5 × 16 = 80 Marks)
11. (a) (i) A random variable X has the following probability function:

X 0 1 2 3 4 5 6 7
P( x) 0 k 2k 2k 3k k 2
2k
2k 2
7k + k
2

(1) Find the value of k.


(2) Evaluate p(X < 6), p(X ≥ 6)
1
(3) If p( X c) > find the minimum value of c.
2
(ii) Find the moment generating function of an exponential random
variable and hence find its mean and variance.
(b) (i) If X is a Poisson variate such that p(X = 2) = 9 p(X = 4) + 90
p(X = 6). Find
(1) Mean and E(X2)
(2) p(X ≥ 2).
(ii) In a certain city, the daily consumption of electric power in
millions of kilowatt hours can be treated as random variable
having Gamma distribution with parameters l = 1/2 and v = 3.
If the power plant of this city has a daily capacity of 12 millions
kilowatt-hours, what is the probability that this power supply
will be inadequate on any given day?

12. (a) (i) Let X and Y be two random variables having the joint probability
function f(x, y) = k(x + 2y) where x and y can assume only the
integer values 0, 1 and 2. Find the marginal and conditional dis-
tributions.
(ii) Two random variables X and Y have the joint probability density
⎧ c( x y ) 0 ≤ x ≤ 2, 0 ≤ y ≤ 2
function f ( x, y ) = ⎨ . Find
⎩ 0, elsewhere
cov(X, Y).
Or

01-MAY JUNE 2012..indd 4 12/7/2012 5:58:13 PM


Probability and Queuing Theory (May/June 2012) 1.5

(b) (i) Two dimensional random variables (X, Y) have the joint probability
density function
f ( x, y ) x 0 < x < y <1
xy
= 0, elsewhere

⎛ 1 1⎞
(1) Find P X < ∩ Y < ⎟ .
⎝ 2 4⎠
(2) Find the marginal and conditional distributions.
(3) Are X and Y independent?
(ii) Suppose that in a certain circuit, 20 resistors are connected in series.
The mean and variance of each resistor are 5 and 0.20 respectively.
Using Central limit theorem, find the probability that the total resis-
tance of the circuit will exceed 98 ohms assuming independence.

13. (a) (i) The process {X (t)} whose probability distribution under certain
condition is given by
( at ) n −1
P[ X ( t ) n] = , n = 1, 2, 3, …
( at ) n +1
at
= , n=0
1 + at
Show that {X (t)} is not stationary.
(ii) A salesman territory consists of three cities A, B and C. He
never sells in the same city on successive days. If he sells in
city-A, then the next day he sells in city-B. However if he sells
in either city-B or city-C, the next day he is twice as likely to
sell in city-A as in the other city. In the long run how often does
he sell in each of the cities?
Or
(b) (i) The transition probability matrix of a Markov chain {X (t)}, n =
⎡ 0.1 0.5 0 4 ⎤
⎢ ⎥
1, 2, 3,…, having three states 1, 2 and 3 is P = ⎢0.6 0.2 0 2⎥
⎢ 0.3 0.4 0 3⎥
⎣ ⎦
and the initial distribution is P = (0.7 0.2 0.1). Find
( )

(1) p[X2 = 3]
(2) p[X3 = 2, X2 = 3, X1 = 3, X0 = 2].
(ii) Suppose that customers arrive at a bank according to Poisson
process with mean rate of 3 per minute. Find the probability that
during a time interval of two minutes.

01-MAY JUNE 2012..indd 5 12/7/2012 5:58:13 PM


1.6 B.E./B.Tech. Question Papers

(1) exactly 4 customers arrive


(2) greater than 4 customers arrive
(3) fewer than 4 customers arrive.

14. (a) (i) A T.V. repairman finds that the time spent on his job has an
exponential distribution with mean 30 minutes. If he repair sets
in the order in which they came in and if the arrival of sets is
approximately Poisson with an average rate of 10 per 8 hour
day. Find
(1) the repairman’s expected idle time each day
(2) how may jobs are ahead of average set just brought?
(ii) A supermarket has 2 girls running up sales at the counters. If
the service time for each customer is exponential with mean
4 minutes and if people arrive in Poisson fashion at the rate of
10 per hour, find the following:
(1) What is the probability of having to wait for service?
(2) What is the expected percentage of idle time for each girl?
(3) What is the expected length of customer′s waiting time?
Or
(b) (i) Trains arrive at the yard every 15 minutes and the service time
is 33 minutes. If the line capacity of the yard is limited to 5
trains, find the probability that the yard is empty and the average
number of trains in the system, given that the inter arrival time
and service time are following exponential distribution.
(ii) There are three typists in an office. Each typist can type an average
of 6 letters per hour. If letters arrive for being typed at the rate of 15
letters per hour, what fraction of times all the typists will be busy?
What is the average number of letters waiting to be typed?

14. (a) (i) Automatic car wash facility operates with only one bay. Cars
arrive according to a Poisson distribution with a mean of 4 cars
per hour and may wait in the facility′s parking lot if the bay
is busy. The parking lot is large enough to accommodate any
number of cars. If the service time for all cars is constant and
equal to 10 minutes, determine
(1) mean number of customers in the system Ls
(2) mean number of customers in the queue Lq
(3) mean waiting time of a customer in the system Ws
(4) mean waiting time of a customer in the queue Wq
(ii) An average of 120 students arrive each hour (inter-arrival times
are exponential) at the controller office to get their hall tickets.

01-MAY JUNE 2012..indd 6 12/7/2012 5:58:14 PM


Probability and Queuing Theory (May/June 2012) 1.7

To complete the process, a candidate must pass through three


counters. Each counter consists of a single server, service times
at each counter are exponential with the following mean times:
counter 1, 20 seconds; counter 2, 15 seconds and counter 3, 12
seconds. One the average how many students will be present in
the controller’s office.
Or
(b) (i) Derive the P–K formula for the (M/G/1) : (GD/∞/∞) queueing
model and hence deduce that with the constant service time the
ρ2 1
P–K formula reduces to Ls = ρ + where μ =
λ 2(1 − ρ ) E( T)
and ρ = .
μ
(ii) For a open queueing network with three nodes 1, 2 and 3, let
customers arrive from outside the system to node j according
to a Poisson input process with parameters rj and let Pij denote
the proportion of customers departing for facility i to facility j.
⎡ 0 0.6 0.3⎤
⎢ ⎥
Given ( 1 , r2 , 3 ) = ( , 4, ) and Pij = ⎢ 0.1 0 0.3⎥
⎢0.4 0.4 0 ⎥
⎣ ⎦
determine the average arrival rate lj to the node j for j = 1, 2, 3.

01-MAY JUNE 2012..indd 7 12/7/2012 5:58:14 PM


Solutions
PART A
1. If f(x) is p.d.f then


−∞
f ( x ) dx = 1

⎧ λe−λ x , x ≥ ,λ > 0
given f ( x) = ⎨
⎩0, elsewhere
∞ ∞
⎡ e−λ x ⎤
∫ −λ x
∴ λ = λ⎢ ⎥
0 ⎣ −λ ⎦0
e −∞ − e 0
= =1
−1
∴ f(x) is p.d.f.

2.
2
M x (t ) =
2−t
2
=
2[[ t / 2]
=[ t / 2]−1
M x (t ) = 1 + t / (t / ) 2 + (t / 2)3 + …
t
E[ x ] = co-efficient of = 1/ 2
1!
t2
E[ x 2 ] = co-efficient of = 1/ 2
2!
∴ var = [ X 2 ] − [ E ( )]2 = / 2 − ( / 2) 2
= 1/ 4
3. Since both Regression line passing through the point ( , y )
∴ 5 − y = 22 (1)

and 64 x − 45y
45 y = 24 (2)

Solving (1) and (2)

01-MAY JUNE 2012..indd 8 12/7/2012 5:58:15 PM


Probability and Queuing Theory (May/June 2012) 1.9

x=6 d y=8

4. If X1, X2…Xn… be a sequence of independent random variables with E[Xi]


= μi & var ( X i ) = σ i2 i = 1, 2,… and if Sn = X1 + X2 + ⋅ ⋅ ⋅ + Xn, then under
certain general conditions, Sn follows a normal distribution with mean m =
n n

∑i =1
μi and variance σ 2 ∑σ
i =1
2
i as n tends to infinity.

5. A random process {X(t)} with finite first-and second-order moments is


called a weakly stationary process or covariance stationary process or
wide-sense stationary process (wss), if its mean is a constant and autocor-
relation depends only on the time difference.
i.e. If E{X(t)} = m and E{X(t) X(t − t )} = R(t )

6. Initial state probability distribution


⎡5 1⎤
p( )
=⎢
⎣6 6 ⎥⎦
⎡ 0 1 ⎤
and tpm P = ⎢ ⎥
⎣1 / 2 1 / 2⎦
⎡ 0 1 ⎤
p( ) p( )
P = [ / 6 1// ] ⎢ ⎥ = [ / 12 11// ]
⎣1// / 2⎦
⎡ 0 1 ⎤
p( )
p( ) P = [ / 12 11// ] ⎢ ⎥ = [ / 24 13 / ]
⎣1// / 2⎦

7.
Ls λ ′Ws
λ′
Lq Ls − where λ ′ = μ ( − P0 )
μ
= effective arrival rate
1
Ws L
λ′ s
1
Wq L
λ q
8. A system is said to be in transient state when its operating characteristics
(like input, output, mean queue length, etc) are dependent upon time.

01-MAY JUNE 2012..indd 9 12/7/2012 5:58:16 PM


1.10 B.E./B.Tech. Question Papers

A system is said to be in steady state when the behaviour of the


system is independent of − time t. In the steady state case.
lim
t →∞ Pn ( t ) = Pn (independent of t )
This implies
d
lim
t →∞ P (t ) = 0
dt n
9. In this M/G/1 model
If G ≡ M, the service time T follows an exponential distribution with
parameter m, then

1
E(T ) = 1 / μ and V (T ) =
μ2
hence
⎧ 1 1 ⎫
λ2 ⎨ 2 + 2 ⎬
λ
Ls = + ⎩μ μ ⎭
μ λ
2(1 − )
μ
λ2
2
λ μ2
= +
μ ⎛ μ − λ⎞
2⎜
⎝ μ ⎟⎠

λ λ2 μ
= + ×
μ μ2 μ − λ
λ λ2
= +
μ μ( μ − λ )
μλ − λ 2 + λ 2
=
μ( μ − λ )
λ
=
μ−λ
Which is Ls of M/M/1 model.

10. Average number of customers is

λ 2 {V (T ) + E 2 (T )}
Ls λ E (T ) +
2{{ − λ E (T )}

01-MAY JUNE 2012..indd 10 12/7/2012 5:58:17 PM


Probability and Queuing Theory (May/June 2012) 1.11

PART B
11. (a) (i)
(1) ∑P(x) = 1
0 + k + 2k + 2k + 3k + k2 + 2k2 + 7k2 + k = 1
1 k 2 9k = 1
(10kk 1)( k 1) = 0
∴ k = 1 / 10 o −1
The value k = −1 makes some values of P(n) negative.
∴ k = 1/10
The distribution is
X: 0 1 2 3 4 5 6 7
P(X): 0 1/10 2/10 2/10 3/10 1/100 2/100 17/100
(2)
P[ ] 1 P[ ]
= 1 − [ ((x = 6) ( x = 7)]
⎡ 2 17 ⎤
= 1− ⎢ + ⎥
⎣100 100 ⎦
19
= 1−
100
0
P[ x ] = 81 / 100
P[ x ] = 1 − P[ x ]
81
= 1−
100
19
=
100
1 3
(3) By trails, P[X ≤ 0] = 0; P[X ≤ 1] = , [ X ≤ 2]
2] ,
10 10
5 8
P[X ≤ 3] = , [ X ≤ 4]
4] ,
10 10
∴ The smallest value of C satisfying the condition P[X ≤ C] >
1/2 is 4

11 (a) (ii)
M x (t ) = E[e tx ]


= λ e − λ x e ttx dx
0

01-MAY JUNE 2012..indd 11 12/7/2012 5:58:17 PM


1.12 B.E./B.Tech. Question Papers


= λ e − ( λ − t ) x dx
0
λ
= = ( − t/λ ) −1
λ −t
2 3
t ⎛t⎞ ⎛t⎞
M x (t ) = 1 + + ⎜ ⎟ + ⎜ ⎟ + ⋅⋅⋅
λ ⎝ λ⎠ ⎝ λ⎠
t 1
Mean E[ X ] = co-efficient of =
1! λ
t2 2
E[ X 2 ] = co-efficient of =
2! λ 2
2 1 1
∴ var ( x)) = [ x 2 ] − [ E ( )]2 = 2 − 2 = 2
λ λ λ
11 (b) (i) (1) X is a poisson variate

e−λ ⋅ λ r
∴ P[ X = r ] =
r!
∴ P[ X = ] = 9 P [ X = ] + 90 P[[ X = ]
e−λ λ 2 e−λ λ 4 e−λ λ 6
=9 + 90
2! ! 6!
3 2 λ4
1= λ +
4 4
λ4 λ2 4 = 0
∴ (λ 2
)( λ 2 1) = 0
∴λ2 λ2 1
Hence λ = 1 [Since λ 2 cannot be negative]
Mean = λ = 1
E[ x 2 ] = var( )]2
ar(( x ) [ E ( x )] var( x ) = E[ x 2 ] − [ E ( x )]2
= λ+λ 2
∵ var( x ) = λ
= 1+1
=2
(2)
P[ X ] = 1− P[X ]
= 1− [P [ X ]+ P[X ]]
= 1 − e −1 ( )
2
= 1−
e

01-MAY JUNE 2012..indd 12 12/7/2012 5:58:18 PM


Probability and Queuing Theory (May/June 2012) 1.13

(b) (ii) Let X denote the daily consumption of electric power (in million
of killowatt hours)
Then the p.d.f of X is
3
⎛ 1⎞
⎜⎝ ⎟⎠ −
x
2
f ( x) = x≥0 x 2e 2,
3
P[The power supply is inadequate]

= P[X > ]=
12
∫ f ( x)dx
∞ x
1 ⎛ 1⎞ 2 − 2
= ∫
12
⎜ ⎟ x e dx ∵ 3 = 2 + 1 = 2 !
3 ⎝ 8⎠
1 ⎡ 2 ∞
= ⎣ x ( −22e − / 2 ) − 2 x( 4e − x / 2 ) + 2( −88e − x / 2 ) ⎤⎦
16 12
e −6
= [288 + 96 + 16 1 ] = 25e −6
16
= 0.0625

12 (a) (i)
X/Y 0 1 2 ∑ P ( x, y )
y

0 0 2k 4k 6k
1 k 3k 5k 9k
2 2k 4k 6k 12k

∑ P ( x, y )
x
3k 9k 15k 27k

∑ ∑ P ( x, y ) = 1
y x

27k = 1
k = 1 / 27 ⇒

X/Y 0 1 2 P(x)
0 0 2/27 4/27 6/27
1 1/27 3/27 5/27 9/27
2 2/27 4/27 6/27 15/27
P(y 3/27 9/27 15/27 1

01-MAY JUNE 2012..indd 13 12/7/2012 5:58:19 PM


1.14 B.E./B.Tech. Question Papers

Marginal distribution of X
X 0 1 2
P(X) 6/27 9/27 12/27
Marginal distribution of Y
Y 0 1 2
P(y) 3/27 9/27 15/27
(ii) Conditional distribution of X given Y = y
P[ X x Y y]
P[ X x / Y = y] =
P[Y y]
P[ X / Y = 0] = 0
1 / 27
P[ X / Y = 0] = = 1/ 3
3 / 27
2 / 27
P[ X / Y = 0] = = 2/3
3 / 27
2 / 27
P[ X / Y = 1] = = 2/9
9 / 27
3 / 27
P[ X / Y = 1] = = 1/ 3
9 / 27
4 / 27
P[ X / Y = 1] = = 4/9
9 / 27
4 / 27
P[ X / Y = 2] = = 4 / 15
15 / 27
5 / 27
P[ X / Y = 2] = = 1/ 3
15 / 27
6 / 27
P[ X / Y = 2] = = 2/5
15 / 27
Conditional distribution of Y given X = x
P[Y y X x]
P[Y y / X = x] =
P[ X x]
P[Y 0/ ]= 0
2 / 27
P[[ / X = 0] = = 1/ 3
6 / 27
4 / 27
P[[ / X = 0] = = 2/3
6 / 27

01-MAY JUNE 2012..indd 14 12/7/2012 5:58:20 PM


Probability and Queuing Theory (May/June 2012) 1.15

1 / 27
P[Y / X = 1] = = 1/ 9
9 / 27
3 / 27
P[Y / X = 1] = = 1/ 3
9 / 27
5 / 27
P[Y / X = 1] = = 5/9
9 / 27
2 / 27
P[Y / X = 2] = = 2 / 15
15 / 27
4 / 27
P[Y / X = 2] = = 4 / 15
15 / 27
6 / 27
P[Y / X = 2] = = 2/5
15 / 27

12 (a) (ii)
To find C
∞ ∞

∫∫
−∞ −∞
dy = 1
f ( x, y ddxd

2 2
C ∫∫(0 0
x y )dxd
d dy 1

2 2
⎡ x2 ⎤

0⎣

C ⎢ x−
2
− yx ⎥ dy = 1
⎦0
2
C ( ∫
0
y ) dy 1


C (
0
y )dy = 1

2
⎡ 2y⎤
C y− ⎥ =1
⎣ 2 ⎦0
1
8C 1⇒ C =
8
⎧1// ( 4 − x − y ), 0 ≤ x ≤ 2; 0 ≤ y ≤ 2
∴ f ( x, y ) = ⎨
⎩0, elsewhere

Marginal densities of X & Y

01-MAY JUNE 2012..indd 15 12/7/2012 5:58:21 PM


1.16 B.E./B.Tech. Question Papers


f ( x) = ∫
−∞
f ( x, y ddy

2
1
=
8 ∫ (4 −
0
− ) dy

2
1⎡ y2 ⎤
= ⎢ 4 y − xy − ⎥
8⎣ 2⎦
0

3− x
f ( x) = ,0≤y≤2
4

& f ( y) ∫
−∞
f ( x, y )dx

2
1
=
8 ∫ (4 −
0
− ) dx

2
1⎡ x2 ⎤
= ⎢4 x − − yx ⎥
8⎣ 2 ⎦0
3− y
f ( y) = , 0 ≤ y≤ 2
4

E( X ) = ∫ x f ( x) ddx
−∞

2
1
= ∫
(3 − 2
) dx
4
0

2
1 ⎡ 3x 2 x3 ⎤
= ⎢ − ⎥
4⎣ 2 3⎦
0

5
=
6

E (Y ) = ∫ y f ( y)ddy
−∞

01-MAY JUNE 2012..indd 16 12/7/2012 5:58:22 PM


Probability and Queuing Theory (May/June 2012) 1.17

2
1
= ∫
(3 − 2
) dy
4
0
5
=
6
∞ ∞
E ( XY
X ) ∫ ∫ xxy f ( x, y) dx dy
−∞ −∞
2 2
1
=
8 ∫ ∫ xy (4 − x − y) dxdy
0 0
2
1 ⎡ 4 x 2 y x3 y x 2 y 2 ⎤
= ⎢ − − ⎥ dy
8 ⎣ 2 3 2 ⎦
0
2
1 ⎡ 8y ⎤
=
8 ⎣
0
⎢ ∫
8y −
3
− 2 y 2 ⎥ dy

2
1 ⎡ 16 y ⎤
= ∫
8 ⎢⎣ 3
0
− 2 y 2 ⎥ dy

2
1 ⎡8 y2 y3 ⎤
= ⎢ −2 ⎥
8⎣ 3 3⎦
0
1 ⎡ 32 16 ⎤
= − ⎥
8 ⎢⎣ 3 3⎦
2
=
3
cov (X,Y) = E[XY] − E[X]E[Y]
2 ⎛ 5⎞ ⎛ 5⎞
= −⎜ ⎟ ⎜ ⎟
3 ⎝ 6⎠ ⎝ 6⎠
−1
cov ( X , ) =
36

⎧8xyy, < x < y <1


12 (b) (i) f ( x, y ) = ⎨
⎩ 0, elsewhere
1 1
2 4
(i) P ⎡ X < ∩ Y
1 1⎤
⎣ 2 4⎦ ∫ ∫ 8 xyx dyddx
0 0

01-MAY JUNE 2012..indd 17 12/7/2012 5:58:23 PM


1.18 B.E./B.Tech. Question Papers

1 1 1 1
2
⎛ y2 ⎞ 4 2
⎛ 1⎞ ⎛ 1⎞ ⎡ x2 ⎤ 2
∫0
⎝ ⎠0
2
0
⎝ 16 ⎠ ∫
= 8 x ⎜ ⎟ dx = 4 x ⎜ ⎟ dx = ⎜ ⎟
⎝ 4⎠
⎢ ⎥
⎣ 2 ⎦0
1
=
32
(2) Marginal distribution
∞ 1
f X ( x) = ∫
−∞
f ( x, y ) ddy ∫ 8 xxy dy
x

⎛y 2 ⎞1
⎛ 1 x2 ⎞
= 8x ⎜ ⎟ = 8x ⎜ − ⎟
⎝ 2 ⎠x ⎝2 2 ⎠
= 4x(1 − x2), 0 < x < y
∞ y

fY ( y ) = ∫
−∞
f ( x, y dx
d ∫ 8 xxy dx
0

⎛x 2⎞y
= 8y ⎜ ⎟
⎝ 2 ⎠0
= 4y (y2) = 4y3, x < y < 1
Conditional distribution
f ( x, y ) 8 xy 2 x
f ( x, y ) = = =
f ( y) 4 y3 y 2
f ( x, y ) 8 xy 2y
f ( y, x ) = = =
f ( x ) 4x(x( x ) 1 − x 2
2

(3) If X & Y are independent, f(x,y) = f(x) × f(y)


f(x) × f(y) = 4x (1 − x2) 4y3
≠ f(x,y)
∴ X & Y are not independent.

12 (b) (ii) Let Xi represent the resistance in the ith resistor


Then X1, X2... X20 are identically distributed random variables.
Since the resistors are connected in series, the total resistance is
given by the R.V
Sn X1 + X 2 … X 20

01-MAY JUNE 2012..indd 18 12/7/2012 5:58:25 PM


Probability and Queuing Theory (May/June 2012) 1.19

Here E[Xi] = 5 & Var [Xi] = 0.20 for all i


By CLT Sn N (n μ, n σ 2 )
To find
P[Sn > 98]
Sn nμ
Z=
nσ 2
nm = 20 × 5 = 100
ns 2 = 20 × 0.20 = A
Sn − 100
∴Z =
2
98 − 100
Sn = 98 ⇒ z = = −1
2
∴ P[Sn > 98] = P[Z > −1]
= 0.5 + P[−1 < Z < 0]
= 0.5 + P[0 < Z < 1]
= 0.5 + 0.3413
P[Sn > 98] = 0.8413

13 (a) (i) The probability distribution of X (t) is


X(t) = n 0 1 2 3 …
2
at at at ( )
Pn = P[x(t) = n] 2 3 …
1+ at ( at ) ( at ) ( )4


E[ X (t ) ∑ nPP
n= 0
n

1 2at 3( ) 2
= + + + ...
(1 + )2 1+
(1+ )3 (1 + )4

1 ⎡ at ⎛ at ⎞
2 ⎤
= ⎢1 + 2 + 3 ⎜⎝ ⎟⎠ + ...⎥
(1 + 2
) ⎢⎣ 1 + at 1 + at ⎥⎦
−2
1 ⎡ at ⎤
= 2 ⎢
1− ⎥
(1 + ) ⎣ 1 + at ⎦

01-MAY JUNE 2012..indd 19 12/7/2012 5:58:26 PM


1.20 B.E./B.Tech. Question Papers

−2
1 ⎡ 1 ⎤
= 2 ⎢1 + at ⎥
=1
(1 + ) ⎣ ⎦
at )
n −1
∞ ∞
(
2
E[ X (t )] ∑ n P ∑ [n(n + 1)
n= 0
2
n
n =1
n]
(1 + at )n+1
⎡ ∞ n −1 ∞ n −1
⎛ at ⎞ ⎛ at ⎞ ⎤
∑ ∑
1
= ⎢ n(
( +1) ⎜⎝ ⎟ − n ⎜⎝ ⎟ ⎥
(1 + ) 2 ⎢⎣ n =1 1 + at ⎠ n =1
1 + at ⎠ ⎥⎦
⎡ ∞ n(( +1) ⎛ at ⎞ n −1 ∞ ⎛ at ⎞ n −1 ⎤
∑ ∑
1
= ⎢2 ⎜ ⎟ − n⎜ ⎥
(1 + ) 2 ⎢⎣ n =1 2 ⎝ 1 + at ⎠ ⎝ 1 + at ⎟⎠ ⎥
n =1 ⎦
⎡ ⎛ − −
at ⎞ ⎤
3 2
1 at ⎞ ⎛
= ⎢ 2 ⎜1 − ⎟ − ⎜1 − ⎥
(1 + ) ⎢⎣ ⎝ 1 + at ⎠
2 ⎝ 1 + at ⎟⎠ ⎥


−3 n(( )
∵( ) = x n −1
n =1
2

2(1 + )3 (1 + )2
= −
(1 + ) 2 (1 + ) 2
= 2(1 + at) −1
E[X2(t)] = 1 + 2at
∴ Var [X(t)] = 1 + 2at −1
= 2at
Since E[X(t)] & Var [X(t)] are functions of t,
∴ { X(t)} is not stationary.

13 (a) (ii) The tpm of the given problem is


A B C
A⎡ 0 1 0 ⎤
⎢ ⎥
P = B ⎢ 2 / 3 0 1 / 3⎥
C ⎢⎣ 2 / 3 1 / 3 0 ⎥⎦
The steady− state distribution of Markov chain
Let p = (p1 p2 p3)
Then pp = p & p1 + p2+ p3 =1
⎡ 0 1 0 ⎤
⎢ ⎥
π P = π ⇒ (π1 π 2 π 3 ) ⎢ 2 / / 3⎥ (π1 π 2 π 3 )
⎢2 / 3 1 / 3 0 ⎥⎦

01-MAY JUNE 2012..indd 20 12/7/2012 5:58:27 PM


Probability and Queuing Theory (May/June 2012) 1.21

2
∴ ( 2 + 3 ) = 1 ⇒ 3 1 − 2π 2 − 2 3 =0 (1)
3
π3
π1 + = π 2 ⇒ π1 − 3π 2 π 3 0 (2)
3
π2
= π 3 ⇒ π 2 − 3π 3 (3)
3
8π 3
From 1 & 2 3π1 8π 3 = 0 ⇒ π1
3
8π 3
π1 + π 2 + π 3 = 1 ⇒ + 3π 3 π 3 1
3
20π 3
=1
3
3
π3 =
20
8 9 3
∴ π1 & π2 & π3 =
20 20 20
∴ The steady state probability distribution is
⎛ 8 9 3⎞
π=⎜ , = (0.40 0.45 0.15)
⎝ 20 20 20 ⎟⎠
,

Thus in long run,


He sells 40% of the time in city A;
45% of the time in city B;
& 15% of the time in city C.

13 (b) (i)
⎡0.1 0.5 0.4 ⎤ ⎡0.1 0.5 0.4 ⎤
P ( )
P.P = ⎢⎢0.6 0.2 0.2 ⎥ ⎢0.6 0.2 0.2
⎥⎢


⎢⎣0.3 0.4 0.3 ⎥⎦ ⎢⎣0.3 0.4 0 3 ⎥⎦

⎡0.43 0.31 0 26 ⎤
= ⎢⎢0.24 0.42 0 34 ⎥⎥
⎢⎣0.36 0.35 0 29 ⎥⎦

3
(i) P[ X 2 ] ∑ P[ X
i =1
2 / X0 i ] P[ X 0 = i ]

01-MAY JUNE 2012..indd 21 12/7/2012 5:58:28 PM


1.22 B.E./B.Tech. Question Papers

( 2)
= p13 P{ X 0 = 1} + p2( 23) P[ X 0 = 2] + p3( 23) P[ X 0 = 3]
= 0.26 × 0.7 + 0.34 × 0.2 + 0.29 × 0.1
= 0.182 + 0.068 + 0.029
= 0.279
(ii) P{X3 = 2, X2 = 3, X1 = 3, X0 = 2}
= P{X3 = 2 / X2 = 3, X1 = 3, X0 = 2} × P{X2 = 3, X1 = 3, X0 = 2}:
= P{X3 = 2 / X2 = 3} × P{X2 = 3/X1 = 3, X0 = 2} × P[X1 = 3, X0 =2}
=P{X3 = 2 / X2 = 3} × P{X2 = 3/X1 = 3} × P{X1 = 3 /X0 = 2} × P {X0 = 2}
(1) (1) (1)
= p32 × p33 × p23 × P{ X 0 2}
= 0.4 × 0.3 × 0.2 × 0.2
= 0.0048
13 (b) (ii) Mean of the Poisson process = lt
Mean arrival rate = l
Given l = 3

e − λt ( t ) x
P{ X (t ) x} =
x!
e −6 ( ) 4
(1) ∴ P{ X ( 2)) = } = = 0.133
4!
(2) P{X(2) > 4} = 1−P{X(2) ≤ 4}
= 1−[P{X(2) = 0} + P{X(2) = 1} + P{ X(2) = 2}
+ P{X(2) = 3} + P{X(2) = 4}]

e −6 (6) x
4
= 1− ∑x=0
x!

⎡ 6 6 2 63 6 4 ⎤
= 1 − e −6 ⎢1 + + + + ⎥
⎣ 1! 2 ! 3! 4 ! ⎦
= 0.715
3 P[fewer than 4 customer] = P {X (2) < 4}
=[P{X(2) = 0} + P{X(2) = 1}+ P{ X(2) = 2} + P{X(2) = 3}
e −6 (6) x
3
= ∑
x=0
x!

⎡ 6 6 2 63 ⎤
= e −6 ⎢1 + + + ⎥
⎣ 1! 2 ! 3! ⎦

01-MAY JUNE 2012..indd 22 12/7/2012 5:58:29 PM


Probability and Queuing Theory (May/June 2012) 1.23

= e −6 [1 + 6 + 18 + 36]
−66
= 61 ( )
= 0.1512
(14) (a) (i) Νumber of server is 1 infinite number of customer
∴ given module is (M/M/l) : (∞ /FIFO)
10 1
λ= = set / min
8 × 60 48
1
μ= set / min
30
Probability that there is no unit in the system
λ 5 3
P0 = 1 1− =
μ 8 8
Repairman’s expected idle time in 8 hour day
3
= nP
P0 = 8 × = 3 hours
8
Expected average number of jobs
λ
Ls =
μ−λ
1 1
= 48
= 48
1 1 48 − 30

30 48 48 × 30
30
=
18
5
= jobs
3
1 1
(14) (a) (ii) Given λ μ = 1 min
6 4
s=2
∴ The given model is (M/M/S) (∞ /FIFO)
−1
⎡ ⎧ ⎫⎤
⎢ ⎧ s −1 n⎫ ⎪ s ⎪⎥
⎪ 1 ⎛ λ⎞ ⎛ λ ⎞ ⎪⎥

1
P0 = ⎢ ⎨ ⎜⎝ μ ⎟⎠ ⎬ + ⎨ ⎛ ⎜ ⎟ ⎬⎥

⎢ ⎪⎩ n = 0
n ! ⎪⎭ ⎪ s ! 1 − λ ⎞ ⎝ μ ⎠ ⎪ ⎥
⎢⎣ ⎪⎩ ⎜⎝ μ s ⎟⎠ ⎪⎭ ⎥⎦

01-MAY JUNE 2012..indd 23 12/7/2012 5:58:30 PM


1.24 B.E./B.Tech. Question Papers

λ 4 2 λ 1/ 6 1
∴ = = & = =
μ 6 3 μs 1 3

4
−1
⎡ ⎛ ⎞⎤
⎢⎛ 2 −1 1 2 n ⎞ ⎜ ⎟⎥
⎛ ⎞

1
∴ P0 = ⎢⎜ ⎟ + ⎜
2 ⎥

⎢⎝ n = 0 n ! ⎜⎝ 3 ⎟⎠ ⎠ ⎜ ⎛ 1 ⎞
( 2 / 3)
− ⎟⎥
⎢ ⎜⎝ ⎜⎝ 3 ⎟⎠
2 ! 1 ⎟⎠ ⎥
⎣ ⎦
−1
⎡ 2 1⎤ 1
= 1+ + ⎥ =
⎣ 3 3⎦ 2

(a) Probability of having to wait for service



P[ n ] = P[w
[ ]= ∑P
n= 2
n

∞ n
⎛ λ⎞
∑ 8! s
1
= n− s ⎜⎝ μ ⎟⎠ P0 , if n ≥ s
n= 2
∞ n
⎛ 2⎞
∑ 2! 2
1
= n− 2 ⎜⎝ ⎟⎠ P0
n= 2
3
∞ n
P0 1 ⎛ 2⎞
=
2 ∑
n= 2 2 n− 2 ⎜
⎝ 3 ⎟⎠
2 ∞ n− 2
P ⎛ 2⎞ ⎛ 2 ⎞
= 0⎜ ⎟
2 ⎝ 3⎠ ∑
n= 2
⎜⎝ 3 × 2 ⎟⎠

2 P0 ⎡ 1 ⎛ 1⎞ 2 ⎤
= ⎢1 + + ⎜ ⎟ + …⎥
9 ⎢⎣ 3 ⎝ 3 ⎠ ⎥⎦
⎡ ⎤
2 P0 ⎢ 1 ⎥
= ⎢ ⎥
9 ⎢ 1⎥
1−
⎣ 3⎦

3 1 1
= 8 1× = ∵ P0 =
2 6 2
λ 1
(2) The fraction of time servers are busy = ρ = =
sμ 3

01-MAY JUNE 2012..indd 24 12/7/2012 5:58:31 PM


Probability and Queuing Theory (May/June 2012) 1.25

∴ the expected idle time for each girl is


1 2
1 − = = 0 67
3 3
Hence, the expected percentage of idle time for each girl is 67%
(3) Expected length of customer’s waiting time in the system
1
Ws Wq +
μ
s
⎛ λ⎞
1 1 ⎜⎝ μ ⎟⎠ 1
= ⋅ P +
2 0
μ s.s ! ⎛ λ⎞ μ
⎜⎝1 − μ s ⎟⎠
2
⎛ 2⎞
1 1 ⎜⎝ ⎟⎠
1 1
3
= ⋅ × +
1 2.2 ! ⎛ 1 ⎞ 2 2 1
4 ⎜⎝1 − ⎟⎠ 4
3
2 2
1 ⎛ 2⎞ ⎛ 3⎞ 1
= 4× ⎜ ⎟ ×⎜ ⎟ × +4

4 3 ⎠ ⎝ 2 ⎠ 2
1 9
= + 4 = = 4 5 min
2 2
1
(14) (b) (i) λ = per minute
15
1
μ= per minute
33
λ 33
∴ρ = = = 2.2
μ 15
k=5
1− ρ
(i) ∴ P0 = ,λ ≠μ
1 − ρ k +1
1 − 2.2 1 − 2.2
= =
1− ( 2.2) 6 1 − 113.379904
−1.2
= = 0.01068
−112.379904
(ii) Average number of brains in the system
k
Ls ∑ np
n
n= 0
n

01-MAY JUNE 2012..indd 25 12/7/2012 5:58:33 PM


1.26 B.E./B.Tech. Question Papers

⎡ 1− ρ ⎤
∴ Pn = ρ n ⎢ k +1 ⎥
⎣1 − ρ ⎦
= ρ n p0
6
= ∑nρ P
n= 0
n
0

= P0 ⎡⎣0 + ρ + 2ρ 2 + 3ρ3 + 4 ρ 4 + 5ρ5 + 6 ρ6 ⎤⎦

= 0.01068 ⎡⎣ 2.2 + 2( 2.2) 2 + 3( 2.2)3 + 4( 2.2) 4 + 5( 2.2)5 + 6( 2.2)6 ⎤⎦

14 (b) (ii)
Arrival rate l = 15/n
λ 5
Service rate μ = 6/n ∴ =
Number of server s = 3 μ 3
Hence this is a problem in multiple server
(M/M/S) : (∞/FIFO) mode
(a) P[all the typists are busy]
=P [N ≥ 3]
3
⎛ λ⎞
⎜⎝ μ ⎟⎠ .P0
= (1)
⎛ λ⎞
3! ⎜1 − ⎟
⎝ 3μ ⎠
−1
⎡ ⎧ ⎫⎤
⎢ s −1 ⎧ n⎫ ⎪ s ⎪⎥
⎪ 1 ⎛ λ⎞ ⎛ λ ⎞ ⎪⎥

1
Where P0 = ⎢ ⎨ ⎜ ⎟ ⎬ + ⎨ ⎜⎝ μ ⎟⎠ ⎬ ⎥
⎢ n ! ⎝ μ ⎠ ⎛ λ ⎞
⎢ n = 0 ⎪⎩ ⎪⎭ ⎪ s ! 1 − ⎪⎥
⎢⎣ ⎪⎩ ⎜⎝ μ s ⎟⎠ ⎪⎭ ⎥⎦
−1
⎡ ⎧ ⎫⎤
⎢ ⎪ 3 ⎪⎥
⎧ 1 ⎫ ⎪ 1 ⎛ 5⎞ ⎪⎥
= ⎢ ⎨1 + 2.5 + ( 2.5) 2 ⎬ + ⎨ ⎜ ⎟ ⎬
⎢⎩ 2 ⎭ ⎪ 3! 1 − ⎝ 3 ⎠ ⎪ ⎥
⎛ 5 ⎞
⎢ ⎜ ⎟ ⎪⎭ ⎥⎦
⎣ ⎩⎪ ⎝ 3 ⎠
= [22.25]−1 = 0.0449

1⇒ [ 3]
(5 3 )
3

× 0.0449 = 0.7016
⎛ 5⎞
6 1− ⎟
⎝ 9⎠

01-MAY JUNE 2012..indd 26 12/7/2012 5:58:34 PM


Probability and Queuing Theory (May/June 2012) 1.27

Hence the fraction of the time all the typists will be busy = 0.7016
(b) The average number of letters waiting to be typed
s +1
⎛ λ⎞
.P0
1 ⎜⎝ μ ⎟⎠
Lq = 2
s.s ! ⎛ λ⎞
1 −
⎜⎝ μ s ⎟⎠
1 ( 2.5) 4
= × × 0.0449 = 3.5078
3 × 6 ⎛ 2 5⎞ 2
⎜⎝1 − ⎟
3 ⎠

15. (a) (i) This is (M/G/1) : (∞/GD) model.


l = 4cars/hour
T is the service time and is constant equal to 10 minutes.
Then E(T) = 10 minutes & V(T) = 0
1 1
∴ = 10 ⇒ μ = per minute
μ 10
∴ m = 6 cars/ hour
& σ2 = Var(T) = 0
λ 4 2
ρ= = =
μ 6 3
(1) Mean number of customers in the system
λ 2σ 2 + ρ 2
Ls = +ρ
2(1 − ρ)
2
⎛ 2⎞
0+⎜ ⎟
⎝ 3⎠ 2
= 2
+ = 1.333 ≈ 1 car
⎛ 2⎞ 3
2 ⎜1 − ⎟
⎝ 3⎠
(2) Mean number of customers in the queue
2
⎛ 2⎞
2 0+ ⎜ ⎟
λ σ +ρ
2
⎝ 3⎠
2
Lq = =
2(1 − ρ) ⎛ 2⎞
2 ⎜1 − ⎟
⎝ 3⎠
= 0.667 cars
(3) Mean waiting time of a customers in the system
Ls 1.333
Ws = = = 0.333 hour
λ 4

01-MAY JUNE 2012..indd 27 12/7/2012 5:58:35 PM


1.28 B.E./B.Tech. Question Papers

(4) Mean waiting time of a customer in the queue


Lq
0.667
Wq = = = 0.167 hour
λ 4
(15) (a) (ii)Arrival rate = 120 students/hour
120
= students / min
60
= 2/min
1
Service rate 1st counter = / seconds
20
m1 = 3/min
1
Service rate 2nd counter = /seconds
15
60
μ2 = / min
15
m2 = 4 / min
1
Service rate 3rd counter = /seconds
12
60
μ3 = / min
12
m3 = 5 / min
The average number of students will be present in the controller’s office.
Ls Ls1 (1st counter) + Ls2 (2nd counter) + Ls3 (3rd counter)
λ λ λ
Ls = + +
μ1 − λ μ2 − λ μ3 − λ
2 2 2
= + +
3− 2 4 − 2 5− 2
2
= 2 +1+
3
11
=  3.667 Students
3
(15) (b) (i) Let N & N′ be the numbers at customers in the system at times
t & t + T, when two consecutive customers have just left the
system after getting service.
Thus T is the random service time, which is continuous random
variable.

01-MAY JUNE 2012..indd 28 12/7/2012 5:58:36 PM


Probability and Queuing Theory (May/June 2012) 1.29

Let f(t), E(T),Var(T) be the pdf, mean and variance of T. Also


let M be the number of customer arriving in the system during
the service time T.
⎧M , if N = 0
Hence N ′ = ⎨
⎩N 1 M if N > 0
Where M is discrete R.V taking the values 0, 1, 2…
∴ N′ = N − 1 + M +d (1)
⎧1, if N 0
where δ = ⎨
⎩0, if N 0
∴ E(( ′ ) = E(( ) − 1 + E(( ) + E (δ ) (2)

When the system has reached the steady state, the probability of
the number of customers in the system will be constant.
Hence E(N) = E(N′) & E(N2) = E(N′)2 (3)
Using this in (2) ⇒ E(d) = 1−E(M) (4)
Eqn (1) squaring on both sides
N′2 = N 2 + (Μ−1)2 + d 2 + 2Ν (Μ −1) + 2(M − 1) d + 2Νd ( 5)
Now d 2 = d (∴d 2 = 0 or 1 according as d = 0 or 1)
⎧0 × 1, if N = 0
and N δ = ⎨
⎩ N 0, if N > 0
Using these values in (5)
(5) ⇒ N′2 = N 2 + M 2+ 2N (M−1) + (2M−1)d − 2M + 1
i.e., 2 N (1 − M) = N2− N′2+ M2 + (2M − 1) d − 2M + 1
2 E {N (1 − M)} = E(N 2) −E(N′2) + E(M2) + E{(2M − 1)d}− 2E(M) + 1
i.e., 2 E (N){ 1−E(M)} = E(M 2) + {2E(M) − 1}E(d) − 2E(M)+1
∴ Independence property & (5)
E ( M 2 ) + { E ( M ) }{ E ( M )} − 2 E ( M ) + 1
E(N ) =
2{{ E ( M )}
E( M 2 ) − E 2 ( M ) + E( M )
= ∴ by 4
2{{ − E ( M )}
Since the number of arrivals in time T follows a Poisson process
with parameter l, say
Then E(M) = lT & var (M) = lT
Or E(M2) = (lT)2 + lT
Now E(M) = E{E(M/T)} = Ε(lT) = l E(T) (6)

01-MAY JUNE 2012..indd 29 12/7/2012 5:58:37 PM


1.30 B.E./B.Tech. Question Papers

E(M2) = E{E(M2/T)} = Ε(l2T 2 + lT }


= l2{var (T) + E2T} + l E (T)} (7)
λ V (T ) + λ E (T ) + λ E (T ) − λ E (T ) + λ E (T )
2 2 2 2 2
Using (6) & (7) in (5) Ls =
2{1 − λ E (T )}
1
Also Let m = & r = l/m or r = lE(T)
E(T )
ρ2
∴ Ls = ρ +
2(1 − ρ)

15 (b) (ii) Given a open queuing network with three nodes 1, 2, & 3.
Let l1, l2, l3 be the resultant arrival rates and r1, r2, r3, be the
arrival rate of customers to server ‘j’, that come from outside
the system.
⎡0 0.6 0.3 ⎤
& given Pij ⎢0.1 0 0.3 ⎥ , ( r r , r ) = (1, 4, 2)
⎢ ⎥ 1 2 3
⎢⎣0.4 0.4 0 ⎥⎦
Jackson’s flow balance equations are
3
λj j ∑λ P
i =1
i iij , j = 1, 2, 3 (1)

We note that
P12 = 0.6, P13 = 0.3, P21 = 0.1, P23 = 0.3
P31 = 0.4, P32 = 0.4, P11 = P22 = P33 = 0
Taking j = 1
3
(1)⇒ λ1 1 ∑λ P
i =1
i i1

= r1 + λ1 P11 + λ 2 P21 + λ3 P31


= 1+ λ1 0 + λ 2 (0.1) + λ3 (0.4)
λ1 λ 2 − 0.4 λ3 = 1 (2)
Taking j = 2
3

(1)⇒ λ2 2 ∑λ P
i =1
i i2

= r2 + λ1 P12 + λ 2 P22 + λ3 P32

01-MAY JUNE 2012..indd 30 12/7/2012 5:58:38 PM


Probability and Queuing Theory (May/June 2012) 1.31

= 4 + (0.6)λ1 + (0) + (0 4)λ3


−0.66 λ1 + λ 2 − 0.4 λ3 = 4 (3)
Taking j = 3
3

(1)⇒ λ3 3 ∑λ P
i =1
i i3

= r3 + λ1 P13 + λ 2 P23 + λ3 P33


= 3 + (0.3)λ1 + 0.3λ 2 + 0
−0.33λ1 − 0 3λ 2 = 3 (4)
Solving (2), (3) & (4)
We get, the average arrival rate
λ1 λ2 10, λ3 7.5

01-MAY JUNE 2012..indd 31 12/7/2012 5:58:39 PM


B.E./B. Tech. DEGREE EXAMINATION,
NOV/ DEC 2011
Fourth Semester
Computer science and Engineering
MA 2262 – PROBABILITY AND QUEUING THEORY
(Common to Information Technology)
(Regulation 2008)
Time : Three hours Maximum : 100 marks
(Normal tables be permitted in the examination hall )
Answer All questions.
PART A-(10 × 2 = 20 marks)
1. A continuous random variable X that can assume any value between x = 2
and x = 5 has a density function, given by f(
f(x) = k (l + x).Find P(X
( < 4).

2. Suppose X has an exponential distribution with mean equal to 10.


[X < x] = 0.95.
Determine the value of x such that P[X
2 2
3. The joint pdf of the RV(
V(X,Y
Y) is given by f (x, y) ( )
, x > 0, y > 0
Find the value of K

4. Given the RV X with density function

⎧2 x,, 0 < x < 1


f ( x) = ⎨ Find the pdf of Y 8X 3
⎩ 0, elsewhere

5. Define transition probability matrix

6. Define Markov process.

7. Bring out the assumptions of a queueing system.

8. What do the letters in the symbolic representation (a/b/c) : (d/e) of a


queueing model represent?

02-Nov-Dec_2011.indd 32 12/8/2012 9:37:32 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.33

9. Write the Pollaczek–Khintchine formula.

10. Define series queues.

PART B-(5 × 16 = 80 marks)


11. (a) (i) The DF of a continuous random variable X is given by

F ( x ) = 0, x < 0
1
= x2 , 0 ≤ x <
2
3 1
= 1− ( x2 ) : x )
25 2
=1 ;x ≥ 3
⎛1 ⎞
Find the pdf of X and evaluate P (| X | ) and P < X < 4⎟
⎝3 ⎠
using both the pdf and PDF.
(ii) A random variable X has the following probability distribution
x: −2 − 0 1 2 3
1
p( x ) 0 k 0 2 0 3
: 1 2 k 3 k
(1) Find k , (2) Evaluate P(X < 2) and P (−2 < X <2), (3) Find the
PDF of X and (4) Evaluate the mean of X .
(b) (i) The probability function of an infinite discrete distribution is
given by P(X1) < 1/2; j = 1,2,3 Verify that the total probability
is 1 and find the mean and variance of the distribution. Find also
P(X is even , P(X ≥ 5) and P(X is divisible by 3).
(ii) Define Gamma distribution and find its mean and variance.

12. (a) (i) The joint probability mass function of (X,Y) is given by P(x,y)
= k (2x + 3y), x = 0, 1, 2; y = 1, 2, 3. Find all the marginal and
conditional probability distributions.
(ii) State and prove central limit theorem.
(Or)
(b) (i) If X and Y are independent RVs with pdf’s e−x,x ≥ 0, and e−y,
X
y ≥ 0, respectively, find the density functions of U = and
V = X+Y. Are U and V independent? X Y
(ii) Find the correlation coefficient for the following data :

02-Nov-Dec_2011.indd 33 12/8/2012 9:37:33 AM


1.34 B.E./B.Tech. Question Papers

X 1 1 1 2 2 3
: 0 4 8 2 6 0
Y 1 1 2 6 3 3
: 8 2 4 0 6

13. (a) (i) Define Poisson process and derive the Poisson probability law.
(ii) A man either drives a car (or) catches a train to go to office each
day. He never goes two days in a row by train but if he drives one
day, then the next day he is just as likely to drive again as he is to
travel by train. Now suppose that on the first day of the week, the
man tossed a fair die and drove to work if and only if a 6 appeared.
Find (1) the probability that he takes a train on the third day and
(2) the probability that he drives to work in the long run.
(Or)
(b) (i) Show that the random process X(t) = A cos( ω 0 t + θ ) is wide-
sense stationary , if A and ω 0 are constants and θ is a uniformly
distributed RV in (0, 2p)
(ii) If customers arrive at a counter in accordance with a Poisson pro-
cess with a mean rate of 2 per minute ,find the probability that the
interval between 2 consecutive arrivals is (1) more than 1 min.(2)
between 1 min and 2 min and (3) 4 min (or) less.

14. (a) Find the mean number of customers in the queue system, average
waiting time in the queue and system of M/M/1 queueing model.
(Or)
(b) There are three typists in an office. Each typist can type an average
of 6 letters per hour. If letters arrive for being typed at the rate of 15
letters per hour,
(i) What fraction of the time all the typists will be busy?
(ii) What is the average number of letters waiting to be typed?
(iii) What is the average time a letter has to spend for waiting and
for being typed?
(iv) What is the probability that a letter will take longer than 20 min.
waiting to be typed and being typed?

(15). (a) Discuss M/G/1 queueing model and derive Pollaczek–Khinchine


formula.
(Or)
(b) Discuss open and closed networks.

02-Nov-Dec_2011.indd 34 12/8/2012 9:37:34 AM


Solutions
PART- A

1. w.k.t
−∞
∫ f ( x )dx = 1

given 2 < x < 5


5


∴ k ( + x )dx = 1
2
5
⎡ x2 ⎤
k ⎢x + ⎥ = 1
⎣ 2 ⎦2

⎡ 25 4⎤
k ⎢5 + −2− ⎥=1
⎣ 2 2⎦
27
k =1
2
2
∴k =
27
2
∴ f ( x) = ( + x)
27
And
4
2
P[x < ] = ∫ 27 (1 + x)ddx
2
4
2 ⎡ x2 ⎤
= ⎢x + ⎥
27 ⎣ 2⎦
2

2 ⎡ 16 4⎤
= 4+ −2− ⎥
27 ⎣ 2 2⎦
2
=
27
[8 ]
16
=
27

02-Nov-Dec_2011.indd 35 12/8/2012 9:37:34 AM


1.36 B.E./B.Tech. Question Papers

That can assume the values 0, 1, 2..... such that its probability mass
function is given by

e−λ λ x
P [ X = r] = ; r = 0, 1, 2... & λ > 0
r!
Then X is said to follow a Poisson distribution with parameter λ
Mean = E[ X ] = λ
Var X = λ

2. Refer Question 4 (Nov/Dec-2009)

3. w.k.t

∞ ∞

∫∫
−∞ −∞
f ( x, y ddx dy = 1

∞∞

∫ ∫ k xxy e −(( x 2 y 2 )
∴ = 1, x y > 0
0 0
∞ ∞

∫ ∫
k x e -x dx y e − y dy = 1 = 1
2 2

0 0

2
Let and y2 s
2 xdx
d dt
dt 2 ydy = ds
dt ds
d =
xdx ydy =
ydy
2 2
∞ ∞
dt ds

k e−t ∫ =1
s
e
2 2
0 0

∞ ∞
k ⎡ e−t ⎤ ⎡e s ⎤
⎢ ⎥ ⎢ ⎥ =1
4 ⎣ 1⎦
0 ⎣ −1 ⎦ 0
k ⎡ −∞ ⎤ ⎡ ∞ ∞
e 1⎦ ⎣e 1⎤⎦ = 1 e 0
4⎣
k
=1
4
k =4

02-Nov-Dec_2011.indd 36 12/8/2012 9:37:35 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.37

4. Since y 8 x 3 is a strictly increasing function in (0, 1)

y ( x )3
1
1 3
∴x y
2
dx
f y y) = f x ( x) | |
ddy
1 2
1 −3
= y3 × y
6
1
1 −3
= y 0< y<8
6
5. When the Markov chain is homogenous, the one-step transition probabily
is denoted by pij. The matrix P = {pij} is called the transition probability
matrix satisfying the conditions.

(i) pij ≥ 0 & (ii) ∑p ij = 1 for all i


i.e., The sum of the elements of any row of the t.p.m is 1.

6. A random process { (t )} is called a Markov process if

⎡ an ⎤
P ⎢ X ( tn ) = = an X (tn − 2 ) an − 2 .... X (t
(t2 ) = a2 , X (t1 ) a1 ⎥
⎣ X(t
( n - 1) ⎦
⎡ an ⎤
= P ⎢ X ( tn ) = = an −1 ⎥
⎣ X (tn −1 ) ⎦
for all t1 t2 < .....tn
In otherwords, if the future behavior of a process depends on the present
value but not on the past, then the process is called a Markov process.
model is
l l l

0 1 2 n n+1

m m m

7. Refer Question 9 to (May/June-2009)

8. (a/b/c) : (d/e)
where
a-inter arrival time

02-Nov-Dec_2011.indd 37 12/8/2012 9:37:36 AM


1.38 B.E./B.Tech. Question Papers

b-service mechanism
c-number of service
d- the capacity of the system
e-queue decipline

9. (i) Average number of customers in the system

λ 2 σ 2 + ρ2 λ 2
= +ρ h ρ= ,σ =V ( )
2(1 − ρ) μ

(ii) Average queue length

λ 2 σ 2 + ρ2
=
2(1 − ρ)

10. A series model or tandem queue model is, in general, one in which
(i) Customers may arrive from outside the system at any node and may
leave the system from any node.
(ii) Customers may enter the system at some node, traverse from node
to node in the system and leave the system from some node, not
necessarily following the same order of nodes.
(iii) Customers may return to the nodes
previously visited, skip some nodes entirely and even choose to remain
in the system forever.

11. (a) (i) The points x = 0,1/2 and 3 are points of continuity

⎧0 , x<0
⎪ 1
⎪2 x , 0 ≤ x <
⎪ 2
∴ f ( x) = ⎨
6
⎪ ( − x ), 1
≤ x<3
⎪ 25 2
⎪0 , x ≥3

Although the points x = 1/2, 3 are points of discontinuity for


f(x),
We may assume that
f(1/2) = 3/5 & f(3)=0

02-Nov-Dec_2011.indd 38 12/8/2012 9:37:37 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.39

Using property of pdf


P[| X | ] = P[ x ]
1
= ∫ f ( x)dx
−1
1
2 1
6
= ∫
0
x ddx + ∫ 25 (
1
x )dx

2
13
=
25
If we use property of cdf
13
P[| X | ] = P[ x ] = F ( ) F(
F( )=
25
If we use the property of pdf
1
2 3
⎡1 ⎤ 6
P
⎣3
≤ X <4
⎦ ∫
1
2 x ddx + ∫ 25 (3
1
x ) dx

3 2
8
=
9
If we use the property of cdf,
⎡1 ⎤ ⎛ 1⎞
P ≤ X < 4 F ( 4) F ⎜ ⎟
⎣ 3 ⎦ ⎝ 3⎠
1
= 1−
9
8
=
9

11.) a (ii)
(1) Since ∑ p( x) = 1
0. 0.2 + 2k 0.3 + 3k 1
k 0.
6k 0.6 = 1
k 1 / 15

02-Nov-Dec_2011.indd 39 12/8/2012 9:37:38 AM


1.40 B.E./B.Tech. Question Papers

∴ The probability distribution


x : −2 −1 0 1 2 3
1 1 1 2 3 1
p( x ) :
10 15 5 15 10 5
(2)

P[X < ] = P[X = − ] + P [ X = − ]+ P [ X = ]+ P [ X = ]


1 1 1 2
= + + +
10 15 5 15
1
=
2
P[ X < ] = P [ X = − ] + P [ X = 0] + P [ X = ]
1 1 2
= + +
15 5 15
2
=
5
3.
⎧0 h x < −2
⎪1
⎪ when − 2 ≤ x < −1
⎪10
⎪1
⎪ when −1 ≤ x < 0
⎪6
⎪ 11
F ( x) = ⎨ when 0 ≤ x <1
⎪ 30
⎪1
⎪2 whenn 1 x 2

⎪4 when 2 3
⎪5

⎩1 when
h 3≤x

(4)

Mean of X E [ x] ∑ x p ( x)
⎛ 1⎞ ⎛ 1⎞ ⎛ 2⎞ ⎛ 3⎞ ⎛ 1⎞
= ( −2) ⎜ ⎟ + ( −1) ⎜ ⎟ + 0 + 1 ⎜ ⎟ + 2 ⎜ ⎟ + 3 ⎜ ⎟
⎝ 10 ⎠ ⎝ 15 ⎠ ⎝ 15 ⎠ ⎝ 10 ⎠ ⎝ 5⎠
16
=
15

02-Nov-Dec_2011.indd 40 12/8/2012 9:37:38 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.41

11 (b)

∑p
1 1 1
(i) j = + 2 + 3 + ....... + ∞
j =1
2 2 2

1
= 2 =1 ∵geometric series
1
1−
2

The mean of X E[X]= ∑ jpj=1
j

1
Let a =
2

E [ X ] = a + 2a 2 + 3a3 + ......
= a ⎡⎣1 + 2a + 3a 2 + .....⎤⎦
1
= a[ a]
−2 2
= 2
=2
⎛ 1⎞
⎜⎝1 − ⎟⎠
2
∴E[ ]= 2

Var of X V(X) = E[X2] − E[X]2



Where E[x2] = ∑j j =1
2
pi

∑j a
1
= 2 j
∵a =
j =1
2

= ∑ [ j( j + 1) − j] a j

∞ ∞
= ∑ j =1
+ − ∑ ja
j =1
j

= [ .2a + 2.3a 2 + 33.4 a3 + .... + ∞] − [1a + 2a 2 + 3a3 + .... + ∞]


= a[ .2 + 2.3a + 3 4 a 2 + ...]] a[1 + 2a + 3a 2 + ...]

02-Nov-Dec_2011.indd 41 12/8/2012 9:37:39 AM


1.42 B.E./B.Tech. Question Papers

= a × 2(1 − a) −3 − a(1 − a) −2
2a a 1
= − = 8−2 = 6 ∵a =
(1− a) 3
(1 − a) 2 2
∴V ( X ) = E[ X 2 ] [ E(( )]2 = 6 − 4 = 2

P[x is even] =
= P[ X = ] + P[ X = ] + ....
∵ the events are mutually exclusive
2 4 6
⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞
= ⎜ ⎟ + ⎜ ⎟ + ⎜ ⎟ + ....
⎝ 2⎠ ⎝ 2⎠ ⎝ 2⎠
1
4 1
= =
1 3
1−
4
P[ X ] = P[ X = 5 or X 6 X = 7.....]
= P[ x ] + P[x
[x ] + ....
5
⎛ 1⎞
⎜⎝ ⎟⎠ 1
2
= =
1 16
1−
2
P[X is divisible by 3] = P[ X = 3 X = 6 or X = 9...]
= P[ X = ] + P[ X = ] + ...
3 6
⎛ 1⎞ ⎛ 1⎞
= ⎜ ⎟ + ⎜ ⎟ + ....
⎝ 2⎠ ⎝ 2⎠
3
⎛ 1⎞ 1
⎜⎝ ⎟⎠ 1
2
= 3
= 8 =
⎛ 1⎞ 1 7
1− ⎜ ⎟ 1−
⎝ 2⎠ 8

11. (b) (ii) Gamma distribution


A continuous random variable X is said to follow an Erlang dis-
tribution or general Gamma distribution with parameters λ > 0
and k > 0, if its probability density function is given by

02-Nov-Dec_2011.indd 42 12/8/2012 9:37:40 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.43

⎧ λ k x k −1e − λ x
⎪ , for x ≥ 0
f ( x) = ⎨ k
⎪0
⎩ , otherwise

M.G.F
The rthmoment μ r′ = E[ X ]

λk
= ∫
0
k
x k+r −1e − λ x dx


λk
∫x k+r −1 − λ x
= e dx
k 0
put λ = ⇒ = λ
when x = 0 ⇒ = 0
x = ∞ ⇒t = ∞
∞ k+r −1
λk ⎛t⎞
=
k ∫ 0
⎜⎝ ⎟⎠
λ
e −tt dt / λ


λ k
1
∫t k+r-1 − t
= × k+r
e dt
k λ k+r 0

1 k+r
= ∵k ∫x k
e x ddx
λr k 0

1 k +1 k
∴ Mean = E[ X ] = . =
λ k λ

Var(X) = E[ x 2 ] − [ E ( X )]2
2
1 k+2 ⎛k⎞
= −⎜ ⎟
λ2 k ⎝ λ⎠
1 ⎡
= k ( k + ) − k 2 ⎤⎦
λ2 ⎣
k
= 2
λ
12. (a) (i) Given P(x,y) = K(2x+3Y),x = 0,1,2,3; y = 1,2,3
The joint probability distribution of (X, Y) is given below

02-Nov-Dec_2011.indd 43 12/8/2012 9:37:41 AM


1.44 B.E./B.Tech. Question Papers

X/Y 1 2 3
0 3k 6k 9k
1 5k 8k 11k
2 7k 10k 13k
3 2
since ∑ ∑ p( x , y ) = 1
j =1 i = 0
i j

∴3k + 6k + 9k + 5k + 8k +11k + 7k + 10k + 13k = 1


72k = 1
k = 1/72

X/Y 1 2 3 ∑ y
( x, y )

0 3/72 6/72 9/72 18/72


1 5/72 8/72 11/72 24/72
2 7/72 10/72 13/72 30/72

∑ x
( x, y )
15/72 24/72 33/72 1

Marginal distribution of X

P[X = x] = ∑ p( x, y)
y

x 0 1 2
P(x) 18/72 24/72 30/72

P[Y = y] = ∑ p( x, y)
x

Y 1 2 3
p(y) 15/72 24/72 33/72
Conditional distribution of X, given Y = y
P[ X x ∩ Y y]
P[ X x / Y = y] =
P[Y y ]
3 / 72 1
P[ X / y = 1] = =
15 / 72 5
5 / 72 1
P[ X / Y = 1] = =
15 / 72 3

02-Nov-Dec_2011.indd 44 12/8/2012 9:37:42 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.45

7 / 72 7
P[ X / Y = 1] = =
15 / 72 15
6 / 72 1
P[ X / Y = 2] = =
24 / 72 4
8 / 72 1
P[ X / Y = 2] = =
24 / 72 3
10 / 72 5
P[ X / Y = 2] = =
24 / 72 12
9 / 72 3
P[ X / Y = 3] = =
33 / 72 11
11 / 72 1
P[ X / Y = 3] = =
33 / 72 3
13 / 72 13
P[ X / Y = 3] = =
33 / 72 33

12. (a) (ii) If X1,X2......Xn is a sequence of n independent and identically


distributed random variable, each having mean μ & σ2 and if
X + X 2 + ..... X n X −μ
X = 1 ,then the variable Z = has a dis-
n σ/ n
tribution that approaches the standared normal distribution as
n→∞, provided the M.G.F exits
Proof:
M.G.F of Z about the origin
M Z (t ) = E[e tz ]
⎡ t x−μ ⎤
= E ⎢e σ / n ⎥
⎢ ⎥
⎣ ⎦
⎡ tX n −t X n ⎤
= E ⎢e σ ⋅e σ ⎥
⎢ ⎥
⎣ ⎦
− μt n ⎡ t n ⎡ X1 + X 2 +.....+ Xn ⎤ ⎤
⎢ ⎥
=e E ⎢e σ ⎣ n ⎦⎥
⎢ ⎥
⎣ ⎦
−tμ n ⎡ tXX1 tX 2 Xn ⎤
tX

e E ⎢e σ n e σ n .......e σ n ⎥
⎢ ⎥
⎣ ⎦

02-Nov-Dec_2011.indd 45 12/8/2012 9:37:43 AM


1.46 B.E./B.Tech. Question Papers

Since X1,X2……Xn are independent


tμ n ⎛ ( tx1 ) ⎞ ⎛ tx2 ⎞ ⎛ txn ⎞
Hence Mz(t) = e E ⎜ eσ ⎟ E ⎜ eσ ⎟ ..........E ⎜ e σ n ⎟
⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠

The variables X1,X2……Xn have the same M.G.F


n

μt n ⎡ ⎛ t ⎞⎤
⎜ ⎟
∴ M z (t ) = e ⎢M ⎝σ n⎠ ⎥
⎢ X

⎣ ⎦
⎡ −tμ n ⎤ ⎡ ⎛⎜ t ⎞⎟ ⎤
log z (t ) log ⎢e σ ⎥ + n log ⎢ M x ⎝ σ n ⎠ ⎥
⎢ ⎥ ⎢ ⎥
⎣ ⎦ ⎣ ⎦
⎡ ⎛⎜ e tx ⎞⎟ ⎤
tμ n
=− + n log ⎢ E ⎝ σ n ⎠ ⎥
σ ⎢ ⎥
⎣ ⎦

tμ n ⎡ ⎛ ⎛ t ⎞ 1⎛ t ⎞
2 ⎞⎤
=− + n log ⎢ E ⎜1 + ⎜ X + X 2
+ .....⎟⎥
σ ⎢ ⎝ ⎝ σ n ⎟⎠ 2 ! ⎜⎝ σ n ⎟⎠ ⎠ ⎥⎦

tμ n ⎡ ⎛ t ⎞ 1⎛ t ⎞
2 ⎤
=− + n log ⎢1 + ⎜ ⎟ μ1
1
+ ⎜ ⎟ μ21 + ....⎥
σ ⎢⎣ ⎝ σ n ⎠ 2! ⎝ σ n ⎠ ⎥⎦

tμ n ⎡⎛ t μ1⎛ t ⎞
2 ⎞ 1⎛ t ⎞
2 ⎤
=− + n ⎢⎜ μ11 + 2 ⎜ + ...⎟ − μ 1
+ ... + ...⎥
σ ⎢⎝ σ n 2 ! ⎝ σ n ⎟⎠ ⎠ 2⎝ 1 σ n ⎟⎠ ⎥
⎣ ⎦
Put μ1 = μ = mean

μt n t2 ⎡ 1 n t
log z (t ) μ
2 ⎣ 2
( μ11 ) 2 ⎤⎦
σ 2σ σ
+terms containing n in the denominator

t2
log z (t ) = σ2
2σ 2
= t2 / 2
2
∴ M z ( ) = et /2
, as n → ∞

M.G.F of z is the M.G.F of N (0, 1) i.e., as n → ∞ the distribution


of z tends to the standard normal distribution.

02-Nov-Dec_2011.indd 46 12/8/2012 9:37:44 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.47

12 (b) (i) Since x and y are independent


f(x,y) = e−(x+y)
x
Given u = and v = x + y
x+ y
Solving this equation, we get
x = uv & y = v(1 − u)

∂x ∂x
∂u ∂v v u
J= = =v
∂yy ∂yy −vv ( u)
∂u ∂v
The joint pdf (u, v) is given by

g(u,v) = J e-(x + y)
= ve-u
Range space of u & v

x ≥ 0 ⇒ uv ≥ 0
y ≥ 0 ⇒ v(1 − u ) ≥ 0
⇒ 0 ≤ u ≤1 &v ≥ 0
∴ g (u
( u, v ) ve − u , 0 ≤ u ≤ 1; v ≥ 0

pdf of u:


fU (u ) = ve − v dv
0

= ⎡⎣ − ve −vv − e v ⎤⎦
0
fU(u) = 1, 0 ≤ u ≤1

pdf of v
1


fV ( v ) = ve −vv ddu = ve v , v ≥ 0
0

Clearly g(u, v) = fU(u)fV(v)


∴ U and V are independent RVS.

02-Nov-Dec_2011.indd 47 12/8/2012 9:37:45 AM


1.48 B.E./B.Tech. Question Papers

12 (b) (ii)

x y x x x − 20 y y y − 21 ( )2 ( )2 ( )(
)( y))
10 18 -10 -3 100 9 30
14 12 -6 -9 36 81 54
18 24 -2 3 4 9 -6
22 6 2 -15 4 225 -30
26 30 6 9 36 81 54
30 36 10 15 100 225 150
∑ 120 126 0 0 280 630 252

∑ x 120
x= = = 20 ∵n = 6
n 6
∑ y 126
y= = = 21
n 6
∑( x − x )( y − y ) 252
byx = = = 0.9
∑( x − x ) 2 280

∑( x − x )( y − y ) 252
bxy = = = 0.4
∑( y − y ) 2 630

r byx × bxy = ± 0.9 × 0.4


r=06

13 (a) (i) Poisson Process:


If X(t) represents the number of occurrences of a certain event in
(0, t), then the discrete random process {X(t)} is called the Pois-
son process, provided the following postulates are satisfied.
(i) P[1 occurrence in (t, t + Δt)] =λΔt + o(Δt)
(ii) P[0.occurence in (t, t + Δt)] = 1 − λΔt + o(Δt)
(iii) P[2 or more occurrences in (t, t + Δt)] = o(Δt)
(iv) X(t) is independent of the occurrences of the even in any
interval prior and after the interval (0, t);
(v) The probability that the event occurs a specified number of
times in (t0,t0 + t) depends only on t, but not on t0.
Probability law of Poisson Process x(t)
Let λ be the number of occurrences of the event in unit time.
Let Pn(t) = P[X(t) = n]
∴The probability that there are n occurrences in time (0, t+Δt)
is

02-Nov-Dec_2011.indd 48 12/8/2012 9:37:46 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.49

Pn (t + Δt) = P[X(t + Δt) = n]


P[(n − 1) occurrences in (0, t) and 1 occurence.in(t, t+Δt)]
+P[n occurrences in (0, t) and no occurrence in (t, t+Δt)]
=Pn−1(t) λΔt +Pn(t)(1 − λΔt)
Pn (t Δt ) Pn (t )
Δt
∴ = λ ⎡⎣Pn (t ) − Pn (t ) ⎤⎦
Δt
Taking the limits as Δt→0

d
P (t ) + ppn (t ) = λ Pn −1 (t )
dt n
∴Which is a linear differential equation.
dy
+ PY = Q
dn
t
∴ Pn (t )e + λ t = ∫
0
pn −1 (t )e λ t dt

x
The solution is ye ∫ = Qe ∫

pdx pdx
dx
0
t


= λ pn −1 (t )e λ t dt
0
(1)

Now, taking n = 1
t

∫ P ( )e
− λt λt
( ) ⇒ P1 ( )e = 0 dt
0
P0 ( ) = P[ in (0, t + Δt )]
= P0 (t ) [ λΔt Δt ]
= P0 (t ) − P0 (t )( λ t )

P0 (t + t ) − P0 (t )
∴ = − λ P0 (t )
Δt
Taking limit as Δ → 0

P0 ( ) P0 ( )
lim = − λ P0 ( )
Δt → 0 Δt

02-Nov-Dec_2011.indd 49 12/8/2012 9:37:47 AM


1.50 B.E./B.Tech. Question Papers

d
P (t ) = − λ P0 (t )
dt 0
d [ P0 (t )]
= − λ dt
P0 (t )
∴ log 0 (t ) = −λt + c
P0 ( ) = e − λ t + c
= Ae − λ t ∵ e c = A
∴ P0 ( ) = Ae 0 = A
ie., 1 = A
t


∴ e λ P1 (t ) = λ e − λ t ⋅ e λ t ddt
0
t


= λ dt
0
= λt
∴ P1 (t ) = e − λ t ( λ t )
Similarly
t

∫ P (t(t )e
λt λt
P1 (t )e 1 dt
0

∴ P1 (t ) = e−λ t λ t
t


= λ e−λλ t ( t )e λ t dt
0
t
= λ 2 tdt
d ∫
0

⎛ t2 ⎞
= λ2 ⎜ ⎟
⎝ 2⎠
t2
∴ P2 (t ) = e − λ t
2!
Proceeding similarly, we have, in general
(λ t )n
P[ X ( t ) n] = Pn (t ) = e − λ t , n = 0, 1, 2 ...
n!

02-Nov-Dec_2011.indd 50 12/8/2012 9:37:48 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.51

13 (a) (ii) The travel pattern is a Marker chain.


The state space = (train, car)
The tpm of the chain is
T C
T⎡ 0 1 ⎤
⎢ P= ⎥
C ⎣1 / 2 1 / 2⎦
The initial state probability distribution is p(1) = ⎛⎜ 5 , 1 ⎞⎟
Probability of traveling by car ⎝ 6 6⎠
= P[getting 6 in the toss of the die]
1
=
6
5
∴ Probability of travelling by train =
6
⎡0 1⎤
⎡5 1⎤ ⎢ ⎥ ⎡1 11 ⎤
p( )
p( ) P = ⎢ ⎥ 1 1⎥ = ⎢
⎣6 6⎦ ⎢ ⎣12 12 ⎥⎦
⎢⎣ 2 2 ⎥⎦
⎡0 1⎤
⎡ 1 11 ⎤ ⎢ ⎥ ⎡ 11 13 ⎤
p ( )
p ( )
P=⎢ ⎥ 1 1⎥ = ⎢
⎣12 12 ⎦ ⎢ ⎣ 24 24 ⎥⎦
⎢⎣ 2 2 ⎥⎦

11
∴ P[ h l by i h hi d d y ] =
24
The steady state probability distribution of the markov chain.
π (π1 , π 2 )
By the property of π ,
πP π
( 1 , π 2 ) ( / 2 1// ) (π1 , 2)
1
∴ π2 π1 (1)
2
1
and π1 + π 2 = π 2 (2)
2
From eqn (1) and (2) with
π1 + π 2 = 1
1 2
π1 d π2 =
3 3

02-Nov-Dec_2011.indd 51 12/8/2012 9:37:49 AM


1.52 B.E./B.Tech. Question Papers

2
∴ P[The man travels by car in the long run =
3
13 (b) (i)
Since θ is uniformly distributed in (0, 2p)
1 1
∴ f( )= = , 0<θ < π
b − a 2π
E[ x(t )] = E[ A cos ( 0 t + ))]

1
=A ∫ 2π cos(ω t
0
0 θ ) dθ

A
= [sin( wo t + θ )]0 2π

A
= [sin( π + w0 t ) − inn ω 0 t ]

A
= [sin ω 0 t i ω 0 t ] ∴ sin ( 2π θ ) = sin θ

= 0 = const .
E[ X (t1 ) X (t2 ) E[ A2 cos(ω 0 t1 θ ) ω s (ω 0 t2 + θ )]
A2
= E[cos{(t1 + t2 ) ω 0 2θ} { 0 ( t1 − t2 )}]
2

A2
=
2 ∫ cos{(t + t )ω
0
1 2 0 + {(t1 − t2 )ω 0 }dθ


A2 ⎡ sin{(t1 + t2 )ω 0 + 2θ} ⎤
= ⎢ +θ s{( 1 2 )ω 0 }⎥
4π ⎣ 2 ⎦0
A2
= [ πc (1 2 )ω 0 ]

A2
= cos( t1 t2 )ω 0
2
∴ R (tt1 t2 ) (t1 − t2 )
∴ { X (t )} is a wide sense stationary
a process.

13 (b) (ii) The time interval T between two consecutive arrivals is a random
variable follows exponential distribution with parameter
l = 2 ∴ the pdf of T is f (t) = 2e−2t, t ≥ 0

02-Nov-Dec_2011.indd 52 12/8/2012 9:37:51 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.53

∫ 2e −2 t
( ) P[ ] dt = e −2 = 0.1353
1
2
] = 2e∫ e −2 − e −4 = 0.1170
2t
( ) P[[ ddt
1
4


(iii ) p[T ≤ 4] = 2e −2t dt = 1 − e −8 = 0.9996
0

14 (a) N is a discrete random variable which can take


the values 0,1,2, …, ∞
Such that
n
⎛ λ⎞
p( N n) Pn = ⎜ ⎟ P0
⎝ μ⎠
w.k.t
1
P0 = n

⎛ λ⎞
1+ ∑
n =1
⎜⎝ μ ⎟⎠

1
= n

⎛ λ⎞

n= 0
⎜⎝ μ ⎟⎠

1
=
λ λ ...
(1 + + 2 + )
μ μ
1 λ
= = 1−
λ μ
(1 − ) −1
μ
n
⎛ λ⎞ λ
∴ Pn = ⎜ ⎟ ⋅1 −
⎝ μ⎠ μ
Average waiting time of a customer in the queue
α
λ
Lq
μ ∫
( μ λ ) ω e − ( μ − λ )ω d ω
0
Let ( μ − λ )ω = t

02-Nov-Dec_2011.indd 53 12/8/2012 9:37:52 AM


1.54 B.E./B.Tech. Question Papers

dt
dw =
μ−λ

λ t dt
= (μ − λ )
μ μ−λ ∫
e−t
0
μ−λ

λ
=
μ( μ − λ ) ∫
e − t dt
0

λ ⎡ e−t ⎤ λ
= ⎢ ⎥ =
μ ( μ − λ ) ⎣ −1 ⎦ 0 μ( μ − λ )
λ
∴ Lq =
μ( μ − λ )
15
14 (b) arrival rate λ =
h
6 λ 5
service rate μ = ∴ =
h μ 3
number of server s = 3
Hence this is a problem in multiple server (M/M/S): (∞/FIFO )
model.
(a) P [all the typists are busy]
= P[ N ≥ ]
3
⎛ λ⎞
⎜⎝ μ ⎟⎠ .Po
= (1)
⎛ ⎛ λ ⎞⎞
3! ⎜1 − ⎜ ⎟ ⎟
⎝ ⎝ 3μ ⎠ ⎠
−1
⎡ ⎧ ⎫⎤
⎢ s−1 ⎧ n⎫ ⎪ s ⎪⎥
⎪ 1⎛ λ⎞ ⎛ λ ⎞ ⎪⎥

1
P0 = ⎢ ⎨ ⎜ ⎟ ⎬ + ⎨ ⎜⎝ ⎟⎠ ⎬ ⎥
⎢ n !⎝ μ ⎠ ⎛ λ ⎞ u ⎪
⎢ n = 0 ⎪⎩ ⎪⎭ ⎪ S ! 1 −
⎜ ⎟ ⎥
⎢⎣ ⎪⎩ ⎝ μ s ⎠ ⎪⎭ ⎥⎦
−1
⎡ ⎧ ⎫⎤
⎢ 3 ⎥
⎧ 1 ⎫ ⎪⎪ 1 ⎛ 5 ⎞ ⎪⎪ ⎥
= ⎢ ⎨1 + 2.5 + ( 2.5) 2 ⎬ + ⎨ ⎜ ⎟ ⎬
⎢⎩ 2 ⎭ ⎪ 3! ⎛1 − 5 ⎞ ⎝ 3 ⎠ ⎪ ⎥

⎣ ⎪⎩ ⎝ 3 ⎟⎠ ⎪⎭ ⎥⎦
= [22.25]−1 = 0.0449
4
3
(5 / 3)
1⇒ [ 3] = × 0.0449 = 0.7016
6(1 − 5 / 9)

02-Nov-Dec_2011.indd 54 12/8/2012 9:37:52 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.55

Hence the fraction of the time all the typists will be busy = 0.7016
(b) The average number of letters waiting to be typed
s +1
⎛ λ⎞
⋅ P0
1 ⎜⎝ μ ⎟⎠
Lq = 2
s s! ⎛ λ⎞
1 −
⎜⎝ μ s ⎟⎠
4
⎛ 2 5⎞
⋅ 0.0449
1 ⎜⎝ 6 ⎟⎠
= 3.5078
3 6 ⎛ 2 5⎞ 2
⎜⎝1 − ⎟
3 ⎠

(c) 1
ws L
λ s
1⎡ λ⎤ λ
= ⎢ Lq Ls = Lq +
λ⎣ μ⎦ μ
1
= [3.5078 + 2.5] = 0.4005 h
15
or ws = 24 min, nearly
⎡ s⎡ − μt s −1− ⎟ ⎤
⎛ λ⎞ ⎤
⎢ ⎛ λ ⎞ ⎢1 − e ⎝ u⎠ ⎥ ⎥
⎢ ⎜⎝ μ ⎟⎠ ⎢ ⎥ ⎥
(d) P[w t ] e − μt ⎢1 + ⎣ ⎦P ⎥
⎢ ⎛ λ ⎞⎛ λ⎞ ⎥
0
⎢ s ! ⎜1 − ⎟ ⎜ s − 1 − ⎟ ⎥
⎢ ⎝ μs ⎠ ⎝ μ⎠ ⎥
⎣ ⎦
⎡ 1 ⎤
∴ P [w > ] = P w > h ⎥
min ⎣ 3 ⎦
⎡ ⎤
−6 × ⎢ ( × − 5)
4 ⎥
1
( .5) {1 − e
3 (−
× 0.0449
= e 3 ⎢1 + ⎥
⎢ ⎛ 2 5⎞ ⎥
⎢ 6 1− ⎟ ( −0.5) ⎥
⎣ ⎝ 3 ⎠ ⎦
⎡ 0 . 7016 (1 − e ) ⎤
= e −2 ⎢1 +
⎣ ( −0.5) ⎥⎦
= 0.4616

15. (a) Let N & N′ be the numbers at customers in the system at time t & t
+ T, when two consecutive customers have just left the system after
getting service.

02-Nov-Dec_2011.indd 55 12/8/2012 9:37:53 AM


1.56 B.E./B.Tech. Question Papers

Thus T is the random service time, which is continuous random vari-


able.
Let f(t), E(T), var (T) be the pdf, mean and variance of T. Also let M
be the number of customers arriving in the system during the service
⎧M , iiff N = 0
time T. Hence N ′ = ⎨
⎩ N − 1 + M if N > 0
Where M discrete R.V., taking the values 0, 1, 2…
∴N′ = N − 1 + Μ + δ (1)

⎧1, if N = 0
where =⎨
⎩0, if N > 0
∴ E ( ′ ) = E (N) − 1 + E (M) + E(δ) (2)
When the system has reached the steady state, the probability of the
number at customers in the system will be constant

Hence E(N) = E ( ′ ) and E (N2) = E ( ′ ) 2 (3)


Using this in (2) => E(δ) = 1 − E(M) (4)
Eqn (1) squaring on both sides

N ′2 N 2 + ( M 1) 2 δ 2 2 N ( M − 1) 2( M − 1) δ 2 N δ (5)
Now d = 0
⎧ 0 × 1, if N = 0
and N δ = ⎨
⎩ N 0, iif N > 0
=0
Using these values in (5)

5⇒ ′2 N2 M2 2 N ( M −11) ( 2 M −11) δ 2 M + 1

ie 2 N (1 M ) = N 2 N 2
+ M2 ( 2 M 1) δ − 2 +1
2 E{N (1 − M )} E ( N ) E(N 2 2
( 2 M 1)δ }
E ( N ) E ( M ) E { (2 2

− 2 E (M ) 1
ie 2E ( N ) {1 E ( M )} = E ( M 2 ) + {2E ( M ) 1} − 2 E ( M ) .
∵ Independense Property and (3)
E ( M 2 ) + { E ( M ) }{ E ( M )} 2 E ( M ) + 1
E (N ) =
2{{ E ( M )}
E (M 2 ) − 2 E2( M ) + E (M )
= ∴by 4
2{{ E ( M )}

02-Nov-Dec_2011.indd 56 12/8/2012 9:37:54 AM


Probability and Queuing Theory (Nov/Dec 2011) 1.57

Since the number of arrivals in time T follows a poisson process with


parameter λ , say
Then
E ( M ) = λT & Var
V ( M ) λT
or E ( M 2 ) = ( λT ) 2 λT
⎧ ⎛ M ⎞⎫
Now E ( M ) = E ⎨ E ⎜ ⎟ ⎬ = E ( λT ) λ E (T ) ( )
⎩ ⎝ t ⎠⎭
⎧⎪ ⎛ M 2 ⎞ ⎫⎪
E ( M 2 ) = E ⎨E ⎜ ⎟ ⎬ = E {λ T λT }
2 2

⎪⎩ ⎝ T ⎠ ⎪⎭
= λ 2 {Var
a (T ) + E 2 T } λ E (T )
λ 2V (T ) + λ 2 E 2 (T ) + λ E (T ) − 2 λ 2 2
( ) λ E (T )
Using ( ) a d ( ) in 5 Ls = (7)
2{1 λ E (T T)}
⎧V (T ) + E 2 (T ) ⎫
= λ ( ) λ2 ⎨ ⎬{ λ (T )}
⎩ 2 ⎭
15 (b) Open Jackson Networks
A networks of ‘k’ service facilities or nodes is called an open Jack-
son Network, if it satisfies the following characteristics:
(1) Arrivals, from outside, to the node ‘i’ follow a Poisson process
with mean rate ‘ri’ and join the queue at ‘i’ and wait for his turn
for service.
(2) Service times at the channels at node ‘i’ a independent and each
exponentially distributed with parameter ‘μi’.
(3) Once a customer gets the service completed at node ‘i’ he joins
the queue at node ‘j’ with probability ‘Pij” (whatever be the
number of customers waiting at ‘j’ for service), where i = 1,2, …
k, and j = 0,1,2,… k.
Pio represents the probability that a customer leaves the system from
node i after getting the service at ‘i’.
If we denote the total arrival rate of customers to server ‘j’ [viz., the
sum of the arrival rate rj (Note: It is not lj) to ‘j’ coming from outside
and the rates of departure λi from the servers i] by λ j , then
k
λj j ∑λ Pi =1
i iij , ; j k (1)

Pij is the probability that a departure from server ‘i’ joins the queue
at server ‘j’ and hence ‘liPij’ is the rate of arrival to server ‘j’ from
among those coming out from server ‘i’.
Equations (1) are called Traffic equations or Flow balance equa-
tion.

02-Nov-Dec_2011.indd 57 12/8/2012 9:37:56 AM


1.58 B.E./B.Tech. Question Papers

Jackson has proved that the steady-state solutions of these traffic


equations with single server at each node is
n n n
P(n1, n2,… nk) = P1 P1 ). p2 (1 P2 )... Pk k (1 Pk ) (2)
λj
Where Pj = , provided Pj 1f ll j.
uj
n1
Since P ( n1 , n2 ,... nk ) [P
[ P1 ( P1 ))][ )]...[[ Pk n (
][ P2 n (1 P2 )] Pk )]
viz., the joint probability is equal to the product of the marginal prob-
abilities, we can interpret that the network acts as if the queue at each
node ‘i’ is an independent M/M/1 queue with rates λi and μi .
Closed Jackson Networks.
A queueing network of ‘k’ nodes is called a Closed Jackson Net-
work , if new customers, never enter into and the existing customers
never.depart from the system. Viz., if ri = 0 and Pio = 0 for all ‘i’.
In other words, it is equivalent to a finite source queuing system of
‘N’ customers who traverse continuously inside the network where
the service time of server ‘i’ is exponentially distributed with rate
μi; ; i = 1,2,…k.
When a customer completes service at ‘Si’ he then joins the queue
at ‘Sj’ j = 1,2,…, k with probability ‘Pij’ where it is assumed that
k

∑P
j =1
ij = 1 for all i =1,2,…, k. we note that the matrix P = [ ij ] is

similar to one-step transition probability matrix of a Markov chain,


that is stochastic and irreducible,
The flow balance equations of this model become

λj ∑ λ P ; j = 1,2,…, k [
i =1
i ij j ] (1)

The matrix [ ij ] is called the routing probability matrix in the con-


text. Jackson has proved that the steady-state solution of equation
(1) is
P( n1 , n2 ,...nk ) C N P1n P2 n2 ...Pk nk ,

Where C N1 ∑
n1 + n2 + ...+nnk N
P1n1 P2 n Pk nk ,

λj
Pj =
μj

02-Nov-Dec_2011.indd 58 12/8/2012 9:37:57 AM


B.E./B.Tech. DEGREE EXAMINATION,
MAY/JUNE 2011
Fourth Semester

Computer Science and Engineering

Probabilty and Queuing Theory


(Common to Information Technology)
Time: Three hours Maximum: 100 marks
Answer All Questions

Part A – (10 × 2 = 20 marks)

1. The Cumulative distribution fn. of the random variable X is given by


⎧ 0 x<0

⎪x + 1 0 ≤ x ≤1

Fx ( x ) = ⎨ 2
⎪ 1
⎪ 1 x>
⎪ 2

⎛ 1⎞
Compute P X > ⎟ .
⎝ 4⎠
2. Let the random variable x denote the sum obtained a pair of fair dice.
Determine the prob. mass fn. of X.
X

3. Given the two regression lines 3x + 12 y = 19, 3y + 9x = 46, find the coef-
ficient of correlation between X and Y.
Y

4. State Central limit theorem.

5. Define (a) Continuous–time random process.


(b) Discrete state random process.

6. Find the transition prob. matrix of the process represented by the state
transition diagram.

03-May-June_2011.indd 59 12/7/2012 6:01:42 PM


1.60 B.E./B.Tech. Question Papers

0.5

2 0.3
0.4 1
0.3

0
0.1 0.2

0.3 0.4

3
0.5

7. Arrival rate of telephone calls at a telephone booth is according to Poisson


distribution with an average time of 9 minutes between two–consecutive
arrivals. The length of a telephone call is assumed to be exponentially
distributed with mean 3 minutes. Determine the prob. that a person arriv-
ing at the booth will have to wait.

8. Trains arrive at the yard every 15 minutes and the service time is 33
minutes. If the line capacity of the yard is limited to 4 trains find the
probability the yard is empty.

9. Given that the service time is Erlang with parameter m and m. Show that
m( + m)l 2
the Pollaczek –Khintchine formula reduced to Ls ml +
2(( − ml )

10. Give any two examples for series queuing situations

PART B – (5 × 16 = 80 marks)
11. (a) (i) Find the moment-generating function of the binomial r.v with
parameters m and p and hence find its mean & variance.
(ii) Define Weibull distribution and write its mean and variance.

Or

(b) (i) Derive mean & variance of a geometric distribution. Also estab-
lish the forgetfulness property of the geometric distribution.
(ii) Suppose that the telephone calls arriving a particular switch-
board follow a Poisson process with an average of 5 calls. What
is the probability that up to a minute will elapse unit 2 calls have
come in to the switch boards?

03-May-June_2011.indd 60 12/7/2012 6:01:43 PM


Probability and Queuing Theory (May/June 2011) 1.61

12. (a) Given the joint density function.


⎧ x( y2 )
⎪ 0 < x < 2 0 < y <1
F ( x, y ) = ⎨ 2 Find the marginal

⎩ 0 elsewhere
densities g(x), h(y) and the conditional density f(x/y) and evaluate
P (1/4 < x < 1/2/y = 1/3)
(b) (i) Determine whether the random variables X and Y are
independent given their joint probability density fn. as
⎧ 2 xyy
⎪x + 0 ≤ x≤1 0≤ y≤2
( x, y ) = ⎨ 3
⎪ 0
⎩ 0.w

Or

(b) (ii) If X & Y are independent random variables having density fns.
⎧2e −2 x x ≥ 0 3e −3 y y ≥ 0 respectively,
⎧3e
f ( x) = ⎨ and f y ( y ) = ⎨
⎩ 0 0.w ⎩ 0 y <0
find the density fns of Z = x−y:

13. (a) (i) Show that the random process {X(t)} = A cost + B sint −∞ < t
<∞ is a wide sense stationary process where A and B are inde-
pendent random variables each of which has a value −2 with
probability 1/3 and a value 1 with 2/3.
(ii) Derive probability distribution of Poisson process and hence
find its auto correlation fn.
Or

(b) (i) Find the limiting–state probabilities associated with the follow-
⎛ 0.4 0.5 0 1⎞
ing probability matrix ⎜ 0.3 0.3 0 4⎟
⎜ ⎟
⎜⎝ 0.3 0.2 0 5⎟⎠

(ii) Show that the difference of two independent Poisson Processes


is not a Poisson Process.

14. (a) (i) Customers arrive at a one window drive in bank according to
Poisson Distribution with mean 10 per hour. Service time per

03-May-June_2011.indd 61 12/7/2012 6:01:43 PM


1.62 B.E./B.Tech. Question Papers

customer is exponential with mean 5 minutes. The space is front


of window, including that for the serviced car can accommodate a
maximum of three cars. Others cars can wait outside this space.
1. What is the probability that an arriving customer can drive
directly to the space in front of the window.
2. What is the probability that an arriving customer will have
to wait outside the indicated space?
3. How long is an arriving customer expected to wait before
served?

14. (a) (ii) Show that for the ( / M / : FCFS / ) , the distribution of
waiting time in the system is w (t ) = ( μ λ )e −(( μ λ )w
.
(b) Find the steady state solution for the multi server M/M/C model and
hence find Lq, Wq, Ns, and Ls by using Little’s formulas.

15. (a) Derive the expected steady state system size for the single serve
queues with Poisson input and General services.

Or

(b) Write short notes on


(i) Series Queues (ii) Open & closed network

03-May-June_2011.indd 62 12/7/2012 6:01:44 PM


Solutions
PART A

1.

d ⎧0 0.w
f ( x) = F ( x) = ⎨
dxx x ⎩1 0 ≤ x ≤ 1
1
⎛ 1⎞ 1 3
∫ dx = ( x)
1
∴P X > ⎟ = = 1− =
⎝ 4⎠ 1/4
4 4
1/4

2. Let X denote the sum obtained a pair of fair dice.


Then the form f of x is

X 2 3 4 5 6 7 8 9 10 11 12
p(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36

3.
3X + 12 Y = 19
1 19
Y X+
4 3
−1
∴ bYX =
4
3 9 X = 36
1
⇒ X = − Y +4
3
−1
∴ bXY =
3

⎛ −1⎞ ⎛ −1⎞ 1
∴ r = ± bXY bXY = ⎜ ⎟ ⎜ ⎟ = ±
⎝ 4 ⎠⎝ 3⎠ 12
Since both bxy, byx are negative
1 1
r=− =−
12 2 3

03-May-June_2011.indd 63 12/7/2012 6:01:44 PM


1.64 B.E./B.Tech. Question Papers

4. If x1, x2, …xn are n independent i dependent identically distributed random


variables with

5. (a) If ‘t’ is continuous and the random variable X is continuous, then the
random process X(t) is called a continuous random process.
(b) If ‘t’ is continuous and the random variable X is discrete then the
random process X(t) is called discrete random process.

6.
1 2 3
1 ⎡0.4 0.5 0 1⎤
⎢ ⎥
TPM P = 2 ⎢ 0.3 0.3 0 4 ⎥
3 ⎢⎣ 0.1 0.2 0 5⎥⎦

1 1
7. λ 9
μ=
3
λ 1/9 1
P(the system is busy) = = =
μ 1/3 3
8. Since yard with maximum capacity 4, if M/M/I finite capacity model-III.
1 1 11
1115
/
Given λ μ= = 4 λμ= = 3315
/11
15 33 1/333

1− λ μ 1 − 3315
/1
P ( idle ) P0 = = = 0.0237
1 − (λ μ ) 1 )
k +1
1 − (3315
5

9. P-k formula

λ 2{ (T ) ( E (T )) 2 }
Ls λ E (T ) +
2(1 − λ E (T ))

E (T ) = m/μ ⎫
Here ⎬ Erlang distribution
Var(T ) m/μ 2 ⎭

⎛ m ⎛ m⎞ 2 ⎞
λ2 ⎜ 2 + ⎜ ⎟ ⎟
λ ⎝μ ⎝ μ⎠ ⎠
∴ LS = m +
μ ⎛ ⎛ m⎞ ⎞
2 ⎜1 − λ ⎜ ⎟ ⎟
⎝ ⎝ μ⎠⎠

03-May-June_2011.indd 64 12/7/2012 6:01:45 PM


Probability and Queuing Theory (May/June 2011) 1.65

m (1 + m) l 2
= ml + where P = λ μ
2 (1 − ml )

10. 1. A student seeking admission in an university must visit a series of


desks certificate verification accounts department head etc.
2. In the passport office an applicant must passes many counters to com-
plete process.

PART B
11. (a) (i) Let X denote discrete r.v
Then its form f is

( = x) C x p x q m− x
mC x = 0,1, 2, ...m.

To find Mgf:

( )= ∑e
M x (t ) = E e tx tx
p( x )
x =∞

m
= ∑e
x=0
tx
mcx p x q m − x

∑ mc ( ) q
x m− x
= x
x=0

( ) + mc q ( )
2
= q m + mc1q m −1 2
m− 2

( )
m
+ +

Mgf = (q + pet)m
Mean:

Mean = E ( x ) = M x′ ( 0 )

M x (t ) = (q
(q pe t )m (1)
Diff. (1) w.r.to ‘t’

M x′ (t ) = m(q
m( q pe t ) m −1 ⋅ pe t (2)
Put t = 0 in eqn. (2)

03-May-June_2011.indd 65 12/7/2012 6:01:46 PM


1.66 B.E./B.Tech. Question Papers

E ( x ) = M x′ ( ) m
mp
E(X)2

( )
E X 2 = M X′′ ( 0 )

( )
⎡ m −1
(2) ′
X () ⎢⎣ .e t ⎤⎥ (3)

Diff.(3) w.r.to. ‘t’

( ) ( )
m −1 m
M X′′ ( t ) = mp q + pe t .ee t + e t ( m − 1) q + pe
p t p
Put t = 0

( )
E x 2 = M X′′ ( 0 ) m ⎡⎣1
mp ( m 1) p⎤⎦ m [ mp
mp m + q ] = m2 p 2 + mpq

(∴1 − p = q)
∴Var (x) = E(x2)−{E(x)}2

= m2 p 2 + mpq − ( mp )
2

= mpq

11. (a) (ii) If X denotes the continuous random variable then the pdf of
Weibull distribution is
⎧⎪αβ x β − e −α x β x>0
f ( x) = ⎨
⎩⎪ 0 0.w
1/ β
⎛ 1⎞
Mean = ⎜ ⎟ 1/ β +1
⎝α⎠

{ )}
2/ β

(
2
⎛ 1⎞
Var( x ) / + − / +
⎝α⎠

11. (b) (i) P(X = x) = qx−1p x = 1,2,3,4…


and p + q = 1
Mgf:

M X (t ) = ∑e
x = −∞
tx
p( )


= ∑e
x =1
tx x −1
q ⋅p

03-May-June_2011.indd 66 12/7/2012 6:01:47 PM


Probability and Queuing Theory (May/June 2011) 1.67

∑ (qe )
p
= t x
q x =1

p
= [q(e t ) + ( qe t ) 2 + ( qe t )3 + ......]
q
p
( q (e t ) ⎡1 + qe t + ( ) + …⎤⎥
2
=
q ⎣ ⎦

(− )
−1
= pet

⎛ 1 ⎞ p
= pet ⎜ t⎟
= −t
⎝ 1 − qe ⎠ e − q
Mgf = MX(t) = p(e−t − q)−1
Mean:

Mean = E ( ) = M X′ ( 0 )

( )
−1
M X (t ) = p e t
q (1)

Diff.(1) w.r..to ‘t’

( ) ( )
−2
M X′ ( t ) = ( −1) p e t
−q e t

( )
−2
M X′ ( t ) = p e −tt e t
q (2)
Put t = 0,

Mean = M X′ ( 0 ) p (1 − q )
−2

p 1
= pp −2 = 2
=
p p

( )
E X2

E(X ) = M
2 ′′
X (0)

( 2) ⇒ M X′ (t ) = p ⎡⎢e − t ( )
−2 ⎤
t
− ⎥⎦

03-May-June_2011.indd 67 12/7/2012 6:01:48 PM


1.68 B.E./B.Tech. Question Papers

Diff w.r.to ‘t’

( ) ( −e ) (e ) ( )⎤⎥⎦
−3 −2
M X′′ ( t ) = p ⎡e −tt ( −2) e t
q t t
−q e t

Put t = 0

( )
E x 2 = p ⎡ 2 (1 q )

−33
− (1 − q ) ⎤
2

⎛ 2 1 ⎞ 2 1
= p⎜ 3 − 2 ⎟ = 2 −
⎝p p ⎠ p p

∴Var (X) = E(X2)−{E(X)}2


2
2 1 ⎛ 1⎞
= − −⎜ ⎟
p2 p ⎝ P⎠

1 1 1− p q
= 2
− = 2 = 2
p p p p
∴ p+q =1

Forget fullness property of geometric distribution


If X is discrete r.v whose pmf follows geometric distribution then for any
positive integers m,n

P[X > m+ n X m] = P ( X > n)


Pf:
P(X = x) = qx−1·p x = 1,2,…

P( X > k) ∑q
x k +1
x −1
.p

= p[qk + qk+1 + qk+2+…]


= pqk(1 + q + q2 + …)
= pqk(1 − q)−1 = pqk p−1
= qk
P ( x > m + n / x > m)
= P ( x > m + n / x > m)
( > m)
P(x

=
P( x > m + n ) = qm+ n
P ( x > m) qm
= qn = P(x > n)

03-May-June_2011.indd 68 12/7/2012 6:01:49 PM


Probability and Queuing Theory (May/June 2011) 1.69

11. (b) (ii) X~ Poisson(l)


Here l = 5 per minute.

e−λ λ x
∴ P( X = x) = x = 0,1, 2…
x!
e −5 ( ) 2
P( X )= = 0.0842
2!

12. (a) g( x) = ∫
−∞
f ( x, y )dyy

∫ 2 ( + ) dyy
x
=
0

1
x⎛ 3 y3 ⎞ x
= y+ ⎟ = (1 + 1) = x
2⎝ 3 ⎠ 2
0

⎧x 0 ≤ x ≤ 2
∴ g ( x) = ⎨
⎩0 0.w

h( y) = ∫
−∞
x y ) dxx
f ( x,

∫ 2 ( + ) dxx
x
=
0

2
⎛ x2 ⎞
= ( + ) ⎜ 2⎟
⎝ ⎠0

⎧( y)2
⎪ 0 < y <1
h( y ) = ⎨ 2

⎩ 0 0.w

f ( xx, y ) x / ( 1 + 3 y )
2
f x y ( x / y) = =
f y ( y) 1// ( 1 + 3 y 2 )

fx/y(x/y) = x 0<x<1

03-May-June_2011.indd 69 12/7/2012 6:01:51 PM


1.70 B.E./B.Tech. Question Papers

1/ 2
⎛1 1 1⎞
∴P
⎝4
< x < /y = ⎟ =
2 3⎠ ∫
1/ 4
fx y ( x / y = 1 / 3) dxx

1/ 2 1/ 2
⎛ x2 ⎞
= xdx
1/ 4

d =⎜ ⎟
⎝ 2 ⎠ 1/ 4

1⎛1 1 ⎞ 3
= − ⎟ =
2 ⎝ 4 16 ⎠ 32

12. (b) (i) f X ( x) = ∫
−∞
f ( x, y ) dyy

2
⎛ xy ⎞
= ∫⎝x +
2
⎟ dyy
3⎠
0

2
⎛ x ⎛ y2 ⎞ ⎞
= ⎜ x2 y + ⎜ ⎟ ⎟
⎝ 3 ⎝ 2 ⎠⎠
0

⎧ 2 2x
⎪2 x + 0 ≤ x≤1
f X ( x) = ⎨ 3

⎩ 0 0.w

f y ( y) = ∫
−∞
f ( x, y ) dxx

1
⎛ xy ⎞
= ∫⎝x +
2
⎟ dxx
3⎠
0

1
⎛ x3
x2 y ⎞
= +
⎝ 3 6 ⎟⎠
0

1 y
= +
3 6
⎧1
⎪ ( y) 0 ≤ y
y)
f y ( y) = ⎨ 6
⎪ 0
⎩ 0.w
X & Y are said to be independent

03-May-June_2011.indd 70 12/7/2012 6:01:52 PM


Probability and Queuing Theory (May/June 2011) 1.71

if fX(x).fY(y) = f(x, y)

⎛ 2 2x ⎞ ⎛ 1 ⎞
∴ f X ( x ) fY y ) = x + ⎟ ⎜ ( 2 + y )⎟
⎝ 3 ⎠ ⎝6 ⎠
xy
= f ( x, y )
≠ 2
+
3
∴ X & Y are not independent.

12. (b) (ii) The transformation two dimensional rvs,


∂( x , y )
guv ( u, v ) f xy ( x, y )
∂( u , v )
Here z = x − y Let w = y
x=z+y
x=z+w y=w
∂x ∂y
=1 =0
∂z ∂z
∂x ∂y
=1 =1
∂w ∂w
Since X & Y are independent random variables.
fXY(x, y) = fX(x)· fY(y)
fXY(x,y) = (2e−2x)(3e−3y)
⎧6 −( 2 +3 y )
, y ≥ 0.
=⎨
⎩ 0 0.w

∂( x , y )
∴ g zw ( z , w ) = f xy ( x y )
∂( z , w )

= 6e−(2x+3y) (1) x, y ≥ 0.
= 6e−(2z+2w+3w) z + w ≥ 0, w ≥ 0.

z , w )= ⎧6e −2 z e 5w
z + w ≥ 0, w ≥ 0
g zw ( ⎨
⎩ 0 0.w

To find

gz ( z) =
−∞
∫ f zw ( z , w )ddz

03-May-June_2011.indd 71 12/7/2012 6:01:53 PM


1.72 B.E./B.Tech. Question Papers

Here w + z ≥ 0, w ≥ 0, → w ≥ 0.
w+z=0

w 0 1
z 0 −1

w w

w≥0

z z

w
+
z=
0
⎧∞

⎪ 6 −2 −5 w
d <0
⎪⎪ − z
=⎨


∫ −2 z −55 w
⎪ 6 d >0
⎪⎩ 0

⎧ ∞
⎛ −5w ⎞
⎪6e −2 z ⎜ e ⎟ z<0
⎪⎪ ⎝ −5 ⎠ − z
=⎨

⎪ −2 z ⎛ e −5w ⎞
⎪6e ⎜ ⎟ z>0
⎪⎩ ⎝ −5 ⎠ 0

⎧ −2 z ⎛ e5 z ⎞
⎪6 ⎜ 0 − z<0
⎪ ⎝ −5 ⎟⎠
=⎨
⎪ −2 z ⎛ 1⎞
⎪ 6 ⎜⎝ 0 + ⎟⎠ z>0
⎩ 5

⎧ 6 3z
⎪⎪ 5 e z<0
gz ( z) = ⎨
⎪ 6 e −2 z z>0
⎪⎩ 5

03-May-June_2011.indd 72 12/7/2012 6:01:54 PM


Probability and Queuing Theory (May/June 2011) 1.73

13. (a) (i) Here A & B are random variables.

A −2 1 B −2 1
p(A) 1/3 2/3 p(B) 1/3 2/3

⎛ 1⎞ ⎛ 2⎞
E ( A) = ∑ Ap( A) = − 2 ⎜ ⎟ + 1 ⎜ ⎟ = 0
⎝ 3⎠ ⎝ 3 ⎠

⎛ 1⎞ ⎛ 1⎞
E ( A2 ) = ∑ A2 p( A) = 4 ⎜ ⎟ + 1 ⎜ ⎟ = 2
⎝ 3⎠ ⎝ 3⎠
Similorly E(B) = 0 & E(B2) = 2
(i) E(X(t)) = E(A cost + B sint)
= cost E(A) + sint E(B) = 0 a constant.
(ii) Rxx(t,t + t ) = E(X(t) × (t + T))
= E(A cost + B sin t) (A cos (t + t)
+ B sin (t + t))
= E(A2 cost cos (t + t) + AB cost sin (t + t) –B
sint cos (t + t)+ B2sint sin(t + t))
= cost(t + t) E(A2) + [cost sin (t + t) + sint cost
(t + t)]E(AB)+ sint sin (t + t) E(B2)
Since A & B are independent.
E(AB) = E(A)⋅ E(B) = 0.
& E(A2) = 2, E(B2) = 2
∴Rxx(t, t + t) = 2 (cos(t + t)cost + sin(t + t) sin t)
= 2 cos(t) which is fn. of T only
∴{x(t)} is WSS process

13. (a) (ii) Poisson Process: If x(t) represents the no. of occurrences of a
certain event in (0,t) then the discrete random process {x(t)} is
called the Poisson process, provided the following postulates are
satisfied.
(i) P(1 occurrence in (t, t + Δt) = lΔt + 0(Δt)
(ii) P (0 occurrence in (t, t + Δt)) = 1−lΔt + 0(Δt)
(iii) P(2 or more occurrence in t, t + Δt) = 0 (Δt)
(iv) X(t) is independent of the no of occurrences of the event in
the any interval prior and after the interval (0,t).
Probability law for Poisson Process X(t)
Let l be the no of occurrences of the event in unit time.
Let pn(t) = p(x(t) = n).
Consider,
pn(t + Δt) = p(x(t + Δt) = n)

03-May-June_2011.indd 73 12/7/2012 6:01:55 PM


1.74 B.E./B.Tech. Question Papers

= p((n − 1)occurrence in (0,t) &1 occurrences in (t, t + Δt))


+ p(n occurrence in (0,t) & 0 occurrence in (t, t + Δt)
pn(t + Δt) = pn−1 (t) (lΔt + 0 (Δt)) + pn(t) (1− lΔt + 0(Δt))
omitting 0 (Δt), ÷ Δt, take Δt→0, we get
pn ( ) − pn ( )
lim = λ pnn− ( ) − λ pn ( )
Δ →0 Δt
⇒ p1n ( t ) + λ pn ( t ) = λ pn −1 ( t )

dp (t )
⇒ + λ pn (t ) = λ pn −1 (t )
dt
The general solution is given by
pn(t) elt = ∫lpn−1(t) elt dt (1)
If n = 1
p1(t) elt = ∫lp0(t) elt dt (2)
To find p0(t):
p0(t + Δt) = p (0 occurrence in (0,t) & 0 occurrence in t, t
+ Δt)
⇒p0(t + Δt) = p0(t) (1 − lΔt + 0 (Δt))
Omitting 0 (Δt), and take Δt →0,
We get, ′
0 (t ) = − λ p0 (t )
dp (t )
⇒ = − λ p0 ( t )
dt
dp (t )
⇒ = − λ dt
p0 (t )

∫ ⇒ log p0(t) = −lt + C


⇒p0(t) = −Ae−lt where A = eC
But p0(0) = 1 ⇒ A = 1
∴po(t) = e−lt
Sub in (2), we get,
p1(t)eλt = ∫lelt e−lt dt
= lt
e − λt λ t
∴ p1 ( t ) =
1!
Sub n = 2 in eqn. (1),
p2(t) elt = ∫ lp1(t) elt dt
= l2∫ t e lt e−lt dt

03-May-June_2011.indd 74 12/7/2012 6:01:56 PM


Probability and Queuing Theory (May/June 2011) 1.75

λ 2t 2
=
2
e − λt (λ t )2
∴ p2 ( t ) =
2!
Proceeding n = 2, 3,… in this way
we get,

e − λt ( t )n
pn (t ) = n = 0,1, 2,…
n!
Auto correlation of Poisson Process:
Rxx(t1, t2) = E(x(t1) × (t2))
= E[x(t1) (x(t2) − x(t1)) + x2(t1)]
= E[x(t1) [x(t2) − x(t1)] + E(x2(t1)]
(t2 ) x(t1 )) + λ 2 t12 + λ t1
E ( x(t1 ) E ( x(t

= ( λ 1 )( λ (t2 − t1 )) + λ 2 12 + λ t1

= λ 2t1t2 − λ 2 2
1 + λ2 2
1 + λ t1
= l2t1 t2 + l min (t1, t2)

13. (b) (i) In limiting case, πP = π


and p 1 + p 2 + p 3 = 1 (A)
⎛ 0.4 0.5 0.1⎞
∴ ( 1 , π 2 , 3 ) ⎜ 0.3 0.3 0.4⎟ = ( 1 3)
⎜ ⎟ 2
⎜⎝ 0.3 0.2 0.5⎟⎠
0.4 p1 + 0.3p2 + 0.3p3 = p1
0.5 p1 + 0.3p2 + 0.2p3 = p2
0.1 p1 + 0.4p2 + 0.5p3 = p3
⇒ − 0.6 p1 + 0.3p2 + 0.3p3 = 0 (1)
0.5 p1 – 0.7p2 + 0.2p3 = 0 (2)
0.1p1 + 0.4p2 – 0.5 p3 = 0 (3)
(1) × 0.2 – (2) × 0.3 ⇒ −0.27 p1 + 0.5p2 = 0
⇒p1 = p2
(2) × 0.1 − (3) × 0.5 ⇒ −0.27 p2 + 0.27p3 = 0
⇒ p3 = p2
Sub in (A)
p2 + p2 + p2 = 1⇒p2 = 1/3
⇒ p1 = 1/3, p3 = 1/3.

03-May-June_2011.indd 75 12/7/2012 6:01:56 PM


1.76 B.E./B.Tech. Question Papers

13. (b) (ii) Let X1 (t) & X2(t) be two independent of Poisson process with
parameter l1 & l2 respectively.
Let X(t) = X1(t) − X2(t)
E(X(t) = E(X1(t) – X2(t)) = E(X1(t) – E(X2(t))
= l1t − l2t
= (l1 − l2) t
E(X2(t) = E((X1(t) – X2(t))2

( ( )+ ( ) () ( ))
Since X1 (t) X2(t) are independent
= λ12 + λ1t + λ 22 + λ 2 t − 2( λ1 ) ( λ 2 t )
= (l1 − l2)2t + (l1 + l2)t ≠ (l1 − l2)2t1 + ((l1 − l2)t
∴X(t) is not a Poisson process.

14. (a) (i) This is of model I, (M/M/I:∞ /FIFO)


Here l = 10 per hour
1
μ = per minute = 12 per hours.
5
λ 10 2 1
(i) p0 = 1 1− = =
μ 12 12 6
1
⎛ λ⎞ 10
p1 = ⎜ ⎟ =
⎝ μ⎠ 12
2 2
⎛ λ⎞ ⎛ 10 ⎞
p2 = ⎜ ⎟ = ⎜ ⎟
⎝ μ⎠ ⎝ 12 ⎠

1
(i) P(Customers can arrive directly to the space) =
6
(ii) P(arriving customer will to have to wait) = P(Ν > 3)
4 4
⎛ λ⎞ ⎛ 10 ⎞
=⎜ ⎟ =⎜ ⎟
⎝ μ⎠ ⎝ 12 ⎠
1
(iii) Wq WS =
μ
1 1 1
where WS = = =
μ − λ 12 − 10 2
1 1 4
= − = =04
2 10 10

03-May-June_2011.indd 76 12/7/2012 6:01:58 PM


Probability and Queuing Theory (May/June 2011) 1.77

14. (a) (ii) The mgf of the exponential distribution (μ) is (1− t/μ)−1 and hence
the sum of (n + 1) exponential (m) variables is (1− t/μ)n+1 which is
Erlang distribution with parameter μ and (n + 1).

∴ f (w) = ∑ f (w / n) ⋅ p
n= 0
n

∞ n
μ n +1 − μ ⎛ λ ⎞ ⎛1− λ ⎞
= ∑
n= 0
n!
e w ⎜ ⎟ ⎜
⎝ μ ⎠ ⎝ μ ⎟⎠

⎛ λ⎞ (λ w )n
= μe − μw 1 − ⎟
⎝ μ⎠ ∑n= 0
n!

= ( μ − λ )e − μ e λ w

= ( μ − λ )e − ( μ − λ ) w

14. (b) In this model ln = l


⎧ nμ 0 ≤ n < c
μn = ⎨
⎩c μ n ≥ c
⎧1 ⎛ λ⎞n
⎪ ⎜ ⎟ p0 if n = 0,1, 2… c − 1
⎪ n ⎝ μ⎠
∴ pn = ⎨
⎪ ( λ μ )n
⎪ p0 if n = c, c + 1,…
⎩ c !cn−c
−1
⎡ ⎛ λ⎞
c ⎤
⎢ c −1 n ⎜ ⎟ ⎥
⎢ 1 ⎛ λ⎞ ⎝ μ⎠ ⎥
and p0 = ⎢ ∑
⎜ ⎟
n! ⎝ μ ⎠
+
⎛ λ ⎞⎥
⎢ n= 0 C1 ⎜1 − ⎟ ⎥
⎢⎣ ⎝ μc ⎠ ⎥

Average Length of the queue

Lq E ( N q ) ∑ (n
n c
c) pn

Put n − c = x,
∴n = x + c

= ∑x p
x=0
x+c

03-May-June_2011.indd 77 12/7/2012 6:01:59 PM


1.78 B.E./B.Tech. Question Papers


1 λ
= ∑ x .C c
x=0
x
( ) x + c p0
μ
c
⎛ λ⎞
⎜⎝ μ ⎟⎠ p0 ∞
⎛ λ⎞
x
=
C! ∑ x=0
x⎜ ⎟
⎝ μ⎠
c
⎛ λ⎞
⎜⎝ μ ⎟⎠ p0 ⎛ ⎛ λ ⎞ ⎛ λ⎞
2
⎛ λ⎞
3 ⎞
= ⎜ ⎜ ⎟
1 + 2 ⎜ ⎟ + 3 ⎜ ⎟ + …⎟
C ⎜⎝ ⎝ μc ⎠ ⎝ μc ⎠ ⎝ μc ⎠ ⎟⎠

c
⎛ λ⎞
⎜⎝ μ ⎟⎠ P0 ⎛ λ ⎞ ⎛ λ⎞
−2
= 1 −
⎜⎝ μc ⎟⎠ ⎜⎝ μc ⎟⎠
C
1 ( λ / μ )c +1
∴ Lq =
c.c ! (1 − λ / μC ) 2
By little’s formula.
λ
LS Lq +
μ
Lq
Wq =
λ
1
WS Wq +
μ
15 (a) Let arrivals follow Poisson process with rate of arrival l. Service
times are independently and identically distributed random vari-
ables with an general distribution with pdf f(t) and the service time T
between two departures.
Let N(t) be the no. of customers in the system at time t 0. Let
tn be the time instant at which nth customer completes service and
departs.
Let X(t) = N(tn) , n =1,2,3… then Xn represent the no. of customers
in the system when the nth customer departs and sequence of random
variables. { xn: n = 1,2,3…} is Markov chain.
⎧Xn A if X n ≥ 1
X n +1 = ⎨
⎩ A if X n = 0

03-May-June_2011.indd 78 12/7/2012 6:02:00 PM


Probability and Queuing Theory (May/June 2011) 1.79

Where A is the no. of customers arriving the service time T of the


(n + 1)th customer.
If U is the unit step fn, then
⎧1 if X n > 0
U(Xn) = ⎨
⎩0 if X n = 0
∴Xn+1 = Xn − U(Xn) + A (1)
In steady-state, E(Xn+1) = E(Xn)
(1) ⇒ E(Xn+1) = E(Xn) − E(U(Xn) + E(A)
E(Xn) = E(Xn) – E(U(Xn)) + E(A)
⇒ E(U(Xn)) = E(A) (2)
(1)2⇒ X n2 X n2 + U 2 ( X n ) A2 X nU ( X n ) 2A
AU ( X n )
A A
Axn
But U2(Xn) = U(X0)
⇒ XnU(Xn) = Xn

∴ X n2+ = X n2 + U ( X n ) + A2 − X n − 2A
AU ( X n ) + AX n
A

2( n n)
2
n
2
n +1 U( n ) 2
2 AU ( X n )

2 ( n ) (1 A)) ( ) (
n n ) (U ( )) ( A )n
2
2E ( A) E (UXn )
UX

Since A & U(Xn) are independent.

E ( A) + E ( A2 ) − 2 E 2 ( A)
⇒ E( X n ) = (2)
2(( − E ( A))
But E(A) = E(E(A/T)) = E(lT) = lE(T)
E(A2) = E(E(A2/T)) = E((lT)2 +lT)
= lE(T2) + lE(T)
Sub in (2),

λ E((T ) + λ 2 E(T
E((T 2 ) + λ E((T ) − ( λ E (T )) 2
E
E( X n ) = (3)
2(( λ E (T ))
But E(T2) = Var(T) + (E(T))2

λ2( (T ) {E ( )) 2 )
( ) ⇒ LS = λ E ( ) +
2(( λ (T ))
This is called P-k-formula.

03-May-June_2011.indd 79 12/7/2012 6:02:01 PM


1.80 B.E./B.Tech. Question Papers

15. (b) (i) Series Queue


A series queue consists of a series of service stations through
which an entering customer gets services in a sequence before
leaving the system. In a series queue a customer will not revisit
the station.
Eg:- 10th 2 marks. here
A series queue in which the service facilities are arranged in se-
quence and the flow is always in a single direction from facility
to facility is called tandom queue.
(ii) Open and closed Queue Networks
A network of queues can be described as a group of m nodes
where each node i represents a service facility with Ci servers,
i = 1,2,… m. Customers can arrive from outside the system at
any node and can leave from any node. In a queuing network, a
customer is allowed to visit a node more than once.
There are two types of queuing network
(i) Open networks
(ii) Closed networks
Open networks:-
A queuing network is said to be a open network if customers
may enter from outside at any node, circulate among the nodes
for service and leave the system at any node.
Closed networks:
A queuing network is called a closed network if no customer
may enter the system from outside and no customer may leave
the system so that there is always a fixed no of customers in the
network.

03-May-June_2011.indd 80 12/7/2012 6:02:02 PM


B.E./B.Tech. DEGREE EXAMINATION,
NOV/DEC 2010
Fourth Semester
Computer Science and Engineering
PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
(Regulation 2008)
Time: Three hours Maximum: 100 marks
Answer ALL questions
PART A (10 ë 2 = 20 marks)
1. If a random variable X has the distribution function
⎧1 e ax for
o x>0
F( X ) = ⎨
⎩ f x≤0
0 for

where a is the parameter, then find P(l ≤ X ≤ 2).

2. Every week the average number of wrong-number phone calls received by


a certain mail order house is seven. What is the probability that they will
receive two wrong calls tomorrow?

3. If there is no linear correlation between two random variables X and Y,


Y
then what can you say about the regression lines?

(X, Y ) to given to
4. Let the joint pdf of the random variable (X
4 xye −(( x ; x > 0 and y > 0. Are X and Y independent? Why or
2
y2 )
f ( x, y )
why not?

5. Examine whether the Poisson process {x(t)} is stationary or not.

6. When is Markov chain, called homogeneous?

7. Arrivals at a telephone booth are considered to be Poisson with an average


time of 12 mins between one arrival and the next. The length of a phone

04_ NovDec2010.indd 81 12/7/2012 6:03:02 PM


1.82 B.E./B.Tech. Question Papers

call is assumed to be distributed exponentially with mean 4 mins. Find


the average number of persons waiting in the system.

8. Draw the state transition rate diagram of an M/M/C queueing model.


9. What do you mean by bottleneck of a network?
10. Consider a service facility with two sequential stations with respective
service rates of 3/min and 4/min. The arrival rate is 2/min. What is the
average service time of the system, if the system could be approximated
by a two stage Tandom queue?

PART B (5 ë 16 = 80 marks)
11. (a) (i) The distribution function of a random variable X is given
by F(X) = 1 − (1 + x)e−x ; x ≥ 0. Find the density function,
mean and variance of X. (8)
(ii) A coin is tossed until the first head occurs. Assuming that
the tosses are independent and the probability of a head
occurring is ‘p’. Find the value of ‘p’ so that the probability
that an odd number of tosses required is equal to 0.6. Can
you find a value of ‘p’ so that the probability is 0.5 that
an odd number of tosses are required? (8)
or
(b) (i) If X is a random variable with a continuous distribution
function F(X), prove that Y = F(X) has a uniform
distribution in (0,1). Further if
⎧1
⎪ ( x 1);
1);
) 1 x 3,
f (X ) = ⎨2
⎪⎩ 0; otherwise
find the range of Y corresponding to the range 1.1 ≤ x ≤
2.9. (8)
(b) (ii) The time (in hours) required to repair a machine is
1
exponentially distributed with parameter l = .
2
What is the probability that the repair time exceeds 2h?
What is the conditional probability that a repair takes at
least 10h given that its duration exceeds 9h? (8)

12. (a) (i) Given f(x, y) = cx(x − y), 0 < x < 2, −x < y < x and ‘0’
elsewhere. Evaluate ‘c’ and find fx(x) and fy(y). (8)

04_ NovDec2010.indd 82 12/7/2012 6:03:03 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.83

(ii) Compute the coefficient of correlation between X and Y


using the following data: (8)
X: 1 3 5 7 8 10
Y: 8 12 15 17 18 20
or
(b) (i) For two random variables X and Y with the same mean,
the two regression equations are y = ax + b and x = cy + d.
Find the common mean, ratio of the standard deviations
b 1− a
and also show that = . (8)
d 1− c
(ii) If X1, X2, ..., Xn are Poisson variates with parameter l = 2,
use the central limit theorem to estimate P(120 ≤ Sn ≤
160), where Sn = X1, + X2 + ... + Xn and n = 75. (8)

13. (a) (i) If customers arrive at a counter in accordance with a


Poisson process with a mean rate of 2/min, find the
probability that the interval between 2 consecutive
arrivals is more than 1 min. between 1 and 2 mins. and 4
mins. or less. (8)
(ii) An engineer analyzing a series of digital signals generated
by a testing system observes that only 1 out of 15 highly
distorted signals follow a highly distorted signal, with no
recognizable signal between, whereas 20 out of 23
recognizable signals follow recognizable signals, with no
highly distorted signal between. Given that only highly
distorted signals are not recognizable, find the fraction of
signals that are highly distroted. (8)
or
(b) (i) A fair coin is tossed 10 times. Find the probability of
getting 3 or 4 or 5 heads using central limit theorem. (6)
(b) (ii) If the joint probability density function of X and Y is
⎧e −( x + y ) , for x > 0, y > 0
f x, y ) = ⎨
⎩ 0, elsewhere
X
Find the probability density function of Z = . (10)
X Y
14. (a) If people arrive to purchase cinema tickets at the average rate
of 6 per minute, it takes an average of 7.5 seconds to purchase
a ticket. If a person arrives 2 min before the picture starts and

04_ NovDec2010.indd 83 12/7/2012 6:03:03 PM


1.84 B.E./B.Tech. Question Papers

it takes exactly 1.5 min to reach the correct seat after


purchasing the ticket,
(i) Can he expect to be seated for the start of the picture?
(ii) What is the probability that he will be seated for the start
of the picture?
(iii) How early must be arrive in order to be 99% sure of
being seated for the start of the picture? (16)
or
14. (b) There are 3 typists in an office. Each types can type an average
of 6 letters per hour. If letters arrive for being typed at the rate
of 15 letters per hour.
(i) What fraction of the time all the typists will be busy?
(ii) What is the average number of letters waiting to be typed?
(iii) What is the average time a letter has to spend for waiting
and for being typed?

15. (a) Derive Pollaczek-Khinchin formula of M/G/l queue. (16)


or
(b) Write short notes on the following:
(i) Queue networks (4)
(ii) Series queues (4)
(iii) Open networks (4)
(iv) Closed netowkrs. (4)

04_ NovDec2010.indd 84 12/7/2012 6:03:03 PM


Solutions
PART A
1. P( X ) ( 2) − F ( l )
(
= 1 − 2a
) − (1 − )a

=( -a
− 2a
).
2. X follows poisson distribution with mean l = 1.
e−λ λ x
W.K.T P ( X = x ) =
x!
e −1 (1) 2 1
∴ P ( X = 2) = =
2! 2e
3. If there is no linear correlation between X and Y i.e., γ XY = 0.
The equations of the regression lines becomes y = y and x = x .
Angle between two lines are 90°.

∫ xye − x .e
2
y2
4. f x dy
0

= 2 xe − x ;
2
> 0.
− y2
IIIly f Y ( y ) 2 ye ; y > 0.
x2 y2
= 4 xye −(( x
2
y2 )
Now f X ( x ). f Y ( y ) xe .2 ye
= f XY ( x, y ).
Hence X and Y are independent.
e − ltt ( t ) x
5. Let P { X (t ) x} = ; x = 0, 1, 2 …
x!
∴E{ } = llt ≠ a constant.
∴The Poisson process is not stationary.
6. If the transition probability does not depend on the step then the Markov
chain is called a homogeneous Markov chain.
1 1
7. Given l = ; m=
12 4
The average number of persons waiting in the system

04_ NovDec2010.indd 85 12/7/2012 6:03:03 PM


1.86 B.E./B.Tech. Question Papers

λ
E( N ) =
μ−λ
⎛ 1⎞
⎜⎝ ⎟⎠
12
= = 0.5.
(1 / 4) − (1 / 12)
8.
l l
l

n−1 n
0 1 2

m 2m n
nm

9. The bottleneck of a network is the node with maximum traffic intensity.


10.
2 2
r1 r2 =
3 4
r1 r2
Ws = +
1 r1 1 r 2
⎛ 1⎞
( 2 / 3) ⎜⎝ ⎟⎠
2
= +
1 2 / 3 1 (1 / 2)
= 3.

PART B

11. (a) (i) By the property of F(x), the pdf f(x) is given by f(x) = F (x) at
points of continuity of F(x).
The given pdf is continuous for x ≥ 0.
∴ f ( x ) = (1 + x ) e − x − e − x = xe − x , x ≥ 0

E ( X ) = ∫ x 2 e − x dx = 2
0

E ( X 2 ) = ∫ x 3 e − x dx = 6
0

V ( X ) = E ( X 2 ) − [ E( X ) ] = 2.
2

11. (a) (ii) Let X denote the number of tosses required to get the first head
(success). Then X follows a geometric distribution given by
P( X ) = Pq r −1 ; r = 1, 2, …,

04_ NovDec2010.indd 86 12/7/2012 6:03:05 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.87

∴ P(X = an odd number) / P(X = 1 or 3 or 5,...)


∞ ∞
=∑ = − = ∑ Pq 2 r − 2
r =1 r =1

P
= 2 ( q 2 + q 4 + q6 + ...)
q
P q2 1
= . = (since p+ q = 1)
q 1− q
2 2
1+ q
1 1
= = 0.6, if =06
1+ q 2− p
0.66 p = 0.2
p = 1/ 3
1 1
= 0.5, if =05
1+ q 2 p
p
1− = 1
2
p = 0.
Though we get p = 0, if is meaningless because
P(X = an odd number)

= ∑ Pq 2 r − 2
r =1

=0, h p = 0.
Hence the value of p cannot be found out.
11. (b) (i) The distribution function of Y is given by
G y ( y ) = p(Y ≤ y)
= p { F(X)
X y}
=p X ≤F{ 1
( y )}
[The inverse exists, as F(x) is non-decreasing and continuous]
= F [ F −1 ( y )][∴ p { ≤ } = F ( x)]
= y.
Therefore, the density function of Y is given by
d
gy ( ⎡G y ( y ) ⎤⎦ .
ddy ⎣
Also the range of Y is 0 ≤ y ≤ 1. Since the range of F(x) is
(0,1). Therefore, Y follows a uniform distribution in (0,1).

04_ NovDec2010.indd 87 12/7/2012 6:03:05 PM


1.88 B.E./B.Tech. Question Papers

⎧1
⎪ ( x ), 1 ≤ x ≤ 3
When f ( x ) = ⎨ 2
⎪⎩ 0 otherwise
x
1 1
F ( x ) = ∫ ((xx )ddx (x )2 .
1
2 4
1
Since Y = F(X), Y ( X − 1) 2
4
1 1
∴ When 1.1 ≤ X ≤ 2.9, (1.1 −1) 1) 2 ( 2.9 1) 2 .
4 4
(i.e.,) the required range of Y is 0.0025 ≤ Y ≤ 0.9025.
1 − 2x
11. (b) (ii) f ( x ) = λ e − λ x e , x>0
2

1
a) p( X )=∫ e x 2
ddx
2
2

( )

= −e x
e −1 = 0.3679.
2

b) p ( X ≥ 10 / X 9) p( X > 1) , (by the memory less


property)

1
= ∫ e − x / 2 dx
1
2
= ( −e − x / 2 )∞0 = e −0.5 = 0.6065.
12. (a) (i)
B
Y

X
Y=

X
0

Y= X=2
−X

Here the range space is the area within the triangle OAB(shown
in figure), defined by 0 < x < 2 and − x < y < x.

04_ NovDec2010.indd 88 12/7/2012 6:03:06 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.89

(a) By the property of jpdf

∫∫
Δ OAB
cx( x y )dx dy = 1

2 x

∫ ∫ cx( x
0 −x
y )dy dx = 1

i.e., 8C = 1
C = l/8.
x
1
(b) f X ( x ) = ∫ 8 x( x
−x
y )ddy

x3
= , i 0 < x < 2.
4
2
1
(c) fY ( y ) = ∫ 8 X (x
−y
y )ddx in 2≤ y≤0

2
1
= ∫ x( x y )ddx in 0 ≤ y ≤ 2.
y
8

⎧1 Y 5 3
⎪⎪ 3 − 4 + 48 y in 2 ≤ y ≤ 0
i.e., fY ( y ) = ⎨
⎪ 1 − Y + 1 Y 3 i 0 y ≤ 2.
⎪⎩ 3 48

12. (a) (ii)


xi yi x2i y2i xi yi
1 8 1 64 8
3 12 9 144 36
5 15 25 225 75
7 17 49 289 119
8 18 64 324 144
10 20 100 400 200
34 90 248 1446 582
Thus n = 6

∑x i 34 ∑ y = 90 i

∑x 2
i 248 ∑ y = 1446 2
i

04_ NovDec2010.indd 89 12/7/2012 6:03:07 PM


1.90 B.E./B.Tech. Question Papers

∑x y i i = 582
n∑ xy
x ∑ x.∑ y
rXY =
(n x − x ) / ( n∑ y ( ∑ y ))
2 2

6 × 582 − 34 × 90
=
(6 × 248 − ( ) 2 )(6 × 1446 − ( )2 )
432
= = 0.9879.
332 × 576

12. (b) (i) If m is the common mean, the point (m, m) less on y = ax + b and
x = cy + d [θ they intersect at x y ]( )
∴ μ = μ+b ...(1)
μ = μ+d ...(2)
b
From(1), μ =
1− a
d
From(2), μ =
1− c
b d
=
1 a 1− c
b 1− a
∴ = .
d 1− c
σ 2Y bYX a σY a
Now = = ∴ = .
σ 2 X bXY c σX c

12. (b) (ii) E(Xi) = l = 2 and Var (Xi) = l = 2


By CLT, Sn follows N(nm, s n )
i.e., Sn follows N(150, 150 ).

⎧ −30 S − 150 10 ⎫
∴ P( ≤ Sn ≤ ) = P⎨ ≤ n ≤ ⎬
⎩ 150 150 150 ⎭
= P {−2.45 ≤ Z ≤ 0.855}
where Z is the standard normal variable
= 0.4927 + 0.2939 (from the normal tables)
= 0.7866.

04_ NovDec2010.indd 90 12/7/2012 6:03:08 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.91

13. (a) (i) Property of Poisson processes. The interval T between 2


consecutive arrivals follows an exponential distribution with
parameter l = 2.

i) (T > 1)) ∫
2t 2
.135
1
2

ii) (1 < T < 2) = ∫ 2e 2t dt


d e −2 − e −4 = 0.117
1
1

( ≤ 4)) = ∫ 2
iii) (T −2
= 1− −8
0.999.
0

13. (a) (ii) The state space of the Markov chain is


{recognizable, highly distorted}.

⎧1 ; if the n th signal generated is highly distorted


Xn = ⎨ th
⎩0; if the n signa
s l generated is recognizable n 1

State space {0, 1} is a Morkov chain.


The transition probability matrix (TPM) is

⎛ 20 3 ⎞
p = ⎜ 23 23 ⎟ .
⎜ ⎟
⎝14 / 15 1 / 15⎠

Let the steady state distribution be p = [p0, p1]. Then


p = pP ,
i.e.,(p0, p1) = (p0, p1)P
and p0 + p1 = 1 ...(1)
⎛ 20 3⎞
⎜ 23 23 ⎟
( 0 1 )=( 0 1) ⎜ ⎟
⎜ 19 1⎟
⎝ 15 15 ⎠
20p 0 14p 1
⇒ 0 = + ...(2)
23 15
3p 0 p 1
p1 = + ...(3)
23 15
14p 1 3p 0
From (3) =
15 23

04_ NovDec2010.indd 91 12/7/2012 6:03:09 PM


1.92 B.E./B.Tech. Question Papers

322 p 1
⇒ 0 =
45
Sub in (1)
367 p1 = 45 ⇒ p1 = 0.123
∴ p0 = 0.877
ie., p0 = 87.7%
and p1 = 12.3%
where p0 = fraction of signals that are recognizable
p1 = fraction of signals that are highly distorted.
13. (b) (i) Refer Nov/Dec 2009 13 (b) (i)
13. (b) (ii) Refer Nov/Dec 2009 13 (b) (ii)
14. (a) l = 6/minute; m = 8/minute.
1
a) E ( ) =
μ λ
1 1
= = min .
8−6 2
∴ E(total time required to purchase the ticket and to reach the seat)
1 1
= + 1 = 2 min .
2 2
Hence he can just be seated for the start of the picture.
b) P(total time < 2 min)

⎛ 1⎞ ⎛ 1⎞
= P ω < ⎟ = 1− P ⎜ω > ⎟
⎝ 2⎠ ⎝ 2⎠
⎛ λ⎞
− μ ⎜ 1− ⎟ 1
⎝ μ⎠
= 1− e ×
2
= 1 − e −1 = 0.63
P (ω < t ) = .99
P (ω > t ) = 0.01
e −( μ λ )t
= 0.1
−2t = log( 0. ) = −2.3
t = 1.15 min
in .
(i.e) P(ticket purchasing time < 1.15) = 0.99

04_ NovDec2010.indd 92 12/7/2012 6:03:10 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.93

∴ P[total time to get the ticket and to go to the seat


< (1.15 + 1.5)] = 0.99.
Therefore the person must arrive atleast 2.65 min early so as to be
99 % sure of seeing the start of the picture.
14. (b) l = 15/hour; m = 6/hour; S = 3.
Hence this is a problem in multiple server model. i.e., model II.
a) P(all the typists are busy)
= P(N ≥ 3 )
( λ/μ )3 ⋅ P0
=
3!(1 − λ / μ )
( .5)3 ⋅ P0 …(1)
=
⎛ 2.5 ⎞
6 × 1− ⎟
⎝ 3 ⎠
1
Now P0 =
⎡ 1 ⎛ λ⎞ ⎤
S
⎧⎪ S −1 1 ⎛ λ ⎞ n ⎫⎪ ⎢ ⋅⎜ ⎟ ⎥
⎨∑ ⎜ ⎟ ⎬ + ⎢ ⎛ 1 − λ ⎞ ⎝ μ ⎠ ⎥
⎪⎩ n = 0 n ! ⎝ μ ⎠ ⎪⎭ ⎢ S ! ⎜⎝ μ S ⎟⎠ ⎥
⎣ ⎦
1
⎧ 1 ⎫
⎧ 1 2⎫ ⎪ × ( 2.5)3 ⎪
⎨1 + 2.5 + × ( 2.5) ⎬ + ⎨ 6 × ⎛1 − 5 ⎞ ⎬
⎩ 2 ⎭ ⎪ ⎝ ⎟⎠ ⎪⎭
⎩ 6
1 …(2)
= = 0.0449
22.25
Using (2) in (1), we have P(N ≥ 3) = 0.7016.
Hence the fraction of the time all the typists will be busy = 0.7016.
1 ( λ / μ )S +1 ⋅ P0
b) E ( N q ) = ⋅
S S! ( λ / μ S ) 2
1 ( .5) 4
= × × 0.0449
3 × 6 ⎛ 2.5 ⎞ 2
⎜⎝1 − ⎟
3 ⎠
= 3.5078.
1
c) E (W ) = E(N )
λ

04_ NovDec2010.indd 93 12/7/2012 6:03:11 PM


1.94 B.E./B.Tech. Question Papers

1⎧ λ⎫
= ⎨E(N q ) + ⎬
λ⎩ μ⎭
1
= 3.5078 + 2.5
15
= 0.4005h or 24 min, nearly.

⎧ ⎡ ⎛ λ⎞⎤⎫
⎪1 + ( λ μμ)) 3 ⎢1− e −μμt ⎜ S − 1 − ⎟ ⎥ ⎪
⎪ ⎣ ⎝ μ⎠ ⎦⎪
P ( W > t ) = e −μμt ⎨ ⎬
⎪ s! ⎛ 1 − λ ⎞ . ⎛ S − 1 − λ ⎞ ⎪
⎪ ⎜⎝ μ S ⎟⎠ ⎜⎝ μ ⎟⎠ ⎪
⎩ ⎭
− 2X − 0 5
⎡ 1− e × 0.0449 ⎤
⎛ 1 ⎞ −6 / 3 ⎢1 + ( 2.5)3 ⎥
P⎜W> ⎟ = e ⎛ 2 5⎞
⎝ 3⎠ ⎢ 6 1− ⎟ ( 0.5) ⎥
⎢⎣ ⎝ 3 ⎠ ⎥⎦
⎡ 0.7016 (1 − e) ⎤
= e − 2 ⎢1 +
⎣ ( − 0.5) ⎥⎦
= 0.4616

15. (a) refer the Nov/Dec 2009 – 15(b)


15. (b) (i) Queueing Networks
Queueing network is a part of organized systems. Network of ser-
vice facilities where customers receive service at some or all of the
facilities.
A network of queues is a group k nodes where each node rep-
resents a service facility with Ci servers at node i (i = 1, 2,.., k).
Customers may enter the system at one node, and after completion
of service at one node may move to another node for further service
and may leave the system from some other node. They may return
to previously visited nodes, skip some nodes and may stay in the
system forever.
(ii) Series Queue
These are special types of open network in which there are a se-
ries of service facilities which each customer should visit (in the
given rder) before leaving the system. The nodes form a series
system with flow always in a single direction from node to node.
Customer enter from outside only at node 1 and depart only from
node k.

04_ NovDec2010.indd 94 12/7/2012 6:03:11 PM


Probability and Queueing Theory (Nov/Dec 2010) 1.95

There are two types of series queues:


(i) Series queues with blocking,
(ii) Series queues with infinite capacity.
(iii) Open Networks
An open Jackson network is a system of k service stations where
station i (i = 1, 2, k) has the following characteristics:
 An infinite queue capacity.
 Customer arrive at station i from outside the system according to
a Poisson process with parameter ai.
 Ci servers at station i with an exponential service time distribution
with parameter ui.
 Customers completing service at station i next go to station (j =
1,2, ..., k) with probability Pij.

(iv) Closed Network


A queueing system in which new customers never enter and existing
ones never depart is called a closed system or a closed network.
In the open network of
 ai = 0 for all i (i.e., no customer may enter the system from
outside).
 Pi = 0 for all i (i.e., no customer may leave the system) then we
0

have a closed Jackson network.

04_ NovDec2010.indd 95 12/7/2012 6:03:12 PM


B.E./B.Tech. DEGREE EXAMINATION,
APRIL/MAY 2010
Fourth Semester

Computer Science and Engineering


PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
(Regulation 2008)
Time: Three hours Maximum: 100 marks
Answer ALL questions
PART A (10 ë 2 = 20 marks)

1. Obtain the mean for a Geometric random variable.

2. What is meant by memoryless property? Which continuous


distribution follows this property?

3. Give a real life example each for positive correlation and negative
correlation.

4. State central limit theorem for independent and identically distributed


(iid) random variables.

5. Is a Poisson process a continuous time Markov chain? Justify your


answer.

6. Consider the Markov chain consisting of the three states 0, 1, 2 and


1 1
0
2 2
1 1 1
transition probability matrix P = it irreducible? Justify.
2 4 4
1 2
0
3 3

04_AprilMay2010(Q).indd 96 12/7/2012 6:03:50 PM


Probability and Queueing Theory (April/May 2010) 1.97

7. Support that customers arrive at a Poisson rate of one per every 12


minutes and that the service time is exponential at a rate of one service
per 8 minutes. What is the average number of customers in the system?

8. Define M/M/2 queueing model. Why the notation M is used?

9. Distinguish between open and closed networks.

10. M/G/1 queueing system is Markovian. Comment on this statement.

PART B (5 ë 16 = 80 marks)

11. (a) (i) By calculating the moment generating function of Poisson


distribution with parameter l, prove that the mean and variance
of the Poisson distribution are equal.
⎧Ce −2 x , 0 < x < ∞,
(ii) If the density function of X equals f ( x ) = ⎨
find C. What is P[X > 2]? ⎩ 0, x<0

or
(b) (i) Describe the situations in which geometric distributions could be
used. Obtain its moment generating function.

(ii) A coin having probability p of coming up heads is successively


flipped until the rth head appears. Argue that X, the number of
flips requird will be n, n ≥ r with probability

⎛ n − 1⎞ r n − r
P[ x n] = ⎜ p q n ≥ r.
⎝ r − 1⎟⎠

12. (a) (i) Suppose that X and Y are independent non-negative continuous
random variables having densities fX(x) and fY(y) respectively.
Compute P[X < Y].

(ii) The joint density of X and Y is given by


⎧ 1 − xy
⎪ ye , 0 < x < 0 < y < 2.
f ( x, y ) = ⎨ 2 Calculate the
⎪⎩ 0, otherwise
conditional density of X given Y = 1.
or

04_AprilMay2010(Q).indd 97 12/7/2012 6:03:50 PM


1.98 B.E./B.Tech. Question Papers

(b) (i) If the correlation coefficient is 0, then can we conclude that they
are independent? Justify your answer, through an example. What
about the converse?

(ii) Let X and Y be independent random variables both uniformly


distributed on (0, 1). Calculate the probability density of X + Y.
13. (a) (i) Let the Markov Chain consisting of the states 0, 1, 2, 3 have the
1 1
0 0
2 2
transition probability matrix P = 1 0 0 0.
0 1 0 0
0 1 0 0
Determine which states are transient and which are recurrent by
defining transient and recurrent states.
(ii) Suppose that whether or not it rains today depends on previous
weather conditions through the last two days. Show how this
system may be analyzed by using a Markov chain. How many
states are needed?
or
(b) (i) Derive Chapman-Kolmogorov equations.

(ii) Three out of every four trucks on the road are followed by a
car, while only one out of every five cars is followed by a truck.
What fraction of vehicles on the road are trucks?

14. (a) Define birth and death process. Obtain its steady state probabilities.
How it could be used to find the steady state solution for the M/M/1
model? Why is it called geometric?
or

(b) Calculate any four measures of effectiveness of M/M/1 queueing


model.

15. (a) Derive Pollaczek-Khintchine formula.


or

(b) Explain how queueing theory could be used to study computer


networks.

04_AprilMay2010(Q).indd 98 12/7/2012 6:03:51 PM


Solutions
Part A
( = x ) = pq x −1 , x = 1,2…
1. If X N Geometric distribution, P(X

Mean = E(X) = ∑ x pq
x =1
x −1


= p ∑ x. pq
x =1
x −1

= p{ + q + q 2 + }
1
= p[ − q]−2 = p[ p −2 ] =
p
q
(or) If p( X x) pq x x = 0,1, 2 , then E ( x ) =
P
2 (i) If X is a discrete/ continuous random variable, then for
any two +ve integers m, n with m > n.
[ > m + n/ X > m] = P[X
p[X [ > n]
In other words, future value depends on present not on past is called
memoryless property.
(ii) In continuous distributions, Exponential distribution follows this prop-
erty.

3. Positive correlation: “ If the demand of commodity increases then the


price will increase accordingly” is an example.
Negative correlation: “If the availability increases for the items, demand
will be decreased”, is an example.

4. Let X1, X2,….Xn be a sequence of independent and identically distributed


random variables each having mean m and variance s 2, then Sn follows
normal distribution as n→∞

5. YES, POISSON PROCESS is a continuous time Markov chain. As the


Poisson process follows memory less property, it is continuous time
Markov chain.
[X(t 3 ) = n3 / X(
i.e., By considering the value of P[X X t2) = n2; X(
X t1) = n2],
we can easily prove that
P[ X (t3 ) n3 / X (t2 n2 )]

04-April-May_2010.indd 99 12/7/2012 6:06:04 PM


1.100 B.E./B.Tech. Question Papers

⎡1 / 2 1 / 2 0 ⎤
6. Given P = ⎢1 / 2 1 / 4 1 / 4 ⎥
⎢ ⎥
⎢ 0 1 / 3 2 / 3⎥
⎣ ⎦
⎡1 / 2 1 / 2 0 ⎤ ⎡1 / 2 1 / 2 0 ⎤
⎢ ⎥⎢ ⎥
P = ⎢1 / 2 1 / 4 1 / 4 ⎥ ⎢1 / 2 1 / 4 1 / 4 ⎥
2

⎢ 0 1 / 3 2 / 3⎥ ⎢ 0 1 / 3 2 / 3⎥
⎣ ⎦⎣ ⎦
⎡2 / 4 3 / 8 1/ 8 ⎤
⎢ ⎥
= ⎢ 3 / 8 19 / 48 11 / 48 ⎥
⎢ 1 / 6 11 / 36 19 / 36 ⎥
⎣ ⎦
As pij > 0, ν i j and for some ‘n’, the tpm is irreducible.
( n)

1
7. Given λ = per minute
12
1
μ = per minute
8
λ 1 / 12
Average number of customers in the system = Ls = = =2
μ−λ 1 1

8 12

8. M stands for Markovian, i,e., Arrival follows Poisson & service time
follows exponential.

9.
Open network Closed network
1. Arrivals from 1. New customers
outside to the cannot enter in
node is allowed. to the system.
2. Once a customer 2. Existing cus-
gets the service tomers can-
completed at not leave the
node i, he joins system.
the queue at node
j with probabil-
ity pij or leaves
the system with
probability pio

04-April-May_2010.indd 100 12/7/2012 6:06:06 PM


Probability and Queuing Theory (April/May 2010) 1.101

10. M/G/1 is a non –Markovian queueing models, as the service time follows
general distribution.

PART –B
e−λ λ x
11. (a) (i) If X N P( λ ), P(X = x) = P(x) = ,
Lx
x = 0,1, 2… ∞

M.G.F = M X (t ) = ∑e x=0
tx
p( x )


e−λ λ x
= ∑e
x=0
tx
Lx

(λ et ) x λ et ( λ et )2
= e−λ ∑
x=0 Lx
= e − λ {1 +
L1
+
L2
+ ....}

−λ λ et
= e λ [e − ]
t
=e .e
d d
Mean = E ( X ) = [ M x (t )}t = 0 = [e λ ( e − ) ]t = 0
t

dt dt
d
= [e λ e e − λ ]t = 0 = [e − λ .λ e t .e λ e ]t = 0
t t

dt

d
E ( X 2 ) = M X " (0) = ⎡ λ e − λ e λe et ⎤
t

dt ⎣ ⎦t =0
= λ e − λ [e λ e .e t + e t .λ e t .e λ e ]
t t

= λ + λ2
Var(X) = E(X2) = [E(X)]2 = l + l2 − l2 = l
(ii)
⎧ce −2 x , 0 < x < ∞
f(x) = ⎨
⎩ 0 , x<0

Since f(x) is a pdf, ∫ −∞

f ( x )dx = 1.

i.e., ∫ 0
Ce −2 x dx = 1.

⎧ e −2 x ⎫
⇒C⎨ ⎬ =1
⎩ −2 ⎭ 0

04-April-May_2010.indd 101 12/7/2012 6:06:07 PM


1.102 B.E./B.Tech. Question Papers

⎛ −C ⎞
⇒⎜ [ −1] = 1
⎝ 2 ⎟⎠
⇒ (C = ) ∴ f ( x ) = 2e −2 x
∞ ∞ ∞ ∞
⎡ e −2 x ⎤
]= ∫ d = 2e∫ ∫ dx = 2 ⎢
2x 2x
P[ X f ( x )dx ddx e ⎥ (5)
2 2 2 ⎣ −2 ⎦ 2
−44
= ( −1)[
) 0−e ]= e 4.

11. (b) (i) Geometric distribution could be used in the situations in which
probability of number of trials required to get first success.
If X N geometric distribution, P(X = x) = pq x −1 , x = 1,2..

M X (t ) = ∑e
x =1
tx
p( x )


= ∑e
x =1
tx
pq x −1

∑ (qe )
p
= t x
q x =1
p
= {qe
{ qqe t ( qe t ) 2 }
q
p
= × qe t { t
}
q
t −1
= pe t { }
t
pe
MX ( ) =
1 − qe t
(ii) Given, A coin having probability of coming up heads is succes-
sively flipped until rth head appears.
P[X = n] = n−1Cr−1 pr qn−r, n ≥ r is a negative binomial distribu-
tion.
Let X = number of heads while tossing a coin in n trials.
As n trials are independent, getting head in each trial is
of probability ‘p’.
C x p x q n − x , x = 0,1, 2.....
∴ X N B( , p) ⇒ P ( X = x ) = nC
Where q = 1−p.
It is clear that, is the first n−1 flips (trials), we must have r−1

04-April-May_2010.indd 102 12/7/2012 6:06:08 PM


Probability and Queuing Theory (April/May 2010) 1.103

heads and in the nth flip we get the rth head.


Since trials are independent & using compound probability

P[ X n] [n C pr 1qn r ] [ p]

ie., p[ X n ] n −1C pr q n − r

12. (a) (i) Given X & Y are independent & f (x) = P(X = x) = fX (x) & f (y) =
P(Y = y) = fY (y)
Since X & Y are independent, f(x,y) = f(x). f(y)

∴P ( X < Y = ∫∫ f (x , y )dxdy
x< y

∞y

= ∫ ∫ f x ) f y ) dx dy
0 0

⎧ 1 − xy
⎪ ye , 0 < x < 0 < y < 2
(ii) f x , y ) = ⎨ 2
⎪⎩ 0 ; otherwise

f x , y = 1) e − x / 2 − x
P( X / Y f (x / y )= = =e
f y) 1
2
Now,

e−x
f y) = ∫
−∞
f x, y d
dx f (x , y ) = ( / 2)(1)e − x ( ) =
2

1
= ∫ ye − xy dx
−∞
2

y ⎧ e − xy ⎫ 1 1
= ⎨ ⎬ = − [0 − 1] =
2 ⎩− y ⎭ 2 2
0

Also , f(y)y =1 = 1/2


Cov( x, y )
12. (b) (i) We know that correlation coefficient rxy = =0
σ xσ y
Suppose x & y are independent, E(X Y) = E (X). E(y) &
Cov (X,Y) = E (XY) − E (X). E (Y) = 0
Thus, rxy = 0
Hence two independent r.v’s are uncorrelated
But ,the converge is not true.

04-April-May_2010.indd 103 12/7/2012 6:06:09 PM


1.104 B.E./B.Tech. Question Papers

Suppose X N N (0,1) & Y N N (0,1) with Y = X2,


Then E (X) = 0, E (Y) = 0 & Ε (XY) = Ε (XX 2) = E (X 3) = 0
Since μ3′ = E (X3) is zero for standard normal variables,
Cov (X,Y) = E(XY) − Ε(X). E(Y) = 0
=>rXY = 0 => X & Y are uncorrelated
But Y =X2,X&Y are dependent random variables.

12. (b) (ii)


⎧1, 0 < x < 1 ⎧1, 0 < y < 1
∵ X &Y N f (x ) = ⎨ & f y) = ⎨
⎩ 0 , 0 .w ⎩0, 0.w
∵ X & Y are independent,
f x, y f (x f ( y ) = 1,
0 < x < y <1

Let U = X+Y, V = Y
u v
i e., x= ,v = y
2

∂x 1 ∂x ∂x ∂x 1
= =0 0
∂u 2 ∂v ∂
&J = u ∂yy 2 1
= =
∂x 1 ∂y ∂x ∂yy 1 2
=− =1 − 1
∂v 2 ∂v ∂v ∂v 2

∴ f(u,v) = |J|.f(x,y)
=1/2.1

⎧1
⎪ ; 0 ≤ u −v≤1
⇒f u
u, ) = ⎨ 2
⎪⎩ & 0 ≤ v ≤ 1

0 1 2 3
0 ⎛0 0 1 / 2 1 / 2⎞
13. (a) (i) Let P = 1 ⎜ 1 0 0 0 ⎟
⎜ ⎟
2 ⎜0 1 0 0 ⎟
⎜ ⎟
3 ⎝0 1 0 0 ⎠

The state diagram is given by

04-April-May_2010.indd 104 12/7/2012 6:06:16 PM


Probability and Queuing Theory (April/May 2010) 1.105

0 1
1/2 1/2 1
1
2 3

The state i is said to be transient if the return state i is uncertain



i.e.,F
F ∑f
n =1
ii
(n )
<1

1 1
∴F
F f 00( ) + f ( )
f 00( ) = 0 + 0 + .1.1 + + .1.1
2 2
=1

1 1
& 11 = f111(1) + f11(2) + f111(3) = 0 + 0 + .1.1 + 1. .1
2 2
=1

2 = f 22 + f 222(2) + f 22(3) + f 222(9) + f 222(12) + …


(1)
& F22

1 1 1
= 0 0 + 1. 1. . + 1. 1. . 1. 1.
2 2 2
1 1 1
= + + +…
2 4 8
1/ 2
= =1
1
1−
2
Also ⇒ F222 = 1

F333 = 1 (easily prove )


This implies that all states are recurrent & also it is easy
to prove that, since the chain is finite, it is recurrent

13. (a) (ii) Let the states of the chain be,


State 0 → If it is rained both today & yesterday
State 1 → If it is rained today but not yesterday
State 2 → If it is yesterday but not today
State 3 → If it did not rain either yesterday or today.
Then tpm would be,

04-April-May_2010.indd 105 12/7/2012 6:06:21 PM


1.106 B.E./B.Tech. Question Papers

0 1 2 3
0 ⎛ P1 0 1 P1 0 ⎞
1 ⎜ P2 0 1 P2 0 ⎟
P= ⎜ ⎟
2⎜0 3 0 1 P3 ⎟
⎜ ⎟
3⎝0 4 0 1 P4 ⎠

Where , P1 is the probability that if it has rained for the past two
days then it will rain tomorrow.
Similarly P2 is the probability that if it has rained today but not
yesterday.
Similarly P3 is the probability that if it rained yesterday but not
today then it will rain tomorrow.
& P4 is the probability that if it has not rained in the past two
days, it will rain tomorrow.
13. (b) (i) statement : If P is the tpm of a homogeneous Markov chain,
then the n-step tpm P(n)is equal to Pn
i.e., Pij(n)=[Pij]n
It is clear that
Proof: p j ( ) p{xp{x j / x0 i}, as the chain is homogeneous.
The state ‘j’ can be reached from the state i in two steps through
some intermediate state k.
Now,

Pij ( )
P{X
P{X
{X 2 j / X0 i}
= P{X
X2 j, X1 = k / X o i}
= P{ X 2 j / X1 = k , X o i}. P { X1 k / X 0 i}
= pkj (1)
pik (1)

= pik pkj

Since the transition from state i to state j in two step.We can


take place through any one of intermediate states, k can assume
the values 1,2,3… . The transition through various intermediate
states are mutual,exclusive
Hence p j ( )
∑p k
ik pkj

i.e., the ijth the element of 2-step tpm = the ijthelement of the
product of the two one step tpm’s.
i.e., p ( ) = p2
Now,

04-April-May_2010.indd 106 12/7/2012 6:06:23 PM


Probability and Queuing Theory (April/May 2010) 1.107

p j( )
P{X
P {X
{X j / X0 i}
= ∑ P { X3 j / X2 = k} P {X
{ X 2 k / X 0 i}
k

= ∑ p j pik ( )

= ∑ pik ( )
pkj
k

Similarly p j ( )
∑p
k
ik . pkj ( )

i.e., P ( )
P ( ) .P P . P 2 P 3
proceeding in this way, we get
P (n ) P n
13. (b) (ii) The tpm of the Markov chain is
C T
C ⎡1 / 4 3 / 4 ⎤
⎢ P= ⎥
T ⎣1 / 5 3 / 5 ⎦
It is given that three out of every four trucks on the r0ad are
followed by a car, while only one out of every five cars is fol-
lowed a truck. The fraction of vehicles on the road are truck is
given by
π P = π & π1 + π 2 =1
i.e.,
⎛ 1// / 4⎞
( 1 ,π 2 ) ⎜ = ( 1 ,π 2 )
⎝ 1// / 5⎟⎠
1 1
π1 + π 2 = π1 ()
4 5
3 4
π1 + π 2 = π 2 (2)
4 5
1 3
1 ⇒ π 2 = π1
5 4
15
⇒ π 2 = π1
4
15
∵ π1 + π 2 =1 π1 + π1 = 1
4
⎛ 19 ⎞
⇒ π1 ⎜ ⎟ = 1
⎝ 4⎠
⎛ 4⎞ ⎛ 15 ⎞
⇒ 1 = ⎜ ⎟ & π2 = ⎜ ⎟
⎝ 19 ⎠ ⎝ 19 ⎠

04-April-May_2010.indd 107 12/7/2012 6:06:27 PM


1.108 B.E./B.Tech. Question Papers

14. (a) Birth and Death process


If x(t) represents the number of individuals present at time t in a
population in which two types of events occur namely one represent-
ing birth which contributes to its increase and the other represent-
ing death which contribute to its decrease,then the discrete random
process {x(t)} is called the birth & death process provided they are
governed by the following postulates:
If x(t) = n(n > 0)
(i) P [1 birth in ( , t t )] λ n (t ).Δt 0( t )

(ii) P[0 birth in ( t , t t )] λ n (t ).Δ


).Δt + 0( t )
(iii) P [2 or more births in ( t , t t )] ( Δt )
(iv) Births occurring in ( , t + Δt ) are independent of time since
last births
)] = μ n ( ) Δt 0( )
(v) P[1 deaths in ( , t + Δt)]

)] 1 μ n ( ) Δt + 0(
(vi) P [0 death in ( , t + Δt)] )
( vii) P [2 or more deaths in ( , t + Δt)]
)] 0( )
(viii) Death occurring in ( , t + Δt ) are independent of time since
last death.
(ix) The birth and death occur independently of each other at any
time.

14. (b) Let P n (t ) = P[X(t) = n] be the probability that the size of the popula-
tion is ‘n’ at time . ‘t’
Consider, Pn ( , t + Δt ) = P(no birth (or) death in ( , t + Δt ) ) + P (1
birth and no death in ( , t + Δt ) ) + P (no
birth and one death in ( , t + Δt ) ) + P (1
birth and 1 death in ( , t + Δt ) )

pn (t (t ).(1 λ n Δt
t ) Pn (t Δtt)) ( − μn Δt)
t ) pn (t )λ n 1 t ( − μ n −1Δt )
+ pn +1 ( ) ((11 − λ n ++11Δ )( μ n +1Δ ) + Pn ( ) ( λ n Δ )( μn Δ )
Pn ( ) Pn ( )
lim = λ n 1 Pnn−1 ( ) − ( λ n ) Pn (
n )P ) + μ n +1 Pn +1 (t )
t →0 Δt
i.e., Pn ′(t ) = λ n −1 Pn (t ) − ( λ n μn ) Pn (t ) + μn +1 Pn +1 (t ) n > 0 (1)
∴ P0 ′ (t ) = − λ0 P0 (t ) + μ1 P1 (t ) (2)
In steady state , Pn (t ) is independent of time ‘t’, Pn ′(t ) =1

04-April-May_2010.indd 108 12/7/2012 6:06:32 PM


Probability and Queuing Theory (April/May 2010) 1.109

Also P0 ′(t ) = 0.

∴ 1&2 ⇒
λn nn−1 ( λ n μ n ) Pn + μ n + Pn +1 = 0 (3)
− λ0 0 + μ1 P1 (4)
λ0
From 4 ⇒ P1 = P
μ1 0
When n = 1, 4 ⇒
λ0 0 ( λ1 μ1 ) P1 + μ2 P2 = 0
λ0 λ1
⇒ P2 = P
μ1μ2 0
In the same way we get,
λ0 λ1λ 2
P3 P , etc…
μ1μ2 μ3 0
λ0 λ1 … λ n 1
Thus Pn .P
μ1μ2 … μn 0

∑P =1
n= 0
n


λ0 λ1  λ n −1
⇒ P0 + ∑
n =1
μ1μ2 … μn 0
P =1

1
⇒ P0 = ∞
⎛ λ0 λ1  λ n −1 ⎞
1+ ∑ ⎜⎝ μ μ … μ
n =1 1 2 n
⎟⎠

For single server model


λn λ & μn μ
⎛ λ⎞
∴ Pn = ⎜ ⎟ P0 and
⎝ μ⎠
⎛ ⎞
⎜ ⎟
P0 = ⎜⎜ ⎟
1
∞ ⎟
λ
⎜1+
⎜⎝ ∑n =1
( )n ⎟
μ ⎟⎠

04-April-May_2010.indd 109 12/7/2012 6:06:35 PM


1.110 B.E./B.Tech. Question Papers

1 1
= = −1
∞ n
⎛ λ⎞ ⎛ λ⎞

n= 0
⎜⎝ μ ⎟⎠ ⎜⎝1 − μ ⎟⎠

n
λ ⎛ λ⎞ ⎛ λ⎞
P0 = 1 − ∴ Pn = ⎜ ⎟ ⎜1 − ⎟ = 1
μ ⎝ μ⎠ ⎝ μ⎠

15. (a) Let N and N′ be the number of customers in the system at time t and
t + T,when two consecutive customers have just left the system after
getting service.
T Random service time which is a continuous random variable
f(t) → pdf of T
E(T) → Mean of T & Var(T) → Variance of T
Let M be the number of customers arriving in the system during the
service time ‘T’

⎧M iif N = 0
∴ N′ = ⎨
⎩ ( n 1) M if N > 0

where M is a discrete values 0,1,2…

∴ N′ = N −1+ M + δ where δ = ⎧⎨
1 if N 0
⎩0 if N 0

∴ E ( N ′ ) = E ( N ) − 1 + E ( M ) + E (δ ) (2)
When the system is in steady state, the probability of number of
customers in the system will be constant.
∴ E(N) = E(N′)
2
E( ) = E(N′2)
Squaring both sides of 1, we get
2
N ′ = N 2 + ( M − ) 2 + δ 2 + 2 N ( M −1) + 2( M − 1)δ + 2 N δ

Now δ 2 δ (Since if δ , δ 2 = 0, if δ ,δ 2 = 1

⎧0 1 if N = 0
& Nδ = ⎨
⎩ N × 0 if ≠ 0
∴Nδ = 0
Substitute in (5)

04-April-May_2010.indd 110 12/7/2012 6:06:36 PM


Probability and Queuing Theory (April/May 2010) 1.111

2
N ′ = N 2 + ( M − ) 2 + δ + 2 N ( M −1) + 2( M − 1)δ
2
N ( M − 1) = N 2 − N ′ + M 2 − 2 M + 1 + δ + 2 δ
⇒ −2N 2δ
∴ 2 [ N (1 − M )] 2E ( ) ( 2
) 1 + E + {( 2 1)δ }
⇒ 2 ( N ) (1 M ) = E ( M 2 ) + E ( M ) E (δ ) 2E(
2 E(( M ) 1
by independence & by (3)
⇒ 2 ( )[1 − E ( M )] = E ( M 2 ) + ( 2 ( ) − 1) E (δ ) − 2 E ( ) +1
2
M ) ]E (δ ) 2E(
E ( M ) + [ E ((M 2 E(( M ) 1
∴ E (N) =
2[[ E ( M )]
E( M 2 ) − 2E 2 ( M ) + E( M )
=
2[[ E ( M )]
Since number of arrivals M in time T follows a Poisson process with
parameter l., we have
E ( M ) = λ E (T ) & E ( M 2 ) = λ 2 [V
[V [T ) E 2 (T )] + λ E (T )
Substitute in (6) we get
λ 2 [v(T )] + λ 2 E 2 (T ) + λ E((T ) − λ 2 E 2 (T ) + λ E (T )
Ls E(N ) =
2[[ − λ E (T )]
λ 2 [V (T ) + E 2 (T )]
Ls λ E (T ) +
2[[ − λ E (T )]
(b)
Queue networks can be regarded as a group of ‘k’ inter-connected
nodes, where each node represents a service facility of some kind
with Si servers at node i (Si ≥ 1)
(i) Consider a two server system in which customers arrive at Pois-
son rate λ at server1. After being served by server1 they then
join the queue in front of server2. Suppose there is infinite wait-
ing space at both servers. Each server servers one customer at
a time with server i taking an exponential time with rate μi for a
service,i = 1,2… such a system is called a tandem system.
(ii) Consider a system of K servers. Customers arrive from outside
the system to server i, i = 1, 2,….k .in accordance with indepen-
dent Poisson processes at rate r1; they then join the queue at i
until their turn at service comes. Once a customer is served by
server i, he then joins the queue in front of server j, j = 1,… k
with prob.Pij
(iii) Jackson’s open network concept can be extended when the
nodes are multi server nodes. In this case the network behaves
as if each node is an independent M/M/S model.

04-April-May_2010.indd 111 12/7/2012 6:06:38 PM


B.E./B.Tech. DEGREE EXAMINATION,
NOV/DEC 2009
Fourth Semester
Computer Science and Engineering
PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
(Regulation 2004)
Time: Three hours Maximum: 100 marks
Answer ALL questions
PART A (10 ë 2 = 20 marks)
1. A lot of semiconductor chips contains 20 that are defective. Two are select-
ed, at random, without replacement from the lot what is the probability that
the second one selected is defective given that the first one was defective?

2. If the range of X is the set {0, 1, 2, 3, 4} and P[X = x] = 0.2, determine the
mean and variance of the random variable.

3. The probability of a successful optical alignment in the assembly of an opti-


cal data storage product is 0.8. Assume the trials are independent, what is the
probability that the first successful alignment requires exactly four trials?

4. Suppose X has an exponential distribution with mean equal to 10.


Determine the value of x such that P[X < x] = 0.95.

5. Determine the value of C such that the function f(x, y) = c × y and 0 <
x < 3 and 0 < y < 3 satisfies the properties of a joint probability density
function.

6. Define covariance and correlation between the random variables X and Y.

7. Consider the random process {X(t), X(t) = cos(t + f) where f is uniform in


(− p/2, p/2). Check whether the process is stationary.

8. The one-step transition probability matrix of a Markov chain with states


⎛ 0 1⎞
(0, 1) is given by P = ⎜ ⎟ . Is it irreducible Markov chain?
⎝ 1 0⎠

05-Nov-Dec_2009 (Q).indd 112 12/7/2012 6:10:10 PM


Probability and Queueing Theory (Nov/Dec 2009) 1.113

9. Define effective arrival rate with respect to an (M|M|1):(GD /N/∞) queue-


ing model.

10. Write Pollaczek-Khintchine formula for the case when service time dis-
tribution is Erlang distribution with K phases.

PART B (5 ë 16 = 80 marks)
11. (a) Customers are used to evaluate preliminary product designs.
In the past, 95% of highly successful products received good
reviews, 60% of moderately successful products received
good reviews and 10% of poor products received good
reviews. In addition, 40% of products have been highly
successful, 35% have been moderately successful and 25%
have been poor products.
(i) What is the probability that a product attains a good review? (6)
(ii) If a new design attains a good review, what is the
probability that it will be a highly successful product? (5)
(iii) If a product does not attain a good review, what is the
probability that it will be a highly successful product? (5)
or
(b) (i) Obtain the moment generating function of the random
variable X having probability density function

⎧ x, 0 ≤ x ≤ 1

f ( x ) = ⎨ 2 − x , 1 ≤ x ≤ 2. (8)
⎪ 0, otherwise

(b) (ii) A fair coin is tossed three times. Let X be the number
of tails appearing. Find the probability distribution of X.
And also calculate E(X). (8)

12. (a) (i) Derive the mean and variance of a Binomial random
variable with parameters n and p. (10)
(ii) Suppose that X is a negative binomial random variable
with p = 0.2 and r = 4. Determine the mean of X. (6)
or
(b) (i) The time between process problems in a manufacturing
line is exponentially distributed with a mean of 30 days.
What is the expected time until the fourth problem? (4)

05-Nov-Dec_2009 (Q).indd 113 12/7/2012 6:10:10 PM


1.114 B.E./B.Tech. Question Papers

(ii) Find the moment generating function of a Normal random


variable with parameters m and s and hence obtain its
mean and standard deviation. (12)

13. (a) Determine the value of C that makes the function F(x, y) =
C(x + y) a joint probability density function over the range 0
< x < 3 and x < y < x + 2. Also determine the following.
(i) P (X < 1, Y < 2) (8)
(ii) P(Y > 2) (4)
(iii) E[X] (4)
or
(b) (i) A fair coin is tossed 10 times. Find the probability of
getting 3 or 4 or 5 heads using central limit theorem. (6)

(b) (ii) If the joint probability density function of X and Y is

⎧e −( x + y ) , for x > 0, y > 0


f x, y ) = ⎨
⎩ 0, elsewhere
X
Find the probability density function of Z = .
X Y

14. (a) Show that the random process X(t) = A sin(w t + q ) is wide-
sense stationary process where A and w are constants and q is
uniformly distributed in (0, 2p ). (16)
or
(b) Define Poisson process and obtain the probability distribution
for that. Also find the auto correlation function for the process.
(16)

15. (a) (i) For the (M | M | 1) : (GD | ∞ | ∞), derive the expression for Lq. (6)
(ii) Patients arrive at a clinic according to Poisson
distribution at a rate of 30 patients per hour. The waiting
room does not accommodate more than 14 patients.
Examination time per patient is exponential with mean
rate of 20 per hour.
(1) What is the probability that an arriving patient does
not have to wait?

05-Nov-Dec_2009 (Q).indd 114 12/7/2012 6:10:10 PM


Probability and Queueing Theory (Nov/Dec 2009) 1.115

(2) What is the expected waiting time until a patient is


discharged from the clinic?
or

(b) Derive the Pollaczek-Khintchine formula for the M|G|1


queueing model.

05-Nov-Dec_2009 (Q).indd 115 12/7/2012 6:10:11 PM


Solutions
Part A
1. Given two chips are taken successfully without replacement
( ) = p(First one is detective)
P(A
( ) = p(First one is detective)
P(B
P ( A ∩ B)
P (B / A ) = ⇒ P ( A ∩ B) = P ( A). P ( B / A)
P ( A)
1 1
= .
20 19
1
= = 0.0026
380
2. X : 0 1 2 3 4
( = x) :
P(X 0.2 0.2 0.2 0.2 0.2

Mean = E ( x ) = ( × .2) + (1 × 0.. ) + ( × .2)) + (3 × 0.. ) + ( × .1)


= 0.. [1++ 2 + 3 + 4] = 2
E(( 2
) = ∑ x 2 p( ) = (0 × 0.2) + (12 × 0.2) + ( 22 × 0.2) + (32 × 0.2) + ( 2
× .2)
= 0.. [1 + 4 + 9 + 16]
= 0.2 × 30
=6
Var ( ) = E(( 2 ) − [ ( x )]2
=6 − 4
=2
3. [Succeseful optical alignment in the assembly of an optical storage
product]
p = 0.8 (given)
q 1 p = 0.2
P(First successful alignment requires exactly four trials) =?
X ∼ G (n, p) ⇒ P (x = x) = pqx−1, x = 1,2,…
∴ p(x
( = 4) = (0.8) (0.2)3 = 0.0064

⎛ 1⎞ e −1 x
4. Given x N exp ⎜ ⎟ ⇒ p ( X = x ) =
⎝ 10 ⎠ x

Mean 1 = 10 ⇒ λ = 1 = 0 1
λ 10

05-Nov-Dec_2009.indd 116 12/7/2012 6:09:16 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.117

P(x < x) = 0.95 (given)


∴ F(x) = 1− e−lx, x ≥ 0
⇒ 0.95 = 1− e− 0.5x
⇒ −0 1x = 0 05
−0.95 x = log 0.1
= log 0.1 − 99.96
x = 0.06

5. since f(x,y) is a Jpdf, ∫∫ f(x,y) dxdy = 1

3 3
i.e., ∫ ∫ c( xyxy)ddxddy = 1
0 0
3
x2 3

c y[
0
] dy = 1
2 0
3
y
⇒c ∫ 2 [ ]dy
0
dy = 1

9c y 2 3
⇒ [ ] =1
2 2 0
81
⇒ =1
4
4
⇒ =
81
6. cov(x, y) = E(xy) –E(x) E(y)
cov( x, y )
Correlation coeff. of x y = rxy =
σ xσ y
⎡ π π⎤
7. Given x(t) = cos( t + φ ),
)φ ∪ − , ⎥
⎣ 2 2⎦
1 π π
⇒ f φ = − ≤x≤
π 2 2
Consider, E[x(t)] = E( cos( t + φ ) ]


= cos( t + φ ) f (φ )dφ
π /2
1
= ∫
−π / 2
cos( t + φ )). dφ
π

05-Nov-Dec_2009.indd 117 12/7/2012 6:09:18 PM


1.118 B.E./B.Tech. Question Papers

1
{sin( + φ )}π−π/ 2/ 2
=
π
1
= {cos t + cos }
π
2 cos t
=
π
As E [x(t)] is dependently on ‘t’ it is not Stationery.
⎛ 0 1⎞
8. P = ⎜
0⎟⎠
let states of p be {0,1}
⎝1

0 1

Starting from 0 →1 → 0, after 2 transition it comes back to 0. hence


period of 0 is 2 In the same way period of 1 is 2. Hence it is periodic,
finite.

9. Effective arrival rate λ1 = μ (1 − 0)

λ
1−
μ
Where P0 = N +1
⎛ λ⎞
1− ⎜ ⎟
⎝ μ⎠
10. If the service time is Erlang with parameters K and m so that E(T) =
K K
and v(t) =
μ μ2
k ( k )e
Then p-k formula reduces to Ls = ke +
2(( ke)

PART-B
11. (a)
(G/A) = Ηighly Successful products received good reviews
(G/B) = Moderately Successful products received good reviews
(G/C) = Poor Products received good reviews
A = Highly successful products
B = Moderately Successful products
C = Poor Products
G = good reviewing Product.
P(D) = 0.4
P(E) = 0.35

05-Nov-Dec_2009.indd 118 12/7/2012 6:09:19 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.119

P(F) = 0.25
P(G/A) = 0.95
P(G/B) = 0.6
P(G/C) = 0.1
(i) P(G) = P(A).P(G/A) + P(B) P(G/B) + P(C) P(G/C)
= 0.615
⎛ P ⎞ p( A). p(G / A) 0 35
(ii) P ⎜ 1 ⎟ = =
⎝G⎠ p(G ) 0.615
= 0 62
⎛ A⎞
(iii) 1 1 − 0 62
⎝ G⎠
= 0 38
11. (b) (ii) S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}.
X = Number of tails
X 0 1 2 3
P(k) 1/8 3/8 3/8 1/8
1 3 3 1
E( x ) = 0 × + 1 × + 2 × + 3 ×
8 8 8 8
1 6
= [ ] = = 0.75
8 8
(b) (i)
1 2


M x (t ) = E[e ] = e xdx + e tx ( ∫
tx tx
x )dx
0 1
1 2
⎡ ⎛ e tx ⎞ e tx ⎤ ⎡ e tx e tx ⎤
= ⎢ x ⎜ ⎟ − 2 ⎥ + ⎢( 2 − ) + 2⎥
⎢⎣ ⎝ t ⎠ t ⎥⎦ 0 ⎣ t t ⎦1
2
e 2t − 2et + 1 ⎛ e t −1 ⎞
= =⎜ ⎟
t2 ⎝ t ⎠
12. (a) (i) If x ∼ B(n, p) ⇒ P(X = x) = ncx p x q n − x
x = 1,2, … n.
n
M X (t ) = E[e tx ] = ∑e
x=0
tx
p( X x)

n
= ∑ nCC
x=0
x p x q n − x e tx

05-Nov-Dec_2009.indd 119 12/7/2012 6:09:21 PM


1.120 B.E./B.Tech. Question Papers

n
= ∑ nCC ( pe )
x=0
x
t x
qn− x

=( + t n
)

Mean = M x′ ( ) = n( q + pe t ) n −1 × pe t
t =0
= np
E ( X ) 2 = M X ( ) np
np {e t (n
(n )( q pe t ) n 2 . pe t e t ( q + pe t ) n
t =0

= n( n − ) p 2 + np
= n2p2 + npq
Var(x) = E(x2) − [E(x)]2 = npq
(ii) Given X ∼ NB(p, r)
p = 0.2, q = 1 −p = 0.8
r=4
rq 4 × 0.8
Mean = = =1 6
p 02
12. (b) (i) Let X = Time between process in a manufacturing Line.
⎛ 1⎞
Given, X ∼ exp ⎜ ⎟
⎝ 30 ⎠
1 1
Mean = 30 ⇒ λ =
λ 30
e − λt λ x
P( x x) =
x

Probability of no events to occur upto a time t


e − λt ( λ t )
0
P( X )= = e − λt
0
−1
×4
= e 30 e −0.13

(ii) ∴ X ∼ N ( μ , σ 2 )
M x (t ) = M σzz + μ ( t ) ,
x−μ
Since z =
σ
= e μt .μ z (σ t ) (by property)

05-Nov-Dec_2009.indd 120 12/7/2012 6:09:22 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.121

⎛ σ 2t ⎞
M X (t ) et μ +
⎝ 2 ⎟⎠
2
t⎛ σ 2t ⎞ t 2 ⎛ σ 2t ⎞
Now, M x (t ) = 1 + ⎜ μ + + μ + +…
1⎝ 2 ⎟⎠ 2 ⎝ 2 ⎟⎠
t
Mean = E(x) = Coeff. of =μ
1
t2
E(X 2) = coeff of = σ 2 + μ2
2
Var (x) = E(X2) − (E(X)]2 = s2,

S.D = σ2 σ

13. (a) (i) Since f(x, y) is joint pdf, Then


∞ ∞

∫∫
−∞ −∞
f xy dy = 1
x ddxd

3 x+2
(i.e.,) C
∫ ∫ ( x + y) dydx = 1
0 ∞
3 x+2
⎡ y2 ⎤
⇒C ⎢ y+ ⎥
0⎣
2⎦ ∫ x
dx = 1


⇒ C ( 4 x + 2) d x = 1
0

1
⇒C{ 2
+ x}30 = ⇒ C = 1⇒ C =
24
1 2
1
∫ ∫ 24 ( x + y ddyddx
(i)
P( x ; y < 2)
0 x
1
1 ⎛ 3x 2 ⎞
=
24 ⎝
0

2x + 2 −
2 ⎟⎠
dx

1
=
16
3

(ii) p(x > 2) = ∫ ∫


0 y>2
f ( x, y ddyd
dx

05-Nov-Dec_2009.indd 121 12/7/2012 6:09:23 PM


1.122 B.E./B.Tech. Question Papers

2 x+2 3 x+2
1 1
= ∫∫
0 x
24
+ + ∫
2 x
24
( x + y ) dydx

2 2 ⎞ x+2 3
1 ⎛ 1 ⎛ y2 ⎞
=
24 ⎝
0
∫ +
2 ⎟⎠
x
+
24 ⎝
2

xy + ⎟ dx
2⎠
7 1
+=
18 2
8
P( x > 2) =
9

(iii) E( x))
−∞
∫ f ( x ) dx

3
1
= x ∫
0
12
( 2 x + 1) dx

15
=
2
x+2
1
f ( x) = ∫ x
24
(x
(x y ) dy

x+2
1 ⎡ y2 ⎤
= ⎢ xy + ⎥
24 ⎣ 2⎦
x
1
= ( 2 + 1); 0 < < 3
12
(b) (i) X = getting heads in 10 trials
n = 10
1
p = getting a head in a trial =
2
1
q=
2
∴ X ∼ B (n, p)
Mean = np = 5
1 1 10
Variance = npq = 10 ×
× =
2 2 4
X − E( x ) X − 5 X − 5 −2
∴Z = = = 1 = 3⇒Z = = −0
var (x ) 10 2. 5 25
4

05-Nov-Dec_2009.indd 122 12/7/2012 6:09:25 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.123

P(3 < X < 5) = P(−0.8 < Z < 0) 2. x = 5 ⇒ Z = 0


= P(0 < Z < 0.8)
0.7881
(ii) Since f ( x, y ) e −(( x y )
x
Let z & + y (say )
x+ y
⇒ x = ZW & y = W(1− z)
W Z
J= =W
−Z
Z 1 Z
−( x + y )
∴ f(u, v) = J e = W e−w
Since x, y, ≥ 0, ZW ≥ 0 & W(1 - Z) ≥ 0
∴ Z ≥ 0, W ≥ 0 & 1 - Z ≥ 0, i.e., 0 ≤ Z ≤ 1 & W ≥ 0
i.e., Z ≤ 0, and Z ≥ 1, is absurd
∴ f(Z, W) = We−W , 0, ≤ u ≤ 1,


f ( Z ) = We −W = 1, 0 ≤ Ζ ≤ 1
0

∴ f(Z) = 1, 0 ≤ Z ≤ 1

14. (a) E[ (t )] = E[
E[A sin(
i ( ))]

∴ q ∼∪[0, 2p]

1 1
= ∫ A sin(ω t + θ )). 2π dθ
0
⇒ f( )=

, < θ < 2π

A
= {− cos( t + θ )}02π

A
= − {cos( t ) − coss t} = 0, a constant.

Rxx (t, t + t) = E[x(t)x(t+t)]
sin (ω t + θ )
= E[ Asin ssii ( (t + τ ) + ))]
sin
= A E [ sin(ω t + θ ) sin(
2
ssii ( + ) + θ ))]
A2
= {E (cos( −ω t)
t ) − cos(( 2 t + 2 + ]}
2
A2
= {cos ω t − E (cos(2
(cos( 2 t + + 2 ]}
]}
2

05-Nov-Dec_2009.indd 123 12/7/2012 6:09:29 PM


1.124 B.E./B.Tech. Question Papers

A2
= cos ωτ , depends on t only
2
∴{x(t))} is a WSS.
{ [ ( 2ω t + ωτ + 2θ ))] = 0}
[cos(
(b) Poisson Process
A counting process is said to be a poisson process
{N(t): t ≥ 0}having rate l, l ≥ 0 if
(i) N(0) = 0
(ii) The process has independent & stationary Increments
(iii) p{N(h) = 1} = lh+ 0(h)
(iv) p{ N(h) ≥ 2} = 0(h)
conditions (iii) & (iv) implies, p{N(h) = 0} = 1-lh + 0(h)
Probability Distribution
Consider Pn (t + Δt), for n ≥ 1 i.e., these are ‘n’ occurances
of the random event in (0, t + Δt). This probability can be
computed as the sum of (n + 1) mutually exclusive events
as follows.
Pn (t + Δt) = P[n occurances in t + Δt]
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + Pn − 2 (t) ⋅ P2 (Δt) + … +P0 (t) Pn (Δt)
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + 0(Δt) + … + 0(Δt)
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + 0(Δt)
Pn (t + Δt) = Pn (t) [1 − lΔt + 0(Δt)] + Pn − 1 (t) [lΔt + 0(Δt)] + 0(Δt)
Pn (t t ) Pn (t ) 0( t ) 0( Δt ) ( Δt )
= − λ Pn (t ) + Pn (t ) + λ Pn (t ) + Pn −1 (t ) ⋅ +
Δt Δt Δt Δt
Pn ( ) − Pn ( )
lim = − λ Pn ( ) + λ Pn −1 ( )
Δ →0 Δt
Pn′ (t ) = − λ Pn (t ) + λ Pn −1 (t ));; n ≥ 1 (1)
Using above argument
P0 (t + Δt) = P0 (t ) ⋅ P0 (Δt)
= P0 (t ) [1 − lΔt + 0(Δt) ]
P0 (t + Δt) − P0 (t ) = −lP0 (t ) Δt + P0 (t ) 0 (Δt)
P0 ( ) − P0 ( ) 0( )
( ) = λ P0 ( ) + P0 ( )
Δt Δt

05-Nov-Dec_2009.indd 124 12/7/2012 6:09:30 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.125

P0 ( ) − P0 ( )
lim = − λ p0 ( )
Δ →0 Δt
Pn′ (t ) = − λ P0 (t )

Solve, P0 (t ) = Ke − λ t

Now, P0 (0) 1 P0 (0) = k

⇒k=1

∴ P0(t) = e−λt
Put n = 1 in 1

P1′ (t ) = − λ P1 (t ) + λ P0 (t )

⇒ P1′ (t ) + λ P1 (t ) = λ e − λ t

Solving P1 (t ) = λ tte − λ t we get

e − λ t ( λ t )′
i.e., P1 (t ) =
L′
continuing this way, we get
e − λt ( t )n
Pn (t ) = , n = 0,1, 2…,
n
Auto correlation of X (t):
Rxx (t1, t2 ) E [ x (t1 ).
) x(t(t2 )]
= E [ x (t1 ).{
) {x(t2 ) x(t1 ) x(t1 )}
= E [ x (t1 ) {x(t2 ) x(t1 )}] E [ x 2 (t1 )]
= E [ x (t1 ) E {x
{x(t2 ) x(t1 )}] E [ x 2 ( 1 )]
= λ t1λ ( 2 1) λ t1 + λ 2 t12
= λ 2 ( 1 2 ) + λ t1 f t2 ≥ t1
Rxx ( 1 ,t,t2 ) = λ 2 , 2 + λ min( 1, t2 ), for any t1, t2

15. (a) (i) Let N denotes the number of customers is the queueing
system, then the number of customer in queue is n-1.
n
⎛ λ⎞ ⎛ λ⎞
W.K.T . P(N = n) ⎜ ⎟ ⋅ ⎜1 − ⎟
⎝ μ⎠ ⎝ μ⎠

05-Nov-Dec_2009.indd 125 12/7/2012 6:09:31 PM


1.126 B.E./B.Tech. Question Papers


Lq E(N ) ∑ (n(n − ) P
n =1
n

∞ n
⎛ λ⎞ ⎛ λ⎞
= ∑
n =1
( n − ) ⎜ ⎟ ⎜1 − ⎟
⎝ μ⎠ ⎝ μ⎠
∞ n
⎛ λ⎞ ⎛ λ⎞
= 1− ⎟
⎝ μ⎠ ∑
n =1
( − )⎜ ⎟
⎝ μ⎠
2 ∞ n− 2
⎛ λ⎞ ⎛ λ⎞ ⎛ λ⎞
= ⎜ ⎟ ⎜1−1
⎝ μ ⎠ ⎝ μ ⎟⎠ ∑
n= 2
( − )⎜ ⎟
⎝ μ⎠
2 −2
⎛ λ⎞ ⎛ λ⎞ ⎛ λ⎞
= ⎜ ⎟ ⎜1 − ⎟ ⎜⎝1 − μ ⎟⎠
⎝ μ⎠ ⎝ μ⎠
λ2
Lq =
μ( μ − λ )

(ii) Given, l = 30 per hour Model: M/M/1: k/FIFO


m = 20 per hour
K = 14+1 = 15
λ 30 3
ρ= = =
μ 20 2
3
1−
1− e 2 = 0.00076
P0 = =
1 − e k +1 ⎛ 3⎞
16
1− ⎜ ⎟
⎝ 2⎠
(i) P (a patient will not wiat) = p0= 0.00076

e ( k ) e k +1
(ii) Ls = −
1− e 1 − e k +1
= 13.02
l′ = m (1 − p0)
= 20 (1− 0.00076)
= 20 per hr (nearly)
Ls
Expected waiting time Ws =
λ1

05-Nov-Dec_2009.indd 126 12/7/2012 6:09:33 PM


Probability and Queuing Theory (Nov/Dec 2009) 1.127

13.02
= hrs
20
13.02
= × 60 minutes
20
= 39.06 min
15 (b) P-K Formula:
Let N and N′ be the Number of customers in the system at time
t and t+T, when two consecutive customers have just left the
system after getting service.
T → Random Service time which is continuous random variable.
f(t)→Pdf of T
E(T) → Mean of T and Var(T) → variance of T

Let M be the number of customers arriving in the system during


⎧ M iif N = 0⎫
the service time ‘T’∴ N ′ = ⎨ ⎬ (1)
⎩( N −1) + M if N > 0 ⎭
Where M is a discrete values 0,1,2…∴

⎧1 if N = 0⎫
N′ = Ν−1+M+S where S = ⎨
⎩0 if N 0⎬⎭
∴ E(N′) = E(N) −1 + E(M) + E(S) (2)
When the system is in steady state, the probability of number of
customers, in the system will be constant.
∴ E(N) = E(N′)
and E(N2) = E(N′2)
Squaring both sides of (1), we get
N ′2 N 2 + (M )2 σ 2 2 N ( M − 1) 2( M − 1)σ + 2 N σ

Nowd 2 = d (since if d = 0, d 2 = 0, if d = 1, if d 2 = 1)

⎧ 0 1 if N =0
and N δ = ⎨
⎩ N 0 if N ≠0
∴ Nδ = 0
Sub in (5)
N′2 = N2 + (M − 1)2 + d + 2 N(M − 1) + 2(M − 1)d
⇒ −2N(M − 1) = N2 − N′2 + M2 − 2M + 1 + d + 2Md − 2 d

05-Nov-Dec_2009.indd 127 12/7/2012 6:09:33 PM


1.128 B.E./B.Tech. Question Papers

∴ 2 E[N(1 − M) = −2E(M) + E(M2) + 1 + E{(2M − 1)d}


⇒ 2E (N) E(1 − M) = E(M2) + E(2M−1) E(d) − 2E(M) + 1.
By independence and by 3
⇒ 2E (N) [1 − E(M)] = E(M2)+ [2E(M)−1] Ε(d) − 2E(M) + 1.

E(( 2
) +[ [ M ) − ]E(( ) − 2 E(( ) + 1
∴ E(N) =
2[[ − ( M )]
E ( M 2 ) − E 2 ( M ) + E ( M )]
=
2[[ − E ( M )]
Since number of arrivals M in time T follows a poisson Process with
parameter l, we have E(M) = l E(T) and
E(M2) = l2 [r(T) + E2(t)] + l E(T)
Sub in (6), we get

λ 2 [V (T )] + λ 2 E 2 (T ) + λ E((T ) − λ 2 E 2 (T ) + λ E (T )
Ls = E( N ) =
2[[ − λ E (T )]
λ 2 [V (T )] + E 2 (T )]
Ls λ E (T ) +
2[[ − λ E (T )]

05-Nov-Dec_2009.indd 128 12/7/2012 6:09:35 PM


B.E./B.Tech. DEGREE EXAMINATION,
MAY/JUNE 2009
Fourth Semester
Computer Science and Engineering
PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
(Regulation 2008)
Time: Three hours Maximum: 100 marks
Answer ALL questions
PART A (10 ë 2 = 20 marks)
1. Is the cumulative distribution function F(x) of a random variable X
always continuous? Justify your answer.

2. When A and B are two mutually exclusive events, are the values
P(A) = 0.6 and P( A B) = 0.5 consistent? Why?

3. If on the average rain fails on 10 days in every 30 days, obtain the


probability that rain will fall on atleast 3 days of a given week.

4. What is the relationship between Weibull and Exponential distribution?

5. Write down any two properties of correlation coefficient.

6. The joint pdf of 2 2


the random variable (X, Y) is given by
f ( x, y ) k xy e ( x y ) ; x > 0, y > 0. Find the value of k.

7. What do you understand by stationary process?

8. “The additive property holds good for any number of independent


Poisson processes”. Justify.

9. Draw the state transition rate diagram for M/M/C queueing model.

10. What is the probability that a customer has to wait more than 15 mins to
get his service completed in M/M/1 queueing system, if l = 6/hr and
m = 10/hr?

06_May-June_2009 (Q).indd 129 12/7/2012 6:10:49 PM


1.130 B.E./B.Tech. Question Papers

PART B (5 × 10 = 80 marks)

11. (a) (i) A top is rejected if the design is faulty or not. The
probability that the design is faulty is 0.1 and that the toy
is rejected if the design is faulty is 0.95 and otherwise
0.45. if a toy is rejected, what is the probability that it is
due to faulty desings? (8)
(ii) In an exhibition, the probabilities of hitting the target are
1/2 for A, 2/3 for B and 3/4 for C. If all of them fire at the
same target, what are the probabilities that (1) only one of
them hits the target (2) atleast one of them hits the target?
(8)
(or)
(b) (i) A random variable X has the following probability
distribution:
X = x: −2 −1 0 1 2 3
P(x): 0.1 K 0.2 2K 0.3 3K
Find h, P(−2 < x < 2), mean of X. (8)

⎧ xe − x ;x≥0
2
/2
(b) (ii) If p( x ) = ⎨
⎩ 0 ; x < 0.

(1) Show that p(x) is a pdf. (2) Find F(x). (8)

12. (a) (i) Out of 800 families with 4 children each, how many
families would be expected to have (1) 2 boys and 2 girls
(2) atleast 1 boy (3) atmost 2 gilrs (4) children of both
sexes. Assume equal probabilities for boys and girls. (8)
(ii) The mileage which car owners get with a certain kind of
radial tire is a random variable having an exponential
distribution with mean 40,000 km. Find the probabilities
that one of these tires will last (1) atleast 20,000 km (2)
atmost 30,000 km. (8)
or
12. (b) (i) If the life X (in years) of a certain type of car has a Weibull
distribution with the parameter b = 2, find the value of the
parameter a, given the probability that the life of the car
exceeds 5 years is e−0.25. For these values of a and b, find
the mean and variance of X. (8)

06_May-June_2009 (Q).indd 130 12/7/2012 6:10:50 PM


Probability and Queueing Theory (May/June 2009) 1.131

(ii) If the actual amount of instant coffee which a filling


machine puts into ‘6-ounce’ jars is a random variable
having a normal distribution with SD = 0.05 ounce and if
only 3% of the jars are to contain less than 6 ounces of
coffee, what must be the mean fill of these jars? (8)

13. (a) Obtain the equation of the regression lines from the following
data, using the method of least squares. Hence find the
coefficient of correlation between X and Y. Also estimate the
value of Y when X = 38 and the value of X when Y = 18. (16)

X: 22 26 29 30 31 33 34 35
Y: 20 20 21 29 27 24 27 31

or
(b) The joint pdf of a 2-dimensional RV (X, Y) is given by
x2
x 2
f ( x, y ) xy ; 0 ≤ x ≤ 2 0 ≤ y ≤ 1. Compute P(X > 1),
8
⎛ 1⎞ ⎛ 1⎞ ⎛ 1 ⎞
P Y < ⎟ , P X > 1 / Y < ⎟ , P Y < / X > 1⎟ , P ( X < Y ) ,
⎝ 2⎠ ⎝ 2⎠ ⎝ 2 ⎠

P ( X + Y ) ≤ 1). (16)

14. (a) (i) Given a RV ‘Ω’ with density f(w) another RV f uniformly
distributed in (−p, p) and independent of Ω and X(t) = a
cos(Ωt + f), prove that X(t) is a WSS process. (8)
(ii) Suppose that customers arrive at a bank according to a
Poisson process with a mean rate of 3 per minute ; find the
probability that during a time interval of 2 mins.
(1) Exactly 4 customers arrive and
(2) More than 4 customers arrive. (8)
or
(b) An engineer analyzing a series of digital signals generated by
a testing system observes that only 1 out of 15 highly distorted
signals follows a highly distorted signal, with no recognizable
signal between, whereas 20 out of 23 recognizable signals
follow recognizable signals, with no highly distorted signal
between. Given that only highly distorted signals are not
recognizable, find the TPM and fraction of signals that are
highly distorted. (16)

06_May-June_2009 (Q).indd 131 12/7/2012 6:10:50 PM


1.132 B.E./B.Tech. Question Papers

15. (a) There are 3 typists in an office ; each typist can type an
average of 6 letters/hour. If letters arrive for being typed at
the rate of 15 letters/hour.
(i) What fraction of the time all the typists will be busy?
(ii) What is the average number of letters waiting to be typed?
(iii) What is the average time a letter has to spend for waiting
and for being typed?
(iv) What is the probability th at a letter will take longer than
20 mins. waiting to be typed and being typed? (16)
or
(b) Customers arrive at a one-man barber shop according to a
Poisson process with a mean inter arrival time of 12 mins.
Customers spend an average of 10 mins in the barber’s chair.
(i) What is the expected number of customers in the barber
shop and in the queue?
(ii) How much time can a customer expect to spend in the
barber’s shop?
(iii) What is the average time customers spend in the queue?
(iv) What is the problem that the waiting time in the system is
greater than 30 mins? (16)

06_May-June_2009 (Q).indd 132 12/7/2012 6:10:50 PM


Solutions
Part A
1. The cumulative distribution function F( F(x) of a random variable x need not
always continuous.
If x is discrete, then F(
F(x) is right continuous & need not be left continu-
ous.

2. If A & B are mutually exclusive, then


[ ∩ B] = f
P[A A B

∴ P(A
( ∪ B) = P(A
( ) + P(B
( )

& ( A ∩ B)
B) P ( A) − P ( A ∩ B )
= P(A
( ) − f = P(A
( )
( ) = 0.5 and P( A B) = 0.6
But given P(A
P(A
( ) & P( A B) is not consistent.

3. If P denotes the probability of rainfall.


10 1
p= =
30 3
1 2
∴ q = 1− p = 1 −=
3 3
Let probability of rain fall on x days in a week
[ = x] = n cx px qn−x
P[X −

∴ Τhe probability of rain will fall at least 3 days of a given week.


[ ≥ 3] = P[X
P[X [ = 3] + P[X
[ = 4] + P[X
[ = 5] + P[X
[ = 6] + P[X
[ = 7]
3 4 4 3 5 2
⎛ 1⎞ ⎛ 2⎞ ⎛ 1⎞ ⎛ 2⎞ ⎛ 1⎞ ⎛ 2⎞
= 7C3 ⎜ ⎟ ⎜ ⎟ + 7C4 ⎜ ⎟ ⎜ ⎟ + 7C5 ⎜ ⎟ ⎜ ⎟ +
⎝ 3⎠ ⎝ 3 ⎠ ⎝ 3⎠ ⎝ 3 ⎠ ⎝ 3⎠ ⎝ 3 ⎠
6 7
⎛ 1⎞ ⎛ 2⎞ ⎛ 1⎞
7C6 ⎜ ⎟ ⎜ ⎟ + 7C7 ⎜ ⎟
⎝ 3⎠ ⎝ 3 ⎠ ⎝ 3⎠
= 0.2561+ 0.1280 + 0.0384 + 0.0066 + 0.0005 = 0.4294

4. Ιn weibull distribution
β
f(x) = α β x β −1e −α x , ξ > 0.
f(

06-May-June_2009.indd 133 12/8/2012 11:57:33 AM


1.134 B.E./B.Tech. Question Papers

When β = 1

f(x) = α e −α x
It is pdf of exponential distribution.
i.e., when β = 1, weibull distribution reduces to the exponential distri-
bution.
5. 1. 1 −1 ≤ rxy ≤ 1 or cov ( x, y ) ≤ σ x .σ y
2. Correlation coefficient is independent at change of origin and scale.
x a y b
i.e., If U = &V = , where h, k > 0, then rxy = ruv
h k
6. By the property of joint pdf

∫ ∫ −( x2 + y2 )
dxdy = 1
x > 0 y>0




k ye − y dy ∫ x2
i.e., xe dx = 1
0
0

Put x2 = t
2x dx = dt
dt
x dx =
2

∞ ∞ e t dt 1 ⎡ e − t ⎤ 1


0
xe − x dx ∫0 2
= ⎢
2⎣ 1⎦
⎥ =
0
2

∴ k ⎡1 1⎤
k 4
⎣2 2⎦
7. If certain probability distribution or averages do not depend on t, then the
random process {X (t)} is called stationary.

8. Let X(t) = X1(t) + X2(t)


n
P[X(t) = n] = ∑ P{X (t )
r =0
1 r}P{X
{ X 2 (t ) n r}

e − λ t (λ1t e − λ t λ2 t )n
n r r
= ∑
r =0
r!
.
(n r)!

06-May-June_2009.indd 134 12/8/2012 11:57:34 AM


Probability and Queuing Theory (May/June 2009) 1.135


1
= e − ( λ1 + λ 2 ) t n c ( λ t ) ( λ 2t )n r
n! r = 0 r 1
[( λ1 + λ 2 )t ]n
= e − ( λ 1 + λ 2 )t
n!
∴ {X1(t) & X2(t)} is a Poisson process with parameter ( λ1 λ 2 )t
l l l l l

0 1 2 3 C−1 C C+1
9 .
m 2m 3m Cm Cm
The transition diagram for multiple Server.

9. Refer Nov/Dec 2010 Question 8.


10. Given l = 6/hour m = 10/hour.
Probability that the waiting time of a customer in the system exceeds t
P[Ws > t] = e −( μ − λ )t

∴ P [ a customer has to wait more than 15 min]


1
= P[a customer has to wait more than hour]
4
1
= P[Ws > ]
4
1
−( − )
= e 4 = e −1 = 2.718

11. (a) (i) Let D1,D2 denote the events that the design is facility or not. Let
A denote the event that the toy is rejected.
∴ P(D1) = 0.1 & P(D2) = 1 − 0.01 = 0.9
P(A/D1) = 0.95 & P(A/D2) = 0.45
P[rejection due to faculty begin)
P ( D1 ).P ( A / D1 )
= P(D1/A) =
P ( D1 ) ⋅ P ( A / D1 ) P ( D2 ) P ( A / D2 )
0.1 × 0.95
= = 0.19
0.1 × 0.95 + 0.9
.9 0.45
1 2 3
11. (a) (ii) given P(A) = , P(B) = & P(C) =
2 3 4
1 1 1
∴ P ( A) = , P ( B) = , P (C ) =
2 3 4

06-May-June_2009.indd 135 12/8/2012 11:57:36 AM


1.136 B.E./B.Tech. Question Papers

(i) P[none of them hits the target]

P( A B C ) = P ( A) × P ( B ) × P (C ) ∴ independence
1 1 1
=
× ×
2 3 4
1
=
24
(ii) P [at least one hits the target]
= 1−P[none of them hits the target]
1
= 1−
24
23
=
24

11 (b) (i)
To find k:
Since ∑ p(x) = 1

∴ 0.1 + k + 0.2 + 2k + 0.3 + 3k = 3


6k + 0.6 = 1
1
∴k =
15
∴ The probability distribution becomes
x: −2 −1 0 1 2 3
p(x): 1/10 1/15 1/5 2/15 3/10 1/5

P[−2 < X < 2] = P[x = −1] + P[x = 0] + P[x = 1]


= 1/15 + 1/5 + 2/15
= 2/5
Mean of X = E[X] = ∑ x p(x)
⎛ 1⎞ ⎛ 1⎞ ⎛ 2⎞
= ⎜ −2 × ⎟ + −1 × ⎟ + 0 + ⎜1 × ⎟ +
⎝ 10 ⎠ ⎝ 15 ⎠ ⎝ 15 ⎠
⎛ 3⎞ ⎛ 1⎞
⎜⎝ 2 × ⎟⎠ + ⎝ 3 × ⎟⎠
10 5
1 1 2 3 3
=− − + + +
5 15 15 5 5

06-May-June_2009.indd 136 12/8/2012 11:57:38 AM


Probability and Queuing Theory (May/June 2009) 1.137

16
=
15
x2

11. (b) (ii) given P(x) = xe 2 ≥0
∂ ∞ x2

Since ∫ 0
P(x) dx = ∫ xe
0
2 dx

x2
Put t = dt = x dx
2
∞ ∞
⎛ e−t ⎞

∴ = e dt = ⎜ ⎟ =1
t

0
⎝ −1 ⎠ 0

∴ P(x) is a pdf.
F(x) = P[X x]

= ∫ f ( x)dx
0
when x < 0
F(x) = 0
when x ≥ 0
x x2

F(x) = ∫ xe
0
2 dx

x2

= 1− e 2

⎧0, x<0

∴ F ( x) = ⎨ x 2

⎪⎩1 − e 2 , x ≥ 0

12. (a) (i) Let each child as a trial n = 4, N = 800 families. Assuming that
1 1
birth of boy is a success, p = & q =
2 2
Let X denote the number of successes (boys),
(1) P[2 boys and 2 girls] = P[X = 2]
2 4−2
⎛ 1⎞ ⎛ 1⎞
= ΑC2 ⎜ ⎟ ⎜ ⎟ ∴ P[X = x] =n Cx px qn−x
⎝ 2⎠ ⎝ 2⎠

06-May-June_2009.indd 137 12/8/2012 11:57:40 AM


1.138 B.E./B.Tech. Question Papers

4
⎛ 1⎞
= 6× ⎜ ⎟
⎝ 4⎠
3
=
8
∴ Νumber of families having 2 boys & 2 girls
=Ν. P[X = 2]
3
= 800 × = 300
8
(2) P[at least 1 boy ] = P[X ≥ 1]
= 1 − P[X < 1]
= 1 − P[X = 0]
0 4
= 1− 4Co ⎛ 1 ⎞ ⎛ 1 ⎞
⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠
2 2
1
=1 −
16
15
=
16 15
∴ Νumber of families having atleast one boy = 800 × =
750 16
(3) P[at most 2 girls ]
= P[X = 0 or X = 1 or X = 2]
= P[X = 0]+ P[X = 1] +P[X = 2]
4 4 4
= 4C0 ⎛⎜ ⎞⎟ + 4C1 ⎛⎜ ⎞⎟ + 4C2 ⎛⎜ ⎞⎟
1 1 1
⎝ 2⎠ ⎝ 2⎠ ⎝ 2⎠
4
⎛ 1⎞
= ⎜ ⎟ [1+4+6]
⎝ 2⎠
11
=
16
∴ Νumber of families having atmost 2 girls
11
= 800 ×
16
= 550

06-May-June_2009.indd 138 12/8/2012 11:57:41 AM


Probability and Queuing Theory (May/June 2009) 1.139

(4) P [children of both sexes]


= 1 − P[children of the same sex]
= 1 − { P(all are boys} + P(all are girls)}
= 1 − {P(X = 4) + P(X = 0)}
4 4
= 1 − {4C4 ⎛ 1 ⎞ + 4 C0 ⎛ 1 ⎞ }
⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠
2 2
⎧1 1⎫
= 1 −⎨ + ⎬
⎩16 16 ⎭
1 7
=1 − =
8 8
∴ Νumber of families having children of both sexes
7
= 800 ×
8
= 700
x
1 −
12. (a) (ii) f(x) = e 40000
40000

1
(1) P[X ≥ 20,000] = ∫
20000
40000
e − x / 40000 dx


⎡ ⎤
1 ⎢ e − x / 40000 ⎥
= ⎢ ⎥
40000 ⎢ −1 ⎥
⎣ 40000 ⎦ 0

= e −0 5
=0.6065
30000
1
(2) P[X ≤ 30000] = ∫0
40000
e − x / 40000 dx

30000
= ⎡⎣ −e − x /40000 ⎤⎦
0

=1 − e −0 75
=0.5270

06-May-June_2009.indd 139 12/8/2012 11:57:42 AM


1.140 B.E./B.Tech. Question Papers

12. (b) (i) The density function of X is given by


f(x) = 2 a x e −α x , x > 0 [∴ β = 2]
2

∫ 2 α x e −α n dx
2
Now P[X > 5] =
5

= [−e −α x ]5∞ = e −25α


2

Given that P[x > 5] = e −0 25

∴ e −25α = e −0 25
∴ 25 α = 0 25
1
α=
100
1/ β
∴ Ε [x] = α β +1
−1/ 2
⎛ 1 ⎞ 3
= ⎜
⎝ 100 ⎟⎠ 2
1 1
=10 ×
2 2
=5 π
2/ β
Var(X) = α β + 1 − { 1 / β + 1}2 ]
−1 ⎡ ⎛ ⎞ ⎤
2
⎛ 1 ⎞ ⎢ 2 −⎜ 3⎟ ⎥
=⎜
⎝ 100 ⎟⎠ ⎢ ⎝ 2⎠ ⎥
⎣ ⎦
⎡ ⎛1 ⎞ ⎤
2
= 100 ⎢1 − ⎜ π⎟ ⎥
⎢⎣ ⎝ 2 ⎠ ⎥

⎡ π⎤
=100 ⎢1 − ⎥
⎣ 4⎦

12. (b) (ii) Let X be the actual amount of coffee put into the jars. Then X
follows N(m, 0.05)
Given P[X < 6] = 0.03

⎡ x − μ 6 − μ⎤
∴ P ⎢ −∞ < < = 0 03
⎣ 0.05 0.05 ⎥⎦

06-May-June_2009.indd 140 12/8/2012 11:57:44 AM


Probability and Queuing Theory (May/June 2009) 1.141

⎡ 6 − μ⎤
∴ P ⎢ −∞ < z < = 0.03
⎣ 0 05 ⎥⎦

⎡ 6 − μ⎤
∴ P [ −∞ < z < ] + P ⎢0 < z < = 0.03
⎣ 0.05 ⎥⎦

⎡ 6 − μ⎤
0.5 + P ⎢0 < < = 0.03
⎣ 0 05 ⎥⎦
⎡ 6 − μ⎤
P ⎢0 < < = −0.47
⎣ 0 05 ⎥⎦
⎡ μ − 6⎤
P ⎢0 < < = 0.47 ∴by symmetric
⎣ 0 05 ⎥⎦
From the table
P[0< z < 1.808] = 0.47
μ −6
∴ = 1.808
0 05
∴ μ = 6.094 onwards

13. (a) Put U = X − 29 & v = Y − 27


Let equation of regression line of Y on X be
Y = Ax + B or equivalently
V = au + b (1)
The normal equations for finding a & b are
a ∑ u + nb = ∑ v → (2)

and a ∑ u 2 + b ∑ u = ∑ uv → (3)

x y u = x − 29 v = y − 27 u2 v2 uv
22 20 −7 −7 49 49 49
26 20 −3 −7 9 49 21
29 21 0 −6 0 36 0
30 29 1 2 1 04 2
31 27 2 0 4 0 0
31 24 2 −3 4 9 −6
34 27 5 0 25 0 0
35 31 6 4 36 16 24
∑ 6 −17 128 163 90

06-May-June_2009.indd 141 12/8/2012 11:57:46 AM


1.142 B.E./B.Tech. Question Papers

2 ⇒ 6a + 8b = −17 (4)
3 ⇒ 128a + 6b = 90 (5)
Solving (4) & (5) we get a = 0.83 & b = −2.75
Hence the regression line of y on x is
y −27 = 0.83(x − 29) − 2.75
i.e., y = 0.83x + 0.18 (6)
Let the equation of the regression line x on y be x = Cy + D or equiva-
lently
u = Cv + d (7)
The normal equations for finding c & d are
c ∑ ν +nd = ∑ u (8)

and c ∑ ν 2 + d ∑ ν = ∑ uv (9)

8⇒ −17 c + 8 d = 6 → (10)
using table values
9⇒ 163 c − 17 d = 90 → (11)
Solving (10) & (11) we get c = 0.81 & d = 2.47
Hence the regression line of X on Y is
x − 29 = 0.81(y−27) + 2.47
x = 0.81y + 9.60 (12)
Comparing equations 6 with

rσ y ( x x)
y−y =
σx

rσ y
= 0.83
σx
Comparing equation 12 with
σx
x x = r (y y)
σy
σx
r = 0.81
σy

06-May-June_2009.indd 142 12/8/2012 11:57:47 AM


Probability and Queuing Theory (May/June 2009) 1.143

σy σx
W.K.T r2 = r ×r
σx σy
= 0.83 × 0.81
σy σx
∴r = 0.82 [∴ r
&r are both positive]
σx σy
We use the equation (6) to estimate the value at Y when X = 38
∴ Y = 0.83 × 38 + 0.18 = 31.72
Using equation (12) to estimate the values of X when Y = 18, we
have
x = 0.81 × 18 + 9.60 = 24.18

13. (b) Given 0 x 20 y 1


∴ we get the rectangle the range spare R1 & R2

(i) P(X > 1) = ∫∫ R1


f ( x, y ddxd
dy

(x )

1 2
= ∫∫
0 1
(xy2 + x2/8) dx dy

1 2
⎡ x2 x3 ⎤
0⎣

= ⎢ y 2 + ⎥ dy
2 24 ⎦
1
1
⎡⎛ 2 1 ⎞ ⎛ y 2 1 ⎞ ⎤
= ∫0
⎢⎜ 2 y + ⎟ −



+ ⎥dy
3 ⎠ ⎝ 2 24 ⎟⎠ ⎥⎦
1
⎛ 3y2 7 ⎞
= ∫0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1
⎛ y3 7 ⎞
= ⎜ + y⎟
⎝ 2 24 ⎠0
1 7 19
= + =
2 24 24
⎛ 2 x2 ⎞
(ii) P [y < 1/2] = ∫ ∫ R2
⎜ xy + 8 ⎟ dydx
⎝ ⎠
( y <1/ 2 )

06-May-June_2009.indd 143 12/8/2012 11:57:48 AM


1.144 B.E./B.Tech. Question Papers

2 1/ 2
⎡ xy 3 x 2 ⎤
= ⎢
0⎣
∫3
+ y ⎥ ddx
8 ⎦
0
2⎛ x x2 ⎞
= ∫ 0
⎜ 24 + 16 ⎟ dx
⎝ ⎠
2
⎛ x 2 x3 ⎞
=⎜ + ⎟
⎝ 48 48 ⎠ 0
4 8 12 1
= = =
48 48 4
(iii) P[X > 1, y < 1/2]
1/ 2 2⎛ x2 ⎞
= ∫ ∫ ⎜ xy + 8 ⎟ dxdy
2
0 1 ⎝ ⎠
2
1/ 2 ⎡ x 2 y 2 x3 ⎤
= ∫ 0

⎣ 2
+ ⎥ dy
24 ⎦
1

1/ 2 ⎡ ⎛ 1⎞ ⎛ y2 1 ⎞ ⎤
= ∫ ⎢⎜ 2 y + ⎟ − +
2
⎥dy
0 ⎢⎣⎝ 3 ⎠ ⎝ 2 24 ⎟⎠ ⎦⎥
1/ 2
⎛ 3y2 7 ⎞
= ∫0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1/ 2
⎡ y3 7 ⎤
= ⎢ + y⎥
⎣ 2 24 ⎦ 0
1 7
= +
16 48
10 5
= =
48 24
⎡ 1⎤
P x > 1, y < ⎥
(iv) P[X > 1 / Y < 1/2] = ⎣
2⎦ 5 / 24 5
= =
⎡ 1⎤ 1/ 4 6
P y< ⎥
⎣ 2⎦
⎡ 1⎤
P x > 1, y < ⎥
(v) P[y < 1/2 / x > 1] = ⎣ 2⎦
=
5 / 24
= 5/19
P[x > ] 19 / 24

06-May-June_2009.indd 144 12/8/2012 11:57:50 AM


Probability and Queuing Theory (May/June 2009) 1.145

1 y y
x2
∫∫( 2
)dxdy
(vi) P[X < Y] = 8
0 0

1
1 y y=x
⎛ x 2 y 2 x3 ⎞
= ⎜
0

⎝ 2
+ ⎟ dy
24 ⎠ 2
x
0
1
⎛ y 4 y3 ⎞
= ∫ 0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1
⎛ y5 y 4 ⎞
= ⎜ + ⎟
⎝ 10 96 ⎠ 0
1 1
= +
10 96
53
=
480
(vii) P[x + y < 1] y
1 1− y
x2
∫ ∫ +
2
( )dxdy
= 8
0 0
1

1− y x+ x
1 2
⎛ x 2 y 2 x3 ⎞ y=
= ⎜
0

⎝ 2
+ ⎟
24 ⎠
dy 1
0
1
⎡ y2 (1 y )3 ⎤

0 ⎣

= ⎢ (1 y ) 2 +
2 24 ⎦
⎥ dy

1
1 ⎡
=
24 ⎣ ∫
12 y 2 (1 − y 2 ) + (1 − y )3 ⎤⎦ dy
0
1
1 ⎡
=
24 ⎣ ∫
12 y 2 +12
0
+1 y 3 − 3 y + 3 y 2 ⎤⎦ dy
+ 12 y 4 − 24 y 3 +1−

1
1 ⎡
=
24 ⎣ ∫
12 y 4 − 25 y 3 + 15 y 2 − 3 y + 1⎤⎦ dy
0

06-May-June_2009.indd 145 12/8/2012 11:57:51 AM


1.146 B.E./B.Tech. Question Papers

1
1 ⎡12 y 5 25 y 4 15 y 3 3 y 2 ⎤
= ⎢ − + − + y⎥
24 ⎣ 5 4 3 2 ⎦ 0

1 ⎡12 25 3 ⎤ 13
= ⎢ − + 5 − + 1⎥ =
24 ⎣ 5 4 2 ⎦ 480

14. (a) (i)


w.k.t E{g(x, y) = E[E{g(X, Y)/X}]
E{X(t)} = E{a cos ( t φ )}
= aE [E{cos ( Ωt + φ )/ Ω }]
= aE [cos Ωt E ( φ ) − sin
i Ωt × E ( φ ))]

⎡ π π ⎤
1 1
= aE ⎢cosΩ
⎢ ∫ 2π
cos − sin Ω ∫ 2π
s φ dφ ⎥
sin

⎣ −π −π ⎦

since f is uniform in ( , π )

E(t1, t2) = E{X(t1) × X(t2)}

= Ε{a2 cos( 1 ) cos(


cos( 2 φ ))}

⎣ {
= a 2 ⎡ E cos Ωt1 cos Ωt
Ω 2 cos 2 φ + i Ωt1 + i Ωt
Ω2 i 2
φ−

(sin Ωt
Ωt1 Ω
Ωtt2 + Ω + Ω ) × ( in φ cos φ / Ω)}⎤⎦
⎡ π
= a 2 E ⎢cos Ω t1 Ω t2 ∫ cos φ d φ + s Ω t1 i Ω t2 ×
2

⎣ −π
π π ⎤
∫ sin 2φ d φ − sin
i Ω( 1 + 2 ) ∫ sin φ cos φ dφ ⎥

−π −π ⎦
1 2
= a E[ 1 Ω2 i 1 i Ω 2]
2
1 2
= a E [cosΩ( 1 − 2 )]
2
= a function of t1 − t2, whatever the value of f(w)
{x(t)} is a wss process.

06-May-June_2009.indd 146 12/8/2012 11:57:53 AM


Probability and Queuing Theory (May/June 2009) 1.147

14. (a) (ii) Mean of the Poisson process = lt


Mean arrival rate l = 3/min
e − λ t (λ t )n
P[X(t) = n] =
n!
−6 4
e 6
(1) P{X(2) = 4)] = = 0.133
4!
(2) P[X(2) > 4] = 1− P[X ≥ 4]
e −6 6n
4
= 1− ∑
n= 0
n!
⎡ 6 2 63 6 4 ⎤
= 1− e −6 ⎢1 + 6 + + + ⎥
⎣ 2 ! 3! 4 ! ⎦
= 0.715

14. (b) If n ≥ 1.
Xn = 1, if the nth signal generated is highly distorted.
Xn = 0, if the nth signal generated is highly recognizable.
Then clearly {Xn:n = 0, 1, 2… } is a marker chain with state spare {0,
1} and the transition probability matrix is given by
⎡ 20 / 23 3 / 23⎤
Xn = ⎢ ⎥
⎣ 14 / 15 1 / 15 ⎦
π 0 = the fraction of signals that are recognizable.
π1 = the fraction of signals that are distorted.
⎡ 20 / 23 3 / 23⎤
( π 0 , π1 ) ⎢ ⎥ = ( π 0 , π1 )
⎣ 14 / 15 1 / 15 ⎦
20 14
∴ π 0 + π1 = π 0 (1)
23 15
1
( / 23)π 0 + π1 = π1 (2)
15
and w.k.t π 0 + π1 = 1 (3)
Solving (1), (2), (3) we get
322
π0 = = 0.877
367
∴ π1 = 1 − π 0 = 1− 0.877 = 0.123 ∴ l / m = 5/3
Therefore 12.3% of the signals generated by the testing system are
highly distroted.

06-May-June_2009.indd 147 12/8/2012 11:57:55 AM


1.148 B.E./B.Tech. Question Papers

15. (a) Arrival rate λ = 15/n


Service rate μ = 6/n
Number of server s = 3.
Hence this is a problem in multiple server (M/M/S): (∞/FIFO)
model.
(i) P[all the typists are busy]
= P [N ≥ 3]
3
⎛ λ⎞
⎜⎝ μ ⎟⎠ P0
= (1)
⎛ λ⎞
3! ⎜1 − ⎟
⎝ 3μ ⎠
−1
⎡ ⎧ ⎫⎤
⎢ s −1 ⎧ n⎫ ⎪ s ⎪⎥
Where P0 = ⎢ ⎪⎨1 / ! ⎛ λ ⎞ ⎪⎬ + ⎪⎨ ⎛ λ ⎞ ⎪⎥

1
⎢ ⎜⎝ μ ⎟⎠ ⎛ λ ⎞ ⎜⎝ μ ⎟⎠ ⎬ ⎥
⎢ n = 0 ⎪⎩ ⎪⎭ ⎪ s ! 1 − ⎪⎥
⎢⎣ ⎪⎩ ⎜⎝ μ s ⎟⎠ ⎪⎭ ⎥⎦
−1
⎡ ⎧ ⎫⎤
⎢ ⎪ 3 ⎪⎥
⎧ 1 ⎫ ⎪ 1 ⎛ 5 ⎞ ⎪⎥
= ⎢ ⎨1 + 2.5 + ( 2.5) ⎬ + ⎨
2
⎜ ⎟ ⎬
⎢⎩ 2 ⎭ ⎪ 3! ⎛1 − 5 ⎞ ⎝ 3 ⎠ ⎪ ⎥

⎣ ⎪⎩ ⎝ 3 ⎟⎠ ⎪⎭ ⎥⎦
= [22.25]−1 = 0.0449
( / 3)3
(1) ⇒ P[N ≥ 3] = × 0.0449 = 0.7016
⎛ 5⎞
6 1− ⎟
⎝ 9⎠
Hence the fraction of the time all the typists will be busy =
0.7016
(ii) The average number of letters waiting to be typed
s +1
⎛ λ⎞
.P0
1 ⎜⎝ μ ⎟⎠
Lq = 2
s.s ! ⎛ λ⎞
1 −
⎜⎝ μ s ⎟⎠

1 ( 2.5) 4
= × 2
× 0.0449 =3.5078
3 6 ⎛ 2 5⎞
⎜⎝1 − ⎟
3 ⎠

06-May-June_2009.indd 148 12/8/2012 11:57:57 AM


Probability and Queuing Theory (May/June 2009) 1.149

1
(iii) Ws = L
λ s
1⎡ λ⎤ λ
= L + ∴ Ls = Lq +
λ ⎢⎣ q μ ⎥⎦ μ
1
=[3.5078 + 2.5] = 0.4005 h
15
or Ws = 24 min, nearly
⎡ s⎡ λ
− μt ( s −1− ) ⎤ ⎤
⎢ ⎛ λ ⎞ ⎢1 − e μ ⎥
P ⎥
⎢ ⎜⎝ μ ⎟⎠ ⎢ ⎥ ⎥
0

(iv) P[w > t] = e− μt ⎢1 + ⎣ ⎦ ⎥


⎢ λ λ ⎥
⎢ s !(1 )( s 1 − ) ⎥
μs μ
⎢ ⎥
⎣ ⎦
1
∴ P[w > 20]min = P[w > h ]
3
⎡ ⎤
1 ⎢
= e −6 × 3 ⎢1 +
{
( 2.5)3 − e − × − × } ⎥

⎢ ⎛ 2 5⎞ ⎥
⎢ 6 ⎜1 − ⎟ ( −0.5) ⎥
⎣ ⎝ 3 ⎠ ⎦
⎡ 0.7016(1 − e) ⎤
= e−2 ⎢1 + ( −0.5) ⎥⎦

= 0.4616
1
15. (b) arrival rate = 12 min
λ
1
∴λ = /min
12
1 1
Service rate = 10 min∴ μ = /min
μ 10
(i) E(number of customers in system]
λ 1 / 12
= Ls = = = 5 customers
μ λ 1 1

10 12
E[number of customers in queue]
λ2 1 / 144
= Lq = = = 4.17 customer
μ( μ − λ ) 1 ⎛ 1 1 ⎞
⎜ − ⎟
10 ⎝ 10 12 ⎠

06-May-June_2009.indd 149 12/8/2012 11:57:58 AM


1.150 B.E./B.Tech. Question Papers

1 1
(ii) Ws = = = 60 min = 1 hour
μ−λ 1 1

10 12
λ 1 / 12
(iii) Wq = = = 50 min
μ−λ 1 ⎛ 1 1⎞
⎜ − ⎟
10 ⎝ 10 12 ⎠

(iv) P[w > t] = e− μ λ )t


⎛ 1 1⎞
− − × 30
⎝ 10 12 ⎠⎟
P[w > 30] = e

= e −0 5 = 0.6065

06-May-June_2009.indd 150 12/8/2012 11:58:00 AM


B.E./B.Tech. DEGREE EXAMINATION,
NOV/DEC 2008
Fourth Semester
(Regulation 2004)
Computer Science and Engineering
MA 1252—PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
Time: Three hours Maximum: 100 marks
Use of Statistical Table is permitted
Answer ALL questions
PART A – (10 × 2 = 20 marks)
( ) = 0.6
1. When A and B are 2 mutually exclusive events, are the values P(A
and P( A B) = 0.5 consistent? Why?

f(x) = k(1
2. A continuous random variable X has a density function given by f( k
+ x) 2 < x < 5. Find P(X
( < 4).

3. The number of monthly breakdowns of a computer is a random variable


having a Poisson distribution with mean equal to 1.8. Find the probability
that this computer will function for a month (a) without a break down (b)
with only one breakdown.

4. Find the distribution function of the random variable Y = g(x


( ), in terms of
the distribution function of X
X, if it is given that
⎧ x − c fo x>c

g( x) = ⎨ 0 for | x | ≤ c
⎪ x + c fo x < c

5. Define independence of two random variables X and Y


Y, both in the discrete
case and in the continuous case.

6. Comment on the following:

a1-Nov-Dec_2008.indd 151 12/7/2012 6:13:12 PM


1.152 B.E./B.Tech. Question Papers

“The random variables X and Y are independent iff Cov (X, Y) = 0”.

7. If X(s, t) is a random process, what is the nature of X(s, t) when (a) s is


fixed (b) t is fixed?

8. What is a stochastic matrix? When is it said to be regular?

9. What is do you mean by transient state and steady state queueing


system?

10. Ιf people arrive to purchase cinema tickets at the average rate of 6 per
minute, it takes an average of 7.5 seconds to purchase a ticket. If a person
arrives 2 mins before the picture starts and it takes exactly 1.5 min to
reach the correct seat after purchasing the ticket. Can be expect to be
seated for the start of the picture?

PART B – (5 × 16 = 80 marks)
11. (a) (i) The first bag contains 3 white balls, 2 red balls and 4 black balls.
Second bag contains 2 white, 3 red and 5 black balls and third
bag contains 3 white, 4 red and 2 black balls. One bag is chosen
at random and from it 3 balls are drawn. Out of 3 balls, 2 balls
are white 1 is red. What are the probabilities that they were taken
from first bag, second bag, third bag.
(ii) A random variable X has the p.d.f.
⎧2 x,, 0 < x < 1
( x) = ⎨
⎩ 0, otherwise
find

⎛ 1⎞ ⎛1 1⎞ ⎛ 3 1⎞
( )P x< ⎟ ,( 2) ,( ) P x > / X > ⎟
⎝ 2⎠ ⎝4 2 ⎠ ⎝ 4 2⎠

(Or)
(b) (i) If the density function of continuous random variable X is given
by
⎧ ax ; 0 ≤ x ≤1

⎪ a ; 1≤ x ≤ 2
f ( x) = ⎨
⎪3a ax
a ; 2≤ x≤3
⎪ 0
⎩ ; otherwise
(1) Find ‘a’ (2) Find the cdf of X.

a1-Nov-Dec_2008.indd 152 12/7/2012 6:13:13 PM


Probability and Queuing Theory (Nov/Dec 2008) 1.153

(ii) If the moments of a random variable ‘X’ are define by E(Xr) =


0.6; r = 1, 2, 3, ….. Show that P(X = 0) = 0.4, P(X = 1) = 0.6,
P(X ≥ 2) = 0.

12. (a) (i) A machine manufacturing screws in known to produce 5%


defective. In a random sample of 15 screws, what is the prob-
ability that there are (1) exactly 3 defectives, (2) not more than
3 defectives.
(ii) A die is cast until 6 appears. What is probability that it must be
cast more than 5 times?
(Or)
(b) (i) If X is uniformly distributed over (−a, a), a 0, find a so that
1
() ( )= ( ) (| | ) = P (| X | ).
3
(ii) Assume that mean height of soldiers to be 68.22 inches with a
variance of 10.8 inches. How many soldiers in a regiment of
1000 would you expect to be over 6 feet tall?

13. (a) (i) If the joint distribution function of X and Y is given by


⎧( e x )( e y
), for x > 0 y > 0
F ( x, y ) = ⎨
⎩0, otherwise
(1) Find the marginal densities of X and Y
(2) Are X and Y independent
(3) P(1 < X < 3, 1 < Y < 2).
(ii) Find the coefficient of correlation between industrial production
and export using the following data:
Production((X ): 55 56 58 59 60 60 62
r (Y ) :
Export 35 38 37 39 44 43 44
(Or)
(b) (i) The lifetime of a certain brand of an electric bulb may be consid-
ered a RV with mean 1200 hrs and S.D. 250 hrs. Find the prob-
ability, using central limit theorem, that the average life time of
60 bulbs exceeds 1250 hrs. (8)
(ii) The two lines of regression are
8x − 10y + 66 = 0
40x − 18y + 214 = 0
The variance of X is 9. Find

a1-Nov-Dec_2008.indd 153 12/7/2012 6:13:13 PM


1.154 B.E./B.Tech. Question Papers

(1) The mean values of X and Y


(2) Correlation coefficient between X and Y.

14. (a) (i) Given a random variable Ω with density f(w) and another random
variable f uniformly distributed in (−p, p) and independent of Ω
and X(t) = a cos (Ωt + f), prove that [X(t) is a WSS process.
(ii) The transition probability matrix of a Markov chain {Xn}, n = 1,
2, 3,… having 3 states 1, 2 and 3 is
⎛ 0.1 0.5 0 4⎞
P = ⎜ 0.6 0.2 0 2⎟
⎜ ⎟
⎜⎝ 0.3 0.4 0 3⎟⎠

and the initial distribution is (0.7, 0.2, 0.1). Find:


(1) P { X 2 = 3}
(2) P { X 3 = 2 X 2 3 X1 = 3 X 0 2} (2)
(Or)

14. (b) (i) A man either drives a car or catches a train to go to office each
day. He never goes 2 days in a row by train but it he drives one
day, then the next day he is just as likely to drive again as he is to
travel by train. Now suppose that on the first day of the week, the
man tossed a fair die and drove to work iff a 6 appeared. Find
(1) The probability that he takes a train on the 3rd day.
(2) The probability that he drives to work in the long run.
(ii) Write a short note on recurrent state, transient state, ergodic
state.

15. (a) (i) A duplicating machine maintained for office use is operated by
an office assistant who earns Rs.5 per hour. The time to complete
each jobs varies according to an exponential distribution with
mean 6 mins. Assume a Poisson input with an average arrival
rate of 5 jobs per hour. If an 8-hrs day is used as a base, deter-
mine
(1) The percentage idle time of the machine.
(2) The average time a job in the system.
(3) The average earning per day of the assistant.
(ii) A super market has two girls attenting to sales at the counters.
If the service time for each customer is exponential with mean
4 mins and if people arrive in Poisson fashion at the rate of 10
per hour,
(1) what is the probability that a customer has to wait for ser-
vice?

a1-Nov-Dec_2008.indd 154 12/7/2012 6:13:14 PM


Probability and Queuing Theory (Nov/Dec 2008) 1.155

(2) What is the expected percentage of idle time for each girl?
(or)

15. (b) (i) Customers arrive at a one-man barber shop according to a Pois-
son process with a mean inter-arrival time of 12 min. Customers
spend as average of 10 min in the barber’s chair.
(1) What is the expected no. of customers in the barber shop and
in the queue?
(2) What is the probaboloty that more than 3 customers are in
the system?
(ii) Derive the Pollaczek-khinchine formula for M/G/1 queueing
model.

a1-Nov-Dec_2008.indd 155 12/7/2012 6:13:14 PM


B.E./B.Tech. DEGREE EXAMINATION,
APRIL/MAY 2008
Fourth Semester
(Regulation 2004)
Computer Science and Engineering
MA 1252—PROBABILITY AND QUEUEING THEORY

(Common to Information Technology)


Use of statistical table is permitted
Time: Three hours Maximum: 100 marks
Answer ALL questions.
PART A – (10 × 2 = 20 marks)
1. If the probability that a communication system will have high fidelity is
0.81 and the probability that it will have fidelity and highs electivity is
0.18, what is the probability that a system with high fidelity will also have
high selectivity?

2. Given the probability density function


k
f ( x) = ,− 1 < x < α1 ,
1 + x2

find k and C.D.F.F(x).

3. If the probability is 0.10 that a certain kind of measuring device will show
excessive drift, what is the probability that the fifth measuring device
tested will be the first show excessive drift? Find its expected value also.

4. Let X be a uniform random variable over [−1, 1]. Find


X < 1/3)
(a) P (|X|
X ≥ 3 /4)
(b) P (|X|

5. If X has mean 4 and variance 9, while Y has mean −2 and variance 5, and
the two or independent, find

b1-April-May_2008.indd 156 12/7/2012 6:13:49 PM


Probability and Queuing Theory (April/May 2008) 1.157

(a) E(X Y)
(b) E(X Y2)

6. Let X and Y be continuous R.Vs with J.p.d.f.


⎧ 3 2
⎪2 xy + y 0 < x < 1, 0 < y < 1
f ( x, y ) = ⎨ 2
⎪0,
⎩ otherwise.
Find P(X + Y < 1).

7. Define (a) Markov chain (b) Wide-Sense stationary process.

8. Let X(t); t ≥ 0 be a Poisson process with rate l. Find E[X(t) X(t + t)],
where t > 0.

9. In the usual notation of an M/M/1 queueing system, if l = 3/hour and m =


4/hour, find P(X ≥ 5) where X is the number of customers in the system.

10. Find P(X ≥ c + n) for an M/M/C queueing system.

PART B – (5 × 16 = 80 marks)
11. (a) (i) A box contain 5 red and 4 white balls. A ball from the box is
taken out at random and kept outside. If once again a ball is
drawn from the box, what is the probability that the drawn ball
is red?
(ii) If the cumulative distribution function of a R.V. X is given by

⎧ 4
⎪1 − 2 , x > 2
F ( x) = ⎨ x
⎪0, x ≤ 2,

find (1) P(X < 3) (2) P(4 < X < 5) (3) P(X ≥ 3).
(iii) A discrete R.V. X has moment generating function
5
⎛ 1 3 t⎞
M x (t ) = + e .
⎝ 4 4 ⎟⎠
Find E(X), Var (X) and P(X = 2).
(b) (i) The p.d.f. of the samples of the amplitude of speech wave forms
is found to decay exponentially at rate α, so the following p.d.f.
is proposed.

f ( x ) = C e −α | x| , −α1 < x < α1 .

b1-April-May_2008.indd 157 12/7/2012 6:13:50 PM


1.158 B.E./B.Tech. Question Papers

Find the constant ‘C’, and then find the probability P (|X| < v)
and E(X).
⎛ x⎞
(ii) Let X be a R.V. with E(X) = 1 and E(X(X − 1) = 4. Find Var ⎜ ⎟
⎝ 2⎠
and Var (2 − 3x).
(iii) If X is a continuous R.V. with p.d.f.

⎧ x, 0 ≤ x <1

⎪3
f ( x) = ⎨ ( x )2 , 1≤ x < 2
⎪2
⎪⎩0, otherwise,

find the cumulative distribution function F(x) of X and use it to


⎛3 5⎞
find P X< ⎟ .
⎝2 2⎠
12. (a) (i) If a R.V. X has geometric distribution, i.e., P(X = x) = pq x − 1, x
= 1, 2, 3 where q = 1 − p and 0 < p < 1, show that P(X > x + y/X
> y) = P(X > x).
(ii) Let X be a random variable with p.d.f.
1 − x2 /2
f ( x) = e , α1 < x < α1 .

Find the p.d.f. of the R.V. Y = X2.
(Or)
(b) (i) Let the p.d.f. for X be given by
⎧ 1 −x\2
⎪ e , x≥0
f ( x) = ⎨ 2
⎪0,
⎩ otherwise.

⎛ 1⎞
Find (1) P X > ⎟ (2) Moment generating function for X
⎝ 2⎠
(3) E(X) (4) Var(X).
(ii) If X is any continuous R.V having the p.d.f.
⎧2 x,, 0 < x < 1
f ( x) = ⎨
⎩0, otherwise.
and Y = e− x, find the p.d.f. of the R.V. Y.

13. (a) (i) Let X and Y have the joint p.d.f.

b1-April-May_2008.indd 158 12/7/2012 6:13:50 PM


Probability and Queuing Theory (April/May 2008) 1.159

X
0 1 2

0 0.1 0.4 0.1


Y
1 0.2 0.2 0

Find (1) P(X + Y > 1), (2) the probability mass function P(X = x)
of the R.V. X, (3) P(Y = 1/ X = 1). (4) E(X Y).
(ii) Suppose that orders at a restaurant are i.i.d. random variables
with mean m = Rs.8 and standard deviation s =Rs.2. Estimate
(1) the probability that first 100 customers spend a total of more
than Rs.840, i.e., (1) P(X1 + X2 + … + X100 > 840, (2) P(780 < X1
+ X2 + … + X100 < 820).
(Or)
(b) (i) Find P(X > 2/Y < 4) when the joint p.d.f. of X and Y is given
by
⎧e −(( x y)
, x ≥ 0, y ≥ 0
g ( x, y ) = ⎨
⎩0, otherwise.
Are X and Y independent R.Vs? Explain.
(ii) If the point p.d.f of the R.Vs X and Y is given by
⎧2, 0 < x < y < 1
f ( x, y ) = ⎨
⎩0, otherwise.
X
Find the p.d.f. of the R.V. U =
Y
14. (a) (i) Let X(t) be a Poisson process with arrival rate l. Find E{X(t) −
x(s)2} for t > s.
(ii) Let {Xn; n = 1, 2, 3, …} be a Markov chain on the space S =
{1,2,3} with one step transition probability matrix
⎡ 0 1 0 ⎤
⎢ ⎥
= ⎢1 / 2 0 1 / 2⎥
⎢ 1 0 0 ⎥
⎣ ⎦
(1) Sketch the transition diagram.
(2) Is the chain irreducible? Explain.
(3) Is the chain Ergodic? Explain.
(Or)

b1-April-May_2008.indd 159 12/7/2012 6:13:51 PM


1.160 B.E./B.Tech. Question Papers

(b) (i) Consider a random process X(t) defined by X(t) = U cost + (V + 1)


sint, where U and V are independent random variables for which
E(U) = E(V) = 0; E(U2) = E(V2) = 1
(1) Find the auto-covariance function of X(t).
(2) Is X(t) wide-sense stationary? Explain your answer.
(ii) Discuss the pure birth process and hence obtain its probabilities,
mean and variance.

15. (a) (i) A concentrator receives messages from a group of terminals


and transmits them over a single transmission line. Suppose that
messages arrives according to a Poisson process at a rate of one
message every 4 milliseconds and suppose that message trans-
mission times are exponentially distributed with mean 3 ms.
Find the mean number of messages in the system and the mean
total delay in the system. What percentage increase in arrival
rate results in a doubling of the above mean total delay?
(ii) Discuss the M/M/1 queueing system with finite capacity and
obtain its steady-state probabilities and the mean number of cus-
tomers in the system.
(Or)
(b) (i) A petrol pump station has 2 pumps. The service times follow
the exponential distribution with mean of 4 minutes and cars
arrive for service is a Poisson process at the rate of 10 cars per
hour. Find the probability that a customer has to wait for service.
What is the probability that the pumps remain idle?
(ii) Automatic car wash facility operates with only one bay. Cars
arrive according to a Poisson process, with mean of 4 cars per
hour and may wait in the facility’s parking lot if the bay is busy.
If the services time for all cars is constant and equal to 10 min,
determine (1) mean number of customers in the system Ls. (2)
mean number of customers in the queue (3) mean waiting time
in the system (4) mean waiting time in the queue.

b1-April-May_2008.indd 160 12/7/2012 6:13:52 PM


B.E./B.TECH.DEGREE EXAMINATION,
NOV/DEC 2007
Fourth Semester (Regulation 2004)
Computer Science and Engineering
M Y
A 1252 – PROBABILITY AND QUEUEING THEOR
(Common to Information Technology)

Time: Three hours Maximum: 100 marks


Use of statistical table is permitted
Answer ALL questions
Part A – (10 × 2 = 20 marks)
1. State the law of total probability and state under which situation it could
be used.

2. Suppose that a bus arrives at a station every day between 10.00 a.m. and
10.30 a.m. at random. Let X be the arrival time; find the distributive func-
tion of X and sketch its graph.

3. Sharon and Ann play a series of backgammon games until one of them
wins the five games. Suppose that the games are independent and the
probability that Sharon win a game is 0.58.
(a) Find the probability that the series ends in 7 games.
(b) If the series ends in 7 games, what is the probability that Sharon
wins?

4. Define exponential random variable. Give an example.

⎧1 − λ − λ ( + ) if 0, y > 0
5. For λ > 0, let (x, ) = ⎨ Check whether F
⎩0 otherwise
can be joint probability distribution function of two random variables X
and Y.
Y

c1-Nov-Dec_2007.indd 161 12/7/2012 6:14:46 PM


1.162 B.E./B.Tech. Question Papers

6. The life time a TV tube (in years) is an exponential random variable with
mean 10. What is the probability that the average lifetime of a random
sample of 36 TV tubes is atleast 10.5?

7. Suppose that people immigrate in to a territory at a Poisson rate l = 1


per day.
(a) What is the expected time until the 10th immigrant arrives?
(b) What is the probability that the elapsed time between the 10th and
the 11th arrival exceeds 2 days?

8. At an intersection, a working traffic light will be out of order the next day
with probability 0.07, and an out-of-order traffic light will be working
the next day with probability 0.88. Let Xn = 1 if on day ‘n’ the traffic light
will work; Xn = 0 if on day ‘n’ the traffic light will not work. Is { Xn; n = 0,
1, 2,…} a Markov chain? If so, write the transition probability martrix.

9. Suppose that customers arrive at a Poisson rate of one per every 12 min-
utes, and that the service time is exponential at a rate of one service per
8 mins.
(a) What is the ‘average no. of customers in the system?
(b) What is the average time of a customer spends in the system?

10. What do you mean by transient state and steady state queueing system?

PART B – (5 × 16 = 30 marks)
11. (a) (i) A box contains 7 red and 13 blue balls. Two balls are selected at
random and are discarded without their colours being seen. If a
third ball is drawn randomly and observed to be red, what is the
probability that both of the discarded balls were blue? (8)
(ii) The sales of a convenience store on a randomly selected day are
X thousand dollars, where X is a random variable with a distri-
bution function of the following terms:
⎧0; x 0
⎪ 2
⎪⎪ x ; 0 x 1
F ( x) ⎨ 2
⎪k ( x x 2) ; 1 ≤ x < 2

⎪⎩1; x 2
Suppose that this convenience store’s total sales on any given
day are less than $ 2000.
(1) Find the value of k.

c1-Nov-Dec_2007.indd 162 12/7/2012 6:14:46 PM


Probability and Queuing Theory (Nov/Dec 2007) 1.163

(2) Let A and B be the events that tomorrow the store’s total sales
are between 500 and 1500 dollars, and over 1000 dollars,
respectively. Find P(A)and P(B).
(3) Are A and B independent events? (8)
Or
(b) (i) A box contains tags marked 1, 2, …n.
(ii) Two tags are chosen at random without replacement. Find the
probability that the numbers on the tags will be consecutive inte-
gers. (2) Two tags are chosen at random with replacement. Find the
probability that the numbers on the tags will be consecutive inte-
gers. (8)
(ii) Experience has shown that while walking in a certain park, the
time X (in mins.), between seeing two people smoking has a den-
sity function of the form
⎧λ xe − x ; x > 0
f ( x) = ⎨
⎩0 otherwise

(1) Calculate the value of l.


(2) Find the distribution function of X.
(3) What is the probability that Jeff, who has just seen a person
smoking, will see another person smoking in 2 to 5 minutes?
In at least 7 minutes? (8)

12 (a) (i) The atoms of radioactive element are randomly disintegrating. It


every gram of this element, on average, emits 3.9 alpha particles
per second, what is the probability that during the next second
the number of alpha particles emitted from 1 gram is1. at most 6
2. at least 2 3. at least 3 and at most 6? (8)
(ii) Starting at 5.00 a.m. every half hour there is a flight from San
Francisco airport to Los Angeles International Airport. Suppose
that none of these planes is completely sold out and that they
always have room for passengers. A person who wants to fly to
L.A. arrives at the airport at a random time between 8.45 a.m.
and 9.45 a.m. Find the probability that she waits: 1. at most 10
mins, 2. at least 15 mins. 8)
(b) (i) X is normally distributed and the mean of X is 12 and S.D. is 4.
Find the probability of the following:
(1) X ≥ 20
(2) 0 ≤ X ≤ 12
(3) Find x′, when P(X > x′) = 0.24

c1-Nov-Dec_2007.indd 163 12/7/2012 6:14:46 PM


1.164 B.E./B.Tech. Question Papers

(4) Find x10 and x11 , when P( x10 X x11 ) = 0.50 and
p( X x11 .25) . (8)
(ii) A man with ‘n’ keys wants to open his door and tries the keys
independently and at random. Find the mean and variance of the
number of trials required to open the door if unsuccessful keys
are not eliminated from further selection. (8)

13. (a) (i) The joint pdf of a two-dimensional RV(X, Y) is given by


x2
x 2
( x, y ) xy ; 0 ≤ x ≤ 2 0 ≤ y ≤ 1. Compute:
8
⎛ 1⎞
1. 1/ Y < ⎟
⎝ 2⎠
⎛ 1 ⎞
2. / X > 1⎟
⎝ 2 ⎠
3. (X < Y)
4. ( 1) (8)
(ii) Find the coefficient of correlation for the following heights (in
inches) of fathers (X) and their sons (Y):
X: 65 66 67 68 69 70 71 72
Y: 67 68 65 68 72 72 69 71

Or
(b) (i) The joint of (X, Y) is given by p(x, y) = k(2x + 3y); x = 0, 1, 2; y =
1, 2, 3. Find all the marginal probability distributions. Also find
the probability distribution of (X + Y). (10)
(ii) Can Y = 5 + 2.8X and X = 3 − 0.5Y be the estimated regres-
sion equations of Y on X and X on Y respectively? Explain your
answer with suitable theoretical arguments. (6)

14. (a) (i) Show that the random process X(t) = Acos(w0 t + q) is wide-
sense stationary, if A and w0 are constants and θ is a uniformly
distributed random variable in (0, 2p). (8)
(ii) An, engineer analyzing a series of digital signals generated by
a testing system observes that only 1 out of 15 highly distorted
signals follows a highly distorted signal, with no recognizable
signal between, whereas 20 out of 23 recognizable signals follow
recognizable signals with no highly distorted signal between.
Given that only highly distorted signals are not recognizable,
find the fraction of signals that are highly distorted. (8)

c1-Nov-Dec_2007.indd 164 12/7/2012 6:14:46 PM


Probability and Queuing Theory (Nov/Dec 2007) 1.165

Or
(b) (i) (1) Define stationary transition probabilities. (3)
(2) Derive the Chapman-Kolmogorov equations for discrete-
time Markov chain. (5)
(ii) On a given day, a retired English professor, Dr. Charles Fish,
amuses himself with only one of the following activities: read-
ing (activity 1), gardening (activity 2), or working on his book
about a river valley (activity 3). For 1 ≤ i ≤ 3, let Xn = i if Dr. Fish
devotes day ‘n’ to activity i. Suppose that {Xn; n = 1, 2,…} is a
Markov chain, and depending on which of these activities on the
next day is given by the t.p.m.
⎛ 0.30 0.25 0 45⎞
P = ⎜ 0.40 0.10 0.50⎟
⎜ ⎟
⎜⎝ 0.25 0.40 0 35⎟⎠

Find the proportion of days Dr. Fish devotes to each activity. (8)

15. (a) (i) For the steady state M/M/I queueing model, prove that
n
⎛ λ⎞
Pn P. (8)
⎝ μ⎠ 0
(ii) On every Sunday morning, a Dental hospital renders free dental
service to the patients, As per the hospital rules, 3 dentists who
are equally qualified and experienced will be on duty then. It
takes on an average 10 mins for a patient to get treatment and the
actual time taken is known to vary approximately exponentially
around this average. The patients arrive according to the Poisson
distribution with an average of 12 hours. The hospital manage-
ment wants to investigate the following:
(1) The expected number of patients waiting in the queue.
(2) The average time that a patient spends at the hospital. (8)

Or

(b) (i) Self-Service system is followed in a super market at a metropolis.


The customer arrivals occur according to a Poisson distribution
with mean 40 per hour. Service time per customer is exponen-
tially distributed with mean 6 mins.
(1) Find the expected number of customers in the system.
(2) What is the percentage of time that the facility is idle? (8)
(ii) Derive the Pollaczek-Khinchine formula for M/G/1 queueing
model. (8)

c1-Nov-Dec_2007.indd 165 12/7/2012 6:14:47 PM


B.E./B.Tech. DEGREE EXAMINATION,
MAY/JUNE 2007
Fourth Semester
(Regulation 2004)
Computer Science and Engineering
MA 1252 – PROBABILITY AND QUEUEING THEORY

(Common to Information Technology)

Time: Three hours Maximum: 100 marks


Answer ALL questions.
PART A – (10 × 2 = 20 marks)
1. A lot of integrated circuit chips consists of 10 good, 4 with minor defects
and 2 with major defects. Two chips are randomly chosen from the lot.
What is the probability that at least one chip is good?

( )=
2. A continuous random variable X has a probability density function f (x
k (1 + x), 2 ≤ x ≤ 5. Find P(X
( < 4).

3. One percent of jobs arriving at a computer system need to wait until week-
ends for scheduling, owing to core-size limitations. Find the probability
that among a sample of 200 jobs there are no jobs that have to wait until
weekends.

4. A fast food chain finds that the average time customers have to wait for
service is 45 seconds. If the waiting time can be treated as an exponential
random variable, what is the probability that a customer will have to wait
more than 5 minutes given that already he waited for 2 minutes?

5. The joint pdf of 2 random variables X and Y is


f ( x, y ) cx
c ( x − y) < x < 2, − x < y < x
= 0, elsewhere
Evaluate c.

d1-May-June_2007.indd 166 12/7/2012 6:15:26 PM


Probability and Queuing Theory (May/June 2007) 1.167

6. Let (X, Y) be a two dimensional random variable. Define covariance of


(X, Y). If X and Y are independent, what will be the covariance of (X,
Y)?

7. Define a Markov chain and give an example.


⎡ 0 1 ⎤
8. If the transition probability matrix of a Markov chain is ⎢ ⎥,
find the limiting distribution of the chain. ⎣1 \ 2 1 \ 2⎦

9. In the usual notation of an M/M/1 queueing system, if λ = 12 per hour and


m = 24 per hour, find the average number of customers in the system.

10. Write Pollaczek-Khintchine formula and explain the notations.

PART B − (5 × 16 = 80 marks)
11. (a) (i) A binary communication channel carries data as one of 2 types
of signals denoted by 0 and 1. Due to noise, a transmitted 0 is
sometimes received as a 1 and a transmitted 1 is sometimes
received as a 0. For a given channel assume a probability of 0.94
that a transmitted 0 is correctly received as a 0 and a probability
of 0.91 that a transmitted 1 is received as a 1. Further assume a
probability of 0.45 of transmitted a 0. If a signal is sent, deter-
mine the probability that
(1) a 1 is received
(2) a 1 was transmitted given that a 1 was received
(3) a 0 was transmitted, given that a 0 was received
(4) an error occurs. (8)
(ii) In a continuous distribution, the probability density is given by
f (x) = kx (2 − x), 0 < x < 2. Find k, mean, variance and the dis-
tribution function. (8)
Or
(b) (i) The cumulative distribution function (cdf) of a random variable
X is given by
F ( x ) = 0, x < 0
1
= x2 , 0 ≤ x <
2
3 1
= 1− ( x)2 , ≤ x<3
25 2
= 1, x ≥ 3

d1-May-June_2007.indd 167 12/7/2012 6:15:26 PM


1.168 B.E./B.Tech. Question Papers

Find the pdf of X and evaluate P (| X | ≤ 1) using both the pdf and
cdf. (8)
(ii) Find the moment generating function of the geometric random
variable with the pdf f (x) = pq x − 1, x = 1, 2, 3, … and hence
obtain its mean and variance. (8)

12. (a) (i) The number of monthly breakdown of a computer is a random


variable having a Poisson distribution with mean equal to 1.8.
Find the probability that this computer will function for a month
(1) without a breakdown
(2) with only one breakdown and
(3) with atleast one break down. (8)
(ii) The life time X in hours of a component is modeled by a Weibull
distribution with b = 2. Starting with a large number of compo-
nents, it is observed that 15% of the components that have lasted
90 hrs. fail before 100 hrs. Find the parameter a. (8)
Or
(b) (i) The marks obtained by a number of students in a certain sub-
ject are approximately normally distributed with mean 65 and
standard deviation 5. If 3 students are selected at random from
this group, what is the probability that atleast one of them would
have scored above 75? [Given the area between z = 0 and z = 2
under the standard normal curve is 0.4772]. (8)
(ii) Write the pdf of Gamma distribution. Find the MGF, mean and
variance. (8)

13. (a) (i) The joint density function of the random variable (X, Y) is given
by
f ( x, y ) xxy 0 < x < 1, 0 < y < x
= 0, elsewhere.
Find the (1) marginal density of Y (2) conditional density of X/Y
⎛ 1⎞
= y and (3) P x < ⎟
⎝ 2⎠
(ii) Calculate the correlation coefficient for the following data:
X : 65 66 67 67 68 69 70 72
Y : 67 68 65 68 72 72 69 71 (8)

Or

d1-May-June_2007.indd 168 12/7/2012 6:15:26 PM


Probability and Queuing Theory (May/June 2007) 1.169

(b) (i) If X and Y are independent exponential random variables each


with parameter 1, find the pdf of U = X − Y. (8)
(ii) State central limit theorem in Lindberg-Levy’s form. A random
sample of size 100 is is taken from a population whose mean is
60 and variance is 400. Using central limit theorem, with what
probability can we assert that the mean of the sample will not
differ from m = 60 by more than 4? Area under the standard
normal curve from 0 to 2 is 0.4772. (8)

14. (a) (i) At the receiver of an AM radio, the received signal contains a
cosine carrier signal at the carrier frequency w0 with a random
phase q that is uniformly distributed over (0, 2p). The received
carrier signal is X (t) = A cos (w0 t + q). Show that the process is
second order stationary. (8)
(ii) Queries presented in a computer data base are following a Pois-
son process of rate λ = 6 queries per minute. An experiment
consists of monitoring the data base for m minutes and recording
N (m) the number of queries presented
(1) What is the probability that no queries in a one minute inter-
val?
(2) What is the probability that exactly 6 queries arriving in one
minute interval?
(3) What is the probability of less than 3 queries arriving in a
half minute interval? (8)
Or
(b) (i) Assume that a computer system is in any one of the three states:
busy, idle and under repair respectively denoted by 0, 1, 2.
Observing is state at 2 pm each day, we get the transition prob-
ability matrix as
⎡0.6 0.2 0 2⎤
⎢ ⎥
P = ⎢ 0.1 08 0.1⎥
⎢0.6 0 0.4 ⎥
⎣ ⎦
Find out the 3rd step transition probability matrix. Determine the
limiting probabilities. (8)
(ii) Obtain the steady state or long run probabilities for the popula-
tion size of a birth death process. (8)

15 (a) (i) Arrivals at a telephone booth are considered to be Poisson with


an average time of 12 min. between one arrival and the next.

d1-May-June_2007.indd 169 12/7/2012 6:15:27 PM


1.170 B.E./B.Tech. Question Papers

The length of phone call is distributed exponentially with mean


4 minutes
(1) What is the average number of customers in the system?
(2) What fraction of the day the phone will be in use?
(3) What is the probability that an arriving customer have to
wait? (8)
(ii) There are three typists in an office. Each typist can type an aver-
age of 6 letters per hour. If letters arrive for being typed at the
rate of 15 letters per hour
(1) What is the probability that no letters are there in the
system?
(2) What is the probability that all the typists are busy? (8)
Or
(b) (i) Explain an M/M/1, finite capacity queueing model and obtain
expressions for the steady state probabilities for the system size. (8)
(ii) Patients arrive at a clinic according to Poisson distribution at a
rate of 30 patients per hour. The waiting room does not accom-
modate more than 14 patients. Examination time per patient is
exponential with mean rate of 20 per hour.
(1) What is the probability that an arriving patient will not
wait?
(2) What is the effective arrival rate? (8)

d1-May-June_2007.indd 170 12/7/2012 6:15:27 PM


B.E/B.Tech. DEGREE EXAMINATION,
NOV/DEC 2006
Fourth Semester
Computer Science and Engineering
MA1252–PROBABILITY AND QUEUING THEORY
(Common to Information Technology)
Answer All Questions
PART A – (10 × 2 = 20 marks)

1. The odds in favour of A solving mathematical problem are 3 to 4 and the


odds against B solving the problems are 5 to 7. Find the probability that
the problem will be solved by at least one of them.

2. A die is loaded in such a way that each odd number is twice as likely to
occur as even number. Find P(G), where G is the event that a number
greater than 3 occurs on a single roll of the die.

3. Define a continuous random variable. Give an example.

4. Find the value of (a). C and (b). mean of the following distribution given
( −x
C(x− 2), for 0 < 1 f(
f(x) = 0, elsewhere

5. If the probability is 0.40 that a child exposed to a certain contagious will


catch it, what is the probability that the tenth child exposed to the disease
will be the third to catch it?

6. If X is uniformly distributed over (0,10) Calculate the probability that


(a).X > 6, (b).3 < 8.

7. Find the moment generating function for the distribution where 2/3, at x
= 1 f(
f(x) = 1/3, at x = 2 0, otherwise

8. State central limit theorem.

9. Define random process and its classification.

10. What are the basic characteristics of Queuing process?

e1-Nov-Dec_2006.indd 171 12/7/2012 6:16:05 PM


1.172 B.E./B.Tech. Question Papers

PART B – (5 × 16 = 80 marks)

11. (a) (i) If the probability density of X is given by 2(1-x), for 0 < 1 f(x) =
0, otherwise 1).Show that E[Xr] = 2/((r + 1)(r + 2)) 2).Use this
result to evaluate e[2X + 1)2]
(ii) Given a binary communication channel, where A is the input
and E is the output, let P(a) = 0.4, P(E/A) = 0.9 and p[E/A] = 0.6.
Find 1).P(A/E) 2).P(A/E)
(b) (i) random variable X has density function given by 1/k, for 0 f(x) =
0, elsewhere Find, (1).m.g.f (2).r th moment (3).mean (4).vari-
ance.
(ii) Given that a student studied, the probability of passing a certain
quiz is 0.99. Given that a student did not study. The probability
of passing the quiz is 0.05. Assume that the probability of study-
ing is 0.7. A student flunks the quiz. What is the probability that
he or she did not study?

12. (a) (i) Let the random variable X following binomial distribution with
parameters n and p. Find, (1).Probability mass function of X. (2).
Moment generating function. (3).Mean and variance of X.
(ii) The number of personal computer (PC) sold daily at a computer
World is uniformly distributed with a minimum of 2000 PC and
a maximum of 5000 PC. Find (1).The probability that daily sales
will fall between 2500 and 3000PC. (2).What is the probability
that the computer World will sell at least 4000 PC’s? (3).What
is the probability that the computer World will sell exactly 2500
PC’s?
(b) (i) Define the probability density function of normal distribution and
standard normal distribution. Write down the important proper-
ties of its distribution.
(ii) An electric firm manufactures light bulbs that have a life, before
burnout, that is normally distributed with mean equal to 800
hours and standard deviation of 40 hours. Find (1).The prob-
ability that a bulb burns more than 834 hours (2).The probability
that bulb between 778 and 834 hours

13. (a) (i) In producing gallium-arsenide microchips, it is known that the


ratio between gallium and arsenide is independent of producing
a high percentage of workable wafer, which are main compo-
nents of microchips. Let X denote the ratio of gallium to arsenide
and Y denote the percentage of workable micro wafers retrieved
during a 1 hour period. X and Y are independent random vari-

e1-Nov-Dec_2006.indd 172 12/7/2012 6:16:05 PM


Probability and Queuing Theory (Nov/Dec 2006) 1.173

ables with the joint density being known as (x(1 + 3y2))/4, 0 <
2,0 < 1 f(x) = 0, otherwise Show that E(XY) = E(X)E(Y).
(ii) If the joint density of X1 and X2 is given by 6.e-3x1-2x2, for
x1>0, x2 > 0 f(x1,x2)= 0, otherwise Find the probability density
of Y = X1 and X2
(b) (i) Two random variables X and Y have joint density function fXY(x,
y) = x2 + (xy)/3; 0 = x = 1, 0 = y = 2 Find the conditional density
functions. Check whether the conditional density functions are
valid.
(ii) If the joint probability density of X1 and X2 is given by ex1 + x2,
for x1>0, x2>0 f(x1,x2) = 0, otherwise Find the probability of Y
= X1/(X1 + X2)

14. (a) (i) Find the correlation coefficient and obtain the lines of regression
from the following data: x: 50 55 50 60 65 65 65 60 60 50 y: 11
14 13 16 16 15 15 14 13 13
(ii) Let z be a random variable with probability density f(z) = 1/2
in the range -1 = z =1. Let the random variable X = z and the
random variable Y = z2. Obviously X and Y are not indepen-
dent since X2 = y. Show none the less, that X and Y are uncor-
related.
(b) (i) Two random variables X and Y are defined as Y = 4X + 9.Find the
correlation coefficient between X and Y.
(ii) A stochastic process is described by X(t) = Asint + Bcost where
A and B are independent random variables with zero means and
equal standard deviation. Show that the process is stationary of
the second order.

15. (a) (i) A raining process is considered as two state Markov chain. If it
rains, it is considered to be the state 0 and if it does not rain, the
chain is in state 1. The transition probability of the Markov chain
is defined as 0.6 0.4 P 0.2 0.8 in matrix form Find the probability
that it will rain for 3 days from today assuming that it will rain
after three days. Assume the initial probabilities of state 0 and
state 1 as 0.4 and 0.6 respectively.
(ii) A person owing a scooter has the option to switch over to scooter,
bike or a car next time with the probability of (0.3,0.5,0.2). If the
transition probability matrix is 0.4 0.3 0.3 0.2 0.5 0.5 0.25 0.25
0.5 . What are the probabilities vehicles related to his fourth pur-
chase?
(b) (i) Define Kendall’s notation. What are the assumptions are made
for simplest queuing model.

e1-Nov-Dec_2006.indd 173 12/7/2012 6:16:05 PM


1.174 B.E./B.Tech. Question Papers

(ii) Arrival rate of telephone calls at telephone are according to Pois-


son distribution with an average time of 12 min between two
consecutive calls arrival. The length of telephone call is assumed
to be exponentially distributed with mean 4 times. 1).Determine
the probability that person arriving at the booth will have to wait.
2).Find the average queue length that is formed from time to
time. 3).The telephone company will install second booth when
convinced that an arrival would expect to have to wait at least 5
min for the phone. Find the increase in flows of arrivals which
will justify a second booth. 4).What is the probability that an
arrival will have to wait for more than 15 min before the phone
is free?

e1-Nov-Dec_2006.indd 174 12/7/2012 6:16:05 PM


B.E./BTECH. DEGREE EXAMINATION,
MAY/JUNE 2006
Fourth Semester
Computer Science and Engineering
MA-1252-PROBABILITY AND QUEUEING THEORY
(Common to Information Technology)
Time: Three hours Maximum: 100 marks
Answer ALL questions
Part A – (10 × 2 = 20 marks)
1. A coin is tossed an infinite number of times. If the probability of a head
in a single toss is p, find the probability that kth head is obtained at the nth
tossing but not earlier, with q = 1 − p.

2. A continuous random variable X that can assume any value between x =


2 and x = 5 has a density function given by f(
f(x) = k(l
k + x). Find P[X
[ < 4]

3. If X is uniformly distributed in (−p/2,


p p p/2), find the pdf of Y = tan X.
X

4. Find the moment generating function of a geometric distribution.

5. Show that correlation coefficient is independent of change of origin and


scals.

6. The two regression lines are 4x − 55y + 33 = 0 and 20x − 9y


9 = 107 and
variance of x =25. Find the means of x and y. Also find the value of r.

X t)} given by the law


7. Examine whether the passion process {X(

e λ t (λ t )r
P[ X ( t ) r] = , r = 0,1, 2,.....
r!
is covariance stationary.

8. Define a Markov process and a Markov chain.

9. What are the basic characteristics of a queueing system?

10. Derive the average number of customers in the system for (M/M/1): (∞/
FIFO) model.

f1-May-June_2006.indd 175 12/7/2012 6:16:56 PM


1.176 B.E./B.Tech. Question Papers

PART B – (5 ×16 = 80 marks)


11. (i) The process {X(t) whose probability distribution is given by

⎧ ( at )n−1 , n =1, 2,…


⎪ n+1
P[ X ( t ) n] = ⎨ ( at at )
⎪ 1+ at , n= 0

Show that it is not stationary.
(ii) A raining process is considered as a two-state Markov chain. If it
rains, it is considered to be in state 0 and if it does not rain, the
chain is in state 1. The transition probability of the Markov chain is
⎛ 0.6 04.⎞
defined as P = ⎜ . Find the probability that it will rain for
⎝ 0.2 0.8⎟⎠
three days from today assuming that it is raining today. Find also the
unconditional probability that it will rain after three days with the
initial probabilities of state 0 and state 1 as 0.4 and 0.6 respectively.

12. (a) (i) Out of (2n + 1) tickets consecutively numbered three are drawn
at random. Find the probability that the numbers on them are in
arithmetic progression.
(ii) If A and B are independent events, then show that A and B are
also indeoendent events. Also show that A and B are also inde-
pendent events.

12. (b) (i) The contents of urns, I,II and III are as follows:
1 white, 2 black and 3 red balls
2 white, 1 black and 1 red ball and
4 white, 5 black and 3 red balls.
One urn is chosen at random and two balls drawn. They happen
to be white and red. What is the probability that they come from
urns I, II and III?
(ii) Let the random variable X assume the value ‘r’ with the prob-
ability law: P(X = r) = qr − 1 p, r = 1, 2, 3,….. Find the moment
generating function and hence its mean and variance.

13. (a) (i) If ‘m’ things are distributed among ‘a’ men and ‘b’ women,
find the probability that the number of things received by men
is odd.
(ii) If X and Y are independent Poisson variates, find the conditional
distribution of X given X + YN.

f1-May-June_2006.indd 176 12/7/2012 6:16:56 PM


Probability and Queuing Theory (May/June 2006) 1.177

13. (b) (i) If X − 1 and X − 2 are independent uniform variates on [0, 1],
find the distribution of X1/X2 and X1X2
(ii) Find the moment generating function of a normal distribution.

14. (a) (i) Two random variables X and y have the following joint prob-
ability density function
⎧2 − x y 0 ≤ x ≤1 0 ≤ y ≤1
f ( x, y ) = ⎨
⎩0, otherwise
Find
(1) Marginal probability density functions of X and Y
(2) Conditional density functions
(3) Vywar (X) and var (Y)
(ii) Let (X, Y) be a two-dimensional non-negative continuous random
variable having the joint density.
y −(( x
⎧4 xye y )
, x ≥ 0y ≥1
f ( x, y ) = ⎨
⎩0 elsewhere.

Find the density function of U X 2 + Y 2.

14. (b) (i) Find the coefficient of correlation and obtain the lines regression
from the data given below:
X : 62 64 65 69 70 71 72 74
Y : 126 125 139 145 165 152 180 208
(ii) Let the random variable X have the marginal density:
1 1
f1 ( x ) = 1,
< x<
1,
2 2
and let the conditional density of Y be
⎧ 1
⎪⎪1, x < y < x + 1 2
<x<0
f ( x, y ) = ⎨
⎪1, − x < y < 1 − x 0 < x < 1
⎪⎩ 2
Show that the variables are uncorrelated.

15. (a) Customers arrive at a one-man barbershop according to a Poisson


process with a mean inter arrival time of 20 minutes. Customers
spend an average of 15 minutes in the barber chair. If an hour is used
as unit of time, then

f1-May-June_2006.indd 177 12/7/2012 6:16:57 PM


1.178 B.E./B.Tech. Question Papers

(i) What is the probability that a customer need not wait for a hair
cut?
(ii) What is the expected number of customers in the barbershop
and in the queue?
(iii) How much time can a customer expect to spend in the barber-
shop?
(iv) Find the average time that the customer spend in the queue.
(v) What is the probability that there will be 6 or more customers
waiting for service?

15 (b) Derive the formula for the average number of customers in the queue
and the Probability that an arrival has to wait for (M/M/C) with infi-
nite capacity. Also deriver for the same model the average waiting
time of a customer in the queue as well as in the system.

f1-May-June_2006.indd 178 12/7/2012 6:16:57 PM

You might also like