2 - Probability and Queueing Theory
2 - Probability and Queueing Theory
Probability and
Queueing Theory
PART B – (5 × 16 = 80 Marks)
11. (a) (i) A random variable X has the following probability function:
X 0 1 2 3 4 5 6 7
P( x) 0 k 2k 2k 3k k 2
2k
2k 2
7k + k
2
12. (a) (i) Let X and Y be two random variables having the joint probability
function f(x, y) = k(x + 2y) where x and y can assume only the
integer values 0, 1 and 2. Find the marginal and conditional dis-
tributions.
(ii) Two random variables X and Y have the joint probability density
⎧ c( x y ) 0 ≤ x ≤ 2, 0 ≤ y ≤ 2
function f ( x, y ) = ⎨ . Find
⎩ 0, elsewhere
cov(X, Y).
Or
(b) (i) Two dimensional random variables (X, Y) have the joint probability
density function
f ( x, y ) x 0 < x < y <1
xy
= 0, elsewhere
⎛ 1 1⎞
(1) Find P X < ∩ Y < ⎟ .
⎝ 2 4⎠
(2) Find the marginal and conditional distributions.
(3) Are X and Y independent?
(ii) Suppose that in a certain circuit, 20 resistors are connected in series.
The mean and variance of each resistor are 5 and 0.20 respectively.
Using Central limit theorem, find the probability that the total resis-
tance of the circuit will exceed 98 ohms assuming independence.
13. (a) (i) The process {X (t)} whose probability distribution under certain
condition is given by
( at ) n −1
P[ X ( t ) n] = , n = 1, 2, 3, …
( at ) n +1
at
= , n=0
1 + at
Show that {X (t)} is not stationary.
(ii) A salesman territory consists of three cities A, B and C. He
never sells in the same city on successive days. If he sells in
city-A, then the next day he sells in city-B. However if he sells
in either city-B or city-C, the next day he is twice as likely to
sell in city-A as in the other city. In the long run how often does
he sell in each of the cities?
Or
(b) (i) The transition probability matrix of a Markov chain {X (t)}, n =
⎡ 0.1 0.5 0 4 ⎤
⎢ ⎥
1, 2, 3,…, having three states 1, 2 and 3 is P = ⎢0.6 0.2 0 2⎥
⎢ 0.3 0.4 0 3⎥
⎣ ⎦
and the initial distribution is P = (0.7 0.2 0.1). Find
( )
(1) p[X2 = 3]
(2) p[X3 = 2, X2 = 3, X1 = 3, X0 = 2].
(ii) Suppose that customers arrive at a bank according to Poisson
process with mean rate of 3 per minute. Find the probability that
during a time interval of two minutes.
14. (a) (i) A T.V. repairman finds that the time spent on his job has an
exponential distribution with mean 30 minutes. If he repair sets
in the order in which they came in and if the arrival of sets is
approximately Poisson with an average rate of 10 per 8 hour
day. Find
(1) the repairman’s expected idle time each day
(2) how may jobs are ahead of average set just brought?
(ii) A supermarket has 2 girls running up sales at the counters. If
the service time for each customer is exponential with mean
4 minutes and if people arrive in Poisson fashion at the rate of
10 per hour, find the following:
(1) What is the probability of having to wait for service?
(2) What is the expected percentage of idle time for each girl?
(3) What is the expected length of customer′s waiting time?
Or
(b) (i) Trains arrive at the yard every 15 minutes and the service time
is 33 minutes. If the line capacity of the yard is limited to 5
trains, find the probability that the yard is empty and the average
number of trains in the system, given that the inter arrival time
and service time are following exponential distribution.
(ii) There are three typists in an office. Each typist can type an average
of 6 letters per hour. If letters arrive for being typed at the rate of 15
letters per hour, what fraction of times all the typists will be busy?
What is the average number of letters waiting to be typed?
14. (a) (i) Automatic car wash facility operates with only one bay. Cars
arrive according to a Poisson distribution with a mean of 4 cars
per hour and may wait in the facility′s parking lot if the bay
is busy. The parking lot is large enough to accommodate any
number of cars. If the service time for all cars is constant and
equal to 10 minutes, determine
(1) mean number of customers in the system Ls
(2) mean number of customers in the queue Lq
(3) mean waiting time of a customer in the system Ws
(4) mean waiting time of a customer in the queue Wq
(ii) An average of 120 students arrive each hour (inter-arrival times
are exponential) at the controller office to get their hall tickets.
∫
−∞
f ( x ) dx = 1
⎧ λe−λ x , x ≥ ,λ > 0
given f ( x) = ⎨
⎩0, elsewhere
∞ ∞
⎡ e−λ x ⎤
∫ −λ x
∴ λ = λ⎢ ⎥
0 ⎣ −λ ⎦0
e −∞ − e 0
= =1
−1
∴ f(x) is p.d.f.
2.
2
M x (t ) =
2−t
2
=
2[[ t / 2]
=[ t / 2]−1
M x (t ) = 1 + t / (t / ) 2 + (t / 2)3 + …
t
E[ x ] = co-efficient of = 1/ 2
1!
t2
E[ x 2 ] = co-efficient of = 1/ 2
2!
∴ var = [ X 2 ] − [ E ( )]2 = / 2 − ( / 2) 2
= 1/ 4
3. Since both Regression line passing through the point ( , y )
∴ 5 − y = 22 (1)
and 64 x − 45y
45 y = 24 (2)
x=6 d y=8
∑i =1
μi and variance σ 2 ∑σ
i =1
2
i as n tends to infinity.
7.
Ls λ ′Ws
λ′
Lq Ls − where λ ′ = μ ( − P0 )
μ
= effective arrival rate
1
Ws L
λ′ s
1
Wq L
λ q
8. A system is said to be in transient state when its operating characteristics
(like input, output, mean queue length, etc) are dependent upon time.
1
E(T ) = 1 / μ and V (T ) =
μ2
hence
⎧ 1 1 ⎫
λ2 ⎨ 2 + 2 ⎬
λ
Ls = + ⎩μ μ ⎭
μ λ
2(1 − )
μ
λ2
2
λ μ2
= +
μ ⎛ μ − λ⎞
2⎜
⎝ μ ⎟⎠
λ λ2 μ
= + ×
μ μ2 μ − λ
λ λ2
= +
μ μ( μ − λ )
μλ − λ 2 + λ 2
=
μ( μ − λ )
λ
=
μ−λ
Which is Ls of M/M/1 model.
λ 2 {V (T ) + E 2 (T )}
Ls λ E (T ) +
2{{ − λ E (T )}
PART B
11. (a) (i)
(1) ∑P(x) = 1
0 + k + 2k + 2k + 3k + k2 + 2k2 + 7k2 + k = 1
1 k 2 9k = 1
(10kk 1)( k 1) = 0
∴ k = 1 / 10 o −1
The value k = −1 makes some values of P(n) negative.
∴ k = 1/10
The distribution is
X: 0 1 2 3 4 5 6 7
P(X): 0 1/10 2/10 2/10 3/10 1/100 2/100 17/100
(2)
P[ ] 1 P[ ]
= 1 − [ ((x = 6) ( x = 7)]
⎡ 2 17 ⎤
= 1− ⎢ + ⎥
⎣100 100 ⎦
19
= 1−
100
0
P[ x ] = 81 / 100
P[ x ] = 1 − P[ x ]
81
= 1−
100
19
=
100
1 3
(3) By trails, P[X ≤ 0] = 0; P[X ≤ 1] = , [ X ≤ 2]
2] ,
10 10
5 8
P[X ≤ 3] = , [ X ≤ 4]
4] ,
10 10
∴ The smallest value of C satisfying the condition P[X ≤ C] >
1/2 is 4
11 (a) (ii)
M x (t ) = E[e tx ]
∞
∫
= λ e − λ x e ttx dx
0
∫
= λ e − ( λ − t ) x dx
0
λ
= = ( − t/λ ) −1
λ −t
2 3
t ⎛t⎞ ⎛t⎞
M x (t ) = 1 + + ⎜ ⎟ + ⎜ ⎟ + ⋅⋅⋅
λ ⎝ λ⎠ ⎝ λ⎠
t 1
Mean E[ X ] = co-efficient of =
1! λ
t2 2
E[ X 2 ] = co-efficient of =
2! λ 2
2 1 1
∴ var ( x)) = [ x 2 ] − [ E ( )]2 = 2 − 2 = 2
λ λ λ
11 (b) (i) (1) X is a poisson variate
e−λ ⋅ λ r
∴ P[ X = r ] =
r!
∴ P[ X = ] = 9 P [ X = ] + 90 P[[ X = ]
e−λ λ 2 e−λ λ 4 e−λ λ 6
=9 + 90
2! ! 6!
3 2 λ4
1= λ +
4 4
λ4 λ2 4 = 0
∴ (λ 2
)( λ 2 1) = 0
∴λ2 λ2 1
Hence λ = 1 [Since λ 2 cannot be negative]
Mean = λ = 1
E[ x 2 ] = var( )]2
ar(( x ) [ E ( x )] var( x ) = E[ x 2 ] − [ E ( x )]2
= λ+λ 2
∵ var( x ) = λ
= 1+1
=2
(2)
P[ X ] = 1− P[X ]
= 1− [P [ X ]+ P[X ]]
= 1 − e −1 ( )
2
= 1−
e
(b) (ii) Let X denote the daily consumption of electric power (in million
of killowatt hours)
Then the p.d.f of X is
3
⎛ 1⎞
⎜⎝ ⎟⎠ −
x
2
f ( x) = x≥0 x 2e 2,
3
P[The power supply is inadequate]
∞
= P[X > ]=
12
∫ f ( x)dx
∞ x
1 ⎛ 1⎞ 2 − 2
= ∫
12
⎜ ⎟ x e dx ∵ 3 = 2 + 1 = 2 !
3 ⎝ 8⎠
1 ⎡ 2 ∞
= ⎣ x ( −22e − / 2 ) − 2 x( 4e − x / 2 ) + 2( −88e − x / 2 ) ⎤⎦
16 12
e −6
= [288 + 96 + 16 1 ] = 25e −6
16
= 0.0625
12 (a) (i)
X/Y 0 1 2 ∑ P ( x, y )
y
0 0 2k 4k 6k
1 k 3k 5k 9k
2 2k 4k 6k 12k
∑ P ( x, y )
x
3k 9k 15k 27k
∑ ∑ P ( x, y ) = 1
y x
27k = 1
k = 1 / 27 ⇒
X/Y 0 1 2 P(x)
0 0 2/27 4/27 6/27
1 1/27 3/27 5/27 9/27
2 2/27 4/27 6/27 15/27
P(y 3/27 9/27 15/27 1
Marginal distribution of X
X 0 1 2
P(X) 6/27 9/27 12/27
Marginal distribution of Y
Y 0 1 2
P(y) 3/27 9/27 15/27
(ii) Conditional distribution of X given Y = y
P[ X x Y y]
P[ X x / Y = y] =
P[Y y]
P[ X / Y = 0] = 0
1 / 27
P[ X / Y = 0] = = 1/ 3
3 / 27
2 / 27
P[ X / Y = 0] = = 2/3
3 / 27
2 / 27
P[ X / Y = 1] = = 2/9
9 / 27
3 / 27
P[ X / Y = 1] = = 1/ 3
9 / 27
4 / 27
P[ X / Y = 1] = = 4/9
9 / 27
4 / 27
P[ X / Y = 2] = = 4 / 15
15 / 27
5 / 27
P[ X / Y = 2] = = 1/ 3
15 / 27
6 / 27
P[ X / Y = 2] = = 2/5
15 / 27
Conditional distribution of Y given X = x
P[Y y X x]
P[Y y / X = x] =
P[ X x]
P[Y 0/ ]= 0
2 / 27
P[[ / X = 0] = = 1/ 3
6 / 27
4 / 27
P[[ / X = 0] = = 2/3
6 / 27
1 / 27
P[Y / X = 1] = = 1/ 9
9 / 27
3 / 27
P[Y / X = 1] = = 1/ 3
9 / 27
5 / 27
P[Y / X = 1] = = 5/9
9 / 27
2 / 27
P[Y / X = 2] = = 2 / 15
15 / 27
4 / 27
P[Y / X = 2] = = 4 / 15
15 / 27
6 / 27
P[Y / X = 2] = = 2/5
15 / 27
12 (a) (ii)
To find C
∞ ∞
∫∫
−∞ −∞
dy = 1
f ( x, y ddxd
2 2
C ∫∫(0 0
x y )dxd
d dy 1
2 2
⎡ x2 ⎤
0⎣
∫
C ⎢ x−
2
− yx ⎥ dy = 1
⎦0
2
C ( ∫
0
y ) dy 1
∫
C (
0
y )dy = 1
2
⎡ 2y⎤
C y− ⎥ =1
⎣ 2 ⎦0
1
8C 1⇒ C =
8
⎧1// ( 4 − x − y ), 0 ≤ x ≤ 2; 0 ≤ y ≤ 2
∴ f ( x, y ) = ⎨
⎩0, elsewhere
∞
f ( x) = ∫
−∞
f ( x, y ddy
2
1
=
8 ∫ (4 −
0
− ) dy
2
1⎡ y2 ⎤
= ⎢ 4 y − xy − ⎥
8⎣ 2⎦
0
3− x
f ( x) = ,0≤y≤2
4
∞
& f ( y) ∫
−∞
f ( x, y )dx
2
1
=
8 ∫ (4 −
0
− ) dx
2
1⎡ x2 ⎤
= ⎢4 x − − yx ⎥
8⎣ 2 ⎦0
3− y
f ( y) = , 0 ≤ y≤ 2
4
∞
E( X ) = ∫ x f ( x) ddx
−∞
2
1
= ∫
(3 − 2
) dx
4
0
2
1 ⎡ 3x 2 x3 ⎤
= ⎢ − ⎥
4⎣ 2 3⎦
0
5
=
6
∞
E (Y ) = ∫ y f ( y)ddy
−∞
2
1
= ∫
(3 − 2
) dy
4
0
5
=
6
∞ ∞
E ( XY
X ) ∫ ∫ xxy f ( x, y) dx dy
−∞ −∞
2 2
1
=
8 ∫ ∫ xy (4 − x − y) dxdy
0 0
2
1 ⎡ 4 x 2 y x3 y x 2 y 2 ⎤
= ⎢ − − ⎥ dy
8 ⎣ 2 3 2 ⎦
0
2
1 ⎡ 8y ⎤
=
8 ⎣
0
⎢ ∫
8y −
3
− 2 y 2 ⎥ dy
⎦
2
1 ⎡ 16 y ⎤
= ∫
8 ⎢⎣ 3
0
− 2 y 2 ⎥ dy
⎦
2
1 ⎡8 y2 y3 ⎤
= ⎢ −2 ⎥
8⎣ 3 3⎦
0
1 ⎡ 32 16 ⎤
= − ⎥
8 ⎢⎣ 3 3⎦
2
=
3
cov (X,Y) = E[XY] − E[X]E[Y]
2 ⎛ 5⎞ ⎛ 5⎞
= −⎜ ⎟ ⎜ ⎟
3 ⎝ 6⎠ ⎝ 6⎠
−1
cov ( X , ) =
36
1 1 1 1
2
⎛ y2 ⎞ 4 2
⎛ 1⎞ ⎛ 1⎞ ⎡ x2 ⎤ 2
∫0
⎝ ⎠0
2
0
⎝ 16 ⎠ ∫
= 8 x ⎜ ⎟ dx = 4 x ⎜ ⎟ dx = ⎜ ⎟
⎝ 4⎠
⎢ ⎥
⎣ 2 ⎦0
1
=
32
(2) Marginal distribution
∞ 1
f X ( x) = ∫
−∞
f ( x, y ) ddy ∫ 8 xxy dy
x
⎛y 2 ⎞1
⎛ 1 x2 ⎞
= 8x ⎜ ⎟ = 8x ⎜ − ⎟
⎝ 2 ⎠x ⎝2 2 ⎠
= 4x(1 − x2), 0 < x < y
∞ y
fY ( y ) = ∫
−∞
f ( x, y dx
d ∫ 8 xxy dx
0
⎛x 2⎞y
= 8y ⎜ ⎟
⎝ 2 ⎠0
= 4y (y2) = 4y3, x < y < 1
Conditional distribution
f ( x, y ) 8 xy 2 x
f ( x, y ) = = =
f ( y) 4 y3 y 2
f ( x, y ) 8 xy 2y
f ( y, x ) = = =
f ( x ) 4x(x( x ) 1 − x 2
2
∞
E[ X (t ) ∑ nPP
n= 0
n
1 2at 3( ) 2
= + + + ...
(1 + )2 1+
(1+ )3 (1 + )4
1 ⎡ at ⎛ at ⎞
2 ⎤
= ⎢1 + 2 + 3 ⎜⎝ ⎟⎠ + ...⎥
(1 + 2
) ⎢⎣ 1 + at 1 + at ⎥⎦
−2
1 ⎡ at ⎤
= 2 ⎢
1− ⎥
(1 + ) ⎣ 1 + at ⎦
−2
1 ⎡ 1 ⎤
= 2 ⎢1 + at ⎥
=1
(1 + ) ⎣ ⎦
at )
n −1
∞ ∞
(
2
E[ X (t )] ∑ n P ∑ [n(n + 1)
n= 0
2
n
n =1
n]
(1 + at )n+1
⎡ ∞ n −1 ∞ n −1
⎛ at ⎞ ⎛ at ⎞ ⎤
∑ ∑
1
= ⎢ n(
( +1) ⎜⎝ ⎟ − n ⎜⎝ ⎟ ⎥
(1 + ) 2 ⎢⎣ n =1 1 + at ⎠ n =1
1 + at ⎠ ⎥⎦
⎡ ∞ n(( +1) ⎛ at ⎞ n −1 ∞ ⎛ at ⎞ n −1 ⎤
∑ ∑
1
= ⎢2 ⎜ ⎟ − n⎜ ⎥
(1 + ) 2 ⎢⎣ n =1 2 ⎝ 1 + at ⎠ ⎝ 1 + at ⎟⎠ ⎥
n =1 ⎦
⎡ ⎛ − −
at ⎞ ⎤
3 2
1 at ⎞ ⎛
= ⎢ 2 ⎜1 − ⎟ − ⎜1 − ⎥
(1 + ) ⎢⎣ ⎝ 1 + at ⎠
2 ⎝ 1 + at ⎟⎠ ⎥
⎦
∞
∑
−3 n(( )
∵( ) = x n −1
n =1
2
2(1 + )3 (1 + )2
= −
(1 + ) 2 (1 + ) 2
= 2(1 + at) −1
E[X2(t)] = 1 + 2at
∴ Var [X(t)] = 1 + 2at −1
= 2at
Since E[X(t)] & Var [X(t)] are functions of t,
∴ { X(t)} is not stationary.
2
∴ ( 2 + 3 ) = 1 ⇒ 3 1 − 2π 2 − 2 3 =0 (1)
3
π3
π1 + = π 2 ⇒ π1 − 3π 2 π 3 0 (2)
3
π2
= π 3 ⇒ π 2 − 3π 3 (3)
3
8π 3
From 1 & 2 3π1 8π 3 = 0 ⇒ π1
3
8π 3
π1 + π 2 + π 3 = 1 ⇒ + 3π 3 π 3 1
3
20π 3
=1
3
3
π3 =
20
8 9 3
∴ π1 & π2 & π3 =
20 20 20
∴ The steady state probability distribution is
⎛ 8 9 3⎞
π=⎜ , = (0.40 0.45 0.15)
⎝ 20 20 20 ⎟⎠
,
13 (b) (i)
⎡0.1 0.5 0.4 ⎤ ⎡0.1 0.5 0.4 ⎤
P ( )
P.P = ⎢⎢0.6 0.2 0.2 ⎥ ⎢0.6 0.2 0.2
⎥⎢
⎥
⎥
⎢⎣0.3 0.4 0.3 ⎥⎦ ⎢⎣0.3 0.4 0 3 ⎥⎦
⎡0.43 0.31 0 26 ⎤
= ⎢⎢0.24 0.42 0 34 ⎥⎥
⎢⎣0.36 0.35 0 29 ⎥⎦
3
(i) P[ X 2 ] ∑ P[ X
i =1
2 / X0 i ] P[ X 0 = i ]
( 2)
= p13 P{ X 0 = 1} + p2( 23) P[ X 0 = 2] + p3( 23) P[ X 0 = 3]
= 0.26 × 0.7 + 0.34 × 0.2 + 0.29 × 0.1
= 0.182 + 0.068 + 0.029
= 0.279
(ii) P{X3 = 2, X2 = 3, X1 = 3, X0 = 2}
= P{X3 = 2 / X2 = 3, X1 = 3, X0 = 2} × P{X2 = 3, X1 = 3, X0 = 2}:
= P{X3 = 2 / X2 = 3} × P{X2 = 3/X1 = 3, X0 = 2} × P[X1 = 3, X0 =2}
=P{X3 = 2 / X2 = 3} × P{X2 = 3/X1 = 3} × P{X1 = 3 /X0 = 2} × P {X0 = 2}
(1) (1) (1)
= p32 × p33 × p23 × P{ X 0 2}
= 0.4 × 0.3 × 0.2 × 0.2
= 0.0048
13 (b) (ii) Mean of the Poisson process = lt
Mean arrival rate = l
Given l = 3
e − λt ( t ) x
P{ X (t ) x} =
x!
e −6 ( ) 4
(1) ∴ P{ X ( 2)) = } = = 0.133
4!
(2) P{X(2) > 4} = 1−P{X(2) ≤ 4}
= 1−[P{X(2) = 0} + P{X(2) = 1} + P{ X(2) = 2}
+ P{X(2) = 3} + P{X(2) = 4}]
e −6 (6) x
4
= 1− ∑x=0
x!
⎡ 6 6 2 63 6 4 ⎤
= 1 − e −6 ⎢1 + + + + ⎥
⎣ 1! 2 ! 3! 4 ! ⎦
= 0.715
3 P[fewer than 4 customer] = P {X (2) < 4}
=[P{X(2) = 0} + P{X(2) = 1}+ P{ X(2) = 2} + P{X(2) = 3}
e −6 (6) x
3
= ∑
x=0
x!
⎡ 6 6 2 63 ⎤
= e −6 ⎢1 + + + ⎥
⎣ 1! 2 ! 3! ⎦
= e −6 [1 + 6 + 18 + 36]
−66
= 61 ( )
= 0.1512
(14) (a) (i) Νumber of server is 1 infinite number of customer
∴ given module is (M/M/l) : (∞ /FIFO)
10 1
λ= = set / min
8 × 60 48
1
μ= set / min
30
Probability that there is no unit in the system
λ 5 3
P0 = 1 1− =
μ 8 8
Repairman’s expected idle time in 8 hour day
3
= nP
P0 = 8 × = 3 hours
8
Expected average number of jobs
λ
Ls =
μ−λ
1 1
= 48
= 48
1 1 48 − 30
−
30 48 48 × 30
30
=
18
5
= jobs
3
1 1
(14) (a) (ii) Given λ μ = 1 min
6 4
s=2
∴ The given model is (M/M/S) (∞ /FIFO)
−1
⎡ ⎧ ⎫⎤
⎢ ⎧ s −1 n⎫ ⎪ s ⎪⎥
⎪ 1 ⎛ λ⎞ ⎛ λ ⎞ ⎪⎥
∑
1
P0 = ⎢ ⎨ ⎜⎝ μ ⎟⎠ ⎬ + ⎨ ⎛ ⎜ ⎟ ⎬⎥
⎢
⎢ ⎪⎩ n = 0
n ! ⎪⎭ ⎪ s ! 1 − λ ⎞ ⎝ μ ⎠ ⎪ ⎥
⎢⎣ ⎪⎩ ⎜⎝ μ s ⎟⎠ ⎪⎭ ⎥⎦
λ 4 2 λ 1/ 6 1
∴ = = & = =
μ 6 3 μs 1 3
2×
4
−1
⎡ ⎛ ⎞⎤
⎢⎛ 2 −1 1 2 n ⎞ ⎜ ⎟⎥
⎛ ⎞
∑
1
∴ P0 = ⎢⎜ ⎟ + ⎜
2 ⎥
⎟
⎢⎝ n = 0 n ! ⎜⎝ 3 ⎟⎠ ⎠ ⎜ ⎛ 1 ⎞
( 2 / 3)
− ⎟⎥
⎢ ⎜⎝ ⎜⎝ 3 ⎟⎠
2 ! 1 ⎟⎠ ⎥
⎣ ⎦
−1
⎡ 2 1⎤ 1
= 1+ + ⎥ =
⎣ 3 3⎦ 2
∞ n
⎛ λ⎞
∑ 8! s
1
= n− s ⎜⎝ μ ⎟⎠ P0 , if n ≥ s
n= 2
∞ n
⎛ 2⎞
∑ 2! 2
1
= n− 2 ⎜⎝ ⎟⎠ P0
n= 2
3
∞ n
P0 1 ⎛ 2⎞
=
2 ∑
n= 2 2 n− 2 ⎜
⎝ 3 ⎟⎠
2 ∞ n− 2
P ⎛ 2⎞ ⎛ 2 ⎞
= 0⎜ ⎟
2 ⎝ 3⎠ ∑
n= 2
⎜⎝ 3 × 2 ⎟⎠
2 P0 ⎡ 1 ⎛ 1⎞ 2 ⎤
= ⎢1 + + ⎜ ⎟ + …⎥
9 ⎢⎣ 3 ⎝ 3 ⎠ ⎥⎦
⎡ ⎤
2 P0 ⎢ 1 ⎥
= ⎢ ⎥
9 ⎢ 1⎥
1−
⎣ 3⎦
3 1 1
= 8 1× = ∵ P0 =
2 6 2
λ 1
(2) The fraction of time servers are busy = ρ = =
sμ 3
⎡ 1− ρ ⎤
∴ Pn = ρ n ⎢ k +1 ⎥
⎣1 − ρ ⎦
= ρ n p0
6
= ∑nρ P
n= 0
n
0
14 (b) (ii)
Arrival rate l = 15/n
λ 5
Service rate μ = 6/n ∴ =
Number of server s = 3 μ 3
Hence this is a problem in multiple server
(M/M/S) : (∞/FIFO) mode
(a) P[all the typists are busy]
=P [N ≥ 3]
3
⎛ λ⎞
⎜⎝ μ ⎟⎠ .P0
= (1)
⎛ λ⎞
3! ⎜1 − ⎟
⎝ 3μ ⎠
−1
⎡ ⎧ ⎫⎤
⎢ s −1 ⎧ n⎫ ⎪ s ⎪⎥
⎪ 1 ⎛ λ⎞ ⎛ λ ⎞ ⎪⎥
∑
1
Where P0 = ⎢ ⎨ ⎜ ⎟ ⎬ + ⎨ ⎜⎝ μ ⎟⎠ ⎬ ⎥
⎢ n ! ⎝ μ ⎠ ⎛ λ ⎞
⎢ n = 0 ⎪⎩ ⎪⎭ ⎪ s ! 1 − ⎪⎥
⎢⎣ ⎪⎩ ⎜⎝ μ s ⎟⎠ ⎪⎭ ⎥⎦
−1
⎡ ⎧ ⎫⎤
⎢ ⎪ 3 ⎪⎥
⎧ 1 ⎫ ⎪ 1 ⎛ 5⎞ ⎪⎥
= ⎢ ⎨1 + 2.5 + ( 2.5) 2 ⎬ + ⎨ ⎜ ⎟ ⎬
⎢⎩ 2 ⎭ ⎪ 3! 1 − ⎝ 3 ⎠ ⎪ ⎥
⎛ 5 ⎞
⎢ ⎜ ⎟ ⎪⎭ ⎥⎦
⎣ ⎩⎪ ⎝ 3 ⎠
= [22.25]−1 = 0.0449
1⇒ [ 3]
(5 3 )
3
× 0.0449 = 0.7016
⎛ 5⎞
6 1− ⎟
⎝ 9⎠
Hence the fraction of the time all the typists will be busy = 0.7016
(b) The average number of letters waiting to be typed
s +1
⎛ λ⎞
.P0
1 ⎜⎝ μ ⎟⎠
Lq = 2
s.s ! ⎛ λ⎞
1 −
⎜⎝ μ s ⎟⎠
1 ( 2.5) 4
= × × 0.0449 = 3.5078
3 × 6 ⎛ 2 5⎞ 2
⎜⎝1 − ⎟
3 ⎠
When the system has reached the steady state, the probability of
the number of customers in the system will be constant.
Hence E(N) = E(N′) & E(N2) = E(N′)2 (3)
Using this in (2) ⇒ E(d) = 1−E(M) (4)
Eqn (1) squaring on both sides
N′2 = N 2 + (Μ−1)2 + d 2 + 2Ν (Μ −1) + 2(M − 1) d + 2Νd ( 5)
Now d 2 = d (∴d 2 = 0 or 1 according as d = 0 or 1)
⎧0 × 1, if N = 0
and N δ = ⎨
⎩ N 0, if N > 0
Using these values in (5)
(5) ⇒ N′2 = N 2 + M 2+ 2N (M−1) + (2M−1)d − 2M + 1
i.e., 2 N (1 − M) = N2− N′2+ M2 + (2M − 1) d − 2M + 1
2 E {N (1 − M)} = E(N 2) −E(N′2) + E(M2) + E{(2M − 1)d}− 2E(M) + 1
i.e., 2 E (N){ 1−E(M)} = E(M 2) + {2E(M) − 1}E(d) − 2E(M)+1
∴ Independence property & (5)
E ( M 2 ) + { E ( M ) }{ E ( M )} − 2 E ( M ) + 1
E(N ) =
2{{ E ( M )}
E( M 2 ) − E 2 ( M ) + E( M )
= ∴ by 4
2{{ − E ( M )}
Since the number of arrivals in time T follows a Poisson process
with parameter l, say
Then E(M) = lT & var (M) = lT
Or E(M2) = (lT)2 + lT
Now E(M) = E{E(M/T)} = Ε(lT) = l E(T) (6)
15 (b) (ii) Given a open queuing network with three nodes 1, 2, & 3.
Let l1, l2, l3 be the resultant arrival rates and r1, r2, r3, be the
arrival rate of customers to server ‘j’, that come from outside
the system.
⎡0 0.6 0.3 ⎤
& given Pij ⎢0.1 0 0.3 ⎥ , ( r r , r ) = (1, 4, 2)
⎢ ⎥ 1 2 3
⎢⎣0.4 0.4 0 ⎥⎦
Jackson’s flow balance equations are
3
λj j ∑λ P
i =1
i iij , j = 1, 2, 3 (1)
We note that
P12 = 0.6, P13 = 0.3, P21 = 0.1, P23 = 0.3
P31 = 0.4, P32 = 0.4, P11 = P22 = P33 = 0
Taking j = 1
3
(1)⇒ λ1 1 ∑λ P
i =1
i i1
(1)⇒ λ2 2 ∑λ P
i =1
i i2
(1)⇒ λ3 3 ∑λ P
i =1
i i3
F ( x ) = 0, x < 0
1
= x2 , 0 ≤ x <
2
3 1
= 1− ( x2 ) : x )
25 2
=1 ;x ≥ 3
⎛1 ⎞
Find the pdf of X and evaluate P (| X | ) and P < X < 4⎟
⎝3 ⎠
using both the pdf and PDF.
(ii) A random variable X has the following probability distribution
x: −2 − 0 1 2 3
1
p( x ) 0 k 0 2 0 3
: 1 2 k 3 k
(1) Find k , (2) Evaluate P(X < 2) and P (−2 < X <2), (3) Find the
PDF of X and (4) Evaluate the mean of X .
(b) (i) The probability function of an infinite discrete distribution is
given by P(X1) < 1/2; j = 1,2,3 Verify that the total probability
is 1 and find the mean and variance of the distribution. Find also
P(X is even , P(X ≥ 5) and P(X is divisible by 3).
(ii) Define Gamma distribution and find its mean and variance.
12. (a) (i) The joint probability mass function of (X,Y) is given by P(x,y)
= k (2x + 3y), x = 0, 1, 2; y = 1, 2, 3. Find all the marginal and
conditional probability distributions.
(ii) State and prove central limit theorem.
(Or)
(b) (i) If X and Y are independent RVs with pdf’s e−x,x ≥ 0, and e−y,
X
y ≥ 0, respectively, find the density functions of U = and
V = X+Y. Are U and V independent? X Y
(ii) Find the correlation coefficient for the following data :
X 1 1 1 2 2 3
: 0 4 8 2 6 0
Y 1 1 2 6 3 3
: 8 2 4 0 6
13. (a) (i) Define Poisson process and derive the Poisson probability law.
(ii) A man either drives a car (or) catches a train to go to office each
day. He never goes two days in a row by train but if he drives one
day, then the next day he is just as likely to drive again as he is to
travel by train. Now suppose that on the first day of the week, the
man tossed a fair die and drove to work if and only if a 6 appeared.
Find (1) the probability that he takes a train on the third day and
(2) the probability that he drives to work in the long run.
(Or)
(b) (i) Show that the random process X(t) = A cos( ω 0 t + θ ) is wide-
sense stationary , if A and ω 0 are constants and θ is a uniformly
distributed RV in (0, 2p)
(ii) If customers arrive at a counter in accordance with a Poisson pro-
cess with a mean rate of 2 per minute ,find the probability that the
interval between 2 consecutive arrivals is (1) more than 1 min.(2)
between 1 min and 2 min and (3) 4 min (or) less.
14. (a) Find the mean number of customers in the queue system, average
waiting time in the queue and system of M/M/1 queueing model.
(Or)
(b) There are three typists in an office. Each typist can type an average
of 6 letters per hour. If letters arrive for being typed at the rate of 15
letters per hour,
(i) What fraction of the time all the typists will be busy?
(ii) What is the average number of letters waiting to be typed?
(iii) What is the average time a letter has to spend for waiting and
for being typed?
(iv) What is the probability that a letter will take longer than 20 min.
waiting to be typed and being typed?
∫
∴ k ( + x )dx = 1
2
5
⎡ x2 ⎤
k ⎢x + ⎥ = 1
⎣ 2 ⎦2
⎡ 25 4⎤
k ⎢5 + −2− ⎥=1
⎣ 2 2⎦
27
k =1
2
2
∴k =
27
2
∴ f ( x) = ( + x)
27
And
4
2
P[x < ] = ∫ 27 (1 + x)ddx
2
4
2 ⎡ x2 ⎤
= ⎢x + ⎥
27 ⎣ 2⎦
2
2 ⎡ 16 4⎤
= 4+ −2− ⎥
27 ⎣ 2 2⎦
2
=
27
[8 ]
16
=
27
That can assume the values 0, 1, 2..... such that its probability mass
function is given by
e−λ λ x
P [ X = r] = ; r = 0, 1, 2... & λ > 0
r!
Then X is said to follow a Poisson distribution with parameter λ
Mean = E[ X ] = λ
Var X = λ
3. w.k.t
∞ ∞
∫∫
−∞ −∞
f ( x, y ddx dy = 1
∞∞
∫ ∫ k xxy e −(( x 2 y 2 )
∴ = 1, x y > 0
0 0
∞ ∞
∫ ∫
k x e -x dx y e − y dy = 1 = 1
2 2
0 0
2
Let and y2 s
2 xdx
d dt
dt 2 ydy = ds
dt ds
d =
xdx ydy =
ydy
2 2
∞ ∞
dt ds
∫
k e−t ∫ =1
s
e
2 2
0 0
∞ ∞
k ⎡ e−t ⎤ ⎡e s ⎤
⎢ ⎥ ⎢ ⎥ =1
4 ⎣ 1⎦
0 ⎣ −1 ⎦ 0
k ⎡ −∞ ⎤ ⎡ ∞ ∞
e 1⎦ ⎣e 1⎤⎦ = 1 e 0
4⎣
k
=1
4
k =4
y ( x )3
1
1 3
∴x y
2
dx
f y y) = f x ( x) | |
ddy
1 2
1 −3
= y3 × y
6
1
1 −3
= y 0< y<8
6
5. When the Markov chain is homogenous, the one-step transition probabily
is denoted by pij. The matrix P = {pij} is called the transition probability
matrix satisfying the conditions.
⎡ an ⎤
P ⎢ X ( tn ) = = an X (tn − 2 ) an − 2 .... X (t
(t2 ) = a2 , X (t1 ) a1 ⎥
⎣ X(t
( n - 1) ⎦
⎡ an ⎤
= P ⎢ X ( tn ) = = an −1 ⎥
⎣ X (tn −1 ) ⎦
for all t1 t2 < .....tn
In otherwords, if the future behavior of a process depends on the present
value but not on the past, then the process is called a Markov process.
model is
l l l
0 1 2 n n+1
m m m
8. (a/b/c) : (d/e)
where
a-inter arrival time
b-service mechanism
c-number of service
d- the capacity of the system
e-queue decipline
λ 2 σ 2 + ρ2 λ 2
= +ρ h ρ= ,σ =V ( )
2(1 − ρ) μ
λ 2 σ 2 + ρ2
=
2(1 − ρ)
10. A series model or tandem queue model is, in general, one in which
(i) Customers may arrive from outside the system at any node and may
leave the system from any node.
(ii) Customers may enter the system at some node, traverse from node
to node in the system and leave the system from some node, not
necessarily following the same order of nodes.
(iii) Customers may return to the nodes
previously visited, skip some nodes entirely and even choose to remain
in the system forever.
11. (a) (i) The points x = 0,1/2 and 3 are points of continuity
⎧0 , x<0
⎪ 1
⎪2 x , 0 ≤ x <
⎪ 2
∴ f ( x) = ⎨
6
⎪ ( − x ), 1
≤ x<3
⎪ 25 2
⎪0 , x ≥3
⎩
2
13
=
25
If we use property of cdf
13
P[| X | ] = P[ x ] = F ( ) F(
F( )=
25
If we use the property of pdf
1
2 3
⎡1 ⎤ 6
P
⎣3
≤ X <4
⎦ ∫
1
2 x ddx + ∫ 25 (3
1
x ) dx
3 2
8
=
9
If we use the property of cdf,
⎡1 ⎤ ⎛ 1⎞
P ≤ X < 4 F ( 4) F ⎜ ⎟
⎣ 3 ⎦ ⎝ 3⎠
1
= 1−
9
8
=
9
11.) a (ii)
(1) Since ∑ p( x) = 1
0. 0.2 + 2k 0.3 + 3k 1
k 0.
6k 0.6 = 1
k 1 / 15
(4)
Mean of X E [ x] ∑ x p ( x)
⎛ 1⎞ ⎛ 1⎞ ⎛ 2⎞ ⎛ 3⎞ ⎛ 1⎞
= ( −2) ⎜ ⎟ + ( −1) ⎜ ⎟ + 0 + 1 ⎜ ⎟ + 2 ⎜ ⎟ + 3 ⎜ ⎟
⎝ 10 ⎠ ⎝ 15 ⎠ ⎝ 15 ⎠ ⎝ 10 ⎠ ⎝ 5⎠
16
=
15
11 (b)
∞
∑p
1 1 1
(i) j = + 2 + 3 + ....... + ∞
j =1
2 2 2
1
= 2 =1 ∵geometric series
1
1−
2
∞
The mean of X E[X]= ∑ jpj=1
j
1
Let a =
2
E [ X ] = a + 2a 2 + 3a3 + ......
= a ⎡⎣1 + 2a + 3a 2 + .....⎤⎦
1
= a[ a]
−2 2
= 2
=2
⎛ 1⎞
⎜⎝1 − ⎟⎠
2
∴E[ ]= 2
∑j a
1
= 2 j
∵a =
j =1
2
= ∑ [ j( j + 1) − j] a j
∞ ∞
= ∑ j =1
+ − ∑ ja
j =1
j
= a × 2(1 − a) −3 − a(1 − a) −2
2a a 1
= − = 8−2 = 6 ∵a =
(1− a) 3
(1 − a) 2 2
∴V ( X ) = E[ X 2 ] [ E(( )]2 = 6 − 4 = 2
P[x is even] =
= P[ X = ] + P[ X = ] + ....
∵ the events are mutually exclusive
2 4 6
⎛ 1⎞ ⎛ 1⎞ ⎛ 1⎞
= ⎜ ⎟ + ⎜ ⎟ + ⎜ ⎟ + ....
⎝ 2⎠ ⎝ 2⎠ ⎝ 2⎠
1
4 1
= =
1 3
1−
4
P[ X ] = P[ X = 5 or X 6 X = 7.....]
= P[ x ] + P[x
[x ] + ....
5
⎛ 1⎞
⎜⎝ ⎟⎠ 1
2
= =
1 16
1−
2
P[X is divisible by 3] = P[ X = 3 X = 6 or X = 9...]
= P[ X = ] + P[ X = ] + ...
3 6
⎛ 1⎞ ⎛ 1⎞
= ⎜ ⎟ + ⎜ ⎟ + ....
⎝ 2⎠ ⎝ 2⎠
3
⎛ 1⎞ 1
⎜⎝ ⎟⎠ 1
2
= 3
= 8 =
⎛ 1⎞ 1 7
1− ⎜ ⎟ 1−
⎝ 2⎠ 8
⎧ λ k x k −1e − λ x
⎪ , for x ≥ 0
f ( x) = ⎨ k
⎪0
⎩ , otherwise
M.G.F
The rthmoment μ r′ = E[ X ]
∞
λk
= ∫
0
k
x k+r −1e − λ x dx
∞
λk
∫x k+r −1 − λ x
= e dx
k 0
put λ = ⇒ = λ
when x = 0 ⇒ = 0
x = ∞ ⇒t = ∞
∞ k+r −1
λk ⎛t⎞
=
k ∫ 0
⎜⎝ ⎟⎠
λ
e −tt dt / λ
∞
λ k
1
∫t k+r-1 − t
= × k+r
e dt
k λ k+r 0
∞
1 k+r
= ∵k ∫x k
e x ddx
λr k 0
1 k +1 k
∴ Mean = E[ X ] = . =
λ k λ
Var(X) = E[ x 2 ] − [ E ( X )]2
2
1 k+2 ⎛k⎞
= −⎜ ⎟
λ2 k ⎝ λ⎠
1 ⎡
= k ( k + ) − k 2 ⎤⎦
λ2 ⎣
k
= 2
λ
12. (a) (i) Given P(x,y) = K(2x+3Y),x = 0,1,2,3; y = 1,2,3
The joint probability distribution of (X, Y) is given below
X/Y 1 2 3
0 3k 6k 9k
1 5k 8k 11k
2 7k 10k 13k
3 2
since ∑ ∑ p( x , y ) = 1
j =1 i = 0
i j
X/Y 1 2 3 ∑ y
( x, y )
∑ x
( x, y )
15/72 24/72 33/72 1
Marginal distribution of X
P[X = x] = ∑ p( x, y)
y
x 0 1 2
P(x) 18/72 24/72 30/72
P[Y = y] = ∑ p( x, y)
x
Y 1 2 3
p(y) 15/72 24/72 33/72
Conditional distribution of X, given Y = y
P[ X x ∩ Y y]
P[ X x / Y = y] =
P[Y y ]
3 / 72 1
P[ X / y = 1] = =
15 / 72 5
5 / 72 1
P[ X / Y = 1] = =
15 / 72 3
7 / 72 7
P[ X / Y = 1] = =
15 / 72 15
6 / 72 1
P[ X / Y = 2] = =
24 / 72 4
8 / 72 1
P[ X / Y = 2] = =
24 / 72 3
10 / 72 5
P[ X / Y = 2] = =
24 / 72 12
9 / 72 3
P[ X / Y = 3] = =
33 / 72 11
11 / 72 1
P[ X / Y = 3] = =
33 / 72 3
13 / 72 13
P[ X / Y = 3] = =
33 / 72 33
e E ⎢e σ n e σ n .......e σ n ⎥
⎢ ⎥
⎣ ⎦
−
tμ n ⎛ ( tx1 ) ⎞ ⎛ tx2 ⎞ ⎛ txn ⎞
Hence Mz(t) = e E ⎜ eσ ⎟ E ⎜ eσ ⎟ ..........E ⎜ e σ n ⎟
⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠
tμ n ⎡ ⎛ ⎛ t ⎞ 1⎛ t ⎞
2 ⎞⎤
=− + n log ⎢ E ⎜1 + ⎜ X + X 2
+ .....⎟⎥
σ ⎢ ⎝ ⎝ σ n ⎟⎠ 2 ! ⎜⎝ σ n ⎟⎠ ⎠ ⎥⎦
⎣
tμ n ⎡ ⎛ t ⎞ 1⎛ t ⎞
2 ⎤
=− + n log ⎢1 + ⎜ ⎟ μ1
1
+ ⎜ ⎟ μ21 + ....⎥
σ ⎢⎣ ⎝ σ n ⎠ 2! ⎝ σ n ⎠ ⎥⎦
tμ n ⎡⎛ t μ1⎛ t ⎞
2 ⎞ 1⎛ t ⎞
2 ⎤
=− + n ⎢⎜ μ11 + 2 ⎜ + ...⎟ − μ 1
+ ... + ...⎥
σ ⎢⎝ σ n 2 ! ⎝ σ n ⎟⎠ ⎠ 2⎝ 1 σ n ⎟⎠ ⎥
⎣ ⎦
Put μ1 = μ = mean
μt n t2 ⎡ 1 n t
log z (t ) μ
2 ⎣ 2
( μ11 ) 2 ⎤⎦
σ 2σ σ
+terms containing n in the denominator
t2
log z (t ) = σ2
2σ 2
= t2 / 2
2
∴ M z ( ) = et /2
, as n → ∞
∂x ∂x
∂u ∂v v u
J= = =v
∂yy ∂yy −vv ( u)
∂u ∂v
The joint pdf (u, v) is given by
g(u,v) = J e-(x + y)
= ve-u
Range space of u & v
x ≥ 0 ⇒ uv ≥ 0
y ≥ 0 ⇒ v(1 − u ) ≥ 0
⇒ 0 ≤ u ≤1 &v ≥ 0
∴ g (u
( u, v ) ve − u , 0 ≤ u ≤ 1; v ≥ 0
pdf of u:
∞
∫
fU (u ) = ve − v dv
0
∞
= ⎡⎣ − ve −vv − e v ⎤⎦
0
fU(u) = 1, 0 ≤ u ≤1
pdf of v
1
∫
fV ( v ) = ve −vv ddu = ve v , v ≥ 0
0
12 (b) (ii)
x y x x x − 20 y y y − 21 ( )2 ( )2 ( )(
)( y))
10 18 -10 -3 100 9 30
14 12 -6 -9 36 81 54
18 24 -2 3 4 9 -6
22 6 2 -15 4 225 -30
26 30 6 9 36 81 54
30 36 10 15 100 225 150
∑ 120 126 0 0 280 630 252
∑ x 120
x= = = 20 ∵n = 6
n 6
∑ y 126
y= = = 21
n 6
∑( x − x )( y − y ) 252
byx = = = 0.9
∑( x − x ) 2 280
∑( x − x )( y − y ) 252
bxy = = = 0.4
∑( y − y ) 2 630
d
P (t ) + ppn (t ) = λ Pn −1 (t )
dt n
∴Which is a linear differential equation.
dy
+ PY = Q
dn
t
∴ Pn (t )e + λ t = ∫
0
pn −1 (t )e λ t dt
x
The solution is ye ∫ = Qe ∫
∫
pdx pdx
dx
0
t
∫
= λ pn −1 (t )e λ t dt
0
(1)
Now, taking n = 1
t
∫ P ( )e
− λt λt
( ) ⇒ P1 ( )e = 0 dt
0
P0 ( ) = P[ in (0, t + Δt )]
= P0 (t ) [ λΔt Δt ]
= P0 (t ) − P0 (t )( λ t )
P0 (t + t ) − P0 (t )
∴ = − λ P0 (t )
Δt
Taking limit as Δ → 0
P0 ( ) P0 ( )
lim = − λ P0 ( )
Δt → 0 Δt
d
P (t ) = − λ P0 (t )
dt 0
d [ P0 (t )]
= − λ dt
P0 (t )
∴ log 0 (t ) = −λt + c
P0 ( ) = e − λ t + c
= Ae − λ t ∵ e c = A
∴ P0 ( ) = Ae 0 = A
ie., 1 = A
t
∫
∴ e λ P1 (t ) = λ e − λ t ⋅ e λ t ddt
0
t
∫
= λ dt
0
= λt
∴ P1 (t ) = e − λ t ( λ t )
Similarly
t
∫ P (t(t )e
λt λt
P1 (t )e 1 dt
0
∴ P1 (t ) = e−λ t λ t
t
∫
= λ e−λλ t ( t )e λ t dt
0
t
= λ 2 tdt
d ∫
0
⎛ t2 ⎞
= λ2 ⎜ ⎟
⎝ 2⎠
t2
∴ P2 (t ) = e − λ t
2!
Proceeding similarly, we have, in general
(λ t )n
P[ X ( t ) n] = Pn (t ) = e − λ t , n = 0, 1, 2 ...
n!
11
∴ P[ h l by i h hi d d y ] =
24
The steady state probability distribution of the markov chain.
π (π1 , π 2 )
By the property of π ,
πP π
( 1 , π 2 ) ( / 2 1// ) (π1 , 2)
1
∴ π2 π1 (1)
2
1
and π1 + π 2 = π 2 (2)
2
From eqn (1) and (2) with
π1 + π 2 = 1
1 2
π1 d π2 =
3 3
2
∴ P[The man travels by car in the long run =
3
13 (b) (i)
Since θ is uniformly distributed in (0, 2p)
1 1
∴ f( )= = , 0<θ < π
b − a 2π
E[ x(t )] = E[ A cos ( 0 t + ))]
2π
1
=A ∫ 2π cos(ω t
0
0 θ ) dθ
A
= [sin( wo t + θ )]0 2π
2π
A
= [sin( π + w0 t ) − inn ω 0 t ]
2π
A
= [sin ω 0 t i ω 0 t ] ∴ sin ( 2π θ ) = sin θ
2π
= 0 = const .
E[ X (t1 ) X (t2 ) E[ A2 cos(ω 0 t1 θ ) ω s (ω 0 t2 + θ )]
A2
= E[cos{(t1 + t2 ) ω 0 2θ} { 0 ( t1 − t2 )}]
2
2π
A2
=
2 ∫ cos{(t + t )ω
0
1 2 0 + {(t1 − t2 )ω 0 }dθ
2π
A2 ⎡ sin{(t1 + t2 )ω 0 + 2θ} ⎤
= ⎢ +θ s{( 1 2 )ω 0 }⎥
4π ⎣ 2 ⎦0
A2
= [ πc (1 2 )ω 0 ]
4π
A2
= cos( t1 t2 )ω 0
2
∴ R (tt1 t2 ) (t1 − t2 )
∴ { X (t )} is a wide sense stationary
a process.
13 (b) (ii) The time interval T between two consecutive arrivals is a random
variable follows exponential distribution with parameter
l = 2 ∴ the pdf of T is f (t) = 2e−2t, t ≥ 0
∫ 2e −2 t
( ) P[ ] dt = e −2 = 0.1353
1
2
] = 2e∫ e −2 − e −4 = 0.1170
2t
( ) P[[ ddt
1
4
∫
(iii ) p[T ≤ 4] = 2e −2t dt = 1 − e −8 = 0.9996
0
1
= n
∞
⎛ λ⎞
∑
n= 0
⎜⎝ μ ⎟⎠
1
=
λ λ ...
(1 + + 2 + )
μ μ
1 λ
= = 1−
λ μ
(1 − ) −1
μ
n
⎛ λ⎞ λ
∴ Pn = ⎜ ⎟ ⋅1 −
⎝ μ⎠ μ
Average waiting time of a customer in the queue
α
λ
Lq
μ ∫
( μ λ ) ω e − ( μ − λ )ω d ω
0
Let ( μ − λ )ω = t
dt
dw =
μ−λ
∞
λ t dt
= (μ − λ )
μ μ−λ ∫
e−t
0
μ−λ
∞
λ
=
μ( μ − λ ) ∫
e − t dt
0
∞
λ ⎡ e−t ⎤ λ
= ⎢ ⎥ =
μ ( μ − λ ) ⎣ −1 ⎦ 0 μ( μ − λ )
λ
∴ Lq =
μ( μ − λ )
15
14 (b) arrival rate λ =
h
6 λ 5
service rate μ = ∴ =
h μ 3
number of server s = 3
Hence this is a problem in multiple server (M/M/S): (∞/FIFO )
model.
(a) P [all the typists are busy]
= P[ N ≥ ]
3
⎛ λ⎞
⎜⎝ μ ⎟⎠ .Po
= (1)
⎛ ⎛ λ ⎞⎞
3! ⎜1 − ⎜ ⎟ ⎟
⎝ ⎝ 3μ ⎠ ⎠
−1
⎡ ⎧ ⎫⎤
⎢ s−1 ⎧ n⎫ ⎪ s ⎪⎥
⎪ 1⎛ λ⎞ ⎛ λ ⎞ ⎪⎥
∑
1
P0 = ⎢ ⎨ ⎜ ⎟ ⎬ + ⎨ ⎜⎝ ⎟⎠ ⎬ ⎥
⎢ n !⎝ μ ⎠ ⎛ λ ⎞ u ⎪
⎢ n = 0 ⎪⎩ ⎪⎭ ⎪ S ! 1 −
⎜ ⎟ ⎥
⎢⎣ ⎪⎩ ⎝ μ s ⎠ ⎪⎭ ⎥⎦
−1
⎡ ⎧ ⎫⎤
⎢ 3 ⎥
⎧ 1 ⎫ ⎪⎪ 1 ⎛ 5 ⎞ ⎪⎪ ⎥
= ⎢ ⎨1 + 2.5 + ( 2.5) 2 ⎬ + ⎨ ⎜ ⎟ ⎬
⎢⎩ 2 ⎭ ⎪ 3! ⎛1 − 5 ⎞ ⎝ 3 ⎠ ⎪ ⎥
⎢
⎣ ⎪⎩ ⎝ 3 ⎟⎠ ⎪⎭ ⎥⎦
= [22.25]−1 = 0.0449
4
3
(5 / 3)
1⇒ [ 3] = × 0.0449 = 0.7016
6(1 − 5 / 9)
Hence the fraction of the time all the typists will be busy = 0.7016
(b) The average number of letters waiting to be typed
s +1
⎛ λ⎞
⋅ P0
1 ⎜⎝ μ ⎟⎠
Lq = 2
s s! ⎛ λ⎞
1 −
⎜⎝ μ s ⎟⎠
4
⎛ 2 5⎞
⋅ 0.0449
1 ⎜⎝ 6 ⎟⎠
= 3.5078
3 6 ⎛ 2 5⎞ 2
⎜⎝1 − ⎟
3 ⎠
(c) 1
ws L
λ s
1⎡ λ⎤ λ
= ⎢ Lq Ls = Lq +
λ⎣ μ⎦ μ
1
= [3.5078 + 2.5] = 0.4005 h
15
or ws = 24 min, nearly
⎡ s⎡ − μt s −1− ⎟ ⎤
⎛ λ⎞ ⎤
⎢ ⎛ λ ⎞ ⎢1 − e ⎝ u⎠ ⎥ ⎥
⎢ ⎜⎝ μ ⎟⎠ ⎢ ⎥ ⎥
(d) P[w t ] e − μt ⎢1 + ⎣ ⎦P ⎥
⎢ ⎛ λ ⎞⎛ λ⎞ ⎥
0
⎢ s ! ⎜1 − ⎟ ⎜ s − 1 − ⎟ ⎥
⎢ ⎝ μs ⎠ ⎝ μ⎠ ⎥
⎣ ⎦
⎡ 1 ⎤
∴ P [w > ] = P w > h ⎥
min ⎣ 3 ⎦
⎡ ⎤
−6 × ⎢ ( × − 5)
4 ⎥
1
( .5) {1 − e
3 (−
× 0.0449
= e 3 ⎢1 + ⎥
⎢ ⎛ 2 5⎞ ⎥
⎢ 6 1− ⎟ ( −0.5) ⎥
⎣ ⎝ 3 ⎠ ⎦
⎡ 0 . 7016 (1 − e ) ⎤
= e −2 ⎢1 +
⎣ ( −0.5) ⎥⎦
= 0.4616
15. (a) Let N & N′ be the numbers at customers in the system at time t & t
+ T, when two consecutive customers have just left the system after
getting service.
⎧1, if N = 0
where =⎨
⎩0, if N > 0
∴ E ( ′ ) = E (N) − 1 + E (M) + E(δ) (2)
When the system has reached the steady state, the probability of the
number at customers in the system will be constant
N ′2 N 2 + ( M 1) 2 δ 2 2 N ( M − 1) 2( M − 1) δ 2 N δ (5)
Now d = 0
⎧ 0 × 1, if N = 0
and N δ = ⎨
⎩ N 0, iif N > 0
=0
Using these values in (5)
5⇒ ′2 N2 M2 2 N ( M −11) ( 2 M −11) δ 2 M + 1
ie 2 N (1 M ) = N 2 N 2
+ M2 ( 2 M 1) δ − 2 +1
2 E{N (1 − M )} E ( N ) E(N 2 2
( 2 M 1)δ }
E ( N ) E ( M ) E { (2 2
− 2 E (M ) 1
ie 2E ( N ) {1 E ( M )} = E ( M 2 ) + {2E ( M ) 1} − 2 E ( M ) .
∵ Independense Property and (3)
E ( M 2 ) + { E ( M ) }{ E ( M )} 2 E ( M ) + 1
E (N ) =
2{{ E ( M )}
E (M 2 ) − 2 E2( M ) + E (M )
= ∴by 4
2{{ E ( M )}
⎪⎩ ⎝ T ⎠ ⎪⎭
= λ 2 {Var
a (T ) + E 2 T } λ E (T )
λ 2V (T ) + λ 2 E 2 (T ) + λ E (T ) − 2 λ 2 2
( ) λ E (T )
Using ( ) a d ( ) in 5 Ls = (7)
2{1 λ E (T T)}
⎧V (T ) + E 2 (T ) ⎫
= λ ( ) λ2 ⎨ ⎬{ λ (T )}
⎩ 2 ⎭
15 (b) Open Jackson Networks
A networks of ‘k’ service facilities or nodes is called an open Jack-
son Network, if it satisfies the following characteristics:
(1) Arrivals, from outside, to the node ‘i’ follow a Poisson process
with mean rate ‘ri’ and join the queue at ‘i’ and wait for his turn
for service.
(2) Service times at the channels at node ‘i’ a independent and each
exponentially distributed with parameter ‘μi’.
(3) Once a customer gets the service completed at node ‘i’ he joins
the queue at node ‘j’ with probability ‘Pij” (whatever be the
number of customers waiting at ‘j’ for service), where i = 1,2, …
k, and j = 0,1,2,… k.
Pio represents the probability that a customer leaves the system from
node i after getting the service at ‘i’.
If we denote the total arrival rate of customers to server ‘j’ [viz., the
sum of the arrival rate rj (Note: It is not lj) to ‘j’ coming from outside
and the rates of departure λi from the servers i] by λ j , then
k
λj j ∑λ Pi =1
i iij , ; j k (1)
Pij is the probability that a departure from server ‘i’ joins the queue
at server ‘j’ and hence ‘liPij’ is the rate of arrival to server ‘j’ from
among those coming out from server ‘i’.
Equations (1) are called Traffic equations or Flow balance equa-
tion.
∑P
j =1
ij = 1 for all i =1,2,…, k. we note that the matrix P = [ ij ] is
λj ∑ λ P ; j = 1,2,…, k [
i =1
i ij j ] (1)
Where C N1 ∑
n1 + n2 + ...+nnk N
P1n1 P2 n Pk nk ,
λj
Pj =
μj
3. Given the two regression lines 3x + 12 y = 19, 3y + 9x = 46, find the coef-
ficient of correlation between X and Y.
Y
6. Find the transition prob. matrix of the process represented by the state
transition diagram.
0.5
2 0.3
0.4 1
0.3
0
0.1 0.2
0.3 0.4
3
0.5
8. Trains arrive at the yard every 15 minutes and the service time is 33
minutes. If the line capacity of the yard is limited to 4 trains find the
probability the yard is empty.
9. Given that the service time is Erlang with parameter m and m. Show that
m( + m)l 2
the Pollaczek –Khintchine formula reduced to Ls ml +
2(( − ml )
PART B – (5 × 16 = 80 marks)
11. (a) (i) Find the moment-generating function of the binomial r.v with
parameters m and p and hence find its mean & variance.
(ii) Define Weibull distribution and write its mean and variance.
Or
(b) (i) Derive mean & variance of a geometric distribution. Also estab-
lish the forgetfulness property of the geometric distribution.
(ii) Suppose that the telephone calls arriving a particular switch-
board follow a Poisson process with an average of 5 calls. What
is the probability that up to a minute will elapse unit 2 calls have
come in to the switch boards?
Or
(b) (ii) If X & Y are independent random variables having density fns.
⎧2e −2 x x ≥ 0 3e −3 y y ≥ 0 respectively,
⎧3e
f ( x) = ⎨ and f y ( y ) = ⎨
⎩ 0 0.w ⎩ 0 y <0
find the density fns of Z = x−y:
13. (a) (i) Show that the random process {X(t)} = A cost + B sint −∞ < t
<∞ is a wide sense stationary process where A and B are inde-
pendent random variables each of which has a value −2 with
probability 1/3 and a value 1 with 2/3.
(ii) Derive probability distribution of Poisson process and hence
find its auto correlation fn.
Or
(b) (i) Find the limiting–state probabilities associated with the follow-
⎛ 0.4 0.5 0 1⎞
ing probability matrix ⎜ 0.3 0.3 0 4⎟
⎜ ⎟
⎜⎝ 0.3 0.2 0 5⎟⎠
14. (a) (i) Customers arrive at a one window drive in bank according to
Poisson Distribution with mean 10 per hour. Service time per
14. (a) (ii) Show that for the ( / M / : FCFS / ) , the distribution of
waiting time in the system is w (t ) = ( μ λ )e −(( μ λ )w
.
(b) Find the steady state solution for the multi server M/M/C model and
hence find Lq, Wq, Ns, and Ls by using Little’s formulas.
15. (a) Derive the expected steady state system size for the single serve
queues with Poisson input and General services.
Or
1.
d ⎧0 0.w
f ( x) = F ( x) = ⎨
dxx x ⎩1 0 ≤ x ≤ 1
1
⎛ 1⎞ 1 3
∫ dx = ( x)
1
∴P X > ⎟ = = 1− =
⎝ 4⎠ 1/4
4 4
1/4
X 2 3 4 5 6 7 8 9 10 11 12
p(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36
3.
3X + 12 Y = 19
1 19
Y X+
4 3
−1
∴ bYX =
4
3 9 X = 36
1
⇒ X = − Y +4
3
−1
∴ bXY =
3
⎛ −1⎞ ⎛ −1⎞ 1
∴ r = ± bXY bXY = ⎜ ⎟ ⎜ ⎟ = ±
⎝ 4 ⎠⎝ 3⎠ 12
Since both bxy, byx are negative
1 1
r=− =−
12 2 3
5. (a) If ‘t’ is continuous and the random variable X is continuous, then the
random process X(t) is called a continuous random process.
(b) If ‘t’ is continuous and the random variable X is discrete then the
random process X(t) is called discrete random process.
6.
1 2 3
1 ⎡0.4 0.5 0 1⎤
⎢ ⎥
TPM P = 2 ⎢ 0.3 0.3 0 4 ⎥
3 ⎢⎣ 0.1 0.2 0 5⎥⎦
1 1
7. λ 9
μ=
3
λ 1/9 1
P(the system is busy) = = =
μ 1/3 3
8. Since yard with maximum capacity 4, if M/M/I finite capacity model-III.
1 1 11
1115
/
Given λ μ= = 4 λμ= = 3315
/11
15 33 1/333
1− λ μ 1 − 3315
/1
P ( idle ) P0 = = = 0.0237
1 − (λ μ ) 1 )
k +1
1 − (3315
5
9. P-k formula
λ 2{ (T ) ( E (T )) 2 }
Ls λ E (T ) +
2(1 − λ E (T ))
E (T ) = m/μ ⎫
Here ⎬ Erlang distribution
Var(T ) m/μ 2 ⎭
⎛ m ⎛ m⎞ 2 ⎞
λ2 ⎜ 2 + ⎜ ⎟ ⎟
λ ⎝μ ⎝ μ⎠ ⎠
∴ LS = m +
μ ⎛ ⎛ m⎞ ⎞
2 ⎜1 − λ ⎜ ⎟ ⎟
⎝ ⎝ μ⎠⎠
m (1 + m) l 2
= ml + where P = λ μ
2 (1 − ml )
PART B
11. (a) (i) Let X denote discrete r.v
Then its form f is
( = x) C x p x q m− x
mC x = 0,1, 2, ...m.
To find Mgf:
∞
( )= ∑e
M x (t ) = E e tx tx
p( x )
x =∞
m
= ∑e
x=0
tx
mcx p x q m − x
∑ mc ( ) q
x m− x
= x
x=0
( ) + mc q ( )
2
= q m + mc1q m −1 2
m− 2
( )
m
+ +
Mgf = (q + pet)m
Mean:
Mean = E ( x ) = M x′ ( 0 )
M x (t ) = (q
(q pe t )m (1)
Diff. (1) w.r.to ‘t’
M x′ (t ) = m(q
m( q pe t ) m −1 ⋅ pe t (2)
Put t = 0 in eqn. (2)
E ( x ) = M x′ ( ) m
mp
E(X)2
( )
E X 2 = M X′′ ( 0 )
( )
⎡ m −1
(2) ′
X () ⎢⎣ .e t ⎤⎥ (3)
⎦
Diff.(3) w.r.to. ‘t’
( ) ( )
m −1 m
M X′′ ( t ) = mp q + pe t .ee t + e t ( m − 1) q + pe
p t p
Put t = 0
( )
E x 2 = M X′′ ( 0 ) m ⎡⎣1
mp ( m 1) p⎤⎦ m [ mp
mp m + q ] = m2 p 2 + mpq
(∴1 − p = q)
∴Var (x) = E(x2)−{E(x)}2
= m2 p 2 + mpq − ( mp )
2
= mpq
11. (a) (ii) If X denotes the continuous random variable then the pdf of
Weibull distribution is
⎧⎪αβ x β − e −α x β x>0
f ( x) = ⎨
⎩⎪ 0 0.w
1/ β
⎛ 1⎞
Mean = ⎜ ⎟ 1/ β +1
⎝α⎠
{ )}
2/ β
(
2
⎛ 1⎞
Var( x ) / + − / +
⎝α⎠
∞
= ∑e
x =1
tx x −1
q ⋅p
∑ (qe )
p
= t x
q x =1
p
= [q(e t ) + ( qe t ) 2 + ( qe t )3 + ......]
q
p
( q (e t ) ⎡1 + qe t + ( ) + …⎤⎥
2
=
q ⎣ ⎦
(− )
−1
= pet
⎛ 1 ⎞ p
= pet ⎜ t⎟
= −t
⎝ 1 − qe ⎠ e − q
Mgf = MX(t) = p(e−t − q)−1
Mean:
Mean = E ( ) = M X′ ( 0 )
( )
−1
M X (t ) = p e t
q (1)
( ) ( )
−2
M X′ ( t ) = ( −1) p e t
−q e t
( )
−2
M X′ ( t ) = p e −tt e t
q (2)
Put t = 0,
Mean = M X′ ( 0 ) p (1 − q )
−2
p 1
= pp −2 = 2
=
p p
( )
E X2
E(X ) = M
2 ′′
X (0)
( 2) ⇒ M X′ (t ) = p ⎡⎢e − t ( )
−2 ⎤
t
− ⎥⎦
⎣
( ) ( −e ) (e ) ( )⎤⎥⎦
−3 −2
M X′′ ( t ) = p ⎡e −tt ( −2) e t
q t t
−q e t
⎣
Put t = 0
( )
E x 2 = p ⎡ 2 (1 q )
⎣
−33
− (1 − q ) ⎤
2
⎦
⎛ 2 1 ⎞ 2 1
= p⎜ 3 − 2 ⎟ = 2 −
⎝p p ⎠ p p
1 1 1− p q
= 2
− = 2 = 2
p p p p
∴ p+q =1
=
P( x > m + n ) = qm+ n
P ( x > m) qm
= qn = P(x > n)
e−λ λ x
∴ P( X = x) = x = 0,1, 2…
x!
e −5 ( ) 2
P( X )= = 0.0842
2!
12. (a) g( x) = ∫
−∞
f ( x, y )dyy
∫ 2 ( + ) dyy
x
=
0
1
x⎛ 3 y3 ⎞ x
= y+ ⎟ = (1 + 1) = x
2⎝ 3 ⎠ 2
0
⎧x 0 ≤ x ≤ 2
∴ g ( x) = ⎨
⎩0 0.w
∞
h( y) = ∫
−∞
x y ) dxx
f ( x,
∫ 2 ( + ) dxx
x
=
0
2
⎛ x2 ⎞
= ( + ) ⎜ 2⎟
⎝ ⎠0
⎧( y)2
⎪ 0 < y <1
h( y ) = ⎨ 2
⎪
⎩ 0 0.w
f ( xx, y ) x / ( 1 + 3 y )
2
f x y ( x / y) = =
f y ( y) 1// ( 1 + 3 y 2 )
fx/y(x/y) = x 0<x<1
1/ 2
⎛1 1 1⎞
∴P
⎝4
< x < /y = ⎟ =
2 3⎠ ∫
1/ 4
fx y ( x / y = 1 / 3) dxx
1/ 2 1/ 2
⎛ x2 ⎞
= xdx
1/ 4
∫
d =⎜ ⎟
⎝ 2 ⎠ 1/ 4
1⎛1 1 ⎞ 3
= − ⎟ =
2 ⎝ 4 16 ⎠ 32
∞
12. (b) (i) f X ( x) = ∫
−∞
f ( x, y ) dyy
2
⎛ xy ⎞
= ∫⎝x +
2
⎟ dyy
3⎠
0
2
⎛ x ⎛ y2 ⎞ ⎞
= ⎜ x2 y + ⎜ ⎟ ⎟
⎝ 3 ⎝ 2 ⎠⎠
0
⎧ 2 2x
⎪2 x + 0 ≤ x≤1
f X ( x) = ⎨ 3
⎪
⎩ 0 0.w
∞
f y ( y) = ∫
−∞
f ( x, y ) dxx
1
⎛ xy ⎞
= ∫⎝x +
2
⎟ dxx
3⎠
0
1
⎛ x3
x2 y ⎞
= +
⎝ 3 6 ⎟⎠
0
1 y
= +
3 6
⎧1
⎪ ( y) 0 ≤ y
y)
f y ( y) = ⎨ 6
⎪ 0
⎩ 0.w
X & Y are said to be independent
if fX(x).fY(y) = f(x, y)
⎛ 2 2x ⎞ ⎛ 1 ⎞
∴ f X ( x ) fY y ) = x + ⎟ ⎜ ( 2 + y )⎟
⎝ 3 ⎠ ⎝6 ⎠
xy
= f ( x, y )
≠ 2
+
3
∴ X & Y are not independent.
∂( x , y )
∴ g zw ( z , w ) = f xy ( x y )
∂( z , w )
= 6e−(2x+3y) (1) x, y ≥ 0.
= 6e−(2z+2w+3w) z + w ≥ 0, w ≥ 0.
z , w )= ⎧6e −2 z e 5w
z + w ≥ 0, w ≥ 0
g zw ( ⎨
⎩ 0 0.w
To find
∞
gz ( z) =
−∞
∫ f zw ( z , w )ddz
Here w + z ≥ 0, w ≥ 0, → w ≥ 0.
w+z=0
w 0 1
z 0 −1
w w
w≥0
z z
w
+
z=
0
⎧∞
∫
⎪ 6 −2 −5 w
d <0
⎪⎪ − z
=⎨
∞
⎪
∫ −2 z −55 w
⎪ 6 d >0
⎪⎩ 0
⎧ ∞
⎛ −5w ⎞
⎪6e −2 z ⎜ e ⎟ z<0
⎪⎪ ⎝ −5 ⎠ − z
=⎨
∞
⎪ −2 z ⎛ e −5w ⎞
⎪6e ⎜ ⎟ z>0
⎪⎩ ⎝ −5 ⎠ 0
⎧ −2 z ⎛ e5 z ⎞
⎪6 ⎜ 0 − z<0
⎪ ⎝ −5 ⎟⎠
=⎨
⎪ −2 z ⎛ 1⎞
⎪ 6 ⎜⎝ 0 + ⎟⎠ z>0
⎩ 5
⎧ 6 3z
⎪⎪ 5 e z<0
gz ( z) = ⎨
⎪ 6 e −2 z z>0
⎪⎩ 5
A −2 1 B −2 1
p(A) 1/3 2/3 p(B) 1/3 2/3
⎛ 1⎞ ⎛ 2⎞
E ( A) = ∑ Ap( A) = − 2 ⎜ ⎟ + 1 ⎜ ⎟ = 0
⎝ 3⎠ ⎝ 3 ⎠
⎛ 1⎞ ⎛ 1⎞
E ( A2 ) = ∑ A2 p( A) = 4 ⎜ ⎟ + 1 ⎜ ⎟ = 2
⎝ 3⎠ ⎝ 3⎠
Similorly E(B) = 0 & E(B2) = 2
(i) E(X(t)) = E(A cost + B sint)
= cost E(A) + sint E(B) = 0 a constant.
(ii) Rxx(t,t + t ) = E(X(t) × (t + T))
= E(A cost + B sin t) (A cos (t + t)
+ B sin (t + t))
= E(A2 cost cos (t + t) + AB cost sin (t + t) –B
sint cos (t + t)+ B2sint sin(t + t))
= cost(t + t) E(A2) + [cost sin (t + t) + sint cost
(t + t)]E(AB)+ sint sin (t + t) E(B2)
Since A & B are independent.
E(AB) = E(A)⋅ E(B) = 0.
& E(A2) = 2, E(B2) = 2
∴Rxx(t, t + t) = 2 (cos(t + t)cost + sin(t + t) sin t)
= 2 cos(t) which is fn. of T only
∴{x(t)} is WSS process
13. (a) (ii) Poisson Process: If x(t) represents the no. of occurrences of a
certain event in (0,t) then the discrete random process {x(t)} is
called the Poisson process, provided the following postulates are
satisfied.
(i) P(1 occurrence in (t, t + Δt) = lΔt + 0(Δt)
(ii) P (0 occurrence in (t, t + Δt)) = 1−lΔt + 0(Δt)
(iii) P(2 or more occurrence in t, t + Δt) = 0 (Δt)
(iv) X(t) is independent of the no of occurrences of the event in
the any interval prior and after the interval (0,t).
Probability law for Poisson Process X(t)
Let l be the no of occurrences of the event in unit time.
Let pn(t) = p(x(t) = n).
Consider,
pn(t + Δt) = p(x(t + Δt) = n)
dp (t )
⇒ + λ pn (t ) = λ pn −1 (t )
dt
The general solution is given by
pn(t) elt = ∫lpn−1(t) elt dt (1)
If n = 1
p1(t) elt = ∫lp0(t) elt dt (2)
To find p0(t):
p0(t + Δt) = p (0 occurrence in (0,t) & 0 occurrence in t, t
+ Δt)
⇒p0(t + Δt) = p0(t) (1 − lΔt + 0 (Δt))
Omitting 0 (Δt), and take Δt →0,
We get, ′
0 (t ) = − λ p0 (t )
dp (t )
⇒ = − λ p0 ( t )
dt
dp (t )
⇒ = − λ dt
p0 (t )
λ 2t 2
=
2
e − λt (λ t )2
∴ p2 ( t ) =
2!
Proceeding n = 2, 3,… in this way
we get,
e − λt ( t )n
pn (t ) = n = 0,1, 2,…
n!
Auto correlation of Poisson Process:
Rxx(t1, t2) = E(x(t1) × (t2))
= E[x(t1) (x(t2) − x(t1)) + x2(t1)]
= E[x(t1) [x(t2) − x(t1)] + E(x2(t1)]
(t2 ) x(t1 )) + λ 2 t12 + λ t1
E ( x(t1 ) E ( x(t
= ( λ 1 )( λ (t2 − t1 )) + λ 2 12 + λ t1
= λ 2t1t2 − λ 2 2
1 + λ2 2
1 + λ t1
= l2t1 t2 + l min (t1, t2)
13. (b) (ii) Let X1 (t) & X2(t) be two independent of Poisson process with
parameter l1 & l2 respectively.
Let X(t) = X1(t) − X2(t)
E(X(t) = E(X1(t) – X2(t)) = E(X1(t) – E(X2(t))
= l1t − l2t
= (l1 − l2) t
E(X2(t) = E((X1(t) – X2(t))2
( ( )+ ( ) () ( ))
Since X1 (t) X2(t) are independent
= λ12 + λ1t + λ 22 + λ 2 t − 2( λ1 ) ( λ 2 t )
= (l1 − l2)2t + (l1 + l2)t ≠ (l1 − l2)2t1 + ((l1 − l2)t
∴X(t) is not a Poisson process.
1
(i) P(Customers can arrive directly to the space) =
6
(ii) P(arriving customer will to have to wait) = P(Ν > 3)
4 4
⎛ λ⎞ ⎛ 10 ⎞
=⎜ ⎟ =⎜ ⎟
⎝ μ⎠ ⎝ 12 ⎠
1
(iii) Wq WS =
μ
1 1 1
where WS = = =
μ − λ 12 − 10 2
1 1 4
= − = =04
2 10 10
14. (a) (ii) The mgf of the exponential distribution (μ) is (1− t/μ)−1 and hence
the sum of (n + 1) exponential (m) variables is (1− t/μ)n+1 which is
Erlang distribution with parameter μ and (n + 1).
∞
∴ f (w) = ∑ f (w / n) ⋅ p
n= 0
n
∞ n
μ n +1 − μ ⎛ λ ⎞ ⎛1− λ ⎞
= ∑
n= 0
n!
e w ⎜ ⎟ ⎜
⎝ μ ⎠ ⎝ μ ⎟⎠
∞
⎛ λ⎞ (λ w )n
= μe − μw 1 − ⎟
⎝ μ⎠ ∑n= 0
n!
= ( μ − λ )e − μ e λ w
= ( μ − λ )e − ( μ − λ ) w
Put n − c = x,
∴n = x + c
∞
= ∑x p
x=0
x+c
∞
1 λ
= ∑ x .C c
x=0
x
( ) x + c p0
μ
c
⎛ λ⎞
⎜⎝ μ ⎟⎠ p0 ∞
⎛ λ⎞
x
=
C! ∑ x=0
x⎜ ⎟
⎝ μ⎠
c
⎛ λ⎞
⎜⎝ μ ⎟⎠ p0 ⎛ ⎛ λ ⎞ ⎛ λ⎞
2
⎛ λ⎞
3 ⎞
= ⎜ ⎜ ⎟
1 + 2 ⎜ ⎟ + 3 ⎜ ⎟ + …⎟
C ⎜⎝ ⎝ μc ⎠ ⎝ μc ⎠ ⎝ μc ⎠ ⎟⎠
c
⎛ λ⎞
⎜⎝ μ ⎟⎠ P0 ⎛ λ ⎞ ⎛ λ⎞
−2
= 1 −
⎜⎝ μc ⎟⎠ ⎜⎝ μc ⎟⎠
C
1 ( λ / μ )c +1
∴ Lq =
c.c ! (1 − λ / μC ) 2
By little’s formula.
λ
LS Lq +
μ
Lq
Wq =
λ
1
WS Wq +
μ
15 (a) Let arrivals follow Poisson process with rate of arrival l. Service
times are independently and identically distributed random vari-
ables with an general distribution with pdf f(t) and the service time T
between two departures.
Let N(t) be the no. of customers in the system at time t 0. Let
tn be the time instant at which nth customer completes service and
departs.
Let X(t) = N(tn) , n =1,2,3… then Xn represent the no. of customers
in the system when the nth customer departs and sequence of random
variables. { xn: n = 1,2,3…} is Markov chain.
⎧Xn A if X n ≥ 1
X n +1 = ⎨
⎩ A if X n = 0
∴ X n2+ = X n2 + U ( X n ) + A2 − X n − 2A
AU ( X n ) + AX n
A
2( n n)
2
n
2
n +1 U( n ) 2
2 AU ( X n )
2 ( n ) (1 A)) ( ) (
n n ) (U ( )) ( A )n
2
2E ( A) E (UXn )
UX
E ( A) + E ( A2 ) − 2 E 2 ( A)
⇒ E( X n ) = (2)
2(( − E ( A))
But E(A) = E(E(A/T)) = E(lT) = lE(T)
E(A2) = E(E(A2/T)) = E((lT)2 +lT)
= lE(T2) + lE(T)
Sub in (2),
λ E((T ) + λ 2 E(T
E((T 2 ) + λ E((T ) − ( λ E (T )) 2
E
E( X n ) = (3)
2(( λ E (T ))
But E(T2) = Var(T) + (E(T))2
λ2( (T ) {E ( )) 2 )
( ) ⇒ LS = λ E ( ) +
2(( λ (T ))
This is called P-k-formula.
(X, Y ) to given to
4. Let the joint pdf of the random variable (X
4 xye −(( x ; x > 0 and y > 0. Are X and Y independent? Why or
2
y2 )
f ( x, y )
why not?
PART B (5 ë 16 = 80 marks)
11. (a) (i) The distribution function of a random variable X is given
by F(X) = 1 − (1 + x)e−x ; x ≥ 0. Find the density function,
mean and variance of X. (8)
(ii) A coin is tossed until the first head occurs. Assuming that
the tosses are independent and the probability of a head
occurring is ‘p’. Find the value of ‘p’ so that the probability
that an odd number of tosses required is equal to 0.6. Can
you find a value of ‘p’ so that the probability is 0.5 that
an odd number of tosses are required? (8)
or
(b) (i) If X is a random variable with a continuous distribution
function F(X), prove that Y = F(X) has a uniform
distribution in (0,1). Further if
⎧1
⎪ ( x 1);
1);
) 1 x 3,
f (X ) = ⎨2
⎪⎩ 0; otherwise
find the range of Y corresponding to the range 1.1 ≤ x ≤
2.9. (8)
(b) (ii) The time (in hours) required to repair a machine is
1
exponentially distributed with parameter l = .
2
What is the probability that the repair time exceeds 2h?
What is the conditional probability that a repair takes at
least 10h given that its duration exceeds 9h? (8)
12. (a) (i) Given f(x, y) = cx(x − y), 0 < x < 2, −x < y < x and ‘0’
elsewhere. Evaluate ‘c’ and find fx(x) and fy(y). (8)
=( -a
− 2a
).
2. X follows poisson distribution with mean l = 1.
e−λ λ x
W.K.T P ( X = x ) =
x!
e −1 (1) 2 1
∴ P ( X = 2) = =
2! 2e
3. If there is no linear correlation between X and Y i.e., γ XY = 0.
The equations of the regression lines becomes y = y and x = x .
Angle between two lines are 90°.
∞
∫ xye − x .e
2
y2
4. f x dy
0
= 2 xe − x ;
2
> 0.
− y2
IIIly f Y ( y ) 2 ye ; y > 0.
x2 y2
= 4 xye −(( x
2
y2 )
Now f X ( x ). f Y ( y ) xe .2 ye
= f XY ( x, y ).
Hence X and Y are independent.
e − ltt ( t ) x
5. Let P { X (t ) x} = ; x = 0, 1, 2 …
x!
∴E{ } = llt ≠ a constant.
∴The Poisson process is not stationary.
6. If the transition probability does not depend on the step then the Markov
chain is called a homogeneous Markov chain.
1 1
7. Given l = ; m=
12 4
The average number of persons waiting in the system
λ
E( N ) =
μ−λ
⎛ 1⎞
⎜⎝ ⎟⎠
12
= = 0.5.
(1 / 4) − (1 / 12)
8.
l l
l
n−1 n
0 1 2
m 2m n
nm
PART B
11. (a) (i) By the property of F(x), the pdf f(x) is given by f(x) = F (x) at
points of continuity of F(x).
The given pdf is continuous for x ≥ 0.
∴ f ( x ) = (1 + x ) e − x − e − x = xe − x , x ≥ 0
∞
E ( X ) = ∫ x 2 e − x dx = 2
0
∞
E ( X 2 ) = ∫ x 3 e − x dx = 6
0
V ( X ) = E ( X 2 ) − [ E( X ) ] = 2.
2
11. (a) (ii) Let X denote the number of tosses required to get the first head
(success). Then X follows a geometric distribution given by
P( X ) = Pq r −1 ; r = 1, 2, …,
P
= 2 ( q 2 + q 4 + q6 + ...)
q
P q2 1
= . = (since p+ q = 1)
q 1− q
2 2
1+ q
1 1
= = 0.6, if =06
1+ q 2− p
0.66 p = 0.2
p = 1/ 3
1 1
= 0.5, if =05
1+ q 2 p
p
1− = 1
2
p = 0.
Though we get p = 0, if is meaningless because
P(X = an odd number)
∞
= ∑ Pq 2 r − 2
r =1
=0, h p = 0.
Hence the value of p cannot be found out.
11. (b) (i) The distribution function of Y is given by
G y ( y ) = p(Y ≤ y)
= p { F(X)
X y}
=p X ≤F{ 1
( y )}
[The inverse exists, as F(x) is non-decreasing and continuous]
= F [ F −1 ( y )][∴ p { ≤ } = F ( x)]
= y.
Therefore, the density function of Y is given by
d
gy ( ⎡G y ( y ) ⎤⎦ .
ddy ⎣
Also the range of Y is 0 ≤ y ≤ 1. Since the range of F(x) is
(0,1). Therefore, Y follows a uniform distribution in (0,1).
⎧1
⎪ ( x ), 1 ≤ x ≤ 3
When f ( x ) = ⎨ 2
⎪⎩ 0 otherwise
x
1 1
F ( x ) = ∫ ((xx )ddx (x )2 .
1
2 4
1
Since Y = F(X), Y ( X − 1) 2
4
1 1
∴ When 1.1 ≤ X ≤ 2.9, (1.1 −1) 1) 2 ( 2.9 1) 2 .
4 4
(i.e.,) the required range of Y is 0.0025 ≤ Y ≤ 0.9025.
1 − 2x
11. (b) (ii) f ( x ) = λ e − λ x e , x>0
2
∞
1
a) p( X )=∫ e x 2
ddx
2
2
( )
∞
= −e x
e −1 = 0.3679.
2
X
Y=
X
0
Y= X=2
−X
Here the range space is the area within the triangle OAB(shown
in figure), defined by 0 < x < 2 and − x < y < x.
∫∫
Δ OAB
cx( x y )dx dy = 1
2 x
∫ ∫ cx( x
0 −x
y )dy dx = 1
i.e., 8C = 1
C = l/8.
x
1
(b) f X ( x ) = ∫ 8 x( x
−x
y )ddy
x3
= , i 0 < x < 2.
4
2
1
(c) fY ( y ) = ∫ 8 X (x
−y
y )ddx in 2≤ y≤0
2
1
= ∫ x( x y )ddx in 0 ≤ y ≤ 2.
y
8
⎧1 Y 5 3
⎪⎪ 3 − 4 + 48 y in 2 ≤ y ≤ 0
i.e., fY ( y ) = ⎨
⎪ 1 − Y + 1 Y 3 i 0 y ≤ 2.
⎪⎩ 3 48
∑x i 34 ∑ y = 90 i
∑x 2
i 248 ∑ y = 1446 2
i
∑x y i i = 582
n∑ xy
x ∑ x.∑ y
rXY =
(n x − x ) / ( n∑ y ( ∑ y ))
2 2
6 × 582 − 34 × 90
=
(6 × 248 − ( ) 2 )(6 × 1446 − ( )2 )
432
= = 0.9879.
332 × 576
12. (b) (i) If m is the common mean, the point (m, m) less on y = ax + b and
x = cy + d [θ they intersect at x y ]( )
∴ μ = μ+b ...(1)
μ = μ+d ...(2)
b
From(1), μ =
1− a
d
From(2), μ =
1− c
b d
=
1 a 1− c
b 1− a
∴ = .
d 1− c
σ 2Y bYX a σY a
Now = = ∴ = .
σ 2 X bXY c σX c
⎧ −30 S − 150 10 ⎫
∴ P( ≤ Sn ≤ ) = P⎨ ≤ n ≤ ⎬
⎩ 150 150 150 ⎭
= P {−2.45 ≤ Z ≤ 0.855}
where Z is the standard normal variable
= 0.4927 + 0.2939 (from the normal tables)
= 0.7866.
i) (T > 1)) ∫
2t 2
.135
1
2
( ≤ 4)) = ∫ 2
iii) (T −2
= 1− −8
0.999.
0
⎛ 20 3 ⎞
p = ⎜ 23 23 ⎟ .
⎜ ⎟
⎝14 / 15 1 / 15⎠
322 p 1
⇒ 0 =
45
Sub in (1)
367 p1 = 45 ⇒ p1 = 0.123
∴ p0 = 0.877
ie., p0 = 87.7%
and p1 = 12.3%
where p0 = fraction of signals that are recognizable
p1 = fraction of signals that are highly distorted.
13. (b) (i) Refer Nov/Dec 2009 13 (b) (i)
13. (b) (ii) Refer Nov/Dec 2009 13 (b) (ii)
14. (a) l = 6/minute; m = 8/minute.
1
a) E ( ) =
μ λ
1 1
= = min .
8−6 2
∴ E(total time required to purchase the ticket and to reach the seat)
1 1
= + 1 = 2 min .
2 2
Hence he can just be seated for the start of the picture.
b) P(total time < 2 min)
⎛ 1⎞ ⎛ 1⎞
= P ω < ⎟ = 1− P ⎜ω > ⎟
⎝ 2⎠ ⎝ 2⎠
⎛ λ⎞
− μ ⎜ 1− ⎟ 1
⎝ μ⎠
= 1− e ×
2
= 1 − e −1 = 0.63
P (ω < t ) = .99
P (ω > t ) = 0.01
e −( μ λ )t
= 0.1
−2t = log( 0. ) = −2.3
t = 1.15 min
in .
(i.e) P(ticket purchasing time < 1.15) = 0.99
1⎧ λ⎫
= ⎨E(N q ) + ⎬
λ⎩ μ⎭
1
= 3.5078 + 2.5
15
= 0.4005h or 24 min, nearly.
⎧ ⎡ ⎛ λ⎞⎤⎫
⎪1 + ( λ μμ)) 3 ⎢1− e −μμt ⎜ S − 1 − ⎟ ⎥ ⎪
⎪ ⎣ ⎝ μ⎠ ⎦⎪
P ( W > t ) = e −μμt ⎨ ⎬
⎪ s! ⎛ 1 − λ ⎞ . ⎛ S − 1 − λ ⎞ ⎪
⎪ ⎜⎝ μ S ⎟⎠ ⎜⎝ μ ⎟⎠ ⎪
⎩ ⎭
− 2X − 0 5
⎡ 1− e × 0.0449 ⎤
⎛ 1 ⎞ −6 / 3 ⎢1 + ( 2.5)3 ⎥
P⎜W> ⎟ = e ⎛ 2 5⎞
⎝ 3⎠ ⎢ 6 1− ⎟ ( 0.5) ⎥
⎢⎣ ⎝ 3 ⎠ ⎥⎦
⎡ 0.7016 (1 − e) ⎤
= e − 2 ⎢1 +
⎣ ( − 0.5) ⎥⎦
= 0.4616
3. Give a real life example each for positive correlation and negative
correlation.
PART B (5 ë 16 = 80 marks)
or
(b) (i) Describe the situations in which geometric distributions could be
used. Obtain its moment generating function.
⎛ n − 1⎞ r n − r
P[ x n] = ⎜ p q n ≥ r.
⎝ r − 1⎟⎠
12. (a) (i) Suppose that X and Y are independent non-negative continuous
random variables having densities fX(x) and fY(y) respectively.
Compute P[X < Y].
(b) (i) If the correlation coefficient is 0, then can we conclude that they
are independent? Justify your answer, through an example. What
about the converse?
(ii) Three out of every four trucks on the road are followed by a
car, while only one out of every five cars is followed by a truck.
What fraction of vehicles on the road are trucks?
14. (a) Define birth and death process. Obtain its steady state probabilities.
How it could be used to find the steady state solution for the M/M/1
model? Why is it called geometric?
or
Mean = E(X) = ∑ x pq
x =1
x −1
∞
= p ∑ x. pq
x =1
x −1
= p{ + q + q 2 + }
1
= p[ − q]−2 = p[ p −2 ] =
p
q
(or) If p( X x) pq x x = 0,1, 2 , then E ( x ) =
P
2 (i) If X is a discrete/ continuous random variable, then for
any two +ve integers m, n with m > n.
[ > m + n/ X > m] = P[X
p[X [ > n]
In other words, future value depends on present not on past is called
memoryless property.
(ii) In continuous distributions, Exponential distribution follows this prop-
erty.
⎡1 / 2 1 / 2 0 ⎤
6. Given P = ⎢1 / 2 1 / 4 1 / 4 ⎥
⎢ ⎥
⎢ 0 1 / 3 2 / 3⎥
⎣ ⎦
⎡1 / 2 1 / 2 0 ⎤ ⎡1 / 2 1 / 2 0 ⎤
⎢ ⎥⎢ ⎥
P = ⎢1 / 2 1 / 4 1 / 4 ⎥ ⎢1 / 2 1 / 4 1 / 4 ⎥
2
⎢ 0 1 / 3 2 / 3⎥ ⎢ 0 1 / 3 2 / 3⎥
⎣ ⎦⎣ ⎦
⎡2 / 4 3 / 8 1/ 8 ⎤
⎢ ⎥
= ⎢ 3 / 8 19 / 48 11 / 48 ⎥
⎢ 1 / 6 11 / 36 19 / 36 ⎥
⎣ ⎦
As pij > 0, ν i j and for some ‘n’, the tpm is irreducible.
( n)
1
7. Given λ = per minute
12
1
μ = per minute
8
λ 1 / 12
Average number of customers in the system = Ls = = =2
μ−λ 1 1
−
8 12
8. M stands for Markovian, i,e., Arrival follows Poisson & service time
follows exponential.
9.
Open network Closed network
1. Arrivals from 1. New customers
outside to the cannot enter in
node is allowed. to the system.
2. Once a customer 2. Existing cus-
gets the service tomers can-
completed at not leave the
node i, he joins system.
the queue at node
j with probabil-
ity pij or leaves
the system with
probability pio
10. M/G/1 is a non –Markovian queueing models, as the service time follows
general distribution.
PART –B
e−λ λ x
11. (a) (i) If X N P( λ ), P(X = x) = P(x) = ,
Lx
x = 0,1, 2… ∞
∞
M.G.F = M X (t ) = ∑e x=0
tx
p( x )
∞
e−λ λ x
= ∑e
x=0
tx
Lx
∞
(λ et ) x λ et ( λ et )2
= e−λ ∑
x=0 Lx
= e − λ {1 +
L1
+
L2
+ ....}
−λ λ et
= e λ [e − ]
t
=e .e
d d
Mean = E ( X ) = [ M x (t )}t = 0 = [e λ ( e − ) ]t = 0
t
dt dt
d
= [e λ e e − λ ]t = 0 = [e − λ .λ e t .e λ e ]t = 0
t t
dt
=λ
d
E ( X 2 ) = M X " (0) = ⎡ λ e − λ e λe et ⎤
t
dt ⎣ ⎦t =0
= λ e − λ [e λ e .e t + e t .λ e t .e λ e ]
t t
= λ + λ2
Var(X) = E(X2) = [E(X)]2 = l + l2 − l2 = l
(ii)
⎧ce −2 x , 0 < x < ∞
f(x) = ⎨
⎩ 0 , x<0
∞
Since f(x) is a pdf, ∫ −∞
−
f ( x )dx = 1.
∞
i.e., ∫ 0
Ce −2 x dx = 1.
∞
⎧ e −2 x ⎫
⇒C⎨ ⎬ =1
⎩ −2 ⎭ 0
⎛ −C ⎞
⇒⎜ [ −1] = 1
⎝ 2 ⎟⎠
⇒ (C = ) ∴ f ( x ) = 2e −2 x
∞ ∞ ∞ ∞
⎡ e −2 x ⎤
]= ∫ d = 2e∫ ∫ dx = 2 ⎢
2x 2x
P[ X f ( x )dx ddx e ⎥ (5)
2 2 2 ⎣ −2 ⎦ 2
−44
= ( −1)[
) 0−e ]= e 4.
11. (b) (i) Geometric distribution could be used in the situations in which
probability of number of trials required to get first success.
If X N geometric distribution, P(X = x) = pq x −1 , x = 1,2..
∞
M X (t ) = ∑e
x =1
tx
p( x )
∞
= ∑e
x =1
tx
pq x −1
∑ (qe )
p
= t x
q x =1
p
= {qe
{ qqe t ( qe t ) 2 }
q
p
= × qe t { t
}
q
t −1
= pe t { }
t
pe
MX ( ) =
1 − qe t
(ii) Given, A coin having probability of coming up heads is succes-
sively flipped until rth head appears.
P[X = n] = n−1Cr−1 pr qn−r, n ≥ r is a negative binomial distribu-
tion.
Let X = number of heads while tossing a coin in n trials.
As n trials are independent, getting head in each trial is
of probability ‘p’.
C x p x q n − x , x = 0,1, 2.....
∴ X N B( , p) ⇒ P ( X = x ) = nC
Where q = 1−p.
It is clear that, is the first n−1 flips (trials), we must have r−1
P[ X n] [n C pr 1qn r ] [ p]
ie., p[ X n ] n −1C pr q n − r
12. (a) (i) Given X & Y are independent & f (x) = P(X = x) = fX (x) & f (y) =
P(Y = y) = fY (y)
Since X & Y are independent, f(x,y) = f(x). f(y)
∴P ( X < Y = ∫∫ f (x , y )dxdy
x< y
∞y
= ∫ ∫ f x ) f y ) dx dy
0 0
⎧ 1 − xy
⎪ ye , 0 < x < 0 < y < 2
(ii) f x , y ) = ⎨ 2
⎪⎩ 0 ; otherwise
f x , y = 1) e − x / 2 − x
P( X / Y f (x / y )= = =e
f y) 1
2
Now,
∞
e−x
f y) = ∫
−∞
f x, y d
dx f (x , y ) = ( / 2)(1)e − x ( ) =
2
∞
1
= ∫ ye − xy dx
−∞
2
∞
y ⎧ e − xy ⎫ 1 1
= ⎨ ⎬ = − [0 − 1] =
2 ⎩− y ⎭ 2 2
0
Let U = X+Y, V = Y
u v
i e., x= ,v = y
2
∂x 1 ∂x ∂x ∂x 1
= =0 0
∂u 2 ∂v ∂
&J = u ∂yy 2 1
= =
∂x 1 ∂y ∂x ∂yy 1 2
=− =1 − 1
∂v 2 ∂v ∂v ∂v 2
∴ f(u,v) = |J|.f(x,y)
=1/2.1
⎧1
⎪ ; 0 ≤ u −v≤1
⇒f u
u, ) = ⎨ 2
⎪⎩ & 0 ≤ v ≤ 1
0 1 2 3
0 ⎛0 0 1 / 2 1 / 2⎞
13. (a) (i) Let P = 1 ⎜ 1 0 0 0 ⎟
⎜ ⎟
2 ⎜0 1 0 0 ⎟
⎜ ⎟
3 ⎝0 1 0 0 ⎠
0 1
1/2 1/2 1
1
2 3
1 1
∴F
F f 00( ) + f ( )
f 00( ) = 0 + 0 + .1.1 + + .1.1
2 2
=1
1 1
& 11 = f111(1) + f11(2) + f111(3) = 0 + 0 + .1.1 + 1. .1
2 2
=1
1 1 1
= 0 0 + 1. 1. . + 1. 1. . 1. 1.
2 2 2
1 1 1
= + + +…
2 4 8
1/ 2
= =1
1
1−
2
Also ⇒ F222 = 1
0 1 2 3
0 ⎛ P1 0 1 P1 0 ⎞
1 ⎜ P2 0 1 P2 0 ⎟
P= ⎜ ⎟
2⎜0 3 0 1 P3 ⎟
⎜ ⎟
3⎝0 4 0 1 P4 ⎠
Where , P1 is the probability that if it has rained for the past two
days then it will rain tomorrow.
Similarly P2 is the probability that if it has rained today but not
yesterday.
Similarly P3 is the probability that if it rained yesterday but not
today then it will rain tomorrow.
& P4 is the probability that if it has not rained in the past two
days, it will rain tomorrow.
13. (b) (i) statement : If P is the tpm of a homogeneous Markov chain,
then the n-step tpm P(n)is equal to Pn
i.e., Pij(n)=[Pij]n
It is clear that
Proof: p j ( ) p{xp{x j / x0 i}, as the chain is homogeneous.
The state ‘j’ can be reached from the state i in two steps through
some intermediate state k.
Now,
Pij ( )
P{X
P{X
{X 2 j / X0 i}
= P{X
X2 j, X1 = k / X o i}
= P{ X 2 j / X1 = k , X o i}. P { X1 k / X 0 i}
= pkj (1)
pik (1)
= pik pkj
i.e., the ijth the element of 2-step tpm = the ijthelement of the
product of the two one step tpm’s.
i.e., p ( ) = p2
Now,
p j( )
P{X
P {X
{X j / X0 i}
= ∑ P { X3 j / X2 = k} P {X
{ X 2 k / X 0 i}
k
= ∑ p j pik ( )
= ∑ pik ( )
pkj
k
Similarly p j ( )
∑p
k
ik . pkj ( )
i.e., P ( )
P ( ) .P P . P 2 P 3
proceeding in this way, we get
P (n ) P n
13. (b) (ii) The tpm of the Markov chain is
C T
C ⎡1 / 4 3 / 4 ⎤
⎢ P= ⎥
T ⎣1 / 5 3 / 5 ⎦
It is given that three out of every four trucks on the r0ad are
followed by a car, while only one out of every five cars is fol-
lowed a truck. The fraction of vehicles on the road are truck is
given by
π P = π & π1 + π 2 =1
i.e.,
⎛ 1// / 4⎞
( 1 ,π 2 ) ⎜ = ( 1 ,π 2 )
⎝ 1// / 5⎟⎠
1 1
π1 + π 2 = π1 ()
4 5
3 4
π1 + π 2 = π 2 (2)
4 5
1 3
1 ⇒ π 2 = π1
5 4
15
⇒ π 2 = π1
4
15
∵ π1 + π 2 =1 π1 + π1 = 1
4
⎛ 19 ⎞
⇒ π1 ⎜ ⎟ = 1
⎝ 4⎠
⎛ 4⎞ ⎛ 15 ⎞
⇒ 1 = ⎜ ⎟ & π2 = ⎜ ⎟
⎝ 19 ⎠ ⎝ 19 ⎠
)] 1 μ n ( ) Δt + 0(
(vi) P [0 death in ( , t + Δt)] )
( vii) P [2 or more deaths in ( , t + Δt)]
)] 0( )
(viii) Death occurring in ( , t + Δt ) are independent of time since
last death.
(ix) The birth and death occur independently of each other at any
time.
14. (b) Let P n (t ) = P[X(t) = n] be the probability that the size of the popula-
tion is ‘n’ at time . ‘t’
Consider, Pn ( , t + Δt ) = P(no birth (or) death in ( , t + Δt ) ) + P (1
birth and no death in ( , t + Δt ) ) + P (no
birth and one death in ( , t + Δt ) ) + P (1
birth and 1 death in ( , t + Δt ) )
pn (t (t ).(1 λ n Δt
t ) Pn (t Δtt)) ( − μn Δt)
t ) pn (t )λ n 1 t ( − μ n −1Δt )
+ pn +1 ( ) ((11 − λ n ++11Δ )( μ n +1Δ ) + Pn ( ) ( λ n Δ )( μn Δ )
Pn ( ) Pn ( )
lim = λ n 1 Pnn−1 ( ) − ( λ n ) Pn (
n )P ) + μ n +1 Pn +1 (t )
t →0 Δt
i.e., Pn ′(t ) = λ n −1 Pn (t ) − ( λ n μn ) Pn (t ) + μn +1 Pn +1 (t ) n > 0 (1)
∴ P0 ′ (t ) = − λ0 P0 (t ) + μ1 P1 (t ) (2)
In steady state , Pn (t ) is independent of time ‘t’, Pn ′(t ) =1
Also P0 ′(t ) = 0.
∴ 1&2 ⇒
λn nn−1 ( λ n μ n ) Pn + μ n + Pn +1 = 0 (3)
− λ0 0 + μ1 P1 (4)
λ0
From 4 ⇒ P1 = P
μ1 0
When n = 1, 4 ⇒
λ0 0 ( λ1 μ1 ) P1 + μ2 P2 = 0
λ0 λ1
⇒ P2 = P
μ1μ2 0
In the same way we get,
λ0 λ1λ 2
P3 P , etc…
μ1μ2 μ3 0
λ0 λ1 … λ n 1
Thus Pn .P
μ1μ2 … μn 0
∞
∑P =1
n= 0
n
∞
λ0 λ1 λ n −1
⇒ P0 + ∑
n =1
μ1μ2 … μn 0
P =1
1
⇒ P0 = ∞
⎛ λ0 λ1 λ n −1 ⎞
1+ ∑ ⎜⎝ μ μ … μ
n =1 1 2 n
⎟⎠
1 1
= = −1
∞ n
⎛ λ⎞ ⎛ λ⎞
∑
n= 0
⎜⎝ μ ⎟⎠ ⎜⎝1 − μ ⎟⎠
n
λ ⎛ λ⎞ ⎛ λ⎞
P0 = 1 − ∴ Pn = ⎜ ⎟ ⎜1 − ⎟ = 1
μ ⎝ μ⎠ ⎝ μ⎠
15. (a) Let N and N′ be the number of customers in the system at time t and
t + T,when two consecutive customers have just left the system after
getting service.
T Random service time which is a continuous random variable
f(t) → pdf of T
E(T) → Mean of T & Var(T) → Variance of T
Let M be the number of customers arriving in the system during the
service time ‘T’
⎧M iif N = 0
∴ N′ = ⎨
⎩ ( n 1) M if N > 0
∴ N′ = N −1+ M + δ where δ = ⎧⎨
1 if N 0
⎩0 if N 0
∴ E ( N ′ ) = E ( N ) − 1 + E ( M ) + E (δ ) (2)
When the system is in steady state, the probability of number of
customers in the system will be constant.
∴ E(N) = E(N′)
2
E( ) = E(N′2)
Squaring both sides of 1, we get
2
N ′ = N 2 + ( M − ) 2 + δ 2 + 2 N ( M −1) + 2( M − 1)δ + 2 N δ
Now δ 2 δ (Since if δ , δ 2 = 0, if δ ,δ 2 = 1
⎧0 1 if N = 0
& Nδ = ⎨
⎩ N × 0 if ≠ 0
∴Nδ = 0
Substitute in (5)
2
N ′ = N 2 + ( M − ) 2 + δ + 2 N ( M −1) + 2( M − 1)δ
2
N ( M − 1) = N 2 − N ′ + M 2 − 2 M + 1 + δ + 2 δ
⇒ −2N 2δ
∴ 2 [ N (1 − M )] 2E ( ) ( 2
) 1 + E + {( 2 1)δ }
⇒ 2 ( N ) (1 M ) = E ( M 2 ) + E ( M ) E (δ ) 2E(
2 E(( M ) 1
by independence & by (3)
⇒ 2 ( )[1 − E ( M )] = E ( M 2 ) + ( 2 ( ) − 1) E (δ ) − 2 E ( ) +1
2
M ) ]E (δ ) 2E(
E ( M ) + [ E ((M 2 E(( M ) 1
∴ E (N) =
2[[ E ( M )]
E( M 2 ) − 2E 2 ( M ) + E( M )
=
2[[ E ( M )]
Since number of arrivals M in time T follows a Poisson process with
parameter l., we have
E ( M ) = λ E (T ) & E ( M 2 ) = λ 2 [V
[V [T ) E 2 (T )] + λ E (T )
Substitute in (6) we get
λ 2 [v(T )] + λ 2 E 2 (T ) + λ E((T ) − λ 2 E 2 (T ) + λ E (T )
Ls E(N ) =
2[[ − λ E (T )]
λ 2 [V (T ) + E 2 (T )]
Ls λ E (T ) +
2[[ − λ E (T )]
(b)
Queue networks can be regarded as a group of ‘k’ inter-connected
nodes, where each node represents a service facility of some kind
with Si servers at node i (Si ≥ 1)
(i) Consider a two server system in which customers arrive at Pois-
son rate λ at server1. After being served by server1 they then
join the queue in front of server2. Suppose there is infinite wait-
ing space at both servers. Each server servers one customer at
a time with server i taking an exponential time with rate μi for a
service,i = 1,2… such a system is called a tandem system.
(ii) Consider a system of K servers. Customers arrive from outside
the system to server i, i = 1, 2,….k .in accordance with indepen-
dent Poisson processes at rate r1; they then join the queue at i
until their turn at service comes. Once a customer is served by
server i, he then joins the queue in front of server j, j = 1,… k
with prob.Pij
(iii) Jackson’s open network concept can be extended when the
nodes are multi server nodes. In this case the network behaves
as if each node is an independent M/M/S model.
2. If the range of X is the set {0, 1, 2, 3, 4} and P[X = x] = 0.2, determine the
mean and variance of the random variable.
5. Determine the value of C such that the function f(x, y) = c × y and 0 <
x < 3 and 0 < y < 3 satisfies the properties of a joint probability density
function.
10. Write Pollaczek-Khintchine formula for the case when service time dis-
tribution is Erlang distribution with K phases.
PART B (5 ë 16 = 80 marks)
11. (a) Customers are used to evaluate preliminary product designs.
In the past, 95% of highly successful products received good
reviews, 60% of moderately successful products received
good reviews and 10% of poor products received good
reviews. In addition, 40% of products have been highly
successful, 35% have been moderately successful and 25%
have been poor products.
(i) What is the probability that a product attains a good review? (6)
(ii) If a new design attains a good review, what is the
probability that it will be a highly successful product? (5)
(iii) If a product does not attain a good review, what is the
probability that it will be a highly successful product? (5)
or
(b) (i) Obtain the moment generating function of the random
variable X having probability density function
⎧ x, 0 ≤ x ≤ 1
⎪
f ( x ) = ⎨ 2 − x , 1 ≤ x ≤ 2. (8)
⎪ 0, otherwise
⎩
(b) (ii) A fair coin is tossed three times. Let X be the number
of tails appearing. Find the probability distribution of X.
And also calculate E(X). (8)
12. (a) (i) Derive the mean and variance of a Binomial random
variable with parameters n and p. (10)
(ii) Suppose that X is a negative binomial random variable
with p = 0.2 and r = 4. Determine the mean of X. (6)
or
(b) (i) The time between process problems in a manufacturing
line is exponentially distributed with a mean of 30 days.
What is the expected time until the fourth problem? (4)
13. (a) Determine the value of C that makes the function F(x, y) =
C(x + y) a joint probability density function over the range 0
< x < 3 and x < y < x + 2. Also determine the following.
(i) P (X < 1, Y < 2) (8)
(ii) P(Y > 2) (4)
(iii) E[X] (4)
or
(b) (i) A fair coin is tossed 10 times. Find the probability of
getting 3 or 4 or 5 heads using central limit theorem. (6)
14. (a) Show that the random process X(t) = A sin(w t + q ) is wide-
sense stationary process where A and w are constants and q is
uniformly distributed in (0, 2p ). (16)
or
(b) Define Poisson process and obtain the probability distribution
for that. Also find the auto correlation function for the process.
(16)
15. (a) (i) For the (M | M | 1) : (GD | ∞ | ∞), derive the expression for Lq. (6)
(ii) Patients arrive at a clinic according to Poisson
distribution at a rate of 30 patients per hour. The waiting
room does not accommodate more than 14 patients.
Examination time per patient is exponential with mean
rate of 20 per hour.
(1) What is the probability that an arriving patient does
not have to wait?
⎛ 1⎞ e −1 x
4. Given x N exp ⎜ ⎟ ⇒ p ( X = x ) =
⎝ 10 ⎠ x
Mean 1 = 10 ⇒ λ = 1 = 0 1
λ 10
3 3
i.e., ∫ ∫ c( xyxy)ddxddy = 1
0 0
3
x2 3
∫
c y[
0
] dy = 1
2 0
3
y
⇒c ∫ 2 [ ]dy
0
dy = 1
9c y 2 3
⇒ [ ] =1
2 2 0
81
⇒ =1
4
4
⇒ =
81
6. cov(x, y) = E(xy) –E(x) E(y)
cov( x, y )
Correlation coeff. of x y = rxy =
σ xσ y
⎡ π π⎤
7. Given x(t) = cos( t + φ ),
)φ ∪ − , ⎥
⎣ 2 2⎦
1 π π
⇒ f φ = − ≤x≤
π 2 2
Consider, E[x(t)] = E( cos( t + φ ) ]
∫
= cos( t + φ ) f (φ )dφ
π /2
1
= ∫
−π / 2
cos( t + φ )). dφ
π
1
{sin( + φ )}π−π/ 2/ 2
=
π
1
= {cos t + cos }
π
2 cos t
=
π
As E [x(t)] is dependently on ‘t’ it is not Stationery.
⎛ 0 1⎞
8. P = ⎜
0⎟⎠
let states of p be {0,1}
⎝1
0 1
λ
1−
μ
Where P0 = N +1
⎛ λ⎞
1− ⎜ ⎟
⎝ μ⎠
10. If the service time is Erlang with parameters K and m so that E(T) =
K K
and v(t) =
μ μ2
k ( k )e
Then p-k formula reduces to Ls = ke +
2(( ke)
PART-B
11. (a)
(G/A) = Ηighly Successful products received good reviews
(G/B) = Moderately Successful products received good reviews
(G/C) = Poor Products received good reviews
A = Highly successful products
B = Moderately Successful products
C = Poor Products
G = good reviewing Product.
P(D) = 0.4
P(E) = 0.35
P(F) = 0.25
P(G/A) = 0.95
P(G/B) = 0.6
P(G/C) = 0.1
(i) P(G) = P(A).P(G/A) + P(B) P(G/B) + P(C) P(G/C)
= 0.615
⎛ P ⎞ p( A). p(G / A) 0 35
(ii) P ⎜ 1 ⎟ = =
⎝G⎠ p(G ) 0.615
= 0 62
⎛ A⎞
(iii) 1 1 − 0 62
⎝ G⎠
= 0 38
11. (b) (ii) S = {HHH, HHT, HTH, THH, HTT, THT, TTH, TTT}.
X = Number of tails
X 0 1 2 3
P(k) 1/8 3/8 3/8 1/8
1 3 3 1
E( x ) = 0 × + 1 × + 2 × + 3 ×
8 8 8 8
1 6
= [ ] = = 0.75
8 8
(b) (i)
1 2
∫
M x (t ) = E[e ] = e xdx + e tx ( ∫
tx tx
x )dx
0 1
1 2
⎡ ⎛ e tx ⎞ e tx ⎤ ⎡ e tx e tx ⎤
= ⎢ x ⎜ ⎟ − 2 ⎥ + ⎢( 2 − ) + 2⎥
⎢⎣ ⎝ t ⎠ t ⎥⎦ 0 ⎣ t t ⎦1
2
e 2t − 2et + 1 ⎛ e t −1 ⎞
= =⎜ ⎟
t2 ⎝ t ⎠
12. (a) (i) If x ∼ B(n, p) ⇒ P(X = x) = ncx p x q n − x
x = 1,2, … n.
n
M X (t ) = E[e tx ] = ∑e
x=0
tx
p( X x)
n
= ∑ nCC
x=0
x p x q n − x e tx
n
= ∑ nCC ( pe )
x=0
x
t x
qn− x
=( + t n
)
Mean = M x′ ( ) = n( q + pe t ) n −1 × pe t
t =0
= np
E ( X ) 2 = M X ( ) np
np {e t (n
(n )( q pe t ) n 2 . pe t e t ( q + pe t ) n
t =0
= n( n − ) p 2 + np
= n2p2 + npq
Var(x) = E(x2) − [E(x)]2 = npq
(ii) Given X ∼ NB(p, r)
p = 0.2, q = 1 −p = 0.8
r=4
rq 4 × 0.8
Mean = = =1 6
p 02
12. (b) (i) Let X = Time between process in a manufacturing Line.
⎛ 1⎞
Given, X ∼ exp ⎜ ⎟
⎝ 30 ⎠
1 1
Mean = 30 ⇒ λ =
λ 30
e − λt λ x
P( x x) =
x
(ii) ∴ X ∼ N ( μ , σ 2 )
M x (t ) = M σzz + μ ( t ) ,
x−μ
Since z =
σ
= e μt .μ z (σ t ) (by property)
⎛ σ 2t ⎞
M X (t ) et μ +
⎝ 2 ⎟⎠
2
t⎛ σ 2t ⎞ t 2 ⎛ σ 2t ⎞
Now, M x (t ) = 1 + ⎜ μ + + μ + +…
1⎝ 2 ⎟⎠ 2 ⎝ 2 ⎟⎠
t
Mean = E(x) = Coeff. of =μ
1
t2
E(X 2) = coeff of = σ 2 + μ2
2
Var (x) = E(X2) − (E(X)]2 = s2,
S.D = σ2 σ
∫∫
−∞ −∞
f xy dy = 1
x ddxd
3 x+2
(i.e.,) C
∫ ∫ ( x + y) dydx = 1
0 ∞
3 x+2
⎡ y2 ⎤
⇒C ⎢ y+ ⎥
0⎣
2⎦ ∫ x
dx = 1
∫
⇒ C ( 4 x + 2) d x = 1
0
1
⇒C{ 2
+ x}30 = ⇒ C = 1⇒ C =
24
1 2
1
∫ ∫ 24 ( x + y ddyddx
(i)
P( x ; y < 2)
0 x
1
1 ⎛ 3x 2 ⎞
=
24 ⎝
0
∫
2x + 2 −
2 ⎟⎠
dx
1
=
16
3
2 x+2 3 x+2
1 1
= ∫∫
0 x
24
+ + ∫
2 x
24
( x + y ) dydx
2 2 ⎞ x+2 3
1 ⎛ 1 ⎛ y2 ⎞
=
24 ⎝
0
∫ +
2 ⎟⎠
x
+
24 ⎝
2
∫
xy + ⎟ dx
2⎠
7 1
+=
18 2
8
P( x > 2) =
9
∞
(iii) E( x))
−∞
∫ f ( x ) dx
3
1
= x ∫
0
12
( 2 x + 1) dx
15
=
2
x+2
1
f ( x) = ∫ x
24
(x
(x y ) dy
x+2
1 ⎡ y2 ⎤
= ⎢ xy + ⎥
24 ⎣ 2⎦
x
1
= ( 2 + 1); 0 < < 3
12
(b) (i) X = getting heads in 10 trials
n = 10
1
p = getting a head in a trial =
2
1
q=
2
∴ X ∼ B (n, p)
Mean = np = 5
1 1 10
Variance = npq = 10 ×
× =
2 2 4
X − E( x ) X − 5 X − 5 −2
∴Z = = = 1 = 3⇒Z = = −0
var (x ) 10 2. 5 25
4
∫
f ( Z ) = We −W = 1, 0 ≤ Ζ ≤ 1
0
∴ f(Z) = 1, 0 ≤ Z ≤ 1
14. (a) E[ (t )] = E[
E[A sin(
i ( ))]
∴ q ∼∪[0, 2p]
2π
1 1
= ∫ A sin(ω t + θ )). 2π dθ
0
⇒ f( )=
2π
, < θ < 2π
A
= {− cos( t + θ )}02π
2π
A
= − {cos( t ) − coss t} = 0, a constant.
2π
Rxx (t, t + t) = E[x(t)x(t+t)]
sin (ω t + θ )
= E[ Asin ssii ( (t + τ ) + ))]
sin
= A E [ sin(ω t + θ ) sin(
2
ssii ( + ) + θ ))]
A2
= {E (cos( −ω t)
t ) − cos(( 2 t + 2 + ]}
2
A2
= {cos ω t − E (cos(2
(cos( 2 t + + 2 ]}
]}
2
A2
= cos ωτ , depends on t only
2
∴{x(t))} is a WSS.
{ [ ( 2ω t + ωτ + 2θ ))] = 0}
[cos(
(b) Poisson Process
A counting process is said to be a poisson process
{N(t): t ≥ 0}having rate l, l ≥ 0 if
(i) N(0) = 0
(ii) The process has independent & stationary Increments
(iii) p{N(h) = 1} = lh+ 0(h)
(iv) p{ N(h) ≥ 2} = 0(h)
conditions (iii) & (iv) implies, p{N(h) = 0} = 1-lh + 0(h)
Probability Distribution
Consider Pn (t + Δt), for n ≥ 1 i.e., these are ‘n’ occurances
of the random event in (0, t + Δt). This probability can be
computed as the sum of (n + 1) mutually exclusive events
as follows.
Pn (t + Δt) = P[n occurances in t + Δt]
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + Pn − 2 (t) ⋅ P2 (Δt) + … +P0 (t) Pn (Δt)
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + 0(Δt) + … + 0(Δt)
= Pn (t) ⋅ P0(Δt) + Pn − 1 (t) ⋅ P1(Δt) + 0(Δt)
Pn (t + Δt) = Pn (t) [1 − lΔt + 0(Δt)] + Pn − 1 (t) [lΔt + 0(Δt)] + 0(Δt)
Pn (t t ) Pn (t ) 0( t ) 0( Δt ) ( Δt )
= − λ Pn (t ) + Pn (t ) + λ Pn (t ) + Pn −1 (t ) ⋅ +
Δt Δt Δt Δt
Pn ( ) − Pn ( )
lim = − λ Pn ( ) + λ Pn −1 ( )
Δ →0 Δt
Pn′ (t ) = − λ Pn (t ) + λ Pn −1 (t ));; n ≥ 1 (1)
Using above argument
P0 (t + Δt) = P0 (t ) ⋅ P0 (Δt)
= P0 (t ) [1 − lΔt + 0(Δt) ]
P0 (t + Δt) − P0 (t ) = −lP0 (t ) Δt + P0 (t ) 0 (Δt)
P0 ( ) − P0 ( ) 0( )
( ) = λ P0 ( ) + P0 ( )
Δt Δt
P0 ( ) − P0 ( )
lim = − λ p0 ( )
Δ →0 Δt
Pn′ (t ) = − λ P0 (t )
Solve, P0 (t ) = Ke − λ t
⇒k=1
∴ P0(t) = e−λt
Put n = 1 in 1
P1′ (t ) = − λ P1 (t ) + λ P0 (t )
⇒ P1′ (t ) + λ P1 (t ) = λ e − λ t
e − λ t ( λ t )′
i.e., P1 (t ) =
L′
continuing this way, we get
e − λt ( t )n
Pn (t ) = , n = 0,1, 2…,
n
Auto correlation of X (t):
Rxx (t1, t2 ) E [ x (t1 ).
) x(t(t2 )]
= E [ x (t1 ).{
) {x(t2 ) x(t1 ) x(t1 )}
= E [ x (t1 ) {x(t2 ) x(t1 )}] E [ x 2 (t1 )]
= E [ x (t1 ) E {x
{x(t2 ) x(t1 )}] E [ x 2 ( 1 )]
= λ t1λ ( 2 1) λ t1 + λ 2 t12
= λ 2 ( 1 2 ) + λ t1 f t2 ≥ t1
Rxx ( 1 ,t,t2 ) = λ 2 , 2 + λ min( 1, t2 ), for any t1, t2
15. (a) (i) Let N denotes the number of customers is the queueing
system, then the number of customer in queue is n-1.
n
⎛ λ⎞ ⎛ λ⎞
W.K.T . P(N = n) ⎜ ⎟ ⋅ ⎜1 − ⎟
⎝ μ⎠ ⎝ μ⎠
∞
Lq E(N ) ∑ (n(n − ) P
n =1
n
∞ n
⎛ λ⎞ ⎛ λ⎞
= ∑
n =1
( n − ) ⎜ ⎟ ⎜1 − ⎟
⎝ μ⎠ ⎝ μ⎠
∞ n
⎛ λ⎞ ⎛ λ⎞
= 1− ⎟
⎝ μ⎠ ∑
n =1
( − )⎜ ⎟
⎝ μ⎠
2 ∞ n− 2
⎛ λ⎞ ⎛ λ⎞ ⎛ λ⎞
= ⎜ ⎟ ⎜1−1
⎝ μ ⎠ ⎝ μ ⎟⎠ ∑
n= 2
( − )⎜ ⎟
⎝ μ⎠
2 −2
⎛ λ⎞ ⎛ λ⎞ ⎛ λ⎞
= ⎜ ⎟ ⎜1 − ⎟ ⎜⎝1 − μ ⎟⎠
⎝ μ⎠ ⎝ μ⎠
λ2
Lq =
μ( μ − λ )
e ( k ) e k +1
(ii) Ls = −
1− e 1 − e k +1
= 13.02
l′ = m (1 − p0)
= 20 (1− 0.00076)
= 20 per hr (nearly)
Ls
Expected waiting time Ws =
λ1
13.02
= hrs
20
13.02
= × 60 minutes
20
= 39.06 min
15 (b) P-K Formula:
Let N and N′ be the Number of customers in the system at time
t and t+T, when two consecutive customers have just left the
system after getting service.
T → Random Service time which is continuous random variable.
f(t)→Pdf of T
E(T) → Mean of T and Var(T) → variance of T
⎧1 if N = 0⎫
N′ = Ν−1+M+S where S = ⎨
⎩0 if N 0⎬⎭
∴ E(N′) = E(N) −1 + E(M) + E(S) (2)
When the system is in steady state, the probability of number of
customers, in the system will be constant.
∴ E(N) = E(N′)
and E(N2) = E(N′2)
Squaring both sides of (1), we get
N ′2 N 2 + (M )2 σ 2 2 N ( M − 1) 2( M − 1)σ + 2 N σ
Nowd 2 = d (since if d = 0, d 2 = 0, if d = 1, if d 2 = 1)
⎧ 0 1 if N =0
and N δ = ⎨
⎩ N 0 if N ≠0
∴ Nδ = 0
Sub in (5)
N′2 = N2 + (M − 1)2 + d + 2 N(M − 1) + 2(M − 1)d
⇒ −2N(M − 1) = N2 − N′2 + M2 − 2M + 1 + d + 2Md − 2 d
E(( 2
) +[ [ M ) − ]E(( ) − 2 E(( ) + 1
∴ E(N) =
2[[ − ( M )]
E ( M 2 ) − E 2 ( M ) + E ( M )]
=
2[[ − E ( M )]
Since number of arrivals M in time T follows a poisson Process with
parameter l, we have E(M) = l E(T) and
E(M2) = l2 [r(T) + E2(t)] + l E(T)
Sub in (6), we get
λ 2 [V (T )] + λ 2 E 2 (T ) + λ E((T ) − λ 2 E 2 (T ) + λ E (T )
Ls = E( N ) =
2[[ − λ E (T )]
λ 2 [V (T )] + E 2 (T )]
Ls λ E (T ) +
2[[ − λ E (T )]
2. When A and B are two mutually exclusive events, are the values
P(A) = 0.6 and P( A B) = 0.5 consistent? Why?
9. Draw the state transition rate diagram for M/M/C queueing model.
10. What is the probability that a customer has to wait more than 15 mins to
get his service completed in M/M/1 queueing system, if l = 6/hr and
m = 10/hr?
PART B (5 × 10 = 80 marks)
11. (a) (i) A top is rejected if the design is faulty or not. The
probability that the design is faulty is 0.1 and that the toy
is rejected if the design is faulty is 0.95 and otherwise
0.45. if a toy is rejected, what is the probability that it is
due to faulty desings? (8)
(ii) In an exhibition, the probabilities of hitting the target are
1/2 for A, 2/3 for B and 3/4 for C. If all of them fire at the
same target, what are the probabilities that (1) only one of
them hits the target (2) atleast one of them hits the target?
(8)
(or)
(b) (i) A random variable X has the following probability
distribution:
X = x: −2 −1 0 1 2 3
P(x): 0.1 K 0.2 2K 0.3 3K
Find h, P(−2 < x < 2), mean of X. (8)
⎧ xe − x ;x≥0
2
/2
(b) (ii) If p( x ) = ⎨
⎩ 0 ; x < 0.
12. (a) (i) Out of 800 families with 4 children each, how many
families would be expected to have (1) 2 boys and 2 girls
(2) atleast 1 boy (3) atmost 2 gilrs (4) children of both
sexes. Assume equal probabilities for boys and girls. (8)
(ii) The mileage which car owners get with a certain kind of
radial tire is a random variable having an exponential
distribution with mean 40,000 km. Find the probabilities
that one of these tires will last (1) atleast 20,000 km (2)
atmost 30,000 km. (8)
or
12. (b) (i) If the life X (in years) of a certain type of car has a Weibull
distribution with the parameter b = 2, find the value of the
parameter a, given the probability that the life of the car
exceeds 5 years is e−0.25. For these values of a and b, find
the mean and variance of X. (8)
13. (a) Obtain the equation of the regression lines from the following
data, using the method of least squares. Hence find the
coefficient of correlation between X and Y. Also estimate the
value of Y when X = 38 and the value of X when Y = 18. (16)
X: 22 26 29 30 31 33 34 35
Y: 20 20 21 29 27 24 27 31
or
(b) The joint pdf of a 2-dimensional RV (X, Y) is given by
x2
x 2
f ( x, y ) xy ; 0 ≤ x ≤ 2 0 ≤ y ≤ 1. Compute P(X > 1),
8
⎛ 1⎞ ⎛ 1⎞ ⎛ 1 ⎞
P Y < ⎟ , P X > 1 / Y < ⎟ , P Y < / X > 1⎟ , P ( X < Y ) ,
⎝ 2⎠ ⎝ 2⎠ ⎝ 2 ⎠
P ( X + Y ) ≤ 1). (16)
14. (a) (i) Given a RV ‘Ω’ with density f(w) another RV f uniformly
distributed in (−p, p) and independent of Ω and X(t) = a
cos(Ωt + f), prove that X(t) is a WSS process. (8)
(ii) Suppose that customers arrive at a bank according to a
Poisson process with a mean rate of 3 per minute ; find the
probability that during a time interval of 2 mins.
(1) Exactly 4 customers arrive and
(2) More than 4 customers arrive. (8)
or
(b) An engineer analyzing a series of digital signals generated by
a testing system observes that only 1 out of 15 highly distorted
signals follows a highly distorted signal, with no recognizable
signal between, whereas 20 out of 23 recognizable signals
follow recognizable signals, with no highly distorted signal
between. Given that only highly distorted signals are not
recognizable, find the TPM and fraction of signals that are
highly distorted. (16)
15. (a) There are 3 typists in an office ; each typist can type an
average of 6 letters/hour. If letters arrive for being typed at
the rate of 15 letters/hour.
(i) What fraction of the time all the typists will be busy?
(ii) What is the average number of letters waiting to be typed?
(iii) What is the average time a letter has to spend for waiting
and for being typed?
(iv) What is the probability th at a letter will take longer than
20 mins. waiting to be typed and being typed? (16)
or
(b) Customers arrive at a one-man barber shop according to a
Poisson process with a mean inter arrival time of 12 mins.
Customers spend an average of 10 mins in the barber’s chair.
(i) What is the expected number of customers in the barber
shop and in the queue?
(ii) How much time can a customer expect to spend in the
barber’s shop?
(iii) What is the average time customers spend in the queue?
(iv) What is the problem that the waiting time in the system is
greater than 30 mins? (16)
∴ P(A
( ∪ B) = P(A
( ) + P(B
( )
& ( A ∩ B)
B) P ( A) − P ( A ∩ B )
= P(A
( ) − f = P(A
( )
( ) = 0.5 and P( A B) = 0.6
But given P(A
P(A
( ) & P( A B) is not consistent.
4. Ιn weibull distribution
β
f(x) = α β x β −1e −α x , ξ > 0.
f(
When β = 1
f(x) = α e −α x
It is pdf of exponential distribution.
i.e., when β = 1, weibull distribution reduces to the exponential distri-
bution.
5. 1. 1 −1 ≤ rxy ≤ 1 or cov ( x, y ) ≤ σ x .σ y
2. Correlation coefficient is independent at change of origin and scale.
x a y b
i.e., If U = &V = , where h, k > 0, then rxy = ruv
h k
6. By the property of joint pdf
∫ ∫ −( x2 + y2 )
dxdy = 1
x > 0 y>0
∞
∞
∫
k ye − y dy ∫ x2
i.e., xe dx = 1
0
0
Put x2 = t
2x dx = dt
dt
x dx =
2
∞
∞ ∞ e t dt 1 ⎡ e − t ⎤ 1
∴
∫
0
xe − x dx ∫0 2
= ⎢
2⎣ 1⎦
⎥ =
0
2
∴ k ⎡1 1⎤
k 4
⎣2 2⎦
7. If certain probability distribution or averages do not depend on t, then the
random process {X (t)} is called stationary.
e − λ t (λ1t e − λ t λ2 t )n
n r r
= ∑
r =0
r!
.
(n r)!
∑
1
= e − ( λ1 + λ 2 ) t n c ( λ t ) ( λ 2t )n r
n! r = 0 r 1
[( λ1 + λ 2 )t ]n
= e − ( λ 1 + λ 2 )t
n!
∴ {X1(t) & X2(t)} is a Poisson process with parameter ( λ1 λ 2 )t
l l l l l
0 1 2 3 C−1 C C+1
9 .
m 2m 3m Cm Cm
The transition diagram for multiple Server.
11. (a) (i) Let D1,D2 denote the events that the design is facility or not. Let
A denote the event that the toy is rejected.
∴ P(D1) = 0.1 & P(D2) = 1 − 0.01 = 0.9
P(A/D1) = 0.95 & P(A/D2) = 0.45
P[rejection due to faculty begin)
P ( D1 ).P ( A / D1 )
= P(D1/A) =
P ( D1 ) ⋅ P ( A / D1 ) P ( D2 ) P ( A / D2 )
0.1 × 0.95
= = 0.19
0.1 × 0.95 + 0.9
.9 0.45
1 2 3
11. (a) (ii) given P(A) = , P(B) = & P(C) =
2 3 4
1 1 1
∴ P ( A) = , P ( B) = , P (C ) =
2 3 4
P( A B C ) = P ( A) × P ( B ) × P (C ) ∴ independence
1 1 1
=
× ×
2 3 4
1
=
24
(ii) P [at least one hits the target]
= 1−P[none of them hits the target]
1
= 1−
24
23
=
24
11 (b) (i)
To find k:
Since ∑ p(x) = 1
16
=
15
x2
−
11. (b) (ii) given P(x) = xe 2 ≥0
∂ ∞ x2
−
Since ∫ 0
P(x) dx = ∫ xe
0
2 dx
x2
Put t = dt = x dx
2
∞ ∞
⎛ e−t ⎞
∫
∴ = e dt = ⎜ ⎟ =1
t
0
⎝ −1 ⎠ 0
∴ P(x) is a pdf.
F(x) = P[X x]
∞
= ∫ f ( x)dx
0
when x < 0
F(x) = 0
when x ≥ 0
x x2
−
F(x) = ∫ xe
0
2 dx
x2
−
= 1− e 2
⎧0, x<0
⎪
∴ F ( x) = ⎨ x 2
−
⎪⎩1 − e 2 , x ≥ 0
12. (a) (i) Let each child as a trial n = 4, N = 800 families. Assuming that
1 1
birth of boy is a success, p = & q =
2 2
Let X denote the number of successes (boys),
(1) P[2 boys and 2 girls] = P[X = 2]
2 4−2
⎛ 1⎞ ⎛ 1⎞
= ΑC2 ⎜ ⎟ ⎜ ⎟ ∴ P[X = x] =n Cx px qn−x
⎝ 2⎠ ⎝ 2⎠
4
⎛ 1⎞
= 6× ⎜ ⎟
⎝ 4⎠
3
=
8
∴ Νumber of families having 2 boys & 2 girls
=Ν. P[X = 2]
3
= 800 × = 300
8
(2) P[at least 1 boy ] = P[X ≥ 1]
= 1 − P[X < 1]
= 1 − P[X = 0]
0 4
= 1− 4Co ⎛ 1 ⎞ ⎛ 1 ⎞
⎜⎝ ⎟⎠ ⎜⎝ ⎟⎠
2 2
1
=1 −
16
15
=
16 15
∴ Νumber of families having atleast one boy = 800 × =
750 16
(3) P[at most 2 girls ]
= P[X = 0 or X = 1 or X = 2]
= P[X = 0]+ P[X = 1] +P[X = 2]
4 4 4
= 4C0 ⎛⎜ ⎞⎟ + 4C1 ⎛⎜ ⎞⎟ + 4C2 ⎛⎜ ⎞⎟
1 1 1
⎝ 2⎠ ⎝ 2⎠ ⎝ 2⎠
4
⎛ 1⎞
= ⎜ ⎟ [1+4+6]
⎝ 2⎠
11
=
16
∴ Νumber of families having atmost 2 girls
11
= 800 ×
16
= 550
∞
⎡ ⎤
1 ⎢ e − x / 40000 ⎥
= ⎢ ⎥
40000 ⎢ −1 ⎥
⎣ 40000 ⎦ 0
= e −0 5
=0.6065
30000
1
(2) P[X ≤ 30000] = ∫0
40000
e − x / 40000 dx
30000
= ⎡⎣ −e − x /40000 ⎤⎦
0
=1 − e −0 75
=0.5270
∫ 2 α x e −α n dx
2
Now P[X > 5] =
5
∴ e −25α = e −0 25
∴ 25 α = 0 25
1
α=
100
1/ β
∴ Ε [x] = α β +1
−1/ 2
⎛ 1 ⎞ 3
= ⎜
⎝ 100 ⎟⎠ 2
1 1
=10 ×
2 2
=5 π
2/ β
Var(X) = α β + 1 − { 1 / β + 1}2 ]
−1 ⎡ ⎛ ⎞ ⎤
2
⎛ 1 ⎞ ⎢ 2 −⎜ 3⎟ ⎥
=⎜
⎝ 100 ⎟⎠ ⎢ ⎝ 2⎠ ⎥
⎣ ⎦
⎡ ⎛1 ⎞ ⎤
2
= 100 ⎢1 − ⎜ π⎟ ⎥
⎢⎣ ⎝ 2 ⎠ ⎥
⎦
⎡ π⎤
=100 ⎢1 − ⎥
⎣ 4⎦
12. (b) (ii) Let X be the actual amount of coffee put into the jars. Then X
follows N(m, 0.05)
Given P[X < 6] = 0.03
⎡ x − μ 6 − μ⎤
∴ P ⎢ −∞ < < = 0 03
⎣ 0.05 0.05 ⎥⎦
⎡ 6 − μ⎤
∴ P ⎢ −∞ < z < = 0.03
⎣ 0 05 ⎥⎦
⎡ 6 − μ⎤
∴ P [ −∞ < z < ] + P ⎢0 < z < = 0.03
⎣ 0.05 ⎥⎦
⎡ 6 − μ⎤
0.5 + P ⎢0 < < = 0.03
⎣ 0 05 ⎥⎦
⎡ 6 − μ⎤
P ⎢0 < < = −0.47
⎣ 0 05 ⎥⎦
⎡ μ − 6⎤
P ⎢0 < < = 0.47 ∴by symmetric
⎣ 0 05 ⎥⎦
From the table
P[0< z < 1.808] = 0.47
μ −6
∴ = 1.808
0 05
∴ μ = 6.094 onwards
and a ∑ u 2 + b ∑ u = ∑ uv → (3)
x y u = x − 29 v = y − 27 u2 v2 uv
22 20 −7 −7 49 49 49
26 20 −3 −7 9 49 21
29 21 0 −6 0 36 0
30 29 1 2 1 04 2
31 27 2 0 4 0 0
31 24 2 −3 4 9 −6
34 27 5 0 25 0 0
35 31 6 4 36 16 24
∑ 6 −17 128 163 90
2 ⇒ 6a + 8b = −17 (4)
3 ⇒ 128a + 6b = 90 (5)
Solving (4) & (5) we get a = 0.83 & b = −2.75
Hence the regression line of y on x is
y −27 = 0.83(x − 29) − 2.75
i.e., y = 0.83x + 0.18 (6)
Let the equation of the regression line x on y be x = Cy + D or equiva-
lently
u = Cv + d (7)
The normal equations for finding c & d are
c ∑ ν +nd = ∑ u (8)
and c ∑ ν 2 + d ∑ ν = ∑ uv (9)
8⇒ −17 c + 8 d = 6 → (10)
using table values
9⇒ 163 c − 17 d = 90 → (11)
Solving (10) & (11) we get c = 0.81 & d = 2.47
Hence the regression line of X on Y is
x − 29 = 0.81(y−27) + 2.47
x = 0.81y + 9.60 (12)
Comparing equations 6 with
rσ y ( x x)
y−y =
σx
rσ y
= 0.83
σx
Comparing equation 12 with
σx
x x = r (y y)
σy
σx
r = 0.81
σy
σy σx
W.K.T r2 = r ×r
σx σy
= 0.83 × 0.81
σy σx
∴r = 0.82 [∴ r
&r are both positive]
σx σy
We use the equation (6) to estimate the value at Y when X = 38
∴ Y = 0.83 × 38 + 0.18 = 31.72
Using equation (12) to estimate the values of X when Y = 18, we
have
x = 0.81 × 18 + 9.60 = 24.18
(x )
1 2
= ∫∫
0 1
(xy2 + x2/8) dx dy
1 2
⎡ x2 x3 ⎤
0⎣
∫
= ⎢ y 2 + ⎥ dy
2 24 ⎦
1
1
⎡⎛ 2 1 ⎞ ⎛ y 2 1 ⎞ ⎤
= ∫0
⎢⎜ 2 y + ⎟ −
⎢
⎣
⎝
+ ⎥dy
3 ⎠ ⎝ 2 24 ⎟⎠ ⎥⎦
1
⎛ 3y2 7 ⎞
= ∫0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1
⎛ y3 7 ⎞
= ⎜ + y⎟
⎝ 2 24 ⎠0
1 7 19
= + =
2 24 24
⎛ 2 x2 ⎞
(ii) P [y < 1/2] = ∫ ∫ R2
⎜ xy + 8 ⎟ dydx
⎝ ⎠
( y <1/ 2 )
2 1/ 2
⎡ xy 3 x 2 ⎤
= ⎢
0⎣
∫3
+ y ⎥ ddx
8 ⎦
0
2⎛ x x2 ⎞
= ∫ 0
⎜ 24 + 16 ⎟ dx
⎝ ⎠
2
⎛ x 2 x3 ⎞
=⎜ + ⎟
⎝ 48 48 ⎠ 0
4 8 12 1
= = =
48 48 4
(iii) P[X > 1, y < 1/2]
1/ 2 2⎛ x2 ⎞
= ∫ ∫ ⎜ xy + 8 ⎟ dxdy
2
0 1 ⎝ ⎠
2
1/ 2 ⎡ x 2 y 2 x3 ⎤
= ∫ 0
⎢
⎣ 2
+ ⎥ dy
24 ⎦
1
1/ 2 ⎡ ⎛ 1⎞ ⎛ y2 1 ⎞ ⎤
= ∫ ⎢⎜ 2 y + ⎟ − +
2
⎥dy
0 ⎢⎣⎝ 3 ⎠ ⎝ 2 24 ⎟⎠ ⎦⎥
1/ 2
⎛ 3y2 7 ⎞
= ∫0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1/ 2
⎡ y3 7 ⎤
= ⎢ + y⎥
⎣ 2 24 ⎦ 0
1 7
= +
16 48
10 5
= =
48 24
⎡ 1⎤
P x > 1, y < ⎥
(iv) P[X > 1 / Y < 1/2] = ⎣
2⎦ 5 / 24 5
= =
⎡ 1⎤ 1/ 4 6
P y< ⎥
⎣ 2⎦
⎡ 1⎤
P x > 1, y < ⎥
(v) P[y < 1/2 / x > 1] = ⎣ 2⎦
=
5 / 24
= 5/19
P[x > ] 19 / 24
1 y y
x2
∫∫( 2
)dxdy
(vi) P[X < Y] = 8
0 0
1
1 y y=x
⎛ x 2 y 2 x3 ⎞
= ⎜
0
∫
⎝ 2
+ ⎟ dy
24 ⎠ 2
x
0
1
⎛ y 4 y3 ⎞
= ∫ 0
⎜ 2 + 24 ⎟ dy
⎝ ⎠
1
⎛ y5 y 4 ⎞
= ⎜ + ⎟
⎝ 10 96 ⎠ 0
1 1
= +
10 96
53
=
480
(vii) P[x + y < 1] y
1 1− y
x2
∫ ∫ +
2
( )dxdy
= 8
0 0
1
1− y x+ x
1 2
⎛ x 2 y 2 x3 ⎞ y=
= ⎜
0
∫
⎝ 2
+ ⎟
24 ⎠
dy 1
0
1
⎡ y2 (1 y )3 ⎤
0 ⎣
∫
= ⎢ (1 y ) 2 +
2 24 ⎦
⎥ dy
1
1 ⎡
=
24 ⎣ ∫
12 y 2 (1 − y 2 ) + (1 − y )3 ⎤⎦ dy
0
1
1 ⎡
=
24 ⎣ ∫
12 y 2 +12
0
+1 y 3 − 3 y + 3 y 2 ⎤⎦ dy
+ 12 y 4 − 24 y 3 +1−
1
1 ⎡
=
24 ⎣ ∫
12 y 4 − 25 y 3 + 15 y 2 − 3 y + 1⎤⎦ dy
0
1
1 ⎡12 y 5 25 y 4 15 y 3 3 y 2 ⎤
= ⎢ − + − + y⎥
24 ⎣ 5 4 3 2 ⎦ 0
1 ⎡12 25 3 ⎤ 13
= ⎢ − + 5 − + 1⎥ =
24 ⎣ 5 4 2 ⎦ 480
⎡ π π ⎤
1 1
= aE ⎢cosΩ
⎢ ∫ 2π
cos − sin Ω ∫ 2π
s φ dφ ⎥
sin
⎥
⎣ −π −π ⎦
since f is uniform in ( , π )
⎣ {
= a 2 ⎡ E cos Ωt1 cos Ωt
Ω 2 cos 2 φ + i Ωt1 + i Ωt
Ω2 i 2
φ−
(sin Ωt
Ωt1 Ω
Ωtt2 + Ω + Ω ) × ( in φ cos φ / Ω)}⎤⎦
⎡ π
= a 2 E ⎢cos Ω t1 Ω t2 ∫ cos φ d φ + s Ω t1 i Ω t2 ×
2
⎢
⎣ −π
π π ⎤
∫ sin 2φ d φ − sin
i Ω( 1 + 2 ) ∫ sin φ cos φ dφ ⎥
⎥
−π −π ⎦
1 2
= a E[ 1 Ω2 i 1 i Ω 2]
2
1 2
= a E [cosΩ( 1 − 2 )]
2
= a function of t1 − t2, whatever the value of f(w)
{x(t)} is a wss process.
14. (b) If n ≥ 1.
Xn = 1, if the nth signal generated is highly distorted.
Xn = 0, if the nth signal generated is highly recognizable.
Then clearly {Xn:n = 0, 1, 2… } is a marker chain with state spare {0,
1} and the transition probability matrix is given by
⎡ 20 / 23 3 / 23⎤
Xn = ⎢ ⎥
⎣ 14 / 15 1 / 15 ⎦
π 0 = the fraction of signals that are recognizable.
π1 = the fraction of signals that are distorted.
⎡ 20 / 23 3 / 23⎤
( π 0 , π1 ) ⎢ ⎥ = ( π 0 , π1 )
⎣ 14 / 15 1 / 15 ⎦
20 14
∴ π 0 + π1 = π 0 (1)
23 15
1
( / 23)π 0 + π1 = π1 (2)
15
and w.k.t π 0 + π1 = 1 (3)
Solving (1), (2), (3) we get
322
π0 = = 0.877
367
∴ π1 = 1 − π 0 = 1− 0.877 = 0.123 ∴ l / m = 5/3
Therefore 12.3% of the signals generated by the testing system are
highly distroted.
1 ( 2.5) 4
= × 2
× 0.0449 =3.5078
3 6 ⎛ 2 5⎞
⎜⎝1 − ⎟
3 ⎠
1
(iii) Ws = L
λ s
1⎡ λ⎤ λ
= L + ∴ Ls = Lq +
λ ⎢⎣ q μ ⎥⎦ μ
1
=[3.5078 + 2.5] = 0.4005 h
15
or Ws = 24 min, nearly
⎡ s⎡ λ
− μt ( s −1− ) ⎤ ⎤
⎢ ⎛ λ ⎞ ⎢1 − e μ ⎥
P ⎥
⎢ ⎜⎝ μ ⎟⎠ ⎢ ⎥ ⎥
0
1 1
(ii) Ws = = = 60 min = 1 hour
μ−λ 1 1
−
10 12
λ 1 / 12
(iii) Wq = = = 50 min
μ−λ 1 ⎛ 1 1⎞
⎜ − ⎟
10 ⎝ 10 12 ⎠
= e −0 5 = 0.6065
f(x) = k(1
2. A continuous random variable X has a density function given by f( k
+ x) 2 < x < 5. Find P(X
( < 4).
“The random variables X and Y are independent iff Cov (X, Y) = 0”.
10. Ιf people arrive to purchase cinema tickets at the average rate of 6 per
minute, it takes an average of 7.5 seconds to purchase a ticket. If a person
arrives 2 mins before the picture starts and it takes exactly 1.5 min to
reach the correct seat after purchasing the ticket. Can be expect to be
seated for the start of the picture?
PART B – (5 × 16 = 80 marks)
11. (a) (i) The first bag contains 3 white balls, 2 red balls and 4 black balls.
Second bag contains 2 white, 3 red and 5 black balls and third
bag contains 3 white, 4 red and 2 black balls. One bag is chosen
at random and from it 3 balls are drawn. Out of 3 balls, 2 balls
are white 1 is red. What are the probabilities that they were taken
from first bag, second bag, third bag.
(ii) A random variable X has the p.d.f.
⎧2 x,, 0 < x < 1
( x) = ⎨
⎩ 0, otherwise
find
⎛ 1⎞ ⎛1 1⎞ ⎛ 3 1⎞
( )P x< ⎟ ,( 2) ,( ) P x > / X > ⎟
⎝ 2⎠ ⎝4 2 ⎠ ⎝ 4 2⎠
(Or)
(b) (i) If the density function of continuous random variable X is given
by
⎧ ax ; 0 ≤ x ≤1
⎪
⎪ a ; 1≤ x ≤ 2
f ( x) = ⎨
⎪3a ax
a ; 2≤ x≤3
⎪ 0
⎩ ; otherwise
(1) Find ‘a’ (2) Find the cdf of X.
14. (a) (i) Given a random variable Ω with density f(w) and another random
variable f uniformly distributed in (−p, p) and independent of Ω
and X(t) = a cos (Ωt + f), prove that [X(t) is a WSS process.
(ii) The transition probability matrix of a Markov chain {Xn}, n = 1,
2, 3,… having 3 states 1, 2 and 3 is
⎛ 0.1 0.5 0 4⎞
P = ⎜ 0.6 0.2 0 2⎟
⎜ ⎟
⎜⎝ 0.3 0.4 0 3⎟⎠
14. (b) (i) A man either drives a car or catches a train to go to office each
day. He never goes 2 days in a row by train but it he drives one
day, then the next day he is just as likely to drive again as he is to
travel by train. Now suppose that on the first day of the week, the
man tossed a fair die and drove to work iff a 6 appeared. Find
(1) The probability that he takes a train on the 3rd day.
(2) The probability that he drives to work in the long run.
(ii) Write a short note on recurrent state, transient state, ergodic
state.
15. (a) (i) A duplicating machine maintained for office use is operated by
an office assistant who earns Rs.5 per hour. The time to complete
each jobs varies according to an exponential distribution with
mean 6 mins. Assume a Poisson input with an average arrival
rate of 5 jobs per hour. If an 8-hrs day is used as a base, deter-
mine
(1) The percentage idle time of the machine.
(2) The average time a job in the system.
(3) The average earning per day of the assistant.
(ii) A super market has two girls attenting to sales at the counters.
If the service time for each customer is exponential with mean
4 mins and if people arrive in Poisson fashion at the rate of 10
per hour,
(1) what is the probability that a customer has to wait for ser-
vice?
(2) What is the expected percentage of idle time for each girl?
(or)
15. (b) (i) Customers arrive at a one-man barber shop according to a Pois-
son process with a mean inter-arrival time of 12 min. Customers
spend as average of 10 min in the barber’s chair.
(1) What is the expected no. of customers in the barber shop and
in the queue?
(2) What is the probaboloty that more than 3 customers are in
the system?
(ii) Derive the Pollaczek-khinchine formula for M/G/1 queueing
model.
3. If the probability is 0.10 that a certain kind of measuring device will show
excessive drift, what is the probability that the fifth measuring device
tested will be the first show excessive drift? Find its expected value also.
5. If X has mean 4 and variance 9, while Y has mean −2 and variance 5, and
the two or independent, find
(a) E(X Y)
(b) E(X Y2)
8. Let X(t); t ≥ 0 be a Poisson process with rate l. Find E[X(t) X(t + t)],
where t > 0.
PART B – (5 × 16 = 80 marks)
11. (a) (i) A box contain 5 red and 4 white balls. A ball from the box is
taken out at random and kept outside. If once again a ball is
drawn from the box, what is the probability that the drawn ball
is red?
(ii) If the cumulative distribution function of a R.V. X is given by
⎧ 4
⎪1 − 2 , x > 2
F ( x) = ⎨ x
⎪0, x ≤ 2,
⎩
find (1) P(X < 3) (2) P(4 < X < 5) (3) P(X ≥ 3).
(iii) A discrete R.V. X has moment generating function
5
⎛ 1 3 t⎞
M x (t ) = + e .
⎝ 4 4 ⎟⎠
Find E(X), Var (X) and P(X = 2).
(b) (i) The p.d.f. of the samples of the amplitude of speech wave forms
is found to decay exponentially at rate α, so the following p.d.f.
is proposed.
Find the constant ‘C’, and then find the probability P (|X| < v)
and E(X).
⎛ x⎞
(ii) Let X be a R.V. with E(X) = 1 and E(X(X − 1) = 4. Find Var ⎜ ⎟
⎝ 2⎠
and Var (2 − 3x).
(iii) If X is a continuous R.V. with p.d.f.
⎧ x, 0 ≤ x <1
⎪
⎪3
f ( x) = ⎨ ( x )2 , 1≤ x < 2
⎪2
⎪⎩0, otherwise,
⎛ 1⎞
Find (1) P X > ⎟ (2) Moment generating function for X
⎝ 2⎠
(3) E(X) (4) Var(X).
(ii) If X is any continuous R.V having the p.d.f.
⎧2 x,, 0 < x < 1
f ( x) = ⎨
⎩0, otherwise.
and Y = e− x, find the p.d.f. of the R.V. Y.
X
0 1 2
Find (1) P(X + Y > 1), (2) the probability mass function P(X = x)
of the R.V. X, (3) P(Y = 1/ X = 1). (4) E(X Y).
(ii) Suppose that orders at a restaurant are i.i.d. random variables
with mean m = Rs.8 and standard deviation s =Rs.2. Estimate
(1) the probability that first 100 customers spend a total of more
than Rs.840, i.e., (1) P(X1 + X2 + … + X100 > 840, (2) P(780 < X1
+ X2 + … + X100 < 820).
(Or)
(b) (i) Find P(X > 2/Y < 4) when the joint p.d.f. of X and Y is given
by
⎧e −(( x y)
, x ≥ 0, y ≥ 0
g ( x, y ) = ⎨
⎩0, otherwise.
Are X and Y independent R.Vs? Explain.
(ii) If the point p.d.f of the R.Vs X and Y is given by
⎧2, 0 < x < y < 1
f ( x, y ) = ⎨
⎩0, otherwise.
X
Find the p.d.f. of the R.V. U =
Y
14. (a) (i) Let X(t) be a Poisson process with arrival rate l. Find E{X(t) −
x(s)2} for t > s.
(ii) Let {Xn; n = 1, 2, 3, …} be a Markov chain on the space S =
{1,2,3} with one step transition probability matrix
⎡ 0 1 0 ⎤
⎢ ⎥
= ⎢1 / 2 0 1 / 2⎥
⎢ 1 0 0 ⎥
⎣ ⎦
(1) Sketch the transition diagram.
(2) Is the chain irreducible? Explain.
(3) Is the chain Ergodic? Explain.
(Or)
2. Suppose that a bus arrives at a station every day between 10.00 a.m. and
10.30 a.m. at random. Let X be the arrival time; find the distributive func-
tion of X and sketch its graph.
3. Sharon and Ann play a series of backgammon games until one of them
wins the five games. Suppose that the games are independent and the
probability that Sharon win a game is 0.58.
(a) Find the probability that the series ends in 7 games.
(b) If the series ends in 7 games, what is the probability that Sharon
wins?
⎧1 − λ − λ ( + ) if 0, y > 0
5. For λ > 0, let (x, ) = ⎨ Check whether F
⎩0 otherwise
can be joint probability distribution function of two random variables X
and Y.
Y
6. The life time a TV tube (in years) is an exponential random variable with
mean 10. What is the probability that the average lifetime of a random
sample of 36 TV tubes is atleast 10.5?
8. At an intersection, a working traffic light will be out of order the next day
with probability 0.07, and an out-of-order traffic light will be working
the next day with probability 0.88. Let Xn = 1 if on day ‘n’ the traffic light
will work; Xn = 0 if on day ‘n’ the traffic light will not work. Is { Xn; n = 0,
1, 2,…} a Markov chain? If so, write the transition probability martrix.
9. Suppose that customers arrive at a Poisson rate of one per every 12 min-
utes, and that the service time is exponential at a rate of one service per
8 mins.
(a) What is the ‘average no. of customers in the system?
(b) What is the average time of a customer spends in the system?
10. What do you mean by transient state and steady state queueing system?
PART B – (5 × 16 = 30 marks)
11. (a) (i) A box contains 7 red and 13 blue balls. Two balls are selected at
random and are discarded without their colours being seen. If a
third ball is drawn randomly and observed to be red, what is the
probability that both of the discarded balls were blue? (8)
(ii) The sales of a convenience store on a randomly selected day are
X thousand dollars, where X is a random variable with a distri-
bution function of the following terms:
⎧0; x 0
⎪ 2
⎪⎪ x ; 0 x 1
F ( x) ⎨ 2
⎪k ( x x 2) ; 1 ≤ x < 2
⎪
⎪⎩1; x 2
Suppose that this convenience store’s total sales on any given
day are less than $ 2000.
(1) Find the value of k.
(2) Let A and B be the events that tomorrow the store’s total sales
are between 500 and 1500 dollars, and over 1000 dollars,
respectively. Find P(A)and P(B).
(3) Are A and B independent events? (8)
Or
(b) (i) A box contains tags marked 1, 2, …n.
(ii) Two tags are chosen at random without replacement. Find the
probability that the numbers on the tags will be consecutive inte-
gers. (2) Two tags are chosen at random with replacement. Find the
probability that the numbers on the tags will be consecutive inte-
gers. (8)
(ii) Experience has shown that while walking in a certain park, the
time X (in mins.), between seeing two people smoking has a den-
sity function of the form
⎧λ xe − x ; x > 0
f ( x) = ⎨
⎩0 otherwise
(4) Find x10 and x11 , when P( x10 X x11 ) = 0.50 and
p( X x11 .25) . (8)
(ii) A man with ‘n’ keys wants to open his door and tries the keys
independently and at random. Find the mean and variance of the
number of trials required to open the door if unsuccessful keys
are not eliminated from further selection. (8)
Or
(b) (i) The joint of (X, Y) is given by p(x, y) = k(2x + 3y); x = 0, 1, 2; y =
1, 2, 3. Find all the marginal probability distributions. Also find
the probability distribution of (X + Y). (10)
(ii) Can Y = 5 + 2.8X and X = 3 − 0.5Y be the estimated regres-
sion equations of Y on X and X on Y respectively? Explain your
answer with suitable theoretical arguments. (6)
14. (a) (i) Show that the random process X(t) = Acos(w0 t + q) is wide-
sense stationary, if A and w0 are constants and θ is a uniformly
distributed random variable in (0, 2p). (8)
(ii) An, engineer analyzing a series of digital signals generated by
a testing system observes that only 1 out of 15 highly distorted
signals follows a highly distorted signal, with no recognizable
signal between, whereas 20 out of 23 recognizable signals follow
recognizable signals with no highly distorted signal between.
Given that only highly distorted signals are not recognizable,
find the fraction of signals that are highly distorted. (8)
Or
(b) (i) (1) Define stationary transition probabilities. (3)
(2) Derive the Chapman-Kolmogorov equations for discrete-
time Markov chain. (5)
(ii) On a given day, a retired English professor, Dr. Charles Fish,
amuses himself with only one of the following activities: read-
ing (activity 1), gardening (activity 2), or working on his book
about a river valley (activity 3). For 1 ≤ i ≤ 3, let Xn = i if Dr. Fish
devotes day ‘n’ to activity i. Suppose that {Xn; n = 1, 2,…} is a
Markov chain, and depending on which of these activities on the
next day is given by the t.p.m.
⎛ 0.30 0.25 0 45⎞
P = ⎜ 0.40 0.10 0.50⎟
⎜ ⎟
⎜⎝ 0.25 0.40 0 35⎟⎠
Find the proportion of days Dr. Fish devotes to each activity. (8)
15. (a) (i) For the steady state M/M/I queueing model, prove that
n
⎛ λ⎞
Pn P. (8)
⎝ μ⎠ 0
(ii) On every Sunday morning, a Dental hospital renders free dental
service to the patients, As per the hospital rules, 3 dentists who
are equally qualified and experienced will be on duty then. It
takes on an average 10 mins for a patient to get treatment and the
actual time taken is known to vary approximately exponentially
around this average. The patients arrive according to the Poisson
distribution with an average of 12 hours. The hospital manage-
ment wants to investigate the following:
(1) The expected number of patients waiting in the queue.
(2) The average time that a patient spends at the hospital. (8)
Or
( )=
2. A continuous random variable X has a probability density function f (x
k (1 + x), 2 ≤ x ≤ 5. Find P(X
( < 4).
3. One percent of jobs arriving at a computer system need to wait until week-
ends for scheduling, owing to core-size limitations. Find the probability
that among a sample of 200 jobs there are no jobs that have to wait until
weekends.
4. A fast food chain finds that the average time customers have to wait for
service is 45 seconds. If the waiting time can be treated as an exponential
random variable, what is the probability that a customer will have to wait
more than 5 minutes given that already he waited for 2 minutes?
PART B − (5 × 16 = 80 marks)
11. (a) (i) A binary communication channel carries data as one of 2 types
of signals denoted by 0 and 1. Due to noise, a transmitted 0 is
sometimes received as a 1 and a transmitted 1 is sometimes
received as a 0. For a given channel assume a probability of 0.94
that a transmitted 0 is correctly received as a 0 and a probability
of 0.91 that a transmitted 1 is received as a 1. Further assume a
probability of 0.45 of transmitted a 0. If a signal is sent, deter-
mine the probability that
(1) a 1 is received
(2) a 1 was transmitted given that a 1 was received
(3) a 0 was transmitted, given that a 0 was received
(4) an error occurs. (8)
(ii) In a continuous distribution, the probability density is given by
f (x) = kx (2 − x), 0 < x < 2. Find k, mean, variance and the dis-
tribution function. (8)
Or
(b) (i) The cumulative distribution function (cdf) of a random variable
X is given by
F ( x ) = 0, x < 0
1
= x2 , 0 ≤ x <
2
3 1
= 1− ( x)2 , ≤ x<3
25 2
= 1, x ≥ 3
Find the pdf of X and evaluate P (| X | ≤ 1) using both the pdf and
cdf. (8)
(ii) Find the moment generating function of the geometric random
variable with the pdf f (x) = pq x − 1, x = 1, 2, 3, … and hence
obtain its mean and variance. (8)
13. (a) (i) The joint density function of the random variable (X, Y) is given
by
f ( x, y ) xxy 0 < x < 1, 0 < y < x
= 0, elsewhere.
Find the (1) marginal density of Y (2) conditional density of X/Y
⎛ 1⎞
= y and (3) P x < ⎟
⎝ 2⎠
(ii) Calculate the correlation coefficient for the following data:
X : 65 66 67 67 68 69 70 72
Y : 67 68 65 68 72 72 69 71 (8)
Or
14. (a) (i) At the receiver of an AM radio, the received signal contains a
cosine carrier signal at the carrier frequency w0 with a random
phase q that is uniformly distributed over (0, 2p). The received
carrier signal is X (t) = A cos (w0 t + q). Show that the process is
second order stationary. (8)
(ii) Queries presented in a computer data base are following a Pois-
son process of rate λ = 6 queries per minute. An experiment
consists of monitoring the data base for m minutes and recording
N (m) the number of queries presented
(1) What is the probability that no queries in a one minute inter-
val?
(2) What is the probability that exactly 6 queries arriving in one
minute interval?
(3) What is the probability of less than 3 queries arriving in a
half minute interval? (8)
Or
(b) (i) Assume that a computer system is in any one of the three states:
busy, idle and under repair respectively denoted by 0, 1, 2.
Observing is state at 2 pm each day, we get the transition prob-
ability matrix as
⎡0.6 0.2 0 2⎤
⎢ ⎥
P = ⎢ 0.1 08 0.1⎥
⎢0.6 0 0.4 ⎥
⎣ ⎦
Find out the 3rd step transition probability matrix. Determine the
limiting probabilities. (8)
(ii) Obtain the steady state or long run probabilities for the popula-
tion size of a birth death process. (8)
2. A die is loaded in such a way that each odd number is twice as likely to
occur as even number. Find P(G), where G is the event that a number
greater than 3 occurs on a single roll of the die.
4. Find the value of (a). C and (b). mean of the following distribution given
( −x
C(x− 2), for 0 < 1 f(
f(x) = 0, elsewhere
7. Find the moment generating function for the distribution where 2/3, at x
= 1 f(
f(x) = 1/3, at x = 2 0, otherwise
PART B – (5 × 16 = 80 marks)
11. (a) (i) If the probability density of X is given by 2(1-x), for 0 < 1 f(x) =
0, otherwise 1).Show that E[Xr] = 2/((r + 1)(r + 2)) 2).Use this
result to evaluate e[2X + 1)2]
(ii) Given a binary communication channel, where A is the input
and E is the output, let P(a) = 0.4, P(E/A) = 0.9 and p[E/A] = 0.6.
Find 1).P(A/E) 2).P(A/E)
(b) (i) random variable X has density function given by 1/k, for 0 f(x) =
0, elsewhere Find, (1).m.g.f (2).r th moment (3).mean (4).vari-
ance.
(ii) Given that a student studied, the probability of passing a certain
quiz is 0.99. Given that a student did not study. The probability
of passing the quiz is 0.05. Assume that the probability of study-
ing is 0.7. A student flunks the quiz. What is the probability that
he or she did not study?
12. (a) (i) Let the random variable X following binomial distribution with
parameters n and p. Find, (1).Probability mass function of X. (2).
Moment generating function. (3).Mean and variance of X.
(ii) The number of personal computer (PC) sold daily at a computer
World is uniformly distributed with a minimum of 2000 PC and
a maximum of 5000 PC. Find (1).The probability that daily sales
will fall between 2500 and 3000PC. (2).What is the probability
that the computer World will sell at least 4000 PC’s? (3).What
is the probability that the computer World will sell exactly 2500
PC’s?
(b) (i) Define the probability density function of normal distribution and
standard normal distribution. Write down the important proper-
ties of its distribution.
(ii) An electric firm manufactures light bulbs that have a life, before
burnout, that is normally distributed with mean equal to 800
hours and standard deviation of 40 hours. Find (1).The prob-
ability that a bulb burns more than 834 hours (2).The probability
that bulb between 778 and 834 hours
ables with the joint density being known as (x(1 + 3y2))/4, 0 <
2,0 < 1 f(x) = 0, otherwise Show that E(XY) = E(X)E(Y).
(ii) If the joint density of X1 and X2 is given by 6.e-3x1-2x2, for
x1>0, x2 > 0 f(x1,x2)= 0, otherwise Find the probability density
of Y = X1 and X2
(b) (i) Two random variables X and Y have joint density function fXY(x,
y) = x2 + (xy)/3; 0 = x = 1, 0 = y = 2 Find the conditional density
functions. Check whether the conditional density functions are
valid.
(ii) If the joint probability density of X1 and X2 is given by ex1 + x2,
for x1>0, x2>0 f(x1,x2) = 0, otherwise Find the probability of Y
= X1/(X1 + X2)
14. (a) (i) Find the correlation coefficient and obtain the lines of regression
from the following data: x: 50 55 50 60 65 65 65 60 60 50 y: 11
14 13 16 16 15 15 14 13 13
(ii) Let z be a random variable with probability density f(z) = 1/2
in the range -1 = z =1. Let the random variable X = z and the
random variable Y = z2. Obviously X and Y are not indepen-
dent since X2 = y. Show none the less, that X and Y are uncor-
related.
(b) (i) Two random variables X and Y are defined as Y = 4X + 9.Find the
correlation coefficient between X and Y.
(ii) A stochastic process is described by X(t) = Asint + Bcost where
A and B are independent random variables with zero means and
equal standard deviation. Show that the process is stationary of
the second order.
15. (a) (i) A raining process is considered as two state Markov chain. If it
rains, it is considered to be the state 0 and if it does not rain, the
chain is in state 1. The transition probability of the Markov chain
is defined as 0.6 0.4 P 0.2 0.8 in matrix form Find the probability
that it will rain for 3 days from today assuming that it will rain
after three days. Assume the initial probabilities of state 0 and
state 1 as 0.4 and 0.6 respectively.
(ii) A person owing a scooter has the option to switch over to scooter,
bike or a car next time with the probability of (0.3,0.5,0.2). If the
transition probability matrix is 0.4 0.3 0.3 0.2 0.5 0.5 0.25 0.25
0.5 . What are the probabilities vehicles related to his fourth pur-
chase?
(b) (i) Define Kendall’s notation. What are the assumptions are made
for simplest queuing model.
e λ t (λ t )r
P[ X ( t ) r] = , r = 0,1, 2,.....
r!
is covariance stationary.
10. Derive the average number of customers in the system for (M/M/1): (∞/
FIFO) model.
12. (a) (i) Out of (2n + 1) tickets consecutively numbered three are drawn
at random. Find the probability that the numbers on them are in
arithmetic progression.
(ii) If A and B are independent events, then show that A and B are
also indeoendent events. Also show that A and B are also inde-
pendent events.
12. (b) (i) The contents of urns, I,II and III are as follows:
1 white, 2 black and 3 red balls
2 white, 1 black and 1 red ball and
4 white, 5 black and 3 red balls.
One urn is chosen at random and two balls drawn. They happen
to be white and red. What is the probability that they come from
urns I, II and III?
(ii) Let the random variable X assume the value ‘r’ with the prob-
ability law: P(X = r) = qr − 1 p, r = 1, 2, 3,….. Find the moment
generating function and hence its mean and variance.
13. (a) (i) If ‘m’ things are distributed among ‘a’ men and ‘b’ women,
find the probability that the number of things received by men
is odd.
(ii) If X and Y are independent Poisson variates, find the conditional
distribution of X given X + YN.
13. (b) (i) If X − 1 and X − 2 are independent uniform variates on [0, 1],
find the distribution of X1/X2 and X1X2
(ii) Find the moment generating function of a normal distribution.
14. (a) (i) Two random variables X and y have the following joint prob-
ability density function
⎧2 − x y 0 ≤ x ≤1 0 ≤ y ≤1
f ( x, y ) = ⎨
⎩0, otherwise
Find
(1) Marginal probability density functions of X and Y
(2) Conditional density functions
(3) Vywar (X) and var (Y)
(ii) Let (X, Y) be a two-dimensional non-negative continuous random
variable having the joint density.
y −(( x
⎧4 xye y )
, x ≥ 0y ≥1
f ( x, y ) = ⎨
⎩0 elsewhere.
14. (b) (i) Find the coefficient of correlation and obtain the lines regression
from the data given below:
X : 62 64 65 69 70 71 72 74
Y : 126 125 139 145 165 152 180 208
(ii) Let the random variable X have the marginal density:
1 1
f1 ( x ) = 1,
< x<
1,
2 2
and let the conditional density of Y be
⎧ 1
⎪⎪1, x < y < x + 1 2
<x<0
f ( x, y ) = ⎨
⎪1, − x < y < 1 − x 0 < x < 1
⎪⎩ 2
Show that the variables are uncorrelated.
(i) What is the probability that a customer need not wait for a hair
cut?
(ii) What is the expected number of customers in the barbershop
and in the queue?
(iii) How much time can a customer expect to spend in the barber-
shop?
(iv) Find the average time that the customer spend in the queue.
(v) What is the probability that there will be 6 or more customers
waiting for service?
15 (b) Derive the formula for the average number of customers in the queue
and the Probability that an arrival has to wait for (M/M/C) with infi-
nite capacity. Also deriver for the same model the average waiting
time of a customer in the queue as well as in the system.