0% found this document useful (0 votes)
3 views

Applied Statistics and Probability for Engineers 5th Edition Chapter3 Compress

Chapter 3 of 'Applied Statistics and Probability for Engineers' covers various probability distributions and their ranges, detailing specific cases and calculations for different scenarios. It includes examples of probability functions, cumulative distribution functions, and the application of statistical principles to real-world problems. The chapter emphasizes the importance of nonnegative probabilities and their summation to one.

Uploaded by

edrian.samperoy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Applied Statistics and Probability for Engineers 5th Edition Chapter3 Compress

Chapter 3 of 'Applied Statistics and Probability for Engineers' covers various probability distributions and their ranges, detailing specific cases and calculations for different scenarios. It includes examples of probability functions, cumulative distribution functions, and the application of statistical principles to real-world problems. The chapter emphasizes the importance of nonnegative probabilities and their summation to one.

Uploaded by

edrian.samperoy
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Applied Statistics and Probability for Engineers, 5 th edition December 21, 2009

CHAPTER 3

Section 3-1

3-1. The range of X is {0,1,2,...,1000}


3-2. The range of X is {0,12
, ,...,50 }

3-3. The range of X is {0,12


, ,...,99999 }

3-4. The range of X is {0,12


, ,3,4,5 }

3-5. The range of X is {1,2,...,491} . Because 490 parts are conforming, a nonconforming part must be selected in 491
selections.

3-6. The range of X is {0,12


, ,...,100} . Although the range actually obtained from lots typically might not exceed 10%.

3-7. The range of X is conveniently modeled as all nonnegative integers. That is, the range of X is{0,12
, ,...}

3-8. The range of X is conveniently modeled as all nonnegative integers. That is, the range of X is{0,12
, ,...}

3-9. The range of X is {0,1,2,...,15}


3-10. The possible totals for two orders are 1/8 + 1/8 = 1/4, 1/8 + 1/4 = 3/8, 1/8 + 3/8 = 1/2, 1/4 + 1/4 = 1/2,
1/4 + 3/8 = 5/8, 3/8 + 3/8 = 6/8.
 1 3 1 5 6
Therefore the range of X is  , , , , 
 4 8 2 8 8

3-11. The range of X is {0,1,2, K,10000}


3-12.The range of X is {100, 101, …, 150}

3-13.The range of X is {0,1,2,…, 40000)

Section 3-2

3-14.
f X (0) = P ( X = 0) = 1 / 6 + 1 / 6 = 1 / 3
f X (1.5) = P( X = 1.5) = 1 / 3
f X (2) = 1 / 6
f X (3) = 1 / 6
a) P(X = 1.5) = 1/3
b) P(0.5< X < 2.7) = P(X = 1.5) +P(X = 2) = 1/3 + 1/6 = 1/2
c) P(X > 3) = 0
d) P(0  X < 2) = P( X = 0) + P( X = 1.5) = 1/ 3+ 1/ 3 = 2 / 3
e) P(X = 0 or X = 2) = 1/3 + 1/6 = 1/2

3-15. All probabilities are greater than or equal to zero and sum to one.
a) P(X  2)=1/8 + 2/8 + 2/8 + 2/8 + 1/8 = 1
b) P(X > - 2) = 2/8 + 2/8 + 2/8 + 1/8 = 7/8

3-1

11 1
c) P(-1  X  1) = 2/8 + 2/8 + 2/8 =6/8 = 3/4
d) P(X  -1 or X=2) = 1/8 + 2/8 +1/8 = 4/8 =1/2

3-16. All probabilities are greater than or equal to zero and sum to one.
a) P(X 1)=P(X=1)=0.5714
b) P(X>1)= 1-P(X=1)=1-0.5714=0.4286
c) P(2<X<6)=P(X=3)=0.1429
d) P(X1 or X>1)= P(X=1)+ P(X=2)+P(X=3)=1

3-17. Probabilities are nonnegative and sum to one.


a) P(X = 4) = 9/25
b) P(X  1) = 1/25 + 3/25 = 4/25
c) P(2  X < 4) = 5/25 + 7/25 = 12/25
d) P(X > 10) = 1

3-18. Probabilities are nonnegative and sum to one.


a) P(X = 2) = 3/4(1/4)2 = 3/64
b) P(X  2) = 3/4[1+1/4+(1/4)2] = 63/64
c) P(X > 2) = 1  P(X  2) = 1/64
d) P(X  1) = 1  P(X  0) = 1  (3/4) = 1/4

3-19. X = number of successful surgeries.


P(X=0)=0.1(0.33)=0.033
P(X=1)=0.9(0.33)+0.1(0.67)=0.364
P(X=2)=0.9(0.67)=0.603

3-20. P(X = 0) = 0.023 = 8 x 10 -6


P(X = 1) = 3[0.98(0.02)(0.02)]=0.0012
P(X = 2) = 3[0.98(0.98)(0.02)]=0.0576
P(X = 3) = 0.983 = 0.9412

3-21. X = number of wafers that pass


P(X=0) = (0.2)3 = 0.008
P(X=1) = 3(0.2)2(0.8) = 0.096
P(X=2) = 3(0.2)(0.8)2 = 0.384
P(X=3) = (0.8)3 = 0.512

3-22. X: the number of computers that vote for a left roll when a right roll is appropriate.
p=0.0001.
P(X=0)=(1-p)4=0.9999 4=0.9996
P(X=1)=4*(1-p)3p=4*0.999930.0001=0.0003999
P(X=2)=C42(1-p)2p2=5.999*10-8
P(X=3)=C43(1-p)1p3=3.9996*10 -12
P(X=4)=C40(1-p)0p4=1*10 -16

3-23. P(X = 50 million) = 0.5, P(X = 25 million) = 0.3, P(X = 10 million) = 0.2

3-24. P(X = 10 million) = 0.3, P(X = 5 million) = 0.6, P(X = 1 million) = 0.1

3-25. P(X = 15 million) = 0.6, P(X = 5 million) = 0.3, P(X = -0.5 million) = 0.1

3-26. X = number of components that meet specifications


P(X=0) = (0.05)(0.02) = 0.001
P(X=1) = (0.05)(0.98) + (0.95)(0.02) = 0.068
P(X=2) = (0.95)(0.98) = 0.931

3-27. X = number of components that meet specifications


P(X=0) = (0.05)(0.02)(0.01) = 0.00001
P(X=1) = (0.95)(0.02)(0.01) + (0.05)(0.98)(0.01)+(0.05)(0.02)(0.99) = 0.00167
P(X=2) = (0.95)(0.98)(0.01) + (0.95)(0.02)(0.99) + (0.05)(0.98)(0.99) = 0.07663

3-2

11 1
P(X=3) = (0.95)(0.98)(0.99) = 0.92169

3.28. X = final temperature


P(X=266) = 48/200 = 0.24
P(X=271) = 60/200 = 0.30
P(X=274) = 92/200 = 0.46

0.24, x = 266

f ( x) = 0.30, x = 271
0.46, x = 274

3.29. X = waiting time (hours)
P(X=1) = 19/500 = 0.038
P(X=2) = 51/500 = 0.102
P(X=3) = 86/500 = 0.172
P(X=4) = 102/500 = 0.204
P(X=5) = 87/500 = 0.174
P(X=6) = 62/500 = 0.124
P(X=7) = 40/500 = 0.08
P(X=8) = 18/500 = 0.036
P(X=9) = 14/500 = 0.028
P(X=10) = 11/500 = 0.022
P(X=15) = 10/500 = 0.020

 0.038, x =1
0.102, x =2

0.172, x =3

0.204, x =4
0.174, x =5

f ( x) = 0.124, x =6
0.080, x =7

0.036, x =8

0.028, x =9
 0.022, x = 10

 0.020, x = 15
3.30. X = days until change
P(X=1.5) = 0.05
P(X=3) = 0.25
P(X=4.5) = 0.35
P(X=5) = 0.20
P(X=7) = 0.15
0.05, x = 1.5
0.25, x= 3

f ( x) =  0.35, x = 4.5
0.20, x= 5

0.15, x= 7

3-3

11 1
3.31. X = Non-failed well depth
P(X=255) = (1515+1343)/7726 = 0.370
P(X=218) = 26/7726 = 0.003
P(X=317) = 3290/7726 = 0.426
P(X=231) = 349/7726 = 0.045
P(X=267) = (280+887)/7726 = 0.151
P(X=217) = 36/7726 = 0.005
 0.005, x = 217
 0.003, x = 218

 0.045, x = 231
f ( x) = 
 0.370, x = 255
 0.151, x = 267

0.426, x = 317
Section 3-3

 0, x<0 
 1 / 3 0  x < 1.5  f X (0) = P ( X = 0) = 1 / 6 + 1 / 6 = 1 / 3
  f X (1.5) = P ( X = 1.5) = 1 / 3
3-32. F (x ) =  2 / 3 1.5  x < 2  where
5 / 6 2  x < 3  f X (2) = 1 / 6
  f X (3) = 1 / 6
 1 3  x 

3-33.
 0, x < 2 
 1/ 8  f X (2) = 1 / 8
  2  x <  1
f X ( 1) = 2 / 8
 3 / 8 1  x < 0 
F (x ) =   where f X (0) = 2 / 8
 5/8 0  x <1 
f X (1) = 2 / 8
7 / 8 1  x < 2 
  f X ( 2) = 1 / 8
 1 2  x 
a) P(X  1.25) = 7/8
b) P(X  2.2) = 1
c) P(-1.1 < X  1) = 7/8  1/8 = 3/4
d) P(X > 0) = 1  P(X  0) = 1  5/8 = 3/8

3-34.
 0 x<1 
 
4 7 1  x < 2 
F( x) =  
6 7 2  x < 3
 1 3  x 
a) P(X < 1.5) = 4/7
b) P(X  3) = 1
c) P(X > 2) = 1 – P(X  2) = 1 – 6/7 = 1/7
d) P(1 < X  2) = P(X  2) – P(X  1) = 6/7 – 4/7 = 2/7

3-4

11 1
3-35.

 0, x<0 
 0.008, 0  x < 1 
 
F (x ) = 0.104, 1  x < 2 
 0.488, 2  x < 3 
 
 1, 3 x 
.
f (0) = 0.2 = 0.008,
3

f (1)= 3(0.2)(0.2)(0.8) = 0.096,


f (2)= 3(0.2)(0.8)(0.8) = 0.384,
f (3) = (0.8)3 = 0.512,
3-36.

.
 0, x <0 
 0.9996, 0  x < 1 f (0) = 0.99994 = 0.9996,
  f (1) = 4(0.99993 )(0.0001) = 0.0003999,
F ( x ) =  0.9999, 1  x < 3 
0.99999, 3  x < 4 f (2) = 5.999 *108 ,
  f (3) = 3.9996 *1012 ,
 1, 4  x 
f (4) = 1*1016
3-37.

 0, x < 10 
0.2, 10  x < 25 
 
F (x ) =  
0.5, 25  x < 50 
 1, 50  x 
where P(X = 50 million) = 0.5, P(X = 25 million) = 0.3, P(X = 10 million) = 0.2

3-38.
 0, x <1 
 0.1, 1  x < 5 
 
F (x ) =  
 0 .7 , 5  x < 10 
 1, 10  x 
where P(X = 10 million) = 0.3, P(X = 5 million) = 0.6, P(X = 1 million) = 0.1

3-39. The sum of the probabilities is 1 and all probabilities are greater than or equal to zero;
pmf: f(1) = 0.5, f(3) = 0.5

3-5

11 1
a) P(X  3) = 1
b) P(X  2) = 0.5
c) P(1  X  2) = P(X=1) = 0.5
d) P(X>2) = 1  P(X2) = 0.5

3-40. The sum of the probabilities is 1 and all probabilities are greater than or equal to zero;
pmf: f(1) = 0.7, f(4) = 0.2, f(7) = 0.1
a) P(X  4) = 0.9
b) P(X > 7) = 0
c) P(X  5) = 0.9
d) P(X>4) = 0.1
e) P(X2) = 0.7

3-41. The sum of the probabilities is 1 and all probabilities are greater than or equal to zero;
pmf: f(-10) = 0.25, f(30) = 0.5, f(50) = 0.25
a) P(X50) = 1
b) P(X40) = 0.75
c) P(40  X  60) = P(X=50)=0.25
d) P(X<0) = 0.25
e) P(0X<10) = 0
f) P(10<X<10) = 0

3-42. The sum of the probabilities is 1 and all probabilities are greater than or equal to zero;
pmf: f1/8) = 0.2, f(1/4) = 0.7, f(3/8) = 0.1
a) P(X1/18) = 0
b) P(X1/4) = 0.9
c) P(X5/16) = 0.9
d) P(X>1/4) = 0.1
e) P(X1/2) = 1

3-43.
 0, x < 266 
 0.24, 266  x < 271
 
F (x ) =  
 0.54, 271  <
x 274 
 1, 274 x 
Where P(X=266 K) = 0.24, P(X=271 K) = 0.30, P(X=274 K) = 0.46

3-44.

3-6

11 1
 0, x <1 
 0.038, 1 x < 2 

 0.140, 2 x <3 
 0.312, 3  x <4 
 
 0.516, 4x <5 
 0.690, 5  x < 6 

F (x ) = 
0.814, 6x <7 
 
 0.894, 7x <8 
 
 0.930, 8  x <9 
 0.958, 9  x < 10 
 
 0.980, 10  x < 15
1 15  x 
 
Where P(X=1) = 0.038, P(X=2) = 0.102, P(X=3) = 0.172, P(X=4) = 0.204, P(X=5) = 0.174, P(X=6) = 0.124,
P(X=7) = 0.08, P(X=8) = 0.036, P(X=9) = 0.028, P(X=10) = 0.022, P(X=15) = 0.020

3-45.
 0, x < 1 .5 
 
 0.05, 1.5  x < 3
 0.30, 3  x < 4.5
F (x ) = 
0.65, 4.5  x < 5
 
0.85, 5x <7 
 
1 7x 
Where P(X=1.5) = 0.05, P(X= 3) = 0.25, P(X=4.5) = 0.35, P(X=5) = 0.20, P(X=7) = 0.15

3-46.
 0, x < 217 
 
 0.005, 217  x < 218

 0.008, 218  x < 231
 231  x < 255
F (x ) =  0.053,
 0.423, 255  x < 267
 
 0.574, 267  x < 317
1, 317  x 
 
Where P(X=255) = 0.370, P(X=218) = 0.003, P(X=317) = 0.426, P(X=231) = 0.045, P(X=267) = 0.151,
P(X=217) = 0.005

Section 3-4

3-7

11 1
3-47. Mean and Variance
µ = E( X ) = 0 f (0) + 1 f (1) + 2 f ( 2 ) + 3 f (3) + 4 f (4 )
= 0(0.2) + 1(0.2) + 2 (0.2) + 3(0.2) + 4( 0.2) = 2
V (X ) = 02 f (0) + 12 f (1) + 22 f ( 2) + 32 f (3) + 42 f (4)  µ 2
= 0(0.2) + 1(0.2) + 4(0.2) + 9(0.2) + 16(0.2)  2 = 2
2

3- 48. Mean and Variance for random variable in exercise 3-14


µ = E( X) = 0 f (0) + 1.5 f (1.5) + 2 f ( 2 ) + 3 f (3)
= 0(1 / 3) + 1.5(1 / 3) + 2(1 / 6) + 3(1 / 6) = 1.333
V (X ) = 0 f (0) + 1.5 f (1) + 2 f (2) + 3 f (3)  µ
2 2 2 2 2

= 0(1 / 3)+ 2.25(1 / 3) + 4(1 / 6) + 9(1 / 6)  1.333 = 1.139


2

3-49. Determine E(X) and V(X) for random variable in exercise 3-15
µ = E( X) = 2 f ( 2) 1 f( 1) + 0 f (0 ) + 1 f (1) + 2 f ( 2)
.
= 2(1 / 8)  1(2 / 8) + 0(2 / 8) + 1( 2 / 8) + 2(1 / 8) = 0
V (X ) =  22 f ( 2)  12 f ( 1) + 02 f (0) + 12 f (1) + 22 f (2)  µ 2
= 4(1 / 8)+ 1(2 / 8) + 0(2 / 8) + 1(2 / 8) + 4(1 / 8)  0 2 = 1.5
3-50. Determine E(X) and V(X) for random variable in exercise 3-16
µ = E (X ) = 1 f (1) + 2 f (2)+ 3f (3)
= 1(0.5714286) + 2(0.2857143) + 3(0.1428571)
= 1.571429
V ( X ) = 1 2 f (1) + 2 2 f (2)+ 32 f (3)+ µ 2
= 1.428571

3-51. Mean and variance for exercise 3-17


µ = E( X ) = 0 f ( 0) + 1 f (1) + 2 f (2) + 3 f (3) + 4 f ( 4)
= 0(0.04 ) + 1(0. 12) + 2(0.2) + 3(0. 28) + 4(0.36) = 2.8
V (X ) = 0 f (0) + 1 f (1) + 2 f (2) + 3 f (3) + 4 f (4)  µ
2 2 2 2 2 2

= 0(0.04)+ 1(0.12)+ 4(0.2)+ 9(0.28) + 16(0.36)  2.82 = 1.36


3-52. Mean and variance for exercise 3-18

x x
3   1 3  1  1
E (X ) =  x   =  x   =
4 x= 0  4 4 x= 1  4  3
The result uses a formula for the sum of an infinite series. The formula can be derived from the fact that the series to

a
sum is the derivative of h (a ) = a
x= 1
x
=
1 a
with respect to a.

For the variance, another formula can be derived from the second derivative of h(a) with respect to a. Calculate from
this formula

3-8

11 1
x x
3  1 3  2 1  5
E (X ) =  x 2   =  x   =
2

4 x= 0  4  4 x =1  4  9
5 1 4
Then V ( X ) = E ( X 2 )  [E ( X )] =  =
2

9 9 9
3-53. Mean and variance for random variable in exercise 3-19

µ = E (X ) = 0 f (0) + 1f (1)+ 2 f (2)


= 0(0.033) + 1(0.364)+ 2(0.603)
= 1.57
V ( X ) = 0 2 f (0) + 12 f (1) + 22 f (2)  µ 2
= 0(0.033)+ 1(0.364)+ 4(0.603) 1.572
= 0.3111

3-54. Mean and variance for exercise 3-20

µ = E (X ) = 0 f (0) + 1f (1) + 2 f (2)+ 3f (3)


= 0(8× 10 6 ) + 1(0.0012)+ 2(0.0576)+ 3(0.9412)
= 2.940008

V ( X ) = 0 2 f (0) + 12 f (1) + 22 f (2) + 32 f (3) µ 2


= 0.05876096

3-55. Determine x where range is [0,1,2,3,x] and mean is 6.


µ = E (X )= 6 = 0f (0)+ 1f (1) + 2 f (2) + 3f (3) + xf (x )
6 = 0(0.2) + 1(0.2) + 2(0.2) + 3(0.2) + x (0.2)
6 = 1.2 + 0.2 x
4.8 = 0.2 x
x = 24
3-56. (a) F(0)=0.17

Nickel Charge: X CDF


0 0.17
2 0.17+0.35=0.52
3 0.17+0.35+0.33=0.85
4 0.17+0.35+0.33+0.15=1

(b)E(X) = 0*0.17+2*0.35+3*0.33+4*0.15=2.29
4
V(X) =  f ( x )( x
i= 1
i i  µ) 2 = 1.5259

3-57. X = number of computers that vote for a left roll when a right roll is appropriate.
µ = E(X)=0*f(0)+1*f(1)+2*f(2)+3*f(3)+4*f(4)

3-9

11 1
= 0+0.0003999+2*5.999*10-8+3*3.9996*10-12+4*1*10-16= 0.0004

5
V(X)=  f ( x )( x  µ )
i=1
i i
2
= 0.00039996

3-58. µ=E(X)=350*0.06+450*0.1+550*0.47+650*0.37=565

4
V(X)=  f (x )(x  µ )
i =1
i
2
=6875

σ= V ( X ) =82.92

3-59. (a)
Transaction Frequency Selects: X f(X)
New order 43 23 0.43
Payment 44 4.2 0.44
µ=
Order status 4 11.4 0.04
E(X) =
Delivery 5 130 0.05
Stock level 4 0 0.04
total 100
23*0.43+4.2*0.44+11.4*0.04+130*0.05+0*0.04 =18.694
5
V(X) =  f ( x )( x  µ)
i=1
i
2
= 735.964  = V ( X ) = 27.1287
(b)
Transaction Frequency All operation: X f(X)
New order 43 23+11+12=46 0.43
Payment 44 4.2+3+1+0.6=8.8 0.44
Order status 4 11.4+0.6=12 0.04
Delivery 5 130+120+10=260 0.05
Stock level 4 0+1=1 0.04
total 100

µ = E(X) = 46*0.43+8.8*0.44+12*0.04+260*0.05+1*0.04=37.172
5
V(X) =  f ( x )( x  µ)
i=1
i
2
=2947.996  = V ( X ) = 54.2955
3-60. µ = E(X) = 266(0.24) + 271(0.30) + 274(0.46) = 271.18
5
V(X) =  f ( x )( x  µ)
i=1
i
2
= 10.11

3-61. µ = E(X) = 1(0.038) + 2(0.102) + 3(0.172) + 4(0.204) + 5(0.174) + 6(0.124) + 7(0.08) + 8(0.036) + 9(0.028)
+ 10(0.022) +15(0.020)
= 4.808 hours
5
V(X) =  f ( x )( x  µ)
i=1
i
2
= 6.15

3-62. µ = E(X) = 1.5(0.05) + 3(0.25) + 4.5(0.35) + 5(0.20) + 7(0.15) = 4.45


5
V(X) =  f ( x )( x  µ) i
2
= 1.9975
i=1

3-10

11 1
3-63. µ = E(X) = 255(0.370) + 218(0.003) + 317(0.426) + 231(0.045) + 267(0.151) + 217(0.005) = 281.83
5
V(X) =  f ( x )( x  µ)
i=1
i
2
= 976.24

Section 3-5

3-64. E(X) = (0+99)/2 = 49.5, V(X) = [(99-0+1)2 -1]/12 = 833.25

3-65. E(X) = (3+1)/2 = 2, V(X) = [(3-1+1)2 -1]/12 = 0.667

3-66. X=(1/100)Y, Y = 15, 16, 17, 18, 19.


1  15 + 19 
E(X) = (1/100) E(Y) =   = 0.17 mm
100  2 
2
 1   (19 15 +1) 1 
2
V(X ) =     = 0.0002 mm
2

 100   12 

1   1   1  1 
3-67. E ( X ) = 2  + 3  + 4  + 5   = 3.5
4   4   4  4 

2 1  2 1  2 1  2 1  5
V ( X ) = (2)   + ( 3)   + ( 4)   + ( 5)    (3.5) = = 1.25
2

4 4   4 4  4
3-68. X = 590 + 0.1Y, Y = 0, 1, 2, ..., 9
 0 + 9
E(X) = 590 + 0.1  = 590.45 mm
 2 
 (9  0 + 1)2  1 
V ( X ) = (0.1)2   = 0.0825 mm
2

 12 
3-69. a = 675, b = 700
a) µ = E(X) = (a+b)/2= 687.5
V(X) = [(b – a +1)2 – 1]/12= 56.25

b) a = 75, b = 100
µ = E(X) = (a+b)/2 = 87.5
V(X) = [(b – a + 1)2 – 1]/12= 56.25
The range of values is the same, so the mean shifts by the difference in the two minimums (or maximums) whereas the
variance does not change.

3-70. X is a discrete random variable because it denotes the number of fields out of 28 that are in error.
However, X is not uniform because P(X = 0) ≠ P(X = 1).

3-71. The range of Y is 0, 5, 10, ..., 45, E(X) = (0+9)/2 = 4.5


E(Y) = 0(1/10)+5(1/10)+...+45(1/10)
= 5[0(0.1) +1(0.1)+ ... +9(0.1)]
= 5E(X)
= 5(4.5)
= 22.5
V(X) = 8.25, V(Y) = 5 2(8.25) = 206.25, Y = 14.36

3-72.

3-11

11 1
E (cX ) =  cxf (x ) = c  xf (x ) = cE (X ) ,
x x

V (cX ) =  (cx  cµ ) 2 f (x ) = c 2 (x  µ ) 2 f (x ) = cV (X )
x x

3-73. E(X) = (9+5)/2 = 7, V(X) = [(9-5+1)2-1]/12 = 2, σ = 1.414

3-74.

Section 3-6

3-75. A binomial distribution is based on independent trials with two outcomes and a constant probability of success on each
trial.

a) reasonable
b) independence assumption not reasonable
c) The probability that the second component fails depends on the failure time of the first component. The binomial
distribution is not reasonable.
d) not independent trials with constant probability
e) probability of a correct answer not constant
f) reasonable
g) probability of finding a defect not constant
h) if the fills are independent with a constant probability of an underfill, then the binomial distribution for
the number packages underfilled is reasonable
i) because of the bursts, each trial (that consists of sending a bit) is not independent
j) not independent trials with constant probability

3-76. (a) P(X≤3) = 0.411


(b) P(X>10) = 1 – 0.9994 = 0.0006
(c) P(X=6) = 0.1091
(d) P(6 ≤X ≤11) = 0.9999 – 0.8042 = 0.1957

3-77. (a) P(X≤2) = 0.9298


(b) P(X>8) = 0
(c) P(X=4) = 0.0112
(d) P(5≤X≤7) = 1 - 0.9984 = 0.0016

 10
3-78. a) P( X = 5) =   0.55 (0.5) 5 = 0. 2461
 5
10  0 10 10 1 9 10  2 8
b) P( X  2) =  0  0.5 0.5 +  1  0.5 0.5 +   0.5 0.5
    2
= 0.5 10 +10(0.5) 10 +45(0.5) 10 =0.0547
 10   10 
c) P( X  9) =   0.59 (0.5)1 +   0.510 (0.5)0 = 0.0107
 9  10 

 10  10 
d) P(3  X < 5) =  0.530.5 7 +   0.54 0.56
 3 4
= 120(0.5) +210(0.5) 10 =0.3223
10

3-12

11 1
 10
3-79. a) P ( X = 5) =   0.015 ( 0.99)5 = 2.40 × 10 8
5
 10 10   10
b )P ( X  2) =   0.01 0 (0.99 ) +  0.011 (0.99) +   0.012 ( 0.99)
10 9 8

 0 1
   2
= 0.9999
 10   10
c) P( X  9) =   0.019 (0.99 ) +   0.0110 ( 0.99) = 9.91× 10 18
1 0

 9 10
 
 10   10
d )P (3  X < 5) =   0.013 ( 0.99) +   0.014 (0.99) 6 = 1.138 × 10 4
7

 3 4

3-80.

0.25

0.20

0.15
f(x)

0.10

0.05

0.00

0 5 10
x

a) P( X = 5) = 0.9999 , x= 5 is most likely, also E (X )= np = 10(0.5) = 5


b) Values x= 0 and x=10 are the least likely, the extreme values

3-81.

Binomal (10, 0.01)

0.9
0.8
0.7
0.6
prob of x

0.5
0.4
0.3
0.2
0.1
0.0

0 1 2 3 4 5 6 7 8 9 10
x

P(X = 0) = 0.904, P(X = 1) = 0.091, P(X = 2) = 0.004, P(X = 3) = 0. P(X = 4) = 0 and so forth.
Distribution is skewed with E (X )= np = 10(0.01) = 0.1
a) The most-likely value of X is 0.

3-13

11 1
b) The least-likely value of X is 10.

3-82. n=3 and p=0.5


3
 1 1
f (0) =   =
 0 x<0   2 8
2
 0.125 0   1  1  3
  x < 1 f (1) = 3    =
   2  2  8
F (x ) =  0.5 1  x < 2  where 2
 0.875 2  x < 3  1  3 3
  f (2) = 3    =
 4  4 8
 1 3  x  3
 1 1
f (3) =   =
 4 8
3-83. n=3 and p=0.25
3
 3 27
f (0) =   =
 0 x <0   4 64
2
 0.4219 0 x 1   1  3  27
  <  f (1) = 3   =
  4  4  64
F (x ) =  0.8438 1  x < 2  where 2
 0.9844 2  x < 3   1  3 9
f (2) = 3    =
   4   4  64
 1 3  x  3
 1 1
f (3) =   =
 4 64
3-84. Let X denote the number of defective circuits.
Then, X has a binomial distribution with n = 40 a p = 0.01.
P(X = 0) = ( )0.01 0.99
0
40 0 40
= 0.6690 .

3-85. Let X denote the number of times the line is occupied.


Then, X has a binomial distribution with n = 10 and p = 0.4
 10
a) P( X = 3) =  0 .4 3 (0.6 )7 = 0 .215
3
b) Let Z denote the number of time the line is NOT occupied.
Then Z has a binomial distribution with n =10 and p = 0.6.
P (Z  1)= 1 P (Z = 0) = 1 ( ) 0.6 0.4
0
10 0 10
= 0.9999
c) E ( X ) = 10(0. 4) = 4

3-86. Let X denote the number of questions answered correctly.


Then, X is binomial with n = 25 and p = 0.25.

3-14

11 1
 25  25  4 25 
a) P( X  20) =   0.2520 (0.75 ) +   0.2521 (0.75 ) +  0.2522 (0.75 )
5 3

20
  21
  22
 
25  25   25
+  0.2523 (0.75)2 +   0.2524 ( 0.75)1 +   0.2525 (0.75)0 = 9.677× 1010
23
  24
   25
 25   25   25
b)P ( X < 5) =  0.250(0.75)25 + 0.251(0.75) 24 +   0.252 ( 0.75)23
0  1  2
25   25 
+  0.253 (0.75) 22 +  0.254(0.75)21 = 0.2137
3  4
3-87. Let X denote the number of mornings the light is green.

a) P (X = 1) =( )0.2 0.8 = 0.410


5
1
1 4

b) P (X = 4) = ( ) 0.2 0.8 = 0.218


20
4
4 16

c) P (X > 4) = 1  P (X  4) = 1  0.630 = 0.370

3-88. X = number of samples mutated


X has a binomial distribution with p=0.01, n=15
 15  0
(a) P(X=0) =   p (1  p)15 = 0.86
0 
(b) P(X≤1)=P(X=0)+P(X=1)= 0.99
(c) P(X>7)=P(X=8)+P(X=9)+…+P(X=15)= 0

3-89. (a) n=20, p=0.6122,


P(X≥1) = 1-P(X=0) = 1

(b)P(X≥3) = 1- P(X<3)= 0.999997

(c) µ=E(X)= np=20*0.6122=12.244

V(X)=np (1-p) = 4.748

σ= V ( X ) =2.179

3-90. n=20, p=0.13


 20 3
(a) P(X = 3) =   p (1  p)17 =0.235
3
 
(b) P(X ≥ 3) = 1-P(X<3)=0.492
(c) µ = E(X) = np = 20*0.13 = 2.6

V(X) = np(1-p) = 2.262

 = V ( X ) = 1.504

3-91. (a) Binomial distribution, p =104/369 = 4.59394E-06, n = 1E09

3-15

11 1
1E 09 0
(b) P(X=0) =   p (1  p )1 E 09 = 0
 0 
(c) µ = E(X) = np =1E09*0.45939E-06 = 4593.9

V(X) = np(1-p) = 4593.9

3-92. E(X) = 20 (0.01) = 0.2


V(X) = 20 (0.01) (0.99) = 0.198
µ X + 3 X = 0 .2 + 3 0 .198 = 153
.
a ) X is binomial with n = 20 and p = 0.01
P( X > 1.53) = P( X  2) = 1  P( X  1)
= 1 [( ) 0.01 0.99 + ( )0.01 0.99 ] = 0.0169
20
0
0 20 20
1
1 19

b) X is binomial with n = 20 and p = 0.04


P (X > 1) = 1  P ( X  1)
= 1 [( ) 0.04 0.96 + ( )0.04 0.96 ]= 0.1897
20
0
0 20 20
1
1 19

c) Let Y denote the number of times X exceeds 1 in the next five samples.
Then, Y is binomial with n = 5 and p = 0.190 from part b.
P (Y  1) = 1 P (Y = 0) = 1 [( ) 0.190 0.810 ] = 0.651
0
5 0 5

The probability is 0.651 that at least one sample from the next five will contain more than one defective

3-93. Let X denote the passengers with tickets that do not show up for the flight.
Then, X is binomial with n = 125 and p = 0.1.
a )P ( X  5) = 1  P ( X  4)
125 0  125  1  125 2 
  0.1 ( 0.9)125 +  0.1 (0.9 )124 +   0.1 (0.9 )123 
 0  1   2 
= 1  
  125  125  
+  0.13 (0.9 )122 +   0.14 ( 0.9) 121 
  3   4  
= 0.9961
b )P (X > 5) = 1  P (X  5) = 0.9886
3-94. Let X denote the number of defective components among those stocked.

a) P (X = 0) = ( ) 0.02 0.98 = 0.133


100
0
0 100

b ) P (X  2)= ( ) 0.02 0.98 + ( ) 0.02 0.98


102
0
0 102 102
1
1 101
+ ( ) 0.02 0.98
102
2
2 100
= 0.666

c) P (X  5) = 0.981
3-95. .
a) Let N denote the number of people (out of five) that wait less than or equal to 4 hours.

b) Let N denote the number of people (out of five) that wait more than 4 hours.

c) Let N denote the number of people (out of five) that wait more than 4 hours.

3-16

11 1
3-96. Probability a person leaves without being seen (LWBS) = 195/5292 = 0.037
a)
b)

c)

3-97. . Let X = number of the 10 changes made in less than 4 days.


a)
b)

c)
d) E

3-98.
a)
b)

c)

d)

Section 3-7

3-99. a) P( X = 1) = (1  0.5) 00.5 = 0.5


b) P( X = 4) = (1  0.5) 30.5 = 0.5 4 =0.0625
c) P( X = 8) = (1  0.5) 70.5 = 0.5 8 =0.0039
d) P( X  2) = P( X =1) + P( X = 2) = (1 0.5) 00.5 +(1 0.5) 10.5
= 0.5 + 0.5 2 = 0.75
e) P( X > 2) = 1  P ( X  2) = 1  0.75 = 0.25

3-100. E(X) = 2.5 = 1/p giving p = 0.4


a) P( X = 1) = (1  0.4) 00.4 = 0.4
b) P( X = 4) = (1  0.4) 30.4 = 0.0864
c) P( X = 5) = (1  0.5) 0.5 = 0.05184
4

d) P ( X  3) = P ( X = 1) + P (X = 2) + P (X = 3)
= (1  0.4) 00.4 +(1 0.4) 10.4 +(1 0.4) 20.4 =0.7840
e) P( X > 3) = 1  P ( X  3) = 1  0. 7840 = 0.2160

3-17

11 1
3-101. Let X denote the number of trials to obtain the first success.
a) E(X) = 1/0.2 = 5
b) Because of the lack of memory property, the expected value is still 5.

3-102. a) E(X) = 4/0.2 = 20


 19
b) P(X=20) = 
 3  (0.80) 0.2 = 0.0436
16 4

 
 18
 3  (0.80) 0.2 = 0.0459
c) P(X=19) = 
15 4

 
 20 
d) P(X=21) = 
 3  (0.80) 0.2 = 0.0411
17 4

 
e) The most likely value for X should be near µ X. By trying several cases, the most likely value is x = 19.

3-103. Let X denote the number of trials to obtain the first successful alignment.
Then X is a geometric random variable with p = 0.8
a) P( X = 4) = (1  0.8) 30.8 = 0.2 30.8 =0.0064
b) P ( X  4) = P ( X = 1) + P ( X = 2) + P ( X = 3) + P ( X = 4)

= (1  0.8) 00.8 +(1 0.8) 10.8 +(1 0.8) 20.8 +(1 0.8) 30.8
= 0.8 + 0.2(0.8) + 0.2 2(0.8) +0.2 30.8 =0.9984
c) P (X  4) = 1 P (X  3) = 1 [P ( X = 1) + P ( X = 2) + P( X = 3)]
= 1 [(1  0.8) 00.8 +(1 0.8) 10.8 +(1 0.8) 20.8]
= 1  [0.8 + 0.2(0.8) + 0.2 2(0.8)] =1 0.992 =0.008
3-104. Let X denote the number of people who carry the gene.
Then X is a negative binomial random variable with r=2 and p = 0.1
a) P( X  4) = 1  P ( X < 4) = 1  [ P ( X = 2) + P ( X = 3)]
  1  2 
= 1    (1 0.1) 00.1 2 +   (1 0.1)1 0.12  = 1  (0.01 + 0.018) = 0.972
  1  1 
b) E ( X ) = r / p = 2 / 0.1 = 20
3-105. Let X denote the number of calls needed to obtain a connection.
Then, X is a geometric random variable with p = 0.02.
a) P( X = 10) = (1  0.02) 90.02 = 0.98 90.02 =0.0167
b) P( X > 5) = 1 P( X  4) = 1 [ P( X = 1) + P ( X = 2) + P ( X = 3) + P ( X = 4) + P ( X = 5)]

= 1  [0.02 + 0.98(0.02) + 0.98 2 (0.02) + 0.983 (0.02) + 0.983 (0.02) + 0.98 4 (0.02)]
= 1  0.0961 = 0.9039
May also use the fact that P(X > 5) is the probability of no connections in 5 trials. That is,
5 
P (X > 5) =   0 .020 0.985 = 0.9039
0 
c) E(X) = 1/0.02 = 50
3-106. X = number of opponents until the player is defeated.
p=0.8, the probability of the opponent defeating the player.
(a) f(x) = (1 – p)x – 1p = 0.8(x – 1)*0.2
(b) P(X>2) = 1 – P(X=1) – P(X=2) = 0.64
(c) µ = E(X) = 1/p = 5

3-18

11 1
(d) P(X≥4) = 1-P(X=1)-P(X=2)-P(X=3) = 0.512
(e) The probability that a player contests four or more opponents is obtained in part (d), which is po = 0.512.
Let Y represent the number of game plays until a player contests four or more opponents.
Then, f(y) = (1-po)y-1po.
µY = E(Y) = 1/po = 1.95

3-107. p=0.13
(a) P(X=1) = (1-0.13)1-1*0.13=0.13.
(b) P(X=3)=(1-0.13) 3-1*0.13 =0.098
(c) µ=E(X)= 1/p=7.69≈8

3-108. X = number of attempts before the hacker selects a user password.


(a) p=9900/36 6=0.0000045
µ=E(X) = 1/p= 219877
V(X)= (1-p )/p 2 = 4.938*10 10

σ= V ( X ) =222222
(b) p=100/363=0.00214
µ=E(X) = 1/p= 467
V(X)= (1-p )/p 2 = 217892.39

σ= V ( X ) =466.78
Based on the answers to (a) and (b) above, it is clearly more secure to use a 6 character password.

3-109. p = 0.005 , r = 8
a.) P (X = 8) = 0.0058 = 3.91x10 19
1
b). µ = E( X ) = = 200 days
0.005
c) Mean number of days until all 8 computers fail. Now we use p=3.91x10-19
1
µ = E (Y ) =  91
= 2.56 x10 18 days or 7.01 x1015 years
3. 91x10
3-110. Let Y denote the number of samples needed to exceed 1 in Exercise 3-66.
Then Y has a geometric distribution with p = 0.0169.
a) P(Y = 10) = (1  0.0169)9(0.0169) = 0.0145
b) Y is a geometric random variable with p = 0.1897 from Exercise 3-66.
P(Y = 10) = (1  0.1897)9(0.1897) = 0.0286
c) E(Y) = 1/0.1897 = 5.27

3-111. Let X denote the number of transactions until all computers have failed.
Then, X is negative binomial random variable with p = 10-8 and r = 3.
a) E(X) = 3 x 10 8
b) V(X) = [3(110-80]/(10-16) = 3.0 x 1016

3-112. (a) p 6=0.6, p=0.918


(b) 0.6*p2=0.4, p=0.816
 x 1 
3-113. Negative binomial random variable: f(x; p, r) =
 (1  p) x r p r.
 r  1 
When r = 1, this reduces to f(x) = (1p)x-1p, which is the pdf of a geometric random variable.
Also, E(X) = r/p and V(X) = [r(1p)]/p2 reduce to E(X) = 1/p and V(X) = (1p)/p2, respectively.

3-114.
a)

3-19

11 1
b)

c)

d)

3-115. a) Probability that color printer will be discounted = 1/10 = 0.01


days

b)
c) Lack of memory property implies the answer equals
d)

3-116.
a)
b)
c)

d)

Section 3-8

3-117. X has a hypergeometric distribution N=100, n=4, K=20

P( X = 1) =
( )( ) =
20 80
1 3 20(82160)
= 0.4191
a)
( )
100
4 3921225
b) P ( X = 6) = 0 , the sample size is only 4

P (X = 4) =
( )( ) = 4845(1) = 0.001236
20
4
80
0

( ) 3921225
c) 100
4

K  20 
d) E( X ) = np = n = 4  = 0.8
N  100 
N n  96 
V ( X ) = np(1  p)   = 4(0.2)(0.8)   = 0.6206
 N 1   99 

( )( ) = (4 × 16 × 15 × 14) / 6 = 0.4623
P ( X = 1) = 1
4 16
3

( ) ( 20 × 19 × 18 × 17) / 24
3-118. a) 20
4

b) P (X = 4) =
( )( ) =
4 16
4 0 1
= 0.00021
( ) (20× 19 ×18 ×17) / 24
4
20

c)

3-20

11 1
P(X  2) = P (X = 0) + P ( X = 1) + P(X = 2)

=
( )( ) + ( )( ) + ( )( )
4
0
16
4 1
4 16
3
4
2
16
2

( ) ( ) ( )
20
4
20
4
20
4
 16× 15× 14× 13 4 ×16 ×15 ×14 6 ×16 ×15 
 + + 
 
= 24 6
 20×19×18×17 
2
= 0.9866
 
 24 
d) E(X) = 4(4/20) = 0.8
V(X) = 4(0.2)(0.8)(16/19) = 0.539

3-119. N=10, n=3 and K=4

0.5

0.4

0.3
P(x)

0.2

0.1

0.0
0 1 2 3
x

 24  12   36 
3-120. (a) f(x) = 
 x  3  x  /  3 
    
(b) µ=E(X) = np= 3*24/36=2
V(X)= np(1 -p)(N-n)/(N-1) =2*(1-24/36)(36-3)/(36-1)=0.629

(c) P(X≤2) =1-P(X=3) =0.717

3-121. Let X denote the number of men who carry the marker on the male chromosome for an increased risk for high blood
pressure. N=800, K=240 n=10
a) n=10

P (X = 1) =
( )( ) = ( )(
240 560
1 9
240! 560!
1!239! 9!551! )
= 0.1201
( )
800
10
800!
10!790!
b) n=10
P( X > 1) = 1 P( X  1) = 1 [ P( X = 0) + P( X = 1)]

P (X = 0) =
( )( ) = (
240 560
0 10
240!
)( 560!
0!240! 10!550! ) = 0.0276
( )800
10
800!
10!790!
P( X > 1) = 1  P( X  1) = 1 [0.0276 + 0.1201] = 0.8523
3-122. Let X denote the number of cards in the sample that are defective.
a)

3-21

11 1
P (X  1) = 1  P( X = 0)

P (X = 0 ) =
( )( ) =
20
0
120
20
120!
20!100!
= 0.0356
( ) 140
20
140!
20!120!

P (X  1) = 1 0.0356 = 0.9644
b)
P( X  1) = 1  P( X = 0)

P (X = 0) =
( )( ) =
5
0
135
20
135!
20!115!
=
135!120!
= 0.4571
( ) 140
20
140!
20!120! 115!140!
P (X  1) = 1  0.4571 = 0.5429

3-123. N=300
(a) K = 243, n = 3, P(X = 1)=0.087
(b) P(X≥1) = 0.9934
(c) K = 26 + 13 = 39, P(X = 1)=0.297
(d) K = 300-18 = 282
P(X ≥ 1) = 0.9998

3-124. Let X denote the count of the numbers in the state's sample that match those in the player's sample.
Then, X has a hypergeometric distribution with N = 40, n = 6, and K = 6.

a) P( X = 6) =
( )( ) =  40!  = 2.61× 10
6
6
34
0
1
7

( )  6!34!
6
40

b) P( X = 5) =
( )( ) = 6 × 34 = 5.31× 10
6 34
5 1 5

( ) ( )
40
6
40
6

c) P ( X = 4) =
( )( ) = 0.00219
6 34
4 2

( ) 40
6
d) Let Y denote the number of weeks needed to match all six numbers.
1
Then, Y has a geometric distribution with p = and
3,838,380
E(Y) = 1/p = 3,838,380 weeks. This is more than 738 centuries!

3-125. Let X denote the number of blades in the sample that are dull.
a)
P (X  1) = 1  P (X = 0)

P( X =0 ) =
( )( ) =
10
0
38
5
38!
5!33!
=
38!43!
= 0.2931
( ) 48
5
48!
5!43!
48!33!
P (X  1) = 1 P (X = 0) = 0.7069
b) Let Y denote the number of days needed to replace the assembly.
P(Y = 3) = 0.29312(0.7069) =0.0607

P (X = 0) =
( )( )=
2
0
46
5
46!
5!41!
=
46!43!
= 0.8005
( )
c) On the first day,
48 48!
5 5!43!
48!41!

3-22

11 1
P( X = 0 ) =
( )( ) =
6
0
42
5
42!
5!37!
=
42!43!
= 0.4968
( )
On the second day, 48 48!
5 5!43!
48!37!
On the third day, P(X = 0) = 0.2931 from part a). Therefore,
P(Y = 3) = 0.8005(0.4968)(1-0.2931) = 0.2811.

3-126. a) For Exercise 3-97, the finite population correction is 96/99.


For Exercise 3-98, the finite population correction is 16/19.
Because the finite population correction for Exercise 3-97 is closer to one, the binomial approximation to the
distribution of X should be better in Exercise 3-97.

b) Assuming X has a binomial distribution with n = 4 and p = 0.2,


P (X = 1) =( )0.2 0.8 = 0.4096
4
1
1 3

P (X = 4) = ( ) 0.2 0.8 = 0.0016


4
4
4 0

The results from the binomial approximation are close to the probabilities obtained in Exercise 3-97.

c) Assume X has a binomial distribution with n = 4 and p = 0.2. Consequently, P(X = 1) and P(X = 4) are the same as
computed in part b. of this exercise. This binomial approximation is not as close to the true answer as the results
obtained in part (b) of this exercise.

d) From Exercise 3-102, X is approximately binomial with n = 20 and p = 20/140 = 1/7.


P (X  1) = 1 P ( X = 0) = ( )( ) ( )
20
0
1 0 6
7 7
20
= 1 0.0458 = 0.9542
finite population correction is 120/139=0.8633

From Exercise 3-92, X is approximately binomial with n = 20 and p = 5/140 =1/28


P( X  1) = 1  P( X = 0) = ( )( ) ( )
20
0
1 0 27 20
28 28
= 1  0.4832 = 0.5168
finite population correction is 120/139=0.8633

3-127. a)

b)

c)

3-23

11 1
3-128. a)

b)

c)

Section 3-9

e 4 4 0
3-129. a) P( X = 0 ) = = e 4 = 0 .0183
0!
b) P(X 2 ) = P( X = 0 ) + P( X = 1) + P( X = 2)

e 4 41 e 4 42
= e 4 + +
1! 2!
= 0.2381
e 4 4 4
c) P (X = 4 ) = = 0.1954
4!
e 4 4 8
d) P (X = 8 ) = = 0.0298
8!

3-130 a) P( X = 0 ) = e0 4. = 0.6703


e 0 .4 (0.4 ) e 0 .4 (0 4. )2
b) P (X  2 ) = e 0 4. + + = 0.9921
1! 2!
e  (0 .4 )
0. 4 4
c) P (X = 4 )= = 0 .000715
4!
e 0.4 (0.4 )8
d) P (X = 8 ) = = 1. 09× 10 8
8!


3-131. P ( X = 0) = e = 0.05 . Therefore,  = ln(0.05) = 2.996.
Consequently, E(X) = V(X) = 2.996.

3-132. a) Let X denote the number of calls in one hour. Then, X is a Poisson random variable with  = 10.

e 1010 5
P( X = 5) = = 0.0378 .
5!
e 10 10 e 1010 2 e 1010 3
b) P (X  3 ) = e 10 + + + = 0. 0103
1! 2! 3!
c) Let Y denote the number of calls in two hours. Then, Y is a Poisson random variable with
20
2015
e
 = 20. P( Y = 15) = = 0 .0516
15!
d) Let W denote the number of calls in 30 minutes. Then W is a Poisson random variable with
e5 55
 = 5. P( W = 5) = = 01755
.
5!
3-133. λ=1, Poisson distribution. f(x) =e- λ λx/x!
(a) P(X≥2)= 0.264
(b) In order that P(X≥1) = 1-P(X=0)=1-e- λ exceed 0.95, we need λ=3.
Therefore 3*16=48 cubic light years of space must be studied.

3-134. (a) λ=14.4, P(X=0)=6*10-7


(b) λ=14.4/5=2.88, P(X=0)=0.056
(c) λ=14.4*7*28.35/225=12.7

3-24

11 1
P(X≥1)=0.999997
(d) P(X≥28.8) =1-P(X ≤ 28) = 0.00046. Unusual.

3-135. (a) λ=0.61. P(X≥1)=0.4566


(b) λ=0.61*5=3.05, P(X=0)= 0.047.

3-136. a) Let X denote the number of flaws in one square meter of cloth. Then, X is a Poisson random variable
e0.1 (0.1)2
with  = 0.1. P ( X = 2) = = 0.0045
2!
b) Let Y denote the number of flaws in 10 square meters of cloth. Then, Y is a Poisson random variable
e  111
= e  = 0.3679
1
with  = 1. P(Y = 1) =
1!
c) Let W denote the number of flaws in 20 square meters of cloth. Then, W is a Poisson random variable
with  = 2.P (W = 0) = e 2 = 0.1353
d) P( Y  2) =1  P( Y  1) = 1  P( Y = 0)  P( Y = 1)
= 1 e  1  e  1
= 0.2642

3-137. a) E ( X ) =  = 0.2 errors per test area


e  0.2 0.2 e  0.2 (0.2) 2
b) P ( X  2) = e  + + = 0.9989
0.2

1! 2!
99.89% of test areas

3-138. a) Let X denote the number of cracks in 5 miles of highway.


Then, X is a Poisson random variable with  = 10.
P( X = 0) = e10 = 4.54 ×10 5
b) Let Y denote the number of cracks in a half mile of highway.
Then, Y is a Poisson random variable with  = 1.
P (Y  1) = 1 P (Y = 0) = 1 e 1 = 0.6321
c) The assumptions of a Poisson process require that the probability of a event is constant for all intervals. If the
probability of a count depends on traffic load and the load varies, then the assumptions of a Poisson process are not
valid. Separate Poisson random variables might be appropriate for the heavy and light load sections of the highway.

3-139. a) Let X denote the number of flaws in 10 square feet of plastic panel.
Then, X is a Poisson random variable with  = 0.5.
P (X = 0) = e 0.5 = 0.6065
b) Let Y denote the number of cars with no flaws,
 10
P (Y = 10) =   (0 .6065)10 (0.3935)0 = 0.0067
 10
c) Let W denote the number of cars with surface flaws. Because the number of flaws has a Poisson distribution, the
occurrences of surface flaws in cars are independent events with constant probability. From part (a), the probability a
car contains surface flaws is 10.6065 = 0.3935. Consequently, W is binomial with n = 10 and p = 0.3935.

3-25

11 1
10 
P (W = 0) =   (0.3935)0 (0.6065)10 = 0.0067
0
 10 
P (W = 1) =   (0.3935)1 (0.6065)9 = 0.0437
1
P (W  1) = 0.0067+ 0.0437= 0.0504
3-140. a) Let X denote the failures in 8 hours. Then, X has a Poisson distribution with  = 0.16.
0.16
P (X = 0) = e = 0.8521
b) Let Y denote the number of failure in 24 hours. Then, Y has a Poisson distribution with = 0.48.
P (Y  1) = 1  P (Y = 0) = 1  e 48 = 0.3812

3-141. a)
b)

c)

3-142. a)
b)

c) No, if a Poisson distribution is assumed, the intervals need not be consecutive.

Supplemental Exercises

1 1  1 1 3 1  1
3-143. E(X ) =  +   +   = ,
8  3  4 3 8 3  4
2 2 2 2
 1   1  1   1   3  1   1 
V ( X ) =     +     +        = 0.0104
 8   3  4   3   8  3   4 
 1000 
a) P( X = 1) = 
 1  0.001 (0.999) = 0.3681
1 999
3-144.
 
 1000
b)P (X  1) = 1  P (X = 0) = 1    0.0010 ( 0.999)999 = 0.6319
 0 
 1000  1000   1000
c) P( X  2) =  0.0010 (0.999)1000 +  0.0011 (0.999)999 +   0.0012 0.999998
 0   1   2 
= 0.9198
d)E (X ) = 1000(0.001) = 1
V (X )= 1000(0.001)(0.999) = 0.999
3-145. a) n = 50, p = 5/50 = 0.1, since E(X) = 5 = np

3-26

11 1
 50   50  50 
b) P( X  2) =  0.10 (0.9 )50 +  0.11 (0.9 )49 +   0.12 (0.9 )48 = 0.112
 0  1  2
 50  49  50 50
 49 0.1 (0.9 ) +  50 0.1 ( 0.9) = 4.51× 10
 48
c) P ( X  49) = 
1 0

   
3-146. (a)Binomial distribution, p=0.01, n=12.
 12  0 12 
(b) P(X>1)=1-P(X≤1)= 1-   p (1  p)12 -   p 1 (1  p )14 =0.0062
0  1 
(c) µ =E(X)= np =12*0.01 = 0.12
V(X)=np(1-p) = 0.1188 σ= V ( X ) = 0.3447

(0.5) = 0.000244
12
3-147. (a)
6
(b) C12 (0.5) 6 (0.5)6 = 0.2256
(c) C125 (0.55) (0.5)7 + C6 12(0.5) 6(0.5) 6 = 0.4189

3-148. (a) Binomial distribution, n =100, p = 0.01.


(b) P(X≥1) = 0.634
(c) P(X≥2)= 0.264
(d) µ=E(X)= np=100*0.01=1

V(X)=np (1-p) = 0.99

σ= V ( X ) =0.995

(e) Let pd= P(X≥2)= 0.264,


Y = number of messages that require two or more packets be resent.
Y is binomial distributed with n=10, pm=pd*(1/10) = 0.0264
P(Y≥1) = 0.235

3-149. Let X denote the number of mornings needed to obtain a green light.
Then X is a geometric random variable with p = 0.20.
a) P(X = 4) = (1-0.2)30.2= 0.1024
b) By independence, (0.8)10 = 0.1074. (Also, P(X > 10) = 0.1074)

3-150. Let X denote the number of attempts needed to obtain a calibration that conforms to specifications.
Then, X is geometric with p = 0.6.
P(X  3) = P(X=1) + P(X=2) + P(X=3) = 0.6 + 0.4(0.6) + 0.4 2(0.6) = 0.936.

3-151. Let X denote the number of fills needed to detect three underweight packages.
Then, X is a negative binomial random variable with p = 0.001 and r = 3.
a) E(X) = 3/0.001 = 3000
b) V(X) = [3(0.999)/0.0012] = 2997000. Therefore, X = 1731.18

3-152. Geometric with p=0.1


(a) f(x)=(1-p)x-1p=0.9(x-1)0.1
(b) P(X=5) = 0.9 4*0.1=0.0656
(c) µ=E(X)= 1/p=10
(d) P(X≤10)=0.651

3-153. (a) λ=6*0.5=3.

3-27

11 1
P(X=0) = 0.0498
(b) P(X≥3)=0.5768
(c) P(X≤x) ≥0.9, x=5
(d) σ2= λ=6. Not appropriate.

3-154. Let X denote the number of totes in the sample that do not conform to purity requirements. Then, X has a
hypergeometric distribution with N = 15, n = 3, and K = 2.
 2  13
   
0 3 13!12!
P (X  1) = 1 P ( X = 0) = 1     = 1  = 0.3714
 15 10!15!
 
3 
3-155. Let X denote the number of calls that are answered in 30 seconds or less.
Then, X is a binomial random variable with p = 0.75.
 10 
a) P(X = 9) = 
 9  (0.75) (0.25) = 0.1877
9 1

 
b) P(X  16) = P(X=16) +P(X=17) + P(X=18) + P(X=19) + P(X=20)
 20   20  20 
=   (0.75) 16 (0.25) 4 +   (0.75)17 (0.25) 3 +   (0.75)18 (0.25)2
 16  17  18 
 20  20
+   (0.75)19 (0.25)1 +   (0.75)20 (0.25)0 = 0.4148
 19   20
c) E(X) = 20(0.75) = 15

3-156. Let Y denote the number of calls needed to obtain an answer in less than 30 seconds.
a) P( Y = 4) = (1  0.75) 0.75 =0.25 30.75 =0.0117
3

b) E(Y) = 1/p = 1/0.75 = 4/3

3-157. Let W denote the number of calls needed to obtain two answers in less than 30 seconds.
Then, W has a negative binomial distribution with p = 0.75.
 5
a) P(W=6) =   (0 .25 )4 (0 75
. )2 =0 0110
.
1 
b) E(W) = r/p = 2/0.75 = 8/3
3-158. a) Let X denote the number of messages sent in one hour.
e 5 55
P (X = 5) = = 0.1755
5!
b) Let Y denote the number of messages sent in 1.5 hours.
Then, Y is a Poisson random variable with  =7.5.
e (7.5)
7.5 10
P(Y = 10) = = 0.0858
10!
c) Let W denote the number of messages sent in one-half hour.
Then, W is a Poisson random variable with  = 2.5.
P( W < 2) = P( W = 0) + P( W =1) = 0.2873

3-159. X is a negative binomial with r=4 and p=0.0001


E ( X ) = r / p = 4 / 0.0001 = 40000 requests
3-160. X  Poisson( = 0.01), X  Poisson( = 1)

3-28

11 1
e 1 (1)1 e 1 (1) 2 e 1 (1) 3
P( Y  3) = e 1 + + + = 0.9810
1! 2! 3!
3-161. Let X denote the number of individuals that recover in one week. Assume the individuals are independent.
Then, X is a binomial random variable with n = 20 and p = 0.1.
P(X  4) = 1  P(X  3) = 1  0.8670 = 0.1330.

3-162. a.) P(X=1) = 0 , P(X=2) = 0.0025, P(X=3) = 0.01, P(X=4) = 0.03, P(X=5) = 0.065
P(X=6) = 0.13, P(X=7) = 0.18, P(X=8) = 0.2225, P(X=9) = 0.2, P(X=10) = 0.16

b.) P(X=1) = 0.0025, P(X=1.5) = 0.01, P(X=2) = 0.03, P(X=2.5) = 0.065, P(X=3) = 0.13
P(X=3.5) = 0.18, P(X=4) = 0.2225, P(X=4.5) = 0.2, P(X=5) = 0.16

3-163. Let X denote the number of assemblies needed to obtain 5 defectives.


Then, X is a negative binomial random variable with p = 0.01 and r=5.
a) E(X) = r/p = 500.
b) V(X) =(5* 0.99)/0.012 = 49500 and  X = 222.49

3-164. Here n assemblies are checked. Let X denote the number of defective assemblies.
If P(X  1)  0.95, then P(X=0)  0.05. Now,
n 
P(X=0) =   (0.01) 0 (0.99) n = 99 n and 0.99n  0.05. Therefore,
0 
n (ln(0.99)) ln(0.05)
ln(0.05)
n = 298.07
ln(0.95)
This would require n = 299.

3-165. Require f(1) + f(2) + f(3) + f(4) = 1. Therefore, c(1+2+3+4) = 1. Therefore, c = 0.1.

3-166. Let X denote the number of products that fail during the warranty period. Assume the units are
independent. Then, X is a binomial random variable with n = 500 and p = 0.02.
 500 
a) P(X = 0) = 
 0  (0.02) (0.98) = 4.1 x 10
0 500 -5

 
b) E(X) = 500(0.02) = 10
c) P(X >2) = 1  P(X  2) = 0.9995

3-167. fX (0) = (0.1)(0.7) +(0.3)(0.3) =0.16


fX (1) = (0.1)(0.7) + (0.4)(0.3) =0.19
fX (2) = (0.2)(0.7) + (0.2)(0.3) =0.20
fX (3) = (0.4)(0.7) +(0.1)(0.3) =0.31
fX ( 4) = (0.2)(0.7) +(0)(0.3) =0.14

3-168. a) P(X  3) = 0.2 + 0.4 = 0.6


b) P(X > 2.5) = 0.4 + 0.3 + 0.1 = 0.8
c) P(2.7 < X < 5.1) = 0.4 + 0.3 = 0.7
d) E(X) = 2(0.2) + 3(0.4) + 5(0.3) + 8(0.1) = 3.9
e) V(X) = 22(0.2) + 32(0.4) + 52(0.3) + 8 2(0.1)  (3.9)2 = 3.09

3-169.
x 2 5.7 6.5 8.5
f(x) 0.2 0.3 0.3 0.2

3-29

11 1
3-170. Let X and Y denote the number of bolts in the sample from supplier 1 and 2, respectively.
Then, X is a hypergeometric random variable with N = 100, n = 4, and K = 30.
Also, Y is a hypergeometric random variable with N = 100, n = 4, and K = 70.
a) P(X=4 or Y=4) = P(X = 4) + P(Y = 4)
 30 70   30   70
      
4 0 0 4
=    +    
 100  100
   
 4   4 
= 0.2408
30   70   30 70 
    +    
b) P[(X=3 and Y=1) or (Y=3 and X = 1)]= =
 3   1   1   3  0.4913
=
100 
 
 4 
3-171. Let X denote the number of errors in a sector. Then, X is a Poisson random variable with  = 0.32768.
a) P(X>1) = 1  P(X1) = 1  e-0.32768  e-0.32768 (0.32768) = 0.0433
b) Let Y denote the number of sectors until an error is found.
Then, Y is a geometric random variable and P = P(X  1) = 1  P(X=0) = 1  e-0.32768 = 0.2794
E(Y) = 1/p = 3.58

3-172. Let X denote the number of orders placed in a week in a city of 800,000 people.
Then X is a Poisson random variable with  = 0.25(8) = 2.
a) P(X  3) = 1  P(X  2) = 1  [e-2 + e-2(2) + (e-222)/2!] = 1  0.6767 = 0.3233.
b) Let Y denote the number of orders in 2 weeks. Then, Y is a Poisson random variable with  = 4, and
P(Y>2) =1- P(Y  2) = e-4 + (e-44 1)/1!+ (e-442)/2! =1 - [0.01832+0.07326+0.1465] = 0.7619.

3-173. a) Hypergeometric random variable with N = 500, n = 5, and K = 125


125  375 
  
 0  5  6.0164E10
f X (0) = = = 0.2357
 500 2.5524E11
 
 5 
 125  375 
  
 1  4  125(8.10855E 8)
f X (1) = = = 0.3971
 500  2.5525E 11
 
 5 
125  375 
   
 2   3  7750(8718875) 0.2647
f X (2) = = =
 500 2.5524E 11
 
 5 

3-30

11 1
125  375
  
 3  2  317750(70125) 0.0873
f X (3) = = =
 500 2.5524E 11
 
 5 
125  375 
   
 4   1  9691375(375)
f X (4) = = = 0.01424
 500 2.5524E 11
 
 5 

125  375 
  
 5  0  2.3453E 8
f X (5) = = = 0.00092
 500  2.5524E11
 5 
 
b)
x 0 1 2 3 4 5 6 7 8 9 10
f(x) 0.0546 0.1866 0.2837 0.2528 0.1463 0.0574 0.0155 0.0028 0.0003 0.0000 0.0000

3-174. Let X denote the number of totes in the sample that exceed the moisture content.
Then X is a binomial random variable with n = 30. We are to determine p.
 30 0
If P(X  1) = 0.9, then P(X = 0) = 0.1. Then 
  (p ) (1 p )30 = 0.1, giving 30ln(1p) = ln(0.1),
0
which results in p = 0.0739.

3-175. Let t denote an interval of time in hours and let X denote the number of messages that arrive in time t.
Then, X is a Poisson random variable with  = 10t.
Then, P(X=0) = 0.9 and e-10t = 0.9, resulting in t = 0.0105 hours = 37.8 seconds

3-176. a) Let X denote the number of flaws in 50 panels.


Then, X is a Poisson random variable with  = 50(0.02) = 1.
P(X = 0) = e-1 = 0.3679.
b) Let Y denote the number of flaws in one panel.
P(Y  1) = 1  P(Y=0) = 1  e-0.02 = 0.0198.
Let W denote the number of panels that need to be inspected before a flaw is found.
Then W is a geometric random variable with p = 0.0198.
E(W) = 1/0.0198 = 50.51 panels.
0.02
c) P (Y 1) = 1 P (Y = 0) =1 e = 0.0198
Let V denote the number of panels with 1 or more flaws.
Then V is a binomial random variable with n = 50 and p = 0.0198
 50   50 
P(V  2) =  0.0198 0(.9802) 50 +   0.01981 (0.9802) 49
0  1
 50 
+   0.01982 (0.9802)48 = 0.9234
2

3-31

11 1
Mind Expanding Exercises

3-177. The binomial distribution


n!
P(X = x) = px(1-p)n-x
r!( n r)!
The probability of the event can be expressed as p = /n and the probability mass function can be written as:
n!
P(X=x) = [/n]x[1 – (/n)]n-x
x!(n  x )!
n× (n 1)× (n  2)× (n 3)......× (n  x + 1) λ x
P(X=x) x
(1 – (/n))n-x
n x!
Now we can re-express as:
[1 – (/n)]n-x = [1 – (/n)]n[1 – (/n)]-x

In the limit as n  
n × (n  1)× (n  2) × (n  3)......× (n  x + 1)
1
x
n
As n   the limit of [1 – (/n)]-x  1
Also, we know that as n  :

(1 - λ n )n = e 
Thus,
x
λ λ
P(X=x) = e
x!
The distribution of the probability associated with this process is known as the Poisson distribution and we
can express the probability mass function as:
e  x
f(x) =
x!

3-178. Show that  (1  p )i1 p = 1 using an infinite sum.
i=1
 
To begin,  (1  p) i 1 p = p (1  p) i 1 , by definition of an infinite sum this can be rewritten as
i= 1 i =1

p p
p  (1  p)i 1 = = =1
i= 1 1  (1  p) p

3-179.

3-32

11 1
E ( X ) = [(a + (a + 1) + ... + b ](b  a + 1)
a 1
b   b (b + 1)  (a  1)a 
  i 
i 

=  i =1 i =1 
= 2 2 
(b  a + 1) (b  a + 1)
 (b2  a 2 + b + a )   (b + a )(b  a + 1) 
 2   
=  = 2
(b  a + 1) (b  a + 1)
(b + a )
=
2
b b 2 b
(b  a + 1)(b + a )2 
[i  b +2a ]  
2

 i =a
i  ( b + a ) 
i =a
i +
4


V ( X ) = i= a =
b +a 1 b + a 1
b (b + 1)(2b + 1) (a  1)a (2a  1)  b (b + 1)  (a  1)a  (b  a + 1)(b + a )
2
  (b + a ) +
6 6  2  4
=

b a 1 +
2
(b  a + 1)  1
=
12
3-180. Let X denote a geometric random variable with parameter p. Let q = 1 – p.

  
d x
E ( X ) =  x(1  p) x 1 p = p xq x 1 = p q
x=1 x =1 x =1 dq

d 
d  q   1(1 q )  q ( 1) 
= p 
dq x=1
q x = p   =
dq  1  q 
p 
(1  q )2

 
 1  1
= p  2  =
p  p

3-33

11 1
 
(
V ( X ) =  ( x  1p)2 (1  p )x 1 p =  px2  2x + 1p (1  p)x 1
x =1 x =1
)
  
= p x 2q x 1  2 xq x 1 + q
  1 x 1
p
x=1 x =1 x =1

= p x 2q x 1  2
p2
+ 1
p2
x=1

= p x 2q x 1  1
p2
x=1

= p dq
d
q + 2q 2 + 3q 3 + ...  1
p2

= p dq
d
q (1 + 2q + 3q 2 + ...)  1
  p2

d  q   
= p dq  1 = 2 pq(1  q ) 3 + p (1  q ) 2  1
 (1 q )2  p2 p2

[ 2(1 p ) + p  1] (1 p ) q
= = = 2
p2 p2 p

3-181.
Let X = number of passengers with a reserved seat who arrive for the flight,
n = number of seat reservations, p = probability that a ticketed passenger arrives for the flight.
a) In this part we determine n such that P(X  120)  0.9. By testing for n in Minitab the minimum value is n =131.

b) In this part we determine n such that P(X > 120)  0.10 which is equivalent to
1 – P(X  120)  0.10 or 0.90  P(X  120).
By testing for n in Minitab the solution is n = 123.

c) One possible answer follows. If the airline is most concerned with losing customers due to over-booking, they
should only sell 123 tickets for this flight. The probability of over-booking is then at most 10%. If the airline is most
concerned with having a full flight, they should sell 131 tickets for this flight. The chance the flight is full is then at
least 90%. These calculations assume customers arrive independently and groups of people that arrive (or do not arrive)
together for travel make the analysis more complicated.

3-182. Let X denote the number of nonconforming products in the sample.


Then, X is approximately binomial with p = 0.01 and n is to be determined.
If P ( X  1)  0.90 , then P( X = 0 )  0.10 .
Now, P(X = 0) = ( )p
n
0
0
(1 p )n = (1 p )n . Consequently, (1  p )n  0.10, and
ln 0.10
n = 229.11 . Therefore, n = 230 is required
ln(1  p)

3-183. If the lot size is small, 10% of the lot might be insufficient to detect nonconforming product. For example, if the lot size
is 10, then a sample of size one has a probability of only 0.2 of detecting a nonconforming product in a lot that is 20%
nonconforming.

If the lot size is large, 10% of the lot might be a larger sample size than is practical or necessary. For example, if the
lot size is 5000, then a sample of 500 is required. Furthermore, the binomial approximation to the hypergeometric
distribution can be used to show the following. If 5% of the lot of size 5000 is nonconforming, then the probability of
 12
zero nonconforming products in the sample is approximately7 ×10 . Using a sample of 100, the same probability
is still only 0.0059. The sample of size 500 might be much larger than is needed.

3-34

11 1
3-184. Let X denote the number of acceptable components. Then, X has a binomial distribution with p = 0.98 and
n is to be determined such that P (X  100 )  0.95.
n P( X  100)
102 0.666
103 0.848
104 0.942
105 0.981
Therefore, 105 components are needed.

3-185. Let X denote the number of rolls produced.

Revenue at each demand


0 1000 2000 3000
0  x  1000 0.05x 0.3x 0.3x 0.3x
mean profit = 0.05x(0.3) + 0.3x(0.7) - 0.1x
1000  x  2000 0.05x 0.3(1000) + 0.3x 0.3x
0.05(x-1000)
mean profit = 0.05x(0.3) + [0.3(1000) + 0.05(x-1000)](0.2) + 0.3x(0.5) - 0.1x
2000  x  3000 0.05x 0.3(1000) + 0.3(2000) + 0.3x
0.05(x-1000) 0.05(x-2000)
mean profit = 0.05x(0.3) + [0.3(1000)+0.05(x-1000)](0.2) + [0.3(2000) + 0.05(x-2000)](0.3) + 0.3x(0.2) - 0.1x
3000  x 0.05x 0.3(1000) + 0.3(2000) + 0.3(3000)+
0.05(x-1000) 0.05(x-2000) 0.05(x-3000)
mean profit = 0.05x(0.3) + [0.3(1000)+0.05(x-1000)](0.2) + [0.3(2000)+0.05(x-2000)]0.3 + [0.3(3000)+0.05(x-
3000)]0.2 - 0.1x

Profit Max. profit


0  x  1000 0.125 x $ 125 at x = 1000
1000  x  2000 0.075 x + 50 $ 200 at x = 2000
2000  x  3000 200 $200 at x = 3000
3000  x -0.05 x + 350 $200 at x = 3000

The bakery can make anywhere from 2000 to 3000 and earn the same profit.

3-35

11 1

You might also like