Solution Manual for Probability Statistics and Random Processes for Engineers 4th Edition by Stark - Download Instantly To Explore The Full Content
Solution Manual for Probability Statistics and Random Processes for Engineers 4th Edition by Stark - Download Instantly To Explore The Full Content
https://testbankmall.com/product/solution-manual-for-
probability-statistics-and-random-processes-for-
engineers-4th-edition-by-stark/
https://testbankmall.com/product/random-processes-for-engineers-1st-
hajek-solution-manual/
https://testbankmall.com/product/2015-managing-human-resources-17th-
edition-test-bank/
https://testbankmall.com/product/test-bank-for-preface-to-marketing-
management-12-edition-j-paul-peter/
https://testbankmall.com/product/test-bank-for-precalculus-9th-
edition-9th-edition/
https://testbankmall.com/product/test-bank-for-sports-
marketing-0132135469/
https://testbankmall.com/product/test-bank-for-juvenile-justice-6th-
edition-hess/
Test Bank for Social Psychology: Goals in Interaction, 6th
Edition, Douglas Kenrick
https://testbankmall.com/product/test-bank-for-social-psychology-
goals-in-interaction-6th-edition-douglas-kenrick/
2 2
3. The cumulative distribution function (CDF) for the waiting time is defined over [0 ∞) and
given as
⎧
⎪ (
2 0≤ 1
⎪
⎨ 1≤ 2
( )= 2 ≤ 10
⎪ 2
⎪ 0 10 ≤ 20
⎩
1 20 ≤
Figure 2:
(b) Taking the derivative, we get the probability density function (pdf) as
⎧
⎪ 0≤ 1
⎪
⎨ 1 ≤ 2
( )= 0 2 10
≤
⎪ 10 ≤ 20
⎪
⎩
0 20 ≤
Figure 3:
We notice that
( ) =
0 + + 10 20
0 2 1 4
1 1 10
= + + =1
4. The point of this problem is to be careful about whether the end-points = and =
are included in the event or not. Remember ), ≤ ] which includes its end-point
4 4
1
Note: (10) = ≤ 10 but here the probability that = 10 is zero.
5 5
[ ] = ( )− [ = ]
≤ ] = ( )
[ ≤ ] = ( )− ( )− [ = ]+ = ]
[ ≤ ≤ ] = ( )− ( )+ [ = ]
[ ≤ ] = ( )− ( )
[ ] = ( )− ( )− [ = ]
5. For normalization, we integrate the probability density function (pdf) over the whole range
of the random variable and equate it to 1.
−
for ∞, 0, and −∞ ∞. For the integration, we will substitute tan =
,
1
and
1
so = . Therefore
1+[( − )2 ]2
Z ∞ Z 2
)
= − 2
−∞
³ ´
2
= |− 2
=
= 1
1
Therefore, = .
0
=
0
∙
= 3 −1 2
6 6
¯∞ Z
−2
¯ 1 − 2
+ ∞
¯
¯ ¸
0
2 0
3
√
= 0+
2 2
= 1
Therefore, = √4 .
3
7 7
Z 2
−1 −1
= 2 (sin )2 (cos )2
0
Now, we look at the product of the Gamma function Γ( ) evaluated at + 1 and +
1. Substitutions used for this integration are = 2 = 2 and = sin =
cos .
+ 1) + 1)
∙Z ∞
¸ ∙Z ∞ ¸
− −
=
0 0
Z ∞Z ∞
−( + )
=
0 0
Z ∞Z ∞
2 +1 2 +1 −[( 2+ 2 )]
= 4
0 0
Z Z ∞
2 2 +2 − 2
= 4 (sin )2 +1
(cos )2 +1
0 0
Z ∞ Z 2
− 2
(sin )2 1
(cos ) 1
= 2
0
2 + +1
( ) 2
0
Z 2
= + + 2)2 (sin )2 1
(cos )2 1
0
Γ( + +2
Therefore, = )
.
Γ( +1)Γ( +1
)
(b) Chi-square distribution. The pdf of a Chi-square random variable is given by
(
( 2)−1 − 2 2 0
( )=
0 otherwise
Integrating ( , we
get
Z ∞ Z ∞
( 2)−1 − 2 2
( ) =
0 Z0 ∞
8 8
2 ( 2) 2 ( 2) − 2 2
= (2 ) [( ) ]
0
Z ∞
2 ( 2) 2 ( 2) − 2 2
= (2 ) [( ) ]
0
2 ( 2)
= (2
³ ´) Γ
2
= 1
1
Therefore, (2
=
2 )( 2) Γ
(2 ).
9 9
7. Here we do calculations with the Normal (Gaussian) random variable of mean 0 and given
variance 2 In notation we often indicate this as : (0 2 In order to calculate
the probabilities [| | ≥ ] for integer values = 1 2 , we need to convert
this to the standard Normal curve that is distributed as (0 1 In particular the so-
called error function is defined as
Z
1 1 2
( )= exp(− ) for ≥ 0
0 2
√
2
and so only includes the right hand side of the (0 1) distribution. Expanding [| | ≥ ]
fo positive, we get [| | ≥ ] = [{ ≤ − } ∪{ ≥ }] which is
somewhat cumbersome, so instead we consider the complementary event {| | } which
satisfies
[| | ≥ ] = 1 − [| | ] For this complementary event, we
have
[| | ]= [− for 0
Z 0 2 Z0 2
1 1
= √ exp(− 2
+√ exp(− 2)
2 − 2 2 2
Z 2
1
= 2√ exp(− by the symmetry about =0
2 2 2
0
= 2 erf( for 0
allowing us to use the standard Table 2.4-1 for erf(·) Looking up this value and
subtracting twice it from one, we get
=1 [| | ≥ =]
=2 [| | ≥ 2 =]
0 =
0456
3 [| | ≥ 3 ] = 0 0026
=4 [| | ≥ 4 =] 0 0008
≈0
8. The pdf of the Rayleigh random variable is given by
−
2 2 2
( )= 2
( )
9. For the Bernoulli random variable , with (0) = (1) = and , 1− the
pdf is given as
)= ( )+ (
− 1)
For the binomial random variable with parameters and we have as a function of
Xµ ¶
( )= (1 − ) − ( − )
=0
1 11
1
10. This problem does some calculations with a mixed random variable. We can represent the
pdf of as
− 1 1
( )= [ ( − 1) − ( − ( − 2) − 3)
4 4
4)] + +
(a) To find the constant , we must integrate the pdf over all to get 1.
Z 4 Z Z
− 1 +∞ 1 +∞
+ ( 2) + ( − 3) = 1
1 4 −∞ − 4 −∞
1 1
( −1− −4 ) + + = 1
4 4
2 −1 − −4 =1
R
(b) Taking the running integral −∞ ( ) , we get the CDF ) with sketch given
in
Fig. 4.
Figure 4:
⎪ 1 −1
1 −
−
⎪ 2 −1 − −4 +2 3≤ 4
⎪
1 4≤
(c) We calculate the pdf as
− 1 1
1
( )= [ ( − 4)] +
2 − So −4
− 1) −
−1
1 12
2
− 2) + − 3)
4 4
Z 3 − Z 3
1 1
[2 ≤ 3] = −4 + − 2)
2 2 − 2−
−1 4
1 −2
− −3
1
= +
2 −1 − −4
4
1 13
3
where we start the integral of the impulse at 2− in order to pick the probability mass
at = 2 Note that we must include the probability mass at = 2 because the
event
{2 ≤ 3} includes this point.
(d) We calculate
Z 3 − Z 3+
1 1
[2 ≤ 3] = − 3)
2 2 +
−1 − −4 4 2
−2 −3
1 −
1
= +
2 −1 − −4
4
where we end the integral of the impulse at 3+ to pick up the probability mass at =3
(e) We have
(3)
= ≤ 3]
Z 3 − Z 3+ Z 3+
1 1 1
= − 3) − 2)
+ +
1 2 −1 − −4 4 1 4 1
−1 −3
=
1 − 1 1
−1 − −4 + +
2
4 4
11. First we need to calculate the probability that is less than 1 and that it is greater than 2
(area of shaded region in Fig. 5). Now
[ 1] = [ ≤ 1] = (1) = 1 −
−1
−2
[ 2] = 1 − ≤ 2] = 1 − (2) = 1 − (1 − )=
−2
Figure 5:
So
Z
−
[2 4] = +
4
1
2 4
−2 −4 1
= ( − )+
4
where we start the integral of the impulse at 2+ in order to not include the probability mass
at = 2 The overall answer then becomes 43( −2 − −4 ) +
13. This is an example where the probability distribution is defined on the sample space which is
not the elementary sample space. Normally, when we consider two coins tossed simultaneously,
we consider the sample space containing two tuples of heads and tails, indicating the outcome
of two tosses, i.e., we consider the sample space Ω = { }. Here we will
see that we can also define probability on another set of outcomes.
The sample space Ω contains outcomes 1 2 3 that denote outcomes of two, one, and
no heads, respectively. Assuming that the coins are unbiased, we first find the probability
of these outcomes.
[ 1 ] = [heads on both tosses] = 0 × 5=
0 25
[ 2 ] = [head on first, tail on second] + [tail on first, head on second] = 5×0 5+
5×
0 5 =
0
[ 3 ] = [tails on both tosses] = 0 × 5=
0 25
(b) Before we find the independence of and , we first find the probability mass functions
(pmf) of and .
[0] = = 0] = [{ 1 }] + [{ 2 }] = 5 + 0 5 = 0 75
[1] = = 1] = [{ 3 }] = 0 25.
[ ] = 0 for = 0 1.
9 9
14. (a) We have to integrate the given density over the full domain. We know
Z +∞
( ) = 1
−∞
Z +2
1 1 1 2
= + + +
8 16 16 −2
Z +2
1 2
= +2
4 0
à ¯2 !
1 1
¯
3¯
= +2
4 3 ¯0
1 8
= +2
4 3
Hence =
9 64.
(b) A labeled plot appears below in Fig. 6. Note the jumps occurring at the impulse
locations in the density. Also note the slope of the distribution function is given by the
density function in the smooth regions.
Figure 6:
8 16 16 8 64 0
2
(using the symmetry of )
à à ¯ !!
1 3 9 1 3 ¯¯1
= + +
4 8 64 3 ¯0
1 1
1 1
1 3 9 1 43
= + + =
4 8 64 3 64
10 10
15. First we calculate the probability that is even. Now it is binomial distributed with
parameters = 4 and = 0 5 i.e. ; 4 0 5) 0 ≤ ≤ 4 thus
[{ = even}] = ;4 ) + (2; 4 0 5) + ;4 5)
µ ¶ µ ¶0 µ ¶4 µ ¶ µ ¶2 µ ¶2 µ ¶ µ ¶4 µ ¶0
4 1 1 4 1 1 4 1 1
= + +
0 2 2 2 2 2 4 2 2
µ ¶4 µ ¶4 µ ¶4
1 1 1 1
= 1× +6 × +1 × =
2 2 2 2
⎪ 10 5≤ 10
⎪
⎩1 ≥ 10 ⎧
⎪
⎨0 ≤0
)= ( )− ( − 1) 1 0 ≤ 10
= 0 10
12 12
[ ≤ = ] = ( )− ( − 1)
⎧
⎪0 ≤ 0
⎪
⎨ −
(1 − 0) 1
10 1≤ ≤5
= ( )
− ≤ 10
⎪(1 −
1
)
1
10 5
⎪
⎩0 10
≤ =
]
( | = ) =
( )
(
(1 − − 0 ) 1≤ ≤5
= −
( ) (1 − 1 ) 5 ≤ 10
Hence, ( 1 − 0 1≤ ≤5
0
( | = )=
( )
1 − 1 5 ≤ 10
1
17. Let the number of bulbs produced by and be and respectively. We have
+ = ,
and is the total number of the bulbs. So [ ] = = 14 and [ ] = = 43 Since
we
have
( | ) = (1 − −0 2 ) ( ) ( | ) = (1 − −0 5 ) ( )
then
( ) = ( | ) ( )+ ( | ) ( )
1 −0 2 −0 5
= (1 − ) ( ) +(1 − ) ( )
3
4 4
So
1 3
−0 5×2
(2) = (1 (1 − )=
−0 2×2 4
− )+
4 3 −0 5×5
(5) = 1 (1 − )=
(1 − 4
−0 2×5 3
)+ (1 − −0 5×7
)=
(7) = 4
4
1
(1 −
−0 2×7
)+
4
13 13
iii) ≤ : Here we must calculate the actual intersection of the two sets { ≤ }
and
={ ≤ Since ≤
, we get { ≤ } ∩ = { ≤ }∩{ ≤
}=
{ We can then calculate the conditional probability
[{ ≤ }∩
( | ) =
[ ]
[{ }]
=
[ ]
( )− )
= for ≤
( )− )
19. In order to get [ = ], we can consider = | = ] first and then do the integral
over all .
Z ∞
[ = ]= [ = | = ( )
−∞
Z 5 −
Z 5
1 1 −
=
=
5 0 ! 5 ! 0
for = 0:
1 1
−5
[ = 0] = (1 − )
5 0!
for = 1:
1 1 51
−5 −5
= 1] = (1 − − )
5 0! 1!
for = 2: µ ¶
1 1 51 52
−
−5 −5 −5
[ = 2] = 1− −
5 0! 1! 2!
for general :
µ ¶
1 1 −5 51 −5 5 −5
[ = ] 1− − − ≥0
5 0! !
= 1!
( ), [ = ]= ( ;8 5)
This is because the 8 votes are independent, each with = 5 chance of being favorable.
They are thus Bernoulli trials, which leads to the binomial distribution in the binary
15 15
[{ ≤ ]} ∩ { 4}]
( | ) , [ ≤ | ]=
[ 4]
[4 ≤ ]
=
[ 4]
16 16
2 4 2
93
=
256
where the second to last line is by symmetry of this binomial distribution about = 4.
Turning to the numerator, we have
½
0 5
[4 ≤ ] = ¡ 1 ¢8 P8 ¡8¢
( ) 5
2 =5 − ≥
Then
½
0 5
( | ) = 256
¡ 1 ¢8 P8 ¡ 8¢
( 5
93 2 =5 − ≥
à 8 µ ¶ !
1 8
X
= − − 5)
93 =5
⎛ )
⎞
b
X µ ¶
1 8
= ⎝ ⎠ − 5)
93 =5
93 ⎪ =7
⎩ 93
1
93 =8
and thus ⎧ 56
⎪ 93 =5
⎨ 84 =6
( | ) = 93
92
⎪ =7
⎩ 93
1 =8
Now for 5, ( | )=0 and for 8, ( | )=1 so we have the plot of
Figure
1.
17 17
(c) From the calculations done above and from the definition
( | )
( | ) =
µ ¶
8
= 8
X − )
1
93 28 8 1
=5
56
= − 5) − 6) − 7) − 8)
93 93 93 93
+ + +
18 18
Figure 7:
Figure 8:
with plot of Figure 2. Note we write the areas of the impulses in parentheses.
In this figure, = 56 , = 28 93, = , and = 93.
[4 ≤ 5]
[4 ≤ ≤ 5| ] =
[ 4]
= 5]
=
[ 4]
¡ 1 ¢8 ¡8¢ µ ¶
2 5 1 8 56
= 93
= =
28
93 5 93
21. The random variables and have joint probability density function (pdf)
½ 3 2
4 (1 − 0≤ ≤2 0≤ ≤1
( )=
0 else.
19 19
(b) By definition
( ) = [ ≤ 0 5]
Z +∞ Z 0 5
= ( )
−∞ −∞
Z 2Z 0 5
3
= 2
(1 − )
0 0 4
µ 3
¶µ 2 ¶
3
= |2 ( − )| 0 5
0 0
4 3 2
µ ¶
38 1 1 3
= − =
43 2 8 4
[ ≤ ≤ 0 5]
[ ≤ 0 5| ≤ ]
= ≤
µ ¶
Z 0 5 Z 0 5
3
3
= 2
(1 − )
0 0 4 4
µ ¶ µ ¶
3 1 1 1 3 1
= − =
4 24 2 8 4 64
20 20
(d) Here, we can note again that and are independent random variables for the given
joint pdf, and thus
[ ≤ | ≤ ]= ≤
3
= from part b).
4
22. To check for independence, we need to look at the marginal pdfs of and . How do we
find the pdf’s? We can use the property that the pdf must integrate to 1. Say ( )=
1
− 2 ( 3 2)
R∞ 1
− 2 2( 2 )
2 ( ), and ( = 1, we find = Similarly, ( )=
√ ( ),
0 3 2
21 21
R∞
and
2
( ) = 1, so = Multiplying the two marginal pdfs, we see that the
√
0 2 2
product is indeed equal to joint pdf; i.e., ( ) ( )= ). Therefore, and
are
independent random variables; their joint probability factors and hence [0 ≤3 0
≤ 2] = [0 ≤ 3] [0 ≤2 Thus
Z3
2 1 2
−2 ( 3 )
[0 ≤ 3] = √
3 2−3
Z 3
2 1 2
−2 ( 3
= 2× √ =2 (1)
3 2 )
0
Z 2
2 1 2
−2 ( 2 )
[0 ≤ 2] √
=
2 2 −2
Z 2
2 1 2
−2( 2
= 2× √ =2 (1)
2 2 )
0
So
[0 ≤3 0 ≤ 2] = [0 ≤ 3] [0 ≤ 2]
= 2 erf(1) × 2 erf(1) = 4 erf(1)2 466
=0
= (1 + + (1 −
−1 0 )
µ 2
¶¯0 µ
¯
2 ¶ ¯1
¯
= + ¯ + − ¯
2 ¯ 2 ¯
−1 0
µ ¶
1 1
= +
2 2
1.F.4. Except for the limited right of replacement or refund set forth
in paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO
OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.
Please check the Project Gutenberg web pages for current donation
methods and addresses. Donations are accepted in a number of
other ways including checks, online payments and credit card
donations. To donate, please visit: www.gutenberg.org/donate.
Most people start at our website which has the main PG search
facility: www.gutenberg.org.
testbankmall.com