0% found this document useful (0 votes)
17 views

Joint Probability Function 111409

Chapter 6 discusses joint probability distributions involving two or more random variables, covering concepts such as joint, marginal, and conditional probability distributions, as well as independence. It includes examples illustrating the calculation of probabilities using joint probability mass and density functions, and how to derive marginal and conditional distributions. Additionally, the chapter addresses the mean and variance of marginal and conditional distributions.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Joint Probability Function 111409

Chapter 6 discusses joint probability distributions involving two or more random variables, covering concepts such as joint, marginal, and conditional probability distributions, as well as independence. It includes examples illustrating the calculation of probabilities using joint probability mass and density functions, and how to derive marginal and conditional distributions. Additionally, the chapter addresses the mean and variance of marginal and conditional distributions.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Chapter 6

Joint Probability
Distributions
Joint
Probability
Distributions
CHAPTER OUTLINE
Two or More Random Variables
6.1 Joint Probability Distributions
6.2 Marginal Probability Distributions
6.3 Conditional Probability Distributions
6.4 Independence
6.5 More Than Two Random Variables
Learning Objectives
After careful study of this chapter, you
should be able to do the following:
1. Use joint probability mass functions and
joint probability density functions to
calculate probabilities.
2. Calculate marginal and conditional
probability distributions from joint
probability distributions.
Joint Probability Mass Function
Joint Probability Density Function
The joint probability density function for the continuous random
variables X and Y, denotes as fXY(x,y), satisfies the following
properties:

Figure 6-2 Joint probability density


function for the random variables X and
Y. Probability that (X, Y) is in the region
R is determined by the volume of fXY(x,y)
over the region R.
Example 6-2: Server Access Time-1
Let the random variable X denote the time until a computer
server connects to your machine (in milliseconds), and let Y
denote the time until the server authorizes you as a valid user (in
milliseconds). X and Y measure the wait from a common
starting point (x < y). The joint probability density function for X
and Y is
f XY  x, y   ke 0.001x  0.002 y for 0  x  y   and k  6 106

Figure 6-4 The joint probability


density function of X and Y is
nonzero over the shaded region
where x < y.
Example 6-2: Server Access Time-2
The region with nonzero probability is shaded in
Fig. 6-4. We verify that it integrates to 1 as follows:

  
  0.001x  0.002 y   0.002 y  0.001x
 

 f XY  x, y dydx     ke dy dx  k    e dy  e dx
  00  00 
 
 e 0.002 x  0.001x
 k e dx  0.003  e 0.003 x
dx
0
0.002  0

 1 
 0.003   1
 0.003 
Example 6-2: Server Access Time-3
Now calculate a probability:
1000 2000
P  X  1000, Y  2000     f XY  x, y dydx
0 x

 2000 0.002 y  0.001x


1000
k    e dy  e dx
0  x 
1000
 e 0.002 x  e 4  0.001x
k   e dx
0 
0.002 
1000
 0.003 
0
e 0.003 x  e4 e 0.001x dx
Figure 6-5 Region of integration for
the probability that X < 1000 and Y
 1  e 3  4  1  e1   < 2000 is darkly shaded.
 0.003  e  
 0.003   0.001 
 0.003  316.738  11.578  0.915
Marginal Probability Distributions
(discrete)
The marginal probability distribution for X is found by summing the probabilities
in each column whereas the marginal probability distribution for Y is found by
summing the probabilities in each row.

f X  x    f  xy 
y = Response x = Number of Bars of
time(nearest Signal Strength
y second) 1 2 3 f (y )

fY  y    f  xy  1
2
0.01
0.02
0.02
0.03
0.25
0.20
0.28
0.25
x
3 0.02 0.10 0.05 0.17
4 0.15 0.10 0.05 0.30
f (x ) 0.20 0.25 0.55 1.00

Marginal probability distributions of X and Y


Marginal Probability Density Function
(continuous)
If the joint probability density function of
random variables X and Y is fXY(x,y), the
marginal probability density functions of X
and Y are:
Example 6-4: Server Access Time-1
For the random variables that
denotes times in Example 6-2,
find the probability that Y
exceeds 2000 milliseconds.

Integrate the joint PDF directly


using the picture to determine
the limits.
2000
   
 
P Y  2000      f XY  x, y  dy dx     f XY  x, y  dy dx
0  2000  2000  x 
Dark region  left dark region  right dark region
Example 6-4: Server Access Time-2
Alternatively, find the marginal PDF and then
integrate that to find the desired probability.
y 
fY  y    ke 0.001 x  0.002 y
dx P Y  2000    fY  y dy
0 2000
y 

 ke0.002 y  e 0.001x dx  6 10 3


 e 0.002 y 1  e 0.001 y dy
2000
0

 0.001 x y 
 e0.002 y    e 0.003 y   
e  6 103   
 ke0.002 y    0.002 2000   0.003 2000  
 0.001 0 
 
 e 4 e6 
0.002 y  1  e 
0.001 y 3
 6 10     0.05
 ke    0.002 0.003 
 0.001 
 6 103 e 0.002 y 1  e 0.001 y  for y  0
Mean & Variance of a Marginal Distribution
E(X) and V(X) can be obtained by first calculating the marginal
probability distribution of X and then determining E(X) and V(X) by
the usual method.

E  X    x  fX  x
R

V  X    x 2  f X  x    X2
R

E  Y    y  fY  y 
R

V Y    y 2  fY  y   Y2
R
Mean & Variance for Example 6-1
y = Response x = Number of Bars
time(nearest of Signal Strength 2
f (y ) y *f (y ) y *f (y )
second)
1 2 3
1 0.01 0.02 0.25 0.28 0.28 0.28
2 0.02 0.03 0.20 0.25 0.50 1.00
3 0.02 0.10 0.05 0.17 0.51 1.53
4 0.15 0.10 0.05 0.30 1.20 4.80
f (x ) 0.20 0.25 0.55 1.00 2.49 7.61
x *f (x ) 0.20 0.50 1.65 2.35
2
x *f (x ) 0.20 1.00 4.95 6.15

E(X) = 2.35 V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275

E(Y) = 2.49 V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099


Conditional Probability Density Function
Example 6-6: Conditional Probability-1
From Example 6-2, determine the conditional PDF for Y
given X=x.

f X  x    k  e 0.001x 0.002 y dy
x

 e 0.002 y  
 ke0.001x  
 0.002 x 
 
 e 0.002

 ke0.001x  
 0.002 
 0.003e 0.003 x for x  0
f XY  x, y  ke0.001x  0.002 y
fY x  y   
f X ( x) 0.003e 0.003 x
 0.002e0.002 x 0.002 y for 0  x and x  y
Example 6-6: Conditional Probability-2
Now find the probability that Y exceeds 2000 given that X=1500:

P Y  2000 X  1500 

  fY 1500  y  dy
2000

  0.002e0.00215000.002 y
2000

 e 0.002 y  
 0.002e3  
 0.002 2000 
 
 e 4
 1
 0.002e 
3
  e  0.368
 0.002 
Mean & Variance of Conditional Random Variables

• The conditional mean of Y given X = x,


denoted as E(Y|x) or μY|x is

E  Y x    y  fY x  y 
y

• The conditional variance of Y given X = x,


denoted as V(Y|x) or σ2Y|x is


V Y x    y  Y x 
2
 fY x  y    y 2  fY x  y   Y2 x
y y
Example 6-8: Conditional Mean And Variance
From Example 6-2 & 6-6, what is the conditional mean for
Y given that x = 1500?
 
E Y X  1500    y  0.002e 0.0021500   0.002 y
dy  0.002e 3
 y  e 0.002 y dy
1500 1500

 e 0.002 y  
 e 0.002 y
 
 0.002e  y
3
    dy 
  0.002 1500 1500 
 0.002  
  0.002 y


3  1500 3 e
 0.002e e  
 0.002   0.002  0.002  1500  
  

3 1500 3 e 3 
 0.002e  e  
 0.002  0.002  0.002 
3 e 
3
 0.002e   2000    2000
 0.002 
If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms.
Example 6-9
For the discrete random variables in Exercise 6-1,
what is the conditional mean of Y given X=1?
y = Response x = Number of Bars
time(nearest of Signal Strength f (y )
second)
1 2 3
1 0.01 0.02 0.25 0.28
2 0.02 0.03 0.20 0.25
3 0.02 0.10 0.05 0.17
4 0.15 0.10 0.05 0.30
2
f (x ) 0.20 0.25 0.55 y*f(y|x=1) y *f(y|x=1)
1 0.050 0.080 0.455 0.05 0.05
2 0.100 0.120 0.364 0.20 0.40
3 0.100 0.400 0.091 0.30 0.90
4 0.750 0.400 0.091 3.00 12.00
Sum of f(y|x) 1.000 1.000 1.000 3.55 13.35
12.6025
0.7475

The mean number of attempts given one bar is 3.55 with variance of 0.7475.
Independent Random Variables
For random variables X and Y, if any one of the
following properties is true, the others are also true.
Then X and Y are independent.
Example 6-11: Independent Random Variables
• Suppose the Example 6-2 is modified such that the joint
PDF is:
f XY  x, y   2 10 e
6 0.001 x  0.002 y
for x  0 and y  0.

• Are X and Y independent?


 

f X  x    2 10 e
6 0.001 x  0.002 y
dy fY  y    2 106 e 0.001x 0.002 y dx
0 0

 0.001e 0.001x for x  0  0.002e 0.002 y for y > 0

• Find the probability

P  X  1000, Y  1000   P  X  1000   P Y  1000 


 e 1  1  e2   0.318
Joint Probability Density Function

The joint probability density function for the continuous


random variables X1, X2, X3, …Xp, denoted as
f X X ... X  x1 , x2 ,..., x p  satisfies the following properties:
1 2 p
Example 6-14: Component Lifetimes
In an electronic assembly, let X1, X2, X3, X4 denote
the lifetimes of 4 components in hours. The joint
PDF is:
f X1 X 2 X 3 X 4  x1 , x2 , x3 , x4   9 1012 e0.001x1  0.002 x2 0.0015 x3 0.003 x4 for x i  0

What is the probability that the device operates


more than 1000 hours?
The joint PDF is a product of exponential PDFs.
P(X1 > 1000, X2 > 1000, X3 > 1000, X4 > 1000)
= e-1-2-1.5-3 = e-7.5 = 0.00055
Marginal Probability Density Function
Mean & Variance of a Joint Distribution
The mean and variance of Xi can be
determined from either the marginal PDF, or
the joint PDF as follows:
Example 6-16
Points that have positive probability in the
joint probability distribution of three random
variables X1 , X2 , X3 are shown in Figure.
Suppose the 10 points are equally likely
with probability 0.1 each. The range is the
non-negative integers with x1+x2+x3 = 3

List the marginal PDF of X2

P (X2 = 0) = f x1x2 x3(3,0,0) + f x1x2 x3(0,0,3) + f x1x2 x3 (1,0,2) + f x1x2 x3(2,0,1) = 0.4
P (X2 = 1) = f x1x2 x3(2,1,0) + f x1x2 x3(0,1,2) + f x1x2 x3 (1,1,1) = 0.3
P (X2 = 2) = f x1x2 x3(1,2,0) + f x1x2 x3(0,2,1) = 0.2
P (X2 = 3) = f x1x2 x3(0,3,0) = 0.1

Also, E(x2) = 0(0.4) + 1(0.3) + 2(0.2) + 3(0.1) = 1


Distribution of a Subset of Random Variables
Conditional Probability Distributions
• Conditional probability distributions can be
developed for multiple random variables by
extension of the ideas used for two random
variables.
• Suppose p = 5 and we wish to find the
distribution conditional on X4 and X5.
f X1 X 2 X 3 X 4 X 5  x1 , x2 , x3 , x4 , x5 
f X1 X 2 X 3 X 4 X 5  x1 , x2 , x3  
f X 4 X 5  x4 , x5 
for f X 4 X 5  x4 , x5   0.
Independence with Multiple Variables
The concept of independence can be extended to
multiple variables.
Example 6-18: Layer Thickness
Suppose X1,X2, and X3 represent the thickness in μm of a
substrate, an active layer and a coating layer of a chemical
product. Assume that these variables are independent and
normally distributed with parameters and specified limits as
tabled.
Parameters Normal
What proportion of the product and specified Random Variables
meets all specifications? limits X1 X2 X3
Answer: 0.7783, 3 layer product. Mean (μ) 10,000 1,000 80
Std dev (σ) 250 20 4
Lower limit 9,200 950 75
Which one of the three Upper limit 10,800 1,050 85
thicknesses has the least P(in limits) 0.99863 0.98758 0.78870
probability of meeting specs? P(all in limits) = 0.77783
Answer: Layer 3 has least prob.
Linear Functions of Random Variables
• A function of random variables is itself a
random variable.
• A function of random variables can be
formed by either linear or nonlinear
relationships. We limit our discussion here
to linear functions.
• Given random variables X1, X2,…,Xp and
constants c1, c2, …, cp
Y= c1X1 + c2X2 + … + cpXp
is a linear combination of X1, X2,…,Xp.
Mean and Variance of a Linear Function

If X1, X2,…,Xp are random variables, and Y= c1X1 + c2X2 +


… + cpXp , then
Example : Error Propagation
A semiconductor product consists of three layers.
The variances of the thickness of each layer is 25,
40 and 30 nm. What is the variance of the finished
product?

Answer:
Mean and Variance of an Average
General Function of a Discrete Random Variable

Suppose that X is a discrete random


variable with probability distribution fX(x).
Let Y = h(X) define a one-to-one
transformation between the values of X
and Y so that the equation y = h(x) can be
solved uniquely for x in terms of y. Let this
solution be x = u(y), the inverse transform
function. Then the probability mass
function of the random variable Y is
fY(y) = fX[u(y)]
Example : Function of a Discrete Random Variable

Let X be a geometric random variable with probability


distribution
fX(x) = p(1-p)x-1 , x = 1, 2, …
Find the probability distribution of Y = X2.
Solution:
– Since X ≥ 0, the transformation is one-to-one.
– The inverse transform function is X = .
– fY(y) = p(1-p) -1 , y = 1, 4, 9, 16,…
General Function of a Continuous Random Variable

Suppose that X is a continuous random variable


with probability distribution fX(x). Let Y = h(X)
define a one-to-one transformation between the
values of X and Y so that the equation y = h(x) can
be solved uniquely for x in terms of y. Let this
solution be x = u(y), the inverse transform
function. Then the probability distribution of Y is

fY(y) = fX[u(y)]∙|J|

where J = u’(y) is called the Jacobian of the


transformation and the absolute value of J is used.
Example: Function of a Continuous Random Variable

Let X be a continuous random variable with probability


distribution: x
f X ( x)  for 0  x  4
8
Find the probability distribution of Y = h(X) = 2X + 4

Note that Y has a one-to-one relationship to X .


y4 1
x  u  y  and the Jacobian is J  u '  y  
2 2

fY  y  
 y  4 2 1 y  4
  for 4  y  12.
8 2 32

You might also like