0% found this document useful (0 votes)
2 views

Math13-Topic-5

This document covers the topic of joint probability distributions in engineering data analysis, focusing on both discrete and continuous types. It explains the definitions, properties, and examples of joint probability distributions, including how to calculate joint probabilities for independent and dependent random variables. Additionally, it provides insights into marginal probability functions and includes references for further reading.

Uploaded by

msrect14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Math13-Topic-5

This document covers the topic of joint probability distributions in engineering data analysis, focusing on both discrete and continuous types. It explains the definitions, properties, and examples of joint probability distributions, including how to calculate joint probabilities for independent and dependent random variables. Additionally, it provides insights into marginal probability functions and includes references for further reading.

Uploaded by

msrect14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

COURSE: Math 13 Engineering Data Analysis

TOPIC 5: JOINT PROBABILITY DISTRIBUTION

Title Learning Outcomes


Topic “I SHOULD BE ABLE TO”… Estimated
time
V 5.1. What Is 1. Define joint probability distribution. 3 hours
JOINT 2. Determine the discrete joint
PROBABILITY probability distribution
DISTRIBUTION? 3. Determine the continuous joint
5.2. Discrete Joint probability distribution
Probability Distributions
5.3. Continuous
Probability Distribution

Study
5.1 What is Joint Probability Distribution?

Probability distributions, however, can be utilized for grouped random variables, leading to the
emergence of joint probability distributions. In this discussion, our focus lies on 2-dimensional
distributions, involving only two random variables, although higher dimensions, comprising more than
two variables, are also feasible.

Given that all random variables are categorized into discrete and continuous types, we consequently
encounter both discrete and continuous joint probability distributions. These distributions bear
resemblance to single-variable distributions we previously examined, yet grasping certain concepts
may necessitate a foundational understanding of multivariable calculus.

In essence, joint probability distributions delineate scenarios where both outcomes represented by
random variables occur. While we previously employed 'X' to denote the random variable, we now
incorporate 'X' and 'Y' as the pair of random variables.

Joint probability distributions are defined in the form below:

where by the above represents the probability that events x and y occur at the same time.

The Cumulative Distribution Function (CDF) for a joint probability distribution


is given by:
5.2 Discrete Joint Probability Distributions

When discrete random variables are paired, they result in discrete joint probability distributions.
Similar to single-variable discrete probability distributions, a discrete joint probability distribution can
be presented in tabular form, as demonstrated in the example below.

The following table illustrates the joint probability distribution derived from the outcomes when rolling
a die and flipping a coin.

In the table provided, the outcomes for the die toss are represented by x=1,2,3,4,5,6, while the
outcomes for the coin flip are denoted by y=Heads and Tails. The letters a through ll indicate the
joint probabilities corresponding to various events resulting from the combinations of x and y,
whereas the Greek letters signify the totals, with ω equating to 1. The sums of the rows and columns
are identified as the marginal probability distribution functions (PDFs).

The probability function, also known as the probability mass function for a joint
probability distribution f(x,y) is defined such that:

• f(x,y) ≥ 0 for all (x,y)


Which means that the joint probability should always greater or equal to zero as
dictated by the fundamental rule of probability.

• ∑x ∑y f(x,y) = 1
Which means that the sum of all the joint probabilities should equal to one for
a given sample space.

• f(x,y) = P(X =x, Y = y)

The mass probability function f(x,y) can be calculated in a number of different ways depend on the
relationship between the random variables X and Y.

If X and Y are Independent:

In the previously mentioned example, the act of flipping a coin and tossing a die represents
independent random variables. The result of one event does not influence the outcome of the other
events. Assuming the fairness of both the coin and die, the probabilities denoted by letters a through
l can be derived by multiplying the probabilities of various combinations of X and Y.

EXAMPLE: P(X = 2, Y = Tails) is given by

Since we claimed that the coin and the die are fair, the probabilities a through l should be the same.

The marginal PDF’s, represented by the Greek letters should be the probabilities you expect when
you obtain each of the outcomes.

EXAMPLE:

The table thus becomes:

If X and Y are Dependent:

If X and Y are dependent variables, their joint probabilities are calculated using their different
relationships as in the example below.

EXAMPLE: Given a bag containing 3 black balls, 2 blue balls and 3 green balls, a random sample
of 4 balls is selected. Given that X is the number of black balls and Y is the number of blue balls, find
the joint probability distribution of X and Y.

Solution:
The random variables X and Y are dependent since they are picked from the same sample
space such that if any one of them is picked, the probability of picking the other is affected. So we
solve this problem by using combinations.
We've been informed that there are four potential outcomes of X: {0, 1, 2, 3}, meaning you can select
none, one, two, or three black balls. Similarly, for Y, there are three possible outcomes: {0, 1, 2},
representing none, one, or two blue balls.

The joint probability distribution is given by the table below:

Where:

We find the joint probability mass function f(x,y) using combinations as:

What the above represents are the different number of ways we can pick each of the required balls.
We substitute for the different values of x (0,1,2,3) and y (0,1,2) and solve i.e.

f(0,0) is a special case. We don’t calculate this and we outright claim that the probability of obtaining
zero black balls and zero blue balls is zero. This is because of the size of the entire population
relative to the sample space. We need 4 balls from a bag of 8 balls, in order not to pick black nor blue
balls, we would need there to be at least 4 green balls. But we only have 3 green balls so we know
that as a rule we must have at least either one black or blue ball in the sample.
f(3,2) doesn’t exist since we only need 4 balls.

5.3 Continuous Joint Probability Distributions

Continuous joint probability distributions stem from sets of continuous random variables.

These distributions are defined by the Joint Density Function, which resembles that of a single-
variable case but operates in two dimensions.
The joint density function f(x,y) is characterized by the following:

• f(x,y) ≥ 0, for all (x,y)

• ∫∞∞ ∫∞∞ f(x,y) dx dy = 1

• For any region A lying in the xy plane,

The marginal probability density functions are given by

whereby the above is the probability distribution of random variable X alone.

The probability distribution of the random variable Y alone, known as its marginal
PDF is given by

Example:

A certain farm produces two kinds of eggs on any given day; organic and non-organic.
Let these two kinds of eggs be represented by the random variables X and Y respectively.
Given that the joint probability density function of these variables is given by

a) Find the marginal PDF of X

b) Find the marginal PDF of Y

c) Find the P(X ≤ 1⁄2, Y ≤ 1⁄2)

Solution:
a) The marginal PDF of X is given by g(x) where

b) The marginal PDF of Y is given by h(y) where

c) P(X ≤ 1⁄2, Y ≤ 1⁄2)


REFERENCES
1. Montgomery, D. C. Runger, G. C. Applied Statistics and Probability for Engineers. John Wiley
and Sons. 2011.
2. Triola. M. F. Elementary Statistics. Addison-Wesley. 2012.
3. DeCoursey, W. J. Statistics and Probability for Engineering Applications. Elsevier Science.
2003.

You might also like