Math13-Topic-5
Math13-Topic-5
Study
5.1 What is Joint Probability Distribution?
Probability distributions, however, can be utilized for grouped random variables, leading to the
emergence of joint probability distributions. In this discussion, our focus lies on 2-dimensional
distributions, involving only two random variables, although higher dimensions, comprising more than
two variables, are also feasible.
Given that all random variables are categorized into discrete and continuous types, we consequently
encounter both discrete and continuous joint probability distributions. These distributions bear
resemblance to single-variable distributions we previously examined, yet grasping certain concepts
may necessitate a foundational understanding of multivariable calculus.
In essence, joint probability distributions delineate scenarios where both outcomes represented by
random variables occur. While we previously employed 'X' to denote the random variable, we now
incorporate 'X' and 'Y' as the pair of random variables.
where by the above represents the probability that events x and y occur at the same time.
When discrete random variables are paired, they result in discrete joint probability distributions.
Similar to single-variable discrete probability distributions, a discrete joint probability distribution can
be presented in tabular form, as demonstrated in the example below.
The following table illustrates the joint probability distribution derived from the outcomes when rolling
a die and flipping a coin.
In the table provided, the outcomes for the die toss are represented by x=1,2,3,4,5,6, while the
outcomes for the coin flip are denoted by y=Heads and Tails. The letters a through ll indicate the
joint probabilities corresponding to various events resulting from the combinations of x and y,
whereas the Greek letters signify the totals, with ω equating to 1. The sums of the rows and columns
are identified as the marginal probability distribution functions (PDFs).
The probability function, also known as the probability mass function for a joint
probability distribution f(x,y) is defined such that:
• ∑x ∑y f(x,y) = 1
Which means that the sum of all the joint probabilities should equal to one for
a given sample space.
The mass probability function f(x,y) can be calculated in a number of different ways depend on the
relationship between the random variables X and Y.
In the previously mentioned example, the act of flipping a coin and tossing a die represents
independent random variables. The result of one event does not influence the outcome of the other
events. Assuming the fairness of both the coin and die, the probabilities denoted by letters a through
l can be derived by multiplying the probabilities of various combinations of X and Y.
Since we claimed that the coin and the die are fair, the probabilities a through l should be the same.
The marginal PDF’s, represented by the Greek letters should be the probabilities you expect when
you obtain each of the outcomes.
EXAMPLE:
If X and Y are dependent variables, their joint probabilities are calculated using their different
relationships as in the example below.
EXAMPLE: Given a bag containing 3 black balls, 2 blue balls and 3 green balls, a random sample
of 4 balls is selected. Given that X is the number of black balls and Y is the number of blue balls, find
the joint probability distribution of X and Y.
Solution:
The random variables X and Y are dependent since they are picked from the same sample
space such that if any one of them is picked, the probability of picking the other is affected. So we
solve this problem by using combinations.
We've been informed that there are four potential outcomes of X: {0, 1, 2, 3}, meaning you can select
none, one, two, or three black balls. Similarly, for Y, there are three possible outcomes: {0, 1, 2},
representing none, one, or two blue balls.
Where:
We find the joint probability mass function f(x,y) using combinations as:
What the above represents are the different number of ways we can pick each of the required balls.
We substitute for the different values of x (0,1,2,3) and y (0,1,2) and solve i.e.
f(0,0) is a special case. We don’t calculate this and we outright claim that the probability of obtaining
zero black balls and zero blue balls is zero. This is because of the size of the entire population
relative to the sample space. We need 4 balls from a bag of 8 balls, in order not to pick black nor blue
balls, we would need there to be at least 4 green balls. But we only have 3 green balls so we know
that as a rule we must have at least either one black or blue ball in the sample.
f(3,2) doesn’t exist since we only need 4 balls.
Continuous joint probability distributions stem from sets of continuous random variables.
These distributions are defined by the Joint Density Function, which resembles that of a single-
variable case but operates in two dimensions.
The joint density function f(x,y) is characterized by the following:
The probability distribution of the random variable Y alone, known as its marginal
PDF is given by
Example:
A certain farm produces two kinds of eggs on any given day; organic and non-organic.
Let these two kinds of eggs be represented by the random variables X and Y respectively.
Given that the joint probability density function of these variables is given by
Solution:
a) The marginal PDF of X is given by g(x) where