0% found this document useful (0 votes)
17 views

Independent and Identical Distributed Random Variables Explained Updated 1

The document provides three examples of random variables X and Y that represent the chance of a Virginia Tech football team winning and the weather conditions. In Example 1, X and Y are independent but not identical distributed (IID) as their marginal probability distributions are different. In Example 2, the numerical values assigned to the events change but X and Y remain independent but not IID. In Example 3, the probabilities are adjusted so that X and Y have equal marginal distributions, satisfying the condition to be both independent and IID random variables.

Uploaded by

ulsafar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Independent and Identical Distributed Random Variables Explained Updated 1

The document provides three examples of random variables X and Y that represent the chance of a Virginia Tech football team winning and the weather conditions. In Example 1, X and Y are independent but not identical distributed (IID) as their marginal probability distributions are different. In Example 2, the numerical values assigned to the events change but X and Y remain independent but not IID. In Example 3, the probabilities are adjusted so that X and Y have equal marginal distributions, satisfying the condition to be both independent and IID random variables.

Uploaded by

ulsafar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Independent and identical distributed (IID) random variables example explained: Theses example

come from Example 4.33 Lecture Note 11

Consider the case with two random variables 𝑋 and 𝑌. Remember random variables is a formalization of
a random experiment in a way that the structure of events is preserved. Let’s say that we want to check
the probabilistic relationship between two phenomena of interest: ‘Virginia Tech football team wins’
and ‘it rains’ on this Saturday. These two phenomena are formalized into two random variables 𝑋 and 𝑌:
the football winning chance represented by 𝑋, and the weather condition by 𝑌.

As we learned, the randomness structure (or, precisely, chance regularity) of any random variables can
be expressed in terms of their probability function, in our case, 𝑓(𝑥) and 𝑓(𝑦). We need the joint
distribution of 𝑓(𝑥) and 𝑓(𝑦) as 𝑓(𝑥, 𝑦) to evaluate whether 𝑋 and 𝑌 are independent and identical
distributed (IID), or not. Three examples will be presented next.

Example 1

Numerical values of events for each random variable are assigned as follow:

Phenomenon Events Numerical value of each event

Virginia Tech footfall Win 𝑥=1


team wins - 𝑋 Lose 𝑥=2
Rain 𝑦=0
It rains - 𝑌
Not rain 𝑦=2

Assume that we know in advance the joint probability 𝑓(𝑥, 𝑦) as follow:

𝑥\𝑦 𝑦=0 𝑦=2


𝑥=1 𝑓(1,0) = 0.18 𝑓(1,2) = 0.12
𝑥=2 𝑓(2,0) = 0.42 𝑓(2,2) = 0.28

In term of events, 𝑓(1,0) = 0.18 means that the probability of both a win for the Virginia Tech team
(𝑥 = 1) and rain to happen (𝑦 = 0) is 0.18, and so on.

From this table, we can calculate the marginal probability of each events.

𝑥\𝑦 𝑦=0 𝑦=2


𝑥=1 𝑓(1,0) = 0.18 𝑓(1,2) = 0.12 𝑓𝑥 (𝑥 = 1) = 0.18 + 0.12 = 0.3
𝑥=2 𝑓(2,0) = 0.42 𝑓(2,2) = 0.28 𝑓𝑥 (𝑥 = 2) = 0.42 + 0.28 = 0.7
(𝑦
𝑓𝑦 = 0) = 0.18 + 0.4 = 0.6 (𝑦
𝑓𝑦 = 2) = 0.12 + 0.28 = 0.4

1
These examples are also Example 4.39 in chapter 4 of Dr. Spanos’ textbook.
The meaning of 𝑓𝑥 (𝑥 = 1) = 0.3 is that the probability of the event of Virginia Tech’s victory (averaging
over information about the weather condition) is 0.3. Now, we will check the IID relationship between 𝑋
and 𝑌.

First, two random variables 𝑋 and 𝑌 are independent if and only if:

𝑓(𝑥, 𝑦) = 𝑓𝑥 (𝑥). 𝑓𝑦 (𝑦), ∀𝑥 ∈ ℝ𝑋 , 𝑦 ∈ ℝ𝑌 (1)

Now, we check whether the condition (1) applies for all values of 𝑋 and 𝑌 (all events related to X and Y):

(𝑋, 𝑌) = (1,0): 𝑓(1,0) = 𝑓𝑥 (1). 𝑓𝑦 (0) = 0.18 = 0.6 ∗ 0.3

(𝑋, 𝑌) = (1,2): 𝑓(1,2) = 𝑓𝑥 (1). 𝑓𝑦 (2) = 0.12 = 0.4 ∗ 0.3

(𝑋, 𝑌) = (2,0): 𝑓(2,0) = 𝑓𝑥 (2). 𝑓𝑦 (0) = 0.42 = 0.7 ∗ 0.6

(𝑋, 𝑌) = (2,2): 𝑓(2,2) = 𝑓𝑥 (2). 𝑓𝑦 (2) = 0.28 = 0.7 ∗ 0.4

These results suggest that 𝑋 and 𝑌 are independent. Now, we need to check if 𝑋 and 𝑌 are identical
distributed (ID). Two random variables 𝑋 and 𝑦 are defined as ID if and only if:

𝑓𝑥 (𝑥) = 𝑓𝑦 (𝑦), ∀𝑥 ∈ ℝ𝑋 , 𝑦 ∈ ℝ𝑌 (2)

First, we have 𝑓𝑥 (2) = 0.6 ≠ 𝑓𝑦 (2) = 0.4. Secondly, for each of 𝑓𝑥 (1) and 𝑓𝑦 (0), there are not existence
of corresponding 𝑓𝑦 (1) and 𝑓𝑥 (0). In other words, the domain ℝ𝑋 = {1,2} of 𝑓𝑥 (𝑥) and ℝ𝑌 = {0,2} of
𝑓𝑦 (𝑦) are not identical. Thus, the condition (2) does not apply. And, we can conclude 𝑋 and 𝑌 are not ID.

In summary, the two random variables 𝑋 and 𝑌 are independent but not ID.

Example 2

In this example, the joint and marginal probability distributions between 𝑋 and 𝑌 are changed as:

𝑥\𝑦 𝑦 = 0 (raining) 𝑦 = 1 (not raining)


𝑥 = 0 (win) 𝑓(0,0) = 0.18 𝑓(0,1) = 0.12 𝑓𝑥 (0) = 0.18 + 0.12 = 0.3
𝑥 = 1 (lose) 𝑓(2,0) = 0.42 𝑓(1,1) = 0.28 𝑓𝑥 (1) = 0.42 + 0.28 = 0.7
𝑓𝑦 (0) = 0.18 + 0.42 = 0.6 𝑓𝑦 (1) = 0.12 + 0.28 = 0.4

Compared to the joint and marginal probability in example 1, the probability of all events in 𝑋 and 𝑌 are
the same. The only difference is that the numbers assigned to each of the events are modified. Now, the
domain of 𝑋 is 𝑅𝑋 = {0,1} (rather than {1,2}), and of 𝑌 is 𝑅𝑌 = {0,1} (rather than {0,2}).

Because the joint probability distribution 𝑓(𝑥, 𝑦) is the same, 𝑋 and 𝑌 are also independent. How about
the identical distributed condition? The probability distribution of 𝑋 and 𝑌 are not the same:

𝑓𝑥 (0) = 0.6 ≠ 𝑓𝑦 (0) = 0.3

𝑓𝑥 (1) = 0.4 ≠ 𝑓𝑦 (0) = 0.7


Thus, the ID condition of 𝑋 and 𝑌 are not satisfied. In conclusion, the two random variables 𝑋 and 𝑌 are
independent but still not IID.

Example 3

In example 3, the joint and marginal probability distributions between 𝑋 and 𝑌 are modified as:

𝑥\𝑦 𝑦 = 0 (raining) 𝑦 = 1 (not raining)


𝑥 = 0 (win) 𝑓(0,0) = 0.36 𝑓(0,1) = 0.24 𝑓𝑥 (0) = 0.36 + 0.24 = 0.6
𝑥 = 1 (lose) 𝑓(1,0) = 0.24 𝑓(1,1) = 0.16 𝑓𝑥 (1) = 0.24 + 0.28 = 0.4
𝑓𝑦 (0) = 0.36 + 0.24 = 0.6 𝑓𝑦 (1) = 0.24 + 0.16 = 0.4

In this example, the assigned numerical value of each event and its probability are chosen to 𝑋 and 𝑌 to
be ID:

𝑓𝑥 (0) = 0.6 = 𝑓𝑦 (0) = 0.6

𝑓𝑥 (1) = 0.4 = 𝑓𝑦 (1) = 0.4

Now, we need to evaluate whether these two variables are independent:


(𝑋, 𝑌) = (0,0): 𝑓(0,0) = 𝑓𝑥 (0). 𝑓𝑦 (0) = 0.36 = 0.6 ∗ 0.6

(𝑋, 𝑌) = (0,1): 𝑓(0,1) = 𝑓𝑥 (0). 𝑓𝑦 (1) = 0.24 = 0.6 ∗ 0.4

(𝑋, 𝑌) = (1,0): 𝑓(1,0) = 𝑓𝑥 (1). 𝑓𝑦 (0) = 0.24 = 0.4 ∗ 0.6

(𝑋, 𝑌) = (1,1): 𝑓(1,1) = 𝑓𝑥 (1). 𝑓𝑦 (1) = 0.16 = 0.4 ∗ 0.4

This check confirms that X and Y are independent. Therefore, we finally have the two random variables X
and Y to be truly IID.

You might also like