Random Variables FinalNotes
Random Variables FinalNotes
1
Why random variables?
➢To model uncertainties in engineering systems
➢To simulate unpredictable phenomenon such as weather conditions and radio
noise
➢To optimize systems for performance and reliability by increasing robustness
➢To enhance security and privacy of data and communication though encryption
and authentication
Applications:
1. Designing and maintenance of engineering applications
2. Queueing theory
3. Reliability Analysis
4. Noise modelling in signal processing
2
Random Experiment:
• A random experiment is an experiment where the outcome can not be
predicted with certainty and it lacks a definite pattern.
• Hence in a random experiment the outcome depends on chance. However
it has a definite probability of occurrence.
• Example:
➢Tossing of a coin
➢Drawing a card from a pack of 52 cards
➢Measuring the amount of rainfall on a random day
➢Checking the price of stock at a random time
➢Measuring the height of a randomly selected person
3
Random Variable:
• A random variable is a function which maps the outcome of a random
process to a real number such that each value/range of values taken
is associated with a probability.
• In other words, If S is a sample space and X is a real valued function
defined on S then X is called a random variable
𝑋: 𝑆 → 𝑅
• Domain of a random variable is the sample space S.
• Range of a random variable is a set of all possible values taken by X
which can be either discrete or continuous.
4
Types of random variables:
1. Discrete random variable: It is a random variable which takes
countable number of possible values.
Example: count of customers,
number of items sold,
total of roll of two dice.
5
Example:
• A balanced coin is tossed four times. List the elements of the sample
space that are presumed to be equally likely and the corresponding
values x of the random variable X which denotes the total number of
heads.
Ans:
• Here X=no. of heads in four tosses of a coin
• Sample Space={HHHH,HHHT,HHTH,HHTT,HTHH,HTHT,HTTH,HTTT,
THHH,THHT, THTH, THTT, TTHH, TTHT, TTTH, TTTT}
• Values of random variable X associated with each outcome are
6
• X(HHHH)=4, X(HHHT)=3, X(HHTH)=3, X(HHTT)=2,
7
Continuous random variable: It is a random variable which takes
uncountable number of possible values.
Example: Temperature on a given day, percentage obtained in a
particular examination, price of stock
8
Example:
• Consider the face of a clock, and suppose that we randomly spin the
secondhand around the clock face. Define X as the position where the
second hand stops spinning. Discuss the sample space and the range of
X in this example
• Ans:
Here the sample space can be represented by S = {x | x is a point on a
circle}
The random variable X, defined as as the position where the second
hand stops spinning is a continuous random variable with a range
[0,12]
9
Probability Distribution of a discrete random
variable:
If X is a discrete random variable, the function given by
𝑓 𝑥 = 𝑃 𝑋 = 𝑥 for each x within the range of 𝑋 is called the
probability distribution/ probability mass function of 𝑋.
11
Example:
• Find the probability distribution of total number of heads obtained in
four tosses of a balanced coin. Also find the cumulative distribution
function .
• Ans:
• As seen earlier the values taken by X are 0,1,2,3,4
• P(X=0)=P(zero heads)=P({TTTT})=1/16
• P(X=1)=P(one head)=P({HTTT, THTT, TTHT, TTTH})=4/16
• Similarly P(X=2)=6/16 , P(X=3)=4/16 , P(X=4)=1/16
12
X 0 1 2 3 4
P(X) 1/16 4/16 6/16 4/16 1/16
13
• Cumulative distribution function is as follows:
1 4 6 4 1
Given 𝑓 0 = 16 , 𝑓 1 = 16 , 𝑓 2 = 16 , 𝑓 3 = 16 , 𝑓 4 = 16
It follows that,
1 0 𝑓𝑜𝑟 𝑥 < 0
𝐹 0 =𝑓 0 =
16 1
5 𝑓𝑜𝑟 0 ≤ 𝑥 < 1
𝐹 1 =𝑓 0 +𝑓 1 = 16
16 5
11 𝑓𝑜𝑟 1 ≤ 𝑥 < 2
𝐹 2 = 𝑓 0 + 𝑓 1 + 𝑓(2) =
16 𝐹 𝑥 = 16
15 11
𝑓𝑜𝑟 2 ≤ 𝑥 < 3
𝐹 3 = 𝑓 0 + 𝑓 1 + 𝑓 2 + 𝑓(3) = 16
16 15
𝐹 4 = 𝑓 0 + 𝑓 1 + 𝑓 2 + 𝑓 3 + 𝑓(4) =1 𝑓𝑜𝑟 3 ≤ 𝑥 < 4
16
1 𝑓𝑜𝑟 𝑥 ≥ 4
14
Graph of CDF:
•
1
15/16
11/16
5/16
1/16
0 1 2 3 4
15
5−2PX31k(−X−12 3= x)
k2
10 10
Practice Problems
1. If is a random variable the difference between heads and tails tails
obtained when a fair coin is tossed 3 times. What are the possible
values of and its probability mass function? Also write the
distribution function of .
2. Determine k such that the following functions are p.m.f.s
a) 𝑃 𝑥 = 𝑘𝑥, 𝑥 = 1,2,3, … 10
2𝑥
b) 𝑃 𝑥 = 𝑘 , 𝑥 = 0,1,2,3
𝑥!
3. A random variable takes values -3,-1, 2,5 with probabilities as
shown in the table below. Determine the distribution of 𝑋.
𝑋 −3 −1 2 5
𝑃(𝑋 2𝑘 − 3 𝑘−1 𝑘−1 𝑘−2
= 𝑥) 10 10 10 10
16
4. A random variable 𝑋 has the following p.m.f.
𝑋 = 𝑥𝑖 0 1 2 3 4 5 6
𝑃 𝑋 = 𝑥𝑖 𝑘 3𝑘 5𝑘 7𝑘 9𝑘 11𝑘 13𝑘
17
Probability distribution of a continuous random
variable:
• Probability density function:
A function with values f(x), defined over the set of all real numbers, is called
a probability density function of the continuous random variable X if and
only if
𝑏
𝑃 𝑎≤𝑋≤𝑏 = 𝑓 𝑎 𝑥 𝑑𝑥
for any real constants 𝑎 and 𝑏 with 𝑎 ≤ 𝑏
Note:
f(c), the value of the probability density of X at c, does not give P(X = c) as in
the discrete case. This is because for continuous random variables,
probabilities are always associated with intervals and P(X = c) = 0 for any real
constant c. This agrees with the above definition of probability in continuous
case.
18
For example:
If an individual is selected at random from a large group of a
population, then probability that his weight X is precisely 68 Kg (i.e.,
68.000 Kg) would be zero. However, there is a probability greater than
zero that X is between 67.000 . . . Kg and 68.500 . . . Kg.
Therefore for a continuous random variable
𝑃 𝑎≤𝑋≤𝑏 =𝑃 𝑎<𝑋≤𝑏 =𝑃 𝑎≤𝑋<𝑏 =𝑃 𝑎<𝑋<𝑏
19
Properties of probability density function:
• A function can serve as a probability density function of a continuous
random variable X if it satisfies the following conditions:
1. 𝑓 𝑥 ≥ 0, −∞ < 𝑥 < ∞
∞
2. −∞ 𝑓 𝑥 𝑑𝑥 = 1
Example:
If 𝑋 has a probability density function
−3𝑥 𝑓𝑜𝑟 𝑥 > 0
𝑓 𝑥 =ቊ 𝑘𝑒
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Find 𝑘 and 𝑃(0.5 ≤ 𝑋 ≤ 1)
20
Solution:
Since 𝑓 𝑥 is a pdf, it should satisfy both the properties of pdf. Hence
∞
by using the second property i.e. −∞ 𝑓 𝑥 𝑑𝑥 = 1 we get,
∞
න 𝑘𝑒 −3𝑥 𝑑𝑥 = 1
0
𝑒 −3𝑥 ∞
∴ 𝑘 ฬ =1
−3 0
1
∴𝑘 0+ =1
3
∴𝑘=3
Also 𝑃 0.5 ≤ 𝑋 ≤ 1 =
1
3𝑒 −3𝑥 𝑑𝑥 = −𝑒 −3𝑥 ȁ 1 = −𝑒 −3 + 𝑒 −1.5 = 0.173
0.5
0.5
21
Cumulative distribution function of a
continuous random variable:
Definition: If 𝑋 is a continuous random variable with a density function 𝑓(𝑡),
then the cumulative distribution function (CDF) of 𝑋 denoted by 𝐹(𝑥) is defined as
𝑥
𝐹 𝑥 = 𝑃(𝑋 ≤ 𝑥) = −∞ 𝑓 𝑡 𝑑𝑡 for −∞ < 𝑥 < ∞
• Properties of cdf:
𝑑
i. 𝐹 𝑥 = 𝑓(𝑥)
𝑑𝑥
ii. 𝑃 𝑎 ≤ 𝑋 ≤ 𝑏 = 𝐹 𝑏 − 𝐹(𝑎)
iii. 𝐹(𝑥) is a monotonically increasing function i.e if 𝑎 < 𝑏 then 𝐹(𝑎) < 𝐹(𝑏)
iv. lim 𝐹 𝑥 = 1
𝑥→∞
v. lim 𝐹 𝑥 = 0
𝑥→−∞
22
Example:
• Find the cumulative distribution function of the random variable 𝑋 of the
previous example and use it to reevaluate 𝑃 0.5 ≤ 𝑋 ≤ 1 .
Solution:
For 𝑋 ≤ 0
𝑥 𝑥
𝐹(𝑥) = −∞ 𝑓 𝑡 𝑑𝑡 = −∞ 0𝑑𝑡 = 0
For 𝑥 > 0,
𝑥 0 𝑥 𝑥
𝐹(𝑥) = −∞ 𝑓 𝑡 𝑑𝑡 = −∞ 𝑓 𝑡 𝑑𝑡+0 𝑓 𝑡 𝑑𝑡 = 0 3𝑒 −3𝑡 𝑑𝑡 = 1 − 𝑒 −3𝑥
Hence
0 𝑓𝑜𝑟 𝑥 ≤ 0
𝐹 𝑥 =൝ −3𝑥
1−𝑒 𝑓𝑜𝑟 𝑥 > 0
Using this 𝐹 𝑥 we get
𝑃 0.5 ≤ 𝑋 ≤ 1 = 𝐹 1 − 𝐹 0.5 = 1 − 𝑒 −3 − 1 − 𝑒 −1.5 = 0.173
As obtained earlier.
23
Graphs of pdf and cdf:
• If 𝑓(𝑥) is the density function for a random variable 𝑋, then we can represent
𝑦 = 𝑓(𝑥) graphically by a curve as shown in the figure. By properties of pdf, the
curve cannot fall below the x axis and the entire area bounded by the curve and
the x axis must be. Geometrically the probability that 𝑋 is between 𝑎 𝑎𝑛𝑑 𝑏, i.e.
𝑃(𝑎 ≤ 𝑋 ≤ 𝑏), is then represented by the shaded region.
• The distribution function 𝐹 𝑥 = 𝑃(𝑋 ≤ 𝑥) is a monotonically increasing
function which increases from 0 to 1
24
Practice Problems:
1. Let 𝑋 be a continuous random variable with probability density function
𝑘𝑥, 0≤𝑥≤2
𝑓 𝑥 = ቐ 2𝑘, 2≤𝑥≤4
−𝑘𝑥 + 6𝑘, 4≤𝑥≤6
Find the value of 𝑘 and the mean.
2. Let 𝑋 be a continuous random variable with the following distribution:
𝑘𝑥, 0≤𝑥≤5
𝑓 𝑥 =ቊ
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
(a) Find 𝑘
(b) Find 𝑃 2 ≤ 𝑋 ≤ 5
(c) Find cdf
(d) Plot the graphs of pmf and cdf.
25
3. Let 𝑋 be a continuous random variable with probability density function
0 𝑥<0
𝑥
𝐹 𝑥 =൞ 0≤𝑥≤2
2
1 2<𝑥
1 3
Find i. 𝑃 <𝑋< ii. 𝑃 1 ≤ 𝑋 ≤ 2
2 2
26
Expectation/mean of a random variable
27
Properties of expectation and variance:
• If 𝑋 & 𝑌 are any two random variables and 𝑎 & 𝑏 are constants then
i. 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
ii. 𝐸(𝑎) = 𝑎
iii. 𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 + 𝑏
iv. 𝑉𝑎𝑟(𝑎) = 0
v. 𝑉𝑎𝑟 𝑎𝑋 + 𝑏 = 𝑎2 𝑉𝑎𝑟(𝑋)
2 2
vi. 𝑉𝑎𝑟 𝑋 = 𝐸 𝑋 − 𝜇 =𝐸 𝑋2 − 𝐸 𝑋
28
Example 1:
• Find 𝐸[𝑋 ] where 𝑋 is the outcome when we roll a fair die.
Solution:
Since 𝑝 1 = 𝑝 2 = 𝑝 3 = 𝑝(4) = 𝑝(5) = 𝑝(6) = 1/ 6 ,
we get
1 1 1 1 1 1 7
𝐸 𝑋 =1 +2 +3 +4 +5 +6 =
6 6 6 6 6 6 2
This means that if we continually roll a fair die, then after a large
number of rolls the average of all the outcomes will be approximately
7/2
29
Example 2:
• Suppose that you are expecting a message at some time past 5 P.M. From
experience you know that X , the number of hours after 5 P.M. until the
message arrives, is a continuous random variable with the following
probability density function
1
𝑓 𝑥 = ቐ1.5 𝑖𝑓 0 < 𝑥 < 1.5
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find the expected amount of time past 5 P.M. until the message arrives.
Solution:
∞ 1.5
1
𝐸 𝑥 = න 𝑥𝑓 𝑥 𝑑𝑥 = න 𝑥 𝑑𝑥 = 0.75
−∞ 0 1.5
This means that on an average you have to wait for 0.75 hours past 5 pm for
the message to arrive.
30
Example 3:
A player tosses three fair coins. The player wins Rs 500 if three heads occur, and Rs 300 if two heads
occur and Rs 100 if one head occur. On the other hand, the player losses Rs 1500 if 3 tails occur.
Find the value of the game to the player. Is the game favorable(fair)?
Solution :
Let X denote the number of heads in 3 tosses, then 𝑋 = {0,1,2,3}
Sample space={𝐻𝐻𝐻, 𝐻𝑇𝐻, 𝑇𝐻𝐻, 𝐻𝐻𝑇, 𝑇𝑇𝐻, 𝑇𝐻𝑇, 𝐻𝑇𝑇, 𝑇𝑇𝑇}
Therefore,
1 3 3 1
𝑃 𝑋 = 0 = ,𝑃 𝑋 = 1 = ,𝑃 𝑋 = 2 = ,𝑃 𝑋 = 3 =
8 8 8 8
Thus , expected value of the game,
𝐸 𝑋 = 𝑥𝑃(𝑋 = 𝑥)
1 3 3 1
= 500 + 300 + 100 − 1500 = 25 𝑅𝑠.
8 8 8 8
31
Variance and standard deviation of a random
variable
• If X is a discrete random variable and 𝑓 𝑥 is the 2value of its probability
distribution at 𝑥, the variance of 𝑋,denoted by 𝜎 is
Var 𝑥 = 𝜎 2 = 𝐸 𝑋 − 𝜇 2 = ∑𝑥 𝑥 − 𝜇 2 𝑓(𝑥)
Correspondingly,
if 𝑋 is a continuous random variable and 𝑓(𝑥) is the value of its
probability density at 𝑥, the variance of 𝑋 is
∞
Var 𝑥 = 𝜎 2 = න 𝑥 − 𝜇 2 𝑓 𝑥 𝑑𝑥
−∞
Standard deviation of a random variable(discrete and continuous) is the
positive square root of variance and is denoted by 𝜎
i.e. 𝑆𝐷 𝑥 = 𝜎 = + 𝑉𝑎𝑟(𝑥)
32
Example 1 :
• Compute Var(X ) when X represents the outcome when we roll a fair die
• Solution:
Since 𝑝 1 = 𝑝 2 = 𝑝 3 = 𝑝(4) = 𝑝(5) = 𝑝(6) = 1/ 6 ,
we get
2 2
𝑉𝑎𝑟 𝑋 = 𝐸 𝑋 − 𝐸 𝑋
Where
1 1 1 1 1 1 91
E 𝑋2 = 12 +22
+32
+42
+52
+62
=
6 6 6 6 6 6 6
7
• And as shown earlier 𝐸 𝑋 =
2
91 7 2 35
• Hence 𝑉𝑎𝑟 𝑋 = − =
6 2 12
33
Example 2:
• The density function of a random variable 𝑋 is given by
𝑥
𝑓𝑜𝑟 0 < 𝑥 < 2
• 𝑓 𝑥 = ൝2
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
• Find the variance and standard deviation of 𝑋.
• Solution:
2 𝑥 𝑥3 2 4
• 𝐸 𝑋 = 0 𝑥 𝑑𝑥 = ฬ 0 =
2 6 3
2 2 2𝑥 𝑥4 2
• And 𝐸 𝑋 = 0 𝑥 𝑑𝑥 = ฬ 0 =2
2 8
2 4 2 2
• Hence 𝑉𝑎𝑟 𝑋 = 𝐸 𝑋2 − 𝐸 𝑋 =2− =
3 9
2 2
• And 𝑆𝐷 𝑋 = + =
9 3
34
Interpretation of variance:
Variance represents the spread or dispersion of the distribution of a random
variable
35
Examples:
1. A box contains 8 items of which 2 are defective. A person draws 3 items from
the box. Determine the expected number of defective items he has drawn.
2. A player tosses two fair coins. The player wins $2 if two heads occur, and $1 if
one head occur. On the other hand, the player losses $3 if no heads occur.
Find the expected gain of the player. Is the game fair?
3. Two cards are drawn one by one with replacement from ten cards numbered
1 to 10. Find expectation of the product of the numbers on the drawn cards.
4. If it rains, a rain coat dealer can earn Rs 500 per day. If it is a dry day, he can
lose Rs 100 per day. What is his expectation, if the probability of rain is 0.4?
5. Determine the discrete probability distribution, variance and standard
deviation of a discrete random variable X which denotes the minimum of two
numbers that appear when a pair of fair dice is thrown once.
36
Examples:
1
1. Let a random variable has mean 2 and standard deviation 2 . Find
3𝑋−1
i. 𝐸 2𝑋 − 1 ii. 𝑉𝑎𝑟(𝑋 + 2) iii. 𝑆𝐷
−4
𝑋=𝑥 -2 -1 0 1 2
𝑃(𝑋 = 𝑥) 0.15 0.30 0 0.30 0.25
Find
(i) 𝐸 𝑋 (ii) 𝐸(2𝑋 + 3) (iii) 𝐸 4𝑋 − 5
(iv) 𝐸 𝑋 2 (v) 𝑉𝑎𝑟(𝑋) (vi) 𝑉𝑎𝑟 3𝑋 + 4
37
3. The probability density function of a continuous random variable 𝑋 is
𝑥3 0<𝑥≤1
𝑓 𝑥 =ቐ 2−𝑥 3 1<𝑥<2
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find the expected value of the random variable.
38
Bivariate/Joint probability distribution:
• Definition:
• If X and Y are discrete random variables taking values
𝑥1 , 𝑥2 , … . . 𝑥𝑚 & 𝑦1 , 𝑦2 , … . 𝑦𝑛 respectively then the function given by
𝑓(𝑥𝑖 , 𝑦𝑗 ) = 𝑃𝑖𝑗 = 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 )
for each pair of values 𝑥𝑖 , 𝑦𝑗 within the range of X and Y is called the
joint probability distribution of X and Y where
i. 𝑓(𝑥𝑖 , 𝑦𝑗 ) ≥ 0 ∀ 0 ≤ 𝑖 ≤ 𝑚, 0 ≤ 𝑗 ≤ 𝑛
ii. ∑𝑛𝑗=0 ∑𝑚
𝑖=0 𝑓(𝑥𝑖 , 𝑦𝑗 ) = 1
39
P
P(
xy111
122(Y
21m
1nmn
22
2n
21
n X == yx1n21m2)))
40
Marginal Probabilities:
The probability that 𝑋 = 𝑥𝑖 0 ≤ 𝑖 ≤ 𝑚 is obtained by adding all entries in the row corresponding to
𝑥𝑖 and is given by 𝑛
𝑓1 𝑥𝑖 = 𝑃 𝑋 = 𝑥𝑖 = 𝑓 𝑥𝑖 , 𝑦𝑗
𝑗=0
This is called as marginal probability of 𝑿 = 𝒙𝒊 , 0 ≤ 𝑖 ≤ 𝑚
Similarly, the probability that Y = 𝑦𝑗 0 ≤ 𝑗 ≤ 𝑛 is obtained by adding all entries in the column
corresponding to 𝑦𝑗 and is given by
𝑚
𝑓2 𝑦𝑗 = 𝑃 𝑌 = 𝑦𝑗 = 𝑓 𝑥𝑖 , 𝑦𝑗
𝑖=0
This is called as marginal probability of 𝐘 = 𝒚𝒋 , 0 ≤ 𝑗 ≤ 𝑛
𝑭 𝒙𝒊 , 𝒚𝒋 = 𝑷 𝑿 ≤ 𝒙𝒊 , 𝒀 ≤ 𝒚𝒋 = 𝒇(𝒖, 𝒗)
𝒖≤𝒙𝒊 𝒗≤𝒖𝒋
41
Independent random variables:
• Two discrete random variables 𝑋 and 𝑌 are said to be independent if
• 𝑃 𝑋 = 𝑥, 𝑌 = 𝑦 = 𝑃 𝑋 = 𝑥 𝑃(𝑌 = 𝑦)
• Equivalently,
• If 𝑋 and 𝑌 are continuous random variables then they are said to be
independent if
• 𝑃 𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦 = 𝑃 𝑋 ≤ 𝑥 𝑃(𝑌 ≤ 𝑦)
42
(−
1PX1()YX
2Y
Y( ) ))
3
Example:
The joint probability function of two discrete random variables 𝑋 and
𝑌 is given by 𝑓 𝑥, 𝑦 = 𝑐(2𝑥 + 𝑦), where x and y can assume all
integers such that 0 ≤ 𝑥 ≤ 2, 0 ≤ 𝑦 ≤ 3, and 𝑓 𝑥, 𝑦 = 0 otherwise.
(a) Find the value of the constant c.
(b) Find 𝑃(𝑋 = 2, 𝑌 = 1)
(c) Find 𝑃 𝑋 ≥ 1, 𝑌 ≤ 2 .
(d) Find the marginal probabilities of X and Y
(e) Check if the variables 𝑋 & 𝑌 are independent.
43
Solution:
(a) The sample points (𝑥, 𝑦) for which probabilities are given by 𝑐(2𝑥 + 𝑦), are
different from zero are indicated in the table. Since the grand total, 42c, must
be equal to 1, we have
1
𝑐=
42
44
5
• (b) 𝑃 𝑋 = 2, 𝑌 = 1 = 5𝑐 =
42
• (c) 𝑃 𝑋 ≥ 1, 𝑌 ≤ 2 = ∑𝑢≥1 ∑𝑣≤2 𝑓 𝑢, 𝑣
= 2𝑐 + 3𝑐 + 4𝑐 + (4𝑐 + 5𝑐 + 6𝑐)
24 4
= 24𝑐 = =
42 7
• (d) The marginal probability function for X is given by 𝑃 𝑋 = 𝑥 = 𝑓1 (𝑥)
and can be obtained from the margin totals in the right-hand
column. 1 1
6𝑐 = 6 ∙ = 𝑎𝑡 𝑥 = 0
42 7
1 1
Hence 𝑓1 𝑥 = 14𝑐 = 14 ∙ = 𝑎𝑡 𝑥 = 1
42 3
1 11
22𝑐 = 22 ∙ = 𝑎𝑡 𝑥 = 2
42 21
Is the marginal distribution of X.
45
• The marginal probability function for Y is given by 𝑃 𝑌 = 𝑦 = 𝑓2 (y)
and can be obtained from the margin totals in the right-hand
column.
1 1
6𝑐 = 6 ∙ = 𝑎𝑡 𝑦 = 0
42 7
1 3
9𝑐 = 9 ∙ = 𝑎𝑡 𝑦 = 1
42 14
Hence 𝑓2 𝑦 = 1 2
12𝑐 = 12 ∙ = 𝑎𝑡 𝑦 = 2
42 7
1 5
15𝑐 = 15 ∙ = 𝑎𝑡 𝑦 = 3
42 14
Is the marginal distribution of Y.
46
e) To check independence of X and Y
To check if X and Y are independent we must check if 𝑃(𝑋 = 𝑥, 𝑌 =
𝑦) = 𝑃 𝑋 = 𝑥 𝑃 𝑌 = 𝑦 ∀ 𝑥, 𝑦
But 𝐹𝑜𝑟 𝑋 = 2, 𝑌 = 1 𝑤𝑒 𝑔𝑒𝑡
11 3
𝑃 𝑋 = 2, 𝑌 = 1 = 5/42 while 𝑃 𝑋 = 2 = & 𝑃 𝑌=1 =
21 14
i.e. 𝑃 𝑋 = 2, 𝑌 = 1 ≠ 𝑃 𝑋 = 2 𝑃 𝑌 = 1
47
Practice Problems:
1. Let 𝑋 𝑎𝑛𝑑 Y be independent random variables with the following
distributions.
𝑋 1 2 Y 5 10 15
𝑃(𝑋 = 𝑥) 0.6 0.4 P(Y = y) 0.2 0.5 0.3
X Y –3 2 4
50
Properties of covariance:
i. 𝐶𝑜𝑣 𝑋, 𝑌 = 𝐶𝑜𝑣 𝑌, 𝑋
ii. 𝐶𝑜𝑣 𝑋, 𝑋 = 𝑉𝑎𝑟 𝑋
iii. 𝐶𝑜𝑣 𝑎𝑋, 𝑌 = 𝑎 𝐶𝑜𝑣(𝑋, 𝑌)
iv. 𝐶𝑜𝑣 𝑋 + 𝑍, 𝑌 = 𝐶𝑜𝑣 𝑋, 𝑌 + 𝐶𝑜𝑣(𝑍, 𝑌)
51
Example:
The joint and marginal probabilities of X and Y, the numbers of aspirin and sedative
caplets among two caplets drawn at random from a bottle containing three aspirin,
two sedative, and four laxative caplets, are recorded as follows:
53
Correlation coefficient:
• The strength of the linear relationship between 𝑋 𝑎𝑛𝑑 𝑌 is indicated by the
correlation between X and Y.
• It provides a measure of dependence between two random variables.
• It is a dimensionless quantity obtained by dividing the covariance by the
product of the standard deviations of 𝑋 𝑎𝑛𝑑 𝑌. It is denoted by 𝜌
𝜎𝑋𝑌
∴ 𝜌 = 𝐶𝑜𝑟𝑟 𝑋, 𝑌 =
𝜎𝑋 𝜎𝑌
Where
• 𝜎𝑋𝑌 is the covariance between 𝑋 & 𝑌.
• 𝜎𝑋 is the standard deviation of 𝑋.
• 𝜎𝑌 is the standard deviation of Y.
54
Properties of correlation coefficient:
i. −1 ≤ 𝜌 ≤ 1
ii. 𝜌 = −1 indicates perfect negative linear relationship between
𝑋 & 𝑌.
iii. 𝜌 = 1 indicates perfect positive linear relationship between 𝑋 & 𝑌.
iv. 𝜌 = 0 indicates that the variables 𝑋 & 𝑌 are uncorrelated.
v. If 𝑋 & 𝑌 are independent then they are uncorrelated i.e. 𝜌 =
0 however the converse is not true.
55
Examples
56
Practice problems:
• A fair coin is tossed three times. Let 𝑋 equal 0 𝑜𝑟 1 according as a
head or a tail occurs on the first toss. Let 𝑌 equal the total number of
heads that occur in each toss.
a) Write the marginal distributions of 𝑋 𝑎𝑛𝑑 𝑌
b) Write the joint distribution of 𝑋 𝑎𝑛𝑑 𝑌.
c) Find 𝐶𝑜𝑣 𝑋, 𝑌
d) Find 𝜌 𝑋, 𝑌 and comment on its value
e) Determine if 𝑋 𝑎𝑛𝑑 𝑌 are independent.
57