0% found this document useful (0 votes)
4 views

Chapter_3

Probability Distributions and Applications
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Chapter_3

Probability Distributions and Applications
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

STAT 21513 Probability Distributions and Applications II

∞ ∞
= ∫ (∫ ℎ(𝑦⁄𝑥) 𝑓 (𝑥)𝑑𝑥) 𝑦𝑑𝑦
−∞ −∞ 1

∞ ∞
= ∫ (∫ 𝑓(𝑥, 𝑦)𝑑𝑥) 𝑦𝑑𝑦
−∞ −∞

= ∫−∞ 𝑦 𝑓2(𝑦)𝑑𝑦

= 𝐸𝑦(𝑌)

In general for any function g()


𝐸[𝑔(𝑌)] = 𝐸[𝐸[𝑔(𝑌)|𝑋]]

Ex: Refer to Example 8 and apply the above theorem.

5.2 Conditional Variance

Definition 5:
Let 𝑋 and 𝑌 be two random variables with joint density 𝑓(𝑥, 𝑦) and 𝑓(𝑦/𝑥) be the conditional density of
𝑌 given 𝑋 = 𝑥. The conditional variance of 𝑌 given 𝑋 = 𝑥, denoted by 𝑉𝑎𝑟(𝑌|𝑥), is defined as
2
𝑉𝑎𝑟(𝑌|𝑥) = 𝐸(𝑌2|𝑥) − (𝐸(𝑌|𝑥)) ,

Where 𝐸(𝑌|𝑥) denotes the conditional mean of Y given 𝑋 = 𝑥.

Example: (Example 8 ctd.) Let X and Y have the joint pdf f(x, y) = 2, 0 ≤ x ≤ y ≤ 1. Find the
conditional variance of Y given X=x.

Recall the condition density and the conditional mean Y given x.

18
STAT 21513 Probability Distributions and Applications II

Now let’s compute the conditional density of Y given X=x

Hence, the conditional mean of Y given x is

Now let’s find the conditional variance of Y given x.

Example 9: Let 𝑋 and 𝑌 be continuous random variables with joint probability density

𝑒−𝑦 𝑓𝑜𝑟 0 < 𝑥 < 𝑦 < ∞


𝑓(𝑥, 𝑦) = {
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒.

What is the conditional variance of 𝑌 given the knowledge that 𝑋 = 𝑥?

Answer:

The marginal density of 𝑓1(𝑥) is given by



𝑓 (𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
1 −∞


= ∫𝑥 𝑒−𝑦𝑑𝑦

= [−𝑒−𝑦]∞𝑥

= 𝑒−𝑥

Thus, the conditional density of 𝑌 given 𝑋 = 𝑥 is


𝑓(𝑥,𝑦)
ℎ(𝑦/𝑥) =
𝑓1(𝑥)

19
STAT 21513 Probability Distributions and Applications II

𝑒−𝑦
= 𝑒−𝑥

= 𝑒−(𝑦−𝑥) 𝑓𝑜𝑟 𝑦 > 𝑥

Thus, given 𝑋 = 𝑥, 𝑌 has an exponential distribution with parameter 𝜃 = 1 and location parameter 𝑥. The
conditional mean of 𝑌 given 𝑋 = 𝑥 is

𝐸(𝑌|𝑥) = ∫−∞ 𝑦ℎ(𝑦/𝑥)𝑑𝑦


= ∫𝑥 𝑦𝑒−(𝑦−𝑥)𝑑𝑦


= ∫0 (𝑧 + 𝑥)𝑒−𝑧𝑑𝑧 𝑤ℎ𝑒𝑟𝑒 𝑧 = 𝑦 − 𝑥
∞ ∞
= 𝑥 ∫ 𝑒−𝑧𝑑𝑧 + ∫ 𝑧 𝑒−𝑧𝑑𝑧
0 0

= 𝑥Γ(1) + Γ(2)

=𝑥+1 The gamma function is defined as



Γ(𝑧) ≔ ∫0 𝑥𝑧−1𝑒−𝑥𝑑𝑥,
Similarly, we compute the second moment of the distribution ℎ(𝑦/𝑥).

𝐸(𝑌2|𝑥) = ∫−∞ 𝑦2ℎ(𝑦/𝑥)𝑑𝑦


= ∫𝑥 𝑦2 𝑒−(𝑦−𝑥)𝑑𝑦


= ∫0 (𝑧 + 𝑥)2𝑒−𝑧𝑑𝑧 𝑤ℎ𝑒𝑟𝑒 𝑧 = 𝑦 − 𝑥
∞ ∞ ∞
= 𝑥2 ∫ 𝑒−𝑧𝑑𝑧 + ∫ 𝑧2𝑒−𝑧𝑑𝑧 + 2𝑥 ∫ 𝑧𝑒−𝑧𝑑𝑧
0 0 0

= 𝑥2Γ(1) + Γ(3) + 2𝑥Γ(2)

= 𝑥2 + 2 + 2𝑥

= (𝑥 + 1)2 + 1
Therefore
2
𝑉𝑎𝑟(𝑌|𝑥) = 𝐸(𝑌2|𝑥) − (𝐸(𝑌|𝑥) )

= (𝑥 + 1)2 + 1 − (𝑥 + 1)2

= 1.

20
STAT 21513 Probability Distributions and Applications II

5.3 Important Inequalities

Cauchy –Schwarz Inequality


Let 𝑋 and 𝑌 have finite second moments. Then

{𝐸(𝑋𝑌)}2 ≤ 𝐸(𝑋2)𝐸(𝑌2)
with equality if and only if

𝑃[𝑌 = 𝛼𝑋] = 1
For some constant 𝛼.

Another form of Cauchy-Schwarz Inequality

|𝐸(𝑋𝑌)| ≤ √𝐸(𝑋2)𝐸(𝑌2)

Proof:
Define a new random variable 𝑊 = (𝑌 − 𝛼𝑋)2. Clearly, W is a nonnegative random variable for any
value of 𝛼 ∈ ℝ.

Let 𝑓(𝛼) = 𝐸(𝑋2) − 2𝛼 𝐸(𝑋𝑌) + 𝛼2 𝐸(𝑌2)


then we know that 𝑓(𝛼) ≥ 0, for all 𝛼 ∈ ℝ.

Let 𝑓(𝛼) = 0, then we have 𝑌 = 𝛼𝑋 with probability one.

Choose
𝐸(𝑋𝑌)
𝛼=
𝐸(𝑌2)

21
STAT 21513 Probability Distributions and Applications II

Then we have the C-S inequality.

Exercise:
Prove that the correlation coefficient 𝜌(𝑋, 𝑌) of random variables X and Y always in the interval [-1,1] ,
i.e. −1 ≤ 𝜌(𝑋, 𝑌) ≤ 1

Bivariate Normal Distribution

Recall the probability density function of the normal distribution:

Definition: A random variable X is defined to be normally distributed if its density is given by


1 1 𝑥−𝜇 2
− (
𝑒 2 𝜎 ) , 𝑖𝑓 − ∞ ≤ 𝑥 ≤ ∞, 𝜎 > 0
𝑓(𝑥) = {√2𝜋𝜎
0 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
where the parameters 𝜇, and 𝜎 satisfy - ∞ < 𝜇 < ∞ and 𝜎 > 0. Any distribution defined by a
density function given above equation is called a normal distribution.

22
STAT 21513 Probability Distributions and Applications II

Now bivariate normal distribution:

https://demonstrations.wolfram.com/JointDensityOfBivariateGaussianRandomVariables/

It follows by inspection that the marginal density of X is a normal


distribution with the mean μ1 and the standard deviation σ1 and, by symmetry, that
the marginal density of Y is a normal distribution with the mean μ2 and the standard
deviation σ2.

23

You might also like