0% found this document useful (0 votes)
26 views

RVSP Unit-2

Operations on Single & Multiple Random Variables – Expectations

Uploaded by

Sasi Bhushan
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

RVSP Unit-2

Operations on Single & Multiple Random Variables – Expectations

Uploaded by

Sasi Bhushan
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 50

RANDOM VARIABLES AND

STOCHASTIC PROCESS

CH SIVA RAMA KRISHNA


Asst. Professor
Dept. of ECE
UNIT-II MULTIPLE RANDOM VARIABLES

Vector random variables,


Joint distribution function and properties,
Marginal distribution functions,
Joint density function and properties,
Marginal density functions,
Joint Conditional distribution and density functions,
statistical independence,
Distribution and density of sum of random variables,
Central limit theorem.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 2


HASTIC PROCESS
Operations on Multiple Random Variables:

Expected value of a function of random variables


Joint moments about the origin, Correlation.
Joint central moments, Covariance, Correlation coefficient.
Joint characteristic function and properties,
Jointly Gaussian random variables-two and N random variables,
properties

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 3


HASTIC PROCESS
RANDOM VARIABLE

Random Variable

 A random variable is a real valued function that maps all


the elements of sample space on to points on the real axis.
(or)

 A random variable is a function that maps outcomes of a


random experiment to real numbers.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 4


HASTIC PROCESS
Random variable
 A numerical value to each outcome of a particular experiment

-3 -2 -1 0 1 2 3
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 5
HASTIC PROCESS
VECTOR RANDOM VARIABLS
 suppose two random variables X and Y are defined on a sample
space S, where specific values of X and Y are denoted by x and y
respectively.
 Then any ordered pair of numbers (x,y) may be conveniently
considered to be a random point in the xy-plane.
JOINT PROBABILITY DISTRIBUTION FUNCTION
Consider two random variables X and Y with elements
ሼ𝑥ሽ and ሼ𝑦ሽ in 𝑥𝑦 plane.

Let two events 𝐴 = ሼ𝑋 ≤ 𝑥ሽ and 𝐵 = ሼ𝑌 ≤ 𝑦ሽ


Then the Joint Probability distribution function is
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ = 𝑃(𝐴 ∩ 𝐵)

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 8


HASTIC PROCESS
Properties of Joint Probability distribution function
𝒊) 𝑭𝑿𝒀 ሺ−∞, −∞ሻ = 𝟎 𝒊𝒊) 𝑭𝑿𝒀 ሺ𝒙, −∞ሻ = 𝟎 𝒊𝒊𝒊) 𝑭𝑿𝒀 ሺ−∞, 𝒚ሻ = 𝟎

Proof:
It is known that
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ
𝐹𝑋𝑌 ሺ−∞, −∞ሻ = Pሼ𝑋 ≤ −∞, 𝑌 ≤ −∞ሽ
= Pሼ𝑋 ≤ −∞ ∩ 𝑌 ≤ −∞ሽ
𝐹𝑋𝑌 ሺ−∞, −∞ሻ = 0

𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ


𝐹𝑋𝑌 ሺ𝑥, −∞ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ −∞ሽ 𝐹𝑋𝑌 ሺ−∞, 𝑦ሻ = Pሼ𝑋 ≤ −∞, 𝑌 ≤ 𝑦ሽ
= Pሼ𝑋 ≤ −∞ ∩ 𝑌 ≤ 𝑦ሽ
= Pሼ𝑋 ≤ 𝑥 ∩ 𝑌 ≤ −∞ሽ
𝐹𝑋𝑌 ሺ−∞, 𝑦ሻ = 0
𝐹𝑋𝑌 ሺ𝑥, −∞ሻ = 0
2. 𝑭𝑿𝒀 ሺ∞, ∞ሻ = 𝟏
Proof:
It is known that
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ
𝐹𝑋𝑌 ሺ∞, ∞ሻ = Pሼ𝑋 ≤ ∞, 𝑌 ≤ ∞ሽ
= Pሼ𝑋 ≤ ∞ ∩ 𝑌 ≤ ∞ሽ
= Pሼ𝑆 ∩ 𝑆ሽ = Pሼ𝑆ሽ = 1
𝐹𝑋𝑌 ሺ∞, ∞ሻ = 1

3. The joint Probability distribution function is always defined


between 0 and 1.
i.e; 0 ≤ 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ ≤ 1

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 10


HASTIC PROCESS
4. Marginal distribution functions
𝑭𝑿𝒀 ሺ𝒙, ∞ሻ = 𝑭𝑿 ሺ𝒙ሻ
𝑭𝑿𝒀 ሺ∞, 𝒚ሻ = 𝑭𝒀 ሺ𝒚ሻ
Proof:
It is known that
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ

𝐹𝑋𝑌 ሺ𝑥, ∞ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ ∞ሽ 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ


= Pሼ𝑋 ≤ 𝑥 ∩ 𝑌 ≤ ∞ሽ 𝐹𝑋𝑌 ሺ∞, 𝑦ሻ = Pሼ𝑋 ≤ ∞, 𝑌 ≤ 𝑦ሽ
= Pሼ𝑋 ≤ 𝑥 ∩ 𝑠ሽ = Pሼ𝑠 ∩ 𝑌 ≤ 𝑦ሽ
= Pሼ𝑋 ≤ 𝑥ሽ = Pሼ𝑌 ≤ 𝑦ሽ
𝐹𝑋𝑌 ሺ𝑥, ∞ሻ = 𝐹𝑋 ሺ𝑥ሻ 𝐹𝑋𝑌 ሺ∞, 𝑦ሻ = 𝐹𝑌 ሺ𝑦ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 11


HASTIC PROCESS
Jo i n t Pr o b a b i l i t y D e n si t y Fu n c t i o n :

It gives information about the joint occurrence of events at


given values of X and Y

 The joint probability density function is the partial


derivatives of joint distribution.
𝝏𝟐 𝑭𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ =
𝝏𝒙 𝝏𝒚
When X and Y are discrete random variables,
joint density function is
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = σ 𝑚 σ 𝑛
𝑖=1 𝑗 =1 𝑃൫𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ൯𝛿 ൫𝑋 − 𝑥𝑖 , 𝑌 − 𝑦𝑗 ൯

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 12


HASTIC PROCESS
Properties of Joint Probability density function
1.Joint probability density function is a non-negative quantity
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ ≥ 𝑜
Proof:
From the definition
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ =
𝜕𝑥 𝜕𝑦

 As the distribution function is a non-decreasing


function slope is always positive.
 Hence the joint probability density function is a non-
negative quantity.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 13


HASTIC PROCESS
2. The area under the probability density function is unity.
∞ ∞
න න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞
Proof:
We know that
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ =
𝜕𝑥 𝜕𝑦

Integrate on both sides w.r.to 𝑥 Integrate on both sides w.r.to 𝑦


∞ ∞ 2 ∞ ∞
𝜕 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 = න 𝑑𝑥 න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
−∞ −∞ 𝜕𝑥 𝜕𝑦 −∞ −∞

𝜕 ∞ 𝜕 𝜕
= න 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 =න 𝐹𝑌 ሺ𝑦ሻ𝑑𝑦
𝜕𝑦 −∞ 𝜕𝑥 −∞ 𝜕𝑦
∞ ∞
𝜕 = ሾ𝐹𝑌 ሺ𝑦ሻሿ−∞
= ൣ𝐹𝑋,𝑌 ሺ𝑥, 𝑦ሻ൧ = 𝐹𝑌 ሺ∞ሻ − 𝐹𝑌 ሺ−∞ሻ = 1
𝜕𝑦 −∞
𝜕
= (𝐹𝑋𝑌 ሺ∞, 𝑦ሻ − 𝐹𝑋𝑌 ሺ−∞, 𝑦ሻ)
𝜕𝑦
𝜕
= 𝐹𝑌 ሺ𝑦ሻ
𝜕𝑦
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 14
HASTIC PROCESS
3. The joint probability distribution function can be obtained from the knowledge
of joint density function.
𝒙 𝒚
𝑭𝑿𝒀 ሺ𝒙, 𝒚ሻ = න න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ𝒅𝒙 𝒅𝒚
−∞ −∞
Proof
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝜕
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = = 𝐹 ሺ𝑥, 𝑦ሻ
𝜕𝑥 𝜕𝑦 𝜕𝑦 𝑋𝑌
Integrating on both sides w. r. to 𝑥 Integrate on both sides w.r.to 𝑦
𝑥 𝑥 𝑥 𝑦
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 = න 𝑑𝑥 න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞ 𝜕𝑥 𝜕𝑦 −∞ −∞
𝑦
𝜕
=න 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑦
𝜕 𝑥 𝜕 −∞ 𝜕𝑦
න 𝐹 ሺ𝑥, 𝑦ሻ𝑑𝑥
𝜕𝑦 −∞ 𝜕𝑥 𝑋𝑌
𝑦
= ሾ𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻሿ−∞
𝑥 = 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ − 𝐹𝑋𝑌 ሺ𝑥, −∞ሻ
𝜕
= ൣ𝐹𝑋,𝑌 ሺ𝑥, 𝑦ሻ൧ = 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝜕𝑦 −∞
𝜕 𝑥 𝑦
= (𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ − 𝐹𝑋𝑌 ሺ−∞, 𝑦ሻ) 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
𝜕𝑦 −∞ −∞
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 15
HASTIC PROCESS
4. The probability of event ሼ𝒙𝟏 < 𝑋 ≤ 𝒙𝟐 , 𝒚𝟏 < 𝑌 ≤ 𝒚𝟐 ሽ can be
obtained from the knowledge of joint density function.
𝒙𝟐 𝒚𝟐
𝑷ሼ𝒙𝟏 < 𝑋 ≤ 𝒙𝟐 , 𝒚𝟏 < 𝑌 ≤ 𝒚𝟐 ሽ = න න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ𝒅𝒙 𝒅𝒚
𝒙𝟏 𝒚𝟏
Proof:
Consider
𝑥2 𝑦2 𝑥2 𝑦2
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
= න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦 = න න 𝑑𝑥 𝑑𝑦
𝑥1 𝑦1 𝑥1 𝑦1 𝜕𝑥 𝜕𝑦
Changing the order of integration
𝑦2 𝑥2 𝑦2
𝜕 𝜕 𝜕 𝑥2
=න න 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦 = න ൣ𝐹𝑋,𝑌 ሺ𝑥, 𝑦ሻ൧𝑥 𝑑𝑦
𝑦1 𝜕𝑦 𝑥 1 𝜕𝑥 𝑦1 𝜕𝑦 1
𝑦2
𝜕
න (𝐹𝑋𝑌 ሺ𝑥2 , 𝑦ሻ − 𝐹𝑋𝑌 ሺ𝑥1 , 𝑦ሻ)𝑑𝑦
𝑦1 𝜕𝑦
𝑦
= ሾ(𝐹𝑋𝑌 ሺ𝑥2 , 𝑦ሻ − 𝐹𝑋𝑌 ሺ𝑥1 , 𝑦ሻ)ሿ𝑦12
= 𝐹𝑋𝑌 ሺ𝑥2 , 𝑦2 ሻ − 𝐹𝑋𝑌 ሺ𝑥1 , 𝑦2 ሻ−𝐹𝑋𝑌 ሺ𝑥2 , 𝑦1 ሻ+𝐹𝑋𝑌 ሺ𝑥1 , 𝑦1 ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 16


HASTIC PROCESS
Marginal density functions

𝒇𝒀 ሺ𝒚ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒙
−∞
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ =
𝜕𝑥 𝜕𝑦
Integrate on both sides w.r.to 𝑥
∞ ∞
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝜕 ∞ 𝜕
න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 = න 𝑑𝑥 = න 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥
−∞ −∞ 𝜕𝑥 𝜕𝑦 𝜕𝑦 −∞ 𝜕𝑥

𝜕 𝜕
= ൣ𝐹𝑋,𝑌 ሺ𝑥, 𝑦ሻ൧ = (𝐹𝑋𝑌 ሺ∞, 𝑦ሻ − 𝐹𝑋𝑌 ሺ−∞, 𝑦ሻ)
𝜕𝑦 −∞
𝜕𝑦
𝜕
= 𝐹𝑌 ሺ𝑦ሻ = 𝑓𝑌 (𝑦)
𝜕𝑦

𝒇𝒀 ሺ𝒚ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒙
−∞
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 17
HASTIC PROCESS
Marginal density functions

𝒇𝑿 ሺ𝒙ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒚
−∞
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ =
𝜕𝑥 𝜕𝑦
Integrate on both sides w.r.to 𝑦
∞ ∞
𝜕 2 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝜕 ∞ 𝜕
න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑦 = න 𝑑𝑦 = න 𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑦
−∞ −∞ 𝜕𝑥 𝜕𝑦 𝜕𝑥 −∞ 𝜕𝑦

𝜕 𝜕
= ൣ𝐹𝑋,𝑌 ሺ𝑥, 𝑦ሻ൧ = (𝐹𝑋𝑌 ሺ𝑥, ∞ሻ − 𝐹𝑋𝑌 ሺ𝑥, −∞ሻ)
𝜕𝑥 −∞ 𝜕𝑥
𝜕
= 𝐹𝑋 ሺ𝑥ሻ == 𝑓𝑋 (𝑥)
𝜕𝑥

𝒇𝑿 ሺ𝒙ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒚
−∞
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 18
HASTIC PROCESS
CONDITIONAL PROBABILITY
It is the probability of an event ‘A’ based on the occurrence of
another event ‘B’.
Let A and B are two events then the conditional probability of A
upon the occurrence of B is given as

𝐴 𝑃ሺ𝐴 ∩ 𝐵ሻ
𝑃 ൬ൗ൰=
𝐵 𝑃ሺ𝐵ሻ
Similarly, the conditional probability of B upon the occurrence of
A is given as
𝑃 ሺ𝐴∩𝐵 ሻ
𝑃 ቆ 𝐵ൗቇ
𝐴 = 𝑃 ሺ𝐴 ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 19


HASTIC PROCESS
CONDITIONAL DISTRIBUTION FUNCTION
The concept of conditional probability is extended for random variable also.
Let ‘X’ be a random variable then the conditional distribution function is
defined as
𝑥ൗ൰= Pሼ𝑋 ≤ 𝑥 ∩ 𝐵ሽ
𝐹𝑋 ൬ 𝐵

Properties of Probability Distribution Function

1. 𝐹𝑋 ቆ −∞ൗ
𝐵ቇ = 0

2. 𝐹𝑋 ቆ ∞ൗ
𝐵ቇ = 1
𝑥ൗ൰≤ 1
3. 0 ≤ 𝐹𝑋 ൬ 𝐵
4. 𝐹𝑋 ሺ𝑥2 /𝐵ሻ ≥ 𝐹𝑋 ሺ𝑥1 ሻ/𝐵 𝑤ℎ𝑒𝑛 𝑥2 > 𝑥1
ሺ𝑥 < 𝑋 ≤ 𝑥2 ሻൗ
5. 𝑃 ቊ 1 𝐵ቋ = 𝐹𝑋 ሺ𝑥2 /𝐵ሻ − 𝐹𝑋 ሺ𝑥1 /𝐵ሻ
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 20
HASTIC PROCESS
CONDITIONAL DENSITY FUNCTION
The derivative of conditional distribution function is called conditional
density function. It gives the conditional probability of an event at a specific
value.
𝑥ൗ൰
𝑑𝐹𝑋 ൬ 𝐵
𝑥ൗ൰=
𝑓𝑋 ൬ 𝐵 𝑑𝑥
Properties
𝑥ൗ൰≥ 𝑜
1. 𝑓𝑋 ൬ 𝐵
∞ 𝑥ൗ൰ 𝑑𝑥 = 1
2. ‫׬‬−∞ 𝑓𝑋 ൬ 𝐵
𝑥 𝑥ൗ൰𝑑𝑥
3. 𝐹𝑋 ሺ𝑥/𝐵ሻ = ‫׬‬−∞ 𝑓𝑋 ൬ 𝐵
𝑥 𝑥ൗ൰𝑑𝑥
4. 𝑃ሼ𝑥1 < 𝑋 ≤ 𝑥2 ሽ = ‫ 𝑥׬‬2 𝑓𝑋 ൬ 𝐵
1

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 21


HASTIC PROCESS
JOINT CONDITIONAL DISTRIBUTION AND DENSITY
FUNCTIONS

Two conditions are defined called as


1. Point conditioning
2. Interval conditioning
Based on the values taken by random variable ‘X’
if ‘Y’ takes a single value then it is called Point conditioning
if ‘Y’ takes a range of values then it is called Interval conditioning

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 22


HASTIC PROCESS
Point conditioning:
Under this the random variable ‘Y’ takes a single value such that
𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦

𝐴 𝑃ሺ𝐴 ∩ 𝐵ሻ
𝑃 ൬ൗ൰=
𝐵 𝑃ሺ𝐵ሻ
𝑥ൗ൰= Pሼ𝑋 ≤ 𝑥 ∩ 𝐵ሽ
𝐹𝑋 ൬ 𝐵

𝑥 Pሼሺ𝑋 ≤ 𝑥 ሻ ∩ ሺ𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦ሻሽ
𝐹𝑋 ൬ൗ 𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦൰= P(𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)
It is known that the distribution function is the integral of density function
𝑦+∆𝑦 𝑥
‫𝑦׬‬−∆𝑦 ‫׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
𝑥ൗ
𝐹𝑋 ൬ 𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦൰= 𝑦+∆𝑦
‫𝑦׬‬−∆𝑦 𝑓𝑌 ሺ𝑦ሻ 𝑑𝑦

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 23


HASTIC PROCESS
𝑥 𝑦+∆𝑦
‫׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 ‫𝑦׬‬−∆𝑦 𝑑𝑦
𝑥ൗ
∆𝑦 ⟶ 0 𝐹𝑋 ൬ 𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦൰= 𝑦+∆𝑦
𝑓𝑌 ሺ𝑦ሻ ‫𝑦׬‬−∆𝑦 𝑑𝑦
𝑥 𝑥
‫׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 2∆𝑦 ‫׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥
= =
𝑓𝑌 ሺ𝑦ሻ 2∆𝑦 𝑓𝑌 ሺ𝑦ሻ
Density function
𝑥ൗ൰
𝑑 𝐹𝑋 ൬ 𝑑 𝑥
𝑌 𝑑𝑥 ‫׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥
𝑥ൗ൰=
𝑓𝑋 ൬ =
𝑌 𝑑𝑥 𝑓𝑌 ሺ𝑦ሻ

𝑥 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑋 ൬ൗ 𝑌 ൰= 𝑓𝑌 ሺ𝑦ሻ
Similarly
𝑦 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑌 ൬ൗ൰=
𝑋 𝑓𝑋 ሺ𝑥 ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 24


HASTIC PROCESS
Interval conditioning:
Under this the random variable ‘Y’ takes the range of values such that
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏
𝑦𝑏 𝑥
‫׬ 𝑦׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
𝑥ൗ
𝐹𝑋 ൬ 𝑎
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏 ൰= 𝑦
‫ 𝑌𝑓 𝑏 𝑦׬‬ሺ𝑦ሻ 𝑑𝑦
𝑎
𝑦𝑏
‫ 𝑌𝑋𝑓 𝑦׬‬ሺ𝑥, 𝑦ሻ 𝑑𝑦
𝑥ൗ
𝐹𝑋 ൬ ൰= 𝑎
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏 𝑦
‫ 𝑌𝑓 𝑏 𝑦׬‬ሺ𝑦ሻ 𝑑𝑦
𝑎

Since 𝒇𝒀 ሺ𝒚ሻ = ‫׬‬−∞ 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒙
𝑦𝑏
‫ 𝑌𝑋𝑓 𝑦׬‬ሺ𝑥, 𝑦ሻ 𝑑𝑦
𝑥ൗ
𝐹𝑋 ൬ ൰= 𝑎
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏 𝑦 ∞
‫׬ 𝑏 𝑦׬‬−∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
𝑎

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 25


HASTIC PROCESS
𝒙 𝒚
𝑭𝑿𝒀 ሺ𝒙, 𝒚ሻ = න න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ𝒅𝒙 𝒅𝒚
−∞ −∞

𝝏𝟐 𝑭𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ =
𝝏𝒙 𝝏𝒚
Marginal distribution functions
𝑭𝑿𝒀 ሺ𝒙, ∞ሻ = 𝑭𝑿 ሺ𝒙ሻ
𝑭𝑿𝒀 ሺ∞, 𝒚ሻ = 𝑭𝒀 ሺ𝒚ሻ
Marginal density functions

𝒇𝑿 ሺ𝒙ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒚
−∞

𝒇𝒀 ሺ𝒚ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒙
−∞
Conditional density functions
𝒙 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝑿 ൬ൗ 𝒀൰= 𝒇𝒀 ሺ𝒚ሻ

𝒚 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝒀 ቀ ൗቁ
𝑿 = 𝒇𝑿 ሺ𝒙ሻ
STATISTICAL INDEPENDENCE OF RANDOM VARIABLES:
Consider two random variables X and Y by defining the events
𝐴 = ሼ𝑋 ≤ 𝑥ሽ and 𝐵 = ሼ𝑌 ≤ 𝑦ሽ for two real numbers 𝑥 𝑎𝑛𝑑 𝑦
 Two random variables are said to be statistical independent
then
Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ = Pሼ𝑋 ≤ 𝑥ሽ Pሼ𝑌 ≤ 𝑦ሽ
 From distribution function
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝐹𝑋 ሺ𝑥ሻ 𝐹𝑌 ሺ𝑦ሻ
 From density function
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑓𝑋 ሺ𝑥ሻ 𝑓𝑌 ሺ𝑦ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 27


HASTIC PROCESS
From conditional density function

𝑥 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑓𝑋 ሺ𝑥ሻ 𝑓𝑌 ሺ𝑦ሻ


𝑓𝑋 ൬ൗ 𝑦൰= 𝑓 ሺ𝑦ሻ == = 𝑓𝑋 ሺ𝑥ሻ
𝑌 𝑓 ሺ𝑦ሻ 𝑌

𝑦ൗ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑓𝑋 ሺ𝑥ሻ 𝑓𝑌 ሺ𝑦ሻ


𝑓𝑋 ൬ 𝑥 ൰= = = 𝑓𝑌 ሺ𝑦ሻ
𝑓 ሺ𝑥ሻ
𝑋 𝑓 ሺ𝑥ሻ𝑋

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 28


HASTIC PROCESS
Dis t r ib u t io n a nd d e ns it y o f a s u m o f r a nd o m
v a r ia b l e s
 In real time applications, the received signal is a sum of
desigred original signal and noise.
 In such a case the information about the probability of
combine signal will help in analysing the communication
system.
 Let ‘W’ be a random variable equal to sum of two
independent random variables X and Y
𝑊 =𝑋+𝑌

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 29


HASTIC PROCESS
The density function of sum of two random variables is the
area under the given curve 𝑊 = 𝑋 + 𝑌

The shaded portion corresponds


to the event

∞ 𝑥
𝐹𝑊 ሺ𝑤 ሻ = න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
−∞ −∞

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 30


HASTIC PROCESS
When X and Y are statically independent
∞ 𝑤−𝑦
𝐹𝑊 ሺ𝑤ሻ = න 𝑓𝑌 ሺ𝑦ሻ න 𝑓𝑋 ሺ𝑥 ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞

Differentiating (using Leibniz’s rule) w.r.to ‘w’ on both side


𝑤−𝑦
𝑑 𝑑 ∞
𝐹𝑊 ሺ𝑤ሻ = න 𝑓𝑌 ሺ𝑦ሻ න 𝑓𝑋 ሺ𝑥 ሻ 𝑑𝑥 𝑑𝑦
𝑑𝑤 𝑑𝑤 −∞ −∞

𝑑 𝑤−𝑦
𝑓𝑊 ሺ𝑤ሻ = න 𝑓𝑌 ሺ𝑦ሻ න 𝑓𝑋 ሺ𝑥 ሻ 𝑑𝑥 𝑑𝑦
−∞ 𝑑𝑤 −∞

𝑓𝑊 ሺ𝑤ሻ = න 𝑓𝑌 ሺ𝑦ሻሾ𝑓𝑋 ሺ𝑤 − 𝑦ሻ − 𝑓𝑋 ሺ−∞ሻሿ𝑑𝑦
−∞

𝑓𝑊 ሺ𝑤ሻ = න 𝑓𝑌 ሺ𝑦ሻ 𝑓𝑋 ሺ𝑤 − 𝑦ሻ 𝑑𝑦
−∞

The above expression is recognized as a convolution integral.


𝑓𝑊 ሺ𝑤ሻ = 𝑓𝑋 ሺ𝑥 ሻ ⊗ 𝑓𝑌 ሺ𝑦ሻ

31
The above expression is recognized as a convolution integral.
𝑓𝑊 ሺ𝑤 ሻ = 𝑓𝑋 ሺ𝑥ሻ ⊗ 𝑓𝑌 ሺ𝑦ሻ
The density function of the sum of two statistically independent
random variables is the convolution of their individual density
functions.

If there are ‘n’ number of statistically independent random


variables then
𝑓𝑋1 +𝑋2 +⋯𝑋𝑛 ሺ𝑥ሻ = 𝑓𝑋1 ሺ𝑥1 ሻ ∗ 𝑓𝑋2 ሺ𝑥2 ሻ ∗ 𝑓𝑋3 ሺ𝑥3 ሻ … … ∗ 𝑓𝑋𝑛 ሺ𝑥𝑛 ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 32


HASTIC PROCESS
OPERATIONS ON MULTIPLE RANDOM VARIABLES:

Expected Value or Mean

When more than a single random variable is involved,


expectation must be taken with respect to all the variables
involved.
∞ ∞
𝐸 ሾ𝑔ሺ𝑋, 𝑌 ሻሿ= න න 𝑔ሺ𝑥, 𝑦ሻ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 33


HASTIC PROCESS
JOINT MOMENTS:
Moments are the measure of deviation of a random variable from a
reference value.
Joint moments indicate the deviation of a multiple random variables from a
reference value.
Joint moments about the origin
The expected value of a function of a form 𝑔ሺ𝑥, 𝑦ሻ = 𝑋 𝑛 𝑌 𝑘
is called joint moment about the origin.
𝑚𝑛𝑘 = 𝐸 ሾ𝑋 𝑛 𝑌 𝑘 ሿ
∞ ∞
𝑚𝑛𝑘 = න න 𝑥 𝑛 𝑦 𝑘 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞

The order of joint moment is the sum of individual orders n and k.


i.e: order=n+k
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 34
HASTIC PROCESS
First order joint moments:
∞ ∞
𝑚𝑛𝑘 = න න 𝑥 𝑛 𝑦 𝑘 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞

𝑚10 = න 𝑥 𝑓𝑋 ሺ𝑥ሻ 𝑑𝑥 = 𝐸(𝑋)
−∞

𝑚01 = න 𝑦 𝑓𝑌 ሺ𝑦ሻ 𝑑𝑦 = 𝐸(𝑌)
−∞
Second order joint moments:
𝑚02 = 𝐸 ሾ 𝑌 2 ሿ
𝑚20 = 𝐸 ሾ 𝑋 2 ሿ
The second order joint moment 𝑚11 is called as Correlation.
𝑅𝑋𝑌 = 𝑚11 = 𝐸 ሾ𝑋 𝑌ሿ
∞ ∞
𝑚11 = න න 𝑥 𝑦 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞

Correlation is a measure of similarity between two (or) more random variables.


If two random variables are said to be statistical independent then
𝑅𝑋𝑌 = 𝐸 ሾ𝑋 𝑌ሿ= 𝐸 ሾ𝑋ሿ 𝐸 ሾ𝑌ሿ
∞ ∞ ∞ ∞
= න න 𝑥 𝑦 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦 = න න 𝑥 𝑦 𝑓𝑋 ሺ𝑥 ሻ 𝑓𝑌 ሺ𝑦ሻ 𝑑𝑥 𝑑𝑦
−∞ −∞ −∞ −∞
∞ ∞
= න 𝑥 𝑓𝑋 ሺ𝑥 ሻ 𝑑𝑥 න 𝑦 𝑓𝑌 ሺ𝑦ሻ 𝑑𝑦 = 𝐸 ሾ𝑋ሿ 𝐸 ሾ𝑌ሿ
−∞ −∞

Note: when two random variables are orthogonal 𝑅𝑋𝑌 = 0


36
Joint central moments

The expected value of a given function 𝑔ሺ𝑥ሻ = ሺ𝑋 − 𝑋തሻ𝑛 ሺ𝑌 − 𝑌തሻ𝐾


is called joint central moment of two random variables.

𝜇𝑛𝑘 = 𝐸 ሾሺ𝑋 − 𝑋തሻ𝑛 ሺ𝑌 − 𝑌തሻ𝑘 ሿ


∞ ∞
= ‫׬‬−∞ ‫׬‬−∞ሺ𝑥 − 𝑋തሻ𝑛 ሺ𝑦 − 𝑌തሻ𝑘 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ 𝑑𝑥 𝑑𝑦
Properties of central moments:
1.The zero order joint central moment is ‘1’.
2.The first order joint central moment is ‘zero’.

𝜇01 = 𝜇10 = 0

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 37


HASTIC PROCESS
3.The second order joint central moment
𝜇𝑛𝑘 = 𝐸 ሾሺ𝑋 − 𝑋തሻ𝑛 ሺ𝑌 − 𝑌തሻ𝑘 ሿ
𝜇20 = 𝐸 ሾሺ𝑋 − 𝑋തሻ2 ሿ= 𝜎𝑋 2
𝜇02 = 𝐸 ሾ ሺ𝑌 − 𝑌തሻ2 ሿ= 𝜎𝑌 2
4.The second order joint central moment 𝐶𝑋𝑌 is called as
covariance between two random variables X and Y.
𝜇11 = 𝐸 ሾሺ𝑋 − 𝑋തሻሺ𝑌 − 𝑌തሻሿ= 𝐶𝑋𝑌
Covariance is a measure of change in random variable with
another one. It indicates how to random variables vary together.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 38


HASTIC PROCESS
Properties of Covariance:
1. If X and Y are two random variables then the Covariance between them
is given as
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 ሾ𝑋ሿ 𝐸 ሾ𝑌 ሿ
2. If X and Y are two statistical independent random variables then
𝐶𝑋𝑌 = 0
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 ሾ𝑋ሿ 𝐸 ሾ𝑌 ሿ
If two random variables are said to be statistical independent then
𝑅𝑋𝑌 = 𝐸 ሾ𝑋 𝑌ሿ= 𝐸 ሾ𝑋ሿ 𝐸ሾ𝑌 ሿ
𝐶𝑋𝑌 = 𝐸 ሾ𝑋ሿ 𝐸ሾ𝑌 ሿ− 𝐸 ሾ𝑋ሿ 𝐸 ሾ𝑌 ሿ= 0
3. Let X and Y be two random variables then
𝑣𝑎𝑟ሾ𝑋 + 𝑌 ሿ= 𝑣𝑎𝑟ሾ𝑋ሿ+ 𝑣𝑎𝑟ሾ𝑌 ሿ+ 2𝐶𝑋𝑌
𝑣𝑎𝑟ሾ𝑋 − 𝑌 ሿ= 𝑣𝑎𝑟ሾ𝑋ሿ+ 𝑣𝑎𝑟ሾ𝑌 ሿ− 2𝐶𝑋𝑌

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 39


HASTIC PROCESS
Correlation coefficient:
It is defined as
𝜇11 ሺ𝑜𝑟ሻ 𝐶𝑋𝑌
𝜌=
ඥ𝜇20 𝜇02
𝜇11
𝜌=
𝜎𝑋 𝜎𝑌

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 40


HASTIC PROCESS
J OINT CHARACTERISTICS FUNCTION
The expected value of the joint function 𝑔ሺ𝑥, 𝑦ሻ = 𝑒 𝑗 𝜔 1 𝑋 𝑒 𝑗 𝜔 2 𝑌 is
called joint characteristics function.
𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝐸ൣ𝑒 𝑗 𝜔 1 𝑋 𝑒 𝑗 𝜔 2 𝑌 ൧

= න 𝑒 𝑗 𝜔 1 𝑋 𝑒 𝑗 𝜔 2 𝑌 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ dx dy
−∞

1
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = න 𝑒 −𝑗 𝜔 1 𝑋 𝑒 −𝑗 𝜔 2 𝑌 𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯d𝜔1 d𝜔2
2𝜋 −∞

Joint characteristics function and joint density function are Fourier


transform pairs with the sign of the variable are reversed.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 41


HASTIC PROCESS
Properties of joint characteristics function:
1. The marginal characteristics function can be obtained from the
knowledge of joint characteristics function
𝜙𝑋 ሺ𝜔1 ሻ = 𝜙𝑋𝑌 ൫𝜔1, 0൯
𝜙𝑌 ሺ𝜔2 ሻ = 𝜙𝑋𝑌 ሺ0, 𝜔2 ሻ
𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝐸ൣ𝑒 𝑗 𝜔 1 𝑋 𝑒 𝑗 𝜔 2 𝑌 ൧
Let 𝜔2 = 0
𝜙𝑋𝑌 ൫𝜔1, 0൯= 𝐸ൣ𝑒 𝑗 𝜔 1 𝑋 ൧= 𝜙𝑋 ሺ𝜔1 ሻ
Let 𝜔1 = 0
𝜙𝑋𝑌 ሺ0, 𝜔2 ሻ = 𝐸ൣ𝑒 𝑗 𝜔 2 𝑌 ൧= 𝜙𝑌 ሺ𝜔2 ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 42


HASTIC PROCESS
2. If X and Y are two statistical independent random variables then their
joint characteristics function is the product of individual characteristics
functions

𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝜙𝑋 ሺ𝜔1 ሻ 𝜙𝑌 ሺ𝜔2 ሻ

𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝐸ൣ𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ൧

= න 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2𝑌 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ dx dy
−∞

If X and Y are two statistical independent random variables then



= න 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2𝑌 𝑓𝑋 ሺ𝑥 ሻ 𝑓𝑌 ሺ𝑦ሻdx dy
−∞
∞ ∞
=න 𝑒𝑗𝜔1𝑋 𝑓𝑋 ሺ𝑥 ሻ dx න 𝑒𝑗𝜔2𝑌 𝑓𝑌 ሺ𝑦ሻ dy
−∞ −∞

𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝜙𝑋 ሺ𝜔1 ሻ 𝜙𝑌 ሺ𝜔2 ሻ


07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 43
HASTIC PROCESS
3. If X and Y are two statistical independent random variables then the
joint characteristics function of sum of random variables is the
product of individual characteristics functions

𝜙𝑋+𝑌 ሺ𝜔ሻ = 𝜙𝑋 ሺ𝜔ሻ 𝜙𝑌 ሺ𝜔ሻ


𝑝𝑟𝑜𝑜𝑓:
𝜙𝑋+𝑌 ሺ𝜔ሻ = 𝐸ൣ𝑒 𝑗𝜔 (𝑋+𝑌) ൧
= 𝐸ൣ𝑒 𝑗𝜔𝑋 𝑒 𝑗𝜔𝑌 ൧
If X and Y are two statistical independent random variables then
𝜙𝑋+𝑌 ሺ𝜔ሻ = 𝐸ൣ𝑒 𝑗𝜔𝑋 ൧ 𝐸ൣ𝑒 𝑗𝜔𝑌 ൧
𝜙𝑋+𝑌 ሺ𝜔ሻ = 𝜙𝑋 ሺ𝜔ሻ 𝜙𝑌 ሺ𝜔ሻ

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 44


HASTIC PROCESS
4.The joint moments of multiple random variable can be obtained
from the knowledge of joint characteristic function is

𝜕 𝑛+𝑘
𝑚𝑛𝑘 = ሺ−𝑗ሻ𝑛+𝑘 𝑛 𝑘 𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯ቤ
𝜕𝜔1 𝜕𝜔2 𝜔 1 =0,𝜔 2 =0

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 45


HASTIC PROCESS
J OINTLY GAUSSIAN RANDOM VARIABLES:
 Among various standard density function Gaussian density
function is the most significantly used density function in the
field of science and engineering.
 In particular it is used to estimate the noise power while
calculating the signal to noise ratio.
 It is some time called bivariate Gaussian density
 Two random variables are said to be jointly Gaussian if their
joint density function of the form

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 46


HASTIC PROCESS
1 −1 ሺ𝑥 − 𝑋തሻ2 2 𝜌 ሺ𝑥 − 𝑋തሻ ሺ𝑦 − 𝑌തሻ ሺ𝑦 − 𝑌തሻ2
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑒𝑥𝑝 ቊ 2ሻ
ቈ 2
− + 2
቉ቋ
2𝜋 𝜎𝑋 𝜎𝑌 ඥ1 − 𝜌 2 2 ሺ1 − 𝜌 𝜎𝑋 𝜎𝑋 𝑌𝜎 𝜎𝑌

Here
𝑋ത= 𝐸 ሾ𝑋ሿ
𝑌ത= 𝐸 ሾ𝑌ሿ
𝜎𝑋 2 = 𝐸 ሺ𝑋 − 𝑋തሻ2
𝜎𝑌 2 = 𝐸 ሺ𝑌 − 𝑌തሻ2
𝐸 ሾሺ𝑋 − 𝑋തሻ ሺ𝑌 − 𝑌തሻሿ
𝜌=
𝜎𝑋 𝜎𝑌

The maximum value of joint Gaussian density function occurs at (𝑥 = 𝑋ത, 𝑦 = 𝑌ത)
1
𝑚𝑎𝑥 ሾ𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻሿ=
2𝜋 𝜎𝑋 𝜎𝑌 ඥ1 − 𝜌2

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 47


HASTIC PROCESS
1. If X and Y are two statistical independent random variables then their
joint Gaussian density function is

1 −1 ሺ𝑥 − 𝑋തሻ2 ሺ𝑦 − 𝑌തሻ2
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑒𝑥𝑝 ቊ ቈ + ቉ቋ
2𝜋 𝜎𝑋 𝜎𝑌 2 𝜎𝑋 2 𝜎𝑌 2
Observe that if 𝜌 = 0, corresponding to uncorrelated X and Y, can be
written as
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑓𝑋 ሺ𝑥 ሻ 𝑓𝑌 ሺ𝑦ሻ
Where 𝑓𝑋 ሺ𝑥 ሻ 𝑎𝑛𝑑 𝑓𝑌 ሺ𝑦ሻ are the marginal density functions of X and Y
1 ሺ𝑥 − 𝑋തሻ2
𝑓𝑋 ሺ𝑥 ሻ = 𝑒𝑥𝑝 ቈ− 2 ቉
ඥ2𝜋𝜎𝑋 2 2𝜎𝑋
1 ሺ𝑦 − 𝑌തሻ2
𝑓𝑌 ሺ𝑦ሻ = 𝑒𝑥𝑝 ቈ− 2

ඥ2𝜋𝜎𝑌 2 2𝜎𝑌

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 48


HASTIC PROCESS
Note:

Two random variables are said to be un-correlated if


they are statistical independent. However the reverse
statement is not true for all cases. But for Gaussian
random variables the reverse statement also true.

07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 49


HASTIC PROCESS
N Random variables
N random variables X1,X2,..... XN are called jointly Gaussian if their joint density
function can be written as
ȁሾ𝐶𝑋 ሿ−1 ȁ1/2 ሾ𝑥 − 𝑋തሿ𝑡 ሾ𝐶𝑋 ሿ−1 ሾ𝑥 − 𝑋തሿ
𝑓𝑋1 …….𝑋 𝑁 ሺ𝑥1. … … 𝑥𝑁 ሻ = 𝑒𝑥𝑝 ቊ− ቈ ቉ቋ
ሺ2𝜋ሻ𝑁/2 2
𝑥1 − 𝑋ത1 𝐶11 𝐶12 … 𝐶1𝑁
‫ۍ‬ ‫ې‬
ത 𝐶 𝐶22 … 𝐶2𝑁
ሾ𝑥 − 𝑋തሿ= 𝑥2 ‫ێ‬− 𝑋2 ‫ۑ‬ ሾ𝐶𝑋 ሿ= ൦ 21 ൪
‫⋮ێ‬ ‫ۑ‬ ⋮ ⋮ ⋮
𝑥𝑁‫ ۏ‬− 𝑋ത𝑁 ‫ے‬ 𝐶𝑁1 𝐶2𝑁 … 𝐶𝑁𝑁
ሾ. ሿ−𝟏 For the matrix inverse
ሾ. ሿ𝒕 For the matrix transpose
ȁሾ. ሿȁ For the matrix determinant
Elements of ሾ𝑪𝑿 ሿcalled the covariance matrix of N random variables, given by
𝜎𝑋 𝑖 2 𝑖=𝑗
𝐶𝑖𝑗 = 𝐸ൣሺ𝑋𝑖 − 𝑋ത𝑖 ሻ൫𝑋𝑗 − 𝑋ത𝑗 ൯൧= ቊ
𝐶𝑋𝑖 𝑋𝑗 𝑖≠𝑗
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 50
HASTIC PROCESS

You might also like