RVSP Unit-2
RVSP Unit-2
STOCHASTIC PROCESS
Random Variable
-3 -2 -1 0 1 2 3
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 5
HASTIC PROCESS
VECTOR RANDOM VARIABLS
suppose two random variables X and Y are defined on a sample
space S, where specific values of X and Y are denoted by x and y
respectively.
Then any ordered pair of numbers (x,y) may be conveniently
considered to be a random point in the xy-plane.
JOINT PROBABILITY DISTRIBUTION FUNCTION
Consider two random variables X and Y with elements
ሼ𝑥ሽ and ሼ𝑦ሽ in 𝑥𝑦 plane.
Proof:
It is known that
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ
𝐹𝑋𝑌 ሺ−∞, −∞ሻ = Pሼ𝑋 ≤ −∞, 𝑌 ≤ −∞ሽ
= Pሼ𝑋 ≤ −∞ ∩ 𝑌 ≤ −∞ሽ
𝐹𝑋𝑌 ሺ−∞, −∞ሻ = 0
𝐴 𝑃ሺ𝐴 ∩ 𝐵ሻ
𝑃 ൬ൗ൰=
𝐵 𝑃ሺ𝐵ሻ
Similarly, the conditional probability of B upon the occurrence of
A is given as
𝑃 ሺ𝐴∩𝐵 ሻ
𝑃 ቆ 𝐵ൗቇ
𝐴 = 𝑃 ሺ𝐴 ሻ
1. 𝐹𝑋 ቆ −∞ൗ
𝐵ቇ = 0
2. 𝐹𝑋 ቆ ∞ൗ
𝐵ቇ = 1
𝑥ൗ൰≤ 1
3. 0 ≤ 𝐹𝑋 ൬ 𝐵
4. 𝐹𝑋 ሺ𝑥2 /𝐵ሻ ≥ 𝐹𝑋 ሺ𝑥1 ሻ/𝐵 𝑤ℎ𝑒𝑛 𝑥2 > 𝑥1
ሺ𝑥 < 𝑋 ≤ 𝑥2 ሻൗ
5. 𝑃 ቊ 1 𝐵ቋ = 𝐹𝑋 ሺ𝑥2 /𝐵ሻ − 𝐹𝑋 ሺ𝑥1 /𝐵ሻ
07/16/2024 07:13 AM RANDOM VARIABLES AND STOC 20
HASTIC PROCESS
CONDITIONAL DENSITY FUNCTION
The derivative of conditional distribution function is called conditional
density function. It gives the conditional probability of an event at a specific
value.
𝑥ൗ൰
𝑑𝐹𝑋 ൬ 𝐵
𝑥ൗ൰=
𝑓𝑋 ൬ 𝐵 𝑑𝑥
Properties
𝑥ൗ൰≥ 𝑜
1. 𝑓𝑋 ൬ 𝐵
∞ 𝑥ൗ൰ 𝑑𝑥 = 1
2. −∞ 𝑓𝑋 ൬ 𝐵
𝑥 𝑥ൗ൰𝑑𝑥
3. 𝐹𝑋 ሺ𝑥/𝐵ሻ = −∞ 𝑓𝑋 ൬ 𝐵
𝑥 𝑥ൗ൰𝑑𝑥
4. 𝑃ሼ𝑥1 < 𝑋 ≤ 𝑥2 ሽ = 𝑥2 𝑓𝑋 ൬ 𝐵
1
𝐴 𝑃ሺ𝐴 ∩ 𝐵ሻ
𝑃 ൬ൗ൰=
𝐵 𝑃ሺ𝐵ሻ
𝑥ൗ൰= Pሼ𝑋 ≤ 𝑥 ∩ 𝐵ሽ
𝐹𝑋 ൬ 𝐵
𝑥 Pሼሺ𝑋 ≤ 𝑥 ሻ ∩ ሺ𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦ሻሽ
𝐹𝑋 ൬ൗ 𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦൰= P(𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)
It is known that the distribution function is the integral of density function
𝑦+∆𝑦 𝑥
𝑦−∆𝑦 −∞ 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
𝑥ൗ
𝐹𝑋 ൬ 𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦൰= 𝑦+∆𝑦
𝑦−∆𝑦 𝑓𝑌 ሺ𝑦ሻ 𝑑𝑦
𝑥 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑋 ൬ൗ 𝑌 ൰= 𝑓𝑌 ሺ𝑦ሻ
Similarly
𝑦 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ
𝑓𝑌 ൬ൗ൰=
𝑋 𝑓𝑋 ሺ𝑥 ሻ
𝝏𝟐 𝑭𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ =
𝝏𝒙 𝝏𝒚
Marginal distribution functions
𝑭𝑿𝒀 ሺ𝒙, ∞ሻ = 𝑭𝑿 ሺ𝒙ሻ
𝑭𝑿𝒀 ሺ∞, 𝒚ሻ = 𝑭𝒀 ሺ𝒚ሻ
Marginal density functions
∞
𝒇𝑿 ሺ𝒙ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒚
−∞
∞
𝒇𝒀 ሺ𝒚ሻ = න 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ 𝒅𝒙
−∞
Conditional density functions
𝒙 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝑿 ൬ൗ 𝒀൰= 𝒇𝒀 ሺ𝒚ሻ
𝒚 𝒇𝑿𝒀 ሺ𝒙, 𝒚ሻ
𝒇𝒀 ቀ ൗቁ
𝑿 = 𝒇𝑿 ሺ𝒙ሻ
STATISTICAL INDEPENDENCE OF RANDOM VARIABLES:
Consider two random variables X and Y by defining the events
𝐴 = ሼ𝑋 ≤ 𝑥ሽ and 𝐵 = ሼ𝑌 ≤ 𝑦ሽ for two real numbers 𝑥 𝑎𝑛𝑑 𝑦
Two random variables are said to be statistical independent
then
Pሼ𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦ሽ = Pሼ𝑋 ≤ 𝑥ሽ Pሼ𝑌 ≤ 𝑦ሽ
From distribution function
𝐹𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝐹𝑋 ሺ𝑥ሻ 𝐹𝑌 ሺ𝑦ሻ
From density function
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑓𝑋 ሺ𝑥ሻ 𝑓𝑌 ሺ𝑦ሻ
∞ 𝑥
𝐹𝑊 ሺ𝑤 ሻ = න න 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ𝑑𝑥 𝑑𝑦
−∞ −∞
31
The above expression is recognized as a convolution integral.
𝑓𝑊 ሺ𝑤 ሻ = 𝑓𝑋 ሺ𝑥ሻ ⊗ 𝑓𝑌 ሺ𝑦ሻ
The density function of the sum of two statistically independent
random variables is the convolution of their individual density
functions.
𝜇01 = 𝜇10 = 0
𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯= 𝐸ൣ𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ൧
∞
= න 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2𝑌 𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ dx dy
−∞
𝜕 𝑛+𝑘
𝑚𝑛𝑘 = ሺ−𝑗ሻ𝑛+𝑘 𝑛 𝑘 𝜙𝑋𝑌 ൫𝜔1, 𝜔2 ൯ቤ
𝜕𝜔1 𝜕𝜔2 𝜔 1 =0,𝜔 2 =0
Here
𝑋ത= 𝐸 ሾ𝑋ሿ
𝑌ത= 𝐸 ሾ𝑌ሿ
𝜎𝑋 2 = 𝐸 ሺ𝑋 − 𝑋തሻ2
𝜎𝑌 2 = 𝐸 ሺ𝑌 − 𝑌തሻ2
𝐸 ሾሺ𝑋 − 𝑋തሻ ሺ𝑌 − 𝑌തሻሿ
𝜌=
𝜎𝑋 𝜎𝑌
The maximum value of joint Gaussian density function occurs at (𝑥 = 𝑋ത, 𝑦 = 𝑌ത)
1
𝑚𝑎𝑥 ሾ𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻሿ=
2𝜋 𝜎𝑋 𝜎𝑌 ඥ1 − 𝜌2
1 −1 ሺ𝑥 − 𝑋തሻ2 ሺ𝑦 − 𝑌തሻ2
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑒𝑥𝑝 ቊ ቈ + ቋ
2𝜋 𝜎𝑋 𝜎𝑌 2 𝜎𝑋 2 𝜎𝑌 2
Observe that if 𝜌 = 0, corresponding to uncorrelated X and Y, can be
written as
𝑓𝑋𝑌 ሺ𝑥, 𝑦ሻ = 𝑓𝑋 ሺ𝑥 ሻ 𝑓𝑌 ሺ𝑦ሻ
Where 𝑓𝑋 ሺ𝑥 ሻ 𝑎𝑛𝑑 𝑓𝑌 ሺ𝑦ሻ are the marginal density functions of X and Y
1 ሺ𝑥 − 𝑋തሻ2
𝑓𝑋 ሺ𝑥 ሻ = 𝑒𝑥𝑝 ቈ− 2
ඥ2𝜋𝜎𝑋 2 2𝜎𝑋
1 ሺ𝑦 − 𝑌തሻ2
𝑓𝑌 ሺ𝑦ሻ = 𝑒𝑥𝑝 ቈ− 2
ඥ2𝜋𝜎𝑌 2 2𝜎𝑌