0% found this document useful (0 votes)
3 views

Lecure-3_2 Probability

Lecture 4 covers the concepts of joint and marginal probability distributions for both discrete and continuous random variables, including definitions and examples of joint PMF, CDF, PDF, and marginal statistics. It also discusses conditional probability distributions, independence of random variables, covariance, and correlation coefficients. The lecture emphasizes the importance of understanding the relationships between multiple random variables in probability and statistics.

Uploaded by

kenatariku672
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Lecure-3_2 Probability

Lecture 4 covers the concepts of joint and marginal probability distributions for both discrete and continuous random variables, including definitions and examples of joint PMF, CDF, PDF, and marginal statistics. It also discusses conditional probability distributions, independence of random variables, covariance, and correlation coefficients. The lecture emphasizes the importance of understanding the relationships between multiple random variables in probability and statistics.

Uploaded by

kenatariku672
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 55

Lecture 4

Probability and Random Processes

由 NordriDesign 提供
www.nordridesign.com
Lecture Outline

• Joint PMF
• Joint CDF
• Joint PDF
• Marginal Statistics
• Independence
• Conditional Distributions
• Correlation and Covariance

2
Multiple Random Variables

Sometimes we must deal with multiple random variables


simultaneously.
Example: Let X and Y denote the blood pressure and heart
rate of a randomly chosen ASTU student (during an
exam).
These two quantities are likely to be related, and
describing the probability distribution of X and Y
separately will not capture the relation between the two.
Therefore we need a joint description of the distribution.
3
Let’s focus first on discrete random variables.
Joint Probability Mass Functions

Definition: Joint Probability Mass Function

 The joint cdf can be written as:

FXY (x, y)   P XY (xi , y j )

(iv) 0  PXY (xi , y j ) 


1
This is quite similar to what we had before, but now we are jointly
describing the two random variables. 4
Example
Suppose you want to study the relation between the number of bars
in your mobile, and the quality of the call. You collect data over time
and come up with the following stochastic model:

1 2 3 4

0 0 0 0 0
1 0.10 0.05 0 0
2 0.05 0.10 0.04 0.01
3 0 0.05 0.15 0.15
4 0 0 0.05 0.25

It’s easy to check that this table describes a valid joint probability mass function 5
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.

Definition: Marginal Probability Mass Function

These are valid probability mass functions on their own, and


6
are called the marginal p.m.f.’s of X and Y.
Example
Suppose you want to study the relation between the number of bars
in your mobile, and the quality of the call. You collect data over time
and come up with the following stochastic model:

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
7
Examples on Two Random Variables
Cont’d……
• Example : Air Conditioner Maintenance

– A company that services air conditioner units in residences


and office blocks is interested in how to schedule its
technicians in the most efficient manner

– The random variable X, taking the values 1,2,3 and 4, is the


service time in hours

– The random variable Y, taking the values 1,2 and 3, is the


number of air conditioner units
8
Examples on Two Random Variables
Cont’d……

Y= X=service time
number
of units 1 2 3 4

1 0.12 0.08 0.07 0.05

2 0.18 0.15 0.07 0.15


9
Conditional Probability Distribution
What information does one variable carry about the other?
In our example we might be interested to know about the call
quality when we have 4 bars.

Definition: Conditional Probability Mass Function

10
Example

1 2 3 4

0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

11
Properties

Note that the conditional probability mass function is actually


a proper mass function therefore

12
Independence of Random V.’s
There are situations were knowing the value of X doesn’t tell
something about Y and vice-versa. This brings up to notion of
independence of random variables.
Definition: Independence of Random Variables

1
0

13
Independence of Random V.’s
 If two random variables X and Y are independent, then
i. from the joint cdf

F X Y (x, y)  F X ( x) FY ( y)

ii. from the joint pdf

f XY (x, y)  f X (x) f Y ( y)

iii.from the joint pmf

PXY (x i , y j )  PX (x i )PY ( y j ) 14
Example

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

15
Continuous Random Variables

All the concepts we introduced in this lecture can also be defined


for continuous random variables. However, this requires the use of
multiple integrals (essentially replacing the double summations
we’ve seen before).

Definition: Joint Probability Density Function

16
Cont….

17
Example

18
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.

Definition: Marginal Density Function

These are valid probability density functions on their own, and


describe X and Y individually (but disregard how these two random
variables are related).
19
Marginal Statistics of Two Random Variables
 In the case of two or more random variables, the statistics
of each individual variable are called marginal statistics.
i. Marginal cdf of X and Y
F X (x)  lim F X Y (x, y)  F X Y (x, )
y

FY ( y)  lim F X Y (x, y)  F X Y (,


x
y)
ii. Marginal pdf of X and Y

f X (x)   -
f XY
(x, y)dy
20
Marginal Statistics of Two Random Variables Cont’d…

iii. Marginal PMF of X and Y

P( X  xi )  PX (x i )   PXY (x i , y i )
yj

P(Y  y j )  PY ( y j )   PXY (x i , y i )
xi

21
Conditional Probability Distributions
What information does one variable carry about the other?

Definition: Conditional Probability Density Function

22
Properties

Note that the conditional probability density function is actually


a proper density function therefore

23
Example

Let X be the input to a communication channel and Y the


output. The input to the channel is + 1 volt or −1 volt with
equal probability. The output of the channel is the input plus
a noise voltage N that is uniformly distributed in the interval
[−2, +2] volts. Find P [X = +1, Y ≤ 0].
Solution: P [X = + 1, Y ≤ y ] = P [Y ≤ y | X = + 1] P [X =
+ 1],
where P [X = +1] = 1/2. When the input X = 1, the output
Y is uniformly distributed in the interval [−1, 3]. Therefore,
P [Y ≤ y|X = +1] = y + 1 for −1 ≤ y ≤ 3.
4

24
Solution2

 We need to find P(X = +1,Y ≤ 0), where: X takes values +1 or −1 with equal
probability.
Y=X+N, where N∼U[−2, 2](uniformly distributed noise).
Step 1: Express Probability in Terms of Noise
Given X = +1, the output is: Y = 1+N. We need to determine:
P(Y ≤ 0∣X = +1)
Rearrange: 1+N ≤ 0 N < -1.
Step 2: Compute Probability
Since N∼U[−2, 2], the probability density function (PDF) is: f N(n)=1/4, −2 ≤ n ≤ 2.
The probability that N ≤ −1 is:
P(N ≤ −1)= = 1/4×(−1−(−2))==1/4×1=1/4.
Step 3: Use Law of Total Probability
Since P(X = +1) =1/2, we find: P(X = +1,Y ≤ 0) =P(X = +1)P(Y ≤ 0 ∣X = +1)
= 1/2×1/4 = 1/8. 25
Example

E x a m p l e : Let X be the input to a communication channel


and let Y be the output. The input to the channel is + 1 volt
or −1 volt with equal probability. The output of the channel
is the input plus a noise voltage N that is uniformly
distributed in the interval [−2, +2] volts. Find the
probability that Y is negative given that X is +1.
Solution
If X = +1, then Y is uniformly distributed in the interval
[−1, 3] and .
1
Thus, F (y|1) = = --1 ≤ y ≤ 3
Y 4
Solution 2
We need to find:
P(Y < 0∣X = +1)
Step 1: Express Y in Terms of N
Since the input X is given as +1 or -1, we focus on X = +1:
Y=X+N =1+N. We need to find:
P(Y < 0∣X = +1) = P(1+N < 0) = P(N < −1).
Step 2: Compute Probability
The noise N follows a uniform distribution:
N∼U[−2, 2]
The probability density function (PDF) of N is:
fN(n) = 1/4, −2 ≤ n ≤ 2.
The probability that N < −1 is:
P(N < −1)= =1/4×(−1−(−2)) = 1/4×1 = ¼.

27
Independence of Random Variables
As with discrete random variables there are situations were knowing
the value of X doesn’t tell something about Y and vice- versa. This
brings up again the notion of independence of random variables.

Definition: Independence of Random Variables

28
Examples on Two Random Variables
Example-1:
The joint pdf of two continuous random variables X and Y
is given by:

a. Find the value of k.


b. Find the marginal pdf of X and
Y.
c. Are X and Y independent?
d. Find P( X  Y  1)
e. Find the conditiona l pdf of X 29
Examples on Two Random Variables
Cont’d……
Solution:

30
Examples on Two Random Variables
Cont’d……
Solution:

31
Examples on Two Random Variables
Cont’d……
Solution:

32
Examples on Two Random Variables
Cont’d……
Solution:

33
Examples on Two Random Variables Cont’d……

Solution:

e. Conditiona l pdf of X and


Y
i. Conditiona l fpdf(x,ofy) 4xy
X fX / Y ( x / y)  XYf Y (y)  2y  2
x
0  x  1, 0  y 
X/Y  2
 f ( x / y)  1 otherwise
x, 0,

48
Examples on Two Random Variables
Cont’d……
Solution:

35
Examples on Two Random Variables
Cont’d……

36
Examples on Two Random Variables
Cont’d……

37
Examples on Two Random Variables
Cont’d……

38
Examples on Two Random Variables
Cont’d……

39
Covariance

Definition: Covariance

40
Covariance
The covariance is a measure of linear relationship between random
variables. If one of the variables is easy to predict as a linear function
of the other then the covariance is going to be non-zero.
Definition:

3
0

41
Correlation Coefficient
It is useful to normalize the covariance, and define the
Definition: Correlation Coefficient

42
Example
Recall our previous example

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

43
Example

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

44
Cont’d

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

45
Cont’d

1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41

As the correlation is significantly different than zero this means


that Y can be well-predicted from X using a linear relationship.
46
Examples

47
Examples

d) Find Mean and Variance of X

48
Examples

49
Examples

50
Examples

(d) Find Mean and Variance of X.

51
Examples

52
Correlation of Independent R.V’s
Proposition:

53
Correlation

Uncorrelation IS NOT EQUIVALENT to Independence


It’s important to note that the implication goes only in one
direction:

54
Thank You !!!

You might also like