Lecure-3_2 Probability
Lecure-3_2 Probability
由 NordriDesign 提供
www.nordridesign.com
Lecture Outline
• Joint PMF
• Joint CDF
• Joint PDF
• Marginal Statistics
• Independence
• Conditional Distributions
• Correlation and Covariance
2
Multiple Random Variables
1 2 3 4
0 0 0 0 0
1 0.10 0.05 0 0
2 0.05 0.10 0.04 0.01
3 0 0.05 0.15 0.15
4 0 0 0.05 0.25
It’s easy to check that this table describes a valid joint probability mass function 5
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
7
Examples on Two Random Variables
Cont’d……
• Example : Air Conditioner Maintenance
Y= X=service time
number
of units 1 2 3 4
10
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
11
Properties
12
Independence of Random V.’s
There are situations were knowing the value of X doesn’t tell
something about Y and vice-versa. This brings up to notion of
independence of random variables.
Definition: Independence of Random Variables
1
0
13
Independence of Random V.’s
If two random variables X and Y are independent, then
i. from the joint cdf
F X Y (x, y) F X ( x) FY ( y)
f XY (x, y) f X (x) f Y ( y)
PXY (x i , y j ) PX (x i )PY ( y j ) 14
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
15
Continuous Random Variables
16
Cont….
17
Example
18
Marginal Probability Distributions
Obviously we should also be able to say something about X and
Y separately.
f X (x) -
f XY
(x, y)dy
20
Marginal Statistics of Two Random Variables Cont’d…
P( X xi ) PX (x i ) PXY (x i , y i )
yj
P(Y y j ) PY ( y j ) PXY (x i , y i )
xi
21
Conditional Probability Distributions
What information does one variable carry about the other?
22
Properties
23
Example
24
Solution2
We need to find P(X = +1,Y ≤ 0), where: X takes values +1 or −1 with equal
probability.
Y=X+N, where N∼U[−2, 2](uniformly distributed noise).
Step 1: Express Probability in Terms of Noise
Given X = +1, the output is: Y = 1+N. We need to determine:
P(Y ≤ 0∣X = +1)
Rearrange: 1+N ≤ 0 N < -1.
Step 2: Compute Probability
Since N∼U[−2, 2], the probability density function (PDF) is: f N(n)=1/4, −2 ≤ n ≤ 2.
The probability that N ≤ −1 is:
P(N ≤ −1)= = 1/4×(−1−(−2))==1/4×1=1/4.
Step 3: Use Law of Total Probability
Since P(X = +1) =1/2, we find: P(X = +1,Y ≤ 0) =P(X = +1)P(Y ≤ 0 ∣X = +1)
= 1/2×1/4 = 1/8. 25
Example
27
Independence of Random Variables
As with discrete random variables there are situations were knowing
the value of X doesn’t tell something about Y and vice- versa. This
brings up again the notion of independence of random variables.
28
Examples on Two Random Variables
Example-1:
The joint pdf of two continuous random variables X and Y
is given by:
30
Examples on Two Random Variables
Cont’d……
Solution:
31
Examples on Two Random Variables
Cont’d……
Solution:
32
Examples on Two Random Variables
Cont’d……
Solution:
33
Examples on Two Random Variables Cont’d……
Solution:
48
Examples on Two Random Variables
Cont’d……
Solution:
35
Examples on Two Random Variables
Cont’d……
36
Examples on Two Random Variables
Cont’d……
37
Examples on Two Random Variables
Cont’d……
38
Examples on Two Random Variables
Cont’d……
39
Covariance
Definition: Covariance
40
Covariance
The covariance is a measure of linear relationship between random
variables. If one of the variables is easy to predict as a linear function
of the other then the covariance is going to be non-zero.
Definition:
3
0
41
Correlation Coefficient
It is useful to normalize the covariance, and define the
Definition: Correlation Coefficient
42
Example
Recall our previous example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
43
Example
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
44
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
45
Cont’d
1 2 3 4
0 0 0 0 0 0
1 0.10 0.05 0 0 0.15
2 0.05 0.10 0.04 0.01 0.20
3 0 0.05 0.15 0.15 0.35
4 0 0 0.05 0.25 0.30
0.15 0.20 0.24 0.41
47
Examples
48
Examples
49
Examples
50
Examples
51
Examples
52
Correlation of Independent R.V’s
Proposition:
53
Correlation
54
Thank You !!!