CS115 Probability (4)
CS115 Probability (4)
Probability
Temperature: Weather:
W P
T P
sun 0.6
hot 0.5
rain 0.1
cold 0.5
fog 0.3
meteor 0.0
Probability Distributions
Unobserved random variables have distributions
Shorthand notation:
T P W P
hot 0.5 sun 0.6
cold 0.5 rain 0.1
fog 0.3
meteor 0.0
T W P
Must obey: hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
X Y P
+x +y 0.2
P(+x) ?
+x -y 0.3
-x +y 0.4
-x -y 0.1
P(-y OR +x) ?
Marginal Distributions
Marginal distributions are sub-tables which eliminate variables
Marginalization (summing out): Combine collapsed rows by adding
T P
hot 0.5
T W P
cold 0.5
hot sun 0.4
hot rain 0.1
cold sun 0.2
W P
cold rain 0.3
sun 0.6
rain 0.4
Quiz: Marginal Distributions
X P
+x
X Y P
-x
+x +y 0.2
+x -y 0.3
-x +y 0.4
Y P
-x -y 0.1
+y
-y
Conditional Probabilities
A simple relation between joint and conditional probabilities
In fact, this is taken as the definition of a conditional probability
P(a,b)
P(a) P(b)
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
Quiz: Conditional Probabilities
P(+x | +y) ?
X Y P
+x +y 0.2 P(-x | +y) ?
+x -y 0.3
-x +y 0.4
-x -y 0.1
P(-y | +x) ?
Conditional Distributions
Conditional distributions are probability distributions over
some variables given fixed values of others
Conditional Distributions
Joint Distribution
W P
T W P
sun 0.8
hot sun 0.4
rain 0.2 hot rain 0.1
cold sun 0.2
cold rain 0.3
W P
sun 0.4
rain 0.6
Normalization Trick
T W P
hot sun 0.4
W P
hot rain 0.1
sun 0.4
cold sun 0.2
rain 0.6
cold rain 0.3
Normalization Trick
Example 1 Example 2
W P W P T W P T W P
Normalize
sun 0.2 hot sun 20 Normalize hot sun 0.4
sun 0.4
rain 0.3 hot rain 5 hot rain 0.1
Z = 0.5 rain 0.6
cold sun 10 Z = 50 cold sun 0.2
cold rain 15 cold rain 0.3
Probabilistic Inference
Probabilistic inference: compute a desired
probability from other known probabilities (e.g.
conditional from joint)
Step 1: Select the Step 2: Sum out H to get joint Step 3: Normalize
entries consistent of Query and evidence
with the evidence
Inference by Enumeration
S T W P
P(W)?
summer hot sun 0.30
summer hot rain 0.05
summer cold sun 0.10
P(W | winter)? summer cold rain 0.05
winter hot sun 0.10
winter hot rain 0.05
winter cold sun 0.15
winter cold rain 0.20
P(W | winter, hot)?
Inference by Enumeration
Obvious problems:
Worst-case time complexity O(dn)
Space complexity O(dn) to store the joint distribution
The Product Rule
Sometimes have conditional distributions but want the joint
The Product Rule
Example:
D W P D W P
wet sun 0.1 wet sun 0.08
R P
dry sun 0.9 dry sun 0.72
sun 0.8
wet rain 0.7 wet rain 0.14
rain 0.2
dry rain 0.3 dry rain 0.06
The Chain Rule
More generally, can always write any joint distribution as an
incremental product of conditional distributions
Dividing, we get:
Example:
M: meningitis, S: stiff neck
Example
givens
Example:
I am 90% confident that I’m a good singer. P(good singer) = 0.9
If I’m a good singer, then 99% of people will like my singing. P(like | good singer) = 0.99
If I’m a bad singer, then 10% of people will like my singing. P(like | bad singer) = 0.10
I sing in my living room and my roommate covers his ears.
I need to update my beliefs to account for what I’ve learned.
I need to calculate: P(good singer | roommate doesn’t like my singing) = ?
Quiz: Bayes’ Rule
Given: D W P
wet sun 0.1
R P
dry sun 0.9
sun 0.8
wet rain 0.7
rain 0.2
dry rain 0.3
This says that their joint distribution factors into a product two
simpler distributions
Another form:
We write:
T P
hot 0.5
cold 0.5
T W P T W P
hot sun 0.4 hot sun 0.3
hot rain 0.1 hot rain 0.2
cold sun 0.2 cold sun 0.3
cold rain 0.3 cold rain 0.2
W P
sun 0.6
rain 0.4
Example: Independence
N fair, independent coin flips:
Equivalent statements:
P(Toothache | Catch , Cavity) = P(Toothache | Cavity)
P(Toothache, Catch | Cavity) = P(Toothache | Cavity) P(Catch | Cavity)
One can be derived from the other easily
Conditional Independence
Unconditional (absolute) independence very rare (why?)
Trivial decomposition: