0% found this document useful (0 votes)
6 views

6.1. Quantifying uncertainty-Probability I (updated) (1)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

6.1. Quantifying uncertainty-Probability I (updated) (1)

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Instructor: Tamer Elsayed

Sec 13.1-13.5

[Many slides were created by Dan Klein and Pieter Abbeel at UC Berkeley (ai.berkeley.edu)]
 We’re done with
 Part I: Search
 Part II: Logical Agents

 Part III: Probabilistic Reasoning


 Diagnosis
 Speech recognition
 Tracking objects
 … lots more!

2
 Probability
 Random Variables
 Joint and Marginal Distributions
 Conditional Distribution
 Product Rule, Chain Rule, Bayes’ Rule
 Inference
 Independence

 You’ll need all this stuff A LOT for the next


few weeks, so make sure you go over it now!

3
 A ghost is in the grid somewhere
 Sensor readings tell how close a
square is to the ghost
 On the ghost: red
 1 or 2 away: orange
 3 or 4 away: yellow
 5+ away: green

 Sensors are noisy, but we know P(Color | Distance)


P(red | 3) P(orange | 3) P(yellow | 3) P(green | 3)
0.05 0.15 0.5 0.3

4
5
 General situation:
 Observed variables (evidence): Agent knows certain things
about the state of the world (e.g., sensor readings or
symptoms).
 Unobserved variables: Agent needs to reason about other
aspects (e.g. where an object is or what disease is present).
 Model: Agent knows something about how the known
variables relate to the unknown variables.

 Probabilistic reasoning gives us a framework for


managing our beliefs and knowledge.

6
 A random variable is some aspect of the world about
which we (may) have uncertainty.
 R = Is it raining?
 T = Is it hot or cold?
 D = How long will it take to drive to work?
 L = Where is the ghost?

 We denote random variables with capital letters.

 Like variables in a CSP, random variables have domains


 R in {true, false} (often write as {+r, -r})
 T in {hot, cold}
 D in [0, )
 L in possible locations, maybe {(0,0), (0,1), …}

7
 Associate a probability with each value.

Temperature Weather

W P
T P
sun 0.6
hot 0.5
rain 0.1
cold 0.5
fog 0.3
meteor 0.0

8
 Random variables have distributions Shorthand notation:

T P W P
hot 0.5 sun 0.6
cold 0.5 rain 0.1
fog 0.3
meteor 0.0
OK if all domain entries are unique
 A distribution is a TABLE of probabilities of values

 A probability (lower case value) is a single number

 Must have: and


9
 A joint distribution over a set of random variables:
specifies a real number for each assignment (or outcome):

T W P

 Must obey: hot sun 0.4


hot rain 0.1
cold sun 0.2
cold rain 0.3

 Size of distribution if n variables with domain sizes d?


 For all but the smallest distributions, impractical to write out!
10
 A probabilistic model is a joint distribution over a set of random variables
Distribution over T,W
T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3

 Probabilistic models:
 (Random) variables with domains
 Assignments are called outcomes
 Joint distributions: say whether assignments (outcomes) are likely
 Normalized: sum to 1.0
 Ideally: only certain variables directly interact
11
 An event is a set E of outcomes

 From a joint distribution, we can calculate the


probability of any event T W P
 Probability that it’s hot AND sunny? hot sun 0.4
hot rain 0.1
 Probability that it’s hot?
cold sun 0.2
 Probability that it’s hot OR sunny? cold rain 0.3

 Typically, the events we care about are partial


assignments, like P(T=hot)
12
 P(+x, +y) ?

X Y P
+x +y 0.2
 P(+x) ?
+x -y 0.3
-x +y 0.4
-x -y 0.1
 P(-y OR +x) ?

13
 Marginal distributions are sub-tables which eliminate variables.
 Marginalization (summing out): Combine collapsed rows by adding.

T P
hot 0.5
T W P
cold 0.5
hot sun 0.4
hot rain 0.1
cold sun 0.2 W P
cold rain 0.3 sun 0.6
rain 0.4

14
X P
+x
X Y P
-x
+x +y 0.2
+x -y 0.3
-x +y 0.4 Y P
-x -y 0.1 +y
-y

15
 A simple relation between joint and conditional probabilities, taken as the
definition of a conditional probability:
P(a,b) Conditional probability
of A given B is how
many times A occurred
together with B divided
by how many times B
occurred, i.e. only the
P(a) P(b) A's "within" B's

T W P
hot sun 0.4
hot rain 0.1
cold sun 0.2
cold rain 0.3
16
 P(+x | +y) ?

X Y P
+x +y 0.2
 P(-x | +y) ?
+x -y 0.3
-x +y 0.4
-x -y 0.1

 P(-y | +x) ?

17
 Conditional distributions are probability distributions over some variables
given fixed values of others.
Conditional Distributions
Joint Distribution

W P
T W P sun 0.8
hot sun 0.4 rain 0.2
hot rain 0.1
cold sun 0.2
cold rain 0.3 W P
sun 0.4
rain 0.6

18
T W P
hot sun 0.4 W P
hot rain 0.1 sun 0.4
cold sun 0.2 rain 0.6
cold rain 0.3

19
SELECT the joint NORMALIZE the
probabilities selection
T W P
matching the (make it sum to one)
hot sun 0.4 evidence W P
T W P
hot rain 0.1 sun 0.4
cold sun 0.2
cold sun 0.2 rain 0.6
cold rain 0.3
cold rain 0.3

20
SELECT the joint NORMALIZE the
probabilities selection
T W P
matching the (make it sum to one)
hot sun 0.4 evidence W P
T W P
hot rain 0.1 sun 0.4
cold sun 0.2
cold sun 0.2 rain 0.6
cold rain 0.3
cold rain 0.3

 Why does this work? Sum of selection is P(evidence)! (P(T=c), here)

21
 P(X | Y=-y) ?

SELECT the joint NORMALIZE the


probabilities selection
X Y P matching the (make it sum to one)
+x +y 0.2 evidence
+x -y 0.3
-x +y 0.4
-x -y 0.1

22
 (Dictionary) To bring or restore to a normal condition

All entries sum to ONE


 Procedure:
 Step 1: Compute Z = sum over all entries
 Step 2: Divide every entry by Z

 Example 1  Example 2
T W P T W P
W P Normalize W P
hot sun 20 Normalize hot sun 0.4
sun 0.2 sun 0.4
hot rain 5 hot rain 0.1
rain 0.3 Z = 0.5 rain 0.6 Z = 50
cold sun 10 cold sun 0.2
cold rain 15 cold rain 0.3
23

You might also like