Lecture5 2023
Lecture5 2023
Prediction Update
Bel ( xt −1 ) step Bel ( xt ) step Bel ( xt )
Parametric Non-parametric
representations: representations:
• Gaussian, χ2, etc. • The data itself,
distributions samples from a pdf,
histogram
• Summarized • Sometimes can be
concisely (e.g. μ, σ) summarized by
• May or may not median, mode, etc.
represent the actual • Can represent non-
data or pdf Gaussian, etc. pdfs
• Single hypothesis • Multiple hypotheses
• Makes assumptions! • Doesn’t assume!
2
Particle filter (Thrun 4.3.1)
• A non-parametric Bayes Filter that
represents probability density
functions with samples from those
pdfs
• Bel(xt) is updated by reweighting and
resampling the samples, i.e. particles
(we’ll come back to how those steps
are actually done later)
3
Particle Filter implementation of robot
localization
Robot Motion
Sensor Information: Importance Sampling
Robot Motion
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)
9
Sample-based Density Representation
• Particle Filters represent probability density
functions (pdfs) by taking m samples
12
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)
x
y
xt[ m ] =
w
14
Particle Filter with Velocity Motion Model
vt
ut =
t
For each [i]:
ˆv[𝑖]
t = v(
t ,
sample_normal
N v 2
+
ˆ
v =
3 t t 4 t
N
)(
2
vt , v
3 t
2
+
4 t
2
)
ˆ t
[𝑖]
( , v ˆ+ = N )( , v
sample_normal
=N t
2
2 t t 1 t
2
t
2
2 t + 1t2 )
𝑖 [𝑖]
[𝑖] 𝑣ො𝑡 𝑖 𝑣ො𝑡 [𝑖] [𝑖]
𝑥𝑡−1 − sin 𝜃𝑡−1 + [𝑖] sin 𝜃𝑡−1 + 𝜔
ෝ𝑡 ∆𝑡
ෝ𝑡𝑖
𝜔 𝜔
ෝ𝑡
[𝑖] [𝑖]
[𝑖] [𝑖] 𝑣ො𝑡 𝑖 𝑣ො [𝑖] [𝑖]
𝑥𝑡 = 𝑦𝑡−1 + [𝑖]
cos 𝜃𝑡−1 − 𝑡[𝑖] cos 𝜃𝑡−1 + 𝜔
ෝ𝑡 ∆𝑡
𝜔
ෝ𝑡 𝜔
ෝ𝑡
[𝑖] [𝑖]
𝜃𝑡−1 + 𝜔 ෝ𝑡 ∆𝑡
[𝑖]
𝑤𝑡−1 15
Propagated uncertainty
17
Particle filter odometry motion model
Start
20
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)
x
y
xt[ m ] =
w
22
Using sensor data to update w
1. Assign a new value to all 𝑤𝑡[𝑖] in accordance to
[𝑖]
how well each 𝑥𝑡 corresponds to sensor data 𝑧𝑡 ,
based on sensor model 𝑝 𝑧𝑡 |𝑥𝑡
2. Re-normalize all weights to maintain σ𝑚 [𝑖]
𝑖=1 𝑡 = 𝑚
𝑤
23
Particle filter sensor model weight example
𝑥𝐺𝑃𝑆 [𝑚]
2D GPS sensor 𝒛t = assume GPS
𝑦𝐺𝑃𝑆 [𝑚]
error Gaussian
[𝑖] 2
[𝑖] 1 1 𝑥𝐺𝑃𝑆 − 𝑥𝑡
𝑝 𝑥𝐺𝑃𝑆 𝑥𝑡 = 𝑒𝑥𝑝 − 2
2
2 𝜎𝐺𝑃𝑆
2𝜋𝜎𝐺𝑃𝑆
[𝑖] 2
[𝑖] 1 1 𝑦𝐺𝑃𝑆 − 𝑦𝑡
𝑝 𝑦𝐺𝑃𝑆 𝑦𝑡 = 𝑒𝑥𝑝 − 2
2
2 𝜎𝐺𝑃𝑆
2𝜋𝜎𝐺𝑃𝑆
26
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)
x
y
xt[ m ] =
w
28
Resampling
• Imagine the weights of our M particles as widths,
stacked end to end
• Re-normalize the widths so that total sum width = M
• Randomly draw r from [0,M] (uniform dist.)
• Probability of drawing particle m is wt[m]/M
• To figure out which particle you actually drew:
• Loop through particles from [1] upwards while sum of
weights thus far,
i =1 t , is less than r
m [i ]
w Choose particle [m]
• Do M times to randomly select M new particles
r
29
30
31
Resampling
• After resampling, all the new particles
again have equal weight, wt[m] = 1 for
all m
32
Recall Markov Assumption
34
Particle deprivation
• Example: 5 particles, no motion, first sensor
reading gives us their (normalized) weights
Particle 1 2 3 4 5
weight 1.33 1.26 1.24 1.00 0.17
35
Particle deprivation
• Example: 5 particles, no motion
• After 3x sensor model + resampling
Particle 1 1 1 2 3
weight 1.03 1.03 1.03 0.96 0.95
36
Combating Particle Deprivation
• Have lots of particles
• Use low-variance resampling
• Don’t invoke sensor model or
resampling when stationary
• In fact, even better to set a threshold for
minimum motion before invoking sensor
model + resampling
• E.g. 1 gridsquare (10 cm) or ~5 degrees
37
Resampling (Low Variance Algorithm)
• Imagine the weights of our M particles as widths,
stacked end to end
• Re-normalize the widths so that total sum width = M
• Randomly draw r from [0,1] (uniform dist.)
• To figure out which particle you actually drew:
• Loop through particles from [1] upwards while sum of
weights thus far is less than r
• For next particle (and kth particle):
• Continue loop where you left off, while sum of all weights
thus far is less than r + 1 (less than r + k)
r r+1 r+k 38
39