0% found this document useful (0 votes)
12 views

Lecture5 2023

Particle filters represent probability density functions in a non-parametric way using samples, or particles. The key steps of a particle filter are: 1. Motion model - Updates each particle's state based on motion/control inputs using a prediction step. 2. Sensor/Measurement model - Updates each particle's weight based on sensor measurements using an update step. 3. Resampling - Resets particles to high-weight regions to avoid sample depletion. Particle filters represent uncertainty in state using many samples, updating particles' states in the motion model and weights in the sensor model to represent the posterior belief given observations over time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Lecture5 2023

Particle filters represent probability density functions in a non-parametric way using samples, or particles. The key steps of a particle filter are: 1. Motion model - Updates each particle's state based on motion/control inputs using a prediction step. 2. Sensor/Measurement model - Updates each particle's weight based on sensor measurements using an update step. 3. Resampling - Resets particles to high-weight regions to avoid sample depletion. Particle filters represent uncertainty in state using many samples, updating particles' states in the motion model and weights in the sensor model to represent the posterior belief given observations over time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Bayes Filters are really important!

Bel ( xt ) =  P( zt | xt )  P( xt | ut , xt −1 ) Bel ( xt −1 ) dxt −1

 Prediction Update 
Bel ( xt −1 ) step Bel ( xt ) step Bel ( xt )

How do we represent the probability density


functions, in particlar bel(xt)?
• Particle filter (non-parametric)
• Kalman filter (parametric)
1
Parametric vs. non-parametric statistics

Parametric Non-parametric
representations: representations:
• Gaussian, χ2, etc. • The data itself,
distributions samples from a pdf,
histogram
• Summarized • Sometimes can be
concisely (e.g. μ, σ) summarized by
• May or may not median, mode, etc.
represent the actual • Can represent non-
data or pdf Gaussian, etc. pdfs
• Single hypothesis • Multiple hypotheses
• Makes assumptions! • Doesn’t assume!
2
Particle filter (Thrun 4.3.1)
• A non-parametric Bayes Filter that
represents probability density
functions with samples from those
pdfs
• Bel(xt) is updated by reweighting and
resampling the samples, i.e. particles
(we’ll come back to how those steps
are actually done later)

3
Particle Filter implementation of robot
localization
Robot Motion
Sensor Information: Importance Sampling
Robot Motion
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)

Where (on Update timestep


the map)
Measurement am I? Motion model
(sensor) (prediction
model step)
(update step)
Where (on
the map)
am I? 8
Uncertainty in Particle Filters
• Particle Filters represent probability density
functions (pdfs) by taking many samples

9
Sample-based Density Representation
• Particle Filters represent probability density
functions (pdfs) by taking m samples

𝑥𝑡 = 𝑥 [1] 𝑥 [2] 𝑥 [3] 𝑥 [4] … 𝑥 [𝑚]


= 0.08 0.99 1.01 1.02 1.20 …
10
Tracking and updating bel(xt)

Each particle, xt[i], has a particular state


(pose), but also a particular likelihood
(weight, w):
 x
 
 y
 
 
 w
 

Initially, all the particles have equal weight,


[𝑖]
wt = 1 for all i
[i] 𝑚
σ𝑖=1 𝑤𝑡 = 𝑚
11
Main parts of a particle filter
• Motion model
• Measurement/Sensor model
• Resampling

12
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)

Where (on Update timestep


the map)
Measurement am I? Motion
(sensor) model
model (prediction
(update step) step)
Where (on
the map)
am I? 13
What each part (function call) of a
Particle Filter does to the particles

• Motion (i.e. odometry) updates modify each


particle’s state (pose)

 x
 
 y
xt[ m ] = 

 
 w
 
14
Particle Filter with Velocity Motion Model
 vt 
ut =  
 t 
For each [i]:
ˆv[𝑖]
t = v(
t , 
sample_normal
N v 2
+
ˆ
v =

3 t t 4 t
N
 )(
2
vt ,  v
3 t
2
+  
4 t
2
)
ˆ t
[𝑖]
( , v ˆ+ = N )( , v
sample_normal
=N t
2
2 t t 1 t
2
t
2
2 t + 1t2 )
𝑖 [𝑖]
[𝑖] 𝑣ො𝑡 𝑖 𝑣ො𝑡 [𝑖] [𝑖]
𝑥𝑡−1 − sin 𝜃𝑡−1 + [𝑖] sin 𝜃𝑡−1 + 𝜔
ෝ𝑡 ∆𝑡
ෝ𝑡𝑖
𝜔 𝜔
ෝ𝑡
[𝑖] [𝑖]
[𝑖] [𝑖] 𝑣ො𝑡 𝑖 𝑣ො [𝑖] [𝑖]
𝑥𝑡 = 𝑦𝑡−1 + [𝑖]
cos 𝜃𝑡−1 − 𝑡[𝑖] cos 𝜃𝑡−1 + 𝜔
ෝ𝑡 ∆𝑡
𝜔
ෝ𝑡 𝜔
ෝ𝑡
[𝑖] [𝑖]
𝜃𝑡−1 + 𝜔 ෝ𝑡 ∆𝑡
[𝑖]
𝑤𝑡−1 15
Propagated uncertainty

Low α1, α2 High α1, α2


Typical αi’s High α3, α4 Low α3, α4

Coefficient describes error in due to


α1 rotation rotation
α2 rotation translation
α3 translation translation
α4 translation rotation
Particle filter motion model example

17
Particle filter odometry motion model

500 particles, all starting at the same initial


state, xt-1:
Representing p(xt|u,xt-1) with samples
• This is how a particle filter models motion

Start

• How do we keep uncertainty bounded?


Main parts of a particle filter
• Motion model
• Measurement/Sensor model
• Resampling

20
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)

Where (on Update timestep


the map)
Measurement am I? Motion model
(sensor)
model
(prediction
(update step) step)
Where (on
the map)
am I? 21
What each part (function call) of a
Particle Filter does to the particles

• Motion (i.e. odometry) updates modify each


particle’s state (pose)
• Sensor (i.e. laser scanner) updates modify
each particle’s likelihood (weight)

 x
 
 y
xt[ m ] = 

 
 w
 
22
Using sensor data to update w
1. Assign a new value to all 𝑤𝑡[𝑖] in accordance to
[𝑖]
how well each 𝑥𝑡 corresponds to sensor data 𝑧𝑡 ,
based on sensor model 𝑝 𝑧𝑡 |𝑥𝑡
2. Re-normalize all weights to maintain σ𝑚 [𝑖]
𝑖=1 𝑡 = 𝑚
𝑤

23
Particle filter sensor model weight example
𝑥𝐺𝑃𝑆 [𝑚]
2D GPS sensor 𝒛t = assume GPS
𝑦𝐺𝑃𝑆 [𝑚]
error Gaussian
[𝑖] 2
[𝑖] 1 1 𝑥𝐺𝑃𝑆 − 𝑥𝑡
𝑝 𝑥𝐺𝑃𝑆 𝑥𝑡 = 𝑒𝑥𝑝 − 2
2
2 𝜎𝐺𝑃𝑆
2𝜋𝜎𝐺𝑃𝑆

[𝑖] 2
[𝑖] 1 1 𝑦𝐺𝑃𝑆 − 𝑦𝑡
𝑝 𝑦𝐺𝑃𝑆 𝑦𝑡 = 𝑒𝑥𝑝 − 2
2
2 𝜎𝐺𝑃𝑆
2𝜋𝜎𝐺𝑃𝑆

[𝑖] [𝑖] [𝑖]


𝑤𝑡 = 𝑝 𝑥𝐺𝑃𝑆 𝑥𝑡 𝑝 𝑦𝐺𝑃𝑆 𝑦𝑡
24
25
Main parts of a particle filter
• Motion model
• Sensor model
• Resampling

26
Conditional probabilities for this course
Measurement model: p(zt | xt)
Motion model: p(xt | xt-1 , ut )
Belief: bel(xt) = p(xt | z1:t , u1:t)

Where (on Update timestep


the map)
Measurement am I? Motion model
(sensor) model
(update step)
(prediction
step)
Where (on
the map)
am I? 27
What each part (function call) of a
Particle Filter does to the particles

• Motion (i.e. odometry) updates modify each


particle’s state (pose)
• Sensor (i.e. laser scanner) updates modify
each particle’s likelihood (weight)
• After particles are reweighted, resample in
proportion to the new likelihoods

 x
 
 y
xt[ m ] = 

 
 w
 
28
Resampling
• Imagine the weights of our M particles as widths,
stacked end to end
• Re-normalize the widths so that total sum width = M
• Randomly draw r from [0,M] (uniform dist.)
• Probability of drawing particle m is wt[m]/M
• To figure out which particle you actually drew:
• Loop through particles from [1] upwards while sum of
weights thus far,
i =1 t , is less than r
m [i ]
w Choose particle [m]
• Do M times to randomly select M new particles

r
29
30
31
Resampling
• After resampling, all the new particles
again have equal weight, wt[m] = 1 for
all m

32
Recall Markov Assumption

p( zt | x0:t , z1:t −1 , u1:t ) = p( zt | xt )

This assumption gets worse the smaller u is

Really bad when u = 0 (i.e. no motion)


33
Particle deprivation
• Invoking the sensor model and resampling
multiple times when stopped implicitly makes
sensor measurements in that spot much more
important than all other sensor measurements

34
Particle deprivation
• Example: 5 particles, no motion, first sensor
reading gives us their (normalized) weights
Particle 1 2 3 4 5
weight 1.33 1.26 1.24 1.00 0.17

Resampling results in:


Particle 1 1 (copy) 2 3 4
weight 1 1 1 1 1
Repeat sensor reading gives:
Particle 1 1 (copy) 2 3 4
weight 1.08 1.08 1.02 1.01 0.81

35
Particle deprivation
• Example: 5 particles, no motion
• After 3x sensor model + resampling
Particle 1 1 1 2 3
weight 1.03 1.03 1.03 0.96 0.95

After 10x sensor model + resampling


Particle 1 1 1 1 2
weight 1.01 1.01 1.01 1.01 0.96

After 13x sensor model + resampling


Particle 1 1 1 1 1
weight 1 1 1 1 1

36
Combating Particle Deprivation
• Have lots of particles
• Use low-variance resampling
• Don’t invoke sensor model or
resampling when stationary
• In fact, even better to set a threshold for
minimum motion before invoking sensor
model + resampling
• E.g. 1 gridsquare (10 cm) or ~5 degrees

37
Resampling (Low Variance Algorithm)
• Imagine the weights of our M particles as widths,
stacked end to end
• Re-normalize the widths so that total sum width = M
• Randomly draw r from [0,1] (uniform dist.)
• To figure out which particle you actually drew:
• Loop through particles from [1] upwards while sum of
weights thus far is less than r
• For next particle (and kth particle):
• Continue loop where you left off, while sum of all weights
thus far is less than r + 1 (less than r + k)

r r+1 r+k 38
39

You might also like