19 Tracking
19 Tracking
C280,ComputerVision
Prof.TrevorDarrell
[email protected]
Lecture19:Tracking
Tracking scenarios
Follow a point
Follow
ll a template
l
Follow a changing template
Follow all the elements of a moving person, fit a
model to it.
Simplifying Assumptions
x2
x3
y1
y2
y3
P( x1 , x2 , x3 , y1 , y2 , y3 )
P( x1 ) P( y1 | x1 ) P( x2 | x1 ) P( y2 | x2 ) P( x3 | x2 ) P( y3 | x3 )
6
Tracking as induction
Make a measurement starting in the 0th frame
Then:
h
assume you have
h
an estimate
i
at the
h ith
ih
frame, after the measurement step.
Show
Sh th
thatt you can do
d prediction
di ti for
f the
th i+1th
frame, and measurement for the i+1th frame.
Base case
Prediction step
given
Update step
given
10
11
13
[figure from http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html]
Notation
Predicted mean
Corrected mean
14
15
16
17
18
Notice:
if measurement noise is small,
we rely mainly on the measurement,
if its large, mainly on the prediction
does
d
nott depend
d
d on y
19
20
21
d i 1, mi 1, d i 0, mi 1
Initial conditions
x0 0 0
Iteration
y0
y0
x
x
y0 y1
2
1
2
2
y0 y1
2
y0 y1 y2
3
1
2
1
3
22
23
d i 1, mi 1, d i 1, mi 1
Initial conditions
x0 0 0
Iteration
y0
xi
y0
2
3
y0 2 y1
3
y0 2 y1
3
y0 2 y1 5 y2
8
5
3
5
8
24
x i N Di1x i1 ; d i
y i N Mi x i ; m i
25
x i N Di1x i1 ; d i
y i N Mi x i ; m i
assume that
th t the
th new position
iti off the
th point
i t is
i the
th old
ld one,
plus noise
D = Identity
cic.nist.gov/lipman/sciviz/images/random3.gif 26
http://www.grunch.net/synergetics/images/random
3.jpg
x i N Di1x i1 ; d i
Constant velocity
y i N Mi x i ; m i
We have
ui ui1 tvi1 i
vi vi1 i
(the Greek letters denote noise terms)
vi 0
which is the form we had above
xi
t u
i
noise
1 v i1
Di-1
xi-1
27
velocity
position
pposition
time
measurement,position
Constant
C
Velocity
Model
time
28
x i N Di1x i1 ; d i
Constant acceleration
y i N Mi x i ; m i
We have
ui ui1 tvi1 i
vi vi1 tai1 i
ai ai1 i
(the Greek letters denote noise terms)
Di-1
0 u
t v noise
1 a i1
xi-1
29
velocity
position
position
Constant
Acceleration
cce e o
Model
time
30
Periodic motion
x i N Di1x i1 ; d i
y i N Mi x i ;
mi
can be defined as
with
i h state d
defined
fi d as stacked
k d position
i i and
d
velocity u=(p, v)
31
x i N Di1x i1 ; d i
Periodic motion
y i N Mi x i ; m i
xi
Di-1
xi-1
32
n-D
Generalization to n-D is straightforward but more complex.
33
n-D
Generalization to n-D is straightforward but more complex.
34
n-D Prediction
Generalization to n-D is straightforward but more complex.
Prediction:
Multiply estimate at prior time with forward model:
n-D Correction
Generalization to n-D is straightforward but more complex.
Correction:
Update a priori estimate with measurement to form a
posteriori
36
n-D correction
Find linear filter on innovations
E xx
x x
37
M
lim i
m 0
li K i 0
lim
i 0
38
39
posittion
veloocity
position
time
time
41
possition
posittion
This is figure 17.3 of Forsyth and Ponce. The notation is a bit involved, but is logical. We
plot the true state as open circles, measurements as xs, predicted means as *s
with three standard deviation bars, corrected means as +s with three
standard deviation bars.
time
42
posittion
time
The o-s give state, x-s measurement.
43
Smoothing
Idea
W
We dont
d t have
h
the
th best
b t estimate
ti t off state
t t - what
h t about
b t
the future?
Run two filters, one moving forward, the other
backward in time.
Now combine state estimates
The crucial point here is that we can obtain a smoothed
estimate by viewing the backward filters prediction as yet
another measurement for the forward filter
44
positionn
Forward estimates.
45
positionn
Backward estimates.
time
46
positionn
47
48
[figure from http://www.ai.mit.edu/~murphyk/Software/Kalman/kalman.html]
22-D
D constant velocity example from Kevin Murphys Matlab toolbox
MSE of filtered estimate is 4.9; of smoothed estimate. 3.2.
Not only is the smoothed estimate better, but we know that it is better,
as illustrated
ill
d bby the
h smaller
ll uncertainty
i
ellipses
lli
Note how the smoothed ellipses are larger at the ends, because these
points have seen less data.
Also, note how rapidly the filtered ellipses reach their steady-state
(Ricatti) values.
49
Resources
Kalman filter homepage
h //
http://www.cs.unc.edu/~welch/kalman/
d /
l h/k l
/
(kalman filter demo applet)
Kevin Murphys Matlab toolbox:
http://www.ai.mit.edu/~murphyk/Software/Kalman/k
alman.html
50
51
Abrupt changes
What if environment is sometimes unpredictable?
Do people move with constant velocity?
Test several models of assumed dynamics, use the
best.
52
53
[figure from Welsh and Bishop 2001]
MM estimate
Two models: Position (P), Position+Velocity (PV)
54
[figure from Welsh and Bishop 2001]
P likelihood
55
[figure from Welsh and Bishop 2001]
No lag
56
[figure from Welsh and Bishop 2001]
57
[figure from Welsh and Bishop 2001]
58
59
Use an on-line
on line EM algorithm to fit the (changing)
model parameters to the recent appearance data.
60
Mixing
probabilities for
Stable (black),
Wandering (red)
and Lost (green)
processes
62
63
64
65
66
68
Noise
added to
that
h
prediction
69
[Isard 1998]
Distribution propagation
70
[Isard 1998]
71
72
73
74
Sample positions
This gives us two knobs to adjust when representing a probability density by samples:
the locations of the samples, and the probability weight on each sample.
75
76
[Isard 1998]
Sampled representation of a
probability distribution
77
x2
x3
y1
y2
y3
p f ( x ) w i ( x u i )
i
P di ti step
Prediction
t
Update step
78
Particle filter
Lets apply this sampled probability density
machinery to generalize the Kalman filtering
framework.
More general probability density representation
than uni-modal Gaussian.
Allows for general state dynamics,
dynamics f(x) + noise
79
Sampled Prediction
= ?
~=
80
yielding
i ldi a set off samples
l describing
d
ibi the
h probability
b bili
distribution after the correction (update) step:
81
Nave PF Tracking
Start with samples from something simple
(Gaussian)
Take each particle from the prediction step and
Repeat
modify the old weight by multiplying by the
new likelihood
Correct
Predict
s
Run everyy pparticle through
g the dynamics
y
function and add noise.
82
Sample impoverishment
10 of the 100 particles, along with the true Kalman
filter track
track, with variance:
time
83
These
h
new samples
l are a representation
i off the
h same
density.
I.e., make N draws with replacement from the
original set of samples,
samples using
sing the weights
eights as the
probability of drawing a sample.
84
85
86
Pictorial view
87
[Isard 1998]