Tracking by Sajid Ali (Nitttr, Chandigarh)
Tracking by Sajid Ali (Nitttr, Chandigarh)
Models
Presented By:
Sajid Ali
CSE (Regular)-2010
NITTTR, Chandigarh
1
Contents:
Tracking
Three main issues
Assumptions
Tracking as Induction
Linear Dynamic Models
Kalman Filter
References
2 Sajid Ali,NITTTR
Tracking
Tracking is the problem of generating an inference about
the motion of an object given a sequence of images.
3 Sajid Ali,NITTTR
Tracking(Cont..)
Very general model:
We assume there are moving objects, which have an
underlying state X
There are measurements Y, some of which are functions
of this state
There is a clock
at each tick, the state changes
at each tick, we get a new observation
Examples
object is ball, state is 3D position+velocity,
measurements are stereo pairs
object is person, state is body configuration,
measurements are frames, clock is in camera (30 fps)
4 Sajid Ali,NITTTR
Three main Issues in Tracking
5 Sajid Ali,NITTTR
Simplifying Assumptions
6 Sajid Ali,NITTTR
Tracking as induction
Assume data association is done
we’ll talk about this later; a dangerous assumption
Do correction for the 0’th frame
Assume we have corrected estimate for i’th frame
show we can do prediction for i+1, correction for i+1
7 Sajid Ali,NITTTR
Base case
8 Sajid Ali,NITTTR
Induction step
Given
9 Sajid Ali,NITTTR
Induction step (Cont.)
10 Sajid Ali,NITTTR
Linear dynamic models
Use notation ~ to mean
“has the pdf of”, N(a, b) is
a normal distribution with
y i N Mi x i ; m i
mean a and covariance b. xi N D x ; d i
i1 i1
Then a linear dynamic
model has the form
This is much, much more
general than it looks, and
extremely powerful
We are on a boat at night
and lost our position
We know: star position
11 Sajid Ali,NITTTR
Examples
Drifting points
we assume that the new position of the point is the old
one, plus noise.
For the measurement model, we may not need to observe
the whole state of the object
e.g. a point moving in 3D, at the 3k’th tick we see x, 3k+1’th tick
we see y, 3k+2’th tick we see z
in this case, we can still make decent estimates of all three
coordinates at each tick.
This property, which does not apply to every model, is
called Observability
12 Sajid Ali,NITTTR
Examples
Points moving with constant velocity
Periodic motion
Points moving with constant acceleration
13 Sajid Ali,NITTTR
Points moving with constant velocity
We have
ui ui1 tvi1 i
vi vi1 i
14 Sajid Ali,NITTTR
Points moving with constant
acceleration
We have
ui ui1 tvi1 i
vi vi1 tai1 i
ai ai1 i
(the Greek letters denote noise terms)
Stack (u, v) into a single state vector
u 1 t 0 u
v 0 1 t v noise
ai 0 0 1 a i1
which is the form we had above
15 Sajid Ali,NITTTR
16 Sajid Ali,NITTTR
17 Sajid Ali,NITTTR
The Kalman Filter
Key ideas:
Linear models interact uniquely well with Gaussian
noise - make the prior Gaussian, everything else
Gaussian and the calculations are easy
Gaussians are really easy to represent --- once you know
the mean and covariance, you’re done
18 Sajid Ali,NITTTR
What is it used for ?
Tracking missiles
Tracking heads/hands/drumsticks
Extracting lip motion from video
Fitting Bezier patches to point data
Lots of computer vision applications
Economics
Navigation
19 Sajid Ali,NITTTR
The Kalman Filter in 1D
Dynamic Model
Notation
Predicted mean
Corrected mean
20 Sajid Ali,NITTTR
Prediction for 1D Kalman filter
The new state is obtained by
multiplying old state by known constant
adding zero-mean noise
Therefore, predicted mean for new state is
constant times mean for old state
Predicted variance is
sum of constant^2 times old state variance and noise
variance
Because:
old state is normal random variable, multiplying normal rv by constant
implies mean is multiplied by a constant variance by square of constant, adding
zero mean noise adds zero to the mean, adding rv’s adds variance
21 Sajid Ali,NITTTR
22 Sajid Ali,NITTTR
Correction for 1D Kalman filter
Pattern match to identities given in book
basically, guess the integrals, get:
Notice:
if measurement noise is small,
we rely mainly on the measurement,
if it’s large, mainly on the
prediction
23 Sajid Ali,NITTTR
In higher dimensions,
derivation follows the
same lines, but isn’t as
easy. Expressions
here.
24 Sajid Ali,NITTTR
25 Sajid Ali,NITTTR
26 Sajid Ali,NITTTR
Smoothing
Idea
We don’t have the best estimate of state - what about the
future?
Run two filters, one moving forward, the other backward
in time.
Now combine state estimates
The crucial point here is that we can obtain a smoothed estimate
by viewing the backward filter’s prediction as yet another
measurement for the forward filter
so we’ve already done the equations
27 Sajid Ali,NITTTR
28 Sajid Ali,NITTTR
29 Sajid Ali,NITTTR
30 Sajid Ali,NITTTR
Data Association
Nearest neighbours
choose the measurement with highest probability given
predicted state
popular, but can lead to catastrophe
Probabilistic Data Association
combine measurements, weighting by probability given
predicted state
gate using predicted state
31 Sajid Ali,NITTTR
32 Sajid Ali,NITTTR
33 Sajid Ali,NITTTR
34 Sajid Ali,NITTTR
35 Sajid Ali,NITTTR
36 Sajid Ali,NITTTR
References:
http://www.cs.utexas.edu/~grauman/courses/fall2008/s
lides/lecture23_tracking.ppt
http://luthuli.cs.uiuc.edu/~daf/book/bookpages/Slides/
Tracking.ppt
37 Sajid Ali,NITTTR
Thanks
38 Sajid Ali,NITTTR