0% found this document useful (0 votes)
83 views13 pages

Recursive Estimation: Problem Set 2: Bayes' Theorem and Bayesian Tracking

1) The document provides notes and problems related to recursive estimation, Bayes' theorem, and Bayesian tracking. It includes 12 problems involving concepts like probability, conditional probability, independence of random variables, and Bayesian filtering. 2) The problems cover topics like probability of events, conditional probability, Bayes' theorem, random variable distributions, and simulating and tracking a moving object using Bayesian filtering. 3) Sample solutions are provided for some of the problems to demonstrate the concepts and calculations involved.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views13 pages

Recursive Estimation: Problem Set 2: Bayes' Theorem and Bayesian Tracking

1) The document provides notes and problems related to recursive estimation, Bayes' theorem, and Bayesian tracking. It includes 12 problems involving concepts like probability, conditional probability, independence of random variables, and Bayesian filtering. 2) The problems cover topics like probability of events, conditional probability, Bayes' theorem, random variable distributions, and simulating and tracking a moving object using Bayesian filtering. 3) Sample solutions are provided for some of the problems to demonstrate the concepts and calculations involved.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Recursive Estimation

Sebastian Trimpe
Spring 2013
Problem Set 2:
Bayes Theorem and Bayesian Tracking
Last updated: March 19, 2013
Notes:
Notation: Unless otherwise noted, x, y, and z denote random variables, f
x
(x) (or the short
hand f(x)) denotes the probability density function of x, and f
x|y
(x|y) (or f(x|y)) denotes
the conditional probability density function of x conditioned on y. The expected value is
denoted by E[], the variance is denoted by Var[], and Pr (Z) denotes the probability that
the event Z occurs. A normally distributed random variable x with mean and variance

2
is denoted by x N
_
,
2
_
.
Please report any errors found in this problem set to the teaching assistants
([email protected] or [email protected]).
Problem Set
Problem 1
Mr. Jones has devised a gambling system for winning at roulette. When he bets, he bets on
red, and places a bet only when the ten previous spins of the roulette have landed on a black
number. He reasons that his chance of winning is quite large since the probability of eleven
consecutive spins resulting in black is quite small. What do you think of this system?
Problem 2
Consider two boxes, one containing one black and one white marble, the other, two black and
one white marble. A box is selected at random with equal probability and a marble is drawn at
random with equal probability from the selected box. What is the probability that the marble
is black?
Problem 3
In Problem 2, what is the probability that the rst box was the one selected given that the
marble is white?
Problem 4
Urn 1 contains two white balls and one black ball, while urn 2 contains one white ball and ve
black balls. One ball is drawn at random with equal probability from urn 1 and placed in urn
2. A ball is then drawn from urn 2 at random with equal probability. It happens to be white.
What is the probability that the transferred ball was white?
Problem 5
Stores A, B and C have 50, 75, 100 employees, and respectively 50, 60 and 70 percent of these
are women. Resignations are equally likely among all employees, regardless of sex. One employee
resigns and this is a woman. What is the probability that she works in store C?
Problem 6
a) A gambler has in his pocket a fair coin and a two-headed coin. He selects one of the
coins at random with equal probability, and when he ips it, it shows heads. What is the
probability that it is the fair coin?
b) Suppose that he ips the same coin a second time and again it shows heads. What is now
the probability that it is the fair coin?
c) Suppose that he ips the same coin a third time and it shows tails. What is now the
probability that it is the fair coin?
Problem 7
Urn 1 has ve white and seven black balls. Urn 2 has three white and twelve black balls. An
urn is selected at random with equal probability and a ball is drawn at random with equal
probability from that urn. Suppose that a white ball is drawn. What is the probability that the
second urn was selected?
2
Problem 8
An urn contains b black balls and r red balls. One of the balls is drawn at random with equal
probability, but when it is put back into the urn c additional balls of the same color are put in
with it. Now suppose that we draw another ball at random with equal probability. Show that
the probability that the rst ball drawn was black, given that the second ball drawn was red is
b/(b + r + c).
Problem 9
Three prisoners are informed by their jailer that one of them has been chosen to be executed at
random with equal probability, and the other two are to be freed. Prisoner A asks the jailer to
tell him privately which of his fellow prisoners will be set free, claiming that there would be no
harm in divulging this information, since he already knows that at least one will go free. The
jailer refuses to answer this question, pointing out that if A knew which of his fellows were to be
set free, then his own probability of being executed would rise from 1/3 to 1/2, since he would
then be one of two prisoners. What do you think of the jailers reasoning?
Problem 10
Let x and y be independent random variables. Let g() and h() be arbitrary functions of x and
y, respectively. Dene the random variables v = g(x) and w = h(y). Prove that v and w are
independent. That is, functions of independent random variables are independent.
Problem 11
Let x be a continuous, uniformly distributed random variable with x X = [5, 5]. Let
z
1
= x + n
1
z
2
= x + n
2
,
where n
1
and n
2
are continuous random variables with probability density functions
f
n
1
(n
1
) =
_
_
_

1
(1 + n
1
) for 1 n
1
0

1
(1 n
1
) for 0 n
1
1
0 otherwise ,
f
n
2
(n
2
) =
_
_
_

2
_
1 +
1
2
n
2
_
for 2 n
2
0

2
_
1
1
2
n
2
_
for 0 n
2
2
0 otherwise ,
where
1
and
2
are normalization constants. Assume that the random variables x, n
1
, n
2
are
independent, i.e. f(x, n
1
, n
2
) = f(x) f(n
1
) f(n
2
).
a) Calculate
1
and
2
.
b) Use the change of variables formula from Lecture 2 to show that f
z
i
|x
(z
i
|x) = f
n
i
(z
i
x).
c) Calculate f(x|z
1
= 0, z
2
= 0).
d) Calculate f(x|z
1
= 0, z
2
= 1).
e) Calculate f(x|z
1
= 0, z
2
= 3).
3
Problem 12
Consider the following estimation problem: an object B moves randomly on a circle with radius
1. The distance to the object can be measured from a given observation point P. The goal is to
estimate the location of the object, see Figure 1.
x
y
-1
d
istan
ce
B
P
L

Figure 1: Illustration of the estimation problem.


The object B can only move in discrete steps. The objects location at time k is given by
x(k) {0, 1, . . . , N 1}, where
(k) = 2
x(k)
N
.
The dynamics are
x(k) = mod (x(k 1) + v(k) , N), k = 1, 2, . . . ,
where v(k) = 1 with probability p and v(k) = 1 otherwise. Note that mod (N, N) = 0 and
mod (1, N) = N 1. The distance sensor measures
z(k) =
_
(L cos (k))
2
+ (sin (k))
2
+ w(k),
where w(k) represents the sensor error which is uniformly distributed on [e, e]. We assume
that x(0) is uniformly distributed and x(0), v(k) and w(k) are independent.
Simulate the object movement and implement a Bayesian tracking algorithm that calculates for
each time step k the probability density function f(x(k)|z(1 : k)).
a) Test the following settings and discuss the results: N = 100, x(0) =
N
4
, e = 0.5,
(i) L = 2, p = 0.5,
(ii) L = 2, p = 0.5,
(iii) L = 2, p = 0.55,
(iv) L = 0.1, p = 0.55,
(v) L = 0, p = 0.55.
b) How robust is the algorithm? Set N = 100, x(0) =
N
4
, e = 0.5, L = 2, p = 0.55 in the
simulation, but use slightly dierent values for p and e in your estimation algorithm, p and
e, respectively. Test the algorithm and explain the result for:
(i) p = 0.45, e = e,
(ii) p = 0.5, e = e,
(iii) p = 0.9, e = e,
(iv) p = p, e = 0.9,
(v) p = p, e = 0.45.
4
Sample solutions
Problem 1
We introduce a discrete random variable x
i
, representing the outcome of the ith spin with
x
i
{red, black}. We assume that both are equally likely and ignore the fact that there is a 0
or a 00 on a Roulette board.
Assuming independence between spins, we have
Pr (x
k
, x
k1
, x
k2
, . . . , x
1
) = Pr (x
k
) Pr (x
k1
) . . . Pr (x
1
) .
The probability of eleven consecutive spins resulting in black is
Pr (x
11
= black, x
10
= black, x
9
= black, . . . , x
1
= black) =
_
1
2
_
11
0.0005.
This value is actually quite small. However, given that the previous ten were black, we calculate
Pr (x
11
= black|x
10
= black, x
9
= black, . . . , x
1
= black) (by independence assumption)
= Pr (x
11
= black) =
1
2
= Pr (x
11
= red|x
10
= black, x
9
= black, . . . , x
1
= black)
i.e. it is equally likely that ten black spins in a row are followed by a red spin, as that they
are followed by another black spin (by the independence assumption). Mr. Jones system is
therefore no better than randomly betting on black or red.
Problem 2
We introduce two discrete random variables. Let x {1, 2} represent which box is chosen (box
1 or 2) with probability f
x
(1) = f
x
(2) =
1
2
. Furthermore, let y {b, w} represent the color of
the drawn marble, where b is a black and w a white marble with probabilities
f
y|x
(b|1) = f
y|x
(w|1) =
1
2
,
f
y|x
(b|2) =
2
3
, f
y|x
(w|2) =
1
3
.
Then, by the total probability theorem, we nd
f
y
(b) = f
y|x
(b|1) f
x
(1) + f
y|x
(b|2) f
x
(2) =
1
2

1
2
+
2
3

1
2
=
7
12
.
Problem 3
f
x|y
(1|w) =
f
y|x
(w|1) f
x
(1)
f
y
(w)
=
1
2

1
2
1 f
y
(b)
=
1
4
5
12
=
3
5
.
5
Problem 4
Let x {b, w} represent the color of the ball drawn from urn 1, where f
x
(b) =
1
3
, f
x
(w) =
2
3
,
and y {b, w} be the color of the ball subsequently drawn from urn 2. Considering the dierent
possibilities, we have
f
y|x
(b|b) =
6
7
f
y|x
(b|w) =
5
7
f
y|x
(w|b) =
1
7
f
y|x
(w|w) =
2
7
.
We seek the probability that the transferred ball was white, given that the second ball drawn is
white and calculate
f
x|y
(w|w) =
f
y|x
(w|w) f
x
(w)
f
y
(w)
=
2
7

2
3
f
y|x
(w|b) f
x
(b) + f
y|x
(w|w) f
x
(w)
=
2
7

2
3
1
7

1
3
+
2
7

2
3
=
4
5
.
Problem 5
We introduce two discrete random variables, x {A, B, C} and y {M, F}, where x represents
which store an employee works in, and y the sex of the employee. From the problem description,
we have
f
x
(A) =
50
225
=
2
9
f
x
(B) =
75
225
=
1
3
f
x
(C) =
100
225
=
4
9
,
and the probability that an employee is a woman is
f
y|x
(F|A) =
1
2
f
y|x
(F|B) =
3
5
f
y|x
(F|C) =
7
10
.
We seek the probability that the resigning employee works in store C, given that it is a woman
and calculate
f
x|y
(C|F) =
f
y|x
(F|C) f
x
(C)
f
y
(F)
=
7
10

4
9

i{A,B,C}
f
y|x
(F|i) f
x
(i)
=
7
10

4
9
1
2

2
9
+
3
5

1
3
+
7
10

4
9
=
1
2
.
Problem 6
a) Let x {F, U} represent whether it is a fair (F) or an unfair (U) coin with f
x
(F) =
f
x
(U) =
1
2
. We introduce y {h, t} to represent how the toss comes up (heads or tails)
with
f
y|x
(h|F) =
1
2
f
y|x
(t|F) =
1
2
f
y|x
(h|U) = 1 f
y|x
(t|U) = 0 .
6
We seek the probability that the drawn coin is fair, given that the toss result is heads and
calculate
f
x|y
(F|h) =
f
y|x
(h|F) f
x
(F)
f
y
(h)
=
1
2

1
2
f
y|x
(h|F) f
x
(F)
. .
fair coin
+f
y|x
(h|U) f
x
(U)
. .
unfair coin
=
1
4
1
2

1
2
+ 1
1
2
=
1
3
.
b) Let y
1
represent the result of the rst ip and y
2
that of the second ip. We assume
conditional independence between ips (conditioned on x), yielding
f
y
1
,y
2
|x
(y
1
, y
2
|x) = f
y
1
|x
(y
1
|x) f
y
2
|x
(y
2
|x) .
We seek the probability that the drawn coin is fair, given that the both tosses resulted in
heads and calculate
f
x|y
1
,y
2
(F|h, h) =
f
y
1
,y
2
|x
(h, h|F) f
x
(F)
f
y
1
,y
2
(h, h)
=
1
2

1
2

1
2
_
1
2
_
2

1
2
. .
fair coin
+ 1
2

1
2
. .
unfair coin
=
1
5
.
c) Obviously, the probability then is 1, because the unfair coin cannot show tails. Formally,
we show this by introducing y
3
to represent the result of the third ip and computing
f
x|y
1
,y
2
,y
3
(F|h, h, t) =
f
y
1
,y
2
,y
3
|x
(h, h, t|F) f
x
(F)
f
y
1
,y
2
,y
3
(h, h, t)
=
1
2

1
2

1
2

1
2
_
1
2
_
3

1
2
. .
fair coin
+ 1
2
0
1
2
. .
unfair coin
= 1.
Problem 7
We introduce the following two random variables:
x {1, 2} represents which box is chosen (box 1 or 2) with probability f
x
(1) = f
x
(2) =
1
2
,
y {b, w} represents the color of the drawn ball, where b is a black and w a white ball
with probabilities
f
y|x
(b|1) =
7
12
f
y|x
(w|1) =
5
12
f
y|x
(b|2) =
12
15
f
y|x
(w|2) =
3
15
.
We seek the probability that the second box was selected, given that the ball drawn is white
and calculate
f
x|y
(2|w) =
f
y|x
(w|2) f
x
(2)
f
y
(w)
=
3
15

1
2
5
12

1
2
+
3
15

1
2
=
12
37
.
7
Problem 8
Let x
1
{B, R} be the color (black or red) of the rst ball drawn with f
x
1
(B) =
b
b+r
and
f
x
1
(R) =
r
b+r
; and let x
2
{B, R} be the color of the second ball with
f
x
2
|x
1
(B|R) =
b
b + r + c
f
x
2
|x
1
(R|R) =
c + r
b + r + c
f
x
2
|x
1
(B|B) =
b + c
b + r + c
f
x
2
|x
1
(R|B) =
r
b + r + c
.
We seek the probability that the rst ball drawn was black, given that the second ball drawn is
red and calculate
f
x
1
|x
2
(B|R) =
f
x
2
|x
1
(R|B) f
x
1
(B)
f
x
2
(R)
=
r
b+r+c

b
b+r
c + r
b + r + c

r
b + r
. .
rst red
+
r
b + r + c

b
b + r
. .
rst black
=
b
b + r + c
.
Problem 9
We can approach the solution in two ways:
Descriptive solution: The probability that A is to be executed is
1
3
, and there is a chance
of
2
3
that one of the others was chosen. If the jailer gives away the name of one of the
fellow prisoners who will be set free, prisoner A does not get new information about his
own fate, but the probability of the remaining prisoner (B or C) to be executed is now
2
3
.
The probability of A being executed is still
1
3
.
Bayesian analysis: Let x represent which prisoner is to be executed, where x {A, B, C}.
We assume that it is a random choice, i.e. f
x
(A) = f
x
(B) = f
x
(C) =
1
3
.
Now let y {B, C} be the prisoner name given away by the jailer. We can now write the
conditional probabilities:
f
y|x
(y|x) =
_

_
0 if x = y (the jailer does not lie)
1
2
if x = A (A is to be executed, jailer mentions B and C with equal probability)
1 if x = A (jailer is forced to give the name of the other prisoner to be set free).
You could also do this with a table:
y x f
y|x
(y|x)
B A 1/2
B B 0
B C 1
C A 1/2
C B 1
C C 0
To answer the question, we have to compare f
x|y
(A| y) , y {B, C} , with f
x
(A):
8
f
x|y
(A| y) =
f
y|x
( y|A) f
x
(A)
f
y
( y)
=
1
2

1
3

k{A,B,C}
f
y|x
( y|k) f
x
(k)
=
1
6
1
2
..
f
y|x
( y|A)

1
3
+ 0
..
f
y|x
( y| y)

1
3
+ 1
..
f
y|x
( y| not y)

1
3
=
1
3
,
where (not y) = C if y = B and (not y) = B if y = C. The value of the posterior probability
is the same as the prior one, f
x
(A). The jailer is wrong: prisoner A gets no additional
information from the jailer about his own fate!
See also Wikipedia: Three prisoners problem, Monty Hall problem.
Problem 10
Consider the joint cumulative distribution
F
v,w
( v, w) = Pr ((v v) and (w w))
= Pr ((g(x) v) and (h(y) w)) .
We dene the sets A
v
and A
w
as below:
A
v
=
_
x

X : g(x) v
_
A
w
=
_
y

Y : h(y) w
_
.
Now we can deduce
F
v,w
( v, w) = Pr ((x A
v
) and (y A
w
)) v, w
= Pr (x A
v
) Pr (y A
w
) (by independence assumption)
= Pr (g(x) v) Pr (h(y) w)
= Pr (v v) Pr (w w)
= F
v
( v) F
w
( w) .
Therefore v and w are independent.
Problem 11
a) By denition of a PDF, the integrals of the PDFs f
n
i
(n
i
), i = 1, 2 must evaluate to
one, which we use to nd the
i
. We integrate the probability density functions, see the
following gure:
f
n
1

1
1 n
1
1
2
2
1
1
f
n
2

2
2 n
2
1
2
4
2
2
9
For n
1
, we obtain

f
n
1
( n
1
) d n
1
=
1
_
_
0
_
1
(1 + n
1
) d n
1
+
1
_
0
(1 n
1
) d n
1
_
_
=
1
_
1
1
2
+ 1
1
2
_
=
1
.
Therefore
1
= 1. For n
2
, we get

f
n
2
( n
2
) d n
2
=
2
_
_
0
_
2
_
1 +
1
2
n
2
_
d n
2
+
2
_
0
_
1
1
2
n
2
_
d n
2
_
_
=
2
(2 1 + 2 1) = 2
2
.
Therefore
2
=
1
2
.
b) The goal is to calculate f
z
i
|x
(z
i
|x) from z
i
= g(n
i
, x) := x+n
i
and the given PDFs f
n
i
(n
i
),
i = 1, 2. We apply the change of variables formula for CRVs with conditional PDFs:
f
z
i
|x
(z
i
|x) =
f
n
i
|x
(n
i
|x)
g
n
i
(n
i
, x)
(just think of x as a parameter that parametrizes the PDFs).
The proof for this formula is analogous to the proof in the lecture notes for a change of
variables of CRVs with unconditional PDFs. We nd that
g
n
i
(n
i
, x) = 1, for all n
i
, x
and, therefore, the fraction in the change of variables formula is well-dened for all values
of n
i
and x. Due to the independence of n
i
and x, f
n
i
|x
(n
i
|x) = f
n
i
(n
i
) :
f
z
i
|x
(z
i
|x) =
f
n
i
|x
(n
i
|x)
g
n
i
(n
i
, x)
= f
n
i
(n
i
) .
Substituting n
i
= z
i
x, we nally obtain
f
z
i
|x
(z
i
|x) = f
n
i
(n
i
) = f
n
i
(z
i
x) .
c) We use Bayes theorem to calculate
f
x|z
1
,z
2
(x|z
1
, z
2
) =
f
z
1
,z
2
|x
(z
1
, z
2
|x) f
x
(x)
f
z
1
,z
2
(z
1
, z
2
)
. (1)
First, we calculate the prior PDF of the CRV x, which is uniformly distributed:
f
x
(x) =
_
1
10
for 5 x 5
0 otherwise
.
In b), we showed by change of variables that f(z
i
|x) = f
n
i
(z
i
x). Since n
1
, n
2
are
independent, it follows that z
1
, z
2
are conditionally independent given x. Therefore, we
can rewrite the measurement likelihood as
f
z
1
,z
2
|x
(z
1
, z
2
|x) = f
z
1
|x
(z
1
|x) f
z
2
|x
(z
2
|x) .
10
We calculate the individual conditional PDFs f
z
i
|x
(z
i
|x):
f
z
1
|x
(z
1
|x) = f
n
1
(z
1
x) =
_

1
(1 z
1
+ x) for 0 z
1
x 1

1
(1 + z
1
x) for 1 z
1
x 0
0 otherwise.
The conditional PDF of z
1
given x is illustrated in the following gure:
f
z
1
|x
(z
1
|x)

1
x 1
z
1
x + 1 x
Analogously
f
z
2
|x
(z
2
|x) = f
n
2
(z
2
x) =
_

2
_
1
1
2
z
2
+
1
2
x
_
for 0 z
2
x 2

2
_
1 +
1
2
z
1

1
2
x
_
for 2 z
2
x 0
0 otherwise.
Let num(x) be the numerator of the Bayes rule fraction (1). Given the measurements
z
1
= 0, z
2
= 0, we nd
num(x) =
1
10
f
z
1
|x
(0|x) f
z
2
|x
(0|x) .
We consider four dierent intervals of x: [5, 1], [1, 0], [0, 1] and [1, 5]. Evaluating
num(x) for these intervals results in:
for x [5, 1] or x [1, 5],
num(x) = 0
for x [1, 0],
num(x) =
1
10

1
(1 + x)
2
_
1 +
x
2
_
=
1
20
(1 + x)
_
1 +
x
2
_
for x [0, 1],
num(x) =
1
10

1
(1 x)
2
_
1
x
2
_
=
1
20
(1 x)
_
1
x
2
_
.
Finally, we need to calculate the denominator of the Bayes rule fraction (1), the normal-
ization constant, which can be calculated using the total probability theorem:
f
z
1
,z
2
(z
1
, z
2
) =

f
z
1
,z
2
|x
(z
1
, z
2
|x) f
x
(x) dx =

num(x) dx.
Evaluated at z
1
= z
2
= 0, we nd
f
z
1
,z
2
(0, 0) =
1
20
_
_
0
_
1
(1 + x)
_
1 +
x
2
_
dx +
1
_
0
(1 x)
_
1
x
2
_
dx
_
_
=
1
20
_
5
12
+
5
12
_
=
1
24
.
11
Finally, we obtain the posterior PDF
f
x|z
1
,z
2
(x|0, 0) =
1
f
z
1
,z
2
(0, 0)
num(x) = 24 num(x)
=
_

_
0 for 5 x 1, 1 x 5
6
5
(1 + x)
_
1 +
x
2
_
for 1 x 0
6
5
(1 x)
_
1
x
2
_
for 0 x 1
which is illustrated in the following gure:
f
x|z
1
,z
2
(x|0, 0)
6
5
1 x 1
The posterior PDF is symmetric about x = 0 with a maximum at x = 0: both sensors
agree.
d) Similar to the above. Given z
1
= 0, z
2
= 1, we write the numerator as
num(x) =
1
10
f
z
1
|x
(0|x) f
z
2
|x
(1|x) . (2)
Again, we consider the same four intervals:
for x [5, 1] or x [1, 5],
num(x) = 0
for x [1, 0],
num(x) =
1
10

1
(1 + x)
2
_
1
2
+
1
2
x
_
=
1
40
(1 + x)
2
for x [0, 1]
num(x) =
1
10

1
(1 x)
2
_
1
2
+
1
2
x
_
=
1
40
_
1 x
2
_
.
Normalizing yields
f
z
1
,z
2
(0, 1) =
1
_
1
num(x) dx =
1
120
+
2
120
=
1
40
.
The solution is therefore
f
x|z
1
,z
2
(x|0, 1) =
_

_
0 for 5 x 1, 1 x 5
(1 + x)
2
for 1 x 0
1 x
2
for 0 x 1.
12
The posterior PDF is depicted in the following gure:
f
x|z
1
,z
2
(x|0, 1)
1
1 x 1
The probability values are higher for positive x values because of the measurement z
2
= 1.
e) We start in the same fashion: given z
1
= 0 and z
2
= 3,
num(x) =
1
10
f
z
1
|x
(0|x) f
z
2
|x
(3|x) .
However, the intervals of positive probability of f
z
1
|x
(0|x) and f
z
2
|x
(3|x) do not overlap,
i.e.
num(x) = 0 x [5, 5] .
In other words, given our noise model for n
1
and n
2
, there is no chance to measure z
1
= 0
and z
2
= 3. Therefore, f
x|z
1
,z
2
(x|0, 3) is not dened.
Problem 12
The Matlab code is available on the class webpage. We notice the following:
a) (i) The PDF remains bimodal for all times. The symmetry in measurements and
motion makes it impossible to dierentiate between the upper and lower half circle.
(ii) The PDF is initially bimodal, but the bias in the particle motion causes one of the
two modes to have higher values after a few time steps.
(iii) We note that this sensor placement also works. Note that the resulting PDF is not
as sharp because more positions explain the measurements.
(iv) The resulting PDF is uniform for all k = 1, 2, . . . . With the sensor in the center, the
distance measurement provides no information on the particle position (all particle
positions have the same distance to the sensor).
b) (i) The PDF has higher values at the position mirrored from the actual position,
because the estimator model uses a value p that biases the particle motion in the
opposite direction.
(ii) The PDF remains bimodal, even though the particle motion is biased in one direc-
tion. This is caused by the estimator assuming that the particle motion is unbiased,
which makes both halves of the circle equally likely.
(iii) The incorrect assumption on the motion probability p causes the estimated PDF
to drift away from the real particle position until it is forced back by measurements
that cannot be explained otherwise.
(iv) The resulting PDF is not as sharp because the larger value of e implies that more
states are possible for the given measurements.
(v) The Bayesian tracking algorithm fails after a few steps (division by zero in the
measurement update step). This is caused by a measurement that is impossible to
explain with the present particle position PDF and the measurement error distri-
bution dened by e. Note that e underestimates the possible measurement error.
This leads to measurements where the true error (with uniform distribution dened
by e) contains an error larger than is assumed possible in the algorithm.
13

You might also like