0% found this document useful (0 votes)
20 views2 pages

Homework1 2024

Uploaded by

renilalexander10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views2 pages

Homework1 2024

Uploaded by

renilalexander10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

H OMEWORK EXERCISE I

F ILTERING AND I DENTIFICATION ( SC 42025)

Hand in pictures / scans of your hand-written solutions as a pdf for exercise one and two. For the MAT-
LAB exercise, please export your live script as a pdf (instructions in template). Then, merge all files
and upload them through Brightspace on November 22nd 2024 before 18:00. You are allowed and
encouraged to discuss the exercises together but must hand in individual solutions.

Please highlight your final answer!

Exercise 1
For the system y = F θ + ϵ, with ϵ ∼ N (0, I) and a full rank matrix F ,

a) Please write down the expression of least squares estimate θ̂ and its covariance (represented by
F, y).

Based on a), given three measurements of the unknown vector θ ∈ R2 :


   
1 1 1
F = 2 1 , y = 2 , (1)
1 0 0

b) Please compute the least-square solution θ̂ and the residual ||y − F θ̂||22 (specific values are required)
and show the different steps in your derivation.

Assume that, very rarely, it is possible to obtain very accurate measurements. Consider the case that N
noisy measurements are made that can be used to identify the unknown parameter vector θ. Then, two
perfect (without any error) measurements are made. The researchers are glad to include the new perfect
measurements which results in the following least-square problem:
     
y F ϵ
min ϵT ϵ, 2 = Fa  θ + 0 (2)
θ
3 Fb 0

Here, θ ∈ Rn , F ∈ RN ×n , N ≫ n, full column rank. Furthermore, Fa ∈ R1×n , Fb ∈ R1×n . Let’s partition


Fa and Fb as Fa = [1 1 f T ], Fb = [2 1 3f T ], for f ∈ R(n−2)×1 . Similarly, F can be partitioned as
F = [fa fb M ] for M ∈ RN ×(n−2) , fa ∈ RN ×1 , fb ∈ RN ×1 .
c) Please give the expression for the least-square solution of θ using all the information above. You
can assume that all necessary (pseudo-) inverses exist in this exercise.
Hint: you may also want to partition the unknown θ.
Exercise 2
Assume that you are interested in the posterior Gaussian distribution

p(θ | y) = N (µθ|y , Pθ|y ), (3)

where µθ|y and Pθ|y are unknown, given the measurements

y = F θ + Lϵ, ϵ ∼ N (0, I),

with F , L known deterministic matrices and W −1 = LL⊤ square and invertible. Before obtaining any
measurements we define the prior distribution of θ as

p(θ) = N (µθ , Pθ ),

where Pθ > 0 is a known symmetric covariance matrix.


a) Write out the prior p(θ), likelihood p(y|θ) and posterior p(θ|y) in terms of their explicit multivariate
normal distributions (with the exponent). For the posterior, write it as a function of the unknown
mean and covariance from Equation (3). Then show that p(θ|y) ∝ p(y|θ)p(θ) can be written as
 
1 −1 −1 1 −1
exp − θ⊤ Pθ|y θ + θ⊤ Pθ|y µθ|y − µ⊤ P µθ|y
2 2 θ|y θ|y
 
1 1 1
∝ exp − θ⊤ (F ⊤ W F + Pθ−1 )θ + θ⊤ (F ⊤ W y + Pθ−1 µθ ) − y ⊤ W y − µ⊤ P −1
µθ .
2 2 2 θ θ

b) Assuming that
E[(θ − µθ )ϵ⊤ ] = 0,
 
y
please prove (4),(5) by deriving an unbiased estimate µθ|y = [M N] such that E[(θ−µθ|y )(θ−
µθ
µθ|y )T ] is minimized.
−1
µθ|y = µθ + Pθ F T F Pθ F T + W −1 (y − F µθ ) (4)

−1
Pθ|y = Pθ − Pθ F T F Pθ F T + W −1 F Pθ (5)

Hint: you might need to use the Schur complement (Lemma 2.3 on page 19 of book Verhaegen)
c) Show that
−1
µθ|y =µθ + Pθ F T F Pθ F T + W −1 (y − F µθ )
(6)
=µθ + (Pθ−1 T
+ F WF) −1 T
F W (y − F µθ ),
using Lemma 2.2 on page 19 of the book by Verhaegen and Verdult.

MATLAB exercise
See the MATLAB live script Matlab 1 template.mlx.

You might also like