0% found this document useful (0 votes)
13 views

EEE539 Fall2018 Final

Uploaded by

selinturhan22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

EEE539 Fall2018 Final

Uploaded by

selinturhan22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

1

Bilkent University

EEE-539 DETECTION AND ESTIMATION THEORY

Fall 2018 - Final Exam

January 15, 2019

Duration: 2 hours

Surname
Name
ID #
Signature

Question-1 (35 pts)


Question-2 (35 pts)
Question-3 (30 pts)
TOTAL (100 pts)
2

1) Consider observations Y1 and Y2 which are independent and identically distributed with the following
probability density function (PDF):
{
3 yi2 / θ3 , if 0 < yi < θ
pθ (yi ) =
0, otherwise
for i ∈ {1, 2}, where θ > 0.
a) Find a one-dimensional (scalar) sufficient statistic for θ based on Y1 and Y2 . Justify your answer.
b) Denote the sufficient statistic found in Part a) as Z and derive the PDF of Z.
c) Prove whether your sufficient statistic, Z, is a complete sufficient statistic.
(Hint: If g(α) = 0 for all α, then ∂g(α)/∂α = 0 for all α, as well.)
d) Does there exist a minimum variance unbiased estimator (MVUE) for θ based on Y1 and Y2 ? If so,
derive it. If not, explain the reason.
3

2) Consider a scalar observation Y which is modeled as


{
g(θ) , with probability ρ
Y =
h(θ) , with probability 1 − ρ
where ρ ∈ [0, 1] and g(θ) and h(θ) are continuous, deterministic and one-to-one functions over the closed
interval [0, 1]. The unknown parameter θ is modeled to have a uniform prior probability density function
(PDF) over the closed interval [0, 1]. It is assumed that ρ, g(·) and h(·) are known.
a) Obtain the posterior PDF of θ given Y = y.
b) Derive the minimum mean-squared error (MMSE) estimator for θ based on Y = y.
c) Assume, only for this part, that g(θ) = θ Iθ∈[0,1] and h(θ) = (1−θ) Iθ∈[0,1] , where Iθ∈[0,1] represents the
indicator function that is equal to one for θ ∈ [0, 1] and zero otherwise. Then, calculate the mean-squared
error (MSE) of the MMSE estimator in Part b) and simplify it as much as possible.
d) For which values of ρ, is the MSE in Part c) maximized and minimized? Both derive mathematically
and explain intuitively.
4

3) Consider an estimation problem in which observation Y is used to estimate a scalar parameter θ,


where Y ∈ Γ. Observation is a continuous random variable and its possible probability density functions
(PDFs) are assumed to be differentiable with respect to θ.
(1) (2)
Consider any two different PDFs for observation Y which are denoted as pθ (y) and pθ (y). Also,
(3) (1) (2)
define a new PDF for observation Y as pθ (y) = (1 − γ) pθ (y) + γ pθ (y), where {( 0 < γ < 1. )The }
2
(i) ∂ (i)
Fisher information corresponding to different PDFs of Y is expressed as Iθ = Eθ ∂θ
log pθ (Y )
for i ∈ {1, 2, 3}.
(3) (1) (2) (1) (2)
Prove or disprove the following inequality: Iθ < (1 − γ)Iθ + γ Iθ for any pθ (y) and pθ (y) with
(1) (2)
pθ (y) ̸= pθ (y) and for any 0 < γ < 1.
(You can assume that Iθ ’s are finite.) Hint: The derivative of log f (x) is f ′ (x)/f (x).
(i)
5

You might also like