I3 Sta2 MX 25122019
I3 Sta2 MX 25122019
85, 90, 70, 50, 45, 60, 80, 75, 52, 35, 68, 95, 86, 88, 65
1
4. Let X1 , X2 , ... , Xn be a random sample from a distribution with mean µ, and unknown
n
X
variance σ 2 < ∞ and let a1 , a2 , ... , an be real numbers such that ai = 1. Define X̄ =
i=1
n
X
ai Xi .
i=1
2
3
Solution
Thus, Y ∼ Exp(θ).
(1.2) Find the MLE of θ
n n n
! θ1 −1
Y Y 1 1
−1
Y
L(θ) = f (x; θ) = xi
θ
= θ−n xi
i=1 i=1
θ i=1
Xn
1
ln L(θ) = −n ln θ + −1 ln xi
θ i=1
n
∂ n 1 X
ln L(θ) = − − 2 ln xi
∂θ θ θ i=1
n
1X
⇒θ=− ln xi = ȳ
n i=1
n
1X
Thus the MLE of θ is θ̂n = − ln Xi = Ȳ.
n i=1
Is θ̂n an efficient estimator?
n
! n
1X 1X nθ
E(θ̂n ) = E Yi = E(Yi ) = =θ (1)
n i=1 n i=1 n
n
! n
1X 1 X nθ2 θ2
V(θ̂n ) = V Yi = 2 V(Yi ) = 2 =
n i=1 n i=1 n n
Since
1
1 1 −1 1 1 1−2t
Z Z
− 2t − 2t
E X θ = x θx θ dx = x θ dx
0 θ θ 0
" 1−2t #1
1 x θ 1
= 1−2t
= (1 − 2t)−1 , t <
θ θ
2
0
2n 1
So, MU (t) = (1 − 2t)−n = (1 − 2t)− 2 , t < .
2
Hence, U ∼ χ2 (2n).
(1.4) Find 100(1 − α)% CI for θ
Since U ∼ χ2 (2n), by symmetry principle, we have
1 − α = P χ21− α ,2n ≤ U ≤ χ2α ,2n
2 2
!
2nθ̂n
= P χ1− α2 ,2n ≤ ≤ χ α2 ,2n
θ
!
2nθ̂n 2nθ̂n
=P ≤θ≤
χ2α ,2n χ1− α2 ,2n
2
(1.5) H0 : θ = 1 vs Ha : θ = θa , θa > 1
By Neyman-Pearson lemma, we have, for k > 0,
L(θ0 ) L(1)
≤k⇔ ≤k
L(θa ) L(θa )
1
⇔ θ 1−1 ≤ k
n
θa−n
Q
x
i=1 i
a
n
! θ 1−1
Y a
⇔ xi ≤ kθa−n
i=1
5
n
1 X
ln xi ≤ ln kθa−n
⇔ 1−
θa i=1
n
θa
ln kθa−n ∼
X
⇔ ln xi ≤ =c
i=1
θa − 1
( n
)
X
Thus, RR = (x1 , · · · , xn ) : ln xi ≤ c where the constant c is defined by
i=1
n
!
X
α=P ln Xi ≤ c|θ = 1
i=1
2c
= P U ≥ − |θ = 1 , U ∼ χ2 (2n)
θ
= P(U ≥ −2c)
0.05 = P(V ≥ a)
Since χ2 = 15.53 < 23.685 ⇒ χ2 ∈ / RR. H0 is not rejected. Thus, there is not
sufficient evidence to support intructor’s claim.
6
4. Given X1 , X2 , ... , Xn ∼ N(µ, σ 2 ).
n
! n n n
X X X X
V(X̂) = V ai Xi = a2i V(Xi ) = a2i σ 2 =σ 2
a2i
i=1 i=1 i=1 i=1
n
X
Since ai = 1, then by Cauchy-Swarz inequality, we have
i=1