ps2,3
ps2,3
1. Let X,,...,X, be aniid. sample from the N(,u, 7) distribution. Find the Fisher information matrix, I,,(@),
for the parameter @ = (ju, o7).
[Note: The Fisher information matrix for a vector valued parameter @ is defined as
Further, from the above definition of I,,(@) show that, when Xy,...,X, are iid, then I,,(@) = nI,(@), where
I, (@) is the Fisher information matrix for one sample.
Solution:
The PDF of a normal distribution N(j, a”) is given by:
f(a;au, 0) = 1
Vv 270?
es
_ fe ny
l + iy 4)
1
—| tar| — 1
—] 2) _ (x — x)?
The log-likelihood function ms n. iid. observations is:
|p) roo
~ - A(o?) | =- (07)?
ar + Ton
(a?)8_— = (0?)ay
_,(OloegLl] ,}/ 1 < ~-w\-— = _ _
i=l
Thus, the Fisher information matrix is:
9
I,,(9) = k xo |
Went
aes
For one sample (7: = 1), the Fisher information matrix is:
+ oa
0 Bf a2)
E|X] = af
E|X*] = af? + (af)?
The sample moments are:
. 1 Th
7
Tt
a x?
t
"=I
Equating the population moments to the sample moments:
ap = X
eae
eat bal Ssh
i=1
where S? is the sample variance.
Dividing af? by a8
ost
OB x
5
om Xx
Substituting into first equation,
Dividing af? by a8
a e _ Ss?
ag X
Ss?
x
Substituting @ into first equation,
yer>e
xX
—-— =
x ST
*
B St= S2
Therefore, the method of moments estimators for a and 3 are:
.
C= “g2
x?
‘ 3?
B =
XxX
.
(a+Piat+@+l) ne
From the first equation:
a = X(a+ 8)
a({l— x j= A B
_ ali= X)
f= x
Now, let m2 = 4+ 377, X?. The sample variance is $* = m2 — X*. We also know that
a{a + 1) ry2 2,
= E[X*)=S*+24 #2
(a+ A\(at+ B8+1) El
Substituting 3
a(a+1) a(a +1)
= = = Fair = Ma
(+ GM )q 4 HM 4 EME +D
a+1 X? mn
a fo Te ow ee
F =, | X({1— X)
p= a-x|205% +
pb? + 2po
+ 20? = we,
i=1
From the first equation:
—y
pe
\|
q
Now, let m2 = I yo, X?. We also know that
a” = E[X?*] — EX]? = m2 — X?
Substituting oa:
Substituting pu:
The MOME:
Qa 1* (1 — 6) 2am)
i=1 (”" I]
> 2, = a5 m = fnm
é=1 i=1
p= ier 2 xX
FLITE m
ga Dini _ X
nm m.
10) = (2 pra»)
For @ to be an integer, you have to use discrete optimization.
theta must be greater than or equal to the observed value.
© Binomial(m, @):
g(0) = P(X, + X_ = 0) = P(X, = 0, X2 = 0) = P(X, = ONP(X, = 0) = (1-4#)™(1 - 8)" = (1- a
Because the MLE of # is X /m. The MLE of (1 — @)?™ is (1 — X/m)?™.
(d) Hypergeometric(m, 7, @):
The likelihood function ts:
(7)
g(@) = 8
Since m = 1, the MLE is the value of @ that maximizes [(@). The solution is @ = 1
(e) Double Exponential:
The likelihood function is:
fl 1 1 FL
To maximize L(@), we need to minimize 55", |a; — 6]. This is minimized when @ is the median of x;
(f) Uniform(a, (3):
1
F(z} a, B) = Bs aijz<p
TE 1 1
L(a, B) =I]
(a 8) —
- ( = a)”
g(@) =a4+ B.
min(X;) + max(X;)
(g) Normal(@, 67):
The likelihood function is:
Wh
L() =] a0 — (on 9”) "Pex | a
—
t=1
—
TF a
1 << |
fly, eeey Yn\|B, a”) = (2107)? €Xp a (Yj — ax
7=1
p= =iar Vii
. dui=l a;
Therefore, 8 is also normally distributed.
The mean of 8 IS:
Elf]= 6
The variance of {3 is:
5 int CY;
= Var | =
Var(8) (4S: = Sia t}7Var(Y%i)
inp = oP? Sy
—— 2}
= o”
So
Dini &; (Sri-1 2;) (Soin 77) int Fj
Thus, the distribution of the MLE of £ is:
a
al Coser
D>
Problem 4(d): Unbiased Estimator of Beta
Let B — 2%
it Li ict Li vet Li
xy,
Therefore, B = <—
+
is an unbiased estimator of £.
™ : sf a ~ ues * -_
Problem 4(e): Variance Comparison
The variance of 6 =Se is:
, Var(Y;) ne
in Xi
Vv ar(£)5 = Var r | ( =]
n = yo"
(i F xi)?
_
( m zi)?
a ea
The variance of the MLE is:
Var( B) = Se
To show unbiasedness:
a oll SYM| TSE) 146% 14,_
sided Ee? i=1 i=1 dyi=1
ay?
Thus, 8* = 1 Soi — is an unbiased estimator of 8.
Problem 4(g): eo Saree eee
The variance of 8* = +>", = is:
1S Y; lig Y; 1 Q.Var(Y¥;) e& Gl
Comparing to:
wave
Var(8*) =Var|—
(SSE) abt
4% —]=—)
(E)-ab ye
Var(—]= =
SE3 =
vol) = ray
The MLE, 8, is the best linear unbiased estimator (BLUE). It has the smallest variance.
Problem 5: Best Unbiased Estimator
Let W,,..., W;, be unbiased estimators of 6 with known variances Var(W;) = 0?i?
Consider an estimator of the form:
sg?= S (Ri
— R)?
n—1l
So, the unbiased estimator is:
gs?
Aunbiased am n(R? _ a )
Since the normal distribution is complete and sufficient, this estimator is the UMVUE.
Problem Set 3
1. (Additive Properties)
(a) Binomial:
Let X; né Binomial (n;,p) fori = 1,...,k. The moment generating function (MGF) of X; is:
Mx,(t) = (1— p+ pe')™
Since the X;'s are independent, the MGF of JT' = yy X; is:
ki k
_ p + pe)” _ (1 _ p+ pet)uam
Mr(t) = [[ “x. — [Iq
i=l i=1
This is the MGF of a Binomial distribution with parameters a n,; and p. Therefore:
k
T ~ Binomial (>: Ni;
i=]
(b) Poisson:
Let X; ‘™" Poisson(A,;) fori = 1,...,n. The MGF of X; is:
Mx, (t) —_ erile'-1)
T ~ Poisson (3: |
i=1
© Normal:
Let X; pay Normal (i;, 07) fori = 1,...,. The MGF of X; is:
Mx. (t) _— elit gait
Mr(t) = Il Mx, (t) = Il elitt poit = erste mitt> Di age? = ele wi)t+3 (Sh oF)!
i=1 i=1
This is the MGF of a Normal distribution with mean }>7_, (4; and variance }>""_, ao. Therefore:
Since the X;'s are independent, the MGF of I’= Dui= 1 AG is:
This is the MGF of a Normal distribution with mean ay + b and variance a7o”. Therefore:
T ~ Normal(ap + 6, a?o”)
3. Let X ~ Gamma/(a, 8). Then T = aX. The MGF of Xis:
Mx(t) =(1- st)", t<- p < —
T ~ Gamma («, )
a
4. Let X ~ beta(n/2,m/2). Then T = ie To find the distribution of T, we use the transformation
method.
First, find the CDF of 7:
n(l1—X
nt
Fr(t)= P(X < ~~)
Now, differentiate F’p(t) to find the PDF f(t). The PDF of Xis:
fx(e) = 2 B-1/4 (1 — 4)
2) #1
B(z, 7)
where B(-,-) is the beta function. Using the transformation formula, we find that:
LP ~ Fam
5. Let X ~ Uniform(0, 1) and a > 0. Then T = X1/@. The CDF ofX is Fy (x) = x for 0 < & < 1. The CDF of
T is:
Fy(t) = P(T < t) = P(X™* <t) = P(X < t*) = Fx(t®) = t°
for0 < t < 1. Differentiating F’7(t) with respect to t, we get the PDF of T:
fr(t) =at™"
for 0 < t < 1. This is the PDF of a Beta distribution with parameters @ and 1. Therefore:
T ~ Beta(a, 1)
6. Let X ~ Cauchy(0, 1). Then T = +5. The PDF of Xis:
1
fx(z) = —— —oo
< f< oO
(1+ 2?)’
The CDF of 7’ is:
oh | Re
xX
IV
_—
Differentiating F'7(t) to find the PDF f(t) and simplifying, we find that:
T ~ Beta(0.5, 0.5)
6. Let X ~ Cauchy(0, 1). Then T = 45. The PDF of Xis:
1
fx(z) = SSS =|! —o <7 < oO
(1 +7)’
The CDF of 7’ is:
oh | Re
xX
TT
\V
Differentiating F’7(t) to find the PDF f(t) and simplifying, we find that:
T ~ Beta(0.5, 0.5)
7. Let X ~ Uniform(0, 1). Then T = —2 log X. The CDF of X is F'x(x) = x for 0 < x < 1. The CDF of T is:
F,(t) = P(T <t) = P(-2logX <t)=P (108 x > -5 =P (x > e?)
bales
Fr(t)=1—e ?, t>0
x= [2 [ ert-ujau=
2 ep= 2
To find the variance, we first needme
Using integration by parts or recognizing that this is the second moment of a half-normal distribution, we have
E(X?)=1.
Therefore:
fx(2) = 2 exp{-2"/2}
The PDF of Y can be found using the transformation formula:
dr 1 1
fy(y) = fx(z) ~ (2 exp{—y} —|
Taa| = —er{-y},
Va y>0
dy
This is the PDF of a Gamma distribution with parameters a = 1/2 and 8 = 1. Therefore:
1
Y ~ Gamma (>. 1)
10. Let X be distributed as Gamma(a, §). Then show that E(X") = p" ee r > —a.
The expected value of X” is given by:
oo gt —1 e~2/8 1 [
(=
E(X")=
fare)
x” x
= Br)
ode = gi te-le-a/8
dp
forO<y< 1.
(b) Find the conditional distribution of Y given X = 1.
First, find the marginal distribution of X:
forO<y< 1.
© Compare the expectations of the above two distributions of Y.
The expectation of the marginal distribution of Y is:
‘fil Vy 1 1.31 1
[Y] Ula tua pl gute jay= jay + gy (43
[\s
1/8 1/4 4.,)' 1
E[XY] = + —a3y + oy | dy =
3 0 4 fi (gutav)av=a|3e+ 34]
—y 4 Ay? \dy
= —|—y? + —y8|
=a
= —[ =
Now, we need:
ois
(e) Find the distribution of Z = 9/(2Y + 1)”.
9
P(Z<2)=P(a aoe <2) P(@y +1 ) ‘22 =P(2v+1>—)
(2Y +1)? Zz 3
—1 1 1
ale
P(r= fae (5 tu) dy
bo
(f) What is P(X > Y)?
P(X >Y)= ff toxtedet = [ [Ger 2vaeay
P(X >Y)= if
aj, "42
[2 ay=4
+74]
Y= 4], 2+ 4y—
y- ¥> —2y?)dy
—2y
1 /f 5 1 » 5a) 1 5 1/19 19
0
12. Let X ~ Normal(0, 1). Define Y = —X1(|X| < 1) + X1{|X]| > 1). Find the distribution of Y. (Hint: Apply
the CDF approach)
X, X<-—-l
Y= ¢-X, -1<X<1
X, X>il
Let's find the CDF of Y, Fy(y) = P(Y < y).
e ify<—tl:
Fy(y) = P(X <y)
eif-—lL<y<l:
Fy(y) = P(X < —y)
elfy> 1:
Fy(y) = P(X < y) + P(X 2 -y)
13. Let X ~ Normal(0, 1). Define Y = sign(X) and Z = |X|. Here sign(-) is a R + {—1,1} function such
that sign(a) = 1ifa > 0, and sign(a) = —1 otherwise.
(a) Find the marginal distributions of Y and Z.
Y is a discrete random variable taking values —1 and 1.
P(Y =1)=P(X >0)=0.5
P(Y = -1) = P(X < 0) =0.5
(b) Find the joint CDF of (Y, 7). Hence or otherwise prove that Y and Z are independently distributed.
P(Y =1,2Z =2) = P(X > 0, |X| = z) = P(X = z) = 0.5fz(z)
P(Y = -1,Z = z) = P(X < 0, |X| = z) = P(X = —z) = 0.5f2(z)
P(Y,Z) = P(Y)P(Z) thus independent
14. Suppose X1,---,Xp, a Normal(,, 0°), ¥1,---,;Ym Ta Normal(j,,, 07), and all the random variables
{X1,+++,;Xn,Vi,--+, Ym} are mutually independent. Then find the distribution of T := S32/S}?, where $3?
and oe are the unbiased sample variances of X and Y, respectively.
Si = gy Da Ki — WP and SH? = ay ELK -P)
(n Si 2 (m—1)8/?
We know that —_7~ ~ y*_, and _e~ x21
eine g
Fx, (2) = P(X) < 2) =S>P( (at least r of X1,...,X, are < 2x)
i=r