0% found this document useful (0 votes)
49 views

Problem Set 2 - Econ 710 - Spring 2018 - Part 2

This document contains instructions for three exercises: 1. Estimating the parameter θ0 of an exponential distribution using maximum likelihood and generalized method of moments. The asymptotic distributions of the estimators are derived and compared. 2. Estimating the variance σ2 of a random variable using generalized method of moments. Two GMM estimators are derived and their asymptotic distributions are compared. The estimators are also applied to real data. 3. Estimating the mean μ of a variable Y using generalized method of moments, under the assumption that the mean of another variable X is zero. An efficient GMM estimator for μ is to be derived.

Uploaded by

Jose Martinez
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Problem Set 2 - Econ 710 - Spring 2018 - Part 2

This document contains instructions for three exercises: 1. Estimating the parameter θ0 of an exponential distribution using maximum likelihood and generalized method of moments. The asymptotic distributions of the estimators are derived and compared. 2. Estimating the variance σ2 of a random variable using generalized method of moments. Two GMM estimators are derived and their asymptotic distributions are compared. The estimators are also applied to real data. 3. Estimating the mean μ of a variable Y using generalized method of moments, under the assumption that the mean of another variable X is zero. An efficient GMM estimator for μ is to be derived.

Uploaded by

Jose Martinez
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Econ 710 - Part 2 - Problem Set 2

Spring 2018

Due April 23, 2018

For all question below, if you need to make additional assumptions to derive the results,
state them clearly.

Exercise 1. Suppose X has a distribution with a density function


1 − θ1 x
f (x, θ0 ) = e 0 ,
θ0
where θ0 > 0. Let {Xi }ni=1 be a random sample from the distribution of X.

(a) Derive an expression for the MLE of θ0 and denote it by θ̂mle .



(b) Use the CLT to derive the asymptotic distribution of n(θ̂mle − θ0 ).

(c) Now use the general asymptotic normality theorem to derive the asymptotic distribu-

tion of n(θ̂mle − θ0 ). That is, verify all conditions of Theorem 2 in the class notes,
derive A(θ0 ) and B(θ0 ), and use these results to derive the asymptotic distribution of

n(θ̂mle − θ0 ). For the last condition of Theorem 2, you can simply apply a LLN to
show that the estimator is consistent.

(d) Show that E[(X − θ0 )2 − θ02 ] = 0 and use this moment condition to derive a GMM
estimator of θ0 . Denote the estimator by θ̂gmm .

(e) Derive the asymptotic distribution of n(θ̂gmm − θ0 ).

(f) Does the asymptotic distribution of n(θ̂gmm − θ0 ) or the asymptotic distribution of

n(θ̂mle − θ0 ) have a smaller variance?

Exercise 2. Let X be a random variable X such that E(X) = σ and var(X) = σ 2 , with
σ > 0. Suppose we observe a random {Xi }ni=1 from the distribution of X and we are
interested in estimating σ.

(a) Show that E[X − σ] = 0 and E[X 2 − 2Xσ] = 0.

1
(b) Define g(X, σ) = (X − σ, X 2 − 2Xσ)0 and
n
!0 n
!
1X 1X
Qn (σ) = g(Xi , σ) g(Xi , σ) .
n i=1 n i=1

Derive an expression for the minimizer of Qn (σ) and derive its asymptotic distribution.
You can derive your expressions in matrix notation.

(c) Using the two moments from part (a) derive the optimal GMM weight matrix. Recall
that the optimal weight matrix is

−1
W = (E[g(X, σ)g(X, σ)0 ]) .

(d) Derive an expression for the GMM estimator with the optimal weight matrix and derive
its asymptotic distribution. Recall that the GMM estimator with the optimal weight
matrix is the minimizer of
n
!0 n
!
1X 1X
g(Xi , σ) Ŵ g(Xi , σ) ,
n i=1 n i=1

where Ŵ is an estimator of W .

(e) Now take the data on the class website and estimate σ using the two different estima-
tors. Also calculate two corresponding 95% confidence intervals.

(e) Perform an over-identification test, which tests the null hypothesis that the model is
correctly specified. What do you conclude?

Exercise 3. Suppose you want to estimate µ = E[Y ] under the assumption that E[X] =
0, where Y and X are scalar random variables. Suppose we observe a random sample
{Yi , Xi }ni=1 . Find an efficient GMM estimator for µ.

You might also like