0% found this document useful (0 votes)
7 views

6_1_Questions-1 (1)

The document contains a series of practice questions related to statistical estimation, focusing on properties of estimators such as unbiasedness, efficiency, and mean square error. It covers various distributions, including normal and binomial, and includes derivations and proofs for different estimators of population parameters. The questions are designed for students in a BA(Hons) program, specifically in the second semester, to enhance their understanding of statistical concepts.

Uploaded by

Muskan Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

6_1_Questions-1 (1)

The document contains a series of practice questions related to statistical estimation, focusing on properties of estimators such as unbiasedness, efficiency, and mean square error. It covers various distributions, including normal and binomial, and includes derivations and proofs for different estimators of population parameters. The questions are designed for students in a BA(Hons) program, specifically in the second semester, to enhance their understanding of statistical concepts.

Uploaded by

Muskan Jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Exercise 6.

1 (Practice Questions)

Subject: SME
Course: BA(Hons), 2nd Sem
1. If X1 , X2 ,.......Xn constitutes a random sample of size n from an infinite population with the
mean µ and variance σ 2 . Compute E(X) and Var(X).

2. Suppose X ∼ N(µ, σ 2 ) and we draw a random sample of size n from this population. The
following are two estimators of µx :
(i) X = Σ Xni
Xi
(ii) X ∗ = Σ n+1
Check if the above estimators are unbiased.

3. Suppose X1 , X2 ,.......Xn be a sample of size n from a normal distribution X ∼ N(µ, σ 2 ). Consider


the point estimators of σ 2 ?
1
(i) S 2 = n−1 Σ(Xi − X̄)
(ii)S ∗2 = n Σ(Xi − X̄)
1

Check if the above estimators are unbiased.

4. If X has the binomial distribution with the parameters n and P, show that the sample proportion
p= X n is an unbiased estimator of P.

5. If X1 X2 and X3 constitute a random sample of size 3 from the normal population with the
mean µ and the variance σ 2 , find the efficiency of
X̄1 = X1 +2X
4
2 +X3
relative to X̄2 = X1 +X32 +X3

6. Suppose X1 and X2 is a random sample from a population with mean µ. Let the estimator is
a weighted average of X1 and X2 i.e. Y = λ1 X1 + λ2 X2 . What values of λ1 and λ2 will give us
the efficiency of µ?

7. Show That sample mean X̄ is the best linear unbiased estimator?

8. Explain the meaning of a BLUE estimator. If X1 and X2 is a random sample from a population
with mean µ and the variance σ 2 , and we define two estimators of µ as X̄=0.5X1 . +0.5X2 , and
X̂ = 0.3X1 , +0.7X2 , show that X̄ is BLUE but not X̂

9. Let X be a random variable with a mean µ and the variance σ 2 . Three independent observations
are drawn from this distribution X1 , X2 and X3 . Consider three different estimators of µ
X̂ = 0.2X1 , + 0.3X2 + 0.5X3
X ∗ = 0.4X1 , + 0.2X2 + 0.4X3
X̄ = 0.3X1 , + 0.3X2 + 0.3X3
which of the above three are BLUE

10. (a) Let X1 , X2 , ..., Xn be a random sample drawn from population with mean µ and variance
2
σ 2 . Let X̄ = n1 Σ Xi , be the sample mean. show that E(X̄)= µ and var(X̄) = σn .
(b) Let Y= n1 Σai Xi , where ai ’s are fixed constant Derive E(Y) and Var(Y). What is the con-
dition for E(Y) to be equal to µ
(c) Construct a new random variable Z that depends only on X, µ, σ 2 , and n, such that it
has zero mean and variance 1.

11. What is mean square error? Why and when is the rule to minimize the mean square error
useful? Prove that MSE equals variance plus the square of the bias of the estimator.

12. Given two estimators t1 , and t2 of the population mean (µ =10) E(t1 ) = 10, E(t2 ) = 11, V(t1 )
= 4 and V(t2 ) = 1, which estimator is better and why?

13. Find the mean square error when the sample mean X̄ is used to estimate the mean of an X
∼ N (µ, σ 2 ) distribution based on a random sample of n observations.

14. A random variable X has a mean µ and the variance σ 2 . Two independent observations Xi and
X2 are drawn. Consider an estimator of µ given by =1.1X1 +bX2
(a) What value of b will make µ̂ unbiased?
(b) Compute the variance of µ̂ in terms of σ 2 ?. What value of b will minimize this variance?
Do you get the same value of b as in (i) above? What do you conclude?

15. If X is a binomial random variable, show that


X
(a) p̂ = n is an unbiased estimator of p

X+ n/2
(b) p̃= √ is a biased estimator of p
n+ (n)

16. If X1 , X2 ,.......Xn constitutes a random sample from a normal distribution with the mean zero
and variance θ, 0¡θ¡∞. Examine two estimators of θ.
Xi2
(a) θ̂ = Σ n
Xi2
(b) θ̃ = Σ n−1
Which one is better?

17. PYQ(2017) Q10


(a)
(b) Assume a random sample (X1 , X2 , , X3 , ..Xn ) from a population with mean µ and standard
deviation σ 2
i. Show that µ̂ = 1∗X1 +2∗X0.5n(n+1)
2 +3∗X3 +.....+n∗Xn
is unbiased.
2(2n+1)σ 2
ii. Show that variance of µ̂ = 3n(n+1)

(c) What is Mean Squared error (MSE) of an estimator θ̂? Show that MSE(θ̂) = Variance(θ̂)
+ (Bias(θ̂))2

18. PYQ(2018) Q7
(a)
(b) Differentiate between a parameter and a statistic. Which of the following are statistic and
why?
i. Σ Xiσ−µ
ii. Σ Xin−X
iii. Σ Xni

Page 2
max(Xi −µ)−min(Xi −µ)
iv. n

19. PYQ(2018) Q10


(a)
(b) Let X11 , X12 , ......., X1n and X21 , X22 , ......., X2n be the two random samples from the pop-
ulation the following a binomial distribution. The parameter to be estimated is p, defined
as the proportion of success in the two samples. Which of the following is a better point
estimate in terms of efficiency and lesser variance?
i. Xn11 +X2
+n2
X1 X
n1
+ n2
ii. 2
2

20. PYQ(2019) Q10


(a) Let X1 , X2 ...... Xn denote a random sample from a normal distribution with mean zero
X2 Xi2
and variance σ 2 , 0 < σ 2 < α. Examine the two estimators of σ 2 : (1) Σ ni (2) Σ n−1
Which of these two is an unbiased estimator of σ 2 for finitely small sample? Will your
answer change if n −→ α?

Page 3

You might also like