0% found this document useful (0 votes)
4 views

Ch07-Point Estimation of Parameters and Sampling Distributions

Uploaded by

mariasharaiyra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Ch07-Point Estimation of Parameters and Sampling Distributions

Uploaded by

mariasharaiyra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Applied Statistics and Probability

for Engineers
Seventh Edition
Douglas C. Montgomery George C. Runger

Chapter 7
Point Estimation of Parameters and Sampling Distributions
Chapter 7 Title Slide
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 1
7 Point Estimation of Parameters and
Sampling Distributions

CHAPTER OUTLINE
7.1 Point Estimation 7.3.4 Bootstrap Standard Error
7.2 Sampling Distributions and 7.3.5 Mean Squared Error of an
the Central Limit Theorem Estimator
7.3 General Concepts of Point 7.4 Methods of Point Estimation
Estimation 7.4.1 Method of Moments
7.3.1 Unbiased Estimators 7.4.2 Method of Maximum
7.3.2 Variance of a Point Likelihood
Estimator 7.4.3 Bayesian Estimation of
7.3.3 Standard Error: Reporting Parameters
a Point Estimate

Chapter 7 Contents
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
2
Learning Objectives for Chapter 7
After careful study of this chapter, you should be able to do the following:
1. Explain the general concepts of estimating the parameters of a population or a
probability distribution
2. Explain the important role of the normal distribution as a sampling distribution and
the central limit theorem
3. Explain important properties of point estimators, including bias, variances, and
mean square error
4. Construct point estimators using the method of moments and the method of
maximum likelihood.
5. Compute and explain the precision with which a parameter is estimated
6. Construct a point estimator using the Bayesian approach

Chapter 7 Learning Objectives

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
3
Point Estimation
• A point estimate is a reasonable value of a population
parameter.
• 𝑋1, 𝑋2, … , 𝑋𝑛 are random variables.
• Functions of these random variables, 𝑥ҧ and 𝑠2, are also random
variables called statistics.
• Statistics have their unique distributions which are called
sampling distributions.

Sec 7.1 Point Estimation

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
4
Point Estimator

Sec 7.1 Point Estimation

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
5
Some Parameters & Their Statistics
• There could be choices for the point
estimator of a parameter.
• To estimate the mean of a population,
we could choose the:
▪ Sample mean.
▪ Sample median.
▪ Average of the largest & smallest
observations in the sample.

Sec 7.1 Point Estimation

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
6
Some Definitions
• The random variables 𝑋1, 𝑋2, … , 𝑋𝑛 are a random sample of
size 𝑛 if:
a) The 𝑋𝑖 ′𝑠 are independent random variables.
b) Every 𝑋𝑖 has the same probability distribution.
• A statistic is any function of the observations in a random
sample.
• The probability distribution of a statistic is called a sampling
distribution.

Sec 7.2 Sampling Distributions and the Central Limit Theorem

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
7
Central Limit Theorem

Sec 7.2 Sampling Distributions and the Central Limit Theorem

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
8
Example 7.2 | Central Limit Theorem

Sec 7.2 Sampling Distributions and the Central Limit Theorem

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
9
Sampling Distribution of a Difference in
Sample Means

Sec 7.2 Sampling Distributions and the Central Limit Theorem

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
10
Unbiased Estimators

Sec 7.3.1 Unbiased Estimators

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
11
Example 7.3 | Sample Mean and Variance
are Unbiased
• Suppose 𝑋 is a random variable with mean 𝜇 and variance 𝜎2. Let 𝑋1, 𝑋2, … , 𝑋𝑛 be a
random sample of size 𝑛 from the population represented by 𝑋
• Show that the sample mean 𝑋ത and sample variance 𝑆 2 are unbiased estimators of 𝜇 and
𝜎 2 respectively.

Sec 7.3.1 Unbiased Estimators

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
12
Variance of a Point Estimator

Sec 7.3.2 Variance of a Point Estimator

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
13
Standard Error: Reporting a Point Estimate

Sec 7.3.3 Standard Error: Reporting a Point Estimate

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
14
Example 7.4 | Thermal Conductivity
• The following 10 measurements of thermal conductivity of Armco iron were obtained:

• A point estimate of the mean thermal conductivity at 100 ℉ and 550 watts is the sample
mean or

• Because σ2 is unknown, we may replace it with the standard deviation s = 0.284 to


obtain the estimated standard error of 𝑥ҧ as: Practical Interpretation: Notice that the standard error is about 0.2
percent of the sample mean, implying that we have obtained a
relatively precise point estimate of thermal conductivity. If we can
assume that thermal conductivity is normally distributed, two times
the standard error is 2σ2x-bar = 2(0.0898) = 0.1796, and we are
highly confident that the true mean thermal conductivity is within the
interval 41.924 +- 0.1796 or between 41.744 and 42.104

Sec 7.3.3 Standard Error: Reporting a Point Estimate


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved 15
Bootstrap Standard Error
• Situations with some of the standard probability distributions, such as the exponential and
Weibulll distributions
• The bootstrap is a computer – intensive technique
• The bootstrap procedure would use the computer to generate bootstrap samples
randomly from the probability distribution and calculate the bootstrap estimate
• Sample mean:

• Sample standard deviation:

Sec 7.3.4. Bootstrap Standard Error

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
16
Mean Squared Error of an Estimator
The mean squared error is equal to the variance of the estimator plus the squared bias

Sec 7.3.5 Mean Squared Error of an Estimator

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
17
Mean Squared Error of an Estimator
If this relative efficiency is less than 1, we
would conclude that the first estimator is a
more efficient estimator than the second
estimator, in the sense that it has a smaller
mean squared error

An estimator that has a mean squared error that is less than or equal to the mean
squared error of any other estimator, for all values of the parameter estimator, is called
an optimal estimator of Θ. Optimal estimators rarely exist.

Sec 7.3.5 Mean Squared Error of an Estimator

Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
18
Methods of Point Estimation
• The general idea behind the method of moments is to equate population moments,
which are defined in terms of expected values, to the corresponding sample moments

• The sample mean is the moment estimator of the population mean

Sec 7.4.1 Methods of Point Estimation


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
19
Example 7.7 | Normal Distribution Moment
Estimators
• Suppose that X1, X2, …, Xn is a random sample from a normal distribution with parameter
μ and σ2 where E(X) = μ and E(X2) = μ2 + σ2.
1 n 1 n 2
 = X =  Xi and  + =  Xi
2 2

n i =1 n i =1
2
1 n
n

1 n 2
 X i
2
− n n iX
 i =1 
 =  X i − X 2 = i =1
2

n i =1 n
  n
 
2
n

 Xi  ( Xi − X )
2

1  n 2  i =1  
=  Xi − = i =1
(biased)
n  i =1 n  n
 
 
Sec 7.4.1 Methods of Point Estimation
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
20
Example 7.8 | Gamma Distribution Moment
Estimators
Suppose that 𝑋1, 𝑋2, … , 𝑋𝑛 is a random sample from a gamma distribution with parameter r
and λ where 𝐸(𝑋) = 𝑟/ 𝜆 and 𝐸(𝑋2) = 𝑟(𝑟 + 1)/ 𝜆2 .

X2 21.6462
r= = = 1.29
n
(1/ n )  X i2 − X 2 ( )
1 8 6645.4247 − 21.646 2

i =1

X 21.646
= = = 0.0598
n
(1/ n )  X i2 − X 2 (1 8 ) 6645.4247 − 21.6462
i =1

Sec 7.4.1 Methods of Point Estimation


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
21
Method of Maximum Likelihood

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
22
Example 7.9 | Bernoulli Distribution MLE
• Let X be a Bernoulli random variable.
• The probability mass function is 𝑓 𝑥; 𝑝 = 𝑝 𝑥 1 − 𝑝 1−𝑥
, 𝑥 = 0, 1 where 𝑝 is the parameter
to be estimated.
• The likelihood function of a random sample of size n is:
L ( p ) = p x1 (1 − p )  p x2 (1 − p )  ...  p xn (1 − p )
1− x1 1− x2 1− xn

n
n  xi n
 xi
=p (1 − p ) (1 − p )
1− xi n−
xi
=p i =1
i =1

i =1

 n   n

ln L ( p ) =   xi  ln p +  n −  xi  ln (1 − p )
 i =1   i =1 
n
 n

d ln L ( p ) 
xi 

n − i =1
xi 

= i =1

dp p (1 − p )
Sec 7.4.2 Method of Maximum Likelihood
Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
23
Example 7.11 | Normal Distribution for MLEs
for 𝜇 and 𝜎2
Let 𝑋 be a normal random variable with unknown mean 𝜇 and known variance 𝜎2. The
likelihood function of a random sample of size 𝑛 is:

Conclusion: Once again, the maximum likelihood estimators are equal to the moment estimators

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
24
Properties of the Maximum Likelihood
Estimator

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
25
Invariance Property

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
26
Example 7.12

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
27
Complications in Using Maximum
Likelihood Estimation
The method of maximum likelihood is an excellent technique, however there
are two complications:

1. It may not be easy to maximize the likelihood function because the derivative
function set to zero may be difficult to solve algebraically.
2. It may not always be possible to use calculus methods directly to determine the
maximum of 𝐿(𝜃).

Sec 7.4.2 Method of Maximum Likelihood


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
28
Bayesian Estimation of Parameters
• The methods of statistical interference that interpret probabilities as relative frequencies
and are called objective frequencies.
• The Bayesian approach combines sample information with other information that may be
available prior to collecting the sample
• The random variable X has a probability distribution of parameter 𝜃 called 𝑓(𝑥|𝜃).
• Additional information about 𝜃 is that it can be summarized as 𝑓(𝜃), the prior distribution,
with mean 𝜇0 and variance 𝜎02 . Probabilities associated with the prior distribution are
subjective probabilities.
• The joint distribution is 𝑓(𝑥1, 𝑥2, … , 𝑥𝑛|𝜃).
• The posterior distribution is 𝑓(𝜃|𝑥1, 𝑥2, … , 𝑥𝑛) is our degree of belief regarding θ after
observing the sample data.

7.4.3 Bayesian Estimation of Parameters


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
29
Bayesian Estimation of Parameters
• The joint probability distribution of the sample is
𝑓(𝑥1, 𝑥2, … , 𝑥𝑛, 𝜃) = 𝑓(𝑥1, 𝑥2, … , 𝑥𝑛 |𝜃) ∙ 𝑓(𝜃)
  f ( x1 , x2 ,..., xn ,θ ) , for θ discrete
• The marginal distribution is  θ
f ( x1 , x2 ,..., xn ) =  
  f ( x1 , x2 ,..., xn ,θ ) dθ, for θ continuous
 −

f ( x1 , x2 ,..., xn ,θ )
• The desired posterior distribution is f ( θ | x1 , x2 ,..., xn ) =
f ( x1 , x2 ,..., xn )

7.4.3 Bayesian Estimation of Parameters


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
30
Example 7.14a | Bayes Estimator for the
Mean of a Normal Distribution
• Let X1, X2, …, Xn be a random sample from a normal distribution unknown mean μ and
known variance σ2. Assume that the prior distribution for μ is:
1 1 (
−  2 − 2 0 + 02 ) 2 02
f (μ ) =
− (  − 0 )
2
2 02
e = e
2 0 2 2
0

• The joint probability distribution of the sample is:

( ) ( x −  )
n
− 1 2 2 2
1
f ( x1 , x2 ,..., xn |  ) =
i
i =1
e
( 2 )2 n 2

 n 2 
( )
n

1 − 1 2 2 

xi − 2   xi + n 2 
= e  i =1 i =1 

( 2 )2 n 2

7.4.3 Bayesian Estimation of Parameters


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
31
Example 7.14b | Bayes Estimator for the
Mean of a Normal Distribution
Now the joint probability distribution of the sample and μ is:

Upon completing the square in the exponent,

7.4.3 Bayesian Estimation of Parameters


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
32
Example 7.14c | Bayes Estimator for the
Mean of a Normal Distribution
If the sample mean is 0.75 the Bayes
estimate of μ is

Conclusion: Note that the maximum


likelihood estimate of μ is 𝑥ҧ = 0.75. The
Bayes estimate is between the maximum
likelihood estimate and the prior mean

7.4.3 Bayesian Estimation of Parameters


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
33
Important Terms and Concepts
• Bayes estimator • Moment estimator • Posterior distribution
• Bias in parameter • Normal distribution as the • Prior distribution
estimation sampling distribution of the
• Sample moments
difference in two sample
• Bootstrap method • Sampling distribution
means
• Central limit theorem
• Normal distribution as the • Standard error and
• Estimator versus estimate sampling distribution of a estimated standard error of
sample mean an estimator
• Likelihood function
• Maximum likelihood • Parameter estimation • Statistic
estimator • Point estimator • Statistical inference
• Minimum variance unbiased • Population or distribution • Unbiased estimator
estimator moments

Chapter 7 Important Terms and Concepts


Copyright © 2019 John Wiley & Sons, Inc. All Rights Reserved
34
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 35
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 36
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 37
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 38
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 39
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 40
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 41
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 42
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 43
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 44
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 45
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 46
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 47
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 48
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 49
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 50
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 51
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 52
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 53
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 54
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 55
Copyright @ 2019 John Wiley & Sons, Inc. All Rights Reserved 56

You might also like