lecture_note_4
lecture_note_4
Lecture 4
Gamma distribution
A random variable X is said to possess a gamma probability distribution with parameters
a > 0 and b > 0 if it has the probability density function (PDF) given by:
(
a
1
xa−1 e−x/b , if x > 0,
f (x) = b Γ(a)
0, otherwise.
The gamma density has two parameters, a and b. We denote this by Gamma(a, b).
The parameter a is called the shape parameter, and b is called the scale parameter.
If X is a gamma random variable with parameters a > 0 and b > 0, then:
(a) What is the probability that on a given day the fuel consumption will be less than
1 million gallons?
(b) Suppose the airport can store only 2 million gallons of fuel. What is the probability
that the fuel supply will be inadequate on a given day?
Solution Let X be the fuel consumption in millions of gallons on a given day at a certain
airport. Then, X ∼ Gamma(3, 1)
1
(a) Z 1
1 2 −x
P (X < 1) = x e dx ≈ 0.08025.
0 2
Thus, there is about an 8% chance that on a given day the fuel consumption will
be less than 1 million gallons.
(b) Because the airport can store only 2 million gallons, the fuel supply will be inade-
quate if the fuel consumption X is greater than 2. Thus,
Z ∞
1 2 −x
P (X > 2) = x e dx = 0.677.
2 2
We can conclude that there is about a 67.7% chance that the fuel supply of 2 million
gallons will be inadequate on a given day. So, if the model is correct, the airport
needs to store more than 2 million gallons of fuel.
2
Limit Theorems
Limit theorems play a very important role in the study of probability theory and statistics.
We already observed that some binomial probabilities can be computed using the Poisson
distribution. They can also be computed using the normal distribution by employing
limiting arguments.
In this section, we discuss modes of convergence, the law of large numbers, and the
Central Limit Theorem. First, we present Chebyshev’s theorem, which is a useful result
for proving limit theorems.
Theorem 1 (Chebyshev’s Theorem). Let the random variable X have a mean µ and
standard deviation σ. Then for K > 0, a constant:
1
P (|X − µ| ≥ Kσ) ≤ .
K2
Proof. We will work with the continuous case. By the definition of the variance of X,
Z ∞
2 2
σ = E[(X − µ) ] = (x − µ)2 f (x) dx.
−∞
σ 2 ≥ K 2 σ 2 [P (X ≤ µ − Kσ) + P (X ≥ µ + Kσ)] .
Equivalently:
1
P (|X − µ| ≥ Kσ) ≤ .
K2
or
1
P (|X − µ| < Kσ) ≥ 1 − .
K2
3
probability that the random variable X assumes values between 16.5 and 31.5.
Solution: From Chebyshev’s theorem, we have:
1
P (µ − Kσ < X < µ + Kσ) ≥ 1 − .
K2
√
Equating µ + Kσ = 31.5 and µ − Kσ = 16.5, with µ = 24 and σ = 9 = 3, we obtain
K = 2.5. Hence, the probability is
Mode of Convergence
In this section, we discuss modes of convergence, namely, convergence in distribution (or
law) and convergence in probability, and their relationship.
We see that as n → ∞,
(
0 if x < θ
FX(n) (x) → FX (x) =
1 if x ≥ θ,
w
which is a distribution function. Thus, FX(n) (x) −
→ FX (x).
4
Converges in probability
Let {Xn } be a sequence of random variables defined on some probability space (Ω, S, P ).
We say that the sequence {Xn } converges in probability to the random variable X if for
every ϵ > 0,
P (|Xn − X| > ϵ) → 0 as n → ∞.
P
We write Xn −
→ X.
Example Let {Xn } be a sequence of random variables with the probability mass function
(PMF)
1 1
P {Xn = 1} = , P {Xn = 0} = 1 − .
n n
Then
1
P (|Xn | > ϵ) = P {Xn = 1} = , 0 < ϵ < 1,
n
and
P (|Xn | > ϵ) = 0, if ϵ ≥ 1.
It follows that
P (|Xn | > ϵ) → 0 as n → ∞,
P
and we conclude that Xn − → 0. This means Xn converge to a RV X which is degen-
erated at 0, i.e., P (X = 0) = 1 and = 0 elsewhere.
Remark
P d
(a) Xn −
→ X implies Xn →
− X, but the converse is not always true.
L P
(b) Let k be a constant. Then Xn −
→ k iff Xn −
→ k.
Theorem 2 (Slutsky’s Theorem). Let {Xn , Yn }, n = 1, 2, . . . be a sequence of pairs of
random variables, and let c be a constant. Then:
d P d
(a) If Xn →
− X and Yn −
→ c, then Xn + Yn →
− X + c.
d P d P
(b) If Xn →
− X, Yn −
→ c, and c ̸= 0, then Xn Yn →
− cX. If c = 0, then Xn Yn −
→ 0.
d P d
(c) If Xn →
− X, Yn −
→ c, and c ̸= 0, then Xn /Yn →
− X/c.
Theorem 3 (Law of Large Numbers). Let X1 , . . . , Xn be a set of pairwise independent
and identically random variables with E(Xi ) = µ and Var(Xi ) = σ 2 < ∞. Then for any
c > 0,
σ2
P µ − c ≤ X̄ ≤ µ + c ≥ 1 − 2 ,
nc
and as n → ∞, the probability approaches 1. Equivalently,
Sn
P − µ ≥ ϵ → 0 as n → ∞,
n
where X = n1 ni=1 Xi and Sn = ni=1 Xi , i.e.,
P P
Sn P
−
→µ as n → ∞.
n
5
Proof. Since X1 , . . . , Xn are independent and identically distributed (iid) random vari-
ables (random sample), we know that:
σ2
2 Sn
Var(Sn ) = nσ , and Var = .
n n
Also,
Sn
E = µ.
n
By Chebyshev’s theorem, for any ϵ > 0,
σ2
Sn
P − µ ≥ ϵ ≤ 2.
n nϵ
Equivalently,
Sn
P − µ < ϵ → 1 as n → ∞.
n
The law of large numbers states that if the sample size n is large, the sample mean rarely
deviates from the mean of the distribution of X, which in statistics is called the popu-
lation mean. This result basically states that we can start with a random experiment
whose outcome cannot be predicted with certainty, and by taking averages, we can ob-
tain an experiment in which the outcome can be predicted with a high degree of accuracy.
6
Exercise Consider
Pn n rolls of a balanced die. Let Xi be the outcome of the i-th roll,
and let Sn = i=1 Xi . Show that, for any ϵ > 0,
Sn 7
P − ≥ ϵ → 0 as n → ∞.
n 2
Now we will discuss one of the most important results in probability theory the Cen-
tral Limit Theorem.
Show that
Sn − np
Zn = √
npq
is approximately normal for large n, where Sn = ni=1 Xi and q = 1 − p.
P
Solution: For the given random variable Xi :
7
Hence, by the Central Limit Theorem (CLT), the limiting distribution of Zn as n → ∞
is the standard normal probability distribution.
Example A soft-drink vending machine is set so that the amount of drink dispensed
is a random variable with a mean of 8 ounces and a standard deviation of 0.4 ounces.
What is the approximate probability that the average of 36 randomly chosen fills exceeds
8.1 ounces?
Solution: From the CLT,
X̄ − 8
√ ∼ N (0, 1).
0.4/ 36
Hence, from the normal table,
8.1 − 8.0
P X̄ > 8.1 = P Z> √ = P (Z > 1.5) = 0.0668.
0.4/ 36