1 Generating Functions: Subject - Statistics Paper - Probability I Module - Probability Generating Functions
1 Generating Functions: Subject - Statistics Paper - Probability I Module - Probability Generating Functions
Paper - Probability I
Module -Probability Generating Functions
1 Generating Functions
A fundamental principal of mathematics is to map a class of objects that are of interest into
a class of objects where computations are easier. This map can be one-to-one, as with linear
maps and matrices, or it may map only some properties uniquely, as with matrices and de-
terminants. In probability theory, in the second category fall quantities such as the median,
mean and variance of random variables. In the first category, we have characteristic functions,
Laplace transforms and probability generating functions. Generating functions are widely used
in mathematics, and play an important role in probability theory.
p q a0
A s a1 s a2 s 2 a3 s 3
t u pq
is called a generating function of the sequence a n if A s is convergent for all s P ps0, s0q for
some s 0 ¡ 0. s0 is the radius of convergence.
Definition 1 Let X be a non-negative, integer valued random variable. Suppose P X p jq
pj, j 0, 1, 2, . . .. Define
8̧
p q p0
P s p1s p2s2 pjsj Ers X s.
j 0
Then P s pq is called the probability generating function of X . As the name suggests, this
function generates the probabilities for non-negative, integer valued random variables.
P s p q Ers X s s c
Example 2 Suppose X Geometricpp q i.e. X is defined as the number of trials necessary to
1
get the first success. Then,
p x q pq x 1
P X
8̧
P p s q Er s s p s
X
pq s q j 1 1 psq s
j 1
q s ¡ 0 ñ s q1
1
p q p ps q q n x pq qn
n x
P s ps
x 0
x
(a) The generating function is defined for at least s| | 1. This is because of the fact that it is
°8
a power series with coefficients in r0, 1s. Moreover P p1q j 0 p j 1.
pq
(b) P s is absolutely and uniformly continuous within its interval of convergence.
pq
(c) P s can be differentiated term-by-term within its interval of convergence.
$
¸ j 1 & P 1p1q if ° j j p j 8
1
P ps q jpjs ñ Er X s %
j 8 otherwise
¸
P 2 ps q Er X p X 1qs if j p j 1q p j 8
j
2
After defining the probability generating function for a non-negative, integer valued random
variable the idea of putting it in use by characterising a probability distribution comes naturally
to one’s mind. Our next theorem solves this problem.
i.e. integer valued random variables have the same probability generating function if and only
if they have same probability mass function.
P ROOF :
The radii of convergence of P X s pq and P Y s pq are ¥ 1, so they have unique power series
expansions about the origin:
8̧
PX s pq p kq
sk P X
k 0
8̧
PX s pq p kq
sk P Y
k 0
Having known the mass function of a random variable X one can easily compute the p.g.f.
of X . But sometimes one might be interested whether the converse of this procedure can be
true i.e. given the p.g.f. of a r.v. X is it possible to compute p k P pX k q? In this context we
have two options to choose from:
pq
Option 1: Expand P X s in a power series in s and set p k coefficient of s k .
pq
Option 2: Differentiate P X s k times with respect to s and put s 0.
In practice, it is often convenient to work with p.g.f. than with mass function. Suppose X is a
random variable with values in the non-negative integers. Then the moments of X are easily
found from the p.g.f. of X , by differentiating the function at s 1. Our next theorem suggests
the procedure of doing it.
pq
Theorem 2 Let X be a random variable with PGF P s . Then r -th derivative of P s at s pq
r p 1q...pX r
1 equals E X X 1 qs for r 1, 2, ..., i.e.
P pr q s
p q ErX pX 1q...pX r 1 qs
3
P ROOF :
From the definition of P 1 s we can write
pq
8̧
P 1 ps q p kq
d
sk P X
ds
k 0
8̧
d k
ds
s P X p k q rby interchanging the order of summation and differentiations
k 0
8̧
ks k 1 P X
p kq
k 0
Theorem 3 If X and Y are independent random variables, each taking values in the set 0, 1, 2, . . . then t u
their sum sum has p.g.f.
PX Y ps q P X ps qPY ps q
P ROOF :
By definition,
PX Y ps q Ers X Y s Ers X s Y s
Ers X sErs Y s r since X and Y are independent s
P X ps qPY ps q.
It follows from the above theorem that the sum S n X1 X2 X3 .... X n of n independent
t u
random variables each taking values in 0, 1, 2, ... , has p.g.f. given by
p q P X ps qP X ps q......P X ps q.
PSn s 1 2 n
4
3 Random sum formula
Theorem 4 Let X 1 , X 2 , .... be independent random variables each taking values in 0, 1, 2, ... . t u
t u are identically distributed with a common p.g.f. P X ps q, then the sum
If the X i
SN X1 X2 ..... XN
has p.g.f. P S N sp q P N pP X ps qq
P ROOF :
By definition,
p q Ers X X
PSN s 1 2 ..... X N
s
8̧
Er s X 1 X 2 ..... X N
|N n sPrN n s
n 0
8̧
r
E s X1 X 2 ..... X n
sPrN n s
n 0
8̧
rP X ps qsn PrN n s
n 0
p p qq
PN PX s .
(3.2)
Notes:
(a) Differentiating both sides of the relation P X Y ps q P X ps qPY ps q we have
P S1 N s
p q P N1 pP X ps qqP X1 ps q. Set s 1, then
P S1 N 1
p q P N1 pP X p1qqP X1 p1q P N1 p1qP X1 p1q
r s ErN sErX 1s
i.e. E S N
p q ErN sVarpX q
(b) Var S N p q r s
Var N E2 X
λp1s q q
e µp1e
5
For example suppose, a hen lays N eggs, where N Poissonpλq. Each egg hatches with
probability p, independently of the other eggs. We are to find the probability distribution
of the number of chicks, Z. Here Z X1 X2 X N where X i B er pp q. Then
PN s p q e λps1q; P X pq 1 ps q
PZ s p q P N pP X ps qq e λp ps1q
1
Theorem 5
P ROOF :
ş 8̧ 8̧ 8̧ k¸
1
p q
Q s qj s j
pk s j
pk s j
j 0 j
j 0k 1
k 1j 0
tj 1 ¤ k 8, 0 ¤ j 8 ô 0 ¤ j ¤ k 1, k ¥ 1u
8̧ 1 s k
pk
1s
1
1s
p1 P ps qq
k 1
8̧ 8̧ 8̧ 8̧ ¸
m 1 8̧
p q
Q 1 qk pk pm mp m P 1p1q.
k 0 j
k 0m 1
m 1k 0 m 1
(3.4)