0% found this document useful (0 votes)
29 views

1 Generating Functions: Subject - Statistics Paper - Probability I Module - Probability Generating Functions

1. The document discusses probability generating functions (PGFs), which are useful tools for characterizing probability distributions of non-negative integer-valued random variables. 2. A PGF is defined as the expected value of s raised to the power of the random variable. Examples are provided of PGFs for common distributions like binomial, geometric, and Poisson. 3. Key properties of PGFs are that they uniquely determine the probability mass function of a random variable, and that derivatives of the PGF provide moments of the random variable.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

1 Generating Functions: Subject - Statistics Paper - Probability I Module - Probability Generating Functions

1. The document discusses probability generating functions (PGFs), which are useful tools for characterizing probability distributions of non-negative integer-valued random variables. 2. A PGF is defined as the expected value of s raised to the power of the random variable. Examples are provided of PGFs for common distributions like binomial, geometric, and Poisson. 3. Key properties of PGFs are that they uniquely determine the probability mass function of a random variable, and that derivatives of the PGF provide moments of the random variable.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Subject - Statistics

Paper - Probability I
Module -Probability Generating Functions

1 Generating Functions

A fundamental principal of mathematics is to map a class of objects that are of interest into
a class of objects where computations are easier. This map can be one-to-one, as with linear
maps and matrices, or it may map only some properties uniquely, as with matrices and de-
terminants. In probability theory, in the second category fall quantities such as the median,
mean and variance of random variables. In the first category, we have characteristic functions,
Laplace transforms and probability generating functions. Generating functions are widely used
in mathematics, and play an important role in probability theory.

p q  a0
A s a1 s a2 s 2 a3 s 3 
t u pq
is called a generating function of the sequence a n if A s is convergent for all s P ps0, s0q for
some s 0 ¡ 0. s0 is the radius of convergence.
Definition 1 Let X be a non-negative, integer valued random variable. Suppose P X p  jq 
pj, j  0, 1, 2, . . .. Define

p q  p0
P s p1s p2s2   pjsj  Ers X s.

j 0

Then P s pq is called the probability generating function of X . As the name suggests, this
function generates the probabilities for non-negative, integer valued random variables.

Example 1 (Constant r.v.) If p c  1 and p k  0 if k  c i.e. X is a constant r.v., then

P s p q  Ers X s  s c
Example 2 Suppose X  Geometricpp q i.e. X is defined as the number of trials necessary to

1
get the first success. Then,

p  x q  pq x 1
P X

P p s q  Er s s  p  s 
X
pq s q j 1  1 psq s

j 1

 q s ¡ 0 ñ s   q1
1


In this case the radius of convergence iss 0   ,


1 1
q q

Example 3 Suppose X  Binpn, p q i.e. X is defined as the number of successes in n number


of Bernoulli trials. Then,
ņ 

p q p ps q q n x  pq qn
n x
P s ps

x 0
x

Example 4 Suppose X  Poissonpλq.


8̧ pλs qx  e λps1q
p q  Ers
P s X
s  e λ

x 0
x!

Example 5 Suppose X  Negative Binomialpp q


8̧ k  1

n
n n k k
P p s q  Er s s  s 
X ps
n 1 1 qs
p q
k 0
1
|s |   q

Here the radius of convergence iss 0   ,


1 1
q q

The probability generating function has some very interesting properties:

(a) The generating function is defined for at least s| |   1. This is because of the fact that it is
°8
a power series with coefficients in r0, 1s. Moreover P p1q  j 0 p j  1.

pq
(b) P s is absolutely and uniformly continuous within its interval of convergence.

pq
(c) P s can be differentiated term-by-term within its interval of convergence.
$
¸ j 1 & P 1p1q if ° j j p j   8
1
P ps q  jpjs ñ Er X s  %
j 8 otherwise
¸
P 2 ps q  Er X p X  1qs if j p j  1q p j   8
j

2
After defining the probability generating function for a non-negative, integer valued random
variable the idea of putting it in use by characterising a probability distribution comes naturally
to one’s mind. Our next theorem solves this problem.

Theorem 1 (Uniqueness Theorem) If X and Y have PGFs P X s pq pq


and P Y s respectively,
then
p q  PY ps q for all s iff P pX  k q  P pY  k q for k  0, 1, 2, . . .
PX s

i.e. integer valued random variables have the same probability generating function if and only
if they have same probability mass function.

P ROOF :
The radii of convergence of P X s pq and P Y s pq are ¥ 1, so they have unique power series
expansions about the origin:

PX s pq  p  kq
sk P X

k 0

PX s pq  p  kq
sk P Y

k 0

p q  PY ps q, then these two power series have identical coefficients.


If P X s 

Having known the mass function of a random variable X one can easily compute the p.g.f.
of X . But sometimes one might be interested whether the converse of this procedure can be
true i.e. given the p.g.f. of a r.v. X is it possible to compute p k  P pX  k q? In this context we
have two options to choose from:

pq
Option 1: Expand P X s in a power series in s and set p k  coefficient of s k .
pq
Option 2: Differentiate P X s k times with respect to s and put s  0.
In practice, it is often convenient to work with p.g.f. than with mass function. Suppose X is a
random variable with values in the non-negative integers. Then the moments of X are easily
found from the p.g.f. of X , by differentiating the function at s  1. Our next theorem suggests
the procedure of doing it.

pq
Theorem 2 Let X be a random variable with PGF P s . Then r -th derivative of P s at s pq 
r p  1q...pX  r
1 equals E X X 1 qs for r  1, 2, ..., i.e.
P pr q s
p q  ErX pX  1q...pX  r 1 qs

3
P ROOF :
From the definition of P 1 s we can write
pq

P 1 ps q  p  kq
d
sk P X
ds

k 0

 d k
ds
s P X p  k q rby interchanging the order of summation and differentiations

k 0

 ks k 1 P X
p  kq

k 0

Now putting s  1 in P 1ps q we get



P 1 p1q  p  k q  ErX s
kP X
k 0 
Note: In our proof of the above theorem we have done interchange of differential operation and
summation. The question is that: Is it permissible? The answer is yes, since in most of the cases
where s | |   1, using Abel’s lemma we can show that this interchange is possible.

2 Sums of independent random variables

Theorem 3 If X and Y are independent random variables, each taking values in the set 0, 1, 2, . . . then t u
their sum sum has p.g.f.
PX Y ps q  P X ps qPY ps q
P ROOF :
By definition,

PX Y ps q  Ers X Y s  Ers X s Y s
 Ers X sErs Y s r since X and Y are independent s
 P X ps qPY ps q. 
It follows from the above theorem that the sum S n  X1 X2 X3 .... X n of n independent
t u
random variables each taking values in 0, 1, 2, ... , has p.g.f. given by

p q  P X ps qP X ps q......P X ps q.
PSn s 1 2 n

4
3 Random sum formula

Theorem 4 Let X 1 , X 2 , .... be independent random variables each taking values in 0, 1, 2, ... . t u
t u are identically distributed with a common p.g.f. P X ps q, then the sum
If the X i

SN  X1 X2 ..... XN

has p.g.f. P S N sp q  P N pP X ps qq

P ROOF :
By definition,

p q  Ers X X
PSN s 1 2 ..... X N
s

 Er s X 1 X 2 ..... X N
|N  n sPrN  n s

n 0

 r
E s X1 X 2 ..... X n
sPrN  n s

n 0

 rP X ps qsn PrN  n s

n 0
 p p qq
PN PX s . 

(3.2)

Notes:
(a) Differentiating both sides of the relation P X Y ps q  P X ps qPY ps q we have
P S1 N s
p q  P N1 pP X ps qqP X1 ps q. Set s  1, then
P S1 N 1
p q  P N1 pP X p1qqP X1 p1q  P N1 p1qP X1 p1q
r s  ErN sErX 1s
i.e. E S N

p q  ErN sVarpX q
(b) Var S N p q r s
Var N E2 X

(c) Special case: Compound Poisson distribution


Let N  P pµq, X 1  P pλq. Then PGF of S N will be

λp1s q q
e µp1e

5
For example suppose, a hen lays N eggs, where N  Poissonpλq. Each egg hatches with
probability p, independently of the other eggs. We are to find the probability distribution
of the number of chicks, Z. Here Z  X1 X2  X N where X i  B er pp q. Then
PN s p q  e λps1q; P X  pq 1 ps q
PZ s p q  P N pP X ps qq  e λp ps1q
1

Theorem 5

Suppose q j  P pX ¡ j q  p j 1 pj 2 ..... Define Q s p q  °8j 0 q j s j . Then


p q  1 1 P pss q , s   1
Q s

P ROOF :

ş 8̧ 8̧ 8̧ k¸
1
p q
Q s qj s j
 pk s j
 pk s j
j 0   j
j 0k 1  
k 1j 0
tj 1 ¤ k   8, 0 ¤ j   8 ô 0 ¤ j ¤ k  1, k ¥ 1u
8̧ 1  s k
 pk
1s
 1
1s
p1  P ps qq

k 1

Result: E Xr s  P 1p1q  Q p1q

8̧ 8̧ 8̧ 8̧ ¸
m 1 8̧
p q
Q 1 qk  pk  pm  mp m  P 1p1q. 

k 0  j
k 0m 1  
m 1k 0 m 1 

(3.4)

You might also like