0% found this document useful (0 votes)
13 views

Mathematical Expectation

Chapter 5 discusses Mathematical Expectation in the context of univariate random variables, highlighting the contributions of Pierre-Simon Laplace to probability theory. It covers definitions, properties, and calculations of expectation, variance, and moments of random variables, along with illustrative examples. Key objectives include understanding expectation, learning moment-generating functions, and computing coefficients of skewness and kurtosis.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Mathematical Expectation

Chapter 5 discusses Mathematical Expectation in the context of univariate random variables, highlighting the contributions of Pierre-Simon Laplace to probability theory. It covers definitions, properties, and calculations of expectation, variance, and moments of random variables, along with illustrative examples. Key objectives include understanding expectation, learning moment-generating functions, and computing coefficients of skewness and kurtosis.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Chapter 5

Mathematical Expectation
(Univariate)
Pierre--Simon Laplace (1749-1827) was a French
mathematician and astronomer whose work on analytic
oN of probability helped in creating foundation for
mathematical statistics. In two important papers in 1810
and 1811, Laplace first developed the characteristic
function as atool for large-sample theory and proved the
first general central limit theorem
We use momentgenerating function as a type of Laplace
transform.
Pierre-Simon Laplace
Contents
5.1 Introduction
5.2 Mathematical Expectafion
5.3 Expectation of a Function of a Random Variable
5.4 Theorems on Expectation
5.5 Variance of a Random Variable
5.6 Effect of Change of Origin and Scale on Variance
57 Moments of Rondom Variable
5,8 Relations between Raw Moments and Moments about 'a
5.9 Relations between Raw Moments and Central Moments
510 Relations between Central Moments and Moments about 'a
1 Effect of Change of Origin and Scale on Central Moments
2 Measures of Skewness and Kurtosis Based on Moments
513 Factorial Moments
5.]4 Moment Generating Funcion (M.G.F.)
D.15 Properties of Moment Generating Function
S6 CumulantGenerating Function (C.G.F)
(5.1)
F.Y.B.Sc. Statistics (Paper - I) 5.2 Mathematical Expectation (Univarias
Key Words :
Mean, Variance of a random variable, Moment generating function, Cumulant genera
function, raw moments, Central moments, Factorial moments.
Objectives :
" Understand the concept of exxpectation of a random variable and its function.
Learn the m.g.f. and c.g.f. and their properties.
Compute raw and central moments of a random variable.
Solve numerical problems on moments and compute coefficient of skewness an
kurtosis.
5.1 Introduction
The probability distribution of a random variable (r.v.) specifies the chance
(probabilities) of a r.v. taking different values, However, we might be interested in varion
characteristics of a probability distribution such as average, spread, symmetry, shape etc. h
order to study these characteristics, statistical measures are developed. The development of
measures such as mean, variance, moments, coefficients of skewness and kurtosis is om
similar lines as that for a frequency distribution. The basis for all this is mathematical
expectation. Mathematical expectation of a r.v. or its function provides a representative
figure for the probability distribution. It takes into account probabilities of all possible values
that the r.V. can take and summarizes them into a single average.
5.2 Mathematical Expectation
Definition : Let X be a discrete r.v. taking values XË, Xz, ..., XË, ... xn with probabilities
Pi- Pzx .... Pis .. Pa respectively. The mathematical expectation of X; denoted by E(Xs
defined as,
E(X) = Xpi+ XP2t ...+ Xn Pn
n

=2 XPi
|=1
Cxlected value 5
E(X) is also called as the expected value of X. Ca|ed mea
Remark 1:E(X) is the arithmetic mean (A.M.) of X. To
following frequency distribution of X. see this, let us consider tne
X X X
X;
f f
We know that the A.M. is
given by

N where, N = Xi
f x+f;x, ... +fËx +... +fn Xn
EY.B.Sc. Statistics (Paper- II) 5.3 Mathematical Expectation (Univariate)

= pxj= E(X)
where, Pi = N i= 1, 2, .. n are the relative frequencies of
X. Xp% , Xn respectively. Thus in E(X), the relative frequencies are replaced by the
probabilities of respective values ofX.
Remark 2 :If the p.m.f. is in functional form P (x), then E(X)= )x P(x).
Remark 3:If arandom variable takes countably infinite values, then E(X) =2 XPi.
i=1
The expectation is well defined if the series Ixl p, < oo (i.e. absolutely convergent).
Otherwise we say E(X) does not exist.
Remark 4: The value of E (X) may not be a possible value of the r.v. X. For example,
when we toss a fair die, P (x) =~ for i = 1,2, .... 6, where X= number observed on the face
of the die.
6

Hence, E(X) = X x, P (x) = (1+2+3 +4+5 +6) = 3.5 which is not a possible
value ofX.
Remark 5: Arithmetic mean of X, i.e. E (X) is considered to be the centre of gravity of
the probability distribution of X. It is the average of values of X, if we perform the
experiment several times and observe a large number of values of X.
Illustrative Examples
Example 5.1: Obtain expectation of ar.v. X with the following probability distribution.
X 1 3 5 6
P (x) 0.1 0.2 0.4 0.3
Solution:
X P (x;) x P (x)
0.1 0.1
3 0.2 0.6
0.4 2.0
6 0.3 .8
Total 1.0 4.5
E (X) = x P (x) = 4.5
Example 5.2 : Obtain the expected value of number of heads when three fair coins are
tossed simultaneously.
Solution: We know that in this case
2= (HHH, HHT, HTH, THH,HTT, THT, TTH, TTT)
F.Y.B.Sc. Statistics (Paper- I) 5,4 Mathematical Expectation (Univari
and hence if Xdenotes number of heads. the probability distribution of Xis,
X 2
P (x) 1 3 3
8 8 8
3 63 12 3
Accordingly. E(X) = ExP() =0+ t t 82
Example 5.3: Abox contains 5 tickets. Two of the tickets carry a prize of 10 each.h.
other three carry prizes of 2 each. (i) If one ticket is drawn at random, what is the expecle
value of the prize ? (i) If two tickets are drawn, without replacement, what is the expecte
value of the prize ?
Solution:(i) Let the tickets be numbered as 1,2,3,4, 5.
2=(1, 2, 3, 4, 5). Without loss of generality, let tickets numbered I and 2cam
prizes of R10/- and others carry prizes of R2/- each. Suppose X denotes the prize amoUN
then following is the probability distribution of X.
2 3
X 10 10 2

P (x) 1/5 1/5 1/5 1/5 1/5

P(X=2) = :3 P(X= 10) =

E(X) = EXP)=2- +10=52


The expected amount of prize is 5.20.
(ii) When we draw two tickets without replacement, the equiprobable sample space
contains C, = 10 points. Now, let Xdenote the amount of prize when the experiment is
performed. There are three possibilities.
(a) Both tickets drawn are of R10/-. This can happen in=l ways i.e. when tickets
numbered 1 and 2 are drawn.

P(X=20) = 0
(b) One ticket is of? 10 and the other is R 2. This corresponds t o =6 sample
points in 2.
6
P(X= 12) 10

(o Both tickets are of 2. There are =3ways in which this can happen.
3
P(X= 4) = T0
(Paper-)
EY.B.Sc. Statistics
5.5 Mathematical Expectation (Univariate)
Hence, the probability distribution of Xis
4 12 20
P (x) 3 6
10 10
and E(X) = )x P(x) = 10.40
. Expected prize would be of 10.40.
Example 5.4 : There are three proposals before a manager to start a new project.
Proposal A: Profit of 50,000 with probability 0.6 or loss of ? 8,000 with probability
0.4
Proposal B:Profit of 1,00,000 with probability 0.4 or otherwise a loss of 20,000.
Proposal C:Profit of 45,000 with probability 0.8, otherwise loss of 5,000.
Which proposal should the manager choose ? Justify ?
Solution :Let X=Profit in . We assign positive sign to profit and negative sign to loss.
We obtain the expected profit due to each proposal.
Proposal A Proposal B Proposal C
XË Pi Pi XË Pi X Pi XË Pi
50000 0.6 30000 100000 0.4 40000 45000 0.8 36000
-8000 0.4 -3200 20000 0.6 12000 5000 0.2 - 1000
E(X) 26800 28000 35000

Expected profits from proposals A, B, C are 26,800; 28,000 and 35,000 respectively.
Since proposal C is expected to give maximum profit, the manager should choose proposal
C

Example 5.5 : Arv. Xtakes values 0, 1, 2, .. nwith probabilities proportional to the


binomial coetfcients - -...respectively, Find E(X).
Solution: Let, p; = P(X = i), where i=0, 1, ..., n

n n

1=0 i =0

k = 2-n 2n

E (X) = xP (x)

1 a0
F.Y.B.Sc. Statistics (Paper- I) 5.6 Mathematical Expectation (Univariate

n2n-1 n
n =2

Example 5.6: Let the p.m.f. of ar.v. Xbe


P(x) = X=-1,0, 1, 2
Calculate E (X).
Solution: The probability distribution of Xis
-1 1

P (x) 0.4 0.3 0.2 0.


Hence, E (X) = Xx P (x) =-0.4 +0 +0.2 + 0.2 = 0.

5.3 Expectation of a Function of a Random Variable


In earlier chapter we have seen that ifY=g (x) is a function of ar.v. X, then Yis also a
r.V. with the same probability distribution of X viz. P(x). Using this property we can define
the expectation of Yas follows :
E(Y) = E[g (x)] = g (x) P (x)
For example; suppose Xhas the following probability distribution.
0 1

P(x) 0.3 0.3 0.4


Let, Y = 2X+3. Hence values of Yare 3. 5,7.
and E (Y) = y P (x) =0.9+ 1.5 + 2.8 = 5.2
The above concept is useful in deriving some important results.
5.4 Theorems on Expectation
Theorem 1:Expected value of aconstant is the constant itself. That is,
E (C)
Proof : Let (Xi, Pi}: i= 1, 2, ... n denote the probability distribution of a discrete
r.v. X. Let g (x) = C, a constant.
E(C) =Elg (x)] = g(x) p, =cp =c P=1
Theorem 2 : Effect of change of origin and scale on E(X).
(1) E(X+ b) = E(X) +b
(ii) E (aX) = a E (X)
(iii) E (aX + b) = aE (X) +b
(Paper- In
EYASC. Statistics 5.7 Mathematical Expectation (Univariate)
Proof: () g (X) = X+b
Elg (X)] = Xg (x)p=(+ b) p;= Xxip+ b} P
= xp +b
= E(X) +b
E (aX) = )a xip, = a)xip, =aE(X)
(G) E(aX +b) = (a x;+ b) pi
= a xipi+b Pi
= aE (X) +b
Remark : 1. In particular, E(- X)= -E(X) and E(3X-6) =3E (X)-6 etc.
Remark :2. If we define Y = X-a then E (Y) = E(X) -a or E (X) =a+ hE (Y)
h h
which is apropertyof a.m. You have studied it in paper I.
5.5 Variance ofa Random Variable
The expected value of X, viz. E (X) provides a measure of central tendency of the
probability distribution. However, it does not provide any idea regarding the spread of the
distribution. Forthis purpose, variance of a random variable is defined as follows.
Definition: Let Xbe a discrete r.v. with probability distribution (xi, Pi), i= 1,..., n.
Variance ofX, denoted by o² is defined as,
o = Var (X) = E [X- E (X)1P
Note : (i) Var (X) is expected value of the function g(X) =[X -E(X)]. The mean of X,
viz. E(X)is generallydenoted by '. Using this notation, we can write
g = E (X-)?
(ii) The above formula for o is difficult to compute. For computational convenience,
the following simplification is used.

= i=1
(Xi-)'P M=
n
2
= X Pi-24 il
XPi+H° P
= E(X)- 212+ ?
= E(X*) -
Thus, Var (X) = E(X)-E (X)|
F.Y.B.Sc. Statistics (Paper- I) Mathematical Expectation (Univariate)
58

Remark 1: Var (X) > 0. This is because variance is expected value of the square.
[X-E(X)]', which cannot be negative. Therefore, we get
E (X) > [E (X))
Remark 2: Variance of X is zero if and only if X is a degenerate rv. That is, X takes
only one value with probability 1. For example, if P(X=C] = 1, then E(X) =C.
and G = E(X-CP=(C- C². 1=0
Remark 3:The positive square root of varianceis called the standard deviation of X
is denoted by o.

G = Var (X) =VE(X-P


Standard deviation is used to compare variability between two distributions.
llustrative Examples
Example 5.7 : Calculate the variance of X, if X denotes the number obtained on the face
of a fair die.
E Solution : We know that.
P (3) = % X =1,2, ..., 6
and E (X) = 3.5 (Ref. Remark 3 to 5.2)
Now, = E(X)- [E (X)]²
Consider, E(X) = XP ()
=a (1P+2+3+4+ 5+6)
91
6
ß = Var (X) = 91 -(3.5)² = 2.9167
Example 5.8 : Obtain variance of r.v. X having following p.m.f.
0 1 2 3 4 5
P (x) 0.05 0.15 0.2 0.5 0.09 0.01
Solution
Pi XËPi
XI P
0 0.05 0 0
1 0.15 0.15 0.15
2 0.20 0.40 0.80
3 0.50 1.50 4.50
4 0.09 0.36 1.44
0.01 0,05 0,25
Total 2.46 7.14
Statistics(Paper- II) 5.9 Mathematical Expectation (Univariate)
EYBSC
E(X) = Xpi = 2.46
Var (X) = xi pi - [E (X))P
= 7.14-(2.46)2 = 1.0884
Romple 5.9:Compute variance ofX for the following probability distribution.
P (x))= 30 x = 0, 1, 2, 3, 4
Solution : E (X) = xP (x)
1
30 X =0

= 30 (0+1+8+ 27 + 64)
100 10
= 30=3
1
E (X) = 302 X*
1 354
= 30 (l+ 16 + 81+ 256) = 30
= 11.8

Var (X) = E (X) - (E (X)P

18-9
= 0.6889
distribution.
Example 5.10: Consider the following probability
X 0 1

P (X) P 1-2p
0sps
For what value of p, is the Var (X) maximum ?
Solution : E(X) 5 1-2p +2p = 1
E(X) = 1(1-2p) + 4p
= 1+2p
Var (X) = E (X) - [E (X)]=l+2p- 1 = 2p.
1 =.
Oince, 0Sps, Var (X) will be maximum when p
1
For p=2: Var (X) = 1.
Variance
aD Effect of Change of Origin and Scale on
variance o, Then,
eorem 3: Let X be a discrete r.v. with mean u and
Var (X+ b) = Var (X)
Var (aX) = a Var (X) ag²
) Var (aX + b) = ao
F.Y.B.Sc. Statistics (Paper - ) 5.10 Mathematical Expectation (Univariate
Proof : (i) By definition,
Var (X + b) = E[(X + b) - E(X + b)]?
= E[X +b-E (X)- b]²
= E[X- =o
Thus variance is invariant to the change of origin.
(iü) Var (aX) = E[aX-E (aX)² by definition of variance of ary,
= E[aX- aE (X)]?
E[a (X- E (X)
= aE(X-E (X)JP
= ao
(iii) On similar lines,
Var (aX + b) = EaX + b-E (aX + b)]²
= E[aX+b-aE (X)- b]²
= E[a (X-E(X)J
= aE [X-E(X)]P
= a'o²
Thus variance is not invariant to the change of scale.

Remark l:If we define Y = h X-a then

where o and ay are Var (X) and Var () respectively.

Remark 2:Let X be a r.v. with mecan L and s.d. a. Define Y = X-4

Then, E(Y) =E | E(X) -4]=0


1
and Var (Y) = Var (X) = 5=1.

.Y has mean Oand variance 1. Therefore, Y


= i s called a standadized r..
Remark 3:If Y =aX, 'a' constant, then standard deviation of Y, is given by
Oy = lal o

We know that ay = ao, and s.d. is defined to be the positive square root of variance.
Oy lal o,
(Paper-n 5.11
Statistics
EY&SC.
Mathematical Expectation (Univariate)
Var (-3X+5) = 9 Var (X)
Thus,
and
s.d. (-3X+5) = 3 s.d. (X).
of constant is zero.
Theorem4: Variance
Proof:
Var (c) =E(c) -[E ( c2-c?=0
hMustrative Examples
Example 5.11 : The mean and variance of marks in Statistics (X) are 60 and 25
cncively, Find the mean and variance of
X-60 X- 50
10

Solution :() Weknow that, if Y then,

E(Y) =
E(X)-a and V(Y)= h2Vdenotes variance
h
60 -60
Here, E(Y) = =0
5

and Var (Y) = =1


Thus Yis astandardized variable of X.
E (X) - 50 60 - 50
() E(Z) = 10 10
=1

V()
Var (Z) = 100 25
100 = 0.25

Bample 5.12 :Arv. Xassumes n values 1, 2, ...,n with equal probability. If the
tato of Var (X) to E(X) is equal to 4. find the value of n. What will be the value of n if
Var (X) =E(X)?
Solution :The probability distribution ofX is as follows :
X 2 n

P(x)
n

Hence. E(X) = n
i
n(n+ D n+l
n2 2

l n(n+1) (2n +1) (n+D(2n+)


n

EX) = n

Var (X) = E(X) -[E


(X)
(n +1)(2n + 1) (n +D
6 4
F.Y.B.Sc. Statistics (Paper - ) 5.12 Mathematical Expectation (Univaris
(n+ 1) (n- 1)
12
n²-1
12

Given : Var (X)


E (X) ^n+1
n = 25
To answer the second part of the problem, let
Var (X) = E(X)
n'-1 n+1
12=2
6 = 1

n =7
Example 5.13: Let X be a discrete r.v. with mean 5 and s.d. 3. Compute mean and s.d.
of (i) 2X - 5,(i) 3-7X, (i)
Solution: ()) Let Y= 2X-5,
E(Y) = 2E (X) -5
= 10-5 =5
6y = 2l, g;=6
(ii) Let, Y = 3-7X

E(Y) = 3-7E (X) =3-35 =-32


s.d. (Y) = Gy =|7| o=7x3 =21
(iii) YX+1
2

E() =;E (X)+5 =51 =3

s.d, of Y = Oy =
=3=5
Example 5.14: Prove that E (X -k)² = Var (X) + E (X) -k, where k is any
constant.
Solution: Var (X) = Var (X-k)
. (Fromn Theorem 3)
E(X-k)?- (E (X -k
= E(X-k)- [E (X) -
k)²
E(X-k)? = Var (X) +|E (X)-k]?
Statistics(Paper - 5.13
EYASC. Mathematical Expectation (Univariate)
Moments of a Random Variable
5.7
we studied mean and variance of a random variable. The mean measures central
Sofar
while the variance measures spread. In order to get the complete information on the
tendency
probability distribution, we also have to study the shape of the probability distribution. For
example,we need measures of Skewness (lack of symmetry) and Kurtosis (peakdedness) of a
probabilityddistribution. Moments of arandom variable (or probability distribution) servethis
purpose.
MAshall study four types of moments of a r.v. in this chapter. Let (X;, Pi). i= 1,2, .., n
representa probability distribution of a discrete r.v. X.
Moments about any arbitrary point 'a': The rth moment of Xabout 'a' is denoted
by 4, (a) and is
defined as.
n
r=1,2, 3, ...
(H, () =E(X-a)= i=1
(X;- a) pi
In particular., (a) = E(X-a) =E (X) - a

H, (a) = E(X- a)²


2. Raw moments (Moments about the origin i.e. zero): The rth raw moment of X is
defined as the rlh moment about0. It is denoted by .
n

Hence, = H, (0)) = E(X= i=1 xP r=l,2,3, ...


In particular, , = E(X) = mean

, = E(X?) = XË Pi
i=1
n

, = E(X*) = Xi Pi
n

= E(X4) = XË Pi and so on.


i=1
: The rh
O Central moments ((Moments about the arithmetic mean) (April 2012)
is denoted by L
a noment ofX 0s defined as the rlh moment of X about E (X). It

Hence, \r = HBe) =E [X - E(X)r= [XË- E(X) P


r=1,2,3, ...
h p ticular, = E[X- E(X)] = E(X) -E(X) =0
F.Y.B.Sc. Statistics (Paper- I1) 5.14 Mathematical Expectation (Univariate)
Thus, the first central moment is always zero.
= E [X-E(X)j² = Var (X)
A = E[X-E(X)P and so on.
5.8 Relations Between Raw Moments and Moments About 'a'
Consider, H, (a) = E(X-a) =E (X) - a= -a
4, (a) =E(X- a)P = (-a)?p
= i (i-2ax; +a) pi
= Exi Pi-2a xi P; +a Epi
= E(X*) 2a E (X) + a²

On similar lines we can prove the following :


H, (a) = H -34, a +3 u, a?- a
L (a) = H, -44, a+6, al-4u, a'+ at
5.9 Relations Between Raw Moments and Central Moments
H, = 0
Ho = E [X-E (X))²
Pi
It can be proved that

Hs = EX-E (X)P
- E(X-) - x-)
It can be shown that:

H3 = H- 3u, H, +2 ,
= E [X - E(X))
- E(X-)
shown that :
Itcan be
and so on.
Statistics(Paper- II) 5.15 Mathematical Expectation (Univariate)
EKASC
Relations between Central Moments and Moments About 'a'
5.10
(Without Proof)
u = 0

Consider, =
E-)

Similarly, =
E(-)
= Hsa-3u,( Hat2ui (a)
Also,
Central Moments
5.11 Effect of Change of Origin and Scale on
X-a
moment u, (x). Define Y = h Then rh central
Let Xbe a discreter.v. with th central
moment of Y, denoted by u, (y) say, is given by,
H(y) = )
4 (x) = hu, (y)
X-a X=a+hY E(X) =a+h E(Y)
Proof : Y =h

Now, (x) = E[X-E (X)]F


= E[a + hY - a- h E
(Y)F

= E [h (Y-E(Y))F
= hE[Y-E()]
= hu, (y).
of scale.
are invariant to the change of origin, but not to the change
us central moments
Kurtosis Based on Moments
5.12 Measures of
1 Skewness and
are similar to those of
Concepts of skewness and kurtosis of a probability distribution
e Paper l.
ency distributíon which you study in Statistics-
symmetry of the probability distribution, while kurtosis
Can hess means the lack of based on moments.
TIVean akdedness of the distribution. Following are the measures
F.Y.B.Sc. Statistics (Paper- )
5.16 Mathematical Expectation (Univariate)
1. Coefficient of skewness (%,): The coefficient of skewness is defined as.

where l, = o. the variance

The sign of Y is that of s.


If Y = 0, the distribution is symmetric.
Y> 0, the distribution is positively skewed.
Y <0, the distribution is negatively skewed.
defined as
2. Coefficient of kurtosis (Y): The coefficient of kurtosis is
-3
= B-3 =
is also called the 'excess of kurtosis".
Y = 0, the distribution is mesokurtic i.e. moderately peaked
If,
Y > 0. the distribution is leptokurtic i.e. more peaked
%<0, the distribution is platykurtic i.e. is less peaked
(April 2012)
5.13 Factorial Moments
Consider the following product.
X0 = X(X- 1) (X-2) ... (X-r+ ),r=l,2, .s*
This product is read as Xfactorial r' and is denoted by X(0, It is the product of rfactos
starting from Xand each time reducing a factor by 1.
For example, 100) = 10 x 9x 8
n) = n (n-1) etc.
For a r.v. X.
X0 = X, XO= X(X-I), X) = X(X- 1)(X- 2) ete.
Definition : rlh factorial moment : Let X be a discrete r.v. taking values x}, Xy n a
with respective probabilities p,, P:. Pr. The rh factorial moment of Xor its probabiliy
distribution is denoted by and is defined as;
Ho = E(*0) = E [X (X- 1)...(X-r+ D]

deine
th croa abyg
4 xË (N -) .. (x-r+Dp: r=l,2,3,..)
In particular, Bu E(X) =E(X) =

Xx(- Dp
Statstics(Paper - ) 5,17 Mathematical Expectation (Univariate)

E(X)-E(X)
=,-H,
Similarly, one can prOve that
-31 +241
Ho = H -64 +1lu, -64,.
Nate : Sometimes, it is casier to compute factorial moments than raw or central
moments. Using the relations between factorial and other moments as expressed above, the
required types of moments can be computed.
Esample 5.15 : Let a discrete r.v. X assume values 1, 2, .... 6 with equal probability.
a Fndthe first twofactorial moments of Xand hence the mean and variance. (ii) Find the
Gtfour raw moments. (ii) Using the relation between raw and central moments, find the
st three central moments. (iv) Also compute y, and comment on the nature of the skewness
of the distribution.
Solution : The probability distribution of X is
X 1 3 4 5 6

P (x) 1
6 6 6 6 6

21
() H =E(X0) =E (X) = - XiP;= 6
Ha = E[X (X- 1)] = x; (xË- 1) p
23.2 4.3 5.4 6.5
= 0+%t6 6 6 6
70
=6
70
,-H=6
Variance = 4-, = - +
70 21 (21)P 35
=6+ 6 36 12
(1) Raw
moments :
70 2191

4413275
F.Y.B.Sc. Statistics (Paper- ) 5.18 Mathematical Expectation (Univariate)
(iüi) Central moments :
35
H=0, a= 12
As = 4 - 34, H, + 21,
=
441
6 - ) ) ) -o
(iv = VB,= = 0 4 = 0
The distribution is symmetric.
(M.G.F.)
5.14 Moment Generating Function
way to find moments of probability
Moment generating function is an elegent probability
Moreover, it is handy in deriving distribution of functions of
distributions. whether two or many random variables are
random variables. One can verify using M.G.F.
distribution theory.
independent. Thus, M.G.F. is useful in many ways in
Definition: Suppose X is a random variable defined with p.m.f. P (x), then the moment
it is as,
generating function of X is denoted by M, (t) and
M,() = E(ets) = Xetxp ()
neighbourhood of zero
provided etx p(x) is convergent for the values of t in some
(i.e. -h<t<h, h> 0).
M.G.F. M,() can be expressed in powers of t as follows :

M,() = E(ets)'= E1+X


= 1+tEX) + E(X) + E(X) +......

5.15 Properties of M.G.F.


(1) M, (0) = 1
Proof :
M,() = E(els) Vegio6le
M, (0) = E(e) = E)= 1.,
:
(2) Effect of change of origin and scale
Result (i): Ifa r.v. Xhas M.G.F. M, () then M.G.E. of X+ ais M, )=eM,()
a being constant.
Statistics(Paper - )
AKBSC.
5.19 Mathematical Expectation (Univariate)

Proof: M+a () = E<eX +a) t


= E<eXt+at]
= E[eXt. eal)
= el E(eXt) = ealM, ().
Rolt (i) :If M,() is aM.G.F. of ar.v, X, then M.GE. of cX is Me() =M, (ct),
cbeingconstant.
Proof: Mex() = E[ ecx1] = E[ec) *]
= M, (ct)

Result (ii) : IfY =a+cX then M,(0) = eatlM, (ct).


Proof: M,(0) = E(e) =E[ela+cX)|
= eat E [ec) X] = eat M, (ct).
Note : M ( ) = ealb. M, (Ub)

(3) If Xand Y are independent random variables with M.G.Fs M,(t) and M,(t)
respectively, then,
M, (t) = My (t) ·My (t)
Proof : M+y() E[ el X+ Y)
= E[etX eY] (:Xand Yare independent)
= E(etX) E(eY).
= M, ()·M, ().
(4) Uniqueness property:
Statement : Foragiven probability distribution there is nique M.G.F. if it exists and for
agiven M.G.F. there is a unique probability distribution.
Proof is beyond the scope of this book. However, we use this property quite often.
Especially, to obtain the distribution of g (X) the transformation of r.v. X, we find M.G.F. of
8(0). If this coincides with that of any standard probability distribution, then due to
Uniqueness property weconclude that g(X) follows that particular probability distribution.
Note : Two diferent probability distributions may have same mean, variance or all the
WS, however, the corresponding M.G.F.s will not be the same. The properties of
M.G.E. are illustrated
Raw moments using below.
M.G.F. :
ctnod 1 : It is clear from the above power series expansion of M(0 that the r raw
TIOment
=coefficient of in the expansion of M,(0.)
F.Y.B.Sc. Statistics (Paper-D) 5.20 Mathematical Expectation (Univariate)
follows :
In particular, the first four moments will be obtained as
H'= Coefficient of t in M()

= Coefficient of in M,()
= Coefficient of a in M, (0)

u' = Coefficient of i in M,)


Method 2: Using successive differentiation of M,() (with respect to t) one can find ra
moments. Note that.
dM, () +......

dM,()7 =

dt 0
Similarly.
d² M,(t)
dt
d'M,O1
d
d'M,0
Thus, dt3 =0

and
d'M,(0)1]
dt
d'M,()
ngeneral, dtr
Note :
(1) Generating function for central moment is defined as follows,
M-n(t) = E<elk-m)] where m = E(X)
Itgives central moments
Hr= Coefficient of in the expansion of M-m ()
as well as,
dr M-m)
dt
(2 M..() = E [ete is a generating function for the moments about 'a'.
(3) Generating function for factorial moments is W,() = EI+ 0N, The rth factonial
moment is uio, thc coefficicnt o n the expansion of w. ()
Statistics(Paper- )
KKRSE
5.21 Mathematical Expectation (Univariate)
Ilustration 1:IfX is r.v. with p.m.f.
p(x) = "C, p qx; x=0, 1,....n, 0<p <l,q=1-P
= 0 ; otherwise.
Find M.G.F.ofX.
Solution: M,() = E(etx) = ep (x)

="C (pe'y qx = (q +pe')n


-00<t< oo,
Cince. the series of finite number of terms with each term finite M,(t) exists for
lustration 2: If M(t) = e2, find the first four raw moments of X. Hence, find the first
four central moments.
Solution :
e2
M,() =ez = 1+5e2
2!
+
3!
+

= 1+7 + 2 83! +
= 1+3 +3 +15 t
coefficient of tin M, (t) = =0
Coefficient of in M,() = E=1
Coefficient of a in M, (t) = ; =0
Coeficient of 4 in M,() = =3.
Dnce ' =0, raw moments and central moments are identical.
P=0, =1, H3=0, H4=3.
D16 Cumulant Generating Function (C.G.F.)
order to obtain M.G.E. we have used the transformation etx, what do we get if we use
inverse transformation log, may be a question of interest. Accordingly, loge M() is
Centt1s called as cumulant generating function, which is found to be useful to find the
central moments easily
Definition : If a'rv. X has M.G.F. M,(), then the cumulant generating function is
denoted by K,() and defined as follows :
K,() = log, M,().
RYRS Statistis(age S22 Mathematical Expetatin (niva
Like M.GF KO can de evndt in the powes oft.

The coeticient ofin the expansion of K, @is callad as or


oner
Preperties ef C.GE :
L Eect of change of origin : l Y=\-a thn exope the first cumalan alloche
cumalants of Xand Yare sam, abeing oonstant.
Proof : Let M0 be M.GE X, Therefre.

0=-at+ log M0
K=-at+ Kt)
Ky0 =(K-at+ K+Kgt
Bquating coefticientsofton oh the sides we get,
Kof y= K-a= (First cumulant of N)-a
Equating ceticients of22weget.
K, of Y= K,ofX
2. Efect ef change of scale : lfY =hX then.
cumulant of Y= hx cumulant of N\ where h is a constant.
Proof : Let M() be ML.GF of X, therefore,
M,0 = Mh ()= M(ht)
K,0 = log M,(ht) = K(h)
(ht
Kt) = K(ht) +K ,K
Equating coeficients of on boh the sides we get,
K,ofY = (K,of X) h.
3. Additive property of eumulants : If Xand Yare indeendent random varnables
then.
K,of (X + Y) K,of X+ K, oY
Proof : Since NandY are independent random variables
M ) M, 0- M, 0
log Mey ( =log M,o +log M,c)
Ky() = K(0 K)
AY&SCStatistics(Paper-) 5.23
Mathematical Expectation (Univariate)

Eauating coefficientsof ri on both the sides we get,


K,of (X + Y) = K,of X + K, of Y.
Note:
In the expansion of K(), the constant term is absent, whereas in M, () it is 1.
hcumulant K, can be obtained by successive differentiation of K,(t) as follows.
d K, ()|
K, = dtr
t=0

Thus, K, = coefficient oft in K, (t) =


dK,(0)1]
dt

K, = coefficient of in K, () = dt? Jt=0

K, = coefficient of a in K, (t) = d K,dt (0)]


-0

K, = Coefficient of a in K, (t) = d+ dt
K,()
0
In general,
d K()|
K, = Coefficient of in K,() = dt t=0
Relation between moments and cumulants :
We can expressfirst four central moments in terms of cumulants as follows:
H=0, =K3, =K4+ 3ui = K,+ 3K
or K= K= M2 K3 = M3 K,=4-3ui
llustration13:If arv, Xhas M.G.F. M, () = e2t +3, find its cumulants and hence
obtain the first four central moments.
Solution : K,) = log. M,() = log et+ 3t = 2t+ 3e x2 b
K, = Coefficient of t in K,() =2. 21,

K, = Coefficient of in' K,(0==6


ce, the terms of t and higher order are absent, we get K,= K,=...=0.
HI=0, Ha= K, 2
H = K, = 0, H=K+ 3K; = 3
F.Y.B.Sc. Statistics (Paper- II) 5.24
Mathematical Expectation (Univariate)
Points to Remember

The concept of expectation of random variable is same as that of arithmetic


for a frequency distribution.
M.g.f. and c.g.f. can be used to compute moments of the probability distribution
Coefficients of skewness and kurtosis based on moments give us the idea about the
symmetry, spread, shape of the probability distribution.
E(X) = xP(x)
Var (X) = E(X)- [E(x)
Variance is invariant to change of origin
t d' Mx(
My() =E(e). n,= coefficient of in the expansion of M&() = (dt)

Kx(t) = log My(t), K,= coefficient of r!


in the expansion of Kx(t) = d'(dt)y
KxO)

Exercise 5 (A)
Theory Questions :
1 Define mathematical expectation of a discrete r.v. X.
2. Explain how E (X) is the arithmetic mean of X. Can E (X) always be one of the
possible values of X ? Explain.
3 What is the physical interpretation of E (X) ?
4 Define expectation of a function of random variable.
5 Define variance of a discrete r.v.
6 Let X be a discrete random variable. Define E (X) and E (X). Hence give formula
for variance of X.
7 With usual notations prove that
EX- k)² = var (X) +[E (X)-k]?, where k is a constant.
8 Define standard deviation of a r.v. X. What is its use ?
9. X-a
If Y = Th nrove that i) EY) = (i) V(Y) =VOX) h.
h
10. Show that variance is invariant to the change of origin, but not of scale.
1. What is meant by standardized r.v. ? Explain with the help of an illustration.
12. Prove that (i) Variance of aconstant is zero, (ii) E (X) > [E (XP.
M.G.F.:
13 Define moment generating function of a random variable X. State and prove
propertics, Also describe its uses.
14. Explain how the raw moments
that,
can be obtained using M.G E
15. With usual notations show
M,(0) = 1
(1)
(ii) Mex (t)= M, (ct)
Statistics (Paper- I)
EYBSC.
5.25 Mathematical Expectation (Univariate)
(ii) Max +b() = e M, (at)
(iv) M () = M, (-)
(w) Max+by (t) = M, (at) M, (bt) (where X and Y are independent r.v.s.)
(vi) My (t) = M, () M, () (where X and Y are independentr.v.s.)
(vii) M,-y (t) = M, () M, (t) (where X and Y are independent r.v.s.)
Show that M.G.F. of sum of two independent random variables is the(P.U.
product of
2002)
their M.G.F'.
in Find the "raw moment of X if its M.G.F. is My(t) = (1 -).

I8. Is Mx(0) = a M.G.F. ? Justify.


19. If Xis ar.v. with p.m.f.
1 X= 1,2, ..., n.
P(x) = n

= 0 otherwise
find the M.G.F. of X.
C.G.F.: X. Also explain how to obtain the
20. Define cumulant generating function of r.v.
cumulants from C.G.F.
in terms of first four cumulants.
21. State expressions for four central moments
22. State the properties of C.G.E. of two independent r.v.s. is sum of their C.G.F's.
23. Show that the C.G.F. of sum
24. What are moments ofar.v. ?
25. What is the need of study moments ? a discrete
moments about 'a', (iii) central moments of
26. Define (i) raw moments, (ii)
I.V.
invariant to the change of origin.
27. Prove that the central moments are X-a
moment u, (x). Let Y = with rh
central
28. Let X be a. discrete r.v. with rth
h

, (x) = hu, (y).


central moment as H, (y). Then prove that
coefficient of skewness Y. Describe how Y is used to measure the skewness
29. Define
of the probability distribution.
of kurtosis Y.. What is meant by ) ,> 0, (ii) Y= 0 and
0 Define coefficient
(i0) <0? use of factorial
. Define th factorial moment of a discrete r.V. X. WVhat is the
moments ?
Exercise 5 (B)
n, with P (x) i, i= 1, 2, ... n. Find E (X).
Adiscrete r.v. X assumes values l, 2,..,
O Let X be a r.v. with following as the p.m.t. 2 3
X 0 1
0.1 0.3 0.4 0.2
P (X)
Find E (X) and Var (X).

You might also like