Mathematical Expectation
Mathematical Expectation
Mathematical Expectation
(Univariate)
Pierre--Simon Laplace (1749-1827) was a French
mathematician and astronomer whose work on analytic
oN of probability helped in creating foundation for
mathematical statistics. In two important papers in 1810
and 1811, Laplace first developed the characteristic
function as atool for large-sample theory and proved the
first general central limit theorem
We use momentgenerating function as a type of Laplace
transform.
Pierre-Simon Laplace
Contents
5.1 Introduction
5.2 Mathematical Expectafion
5.3 Expectation of a Function of a Random Variable
5.4 Theorems on Expectation
5.5 Variance of a Random Variable
5.6 Effect of Change of Origin and Scale on Variance
57 Moments of Rondom Variable
5,8 Relations between Raw Moments and Moments about 'a
5.9 Relations between Raw Moments and Central Moments
510 Relations between Central Moments and Moments about 'a
1 Effect of Change of Origin and Scale on Central Moments
2 Measures of Skewness and Kurtosis Based on Moments
513 Factorial Moments
5.]4 Moment Generating Funcion (M.G.F.)
D.15 Properties of Moment Generating Function
S6 CumulantGenerating Function (C.G.F)
(5.1)
F.Y.B.Sc. Statistics (Paper - I) 5.2 Mathematical Expectation (Univarias
Key Words :
Mean, Variance of a random variable, Moment generating function, Cumulant genera
function, raw moments, Central moments, Factorial moments.
Objectives :
" Understand the concept of exxpectation of a random variable and its function.
Learn the m.g.f. and c.g.f. and their properties.
Compute raw and central moments of a random variable.
Solve numerical problems on moments and compute coefficient of skewness an
kurtosis.
5.1 Introduction
The probability distribution of a random variable (r.v.) specifies the chance
(probabilities) of a r.v. taking different values, However, we might be interested in varion
characteristics of a probability distribution such as average, spread, symmetry, shape etc. h
order to study these characteristics, statistical measures are developed. The development of
measures such as mean, variance, moments, coefficients of skewness and kurtosis is om
similar lines as that for a frequency distribution. The basis for all this is mathematical
expectation. Mathematical expectation of a r.v. or its function provides a representative
figure for the probability distribution. It takes into account probabilities of all possible values
that the r.V. can take and summarizes them into a single average.
5.2 Mathematical Expectation
Definition : Let X be a discrete r.v. taking values XË, Xz, ..., XË, ... xn with probabilities
Pi- Pzx .... Pis .. Pa respectively. The mathematical expectation of X; denoted by E(Xs
defined as,
E(X) = Xpi+ XP2t ...+ Xn Pn
n
=2 XPi
|=1
Cxlected value 5
E(X) is also called as the expected value of X. Ca|ed mea
Remark 1:E(X) is the arithmetic mean (A.M.) of X. To
following frequency distribution of X. see this, let us consider tne
X X X
X;
f f
We know that the A.M. is
given by
N where, N = Xi
f x+f;x, ... +fËx +... +fn Xn
EY.B.Sc. Statistics (Paper- II) 5.3 Mathematical Expectation (Univariate)
= pxj= E(X)
where, Pi = N i= 1, 2, .. n are the relative frequencies of
X. Xp% , Xn respectively. Thus in E(X), the relative frequencies are replaced by the
probabilities of respective values ofX.
Remark 2 :If the p.m.f. is in functional form P (x), then E(X)= )x P(x).
Remark 3:If arandom variable takes countably infinite values, then E(X) =2 XPi.
i=1
The expectation is well defined if the series Ixl p, < oo (i.e. absolutely convergent).
Otherwise we say E(X) does not exist.
Remark 4: The value of E (X) may not be a possible value of the r.v. X. For example,
when we toss a fair die, P (x) =~ for i = 1,2, .... 6, where X= number observed on the face
of the die.
6
Hence, E(X) = X x, P (x) = (1+2+3 +4+5 +6) = 3.5 which is not a possible
value ofX.
Remark 5: Arithmetic mean of X, i.e. E (X) is considered to be the centre of gravity of
the probability distribution of X. It is the average of values of X, if we perform the
experiment several times and observe a large number of values of X.
Illustrative Examples
Example 5.1: Obtain expectation of ar.v. X with the following probability distribution.
X 1 3 5 6
P (x) 0.1 0.2 0.4 0.3
Solution:
X P (x;) x P (x)
0.1 0.1
3 0.2 0.6
0.4 2.0
6 0.3 .8
Total 1.0 4.5
E (X) = x P (x) = 4.5
Example 5.2 : Obtain the expected value of number of heads when three fair coins are
tossed simultaneously.
Solution: We know that in this case
2= (HHH, HHT, HTH, THH,HTT, THT, TTH, TTT)
F.Y.B.Sc. Statistics (Paper- I) 5,4 Mathematical Expectation (Univari
and hence if Xdenotes number of heads. the probability distribution of Xis,
X 2
P (x) 1 3 3
8 8 8
3 63 12 3
Accordingly. E(X) = ExP() =0+ t t 82
Example 5.3: Abox contains 5 tickets. Two of the tickets carry a prize of 10 each.h.
other three carry prizes of 2 each. (i) If one ticket is drawn at random, what is the expecle
value of the prize ? (i) If two tickets are drawn, without replacement, what is the expecte
value of the prize ?
Solution:(i) Let the tickets be numbered as 1,2,3,4, 5.
2=(1, 2, 3, 4, 5). Without loss of generality, let tickets numbered I and 2cam
prizes of R10/- and others carry prizes of R2/- each. Suppose X denotes the prize amoUN
then following is the probability distribution of X.
2 3
X 10 10 2
P(X=20) = 0
(b) One ticket is of? 10 and the other is R 2. This corresponds t o =6 sample
points in 2.
6
P(X= 12) 10
(o Both tickets are of 2. There are =3ways in which this can happen.
3
P(X= 4) = T0
(Paper-)
EY.B.Sc. Statistics
5.5 Mathematical Expectation (Univariate)
Hence, the probability distribution of Xis
4 12 20
P (x) 3 6
10 10
and E(X) = )x P(x) = 10.40
. Expected prize would be of 10.40.
Example 5.4 : There are three proposals before a manager to start a new project.
Proposal A: Profit of 50,000 with probability 0.6 or loss of ? 8,000 with probability
0.4
Proposal B:Profit of 1,00,000 with probability 0.4 or otherwise a loss of 20,000.
Proposal C:Profit of 45,000 with probability 0.8, otherwise loss of 5,000.
Which proposal should the manager choose ? Justify ?
Solution :Let X=Profit in . We assign positive sign to profit and negative sign to loss.
We obtain the expected profit due to each proposal.
Proposal A Proposal B Proposal C
XË Pi Pi XË Pi X Pi XË Pi
50000 0.6 30000 100000 0.4 40000 45000 0.8 36000
-8000 0.4 -3200 20000 0.6 12000 5000 0.2 - 1000
E(X) 26800 28000 35000
Expected profits from proposals A, B, C are 26,800; 28,000 and 35,000 respectively.
Since proposal C is expected to give maximum profit, the manager should choose proposal
C
n n
1=0 i =0
k = 2-n 2n
E (X) = xP (x)
1 a0
F.Y.B.Sc. Statistics (Paper- I) 5.6 Mathematical Expectation (Univariate
n2n-1 n
n =2
= i=1
(Xi-)'P M=
n
2
= X Pi-24 il
XPi+H° P
= E(X)- 212+ ?
= E(X*) -
Thus, Var (X) = E(X)-E (X)|
F.Y.B.Sc. Statistics (Paper- I) Mathematical Expectation (Univariate)
58
Remark 1: Var (X) > 0. This is because variance is expected value of the square.
[X-E(X)]', which cannot be negative. Therefore, we get
E (X) > [E (X))
Remark 2: Variance of X is zero if and only if X is a degenerate rv. That is, X takes
only one value with probability 1. For example, if P(X=C] = 1, then E(X) =C.
and G = E(X-CP=(C- C². 1=0
Remark 3:The positive square root of varianceis called the standard deviation of X
is denoted by o.
= 30 (0+1+8+ 27 + 64)
100 10
= 30=3
1
E (X) = 302 X*
1 354
= 30 (l+ 16 + 81+ 256) = 30
= 11.8
18-9
= 0.6889
distribution.
Example 5.10: Consider the following probability
X 0 1
P (X) P 1-2p
0sps
For what value of p, is the Var (X) maximum ?
Solution : E(X) 5 1-2p +2p = 1
E(X) = 1(1-2p) + 4p
= 1+2p
Var (X) = E (X) - [E (X)]=l+2p- 1 = 2p.
1 =.
Oince, 0Sps, Var (X) will be maximum when p
1
For p=2: Var (X) = 1.
Variance
aD Effect of Change of Origin and Scale on
variance o, Then,
eorem 3: Let X be a discrete r.v. with mean u and
Var (X+ b) = Var (X)
Var (aX) = a Var (X) ag²
) Var (aX + b) = ao
F.Y.B.Sc. Statistics (Paper - ) 5.10 Mathematical Expectation (Univariate
Proof : (i) By definition,
Var (X + b) = E[(X + b) - E(X + b)]?
= E[X +b-E (X)- b]²
= E[X- =o
Thus variance is invariant to the change of origin.
(iü) Var (aX) = E[aX-E (aX)² by definition of variance of ary,
= E[aX- aE (X)]?
E[a (X- E (X)
= aE(X-E (X)JP
= ao
(iii) On similar lines,
Var (aX + b) = EaX + b-E (aX + b)]²
= E[aX+b-aE (X)- b]²
= E[a (X-E(X)J
= aE [X-E(X)]P
= a'o²
Thus variance is not invariant to the change of scale.
We know that ay = ao, and s.d. is defined to be the positive square root of variance.
Oy lal o,
(Paper-n 5.11
Statistics
EY&SC.
Mathematical Expectation (Univariate)
Var (-3X+5) = 9 Var (X)
Thus,
and
s.d. (-3X+5) = 3 s.d. (X).
of constant is zero.
Theorem4: Variance
Proof:
Var (c) =E(c) -[E ( c2-c?=0
hMustrative Examples
Example 5.11 : The mean and variance of marks in Statistics (X) are 60 and 25
cncively, Find the mean and variance of
X-60 X- 50
10
E(Y) =
E(X)-a and V(Y)= h2Vdenotes variance
h
60 -60
Here, E(Y) = =0
5
V()
Var (Z) = 100 25
100 = 0.25
Bample 5.12 :Arv. Xassumes n values 1, 2, ...,n with equal probability. If the
tato of Var (X) to E(X) is equal to 4. find the value of n. What will be the value of n if
Var (X) =E(X)?
Solution :The probability distribution ofX is as follows :
X 2 n
P(x)
n
Hence. E(X) = n
i
n(n+ D n+l
n2 2
EX) = n
n =7
Example 5.13: Let X be a discrete r.v. with mean 5 and s.d. 3. Compute mean and s.d.
of (i) 2X - 5,(i) 3-7X, (i)
Solution: ()) Let Y= 2X-5,
E(Y) = 2E (X) -5
= 10-5 =5
6y = 2l, g;=6
(ii) Let, Y = 3-7X
s.d, of Y = Oy =
=3=5
Example 5.14: Prove that E (X -k)² = Var (X) + E (X) -k, where k is any
constant.
Solution: Var (X) = Var (X-k)
. (Fromn Theorem 3)
E(X-k)?- (E (X -k
= E(X-k)- [E (X) -
k)²
E(X-k)? = Var (X) +|E (X)-k]?
Statistics(Paper - 5.13
EYASC. Mathematical Expectation (Univariate)
Moments of a Random Variable
5.7
we studied mean and variance of a random variable. The mean measures central
Sofar
while the variance measures spread. In order to get the complete information on the
tendency
probability distribution, we also have to study the shape of the probability distribution. For
example,we need measures of Skewness (lack of symmetry) and Kurtosis (peakdedness) of a
probabilityddistribution. Moments of arandom variable (or probability distribution) servethis
purpose.
MAshall study four types of moments of a r.v. in this chapter. Let (X;, Pi). i= 1,2, .., n
representa probability distribution of a discrete r.v. X.
Moments about any arbitrary point 'a': The rth moment of Xabout 'a' is denoted
by 4, (a) and is
defined as.
n
r=1,2, 3, ...
(H, () =E(X-a)= i=1
(X;- a) pi
In particular., (a) = E(X-a) =E (X) - a
, = E(X?) = XË Pi
i=1
n
, = E(X*) = Xi Pi
n
Hs = EX-E (X)P
- E(X-) - x-)
It can be shown that:
H3 = H- 3u, H, +2 ,
= E [X - E(X))
- E(X-)
shown that :
Itcan be
and so on.
Statistics(Paper- II) 5.15 Mathematical Expectation (Univariate)
EKASC
Relations between Central Moments and Moments About 'a'
5.10
(Without Proof)
u = 0
Consider, =
E-)
Similarly, =
E(-)
= Hsa-3u,( Hat2ui (a)
Also,
Central Moments
5.11 Effect of Change of Origin and Scale on
X-a
moment u, (x). Define Y = h Then rh central
Let Xbe a discreter.v. with th central
moment of Y, denoted by u, (y) say, is given by,
H(y) = )
4 (x) = hu, (y)
X-a X=a+hY E(X) =a+h E(Y)
Proof : Y =h
= E [h (Y-E(Y))F
= hE[Y-E()]
= hu, (y).
of scale.
are invariant to the change of origin, but not to the change
us central moments
Kurtosis Based on Moments
5.12 Measures of
1 Skewness and
are similar to those of
Concepts of skewness and kurtosis of a probability distribution
e Paper l.
ency distributíon which you study in Statistics-
symmetry of the probability distribution, while kurtosis
Can hess means the lack of based on moments.
TIVean akdedness of the distribution. Following are the measures
F.Y.B.Sc. Statistics (Paper- )
5.16 Mathematical Expectation (Univariate)
1. Coefficient of skewness (%,): The coefficient of skewness is defined as.
deine
th croa abyg
4 xË (N -) .. (x-r+Dp: r=l,2,3,..)
In particular, Bu E(X) =E(X) =
Xx(- Dp
Statstics(Paper - ) 5,17 Mathematical Expectation (Univariate)
E(X)-E(X)
=,-H,
Similarly, one can prOve that
-31 +241
Ho = H -64 +1lu, -64,.
Nate : Sometimes, it is casier to compute factorial moments than raw or central
moments. Using the relations between factorial and other moments as expressed above, the
required types of moments can be computed.
Esample 5.15 : Let a discrete r.v. X assume values 1, 2, .... 6 with equal probability.
a Fndthe first twofactorial moments of Xand hence the mean and variance. (ii) Find the
Gtfour raw moments. (ii) Using the relation between raw and central moments, find the
st three central moments. (iv) Also compute y, and comment on the nature of the skewness
of the distribution.
Solution : The probability distribution of X is
X 1 3 4 5 6
P (x) 1
6 6 6 6 6
21
() H =E(X0) =E (X) = - XiP;= 6
Ha = E[X (X- 1)] = x; (xË- 1) p
23.2 4.3 5.4 6.5
= 0+%t6 6 6 6
70
=6
70
,-H=6
Variance = 4-, = - +
70 21 (21)P 35
=6+ 6 36 12
(1) Raw
moments :
70 2191
4413275
F.Y.B.Sc. Statistics (Paper- ) 5.18 Mathematical Expectation (Univariate)
(iüi) Central moments :
35
H=0, a= 12
As = 4 - 34, H, + 21,
=
441
6 - ) ) ) -o
(iv = VB,= = 0 4 = 0
The distribution is symmetric.
(M.G.F.)
5.14 Moment Generating Function
way to find moments of probability
Moment generating function is an elegent probability
Moreover, it is handy in deriving distribution of functions of
distributions. whether two or many random variables are
random variables. One can verify using M.G.F.
distribution theory.
independent. Thus, M.G.F. is useful in many ways in
Definition: Suppose X is a random variable defined with p.m.f. P (x), then the moment
it is as,
generating function of X is denoted by M, (t) and
M,() = E(ets) = Xetxp ()
neighbourhood of zero
provided etx p(x) is convergent for the values of t in some
(i.e. -h<t<h, h> 0).
M.G.F. M,() can be expressed in powers of t as follows :
(3) If Xand Y are independent random variables with M.G.Fs M,(t) and M,(t)
respectively, then,
M, (t) = My (t) ·My (t)
Proof : M+y() E[ el X+ Y)
= E[etX eY] (:Xand Yare independent)
= E(etX) E(eY).
= M, ()·M, ().
(4) Uniqueness property:
Statement : Foragiven probability distribution there is nique M.G.F. if it exists and for
agiven M.G.F. there is a unique probability distribution.
Proof is beyond the scope of this book. However, we use this property quite often.
Especially, to obtain the distribution of g (X) the transformation of r.v. X, we find M.G.F. of
8(0). If this coincides with that of any standard probability distribution, then due to
Uniqueness property weconclude that g(X) follows that particular probability distribution.
Note : Two diferent probability distributions may have same mean, variance or all the
WS, however, the corresponding M.G.F.s will not be the same. The properties of
M.G.E. are illustrated
Raw moments using below.
M.G.F. :
ctnod 1 : It is clear from the above power series expansion of M(0 that the r raw
TIOment
=coefficient of in the expansion of M,(0.)
F.Y.B.Sc. Statistics (Paper-D) 5.20 Mathematical Expectation (Univariate)
follows :
In particular, the first four moments will be obtained as
H'= Coefficient of t in M()
= Coefficient of in M,()
= Coefficient of a in M, (0)
dM,()7 =
dt 0
Similarly.
d² M,(t)
dt
d'M,O1
d
d'M,0
Thus, dt3 =0
and
d'M,(0)1]
dt
d'M,()
ngeneral, dtr
Note :
(1) Generating function for central moment is defined as follows,
M-n(t) = E<elk-m)] where m = E(X)
Itgives central moments
Hr= Coefficient of in the expansion of M-m ()
as well as,
dr M-m)
dt
(2 M..() = E [ete is a generating function for the moments about 'a'.
(3) Generating function for factorial moments is W,() = EI+ 0N, The rth factonial
moment is uio, thc coefficicnt o n the expansion of w. ()
Statistics(Paper- )
KKRSE
5.21 Mathematical Expectation (Univariate)
Ilustration 1:IfX is r.v. with p.m.f.
p(x) = "C, p qx; x=0, 1,....n, 0<p <l,q=1-P
= 0 ; otherwise.
Find M.G.F.ofX.
Solution: M,() = E(etx) = ep (x)
= 1+7 + 2 83! +
= 1+3 +3 +15 t
coefficient of tin M, (t) = =0
Coefficient of in M,() = E=1
Coefficient of a in M, (t) = ; =0
Coeficient of 4 in M,() = =3.
Dnce ' =0, raw moments and central moments are identical.
P=0, =1, H3=0, H4=3.
D16 Cumulant Generating Function (C.G.F.)
order to obtain M.G.E. we have used the transformation etx, what do we get if we use
inverse transformation log, may be a question of interest. Accordingly, loge M() is
Centt1s called as cumulant generating function, which is found to be useful to find the
central moments easily
Definition : If a'rv. X has M.G.F. M,(), then the cumulant generating function is
denoted by K,() and defined as follows :
K,() = log, M,().
RYRS Statistis(age S22 Mathematical Expetatin (niva
Like M.GF KO can de evndt in the powes oft.
0=-at+ log M0
K=-at+ Kt)
Ky0 =(K-at+ K+Kgt
Bquating coefticientsofton oh the sides we get,
Kof y= K-a= (First cumulant of N)-a
Equating ceticients of22weget.
K, of Y= K,ofX
2. Efect ef change of scale : lfY =hX then.
cumulant of Y= hx cumulant of N\ where h is a constant.
Proof : Let M() be ML.GF of X, therefore,
M,0 = Mh ()= M(ht)
K,0 = log M,(ht) = K(h)
(ht
Kt) = K(ht) +K ,K
Equating coeficients of on boh the sides we get,
K,ofY = (K,of X) h.
3. Additive property of eumulants : If Xand Yare indeendent random varnables
then.
K,of (X + Y) K,of X+ K, oY
Proof : Since NandY are independent random variables
M ) M, 0- M, 0
log Mey ( =log M,o +log M,c)
Ky() = K(0 K)
AY&SCStatistics(Paper-) 5.23
Mathematical Expectation (Univariate)
K, = Coefficient of a in K, (t) = d+ dt
K,()
0
In general,
d K()|
K, = Coefficient of in K,() = dt t=0
Relation between moments and cumulants :
We can expressfirst four central moments in terms of cumulants as follows:
H=0, =K3, =K4+ 3ui = K,+ 3K
or K= K= M2 K3 = M3 K,=4-3ui
llustration13:If arv, Xhas M.G.F. M, () = e2t +3, find its cumulants and hence
obtain the first four central moments.
Solution : K,) = log. M,() = log et+ 3t = 2t+ 3e x2 b
K, = Coefficient of t in K,() =2. 21,
Exercise 5 (A)
Theory Questions :
1 Define mathematical expectation of a discrete r.v. X.
2. Explain how E (X) is the arithmetic mean of X. Can E (X) always be one of the
possible values of X ? Explain.
3 What is the physical interpretation of E (X) ?
4 Define expectation of a function of random variable.
5 Define variance of a discrete r.v.
6 Let X be a discrete random variable. Define E (X) and E (X). Hence give formula
for variance of X.
7 With usual notations prove that
EX- k)² = var (X) +[E (X)-k]?, where k is a constant.
8 Define standard deviation of a r.v. X. What is its use ?
9. X-a
If Y = Th nrove that i) EY) = (i) V(Y) =VOX) h.
h
10. Show that variance is invariant to the change of origin, but not of scale.
1. What is meant by standardized r.v. ? Explain with the help of an illustration.
12. Prove that (i) Variance of aconstant is zero, (ii) E (X) > [E (XP.
M.G.F.:
13 Define moment generating function of a random variable X. State and prove
propertics, Also describe its uses.
14. Explain how the raw moments
that,
can be obtained using M.G E
15. With usual notations show
M,(0) = 1
(1)
(ii) Mex (t)= M, (ct)
Statistics (Paper- I)
EYBSC.
5.25 Mathematical Expectation (Univariate)
(ii) Max +b() = e M, (at)
(iv) M () = M, (-)
(w) Max+by (t) = M, (at) M, (bt) (where X and Y are independent r.v.s.)
(vi) My (t) = M, () M, () (where X and Y are independentr.v.s.)
(vii) M,-y (t) = M, () M, (t) (where X and Y are independent r.v.s.)
Show that M.G.F. of sum of two independent random variables is the(P.U.
product of
2002)
their M.G.F'.
in Find the "raw moment of X if its M.G.F. is My(t) = (1 -).
= 0 otherwise
find the M.G.F. of X.
C.G.F.: X. Also explain how to obtain the
20. Define cumulant generating function of r.v.
cumulants from C.G.F.
in terms of first four cumulants.
21. State expressions for four central moments
22. State the properties of C.G.E. of two independent r.v.s. is sum of their C.G.F's.
23. Show that the C.G.F. of sum
24. What are moments ofar.v. ?
25. What is the need of study moments ? a discrete
moments about 'a', (iii) central moments of
26. Define (i) raw moments, (ii)
I.V.
invariant to the change of origin.
27. Prove that the central moments are X-a
moment u, (x). Let Y = with rh
central
28. Let X be a. discrete r.v. with rth
h