Week 5 Lectures
Week 5 Lectures
Example
Let X be a Bernoulli random variable such that
{
1, with probability p
X=
0, with probability 1 − p
We have that EX = 1 · p + 0 · (1 − p) = p.
Note that X takes values 0 or 1 only, and it never takes the value
p = EX, if p ∈ (0, 1). This is why the term ”the expected value”
should not be taken literally.
Note that the median of Bernoulli distribution is not a very useful
parameter. If p = 1/2 than
P(X ≤ a) = P(X = 0) = P(X = 1) = P(X ≥ a)
1
2CHAPTER 5. WEEK 5: EXPECTATION FOR RANDOM VARIABLES
X 1 2 3 4 5 6
1 1 1 1 1 1
p(x) = P(X = x)
6 6 6 6 6 6
1 1 1 1 1 1
E(X) = 1 × +2× +3× +4× +5× +6×
6 6 6 6 6 6
= 3.5
Example
A pair of dice are tossed once and X is the sum of the outcomes:
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
p(x)
36 36 36 36 36 36 36 36 36 36 36
= P(X = x)
∑
12
1 2 1
EX = xf (x) = 2 × +3× + . . . + 12 × =7
36 36 36
x=2
Example
A gambler bets on the coin tossing game: he pays $100 for head and
receives $100 for the tail. What is his expected gain?
Solution: let X be the gain. It is a random variable with
frequency pX (x) given by
x -100 100
pX (x) 1/2 1/2
It gives EX = 100(1/2) + 100(−1/2) = 0. This is a fair
game!
Example
A roulette wheel has the numbers 1,2,..,36, as well as 0 and 00 (total
38 numbers). If you bet $1 that an odd non-zero number comes up,
you win or lose $1 according to whether or not that event occurs. Let
X be the gain. We have
Hence
Example
Consider roulette wheel again. If you bet $1 that the number 1 comes
up, you win $35 or lose $1 according to whether or not that event
occurs. Let X be the gain. We have
Hence
Your expected loss is about $0.05 again, Note that the expected value
is the same as for the previous example.
Example
Consider the uniform probability function over (0, a).
{ 1
, 0≤x≤a
f (x) = a
0, elsewhere
⇒ a
∫ a
x 1 x2 a
E(X) = dx = = .
0 a a 2 0 2
Example
Consider the exponential distribution
{
λe−λx , x ≥ 0
f (x) =
0, elsewhere
⇒
∫ +∞
1
EX = E(X) = xλe−λx dx = .
0 λ
4CHAPTER 5. WEEK 5: EXPECTATION FOR RANDOM VARIABLES
Similar rules are often accepted for the case of discrete random
variables.
Example
Cauchy distribution: Consider a random variable X = X1 /X2 ,
where Xi ∼ N (0, 1). We found on Week 4 that X has density
1
f (x) = .
π(1 + x2 )
The case when there are more than one x such that yk = g(x)
requires additional analysis, We have
∑
P(Y = yk ) = pX (xj ).
j: g(xj )=yk
Hence
∑ ∑ ∑
EY = yk pY (yk ) = yk pX (xj )
k k j: g(xj )=yk
∑ ∑ ∑
= g(xj )pX (xj ) = g(x)p(x).
k j: g(xj )=yk x
Example
Let X be a Bernoulli random variable such that
{
1, with probability p
X=
0, with probability 1 − p
Notations:
It is common to write EX 2 instead of E(X 2 ). Therefore, EX 2
means E(X 2 ) rather than (EX)2 .
6CHAPTER 5. WEEK 5: EXPECTATION FOR RANDOM VARIABLES
Example
Consider the uniform distribution over (0, a).
{ 1
, 0≤x≤a
f (x) = a
0, elsewhere
⇒ a
∫ a 2
2 x 1 x3 a2
E(X ) = dx = = .
0 a a 3 0 3
Example
Consider the exponential distribution
{
λe−λx , x ≥ 0
f (x) =
0, elsewhere
then
∫ +∞
2
2
EX = E(X ) = 2
x2 λe−λx dx = 2 .
0 λ
Example
Consider discrete random variable with frequency
{ 1
(x + y), x = 0, 1, 2; y = 0, 1, 2, 3
p(x, y) = 30
0, elsewhere.
Then
∑
2 ∑
3
1
E(XY ) = xy (x + y)
30
x=0 y=0
5.5. JOINT RANDOM VARIABLES 7
It gives
1 ∑
2
E(XY ) = {x × 0 × (x + 0) + x × 1 × (x + 1)
30
x=0
+x × 2 × (x + 2) + x × 3 × (x + 3)}
1 ∑
2
= {x(x + 1) + 2x(x + 2) + 3x(x + 3)}
30
x=0
1
= {0(0 + 1) + 2 × 0 × (0 + 2) + 3 × 0 × (0 + 3)
30
+1(1 + 1) + 2 × 1 × (1 + 2) + 3 × 1 × (1 + 3)
+2(2 + 1) + 2 × 2 × (2 + 2) + 3 × 2 × (2 + 3)}
72
= = 2.4.
30
Example
In the previous example
∑
2 ∑
3
(x + y)2 98
E(X + Y ) = = = 3.266.
30 30
x=0 y=0
Special cases
Let g(X, Y ) = X. Let us verify that Eg(X, Y ) = E(X).
We have
{ ∑ ∑
y xf (x, y), if X, Y are discrete
E(X) = ∫ ∞x ∫ ∞
−∞ −∞ xf (x, y)dxdy, if X, Y are continuous
∫∞
where fX (x) ≡ −∞ f (x, y)dy is the marginal density of X.
Similar conclusions for discrete random variables.
Theorem Let a be a constant. Then
E(a) = a.
E(aX) = aEX.
Example
A roulette wheel has the numbers 1,2,..,36, as well as 0 and 00 (total
38 numbers). If you bet $1 that an odd non-zero number comes up,
you win or lose $1 according to whether or not that event occurs. Let
X be the gain. We found above that
EX = −$1/19 ∼ −$0.05.
If you bet $100 that an odd non-zero number comes up, you win or
lose is Y = 100X, and EY = 100EX ∼ −$5.
Example
A roulette wheel has the numbers 1,2,..,36, as well as 0 and 00 (total
38 numbers). If you bet $1 that an odd non-zero number comes up,
you win or lose $1 according to whether or not that event occurs. Let
X be the gain. We found above that EX = −$1/19 ∼ −$0.05.
Assume that you have $10 in your pocket, and let Y be the total
amount of money in your pocket after the game. Then Y = $10 + X
and EY = $10 + EX ∼ $9.95.
Then
∑
n
EZ = a0 + ai Xi .
i=1
Example
Let X be the total number of successes in n Bernoulli trias. We have
that X = X1 + .... + Xn , where Xi are Bernoulli variables such that
{
1, with probability p
Xi =
0, with probability 1 − p
5.10 Moments
k = 1, 2, ... The first moment (i.e., EX) is called the mean of X and
is often denoted by µ, i.e.,
′
µ1 = E(X) = mean of X ≡ µ.
Example
Let X be a Bernoulli random variable such that
{
1, with probability p
X=
0, with probability 1 − p
Definition: The kth central moment (or the moment about the mean)
of the random variable X is defined as E[(X − µ)k ]. It is usually
denoted by µk . By the rule of expectation, we have that
{ ∑
(x − µ)k f (x) if X is discrete
µk = ∫ ∞x
−∞ (x − µ) f (x)dx if X is continuous
k
Example
Let X be a Bernoulli random variable such that
{
1, with probability p
X=
0, with probability 1 − p
We have that µ = EX = p,
In particular,