Conditional Expectations E (X - Y) As Random Variables: Sums of Random Number of Random Variables (Random Sums)
Conditional Expectations E (X - Y) As Random Variables: Sums of Random Number of Random Variables (Random Sums)
Let X and Y be two discrete r.v.’s with a joint p.m.f. fX,Y (x, y) = P(X = x,Y = y). Remember
that the distributions (or the p.m.f.’s) fX (x) = P(X = x) of X and fY (y) = P(Y = y) of Y are
called the marginal distributions of the pare (X,Y ) and that
is defined for any real valued function g(X). In particular, E(X 2 |Y = y) is obtained when
g(X) = X 2 and
Var(X|Y = y) = E(X 2 |Y = y) − [E(X|Y = y)]2 .
def
Definition. Denote ϕ(y) = E(X|Y = y). Then E(X|Y ) = ϕ(Y ). In words, E(X|Y ) is a random
variable which is a function of Y taking value E(X|Y = y) when Y = y.
The E(g(X)|Y ) is defined similarly. In particular E(X 2 |Y ) is obtained when g(X) = X 2 and
Remark. Note that E(X|Y ) is a random variable whereas E(X|Y = y) is a number (y is fixed).
(n)
where r.v.’s X j are independent of each other and have the same distribution as a given integer-
valued r.v. X.
Proof. Was given in lectures (and a different proof can be found in Notes 4).
2. E[E(g(X)|Y )] = E(g(X))
Proof. Set Z = g(X). Statement (i) of Theorem 1 applies to any two r.v.’s. Hence, applying it
to Z and Y we obtain E[E(Z|Y )] = E(Z) which is the same as E[E(g(X)|Y )] = E(g(X)). 2
This property may seem to be more general statement than (i) in Theorem 1. The proof above
shows that in fact these are equivalent statements.
3. E(XY |Y ) = Y E(X|Y ).
Proof. E(XY |Y = y) = E(yX|Y = y) = yE(X|Y = y) (because y is a constant). Hence, E(XY |Y ) =
Y E(X|Y ) by the definition of the conditional expectation. 2
Corollary. E(XY ) = E[Y E(X|Y )]. Proof. E(XY ) = E[E(XY |Y )] = E[Y E(X|Y )]. 2
Exercise. Use the same method to prove that E(Xh(Y )|Y ) = h(Y )E(X|Y ) for any real valued
function h(y).