0% found this document useful (0 votes)
17 views

hw10_sol.pdf

Uploaded by

masterrahool1423
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

hw10_sol.pdf

Uploaded by

masterrahool1423
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

CS 70 Discrete Mathematics and Probability Theory

Fall 2019 Alistair Sinclair and Yun S. Song HW 10

Note: This homework consists of two parts. The first part (questions 1-5) will be graded and will
determine your score for this homework. The second part (questions 6-7) will be graded if you
submit them, but will not affect your homework score in any way. You are strongly advised to
attempt all the questions in the first part. You should attempt the problems in the second part only
if you are interested and have time to spare.

For each problem, justify all your answers unless otherwise specified.

Part 1: Required Problems

1 Random Variables Warm-Up


Let X and Y be random variables, each taking values in the set {0, 1, 2}, with joint distribution

P[X = 0,Y = 0] = 1/3 P[X = 0,Y = 1] = 0 P[X = 0,Y = 2] = 1/3


P[X = 1,Y = 0] = 0 P[X = 1,Y = 1] = 1/9 P[X = 1,Y = 2] = 0
P[X = 2,Y = 0] = 1/9 P[X = 2,Y = 1] = 1/9 P[X = 2,Y = 2] = 0.

(a) What are the marginal distributions of X and Y ?

(b) What are E[X] and E[Y ]?


(c) What are var(X) and var(Y )?
(d) Let I be the indicator that X = 1, and J be the indicator that Y = 1. What are E[I], E[J] and
E[IJ]?

(e) In general, let IA and IB be the indicators for events A and B in a probability space (Ω, P). What
is E[IA IB ], in terms of the probability of some event?

Solution:

(a) By the law of total probabilty

P[X = 0] = P[X = 0,Y = 0] + P[X = 0,Y = 1] + P[X = 0,Y = 2] = 1/3 + 0 + 1/3 = 2/3

CS 70, Fall 2019, HW 10 1


and similarly

P[X = 1] = 0 + 1/9 + 0 = 1/9


P[X = 2] = 1/9 + 1/9 + 0 = 2/9.

As a sanity check, these three numbers are all positive and they add up to 2/3 + 1/9 + 2/9 = 1
as they should. The same kind of calculation gives

P[Y = 0] = 1/3 + 0 + 1/9 = 4/9


P[Y = 1] = 0 + 1/9 + 1/9 = 2/9
P[Y = 2] = 1/3.

(b) From the above marginal distributions, we can compute

E[X] = 0P[X = 0] + 1P[X = 1] + 2P[X = 2] = 5/9


E[Y ] = 0P[Y = 0] + 1P[Y = 1] + 2P[Y = 2] = 8/9

(c) Again using our marginal distributions,

E[X 2 ] = 0P[X = 0] + 1P[X = 1] + 4P[X = 2] = 1


E[Y 2 ] = 0P[Y = 0] + 1P[Y = 2] + 4P[Y = 2] = 14/9

and thus

var(X) = E[X 2 ] − E[X]2 = 66/81


var(Y ) = E[Y 2 ] − E[Y ]2 = 62/81.

(d) We know that taking the expectation of an indicator for some event gives the expectation of
that event, so

E[I] = P[X = 1] = 1/9


E[J] = P[Y = 1] = 2/9.

The random variable IJ is equal to one if I = 1 and J = 1, and is zero otherwise. In other
words, it is the indicator for the event that I = 1 and J = 1:

E[IJ] = P[I = 1, J = 1] = 1/9.

(e) By what we said in the previous part of the solution, IA IB is the indicator for the event A ∩ B,
so
E[IA IB ] = P[A ∩ B].

CS 70, Fall 2019, HW 10 2


2 Marginals
(a) Can there exist three random variables X1 , X2 , X3 , each taking values in the set {+1, −1}, with
the property that for every i 6= j, the joint distribution of Xi and X j is given by
1 1
P[Xi = 1, X j = −1] = P[Xi = −1, X j = 1] = P[Xi = X j ] = 0? (1)
2 2
If so, specify the joint distribution of X1 , X2 , X3 ; if not, prove it.
(b) For which natural numbers n ≥ 3 can there exist random variables X1 , X2 , ..., Xn , each taking
values in the set {+1, −1}, with the property that for every i and j satisfying i− j = 1 (mod n),
the joint distribution of Xi and X j is given by (1)? For any n that work, specify the joint
distribution; for those that do not, prove it.

Solution:

(a) No such random variables can exist; let’s prove it by contradiction. From the desired joint
distribution of X1 and X2 , we claim that X1 = −X2 (by which we mean that for every ω in the
sample space X1 (ω) = −X2 (ω)). Similarly, we would need to have X2 = −X3 and X3 = −X1 .
But now
X1 = −X2 = X3 = −X1 ,
a contradiction since X1 ∈ {+1, −1}.
(b) This is only possible if n is even. When n = 2k + 1, the same argument as above gives us

X1 = −X2 = X3 = · · · = −X2k = X2k+1 = −X1 ,

a contradiction for the same reason as before. However, when n = 2k, we can set X1 , ..., X2k to
have the joint distribution

P[X1 = 1, X2 = −1, · · · X2k = −1] = 1/2


P[X1 = −1, X2 = 1, · · · X2k = 1] = 1/2.

3 Random Tournaments
A tournament is a directed graph in which every pair of vertices has exactly one directed edge
between them—for example, here are two tournaments on the vertices {1, 2, 3}:

CS 70, Fall 2019, HW 10 3


In the first tournament above, (1, 2, 3) is a Hamiltonian path, since it visits all the vertices exactly
once, without repeating any edges, but (1, 2, 3, 1) is not a valid Hamiltonian cycle, because the
tournament contains the directed edge 1 → 3 and not 3 → 1. In the second tournament, (1, 2, 3, 1)
is a Hamiltonian cycle, as are (2, 3, 1, 2) and (3, 1, 2, 3); for this problem we’ll say that these are all
different Hamiltonian cycles, since their start/end points are different.
Consider the following way of choosing a random tournament T on n vertices: independently
for each (unordered) pair of vertices {i, j} ⊂ {1, ..., n}, flip a coin and include the edge i → j in
the graph if the outcome is heads, and the edge j → i if tails. What is the expected number of
Hamiltonian paths in T ? What is the expected number of Hamiltonian cycles?
Solution:
Each possible Hamiltonian path in the graph corresponds to a permutation σ of the numbers 1, ..., n,
where σ (1) is the starting vertex, σ (2) is the second vertex visited, etc. If we write Iσ for the
indicator random variable that σ corresponds to an actual Hamiltonian cycle in T , then
" #
E [# Hamiltonian Paths] = E ∑ Iσ = ∑ P[σ is a Hamiltonian path in T ]
σ σ

In order for each σ to correspond to an actual Hamiltonian path in T , the edges σ (i) → σ (i + 1),
for i = 1, ..., n − 1 must all be included in the graph. Since the orientations of the edges in T are
independent, with σ (i) → σ (i + 1) occurring with probability 1/2, the probability that they are all
included is 2−(n−1) . There are n! possible permutations, so we have
n!
E [# Hamiltonian Paths] = .
2n−1

The situation for Hamiltonian cycles is similar. Each possible Hamiltonian cycle each possible
cycle corresponds to a permutation σ , but this time in order for σ to be a valid Hamiltonian cycle,
T must include the edges σ (i) → σ (i + 1) for all i = 1, ..., n − 1, as well as the edge σ (n) → σ (1).
As above, these n edges are oriented independently of one another, so
n!
E [# Hamiltonian Cycles] = .
2n

4 Triangles in Random Graphs


Let’s say we make a simple and undirected graph G on n vertices by randomly adding m edges,
without replacement. In other words, we choose the first edge uniformly from all n2 possible
n
edges, then the second one uniformly from among the remaining 2 − 1 edges, etc. What is the
expected number of triangles in G? (A triangle is a triplet of distinct vertices with all three edges
present between them.)
Solution:

CS 70, Fall 2019, HW 10 4


Let’s label our vertices 1, ...n, and first check the probability that vertices 1, 2, 3 form a triangle.
This event is described by a hypergeometric distribution with parameters n2 , 3, m: when we make
the graph we are drawing the m edges from a bucket of n2 possible edges, 3 of which are the ones


connecting vertices 1, 2, and 3. Thus the probability that all three of these edges exist is
n
3 (2)−3
3 m−3
P[1, 2, and 3 form a triangle] =
(n2)
m

In fact, there was nothing special about vertices 1, 2, 3 in this calculation. The probability that
the three edges connecting some triplet of distinct vertices i, j, k is equal to the quantity above.
Now, for each subset {i, j, k} ⊂ {1, ..., n}, let Ii, j,k be the indicator that these three vertices form a
triangle. We then have

E[# of triangles] = E ∑ Ii, j,k


{i, j,k}⊂{1,...,n}
= ∑ E[Ii, j,k ] linearity of expectation
{i, j,k}⊂{1,...,n}
= ∑ P[i, j, and k form a triangle]
{i, j,k}⊂{1,...,n}
n
3 (2)−3
3 m−3
= ∑
{i, j,k}⊂{1,...,n}
(n2)
m
  3 (n2)−3
n 3 m−3
=
3 (n2)
m

5 Variance
A building has n upper floors numbered 1, 2, . . . , n, plus a ground floor G. At the ground floor, m
people get on the elevator together, and each person gets off at one of the n upper floors uniformly
at random and independently of everyone else. What is the variance of the number of floors the
elevator does not stop at?
Solution: Let N be the number of floors the elevator does not stop at. We can represent N as the
sum of the indicator variables I1 , . . . , In , where Ii = 1 if no one gets off on floor i. Thus, we have
 n − 1 m
E[Ii ] = P[Ii = 1] = ,
n
and from linearity of expectation,
n  n − 1 m
E[N] = ∑ E[Ii ] = n .
i=1 n

CS 70, Fall 2019, HW 10 5


To find the variance, we cannot simply sum the variance of our indicator variables. However, since
var(N) = E[N 2 ] − E[N]2 the only piece we don’t already know is E[N 2 ]. We can calculate this by
again expanding N as a sum:
h i n
E[N 2 ] = E[(I1 + · · · + In )2 ] = E ∑ Ii I j = ∑ E[Xi X j ] = ∑ E[Ii2 ] + ∑ E[Ii I j ].
i, j i, j i i6= j

The first term is simple to calculate: since Ii is an indicator, Ii2 = Ii , so we have


 n − 1 m
E[Ii2 ] == E[Ii ] = P[Ii = 1] = ,
n
meaning that
n  2  n − 1 m
∑ i E I = n .
i=1 n
From the definition of the variables Ii , we see that Ii I j = 1 when both Ii and I j are 1, which means
no one gets off the elevator on floor i and floor j. This happens with probability
 n − 2 m
P[Ii = I j = 1] = P[Ii = 1 ∩ I j = 1] = .
n
Thus we now know  n − 2 m
∑ ij E[I I ] = n(n − 1) ,
i6= j n
and we can assemble everything we’ve done so far to see that
 n − 1 m  n − 2 m
2 n−1
 2m
2 2
var(N) = E[N ] − E[N] = n + n(n − 1) −n .
n n n

Note: This concludes the first part of the homework. The problems below are optional, will not
affect your score, and should be attempted only if you have time to spare.

Part 2: Optional Problems

6 Indicators, Probabilities, and Positivity


(a) Let X be a positive random variable, i.e. X(ω) ≥ 0 for every ω ∈ Ω. Prove that E[X] ≥ 0.
(b) Let n be a natural number, α1 , . . . , αn ∈ R, and let (Ω, P) be a probability space with some
events A1 , . . . , An ⊂ Ω. Prove that ∑ni=1 ∑nj=1 αi α j P(Ai ∩ A j ) ≥ 0. Note that αi can be less than
0.

CS 70, Fall 2019, HW 10 6


(c) Again let X be a positive random variable, and let I be the indicator that X > 0. Prove that
(E[X])2
P[X > 0] ≥ E[X 2 ]
.

It may be useful to prove that X = XI, and to consider the random variable (X +aI)2 for various
values of a ∈ R.

Solution:

(a) Directly from the definition, E[X] = ∑ω∈Ω P[ω]X(ω), and the right hand side is a sum of
nonnegative terms.
(b) Let’s write the sum using the indicators I1 , ..., In for the events A1 , ..., An . Then,
n n n n hn n i
∑ ∑ αiα j P[Ai ∩ A j ] = ∑ ∑ i j ij
α α E[I I ] = E ∑ ∑ αiα j IiI j
i=1 j=1 i=1 j=1 i=1 j=1
h n  n i h n 2 i
=E ∑ αiIi ∑ α j I j = E ∑ αi Ii ≥ 0.
i=1 j=1 i=1

(c) As suggested, note that X = XI, since if X(ω) = 0, then I(ω) = 0, and if X(ω) > 0, I(ω) = 1.
We know that for any a, (aX + I)2 is a positive random variable, so from the first part of the
problem,

0 ≤ E[(aX + I)2 ] = a2 E[X 2 ] + 2a E[XI] + E[I 2 ] = a2 E[X 2 ] + 2a E[X] + P[X > 0].

Rearranging gives
P[X > 0] ≥ −2a E[X] − a2 E[X 2 ],
and setting a = − E[X]/ E[X 2 ] will give us the requested lower bound.

7 Swaps and Cycles


We’ll say that a permutation π = (π(1), ..., π(n)) contains a swap if there exist i, j ∈ {1, ..., n} so
that π(i) = j and π( j) = i.

(a) What is the expected number of swaps in a random permutation?


(b) What about the variance?
(c) We say that π is an involution if π(π(i)) = i for every i = 1, ..., n. What is the probability that
π is an involution? The answer may depend on n...
(d) In the same spirit as above, we’ll say that π contains a s-cycle if there exist i1 , ..., is ∈ {1, ..., n}
with π(i1 ) = i2 , π(i2 ) = i3 , ..., π(is ) = i1 . Compute the expectation and variance of the number
of s-cycles.

CS 70, Fall 2019, HW 10 7


Solution:

(a) As a warm-up, let’s compute the probability that 1 and 2 are swapped. There are n! possible
permutations, and (n − 2)! of them have π(1) = 2 and π(2) = 1. This means
(n − 2)! 1
P[(1, 2) are a swap] = = .
n! n(n − 1)
There was nothing special about 1 and 2 in this calculation, so for any {i, j} ⊂ {1, ..., n}, the
probability that i and j are swapped is the same as above. Let’s write Ii, j for the indicator that
i and j are swapped, and N for the total number of swaps, so that
" #  
1 n 1
E[N] = E ∑ Ii, j = ∑ P[(i, j) are swapped] = n(n − 2) 2 = 2 .
{i, j}⊂{1,...,n} {i, j}⊂{1,...n}

(b) For the variance, when we expand N 2 as a sum over pairs {i, j}, {k, l}, we’ll need to know
E[Ii, j Ik,l ], which is just the probability that i and j are swapped and k and l are swapped as
well. There are three cases to consider. If {i, j} = {k, l}, then the probability is exactly what
we computed above. If the two pairs share one element, then it is impossible that they are both
swaps. If the pairs are disjoint, then of the n! possible permutations, (n − 4)! include the two
swaps we are concerned with. Thus

E[N 2 ] = ∑ E[Ii,2 j ] + ∑ E[Ii, j Ik,l ]


{i, j}⊂{1,...,n} {i, j}∩{k,l}=0/
    
n 1 n n−2 1
= +
2 n(n − 1) 2 2 n(n − 1)(n − 2)(n − 3)
1 n(n − 1) (n − 2)(n − 3) 1 1 1
= + = + ,
2 2 2 n(n − 1)(n − 2)(n − 3) 2 4

and var(N 2 ) = E[N 2 ] − E[N]2 = 21 + 14 − 14 = 21 .


(c) Let π be an involution, and consider some i ∈ {1, ..., n}. If we call j = π(i), we know from the
definition that i = π(π(i)) = π( j). Thus every i = 1, ..., n belongs to a swap, and consequently
we can partition the set {1, ..., n} into swaps (this tells us right away that n is even). To count
involutions, then, we can alternatively count matchings of the set {1, ..., n}, i.e. partitions of
this set into n/2 subsets of size 2. There are
     
n n−2 4 2 n!
··· = n/2
2 2 2 2 2

such matchings, so the probability that a random permutation is an involution is 2−n/2 .


(d) The idea here is quite similar to the above, so we’ll be a little less verbose in the exposition.
However, as a first aside we need the notion of a cyclic ordering of s elements from a set
{1, ...n}. We mean by this a labelling of the s beads of a necklace with elements of the set,
where we say that labelings of the beads are the same if we can move them along the string to

CS 70, Fall 2019, HW 10 8


turn one into the other. For example, (1, 2, 3, 4) and (1, 2, 4, 3) are different cyclic orderings,
but (1, 2, 3, 4) and (2, 3, 4, 1) are the same. There are
 
n s! n! 1
=
s s (n − s)! s
possible cyclic orderings of length s from a set with n elements, since if we first count all
subsets of size s, and then all permutations of each of those subsets, we have overcounted by a
factor of s.
Now, let N be a random variable counting the number of s-cycles, and for each cyclic order-
ing (i1 , ..., is ) of s elements of {1, ..., n}, let I(i1 ,...,is ) be the indicator that π(i1 ) = i2 , π(i2 ) =
i3 , ..., π(is ) = i1 . There are (n − s)! permutations in which (i1 , ..., is ) form an s-cycle (since
we are free to do whatever we want to the remaining (n − s) elements of {1, ..., n}), so the
probability that (i1 , ..., is ) are such a cycle is (n−s)! n! , and
" #
n! 1 (n − s)! 1
E[N] = E ∑ I(i1 ,...,is ) = = .
(i ,...,i ) cyclic ordering
(n − s)! s n! s
1 s

For the variance, we need to know E[I(i1 ,...,is ) I( j1 ,..., js ) ], the probability that both (i1 , ..., is ) and
( j1 , ..., js ) are s-cycles. We have already computed this probability if the two cyclic orderings
are the same, and we know that it is zero if they overlap but are not equal. If they are dis-
joint, then there are (n − 2s)! permutations in which both are s-cycles, so E[I(i1 ,...,is ) I( j1 ,..., js ) ] =
(n−2s)!
n! , and there are
n! (n − s + 1)! 1
(n − s)! (n − 2s)! s2
ways to choose the two disjoint cyclic orderings, so mirroring the calculation earlier,
1 1
E[N 2 ] = + ,
s s2
and var(N) = 1s .

CS 70, Fall 2019, HW 10 9

You might also like