0% found this document useful (0 votes)
8 views

hw01-sol

The document outlines Homework 1 for the course IEOR E4102, focusing on stochastic modeling concepts. It includes various problems related to probability, random variables, and distributions, requiring students to perform calculations and provide explanations. The homework is due on January 31, 2025, and emphasizes the importance of showing intermediate steps in solutions.

Uploaded by

atlantise163
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

hw01-sol

The document outlines Homework 1 for the course IEOR E4102, focusing on stochastic modeling concepts. It includes various problems related to probability, random variables, and distributions, requiring students to perform calculations and provide explanations. The homework is due on January 31, 2025, and emphasizes the importance of showing intermediate steps in solutions.

Uploaded by

atlantise163
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

IEOR E4102 Stochastic Modeling for MSE Spring 2025

Dr. A. B. Dieker

Homework 1
due on Friday January 31, 2025, 11:59pm EST
Include all intermediate steps of the computations in your answers. If the answer is readily
available on the web (e.g., on wikipedia), then credit is only given for the intermediate steps.

1. In the Ivy League conference of the NCAA soccer championship, Columbia University plays
seven games per season. In this problem, we model Columbia’s season as seven experiments and
choose S = {W, N }7 for the underlying sample space, where W stands for win and N stands
for a non-win (loss or tie). For i = 1, . . . , 7, we define Xi as a random variable on S signifying
whether or not the i-th match results in a win.

(a) Write down three events on S; you are completely free to choose the events you write down.
Do not use words to describe your events.
(b) Write down three outcomes in S; you are completely free to choose the outcomes you write
down.
(c) Give the value X1 assigns to each of your three outcomes in part (b).

Solution.
(a) There are many correct answers to this question. Here is one: S, ∅, {(W, W, W, W, W, W, W )}
are three events.
(b) There are many correct answers to this question. Here is one: Three outcomes are ω1 =
(W, W, W, W, W, W, W ), ω2 = (W, N, W, W, W, W, W ), and ω3 = (W, W, N, W, W, W, W ).
(c) We have X1 (ω1 ) = X1 (ω2 ) = X1 (ω3 ) = 1.
,
2. Suppose you own a four-sided (tetrahedral) die. Such a die has four faces, each of which is
equally likely to appear. (Most commonly such dice have three numbers between 1 and 4 on
each face. The number rolled is the number that appears upright on each of the three visible
faces. You do not need to use this information for this problem.) You roll this die and you
choose S = {1, 2, 3, 4} for your sample space.
(a) Specify all outcomes in S.
(b) Specify the random variable X representing the square of the number rolled.
(c) Specify all events in S.
(d) Specify the probability function P .
Solution.
(a) There are six possible outcomes in S: ω1 = 1, ω2 = 2, ω3 = 3, and ω4 = 4.
(b) We have X(ω) = ω 2 for all ω ∈ S.
(c) Since events are defined as subsets of S, we need to list all possible subsets of the sample
space S: ∅, {1}, {2}, {3}, {4}, {1, 2}, {1, 3}, {1, 4}, {2, 3}, {2, 4}, {3, 4}, {1, 2, 3}, {1, 2, 4},
{1, 3, 4}, {2, 3, 4}, and S = {1, 2, 3, 4}. Note that total number of subsets is 24 = 16.
(d) The probability function is given by
|E| |E|
P (E) = =
|S| 4
for all events E ⊆ S, where | · | stands for the cardinality of a set, i.e., the number of
elements contained in the set.
,
3. Three events A, B, and C satisfy P (A) = 4/9, P (B) = 5/9, P (A|B) = 3/5, and P (Ac ∩B c ∩C c ) =
0. Calculate P (Ac ∩ B c ∩ C).
Solution. First note that P (A ∩ B) = P (A|B)P (B) = 1/3. We have

P (Ac ∩ B c ∩ C) = P (Ac ∩ B c ) − P (Ac ∩ B c ∩ C c ) = P (Ac ∩ B c )

and

P (Ac ∩ B c ) = 1 − P (A ∪ B) = 1 − P (A) − P (B) + P (A ∩ B) = 1 − 4/9 − 5/9 + 1/3 = 1/3.

,
4. Santa Claus delivers gifts with the help of eight reindeer. Answer each of the following with one
word only: What would be a reasonable distribution for:
(a) The number of reindeer with sore feet after their Christmas Eve ride.
(b) Whether or not he has a gift for you.
Solution. (a) Binomial
(b) Bernoulli ,
5. The k-th central moment of a random variable X is defined by µk = E[(X − E(X))k ]. For
instance, the second central moment equals the variance.
Compute all central moments of a standard uniform random variable. (A standard uniform
random variable is uniform on [0, 1].)
Solution. We have fU (x) = 1 for 0 ≤ x ≤ 1 (and 0 otherwise), so
Z ∞ Z 1 Z 1
E(U ) = xfU (x)dx = xfU (x)dx = xdx = 1/2.
−∞ 0 0

Therefore, for any k ≥ 1, we have


Z 1 Z 1/2
k k 1 1/2
µk = E[(U − E(U )) ] = (u − 1/2) du = uk du = uk+1 |−1/2
0 −1/2 k+1
(
0 if k is odd
= 2−k
k+1 if k is even.
,
6. Consider the function g defined by
(
−(x − 3)(x − 4) for 3 ≤ x ≤ 4
g(x) =
0 otherwise.

2
(a) Can g be a probability density function? Explain why or why not.
(b) Can g be a cumulative distribution function? Explain why or why not.

Solution.

(a) First note that g(x) ≥ 0 for every x, so that requirement


R∞ of a pdf is satisfied. If g is a
probability density function, we should also have −∞ g(x)dx = 1, so let’s check that. We
find that Z 4 Z 4
−(x − 3)(x − 4)dx = − (x2 − 7x + 12)dx = 1/6 ̸= 1.
3 3
Therefore g cannot be a probability density function.
(b) There are two ways to answer this question, and both lead to the conclusion that g cannot
be a cumulative distribution function:
• Any cumulative distribution function F is non-decreasing because F (x) = P (X ≤ x)
for some random variable X, but g is not non-decreasing.
• Any cumulative distribution function F has the property that limx→∞ F (x) = 1 but
here limx→∞ g(x) = 0, so g cannot be a probability density function.

7. The discrete random variable X has probability mass function




1/10, a = −1

2/10, a = 0



p(a) = 4/10, a = 1

3/10, a = 2





0, otherwise.

(a) Are {X ≤ 3/2} and {X > 0} disjoint? Explain.


(b) Are {X ≤ 3/2} and {X > 0} independent? Explain.
(c) Compute E(X).
(d) Compute Var(10X + 7).

Solution.

(a) No, because


{X ≤ 3/2} ∩ {X > 0} = {X = 1}
we have P (X ≤ 3/2, X > 0) = P (X = 1) = p(1) = 4/10 > 0, and therefore

{X ≤ 3/2} ∩ {X > 0} =
̸ ∅.

(b) No, because we saw that P (X ≤ 3/2, X > 0) = 4/10 and we similarly find that

P (X ≤ 3/2) = P (X = −1) + P (X = 0) + P (X = 1) = p(−1) + p(0) + p(1) = 7/10


P (X > 0) = P (X = 1) + P (X = 2) = p(1) + p(2) = 7/10,

we deduce that P (0 < X ≤ 3/2) ̸= P (X ≤ 3/2)P (X > 0). We conclude that {X ≤ 3/2}
and {X > 0} are not independent.

3
(c) We find that
E(X) = (−1) × p(−1) + 0 × p(0) + 1 × p(1) + 2 × p(2) = 9/10.

(d) We know that Var(10X + 7) = 100Var(X). We find the latter as follows:


Var(X) = E(X 2 ) − [E(X)]2
= 1 × p(−1) + 0 × p(0) + 1 × p(1) + 4 × p(2) − (9/10)2
= 89/100.
Therefore, Var(10X + 7) = 89.
,
8. The joint probability density function f of two random variables X and Y is given by
(
1/4 for − 1 ≤ x ≤ 1, x4 ≤ y 2 ≤ (1 + x2 )2
f (x, y) =
0 otherwise.

A picture of the support of (X, Y ) is given below. (The support is the set of all points where f
is positive, more specifically the closure of this set.)

y
y = 1 + x2

2
y = x2

−1 0 1 x

−1

y = −x2
−2

2
x = −1 x = 1 y = −1 − x

(a) Calculate the marginal cumulative distribution function FX of X.


(b) Calculate P (X ≤ −1/2 | Y ≤ −1).

Solution.
(a) It is clear that FX (a) = 0 when a ≤ −1 and FX (a) = 1 when a > 1. Now we consider the
case of −1 < a ≤ 1. We get that
Z a Z ∞ Z a Z 1+x2
1 a+1
FX (a) = f (x, y)dydx = 2 dydx = .
−∞ −∞ −1 x2 4 2

4
Therefore, the marginal cumulative distribution function of X is

0
 for a ≤ −1
a+1
FX (a) = 2 for − 1 < a ≤ 1

1 for a > 1.

(b) We have that


Z 1 Z −1
1 1
P (Y ≤ −1) = dydx = ,
−1 −1−x2 4 6
  Z − 1 Z −1
1 2 1 7
P X ≤ − , Y ≤ −1 = dydx = .
2 −1 −1−x 2 4 96
The desired conditional probability is then found as follows:
1 P (X ≤ − 12 , Y ≤ −1) 7
P (X ≤ − |Y ≤ −1) = = .
2 P (Y ≤ −1) 16
,
9. The function f given by
(
2x, 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1
f (x, y) =
0, otherwise
is the joint probability density function of the random variables X and Y .
(a) Verify that f is indeed a joint probability density function.
(b) Calculate the marginal probability density function of X.
(c) Ungraded challenge problem. Calculate the cumulative distribution function of XY .
Solution.
(a) First, we note that f (x, y) ≥ 0 for any (x, y) ∈ R2 . Second, we note that
Z Z 1Z 1
f (x, y) = f (x, y)dxdy
R2 0 0
Z 1Z 1
= 2xdxdy
0 0
Z 1
= 1dy
0
= 1.
So f is indeed a joint probability density function.
(b) We denote the marginal density function of X by fX (·). For x ∈ [0, 1], we have
Z Z 1
fX (x) = f (x, y)dy = 2xdy = 2x,
R 0

and we conclude that (


2x, 0≤x≤1
fX (x) =
0, otherwise.

5
(c) We first note that XY takes values in [0, 1] since (X, Y ) takes values in [0, 1]2 . For t ∈ [0, 1],
we first calculate P (XY > t) by noting that
Z
P (XY > t) = f (x, y)dxdy
{(x,y):xy>t}
Z 1Z 1
= 2xdydx
t t/x
Z 1
= 2(x − t)dx
t
= (x − t)2 |1t
= (1 − t)2 .

Therefore we find that P (XY ≤ t) = 1 − (1 − t2 ) = 2t − t2 for t ∈ [0, 1], so the distribution


function for XY is 
0,
 t<0
2
FXY (t) = 2t − t , 0 ≤ t < 1

1, t ≥ 1.

You might also like