Problem Set 9 Solutions
Problem Set 9 Solutions
1 Probability measure
Given a complete probability space (Ω, F, P). Prove that the conditional probability defined by
P (A ∩ B)
P ( A| B) = ,
P (B)
Answer From the lecture notes we know that a probability measure is defined by the three axioms of
Kolmogorov, i.e
1. P (∅) = 0,
2. P (Ω) = 1,
S∞ P∞
3. P ( i=1 Ai ) = i=1 P (Ai ) for disjoint sets A1 , A2 , . . . ∈ F.
Thus, we have just to check whether the conditional probability measure satisfies all three conditions. In
order to show that all conditions are in fact true we fix the event B and just write Q (A) := P ( A| B) as
short-hand notation. Assume that all axioms are fulfilled for unconditional probabilities, like P (A) for any
set A ∈ F.
Ad 1.
P (∅ ∩ B) P (∅)
Q (∅) = = = 0,
P (B) P (B)
Ad 2.
P (Ω ∩ B) P (B)
Q (Ω) = = = 1,
P (B) P (B)
Ad 3.
∞ ∞
! S∞ S∞ P∞
[ P (( i=1Ai ) ∩ B) P ( i=1 (B ∩ Ai )) P (B ∩ Ai ) X
Q Ai = = = i=1 = Q (Ai ) ,
i=1
P (B) P (B) P (B) i=1
2 Joint Distribution
Consider the following random variables. The random pair (X, Y ) has the following joint distribution:
X
1 2 3
1 1 1
2 12 6 12
1 1
Y 3 6 0 6
1
4 0 3 0
1
Are X and Y dependent random variables? Explain your answer. What is the marginal distribution of X
and Y , respectively?
Answer:
Marginal distribution of X:
x < 1, F (x) = 0
1 1 1
1 ≤ x < 2, F (x) = + =
12 6 4
1 1 1 3
2 ≤ x < 3, F (x) = + + =
4 6 3 4
3 1 1
x ≥ 3, F (x) = + + =1
4 12 6
Marginal distribution of Y:
y < 2, F (y) = 0
1 1 1 1
2 ≤ y < 3, F (y) = + + =
12 6 12 3
1 1 1 2
3 ≤ y < 4, F (y) = + + =
3 6 6 3
2 1
y ≥ 4, F (y) = + = 1
3 3
3 Transformation
Standard quantitative models of the stock market assume that stock returns follow a log-normal distribution.
If log X ∼ N (µ, σ 2 ), 0 < X < ∞, −∞ < µ < ∞, σ 2 > 0.
1. Find the pdf for X.
2. Compute E(X), V ar(X).
Answer:
1. Set Y = g(X) = log(X), then X = g −1 (Y ),we have:
Increasing Monotone Transformation
FX (x) = P (X ≤ x) = P (g −1 (Y ) ≤ x) P (g g −1 (Y ) ≤ g(x)) = P (Y ≤ g(x)) = FY (g(x)).
=
For pdf:
d d Chain Rule d d 1 1 (log x−µ)2
fX (x) = FX (x) = FY (g(x)) = FY (g(x)) (g(x)) = fY (g(x) · = √ e− 2σ2 .
dx dx dy dx x 2πσx
2
2.
Z Z
1 (y−µ)2 1 (y−µ)2 −2σ 2 y
E(X) = E(elog X ) = E(eY ) = e− 2σ2 dy =
ey √ √ e− 2σ 2 dy
2πσ 2πσ
(y−µ)2 −2σ 2 (y−µ)+σ 4 −σ 4 −2σ 2 µ
Z Z
1 − 1 (y−µ−σ 2 )2 σ2
= √ e 2σ 2
dy = √ e− 2σ2 + 2 +µ dy
2πσ 2πσ
Z Z
1 (y−µ−σ 2 )2 σ 2 σ 2 1 (y−µ−σ 2 )2
= √ e− 2σ2 e 2 +µ dy = e 2 +µ √ e− 2σ2 dy
2πσ 2πσ
σ2
= eµ+ 2 .
2 2 2 2 2 2
V ar(X) = E(X 2 ) − (EX) = Eelog X − (EX) = Ee2Y − (EX) = e2µ+2σ − e2µ+σ .
Answer: The cdf of the random variable Z is a mixture of a continuous and a step function, thus it is
neither continuous nor discrete. Nevertheless, for the continuous part the pdf is given by
(
1
, 0 < x ≤ 1;
fX (x) = 2
0, otherwise.
Additionally, we have an atom at x = 0 with probability P (x = 0) = 12 . After we have found the pdf of Z
and identified the atom it is not to difficult to obtain the expectation and the variance of it. We have
Z 1
1 1 1
E (X) = · 0 + x dx = .
2 0 2 4
Z 1 2
1 1 2 1 5
V ar (X) = · 02 + x dx − = .
2 0 2 4 48
5 PDF
2
Consider the pdf defined by fX (x) = x3 , x > 1 and zero elsewhere.
Answer:
R∞
1. fX (x) ≥ 0, 1 x23 dx = 1.
R∞
2. E(X) = 1 x x23 dx = 2, V ar(X) = ∞.
3
6 Exponential Distribution
λe−λx ,
x≥0
Assume a random variable X follows exponential distribution with pdf: f (x; λ) = .
0, x<0
1. Compute cdf, mean, variance and moment generating function.
2. Show that X is memoryless w.r.t. t, which means P (X > s + t | x > t) = P (X > s), for all s, t ≥ 0.
Answer:
Rx
1. For x ≥ 0, F (x) = λe−λx dx = −e−λx |x0 = 1 − e−λx . Therefore, cdf function is
0
1 − e−λx , x ≥ 0
F (x) = .
0, x<0
For mean, variance and E(euX ),
Z∞
E(x) = x · f (x)dx
−∞
Z∞
= x · λe−λx dx
0
Z∞
integration by parts = −x · e−λx |∞
0 + e−λx dx
0
−x 1
= lim λx + (− e−λx |∞
0 )
x→∞ e λ
1 1
L’hopital = 0 + = .
λ λ
Z∞
2 1 2 −λx
V ar(x) = E[(x − E[x]) ] = (x − ) λe · dx
λ
0
Z∞ Z∞ Z∞
2 −λx −λx 1 −λx
= x λe · dx − 2xe · dx + e · dx
λ
0 0 0
Z∞ Z∞
1
= −x2 e−λx |∞
0 +2 xe−λx
· dx − 2 xe−λx · dx + [− 2 e−λx |∞
0 ]
λ
0 0
1
= 2.
λ
Z∞
E(euX ) = eux · f (x)dx
−∞
Z∞
= eux · λe−λx dx
0
Z∞
=λ e−(λ−u)x dx
0
λ −(λ−u)x ∞
=− e |0
λ−u
4
λ
if u < λ, we have E(euX ) = λ−u .
P (X > s + t | x > t)
P (X > s + t) ∩ P (x > t)
=
P (x > t)
P (X > s + t)
=
P (x > t)
e−λ(s+t)
=
e−λt
−λs
=e = P (X > s).
X is a Markov process.