0% found this document useful (0 votes)
19 views5 pages

Problem Set 9 Solutions

The document contains solutions to problems from a mathematics for economics and finance course. It includes proofs that conditional probability is a valid probability measure, determining whether random variables are dependent based on their joint distribution, transformations of random variables to different distributions, and identifying properties of a random variable with a mixed continuous and discrete cumulative distribution function.

Uploaded by

glowygamingno1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views5 pages

Problem Set 9 Solutions

The document contains solutions to problems from a mathematics for economics and finance course. It includes proofs that conditional probability is a valid probability measure, determining whether random variables are dependent based on their joint distribution, transformations of random variables to different distributions, and identifying properties of a random variable with a mixed continuous and discrete cumulative distribution function.

Uploaded by

glowygamingno1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Mathematics for Economics and Finance (Fall 2023)

Problem Set 9 Solutions: Probability & Statistics


Professor: Norman Schürhoff
You do not need to hand in any solutions!
24 November 2023

1 Probability measure
Given a complete probability space (Ω, F, P). Prove that the conditional probability defined by

P (A ∩ B)
P ( A| B) = ,
P (B)

with A, B ∈ F is a probability measure.

Answer From the lecture notes we know that a probability measure is defined by the three axioms of
Kolmogorov, i.e
1. P (∅) = 0,

2. P (Ω) = 1,
S∞ P∞
3. P ( i=1 Ai ) = i=1 P (Ai ) for disjoint sets A1 , A2 , . . . ∈ F.
Thus, we have just to check whether the conditional probability measure satisfies all three conditions. In
order to show that all conditions are in fact true we fix the event B and just write Q (A) := P ( A| B) as
short-hand notation. Assume that all axioms are fulfilled for unconditional probabilities, like P (A) for any
set A ∈ F.
Ad 1.
P (∅ ∩ B) P (∅)
Q (∅) = = = 0,
P (B) P (B)

Ad 2.
P (Ω ∩ B) P (B)
Q (Ω) = = = 1,
P (B) P (B)

Ad 3.
∞ ∞
! S∞ S∞ P∞
[ P (( i=1Ai ) ∩ B) P ( i=1 (B ∩ Ai )) P (B ∩ Ai ) X
Q Ai = = = i=1 = Q (Ai ) ,
i=1
P (B) P (B) P (B) i=1

because the sets (B ∩ Ai ) are disjoint for all i = 1, . . . , ∞.


This completes our proof.

2 Joint Distribution
Consider the following random variables. The random pair (X, Y ) has the following joint distribution:

X
1 2 3
1 1 1
2 12 6 12
1 1
Y 3 6 0 6
1
4 0 3 0

1
Are X and Y dependent random variables? Explain your answer. What is the marginal distribution of X
and Y , respectively?

Answer:

X and Y are independent random variables


⇔ P (x = a, Y = b) = P (x = a)P (x = b)
1 1 1
P (x = 1) = , P (x = 2) = , P (x = 3) =
4 2 4
1 1 1
P (y = 2) = , P (y = 3) = , P (y = 4) =
3 3 3
1 1 1
P (x = 1, y = 2) = ∗ =
4 3 12
1 1 1 1
P (x = 1, y = 3) = ∗ = ̸=
4 3 12 6
1 1 1
P (x = 1, y = 4) = ∗ = ̸= 0
4 3 12
So, X and Y are dependent random variables

Marginal distribution of X:
x < 1, F (x) = 0
1 1 1
1 ≤ x < 2, F (x) = + =
12 6 4
1 1 1 3
2 ≤ x < 3, F (x) = + + =
4 6 3 4
3 1 1
x ≥ 3, F (x) = + + =1
4 12 6
Marginal distribution of Y:
y < 2, F (y) = 0
1 1 1 1
2 ≤ y < 3, F (y) = + + =
12 6 12 3
1 1 1 2
3 ≤ y < 4, F (y) = + + =
3 6 6 3
2 1
y ≥ 4, F (y) = + = 1
3 3

3 Transformation
Standard quantitative models of the stock market assume that stock returns follow a log-normal distribution.
If log X ∼ N (µ, σ 2 ), 0 < X < ∞, −∞ < µ < ∞, σ 2 > 0.
1. Find the pdf for X.
2. Compute E(X), V ar(X).
Answer:
1. Set Y = g(X) = log(X), then X = g −1 (Y ),we have:
Increasing Monotone Transformation
FX (x) = P (X ≤ x) = P (g −1 (Y ) ≤ x) P (g g −1 (Y ) ≤ g(x)) = P (Y ≤ g(x)) = FY (g(x)).

=
For pdf:
d d Chain Rule d d 1 1 (log x−µ)2
fX (x) = FX (x) = FY (g(x)) = FY (g(x)) (g(x)) = fY (g(x) · = √ e− 2σ2 .
dx dx dy dx x 2πσx

2
2.
Z Z
1 (y−µ)2 1 (y−µ)2 −2σ 2 y
E(X) = E(elog X ) = E(eY ) = e− 2σ2 dy =
ey √ √ e− 2σ 2 dy
2πσ 2πσ

(y−µ)2 −2σ 2 (y−µ)+σ 4 −σ 4 −2σ 2 µ
Z Z
1 − 1 (y−µ−σ 2 )2 σ2
= √ e 2σ 2
dy = √ e− 2σ2 + 2 +µ dy
2πσ 2πσ
Z Z
1 (y−µ−σ 2 )2 σ 2 σ 2 1 (y−µ−σ 2 )2
= √ e− 2σ2 e 2 +µ dy = e 2 +µ √ e− 2σ2 dy
2πσ 2πσ
σ2
= eµ+ 2 .
2 2 2 2 2 2
V ar(X) = E(X 2 ) − (EX) = Eelog X − (EX) = Ee2Y − (EX) = e2µ+2σ − e2µ+σ .

4 CDF, PDF, mean, variance of a random variable


Suppose that the random variable Z has cdf


 0 if z<0
1

if z=0
FZ (z) = 2


(z + 1) /2 if 0<z≤1
1 if z > 1.

Is Z a discrete or continuous random variable? Calculate the mean and variance of Z.

Answer: The cdf of the random variable Z is a mixture of a continuous and a step function, thus it is
neither continuous nor discrete. Nevertheless, for the continuous part the pdf is given by
(
1
, 0 < x ≤ 1;
fX (x) = 2
0, otherwise.

Additionally, we have an atom at x = 0 with probability P (x = 0) = 12 . After we have found the pdf of Z
and identified the atom it is not to difficult to obtain the expectation and the variance of it. We have
Z 1
1 1 1
E (X) = · 0 + x dx = .
2 0 2 4
Z 1  2
1 1 2 1 5
V ar (X) = · 02 + x dx − = .
2 0 2 4 48

5 PDF
2
Consider the pdf defined by fX (x) = x3 , x > 1 and zero elsewhere.

1. Show that pdf is correctly defined.


2. Find E(X) and V ar(X).

Answer:
R∞
1. fX (x) ≥ 0, 1 x23 dx = 1.
R∞
2. E(X) = 1 x x23 dx = 2, V ar(X) = ∞.

3
6 Exponential Distribution
λe−λx ,

x≥0
Assume a random variable X follows exponential distribution with pdf: f (x; λ) = .
0, x<0
1. Compute cdf, mean, variance and moment generating function.
2. Show that X is memoryless w.r.t. t, which means P (X > s + t | x > t) = P (X > s), for all s, t ≥ 0.

Answer:
Rx
1. For x ≥ 0, F (x) = λe−λx dx = −e−λx |x0 = 1 − e−λx . Therefore, cdf function is
0

1 − e−λx , x ≥ 0

F (x) = .
0, x<0
For mean, variance and E(euX ),
Z∞
E(x) = x · f (x)dx
−∞
Z∞
= x · λe−λx dx
0
Z∞
integration by parts = −x · e−λx |∞
0 + e−λx dx
0
−x 1
= lim λx + (− e−λx |∞
0 )
x→∞ e λ
1 1
L’hopital = 0 + = .
λ λ
Z∞
2 1 2 −λx
V ar(x) = E[(x − E[x]) ] = (x − ) λe · dx
λ
0
Z∞ Z∞ Z∞
2 −λx −λx 1 −λx
= x λe · dx − 2xe · dx + e · dx
λ
0 0 0
Z∞ Z∞
1
= −x2 e−λx |∞
0 +2 xe−λx
· dx − 2 xe−λx · dx + [− 2 e−λx |∞
0 ]
λ
0 0
1
= 2.
λ
Z∞
E(euX ) = eux · f (x)dx
−∞
Z∞
= eux · λe−λx dx
0
Z∞
=λ e−(λ−u)x dx
0
λ −(λ−u)x ∞
=− e |0
λ−u

4
λ
if u < λ, we have E(euX ) = λ−u .

2. P (X ≤ t) = 1 − e−λt ⇒ P (X > t) = e−λt .

P (X > s + t | x > t)
P (X > s + t) ∩ P (x > t)
=
P (x > t)
P (X > s + t)
=
P (x > t)
e−λ(s+t)
=
e−λt
−λs
=e = P (X > s).

X is a Markov process.

You might also like