Stats ch1
Stats ch1
Def (1.1.1): The set, S, of all possible outcomes of a particular experiment is called
the sample space for the experiment
Ex: tossing a coin, SAT results, reaction time
Operations:
Union: set of elements that belong to A or B or both;
𝐴 ∪ 𝐵 = {𝑥: 𝑥 ∈ 𝐴 𝑜𝑟 𝑥 ∈ 𝐵}
Intersection: set of elements that belong to A and B ;
𝐴 ∩ 𝐵 = {𝑥: 𝑥 ∈ 𝐴 & 𝑥 ∈ 𝐵}
Complement of A: set of all elements that are not in A; 𝐴∁ = {𝑥: 𝑥 ∉ 𝐴}
Ex: empty set ∅ is 𝑆 ∁
Thm (1.1.4): For any events 𝐴, 𝐵 and 𝐶 defined on sample space 𝑆,
a. Commutativity: 𝐴 ∪ 𝐵 = 𝐵 ∪ 𝐴 and 𝐴 ∩ 𝐵 = 𝐵 ∩ 𝐴
b. Associativity: 𝐴 ∪ (𝐵 ∪ 𝐶 ) = (𝐴 ∪ 𝐵 ) ∪ 𝐶
and 𝐴 ∩ (𝐵 ∩ 𝐶 ) = (𝐴 ∩ 𝐵) ∩ 𝐶
c. Distributive law: 𝐴 ∩ (𝐵 ∪ 𝐶 ) = (𝐴 ∩ 𝐵) ∪ (𝐴 ∩ 𝐶 )
𝐴 ∪ (𝐵 ∩ 𝐶 ) = (𝐴 ∪ 𝐵 ) ∩ (𝐴 ∪ 𝐶 )
d. DeMorgan’s laws: (𝐴 ∪ 𝐵)∁ = 𝐴∁ ∩ 𝐵∁ and (𝐴 ∩ 𝐵)∁ = 𝐴∁ ∪ 𝐵∁
Def (1.2.1): A collection of subsets of S is called a Borel field (or sigma algebra)
denoted by 𝓑, if it satisfies the following three properties:
a. ∅ ∈ ℬ (the empty set is an element of 𝓑)
b. If 𝐴 ∈ ℬ, then 𝐴∁ ∈ ℬ (or 𝓑 is closed under complementation)
c. If A1, A2, …∈ ℬ, then ⋃∞𝑖=1 𝐴𝑖 ∈ ℬ ( 𝓑 is closed under countable unions)
thus, 𝑆 ∈ ℬ and ⋂∞ 𝑖=1 𝐴𝑖 ∈ ℬ
Thm (1.2.6): Let 𝑆 = {𝑠1 , … , 𝑠𝑛 } be a finite set. Let ℬ be any sigma algebra of subsets
of 𝑆. Let 𝑝1 , … , 𝑝𝑛 be nonnegative numbers such that ∑𝑛𝑖=1 𝑝𝑖 =1
If for any 𝐴 ∈ ℬ, we define 𝑃(𝐴) = ∑{𝑖: 𝑠𝑖 ∈𝐴} 𝑝𝑖 , then 𝑃 is a probability function; even
if 𝑆 = {𝑠1 , 𝑠2 … } but 𝑆 is countable and ∑∞
𝑖=1 𝑝𝑖 =1.
Ex. risk neutral (RN) probabilities
we can price any security paying CF in states u & d not knowing the actual
probabilities that the market assign to u & d. Instead, you can see in the numerator
0.6 & 0.4 that are RN probabilities, capturing the risk premium and allowing to
discount future payoffs by the risk free rate (i.e. as if agents were risk neutral)
For instance, if the market risk premium is 0.005, then the actual Pu is 0.65.
Thm (1.2.8): If P is a probability function and A is any set in ℬ, then:
a. 𝑃(∅) = 0
b. 𝑃(𝐴) ≤ 1
c. 𝑃(𝐴∁ ) = 1 − 𝑃(𝐴)
Thm (1.2.9): If P is a probability function and A & B are any sets in ℬ, then
b. 𝑃(𝐴 ∪ 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵)
a. 𝑃(𝐵 ∩ 𝐴∁ ) = 𝑃(𝐵) − 𝑃(𝐴 ∩ 𝐵)
c. If 𝐴 ⊂ 𝐵, then 𝑃(𝐴) ≤ 𝑃(𝐵)
Def (1.3.2) If 𝐴 and 𝐵 are events in 𝑆 and 𝑃 (𝐵) > 0, then the conditional
probability of A given 𝐵 is:
𝑃(𝐴 ∩ 𝐵)
𝑃 (𝐴 | 𝐵 ) =
𝑃(𝐵)
Thm (1.3.5) Bayes’ Rule: Let 𝐴1 , 𝐴2 , … be a partition of the sample space, and let 𝐵
be any set. Then, for each 𝑖 = 1, 2, …
𝑃 (𝐵|𝐴𝑖 )𝑃(𝐴𝑖 )
𝑃 (𝐴 𝑖 | 𝐵 ) =
∑∞
𝑗=1 𝑃(𝐵|𝐴𝑗 )𝑃(𝐴𝑗 )
Def (1.3.7): Two events, 𝐴 & 𝐵 are statistically independent if
𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴)𝑃(𝐵)
Thm (1.3.9) If 𝐴 & 𝐵 are independent events, then the following pairs are also
independent:
a. 𝐴 and 𝐵∁
b. 𝐴∁ and 𝐵
c. 𝐴∁ and 𝐵∁
Ex (1.3.10)
Def (1.4.1) A random variable is a function from a sample space 𝑆 into the real
numbers ℝ
and
1.5 Distribution Functions
Def (1.5.8 & 1.5.10): The random variables 𝑋 & 𝑌 are identically distributed if
𝐹𝑋 (𝑥 ) = 𝐹𝑌 (𝑥 ) ∀𝑥
1.6 Density and Mass functions
Def (1.6.1): The probability mass function (pmf) of a discrete random variable 𝑋
is given by 𝑓𝑋 (𝑥 ) = 𝑃(𝑋 = 𝑥 ) ∀𝑥
Thm (1.6.5): A function 𝑓𝑋 (𝑥 ) is a pdf (or pmf) of a random variable 𝑋 if and only
if:
a. 𝑓𝑋 (𝑥 ) ≥ 0, ∀𝑥
∞
b. ∑𝑥 𝑓𝑋 (𝑥 ) = 1 (pmf) or ∫−∞ 𝑓𝑋 (𝑥 )𝑑𝑥 = 1 (pdf)
Exponential distribution 𝑋~exp(𝜆) continuous analogue to geometric
𝑋 is the time it takes until an event happens (or time between events)
The event follows a Poisson process THUS they occur:
independently from each other
at a constant rate ("𝜆" occurrences per unit of time)