0% found this document useful (0 votes)
27 views

Lecture 2

This document contains notes on Bayesian statistical inference. It includes discussion of probability distributions like the Bernoulli, binomial, geometric, and negative binomial distributions. It provides the formulas for the expected value and variance of these distributions. It also covers topics like the cumulative distribution function, probability density function, and how to calculate probabilities, expected values, and variances for different random variables and distributions.

Uploaded by

kkoopedi10
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Lecture 2

This document contains notes on Bayesian statistical inference. It includes discussion of probability distributions like the Bernoulli, binomial, geometric, and negative binomial distributions. It provides the formulas for the expected value and variance of these distributions. It also covers topics like the cumulative distribution function, probability density function, and how to calculate probabilities, expected values, and variances for different random variables and distributions.

Uploaded by

kkoopedi10
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 10

STSM 2626

BAYESIAN STATISTICAL INFERENCE


2022
Notes prepared by Dr I. Garisch. Notes edited by Dr D. Chikobvu and Dr M. Sjölander

Slides (from notes) by Dr M. Sjölander


Presented by Dr M. Sjölander
+ = Machine says she’s lying.
- = Machine says she’s not lying.
P(Ac|B) = 1 – P(A|B) L = She’s lying. T = She’s not lying.

+ P(T|+)
-
L A=+
T B1 = T
B2 = L
0 1 2 3 4 x = 0, 1, 2, 3, 4.
●●●●●

0 1 2 etc. x = 0, 1, 2, 3, …
● ● ● etc.

Independent trials of an experiment are run, each resulting in a success or failure.


P(success) = p. P(failure) = 1 – p.
Distribution # trials # successes # failures Range of X (or K)
Bernoulli 1 X 1–X x = 0, 1.
Binomial n X n–X x = 0,1, …, n.
Geometric** X + 1 (= K) 1 X (= K – 1) x = 0, 1, … (k = 1, 2, …)
Neg. Binomial* X + r (= K) r X (= K – r) x = 0, 1, … (k = r, r+1, …)
** Trials run until 1st success. * Trials run until rth success.
E(X) = p
V(X) = p(1-p)

E(X) = np
V(X) = np(1-p)

E(X) =
V(X) =

E(K) =
V(K) =

E(X) =
V(X) =

E(K) =
V(K) =
E(X) = mr/n ; V(X) = (mr/n)(1 – r/n) (n-m/n-1)

You have n objects.


r are of type 1 and the remaining n-r are of type 2.
You take a sample of m objects from the n objects.
x = # objects of type 1 in the sample
Thus m – x = # objects of type 2 in the sample

E(X) = λ
V(X) = λ

Let λ = average # times a certain event happens in a certain interval length (e.g. hour)
Then X = # times the event happens in a given interval (e.g. in a given hour).
e.g. On average 10 red cars ride past Mimosa Mall per minute. The probability that 9
red cars ride past Mimosa Mall in the next minute is p(x=9|λ=10) = (109 • e-10)/9!
The definition of a cdf is the same for Discrete and Continuous Random Variables

(a,b), (a,∞), (- ∞,b), (- ∞,∞),


[a,b], [a,∞), (- ∞,b], [a,b), (a,b]

f(x) = d/dx F(x) m = Median(X) implies that F(m) = 0.5

= F(b) – F(a)
E(X) = ; V(X) =
Median(X) =

E(X) = ; V(X) = ; Median(X) =

E(X) =; V(X) =; Median(X) =

E(X) = ; V(X) =

θ
E(X) = ; V(X) =

E(X) =μ ; V(X) =2
Median(X) = μ

E(X) = ; V(X) = Median(X) ≈


● If x is below the range of X, then F(x|…)=0. If x is above the range of X, then F(x|…)=1
● If x isn’t in the range of X, then f(x|…) = 0.

We had:

Be careful when the range is restrictive e.g. x > 0 or a ≤ x ≤ b


e.g. the Uniform Distribution!
If X~U(2,3), then the range of X is [2,3] i.e. 2 ≤ x ≤ 3.
Then P(1 ≤ X ≤ 2.5) = ∫12.5 f(x)dx
= ∫12 f(x) dx +∫22.5 f(x) dx
= ∫12(0)dx +∫22.5 ( 1/3-2 ) dx
= 0+∫22.5 (1)dx
= x|22.5
= 2.5 – 2
= 0.5
pdf

Or we can divide by |g'(x)|


N.B.

y = -lnx -y = lnx
e-y = x x = e-y

0≤x≤1
0 ≤ e-y ≤ 1
-∞ < -y ≤ 0
0≤y<∞

You might also like