0% found this document useful (0 votes)
11 views

Lecture 8

1) The document discusses three probability distributions: Bernoulli, binomial, and uniform. 2) The Bernoulli distribution describes experiments with two possible outcomes (success/failure) with a fixed probability p of success. The binomial distribution describes the number of successes in n independent Bernoulli trials. 3) The uniform distribution assigns equal probability to each of a finite number of outcomes between x1 and xN.

Uploaded by

anonymous
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Lecture 8

1) The document discusses three probability distributions: Bernoulli, binomial, and uniform. 2) The Bernoulli distribution describes experiments with two possible outcomes (success/failure) with a fixed probability p of success. The binomial distribution describes the number of successes in n independent Bernoulli trials. 3) The uniform distribution assigns equal probability to each of a finite number of outcomes between x1 and xN.

Uploaded by

anonymous
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Bernoulli, Binomial and Uniform Distributions

Let (S, Σ, P ) be a probability space corresponding to a random experiment E.

• Each repetition of the random experiment E will be called a trial.


• We say that a collection of trials forms a collection of independent trials if any
collection of corresponding events forms a collection of independent events.

1. Bernoulli Distribution

A random experiment is said to be a Bernoulli experiment if its each trial results in


just two possible outcomes, labeled as success (s) and failure (f ). Each repetition of
a Bernoulli experiment is called a Bernoulli trial. For example, consider a sequence of
random rolls of a fair dice. In each roll of the dice a person bets on occurrence of upper
face with six dots. Let the event of occurrence of upper face with six dots be denoted by
E. Here, in each trial, one is only interested in the occurrence or non-occurrence of the
event E. In such situations, the occurrence of event E will be label as a success and the
non-occurrence of event E will be label as a failure.
For a Bernoulli trial, the sample space is S = {s, f }, the event space is Σ = P(S)
and the probability function is P : Σ −→ R defined by P ({s}) = p, P ({f }) = 1 − p,
P ({∅}) = 0 and P ({S}) = 1, where p ∈ (0, 1) is a fixed real number and it is the
probability of success of the trial. Define the random variable X : S −→ R by
(
1, if w = s
X(w) =
0, if w = f

Then the r.v. X is of discrete type with the support EX = {0, 1} and the p.m.f.

1 − p, if x = 0

(1) fX (x) = P ({X = x}) = p, if x = 1 .

0, otherwise

The random variable X is called a Bernoulli random variable and the distribution with
p.m.f. (1) is called a Bernoulli distribution with success probability p ∈ (0, 1).
The d.f. of X is given by

0, if x < 0

FX (x) = P ({X ≤ x}) = 1 − p, if 0 ≤ x < 1

1, if x ≥ 1

xfX (x) = p and E(X 2 ) = x2 fX (x) =


P P
Now, the expectation of X is E(X) =
x∈{0,1} x∈{0,1}
p. Thus the variance is V ar(X) = p−p2 = p(1−p). Also the moment generating functions
is
X
MX (t) = E(etX ) = etx fX (x) = p(et − 1) + 1, ∀ t ∈ R
x∈{0,1}
.
1
2. Binomial Distribution

Consider a sequence of n independent Bernoulli trials with probability of success (s) in


each trial being p ∈ (0, 1). In this case, the sample space is S = {(w1 , w2 , . . . , wn ) | wi ∈
{s, f }, i = 1, 2, . . . , n}, where wi represents the outcome of the i-th Bernoulli trial and
the event space is Σ = P(S). Define the random variable X : S −→ R by

X((w1 , w2 , . . . , wn )) = number of successes among w1 , w2 , . . . , wn

Clearly, Im X = {0, 1, 2, · · · , n} and P ({X = x}) = 0, if x ∈


/ {0, 1, 2, · · · , n}. For
x ∈ {0, 1, 2, · · · , n}

P ({X = x}) = P ({(w1 , w2 , . . . , wn ) ∈ S | X(w1 , w2 , . . . , wn ) = x})


X
= P ((w1 , w2 , . . . , wn )),
(w1 ,w2 ,...,wn )∈Sx

where Sx = {(w1 , w2 , . . . , wn ) | x of wi0 s are s and remaining n − x of wi0 s are f }.

For x ∈ {0, 1, 2, · · · , n} and (w1 , w2 , . . . , wn ) ∈ Sx ,

P ((w1 , w2 , . . . , wn )) = px (1 − p)n−x ,

since trials are independent and P ({s}) = p & P ({f }) = 1 − p. Therefore, x ∈


{0, 1, 2, · · · , n},

 
X
x n−x n x
P ({X = x}) = p (1 − p) = p (1 − p)n−x .
x
(w1 ,w2 ,...,wn )∈Sx

Thus the r.v. X is of discrete type with support EX = {0, 1, 2, · · · , n} and p.m.f.

( 
n x
x
p (1 − p)n−x , if x ∈ {0, 1, 2, · · · , n}
(2) fX (x) = P ({X = x}) = .
0, otherwise

The random variable X is called a Binomial random variable with n trials and success
probability p ∈ (0, 1) and it is written as X ∼ Bin(n, p). The probability distribution
with the p.m.f. (2) is called a Binomial distribution with n trials and success probability
n
n x

p (1 − p)n−x = (p + (1 − p))n = 1
P P
p ∈ (0, 1). It is clear that fX (x) = x
x∈EX x=0
2
Now, the expectation of X ∼ Bin(n, p) is
X
E(X) = xfX (x)
x∈EX
n  
X n x
= x p (1 − p)n−x
x=0
x
n
X xn!
= px (1 − p)n−x
x=0
(n − x)x!
n
X n!
= px (1 − p)n−x
x=1
(n − x)(x − 1)!
n
X (n − 1)!
= np p(x−1) (1 − p)n−x
x=1
(n − x)(x − 1)!
n−1
X n − 1
= np px (1 − p)n−1−x
x=0
x
= np(p + (1 − p))(n−1) = np
Now, the moment generating function of X ∼ Bin(n, p) is
MX (t) = E(etX )
X
= etx fX (x)
x∈EX
n  
X n x
tx
= e p (1 − p)n−x
x=0
x
n  
X n
= (pet )x (1 − p)n−x
x=0
x
= (pet + (1 − p))n , t ∈ R

Therefore,
(1)
MX (t) = npet (pet + (1 − p))(n−1) , t ∈ R;
(2)
MX (t) = npet (pet + (1 − p))(n−1) + n(n − 1)p2 e2t (pet + (1 − p))(n−2) , t ∈ R;
(1)
E(X) = MX (0) = np;
(2)
E(X 2 ) = MX (0) = np + n(n − 1)p2 ;
and V ar(X) = E(X 2 ) − (E(X))2 = np(1 − p).
Example 1. Four fair coins are flipped. If the outcomes are assumed independent, what
is the probability that two heads and two tails are obtained?

Solution: Let us label the occurrence of a head in a trial as success and label the
occurrence of a tail in a trial as failure. Let X be the number of successes (i.e. heads) that
appear. Then X ∼ Bin(4, 12 ). Hence the required probability is P (X = 2) = 42 ( 21 )2 ( 21 )2 =
3
8
.
Example 2. A fair dice is rolled six times independently. Find the probability that on
two occasions we get an upper face with 2 or 3 dots.
3
Solution: Let us label the occurrence of an upper face having 2 or 3 dots as success and
label the occurrence of any other face as failure. Let X be the number of occasions on
which we get success (i.e., an upper face having 2 or 3 dots). Then X ∼ Bin(6, 13 ). Hence
the required probability is P (X = 2) = 62 ( 13 )2 ( 32 )4 = 243
80

.

3. Discrete Uniform Distribution

For a given positive integer N (≥ 2) and real numbers x1 < x2 < · · · < xN , a ran-
dom variable X of discrete type is said to follow a discrete uniform distribution on
the set {x1 , x2 , . . . , xN } (written as X ∼ U ({x1 , x2 , . . . , xN })) if the support of X is
EX = {x1 , x2 , . . . , xN } and its p.m.f. is given by
(
1
, if x ∈ EX = {x1 , x2 , . . . , xN }
fX (x) = P ({X = x}) = N
0, otherwise
.
N N
1 1
Now, for r ∈ {1, 2, · · · }, E(X r ) = xri . Therefore, the mean E(X) =
P P
N N
xi and
i=1 i=1
N N
1 1
(xi − E(X))2 . Also the m.g.f. is MX (t) = E(etX ) = etxi , t ∈ R.
P P
V ar(X) = N N
i=1 i=1

Now, suppose that X ∼ U ({1, 2, . . . , N }). Then


N
1 X N +1
E(X) = i= ,
N i=1 2
N
2 1 X 2 (N + 1)(2N + 1)
E(X ) = i = ,
N i=1 6
N2 − 1
V ar(X) = E(X 2 ) − (E(X))2 = .
12
Also the m.g.f. of X ∼ U ({1, 2, . . . , N }) is
N
(
et (eN t −1)
tX 1 X it et −1
if t 6= 0
,
MX (t) = E(e ) = e =
N i=1 1, if t = 0

You might also like