0% found this document useful (0 votes)
29 views

Introduction To Probability Distributions

The document discusses probability distributions and random variables. It defines key terms like random variable, discrete random variable, continuous random variable, probability distribution, probability mass function, and cumulative distribution function. It provides examples of discrete random variables like the outcomes of rolling a die and examples of continuous random variables like the diameter of a metal cylinder. It explains how to calculate probabilities and density for both discrete and continuous random variables using their respective probability functions.

Uploaded by

Naski Kuafni
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Introduction To Probability Distributions

The document discusses probability distributions and random variables. It defines key terms like random variable, discrete random variable, continuous random variable, probability distribution, probability mass function, and cumulative distribution function. It provides examples of discrete random variables like the outcomes of rolling a die and examples of continuous random variables like the diameter of a metal cylinder. It explains how to calculate probabilities and density for both discrete and continuous random variables using their respective probability functions.

Uploaded by

Naski Kuafni
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 93

Introduction to Probability

Distributions
Random Variables
 Random Variable (RV): A numeric outcome that results
from an experiment
 For each element of an experiment’s sample space, the
random variable can take on exactly one value
 Discrete Random Variable: An RV that can take on only
a finite or countably infinite set of outcomes
 Continuous Random Variable: An RV that can take on
any value along a continuum (but may be reported
“discretely”)
 Random Variables are denoted by upper case letters (Y)
 Individual outcomes for an RV are denoted by lower
case letters (y)
Random Variable
• A random variable x takes on a defined set
of values with different probabilities.
• For example, if you roll a die, the outcome is random
(not fixed) and there are 6 possible outcomes, each of
which occur with probability one-sixth.
• For example, if you poll people about their voting
preferences, the percentage of the sample that responds
“Yes on Proposition 100” is a also a random variable (the
percentage will be slightly differently every time you
poll).

• Roughly, probability is how frequently we


expect different outcomes to occur if we
repeat the experiment over and over
(“frequentist” view)
Random variables can be
discrete or continuous
 Discrete random variables have a countable
number of outcomes
 Examples: Dead/alive, treatment, dice, counts, etc.
 Continuous random variables have an infinite
continuum of possible values.
 Examples: blood pressure, weight, the speed of a
car, the real numbers from 1 to 6.
2.1.1 Definition of a Random
Variable (1/2)
 Random variable
 A numerical value to each outcome of a
S
particular experiment

-3 -2 -1 0 1 2 3
2.1.1 Definition of a Random
Variable (2/2)

 Example 1 : Machine Breakdowns


 Sample space : S  {electrical , mechanical , misuse}
 Each of these failures may be associated with
a repair cost
 State space : {50, 200,350}
 Cost is a random variable : 50, 200, and 350
Probability Distributions
 Probability Distribution: Table, Graph, or Formula that
describes values a random variable can take on, and its
corresponding probability (discrete RV) or density
(continuous RV)
 Discrete Probability Distribution: Assigns probabilities
(masses) to the individual outcomes
 Continuous Probability Distribution: Assigns density at
individual points, probability of ranges can be obtained by
integrating density function
 Discrete Probabilities denoted by: p(y) = P(Y=y)
 Continuous Densities denoted by: f(y)
 Cumulative Distribution Function: F(y) = P(Y≤y)
Probability functions
 A probability function maps the possible
values of x against their respective
probabilities of occurrence, p(x)
 p(x) is a number from 0 to 1.0.
 The area under a probability function is
always 1.
2.1.2 Probability Mass Function (1/2)
Discrete RV

 Probability Mass Function (p.m.f.)


 A set of probability value p assigned to each
i

of the values taken by the discrete random


variable x i

0  pi  1  i pi  1
 and
 Probability : P( X  xi )  pi
2.1.2 Probability Mass Function
(1/2) Example 1 : Machine
Breakdowns
 P (cost=50)=0.3,
 P (cost=200)=0.2,
xi 50 200 350
P (cost=350)=0.5 0.3 0.2 0.5
pi
 0.3 + 0.2 + 0.5 =1
f ( x)
0.5

0.3
0.2

50 200 350 Cost($)


2.1.3 Cumulative Distribution
Function (1/2)
 Cumulative Distribution Function
 Function : F ( x)  P( X  x) F ( x)  
y: y  x
P( X  y )

 Abbreviation : c.d.f

F ( x)
1.0

0.5

0.3

0 50 200 350 x($cost)


2.1.3 Cumulative Distribution
Function (2/2)
 Example 1 : Machine Breakdowns
  x  50  F ( x)  P(cost  x)  0
50  x  200  F ( x)  P(cost  x)  0.3
200  x  350  F ( x)  P(cost  x)  0.3  0.2  0.5
F ( x) 350  x    F ( x)  P(cost  x)  0.3  0.2  0.5  1.0
1.0

0.5

0.3

0 50 200 350 x($cost)


Discrete example: roll of a die

p(x)

1/6

x
1 2 3 4 5 6

 P(x)  1
all x
Probability mass function (pmf)
x p(x)
1 p(x=1)=1/6

2 p(x=2)=1/6

3 p(x=3)=1/6

4 p(x=4)=1/6

5 p(x=5)=1/6

6 p(x=6)=1/6
1.0
Cumulative distribution function
(CDF)

1.0 P(x)
5/6
2/3
1/2
1/3
1/6
1 2 3 4 5 6 x
Cumulative distribution
function
x P(x≤A)
1 P(x≤1)=1/6

2 P(x≤2)=2/6

3 P(x≤3)=3/6

4 P(x≤4)=4/6

5 P(x≤5)=5/6

6 P(x≤6)=6/6
Practice Problem:
 The number of patients seen in any given hour is a random
variable represented by x. The probability distribution for x is:

x 10 11 12 13 14
P(x) 0.4 0.2 0.2 0.1 0.1
Find the probability that in a given hour:
a.    exactly 14 patients arrive  p(x=14)= 0.1
b.    At least 12 patients arrive p(x12)= (0.2 + 0.1 +0.1) =
0.4
c.    At most 11 patients arrive p(x≤11)= (0.4 +0.2) =
0.6
Review Question 1
If you toss a die, what’s the probability that you
roll a 3 or less?

a. 1/6
b. 1/3
c. 1/2
d. 5/6
e. 1.0
Review Question 1
If you toss a die, what’s the probability that you
roll a 3 or less?

a. 1/6
b. 1/3
c. 1/2
d. 5/6
e. 1.0
Review Question 2
Two dice are rolled and the sum of the face
values is six? What is the probability that at
least one of the dice came up a 3?

a. 1/5
b. 2/3
c. 1/2
d. 5/6
e. 1.0
Review Question 2
Two dice are rolled and the sum of the face
values is six. What is the probability that at least
one of the dice came up a 3?

a. 1/5 How can you get a 6 on two


b. 2/3 dice? 1-5, 5-1, 2-4, 4-2, 3-3
c. 1/2 One of these five has a 3.
d. 5/6
1/5
e. 1.0
Continuous case
 The probability function that accompanies a
continuous random variable is a continuous
mathematical function that integrates to 1.
 For example, recall the negative exponential
function (in probability, this is called an
“exponential distribution”):
f ( x)  e  x
 This function integrates to 1:
 

e
x x
 e  0 1 1
0
0
2.2 Continuous Random Variables
2.2.1 Example of Continuous Random Variables (1/1)

 Example 14 : Metal Cylinder Production


 Suppose that the random variable is the diameter of a randomly
chosen cylinder manufactured by the company. Since this random
variable can take any value between 49.5 and 50.5, it is a
X
continuous random variable.
 Probability Density Function (p.d.f.)
 Probabilistic properties of a continuous random variable
f ( x)  0

statespace
f ( x)dx  1
2.2.2 Probability Density Function (2/4)

 Suppose that the diameter of a metal cylinder


has a p.d.f f ( x)

f ( x)  1.5  6( x  50.2) 2 for 49.5  x  50.5


f ( x)  0, elsewhere

50.5

49.5
(1.5  6( x  50.0) 2 )dx  [1.5 x  2( x  50.0)3 ]50.5
49.5 49.5 50.5 x
 [1.5  50.5  2(50.5  50.0)3 ]
[1.5  49.5  2(49.5  50.0)3 ]
 75.5  74.5  1.0
2.2.2 Probability Density Function (4/4)

 The probability that a metal cylinder has a diameter between 49.8


and 50.1 mm can be calculated to be

50.1
49.8
(1.5  6( x  50.0) 2 ) dx  [1.5 x  2( x  50.0) 3 ]50.1
49.8
f ( x)
 [1.5  50.1  2(50.1  50.0) 3 ]
[1.5  49.8  2(49.8  50.0) 3 ]
 75.148  74.716  0.432

49.5 49.8 50.1 50.5 x


2.2.3 Cumulative Distribution Function (1/3)
 Cumulative Distribution Function
x
 F ( x)  P( X  x)   f ( y )dy


dF ( x )
 f ( x) 
dx

 P (a  X  b)  P ( X  b)  P ( X  a )
 F (b)  F (a )

 P (a  X  b)  P (a  X  b)
2.2.2 Probability Density Function (2/3)

x
F ( x)  P( X  x)   (1.5  6( y  50.0) 2 )dy
49.5

 [1.5 y  2( y  50.0)3 ]49.5


x

 [1.5 x  2( x  50.0)3 ]  [1.5  49.5  2(49.5  50.0)3 ]


1.5 x  2( x  50.0)3  74.5
P (49.7  X  50.0)  F (50.0)  F (49.7)
 (1.5  50.0  2(50.0  50.0)3  74.5)
(1.5  49.7  2(49.7  50.0)3  74.5)
 0.5  0.104  0.396
2.2.2 Probability Density Function (3/3)

P(49.7  X  50.0)  0.396


1

P ( X  50.0)  0.5
F ( x)

P( X  49.7)  0.104

49.5 49.7 50.0 50.5 x


Continuous case: “probability
density function” (pdf)

p(x)=e-x

The probability that x is any exact particular value (such as 1.9976) is 0;


we can only assign probabilities to possible ranges of x.
For example, the probability of x falling within 1 to 2:

Clinical example: Survival times


after lung transplant may
roughly follow an exponential p(x)=e-x
function.
Then, the probability that a 1
patient will die in the second
year after surgery (between
years 1 and 2) is 23%.

x
1 2

2 2
P(1  x  2)   e x
 e x
 e  2  e 1  0.135  0.368  0.23
1
1
Expected Value and Variance

 All probability distributions are


characterized by an expected value
(mean) and a variance (standard
deviation squared).
2.3 The Expectation of a Random Variable
2.3.1 Expectations of Discrete Random Variables (1/2)

 Expectation of a discrete random variable with p.m.f


E ( X )   pi xi P( X  xi )  pi
i
E( X )  state space
xf ( x ) dx
 Example 1 (discrete random variable)
 The expected repair cost is
Expectation of a continuous random variable with p.d.f f(x)
The expected value of a random variable is also called the mean of the
random variable
2.3.2 Expectations of Continuous Random Variables
(2/2)

 Symmetric Random Variables f ( x) E( X )  


 If x has a p.d.f f ( x) that is
symmetric about a point 
so that f (   x)  f (   x)

 Then, E ( X )   (why?)

 So that the expectation of the


random variable is equal to the  x
point of symmetry
Expected value of a random variable

 Expected value is just the average or mean (µ) of


random variable x.

 It’s sometimes called a “weighted average”


because more frequent values of X are weighted
more highly in the average.

 It’s also how we expect X to behave on-average


over the long run (“frequentist” view again).
Expected value, formally
Discrete case:

E( X )   x p(x )
all x
i i

Continuous case:

E( X )  
all x
xi p(xi )dx
Symbol Interlude
 E(X) = µ
 these symbols are used interchangeably
Example: expected value

 Recall the following probability distribution:

x 10 11 12 13 14
P(x) 0.4 0.2 0.2 0.1 0.1

 x p( x)  10(0.4)  11(0.2)  12(0.2)  13(0.1)  14(0.1)  1.13


i 1
i
Sample Mean is a special case of
Expected Value…

Sample mean, for a sample of n subjects: =


n

x i n
1
X i 1
n
 
i 1
xi ( )
n

The probability (frequency) of each


person in the sample is 1/n.
Expected Value
 Expected value is an extremely useful
concept for good decision-making!
Example: the lottery
 The Lottery (also known as a tax on people
who are bad at math…)
 A certain lottery works by picking 6 numbers
from 1 to 49. It costs $1.00 to play the
lottery, and if you win, you win $2 million
after taxes.

 If you play the lottery once, what are your


expected winnings or losses?
Lottery
Calculate the probability of winning in 1 try:

1 1 1 “49 choose 6”
   7.2 x 10-8
 49  49! 13,983,816
  Out of 49
6 43!6!
numbers, this is
the number of
distinct
The probability function (note, sums to 1.0): combinations of 6.

x$ p(x)
-1 .999999928

+ 2 million 7.2 x 10--8


Expected Value
The probability function
x$ p(x)
-1 0.999999928

+ 2 million 7.2 x 10--8

Expected Value
E(X) = P(win)*$2,000,000 + P(lose)*-$1.00
= 2.0 x 106 * 7.2 x 10-8+ .999999928 (-1) = .144 - .999999928 = -$.86
 

Negative expected value is never good!


You shouldn’t play if you expect to lose money!
Expected Value
If you play the lottery every week for 10 years, what are your
expected winnings or losses?
 
520 x (-0.86) = -$447.20
Gambling (or how casinos can afford to give so
many free drinks…)

A roulette wheel has the numbers 1 through 36, as well as 0 and 00. If you
bet $1 that an odd number comes up, you win or lose $1 according to
whether or not that event occurs. If random variable X denotes your net
gain, X=1 with probability 18/38 and X= -1 with probability 20/38.
 
E(X) = 1(18/38) – 1 (20/38) = -$.053
 
On average, the casino wins (and the player loses) 5 cents per game.
 
The casino rakes in even more if the stakes are higher:
 
E(X) = 10(18/38) – 10 (20/38) = -$.53
 
If the cost is $10 per game, the casino wins an average of 53 cents per game.
If 10,000 games are played in a night, that’s a cool $5300.
Expected value isn’t
everything though…
 Take the hit new show “Deal or No Deal”
 Everyone know the rules?
 Let’s say you are down to two cases left. $1
and $400,000. The banker offers you
$200,000.
 So, Deal or No Deal?
Deal or No Deal…
 This could really be represented as a
probability distribution and a non-
random variable:
x$ p(x)
+1 0.50

+$400,000 0.50

x$ p(x)
+$200,000 1.0
Expected value doesn’t help…

x$ p(x)
+1 0.50

+$400,000 0 .50

  E( X )   x p(x )  1(.50)  400,000(.50)  200,000


all x
i i

x$ p(x)
+$200,000 1.0

  E ( X )  200,000
How to decide?
Variance!
• If you take the deal, the variance/standard
deviation is 0.
•If you don’t take the deal, what is average
deviation from the mean?
•What’s your gut guess?
Variance/standard deviation
2=Var(x) =E(x-)2

“The expected (or average) squared


distance (or deviation) from the mean”

 2  Var ( x)  E[( x   ) 2 ]  
all x
( xi   ) 2 p(xi )
Variance, continuous
Discrete case:

Var ( X )   (x
all x
i   ) p(xi )
2

Continuous case?:

 ( xi   ) p(xi )dx
2
Var ( X ) 
all x
Symbol Interlude
 Var(X)= 2
 SD(X) = 
 these symbols are used interchangeably
Similarity to empirical variance

The variance of a sample: s2 =

 ( xi  x ) 2 N
1
i 1
n 1
 
i 1
( xi  x ) (2
n 1
)

Division by n-1 reflects the fact that we have lost a


“degree of freedom” (piece of information) because
we had to estimate the sample mean before we could
estimate the sample variance.
Variance

 2
 (x
all x
i   ) p(xi )
2

2  
all x
( xi   ) 2 p(xi ) 

 (1  200,000) 2 (.5)  (400,000  200,000) 2 (.5)  200,000 2


  200,000 2  200,000

Now you examine your personal risk tolerance…


Practice Problem
On the roulette wheel, X=1 with
probability 18/38 and X= -1 with
probability 20/38.
 We already calculated the mean to be = -
$.053. What’s the variance of X?
Answer
 2
 (x   )
all x
i
2
p(xi )
 (1  .053) 2 (18 / 38)  (1  .053) 2 (20 / 38)
 (1.053) 2 (18 / 38)  (1  .053) 2 (20 / 38)
 (1.053) 2 (18 / 38)  (.947) 2 (20 / 38)
 .997

  .997  .99
Standard deviation is $.99. Interpretation: On average, you’re
either 1 dollar above or 1 dollar below the mean, which is just
under zero. Makes sense!
Review Question 3
The expected value and variance of a
coin toss (H=1, T=0) are?

a. .50, .50
b. .50, .25
c. .25, .50
d. .25, .25
Review Question 3
The expected value and variance of a
coin toss are?

a. .50, .50
b. .50, .25
c. .25, .50
d. .25, .25
Important discrete probability
distribution: The binomial
Bernoulli Trials
 Two Possible Outcomes
 Success, with probability p
 Failure, with probability q = 1  p

 Trials are independent.


Binomial Distribution
 For n Bernoulli trials, the probability of k
successes is given by the binomial
probability formula:
 n k
Pn , p  k     p 1  p 
nk

k
 As k varies with fixed n and p, the binomial
probabilities define a binomial probability
distribution over {0, 1, 2, …, n}.
Mean and Mode of a Binomial
 Mean: The expected number of
successes in n trials.
  np
 Mode: The most likely number of
successes in n trials.
m  int  np  p 
Binomial Probability
Distribution
 A fixed number of observations (trials), n
 e.g., 15 tosses of a coin; 20 patients; 1000 people
surveyed
 A binary outcome
 e.g., head or tail in each toss of a coin; disease or no
disease
 Generally called “success” and “failure”
 Probability of success is p, probability of failure is 1 – p
 Constant probability for each observation
 e.g., Probability of getting a tail is the same each time we
toss the coin
Binomial distribution
Take the example of 5 coin tosses.
What’s the probability that you flip
exactly 3 heads in 5 coin tosses?
Binomial distribution
Solution:
One way to get exactly 3 heads: HHHTT

What’s the probability of this exact arrangement?


P(heads)xP(heads) xP(heads)xP(tails)xP(tails)
=(1/2)3 x (1/2)2

Another way to get exactly 3 heads: THHHT


Probability of this exact outcome = (1/2)1 x (1/2)3
x (1/2)1 = (1/2)3 x (1/2)2
Binomial distribution
In fact, (1/2)3 x (1/2)2 is the probability of each
unique outcome that has exactly 3 heads and 2
tails.

So, the overall probability of 3 heads and 2 tails


is:
(1/2)3 x (1/2)2 + (1/2)3 x (1/2)2 + (1/2)3 x (1/2)2
+ ….. for as many unique arrangements as
there are—but how many are there??
Outcome Probability
  THHHT (1/2)3 x (1/2)2
HHHTT (1/2)3 x (1/2)2
TTHHH (1/2)3 x (1/2)2
HTTHH (1/2)3 x (1/2)2 The probability
ways to
5 arrange 3
HHTTH (1/2)3 x (1/2)2 of each unique
  heads in
HTHHT
THTHH
(1/2)3 x (1/2)2
(1/2)3 x (1/2)2
outcome (note:
 3 5 trials HTHTH (1/2)3 x (1/2)2
they are all
equal)
HHTHT (1/2)3 x (1/2)2
THHTH (1/2)3 x (1/2)2
10 arrangements x (1/2)3 x (1/2)2
C3 = 5!/3!2! = 10  
5

Factorial review: n! = n(n-1)(n-2)…


5
P(3 heads
 
and 2 tails) =   x P(heads)3 x P(tails)2 =
 3

10 x (½)5=31.25%
Binomial distribution
function:
X= the number of heads tossed in 5 coin
tosses
p(x)

x
0 1 2 3 4 5
number of heads
Binomial distribution,
generally
Note the general pattern emerging  if you have only two possible
outcomes (call them 1/0 or yes/no or success/failure) in n independent
trials, then the probability of exactly X “successes”=
n = number of trials

n X n X
  p (1  p )
X 1-p = probability
of failure
X=# p=
successes probability of
out of n success
trials
Binomial distribution: example

 If I toss a coin 20 times, what’s the


probability of getting exactly 10 heads?

 20  10 10
 (.5) (.5)  .176
 10 
Binomial distribution: example
 If I toss a coin 20 times, what’s the
probability of getting of getting 2 or
fewer heads?
 20  20!
0
 (.5) (.5) 
20
(.5) 20  9.5 x10 7 
0 20!0!
 20  20!
  (.5)1
(.5)19
 (.5) 20  20 x9.5 x10  7  1.9 x10 5 
1 19!1!
 20  20!
2 18
 (.5) (.5)  (.5) 20  190 x9.5 x10 7  1.8 x10  4
2 18!2!
 1.8 x10  4
**All probability distributions are
characterized by an expected value and a
variance:

If X follows a binomial distribution with


parameters n and p: X ~ Bin (n, p)
Then:
Note: the variance will
E(X) = np always lie between

Var (X) = np(1-p) 0*N-.25 *N


p(1-p) reaches

SD (X)= np(1  p) maximum at p=.5


P(1-p)=.25
Practice Problem
 1. You are performing a cohort study. If the
probability of developing disease in the exposed
group is .05 for the study duration, then if you
(randomly) sample 500 exposed people, how many
do you expect to develop the disease? Give a margin
of error (+/- 1 standard deviation) for your estimate.

 2. What’s the probability that at most 10 exposed


people develop the disease?
Answer
1. How many do you expect to develop the disease? Give a margin of
error (+/- 1 standard deviation) for your estimate.

X ~ binomial (500, .05)


E(X) = 500 (.05) = 25
Var(X) = 500 (.05) (.95) = 23.75
StdDev(X) = square root (23.75) = 4.87 
25  4.87
Answer
2. What’s the probability that at most 10 exposed
subjects develop the disease?

This is asking for a CUMULATIVE PROBABILITY: the probability of 0 getting the


disease or 1 or 2 or 3 or 4 or up to 10.
 
P(X≤10) = P(X=0) + P(X=1) + P(X=2) + P(X=3) + P(X=4)+….+ P(X=10)=

 500  0 500  500  1 499  500  2 498  500  10 490


 (.05) (.95)   (.05) (.95)   (.05) (.95)  ...   (.05) (.95)  .01
 0  1  2  10 
Practice Problem:
You are conducting a case-control study of
smoking and lung cancer. If the probability of
being a smoker among lung cancer cases is .6,
what’s the probability that in a group of 8 cases
you have:

a. Less than 2 smokers?


b. More than 5?
c. What are the expected value and variance of the number
of smokers?
Answer
X P(X)
8
0 1(.4) =.00065
1 7
1 8(.6) (.4) =.008
2 28(.6)2 (.4) 6 =.04
3 5
3 56(.6) (.4) =.12
4 70(.6)4 (.4) 4 =.23
5 3
5 56(.6) (.4) =.28
6 2
6 28(.6) (.4) =.21
7 1
7 8(.6) (.4) =.090
8 1(.6)8 =.0168

0 1 2 3 4 5 6 7 8
Answer, continued

P(<2)=.00065 + .008 = .00865 P(>5)=.21+.09+.0168 = .3168

0 1 2 3 4 5 6 7 8

E(X) = 8 (.6) = 4.8


Var(X) = 8 (.6) (.4) =1.92
StdDev(X) = 1.38
Review Question 4
In your case-control study of smoking and lung-
cancer, 60% of cases are smokers versus only 10%
of controls. What is the odds ratio between smoking
and lung cancer?

a. 2.5
b. 13.5
c. 15.0
d. 6.0
e. .05
Review Question 4
In your case-control study of smoking and lung-
cancer, 60% of cases are smokers versus only 10%
of controls. What is the odds ratio between smoking
and lung cancer?

a. 2.5
b. 13.5
c. 15.0
.6
d. 6.0 .4  3 x 9  27  13.5
e. .05 .1 2 1 2
.9
Review Question 5
What’s the probability of getting exactly 5
heads in 10 coin tosses?

 10  5 5
 (.50) (.50)
a. 0

b.  (.50) 5 (.50) 5
10

5
c.  10  10
 (.50) (.50)
5

5
d.  10  10 0
 (.50) (.50)
 10 
Review Question 5
What’s the probability of getting exactly 5
heads in 10 coin tosses?

 10  5 5
 (.50) (.50)
a. 0

b.  (.50) 5 (.50) 5
10

5
c.  10  10
 (.50) (.50)
5

5
d.  10  10 0
 (.50) (.50)
 10 
Review Question 6
A coin toss can be thought of as an example of a binomial
distribution with N=1 and p=.5. What are the expected
value and variance of a coin toss?

a. .5, .25
b. 1.0, 1.0
c. 1.5, .5
d. .25, .5
e. .5, .5
Review Question 6
A coin toss can be thought of as an example of a binomial
distribution with N=1 and p=.5. What are the expected
value and variance of a coin toss?

a. .5, .25
b. 1.0, 1.0
c. 1.5, .5
d. .25, .5
e. .5, .5
Review Question 7
If I toss a coin 10 times, what is the expected value
and variance of the number of heads?

a. 5, 5
b. 10, 5
c. 2.5, 5
d. 5, 2.5
e. 2.5, 10
Review Question 7
If I toss a coin 10 times, what is the expected value
and variance of the number of heads?

a. 5, 5
b. 10, 5
c. 2.5, 5
d. 5, 2.5
e. 2.5, 10
Review Question 8
In a randomized trial with n=150, the goal is to randomize
half to treatment and half to control. The number of people
randomized to treatment is a random variable X. What is
the probability distribution of X?

a. X~Normal(=75,=10)
b. X~Exponential(=75)
c. X~Uniform
d. X~Binomial(N=150, p=.5)
e. X~Binomial(N=75, p=.5)
Review Question 8
In a randomized trial with n=150, every subject has a 50%
chance of being randomized to treatment. The number of
people randomized to treatment is a random variable X.
What is the probability distribution of X?

a. X~Normal(=75,=10)
b. X~Exponential(=75)
c. X~Uniform
d. X~Binomial(N=150, p=.5)
e. X~Binomial(N=75, p=.5)
Review Question 9
In the same RCT with n=150, if 69
end up in the treatment group and 81
in the control group, how far off is
that from expected?

a. Less than 1 standard deviation


b. 1 standard deviation
c. Between 1 and 2 standard deviations
d. More than 2 standard deviations
Review Question 9

In the same RCT with n=150, if 69


end up in the treatment group and 81
in the control group, how far off is
that from expected? Expected = 75
81 and 69 are both 6 away from
the expected.
a. Less than 1 standard deviation Variance = 150(.25) = 37.5

b. 1 standard deviation Std Dev  6


Therefore, about 1 SD away
c. Between 1 and 2 standard deviations from expected.
d. More than 2 standard deviations
Proportions…
 The binomial distribution forms the basis of
statistics for proportions.
 A proportion is just a binomial count divided
by n.
 For example, if we sample 200 cases and find 60
smokers, X=60 but the observed proportion=.30.
 Statistics for proportions are similar to
binomial counts, but differ by a factor of n.
Stats for proportions
For binomial:  x  np
Differs by
a factor of
 x  np(1  p )
2
n.

 x  np (1  p )
Differs
by a
factor
 pˆ  p of n.
For proportion:
np(1  p) p (1  p)
 pˆ 2  2

n n
P-hat stands for “sample p (1  p )
proportion.”  pˆ 
n
It all comes back to normal…
 Statistics for proportions are based on a
normal distribution, because the
binomial can be approximated as
normal if np>5

You might also like