0% found this document useful (0 votes)
26 views5 pages

1 Binomial Numbers and Related Distributions

1. The document discusses several probability distributions related to binomial numbers: binomial, hypergeometric, negative binomial, and Poisson. 2. The binomial distribution models the probability of success in a fixed number of yes/no trials where the probability of success is constant across trials. The hypergeometric distribution is similar but models sampling without replacement. 3. The negative binomial distribution models the number of failures before a target number of successes. The Poisson distribution arises as a limiting case of the binomial distribution and models rare independent events with a constant average rate of occurrence.

Uploaded by

duyttcute1989
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views5 pages

1 Binomial Numbers and Related Distributions

1. The document discusses several probability distributions related to binomial numbers: binomial, hypergeometric, negative binomial, and Poisson. 2. The binomial distribution models the probability of success in a fixed number of yes/no trials where the probability of success is constant across trials. The hypergeometric distribution is similar but models sampling without replacement. 3. The negative binomial distribution models the number of failures before a target number of successes. The Poisson distribution arises as a limiting case of the binomial distribution and models rare independent events with a constant average rate of occurrence.

Uploaded by

duyttcute1989
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

1

1.1

Binomial numbers and related distributions


Binomial numbers

Texans (used to) play the lottery by selecting six dierent numbers between 1 and 50, which cost $1 for each combination. Twice a week, on Wednesday and Saturday at 10 PM, six ping-pong balls are released without replacement from a rotating plastic ball containing 50 ping-pong balls numbered 1 through 50. The winner of the jackpot (which occasionally accumulates to 60 or more million dollars!) is the one who has all six drawn numbers correct, where the order in which the numbers are drawn does not matter. What are the odds of winning if you play one set of six numbers only? In order to answer this question, suppose rst that the order of the numbers does matter. Then the number of ordered sets of 6 out of 50 numbers is: 50 possibilities for the rst drawn number, times 49 possibilities for the second drawn number, times 48 possibilities for the third drawn number, times 47 possibilities for the fourth drawn number, times 46 possibilities for the fth drawn number, times 45 possibilities for the sixth drawn number:
5 Y k=1 (50 j) = Q506 k=1

j=0

Q50

50! k = . (50 6)! k

The notation n! (read: n factorial) stands for the product of the natural numbers 1 through n: n! = 1 2 ....... (n 1) n if n > 0, 0! = 1. The reason for dening 0! = 1 will be explained below. Since a set of six given numbers can be permutated in 6! ways, we need to correct the above number for the 6! replications of each unordered set of six given numbers. Therefore, the number of sets of six unordered numbers out of 50 is: ! 50 50! = = 15, 890, 700. 6 6! (50 6)!

Thus, the probability of winning the Texas lottery if you play only one combination of six numbers is 1 out of 15, 890, 700. Note: In the Spring of 2000, the Texas lottery has changed the rules: Now one has to choose 6 dierent numbers between 1 and 54, which reduces the 1

probability of winning to 1 one out of 25, 827, 165. The reason for this change is to boost the jackpot, because the higher the jackpot, the more people play. The setup of the Pennsylvania lottery is dierent: One has to choose 5 out of 59 numbers and then a power ball (numbered from 1 to 39). The probability of winning in this case is 1 out of

59! 59 39 = 39 = 5, 006, 386 39 = 195, 249, 054 5! (59 5)! 5

which is clearly a rip-o compared with the Texas lotto! In general, the number of ways we can draw a set of k unordered objects out of a set of n objects without replacement is:
!

n n! . = k k! (n k)!

These numbers, read as: n choose k, also appear as coecients in the binomial expansion ! n X n k nk n (a + b) = a b . k k=0 The reason for dening 0! = 1 is now that the rst and last coecients in this binomial expansion are always equal to 1:
!

n n n! 1 = = = = 1. 0 n 0!n! 0!

1.2

The binomial distribution

Consider a bowl containing r red balls and N r white balls, where 0 < r < N . Draw randomly n balls from this bowl with replacement, i.e., shake the bowl thoroughly, draw blindfolded a ball, take the blindfold o, observe the color of the ball you have drawn, put the ball back in the bowl (and the blindfold on!), and repeat this procedure n times. The number of ways you can draw an ordered sequence of k red balls and n k white balls in this way is: rk (N r)nk , and the number of ways you can draw an ordered sequence of n balls (of any color) is N n . Thus, the probability that you draw a sequence of k red balls and n k white balls in a particular order is: rk (N r)nk /N n = (p)k (1 p)nk , where p = r/N . 2

But the number of ordered sequences of k red balls and n k white balls is: n . Therefore, if X is the number of red balls you have drawn, then k P (X = k) =
!

n k p (1 p)nk , k = 0, 1, ..., n. k

This distribution is called the Binomial (n, p) distribution. The expectation of X is: E[X] = n.p

1.3

The hypergeometric distribution

color) is N . Clearly, if n > r and k > r, or k > n, there is no way to draw n k red balls and n k white balls. Therefore, if X is the number of red balls you have drawn, then P (X = k) =

r k N r nk N n

Consider again a bowl containing r red balls and N r white balls, where 0 < r < N . Draw randomly n balls (n r) from this bowl without replacement, i.e., shake the bowl thoroughly, draw blindfolded a ball, dont put the ball back in the bowl, and repeat this procedure n times. The number of ways you can draw k red balls and n k white balls in r this way is k Nr , and the number of ways you can draw n balls (of any nk

for k = 0, 1, ..., min (n, r) ,

P (X = k) = 0 for k > min (n, r) . This distribution is called the Hypergeometric (n, r, N ) distribution. If n is very small relative to N, this distribution is approximately equal to the Binomial (n, p) distribution, with p = r/N. The expectation of X is E[X] = n.r . N

1.4

The negative binomial distribution

Consider a sequence of independent repetitions of a random experiment with constant probability p of success. Let the random variable X be the total 3

number of failures in this sequence before the m-th success, where m 1. Thus, X + m is equal to the number of trials necessary to produce exactly m successes. The probability P (X = k), k = 0, 1, 2, ...., is the product of the probability of obtaining exactly m 1 successes in the rst k + m 1 trials, which is equal to the (Binomial) probability

k + m 1 m1 p (1 p)k+m1(m1) , m1
!

and the probability p of a success on the (k + m)-th trial: k+m1 m P (X = k) = p (1 p)k , k = 0, 1, 2, .... m1 This distribution is called the Negative Binomial (m, p) distribution. The expectation of X is: E[X] = m p1 1 .

1.5

The Poisson distribution


!

Let Xn be Binomial (n, pn ) distributed: P (Xn = k) = n k pn (1 pn )nk , k = 0, 1, ..., n, k

and suppose that for n = 1, 2, ...., pn 0 as n , such that for n > c, npn = c, where c > 0 is a constant. Then for xed k, and n > c, P (Xn = k) = (1 c k n! c ck ) k (1 )n . n n (n k)! n k! c k c ) = (1 n )k = 1. lim n n

(1)

The rst factor in (1) converges to 1: lim (1 n

The second factor in (1) equals 1 if k = 0, and converges to 1 if k 1:


k1 Y n! n(n 1) ... (n k + 1) j = n lim = n lim 1 = 1. n nk (n k)! nk n j=1

lim

The third factor in (1) converges to exp(c), because


c ln(1 n ) ln 1 c ln(1 + ) ln 1 = c lim lim ln(1 )n = c n lim n 0 n c/n d ln(x) 1 = c = c = c. dx x=1 x x=1

Thus, for xed k = 0, 1, 2, ....., limn P (Xn = k) = P (X = k), where P (X = k) = exp(c) ck . k!

This distribution is called the Poisson (c) distribution. Since it is the limit of a Binomial (n, p) distribution with p = c/n for n > c, the Poisson distribution is often used to model the distribution of rare events. The expectation of X is: E[X] = c

You might also like