0% found this document useful (0 votes)
7 views

Note 28

Uploaded by

Asif Hoda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Note 28

Uploaded by

Asif Hoda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

@bohring_bot

Probability
SOME BASIC DEFINITIONS
EXPERIMENT
An operation which results in some well defined outcome is called an experiment.
RANDOM EXPERIMENT
An experiment whose outcome cannot be predicted with certainty is called a random experiment.
For example, tossing of a fair coin or throwing an unbiased die or drawing a card from a well shuffled
pack of 52 cards is a random experiment.
SAMPLE SPACE
The set of all possible outcomes of a random experiment is called the sample space.
It is usually denoted by S .
For example, if we toss a coin, there are two possible outcomes, a head H or a tail T .
So, the sample space in this experiment is given by S H, T .
EVENT
A subset of the sample space S is called an Event.
Note :
(a) Sample space S plays the same role as the universal set for all problems related to the particular
experiment.
(b) is also a subset of S which is called an impossible event.
(c) S is also a subset of S which called a sure event.
SIMPLE EVENT
An event having only a single sample point is called a simple event.
For example, when a coin is tossed, the sample space S H, T .
Let E1 H the event of occurrence of head and
E2 T the event of occurrence of tail.
Then, E1 and E2 are simple events.
MIXED EVENT
A subset of the sample space S which contains more than one element is called a mixed event.
For example, when a coin is tossed, the sample space S H, T .
Let E H , T the event of occurrence of a head or a tail.
Then, E is a mixed event.
EQUALLY LIKELY EVENTS
A set of events is said to be equally likely if none of them is expected to occur in preference to the other.
For example, when a fair coin is tossed, then occurrence of head or tail are equally likely cases and there
is no reason to expect a ‘head’ or a ‘tail’ in preference to the other.
EXHAUSTIVE EVENTS
A set of events is said to be exhaustive if the performance of the experiment always results in the occurrence of
atleast one of them.
For example, when a die is thrown, then the events
A1 1, 2 and A2 2, 3, 4 are not exhaustive as we can get 5 as outcome of the experiment
which is not the member of any of the events A1 and A2 .
@bohring_bot
But, if we consider the events E1 1, 2, 3 and E2 2, 4, 5, 6 then the set of events E1 , E2 is exhaustive.
MUTUALLY EXCLUSIVE EVENTS
A set of events is said to be mutually exclusive if occurrence of one of them precludes the occurrence of any of
the remaining events.
Thus, E1 , E2 , ...., En are mutually exclusive if and only if Ei E j for i j .
For example, when a coin is tossed, the event of occurrence of a head and the event of occurrence of a tail
are mutually exclusive events.
INDEPENDENT EVENTS
Two events are said to be independent, if the occurrence of one does not depend on the occurrence of the other.
For example, when a coin is tossed twice, the event of occurrence of head in the first throw and the event
of occurrence of head in the second throw are independent events.
COMPLEMENT OF AN EVENT
The complement of an event E , denoted by E or E or E c , is the set of all sample points of the space other
than the sample points in E.
For example, when a die is thrown, sample space
S 1, 2, 3, 4, 5, 6 .
If E 1, 2, 3, 4 , then E 5, 6 .
Note that E E S .
MUTUALLY EXCLUSIVE AND EXHAUSTIVE EVENTS
A set of events E1 , E2 ,..., En of a sample space S form a mutually exclusive and exhaustive system of events,
if
(i) Ei E j for i j and
(ii) E1 E2 ... En S.
For example, when a die is thrown, sample space S 1, 2, 3, 4, 5, 6 .
Let E1 1, 3, 5 the event of occurrence of an odd number and
E2 2, 4, 6 the event of occurrence of an even number.
Then, E1 E2 S and E1 E2 .
PROBABILITY OF OCCURRENCE OF AN EVENT
Let S be a sample space, then the probability of occurrence of an event E is denoted by P E and is
defined as
n E number of elements in E
P E
n S number of elements in S
number of cases favourable to event E
total number of cases
Note :
(a) 0 P E 1, i.e. the probability of occurrence of an event is a number lying between 0 and 1.
(b) P 0, i.e. probability of occurrence of an impossible event is 0.
(c) P S 1, i.e. probability of occurrence of a sure event is 1.
ODDS IN FAVOUR OF AN EVENT AND ODDS AGAINST AN EVENT
If the number of ways in which an event can occur be m and the number of ways in which it does not occur be
n, then
@bohring_bot
m
(i) odds in favour of the event and
n
n
(ii) odds against the event .
m
a
If odds in favour of an event are a : b then the probability of the occurrence of that event is and the
a b
b
probability of the non-occurrence of that event is
.
a b
SOME IMPORTANT RESULTS ON PROBABILITY
1. P A 1 P A
2. If A and B are any two events, then P A B P A P B P A B .
3. If A and B are mutually exclusive events, then A B and hence P A B 0.
P A B P A P B .
4. If A, B, C are any three events, then P A B C P A P B P C P A B
P B C P C A P A B C .
5. If A, B, C are mutually exclusive events, then A B ,B C ,C A ,A B C and
hence P A B 0, P B C 0, P C A 0, P A B C 0.
P A B C P A P B P C .
6. P A B 1 P A B .
7. P A B 1 P A B .
8. P A P A B P A B
9. P B P B A P B A
10. If A1 , A2 , ...., An are independent events, then P A1 A2 ... An P A1 P A2 ,..., P An .
11. If A1 , A2 ,..., An are mutually exclusive events, then
P A1 A2 ... An P A1 P A2 ... P An .
12. If A1 , A2 ,..., An are exhaustive events, then P A1 A2 ... An 1.
13. If A1 , A2 ,..., An are mutually exclusive and exhaustive events, then
P A1 A2 ... An P A1 P A2 ... P An 1.
14. If A1 , A2 ,..., An are n events, then
(a) P A1 A2 ... An P A1 P A2 ... P An
(b) P A1 A2 ... An 1 P A1 P A2 ... P An
CONDITIONAL PROBABILITY
Let A and B be two events associated with a random experiment. Then, the probability of occurrence of A
under the condition that B has already occurred and P B 0, is called the conditional probability and it is
denoted by P A / B .
Thus P A Probability of occurrence of A, given that B has already happened.
B
@bohring_bot
P A B n A B
P B n B
Similarly, P B Probability of occurrence of B, given that A has already happened.
A
P A B n A B
.
P A n A
Sometimes, P A B is also used to denote the probability of occurrence of A when B occurs.

Similarly, P B is used to denote the probability of occurrence of B when A occurs.


A
1. Multiplication theorems on probability
(i) If A and B are two events associated with a random experiment, then P A B P A .P B A ,

If P A 0or P A B P B .P A ,if P B 0
B
(ii) Extension of multiplication theorem : If A1 , A2 ... An are n events related to a random experiment,
then P A1 A2 A3 ... An = P A1 P A2 / A1 P A3 / A1 A2 … P An / A1 A2 .. An 1 .
where P Ai / A1 A2 .. Ai 1 represents the conditional probability of the event Ai , given that
the events A1 , A2 ...Ai 1 have already happened
(iii) Multiplication theorems for independent events: If A and B are independent events associated
with a random experiment, then P A B P A .P B i.e the probability of simultaneous
occurrence of two independent events is equal to the product of their probabilities. By multiplication
theorem, we have P A B P A .P B / A .since A and B are independent events, therefore
P B/ A P B . Hence , P A B P A .P B .
(iv) Extension of multiplication theorem for independent events: If A1 , A2 ... An are independent
events associated with a random experiment, then P A1 A2 A3 ... An P A1 P A2 ...P An .
By multiplication theorem, we have
P A1 A2 A3 ... An P A1 P A2 / A1 P A3 / A1 A2 ...P An / A1 A2 ... An 1 .
Since A1 , A2 ... An 1 , An are independent events, therefore
P A2 / A1 P A2 , P A3 / A1 A2 P A3 ...P An / A1 A2 ... An _1 P An
Hence, P A1 A2 ... An P A1 P A2 ...P An .
2. Probability of at least one of the n independent events :
If p1 , p2 , p3 ... pn be the probabilities of happening of n independent events A1 , A2 , A3 ... An respectively, then
(i) Probability of happening none of them = P A1 A2 A3 ... An P A1 .P A 2 .P A3 ...P An
1 p1 1 p2 1 p3 ... 1 pn
(ii) Probability of happening at least one of them
P A1 A2 A3 ... An 1 P A1 P A2 P A3 ...P An . 1 1 p1 1 p2 1 p3 ... 1 pn
(iii) Probability of happening of first event and not happening of the remaining
= P A1 P A2 P A3 ...P An = p1 1 p2 1 p3 ... 1 pn
@bohring_bot
LAW OF TOTAL PROBABILITY
Let S be the sample space and let E1 , E2 ,...., En be n mutually exclusive and exhaustive events associated
with a random experiment. If A is any event which occurs with E1 or E2 or ... En , then
A A A
P A P E1 P P E2 P ... P En P
E1 E2 En
BAYE’S RULE
n
Let S be a sample space and E1 , E2 ,... En be n mutually exclusive events such that Ei S and P Ei 0
i 1

for i 1, 2,......, n. We can think of Ei ’s as the causes that lead to the outcome of an experiment. The
probabilities P Ei , i 1, 2,...., n are called prior probabilities. Suppose the experiment results in an outcome of
event A, where P A 0. We have to find the probability that the observed event A was due to cause Ei , that
is, we seek the conditional probability P Ei / A . These probabilities are called posterior probabilities, given by
P Ei P A / Ei
Baye’s rule as P Ei / A n
.
P Ek P A / Ek
k 1
RANDOM VARIABLE
A random variable is a real valued function whose domain is the sample space of a random experiment.
A random variable is usually denoted by the capital letters X , Y , Z , ...., etc.
DISCRETE RANDOM VARIABLE
A random variable which can take only finite or countably infinite number of values is called a discrete random
variable.
CONTINUOUS RANDOM VARIABLE
A random variable which can take any value between two given limits is called a continuous random variable.
Geometrical method for probability: When the number of points in the sample space is infinite, it becomes
difficult to apply classical definition of probability. For instance if we are interested to find the probabilities that
a point selected at random from the interval [1,6] lies either in the interval [1,2] or [5,6], we cannot apply the
classical definition of probability. In this case we define the probability as follows:
Measure of region A
P x A
Measure of the sample spaceS
where measure stands for length, area or volume depending upon whether S is a one – dimensional,
two – dimensional or three dimensional region.
Probability distribution : Let S be a sample space. A random variable X is a function from the set S to R ,
the set of real numbers.
For example, the sample space for a throw of a pair of dice is
11, 12, , 16
21, 22, , 26
S
61, 62, , 66
Let X be the sum of numbers on the dice. Then X 12 3, X 43 7, etc. Also, X 7 is the event
61, 52, 43, 34, 25, 16 . In general, if X is a random variable defined on the sample space S and r is a real
number, then X r is an event.
@bohring_bot
If the random variable X takes n distinct values x1 , x2 ,...., xn , then X x1 , X x2 ,....., X xn are
mutually exclusive and exhaustive events.
Now, since X xi is an event, we can talk of P X xi . X x1 X x3
X x2
If P X xi Pi where 1 i n , then the system of numbers.
x1 x2 xn X x4 X xn
is said to be the probability distribution of the random
p1 p2 pn
variable X .
n
The expectation (mean) of the random variable X is defined as E X pi xi and the variance of X is
i 1
n n
2 2
defined as var X pi xi E X pi xi2 E X .
i 1 i 1
Bernoullian Trials, set of n trials is said to be Bernoullian if
(a) The value of n is finite i.e. number of trials are finite
(b) Each and every trial/experiment is independent
(c) Trial (Experiment) consist only two out comes namely success and failure.
(d) Probability of success & failure for each trial is fixed (same)
3. Binomial probability distribution : A random variable X which takes values 0, 1, 2, ..., n is said to
follow binomial distribution if its probability distribution function is given by
n
P X r Cr p r q n r , r 0, 1, 2, ...., n
where p, q 0 such that p q 1.
The notation X ~ B n, p is generally used to denote that the random variable X follows binomial
distribution with parameters n and p.
We have P X 0 P X 1 ... P X n .
n n
C0 p 0 q n 0 nC1 p1q n 1 ... nCn p n q n n q p 1n 1
Now probability of
n
(a) Occurrence of the event exactly r times P X r Cr q n r p r .
n
n n r r n n
(b) Occurrence of the event at least r times P X r Cr q p ... p CX p X qn X .
X r
(c) Occurrence of the event at the most r times
r
P 0 X r qn n
C1q n 1 p ... n
Cr q n r p r n
CX p X qn X .
X 0
Note :
1. If the probability of happening of an event in one trial be p, then the probability of successive
happening of that event in r trials is p r .
2. If n trials constitute an experiment and the experiment is repeated N times, then the frequencies of
0, 1, 2, ...., n successes are given by
N .P. X 0 , N .P. X 1 , N .P. X 2 , ...., N .P. X n .

(i) Mean and variance of the binomial distribution : The binomial probability distribution is
@bohring_bot
X 0 1 2 .... n
n
P X C0 q n p 0 n
C1q n 1 p n
C2 q n 2 p 2 .... n
Cn q 0 p n
n n
The binomial probability distribution is X i pi X . nCX q n X
pX np,
i 1 X 1
2
The variance of the Binomial distribution is npq and the standard deviation is npq .
(ii) Use of multinomial expansion : If a die has m faces marked with the numbers 1, 2, 3, ...m and
if such n dice are thrown, then the probability that the sum of the number exhibited on the upper
n
x x2 x 3 ... x m
faces equal to p is given by the coefficient of x p in the expansion of .
mn
4. Poission distribution : It is the limiting case of B.D. under the following condition
(i) Number of trials are very-very large i.e. n
(ii) p 0 (Here p is not exactly 0 but nearly approaches to zero)
(iii) np , a finite quantity ( is called parameter)
Probability of r success for poission distribution is given by
e r
P X r , r 0, 1, 2, ....
r!
For poission distribution recurrence formula is given by
P r 1 P r
r 1
Note :
(a) For Poission Distribution mean variance np.
(b) If X and Y are independent poission variates with parameter 1 and 2 then X Y also has
poission distribution with parameters 1 2 .
5. Normal Distribution
For a normal distribution, number of trials are infinite.
The Normal probability function or distribution is given by
2
1 x
1 x
P X x e 2 where z known as standard variate, x
2
Facts About Normal Distribution P z
(i) It is limiting case of B.D. i.e. B n, p when n
(ii) mean mode Median
(iii) Total area under a standard normal (curve) distribution is 1.
(iv) Normal curve is bell-shaped and uni-model
(v) Q3 Median Median Q1 z
1 3 o 3
(vi) P X r dx 1 mean mode median
1

2 X
If X ~ N , and Z then P x P 1 z 1 .6827

P 2 x 2 P 2 z 2 .9544
P 3 x 3 P 3 z 3 .9973
@bohring_bot
Note :
The probabilities P z1 z z2 , P z1 z z2
P z1 z z2 , P z1 z z 2 are all treated to be same.

You might also like