0% found this document useful (0 votes)
5 views

Chapter One

Chapter One provides an overview of basic probability theory, including definitions of key concepts such as experiments, sample spaces, and events. It outlines postulates and basic theorems of probability, emphasizing the numerical measure of likelihood and uncertainty. The chapter also discusses joint and marginal distributions for both discrete and continuous random variables.

Uploaded by

tesfayefantaw1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Chapter One

Chapter One provides an overview of basic probability theory, including definitions of key concepts such as experiments, sample spaces, and events. It outlines postulates and basic theorems of probability, emphasizing the numerical measure of likelihood and uncertainty. The chapter also discusses joint and marginal distributions for both discrete and continuous random variables.

Uploaded by

tesfayefantaw1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Chapter One

1. Overview of Basic of Probability Theory


1.1. Introduction
Personally, in our daily lives we are faced with a lot of decision-making that involve uncertainty.
Perhaps you may ask you self to analyse one of the following situations:
- What is the chance for me to score “A” in statistics?
- What is the likelihood that your (our/my) weekend picnic be successful? Etc
In such situations we use the concept of probability in our daily life without detailed and actual
knowledge of the concept in other words we use it intuitively.
Probability is:
 Numerical measure of the chance of likelihood that a particular event will occur
 A quantitative measure of uncertainty
 A measure of the strength of belief in the occurrence of an uncertain event
 Measured by a number between 0 and 1 (or between 0% and 100%)
1.2. Definition of Basic Concepts in Probability
 Experiment: is the Process that leads to the occurrence one of several possible outcomes ,
e.g.:
 Coin toss : Heads, Tails
 Throw die : 1, 2, 3, 4, 5, 6
Note: Each trial of an experiment has a single observed outcome.
The precise outcome of a random experiment is unknown before a trial.
 Sample Space: - is the set of all possible outcomes that may occur as a result of a particular
experiment
 Event:- is Collection of outcomes having a common characteristics
Example

If someone takes three shots at a target and we care only whether each shot is a hit or a miss,
describe a suitable sample space, the elements of the sample space that constitute event M that
the person will miss the target three times in a row, and the elements of event N that the person
will hit the target once and miss it twice

Solution If we let 0 and 1 represent a miss and a hit, respectively, the eight possibilities

S= {(0, 0, 0), (1, 0, 0), (0, 1, 0), (0, 0, 1), (1, 1, 0), (1, 0, 1), (0, 1, 1), (1, 1, 1)}

M={ }
N= {(1, 0, 0), (0, 1, 0), (0, 0, 1)}
Events can be:
Simple Event: – is a subset of sample space that has exactly one sample point. It can also be called as
element or fundamental event.
Compound Event: – is a subset of sample space that has two or more sample points.
Complement Event: – the complement of event A is denoted by A’. A’ is the event that has all the
points in a sample space that are not in A.
E.g. rolling a die: S = {1, 2, 3, 4, 5, 6}
Event A = {1, 3, 5}
Complement event of A, A’ = {2, 4, 6}.
Impossible Event: – is a subset of sample space that contains none of the Points.
E.g. Rolling a die: S = {1, 2, 3, 4, 5, 6}
E = {7} or E = {0}
Independent Events:-Two events are said to be independent when the happening of one event
doesn’t affect the happening of the other.
E.g. rolling a die
Dependent Events: - Two events are said to be dependent when the happening (or occurrence) and
non-occurrence of an event affects the happening of another event.
Mutually Exclusive Event: - Events are said to be mutually exclusive if one and only of them can
take place at a time.
Collectively Exhaustive Events/Lists: - When a set of events for an experiment includes every
possible outcome the set is said to be collectively exhaustive event/list.
E.g. flipping a fair coin twice: S = {HH, HT, TH, TT}
Once looking the basic concepts, we pass to formally give definitions for probability.

1.3. Postulates and Basic Theorems of Probability


1.3.1. Postulates of Probability
i) Given a sample space, S, of a random experiment, the probability of the entire sample
space is 1. i.e. P (S) = 1
ii) The probability of an event ranges from 0 to 1. i.e. 0 P (A) 1
Where: A is any event in a random experiment
P (A) is the probability of A
iii) If two events A and B are mutually exclusive (disjoint events), then the probability of
either A or B or both is
P (A or B) = P (AUB) = P (A) + P (B) … Addition rule
1.3.2. Basic theorems of Probability
Theorem 1.1. If two events A and B are not mutually exclusive, then the occurrence of either
event A or B is given by the probability:
P (A or B) = P (AUB) = P (A) + P (B) – P (A n B)
Theorem 1.2. If A is an event from a sample space, S, and A’ it its complement then:
P (A) + P (A’) = 1
Proof: Making use of the definition of a complement, according to which A and A' are mutually
exclusive and A U A' = S. Thus, we write

1 = P(S) (by Postulate 1)

= P (A U A')
= P (A) + P (A') (by Postulate 3)
And it follows that P (A') = 1 – P ( A ) .
Example 1: If A and B are the events that the consumer Union will rate a car stereo good or poor is
given by; P(A)=0.24, P(B)=0.35, then determine the following probabilities
a) P (A′), b) P (AUB) c) P(A ∩)
Answer
a) From the theorem ii of probability, P(A′)=1-P(A)
=1-0.24=0.76
b) Since the two events are mutually exclusive events (consumer can either rate the car as good or
poor, they cannot rate the car poor and good at the same time)
P (AUB) = P (A) + P(B)= 0.24 + 0.35=0.59
c) We have said that event A and B are mutually exclusive events, so, P (A∩)B)= P ( ) =0
Example 2: The probability that it will rain in Jigjiga on a particular day is 0.27, the probability
there will be thunderstorm on that day is 0.24 and that there will be a rain as well as a
thunderstorm is 0.15. What is the probability that there will be a rain or a thunderstorm in Jigjiga
on such day?
Answer:
Let A represents the event that there is rain and B there will be thunderstorm
P (A) =0.27, P (B) = 0.24, and P (A∩B) =0.15
Since the two events are not mutually exclusive events, we use
P (AUB) = P (A) + P (B) – P (A ∩) B)
= 0.27+0.24-0.15
= 0.36
Joint and Marginal Distributions
Discrete Case
Definition: If X and Y are discrete random variable, the function given by fxy (x,y) = P(X=x,
Y=y) for each pair of (x,y) with the range of X and Y is called joint probability distribution of X
and Y .
Properties
1. 0  P( xi, yi)  1

  f ( x, y )  1
2. x y
i.e. the sum the probability of x and y is equal to one
Suppose that X can assume any one of m values x1, x2, . . . ,xm and Y can assume any one of n values y1,
y2, . . . yn. Then the probability of the event that X=xj, and Y=yk is given by
P(X=xj, Y=yk) = f(xj, yk)
A joint probability function for X and Y can be represented by a joint probability table as in Table 1.1.
Definition: If X and Y are discrete random variable and f(x,y) is the value of their joint
probability distribution at (X,Y). The function given by g(x) = Σy f(x,y) for each x with in the
range of X is called the marginal distribution of X, correspondingly h(y) = Σx f(x,y) for each y
within the range of Y is the marginal distribution of Y
Table 1.1 Joint probability distribution
Y Totals
X y1 y2 … yn 
x1 f(x1,y1) f(x1,y2) … f(x1,yn) fx(x1)
x2 f(x2,y1) f(x2,y2) … f(x2,yn) fx(x2)
.. .. .. .. .. ..
xm f(xm,y1) f(xm,y2) … f(xm, yn) fx(xm)
Totals  fy(y1) fy(y2) … fy(yn) 1  Grand Total
n
Because the probabilities P(X=xj, Y=yk) = f(xj, yk) and P(Y  yk )  f y ( yk )   f ( x j , yk ) are
j 1

obtained from the margins of the table we often refer to fx(xj) and fy(yk) (or simply fx(x) and fy(y)) as the
marginal probability functions of X and Y respectively.
It should also be noted that

m n

j 1
fx (x j )  1 f
k 1
y ( yk )  1

This can be written as:


m n

  f (x , y )  1
j 1 k 1
j k

This is simply the statement that the total probability of all entries is 1. The grand total of 1 is indicated
in the lower right-hand corner of the table.
The joint distribution function of X and Y is defined by
F ( x, y )  P ( X  x, Y  y )    f (u, v)
u x v y

In Table 4.1, F(x, y) is the sum of all entries for which xj < x and yk < y.
Example 4.2 The joint probability function of two discrete random variables X and Y is given by f(x,y)
= c(2x + y), where x and y can assume all integers such that 0 < x < 2, 0 < y < 3, and f(x,y) = 0
otherwise.
a) Find the value of the constant c,
b) Find P(X = 2, Y = 1).
c) Find P(X > 1, Y < 2)
Solution:
a) The sample points (x,y) for which probabilities are different from zero are indicated in fig 5.1.
The probabilities associated with these points, given by c(2x + y), are shown in Table 5.2.
Since the grand total, 42c, must equal 1, we have c = 1/42.
y
Table 5.2
Y Totals
3  
X 0 1 2 3 
0 0 C 2c 3c 6c 2  
1 2c 3c 4c 5c 14c
2 4c 5c 6c 7c 22c 1  

Totals  6c 9c 12c 15c 42c

Figure 1.1
b) from table 5.2 we see that
5
P( X  2, Y  1)  5c 
42
c) from table 5.2 we see that
P ( X 1 , Y 2)    f ( x, y )
x 1 y2

 (2c  3c  4c)  (4c  5c  6c)


24 4
 24c  
42 7
For discrete random variables X and Y:
 Marginal probability
Pₓ(x) = ∑ᵢ P(X=x, Y=yᵢ).
Pᵧ(y) = ∑ᵢ P(X=xᵢ, Y=y).
Find the marginal probability functions (a) of X and (b) of Y for the random variables of example 1.1
Solution:
a .The marginal probability function for X is given by g(x) =P(X = x) = f x(x) and can be obtained from
the margin totals in the right-hand column of table 1.2. From these we see that
6c  1 7 x0
g ( x)  P( x  x)  fx( x)  14c  1 3 x 1
22c  11 21 x  2

The marginal probability function for Y is given by h(y)=P(Y = y) = fy (y) and can be obtained from the
margin totals in the last row of Table 1.2. From these we see that
6c  1 7 x0
h( y )  P( X  y )  Fy ( y )  9c  3 14 x 1
12c  2 7 x2
15c  5 14 y 3

1.2 Continuous Case


The case where both variables are continuous is obtained easily by analogy with the discrete case on
replacing sums by integrals.
If X and Y are jointly continuous random variables with a joint density function given by
f(x; y) = f(X = x; Y = y), then
1. f ( x, y )  0
 
2.  
 
f ( x, y ) dx dy  1

The joint distribution function of X and Y in this case is defined by


x y
F ( x, y )  P ( X  x, Y  y )   
 
f ( x, y ) dx dy

The probability that X lies between a1 and a2 while Y lies between b1 and b2 is:
a2 b2
P   a1 b1
f ( x, y ) dx dy

It follows that
i.e. the density function is obtained by differentiating the distribution function with respect to x and y,
therefore, we obtain
x y
P( X  x)  Fx ( x)     
f ( x, y ) dx dy
x y
P(Y  y )  Fy ( y )   
 
f ( x, y ) dx dy
We call, the above equation, the marginal distribution functions, or simply the distribution functions, of
X and Y respectively. The derivatives with respect to x and y are then called the marginal density
functions, or simply the density functions, of X and Y and are given by


g ( x)  f x ( x)    f ( x, y)dy and h( y)  f y ( y)   
f ( x, y ) dx

Example 1.3: Check whether or not the following function can be a valid joint density function

Example 1.4
: Given the following the function

Find the marginal density of X and Y?)


Find the marginal density of x and y, h(x) & g(y)

Conditional Probability
Definition: If A and B are any two events in a sample space S and P(A) ≠0, the conditional
𝑃𝐴∩𝐵
probability of B given A is P(B\A)=
𝑃 𝐴

A bag contains 3 red and 7 black balls. Two balls are drawn at randon without replacement. If the
second ball is red, what is the probability that the first ball is also red?
Solution:
Let A: event of selecting a red ball in first draw
B: event of selecting a red ball in second draw
P(A ∩ B) = P(selecting both red balls) = 3/10 × 2/9 = 1/15
P(B) = P(selecting a red ball in second draw) = P(red ball and rad ball or black ball and red ball)
= P(red ball and red ball) + P(black ball and red ball)
= 3/10 × 2/9 + 7/10 × 3/9 = 3/10
∴ P(A|B) = P(A ∩ B)/P(B) = 1/15 ÷ 3/10 = 2/9.

You might also like