0% found this document useful (0 votes)
38 views

II Sem - Last Minute Revision

Uploaded by

22armylove2004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

II Sem - Last Minute Revision

Uploaded by

22armylove2004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

CALICUT UNIVERSITY

II BSc COMPLEMENTARY
STATISTICS
(Mathematics/Computer science)

STA2C02 - PROBABILITY THEORY

LAST MINUTE REVISION


Module 1 - PROBABILITY SUMMARY

Random experiment : A random experiment is an experiment having


several possible outcomes but we cannot predict which outcome will turn
up in a particular trial.
Example :- Tossing a coin, Throwing a die , selecting a card from a pack of
cards etc.

Sample space : Sample space is the set of all possible outcomes of a


random experiment. It is denoted by the letter S or Ω
Example :-
In the coin tossing experiment sample space S = { H , T }
In the die throwing experiment sample space S = { 1,2,3,4,5,6 }
Events : Out of all the outcomes of a sample space certain outcomes satisfy
a particular condition. Set of such outcomes are called events. Events are
subsets of the sample space. They are denoted by the letters A,B,C ….

Example :-
In the coin tossing experiment A = getting a head = {H} and
B = getting a tail = {T} are events.
Equally likely events : Two or more events are said to be equally likely if
they have the same chance of occurrence in a trial.
Example :-
In the coin tossing experiment the events getting a head {H} and
getting a tail {T} are equally likely events.

Mutually exclusive events : Two or more events are said to be mutually


exclusive if they cannot happen simultaneously in the same trial.

Note: If the events A and B are mutually exclusive then AՈB = ϕ

Example :-
In the coin tossing experiment the events getting a head {H} and
getting a tail {T} are mutually exclusive events.
Classical definition of probability
If a random experiment results in n equally likely, mutually exclusive
and exhaustive number of cases out of which m cases are favourable to
the occurrence of an event A then the probability of the event A is
given by
P(A) = favourable no.of cases = m
exhaustive no.ofcases n
Frequency definition of probability
Consider a random experiment which is repeated n times. Let an event A
occurs in f out of n repetitions. Then f is called frequency and f/n is called
the frequency ratio or relative frequency of the event A. When n becomes
very large , the frequency ratio becomes more regular and approaches a
constant. This constant value is called probability of event A and the
process of frequency ratio approaching a constant when n becomes very
large is called Statistical regularity.
That is by frequency definition P(A) = Lt f
n ∞n
Union, intersection and complement of an event

For any two events A and B


Occurrence of an event A --> A
Non occurrence of an event A --> A ͨ or A'
Occurrence of both A and B --> AՈB
Occurrence of at least one of A , B ( A or B)--> AUB
Occurrence of only A --> AՈB ͨ
Occurrence of only B --> A ͨ ՈB
Occurrence of exactly one --> ( AՈB ͨ ) U ( A ͨ ՈB )
Occurrence of none --> A ͨ Ո B ͨ
Field or Algebra
Let S be the sample space of a random experiment and F be a class of
subsets of S . Then F is called a Field or Algebra if it satisfies the following
conditions.
(i) F should be nonempty
(ii) If A ∈ F , then A ͨ ∈ F
(iii)If A , B ∈ F , then AUB ∈ F

σField or σAlgebra or Borel field


Let S be the sample space of a random experiment and F be a class of
subsets of S . Then F is called a σField or σAlgebra or Borel field if it
satisfies the following conditions.
(i) F should be nonempty
(ii) If A ∈ F , then A ͨ ∈ F
(iii)If A₁, A₂,..., Aₙ ∈ F , then A₁UA₂U...Aₙ ∈ F
Axiomatic definition of probability
Let S be the sample space of a random experiment and F be a σfield of
subsets of S. For each event A which is an element of the σfield , we can
define a real valued function P(A) and this function P(A) is called the
probability of the event A if it satisfies the following axioms.
(i) 0 ≤ P(A) ≤ 1, for every event A
(ii) P(S) = 1
(iii)If A₁, A₂, A₃.....are mutually exclusive events then
P(A₁ UA₂ U A₃ U....) = P(A₁) + P(A₂) + P(A₃) +.....

Probability space
The triplet (S,F,P) is called the probability space , where S is the sample
space ,F is the σfield of subsets of S and P is the probability function
Addition theorem of probability
For any two events A and B , P(AUB) = P(A) + P(B) – P(AՈB)

Results
P(A ͨ ) = 1 – P(A)
P(A Ո B ͨ ) = P(A) - P(AՈB)
P(A ͨ Ո B) = P(B) - P(AՈB)

De Morgan’s law
(AUB) ͨ = A ͨ Ո B ͨ
(AՈB) ͨ = A ͨ U B ͨ
Conditional probability
Probability of an event a given that the event B has already occurred is
called the conditional probability of A given B and it is denoted by
P(A/B) = P(AՈB) , provided P(B) ≠ 0
P(B)

Similarly the conditional probability of B given A is given by


P(B/A) = P(AՈB) , provided P(A) ≠ 0
P(A)
Dependent and independent events
Two events A and B are called dependent events if the happening of
one of them affect the probability of happening of the other.

Two events A and B are said to be independent events if the


happening of one event does not affect the probability of happening
of the other.

If A and B are independent events , then


P(A/B) = P(A) and P(B/A) = P(B)
Multiplication theorem of probability .
For any two events A and B ,
P(AՈB) = P(A)P(B/A) or
P(AՈB) = P(B)P(A/B)

If A and B are two independent events then P(AՈB) = P(A)P(B)


Baye’s theorem
Let S be the sample space of a random experiment which is partitioned
into n mutually exclusive events B₁,B₂,...Bₙ . Let A be an event in the
sample space which can happen only if any one of the events B₁,B₂,...Bₙ
happens . Then
then P(Bi/A)= P(Bi)P(A/Bi) i =1,2,...n
P(B₁)P(A/B₁) + P(B₂)P(A/B₂) + .....+P(Bₙ)P(A/Bₙ)
MODULE 2 - RANDOM VARIABLES SUMMARY

Random variable
A random variable is a real valued function defined over the sample
space of a random experiment.
Random variables are classified into two - discrete and continuous.

Discrete random variables


A random variable is said to be discrete if it takes finite or countably
infinite number of values.

Continuous random variables


A random variable is said to be continuous if it can take uncountably
infinite number of values.
Probability mass function (pmf) of a discrete random variable
Let X be a discrete random variable then the function f(x) = P(X = x) is
called the probability mass function or pmf of X , if it satisfies the
following conditions
f(x) ≥ 0 , for every x
Σf(x) = 1 ( Total pmf = 1)

Probability density function (pdf) of a continuous random variable


Let X be a continuous random variable and let f(x)dx = P(x ≤ X ≤ x+dx) ,
then the function f(x) is called the probability density function (pdf) of X
if it satisfies the following conditions
f(x) ≥ 0 , for every x
∫f(x)dx =1 (Total pdf =1)
Probability mass function (pmf) of a discrete random variable
Let X be a discrete random variable then the function f(x) = P(X = x) is
called the probability mass function or pmf of X , if it satisfies the
following conditions
f(x) ≥ 0 , for every x
Σf(x) = 1 ( Total pmf = 1)

Probability density function (pdf) of a continuous random variable


Let X be a continuous random variable and let f(x)dx = P(x ≤ X ≤ x+dx) ,
then the function f(x) is called the probability density function (pdf) of X
if it satisfies the following conditions
f(x) ≥ 0 , for every x
∫f(x)dx =1 (Total pdf =1)
Cumulative distribution function (cdf) / Distribution function
Let X be a random variable having the pmf/pdf f(x) , then the cumulative
distribution function (cdf) of X is given by

F(x) = P(X ≤ x) = Σ f(x) , if X is discrete

= ∫ f(x)dx , if X is continuous
Properties of cdf

* F(x) is defined for all real values of x


* F(-∞) =0 and F(+∞) = 1
* If a < b , F(a) ≤ F(b) .That is F (x) is a nondecreasing function
* If X is a discrete random variable then the graph of F(x) will be a step
function
* If X is a continuous random variable then the graph of F(x) will be a
continuous function
* If X is a discrete random variable then P(a < X ≤ b) = F(b) – F(a)
* If X is a continuous random variable then
P(a < X < b) = P(a ≤ X < b) = P(a < X ≤ b) = P(a ≤ X ≤ b) = F(b) – F(a)
* If f(x) is the pdf and F(x) is the cdf of a continuous random variable X then
f(x) = d F(x)
dx
Change of variable (Continuous case)
Let X be a continuous random variable having pdf f(x) and let Y be a
function of X , then the pdf of the random variable Y is given by
g(y)=f(x) 1
|dy/dx|
MODULE 3 - EXPECTATION SUMMARY

Expectation
Let X be a random variable having pmf/pdf f(x) . Then the Mathematical
Expectation of X or Mean of the random variable X is given by
E(X) = ∑ xf(x) ,if X is a discrete random variable

= ∫ xf(x)dx ,if X is a continuous random variable

Expectation of a function of a random variable Let X be a random variable


having pmf/pdf f(x) and let g(X) be a function of X , then the Expectation of
g(X) is given by
E[g(X)] = ∑ g(x)f(x) ,if X is a discrete random variable

∫ g(x)f(x)dx ,if X is a continuous random variable


Properties of Expectation
(i) E(c) = c , where c is a constant
(ii) E(cX)=cE(X)
(iii) E(aX+b) = aE(X) + b, where a and b are constants
(iv) E(X+Y) = E(X) + E(Y)
(v) E(XY) = E(X)E(Y), if X and Y are independent
Variance of a random variable
Let X be a random variable having pmf/pdf f(x) , then the variance of X is
given by
V(X) = E[X-E(X)]²
= ∑[X-E(X)]² f(x) , if X is a discrete random variable

∫[X-E(X)]² f(x)dx, if X is a continuous random variable

The variance of a random variable X can also be written as


V(X) = E(X²) – {E(X)}²

Properties of Variance
V(c) = 0
V(cX) = c²V(X)
V(aX+b) = a²V(X)
V(X±Y) = V(X)+V(Y) if X and Y are independent random variables
Moments
There are two types of moments – Raw moments and Central moments

Raw moments
Let X be a random variable having pmf/pdf f(x) , then the rth raw moment
about the origin A is given by μr'(A) = E(X – A)

In particular if the origin A = 0, then the rth raw moment about the origin 0
is given by
μr' = μr'(0) = E(X )

Central moments
Let X be a random variable having pmf/pdf f(x) , then the rth central
moment a is given by
μr = E[X – E(X)]
Note :-
The first raw moment about the origin 0 , μ1' is the mean
The first central moment, μ1 is always zero.
The second central moment, μ2 is the variance.
Relation between raw moments and central moments

In particular
μ1 = 0
μ2 = μ2' – (μ1' )²
μ3 = μ3' - 3 μ2' μ1' + 2(μ1' )³
μ4 = μ4' – 4 μ3' μ1' + 6 μ2'(μ1' )² – 3(μ1' )⁴
Moment generating function (mgf)
Let X be a random variable having pmf/pdf f(x) , then the moment
generating function (mgf) is given by

MX(t) = E[e ] = ∑e f(x) ,if X is discrete

= ∫e f(x)dx ,if X is continuous

The moment generating function MX(t) generates raw moments.


μr' is the coefficient of t/r! in the expansion of the mgf
Or
μr' = d MX (t)
dt
Properties of moment generating function
* Moment generating function uniquely identifies the distribution of a
random variable.
* McX(t) = MX(ct) , where c is a constant
* MaX+b (t) = e MX(at)
* MX+Y(t) = MX(t) MY(t) , if X and Y are independent
Characteristic function
Let X be a random variable having pmf/pdf f(x) , then the characteristic
function is given by ϕX(t) = E[e ] where i = √-1
= ∑e f(x), if X is discrete
= ∫e f(x)dx ,if X is continuous
Skewness
Skewness means lack of symmetry .
The graph of the probability distribution of random variable can be either
symmetric , positively skewed or negatively skewed .

Measures of skewness based on moments


Coefficient of skewness , β1= μ3² and γ1=√(β1 )= μ3
μ2³ √μ2³

A probability distribution will be negatively skewed , symmetric or positively


skewed according as γ₁ < , = , > 0 or μ3 < , = , > 0
Kurtosis
Kurtosis measures the peakedness of the curve of the probability
distribution.

Measures of kurtosis based on moments


Coefficient of kurtosis, β2 = μ4 and γ2 = β2-3
μ2²

A probability distribution will be platykurtic , mesokurtic or leptokurtic


according as β2 < , = , > 3 or γ2 < , = , > 0
MODULE 4 - BIVARIATE DISTRIBUTION SUMMARY

Joint pmf
Let (X,Y) be a pair of discrete random variables , then the function
f(x,y) = P(X=x ,Y =y) is called the joint pmf of the random variables (X,Y) if
it satisfies the following conditions
f(x,Y ) ≥ 0 , for every (x,y)
∑ ∑ f(x,y)=1

Joint pdf
Let (X,Y) be a pair of continuous random variables and if
f(x,y)dxdy= P(x ≤ X ≤ x+dx , y ≤ Y ≤ y+dy) then the function f(x,y) is called
the joint pdf of the random variables (X,Y) if it satisfies the following
conditions
f(x,y ) ≥ 0 , for every (x,y)
∫ ∫ f(x,y)dxdy =1
Joint cdf
Let (X,Y) be a pair of random variables having the joint pmf/pdf f(x,y) ,
then the joint cumulative distribution function of (X,Y) is given by
F(x,y)= P(X≤x ,Y ≤y)

= ∑ ∑ f(x,y) , if X and Y are discrete rvs

= ∫ ∫ f(x,y)dxdy , if X and Y are continuous rvs


Properties of Joint cdf

* F(+∞ ,+∞) = 1
* F(-∞ ,-∞) = 0
* F(-∞ ,y) = F(x ,-∞) = 0
* F(x ,+∞) = F(x)
* F(+∞ ,y) = F(y)
* If (X,Y) is a pair of discrete random variables then
P(a < X ≤ b , c < Y ≤ d) = F(a,c) –F(a,d) – F(b,c) + F(b,d)
* If (X,Y) is a pair of continuous random variables having joint cdf F(x,y) ,
then their joint pdf f(x,y) = ∂²F(x,y)
∂x∂y
Marginal pmf
Let (X,Y) be a pair of discrete random variables having the joint pmf f(x,y),
then the marginal pmf of the random variable X is given by
f1(x)= ∑ f(x,y)

and the marginal pmf of the random variable Y is given by


f2(y)= ∑ f(x,y)

Marginal pdf
Let (X,Y) be a pair of continuous random variables having the joint pdf f(x,y) ,
then the marginal pdf of the random variable X is given by
f1(x)= ∫ f(x,y)dy

and the marginal pdf of the random variable Y is given by


f2(y)= ∫f(x,y)dx
Conditional distribution
Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) and
marginal pmfs/pdfs f1(x) and f2(y), then the conditional pmf / pdf of X
given Y is given by f(x/y) = f(x,y)
f2(y)
And the conditional pmf/pdf of Y given X is given by
f(y/x) = f(x,y)
f1(x)

Independence of two random variables


Two random variables X and Y joint pmf/pdf f(x,y) and marginal pmfs/pdfs
f1(x) and f2(y) are said to be independent if and only if
f(x,y) = f1(x)f2(y)
If the two random variables X and Y are independent then
f(x/y) = f1(x) and f(y/x) = f2(y)
Bivariate Expectation
Let (X,Y) be a pair of random variables having the joint pmf/pdf f(x,y) and
let g(X,Y) be a function of (X,Y) , then the expectation of g(X,Y) is given by
E[g(X,Y)] = ∑ ∑ g(x,y)f(x,y) ,if (X,Y) is discrete

= ∫ ∫ g(x,y)f(x,y)dxdy ,if (X,Y) is continuous


Addition theorem of expectation
If (X ,Y) is a pair of any two random variables having joint pmf/pdf f(x,y)
then E(X+Y) = E(X) + E(Y)

Multiplication theorem of expectation


If X and Y are two independent random variables having joint pmf/pdf
f(x,y) , then E(XY) = E(X)E(Y)

Result
If X and Y are two independent random variables , then
MX+Y(t) = MX(t) MY(t)
Product moments

Product raw moments


Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then
the (r,s)th product raw moment of the random variables X and Y is given
by µʹrs = E(XrYs) = ∑ ∑ x y f(x,y) ,if (X,Y) is discrete

= ∫ ∫x y f(x,y)dxdy ,if (X,Y)is continuous

Product central moments


Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then
the (r,s)th product central moment of the random variables X and Y is
given by µrs = E{[(X – E(X)] [Y – E(Y)]}
Covariance of X and Y
Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then
the covariance between X and Y is given by
Cov(X,Y) = E{[(X – E(X)][Y – E(Y)]} = µ11

On simplification Cov(X,Y) can also be written as


Cov(X,Y) = E(XY) – E(X)E(Y)
Karl Pearson’s correlation coefficient
Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then the
correlation coefficient between X and Y is given by
ρxy = Cov(X,Y)
√V(X)√(V(Y)

= E{[(X – E(X)][Y – E(Y)]}


√E[(X – E(X)]²√E[(Y – E(Y)]²

= μ11
√μ20√μ02

On simplification this formula can also be written as


ρxy = E(XY)– E(X)E(Y)
√E(X² )- [E(X)]² √E(Y² )- [E(Y)]²
Conditional Expectation
Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then the
conditional expectation of the random variable X given Y is given by
E(X/Y) = ∑ xf(x/y) , if X and Y are discrete

=∫ xf(x/y )dx , if X and Y are continuous

Similarly the conditional expectation of the random variable Y given X is


E(Y/X) = ∑ yf(y/x) , if X and Y are discrete

=∫ yf(y/x)dy , if X and Y are continuous


Conditional variance
Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then
the conditional variance of the random variable X given Y is given by
V(X/Y) = E(X²/Y) – {E(X/Y)}²
And the conditional variance of the random variable Y given X is given by
V(Y/X) = E(Y²/X) – {E(Y/X)}²

Cauchy Schwarz inequality


Let (X,Y) be a pair of random variables having joint pmf/pdf f(x,y) , then
[E(XY)]² ≤ E(X²)E(Y²)

You might also like