II Sem - Last Minute Revision
II Sem - Last Minute Revision
II BSc COMPLEMENTARY
STATISTICS
(Mathematics/Computer science)
Example :-
In the coin tossing experiment A = getting a head = {H} and
B = getting a tail = {T} are events.
Equally likely events : Two or more events are said to be equally likely if
they have the same chance of occurrence in a trial.
Example :-
In the coin tossing experiment the events getting a head {H} and
getting a tail {T} are equally likely events.
Example :-
In the coin tossing experiment the events getting a head {H} and
getting a tail {T} are mutually exclusive events.
Classical definition of probability
If a random experiment results in n equally likely, mutually exclusive
and exhaustive number of cases out of which m cases are favourable to
the occurrence of an event A then the probability of the event A is
given by
P(A) = favourable no.of cases = m
exhaustive no.ofcases n
Frequency definition of probability
Consider a random experiment which is repeated n times. Let an event A
occurs in f out of n repetitions. Then f is called frequency and f/n is called
the frequency ratio or relative frequency of the event A. When n becomes
very large , the frequency ratio becomes more regular and approaches a
constant. This constant value is called probability of event A and the
process of frequency ratio approaching a constant when n becomes very
large is called Statistical regularity.
That is by frequency definition P(A) = Lt f
n ∞n
Union, intersection and complement of an event
Probability space
The triplet (S,F,P) is called the probability space , where S is the sample
space ,F is the σfield of subsets of S and P is the probability function
Addition theorem of probability
For any two events A and B , P(AUB) = P(A) + P(B) – P(AՈB)
Results
P(A ͨ ) = 1 – P(A)
P(A Ո B ͨ ) = P(A) - P(AՈB)
P(A ͨ Ո B) = P(B) - P(AՈB)
De Morgan’s law
(AUB) ͨ = A ͨ Ո B ͨ
(AՈB) ͨ = A ͨ U B ͨ
Conditional probability
Probability of an event a given that the event B has already occurred is
called the conditional probability of A given B and it is denoted by
P(A/B) = P(AՈB) , provided P(B) ≠ 0
P(B)
Random variable
A random variable is a real valued function defined over the sample
space of a random experiment.
Random variables are classified into two - discrete and continuous.
= ∫ f(x)dx , if X is continuous
Properties of cdf
Expectation
Let X be a random variable having pmf/pdf f(x) . Then the Mathematical
Expectation of X or Mean of the random variable X is given by
E(X) = ∑ xf(x) ,if X is a discrete random variable
Properties of Variance
V(c) = 0
V(cX) = c²V(X)
V(aX+b) = a²V(X)
V(X±Y) = V(X)+V(Y) if X and Y are independent random variables
Moments
There are two types of moments – Raw moments and Central moments
Raw moments
Let X be a random variable having pmf/pdf f(x) , then the rth raw moment
about the origin A is given by μr'(A) = E(X – A)
In particular if the origin A = 0, then the rth raw moment about the origin 0
is given by
μr' = μr'(0) = E(X )
Central moments
Let X be a random variable having pmf/pdf f(x) , then the rth central
moment a is given by
μr = E[X – E(X)]
Note :-
The first raw moment about the origin 0 , μ1' is the mean
The first central moment, μ1 is always zero.
The second central moment, μ2 is the variance.
Relation between raw moments and central moments
In particular
μ1 = 0
μ2 = μ2' – (μ1' )²
μ3 = μ3' - 3 μ2' μ1' + 2(μ1' )³
μ4 = μ4' – 4 μ3' μ1' + 6 μ2'(μ1' )² – 3(μ1' )⁴
Moment generating function (mgf)
Let X be a random variable having pmf/pdf f(x) , then the moment
generating function (mgf) is given by
Joint pmf
Let (X,Y) be a pair of discrete random variables , then the function
f(x,y) = P(X=x ,Y =y) is called the joint pmf of the random variables (X,Y) if
it satisfies the following conditions
f(x,Y ) ≥ 0 , for every (x,y)
∑ ∑ f(x,y)=1
Joint pdf
Let (X,Y) be a pair of continuous random variables and if
f(x,y)dxdy= P(x ≤ X ≤ x+dx , y ≤ Y ≤ y+dy) then the function f(x,y) is called
the joint pdf of the random variables (X,Y) if it satisfies the following
conditions
f(x,y ) ≥ 0 , for every (x,y)
∫ ∫ f(x,y)dxdy =1
Joint cdf
Let (X,Y) be a pair of random variables having the joint pmf/pdf f(x,y) ,
then the joint cumulative distribution function of (X,Y) is given by
F(x,y)= P(X≤x ,Y ≤y)
* F(+∞ ,+∞) = 1
* F(-∞ ,-∞) = 0
* F(-∞ ,y) = F(x ,-∞) = 0
* F(x ,+∞) = F(x)
* F(+∞ ,y) = F(y)
* If (X,Y) is a pair of discrete random variables then
P(a < X ≤ b , c < Y ≤ d) = F(a,c) –F(a,d) – F(b,c) + F(b,d)
* If (X,Y) is a pair of continuous random variables having joint cdf F(x,y) ,
then their joint pdf f(x,y) = ∂²F(x,y)
∂x∂y
Marginal pmf
Let (X,Y) be a pair of discrete random variables having the joint pmf f(x,y),
then the marginal pmf of the random variable X is given by
f1(x)= ∑ f(x,y)
Marginal pdf
Let (X,Y) be a pair of continuous random variables having the joint pdf f(x,y) ,
then the marginal pdf of the random variable X is given by
f1(x)= ∫ f(x,y)dy
Result
If X and Y are two independent random variables , then
MX+Y(t) = MX(t) MY(t)
Product moments
= μ11
√μ20√μ02