0% found this document useful (0 votes)
62 views

Maths - Class - 12 - Statistics and Probability

This document discusses various measures of central tendency and dispersion in statistics and probability. It defines arithmetic mean, geometric mean, harmonic mean, median, mode, range, mean deviation, variance, and standard deviation. It provides formulas to calculate each of these measures for both individual and grouped data distributions. The key measures discussed are measures of central tendency like mean, median and mode which provide a central or typical value for a dataset, and measures of dispersion like range, mean deviation and standard deviation which quantify how spread out or varied the values in a dataset are.

Uploaded by

Sathya kiruba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
62 views

Maths - Class - 12 - Statistics and Probability

This document discusses various measures of central tendency and dispersion in statistics and probability. It defines arithmetic mean, geometric mean, harmonic mean, median, mode, range, mean deviation, variance, and standard deviation. It provides formulas to calculate each of these measures for both individual and grouped data distributions. The key measures discussed are measures of central tendency like mean, median and mode which provide a central or typical value for a dataset, and measures of dispersion like range, mean deviation and standard deviation which quantify how spread out or varied the values in a dataset are.

Uploaded by

Sathya kiruba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

STATISTICS AND PROBABILITY

STATISTICS:
MEASURES OF CENTRAL TENDENCY:
An average value or a central value of a distribution is the value of variable which
is representative of the entire distribution, this representative value are called the
measures of central tendency.
Three types of mathematical averages are (i) arithmetic mean, (ii) geometric
mean, and (iii) harmonic mean.
Five types of positional averages are (i) median, (ii) quartiles, (iii) deciles, (iv)
percentiles, and (v) mode.
Arithmetic mean
Σx
Individual observations: X́ = , Σ x=¿ sum of items. N=¿ number of
N
observations
Σ f ( x)
Discrete series: X́ = , f =¿ frequency, N=Σ f
N
Σ fd
Short-cut method: X́ =A + , d =X− A , N =Σ f , A=¿ assumed
N
mean
Σ fm
Continuous series: X́ = , m=¿ mid-values of various classes,
N
N=Σ f classes, N=Σ f
Combined mean: If a series of N observation consists of two components, with
means X́ 1 , X́ 2 and number of items N 1 and N 2 , then combined mean is

N X́ + N 2 X́ 2
X́ 12= 1 1
N 1+ N 2
GEOMETRIC MEAN:
(i) For ungrouped distribution: If x 1 , x 2 , … … x n are n positive values of
variate then their geometric mean G is given by

[ ]
n
1
G=( x1 , x2 , … … … . x n ) ⇒ G=antilog ⁡ ∑ ❑ log ⁡xi
1 /n
n i=1
(ii) For frequency distribution: If x 1 , x 2 , … x n are n positive values with
corresponding frequencies f 1 , f 2 , … . f n resp. then their G.M.

[ ]
n
1
∑ ❑ f i log ⁡x i
1/N
G=( x fi × x f2 ×… × xnf )
1 2 n
⇒G=antilog ⁡
N i=1
Note: If G 1 and G 2 are geometric means of two series which containing n1 and
n2 positive values resp. and G is geometric mean of their combined series then

[ ]
1
n1 log ⁡G1+ n2 log ⁡G2
G=( G1 × G2 )
n1 n2 n1+n2
⇒G=antilog ⁡
n1 + n2
HARMONIC MEAN:
x 1 , x 2 , … .. x n are n non-zero values of
(i) For ungrouped distribution: If
variate then their harmonic mean H is defined as
n n
H= =
1 1 1 n
1
xn ∑
+ + … .+ ❑
x1 x 2 i=1 xi
(ii) For frequency distribution: If x 1 , x 2 , … x n are n non-zero values of variate
with corresponding frequencies f 1 , f 2 , … . f n respectively their H.M.
N N
H= = n
f1 f 2 fn f
+ + … .+
x1 x 2

x n i=1 xi
❑ i
Median: Median is defined as the central value of set of observations. In order to
calculate median, first of all, arrange the data in ascending or descending order of
magnitude of the observations.
Individual observation: If N is odd, then median ¿ size of (N +1)/2 th item.
If N is even, median ¿ average of N / 2 th and
N
2 ( )
+1 th items.
Discrete series: First, arrange the data in ascending or descending order, find
cumulative frequencies, then median is the size of the observation which lies in
the class having cumulative frequency just greater than N /2.
Continuous series: Median class is the class corresponding to cumulative
frequency just greater than N / 2 and median is given by the formula
N
−c . f .
2
 Median =l+ ×i
f
where l=¿ lower limit of median class, c.f. ¿ cumulative frequency of class
preceding to the median class, f =¿ frequency of median class, and i=¿ class
interval of median class.
Mode:
It is that value of the variable, which occurs greatest number of times, i.e.,
variable with maximum frequency.
In case of a discrete frequency distribution of value of mode is determined by
the method of grouping.
In case of a grouped or continuous frequency distribution, mode is given by
the formula
f −f 1
 Mode =l + ×h
2 f −f 1−f 2
where l=¿ lower limit of the modal class, h=¿ width of modal class, f 1=¿
frequency of the class preceding the modal class, f 2=¿ frequency of the class
following the modal class, and f =¿ frequency of the modal class.
Note:
If there are two observations (or modal classes) with the same maximum
frequency, then the mode can be found by using the formula (known as empirical
formula).
Mode ¿ 3 Median −2 Mean
Measures of Dispersion
Dispersion may be defined as the extent of the scatteredness of item around a
measure of central tendency.
Methods of measuring dispersion
The following are the methods of measuring dispersion: (i) the range; (ii) the
semi-interquartile range or quartile deviation; (iii) the mean deviation; and (iv) the
standard deviation.
Range It is the difference between the highest and the lowest value in the series,
i.e., Range ¿ x h−x t , wherex h is highest value and x l is the lowest value. The
coefficient of range ¿ ( x h−x l ) / ( x h + x l ).
Mean deviation
Individual series:
n
1
MD= ∑ ❑|x i−M |
n i=1
where M =¿ median ¿ mean /¿ mode, n=¿ number of observations.
Discrete series:
n n
1
MD= ∑ ❑ f i|x i−M |, N =∑ ❑ f i
N i=1 i=1
Note: In general, mean deviation (MD) always stands for mean deviation about
the median.
Standard deviation:The arithmetic mean of the square of deviations of the
variable values from its actual arithmetic mean is known as variance and its
square root is known as standard deviation (σ ) .
Individual series:

( )
n n 2
1 1 1
2
σ = variance = ∑
n i=1
❑ ( x i−x́ ) = ∑ ❑ x i − ∑ x i
2
n i=1
2
n
Discrete series:

( )
n n 2
1
2
σ = variance =
N
∑ ❑ f i ( x i−x́ ) = N1 ∑ ❑ f i x 2i − N1 ∑ f i xi
2

i=1 i=1
Standard deviation

√ ( )
n 2
1
(σ )= ∑ ❑ f i x 2i − N1 ∑ f i xi
N i=1
x i− A
Note: If d = then
i
h

[ ( )]
n n 2
1 1
σ =h2

N i=1
❑ f i d 2i − ∑ ❑ f i d i
2
N i=1
Properties of Mean, Median, and Mode
Mean:
1 The sum of the square of deviations from mean is minimum, i.e., Σ ¿ is least.
2 The sum of deviations of items from their mean is equal to zero, i.e.,
Σ (X − X́ )=0 .
3 The mean is affected accordingly if the observations are given mathematical
treatment by any constant item.
4 The arithmetic mean is independent of origin, i.e., it is not affected by any
change of origin (assumed mean).
Median
1 The sum of the absolute values of deviation of the item from median is
minimum.
2 It is a positional average and is not influenced by the position of the items.
Mode
It is not affected by the presence of extremely large or small items.
Combined standard deviation:
If there are two sets of observations containing n1 and n2 items with respective
mean X́ 1 and X́ 2 and standard deviations σ 1 and σ 2, then the mean X́ and
standard deviation of the n1 +n 2 observations, taken together, is
n X́ +n X́ 1
X́ 12= 1 1 2 2 ⇒ σ 2=
n1 +n2
[
n1 +n2 1 1 1
]
n ( σ 2 +d 2 ) + n2 ( α 22+ d22 )

d 1= X́ 12− X́ 1 , d 2= X́ 12− X́ 2
Properties of Standard Deviation:
1 The standard deviation of first n natural numbers 1 , 2,3 , … , n is

√ ( n −1 ) /12.
2

2 The variance and consequently standard deviation of a distribution is


independent of change of origin.
3 The variance or standard is not independent of change of scale.
4 If all the values of the variable are same, then standard deviation is zero.
5 The standard deviation does not alter when a constant quantity k is added to
or subtracted from each value of the variable of the series.
Some more properties:
6 For a symmetrical distribution
Mean ± 0.6745 covers 50 % of items
Mean ± 1 σ covers 68.27 % of items
Mean ± 2 σ covers 95.45 % of items
Mean ± 3 σ covers 99.93 % of items
7 The sum of squares of deviations of items in the series from their arithmetic
mean is minimum.
8 QD=2/3 σ , MD=4 /5 σ , and QD=5/6 MD .
9 Coefficient of mean deviation ¿ MD/Median × 100 % .
10 Coefficient of variance ¿ σ /¿ mean ×100 % .
11 Sometimes a graph of cumulative frequency is called an ogive, obtained by
plotting the cumulative frequencies against the end points of the class
intervals.
Probability:
Probability gives us a measure of likelihood that something will happen. However,
probability can never predict the number of times that an occurrence actually
happens. But being able to quantify the likely occurrence of an event is important
because most of the decisions that affect our daily lives are based on likelihoods
and not on absolute certainties.
Experiment: An operation which results in some well-defined outcomes is called
an experiment.
Random experiment: An experiment whose outcome cannot be predicted with
certainty is called a random experiment. In other words, if an experiment is
performed many times under similar conditions and the outcome each time is
not the same, then this experiment is called a random experiment.
Sample space: The set of all possible outcomes of a random experiment is called
the sample space for that experiment. It is usually denoted by S.
Examples: When two coins are tossed.
Sample space S={( H , H ),( H , T ) ,(T , H ) ,(T , T )}, where H , T
denotes occurrence of head and tail respectively.
Sample point or event point:
Each element of the sample space is called a sample point or an even point.
Example: When a die is thrown sample space S={1,2, 3,4,5,6 }
Here 1,2,3,4,5, and 6 are the sample points.
Trial: When an experiment is repeated under similar conditions and it does not
give the same result each time but may result in any one of the several possible
outcomes, the experiment is called a trial and the outcomes are called cases. The
number of times the experiment is repeated is called the number of trials.
Examples:
1 One toss of a coin is a trial when the coin is tossed 5 times.
2 One throw of a die is a trial when the die is thrown 4 times.
3 Different types of events
A subset of the sample space S is called an event.
Example: When a die is thrown, sample space S={1,2, 3,4,5,6 }.
4 A={1,3,5 }, here A is the event of occurrence of an odd number.
Let
B={5,6 }, here B is the event of the occurrence of a number greater than
4.
Simple event or elementary event: An event is called a simple event if it is a
singleton subset of the sample space S .
Mixed event or compound event or composite event: A subset of the sample
space S which contains more than one element is called a mixed event.
Example: When a die is thrown, sample space S={1,2, 3,4,5,6 }.
Let A={1,3,5 }=¿ the event of occurrence of an odd number and
B={5,6 }=¿ the event of occurrence of a number greater than 4 . Here A and
B are mixed events.
Equally likely cases (events): Cases (outcomes) are said to be equally likely
when we have no reason to believe that one is more likely to occur than the
other. Thus, when an unbiased die is thrown, all the six faces 1,2,3,4,5, and 6
are equally likely to come up. Similarly, when an unbiased coin is tossed
occurrences of head and tail are equally likely cases.
Exhaustive cases (events): For a random experiment A , set of cases (events) is
said to be exhaustive if one of them must necessarily happen every time the
experiment is performed. For example, when a die is thrown cases (events) 1, 2,
3, 4, 5. 6, form an exhaustive set of cases (events).
Mutually exclusive or disjoint events: Two or more events are said to be
mutually exclusive if one of them occurs, other cannot occur. Thus, two or more
events are said to be mutually exclusive if no two of them can occur together.
Thus events A1 , A 2 ,… , An are mutually exclusive if and only if Ai ∩ A j=ϕ
for i≠ j.
Independent or mutually independent events:
Two or more events are said to be independent of occurrence or non-occurrence
of any of them does not affect the probability of occurrence or non-occurrence of
other events.
In other words two or more events are said to be independent if occurrence or
non-occurrence of any of them does not influence the occurrence or non-
occurrence of other events.
Example: When two cards are drawn out of a full pack of 52 playing cards with
replacement (the first card drawn is put back in the pack and then the second
card is drawn), then the event of occurrence of a king in the first draw and the
event of occurrence of a king in the second draw are independent events because
the probability of drawing a king in the second draw is 4 /52 whether a king is
drawn in the first draw or not. But if the two cards are drawn without
replacement, then the two events are not independent.
Note: By definition of independent events, it is clear that if A and B are
independent events, then
 A and B' are independent events.
 A' and B are independent events.
 A' and B ' are independent events.
 Non-impossible mutually exclusive events are not independent and non-
impossible independent events are not mutually exclusive.
Probability: Definition
Let S be the sample space, then the probability of occurrence of an event E is
denoted by P( E) and is defined as
n( E)  number of elements in  E  number of cases favorable to
P( E)= = =
n( S)  number of elements in  S  total number of cases 
Complement of an event: Complement of an event E is denoted by E ' or Ec
or É . E' means non-occurrence of event E . Thus E ' occurs if and only if E does
not occur. We have P( E)+ P ( E' )=1 .
Odds in favor and odds against an event: Let S be the sample space and E be
an event. Let E ' denote the complement of event E , then
1 Odds in favor of event
n ( E)  number of cases favorable to event  E
E= =
n ( E ' )  number of cases against  E
2 Odds against an event
n ( E ' )  number of cases against the event  E
E= =
n ( E)  number of cases favorable to event  E
Addition Theorem of Probability:
If A and B be any two events in a sample space S, then the probability of
occurrence of at least one of the events A and B is given by
P( A ∪ B)=P( A)+ P(B)−P( A ∩B).
If A , B , and C are any three events in a sample space S , then
P( A ∪ B∪ C )=P( A)+ P(B)+ P( C)−P( A ∩ B)−P( B ∩C)−P( A ∩C )+
If A and B are mutually exclusive, events then A ∩ B ¿ ϕ and hence
P( A ∩ B)=0 .
∴ P ( A ∪ B)=P( A)+ P(B)
If A , B , C are mutually exclusive events, then
P( A ∪ B∪ C )=P( A)+ P( B)+ P(C)
Conditional probability:
Let A and B be any two events, B≠ ϕ, then P( A /B) denotes the conditional
probability of occurrence of event A when B has already occurred which is given
by
P( A ∩B)
P( A /B)=  or  P( A ∩ B)=P(B)⋅ P( A /B)
P(B)
If A and B are independent events, then
P( A /B)=P( A)
Two events A and B are independent if and only if
P( A ∩ B)=P( A)⋅P (B).
If A , B are independent events, then A' and B' are independent events. Hence

P ( A ' ∩ B' ) =P ( A ' ) ⋅ P ( B' )


Notes:
 P( A ∪ B)=1−P ( A' ) ⋅ P ( B' ) [valid only when A and B are
independent].
 If A1 , A 2 ,… , An are independent events, then P ( A 1
∪ A 2 ∪ …∪ A n )=1−P ( A '1 ) ⋅ P ( A '2 ) … P ( A 'n ).
 If A and B are independent events, then - A and B' are independent
events.
A and B are independent events.
'

' '
 A and B are independent events.
 If A and B are two events such that B≠ ϕ, then P( A /B)+ P ( A ' /B ) =1
.
A and B be two events such that A ≠ ϕ , then
If

P( B)=P( A) ⋅P (B / A)+ P ( A ) ⋅ P ( B / A )
' '

Total probability theorem:


Let A " be any event of S.
We can write A=( A 1 ∩ A ) ∪ ( A2 ∩ A ) … ∪ ( A n ∩ A )

 Let A1 , A 2 ,⋯ , A n are mutually exclusive, then

( A1 ∩ A ) , ( A2 ∩ A ) , … , ( A n ∩ A ) would also be mutually exclusive.


⇒ P( A)=¿ P ( A 1 ∩ A ) + P ( A 2 ∩ A ) + ⋯+ P ( An ∩ A ) P A ⋅ P A / A + P A
( 1) ( 1) ( 2)
¿
 This is known as the total probability of event A . Here P ( A / Ai ) gives up the
contribution of Ai in the occurrence of A .
Bayes' theorem
IfA1 , A 2 , A3 , … , A n be n mutually exclusive and exhaustive events and A is
an event which occurs together (in conjunction) with either of Ai i.e., if
A1 , A 2 ,… , An form a partition of the sample space S and A be any event,
then
P ( Ak ) ⋅ P( A / Ak )
P ( A k / A)=

Probability distribution
[ P ( A 1 ) ⋅ P ( A / A 1 )+ P ( A 2 ) ⋅ P ( A / A 2 )
+ ⋯+ P ( A n ) ⋅ P ( A / A n ) ]
Let S be the sample space associated with a given random experiment. Then a
random variable is a real-valued function whose domain is subset of sample
space of the experiment. If the experiment random variable assumes (takes) the
values 0,1,2 , … , n
x x1 x2 x 3 … xn
P( x ) p1 p2 p3 … pn
Binomial distribution
A probability distribution representing the binomial trials is said to binomial
distribution. Let us consider a binomial experiment which has been repeated " n
p and q
" times. Let the probability of success and failure in any trial be
respectively in these n trials. Now number of ways of choosing " r " success in " n
n
" trials ¿ Cr Probability of " r " success and (n−r ) failures is pr ⋅q n−r. Thus
n r n−r
probability of having exactly r successes ¿ Cr ⋅ p ⋅q .
Let " X " be random variable representing the number of successes, then

P( X =r )=n Cr ⋅ pr ⋅ qn−r ¿, … , n ¿
Notes:
 Probability of atmost " r " successes in n trials
r
¿ ∑ ❑ Cλ p q
n λ n− λ

λ=0
 Probability of atleast ' r ' successes in n trials
n
¿ ∑ ❑ n C λ p λ ⋅ qn− λ
λ=r
 Probability of having first success at the r th trial ¿
p ⋅q
r −1

Thus if X be a random variable, X takes the values 0,1,2 , … , n said to be


BD . Its probability distribution or probability function is given by
n r n−r
P( X =r )= Cr p q ,
r =0,1,2 ,… , n ; p , q>0 and p+q=1
Poisson distribution
BD under the following conditions:
It is the limiting case of
 Number of trials are very-very large, i.e. n → ∞ p →0
nq → λ , a finite quantity ( is called parameter)
→ Probability of r success for Poisson distribution is given by
−λ r
e λ
 P( X =r )= , r=0,1,2 , …
r!
 → For Poisson distribution recurrence formula is given by
λ
 P(r +1)= P( r)
r +1
Notes:
 For Poisson distribution, mean ¿ variance ¿ λ=¿ np .
 X and Y are independent Poisson variates with parameters λ 1 and λ 2 then
If

X +Y also has Poisson distribution with parameter λ 1+ λ2.


Mean of BD
Mean of BD of the variable X is given by
n n
 X́ =E( X)=∑ ❑ Pi x i=∑ ❑rP( X=r )
i=1 r =0
n
X́ =E( X)=∑ ❑ r C r p q
n r n−r
 =np
r=0
 Variance of BD
n n
Var ⁡( X)=∑ ❑ r p( r)−∑ ❑ rp(r )=npq
2

r =0 r=0
Notes:
Mean ¿ np , variance ¿ npq
 Mean - variance ¿ np( 1−q)>0 as 0 ≤ q ≤ 1 Thus mean > variance
 SD=√ npq
 Mode of BD
The value of the variable ( X ) which occurs with the maximum (largest)
probability.
 Recurrence relation (formula) for BD is denoted by P(r +1) and defined as
P(r +1)= ( n−r
r +1 q )
)( p
P(r) , r ∈{0,1,2 , … , n−1 }

You might also like