Prob_Week5_Slides
Prob_Week5_Slides
1
Summary of Facts About Conditional
Expectations
remember
2
Summary of Facts About Conditional
Expectations
Event B
3
Total Expectation Theorem
• The last three equalities above apply in different situations, but are
essentially equivalent, and will be referred to collectively as the total
expectation theorem.
• They all follow from the total probability theorem and express the
fact that "the unconditional average can be obtained by averaging
the conditional averages."
• They can be used to calculate the unconditional expectation E[X] from
the conditional PMF or expectation, using a divide-and-conquer
approach.
4
Total Expectation Theorem: Example 1
5
Total Expectation Theorem: Example 2
7
Total Expectation Theorem: Example 2
8
Example 3
9
Example 3
10
Example 4: Total Expectation Theorem
• In a class, 30% of the students are British ( ), 50% and 20% percent of the students are Turkish
( ) and German ( ), respectively. British, Turkish and German students consume an average of
2, 3 and 1 cups of coffee per day, respectively.
• When asked, 50%, 40% and 30% of British, Turkish and German students, respectively, say they
also like chocolate (CL: Chocolate Lovers).
• The average daily coffee consumption of British, Turkish and German students, who like (CL) and
do not like (no CL) chocolate, is given in the table below: Event CL
CL no CL Total
B𝑩
𝑨 3 1 2
𝑨T𝑻 4.5 2 3 𝑨B𝑩 𝑨T𝑻 𝑨G𝑮
G𝑮
𝑨 1.7 0.7 1
A) What is the expected value of daily coffee consumption per person for the students in this class?
B) What is the percentage of coffee lovers in this class ?
C) What is the expected value of daily coffee consumption per person for the chocolate lovers?
11
Example 4: Total Expectation Theorem Event CL
𝑨B𝑩
( ) 𝑨T𝑻 𝑨G𝑮
12
Example 4: Total Expectation Theorem
Event CL
( )
𝑨B𝑩 𝑨T𝑻 𝑨G𝑮
Remember!
13
«B» is Event «CL» here!
Independence
• We now discuss concepts of independence related to random
variables.
• These are analogous to the concepts of independence between
events.
• They are developed by simply introducing suitable events involving
the possible values of various random variables, and by considering
the independence of these events.
• We will start the discusssion with the independence of a random
variable from an event.
14
Independence
15
Independence
16
Independence
• Consider the random variable that takes the value 0 if the first toss is a head, and the
value 1 if the first toss is a tail (still there are two independent tosses).
• It can be shown that this random variable is independent of (in the previous
slide: A fair coin is tossed twice independently and the number of heads are even),
17
Independence of Random Variables
18
Conditional Independence of Random Variables
19
Conditional Independence of Random Variables
20
Independence of Random Variables
21
Independence of Random Variables
22
Independence of Random Variables
In the previous slide,
23
Independence of Random Variables
24
Independence of Random Variables: Summary
25
Independence of Random Variables: Summary
26
Independence of Several Random Variables
The preceding discussion extends naturally to the case of more than
two random variables. For example, three random variables and
are said to be independent if,
27
Independence of Several Random Variables
28
Variance of the Binomial and the Poisson.
29
Variance of the Binomial and the Poisson.
30
Mean and Variance of the Sample Mean - I
31
Mean and Variance of the Sample Mean - II
32
Continous Random Variables
• Random variables with a continuous range of possible values are quite
common: The velocity of a vehicle traveling along the highway could be
one example.
• Models involving continuous random variables can be useful for several
reasons:
Besides being finer-grained (detailed) and possibly more accurate, they
allow the use of powerful tools from calculus and often admit an insightful
analysis that would not be possible under a discrete model.
• All of the concepts and methods introduced for discrete random variables,
such as expectation, PMFs, and conditioning, have continuous
counterparts.
• Developing and interpreting these counterparts is the subject of this
chapter.
33
Continous Random Variables
34
Continous Random Variables
35
Continous Random Variables
36
Probability Density Functions
37
Probability Density Functions
38
Continous Uniform Random Variable
39
Continous Uniform Random Variable
40
Piecewise Constant PDF
41
Piecewise Constant PDF
42
Piecewise Constant PDF
43
Probability Density Functions: Summary
44
Probability Density Functions: Summary
45
Expectation
• The expected value or expectation or mean of a continuous random
variable is defined by,
This is similar to the discrete case except that the PMF is replaced by
the PDF, and summation is replaced by integration.
• As in discrete case, can be interpreted as the "center of gravity"
of the PDF and, also, as the anticipated average value of in a large
number of independent repetitions of the experiment.
• Its mathematical properties are similar to the discrete case - after all,
an integral is just a limiting form of a sum.
46
Expectation
• If is a continuous random variable with given PDF, any real-valued
function of is also a random variable.
• Note that Y can be a continuous random variable: for example,
consider the trivial case where .
• But Y can also turn out to be discrete. For example, suppose that
for and otherwise. Then is
a discrete random variable taking values in the finite set
• In either case, the mean of g(X) satisfies the expected value rule
48
49
Uniform Random Variable: Mean
50
Uniform Random Variable: Variance
51
Exponential Random Variable
52
Exponential Random Variable
An exponential random variable can, for example, be a good model for
the amount of time until an incident of interest takes place, such as
• a message arriving at a computer,
• some equipment breaking down,
• a lightbulb burning out,
• an accident occuring etc.
It is closely connected with the geometric random variable.
The mean and variance can be calculated to be:
53
Exponential Random Variable: Mean
Here, 𝑢 = 𝑥, 𝑑𝑣 = 𝜆𝑒 𝑑𝑥
Therefore, 𝑑𝑢 = 𝑑𝑥, 𝑣 = −𝑒
54
Exponential Random Variable: Variance
Recall the Integration by Parts
formula:
𝑢𝑑𝑣 = 𝑢𝑣 − 𝑣𝑑𝑢
Here, 𝑢 = 𝑥 , 𝑑𝑣 = 𝜆𝑒 𝑑𝑥
Therefore, 𝑑𝑢 = 2𝑥𝑑𝑥, 𝑣 = −𝑒
55
Exponential Random Variable: Example
4 slides before
56