0% found this document useful (0 votes)
13 views

Prob_Week5_Slides

Introduction to Probability: Lecture Notes

Uploaded by

zeynepskuralay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Prob_Week5_Slides

Introduction to Probability: Lecture Notes

Uploaded by

zeynepskuralay
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Conditional Expectation

• A conditional PMF can be thought of as an ordinary PMF over a new


universe determined by the conditioning event.
• In the same spirit, a conditional expectation is the same as an
ordinary expectation, except that it refers to the new universe such
that all probabilities and PMFs are replaced by their conditional
counterparts.
• Conditional variances can also be treated similarly.
• We list the main definitions and relevant facts in the next slide:

1
Summary of Facts About Conditional
Expectations

remember

2
Summary of Facts About Conditional
Expectations

Event B

𝑨B𝟏 𝑨T𝟐 𝑨G𝟑

3
Total Expectation Theorem

• The last three equalities above apply in different situations, but are
essentially equivalent, and will be referred to collectively as the total
expectation theorem.
• They all follow from the total probability theorem and express the
fact that "the unconditional average can be obtained by averaging
the conditional averages."
• They can be used to calculate the unconditional expectation E[X] from
the conditional PMF or expectation, using a divide-and-conquer
approach.

4
Total Expectation Theorem: Example 1

5
Total Expectation Theorem: Example 2

We’ll try something


different
6
Total Expectation Theorem: Example 2

7
Total Expectation Theorem: Example 2

8
Example 3

9
Example 3

10
Example 4: Total Expectation Theorem
• In a class, 30% of the students are British ( ), 50% and 20% percent of the students are Turkish
( ) and German ( ), respectively. British, Turkish and German students consume an average of
2, 3 and 1 cups of coffee per day, respectively.
• When asked, 50%, 40% and 30% of British, Turkish and German students, respectively, say they
also like chocolate (CL: Chocolate Lovers).
• The average daily coffee consumption of British, Turkish and German students, who like (CL) and
do not like (no CL) chocolate, is given in the table below: Event CL
CL no CL Total
B𝑩
𝑨 3 1 2
𝑨T𝑻 4.5 2 3 𝑨B𝑩 𝑨T𝑻 𝑨G𝑮
G𝑮
𝑨 1.7 0.7 1
A) What is the expected value of daily coffee consumption per person for the students in this class?
B) What is the percentage of coffee lovers in this class ?
C) What is the expected value of daily coffee consumption per person for the chocolate lovers?
11
Example 4: Total Expectation Theorem Event CL

𝑨B𝑩
( ) 𝑨T𝑻 𝑨G𝑮

A) Using total expectation theorem, we can calculate mean daily coffee


consumption per person as,

B) Using total probability theorem, we can calculate the probability of the


event CL (simply the percentage of coffee lovers) as,

12
Example 4: Total Expectation Theorem
Event CL

( )
𝑨B𝑩 𝑨T𝑻 𝑨G𝑮

C) Using Bayes Theorem, we can calculate the following probabilities:


CL no CL Total
( | ) ) . ( . ) B𝑩
𝑨 3 1 2
( ) . 𝑨T𝑻 4.5 2 3
G𝑮
𝑨 1.7 0.7 1
Similiarly, we can obtain and as and , respectively.
Finally, we can obtain using total expectation theorem as follows:

Remember!

13
«B» is Event «CL» here!
Independence
• We now discuss concepts of independence related to random
variables.
• These are analogous to the concepts of independence between
events.
• They are developed by simply introducing suitable events involving
the possible values of various random variables, and by considering
the independence of these events.
• We will start the discusssion with the independence of a random
variable from an event.

14
Independence

15
Independence

16
Independence
• Consider the random variable that takes the value 0 if the first toss is a head, and the
value 1 if the first toss is a tail (still there are two independent tosses).
• It can be shown that this random variable is independent of (in the previous
slide: A fair coin is tossed twice independently and the number of heads are even),

17
Independence of Random Variables

18
Conditional Independence of Random Variables

19
Conditional Independence of Random Variables

20
Independence of Random Variables

21
Independence of Random Variables

22
Independence of Random Variables
In the previous slide,

23
Independence of Random Variables

• In conclusion, the variance of the sum of two independent random


variables is equal to the sum of their variances.
• For an interesting comparison, note that the mean of the sum of two
random variables is always equal to the sum of their means even if
they are not independent.

24
Independence of Random Variables: Summary

25
Independence of Random Variables: Summary

26
Independence of Several Random Variables
The preceding discussion extends naturally to the case of more than
two random variables. For example, three random variables and
are said to be independent if,

27
Independence of Several Random Variables

28
Variance of the Binomial and the Poisson.

29
Variance of the Binomial and the Poisson.

30
Mean and Variance of the Sample Mean - I

31
Mean and Variance of the Sample Mean - II

32
Continous Random Variables
• Random variables with a continuous range of possible values are quite
common: The velocity of a vehicle traveling along the highway could be
one example.
• Models involving continuous random variables can be useful for several
reasons:
Besides being finer-grained (detailed) and possibly more accurate, they
allow the use of powerful tools from calculus and often admit an insightful
analysis that would not be possible under a discrete model.
• All of the concepts and methods introduced for discrete random variables,
such as expectation, PMFs, and conditioning, have continuous
counterparts.
• Developing and interpreting these counterparts is the subject of this
chapter.

33
Continous Random Variables

34
Continous Random Variables

35
Continous Random Variables

36
Probability Density Functions

37
Probability Density Functions

38
Continous Uniform Random Variable

39
Continous Uniform Random Variable

40
Piecewise Constant PDF

41
Piecewise Constant PDF

42
Piecewise Constant PDF

43
Probability Density Functions: Summary

44
Probability Density Functions: Summary

45
Expectation
• The expected value or expectation or mean of a continuous random
variable is defined by,

This is similar to the discrete case except that the PMF is replaced by
the PDF, and summation is replaced by integration.
• As in discrete case, can be interpreted as the "center of gravity"
of the PDF and, also, as the anticipated average value of in a large
number of independent repetitions of the experiment.
• Its mathematical properties are similar to the discrete case - after all,
an integral is just a limiting form of a sum.
46
Expectation
• If is a continuous random variable with given PDF, any real-valued
function of is also a random variable.
• Note that Y can be a continuous random variable: for example,
consider the trivial case where .
• But Y can also turn out to be discrete. For example, suppose that
for and otherwise. Then is
a discrete random variable taking values in the finite set
• In either case, the mean of g(X) satisfies the expected value rule

in complete analogy with the discrete case.


47
Expectation
• The nth moment of a continuous random variable is defined as
, the expected value of the random variable .
• The variance, denoted by is defined as the expected
value of the random variable .
• In the next slide, we will summarize this discussion and list a
number of additional facts that are practically identical to their
discrete counterparts.

48
49
Uniform Random Variable: Mean

50
Uniform Random Variable: Variance

51
Exponential Random Variable

52
Exponential Random Variable
An exponential random variable can, for example, be a good model for
the amount of time until an incident of interest takes place, such as
• a message arriving at a computer,
• some equipment breaking down,
• a lightbulb burning out,
• an accident occuring etc.
It is closely connected with the geometric random variable.
The mean and variance can be calculated to be:

53
Exponential Random Variable: Mean

Remember the Integration by Parts


formula:
𝑢𝑑𝑣 = 𝑢𝑣 − 𝑣𝑑𝑢

Here, 𝑢 = 𝑥, 𝑑𝑣 = 𝜆𝑒 𝑑𝑥
Therefore, 𝑑𝑢 = 𝑑𝑥, 𝑣 = −𝑒

54
Exponential Random Variable: Variance
Recall the Integration by Parts
formula:
𝑢𝑑𝑣 = 𝑢𝑣 − 𝑣𝑑𝑢

Here, 𝑢 = 𝑥 , 𝑑𝑣 = 𝜆𝑒 𝑑𝑥
Therefore, 𝑑𝑢 = 2𝑥𝑑𝑥, 𝑣 = −𝑒

55
Exponential Random Variable: Example

4 slides before

56

You might also like