0% found this document useful (0 votes)
45 views

Prob ch1

The document provides an introduction to probability and stochastic processes for electrical and computer engineers. It covers topics such as set theory, experiments, outcomes, sample spaces, events, axioms of probability, conditional probability, Bayes' theorem, independent events, permutations, and combinations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views

Prob ch1

The document provides an introduction to probability and stochastic processes for electrical and computer engineers. It covers topics such as set theory, experiments, outcomes, sample spaces, events, axioms of probability, conditional probability, Bayes' theorem, independent events, permutations, and combinations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Probability and Stochastic Processes

A Friendly Introduction for Electrical and Computer Engineers

Chapter 1 Viewgraphs

1
Set Theory Preliminaries

• Venn Diagrams

• Universal Set/Empty Set

• Union/Intersection

• Complement

• Mutually Exclusive/Collectively Exhaustive

2
What is Probability

• a number between 0 and 1.

• a physical property (like mass or volume) that can be measured?

• Measure of our knowledge?

3
Experiments

• Procedure + Observations

• Real Experiments are TOO complicated

• Instead we analyze/develop models of experiments

– A coin flip is equally likely to be H or T

4
Example 1.1
An experiment consists of the following procedure, observation and model:

• Procedure: Flip a coin and let it land on a table.

• Observation: Observe which side (head or tail) faces you after the coin
lands.

• Model: Heads and tails are equally likely. The result of each flip is
unrelated to the results of previous flips.

5
Definition Outcome, Sample Space
An outcome of an experiment is any possible observation of that experiment.
The sample space of an experiment is the finest-grain, mutually exclusive,
collectively exhaustive set of all possible outcomes. An event is a set of
outcomes of an experiment.

6
Correspondences
Set Algebra Probability
set event
universal set sample space
element outcome

7
Example 1.9
Flip four coins, a penny, a nickel, a dime, and a quarter. Examine the coins in
order (penny, then nickel, then dime, then quarter) and observe whether each
coin shows a head (h) or a tail (t). What is the sample space? How many
elements are in the sample space?
......................................................................
The sample space consists of 16 four-letter words:

{tttt, ttth, ttht, . . . , hhhh}

8
Event Spaces

• An event space is a collectively exhaustive, mutually exclusive set of


events.

• Example 1.10: For i = 0, 1, 2, 3, 4,

Bi = {outcomes with i heads}

• Each Bi is an event containing one or more outcomes:

• The set B = {B0 , B1 , B2 , B3 , B4 } is an event space.

9
Theorem 1.2

• For an event space B = {B1 , B2 , . . .} and any event A, let Ci = A ∩ Bi .

• For i = j, the events Ci ∩ Cj = φ.


A = C1 ∪ C2 ∪ · · ·

10
Axioms of Probability
A probability measure P [·] is a function that maps events in the sample space
to real numbers such that

Axiom 1 For any event A, P [A] ≥ 0.

Axiom 2 P [S] = 1.

Axiom 3 For any countable collection A1 , A2 , . . . of mutually exclusive


events
P [A1 ∪ A2 ∪ · · ·] = P [A1 ] + P [A2 ] + · · ·

11
Consequences of the Axioms
Theorem 1.4: If
B = B1 ∪ B2 ∪ · · · ∪ Bm

and for i = j,
Bi ∩ Bj = φ

then
m

P [B] = P [Bi ]
i=1

12
Problem 1.3.5
A student’s score on a 10-point quiz is equally likely to be any integer
between 0 and 10. What is the probability of an A, which requires the student
to get a score of 9 or more? What is the probability the student gets an F by
getting less than 4?

13
Consequences of the Axioms
Theorem 1.7: The probability measure P [·] satisfies

• P [φ] = 0.

• P [Ac ] = 1 − P [A].

• For any A and B,

P [A ∪ B] = P [A] + P [B] − P [A ∩ B]

• If A ⊂ B, then P [A] ≤ P [B].

14
Problem 1.4.5
A cellphone is equally likely to make zero handoffs (H0 ), one handoff (H1 ),
or more than one handoff (H2 ). Also, a caller is on foot (F ) with probability
5/12 or in a vehicle (V ).

• Find three ways to fill in the following probability table:

H0 H1 H2
F
V

• If 1/4 of all callers are on foot making calls with no handoffs and that
1/6 of all callers are vehicle users making calls with a single handoff,
what is the table?

15
Conditioning

• P [A] = our knowledge of the likelihood of A

• P [A] = a priori probability

• Suppose we cannot completely observe an experiment

– We learn that event B occurred


– We do not learn the precise outcome

16
Conditional Probability Definition

• Learning B occurred changes P [A]

• The conditional probability A given the occurrence of B is

P [AB]
P [A|B] =
P [B]

17
Problem 1.5.6
For deer ticks in the Midwest,

• 16% carried Lyme disease (event L)

• 10% had HGE (event H)

• 10% of the ticks that had either Lyme or HGE carried both

Find P [LH] and then P [H|L].

18
Law of Total Probability

• If B1 , B2 , . . . , Bm is an event space and P [Bi ] > 0 for i = 1, . . . , m,


then
m
P [A] = P [A|Bi ]P [Bi ]
i=1

19
Bayes’ Theorem


P [A|B]P [B]
P [B|A] =
P [A]

• For an event space B1 , B2 , . . . , Bm ,

P [A|Bi ]P [Bi ]
P [Bi |A] = m
i=1 P [A|Bi ]P [Bi ]

20
Bayes’ Theorem


P [A|B]P [B]
P [B|A] =
P [A]

• For an event space B1 , B2 , . . . , Bm ,

P [A|Bi ]P [Bi ]
P [Bi |A] = m
i=1 P [A|Bi ]P [Bi ]

21
Sequential Experiments - Example

• Two coins, one biased, one fair, but you don’t know which is which.

• Coin 1: P [H] = 3/4. Coin 2: P [H] = 1/2

• Pick a coin at random and flip it. Let Ci denote the event that coin i is
picked. What is P [C1 |H]

22
Solution: Tree Diagram

 

3/4 H •C1 H 3/8



1/2 C1 T •C1 T 1/8
1/4

 
1/2

C2
1/2

1/2
H •C2 H

T •C2 T
1/4

1/4

P [C1 H] 3/8 3
P [C1 |H] = = =
P [C1 H] + P [C2 H] 3/8 + 1/4 5

23
Definition 2 Independent Events
Definition 1.6 Events A and B are independent if and only if

P [AB] = P [A]P [B] (1)

......................................................................
Equivalent definitions:

P [A|B] = P [A] P [B|A] = P [B]

Always check if you are asked!

24
Definition 3 Independent Events
Definition 1.7 A1 , A2 , and A3 are independent if and only if

• A1 and A2 are independent.

• A2 and A3 are independent.

• A1 and A3 are independent.

• P [A1 ∩ A2 ∩ A3 ] = P [A1 ]P [A2 ]P [A3 ].

25
Fundamental Principle of Counting

• Experiment A has n possible outcomes,

• Experiment B has k possible outcomes,

• There are nk possible outcomes when you perform both experiments.

26
Permutations

• k-permutation: an ordered sequence of k distinguishable objects

• (n)k = no. of k-permutations of n dist. objects.

n!
(n)k = n(n − 1)(n − 2) · · · (n − k + 1) =
(n − k)!

27
Combinations

• Pick a subset of k out of n objects.

• Order of selection doesn’t matter

• Each subset is a k-combination

28
How Many Combinations
n
• k = “n choose k”

• Two steps for a k-permutation:

1. Choose a k-combination out of n objects.


2. Choose a k-permutation of the k objects in the k-combination.
   
n n n!
(n)k = k! =
k k k!(n − k)!

29
Problem 1.8.6
A basketball team has

• 3 pure centers, 4 pure forwards, 4 pure guards

• one swingman who can play either guard or forward.

A pure player can play only the designated position. How many lineups are
there (1 center, 2 forwards, 2 guards)

30
Problem 1.8.6 Solution
Three possibilities:

1. swingman plays guard: N1 lineups

2. swingman plays forward N2 lineups

3. swingman doesn’t play. N3 lineups

N = N1 + N2 + N3
   
3 4 4
N1 = = 72
1 2 1
   
3 4 4
N2 = = 72
1 1 2
   
3 4 4
N3 = = 108
1 2 2

31
Multiple Outcomes

• n independent trials

• r possible trial outcomes (s1 , . . . , sr )

• P [sk ] = pk

32
Multiple Outcomes (2)

• Outcome is a sequence:

– Example: s3 s4 s3 s1

P [s3 s4 s3 s1 ] = p3 p4 p3 p1 = p1 p23 p4
= pn1 1 pn2 2 pn3 3 pn4 4

• Prob depends on how many times each outcome occurred

33
Multiple Outcomes (3)
Ni = no. of time si occurs

P [N1 = n1 , . . . , Nr = nr ] = M pn1 1 pn2 2 · · · pnr r

M = Multinomial Coefficient
n!
=
n1 !n2 ! · · · nr !

34

You might also like