0% found this document useful (0 votes)
14 views109 pages

DMD 1

Uploaded by

carlos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views109 pages

DMD 1

Uploaded by

carlos
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 109

1

Welcome to DMD Recitation 1!


¡ Due electronically on Monday February 2, 2015
¡ Airline on Ground (AOG) case – work with your team!
¡ Exercises 2.13 and 2.30 – complete individually!
¡ Submit the PDF files on Stellar.

¡ Google Doc for Teams: bit.ly/DMD16-Teams

¡ To reduce background noise, please mute your phone/


computer!

¡ Please feel free to raise your hand or chat through WebEx if


you have any questions or comments!

15.730 - Decision Trees and Discrete Probability


2

Data, Models,
Data Models
Data, and Decisions
Models, and Decisions
¡ Decision making is hard due to uncertainty

¡ Use data and build models


Making to decision
good make moreis hard
informed decisions

Data Uncertainty Models

15.730 - Decision Trees and Discrete Probability


3

Outline
¡ Decision Analysis
¡ Tree construction
¡ EMV calculations
¡ Sensitivity Analysis

¡ Binomial Distribution
¡ Understand the formula
¡ Compute probabilities

15.730 - Decision Trees and Discrete Probability


4

Decision Analysis
¡ Decision Tree
¡ Logical and systematic way of organizing and
representing various decisions and uncertainties

¡ Two types of nodes


¡ Decision node: point where we can choose
between alternatives
¡ Event node: point with uncertainty that we cannot
control

15.730 - Decision Trees and Discrete Probability


5

Examples of Nodes

We have
control!

15.730 - Decision Trees and Discrete Probability


6

Examples of Nodes

Outcome is
Random!

15.730 - Decision Trees and Discrete Probability


7

Event Node Branches


¡ Mutually Exclusive:
No two outcomes can
happen simultaneously

¡ Collectively Exhaustive:
Set of all possible outcomes
represents the entire range of
possible outcomes
Sum of probabilities equals ONE

15.730 - Decision Trees and Discrete Probability


8

Should We Play the Lottery?


+
- $0.9 $1,000,000

- $1

$0
15.730 - Decision Trees and Discrete Probability
9

Should We Play the Lottery?


Compare EMVs of each +
decision and pick the $1,000,000
highest! - $0.9

- $1

$0
15.730 - Decision Trees and Discrete Probability
10

Should We Play the Lottery?


Compare EMVs of each +
decision and pick the $1,000,000
highest! - $0.9

$0

- $1

$0
15.730 - Decision Trees and Discrete Probability
11

Decision Analysis Procedure


¡ List choices (decision nodes)

¡ List uncertain events (event nodes)

¡ Construct a decision tree

¡ Determine the probabilities of each outcome

¡ Determine the numerical values of the endpoints

¡ Solve using backward induction


¡ Event nodes: calculate EMV
¡ Decision nodes: choose decision with highest EMV

¡ Perform sensitivity analysis (What-if scenarios)

15.730 - Decision Trees and Discrete Probability


12

Kendall Crab and Lobster


Case
¡ Shortly before noon, Jeff Daniels, director of
Overnight Delivery Operations at Kendall Crab
and Lobster (KCL) watched the weather
channel:
¡ Weather forecast predicted 50% chance that the
storm hits Boston around 5pm

¡ With the chance of Logan airport closing,


business travelers were also nervously awaiting
further weather information
¡ In the past, if a storm of this magnitude hits, 1 in 5
come with strong winds that force Logan to close

15.730 - Decision Trees and Discrete Probability


13

Operations
¡ Customers can order lobsters for next-day
delivery prior to 5pm on day before delivery
¡ Typical daily order of 3,000 lobsters
¡ At 5:30pm, trucks from United Express pick up the
lobsters and truck them to Logan airport
¡ At 6:30pm, packed lobsters are flown to a processing
and distributing facility in DC
¡ By 10:30am of next day, lobsters are delivered

15.730 - Decision Trees and Discrete Probability


14

Earnings and Refund Policy


¡ Price charged to customers is $30/lobster, which
includes all transportation costs

¡ When KCL ships a lobster via United Express, its


unit contribution to earnings is $10/lobster

¡ If KCL cannot deliver the lobsters to customers, its


policy is to give each customer a $20 discount
coupon per lobster
¡ Market research has shown that ~70% of the
customers only redeem the coupons

15.730 - Decision Trees and Discrete Probability


15

Changes to Operations Due to


Weather
¡ Rely on the Massachusetts Air Freight (MAF)
which operates 50 miles away from Boston
¡ If contacted before 5:30pm, MAF will pick up the
lobsters from KCL and deliver them to an airport in
Worcester to fly them to DC
¡ Additional transportation cost of using MAF is $13/
lobster in roughly 67% of the time, $19/lobster in the
remaining 33%

15.730 - Decision Trees and Discrete Probability


16

Changes to Operations Due to


Weather
¡ Cancel orders and issue coupons
¡ If the lobsters are not packaged yet, the incremental
cost of cancelling the orders is ~$1/lobster
¡ If lobsters were already packages, incremental cost
is ~$1.25/lobster

¡ Deliver lobsters by truck to DC via the Eastern


Parcel Delivery (EPD)
¡ Arrangement needs to be made by noon!
¡ Cost is $4/lobster 50% of the time, $3/lobster 25% of
the time and $2/lobster 25% of the time.

15.730 - Decision Trees and Discrete Probability


17

What is the Best Decision?


¡ List Decisions that can be made over time
Noon 5:00pm 5:30pm

15.730 - Decision Trees and Discrete Probability


18

What is the Best Decision?


¡ List Decisions that can be made over time
Noon 5:00pm 5:30pm

1. Use EPD to
deliver by land
2. Wait until
5:00pm
3. Cancel orders
and issue
coupons

15.730 - Decision Trees and Discrete Probability


19

What is the Best Decision?


¡ List Decisions that can be made over time
Noon 5:00pm 5:30pm

1. Use EPD to 1. USE MAF to


deliver by land deliver to
2. Wait until Worcester
5:00pm 2. Cancel orders
3. Cancel orders and issue
and issue coupons
coupons

15.730 - Decision Trees and Discrete Probability


20

What is the Best Decision?


¡ List Uncertainties and their probability of occurrence
Noon 5:00pm 5:30pm

If EPD, additional
cost is uncertain:
- $4 w.p. 0.5
- $3 w.p. 0.25
- $2 w.p. 0.25

15.730 - Decision Trees and Discrete Probability


21

What is the Best Decision?


¡ List Uncertainties and their probability of occurrence
Noon 5:00pm 5:30pm

If EPD, additional Storm:


cost is uncertain: - Yes w.p. 0.5
- $4 w.p. 0.5 - No w.p. 0.5
- $3 w.p. 0.25
- $2 w.p. 0.25

15.730 - Decision Trees and Discrete Probability


22

What is the Best Decision?


¡ List Uncertainties and their probability of occurrence
Noon 5:00pm 5:30pm

If EPD, additional Storm:


cost is uncertain: - Yes w.p. 0.5
- $4 w.p. 0.5 - No w.p. 0.5
- $3 w.p. 0.25
- $2 w.p. 0.25 Logan Airport:
- Closed w.p. 0.2
- Open w.p. 0.8

15.730 - Decision Trees and Discrete Probability


23

What is the Best Decision?


¡ List Uncertainties and their probability of occurrence
Noon 5:00pm 5:30pm

If EPD, additional Storm: If MAF, cost is uncertain:


cost is uncertain: - Yes w.p. 0.5 - $13 w.p. 0.67
- $4 w.p. 0.5 - No w.p. 0.5 - $19 w.p. 0.33
- $3 w.p. 0.25
- $2 w.p. 0.25 Logan Airport:
- Closed w.p. 0.2
- Open w.p. 0.8

15.730 - Decision Trees and Discrete Probability


24

What is the Best Decision?


¡ List Uncertainties and their probability of occurrence
Noon 5:00pm 5:30pm

If EPD, additional Storm: If MAF, cost is uncertain:


cost is uncertain: - Yes w.p. 0.5 - $13 w.p. 0.67
- $4 w.p. 0.5 - No w.p. 0.5 - $19 w.p. 0.33
- $3 w.p. 0.25
- $2 w.p. 0.25 Logan Airport:
- Closed w.p. 0.2 If order canceled:
- Open w.p. 0.8 - Redeem w.p. 0.7
- Not w.p. 0.3

15.730 - Decision Trees and Discrete Probability


25
Decision Tree

$3
.25

Wait

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
26
Decision Tree

$3
.25

Wait
No s
0.5
torm

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
27
Decision Tree

$3
.25

$19
Wait
No s
0.5
torm

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
28
Calculating the Branch Values $10
profit/
lobster
(10-4)*3,000 = 18,000

$3
.25
(10-3)*3,000 = 21,000

(10-2)*3,000 = 24,000
Wait

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
29
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
30
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait

(-1)*3,000 + (-20)*(0.7)*3,000 = -45,000

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
31
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait

-45,000

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
32
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait
No s

10*3,000 = 30,000
0.5
torm

-45,000 10*3,000 = 30,000

Noon 5:00pm 5:30pm


15.730 - Decision Trees and Discrete Probability
33
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
34
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait
(-1.25)*3,000 +
No s

30,000 (-20)*(0.7)*3,000 =
0.5

-45,750
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
35
Calculating the Branch Values $10
profit/
lobster
18,000
$3
.25 21,000

24,000
Wait
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
36
Calculating the Branch Values $10
profit/
lobster
18,000
(10-13)*3,000 = -9,000
$3
.25 21,000

$19
24,000
(10-19)*3,000 = -27,000
Wait
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
37
Calculating the Branch Values $10
profit/
lobster
18,000
-9,000
$3
.25 21,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
38
Calculating the EMV

18,000
-9,000
$3
.25 21,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
39
Calculating the EMV

18,000 0.67*(-9,000)+0.33*
(-27,000) = -15,000 -9,000
$3
.25 21,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
40
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
41
Comparing the EMV

18,000
-9,000
$3 -15,000
.25 21,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
42
Comparing the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
43
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
44
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.2*(-15,000)+0.8*
torm

-45,000 (30,000) = 21,000


30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
45
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000

21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
46
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000

21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
47
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
0.5*(21,000) + -15,000

$19
0.5*(30,000) = 25,500
24,000
Wait -27,000

21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
48
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
49
Calculating the EMV

18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
50
Calculating the EMV
0.5*(18,000) + 0.25*(21,000)
+ 0.25*24,000 = 20,250
18,000
-9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
51
Calculating the EMV

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
52
Comparing the EMV

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
53
Comparing the EMV

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
25,000 24,000
Wait -27,000
25,000
21,000 -45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
54
Sensitivity Analysis

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
? 24,000
Wait -27,000
?
21,000 -45,750
No s

30,000
0

torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
55
Sensitivity Analysis

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
21,000 24,000
Wait -27,000
21,000
21,000 -45,750
No s

30,000
0

torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
56
Sensitivity Analysis To wait is
robust to
forecast
18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
21,000 24,000
Wait -27,000
21,000
21,000 -45,750
No s

30,000
0

torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
57
Sensitivity Analysis

18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
? 24,000
?
Wait -27,000
?
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
58
Sensitivity Analysis

18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
? 24,000
Wait -27,000
?
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
59
Sensitivity Analysis

18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
? 24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.5*(30,000) +
torm

0.5*[p*(-15,000) +
-45,000 (1-p)*(30,000)]
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
60
Sensitivity Analysis | Breakeven
0.5*(30,000) + 0.5*[p*(-15,000)
+ (1-p)*(30,000)] = 20,250
18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
? 24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.5*(30,000) +
torm

0.5*[p*(-15,000) +
-45,000 (1-p)*(30,000)]
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
61
Sensitivity Analysis | Breakeven
p = 0.433
18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
? 24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.5*(30,000) +
torm

0.5*[p*(-15,000) +
-45,000 (1-p)*(30,000)]
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
62
Sensitivity Analysis | Breakeven
p = 0.433
18,000
20,250 -9,000
$3 -15,000
.25 21,000
-15,000

$19
24,000
Wait -27,000
20,250
-45,750
No s

30,000
0.5
torm

-45,000
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
63
Sensitivity Analysis | Breakeven
p = 0.433
18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.5*(30,000) + If p ≤ 0.433, then wait


torm

0.5*[p*(-15,000) + and go with MAF


-45,000 (1-p)*(30,000)]
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
64
Sensitivity Analysis | Breakeven
p = 0.433
18,000
20,250 -9,000
$3 -15,000
.25 21,000
p*(-15,000) +
-15,000

$19
(1-p)*(30,000)
24,000
Wait -27,000
-45,750
No s

30,000
0.5

0.5*(30,000) + If p ≥ 0.433, then ship


torm

0.5*[p*(-15,000) + via trucks with EPD


-45,000 (1-p)*(30,000)]
30,000
Noon 5:00pm 5:30pm
15.730 - Decision Trees and Discrete Probability
65

Result of Decision Analysis


¡ An optimal decision rule (for every possible
scenario): a “complete contingency plan”

¡ Insights:
¡ If risk neutral, then computing EMV is good enough
¡ But what if risk averse?
¡ Maybe we want to be on the safe side and just
ship by truck to DC!
¡ Variability might be important to look at

15.730 - Decision Trees and Discrete Probability


66

Random Variables

Discrete
Continuous

15.730 - Decision Trees and Discrete Probability


67
Simple questions
Probability and Random
Variables
. What is the p
probabilityy that the return is at least 12%?
. P(R 12) = P(R=12) + P(R=13) + P(R=14) = 0.25+0.21+0.09 = 0.55
¡ .Remember!
What is the probability that the return is less than 12%?
¡ Probabilities are always between 0 and 1
. P(R<12) = P(R=10) + P(R=11) = 0.22+0.23 = 0.45
¡ Considering all possible outcomes, the sum of their
. Notice that must
probabilities these
add two probabilities add up to
to ONE 1
R (% per year) Probability
10 0.22
11 0.23
12 0.25
13 0.21
14 0 09
0.09

15.730 - Decision Trees and Discrete Probability


68
Simple questions
Complements
. What is the p
probabilityy that the return is at least 12%?
. P(R 12) = P(R=12) + P(R=13) + P(R=14) = 0.25+0.21+0.09 = 0.55
¡ .What
Whatis the probability
is the thatthat
probability return
theis return
at leastis12%
less than 12%?
P(R ≥ 12) = P(R=12) + P(R=13) + P(R=14) = 0.25 + 0.21 + 0.09 = 0.55
. P(R<12) = P(R=10) + P(R=11) = 0.22+0.23 = 0.45
¡ What is the probability that return is less than 12%?
. Notice that these two probabilities add up to 1
P(R < 12) = P(R=10) + P(R=11) = 0.22 + 0.23 = 0.45
R (% per year) Probability
10 0.22
11 0.23
12 0.25
13 0.21
14 0 09
0.09

15.730 - Decision Trees and Discrete Probability


69
Simple questions
Complements
. What is the p
probabilityy that the return is at least 12%?
. P(R 12) = P(R=12) + P(R=13) + P(R=14) = 0.25+0.21+0.09 = 0.55
¡ .What
Whatis the probability
is the thatthat
probability return
theis return
at leastis12%
less than 12%?
P(R ≥ 12) = P(R=12) + P(R=13) + P(R=14) = 0.25 + 0.21 + 0.09 = 0.55
. P(R<12) = P(R=10) + P(R=11) = 0.22+0.23 = 0.45
¡ What is the probability that return is less than 12%?
. Notice that these two probabilities add up to 1
P(R < 12) = P(R=10) + P(R=11) = 0.22 + 0.23 = 0.45
R (% per year) Probability
10 0.22
11 0.23 0.55 + 0.45 = 1
12 0.25
13 0.21
14 0 09
0.09

15.730 - Decision Trees and Discrete Probability


70

Example: return
Expected Valueon investment
R= 10*0.22 + 11*0.23 + 12*0.25 + 13*0.21 + 14*0.09 = 11.72
(Excel tip: SUMPRODUCT(column pi, column ri))

Interpretation: center of gravity


0.3
R (% per year) Probability 0.25
0.25 0.23
0.22
10 0.22 0.21
0.2
11 0.23
0.15
12 0.25
0.09
0.1
13 0 21
0.21
0.05
14 0.09
0
9 10 11 12 13 14 15

Distribution can be PbalancedR at the mean


15.730 - Decision Trees and Discrete Probability
71

Mean doesn(t
doesn t tell the whole
Expected Value Doesn’t Tell story
the Whole
Consider a secondStory!
portfolio with the following
p g return
distribution 0.57 0.6

S (% per year) Probability 0.5


0 43
0.43
10 0.57 0.4

11 0 0.3

12 0
02
0.2
13 0
0.1
14 0.43
0
9 10 11 12 13 14 15

The two portfolios have the same


average return: Distribution still balances at 11.72
S = 10*0.57
10*0 57 + 14*0
14*0.43
43 = 11
11.72
72 = R

15.730 - Decision Trees and Discrete Probability


72

How are
How they
are they different?
different?
0.3
0.25
0.25 0.23
0 22
0.22
0.21
0.2

0.15

0 09
0.09
0.1

0.05

0
9 10 11 12 13 14 15

0.6 0.57

0.5
0.43

0.4

0.3

0.2

0.1

0
9 10 11 12 13 14 15

15.730 - Decision Trees and Discrete Probability


73

Example
Variance calculation of variance
2
R
((10-11.72))2 * ((0.22))
0.3
0.25 + (11-11.72)2 * (0.23)
0.25 0.23
0.22
0.21
0.2 + (12-11.72)
(12-11 72)2 * (0.25)
(0 25)
0.15
+ (13-11.72)2 * (0.21)
0.09
0.1

0.05
+ (14-11.72)
(14 11 72)2 * (0
(0.09)
09)
0 = 1.6016
9 10 11 12 13 14 15

(10-11.72)2

15.730 - Decision Trees and Discrete Probability


74

Example
Variancecalculation of variance
2
S ((10-11.72))2 * ((0.57))
0.57
0.6
+ (14-11.72)2 * (0.43)
0.5
0.43

0.4
= 3.9216
3 9216
2
0.3 > 1.6016 = R
0.2

0.1 Portfolio S has more variability in


its potential return than portfolio
0
9 10 11 12 13 14 15 R

(10-11.72)2

15.730 - Decision Trees and Discrete Probability


75

Binomial Distribution
¡ Describes the distribution of
¡ the number of successes
¡ out of n independent “trials”
¡ where each trial has the same probability of success p

¡ Binomial(n,p)

¡ Need to properly define what “trial” and “success”


mean for our problem!

15.730 - Decision Trees and Discrete Probability


76

Binomial Distribution
¡ Suppose we flip 4 coins n=4

¡ Success is obtaining a head with p = 1/3

¡ X = number of heads

1 2 3 4

15.730 - Decision Trees and Discrete Probability


77

Binomial Distribution
¡ How can we obtain 2 heads?
1 2 3 4 1 2 3 4
H H T T T T H H

H T H T T H T H

H T T H T H H T

4! 4*3*2*1
= = 6 ways we can obtain 2 heads!
2!(4-2)! 2*1*(2*1)
15.730 - Decision Trees and Discrete Probability
78

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations
¡ To place the first head, we have 4 possibilities

15.730 - Decision Trees and Discrete Probability


79

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations
¡ To place the first head, we have 4 possibilities
¡ To place the second head, we have 3 possibilities

15.730 - Decision Trees and Discrete Probability


80

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations
¡ To place the first head, we have 4 possibilities
¡ To place the second head, we have 3 possibilities
¡ To place the first tail, we have 2 possibilities

H H

15.730 - Decision Trees and Discrete Probability


81

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations
¡ To place the first head, we have 4 possibilities
¡ To place the second head, we have 3 possibilities
¡ To place the first tail, we have 2 possibilities
¡ To place the second tail, we have 1 possibility

H H T

15.730 - Decision Trees and Discrete Probability


82

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations
¡ To place the first head, we have 4 possibilities
4*3*2*1
¡ To place the second head, we have 3 possibilities
¡ To place the first tail, we have 2 possibilities
= 4!
¡ To place the second tail, we have 1 possibility

H T H T

15.730 - Decision Trees and Discrete Probability


83

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations = 4!
¡ But we don’t care about the order of the heads or the tails!
¡ Suppose we choose the following break-down

H T H T

¡ In how many ways can I place the heads in the grey bins?

15.730 - Decision Trees and Discrete Probability


84

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations = 4!
¡ But we don’t care about the order of the heads or the tails!
¡ Suppose we choose the following break-down

H T H T

¡ In how many ways can I place the heads in the grey bins? 2!
¡ In how many ways can I place the tails in the red bins?

15.730 - Decision Trees and Discrete Probability


85

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations = 4!
¡ But we don’t care about the order of the heads or the tails!
¡ Suppose we choose the following break-down

H T H T

¡ In how many ways can I place the heads in the grey bins? 2!
¡ In how many ways can I place the tails in the red bins? (4-2)!

15.730 - Decision Trees and Discrete Probability


86

Binomial Distribution
¡ How can we obtain 2 heads?

¡ We can think of it a little differently!


¡ Total number of permutations = 4!
¡ But we don’t care about the order of the heads or the tails!
¡ Suppose we choose the following break-down

H T H T

¡ In how many ways can I place the heads in the grey bins? 2!
¡ In how many ways can I place the tails in the red bins? (4-2)!
¡ Since we do not care about the order: 4!/(2! * 2!)

15.730 - Decision Trees and Discrete Probability


87

Binomial Distribution
¡ P(X=2) = probability that we obtain 2 heads and the remaining
are tails
¡ Each grey bin has a probability p=1/3 to have a head
1/3 * 1/3 = (1/3)2
¡ Each red bin has a probability 1-p = 2/3 to have a tail

2/3 * 2/3 = (2/3)4-2

H T H T

4! (1/3)2 * (2/3)4-2
2!(4-2)!
15.730 - Decision Trees and Discrete Probability
88

Binomial Distribution
¡ In general, if X is Binomial(n,p)
¡ X can take on only the values 0,1,…,n-1,n
¡ The probability that we have x successes out of n trials

n! * (p)x * (1-p)n-x
x!(n-x)!
Number of combinations
with x successes out of n Probability that the
trial remaining (n-k) trials are
Probability that x failures
trials are
successes
15.730 - Decision Trees and Discrete Probability
89

Binomial RVs
¡ Expected value

E[X] = np

¡ Variance

Var(X) = np(1-p)

15.730 - Decision Trees and Discrete Probability


90

Revisiting Taylor Swift

15.730 - Decision Trees and Discrete Probability 90


91

Revisiting Taylor Swift

n = 50.9M, p = 0.005

E(X) = 254,500
Std(X) = 503
CV(X) = Std(X)/E(X) = 0.0019
15.730 - Decision Trees and Discrete Probability
91
92

Revisiting Taylor Swift

n = 50.9M, p = 0.005

E(X) = 254,500
Std(X) = 503
CV(X) = Std(X)/E(X) = 0.0019 Same p: E(X) = 0.67
Std(X) = 0.816
15.730 - Decision Trees and Discrete Probability CV(X) = 1.219
92
93

Revisiting Taylor Swift

n = 50.9M, p = 0.005

E(X) = 254,500
Std(X) = 503
CV(X) = Std(X)/E(X) = 0.0019 P = 0.0005: E(X) = 0.0067
Std(X) = 0.258
15.730 - Decision Trees and Discrete Probability CV(X) = 3.862
93
94

Binomial Distribution -
Example
¡ United’s first class cabin has 10 seats in each plane.

¡ Overbooking policy is to sell up to 11 first class tickets since


cancellations and no-shows are always possible

¡ Suppose that for a given flight


¡ 11 first class tickets are sold
¡ Each passenger has 80% chance of showing up for the flight
¡ Whether a passenger shows up is independent of other passengers

¡ Can we model this as a binomial distribution?

15.730 - Decision Trees and Discrete Probability


95

Binomial Distribution -
Example
¡ United’s first class cabin has 10 seats in each plane.

¡ Overbooking policy is to sell up to 11 first class tickets since


cancellations and no-shows are always possible

¡ Suppose that for a given flight


¡ 11 first class tickets are sold
¡ Each passenger has 80% chance of showing up for the flight
¡ Whether a passenger shows up is independent of other passengers

¡ Can we model this as a binomial distribution? YES!

15.730 - Decision Trees and Discrete Probability


96

Binomial Distribution -
Example
¡ United’s first class cabin has 10 seats in each plane.

¡ Overbooking policy is to sell up to 11 first class tickets since


cancellations and no-shows are always possible

¡ Suppose that for a given flight


¡ 11 first class tickets are sold
¡ Each passenger has 80% chance of showing up for the flight
¡ Whether a passenger shows up is independent of other passengers

¡ Trial = Passengers, Success = Showing up n =11, p = 0.8

15.730 - Decision Trees and Discrete Probability


97

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that 10 passengers show up?

15.730 - Decision Trees and Discrete Probability


98

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that 10 passengers show up?

P(X=10) = 11! (0.8)10 * (1-0.8)11-10


10!(11-10)!
= 11 * (0.8)10 * 0.2 ~ 0.236

15.730 - Decision Trees and Discrete Probability


99

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?

15.730 - Decision Trees and Discrete Probability


100

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?
P(X<=10)

15.730 - Decision Trees and Discrete Probability


101

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?
P(X<=10) = P(X=0) + P(X=1) + … + P(X=10)

15.730 - Decision Trees and Discrete Probability


102

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?
P(X<=10) = P(X=0) + P(X=1) + … + P(X=10)
¡ Or alternatively:

P(X<=10) = 1 – P(X>10) = 1 – P(X=11)

15.730 - Decision Trees and Discrete Probability


103

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?
P(X<=10) = P(X=0) + P(X=1) + … + P(X=10)
¡ Or alternatively:

P(X<=10) = 1 – P(X>10) = 1 – P(X=11)


=1– 11! 0.811 * (1-0.8)11-11
11!(11-11)!
15.730 - Decision Trees and Discrete Probability
104

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What is the probability that the airline gets away with booking?
P(X<=10) = P(X=0) + P(X=1) + … + P(X=10)
¡ Or alternatively:

P(X<=10) = 1 – P(X>10) = 1 – P(X=11)


=1– 11! 0.811 * (1-0.8)11-11
11!(11-11)!
~ 0.914
15.730 - Decision Trees and Discrete Probability
105

Binomial Distribution -
Example
¡ X = number of passengers that show up

¡ X is Binomial(11, 0.8)

¡ What if the airline overbooked by selling


¡ 11 seats: P(X<=10) ~ 0.914

¡ 12 seats: P(X<=10) ~ 0.725

¡ 13 seats: P(X<=10) ~ 0.498

15.730 - Decision Trees and Discrete Probability


106

Extensions
¡ Given some additional data
¡ Fares prices
¡ Cost of too many passengers showing up (refunds,
damage to customer relations, etc.)

¡ Is it worthwhile to overbook flights?

¡ Check our assumptions


¡ Is 80% an accurate probability?
¡ Are passengers really independent?

15.730 - Decision Trees and Discrete Probability


107

Alternate Model
¡ We can define the “success” to be a passenger not showing
up

¡ Y = number of passengers not showing up is Binomial(11,0.2)

¡ P(X=10) = P(exactly 10 passengers show up)

= P(exactly 1 passenger does not show up)

= P(Y=1) = 11!
0.21 * (1-0.2)11-1
1!(11-1)!

15.730 - Decision Trees and Discrete Probability


108

Wrap-up
¡ Due electronically on Monday February 2, 2015
¡ AOG case – work with your team!
¡ Exercises 2.13 and 2.30 – complete individually!
¡ Submit the PDF files on Stellar.

¡ Google Doc for Teams: bit.ly/DMD16-Teams

¡ Office hours for 30 minutes!

¡ Feel free to raise your hand and unmute yourself


to ask questions!

15.730 - Decision Trees and Discrete Probability


109

Office Hour

15.730 - Decision Trees and Discrete Probability

You might also like