0% found this document useful (0 votes)
111 views19 pages

Sensitivity Analysis 2

This document discusses sensitivity analysis approaches that can be used to refine decision models. It provides examples using tornado diagrams and two-way sensitivity graphs to analyze the Southern Electronics case study from Chapter 4. It also describes how PrecisionTree and Excel tools like Goal Seek and Data Tables can be used for sensitivity analysis. Conducting sensitivity analysis by hand can be tedious, but using software tools makes it more exciting and powerful. The document aims to present basic sensitivity analysis techniques to identify inputs that most impact decisions.

Uploaded by

le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
111 views19 pages

Sensitivity Analysis 2

This document discusses sensitivity analysis approaches that can be used to refine decision models. It provides examples using tornado diagrams and two-way sensitivity graphs to analyze the Southern Electronics case study from Chapter 4. It also describes how PrecisionTree and Excel tools like Goal Seek and Data Tables can be used for sensitivity analysis. Conducting sensitivity analysis by hand can be tedious, but using software tools makes it more exciting and powerful. The document aims to present basic sensitivity analysis techniques to identify inputs that most impact decisions.

Uploaded by

le
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

CHAPTER 5

Sensitivity Analysis
Notes
Because no optimal procedure exists for performing sensitivity analysis, this chapter is somewhat “looser”
than the preceding. An effort has been made to present some of the more basic sensitivity analysis
approaches and tools. It is important to keep in mind that the purpose of sensitivity analysis is to refine the
decision model, with the ultimate objective of obtaining a requisite model.

For those instructors who enjoy basing lectures on problems and cases, an appealing way to introduce
sensitivity analysis is through the Southern Electronics cases from Chapter 4. If this case has just been
discussed, it can be used as a platform for launching into sensitivity analysis. Start by constructing a simple
(one bar) tornado diagram for the value to Steve of Banana’s offer. The endpoints of the bar are based on
the value of Banana’s stock ($20 to $60, say). Because the bar crosses the $10 million mark, the value of
Big Red’s offer, it would appear that the uncertainty about the stock price should be modeled. A second
tornado diagram can be created which considers the sensitivity of the expected value to 1) the probability of
success (from 0.35 up to 0.5, say); 2) the value of Banana’s stock if the EYF succeeds ($35 to $60); and 3)
the value of Banana’s stock if the EYF fails ($20 to $30). An advantage of this is to show that tornado
diagrams can be used on EMVs as well as sure consequences.

After showing tornado diagrams for Southern Electronics, it is natural to construct a two-way sensitivity
graph for the problem. One possibility is to construct a three-point probability distribution for the value of
the stock if the EYF succeeds. Denote two of the three probabilities by p and q, and construct the graph to
show the region for which Banana’s offer is preferred.

Sensitivity analysis is one of those topics in decision analysis that can be tedious and boring if done by
hand but quite exciting when a computer can be used. PrecisionTree, along with Goal Seek and Data Tables
in Excel provide some very powerful tools for sensitivity analysis. The software provides the capability, to
determine which inputs have the largest effect on a particular output, how much change you can expect in
an output when a given input changes by a defined amount, and which variables in the model change the
rank ordering of the alternatives. The chapter provides step-by-step instructions for setting up a sensitivity
analysis in PrecisionTree to create tornado diagrams, sensitivity graphs, and spider graphs. Instructions are
also provided on how to use Goal Seek and Data Tables as sensitivity analysis tools.

PrecisionTree saves the entries for the sensitivity analysis dialog box, which is helpful upon returning to the
model. When creating student handout worksheets, however, you may want your students to make their
own entries, in which case, make sure the dialog box is empty.

In the Excel solution files, the variables or cells used in the sensitivity analysis have been shaded green.
This should help in reading and understanding the analysis a little better.

Topical cross-reference for problems


Cost-to-loss ratio problem 5.6, 5.7
Linked Decision Trees 5.9, 5.12
Multiple objectives 5.10, 5.12, Job Offers Part II, MANPADS
Negotiations DuMond International
Net present value 5.10
PrecisionTree 5.8 - 5.12, Dumond International, Strenlar
Part II, Job Offers Part II, MANPADS
Requisite models 5.4
Sensitivity analysis 5.1-5.12, DuMond International, Strenlar
Part II, Job Offers Part II, MANPADS
Texaco-Pennzoil 5.11

66
Tornado diagrams 5.12, Strenlar Part II
Trade-off weights 5.9, Job Offers Part II
Two-way sensitivity analysis 5.5, 5.10, 5.11, , Job Offers Part II, Strenlar
Part II, MANPADS
Umbrella problem 5.6

Solutions
5.1. Sensitivity analysis answers the question “What matters in this decision?” Or, “How do the results
change if one or more inputs change?” To ask it still another way, “How much do the inputs have to change
before the decision changes?” Or “At what point does the most preferred alternative become the second
most preferred and which alternative moves into the number one spot?”

We have framed the main issue in sensitivity analysis as “What matters” because of our focus on
constructing a requisite model. Clearly, if a decision is insensitive to an input—the decision does not
change as the input is varied over a reasonable range—then variation in that input does not matter. An
adequate model will fix such an input at a “best guess” level and proceed.

By answering the question, “What matters in this decision,” sensitivity analysis helps identify elements of
the decision situation that must be included in the model. If an input can vary widely without affecting the
decision, then there is no need to model variation in that input. On the other hand, if the variation matters,
then the input’s uncertainty should be modeled carefully, as an uncertain variable or as a decision if it is
under the decision maker’s control.

5.2. This is a rather amorphous question. The decision apparently is, “In which house shall we live?”
Important variables are the value of the current home, costs of moving, purchase price, financing
arrangements (for the current house as well as for a new one), date of the move, transportation costs, and so
on.

What role would sensitivity analysis play? The couple might ask whether the decision would change if they
took steps to reduce driving time in the future (e.g., by deciding to have no more children). How does the
attractiveness of the different alternatives vary as the family’s size and the nature of future outings are
varied? (For example, how much more skiing would the family have to do before living out of town is
preferred?) Can the family put a price tag on moving into town; that is, is there an amount of money (price,
monthly payment, etc.) such that if a house in town costs more than this amount, the family would prefer to
stay in the country?

5.3. Another rather amorphous question. Students may raise such issues as:
• Is a retail business the right thing to pursue? (Is the right problem being addressed?)
• Does the father really want to be a retailer?
• Operating costs and revenues may vary considerably. These categories cover many possible
inputs that might be subjected to sensitivity analysis.

To use sensitivity analysis in this problem, the first step would be to determine some kind of performance
measure (NPV, cash flow, payback, profit). Then a tornado diagram could be constructed showing how the
selected performance measure varies over the range of values for the inputs. The tornado diagram will
suggest further modeling steps.

5.4. From the discussion in the text, the basic issue is whether some relevant uncertainty could be resolved
during the life of the option. Some possibilities include:
• Obtaining a new job, promotion, or raise,
• Obtaining cash for a down payment,
• Learning about one’s preferences. “Is this house right for me?”
• Are there hidden defects in the house that will require repairs?
• Are zoning decisions or other developments hanging in the balance?

If such an uncertainty exists, then the option may have value. If not, it may be a dominated alternative.

67
5.5. Each line is a line of indifference where two of the alternatives have the same EMV. Now imagine a
point where two lines cross. At that point, all three of the alternatives must have the same EMV.

Point D is at t = 0.4565 and v = 0.3261 and is the location when all 3 EMVs equal. Thus, at D, the
EMV(High-Risk Stock) = EMV(Low-Risk Stock) = $500. The exact location of D can be found using
algebra, using Solver, or using a combination of algebra and Goal Seek. Because there are two unknowns,
Goal Seek needs additional information. For example, to use Goal Seek: Substitute 𝑣 = (9 − 14𝑡)⁄8,
which corresponds to Line CDE, into the expression for EMV(High-Risk Stock). Now you have one
equation and one unknown, and you can use Goal Seek to find the value of t for which the new expression
equals 500. Point D is unique in that there is no other point at which all 3 alternatives have equal expected
monetary values.

5.6. Cost of protective action = C


Expected loss if no action taken = pL

C
Set C = pL, and solve for p: p = L .
C
Thus, if p ≥ L , take protective action.
C
The only information needed is p and L . Note that the specific values of C and L are not required, only
their relative values.

5.7. The best way to see if it is necessary to model the uncertainty about D is to revert to the regular cost-
to-loss problem. If pL < C < C + D, then one would not take action, and if pL > C + D , then the optimal
choice would be to take action. However, if C < pL < C + D then the occurrence of damage D does matter.
The choice between taking action and not is unclear, and one would want to include D in the decision tree.

5.8. This problem can either be set up to maximize expected crop value (first diagram below) or minimize
expected loss (second diagram below). The solution is the same either way; the difference is the perspective
(crop value vs. loss). The decision tree that maximizes expected crop value is modeled in the Excel file
“Problem 5.8.xlsx” along with the sensitivity reports.

The expected loss from doing nothing is much greater than for either of the two measures, and so it is
certainly appropriate to take some action. The expected loss for burners is almost entirely below that for
sprinklers, the only overlap being between $14.5K and $15K. It would be reasonable to set the burners
without pursuing the analysis further.

Another argument in favor of this is that most likely the same factors lead to more or less damage for both
burners and sprinklers. With this reasoning, there would be a negligible chance that the burners would
produce a high loss and the sprinklers a low loss.

A final note: Some students may solve this problem without calculating the expected loss, comparing the
range of losses from burners or sprinklers if damage occurs with the $50K loss from doing nothing.
However, if uncertainty about the weather is ignored altogether, the appropriate analysis has the loss
ranging from $0 to $50K for no action, $5 to $25K for the burners, and $2 to $32K for the sprinklers.
Because the three ranges overlap so much, no obvious choice can be made. It is, therefore, appropriate and
necessary to include the probability of adverse weather and calculate the expected losses.

68
Decision tree for
maximizing expected
crop value.

Loss ($ 000’s)

Exp Loss = $25K Freeze (0.5)


50
Do nothing
Decision tree for
minimizing No f reeze (0.5)
0
expected loss.
Freeze (0.5)
20 to 25
Set burners
Exp Loss =
$12.5K to $15K No f reeze (0.5)
5

Freeze (0.5)
27 to 32
Use sprinklers
Exp Loss =
No f reeze (0.5)
$14.5K to $17K 2

5.9. This decision tree (shown in Figure 4.40 in the text) is modeled in the Excel file “Problem.5.9.xlsx.”
The model is a linked tree where the uncertainty node for the amount of fun is linked to cell $F$6 in the
spreadsheet model (“Fun Level for Forest Job”), and the uncertainty node for the amount of work is linked
to cell $G$7 in the spreadsheet model (“Salary Level for In-town Job”). The outcome nodes for the Forest
Job are linked to cell $F$8 and the outcome nodes for the In-Town Job to cell $G$8. The user can then vary
the weights to see that Sam will still prefer the forest job. The sensitivity analysis gives the following
results:
Expected Overall Score
ks Forest Job In-Town Job
0.50 71.25 57.50
0.75 76.125 56.25

Thus, regardless of the precise value of ks, the optimal choice is the forest job. In fact, a much stronger
statement can be made; it turns out that for no value of ks between zero and one is the in-town job

69
preferred. Smaller values of ks favor the in-town job, but even setting ks = 0 leaves the expected overall
scores equal to 60 and 61.5 for the in-town and forest jobs, respectively.

Another way to show the same result is to realize that the expected overall scores are linear in the weights
and in the expected scores for the individual attributes. Because the forest job has higher expected scores
on both attributes, there cannot exist a set of weights that makes the in-town job have the higher overall
expected score.

5.10. Using the base values of $5000 for the cash flows and 11% for the interest rate,

5000 5000 5000


NPV = -14000 + 1.11 + 2 + ... +
1.11 1.116

= $7153.

When the cash flows are varied from $2500 to $7000, and the interest rate is varied from 9.5% to 12%, the
following tornado diagram results:

Cash f lows $2500 $7000

Interest rate 12% 9.5%

NPV -$4000 0 $4000 $8000 $12,000 $16,000

This graph assumes that the cash flows vary between $2500 and $7000 and the amount is the same across
all 6 years. The range of possible interest rates appears not to pose a problem; NPV remains positive within
a relatively narrow range. On the other hand, NPV is more sensitive to the range of cash flows. It would be
appropriate to set the interest rate at 11% for the analysis but to model the uncertainty about the cash flows
with some care.

The tornado diagram generated by PrecisionTree is shown in the Excel file “Problem 55.10.xlsx.” The
solution file also shows a two-way data table to calculate the swing of NPV as the annual payment and the
interest rate vary. Note that the table reports many more values than the tornado diagram, as the diagram
only incorporates one column and one row of the table.

Tornado Graph of Decision Tree 'Problem 5.10'


Expected Value of Entire Model

Annual Payment (B8)

Interest Rate (B13)

-$4,000 -$2,000 $0 $2,000 $4,000 $6,000 $8,000 $10,000 $12,000 $14,000 $16,000

Expected Value

The problem can also be modeled by varying each year between $2500 and $7000, one at a time. This
model is shown in the Excel file “Problem 5.10.xlsx.” Because of the discounting factor, the early year
payments are more influential than the later years. Said differently, NPV is more sensitive to the early year
payments than to later years.

70
Tornado Graph of Decision Tree 'Problem 5.10 by Year'
Expected Value of Entire Model
Year 1 (B8)

Year 2 (C8)

Year 3 (D8)

Year 4 (E8)

Year 5 (F8)

Year 6 (G8)

$4,500 $5,000 $5,500 $6,000 $6,500 $7,000 $7,500 $8,000 $8,500 $9,000 $9,500

Expected Value

Some students may see the error message: “Model Extends Beyond Allowed Region of Worksheet,” which
means that their version of the software is limited to one decision model per workbook. .

5.11. First, sensitivity analysis by hand: Let’s establish some notation for convenience:

Strategy A = Accept $2 billion.


Strategy B = Counteroffer $5 billion, then refuse if Texaco offers $3 billion.
Strategy C = Counteroffer $5 billion, then accept if Texaco offers $3 billion.

EMV(A) = 2

EMV(B) = 0.17 (5) + 0.5 [p 10.3 + q 5 + (1-p - q) 0] + 0.33 [p 10.3 + q 5 + (1-p - q) 0]


= 0.85 + 8.549 p + 4.15 q.

EMV(C) = 0.17 (5) + 0.5 [p 10.3 + q 5 + (1-p - q) 0] + 0.33 (3)


= 1.85 + 5.15 p + 2.5 q.

Now construct three inequalities:

• EMV(A) > EMV(B)


2 > 0.85 + 8.549 p + 4.15 q
0.135 - 0.485 q > p . (1)

• EMV(A) > EMV(C)


2 > 1.85 + 5.15 p + 2.5 q
0.03 - 0.485 q > p . (2)

• EMV(B) > EMV(C)


0.85 + 8.549 p + 4.15 q > 1.85 + 5.15 p + 2.5 q
p > 0.294 - 0.485 q. (3)

Plot these three inequalities as lines on a graph with p on the vertical axis and q on the horizontal axis. Note
that only the region below the line p + q = 1 is feasible because p + q must be less than or equal to one.

71
p

1.0

0.9

0.8

0.7

0.6

0.5

0.4

0.3 I

0.2
II
0.1
IV III
0 q
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0

These three lines divide the graph into four separate regions, labeled I, II, III, and IV. Inequality (3) divides
regions I and II. For points above this line, p > 0.294 - 0.485 q, and so EMV(B) > EMV (C). Inequality (1)
divides regions II and III. For points above this line, p > 0.135 - 0.485 q, and EMV(B) > EMV(A). As a
result of this, we know that B is the preferred choice in region I and that C is the preferred choice in region
II [where EMV(C) > EMV (B) > EMV(A)].

Inequality (2) divides regions III and IV. For points above this line, p > 0.03 - 0.485 q, and EMV(C) >
EMV (A). Thus, we now know that C is the preferred choice in region III [where EMV(C) > EMV(A) and
EMV(C) > EMV(B)], and A is preferred in region IV. Thus, we can redraw the graph, eliminating the line
between regions II and III:

1.0

0.9

0.8

0.7
p > 0.15
0.6
q > 0.35
0.5

0.4
B
0.3

0.2

0.1 C
A
0 q
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0

The shaded area in the figure represents those points for which p > 0.15 and q > 0.35. Note that all of these
points fall in the “Choose B” region. Thus, Liedtke should adopt strategy B: Counteroffer $5 billion, then
refuse if Texaco offers $3 billion.

72
The two-way SA feature of PrecisionTree can be used here, but with limited results. The widest any one
probability can vary is up to the point that the remaining probabilities sum to 0 or sum to 1. In this case, the
most we can vary “Probability of Large Award” is between 0 and 0.5. The model is in the Excel file
“Problem 5.11.xlsx.” Because cell D18 (“Probability of No Award”) contains the formula “=1 – (D12 +
D16)”, “Probability of Large Award” can vary at most from 0 and 0.5. The same is true for “Probability of
Medium Award”. In deciding how much you can vary the variables in a two-way analysis, the complete
rectangle defined by the range of values must fit inside the large triangular region shown above.

The two-way sensitivity graph is on the second tab, and is hard to read. Where there is a kink in the 3D
graph, then the optimal alternative has changed. In other words, the alternative with the maximum EMV
has changed. Easier to read, butperhaps not available in the student version, is the strategy graph. The
workbook contains two strategy graphs. The second one uses 50 steps, and thus computed a combination of
2500 values. From these graphs, we clearly see which alternative has the maximum EMV. Remember, the
PrecisionTree is limited to working with rectangular regions, and the whole region needs to fit within the
triangle. Therefore, the PrecisionTree two-way SA provides a limited and incomplete analysis compared to
the algebraic solution previously given.

If the student version does not allow strategy graphs, then by calculating the differences from row to row,
we can see where the value changes (i.e., where the kink occurs). As shown below, the differences show
which alternative is optimal. Taking the differences, as we have done, only works for linear models.

Prob Medium Award


Prob Large Aw 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5
0 2 2 2.09 2.215 2.34 2.465 2.59 2.715 2.84 2.965 3.09
0.05 2.0975 2.2225 2.3475 2.4725 2.5975 2.7225 2.8475 2.9725 3.0975 3.2225 3.35245
0.1 2.355 2.48 2.605 2.73 2.855 2.98 3.105 3.23 3.3649 3.5724 3.7799
0.15 2.6125 2.7375 2.8625 2.9875 3.1125 3.2375 3.37735 3.58485 3.79235 3.99985 4.20735
0.2 2.87 2.995 3.12 3.245 3.3898 3.5973 3.8048 4.0123 4.2198 4.4273 4.6348
0.25 3.1275 3.2525 3.40225 3.60975 3.81725 4.02475 4.23225 4.43975 4.64725 4.85475 5.06225
0.3 3.4147 3.6222 3.8297 4.0372 4.2447 4.4522 4.6597 4.8672 5.0747 5.2822 5.4897
0.35 3.84215 4.04965 4.25715 4.46465 4.67215 4.87965 5.08715 5.29465 5.50215 5.70965 5.91715
0.4 4.2696 4.4771 4.6846 4.8921 5.0996 5.3071 5.5146 5.7221 5.9296 6.1371 6.3446
0.45 4.69705 4.90455 5.11205 5.31955 5.52705 5.73455 5.94205 6.14955 6.35705 6.56455 6.77205
0.5 5.1245 5.332 5.5395 5.747 5.9545 6.162 6.3695 6.577 6.7845 6.992 7.1995
Successive differences:
0.0975 0.2225 0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.26245
Accept $2B
0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.2674 0.3499 0.42745
preferred.
0.2575 0.2575 0.2575 0.2575 0.2575 0.2575 0.27235 0.35485 0.42745 0.42745 0.42745
0.2575 0.2575 0.2575 0.2575 0.2773 0.3598 0.42745 0.42745 0.42745 0.42745 0.42745
Counter and
0.2575 0.2575 0.28225 0.36475 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745
accept $3B
0.2872 0.3697 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745
0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745
Counter and 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745
refuse $3B 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745
0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745 0.42745

5.12. The first thing to notice is that the net annual cost values have changed for Barnard and Charlotte, but
not for Astoria, as Astoria has no rental. Specifically, when incorporating the taxes on rental income and
depreciation, Barnard’s net annual cost has dropped by almost $5,000 and Charlotte’s by over $3,000.
Thus, instead of there be an $11,000 difference in annual costs among the 3 properties, there is only a
$6,000 difference. This helps improve the attractiveness of Barnard and Charlotte.

The Excel file for the model is “Problem 5.12.xlsx” and the sensitivity graphs are found in “Problem 5.12
Sensitivity Graphs.” Remember: what we call sensitivity graphs, PrecisionTree calls strategy graphs.

Many of the sensitivity analysis insights discussed in the chapter also hold here. Astoria is still the cheapest
no matter the value of the input variable for all the 8 input variables, except monthly rent. The main
difference, as mentioned above, is that the properties are now closer in value. For example, when varying
the interest rate between 4% and 8%, Astoria’s net annual cost is always less than the other 2 properties net
annual cost values. At an interest rate of 4%, Charlotte is within $700 of Astoria’s net annual cost, but

73
previously, when we did not incorporate depreciation and taxes on the rental income, then the closest
Charlotte came to Astoria was $4,000.

A subtle difference between the analyses is the influence of the tax rate. Previously, taxes could swing
Charlotte’s and Barnard’s net annual cost by $4,000 to $5,000; whereas now it only swings it by only
$2,000 to $2,500. Remember, that tax rate’s impact is counterintuitive in that costs go down as the tax rate
increases. When incorporating a more complete tax analysis, the effect of tax rate overall is muted by taxes
on the rental income.

While the months-occupied variable never changes the rank ordering of the properties, the analysis does
show a significant impact on net annual cost. Previously, when “Months Occupied” = 0 months, then
Barnard’s net annual cost was nearly $60,000; whereas now it is only $48,218.

The monthly rent value again can make Charlotte the least expensive property. Previously, when
Charlotte’s rent was $2,100 or more per month, than it was cheaper than Astoria. Now, Charlotte’s rent
needs only be above $1,900 per month for it to have a smaller net annual cost. Barnard is always the most
expensive no matter the monthly rent (up to $2,900/month), but now it is within $200 at $2,900/month.
The Excel file “Problem 5.12 Tornado Diagrmas.xlsx” report the 3 tornado diagrams and now show that
the loan’s interest rate is more influential than months occupied for both Barnard and Charlotte. For
example, previously months occupied could vary net annual cost by $24,000 for Barnard ($2,000 monthly
rent), and now varies it by approximately $16,000. This is the buffer we mentioned. Because the interest
rate can vary Barnard’s net annual cost by $19,276, interest rate has become the most influential variable
for Barnard. The same holds true for Charlotte; interest rate is now more influential than months occupied.

We choose to run a two-way SA on “Monthly Rent” and “Interest Rate.” Remember that Charlotte’s rent is
set at $500 less than Barnard’s. The Excel file is “Problem 5.12 Two-Way SA.xlsx.” As expected, the
results show a larger region when Charlotte is the cheapest than the previous analysis, and, again, Barnard
is never the cheapest within this rectangle. The two-way graphs show that when the interest rate is 4%, then
Charlotte has a lower net annual cost than Astoria when Charlotte’s rent is $1,629 or more.

Sensitivity of Decision Tree 'Problem 5.12'


Expected Value of Node 'Housing Decision' (B34)

$36,000
$34,000
$32,000
Expected Value

$30,000
$28,000
$26,000 8.0%
7.4%
$24,000 6.9%
$22,000 6.3%
5.7% Interest Rate (B4)
$20,000
5.1%
$1,500
$1,586
$1,671
$1,757

4.6%
$1,843
$1,929
$2,014
$2,100
$2,186
$2,271

4.0%
$2,357
$2,443
$2,529
$2,614
$2,700
$2,786
$2,871

Monthly Rent (Barnard) (C10)

74
Strategy Region for Node 'Housing Decision'
8.0%

7.5%

7.0%

6.5%
Interest Rate (B4)

6.0%
Astoria St
5.5% Charlotte St

5.0%

4.5%

4.0%
$1,400

$1,600

$1,800

$2,000

$2,200

$2,400

$2,600

$2,800

$3,000
Monthly Rent (Barnard) (C10)

5.13. The consequence measure “Appreciation + Equity – Net Annual Cost” substantially changes the
preference order. Barnard St is now ranked first at $15,645 and Astoria St. is ranked last at $5,582. With
this measure, larger values are more preferred. The Excel file “Problem 5.13.xlsx” contains the spreadsheet
model and note that the values given in the text for Barnard and Charlotte are off. Charlotte should be
$9,584.

Strategy Region of Decision Tree 'Problem 5.13'


Expected Value of Node 'Decision' (B34)
With Variation of Interest Rate (B4)
$25,000

$20,000
Expected Value

$15,000

Astoria St
$10,000
Barnard St
Charlotte St
$5,000

$0
3.5%

4.0%

4.5%

5.0%

5.5%

6.0%

6.5%

7.0%

7.5%

8.0%

8.5%

Interest Rate (B4)

The Barnard Street house is the most preferred property for all the variables listed in Table 5.2, except
“Appreciation” and “Months Occupied,” for every value in the range. For example, the sensitivity graph for
interest rate with between 4% and 8% is shown above. For each interest rate in this range, Barnard always
has the maximum value. The only time the rank ordering of the alternatives change is for the variables:
“Appreciation” and “Months Occupied.”

75
Strategy Region of Decision Tree 'Problem 5.13'
Expected Value of Node 'Decision' (B34)
With Variation of Months Occupied (C18)

$25,000

$20,000

$15,000
Expected Value

$10,000
Astoria St
$5,000 Barnard St

$0 Charlotte St

-$5,000

-$10,000

10

12

14
-2

8
Months Occupied (C18)

Strategy Region of Decision Tree 'Problem 5.13'


Expected Value of Node 'Decision' (B34)
With Variation of Appreciation Percentage (B26)
$50,000
$40,000
$30,000
$20,000
Expected Value

$10,000
$0
Astoria St
-$10,000
Barnard St
-$20,000
Charlotte St
-$30,000
-$40,000
-$50,000
10.00%

12.00%
-4.00%

-2.00%

0.00%

2.00%

4.00%

6.00%

8.00%

Appreciation Percentage (B26)

From 0 to 12 months, Barnard’s “Appreciation + Equity – Net Annual Cost” value is always larger than
Charlotte’s, but is only larger than Astoria’s when months occupied greater than or equal to 5 months.

“Appreciation” is now the most influential variable. Notice that the y-axis now goes from -$50,000 to
$50,000, showing that as the appreciation percentage changes, the “Appreciation + Equity – Net Annual
Cost” value can vary up to $100,000. For annual appreciation rates less than 2.8%, Astoria is the better
choice, then, after 2.8%, Barnard is more preferred.

Part c asks the students to append a chance node to Figure 5.14 to account for the uncertainty in the
appreciation percentage. You can either let the students define their own chance node (number of

76
outcomes, probabilities, etc.), or you can give them specific instructions. The file “Problem 5.13 part
c.xlsx” contains our solution, and, as shown below, it has 3 outcomes. Note also this is a linked tree. You
can tell this because the branch values are not payoffs, but the input values used to calculate the payoffs.

Running a two-way SA on the probabilities “p1” and “q1” in Figure 5.14 shows the Barnard is optimal for
all probability values. We set the steps equal to 50 for each “p1” and “q1” resulting in 2500 evaluation
points. On this measure, Barnard dominates the other two properties.

In part d, It is clear that “Net Annual Cost” is attempting to capture the real, or out-of-pocket, costs
experienced by home ownership. These are items that Sanjay and Sarah will be paying cash for. The other
consequence (“Appreciation + Equity – Net Annual Cost”) is harder to understand. In some ways, it is
measuring the annual value Sanjay and Sarah are earning in their home. When they sell their home, the will
realize both the appreciation and equity. This, however, does not happen incrementally as this measure
indicates. Either it happens when they sell their house or it comes into play when they take out a home-
equity loan. In the screenshot of the model, we see that Appreciation + Equity – Net Annual Cost =
$35,470 for Barnard when there is full occupancy, $8,000 in repair and upkeep, and a 9% appreciation rate.
This does not mean that Sanjay and Sarah have an additional $35,000 in their hands nor can they take a
loan out for that amount.

Strategy Region for Node 'Housing Decision'


100%

80%
Prob of High R& U Cost (J12)

60%

40% Barnard

20%

0%
0%

5%

10%

15%

20%

25%

30%

35%

Prob of 12 Months Occup (J11)

77
Case Study: DuMond International
1. If the changes suggested by Dilts and Lillovich are incorporated, EMV(Current product) increases to
$930,000, but EMV(New product) stays the same at $1,100,000. Thus, the new product would be preferred.

If the changes suggested by Jenkins and Kellogg are incorporated, EMV(New product) drops to $925,000.
Recall, though, that Jenkins and Kellogg were satisfied with Milnor’s analysis of the current product, so
their changes still leave the new product as the preferred choice.

If all of the suggested changes are incorporated, EMV(New product) = $925,000 and EMV(Current
product) = $930,000. Thus, the combination of optimism about the current product and pessimism about
the new leads to a scenario in which the current product barely has a better EMV than the new one.
Because no one embraced all of the changes, though, all board members should be convinced that the new
product is the preferred choice.

This case is set-up as a spreadsheet model in the Excel file “Dumond Case.xlsx.” The decision tree is
structured so that it references cells in the spreadsheet, so that the user can vary the parameters of the
model, and see how the preferred decision changes.

Case Study: Strenlar, Part II


1. The solution to this case depends to a great extent on how the decision was modeled in the first place.
The sensitivity analysis that follows is based on the model discussed above in the solution for Strenlar, Part
I, in Chapter 4. The Excel solution file is “Strenlar Case Part II.xlsx.”

The table shows the parameters for which we wish to perform a sensitivity analysis. For the Reject PI
option, this includes P(Win Lawsuit), P(Manufacturing Process Works), legal fees, and profits. For the
Accept Job option, the parameters are the interest rate, gross sales, and P(Manufacturing Process Works).
For the Accept Lump Sum and Options alternative, the analysis focuses on the interest rate, stock price if
Strenlar succeeds, and P(Manufacturing Process Works).

Pessimistic Base Optimistic


P(Win Lawsuit) 50% 60% 75%
P(Mfg Process Works) 70% 80% 90%
Gross Sales $25,000,000 $35,000,000 $45,000,000
Legal Fees $90,000 $60,000 $20,000
Fixed Cost $8,000,000 $5,000,000 $2,000,000
Variable Cost (% of Gross Sales) 80.0% 62.9% 60.0%
PI Stock Price $48.00 $52.00 $57.00
Interest Rate 5.0% 7.5% 12.5%

Sensitivity Analysis Table for Strenlar Model.

78
Because Refuse PI depends on profits, which in turn depends on gross sales, we have chosen to expand the
model slightly. We have assumed as base values that fixed costs would equal $5 million and that, for sales
of $35 million, the variable cost would be $22 million. This leaves profits of $8 million as specified in the
case. Specifically, we assumed:
Profit = Gross Sales – Variable Cost – Fixed Cost,
where Fixed Costs = $5 million and Variable Cost = (35/22) * Gross Sales.
Now it is possible to run sensitivity analysis using PrecisionTree on all three variables (Gross Sales,
Variable Costs, Fixed Costs) and obtain comparable results for the Refuse PI and Accept Job alternatives.

The tornado graph for the Refuse PI alternative shows that Variable Cost and Gross Sales are the two most
influential variables. These two variables are also the ones to cause Fred’s payoff to drop below $3.9
million.

Tornado Graph of Refuse PI


Expected Value of Entire Model
Variable Costs (B14)

Gross Sales (B5)

Fixed Costs (B13)

Prob of Winning Case (B4)

Prob(Mfg Process Works) (B3)

Legal Fees (B8)

Interest Rate (B9)

PI Stock Price (B10)


$1,000,000

$2,000,000

$3,000,000

$4,000,000

$5,000,000

$6,000,000

$7,000,000
Expected Value

The tornado diagram for Accept Job shows that Fred’s payoffs are always below $3.2 million and that
Gross Sales is the most influential variable pushing his payoff below $2 million. The sensitivity graph
shows that Refusing PI has the better payoff until gross sales are less than $27 million.

Tornado Graph of Accept Job


Expected Value of Entire Model
Gross Sales (B5)

Prob(Mfg Process Works) (B3)

Interest Rate (B9)

Legal Fees (B8)

Prob of Winning Case (B4)

PI Stock Price (B10)

Fixed Costs (B13)

Variable Costs (B14)


$1,800,000

$2,000,000

$2,200,000

$2,400,000

$2,600,000

$2,800,000

$3,000,000

$3,200,000

Expected Value

79
Strategy Region of Decision Tree 'Strenlar'
Expected Value of Node 'Decision' (B23)
With Variation of Gross Sales (B5)
$5,500,000
$5,000,000
$4,500,000
$4,000,000
Expected Value

$3,500,000
$3,000,000
Refuse PI
$2,500,000
Accept job
$2,000,000
Accept lum sum
$1,500,000
$1,000,000
$20,000,000

$25,000,000

$30,000,000

$35,000,000

$40,000,000

$45,000,000

$50,000,000
Gross Sales (B5)

The only other variable to cause a change to the rank ordering of the alternatives is Variable Costs. The
sensitivity graph shows the variable cost only affects the Refuse PI alternative. If he takes the job at PI,
then his royalties are tied to gross sales and not profit.

Strategy Region of Decision Tree 'Strenlar'


Expected Value of Node 'Decision' (B23)
With Variation of Variable Costs (B14)
$7,000,000

$6,000,000

$5,000,000
Expected Value

$4,000,000
Refuse PI

$3,000,000 Accept job


Accept lum sum
$2,000,000

$1,000,000
55.00%

60.00%

65.00%

70.00%

75.00%

80.00%

85.00%

Variable Costs (B14)

.
Another intriguing possibility is to see how pessimistic Fred could be before Refuse PI is no longer
optimal. Consider Scenario 1 in the table. In this pessimistic scenario, all the variables are at their base
value (best guess value) or they are less, and four of them are at their lowest value. Even in this case,
refusing PI has a larger EMV than accepting the job (compare $1.77 million to $1.74 million). In
Scenario 2, we kept all the values from Scenario 1, except we increased the stock price to its maximum
upper bound of $57. Doing so only changed the lump-sum payoff, from $0.9 million to $1.4 million, a
$500,000 gain, but not enough to overtake the other alternatives. Scenario 3 keeps all the same values as in

80
Scenario 1, except for increasing the fixed cost value only by $89,505. The EMV of Refuse PI and of
Accept Job are equal in Scenario 3. In other words, Fred would have to be relatively pessimistic overall
before the Reject PI option is not optimal

Scenario 1 Scenario 2 Scenario 3


P(Win Lawsuit) 60% (Base Value) 60% 60%
P(Mfg Process Works) 70% (Pessimistic) 70% 70%
Gross Sales $25,000,000 (Pessimistic) $25,000,000 $25,000,000
Legal Fees $90,000 (Pessimistic) $90,000 $90,000
Fixed Cost $5,000,000 (Base Value) $5,000,000 $5,089,505
Variable Cost (% of (Slightly
60 % 60% 60%
Gross Sales) Pessimistic)
PI Stock Price $48.00 (Pessimistic) $57.00 $48.00
Interest Rate 7.5% (Base Value) 7.5% 7.5%

Case Study: Job Offers, Part II


1. The sensitivity analysis gives the following results:

Expected Overall Score


P($1500) Madison MPR Pandemonium
0 50 36 50
1 58 36 50

Thus, there is no question about the preference for Madison Publishing; with the probability set at the most
pessimistic value of 0, Madison and Pandemonium are equivalent. MPR, of course, is never in the running
at all. A one-way sensitivity plot for the probability of disposable income being $1500 for the Madison job
generated by PrecisionTree is shown in the second worksheet. The one-way plot does not show the
alternative policies, only the range of possible outcome results.

2. The algebraically generated two-way sensitivity graph is shown below. The labels indicate the optimal
choice in each region. The graph makes good sense! When the weights on snowfall and income are small
(and hence the weight for the magazine score is high), Pandemonium is the best, reflecting its strong
showing in the magazine dimension. Likewise, when the magazine weight is low, MPR is best, reflecting
its strength in the income and snowfall dimensions, but poor showing with the magazine score.

ki
1.0

0.8

0.6 MPR

0.4

0.2
Madison
Pandemonium
0.0
0.0 0.2 0.4 0.6 0.8 1.0
ks

81
Sensitivity analysis using PrecisionTree. The decision model is in the Excel file “Job Offers Case II.xlsx.”

Question 1: To vary only the probability of $1,500 from 0 to 1, you must choose Spreadsheet Cell in the
SA dialog box, for Type of Value.

Question 2: You can use PrecisionTree to create the same optimal regions as we did above algebraically.
To do so, use the model and run a two-way SA, where you vary the weights for snowfall on the x-axis and
income on the y-axis. The weight for magazine rating is the formula 1 – (snowfall wt + income wt). Vary
both these weights from 0 to 1. This will create a rectangular region not the desired the triangular region.
See figure below. However, we can simply ignore the region above the main diagonal. We can do this for
weights, but not for probabilities. With the weights, if one of this is negative; the model still calculates a
value. The value it calculates is meaningless, which is why we ignore the upper triangular region of the
two-way analysis. We cannot do this for probabilities, because probabilities can never be negative. Once
PrecisionTree encounters a negative probability, it will not calculate any results.

Strategy Region for Node 'Decision'


1.00

0.80

0.60
Income Wt (F4)

Madison Publishing
MPR Manufacturing
0.40
Pandemonium Pizza

0.20

0.00
0.00

0.20

0.40

0.60

0.80

1.00

Snowfall Wt (G4)

Case Study: MANPADS


See file “MANPADS.xlsx” for the decision tree and the full analysis.

1. Decision tree shown below. Using the values given in the case,
E(Cost | Countermeasures) = $14,574 million
E(Cost | No Countermeasures) = $17,703 million

Putting the tree together is relatively straightforward. The only odd aspect of the model is working out
P(Interdiction | Attempt, Countermeasures). The information in the case is not entirely clear; in this
solution f is interpreted as the extent to which P(Interdiction | Attempt) is increased. The wording in the
case suggests that f should be interpreted as the proportional decrease in P(Interdiction | Attempt), but that

82
does not necessarily make sense. In the original article, the based value for f was set to 0, and the range was
0-0.25. You may want to be lenient with the students on this point.

83
2. Risk profiles shown below. The real difference is that adopting countermeasures greatly reduces the
chance of a very large loss. So the policy question is whether the cost is worth that reduction in risk. The
cumulative graph (see the Excel file) shows that neither alternative stochastically dominates the other.

Probabilities for Decision Tree 'MANPADS'


Choice Comparison for Node 'Decision'
80%

70%

60%

50%
Probability

40%
Countermeasures
30% No Countermeasures

20%

10%

0%
0

100000

120000
-20000

20000

40000

60000

80000

3. See the Excel file for both one-way and two-way sensitivity analyses. Changing the inputs in this model
can result in changes to both alternatives, so this sensitivity analysis has been run on the difference between
the expected costs (B45 on "MANPADS Tree"). When the difference is positive, Countermeasures has the
lower expected cost. Also, without any real guidance on reasonable ranges, I've varied each input by plus or
minus 25%.

The tornado and spider charts in the Excel file show that the top three variables in terms of sensitivity are
P(Attempt), Economic Loss, and P(Hit | Attack). Moreover, only the first two could result in No
Countermeasures being preferred. But remember that this is a one-way analysis.

The two-way analysis in the file shows how the strategy changes as we vary P(Attempt), Economic Loss,
and P(Hit | Attack). (So it is actually a three-way analysis.) When P(Hit | Attack) = 0.80, the region for No
Countermeasures is fairly small. When P(Hit | Attack) decreases to 0.60, though, the No Countermeasures
region is much larger. So it would not take too much movement of these three variables together to result in
No Countermeasures having the lower expected cost.

84

You might also like