Decision Analysis
Decision Analysis
Decision Analysis
Learning Objectives:
After the completion of the chapter, the students will be able to:
list the steps of the decision making process;
describe the types of decision making environments;
make decisions under certainty;
use probability values to make decisions under risk;
develop accurate and useful decision trees;
revise probability estimates using Bayesian analysis; and
explain the importance and use of utility theory in decision making.
Step 1. The problem that John Thompson identifies is whether to expand his product
line by manufacturing and marketing a new product, backyard storage sheds.
Step 2. Thompson’s second step is to generate the alternatives that are available to
him. In decision theory, an alternative is defined as a course of action or a strategy
that the decision maker can choose. John decides that his alternatives are to construct
(1) a large new plant to manufacture the storage sheds, (2) a small plant, or (3) no
plant at all (i.e., he has the option of not developing the new product line).
The next step involves identifying the possible outcomes of the various alternatives. A
common mistake is to forget about some of the possible outcomes. Optimistic decision
makers tend to ignore bad outcomes, whereas pessimistic managers may discount a
favorable outcome. If you don’t consider all possibilities, you will not be making a
logical decision, and the results may be undesirable. In decision theory, those
outcomes over which the decision maker has little or no control are called states of
nature.
Step 3. Thompson determines that there are only two possible outcomes: the market
for the storage sheds could be favorable, meaning that there is a high demand for the
product, or it could be unfavorable, meaning that there is a low demand for the sheds.
Once the alternatives and states of nature have been identified, the next step is to
express the payoff resulting from each possible combination of alternatives and
outcomes. In decision theory, we call such payoffs or profits conditional values. Not
every decision, of course, can be based on money alone—any appropriate means of
measuring benefit is acceptable.
Step 4. Because Thompson wants to maximize his profits, he can use profit to evaluate
each consequence.
John Thompson has already evaluated the potential profits associated with the various
outcomes. With a favorable market, he thinks a large facility would result in a net profit
of PhP200,000 to his firm. This PhP200,000 is a conditional value because
Thompson’s receiving the money is conditional upon both his building a large factory
and having a good market. The conditional value if the market is unfavorable would be
a PhP180,000 net loss. A small plant would result in a net profit of PhP100,000 in a
favorable market, but a net loss of PhP20,000 would occur if the market was
unfavorable. Finally, doing nothing would result in PhP0 profit in either market. The
easiest way to present these values is by constructing a decision table, sometimes
called a payoff table. A decision table for Thompson’s conditional values is shown in
Table 3.1. All of the alternatives are listed down the left side of the table, and all of the
possible outcomes or states of nature are listed across the top. The body of the table
contains the actual payoffs.
Table 3.1 Decision Table for Thompson’s Conditional Values
State of Nature
Alternative
Favorable Market Unfavorable Market
Construct a large plant 200,000 -180,000
Construct a small plant 100,000 -20,000
Do nothing 0 0
Note: it is important to include all alternatives, including “do nothing.”
Steps 5 and 6. The last two steps are to select a decision theory model and apply it to
the data to help make the decision. Selecting the model depends on the environment
in which you’re operating and the amount of risk and uncertainty involved.
The types of decisions people make depend on how much knowledge or information
they have about the situation. There are three decision-making environments:
In the environment of decision making under certainty, decision makers know with
certainty the consequence of every alternative or decision choice. Naturally, they will
choose the alternative that will maximize their well-being or will result in the best
outcome. For example, let’s say that you have PhP1,000 to invest for a 1-year period.
One alternative is to open a savings account paying 4% interest, and another is to
invest in a government Treasury bond paying 6% interest. If both investments are
secure and guaranteed, there is a certainty that the Treasury bond will pay a higher
return. The return after 1 year will be PhP60 in interest.
In decision making under uncertainty, there are several possible outcomes for each
alternative, and the decision maker does not know the probabilities of the various
outcomes.
In decision making under risk, there are several possible outcomes for each
alternative, and the decision maker knows the probability of occurrence of each
outcome. We know, for example, that when playing cards using a standard deck, the
probability of being dealt a club is 0.25. The probability of rolling a 5 on a die is 1/6. In
decision making under risk, the decision maker usually attempts to maximize his or her
expected well-being. Decision theory models for business problems in this
environment typically employ two equivalent criteria: maximization of expected
monetary value and minimization of expected opportunity loss.
In the Thompson Lumber example, John Thompson is faced with decision making
under uncertainty. If either a large plant or a small plant is constructed, the actual
payoff depends on the state of nature, and probabilities are not known. If probabilities
for a favorable market and for an unfavorable market were known, the environment
would change from uncertainty to risk. For the third alternative, do nothing, the payoff
does not depend on the state of nature and is known with certainty.
The criteria for decision making under uncertainty (and also for decision making under
risk) is based on the assumption that the payoff is something in which larger values
are better and high values are desirable. For payoffs such as profit, total sales, total
return on investment, and interest earned, the best decision would be one that resulted
in some type of maximum payoff. However, there are situations in which lower payoff
values (e.g., cost) are better, and these payoffs would be minimized rather than
maximized. The statement of the decision criteria would be modified slightly for such
minimization problems.
Several criteria exist for making decisions under conditions of uncertainty are as
follows:
1. Optimistic
2. Pessimistic
3. Criterion of realism (Hurwicz)
4. Equally likely (Laplace)
5. Minimax regret
The first four criteria can be computed directly from the decision (payoff) table, whereas
the minimax regret criterion requires use of an opportunity loss table. Let’s take a look
at each of the five models and apply them to the Thompson Lumber example.
Optimistic
In using the optimistic criterion, the best (maximum) payoff for each alternative is
considered, and the alternative with the best (maximum) of these is selected. Hence,
the optimistic criterion is sometimes called the maximax criterion. In Table 3.2, we see
that Thompson’s optimistic choice is the first alternative, “construct a large plant.” By
using this criterion, the highest of all possible payoffs (PhP200,000 in this example)
may be achieved, while if any other alternative were selected, it would be impossible
to achieve a payoff this high.
In using the optimistic criterion for minimization problems in which lower payoffs (e.g.,
cost) are better, you would look at the best (minimum) payoff for each alternative and
choose the alternative with the best (minimum) of these.
Maximax
Construct a small plant 100,000 -20,000 100,000
Do nothing 0 0 0
Pessimistic
In using the pessimistic criterion, the worst (minimum) payoff for each alternative is
considered, and the alternative with the best (maximum) of these is selected. Hence,
the pessimistic criterion is sometimes called the maximin criterion. This criterion
guarantees the payoff will be at least the maximin value (the best of the worst values).
Choosing any other alternative may allow a worse (lower) payoff to occur.
Thompson’s maximin choice, “do nothing,” is shown in Table 3.3. This decision is
associated with the maximum of the minimum number within each row or alternative.
In using the pessimistic criterion for minimization problems in which lower payoffs (e.g.,
cost) are better, you would look at the worst (maximum) payoff for each alternative and
choose the alternative with the best (minimum) of these.
Both the maximax and maximin criteria consider only one extreme payoff for each
alternative, while all other payoffs are ignored. The next criterion considers both of
these extremes.
Table 3.3 Thompson’s Maximin Decision
State of Nature
Alternative Favorable Unfavorable Maximum in a
Market Market Row
Construct a large plant 200,000 -180,000 -180,000
Construct a small plant 100,000 -20,000 -20,000
Do nothing 0 0
0
Maximin
Often called the weighted average, the criterion of realism (the Hurwicz criterion)
is a compromise between an optimistic and a pessimistic decision. To begin, a
coefficient of realism, 𝛼, is selected. This measures the degree of optimism of the
decision maker and is between 0 and 1. When 𝛼 is 1, the decision maker is 100%
optimistic about the future. When 𝛼 is 0, the decision maker is 100% pessimistic about
the future. The advantage of this approach is that it allows the decision maker to build
in personal feelings about relative optimism and pessimism. The weighted average is
computed as follows:
For a maximization problem, the best payoff for an alternative is the highest value, and
the worst payoff is the lowest value. Note that when 𝛼 = 1, this is the same as the
optimistic criterion, and when 𝛼 = 0, this is the same as the pessimistic criterion. This
value is computed for each alternative, and the alternative with the highest weighted
average is then chosen.
If we assume that John Thompson sets his coefficient of realism, 𝛼, to be 0.80, the
best decision would be to construct a large plant. As seen in Table 3.4, this alternative
has the highest weighted average:
One criterion that uses all the payoffs for each alternative is the equally likely, also
called Laplace, decision criterion. This involves finding the average payoff for each
alternative and selecting the alternative with the best or highest average. The equally
likely approach assumes that all probabilities of occurrence for the states of nature are
equal, and thus each state of nature is equally likely.
The equally likely choice for Thompson Lumber is the second alternative, “construct a
small plant.” This strategy, shown in Table 3.5, is the one with the maximum average
payoff.
In using the equally likely criterion for minimization problems, the calculations are
exactly the same, but the best alternative is the one with the lowest average payoff.
Do nothing 0 0 0
Minimax Regret
Opportunity loss or regret refers to the difference between the optimal profit or payoff
for a given state of nature and the actual payoff received for a particular decision for
that state of nature. In other words, it’s the amount lost by not picking the best
alternative in a given state of nature.
The first step is to create the opportunity loss table by determining the opportunity
loss for not choosing the best alternative for each state of nature. Opportunity loss for
any state of nature, or any column, is calculated by subtracting each payoff in the
column from the best payoff in the same column. For a favorable market, the best
payoff is PhP200,000 as a result of the first alternative, “construct a large plant.” The
opportunity loss is 0, meaning that it is impossible to achieve a higher payoff in this
state of nature. If the second alternative is selected, a profit of PhP100,000 would be
realized in a favorable market, and this is compared to the best payoff of PhP200,000.
Thus, the opportunity loss is 200,000 - 100,000 = 100,000. Similarly, if “do nothing” is
selected, the opportunity loss would be 200,000 - 0 = 200,000.
For an unfavorable market, the best payoff is PhP0 as a result of the third alternative,
“do nothing,” so this has 0 opportunity loss. The opportunity losses for the other
alternatives are found by subtracting the payoffs from this best payoff (PhP0) in this
state of nature, as shown in Table 3.6.
In calculating the opportunity loss for minimization problems such as those involving
costs, the best (lowest) payoff or cost in a column is subtracted from each payoff in
that column. Once the opportunity loss table has been constructed, the minimax regret
criterion is applied in exactly the same way as just described. The maximum
opportunity loss for each alternative isfound, and the alternative with the minimum of
these maximums is selected. As with maximization problems, the opportunity loss can
never be negative.
Decision making under risk is a decision situation in which several possible states
of nature may occur and the probabilities of these states of nature are known. We
consider one of the most popular methods of making decisions under risk: selecting
the alternative with the highest expected monetary value (or simply expected
value). We also use the probabilities with the opportunity loss table to minimize the
expected opportunity loss.
Expected Monetary Value
Given a decision table with conditional values (payoffs) that are monetary values and
probability assessments for all states of nature, it is possible to determine the
expected monetary value (EMV) for each alternative. The expected value, or the
mean value, is the long-run average value of that decision. The EMV for an alternative
is just the sum of possible payoffs of the alternative, each weighted by the probability
of that payoff occurring.
where
∑ = summation symbol
EMV (alternative) = (Payoff in first state of nature) x (Probability of first state of nature)
The alternative with the maximum EMV is then chosen. Suppose that John Thompson
now believes that the probability of a favorable market is exactly the same as the
probability of an unfavorable market; that is, each state of nature has a 0.50 probability.
Which alternative would give the greatest EMV? To determine this, John has expanded
the decision table, as shown in Table 3.9. His calculations follow:
The largest expected value (40,000) results from the second alternative, “construct a
small plant.” Thus, Thompson should proceed with the project and put up a small plant
to manufacture storage sheds. The EMVs for constructing the large plant and for doing
nothing are 10,000 and 0, respectively.
When using the EMV criterion with minimization problems, the calculations are the
same, but the alternative with the smallest EMV is selected.
Table 3.9 Decision Table with Probabilities and EMVs for Thompson Lumber
State of Nature
Alternative Favorable Unfavorable EMV
Market Market
Construct a large plant 200,000 -180,000 10,000
Do nothing 0 0 0
Probabilities 0.50 0.50
There are times that accurate and complete information are available for the decision
maker to take advantage of. But such information is not free. Perfect information can
be from accurate historical data or a product of technical analysis of the market or of
any production processes. The question is how much is the cost of information? To
help a manager assess how much is the cost of such information? To help a manager
assess how much is the cost of such information, he can compute the EMV with perfect
information.
The expected value with perfect information is the expected or average return, in the
long run, if we have perfect information before a decision has to be made. To calculate
this value, we choose the best alternative for each state of nature and multiply its
payoff times the probability of occurrence of that state of nature.
EVwPI = (Best payoff in first state of nature) x (Probability of first state of nature)
By referring to Table 3.9, Thompson can calculate the maximum that he would pay for
information, that is, the EVPI. He follows a three-stage process. First, the best payoff
in each state of nature is found. If the perfect information says the market will be
favorable, the large plant will be constructed, and the profit will be $200,000. If the
perfect information says the market will be unfavorable, the “do nothing” alternative is
selected, and the profit will be 0. These values are shown in the “with perfect
information” row in Table 3.10. Second, the expected value with perfect information is
computed. Then, using this result, EVPI is calculated.
Table 3.10 Decision Table with Probabilities and EMVs for Thompson Lumber
State of Nature
Alternative Favorable Unfavorable EMV
Market Market
Construct a large plant 200,000 -180,000 10,000
In finding the EVPI for minimization problems, the approach is similar. The best payoff
in each state of nature is found, but this is the lowest payoff for that state of nature
rather than the highest. The EVwPI is calculated from these lowest payoffs, and this is
compared to the best (lowest) EMV without perfect information. The EVPI is the
improvement that results, and this is the best EMV - EVwPI.
It is important to note that minimum EOL will always result in the same decision as
maximum EMV and that the EVPI will always equal the minimum EOL. Referring to
the Thompson case, we used the payoff table to compute the EVPI to be PhP60,000.
Note that this is the minimum EOL we just computed.
Sensitivity Analysis
In previous sections, we determined that the best decision (with the probabilities
known) for Thompson Lumber was to construct the small plant, with an expected value
of PhP 40,000. This conclusion depends on the values of the economic consequences
and the two probability values of a favorable and an unfavorable market. Sensitivity
analysis investigates how our decision might change given a change in the problem
data. In this section, we investigate the impact that a change in the probability values
would have on the decision facing Thompson Lumber. We first define the following
variable:
Because there are only two states of nature, the probability of an unfavorable market
must be 1 - P.
We can now express the EMVs in terms of P, as shown in the following equations. A
graph of these EMV values is shown in Figure 3.1.
= PhP380,000P - PhP180,000
= PhP120,000P - PhP20,000
As you can see in Figure 3.1, the best decision is to do nothing as long as P is between
0 and the probability associated with point 1, where the EMV for doing nothing is equal
to the EMV for the small plant. When P is between the probabilities for points 1 and 2,
the best decision is to build the small plant. Point 2 is where the EMV for the small
plant is equal to the EMV for the large plant. When P is greater than the probability for
point 2, the best decision is to construct the large plant.
EMV Value
200,000 Point 2
EMV (small plant)
100,000 Point 1
Of course, this is what you would expect as P increases. The value of P at points 1
and 2 can be computed as follows:
20,000
0 = 𝑃ℎ𝑃120,000 𝑃 − 𝑃ℎ𝑃 20,000 𝑃= = 0.167
120,000
Point 2: EMV (small plant) = EMV (large plant)
𝑃ℎ𝑃 120,000𝑃 − 𝑃ℎ𝑃20,000 = 𝑃ℎ𝑃 380,000𝑃 − 𝑃ℎ𝑃180,000
160,000
260,000𝑃 = 160,000 𝑃= = 0.615
260,000
The results of this sensitivity analysis are displayed in the following table:
Best Alternative Range of P Values
Do nothing Less than 0.167
Construct a small plant 0.167 – 0.615
Construct a big plant Greater than 0.615
A Minimization Example
The previous examples have illustrated how to apply the decision-making criterion
when the payoffs are to be maximized. The following example illustrates how the
criteria are applied to problems in which the payoffs are costs that should be
minimized.
Table 3.12 Payoff Table with Monthly Copy Costs for DIT
10,000 20,000 30,000
Copies per Copies per Copies per
Month Month Month
Machine A 950 1,050 1,150
Machine B 850 1,100 1,350
Machine C 700 1,000 1,300
If the decision maker is pessimistic, only the worst (maximum) payoff for each decision
is considered. These are also shown in Table 3.13, and the best (minimum) of these
is 1,150. Thus, Machine A would be selected based on the pessimistic criterion. This
would guarantee that the cost would be no more than 1,150, regardless of which state
of natue occurred.
Using the Hurwicz criterion, if we assume that the decision maker is 70% optimistic
(the coefficient of realism is 0.7), the weighted average of the best and the worst payoff
for each alternative would be calculated using the formula
The decision would be to select Machine C based on this criterion because it has the
lowest weighted average cost.
For the equally likely criterion, the average payoff for each machine would be
calculated.
Based on the equally likely criterion, Machine C would be selected because it has the
lowest average cost.
To apply the EMV criterion, probabilities must be known for each state of nature. Past
records indicate that 40% of the time the number of copies made in a month was
10,000, while 30% of the time it was 20,000 and 30% of the time it was 30,000. The
probabilities for the three states of nature would be 0.4, 0.3, and 0.3. We can use these
to calculate the EMVs, and the results are shown in Table 3.14. Machine C would be
selected because it has the lowest EMV. The monthly cost would average PhP970
with this machine, while the other machines would average a higher cost.
To find the EVPI, we first find the payoffs (costs) that would be experienced with perfect
information. The best payoff in each state of nature is the lowest value (cost) in that
state of nature, as shown in the bottom row of Table 3.14. These values are used to
calculate the EVwPI. With these costs, we find
To apply criteria based on opportunity loss, we must first develop the opportunity loss
table. In each state of nature, the opportunity loss indicates how much worse each
payoff is than the best possible payoff in that state of nature. The best payoff (cost)
would be the lowest cost. Thus, to get the opportunity loss in this case, we subtract the
lowest value in each column from all the values in that column, and we obtain the
opportunity loss table.
Once the opportunity loss table has been developed, the minimax regret criterion is
applied exactly as it was for the Thompson Lumber example. The maximum regret for
each alternative is found, and the alternative with the minimum of these maximums is
selected. As seen in Table 3.15, the minimum of these maximums is 150, so Machine
C would be selected based on this criterion.
The probabilities are used to compute the expected opportunity losses as shown in
Table 3.15. Machine C has the lowest EOL of PhP45, so it would be selected based
on the minimum EOL criterion. As previously noted, the minimum EOL is equal to the
expected value of perfect information.
Decision Trees
Any problem that can be presented in a decision table can also be graphically
illustrated in a decision tree. All decision trees are similar in that they contain decision
nodes or decision points and state-of-nature nodes or state-of-nature points:
In drawing the tree, we begin at the left and move to the right. Thus, the tree presents
the decisions and outcomes in sequential order. Lines or branches from the squares
(decision nodes) represent alternatives, and branches from the circles represent the
states of nature. Figure 3.2 gives the basic decision tree for the Thompson Lumber
example. First, John decides whether to construct a large plant, a small plant, or no
plant. Then, once that decision is made, the possible states of nature or outcomes
(favorable or unfavorable market) will occur. The next step is to put the payoffs and
probabilities on the tree and begin the analysis.
PhP - 200,000
PhP 100,000
PhP - 20,000
Figure 3.3 Completed and Solved Decision Tree for Thompson Lumber
When sequential decisions need to be made, decision trees are much more powerful
tools than decision tables. Let’s say that John Thompson has two decisions to make,
with the second decision dependent on the outcome of the first. Before deciding about
building a new plant, John has the option of conducting his own marketing research
survey, at a cost of PhP10,000. The information from his survey could help him decide
whether to construct a large plant or a small plant or not to build at all. John recognizes
that such a market survey will not provide him with perfect information, but it may help
quite a bit nevertheless.
John’s new decision tree is represented in Figure 3.4. Let’s take a careful look at this
more complex tree. Note that all possible outcomes and alternatives are included in
their logical sequence. This is one of the strengths of using decision trees in making
decisions. The user is forced to examine all possible outcomes, including unfavorable
ones. He or she is also forced to make decisions in a logical, sequential manner.
Examining the tree, we see that Thompson’s first decision point is whether to conduct
the PhP10,000 market survey. If he chooses not to do the study (the lower part of the
tree), he can construct a large plant, a small plant, or no plant. This is John’s second
decision point. The market will be either favorable (0.50 probability) or unfavorable
(also 0.50 probability) if he builds. The payoffs for each of the possible consequences
are listed along the right side. As a matter of fact, the lower portion of John’s tree is
identical to the simpler decision tree shown in Figure 3.3.
The upper part of Figure 3.4 reflects the decision to conduct the market survey. State-
of-nature node 1 has two branches. There is a 45% chance that the survey results will
indicate a favorable market for storage sheds. We also note that the probability is 0.55
that the survey results will be negative.
The rest of the probabilities shown in parentheses in Figure 3.4 are all conditional
probabilities or posterior probabilities. For example, 0.78 is the probability of a
favorable market for the sheds given a favorable result from the market survey. Of
course, you would expect to find a high probability of a favorable market given that the
research indicated that the market was good. Don’t forget, though, there is a chance
that John’s PhP10,000 market survey didn’t result in perfect or even reliable informatio.
Any market research study is subject to error. In this case, there is a 22% chance that
the market for sheds will be unfavorable given that the survey results are positive.
We note that there is a 27% chance that the market for sheds will be favorable given
that John’s survey results are negative. The probability is much higher, 0.73, that the
market will actually be unfavorable given that the survey was negative.
Finally, when we look to the payoff column in Figure 3.4, we see that PhP10,000, the
cost of the marketing study, had to be subtracted from each of the top 10 tree branches.
Thus, a large plant with a favorable market would normally net a PhP 200,000 profit.
But because the market study was conducted, this figure is reduced by PhP 10,000 to
PhP190,000. In the unfavorable case, the loss of PhP 180,000 would increase to a
greater loss of PhP190,000. Similarly, conducting the survey and building no plant now
results in a -PhP10,000 payoff.
The rest of the probabilities shown in parentheses in Figure 3.4 are all conditional
probabilities or posterior probabilities. For example, 0.78 is the probability of a
favorable market for the sheds given a favorable result from the market survey. Of
course, you would expect to find a high probability of a favorable market given that the
research indicated that the market was good. Don’t forget, though, there is a chance
that John’s PhP10,000 market survey didn’t result in perfect or even reliable
information. Any market research study is subject to error. In this case, there is a 22%
chance that the market for sheds will be unfavorable given that the survey results are
positive.
Payoffs
PhP 190,000
PhP -190,000
PhP 90,000
PhP -30,000
PhP -10,000
PhP 190,000
PhP -190,000
PhP 90,000
PhP -30,000
PhP -10,000
PhP 200,000
PhP -180,000
PhP 100,000
PhP -20,000
PhP 0
Figure 3.4 Larger Decision Tree with Payoffs and Probabilities for Thompson Lumber
We note that there is a 27% chance that the market for sheds will be favorable given
that John’s survey results are negative. The probability is much higher, 0.73, that the
market will actually be unfavorable given that the survey was negative.
Finally, when we look to the payoff column in Figure 3.4, we see that PhP10,000, the
cost of the marketing study, had to be subtracted from each of the top 10 tree branches.
Thus, a large plant with a favorable market would normally net a PhP200,000 profit.
But because the market study was conducted, this figure is reduced by PhP10,000 to
PhP190,000. In the unfavorable case, the loss of PhP180,000 would increase to a
greater loss of PhP190,000. Similarly, conducting the survey and building no plant now
results in a -PhP10,000 payoff.
With all probabilities and payoffs specified, we can start calculating the EMV at each
stateof-nature node. We begin at the end, or right side of the decision tree, and work
back toward the origin. When we finish, the best decision will be known.
= PhP 40,000
The EMV of no plant is PhP0. Thus, building a small plant is the best choice, given that
the marketing research is not performed.
5. We move back to the first decision node and choose the best alternative. The
EMV of conducting the survey is PhP 49,200, versus an EMV of PhP 40,000
for not conducting the study, so the best choice is to seek marketing
information. If the survey results are favorable, John should construct a large
plant, but if the research is negative, John should construct a small plant.
In Figure 3.5, these expected values are placed on the decision tree. Notice on the
tree that a pair of slash lines / / through a decision branch indicates that particular
alternative is dropped from further consideration. This is because its EMV is lower than
the EMV for the best alternative.
Payoffs
Php 106,400
PhP 190,000
PhP -190,000
Php 106,400
Php 63,600 PhP 90,000
PhP -30,000
PhP -10,000
PhP -190,000
Php 2,400
PhP 90,000
Php 2,400
PhP -30,000
PhP -10,000
Php 49,200
PhP -180,000
Php 40,000
Php 40,000
PhP 100,000
PhP -20,000
PhP 0
With the market survey he intends to conduct, John Thompson knows that his best
decision will be to build a large plant if the survey is favorable or a small plant if the
survey results are negative. But John also realizes that conducting the market research
is not free. He would like to know what the actual value of doing a survey is. One way
of measuring the value of market information is to compute the expected value of
sample information (EVSI), which is the increase in expected value resulting from
the sample information.
The expected value with sample information (EV with SI) is found from the decision
tree, and the cost of the sample information is added to this, since this was subtracted
from all the payoffs before the EV with SI was calculated. The expected value without
sample information (EV without SI) is then subtracted from this to find the value of the
sample information.
where
In John’s case, his EMV would be PhP59,200 if he hadn’t already subtracted the
PhP10,000 study cost from each payoff. (Do you see why this is so? If not, add
PhP10,000 back into each payoff, as in the original Thompson problem, and
recompute the EMV of conducting the market study.) From the lower branch of Figure
3.5, we see that the EMV of not gathering the sample information is PhP40,000. Thus,
= PhP59,200 - PhP40,000
= PhP 19,200
This means that John could have paid up to PhP19,200 for a market study and still
come out ahead. Since it costs only PhP10,000, the survey is indeed worthwhile.
𝐸𝑉𝑆𝐼
𝐸𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑐𝑦 𝑜𝑓 𝑠𝑎𝑚𝑝𝑙𝑒 𝑖𝑛𝑓𝑜𝑟𝑚𝑎𝑡𝑖𝑜𝑛 = 100% (3-5)
𝐸𝑉𝑃𝐼
= 32%
Sensitivity Analysis
As with payoff tables, sensitivity analysis can be applied to decision trees as well. The
overall approach is the same. Consider the decision tree for the expanded Thompson
Lumber problem shown in Figure 3.5. How sensitive is our decision (to conduct the
marketing survey) to the probability of favorable survey results?
We are indifferent when the EMV of conducting the marketing survey, node 1, is the
same as the EMV of not conducting the survey, which is PhP40,000. We can find the
indifference point by equating EMV(node 1) to PhP40,000:
𝑃ℎ𝑃 37,600
𝑝= = 𝟎. 𝟑𝟔
𝑃ℎ𝑃 104,000
As long as the probability of favorable survey results, p, is greater than 0.36, our
decision will stay the same. When p is less than 0.36, our decision will be not to conduct
the survey.
We could also perform sensitivity analysis for other problem parameters. For example,
we could find how sensitive our decision is to the probability of a favorable market
given favorable survey results. At this time, this probability is 0.78. If this value goes
up, the large plant becomes more attractive. In this case, our decision would not
change. What happens when this probability goes down? The analysis becomes more
complex. As the probability of a favorable market (given favorable survey) result goes
down, the small plant becomes more attractive. At some point, the small plant will result
in a higher EMV (given favorable survey results) than the large plant. This, however,
does not conclude our analysis. As the probability of a favorable market (given
favorable survey results) continues to fall, there will be a point where not conducting
the survey, with an EMV of PhP 40,000, will be more attractive than conducting the
marketing survey. We leave the actual calculations to you. It is important to note that
sensitivity analysis should consider all possible consequences.
There are many ways of getting probability data for a problem such as Thompson’s.
The numbers (such as 0.78, 0.22, 0.27, and 0.73 in Figure 3.4) can be assessed by a
manager based on experience and intuition. They can be derived from historical data,
or they can be computed from other available data using Bayes’ Theorem. The
advantage of Bayes’ Theorem is that it incorporates both our initial estimates of the
probabilities and information about the accuracy of the information source (e.g., market
research survey).
The Bayes’ Theorem approach recognizes that a decision maker does not know with
certainty what state of nature will occur. It allows the manager to revise his or her initial
or prior probability assessments based on new information. The revised probabilities
are called posterior probabilities.
In the Thompson Lumber case, we made assumption that the following four conditional
probabilities were known:
We now show how John Thompson was able to derive these values with Bayes’
Theorem. From the discussions, John knows that special surveys such as his can
either be positive (i.e., predict a favorable market) or be negative (i.e., predict an
unfavorable market). The experts have told John that, statistically, of all new products
with a favorable market (FM), market surveys were positive and predicted success
correctly 70% of the time. Thirty percent of the time the surveys falsely predicted
negative results or an unfavorable market (UM). On the other hand, when there was
actually an unfavorable market for a new product, 80% of the surveys correctly
predicted negative results. The surveys incorrectly predicted positive results the
remaining 20% of the time. These conditional probabilities are summarized in Table
3.16. They are an indication of the accuracy of the survey that John is thinking of
undertaking.
Recall that without any market survey information, John’s best estimates of a favorable
and unfavorable market are
P(FM) = 0.50
P(UM) = 0.50
𝑃 (𝐵|𝐴)𝑃(𝐴)
𝑃(𝐴|𝐵) = (3-6)
𝑃 (𝐵|𝐴)𝑃(𝐴) + 𝑃 (𝐵|𝐴′ ) 𝑃(𝐴′ )
STATE OF NATURE
Positive (predicts favorable P(survey positive |FM)= 0.70 P(survey positive |UM)= 0.20
market for product)
Posterior Probability
Where
A’ = complement of A
We can let A represent a favorable market and B represent a positive survey. Then,
substituting the appropriate numbers into this equation, we obtain the conditional
probabilities given that the market survey is positive:
(0.70)(0.50) 0.35
= (0.70)(0.50)+(0.20)(0.50)
= = 𝟎. 𝟕𝟖
0.45
(0.20)(0.50) 0.10
= = = 𝟎. 𝟐𝟐
(0.20)(0.50) + (0.70)(0.50) 0.45
Note that the denominator (0.45) in these calculations is the probability of a positive
survey. An alternative method for these calculations is to use a probability table as
shown in Table 3.17.
The conditional probabilities, given that the market survey is negative, are
(0.30)(0.50) 0.15
= (0.30)(0.50)+(0.80)(0.50)
= = 𝟎. 𝟐𝟕
0.55
(0.80)(0.50) 0.40
= (0.80)(0.50)+(0.30)(0.50)
= = 𝟎. 𝟕𝟑
0.55
Note that the denominator (0.55) in these calculations is the probability of a negative
survey. These computations could also have been performed in a table instead, as in
Table 3.18.
Table 3.18 Probability Revisions Given a Negative Survey
Posterior Probability
The calculations shown in Tables 3.17 and 3.18 can easily be performed in Excel
spreadsheets.
The posterior probabilities now provide John Thompson with estimates for each state
of nature if the survey results are positive or negative. As you know, John’s prior
probability of success without a market survey was only 0.50. Now he is aware that
the probability of successfully marketing storage sheds will be 0.78 if his survey shows
positive results. His chances of success drop to 27% if the survey report is negative.
This is valuable management information, as we saw in the earlier decision tree
analysis.
Utility Theory
Why do people make decisions that don’t maximize their EMV? They do this because
the monetary value is not always a true indicator of the overall value of the result of
the decision. The overall worth of a particular outcome is called utility, and rational
people make decisions that maximize the expected utility. Although at times the
monetary value is a good indicator of utility, there are other times when it is not. This
is particularly true when some of the values involve an extremely large payoff or an
extremely large loss. Suppose that you are the lucky holder of a lottery ticket. Five
minutes from now a fair coin could be flipped, and if it comes up tails, you would win 5
million. If it comes up heads, you would win nothing. Just a moment ago a wealthy
person offered you 2 million for your ticket. Let’s assume that you have no doubts
about the validity of the offer. The person will give you a certified check for the full
amount, and you are absolutely sure the check would be good.
A decision tree for this situation is shown in Figure 3.6. The EMV for rejecting the offer
indicates that you should hold on to your ticket, but what would you do? Just think, 2
million for sure instead of a 50% chance at nothing. Suppose you were greedy enough
to hold on to the ticket, and then lost. How would you explain that to your friends?
Wouldn’t 2 million be enough to be comfortable for a while?
2,000,000
Accept
Offer 0
Heads
(0.5)
Reject
Offer
Tails
(0.5)
Most people would choose to sell the ticket for 2 million. Most of us, in fact, would
probably be willing to settle for a lot less. Just how low we would go is, of course, a
matter of personal preference. People have different feelings about seeking or
avoiding risk. Using the EMV alone is not always a good way to make these types of
decisions. One way to incorporate your own attitudes toward risk is through utility
theory.
The first step in using utility theory is to assign utility values to each monetary value in
a given situation. It is convenient to begin utility assessment by assigning the worst
outcome a utility of 0 and the best outcome a utility of 1. Although any values may be
used as long as the utility for the best outcome is greater than the utility for the worst
outcome, using 0 and 1 has some benefits. Because we have chosen to use 0 and 1,
all other outcomes will have a utility value between 0 and 1. In determining the utilities
of all outcomes, other than the best or worst outcome, a standard gamble is
considered. This gamble is shown in Figure 3.7.
Other Outcome
Utility = ?
Figure 3.7 Standard Gamble for Utility Assessment
In Figure 3.7, p is the probability of obtaining the best outcome, and (1 – p) is the
probability of obtaining the worst outcome. Assessing the utility of any other outcome
involves determining the probability (p) that makes you indifferent between alternative
1, which is the gamble between the best and worst outcomes, and alternative 2, which
is obtaining the other outcome for sure. When you are indifferent between alternatives
1 and 2, the expected utilities for these two alternatives must be equal. This
relationship is shown as
Now all you have to do is to determine the value of the probability (p) that makes you
indifferent between alternatives 1 and 2. In setting the probability, you should be aware
that utility assessment is completely subjective. It’s a value set by the decision maker
that can’t be measured on an objective scale.
Example:
Desirei Petite would like to construct a utility curve revealing her preference for money
between 0 and 10,000. A utility curve is a graph that plots utility value versus monetary
value. She can invest her money either in a bank savings account or in a real estate
deal.
If the money is invested in the bank, in 3 years Desirei would have 5,000. If she
invested in the real estate, after 3 years she could either have nothing or have 10,000.
Desirei, however, is very conservative. Unless there is an 80% chance of getting
10,000 from the real estate deal, Desirei would prefer to have her money in the bank,
where it is safe. What Desirei has done here is to assess her utility for 5,000. When
there is an 80% chance (this means that p is 0.8) of getting 10,000, Desirei is indifferent
between putting her money in real estate and putting it in the bank. Desirei’s utility for
5,000 is thus equal to 0.8, which is the same as the value for p. This utility assessment
is shown in Figure 3.8.
Other utility values can be assessed in the same way. For example, what is Jane’s
utility for 7,000? What value of p would make Desirei indifferent between 7,000 and
the gamble that would result in either 10,000 or 0? For Desirei, there must be a 90%
chance of getting the 10,000. Otherwise, she would prefer the $7,000 for sure. Thus,
her utility for 7,000 is 0.90. Desirei’s utility for 3,000 can be determined in the same
way. If there were a 50% chance of obtaining the 10,000, Desirei would be indifferent
between having 3,000 for sure and taking the gamble of either winning the 10,000 or
getting nothing. Thus, the utility of 3,000 for Desirei is 0.5. Of course, this process can
be continued until Jane has assessed her utility for as many monetary values as she
wants. These assessments, however, are enough to get an idea of Jane’s feelings
toward risk. In fact, we can plot these points in a utility curve, as is done in Figure 3.9.
In the figure, the assessed utility points of 3,000, 5,000, and 7,000 are shown by dots
and the rest of the curve is inferred from these.
(p = 0.80) 10,000
U (10,000) = 1.0
(1-p = 0.20) 0
U (0.00) = 0.0
5,000
U 5,000 = p = 0.80
U (10.000) =1.0
U (7.000) =0.90
U (6.000) =0.80
U (3.000) =0.50
U (0) =0
Monetary Value
Desirei’s utility curve is typical of a risk avoider. A risk avoider is a decision maker
who gets less utility or pleasure from a greater risk and tends to avoid situations in
which high losses might occur. As monetary value increases on her utility curve, the
utility increases at a slower rate.
Figure 3.10 illustrates that a person who is a risk seeker has an opposite-shaped utility
curve. This decision maker gets more utility from a greater risk and higher potential
payoff. As monetary value increases on his or her utility curve, the utility increases at
an increasing rate. A person who is indifferent to risk has a utility curve that is a straight
line. The shape of a person’s utility curve depends on the specific decision being
considered, the monetary values involved in the situation, the person’s psychological
frame of mind, and how the person feels about the future. It may well be that you have
one utility curve for some situations you face and completely different curves for others.
Risk
Avoider
Utility
Risk
Seeker
Monetary Outcome
After a utility curve has been determined, the utility values from the curve are used in
making decisions. Monetary outcomes or values are replaced with the appropriate
utility values, and then decision analysis is performed as usual. The expected utility for
each alternative is computed instead of the EMV. Let’s take a look at an example in
which a decision tree is used and expected utility values are computed in selecting the
best alternative.
Example:
Hendrik loves to gamble. He decides to play a game that involves tossing thumbtacks
in the air. If the point on the thumbtack is facing up after it lands, Hendrik wins 10,000.
If the point on the thumbtack is down, Hendrik loses 10,000. Should Hendrik play the
game (alternative 1), or should he not play the game (alternative 2)?
Alternatives 1 and 2 are displayed in the tree shown in Figure 3.11. As can be seen,
alternative 1 is to play the game. Mark believes that there is a 45% chance of winning
10,000 and a 55% chance of suffering the 10,000 loss. Alternative 2 is not to gamble.
What should Hendrik do? Of course, this depends on Hendrik’s utility for money. As
stated previously, he likes to gamble. Using the procedure just outlined, Hendrik was
able to construct a utility curve showing his preference for money. Hendrik has a total
of 20,000 to gamble, so he has constructed the utility curve based on a best payoff of
20,000 and a worst payoff of a 20,000 loss. This curve appears in Figure 3.12.
Track Lands
Point Up (0.45)
10,000
Track Lands
Point Down (0.55) -10,000
We see that Hendrik’s utility for –10,000 is 0.05, his utility for not playing (0) is 0.15,
and his utility for 10,000 is 0.30. These values can now be used in the decision tree.
Hendrik’s objective is to maximize his expected utility, which can be done as follows:
U(0) = 0.15
U(10,000) = 0.30
Utility
Monetary Outcome
Step 2. Replace monetary values with utility values. Refer to Figure 3.13. Here are the
expected utilities for alternatives 1 and 2:
Don’t play
0.15