Note1_Class_Part1
Note1_Class_Part1
3 4
References Teaching schedule
1. B. Render, R.M. Stair and M.E. Hanna, Quantitative
analysis for management, 9ed., 2006 Pearson 1. Introduction and probability review (4)
2. Basic Decision Analysis (4)
2. W.L. Winston, Introduction to probability models, 4ed.,
2004 Thomson 3. Utility theory (4)
4. Multi-attributes (4)
3. Pratt, J.W., Raiffa H. and Schlaifer R.,
5. Game theory (4)
Introduction to Statistical Decision Theory,
1995 MIT Press 6. Decision analysis with sampling (4)
If the travelling time don’t depend on • If we have various relevant information, how to
displacement, but depend on the actual distance use them and solve the problem in mathematical
and the traffic. What is your decision? way?
15 16
Why Study Decision Analysis (DA)? Why Study Decision Analysis (DA)?
• Based on the limited information, DA may helps us to • Instead of providing solutions, DA is perhaps best
obtain the best decision. thought of as simply an information source,
• Although decision analysis cannot improve your luck, it providing insight about the situation, uncertainty,
helps you to understand better the problems you face and objectives, and trade-offs, and possibly yielding a
hence make better decisions.
recommended course of action.
• It includes the structure of the problem as well as the
uncertainty and trade-off. • DA can be used to justify why a previously chosen
• Furthermore, it helps people deal with difficult decisions. action was appropriate.
Even we are not perfect decision marker, we can do better • Using DA in finance aspect, you may enjoy the
through more structure and guidance. same expected return (outcome) with the lowest
• DA offers guidance to normal people working on hard risk (unpleasant surprises).
decisions.
What is the best decision? Give the best outcome.
17 18
Define
the problem
Almost everywhere!!
Acquire
• Managing research and development programs input data
• Negotiating for oil and gas leases
• Forecasting sales for new product
• Even in game strategy. Develop
a solution
DA can be used in any decision, but depend on whether it is worth
to use DA on a particular problem. Test
the solution
Example
It is cloudy outside. Should we collect the weather report to perform Analyze
the results
DA for bringing an umbrella or not?
19 Implement 20
the results
Decision Analysis - Introduction
1.1 Decision Making Problems
§ We need to act in a world that is full of uncertainties.
§ ¨ Action
under Uncertainty
Þ Consequence
¨ State of World
An inventory problem
§ ¨ Decision maker’s preference on consequences
Þ Action A retailer is about to place an order for a number of units of
¨ judgments concerning the uncertain states
a perishable commodity which spoils if it is not sold by the end of
§ We can only act on the basis of what we know. The more the day on which it is stocked. The retailer does not know what
information, the better the decision. the demand for the item will be, but nevertheless decide on
a definite number of units to stock.
§ Decision Analysis : We want to minimize the loss caused by either overage or
¾ Mathematical analysis of the decision making shortage cost. If overage happen, should we give a discount to the
customer in the last hour(s) before closing?
problems under uncertainty.
21 22
An investment problem
• Cost of drilling 2. This choice (or choices) will ultimately lead to some consequence,
• Amount of oil or gas discovered but the decision maker cannot be sure in advance what this
• Price at which oil or gas can be sold consequence will be because it depends not only on his or her
• and others choice(s) but on an unpredictable event or sequence of events.
His problem is further complicated by the fact that it is possible Such a kind of problem can be presented by a diagram known as a
to perform various tests or experiments that will yield a certain amount decision tree.
of information on the geophysical structure below the land.
23 24
1.3 The Problem of Analysis A simplest tree
The problem can be represented by a decision tree:
Analysis of the Simplest Problems:
Mr Lee, a manufacturer, has experienced a serious decline in demand
for his product and will be forced to lay off a substantial portion of
his work force, etc, unless he can obtain a large order which
the XYZ Company is about to place with some suppliers.
Decision tree
The Oil Wildcatter’s Decision Tree
27 28
Bases of decision: preference and judgment
• The problem becomes complicated if § There is in general no “objectively correct” solution to
any realistic decision problem.
there is time constraint or the variables vary
with time. § A “reasonable” decision must necessarily rely on
• The cost of drill depends on the solid nature, * Decision maker’s personal preference for consequences;
the depth required. * Decision maker’s personal judgment concerning
the chances of events.
• The labor cost, the amount of oil, etc.
• All the available information are required If the decision maker behaves reasonably, she or he will choose
a solution which is consistent with his or her personal preference
for DA. and personal judgment.
Given that event E has occurred, the Conditional Probability Quality required
of event R is denoted and defined by Regular Special De-luxe
Home 30 20 10
Export 5 15 20
P( R Ç E )
P( R | E ) = (if P( E ) > 0).
P( E ) Then
P( R Ç E )
P( R | E ) = =
P( E )
P( S Ç H )
P( S | H ) = =
P( H )
35 36
Independence between two events Prior and posterior probability
Two events A and B are said to be independent if When the unconditional probability P(R) is compare with
the conditional probability P(R|E), we refer to the former as a
P( A Ç B) = P( A) P( B)
which is equivalent to prior probability and to the latter a posterior probability.
P( B | A) = P( B) and P( A | B) = P( A) (if P( A) > 0, P( B) > 0)
Rewrite the definition of conditional probability P(R|E), we obtain
Example 1.3. Toss a fair dice and let A = {1, 3, 5}, B = {1, 2, 3}, The multiplication law (second law):
C = {1, 2}. Then P( R Ç E ) = P( R | E ) P( E ).
P({1,3}) 1
P( B | A) = = ¹ P( B ) = Example 1.2 (cont.)
P( A) 2
Hence A and B are dependent (not independent). But A and C are 5 5 40
independent since P( R Ç E ) = = ´ = P( R | E ) P( E ).
100 40 100
P( A Ç C ) = P({1}) = = P( A) P(C )
37 38
39 40
Bayes’ formula Example 1.4
(Decision analysis, G. Gregory, p. 28)
Assume that E1, E2, …, En are mutually exclusive and exhaustive and • Machines E and F make printed circuits;
that A is any other outcome. Then • E and F produces 2 and 4 per cent defective circuits, respectively;
P( Ei | A) P( A) = P( Ei Ç A) = P( A | Ei ) P( Ei ) • If combined, F making 3 printed circuits for every 2 made by E;
In fact
Example 1.5 (cont.)
P( B | C ) = P( B | A1 È A2 )
P( B Ç ( A1 È A2 ))
=
P( A1 È A2 )
P( B Ç A1 ) + P( B Ç A2 )
=
P( A1 È A2 )
P( B | A1 ) P( A1 ) + P( B | A2 ) P( A2 )
=
P(C )
0.99 ´ 0.0004 + 0.65 ´ 0.0006
= = 0.768
0.001
Thus
P( B | C ) P(C ) 0.768 ´ 0.001
P(C | B) = = = 0.007806.
P( B) 0.100686
The information that the tested person has cancers has increased
from the prior probability of 0.001 that it came from persons with
cancers to the posterior probability of 0.007806 ! 47