A Genetic Algorithm For Minimax Optimization Problems
A Genetic Algorithm For Minimax Optimization Problems
Jeffrey W. Herrmann
Department of Mechanical Engineering
and Institute for Systems Research
University of Maryland
College Park, Maryland 20742
jwh2 @eng.umd.edu
Abstract- Robust discrete optimization is a technique for The remainder of this paper is structured as follows: Sec-
structuring uncertainty in the decision-making process. tion 2 describes robust discrete optimization problems. Sec-
The objective is to find a robust solution that has the best tion 3 presents the two-space genetic algorithm. Section 4
worst-case performance over a set of possible scenarios. describes the parallel machine scheduling problem. Section 5
However, this is a difficult optimizationproblem. This pa- presents the experimental results. Section 6 concludes the pa-
per proposes a two-space genetic algorithm as a general per by identifying topics for future research that could extend
technique to solve minimax optimization problems. This the algorithm’s utility.
algorithm maintains two populations. The first popula-
tion represents solutions. The second population repre- 2 Robust Discrete Optimization Problems
sents scenarios. An individual in one population is eval-
uated with respect to the individuals in the other popu- Making decisions under uncertainty is a difficult problem.
lation. The populations evolve simultaneously, and they However, accepting and structuring uncertainty can lead to
converge to a robust solution and its worst-case scenario. effective decision-making. Stochastic optimization recog-
Since minimax optimization problems occur in many ar- nizes uncertainty but asks the decision-maker to assign prob-
eas, the algorithmwill have a wide variety of applications. ability distributions to possible outcomes. Then, after solving
To illustrate its potential, we use the two-space genetic the associated optimization problem, one can select the deci-
algorithm to solve a parallel machine scheduling prob- sion that has the best average performance over time or one
lem with uncertain processing times. Experimental re- can select the decision that has the best performance in the
sults show that the two-space genetic algorithm can find expected outcome.
robust solutions. Robust discrete optimization, on the other hand, seeks to
identify decisions that will perform well under any circum-
1 Introduction stances. Although many criteria are available, one reasonable
choice is the minimax criterion, which allows one to identify
Many decisions involve uncertainty. Recently, researchers a robust decision (or solution) as one that has the best worst-
have begun to study such problems and to develop approaches case performance.
for finding robust solutions that have the best worst-case per- Robust discrete optimization (as described by Kouvelis
formance over a set of possible scenarios. However, this is a and Yu 191) uses scenarios to structure uncertainty. Decision-
difficult optimization problem, and no general techniques ex- makers must use their intuition about the decision environ-
ist, Kouvelis and Yu [9] discuss approaches for handling un- ment to define a set of scenarios. Each scenario in this set
certainty, and they review robust discrete optimization prob- represents a possible future. That is, the scenario occurs with
lems, complexity results, and solution procedures. some positive but unknown probability.
This paper proposes a two-space genetic algorithm as a In general, a robust discrete optimization problem can be
general technique to solve minimax optimization problems. formulated as follows. Let X be the set of all solutions. Let
This algorithm maintains two populations. The first popula- S be the set of all possible scenarios. The performance of a
tion represents solutions. The second population represents solution 2 E X in scenario s E S is F ( z ,s ) . The problem is
scenarios. An individual in one population is evaluated with to find the solution that has the best worst-case performance,
respect to the individuals in the other population. This causes which is the same as minimizing (over all solutions) tlie max-
the algorithm to converge to the robust solution and its worst- imum (over all scenarios) performance:
case sceanrio. Since minimax optimization problems occur in
many areas, the algorithm will have a wide variety of appli- min maxF(z, s )
XEX SES
cations. To illustrate the algorithm’s potential, we use the al-
gorithm to solve a parallel machine scheduling problem with In general, robust discrete optimization problems are more
uncertain processing times. The objective is to find a sched- difficult to solve than the deterministic versions that have
ule that minimizes the worst-case makespan. Experimental no uncertainty. The robust discrete optimization versions of
results show the two-space genetic algorithm can find robust many polynomially solvable optimization problems are NP-
solutions. hard, although some easily solvable cases do exist. (For re-
sults in the more general area of minimax theory, see Du and
0-7803-5536-9/99/$10.00 01999 IEEE 1099
Pardalos [31 .) g ( s ) evaluates the best solution in the first population:
Kouvelis and Yu [9] describe a branch-and-bound algo-
rithm that uses surrogate relaxation to generate bounds, and g(s) = min{F(x, s ) : s E PI}
they use this procedure to solve four robust optimization
The algorithm penalizes small g(s) and rewards large g ( s ) ,
problems. However, there are no general techniques for solv-
ing robust discrete optimization problems.
so scenarios with worse optimal solutions will survive.
Some authors have proposed genetic algorithms for prob- For instance, consider Example 1, which displays hypo-
lems with uncertainty. (For more information on genetic thetical populations of solutions and scenarios. The entry in
algorithms, see, for example, [2, 4, 7, 121.) Tsutsui and each cell of the table is F ( x i ,s j ) . x1 is more likely to survive
i ;
Ghosh [ 161 present a genetic algorithm that applies noise to since it has the best worst-case performance (h(x1) = S), and
the decoding procedure. This allows the algorithm to find so- s3 is more likely to survive since it has a poor optimal solu-
lutions whose performance is insensitive to small changes in tion (g(s3) = 8).
the solution values. They present results for some functions Example 1
on one- and two-dimensional search spaces. However, they
do not explicitly consider the scenarios present in robust dis-
crete optimization problems.
The design of robust control systems is an important re- 10 9
lated area, and a few authors have proposed genetic algo- x3
rithms for this problem. Taranto and Falcao [15] consider d S j )
robust decentralized power system damping controllers and A traditional, simple genetic algorithm has the following
use a genetic algorithm to find designs that maximize the sum steps:
of the spectrum damping ratio over all operating conditions. Create initial generation P ( 0 ) .Lett = 0.
Marrison and Stengel [ 101 measure compensator robustness
as the probability that the system will behave unacceptably. For each individual i E P ( t ) ,evaluate its fitness f ( z ) .
They use a genetic algorithm to find parameter values that
minimize this probability. This is a stochastic optimization Create generation P(t + 1)by reproduction, crossover,
approach, though their genetic algorithm does handle uncer- and mutation.
tainty in estimating the cost function, which cannot be eval- Let t = t + 1. Unless t equals the maximum number
uated directly. Tang, Man, and Gu [14] use a genetic algo- of generations, return to Step 2.
rithm to search for distillation column controllers that per-
form well for all possible plant characteristics. For a problem The two-space genetic algorithm can be summarized as
with four plants, they formulate a multiple objective prob- follows:
lem and use the genetic algorithm to find non-dominated so-
lutions. The authors note that, when there is a large set of 1. Create initial generations PI (0) and P2(0).Let t = 0.
scenarios, the problem of efficiently determining a solution’s 2. For each individual s E P l ( t ) , evaluate h(s) =
worst-case performance still remains. This is the problem that
max{F(z, s) : s E P2(t)}.
the two-space genetic algorithm addresses.
3. For each individual s E P2(t), evaluate g(s) =
3 Two-space Genetic Algorithm min{F(z, s) : s E PI ( t ) } .
1100
The two-space genetic algorithm reduces the computa- uncertain, The only available data are a lower bound and an
tional effort needed. It takes a sample population from the upper bound on the required time. The problem is to create a
set of scenarios and allows this to evolve while the algorithm schedule that minimizes the total time needed to complete all
is searching for solutions. Thus, it searches two spaces simul- jobs. No preemption is allowed. No rescheduling can occur
taneously. Moreover, it is evaluating the solutions in parallel, once processing begins.
since it uses the same scenarios for all solutions. Let { J l , . . . ,Jn} be the set of jobs. Each job Jj has a
The chosen objective functions encourage the two-space minimum processing timepj and a maximum processing time
genetic algorithm to converge to a robust solution. Although q j , where 0 < pj < q j . There are m machines M I , ...,
we do not prove the convergence, the following argument pro- M,. The decision variables are the nm binary variables xjk ,
vides the necessary insight. Suppose that there exists a solu- where X j k = 1 if and only if Jj is assigned to machine Mk.
tion z E X and a scenario t E S such that An assignment x is a feasible assignment (or solution) if, for
XEX SES
A scenario s is a combination of realized processing times.
Thus, s = (PI,. . . ,ph). For each job J j , pj 5 p! 5 q j . S is
Consequently, the set of all possible scenarios.
F ( x , s) is the makespan of a solution x in scenario s. The
F ( z , t )= minmaxF(x,s) = maxminF(x,s) makespan is the maximum total processing time on any ma-
XEX sES SES XEX
chine:
If the initial populations are sufficiently large, then for all
n
x E P I , h(x) is approximately maxSEgF ( x , s). Likewise,
for all s E P2, g(s) is approximately minxEXF ( x , s). Thus, F(X,S) = max {Cxjkp,”)
1 9 l m j=l
the populations are likely to converge towards z and t.
Now, consider any generation such that z is in PI and t The problem is to minimize (over all schedules) the maxi-
is in P2. Then, h ( z ) = F ( z , t ) and g ( t ) = F ( z ,t). For all mum possible makespan:
other x E PI, h(x) 2 F ( x , t ) 2 F ( z , t ) = h ( z ) . Thus,
z is more likely to survive. Similarly, for all other s E P2, min max F ( x , s)
XEX S E S
g ( s ) 5 F ( z ,s) 5 F ( z ,t) = g ( t ) . Thus, t is more likely to
survive. We will use the two-space genetic algorithm to find robust
Consequently, we can see that, in this case, the genetic solutions.
algorithm will converge to z, the most robust solution, and t, For this simple problem it is easy to find the worst-case
that solution’s worst-case scenario. scenario, and thus we can judge the two-space genetic al-
To test its potential, we used the two-space genetic algo- gorithm’s performance. Note that, for any solution, one
rithm to solve a specific robust discrete optimization prob- of its worst-case scenarios is the one when all jobs require
lem. Before discussing the implementation details for the their maximum processing time. Let t represent this sce-
two-space genetic algorithm, we will present the scheduling nario: t = (41,. . . ,qn). Then, F ( x , t) = m a x S E s F ( z s).
,
problem under consideration. Thus, minzEX F ( x , t) is the minimum worst-case perfor-
mance. A lower bound on this worst-case performance is
4 Parallel Machine Scheduling La = c;=, q j . In the experiments described later, we will
consider instances with four machines and nine jobs. Thus,
There are a wide variety of parallel machine scheduling prob- at least one machine will have at least three jobs. A second
lems. Pinedo [ 131 provides a review of the most important lower bound is the sum of the three smallest maximum pro-
research results. Most previous work has considered deter- + +
cessing times: Lb = q1 q 2 4 3 , if we number the jobs so
ministic scheduling, where all information is known ahead of that q1 5 q 2 5 43 5 . . . 5 qn. L = max{L,, Lb} combines
time, and stochastic scheduling, where some information has these lower bounds for minXEXF(x, t).
a known probability distribution. We will consider a prob-
lem with more general uncertainty. Previously, Daniels and 5 Experimental Results
Kouvelis [ l ] and Kouvelis et al. [8] have studied schedul-
ing problems with uncertain processing times and developed To implement the two-space genetic algorithm, we used
methods to optimize the worst-case performance. In addition, GENESIS, a genetic algorithm software developed by John
researchers have developed genetic algorithms to find good J. Grefenstette. We modified the code so that the algorithm
solutions to deterministic scheduling and sequencing prob- creates and maintains two populations. Each individual in
lems (see, for example, [ 5 , 6 , 111). each population is evaluated with respect to the individuals in
We will consider the following problem. There is a finite the other population. Each run of the algorithm created 100
set of jobs and a set of parallel, identical machines. All jobs generations of each population. Each population had 50 indi-
are available at the current time. Each job requires processing viduals. The algorithm used an elitist strategy that copied the
on any one of the machines. Each job’s processing time is best individual in one generation to the next generation.
1101
Recall that a solution assigns every job to a machine, and a Problem Set Two-space Worst-case
scenario is a combination of processing times. The objective n QI QZ Genetic Algorithm Optimization
is to minimize the worst-case schedule makespan. 9 0.2 0.2 1.ooo 1.ooo
9 0.2 0.6 1.002 1.008
5.1 Problem Generation 9 0.2 1.0 1.028 1.037
9 0.6 0.2 1.044 1.041
We tested the two-space genetic algorithm on randomly gen- 9 0.6 0.6 1.058 1.047
erated problem instances. All instances had m = 4 identi- 9 0.6 1.0 1.053 1.053
cal parallel machines. We created various problem sets, each 9 1.0 0.2 1.056 1.064
with ten instances. Various parameters governed the instance 9 1.0 0.6 1.068 1.062
generation. The problem size was n = 9 jobs. The param- 9 1.0 1.0 1.062 1.062
eters a1 and a2 governed the processing times. Each job’s
~~
minimum processing time p j was selected from a uniform Table 1: Results for the Parallel Machine Scheduling Problem
distribution with the range [lo, 50~x11.a1 equals 0.2,0.6,or
1.0. Each job’s maximum delay d j was selected from a uni-
form distribution with the range [0,azpj].a2 equals 0.2,0.6, 6 Summary and Conclusions
or 1.O. The maximum processing time qj = p j d j . + This paper presented a two-space genetic algorithm and sug-
Each set of ten instances can be denoted by the combina-
gested that it can be a general technique for solving minimax
tion of the parameter values. We used all nine combinations
and robust discrete optimization problems. The two-space
of a1 and ay^. Thus, there were 90 instances altogether.
genetic algorithm should be useful for solving minimax op-
To solve these problems, the genetic algorithm uses 27-bit
timization problems in a wide variety of domains. It will be
strings. Each individual in both populations has nine genes,
particularly useful when determining the worst-case scenario
one for each job. Each gene contains three bits. Individuals in
is a difficult problem due to the number of scenarios.
the first population represent assignments. After transform-
To illustrate its potential, we used the algorithm to solve
ing the gene’s value to an integer in the set 1,.. . ,m, gene j
a parallel machine scheduling problem with uncertain pro-
identifies a machine k,and the job is assigned to that machine.
cessing times. For this particular problem, we can find good
That is, xjk = 1and x j p = 0 if p # k. Each individual in the
lower bounds and thus evaluate the algorithm’s performance.
second population represents a scenario s. After transform-
The results show that a two-space genetic algorithm is a very
ing the gene’s value to a real number in the interval [ p j , q j ] ,
suitable technique for robust discrete optimization problems
gene j describes the job’s realized processing time p:.
such as these. For the specific problem considered here, one
can verify that there exists a solution I E X and a scenario
5.2 Results t E S such that
To evaluate the algorithm, we compare the solutions to the
lower bound L, as described in Section 4. In addition, we F ( z , t) = min F ( z ,t) = m a x F ( z , s)
XEX SES
used a standard version of GENESIS to find good solutions
to the problem minrEX F ( x ,t). Recall that scenario t is the However, the algorithm may require adjustment to solve
worst-case scenario, and the optimal makespan of this prob- problems when this condition does not hold. Preliminary re-
lem is the optimal makespan to the robust scheduling prob- sults suggest that maintaining diversity in the population of
lem. scenarios may be necessary. If the second population con-
All results are relative to the lower bounds and averaged verges to just one scenario, the first population will converge
over :he ten instances in the problem set. If, for each in- to the solutions that are optimal for that scenario. However,
stance i = 1,.. . , l o , an algorithm finds a solution xt whose these solutions may have terrible performance on other sce-
worst-case performance is h(zf), and the lower bound for narios. Evaluating how often a scenario is a worst-case sce-
that instance is Li, then, the algorithm’s average relative per- nario may be an appropriate objective function, and sharing
formance is H: mechanisms can maintain a variety of worst-case scenarios in
the second population. This yields a more accurate evalua-
tion of the individuals in the first population and causes it to
converge to truly robust solutions.
Another possibility is to give each individual a memory so
Table 1 lists the results. The deviation from the lower bound that, if an individual survives intact from one generation to
is the same for both algorithms. This shows that the two- the next, it can compare its performance (with respect to the
space genetic algorithm is able to find robust solutions. current population of scenarios) to its worst-case performance
in the past. This also yields more accurate evaluations.
1102
Bibliography [ 151 Taranto, G.N., and D.M. Falcao, “Robust decentralised
control design using genetic algorithms in power sys-
[ l ] Daniels, R.L., and P. Kouvelis, “Robust scheduling to tem damping control,” ZEE Proceedings. Generation,
hedge against processing time uncertainty in single- Transmissions, and Distribution, Volume 145, Num-
stage production,” Management Science, Volume 41, ber 1, pages 1-6, 1998.
Number 2, pages 363-376, 1995.
[ 161 Tsutsui, Shigeyoshi, and Ashish Ghosh, “Genetic algo-
[2] Davis, L., ed., Handbook of Genetic Algorithms, Van rithms with a robust solution searching scheme,” ZEEE
Nostrand Rheinhold, New York, 1991. Transactions on Evolutionary Computation, Volume 1,
Number 3, pages 201-208,1997.
[3] Du, Ding-Zhu, and Panos M. Pardalos, Minimax and
Applications, Kluwer Academic Publishers, Dordrecht,
1995.
[4] Goldberg, D.E., Genetic Algorithm in Search, Op-
timization, and Machine Learning, Addison-Wesley,
Reading, Massachusetts, 1989.
[5] Herrmann, Jeffrey W., and Chung-Yee Lee, “Solving
a class scheduling problem with a genetic algorithm,”
ORSA Journal of Computing, Vol. 7, No. 4, 1995.
[6] Herrmann, Jeffrey W., Chung-Yee Lee, and Jim Hinch-
man, “Global job shop scheduling with a genetic algo-
rithm,” Production and Operations Management, Vol-
ume 4, Number 1, pp. 30-45, 1995.
[7] Holland, J.H., Adaptation in Natural and Artijicial Sys-
tems, University of Michigan Press, Ann Arbor, Michi-
gan, 1975.
[8] Kouvelis, P., and R.L. Daniels, and G. Vairaktarakis,
“Robust scheduling of a two-machine flow shop with
uncertain processing times,” working paper, Fuqua
School of Business, Duke University, 1996.
[9] Kouvelis, Panos, and Gang Yu, Robust Discrete Opti-
mization and Its Applications, Kluwer Academic Pub-
lishers, Norwell, MA, 1997.
[lo] Marrison, Christopher, and Robert F. Stengel, “Robust
control system design using random search and genetic
algorithms,” IEEE Transactions on Automatic Control,
Volume 42, Number 6, pages 835-839, 1997.
[ 111 Mattfeld, Dirk C., Evolutionary search and thejob shop
: investigations on genetic algorithms for production
scheduling, Heidelberg : Physica-Verlag, 1996.
+ Data
[ 121 Michalewicz, Zbigniew, Genetic Algorithms
Structures = Evolution Programs, Springer-Verlag,
Berlin, 1992.
[ 131 Pinedo, Michael, Scheduling : theory, algorithms, and
systems, Englewood Cliffs, N.J.: Prentice Hall, 1995.
[14] Tang, K.S., K.F. Man, and D.-W. Gu, “Structured ge-
netic algorithm for robust H-infinity control systems
design,” IEEE Transactions on Industrial Electronics,
Volume 43, Number 5, pages 575-582, 1996.
1103