0% found this document useful (0 votes)
43 views

Tutorial Metaheuristics

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Tutorial Metaheuristics

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 131

An introduction to Metaheuristics

Andrea Roli
[email protected]

Università degli Studi “G. D’Annunzio” – Chieti

DEIS - Università degli Studi di Bologna

An introduction to Metaheuristics – p.1


Outline
Definition of Metaheuristics
Combinatorial Optimization Problems
Trajectory Methods
Population-based Methods
Hybrid approaches

An introduction to Metaheuristics – p.2


Metaheuristics
Approximate algorithms: they do not
guarantee to find the optimal solution in
bounded time.
Applied to Combinatorial Optimization
Problems and Constraint Satisfaction
Problems

An introduction to Metaheuristics – p.3


Metaheuristics
Approximate algorithms: they do not
guarantee to find the optimal solution in
bounded time.
Applied to Combinatorial Optimization
Problems and Constraint Satisfaction
Problems
Applied when:
Problems have large size
The goal is to find a (near-)optimal solution
quickly
An introduction to Metaheuristics – p.3
Metaheuristics
OBJECTIVE: Effectively and efficiently explore
the search space

An introduction to Metaheuristics – p.4


Metaheuristics
OBJECTIVE: Effectively and efficiently explore
the search space
Techniques:

Use of the search history


Adaptivity
General strategies to balance intensification
and diversification.

An introduction to Metaheuristics – p.4


Metaheuristics
OBJECTIVE: Effectively and efficiently explore
the search space
Techniques:

Use of the search history


Adaptivity
General strategies to balance intensification
and diversification.

Sometimes, they are erroneously simply called “local




search methods”.
An introduction to Metaheuristics – p.4
Two-faced Janus
Intensification and Diversification are the driving
forces of metaheuristic search.

Intensification: exploitation of the accumulated


search experience (e.g., by concentrating the
search in a confined, small search space area)
Diversification: exploration ’in the large‘ of the
search space

An introduction to Metaheuristics – p.5


Two-faced Janus
I&D are contrary and complementary: their
dynamical balance determines the effectiveness
of metaheuristics.
Two levels of I&D balance:

Basic level: intrinsic exploration mechanism


Strategic level: general criteria to guide the
exploration of the search space

An introduction to Metaheuristics – p.6


Other characteristics
Metaheuristics are strategies that guide the search
process.
Metaheuristic algorithms are usually non-deterministic.
The basic concepts of metaheuristics permit an
abstract level description.
Metaheuristics are not problem-specific.
Metaheuristics may make use of domain-specific
knowledge as heuristic controlled by the upper level
strategy.

An introduction to Metaheuristics – p.7


Metaheuristics
Metaheuristics encompass and combine:

Constructive methods (e.g., random,


heuristic, adaptive, etc.)
Local search-based methods (e.g., Tabu
Search, Simulated Annealing, Iterated Local
Search, etc.)
Population-based methods (e.g., Evolutionary
Algorithms, Ant Colony Optimization, Scatter
Search, etc.)

An introduction to Metaheuristics – p.8


A classification
We will classify metaheuristics in two basic
classes:

Trajectory methods
Population-based methods

Other classifications are possible, based on


different key concepts (e.g., the use of memory)

An introduction to Metaheuristics – p.9


Trajectory methods
The search process is characterized by a
trajectory in the search space
The search process can be seen as the
evolution in (discrete) time of a discrete
dynamical system

Examples: Tabu Search, Simulated Annealing,


Iterated Local Search, ...

An introduction to Metaheuristics – p.10


Population-based methods
Deal in every iteration of the algorithm with a
set – a population – of solutions
The search process can be seen as the
evolution in (discrete) time of a set of points in
the search space

Examples: Evolutionary Algorithms, Ant Colony


Optimization, Scatter Search, ...

An introduction to Metaheuristics – p.11


Combinatorial Optimization Problems




A Combinatorial Optimization Problem can






be defined by:


variables ;









variable domains ;








constraints among variables;


Objective function ;




 




The set of all possible feasible assignments






















 

satisfies all the constraints


 

An introduction to Metaheuristics – p.12


Combinatorial Optimization Problems

Objective: find a solution with minimum

!


"

"
#
objective function value, i.e., .

%
$

!



Many COPs are -hard no polynomial time
algorithm exists (assuming )

&
Examples: Traveling Salesman problem (TSP), Quadratic
Assignment problem (QAP), Maximum Satisfiability
Problem (MAXSAT), Timetabling and Scheduling
problems.

An introduction to Metaheuristics – p.13


Preliminary definitions
A neighborhood structure is a function

)
(
that assigns to every a set of

!

'

"
#

"
#
neighbors . is called the



neighborhood of .


An introduction to Metaheuristics – p.14


Preliminary definitions
A neighborhood structure is a function

)
that assigns to every
(
a set of

!

'

"
#

"
#
neighbors . is called the



neighborhood of .

A locally minimal solution (or local minimum)
with respect to a neighborhood structure is a

"
#

"
#

"
#
solution such that . We
$
+*

+*

+*
!



'
call a strict locally minimal solution if
+*
"
#

"
#

"
#

.
$
+*

+*
!
,

An introduction to Metaheuristics – p.14


Neighborhood: Examples
For problems defined on binary variables, the
neighborhood can be defined on the basis of the
Hamming distance ( ) between two

-
assignments. E.g.,
"

320

876
9

4 . "

6
5

5
!
/.

/1

1
&

&
-
4

320

6
For example:
"2
2
2

2
5
2
52

52
2
&

An introduction to Metaheuristics – p.15


Neighborhood: Examples
In TSP, the neighborhood can be defined by
means of arc exchanges on Hamiltonian tours.
E.g.,
D C D C

A B A B

An introduction to Metaheuristics – p.16


Trajectory methods
Iterative Improvement
Simulated Annealing
Tabu Search
Variable Neighborhood Search
Guided Local Search
Iterated Local Search

An introduction to Metaheuristics – p.17


Iterative Improvement
Very basic local search
Each move is only performed if the solution it
produces is better than the current solution
(also called hill-climbing).
The algorithm stops as soon as it finds a local
minimum.

An introduction to Metaheuristics – p.18


Iterative Improvement

solution s

neighborhood of s

An introduction to Metaheuristics – p.19


Iterative Improvement

solution s

best in the neighborhood

An introduction to Metaheuristics – p.19


Iterative Improvement

next solution s’
solution s

best in the neighborhood

An introduction to Metaheuristics – p.19


A pictorial view
objective function

Initial solution

s*
Local minimum

Solution space

An introduction to Metaheuristics – p.20


The algorithm
GenerateInitialSolution()


repeat

"
#
BestOf( , )



until no improvement is possible

An introduction to Metaheuristics – p.21


Escaping strategies...
Problem: Iterative Improvement stops at Local
minima, which can be very “poor”.

Strategies are required to prevent the search


from getting trapped in local minima and to
escape from them.

An introduction to Metaheuristics – p.22


Three basic ideas
1) Accept up-hill moves
i.e., the search moves toward a solution with a
worse objective function value

An introduction to Metaheuristics – p.23


Three basic ideas
1) Accept up-hill moves
i.e., the search moves toward a solution with a
worse objective function value
Intuition: climb the hills and go downward in
another direction

An introduction to Metaheuristics – p.23


Three basic ideas
2) Change neighborhood structure during
the search

An introduction to Metaheuristics – p.24


Three basic ideas
2) Change neighborhood structure during
the search

Intuition: different neighborhoods generate


different search space topologies

An introduction to Metaheuristics – p.24


Three basic ideas
3) Change the objective function so as to
“fill-in” local minima

An introduction to Metaheuristics – p.25


Three basic ideas
3) Change the objective function so as to
“fill-in” local minima

Intuition: modify the search space with the aim of


making more “desirable” not yet explored areas

An introduction to Metaheuristics – p.25


Simulated Annealing
Simulated Annealing exploits the first idea:
accept also up-hill moves.

Origins in statistical mechanics (Metropolis


algorithm)
It allows moves resulting in solutions of worse
quality than the current solution
The probability of doing such a move is
decreased during the search.

An introduction to Metaheuristics – p.26


Simulated Annealing
Simulated Annealing exploits the first idea:
accept also up-hill moves.

Origins in statistical mechanics (Metropolis


algorithm)
It allows moves resulting in solutions of worse
quality than the current solution
The probability of doing such a move is
decreased during the search.

GHFE
CDA B

C A BF


?@


;

Usually, accept up-hill move .


=<
:

>


I
An introduction to Metaheuristics – p.26
A pictorial view
objective function

Initial solution

Solution space

An introduction to Metaheuristics – p.27


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.27


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.27


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.27


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.27


SA: the algorithm
GenerateInitialSolution()


J
while termination conditions not met do
 K

"
#
PickAtRandom( )


 K
"

"
#
if then
,


 K
 K

{ replaces }



else
 K

Accept as new solution with probability


 K
"

#
L


4

end if
Update( )
end while
An introduction to Metaheuristics – p.28
Cooling schedules
The temperature can be varied in different ways:

P
Logarithmic: .

N M
O




TB

F
RSQ
N
VU N
The algorithm is guaranteed to converge to the optimal
solution with probability 1. Too slow for applications.

[
Geometric: , where .
N M
O

N M
O

XY

Z

W

W




Non-monotonic: the temperature is decreased
(intensifications is favored), then increased again (to
increase diversification).

An introduction to Metaheuristics – p.29


Applications
SA is usually not very effective when used as
stand-alone metaheuristic.

References:

- E. H. L. Aarts and J. H. M. Korst and


P. J. M. van Laarhoven, Simulated Annealing, in Local
Search in Combinatorial Optimization,
Wiley-Interscience, 1997.
- S. Kirkpartick and C. D. Gelatt and M. P. Vecchi,
Optimization by simulated annealing, Science 13 May
1983. An introduction to Metaheuristics – p.30
Tabu Search
Tabu Search exploits the second idea: change
neighborhood structure.

Explicitly exploits the search history to dynamically


change the neighborhood to explore
Tabu list: keeps track of recent visited solutions or
moves and forbids them escape from local minima
\
and no cycling
Many important concepts developed “around” the basic
TS version (e.g., general exploration strategies)

An introduction to Metaheuristics – p.31


A pictorial view

solution s

neighborhood of s

An introduction to Metaheuristics – p.32


A pictorial view

solution s

ALLOWED neighborhood of s

An introduction to Metaheuristics – p.32


A pictorial view

solution s new solution s´

ALLOWED neighborhood of s

An introduction to Metaheuristics – p.32


Basic TS: the algorithm
GenerateInitialSolution()


d
`_^

ba

c
]


while termination conditions not met do

"
#f

_ ^

ba

c
e
ChooseBestOf( )



_ ^

ba

c
Update( )
]

end while 

An introduction to Metaheuristics – p.33


Tabu Search
Storing a list of solutions is often inefficient,
therefore moves are stored instead.

An introduction to Metaheuristics – p.34


Tabu Search
Storing a list of solutions is often inefficient,
therefore moves are stored instead.

BUT: storing moves we could cut good not yet


visited solutions

An introduction to Metaheuristics – p.34


Tabu Search
Storing a list of solutions is often inefficient,
therefore moves are stored instead.

BUT: storing moves we could cut good not yet


visited solutions

we use ASPIRATION CRITERIA (e.g., accept a


forbidden move toward a solution better than the
current one)

An introduction to Metaheuristics – p.34


TS: the algorithm
GenerateInitialSolution()


4 h
g a

a
InitializeTabuLists( )

j
i
i4
i
k

while termination conditions not met do


"

q 0
r

"
#
9
no tabu
o %
l
`ml

k
c

!
no


4

condition is violated or at least one

6
aspiration condition is satisfied

"

#
o %
l
m l

k
c
ChooseBestOf( ) e


no


4
UpdateTabuListsAndAspirationConditions()
k

end while
An introduction to Metaheuristics – p.35
Applications
Among the best performing metaheuristics
(when applied with general strategies to
balance intensification and diversification)
Applied to many COPs
TS approaches dominate the Job Shop
Scheduling area

References:

- F. Glover and M. Laguna, Tabu Search, Kluwer


Academic Publishers, 1997.
An introduction to Metaheuristics – p.36
Variable Neighborhood Search
VNS exploits the second idea: change
neighborhood structure.

VNS uses different neighborhood structures


during search
A neighborhood is substituted by
.
neighborhood as soon as local search can
1

not improve the current best solution.

An introduction to Metaheuristics – p.37


VNS: the algorithm

+vu

y{ u
z
sx w
Select a set of neighborhood structures

tsN

|
x
x
s
GenerateInitialSolution()
~
}

while termination conditions not met do


u

w
~

while do {Inner Loop}


€u

y{ u

|
} ;


} ‚
PickAtRandom( ){Shaking phase}
~

N
} E
E

} ;

LocalSearch( )
~
} E
E



} ‚

if then
ƒ

ƒ
} E
E

;
u

w
~

~
}

else
u

3„u

w
~

end if
end while An introduction to Metaheuristics – p.38

end while
Applications
Graph based COPs (e.g., -Median problem,

L
the Steiner tree problem, -Cardinality Tree

k
problem)
Some variants also very effective

References:

- P. Hansen and N. Mladenović, Variable neighborhood


search: Principles and applications, European Journal
of Operational Research, 2001.

An introduction to Metaheuristics – p.39


Guided Local Search
GLS exploits the third idea: dynamically change
the objective function.

Basic principle: help the search to move out


gradually from local optima by changing the
search landscape
The objective function is dynamically
changed with the aim of making the current
local optimum “less desirable”

An introduction to Metaheuristics – p.40


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.41


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.41


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.41


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.41


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.41


Guided Local Search
GLS penalizes solutions which contains some
defined features (e.g., arcs in a tour, unsatisfied
clauses, etc.)

"
#
If feature is present in solution , then ,

5
b

&
.
"
#

otherwise
2


&
.

An introduction to Metaheuristics – p.42


Guided Local Search
Each feature is associated a penalty which

.L
weights the importance of the features.

The objective function is modified so as to take


into account the penalties.

An introduction to Metaheuristics – p.43


Guided Local Search
Each feature is associated a penalty which

.L
weights the importance of the features.

The objective function is modified so as to take


into account the penalties.
†

ˆ
"
#

"
#

"
#
‡


‹. L


&

.
Љ.
g

An introduction to Metaheuristics – p.43


Guided Local Search
Each feature is associated a penalty which

.L
weights the importance of the features.

The objective function is modified so as to take


into account the penalties.
†

ˆ
"
#

"
#

"
#
‡


‹. L


&

.
Љ.
g
scales the contribution of the penalties wrt to the original
Œ

objective function

An introduction to Metaheuristics – p.43


GLS: The Algorithm
GenerateInitialSolution()


while termination conditions not met do

†
LocalSearch( )



4
for all selected features do

b
5
.L

.L

end for
K

Update( ,p){where p is the penalty vector}


end while

An introduction to Metaheuristics – p.44


GLS: Applications
Crucial: optimally tune parameters and penalty
updating procedure

An introduction to Metaheuristics – p.45


GLS: Applications
Crucial: optimally tune parameters and penalty
updating procedure

Successfully applied to the weighted MAXSAT,


the VR problem, the TSP and the QAP

References:

- C. Voudouris and E. Tsang, Guided Local


Search, European Journal of Operational
Research, 1999.
An introduction to Metaheuristics – p.45
Iterated Local Search
ILS can be seen as a general trajectory
method framework
Three basic blocks:
Local Search
Perturbation
Acceptance criteria

An introduction to Metaheuristics – p.46


ILS: basic scheme
1. Generate an initial solution
2. Apply local search (e.g., SA, TS, etc.)
3. Perturb (i.e., slightly change) the obtained
solution
4. Apply again local search with the perturbed
solution as starting solution
5. Decide whether to accept the new solution or
not
6. Go to step 3
An introduction to Metaheuristics – p.47
A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.48


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.48


A pictorial view
objective function

n
io
at
rb
rtu
pe

Solution space

An introduction to Metaheuristics – p.48


A pictorial view
objective function

Solution space

An introduction to Metaheuristics – p.48


ILS: the algorithm
GenerateInitialSolution()

J

LocalSearch( )



J
while termination conditions not met do
 K

mc
Perturbation( )

4 

Ž
†

 K
LocalSearch( )


mc
ApplyAcceptanceCriterion( )


4 

4 

Ž
end while

An introduction to Metaheuristics – p.49


Design principles
’Local search‘ can be any kind of trajectory method.
The perturbation should be strong enough to move the
staring point to another local minimum basin of
attraction, but it should keep some parts of the current
solution.
The acceptance criteria can range from very simple
(e.g., accept the new solution if better than the current
one) to more complex (e.g., with probabilistic
acceptance)

An introduction to Metaheuristics – p.50


Applications
TSP, QAP, Single Machine Total Weighted
Tardiness (SMTWT) problem, Graph Coloring
Problem

References:

- H. R. Lourenço and O. Martin and T. Stützle,


Iterated Local Search, in Handbook of
Metaheuristics, Kluwer Academic Publishers,
2002.

An introduction to Metaheuristics – p.51


Lessons learnt
The effectiveness of a metaheuristic strongly
depends on the dynamical interplay of
intensification and diversification.
General search strategies have to be applied
to effectively explore the search space.
The use of search history characterizes the
nowadays most effective algorithms.
Optimal parameter tuning is crucial and
sometimes very difficult to achieve.

An introduction to Metaheuristics – p.52


Population-based methods
Evolutionary Algorithms
Evolutionary Programming
Evolution Strategies
Genetic Algorithms
Ant Colony Optimization

But also (not coverd by this introduction): Scatter Search,


Population-Based Incremental Learning and Estimation of
Distribution Algorithms.

An introduction to Metaheuristics – p.53


Evolutionary Algorithms
Inspired by Nature’s capability to evolve living
beings well adapted to their environment.
Computational models of evolutionary
processes.

An introduction to Metaheuristics – p.54


Evolutionary Algorithms
Inspired by Nature’s capability to evolve living
beings well adapted to their environment.
Computational models of evolutionary
processes.

They include:
Evolutionary Programming
Evolution Strategies
Genetic Algorithms

An introduction to Metaheuristics – p.54


Evolutionary Algorithms
Basic principle: moving a population of solutions
toward good regions of the search space.

An introduction to Metaheuristics – p.55


Evolutionary Algorithms
Basic principle: moving a population of solutions
toward good regions of the search space.
objective function

Solution space
An introduction to Metaheuristics – p.55
Evolutionary Algorithms
Basic principle: moving a population of solutions
toward good regions of the search space.
objective function

Solution space
An introduction to Metaheuristics – p.55
Evolutionary Algorithms
Basic principle: moving a population of solutions
toward good regions of the search space.
objective function

Solution space
An introduction to Metaheuristics – p.55
The Evolutionary Cycle

An introduction to Metaheuristics – p.56


The Evolutionary Cycle

Population

An introduction to Metaheuristics – p.56


The Evolutionary Cycle

Selection
Parents

Population

An introduction to Metaheuristics – p.56


The Evolutionary Cycle

Selection
Parents

Recombination
Population
Mutation

Offspring

An introduction to Metaheuristics – p.56


The Evolutionary Cycle

Selection
Parents

Recombination
Population
Mutation

Offspring
Replacement

An introduction to Metaheuristics – p.56


EA: the algorithm
GenerateInitialPopulation()
Evaluate( )
while termination conditions not met do
K

Recombine( )
KK

K
Mutate( )
KK

Evaluate( )
KK

e
Select( )
end while

An introduction to Metaheuristics – p.57


The Seven Features
1. Description of the individuals
2. Evolution process
3. Neighborhood structure
4. Information sources
5. Infeasibility
6. Intensification strategy
7. Diversification strategy

An introduction to Metaheuristics – p.58


Description of the individuals
Solutions can be represented in many ways:

bit-strings
integer/real arrays
tree structures or more complex data
structures

the representation is crucial for the success of




the algorithm

An introduction to Metaheuristics – p.59


Evolution process
The evolution process determines which
individuals will enter the population at each
iteration.

Generational replacement: the offspring


entirely replaces the old population
Steady state: some new indivuduals are
inserted in the old population
The population size can be constant or
varying

An introduction to Metaheuristics – p.60


Neighborhood structure

“
The neighborhood function defines,

(
'
‘’
for every individual , the set of individuals

!
b"
#

which can be recombined with it.


‘’

Unstructured population: every individual can


be recombined with any other one (e.g.,
Simple Genetic Algorithm)
Structured population: otherwise (e.g.,
Parallel Genetic Algorithm with “islands”)

An introduction to Metaheuristics – p.61


Information sources
Several kinds of recombination are possible:

two-parent crossover
multi-parent crossover
population statistics-based recombination
operators

An introduction to Metaheuristics – p.62


Infeasibility
The result of a recombination could be an
individual violating some constraints.
Three possible ways of dealing with infeasibility:

Reject: discard infeasible solutions


Penalize: decrease the fitness of individuals
violating constraints
Repair : apply some operators to change the
solution trying to obtain a feasible one

An introduction to Metaheuristics – p.63


Intensification strategy
It is possible to apply operators or algorithms to
improve the fitness of single individuals.
For example:

Before the replacement, apply local search to


every individual of the population ( memetic
algorithms.
Apply mutation operators based on local
improvements (e.g., some steps of local
search).

An introduction to Metaheuristics – p.64


Diversification strategy
To avoid premature convergence of the search,
diversification techniques are introduced.
For example:

Random mutation (most often adopted)


Introduce into the population new individuals
’coming‘ from not yet explored areas of the
search space

An introduction to Metaheuristics – p.65


Applications
Applied to nearly any COPs and optimization
problems
Particularily effective in robotics applications

References:

- T. Bäck and D. Fogel and M. Michalewicz


eds., Handbook of Evolutionary Computation,
Institute of Physics Publishing Ltd., 1997.

An introduction to Metaheuristics – p.66


Ant Colony Optimization
Population-based metaheuristic inspired by the foraging
behavior of ants, which enables them to find the shortest
path between the nest and a food source.

While walking ants deposit a substance called


pheromone on the ground.
When they decide about a direction to go, they choose
with higher probability paths that are marked by
stronger pheromone concentrations.
This basic behavior is the basis for a cooperative
interaction which leads to the emergence of shortest
paths. An introduction to Metaheuristics – p.67
Ant Colony Optimization
ACO algorithms are based on a parametrized
probabilistic model – the pheromone model –
that is used to model the chemical pheromone
trails.

Artificial ants incrementally construct solutions by


adding opportunely defined solution components
to a partial solution under consideration

Artificial ants perform randomized walks on the


construction graph: a completely connected
”"

graph .
&
4

An introduction to Metaheuristics – p.68


ACO construction graph
”"

#
&
4

vertices are the solution components

”
are the connections
states are paths in .

Solutions are states, i.e., encoded as paths on

Constraints are also provided in order to


construct feasible solutions

An introduction to Metaheuristics – p.69


Example
One possible TSP model for ACO:

- nodes of (the components) are the cities to


be visited;
- states are partial or complete paths in the
graph;
- a solution is an Hamiltonian tour in the graph;
- constraints are used to avoid cycles (an ant
can not visit a city more than once).

An introduction to Metaheuristics – p.70


Sources of information
Connections, components (or both) can have
associated pheromone trail and heuristic
value.
Pheromone trail takes the place of natural
pheromone and encodes a long-term memory
about the whole ants’ search process
Heuristic represents a priori information about
the problem or dynamic heuristic information
(in the same way as static and dynamic
heuristics are used in constructive
algorithms).
An introduction to Metaheuristics – p.71
A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

An introduction to Metaheuristics – p.72


A pictorial view

solution constructed

An introduction to Metaheuristics – p.72


ACO: the algorithm
while termination conditions not met do
ScheduleActivities
AntBasedSolutionConstruction()
PheromoneUpdate()
DaemonActions() {optional}
end ScheduleActivities
end while

An introduction to Metaheuristics – p.73


Solution construction
Ants move by applying a stochastic local
decision policy that makes use of the
pheromone values and the heuristic values
on components of the construction graph.
While moving, the ant keeps in memory the
partial solution it has built in terms of the path
it was walking on the construction graph.

An introduction to Metaheuristics – p.74


Pheromone Update
When adding a component to the current partial

–•
solution, an ant can update the pheromone trail (online
step-by-step pheromone update).
Once an ant has built a solution, it can retrace the
same path backward and update the pheromone trails
of the used components according to the quality of the
solution it has built (online delayed pheromone
update).
Pheromone evaporation is always applied the


pheromone trail intensity on the components
decreases over time.
An introduction to Metaheuristics – p.75
Daemon Actions
Can be used to implement centralized actions
which cannot be performed by single ants.
E.g.,
local search procedure applied to the
solutions built by the ants
collection of global information used to
decide whether to deposit additional
pheromone to bias the search process
from an non-local perspective

An introduction to Metaheuristics – p.76


Variants
Ant Colony System
- Ant System – among the most
effective ACO implementations

An introduction to Metaheuristics – p.77


Variants
Ant Colony System
- Ant System – among the most
effective ACO implementations

Hyper-Cube Framework : generalizes


pheromone-based construction mechanisms

An introduction to Metaheuristics – p.77


Applications
Routing in communication networks,
Sequential Ordering Problem, Resource
Constraint Project Scheduling

ACO is effective when combined with local




search and/or tree-search strategies.

References:

- M. Dorigo and G. Di Caro, The Ant Colony


Optimization Meta-Heuristic, in New Ideas in
Optimization, McGraw-Hill, 1999. MIT Press, 2003.
An introduction to Metaheuristics – p.78
Hybrid metaheurictics
Component exchange among metaheuristics
Cooperative search
Integrating metaheuristics and systematic
methods

An introduction to Metaheuristics – p.79


Component exchange
Integrate trajectory methods into
population-based methods (e.g., memetic
algorithms)
Apply general I&D strategies of a
metaheuristic into another one (e.g., TS
restart strategy applied into ILS)

The combination of population-based methods with




trajectory methods produces hybrid algorithms which are


often more efficient than single methods.

An introduction to Metaheuristics – p.80


Cooperative search
Loose form of integration
The search is performed by different
algorithms (possibly running in parallel)
The algorithms exchange information during
the search process
Crucial:
kind of information exchanged
implementation

An introduction to Metaheuristics – p.81


Metaheuristics and systematic
methods
1. Metaheuristics are applied before systematic
methods, providing a valuable input, or vice
versa.
2. Metaheuristics use CP and/or tree search to
efficiently explore the neighborhood.

An introduction to Metaheuristics – p.82


Metaheuristics and systematic
methods
3. A “tree search”-based algorithm applies a
metaheuristic in order to improve a solution
(i.e., a leaf of the tree) or a partial solution
(i.e., an inner node). Metaheuristic concepts
can also be used to obtain incomplete but
efficient tree exploration strategies.

An introduction to Metaheuristics – p.83


Metaheuristics and systematic
methods
3. A “tree search”-based algorithm applies a
metaheuristic in order to improve a solution
(i.e., a leaf of the tree) or a partial solution
(i.e., an inner node). Metaheuristic concepts
can also be used to obtain incomplete but
efficient tree exploration strategies.

References: F. Focacci and F. Laburthe and A. Lodi, Local


Search and Constraint Programming, in Handbook of
Metaheuristics, Kluwer Academic Publishers, 2002.

An introduction to Metaheuristics – p.83


Overview references
- C.Blum, A.Roli. Metaheuristics in Combinatorial
Optimization: Overview and Conceptual Comparison.
Technical report TR/IRIDIA/2001-13, IRIDIA, Université
Libre de Bruxelles, Belgium.
- F. Glover and G. Kochenberger eds., Handbook of
Metaheuristics, Kluwer Academic Publishers, 2002.
- A.Roli, M.Milano. MAGMA: A Multiagent Architecture for
Metaheuristics. LIA Technical Report 02-007, no. 60. To be
published in IEEE Tran. of Systems, Men and Cybernetics –
part.B.

An introduction to Metaheuristics – p.84


Internet resources
www.metaheuristics.net
tew.ruca.ua.ac.be/eume/

An introduction to Metaheuristics – p.85

You might also like