0% found this document useful (0 votes)
33 views

Derivative Free Optimization Genetic Algorithm 1

The document discusses derivative free optimization methods, focusing on genetic algorithms. It describes genetic algorithms as derivative free, stochastic optimization methods inspired by natural selection, where populations of potential solutions evolve toward better solutions. It explains the key components of genetic algorithms, including encoding potential solutions, evaluating their fitness, selecting solutions for reproduction, and applying genetic operators like crossover and mutation to produce new potential solutions for the next generation.

Uploaded by

Lekshmi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Derivative Free Optimization Genetic Algorithm 1

The document discusses derivative free optimization methods, focusing on genetic algorithms. It describes genetic algorithms as derivative free, stochastic optimization methods inspired by natural selection, where populations of potential solutions evolve toward better solutions. It explains the key components of genetic algorithms, including encoding potential solutions, evaluating their fitness, selecting solutions for reproduction, and applying genetic operators like crossover and mutation to produce new potential solutions for the next generation.

Uploaded by

Lekshmi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Derivative Free Optimization

Genetic Algorithm

Presentation by:
C. Vinoth Kumar
SSN College of Engineering
Derivative Free Optimization
• Four of the most popular derivative-free optimization are:

– Genetic Algorithms (GA)


– Simulated Annealing (SA)
– Random search method
– Downhill simplex search
Derivative Free Optimization
• They are characterized by the following:

(i) Derivative freeness: They do not require functional


derivative information; they rely exclusively on repeated
evaluations of the objective function, and the subsequent
search direction after each evaluation follows certain
heuristic guidelines

(ii) Slowness: They search a global optimum & are slower


than derivative-based optimization methods

(iii) Intuitive guidelines: They are based on simple intuitive


concepts that pertain to nature’s wisdom including
evolution & thermodynamics
Derivative Free Optimization
(iv) Flexibility: They are flexible since they do not require
any differentiability of the objective function

(v) Randomness: All are stochastic (except downhill


search); hence these methods are global optimizers that
will find a global optimum given enough computation time

(vi) Analytic opacity: It is difficult to do analytic studies (due


to its randomness); only empirical studies are possible

(vii) Iterative nature: these methods are iterative in nature


and stopping criteria is required to determine when to
terminate the optimization process
Derivative Free Optimization
• Let k denote an iteration count and fk denote the best
objective function obtained at count k; common stopping
criteria for a maximization problem includes:

(i) Computation time


(ii) Optimization goal (fk is less than a certain preset goal
value)
(iii) Minimal improvement (fk–fk-1 is less than a preset
value)
(iv) Minimal relative improvement ( (fk–fk-1)/fk-1 is less than
a preset value)
Derivative Free Optimization

Model space
Adaptive networks
Neural networks
Fuzzy inf. systems
Soft
Computing
Approach space
Derivative-free optim.

Derivative-based optim.
Genetic Algorithm (GA)

• GAs are derivative free stochastic optimization methods


based loosely on the concepts of natural selection and
evolutionary processes.

• GAs are parallel-search procedures that can be


implemented on parallel-processing machines for
massively speeding up their operations.

• GAs are applicable to both continuous and discrete


optimization problems.
Genetic Algorithm

• GAs are stochastic and less likely to get trapped in local


minima, which inevitably are present in any practical
optimization application

• GAs’ flexibility facilitates both structure and parameter


identification in complex models such as neural networks
and fuzzy inference systems.
Genetic Algorithm
• Each point in a parameter space is encoded into binary
strings called chromosomes

• Example: The 3D point (11,6,9) can be represented as a


concatenated binary string (or chromosome)

1011
{ 0110
123 1001
{
11 6 9
gene A gene B gene C
Genetic Algorithm
• Each point is associated with a fitness value (function
value)

• GA keeps a set of points as a population (or gene pool),


which is then evolved repeatedly toward a better overall
fitness value

• In each generation, the GA constructs a new population


using genetic operators such as crossover & mutation:
members with higher fitness values are more likely to
survive & to participate in mating (crossover) operations
Genetic Algorithm
• After a number of generations, the populations contains
members with better fitness values (Darwin model of
evolution: random mutation & natural selection)

• It is a population-based optimization that improves


performance by upgrading entire populations rather than
individual members.
Comparison of Natural and GA Terminology
Natural Genetic Algorithm
chromosome string
gene feature, character or detector
allele feature value
locus string position
genotype structure, or population
phenotype parameter set, alternative solution,
a decoded structure
Genetic Algorithm
• The major components of GAs are

(i) Encoding Schemes: binary and gray coding

(ii) Fitness evaluation

(iii) Selection

(iv) Crossover: one-point or two-point crossover

(v) Mutation
Genetic Algorithm – Binary encoding

Chromosome

(11, 6, 9) 1011 0110 1001

Crossover Gene

10011110 10010010
10110010 10111110

Crossover point
Mutation
10011110 10011010
Mutation bit
Genetic Algorithm
Genetic Algorithm

Elitism (retaining a certain number


of best members when each new
10010110 population is generated) 10010110
01100010 01100010
10100100 10100100
10011001 10011101
01111101 01111001
... ...
... Selection Crossover Mutation ...
... ...
... ...

Current Next
generation generation
Genetic Algorithm
Step 1: Initialize a population with randomly generated
individuals and evaluate the fitness value of each individual
Step 2:
(a) Select 2 members from the population with probabilities
proportional to their fitness values
(b) Apply crossover with a probability equal to the crossover
rate
(c) Apply mutation with a probability equal to the mutation
rate
(d) Repeat (a) to (d) until enough members are generated to
form the next generation
Step 3: Repeat step 2 & 3 until a stopping criterion is met
Genetic Algorithm
Example: Maximization of the “peaks” function using GAs

Peaks

-5

2
3
2
0 1
0
-2 -1
-2
y -3
x

peaks (from MATLAB)


Genetic Algorithm
• Use of GA on the domain [-3,3] x [-3,3]
• Use 8-bit binary coding for each variable ⇒ the search
space is 28 * 28 = 65,536
• Each generation contains 20 points (or individuals)
• Use of a simple one-point crossover scheme with rate =
1.0
• Use of a uniform mutation with rate = 0.01
• Apply elitism to keep the best individuals across
generations
Genetic Algorithm

Initial population 5th generation 10th generation


Genetic Algorithm

You might also like