0% found this document useful (0 votes)
10 views

Genetic Algorithm-Contd

The genetic algorithm begins with a randomly generated population of potential solutions. It then applies genetic operators like selection, crossover, and mutation to produce a new generation. The fitness of each solution is evaluated using a fitness function, and higher-fitting solutions are more likely to be selected to pass their genes on to the next generation. Over many generations, the population evolves to include increasingly fit solutions, converging on an optimal or near-optimal solution to the given problem.

Uploaded by

Vinay Rai
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Genetic Algorithm-Contd

The genetic algorithm begins with a randomly generated population of potential solutions. It then applies genetic operators like selection, crossover, and mutation to produce a new generation. The fitness of each solution is evaluated using a fitness function, and higher-fitting solutions are more likely to be selected to pass their genes on to the next generation. Over many generations, the population evolves to include increasingly fit solutions, converging on an optimal or near-optimal solution to the given problem.

Uploaded by

Vinay Rai
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

GENETIC ALGORITHM

Genetic Algorithm
• The GA starts with a set of solutions or populations.
• Each individual in the population set represents a solution to the problem.
• Some solutions may be very good, while others are very bad.
• The goodness of the solution varies from individual to individual within the population.
• To measure the goodness of the solution, we use a function called the fitness function,
which must be made according to the program logic.
• Initially the population set contains all randomly generated solutions that are initially
generated. This is called the first generation of the population.
• Then the various genetic operators are applied over the population.
• The genetic operators help in the generation of a new population set from the old
population set.
• The new population set is formed by the genetic reproduction of the parents to form the
children.
• Various genetic operators are applied that help generate the new population set. These
specialized operators help in better optimization of the solution.
Genetic Algorithm
• In this manner, we generate higher-order generations from the lower order.
• The fitness of any solution or individual may be measured by the fitness function. A
higher value of the fitness function means better solutions.
• The fitness of the population is measured by the average fitness of all the individuals that
make up the population.
• It is observed that the fitness keeps increasing as we move forward with the algorithm.
• The higher-order generations are more fit when compared with the lower-order
generations.
• This is true for all best fitness individuals in the population
• n, average fitness in the population as well as the worst fitness in the population. The GA
results in improvisation as the generation proceeds.
• This improvisation is very rapid in the first few generations.
• After that, however, the algorithm more or less converges to some point to give the most
optimal solution to the problem
Genetic Operators
1) Selection
• The first operator applied is selection.
• Selection forms pairs of individuals that will later cross-breed to generate
individuals.
• Selection is an important step, because it decides the parents that are
responsible for the individuals being generated.
• Thus a number of strategies for the selection process have been developed.
• All try to cross-breed good solutions in the hope of generating good individuals in
the resulting populations.
• Selection is done on the basis of the individual’s reproductive capability, or fitness
value.
• An individual having a higher fitness has a high reproductive capability and thus is
likely to be selected numerous times.
Genetic Operators
2)Crossover
• It is the genetic operator that results in the mixing of two parents to
generate individuals of the new population.
• Crossover is done by taking some bits from the first parent and other bits
from the other parent.
• One of the commonly used crossovers is the single-point crossover, in
which we randomly select a point in the chromosome. The bits to the left
of this point are taken from the first parent and the bits to the right are
taken from the other parent.
• This generates new individuals in the new population.
• Crossover is motivated by the intrinsic hope to form offspring that takes
the best characteristics from the two parents.
Genetic Operators
3) Mutation
• Crossover results in the mixing of characteristics between two individuals. Many
characteristics may be completely absent in the solution pool, even if many of
those missing characteristics may have been desirable for good solutions.
• This absence may be due to the fact that these characteristics were never
generated, or that these characteristics somehow got deleted.
• The term characteristic here refers to the bits in the genotype representation.
• These characteristics need to be introduced into the population.
• Mutation is the process by which we try to introduce random characteristics into
the population.
• We do this done by flipping bits.
• A random bit in the individual is selected and is simply swapped during mutation.
• Thus if it represented 1, it now represents 0, and vice versa.
• This mutation is the characteristic that we have artificially introduced into the
system
Other Operators
Eliticism
• Eliticism selects some predefined proportion of individuals with highest
fitness from one generation and passes it straight to the next generation.
• This operator saves these high-fitness individuals from being destroyed by
crossover or mutation and makes them available in the next generation.
• This operator also ensures that the best populations will be carried
forward, which in turn ensures that the globally best individual will be
selected as the final solution.
• Without this operator, the generated individuals may be of fitness lower
than the fitness of the best fit individual of the preceding population.
• This would result in lowering the best fitness value across the two
generations.
Other Operators
Insert and Delete
• The insert operator allows the insertion of new individuals in the
population pool.
• This makes the entire algorithm flexible to accepting new individuals.
• Based on the solution, at any point, we can generate new individuals with
high fitness.
• These individuals may be added to the pool by using the insert operator.
• Similarly, the delete operator allows deletion of some individuals from the
population set.
• We may wish to delete certain individuals that have a low fitness value.
• This operator may be used in such circumstances
Other Operators -Hard and Soft Mutation

Hard and Soft Mutation


• Many times the mutation operator may be split in two.
• One part, or the soft mutation, brings about small changes in the individual.
• Because the changes are small, big changes might not be expected in the
behavior of these solutions.
• This operator may be used to control the system convergence or for general
addition of new characteristics.
• The other operator, or the hard mutation, is used to carry out big changes in the
individual.
• This mutation may completely transform the individual. It brings major changes in
the solution’s behavior.
• This mutation may be used very rarely to add very new characteristics to the
system
Other Operators
Repair
• Many times, the GA may generate infeasible solutions.
• The repair operator may be used to repair the infeasible solutions before
they are stored in the population pool.
• This operator converts an infeasible solution into a solution that could be
feasible.
• For this, we need to study the problem constraints and examine the
individual.
• We make as slight modifications as possible in order to make the solution
feasible.
• The fitness value of this newly generated solution cannot be guaranteed.
Fitness Function
• The quality of the solution or of any individual is measured by the fitness function.
• In general, a good solution will have a high value of fitness function, whereas a bad
solution will have a low value.
• Sometimes the converse relation may also be applied.
• Using the fitness function, we may be able to rank the solutions from good to bad, which
is very helpful in the selection operator.
• The fitness function is hence very helpful for knowing how the algorithm is behaving
over time and over generations.
• Based on the fitness of the solutions as given by the fitness function, we are generally
interested in three kinds of fitness. The first is the best fitness. The best solutions have
the best value of the fitness function out of the entire population set. The best fitness
gives us an idea of the algorithm’s performance or of the most optimal solutions so far.
• At the time that the stopping criterion is met, the best solution is used as the final
solution of the GA.
• The second type of fitness is the average fitness, which gives us an idea of the average
solution in the population set.
• The third type is the worst fitness, which is the worst solution in the entire population
set.
Stopping Condition
• The GA proceeds in an iterative manner and runs generation by generation.
• This procedure goes on and on, and the solutions generally keep
improving.
• However, we must be able to decide when we would be willing to
terminate the program.
• This is done by the stopping criterion, which decides when the program will
be terminated.
• The best solution at that point would be regarded as the final solution.
• There are numerous ways to set the stopping criterion:
1)If the solutions being generated are sufficiently optimized and further
optimizations may not be helpful, we may wish to stop the algorithm. This is
when the goal that was set before training has been met.
2)If the time taken by the algorithm or the number of generations exceeds a
certain number, we may wish to stop.
3) If the improvement by subsequent generations or time is too low and the
solution seems to have converged, we may choose to stop the algorithm.
This is also known as stall time or stall generation.
Fitness Scaling
• Fitness-based selection has some problems. It is likely that in such systems,
the higher fitness individuals be selected in very large numbers. This would
mean the new generation would be dominated by a few individuals, and
the convergence would be very fast. In practice, however, we may only get
local minima and not global minima from this system. These problems with
fitness give rise to the need for additional methods that can be used for
effective selection.
• Scaling is one such mechanism for effective selection. In scaling, the
individuals in the population are rescaled according to their fitness values.
The scaling of fitness values gives rise to the expected value of the
individuals. This new value will then help the selection operator to select
the parents. The expected value denotes the fitness of the solution in the
new scale.
Roulette Wheel Selection
In roulette wheel selection, the probability of selecting any individual is directly proportional to that individual’s expectation value.
Hence the more fit solutions have a greater chance of being selected again and again.
This makes the fitter individuals participate more and more in the crossover.
The weaker individuals get eliminated from the process. The simplest way of implementing this is by the roulette wheel.
Each individual in the population represents a section of the roulette wheel.
The higher the individual’s fitness, the larger its area on the roulette wheel, thus increasing the probability of the selection of this
individual.
To perform selection, the roulette wheel is spun N times, where N is the number of individuals in the population.
After each spin, the reading of where the wheel stops is noted.
The wheel is naturally more likely to stop at individuals whose fitness value is large.
For implementation purposes, the roulette wheel is implemented by first summing up all the expectations of all individuals.
Let this sum be T. We then select any random number r between 0 and T.
We move through each and every individual and sum their fitness. We keep moving until the summated value is less than r.
The final selected value is the one that makes the sum exceed r, as given below:
Types of Mutation

You might also like