0% found this document useful (0 votes)
2 views

Local Search Algorithms

Chapter 4 discusses various local search algorithms including best-first search, A*, hill climbing, simulated annealing, and genetic algorithms. It highlights the strengths and weaknesses of each method, such as A* being complete and optimal but having high space complexity, and hill climbing's tendency to get stuck in local optima. Additionally, it covers iterative improvement search techniques and the importance of heuristic functions in guiding the search process.

Uploaded by

Muhammad Khalid
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Local Search Algorithms

Chapter 4 discusses various local search algorithms including best-first search, A*, hill climbing, simulated annealing, and genetic algorithms. It highlights the strengths and weaknesses of each method, such as A* being complete and optimal but having high space complexity, and hill climbing's tendency to get stuck in local optima. Additionally, it covers iterative improvement search techniques and the importance of heuristic functions in guiding the search process.

Uploaded by

Muhammad Khalid
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 36

Local Search

Algorithms
Chapter 4
Summary: Informed search
• Best-first search is general search where the minimum-cost nodes (according
to some measure) are expanded first.
• A* search combines uniform-cost search and greedy search: f(n) = g(n) + h(n).
A* handles state repetitions and h(n) never overestimates.
– A* is complete and optimal, but space complexity is high.
– The time complexity depends on the quality of the heuristic function.
– IDA* and SMA* reduce the memory requirements of A*.
• Hill-climbing algorithms keep only a single state in memory, but can get
stuck on local optima.
• Simulated annealing escapes local optima, and is complete and optimal
given a “long enough” cooling schedule.
• Genetic algorithms can search a large space by modeling biological evolution.
• .
Iterative improvement search

• Another approach to search involves


starting with an initial guess at a solution
and gradually improving it until it is one.
• Some examples:
– Hill Climbing
– Simulated Annealing
– Genetic Algorithms
– Constraint satisfaction
Hill Climbing
• It is an iterative algorithm that starts with an
arbitrary solution to a problem, then attempts
to find a better solution by making an
incremental change to the solution.
• Always moves in single direction
Hill climbing on a surface of states

Height Defined by
Evaluation Function
Hill climbing on a surface of states
Hill-climbing search
• If there exists a successor s for the current state n such that
– h(s) < h(n)
– h(s) ≤ h(t) for all the successors t of n,
• then move from n to s. Otherwise, halt at n.
• Looks one step ahead to determine if any successor is
better than the current state; if there is, move to the best
successor.
• Similar to Greedy search in that it uses h, but does not
allow backtracking or jumping to an alternative path since it
doesn’t “remember” where it has been.
• Not complete since the search will terminate at
"local minima," "plateaus," and "ridges."
Exploring the Landscape
• Local Maxima: peaks that
aren’t the highest point in the area.
space
• Plateaus: the space has a
broad flat region that
gives the search algorithm
no direction (random
walk)
• Ridges: flat like a plateau, but
the main difference between a
ridge and a plateau is the
scale of the flat region: a
ridge is a narrow flat path,
while a plateau is a broad flat
local maximum

plateau

ridge

Image from: http://classes.yale.edu/fractals/CA/GA/Fitness/Fitness.html


• Local Maxima vs Plateau
Drawbacks of hill climbing

• The hill climbing algorithm has certain limitations to


keep in mind: Local Optima: It may get stuck in
local optima, suboptimal solutions that are better
than their neighbors but worse than the global
optimum.
• Plateaus: It has difficulty navigating flat regions
or plateaus where there are no uphill moves.
• No Backtracking
• Greedy Approach , not optimal
Example
Examples
1
Apply the hill climbing algorithm to solve the blocks
world
problem shown in Figure.
Solution

To use the hill climbing algorithm we need an evaluation function or a


heuristic
function.

We consider the following


evaluation function:
h(n) = Add one point for every
block that is resting on the thing it
is supposed to be resting on.
Example
Examples
1
We call “initial state” “State 0” and “goal state”
”Goal”. Then considering the blocks A, B, C, D, E,
F, G, H in that order we have

There is only one move


from State 0, namely, to
h(n) = Add one point for every move block A to the table.
block that is resting on the thing it This produces the State 1
is supposed to be resting on. with a score of 6:
Subtract one point for every block
that is sitting on the wrong thing.
Example 1
Examples
There are three possible moves form State 1 as shown in
Figure.
We denote these three states State 2(a), State 2(b) and
State 2
(c). We also have

Hill climbing will halt because all


these states have lower scores than
the current state. The process has
reached a local maximum that is not a
global maximum. We have reached
such a situation because of the
particular choice of the heuristic
function. A different choice of heuristic
function may not produce such a
situation.
Example 2
Examples
Given the 8-puzzle shown in Figure, use the hill-
climbing algorithm with the Manhattan distance
heuristic to find a path to the goal state.

Solution
By definition, the Manhattan distance heuristic is the
sum of the Manhattan distances of tiles from their
goal positions. In Figure, only the tiles 5, 6 and 8 are
misplaced and their distances from the goal positions
are respectively 2, 2 and 1. Therefore h(Initialstate) =
2 + 2 + 1 = 5:
Examples
Example 2

Fig shows the various states


reached by applying the various
operations specified in the hill-
climbing algorithm. The numbers
written alongside the grids are the
values of the heuristic function.

Note that the problem is a


minimization problem. The value of
the heuristic function at the goal
state is 0 which is the minimum
value of the function. Note further
that once we find an operation
which produces a better value than
the value in current state we do
proceed to that state without
considering whether there is any
other move which produces a state
having a still better value.
Simulated annealing
• Simulated annealing (SA) exploits an analogy between the way
in which a metal cools and freezes into a minimum-energy
crystalline structure (the annealing process) and the search for
a minimum [or maximum] in a more general system.
• SA can avoid becoming trapped at local minima.
• SA uses a random search that accepts changes that increase
objective function f, as well as some that decrease it.
• SA uses a control parameter T, which by analogy with the
original application is known as the system
“temperature.”
• T starts out high and gradually decreases toward 0.
Simulated annealing (cont.)
• A “bad” move from A to B is accepted with a
probability
P(moveA→B) = e
( f (B) – f (A)) / T

• The higher the temperature, the more likely it is that a bad


move can be made.
• As T tends to zero, this probability tends to zero, and SA
becomes more like hill climbing
• If T is lowered slowly enough, SA is complete
Simulated annealing (cont.)
The Simulated Annealing algorithm
Calculation for schedule of
Temperature
• A common temperature schedule used in simulated annealing is to start with a
high initial temperature and gradually decrease it over time. This decrease can
be linear, exponential, or follow some other function. The key idea is to start with
a high temperature to allow for more exploration of the search space and then
gradually reduce the temperature to focus on exploitation.
• Here's an example of a linear temperature schedule:

• Where T new can be calculated using the acceptance probability


Genetic algorithms
• Start with k random states (the initial population)
• New states are generated by “mutating” a single state or
“reproducing” (combining via crossover) two parent
states (selected according to their fitness)
• Encoding used for the “genome” of an individual
strongly affects the behavior of the search

• Genetic algorithms / genetic programming are a large


and active area of research
Genetic algorithms
Algorithm 2.1 : SGA

Begin
Initialize gen=0
Population initialization P (gen)
While termination criteria is not accomplish
Begin
Assess fitness of each candidate solution of the current generation
Select parents to generate offspring
Produce offspring by using genetic operators
Replace some members from new offspring
gen = gen+1
End while
End while

Figure 2.6 : Algorithm for SGA


GA :Chromosome

• Representation of Solutions: The Chromosome


• A single chromosome may compromise of different parts
mention below.
• Gene is a basic unit of chromosome it represents one
single feature or attribute. A single chromosome is a
collection of many genes or in other word a single
chromosome may represents different attributes or features
• Allele A single gene all possible values are called as allele.
Problem Encoding
• Binary Coded Vector In case when Binary coded vector
representation is used .A single allele may have two have
values 1 or 0.So a chromosome may consist of strings of bit.
And the length of the string will be equal total features or total
Genes.
• Real Coded Vector In Real coded vector a single gene may have
any possible decimal value.
• Population Group of chromosomes form a population. Where,
one chromosome represents one individual, or one solution to
a problem, or one point in the search space.
• Solution Representation Importance Representation of the
problem, effects the complexity and efficiency of overall
algorithm
Genetic Operators
Fitness Function
Fitness Function is the evaluation measure of each individual in the population.
Fitness function is so very important in GA .As all results will depends on this. On the
basis of the fitness value we will decide who will be surviving and will die.
Reproduction Operators
• Genetic operators are being applied on the chromosomes that are selected as the best
parent on the basis of fitness function to create offspring for next generation.
• Reproduction operators are basically of two types
• crossover
• Mutation
Crossover Operator
• In crossover operation we take two candidate solutions and swap their parts with
each other to form two new offspring having good features of both parents in
new created offspring.
Genetic Operators
Depending on where to split have to made and how to split chromosome
to share attributes. It can be dividing into different categories.
 Single Point Crossover
 Uniform Crossover
Mutation Operator
• It takes a single candidate. For the diversity purpose small random
changes are used to make in a single chromosome. Mutation will be
helpful n the scenario when the whole population of current
generation missing some attribute on the same location for example if
initial population missing 1 at first bit location.
• If the mutation rate is low, new trial will appear slow .If mutation rate is
high each generation will be unrelated to each other.
Parent Selection
• Selection Process
• Selection process is to choose the two best parents from
the existing population on the basis of fitness function. So
after fitness function it is critical to choose proper selection
procedure.
Next Generation
• Next Generation
• The new offspring replaces the old. And some old better solutions also
included in the next generation. New generation elitism describes the
ratio to which old generation members will be added to new
generation. It ensures that maximum fitness may not tend to decrease
from one generation to next generation.
• Generation gap means percentage of the gap of from one
generation to the next generation. If new generation contains all the
new individuals then this gap is 100%.

• Duplicates allow If duplicates will be removed it may increase


the efficiency of search and reduces premature convergence but
can increase the processing time in case of large population.

• Population size: Number of the solutions in current population.


Termination Requirement

• The GA continues generation after generation


some predefine criteria is met.
• Find solution that have fitness that met the
predefine threshold value
• Predefined Number of generation have reached
• No father improvement observe, shows stable results

You might also like