0% found this document useful (0 votes)
46 views

AI Lecture 9-10

The document discusses different local search algorithms covered in a lecture on artificial intelligence, including hill-climbing search which moves uphill to a peak, simulated annealing search which can escape local maxima, and genetic algorithms which generate successive populations using selection, crossover and mutation to evolve solutions.

Uploaded by

Mohammad Bangee
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views

AI Lecture 9-10

The document discusses different local search algorithms covered in a lecture on artificial intelligence, including hill-climbing search which moves uphill to a peak, simulated annealing search which can escape local maxima, and genetic algorithms which generate successive populations using selection, crossover and mutation to evolve solutions.

Uploaded by

Mohammad Bangee
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Artificial Intelligence

(CS-401)
Lecture 5:
Local Search Algorithms

Dr. Fahad Sherwani (Assistant Professor – FAST NUCES)


PhD in Artificial Intelligence
Universiti Tun Hussein Onn Malaysia
[email protected]
Local Search Algorithms

In this lecture:

❑ Part 1: What/Why Local Search


❑ Part 2: Hill-Climbing Search
❑ Part 3: Simulated Annealing Search

❑ Part 4: Genetic Algorithms

3
Local Search Algorithms

In many optimization problems, the path to the goal is irrelevant; the


goal state itself is the solution.

ِExamples:
– to reduce cost, as in cost functions

– to reduce conflicts, as in n-queens

The idea: keep a single "current" state, try to improve it according to


an objective function.

Local search algorithms:


– Use little memory
– Find reasonable solutions in large infinite spaces
4
Local Search Algorithms

• Local search can be used on problems that can be formulated as


finding a solution maximizing a criterion among a number of
candidate solutions.

• Local search algorithms move from solution to solution in the


space of candidate solutions (the search space) until a solution
deemed optimal is found or a time bound is elapsed.

• For example, the travelling salesman problem, in which a solution


is a cycle containing all nodes of the graph and the target is to
minimize the total length of the cycle. i.e. a solution can be a
cycle and the criterion to maximize is a combination of the
number of nodes and the length of the cycle.

• A local search algorithm starts from a candidate solution and then


iteratively moves to a neighbor solution.
5
Local Search Algorithms

Terminate on a time bound or if the situation is not improved after number


of steps.

Local search algorithms are typically incomplete algorithms, as the


search may stop even if the best solution found by the algorithm is not
optimal.

6
Search Landscape (two-dimension)

7
Search Landscape (three-dimensions)

8
Local Search Algorithms

In this lecture:
❑ Part 1: What/Why Local Search
❑ Part 2: Hill-Climbing Search
❑ Part 3: Simulated Annealing Search

❑ Part 4: Genetic Algorithms

9
Hill-Climbing Search
• Continually moves in the direction of increasing value (i.e., uphill).
• Terminates when it reaches a “peak”, no neighbor has a higher value.
• Only records the state and its objective function value.
• Does not look ahead beyond the immediate.

Sometimes called
Greedy Local Search

• Problem: can get stuck in local maxima,


• Its success depends very much on the shape of the state-space
land-scape: if there are few local maxima, random-restart hill
climbing will find a “good” solution very quickly.
10
Example: n-queens

Put n queens on an n×n board with no two queens on the


same row, column, or diagonal.

Move a queen to reduce number of conflicts.

11
Hill-Climbing Search

S → B → E → G1
S SEARCH PATH = [S0, B1, E2, G13]
4 1 Cost = 1+2+3=6

A B
2 3
1 2 NOTE: The values on arcs are evaluation
function values or cost function values
C D E F (NOT heuristic estimate values).

4 3 2 1
NOTE: in Hill climbing NO revising or
backtracking. This is the only difference
H G1 I G2
b/w Hill climbing and Best first search.

12
Hill-Climbing Search Disadvantages

13
Hill-Climbing Search Algorithm

The hill-climbing search algorithm, which is the most basic


local search technique. At each step the current node is replaced
by the best neighbor; in this version, that means the neighbor
with the highest VALUE, but if a heuristic cost estimate h is used,
we would find the neighbor with the lowest h.

14
Local Search Algorithms

In this lecture:
❑ Part 1: What/Why Local Search
❑ Part 2: Hill-Climbing Search
❑ Part 3: Simulated Annealing Search
❑ Part 4: Genetic Algorithms

15
Simulated Annealing Search

What is annealing ? - the physical origin

16
Simulated Annealing Search

17
Simulated Annealing Search

To avoid being stuck in a local maxima, it tries randomly


(using a probability function) to move to another state, if this
new state is better it moves into it, otherwise try another
move… and so on.

18
Simulated Annealing Search

19
Simulated Annealing Search

20
Properties of Simulated Annealing Search

The problem with this approach is that the neighbors of a


state are not guaranteed to contain any of the existing
better solutions which means that failure to find a better
solution among them does not guarantee that no better
solution exists.

It will not get stuck to a local optimum

If it runs for an infinite amount of time, the global optimum


will be found.

21
Local Search Algorithms

In this lecture:
❑ Part 1: What/Why Local Search
❑ Part 2: Hill-Climbing Search
❑ Part 3: Simulated Annealing Search

❑ Part 4: Genetic Algorithms

22
Genetic Algorithms

• A genetic algorithm is a search heuristic that is inspired by


Charles Darwin’s theory of natural evolution.

• This algorithm reflects the process of natural selection where


the fittest individuals are selected for reproduction in order to
produce offspring of the next generation.

• A successor state is generated by combining two parent


states, rather by modifying a single state.

• Start with k randomly generated states (population), Each


state is an individual.

23
Genetic Algorithms

• A state is represented as a string over a finite alphabet


(often a string of 0s and 1s)

• Evaluation function (fitness function). Higher values for


better states.

• Produce the next generation of states by selection,


crossover, and mutation.

• Commonly, the algorithm terminates when either a


maximum number of generations has been produced,
or a satisfactory fitness level has been reached for the
population.

24
Genetic Algorithms

25
The 8-queen Problem

Five phases are considered in a genetic algorithm.


1. Initial population
2. Fitness function
3. Selection
4. Crossover
5. Mutation

26
Initial Population
The process begins with a set of individuals which is called a
Population. Each individual is a solution to the problem you
want to solve.
An individual is characterized by a set of parameters
(variables) known as Genes. Genes are joined into a string to
form a Chromosome (solution).
In a genetic algorithm, the set of genes of an individual is
represented using a string, in terms of an alphabet. Usually,
binary values are used (string of 1s and 0s). We say that we
encode the genes in a chromosome.

27
Fitness Function

• The fitness function determines how fit an


individual is (the ability of an individual to
compete with other individuals).
• It gives a fitness score to each individual.
• The probability that an individual will be selected
for reproduction is based on its fitness score

28
Selection

• The idea of selection phase is to select the


fittest individuals and let them pass their
genes to the next generation.
• Two pairs of individuals (parents) are selected
based on their fitness scores. Individuals with
high fitness have more chance to be selected
for reproduction.

29
Crossover
• For each pair of parents to be mated, a
crossover point is chosen at random from
within the genes.
• For example, consider the crossover point to
be 3 as shown below.
• Offspring are created by exchanging the
genes of parents among themselves until the
crossover point is reached
• The new offspring are added to the
population.

30
Mutation
• In certain new offspring formed, some of their
genes can be subjected to a mutation with a
low random probability.
• This implies that some of the bits in the bit
string can be flipped.
• Mutation occurs to maintain diversity within
the population and prevent premature
convergence.

31
Termination
• The algorithm terminates if the population has
converged (does not produce offspring which
are significantly different from the previous
generation).
• Then it is said that the genetic algorithm has
provided a set of solutions to our problem.

32
Termination

33
Genetic Algorithm Pseudocode

1) Randomly initialize populations p


2) Determine fitness of population
3) Until convergence, repeat:
a) Select parents from population
b) Crossover and generate new
population
c) Perform mutation on new population
d) Calculate fitness for new population

34

You might also like