0% found this document useful (0 votes)
37 views46 pages

Centurian University Talk On Pso

This document provides an overview of genetic algorithms and particle swarm optimization. It discusses how both techniques are evolutionary computation methods inspired by natural evolution or swarm intelligence respectively. The key concepts of particle swarm optimization are explained, including how it was inspired by bird flocking and fish schooling behavior. The basic mathematical equations of PSO are presented, showing how each particle's velocity and position are updated based on its personal best solution and the neighborhood's best solution.

Uploaded by

Surendra Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views46 pages

Centurian University Talk On Pso

This document provides an overview of genetic algorithms and particle swarm optimization. It discusses how both techniques are evolutionary computation methods inspired by natural evolution or swarm intelligence respectively. The key concepts of particle swarm optimization are explained, including how it was inspired by bird flocking and fish schooling behavior. The basic mathematical equations of PSO are presented, showing how each particle's velocity and position are updated based on its personal best solution and the neighborhood's best solution.

Uploaded by

Surendra Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 46

Genetic Algorithm and Particle Swarm

Optimization: An Overview

Dr. Pravat Kumar Rout


Prof. and HOD ,
EEE Department,
ITER, SOA University
Goal of Optimization

Find values of the variables that minimize or


maximize the objective function while satisfying the
constraints.

2
Problem Definition

optimization of continuous nonlinear functions

finding the best solution in problem space

6/21/2013 3
Calculus

Maximum and minimum of a


smooth function is reached at a
stationary point where its
gradient vanishes.

6/21/2013 4
Why Evolutionary Computation

Exact Mathematical Expression (Not being applicable


to certain class of objective functions in case of
classical techniques)

 Derivative Free

Global Optimization(Not trapped at local minima in


case of classical technique)

Less Computational Complexity

6/21/2013 5
Other class of Heuristics
Inspired By Nature
 Evolutionary Algorithms (EA)
– All methods inspired by the evolution

 Swarm Intelligence (SI)


– All methods inspired by collective intelligence

 Geographical Nature Algorithm (GNA)


– All methods inspired by geographical structure of the
environment/ earth

6/21/2013 6
Evolution Inspired:
Evolutionary
Computation

Collective
Geographical
Intelligence
Inspired: Nature
Inspired: Swarm
Computation
Intelligence

6/21/2013 7
Memetic
Genetic Algorithm
Algorithm

Artificial
Immune
System

Evolutionary Computation

6/21/2013 8
Ant Colony
Particle Optimization
Swarm
Intelligence

Bat
Optimization

Swarm Intelligence

6/21/2013 9
Weed
Optimization
Biogeographical
Optimization

River Formation
Dynamics

Geographical Nature
Optimization

6/21/2013 10
Terms
• Stochastic optimization for use with random (noisy) function measurements or
random inputs in the search process.
• Combinatorial Optimization is concerned with problems where the set of feasible
solutions is discrete or can be reduced to a discrete one.
• Multi-modal Optimization: Optimization problems are often multi-modal; that is,
they possess multiple good solutions.
• Heuristics and meta-heuristics: make few or no assumptions about the problem
being optimized. Usually, heuristics do not guarantee that any optimal solution
need be found. On the other hand, heuristics are used to find approximate
solutions for many complicated optimization problems.

6/21/2013 11
Component of Optimization
Problem
• Objective Function: An objective function which we want to
minimize or maximize.
• For example, in a manufacturing process, we might want to
maximize the profit or minimize the cost.
• In fitting experimental data to a user-defined model, we might
minimize the total deviation of observed data from predictions
based on the model.
• In designing an inductor, we might want to maximize the Quality
Factor and minimize the area.

6/21/2013 12
Component of Optimization Problem
• Design Variables: A set of unknowns or variables which affect
the value of the objective function.
• In the manufacturing problem, the variables might include the
amounts of different resources used or the time spent on each
activity.
• In fitting-the-data problem, the unknowns are the parameters
that define the model.
• In the inductor design problem, the variables used define the
layout geometry of the panel.

6/21/2013 13
Component of Optimization
Problem
• Constraints: A set of constraints that allow the unknowns to take
on certain values but exclude others.

• For the manufacturing problem, it does not make sense to spend a


negative amount of time on any activity, so we constrain all the
"time" variables to be non-negative.

• In the inductor design problem, we would probably want to limit


the upper and lower value of layout parameters and to target an
inductance value within the tolerance level.

6/21/2013 14
Mathematical Formulation of
Optimization Problems

minimize the objective function


Example
min f ( x), x = ( x1 , x2 ,......., xn )
min ( x1 - 2 ) + ( x2 - 1) �
� 2 2
subject to constraints � �
ci ( x) �0 subject: x12 - x2 2 �0
ci ( x ) = 0 x1 + x2 �2
Inequality constraints: x12 – x22 < 0 
Equality constraints: x1 = 2

6/21/2013 15
Origin of PSO
Concept:
based on bird
flocks and fish
schools

How can birds or fish exhibit


such a coordinated
collective behavior?

6/21/2013 16
What is PSO?
• In PSO, each single solution is a "bird" in the search space. Call it
"particle".

• All of particles have fitness values

• which are evaluated by the fitness function to be optimized, and

• have velocities
• which direct the flying of the particles.

• The particles fly through the problem space by following the


current optimum particles.

6/21/2013 17
Particle Swarm Optimization
• Evolutionary computational technique based on the movement and intelligence of
swarms looking for the most fertile feeding location

• It was developed in 1995 by James Kennedy and Russell Eberhart

• Simple algorithm, easy to implement and few parameters to adjust mainly the
velocity

• A “swarm” is an apparently disorganized collection (population) of moving


individuals that tend to cluster together while each individual seems to be moving
in a random direction

6/21/2013 18
Continued…
• It uses a number of agents (particles) that constitute a swarm
moving around in the search space looking for the best solution
(based on bird flocks and fish schools).

• Each particle is treated as a point in a D-dimensional space which


adjusts its “flying” according to its own flying experience as well as
the flying experience of other particles

• Each particle keeps track of its coordinates in the problem space


which are associated with the best solution (fitness) that has
achieved so far. This value is called pbest.

6/21/2013 19
Continued…
• Another best value that is tracked by the PSO is the best value
obtained so far by any particle in the neighbors of the particle. This
value is called gbest.

• The PSO concept consists of changing the velocity(or accelerating)


of each particle toward its pbest and the gbest position at each time
step.

6/21/2013 20
PSO Basic Mathematical Equations
Inertia Social Influence Factor
Factor


�vt +1 = c1vt + c2 ( pi ,t - xt ) + c3 ( pg ,t - xt )

�xt +1 = xt + vt +1 pi
particle’s personal best

Personal
where
Influence xt particle’s neighbours best
pg
Factor
vt := velocity at time step t xt+1
xt := position at time step t vt particle’s itself
pi ,t := best previous position, at time step t
pg ,t := best neighbour's previous best, at time step t ,
c1 ,c2 ,c3 := cognitive/social confidence coefficients
PSO Velocity Update Equations Using
Constriction Factor Method

vidnew = K  [vidold + c1  rand1  ( pid - xid ) + c2  rand 2  ( p gd - xid )]


xidnew = xidold + vidnew
2
K=
2 -  -  2 - 4
 = c1 + c2 ,   4
( was set to 4.1, so K = 0.729)

22
PSO algorithm
Start

Initialize particles with random


pbest = the best
position and zero velocity
solution (fitness) a
particle has
Evaluate fitness value achieved so far.

Compare & update fitness value gbest = the global


with pbest and gbest best solution of all
particles.
YES
Meet stopping
criterion?
End
NO

Update velocity and


position
Iteration Procedure for P.S.O.
 Pseudo Code of Iteration Procedure:
For each particle
Initialize particle
END

Do
For each particle
Calculate fitness value
If the fitness value is better than the best fitness value (pBest) in history
set current value as the new pBest
End

Choose the particle with the best fitness value of all the particles as the gBest

For each particle


Update particle velocity
Update particle position
End
While maximum iterations or minimum error criteria is not attained
PSO and GA Comparison
• Commonalities
– PSO and GA are both population based stochastic optimization
– both algorithms start with a group of a randomly generated
population,
– both have fitness values to evaluate the population.
– Both update the population and search for the optimium with
random techniques.
– Both systems do not guarantee success.

6/21/2013 25
PSO and GA Comparison
• Differences
– PSO does not have genetic operators like crossover and
mutation. Particles update themselves with the internal
velocity.
– They also have memory, which is important to the algorithm.
– Particles do not die
– the information sharing mechanism in PSO is significantly
different
• Info from best to others, GA population moves together

6/21/2013 26
• PSO has a memory
not “what” that best solution was, but “where” that best solution
was
• Quality: population responds to quality factors pbest and gbest
• Diverse response: responses allocated between pbest and gbest
• Stability: population changes state only when gbest changes
• Adaptability: population does change state when gbest changes

6/21/2013 27
• There is no selection in PSO
all particles survive for the length of the run
PSO is the only EA that does not remove candidate population
members
• In PSO, topology is constant; a neighbor is a neighbor
• Population size: 20-40

6/21/2013 28
Schwefel's function
n
f ( x) =  (- xi )  sin( xi )
i =1

where
- 500  xi  500
global minimum
f ( x) = n  418.9829;
xi = -420.9687, i = 1 : n

6/21/2013 29
Advantages Over Other Optimization Technique
• It is derivative free technique unlike many conventional technique
• It has the flexibility to integrated with other optimization techniques to form
hybrid tools
• It is less sensitive to the nature of the objective function that is convexity or
continuity
• It has less parameters to adjust unlike many other competing evolutionary
techniques
• It has the ability to escape the local minima
• It is easy to implement and program with basic mathematical and logic
operations
• It does not require a good initial solution to start its iteration process
• It can handle objective functions with stochastic nature, like in the case of
representing one of the optimization variables as random

6/21/2013 30
Disadvantages of PSO
• Lack of solid mathematical background
• failure to assure global optimal solution
• the social influence aspect of the algorithm
• generalized rules in how to tune its parameters to suit
different optimization problems
• coefficient adjustment not clear methodology

6/21/2013 31
DECLARATION OF VARIABLES AND
THEIR MEANING
% itermax: Maximum Iteration Number
% c1, c2 : Two parameters for PSO algorithm
% wmax, wmin : these are the maximum and minimum value
of the parameter w
% population_size: Size of the population/number of particles
% var_max : maximum value of the variable
% var_min : minimum value of the variable
% var_size : total number of variables
% population: matrix of value of all the particles/ solutions

6/21/2013 32
% pbest: personal best value
% pbest_value: personal best fitness value
% gbest: group best among all particles
% gbest_value: fitness value of the best among the
group
% velocity_max = maximum value of the velocity
% velocity_min = minimum value of the velocity

6/21/2013 33
Step-1: INITIALIZATION OF
VARIABLES
itermax = 100; var_max = [5.12 5.12];
c1 = 2; var_min = [-5.12 -5.12];
c2 = 2;
velocity_max = var_max;
wmax = 0.9;
velocity_min = var_min;
wmin = 0.4;
var_size = length(var_max);
population_size = 20;

6/21/2013 34
Step-2:Initial Position
population = zeros(population_size, var_size);
velocity = zeros(population_size, var_size);
velocity_new = zeros(population_size, var_size);

for i = 1:population_size
for j = 1:var_size
population(i,j) = var_min(1,j) + rand*(var_max(1,j) -
var_min(1,j));
end
end

6/21/2013 35
Step-3:Initial Velocity
for i = 1:population_size
for j = 1:var_size
velocity(i,j) = velocity_min(1,j) +
rand*(velocity_max(1,j) -
velocity_min(1,j));
end
end

6/21/2013 36
Step-4:Determination of P best
&G best

fitness = objective_function(population);
pbest = population;
pbest_value = fitness;

[ xx yy] = min(fitness);
gbest = population(yy,:);
gbest_value = xx ;

6/21/2013 37
Loop
for iter = 1:itermax
Step-5: Update Weight, Velocity and Check
limit
Step-6: Update Position & Limit Checking
Step-7:Modifying Pbest & Gbest
Step-8: Graph & Data Presentation
end

6/21/2013 38
Step-5: Update Weight, Velocity and
Check Limit
w = wmax - ((wmax - wmin)/itermax)*iter;
for i = 1:population_size
velocity_new(i,:) = w*velocity(i,:) +
c1*rand*(pbest(i,:) - population(i,:)) +
c2*rand*(gbest(1,:) - population(i,:));
end

6/21/2013 39
Contd. …
for i = 1:population_size
for j = 1:var_size
if velocity_new(i,j) > velocity_max(j)
velocity_new(i,j) = velocity_max(j);
elseif velocity_new(i,j) < velocity_min(j)
velocity_new(i,j) = velocity_min(j);
end
end
end

6/21/2013 40
Step-6: Update Position & Check Limit
population_new = population + velocity_new;

for i = 1:population_size
for j = 1:var_size
if population_new(i,j) > var_max(j)
population_new(i,j) = var_max(j);
elseif population_new(i,j) < var_min(j)
population_new(i,j) = var_min(j);
end
end
end

6/21/2013 41
Step-7: Modifying P best

fitness_new = objective_function(population_new);
[ x y] = min(fitness_new);
for i = 1:population_size
if fitness_new(i)< pbest_value
pbest(i,:) = population_new(i,:);
pbest_value(i) = fitness_new(i);
end
end

6/21/2013 42
Step-7: Modifying G best

if x < gbest_value
gbest = population_new (y , :);
gbest_value = x;
end

population = population_new;
velocity = velocity_new;

6/21/2013 43
Step-8: Graphs & Data
Presentation

best_value(iter) = gbest_value;
drawnow
plot(best_value);

6/21/2013 44
Objective Function
function fitness = objective_function(population)
[row_population col_population] = size(population);
for i = 1: row_population
for j = 1: col_population
xx(j) = population(i,j);
end
fitness(i) = 20 + xx(1)^2 + xx(2)^2 -10*(cos(2*pi*xx(1))
+cos(2*pi*xx(2)));
end

6/21/2013 45
Thank You All

6/21/2013 46

You might also like