Centurian University Talk On Pso
Centurian University Talk On Pso
Optimization: An Overview
2
Problem Definition
6/21/2013 3
Calculus
6/21/2013 4
Why Evolutionary Computation
Derivative Free
6/21/2013 5
Other class of Heuristics
Inspired By Nature
Evolutionary Algorithms (EA)
– All methods inspired by the evolution
6/21/2013 6
Evolution Inspired:
Evolutionary
Computation
Collective
Geographical
Intelligence
Inspired: Nature
Inspired: Swarm
Computation
Intelligence
6/21/2013 7
Memetic
Genetic Algorithm
Algorithm
Artificial
Immune
System
Evolutionary Computation
6/21/2013 8
Ant Colony
Particle Optimization
Swarm
Intelligence
Bat
Optimization
Swarm Intelligence
6/21/2013 9
Weed
Optimization
Biogeographical
Optimization
River Formation
Dynamics
Geographical Nature
Optimization
6/21/2013 10
Terms
• Stochastic optimization for use with random (noisy) function measurements or
random inputs in the search process.
• Combinatorial Optimization is concerned with problems where the set of feasible
solutions is discrete or can be reduced to a discrete one.
• Multi-modal Optimization: Optimization problems are often multi-modal; that is,
they possess multiple good solutions.
• Heuristics and meta-heuristics: make few or no assumptions about the problem
being optimized. Usually, heuristics do not guarantee that any optimal solution
need be found. On the other hand, heuristics are used to find approximate
solutions for many complicated optimization problems.
6/21/2013 11
Component of Optimization
Problem
• Objective Function: An objective function which we want to
minimize or maximize.
• For example, in a manufacturing process, we might want to
maximize the profit or minimize the cost.
• In fitting experimental data to a user-defined model, we might
minimize the total deviation of observed data from predictions
based on the model.
• In designing an inductor, we might want to maximize the Quality
Factor and minimize the area.
6/21/2013 12
Component of Optimization Problem
• Design Variables: A set of unknowns or variables which affect
the value of the objective function.
• In the manufacturing problem, the variables might include the
amounts of different resources used or the time spent on each
activity.
• In fitting-the-data problem, the unknowns are the parameters
that define the model.
• In the inductor design problem, the variables used define the
layout geometry of the panel.
6/21/2013 13
Component of Optimization
Problem
• Constraints: A set of constraints that allow the unknowns to take
on certain values but exclude others.
6/21/2013 14
Mathematical Formulation of
Optimization Problems
6/21/2013 15
Origin of PSO
Concept:
based on bird
flocks and fish
schools
6/21/2013 16
What is PSO?
• In PSO, each single solution is a "bird" in the search space. Call it
"particle".
• have velocities
• which direct the flying of the particles.
6/21/2013 17
Particle Swarm Optimization
• Evolutionary computational technique based on the movement and intelligence of
swarms looking for the most fertile feeding location
• Simple algorithm, easy to implement and few parameters to adjust mainly the
velocity
6/21/2013 18
Continued…
• It uses a number of agents (particles) that constitute a swarm
moving around in the search space looking for the best solution
(based on bird flocks and fish schools).
6/21/2013 19
Continued…
• Another best value that is tracked by the PSO is the best value
obtained so far by any particle in the neighbors of the particle. This
value is called gbest.
6/21/2013 20
PSO Basic Mathematical Equations
Inertia Social Influence Factor
Factor
�
�vt +1 = c1vt + c2 ( pi ,t - xt ) + c3 ( pg ,t - xt )
�
�xt +1 = xt + vt +1 pi
particle’s personal best
Personal
where
Influence xt particle’s neighbours best
pg
Factor
vt := velocity at time step t xt+1
xt := position at time step t vt particle’s itself
pi ,t := best previous position, at time step t
pg ,t := best neighbour's previous best, at time step t ,
c1 ,c2 ,c3 := cognitive/social confidence coefficients
PSO Velocity Update Equations Using
Constriction Factor Method
22
PSO algorithm
Start
Do
For each particle
Calculate fitness value
If the fitness value is better than the best fitness value (pBest) in history
set current value as the new pBest
End
Choose the particle with the best fitness value of all the particles as the gBest
6/21/2013 25
PSO and GA Comparison
• Differences
– PSO does not have genetic operators like crossover and
mutation. Particles update themselves with the internal
velocity.
– They also have memory, which is important to the algorithm.
– Particles do not die
– the information sharing mechanism in PSO is significantly
different
• Info from best to others, GA population moves together
6/21/2013 26
• PSO has a memory
not “what” that best solution was, but “where” that best solution
was
• Quality: population responds to quality factors pbest and gbest
• Diverse response: responses allocated between pbest and gbest
• Stability: population changes state only when gbest changes
• Adaptability: population does change state when gbest changes
6/21/2013 27
• There is no selection in PSO
all particles survive for the length of the run
PSO is the only EA that does not remove candidate population
members
• In PSO, topology is constant; a neighbor is a neighbor
• Population size: 20-40
6/21/2013 28
Schwefel's function
n
f ( x) = (- xi ) sin( xi )
i =1
where
- 500 xi 500
global minimum
f ( x) = n 418.9829;
xi = -420.9687, i = 1 : n
6/21/2013 29
Advantages Over Other Optimization Technique
• It is derivative free technique unlike many conventional technique
• It has the flexibility to integrated with other optimization techniques to form
hybrid tools
• It is less sensitive to the nature of the objective function that is convexity or
continuity
• It has less parameters to adjust unlike many other competing evolutionary
techniques
• It has the ability to escape the local minima
• It is easy to implement and program with basic mathematical and logic
operations
• It does not require a good initial solution to start its iteration process
• It can handle objective functions with stochastic nature, like in the case of
representing one of the optimization variables as random
6/21/2013 30
Disadvantages of PSO
• Lack of solid mathematical background
• failure to assure global optimal solution
• the social influence aspect of the algorithm
• generalized rules in how to tune its parameters to suit
different optimization problems
• coefficient adjustment not clear methodology
6/21/2013 31
DECLARATION OF VARIABLES AND
THEIR MEANING
% itermax: Maximum Iteration Number
% c1, c2 : Two parameters for PSO algorithm
% wmax, wmin : these are the maximum and minimum value
of the parameter w
% population_size: Size of the population/number of particles
% var_max : maximum value of the variable
% var_min : minimum value of the variable
% var_size : total number of variables
% population: matrix of value of all the particles/ solutions
6/21/2013 32
% pbest: personal best value
% pbest_value: personal best fitness value
% gbest: group best among all particles
% gbest_value: fitness value of the best among the
group
% velocity_max = maximum value of the velocity
% velocity_min = minimum value of the velocity
6/21/2013 33
Step-1: INITIALIZATION OF
VARIABLES
itermax = 100; var_max = [5.12 5.12];
c1 = 2; var_min = [-5.12 -5.12];
c2 = 2;
velocity_max = var_max;
wmax = 0.9;
velocity_min = var_min;
wmin = 0.4;
var_size = length(var_max);
population_size = 20;
6/21/2013 34
Step-2:Initial Position
population = zeros(population_size, var_size);
velocity = zeros(population_size, var_size);
velocity_new = zeros(population_size, var_size);
for i = 1:population_size
for j = 1:var_size
population(i,j) = var_min(1,j) + rand*(var_max(1,j) -
var_min(1,j));
end
end
6/21/2013 35
Step-3:Initial Velocity
for i = 1:population_size
for j = 1:var_size
velocity(i,j) = velocity_min(1,j) +
rand*(velocity_max(1,j) -
velocity_min(1,j));
end
end
6/21/2013 36
Step-4:Determination of P best
&G best
fitness = objective_function(population);
pbest = population;
pbest_value = fitness;
[ xx yy] = min(fitness);
gbest = population(yy,:);
gbest_value = xx ;
6/21/2013 37
Loop
for iter = 1:itermax
Step-5: Update Weight, Velocity and Check
limit
Step-6: Update Position & Limit Checking
Step-7:Modifying Pbest & Gbest
Step-8: Graph & Data Presentation
end
6/21/2013 38
Step-5: Update Weight, Velocity and
Check Limit
w = wmax - ((wmax - wmin)/itermax)*iter;
for i = 1:population_size
velocity_new(i,:) = w*velocity(i,:) +
c1*rand*(pbest(i,:) - population(i,:)) +
c2*rand*(gbest(1,:) - population(i,:));
end
6/21/2013 39
Contd. …
for i = 1:population_size
for j = 1:var_size
if velocity_new(i,j) > velocity_max(j)
velocity_new(i,j) = velocity_max(j);
elseif velocity_new(i,j) < velocity_min(j)
velocity_new(i,j) = velocity_min(j);
end
end
end
6/21/2013 40
Step-6: Update Position & Check Limit
population_new = population + velocity_new;
for i = 1:population_size
for j = 1:var_size
if population_new(i,j) > var_max(j)
population_new(i,j) = var_max(j);
elseif population_new(i,j) < var_min(j)
population_new(i,j) = var_min(j);
end
end
end
6/21/2013 41
Step-7: Modifying P best
fitness_new = objective_function(population_new);
[ x y] = min(fitness_new);
for i = 1:population_size
if fitness_new(i)< pbest_value
pbest(i,:) = population_new(i,:);
pbest_value(i) = fitness_new(i);
end
end
6/21/2013 42
Step-7: Modifying G best
if x < gbest_value
gbest = population_new (y , :);
gbest_value = x;
end
population = population_new;
velocity = velocity_new;
6/21/2013 43
Step-8: Graphs & Data
Presentation
best_value(iter) = gbest_value;
drawnow
plot(best_value);
6/21/2013 44
Objective Function
function fitness = objective_function(population)
[row_population col_population] = size(population);
for i = 1: row_population
for j = 1: col_population
xx(j) = population(i,j);
end
fitness(i) = 20 + xx(1)^2 + xx(2)^2 -10*(cos(2*pi*xx(1))
+cos(2*pi*xx(2)));
end
6/21/2013 45
Thank You All
6/21/2013 46