Notes On PSO
Notes On PSO
1.
• We can see time dependant on t, which is the iteration space
• Estimate at k+1, based on previous x and predicted velocity.
3.
which follows a general gradient line search ( Like Newton, secant, not bisection) [ xn+1 = xn +
step*alpha
Where first two terms can be atributed to the previous step, and the third term: factor is the step size
multiplied by the direction. Keeping r1 and r2 as 0 and 0 and 1,1 we can find the minimum and
maximum of the factor and the direction. (Check paper)
• For convergence, we put both 1 and 2 equations in matrix form .
• so when it converges xk+1 = xk at k = infinity,
• so we find that it only converges when our velocity =0 and the current position, local best
and global best are the same.
4.
• Constraints
• Average of the objective function and level of violation of each constraint during
each iteration.
where f(x) is the objective function, m is the number of
constraints, g i (x) is a specific constraint value (with violated constraints having values larger than
zero), fbar is the average of the objective function values in the current swarm,and gbar(I) is the
violation of the lth constraint averaged overthe current population.
• The expression distributes the penalty parameter in such that hard constraints have bigger
penalties. So if x is withing the constraints , the the onjective function is f(x). Otherwise f(x)
is increased with a penalty function. suppose for 1 constraint is not satisfied then g1 >0. So a
particle with a violation (gi) equal average of the violations of all (gi bar) , then k(penalty) =
fbar/gbar in Equation 20. So the penalty is large for that individual if it contributes to most
of the violations.
=== Conclusion===
• PSO mimics social behaviour of animals in a flock.
• Individual and group memory to updae each particle position allowing global and local
search optimization.
• PSO can be formulated with a stochastic length and search.
• Effects of social and individual parameters and dynamic inertia was studied. c1=c2 was
found favourable. Global convergence and fast.
• PSO can find better or same structural optimization for differnt tasks.