0% found this document useful (0 votes)
15 views

Ga-Matlab

The document discusses various optimization techniques available in MATLAB including genetic algorithm (ga), particle swarm optimization (pso), simulated annealing (simulannealbnd), multi-objective genetic algorithm (gamultiobj) and pattern search. Examples are provided for each technique to minimize different objective functions subject to various constraints. It is also mentioned that supplying analytical gradients of the objective function can improve optimization performance for some solvers.

Uploaded by

Maham Akram
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

Ga-Matlab

The document discusses various optimization techniques available in MATLAB including genetic algorithm (ga), particle swarm optimization (pso), simulated annealing (simulannealbnd), multi-objective genetic algorithm (gamultiobj) and pattern search. Examples are provided for each technique to minimize different objective functions subject to various constraints. It is also mentioned that supplying analytical gradients of the objective function can improve optimization performance for some solvers.

Uploaded by

Maham Akram
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

gatool

x = ga(fitnessfcn,nvars,A,b,Aeq,beq,LB,UB)

Nvars :number of variables in the problem

H.W: Solve the following problem using gatool ?

x
~

min f x   x1 x2 x3
~

Subject to:
 x1  2 x2  2 x3  0
x1  2 x2  2 x3  72
0  x1 , x2 , x3  30
solution
clear
clc
nvars=3;
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
fitnessfcn=@myfun;
x = ga(fitnessfcn,nvars,A,B,[],[],LB,UB)

function f = myfun(x)
f=-x(1)*x(2)*x(3);
end
• Example Use the genetic algorithm to minimize the ps_example function {inbuilt in
Matlab}, with following constraints :-

Rearranging constraints

Matlab Code
A = [-1 -1]; b = -1;
Aeq = [-1 1]; beq = 5
lb = [1 -3]; ub = [6 8];%Set bounds lb and ub • Results
fun = @ps_example; x=
x = ga(fun,2,A,b,Aeq,beq) -2.0000 2.9990
particleswarm
• Bound constrained optimization using Particle Swarm Optimization (PSO)
• Minimizes objective function subject to constraints
• MATLAB built in function particles warm
– May define as an unconstrained problem or find solution in a range
– Allows to specify various options using optimoptions
• Syntax
x = particleswarm(fun,nvars,lb,ub,options)
• Example the objective function. Set bounds on the variables
fun =@(x)x(1)*exp(−norm(x)2 )
• Matlab Code
lb = [−10,−15] and ub = [15,20] %Define the objective function.
fun = @(x)x(1)*exp(-norm(x)^2);
%Call particleswarm to minimize the
function.
rng default % For reproducibility
Result nvars = 2;
x= x = particleswarm(fun,nvars)
629.4474 311.4814
%This solution is far from the true minimum, as
%you see in a function plot.
fsurf(@(x,y)x.*exp(-(x.^2+y.^2)))
simulannealbnd
• Finds the minimum of a given function using the
Simulated Annealing Algorithm.
• May define upper and lower bounds
• Useful in minimizing functions that may have many local
minima
• Syntax

[x,fval] = simulannealbnd(fun,x0,lb,ub,options)
• Example Minimize De Jong’s fifth function à 2 D function with many local minima.

• Startingpoint à[0,0]
• UB is 64 and LB is -64

De Jong's fifth function is a two-


dimensional function with many (25)
local minima. In the following plot, it is
unclear which of these local minima is
the global minimum. Many standard
optimization algorithms become stuck in
local minima.
No Lower and Upper Bounds With LB and UB
fun = @dejong5fcn; fun = @dejong5fcn;
x0 = [0 0]; x0 = [0 0];
x = simulannealbnd(fun,x0) lb = [-64 -64];
ub = [64 64];
Output X=simulannealbnd(fun,x0,lb,ub)
x=
-31.9785 -31.9797 Output
x=
0.0009 -31.9644

Note: The simulannealbnd algorithm uses the MATLAB® random number stream,
so different results may be obtained after each run
Multi-objective GA
• Find Pareto front of multiple fitness functions using genetic algorithm
• Syntax
x = gamultiobj(fitnessfcn,nvars,A,b,Aeq,beq,lb,ub)

• Example Compute the Pareto front for a simple multi-objective problem.


There are two objectives and two decision variables x

X-

• Matlab code
fitnessfcn = @(x)[norm(x)^2,0.5*norm(x(:)-[2;-
1])^2+2];
rng default % For reproducibility
x = gamultiobj(fitnessfcn,2,[],[],[],[],[],[])
Results
Optimization terminated: average change in the spread of Pareto solutions less
than options. Function Tolerance.
x=
-0.0072 0.0003 • Plot the solution points
0.0947 -0.0811 plot(x(:,1),x(:,2),'ko')
1.0217 -0.6946
1.1254 -0.0857
-0.0072 0.0003
0.4489 -0.2101
1.8039 -0.8394
0.5115 -0.6314
1.5164 -0.7277
1.7082 -0.7006
1.8330 -0.9337
0.7657 -0.6695
0.7671 -0.4882
1.2080 -0.5407
1.0075 -0.5348
0.6281 -0.1454
2.0040 -1.0064
1.5314 -0.9184
patternsearch
• Find a local minimum a given function using pattern search subject to
– linear equalities/inequalities,
– lower/upper bounds,
– non-linear inequalities/equalities
• Specific options may be given using optimoptions
• Syntax

x = patternsearch(fun,x0,A,b,Aeq,beq,lb,ub,nonlcon)
• Example1 Minimize an unconstrained problem using a user function and patternsearch
solver
y = exp(-x(1)2-x(2)2)*(1+5x(1) + 6*x(2) + 12x(1)cos(x(2)));
Function Code
function[y] = obj1(x)
y = exp(-x(1)^2-x(2)^2)*(1+5*x(1) + 6*x(2) + 12*x(1)*cos(x(2)));

Main Code
fun = @obj1; Output
x0 = [0,0]; x =-0.7037 -0.1860
x = patternsearch(fun,x0)
• Example 2 Solving the previous problem with lower bound (LB) and upper bound (UB)
specified and starting at [1,-5]:

Main Code
Output
fun = @obj1;
x=
lb = [0,-Inf];
0.1880 -3.0000
ub = [Inf,-3];
A = [];
b = [];
Aeq = [];
beq = [];
x0 = [1,-5];
x = patternsearch(fun,x0,A,b,Aeq,beq,lb,ub)
Supplying gradients
It may be desirable to analytically specify the gradient of the
function
To do this, the named function must return two outputs: the
function value and the gradient

You might also like