0% found this document useful (0 votes)
7 views

OPTMIZATION Practical FILE

Optimization file

Uploaded by

garvit.ug20
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

OPTMIZATION Practical FILE

Optimization file

Uploaded by

garvit.ug20
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 19

NETAJI SUBHAS UNIVERSITY OF

TECHNOLOGY, DWARKA

LAB File

Optimization Techniques in AI & ML

EAEPE21

ECAM-2

Submitted to:
Submitted by:
Prof. Aarti Jain
GARVIT

2020UEA6591
INDEX

S.No. Title Date Page

1 Write a MATLAB program to initialize 3-4


parameters for simulated annealing algorithm

2 Write a MATLAB program to compute the 5-6


optimal output using simulated annealing
algorithm

3 Using Optimization ToolBox apply Genetic 7-9


Algorithm on Rosen-brock’s 2D equation

4 Write a MATLAB program to initialize 10-11


parameters for firefly algorithm

5 Write a MATLAB program to compute the 12-13


optimal output using firefly algorithm

6 Write a MATLAB program to initialize 14-15


parameters for particle swarm optimization
algorithm

7 Write a MATLAB program to compute the 16-17


optimal output using particle swarm
optimization algorithm
EXPERIMENT 1

Aim: Write a MATLAB program to initialize parameters for simulated annealing


algorithm

Theory: Simulated annealing is an optimization algorithm inspired by the physical


process of annealing. It works by randomly selecting input parameters and evaluating
the score, then randomly modifying them and evaluating the new score. The algorithm
accepts the new parameters if the new score is better or with a certain probability if the
new score is worse, depending on a temperature parameter that gradually decreases
over time. Simulated annealing can solve various optimization problems, and the
performance depends on parameters such as the cooling schedule and initial
temperature. Experimentation and tuning of these parameters are important for
achieving good results. Simulated annealing is particularly useful for complex
optimization problems where traditional methods may not work well. A metaheuristic
algorithm can find a globally optimal solution with a higher probability than other local
search algorithms. However, the computational cost can be high, especially for
problems with many input parameters.

Code:
% Lower and upper bounds
Lb = [-2 -2];
Ub = [2 2];
nd = length(Lb);
%Initializing parameters and settings
T_init = 1.0;
T_min = 1e-10;
F_min = -1e+100;
max_rej = 250;
max_run = 150;
max_accept = 15;
k = 1;
alpha = 0.95;
Enorm = 1e-2;
guess = Lb+(Ub-Lb).*rand(size(Lb));
%initializing the counter i,j etc
i = 0; j = 0; accept = 0; totaleval = 0;
%Initializing various values
T = T_init;
E_init = f_obj(guess);
E_old = E_init; E_new = E_old;
best = guess;

function z=f_obj(u)
%Rosenbrock's function with f+=0 at (1,1)
z=(u(1)-1)^2+100*(u(2)-u(1)^2)^2;
end
Output:
EXPERIMENT 2
Aim: Write a MATLAB program to compute the optimal output using simulated
annealing algorithm

Theory: Simulated annealing is an optimization algorithm inspired by the physical


process of annealing. It works by randomly selecting input parameters and evaluating
the score, then randomly modifying them and evaluating the new score. The algorithm
accepts the new parameters if the new score is better or with a certain probability if the
new score is worse, depending on a temperature parameter that gradually decreases
over time. Simulated annealing can solve various optimization problems, and the
performance depends on parameters such as the cooling schedule and initial
temperature. Experimentation and tuning of these parameters are important for
achieving good results. Simulated annealing is particularly useful for complex
optimization problems where traditional methods may not work well. A metaheuristic
algorithm can find a globally optimal solution with a higher probability than other local
search algorithms. However, the computational cost can be high, especially for
problems with many input parameters.

Code:
%Started Simulated Annealling
while((T>T_min) && (j <= max_rej) && (E_new > F_min))
i = i+1;
if(i >= max_run) || (accept >= max_accept)
T = alpha*T;
totaleval = totaleval + i;
i = 1; accept = 1;
end

%Function evals at new locations


s = 0.01*(Ub-Lb);
ns = best+s.*randn(1,nd);
E_new = f_obj(ns);
%Decide to accept new solution
DeltaE = E_new - E_old;
%Accept if improved
if(-DeltaE>Enorm)
best = ns; E_old = E_new;
accept = accept+1; j = 0;
end

%Accept with a small probability if not improved


if(DeltaE <= Enorm && exp(-DeltaE/(k*T))>rand)
best = ns; E_old = E_new;
accept = accept+1;
else
end

%Update estimated optimal solution


f_opt = E_old;
end
%Displaying Results
disp(strcat('Evaluations : ', num2str(totaleval)))
disp(strcat('Best solution : [', num2str(best), ']'))
disp(strcat('Best objective : ', num2str(f_opt)))

Output:
EXPERIMENT 3
Aim: Using Optimization ToolBox, apply Genetic Algorithm on Rosen-brock’s 2D
equation

Theory: Genetic algorithms are optimization algorithms inspired by evolution and


natural selection. A population of candidate solutions is generated and evolved over
generations using genetic operators, such as crossover and mutation, to find a solution
with the highest fitness score. Genetic algorithms optimize problems with large search
spaces and complex or noisy fitness functions. However, performance can depend on
factors such as population size, selection mechanism, and genetic operators used.
Experimentation and tuning of parameters are important for achieving good results.
Genetic algorithms have been used in various fields, including engineering, finance, and
machine learning. They are particularly useful for problems with no obvious solution or
where traditional methods may be inefficient or impractical. However, the
computational cost can be high, especially for problems with many parameters.

Code:
% initialize values
x0 = [5;0]; %initial population
a = 100; %number of iterations

% objective function
function f = objectiveFcn(x,a)
f = a*(x(2) - x(1)^2)^2 + (1 - x(1))^2;
end

% constraints function
function [c,ceq] = constraintFcn(x)
c(1) = x(1)^2 + x(2)^2 - 5;
c(2) = 3 - x(1)^2 - x(2)^2;
ceq = []; % No equality constraints
end

Optimization ToolBox Setup:


Steps to solve optimization via toolbox are:
● First define the initial population and initial parameters of the algorithm along
with objective functions as well as constraints.
● Next, choose the solver-based approach in the dialog shown below.
● Then, a toolbox will pop-up containing all the tools for setting up the
optimization process, as shown in the figure below. You need to design the
toolbox according to your convenience. I have used Genetic Algorithm with
Multi-Objective Optimization since I am optimizing two variables with
different constraints. After setting up the toolbox, you can run and wait for
results.
Output:
EXPERIMENT 4
Aim: Write a MATLAB program to initialize parameters for firefly algorithm

Theory: Firefly algorithm is a metaheuristic optimization algorithm inspired by the


flashing behavior of fireflies. The algorithm simulates the attraction behavior of fireflies
towards brighter ones, and their movement is controlled by a set of parameters that
determine the step size and attractiveness. The Firefly algorithm has been applied to
many optimization problems and can be effective when traditional methods are not
feasible or efficient. However, performance can depend on parameter choice and
problem nature, so experimentation and tuning are important for achieving good
results. Firefly algorithm is easy to implement and has few parameters to tune
compared to other metaheuristic algorithms. It is also a population-based algorithm,
which makes it suitable for problems with multiple optima or non-convex search spaces.

Code:
% func - function to be optimized
% lb - lower bound of the solution space
% ub - upper bound of the solution space
% dim - dimension of the solution space
% nFireflies - number of fireflies in the population
% maxGen - maximum number of generations
% Define the optimization function
func = @(x) (x(1) - 5)^2 + (x(2) - 3)^2;
% Set the lower and upper bounds of the solution space
lb = -5
ub = 5
% Set the number of fireflies and dimensions
nFireflies = 100
dim = 2
% Set the maximum number of generations
maxGen = 50
% Initialize the fireflies
fireflies = lb + (ub - lb)*rand(nFireflies, dim)
% Evaluate the fitness of the fireflies
fit = zeros(nFireflies,1);
for i = 1 : nFireflies
fit(i) = func(fireflies(i,:));
end
% Find the best firefly
[best_fit, bestIdx] = min(fit)
best_solution = fireflies(bestIdx, :)
% Set the algorithm parameters
beta0 = 1
gamma = 1
alpha = 0.2

Output:
EXPERIMENT 5
Aim: Write a MATLAB program to compute the optimal output using firefly
algorithm

Theory: Firefly algorithm is a metaheuristic optimization algorithm inspired by the


flashing behavior of fireflies. The algorithm simulates the attraction behavior of fireflies
towards brighter ones, and their movement is controlled by a set of parameters that
determine the step size and attractiveness. The Firefly algorithm has been applied to
many optimization problems and can be effective when traditional methods are not
feasible or efficient. However, performance can depend on parameter choice and
problem nature, so experimentation and tuning are important for achieving good
results. Firefly algorithm has the advantage of being easy to implement and having few
parameters to tune compared to other metaheuristic algorithms. Additionally, it is a
population-based algorithm, which makes it suitable for problems with multiple optima
or non-convex search spaces.

Code:
% Iterate through generations
for g = 1 : maxGen
for i = 1 : nFireflies
for j = i+1 : nFireflies
% Calculate the distance between fireflies i and j
dist = norm(fireflies(i,:) - fireflies(j,:));

% If firefly j is brighter than firefly i, move firefly i towards j


if fit(j) < fit(i)
beta = beta0 * exp(-gamma * dist.^2);
fireflies(i,:) = fireflies(i,:) + beta * (fireflies(j,:) - fireflies(i,:)) + alpha * randn(1,
dim);

% Update the fitness of firefly i


fit(i) = func(fireflies(i,:));

% Update the best firefly


if fit(i) < best_fit
best_fit = fit(i);
best_solution = fireflies(i,:);
end
end
end
end
end
fprintf('Best solution: [%f, %f]\n', best_solution(1), best_solution(2));
fprintf('Best fitness: %f\n', best_fit);
end
Output:
EXPERIMENT 6
Aim: Write a MATLAB program to initialize parameters for particle swarm
optimization algorithm

Theory: Particle swarm optimization (PSO) is a metaheuristic optimization algorithm


inspired by the collective behavior of birds flocking or fish schooling. In PSO, particles
adjust their position and velocity based on their own experience and the experience of
their neighboring particles toward the optimal solution. The algorithm has been
successfully applied to many optimization problems, but performance can depend on
parameter choice and problem nature, requiring experimentation and tuning. PSO has
the advantage of being easy to implement and efficient in finding the global optimum in
high-dimensional search spaces. Additionally, PSO is a population-based algorithm that
can handle problems with multiple optima or non-convex search spaces, making it a
popular choice for optimization problems in various domains.

Code:
% fitness_func: the fitness function to be optimized
% n_vars: the number of variables in the problem
% lb: lower bounds for each variable
% ub: upper bounds for each variable
% max_iters: maximum number of iterations
% swarm_size: number of particles in the swarm
% inertia_weight: inertia weight parameter
% cognitive_weight: cognitive weight parameter
% social_weight: social weight parameter
fitness_func = @(x) sum(x.^2);
n_vars = 2
lb = -10
ub = 10
max_iters = 100
swarm_size = 50
inertia_weight = 0.729
cognitive_weight = 1.49445
social_weight = 1.49445
% Initialize the swarm particles
swarm = lb +(ub - lb)*rand(swarm_size, n_vars)
velocities = zeros(swarm_size, n_vars);
fitnesses = zeros(swarm_size, 1);
pbests = swarm;
pbest_fitnesses = inf(swarm_size, 1);
gbest = zeros(1, n_vars);
gbest_fitness = inf;
Output:
EXPERIMENT 7
Aim: Write a MATLAB program to compute the optimal output using particle swarm
optimization algorithm

Theory: Particle swarm optimization (PSO) is a metaheuristic optimization algorithm


inspired by the collective behavior of birds flocking or fish schooling. In PSO, particles
adjust their position and velocity based on their own experience and the experience of
their neighboring particles toward the optimal solution. The algorithm has been
successfully applied to many optimization problems, but performance can depend on
parameter choice and problem nature, requiring experimentation and tuning. PSO has
the advantage of being easy to implement and efficient in finding the global optimum in
high-dimensional search spaces. Additionally, PSO is a population-based algorithm that
can handle problems with multiple optima or non-convex search spaces, making it a
popular choice for optimization problems in various domains.

Code:
%For each particle calculate the fitness value
for i = 1:max_iters
% Evaluate fitness for each particle
for j = 1:swarm_size
fitnesses(j) = fitness_func(swarm(j,:));
%if the fitness value is better than the pbest before,
%set the current value as new pbest
if fitnesses(j) < pbest_fitnesses(j)
pbests(j,:) = swarm(j,:);
pbest_fitnesses(j) = fitnesses(j);
end
if fitnesses(j) < gbest_fitness
gbest = swarm(j,:);
gbest_fitness = fitnesses(j);
end
end
% Update velocities and positions for each particle
for j = 1:swarm_size
r1 = rand(1, n_vars);
r2 = rand(1, n_vars);
velocities(j,:) = inertia_weight * velocities(j,:) + cognitive_weight * r1 .* (pbests(j,:) -
swarm(j,:)) + social_weight * r2 .* (gbest - swarm(j,:));
swarm(j,:) = swarm(j,:) + velocities(j,:);
end
end
% Return the best solution and best fitness found
fprintf('Best solution and best fitness is:')
best_fitness = gbest_fitness;
best_solution = gbest;
disp(best_fitness);
disp(best_solution);
end
Output:

You might also like