SlideShare a Scribd company logo
MATLAB Optimization
Toolbox
Presented by
Chin Pei
February 28, 2003
Presentation Outline
 Introduction
 Function Optimization
 Optimization Toolbox
 Routines / Algorithms available
 Minimization Problems
 Unconstrained
 Constrained

Example

The Algorithm Description
 Multiobjective Optimization
 Optimal PID Control Example
Function Optimization
 Optimization concerns the minimization
or maximization of functions
 Standard Optimization Problem
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
~
min
x
f x
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality ConstraintsSubject to:
Inequality Constraints
Side Constraints
Function Optimization
( )~
f x is the objective function, which measure
and evaluate the performance of a system.
In a standard problem, we are minimizing
the function.
For maximization, it is equivalent to minimization
of the –ve of the objective function.
~
x is a column vector of design variables, which can
affect the performance of the system.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Function Optimization
Constraints – Limitation to the design space.
Can be linear or nonlinear, explicit or implicit functions
( )~
0jg x ≤
( )~
0ih x =
L U
k k kx x x≤ ≤
Equality Constraints
Inequality Constraints
Side Constraints
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Most algorithm require less than!!!
Optimization Toolbox
 Is a collection of functions that extend the capability of
MATLAB. The toolbox includes routines for:
 Unconstrained optimization
 Constrained nonlinear optimization, including goal
attainment problems, minimax problems, and semi-
infinite minimization problems
 Quadratic and linear programming
 Nonlinear least squares and curve fitting
 Nonlinear systems of equations solving
 Constrained linear least squares
 Specialized algorithms for large scale problems
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Minimization Algorithm (Cont.)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Equation Solving Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Least-Squares Algorithms
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Implementing Opt. Toolbox
 Most of these optimization routines require
the definition of an M-file containing the
function, f, to be minimized.
 Maximization is achieved by supplying the
routines with –f.
 Optimization options passed to the routines
change optimization parameters.
 Default optimization parameters can be
changed through an options structure.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Unconstrained Minimization
 Consider the problem of finding a set of
values [x1 x2]T
that solves
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( ) ( )1
~
2 2
1 2 1 2 2
~
min 4 2 4 2 1x
x
f x e x x x x x= + + + +
[ ]1 2
~
T
x x x=
Steps
 Create an M-file that returns the
function value (Objective Function)
 Call it objfun.m
 Then, invoke the unconstrained
minimization routine
 Use fminunc
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Step 1 – Obj. Function
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
function f = objfun(x)
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
[ ]1 2
~
T
x x x=
Objective function
Step 2 – Invoke Routine
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
x0 = [-1,1];
options = optimset(‘LargeScale’,’off’);
[xmin,feval,exitflag,output]=
fminunc(‘objfun’,x0,options);
Output arguments
Input arguments
Starting with a guess
Optimization parameters settings
Results
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
xmin =
0.5000 -1.0000
feval =
1.3028e-010
exitflag =
1
output =
iterations: 7
funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
algorithm: 'medium-scale: Quasi-Newton line search'
Minimum point of design variables
Objective function value
Exitflag tells if the algorithm is converged.
If exitflag > 0, then local minimum is found
Some other information
More on fminunc – Input
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
 fun: Return a function of objective function.
 x0: Starts with an initial guess. The guess must be a vector of size of
number of design variables.
 option: To set some of the optimization parameters. (More after few
slides)
 P1,P2,…: To pass additional parameters.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-5 to 5-9
More on fminunc – Output
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,…)
 xmin: Vector of the minimum point (optimal point). The size is the
number of design variables.
 feval: The objective function value of at the optimal point.
 exitflag: A value shows whether the optimization routine is
terminated successfully. (converged if >0)
 output: This structure gives more details about the optimization
 grad: The gradient value at the optimal point.
 hessian: The hessian value of at the optimal point
Ref. Manual: Pg. 5-5 to 5-9
Options Setting – optimset
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
 The routines in Optimization Toolbox has a set of default optimization
parameters.
 However, the toolbox allows you to alter some of those parameters,
for example: the tolerance, the step size, the gradient or hessian
values, the max. number of iterations etc.
 There are also a list of features available, for example: displaying the
values at each iterations, compare the user supply gradient or
hessian, etc.
 You can also choose the algorithm you wish to use.
Ref. Manual: Pg. 5-10 to 5-14
Options Setting (Cont.)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
 Type help optimset in command window, a list of options setting available will be displayed.
 How to read? For example:
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
The default is with { }
Parameter (param1)
Value (value1)
Options Setting (Cont.)
LargeScale - Use large-scale algorithm if
possible [ {on} | off ]
 Since the default is on, if we would like to turn off, we just type:
Options =
optimset(‘LargeScale’, ‘off’)
Options =
optimset(‘param1’,value1, ‘param2’,value2,…)
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
and pass to the input of fminunc.
Useful Option Settings
 Display - Level of display [ off | iter |
notify | final ]
 MaxIter - Maximum number of iterations
allowed [ positive integer ]
 TolCon - Termination tolerance on the
constraint violation [ positive scalar ]
 TolFun - Termination tolerance on the
function value [ positive scalar ]
 TolX - Termination tolerance on X [ positive
scalar ]
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Ref. Manual: Pg. 5-10 to 5-14
Highly recommended to use!!!
fminunc and fminsearch
 fminunc uses algorithm with gradient
and hessian information.
 Two modes:
 Large-Scale: interior-reflective Newton
 Medium-Scale: quasi-Newton (BFGS)
 Not preferred in solving highly
discontinuous functions.
 This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
fminunc and fminsearch
 fminsearch is generally less efficient than
fminunc for problems of order greater than
two. However, when the problem is highly
discontinuous, fminsearch may be more
robust.
 This is a direct search method that does not
use numerical or analytic gradients as in
fminunc.
 This function may only give local solutions.
IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Constrained Minimization
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
[xmin,feval,exitflag,output,lambda,grad,hessian]
=
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,
P1,P2,…)
Vector of Lagrange
Multiplier at optimal
point
Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
( )~
1 2 3
~
min
x
f x x x x= −
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1 2 30 , , 30x x x≤ ≤
Subject to:
1 2 2 0
,
1 2 2 72
A B
− − −   
= =   
   
0 30
0 , 30
0 30
LB UB
   
   = =   
      
function f = myfun(x)
f=-x(1)*x(2)*x(3);
Example (Cont.)
2
1 22 0x x+ ≤For
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq]
function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2);
Ceq=[];
Remember to return a null
Matrix if the constraint does
not apply
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
Example (Cont.)
x0=[10;10;10];
A=[-1 -2 -2;1 2 2];
B=[0 72]';
LB = [0 0 0]';
UB = [30 30 30]';
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
1 2 2 0
,
1 2 2 72
A B
− − −   
= =   
   
Initial guess (3 design variables)
CAREFUL!!!
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
0 30
0 , 30
0 30
LB UB
   
   = =   
      
Example (Cont.)
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to
medium-scale (line search).
> In D:ProgramsMATLAB6p1toolboxoptimfmincon.m at line 213
In D:usrCHINTANGOptToolboxmin_con.m at line 6
Optimization terminated successfully:
Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum
constraint violation is less than options.TolCon
Active Constraints:
2
9
x =
0.00050378663220
0.00000000000000
30.00000000000000
feval =
-4.657237250542452e-035
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization
Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
2
1 22 0x x+ ≤
1 2 3
1 2 3
2 2 0
2 2 72
x x x
x x x
− − − ≤
+ + ≤
1
2
3
0 30
0 30
0 30
x
x
x
≤ ≤
≤ ≤
≤ ≤
Const. 1
Const. 2
Const. 3
Const. 4
Const. 5
Const. 6
Const. 7
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq
Const. 8
Const. 9
Multiobjective Optimization
 Previous examples involved problems
with a single objective function.
 Now let us look at solving problem with
multiobjective function by lsqnonlin.
 Example is taken by designing an
optimal PID controller for an plant.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Simulink Example
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Goal: Optimize the control parameters in Simulink model optsim.mdl
in order to minimize the error between the output and input.
Plant description:
• Third order under-damped with actuator limits.
• Actuation limits are a saturation limit and a slew rate limit.
• Saturation limit cuts off input: +/- 2 units
• Slew rate limit: 0.8 unit/sec
Simulink Example (Cont.)
Initial PID Controller Design
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology
 Design variables are the gains in PID
controller (KP, KI and KD) .
 Objective function is the error between
the output and input.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Solving Methodology (Cont.)
 Let pid = [Kp Ki Kd]T
 Let also the step input is unity.
 F = yout - 1
 Construct a function tracklsq for
objective function.
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Objective Function
function F = tracklsq(pid,a1,a2)
Kp = pid(1);
Ki = pid(2);
Kd = pid(3);
% Compute function value
opt = simset('solver','ode5','SrcWorkspace','Current');
[tout,xout,yout] = sim('optsim',[0 100],opt);
F = yout-1;
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Getting the simulation
data from Simulink
The idea is perform nonlinear least squares
minimization of the errors from time 0 to 100
at the time step of 1.
So, there are 101 objective functions to minimize.
The lsqnonlin
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
[X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN]
= LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
Invoking the Routine
clear all
Optsim;
pid0 = [0.63 0.0504 1.9688];
a1 = 3; a2 = 43;
options =
optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun
',0.001);
pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2)
Kp = pid(1); Ki = pid(2); Kd = pid(3);
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Results
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Optimal gains
Results (Cont.)
Initial Design
Optimization Process
Optimal Controller Result
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective Optimization ConclusionConclusion
Conclusion
 Easy to use! But, we do not know what is happening
behind the routine. Therefore, it is still important to
understand the limitation of each routine.
 Basic steps:
 Recognize the class of optimization problem
 Define the design variables
 Create objective function
 Recognize the constraints
 Start an initial guess
 Invoke suitable routine
 Analyze the results (it might not make sense)
IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization
Multiobjective OptimizationMultiobjective Optimization Conclusion
Thank You!
Questions & Suggestions?

More Related Content

What's hot (20)

PPTX
Access specifier
zindadili
 
PPTX
Practical Swarm Optimization (PSO)
khashayar Danesh Narooei
 
PPTX
Regularization in deep learning
Kien Le
 
PPTX
Greedy algorithms
sandeep54552
 
PPTX
Activation_function.pptx
Mohamed Essam
 
PDF
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Universitat Politècnica de Catalunya
 
PDF
Regularization
Darren Yow-Bang Wang
 
PPTX
Activation function
RakshithGowdakodihal
 
PDF
Genetic algorithm
Respa Peter
 
PPTX
Viterbi algorithm
Supongkiba Kichu
 
PPT
Python ppt
Rohit Verma
 
PPTX
Optimization tutorial
Northwestern University
 
PPTX
Fuzzy arithmetic
Mohit Chimankar
 
PPTX
Python Programming Essentials - M23 - datetime module
P3 InfoTech Solutions Pvt. Ltd.
 
PPTX
02 data types in java
রাকিন রাকিন
 
PPTX
Postfix Notation | Compiler design
Shamsul Huda
 
PPTX
Fuzzy inference
swati singh
 
PPSX
Modules and packages in python
TMARAGATHAM
 
PDF
Classes and objects
Nilesh Dalvi
 
PDF
Python Functions Tutorial | Working With Functions In Python | Python Trainin...
Edureka!
 
Access specifier
zindadili
 
Practical Swarm Optimization (PSO)
khashayar Danesh Narooei
 
Regularization in deep learning
Kien Le
 
Greedy algorithms
sandeep54552
 
Activation_function.pptx
Mohamed Essam
 
Variational Autoencoders VAE - Santiago Pascual - UPC Barcelona 2018
Universitat Politècnica de Catalunya
 
Regularization
Darren Yow-Bang Wang
 
Activation function
RakshithGowdakodihal
 
Genetic algorithm
Respa Peter
 
Viterbi algorithm
Supongkiba Kichu
 
Python ppt
Rohit Verma
 
Optimization tutorial
Northwestern University
 
Fuzzy arithmetic
Mohit Chimankar
 
Python Programming Essentials - M23 - datetime module
P3 InfoTech Solutions Pvt. Ltd.
 
02 data types in java
রাকিন রাকিন
 
Postfix Notation | Compiler design
Shamsul Huda
 
Fuzzy inference
swati singh
 
Modules and packages in python
TMARAGATHAM
 
Classes and objects
Nilesh Dalvi
 
Python Functions Tutorial | Working With Functions In Python | Python Trainin...
Edureka!
 

Similar to Optimization toolbox presentation (20)

PPT
R/Finance 2009 Chicago
gyollin
 
PDF
2009 : Solving linear optimization problems with MOSEK
jensenbo
 
PPTX
MACHINE LEARNING NEURAL NETWORK PPT UNIT 4
MulliMary
 
PDF
Xgboost
Vivian S. Zhang
 
PDF
Handout2.pdf
Shoukat13
 
PPTX
Techniques in Deep Learning
Sourya Dey
 
PDF
Xgboost
Vivian S. Zhang
 
PPTX
JNTUK python programming python unit 3.pptx
Venkateswara Babu Ravipati
 
PDF
18.1 combining models
Andres Mendez-Vazquez
 
PPTX
KabirDataPreprocessingPyMMMMMMMMMMMMMMMMMMMMthon.pptx
ratnapatil14
 
PDF
Economic Dispatch of Generated Power Using Modified Lambda-Iteration Method
IOSR Journals
 
PPSX
Algorithms, Structure Charts, Corrective and adaptive.ppsx
DaniyalManzoor3
 
PDF
4optmizationtechniques-150308051251-conversion-gate01.pdf
BechanYadav4
 
PPTX
Optmization techniques
Deepshika Reddy
 
PDF
optmizationtechniques.pdf
SantiagoGarridoBulln
 
PDF
Code Optimizatoion
rajuvermadsvv
 
PPTX
Repair dagstuhl jan2017
Abhik Roychoudhury
 
PDF
XGBoost: the algorithm that wins every competition
Jaroslaw Szymczak
 
PDF
Functional Programming in Java 8
Omar Bashir
 
PDF
Optimization
QuantUniversity
 
R/Finance 2009 Chicago
gyollin
 
2009 : Solving linear optimization problems with MOSEK
jensenbo
 
MACHINE LEARNING NEURAL NETWORK PPT UNIT 4
MulliMary
 
Handout2.pdf
Shoukat13
 
Techniques in Deep Learning
Sourya Dey
 
JNTUK python programming python unit 3.pptx
Venkateswara Babu Ravipati
 
18.1 combining models
Andres Mendez-Vazquez
 
KabirDataPreprocessingPyMMMMMMMMMMMMMMMMMMMMthon.pptx
ratnapatil14
 
Economic Dispatch of Generated Power Using Modified Lambda-Iteration Method
IOSR Journals
 
Algorithms, Structure Charts, Corrective and adaptive.ppsx
DaniyalManzoor3
 
4optmizationtechniques-150308051251-conversion-gate01.pdf
BechanYadav4
 
Optmization techniques
Deepshika Reddy
 
optmizationtechniques.pdf
SantiagoGarridoBulln
 
Code Optimizatoion
rajuvermadsvv
 
Repair dagstuhl jan2017
Abhik Roychoudhury
 
XGBoost: the algorithm that wins every competition
Jaroslaw Szymczak
 
Functional Programming in Java 8
Omar Bashir
 
Optimization
QuantUniversity
 
Ad

Recently uploaded (20)

PPTX
Introduction to Internal Combustion Engines - Types, Working and Camparison.pptx
UtkarshPatil98
 
PPTX
Water Resources Engineering (CVE 728)--Slide 4.pptx
mohammedado3
 
PPTX
How Industrial Project Management Differs From Construction.pptx
jamespit799
 
PDF
Tesia Dobrydnia - An Avid Hiker And Backpacker
Tesia Dobrydnia
 
PDF
3rd International Conference on Machine Learning and IoT (MLIoT 2025)
ClaraZara1
 
PDF
20ES1152 Programming for Problem Solving Lab Manual VRSEC.pdf
Ashutosh Satapathy
 
PDF
Digital water marking system project report
Kamal Acharya
 
PDF
William Stallings - Foundations of Modern Networking_ SDN, NFV, QoE, IoT, and...
lavanya896395
 
PDF
Artificial Neural Network-Types,Perceptron,Problems
Sharmila Chidaravalli
 
PPTX
Seminar Description: YOLO v1 (You Only Look Once).pptx
abhijithpramod20002
 
PDF
MODULE-5 notes [BCG402-CG&V] PART-B.pdf
Alvas Institute of Engineering and technology, Moodabidri
 
PPTX
MODULE 03 - CLOUD COMPUTING AND SECURITY.pptx
Alvas Institute of Engineering and technology, Moodabidri
 
PPTX
fatigue in aircraft structures-221113192308-0ad6dc8c.pptx
aviatecofficial
 
PDF
Module - 4 Machine Learning -22ISE62.pdf
Dr. Shivashankar
 
PDF
Bayesian Learning - Naive Bayes Algorithm
Sharmila Chidaravalli
 
PDF
Water Industry Process Automation & Control Monthly July 2025
Water Industry Process Automation & Control
 
PDF
REINFORCEMENT LEARNING IN DECISION MAKING SEMINAR REPORT
anushaashraf20
 
PPTX
darshai cross section and river section analysis
muk7971
 
PPTX
Alan Turing - life and importance for all of us now
Pedro Concejero
 
PPTX
2025 CGI Congres - Surviving agile v05.pptx
Derk-Jan de Grood
 
Introduction to Internal Combustion Engines - Types, Working and Camparison.pptx
UtkarshPatil98
 
Water Resources Engineering (CVE 728)--Slide 4.pptx
mohammedado3
 
How Industrial Project Management Differs From Construction.pptx
jamespit799
 
Tesia Dobrydnia - An Avid Hiker And Backpacker
Tesia Dobrydnia
 
3rd International Conference on Machine Learning and IoT (MLIoT 2025)
ClaraZara1
 
20ES1152 Programming for Problem Solving Lab Manual VRSEC.pdf
Ashutosh Satapathy
 
Digital water marking system project report
Kamal Acharya
 
William Stallings - Foundations of Modern Networking_ SDN, NFV, QoE, IoT, and...
lavanya896395
 
Artificial Neural Network-Types,Perceptron,Problems
Sharmila Chidaravalli
 
Seminar Description: YOLO v1 (You Only Look Once).pptx
abhijithpramod20002
 
MODULE-5 notes [BCG402-CG&V] PART-B.pdf
Alvas Institute of Engineering and technology, Moodabidri
 
MODULE 03 - CLOUD COMPUTING AND SECURITY.pptx
Alvas Institute of Engineering and technology, Moodabidri
 
fatigue in aircraft structures-221113192308-0ad6dc8c.pptx
aviatecofficial
 
Module - 4 Machine Learning -22ISE62.pdf
Dr. Shivashankar
 
Bayesian Learning - Naive Bayes Algorithm
Sharmila Chidaravalli
 
Water Industry Process Automation & Control Monthly July 2025
Water Industry Process Automation & Control
 
REINFORCEMENT LEARNING IN DECISION MAKING SEMINAR REPORT
anushaashraf20
 
darshai cross section and river section analysis
muk7971
 
Alan Turing - life and importance for all of us now
Pedro Concejero
 
2025 CGI Congres - Surviving agile v05.pptx
Derk-Jan de Grood
 
Ad

Optimization toolbox presentation

  • 2. Presentation Outline  Introduction  Function Optimization  Optimization Toolbox  Routines / Algorithms available  Minimization Problems  Unconstrained  Constrained  Example  The Algorithm Description  Multiobjective Optimization  Optimal PID Control Example
  • 3. Function Optimization  Optimization concerns the minimization or maximization of functions  Standard Optimization Problem IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( )~ ~ min x f x ( )~ 0jg x ≤ ( )~ 0ih x = L U k k kx x x≤ ≤ Equality ConstraintsSubject to: Inequality Constraints Side Constraints
  • 4. Function Optimization ( )~ f x is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the –ve of the objective function. ~ x is a column vector of design variables, which can affect the performance of the system. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 5. Function Optimization Constraints – Limitation to the design space. Can be linear or nonlinear, explicit or implicit functions ( )~ 0jg x ≤ ( )~ 0ih x = L U k k kx x x≤ ≤ Equality Constraints Inequality Constraints Side Constraints IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Most algorithm require less than!!!
  • 6. Optimization Toolbox  Is a collection of functions that extend the capability of MATLAB. The toolbox includes routines for:  Unconstrained optimization  Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi- infinite minimization problems  Quadratic and linear programming  Nonlinear least squares and curve fitting  Nonlinear systems of equations solving  Constrained linear least squares  Specialized algorithms for large scale problems IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 7. Minimization Algorithm IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 8. Minimization Algorithm (Cont.) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 9. Equation Solving Algorithms IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 10. Least-Squares Algorithms IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 11. Implementing Opt. Toolbox  Most of these optimization routines require the definition of an M-file containing the function, f, to be minimized.  Maximization is achieved by supplying the routines with –f.  Optimization options passed to the routines change optimization parameters.  Default optimization parameters can be changed through an options structure. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 12. Unconstrained Minimization  Consider the problem of finding a set of values [x1 x2]T that solves IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( ) ( )1 ~ 2 2 1 2 1 2 2 ~ min 4 2 4 2 1x x f x e x x x x x= + + + + [ ]1 2 ~ T x x x=
  • 13. Steps  Create an M-file that returns the function value (Objective Function)  Call it objfun.m  Then, invoke the unconstrained minimization routine  Use fminunc IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 14. Step 1 – Obj. Function IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion function f = objfun(x) f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1); [ ]1 2 ~ T x x x= Objective function
  • 15. Step 2 – Invoke Routine IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion x0 = [-1,1]; options = optimset(‘LargeScale’,’off’); [xmin,feval,exitflag,output]= fminunc(‘objfun’,x0,options); Output arguments Input arguments Starting with a guess Optimization parameters settings
  • 16. Results IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion xmin = 0.5000 -1.0000 feval = 1.3028e-010 exitflag = 1 output = iterations: 7 funcCount: 40 stepsize: 1 firstorderopt: 8.1998e-004 algorithm: 'medium-scale: Quasi-Newton line search' Minimum point of design variables Objective function value Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found Some other information
  • 17. More on fminunc – Input [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…)  fun: Return a function of objective function.  x0: Starts with an initial guess. The guess must be a vector of size of number of design variables.  option: To set some of the optimization parameters. (More after few slides)  P1,P2,…: To pass additional parameters. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Ref. Manual: Pg. 5-5 to 5-9
  • 18. More on fminunc – Output IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion [xmin,feval,exitflag,output,grad,hessian]= fminunc(fun,x0,options,P1,P2,…)  xmin: Vector of the minimum point (optimal point). The size is the number of design variables.  feval: The objective function value of at the optimal point.  exitflag: A value shows whether the optimization routine is terminated successfully. (converged if >0)  output: This structure gives more details about the optimization  grad: The gradient value at the optimal point.  hessian: The hessian value of at the optimal point Ref. Manual: Pg. 5-5 to 5-9
  • 19. Options Setting – optimset IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Options = optimset(‘param1’,value1, ‘param2’,value2,…)  The routines in Optimization Toolbox has a set of default optimization parameters.  However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.  There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc.  You can also choose the algorithm you wish to use. Ref. Manual: Pg. 5-10 to 5-14
  • 20. Options Setting (Cont.) Options = optimset(‘param1’,value1, ‘param2’,value2,…) IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion  Type help optimset in command window, a list of options setting available will be displayed.  How to read? For example: LargeScale - Use large-scale algorithm if possible [ {on} | off ] The default is with { } Parameter (param1) Value (value1)
  • 21. Options Setting (Cont.) LargeScale - Use large-scale algorithm if possible [ {on} | off ]  Since the default is on, if we would like to turn off, we just type: Options = optimset(‘LargeScale’, ‘off’) Options = optimset(‘param1’,value1, ‘param2’,value2,…) IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion and pass to the input of fminunc.
  • 22. Useful Option Settings  Display - Level of display [ off | iter | notify | final ]  MaxIter - Maximum number of iterations allowed [ positive integer ]  TolCon - Termination tolerance on the constraint violation [ positive scalar ]  TolFun - Termination tolerance on the function value [ positive scalar ]  TolX - Termination tolerance on X [ positive scalar ] IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion Ref. Manual: Pg. 5-10 to 5-14 Highly recommended to use!!!
  • 23. fminunc and fminsearch  fminunc uses algorithm with gradient and hessian information.  Two modes:  Large-Scale: interior-reflective Newton  Medium-Scale: quasi-Newton (BFGS)  Not preferred in solving highly discontinuous functions.  This function may only give local solutions. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 24. fminunc and fminsearch  fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust.  This is a direct search method that does not use numerical or analytic gradients as in fminunc.  This function may only give local solutions. IntroductionIntroduction Unconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 25. Constrained Minimization IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion [xmin,feval,exitflag,output,lambda,grad,hessian] = fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options, P1,P2,…) Vector of Lagrange Multiplier at optimal point
  • 26. Example IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion ( )~ 1 2 3 ~ min x f x x x x= − 2 1 22 0x x+ ≤ 1 2 3 1 2 3 2 2 0 2 2 72 x x x x x x − − − ≤ + + ≤ 1 2 30 , , 30x x x≤ ≤ Subject to: 1 2 2 0 , 1 2 2 72 A B − − −    = =        0 30 0 , 30 0 30 LB UB        = =           function f = myfun(x) f=-x(1)*x(2)*x(3);
  • 27. Example (Cont.) 2 1 22 0x x+ ≤For Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x) C=2*x(1)^2+x(2); Ceq=[]; Remember to return a null Matrix if the constraint does not apply IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion
  • 28. Example (Cont.) x0=[10;10;10]; A=[-1 -2 -2;1 2 2]; B=[0 72]'; LB = [0 0 0]'; UB = [30 30 30]'; [x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon) 1 2 2 0 , 1 2 2 72 A B − − −    = =        Initial guess (3 design variables) CAREFUL!!! fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,…) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion 0 30 0 , 30 0 30 LB UB        = =          
  • 29. Example (Cont.) Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:ProgramsMATLAB6p1toolboxoptimfmincon.m at line 213 In D:usrCHINTANGOptToolboxmin_con.m at line 6 Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Active Constraints: 2 9 x = 0.00050378663220 0.00000000000000 30.00000000000000 feval = -4.657237250542452e-035 IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained Minimization Multiobjective OptimizationMultiobjective Optimization ConclusionConclusion 2 1 22 0x x+ ≤ 1 2 3 1 2 3 2 2 0 2 2 72 x x x x x x − − − ≤ + + ≤ 1 2 3 0 30 0 30 0 30 x x x ≤ ≤ ≤ ≤ ≤ ≤ Const. 1 Const. 2 Const. 3 Const. 4 Const. 5 Const. 6 Const. 7 Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq Const. 8 Const. 9
  • 30. Multiobjective Optimization  Previous examples involved problems with a single objective function.  Now let us look at solving problem with multiobjective function by lsqnonlin.  Example is taken by designing an optimal PID controller for an plant. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 31. Simulink Example IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Goal: Optimize the control parameters in Simulink model optsim.mdl in order to minimize the error between the output and input. Plant description: • Third order under-damped with actuator limits. • Actuation limits are a saturation limit and a slew rate limit. • Saturation limit cuts off input: +/- 2 units • Slew rate limit: 0.8 unit/sec
  • 32. Simulink Example (Cont.) Initial PID Controller Design IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 33. Solving Methodology  Design variables are the gains in PID controller (KP, KI and KD) .  Objective function is the error between the output and input. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 34. Solving Methodology (Cont.)  Let pid = [Kp Ki Kd]T  Let also the step input is unity.  F = yout - 1  Construct a function tracklsq for objective function. IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 35. Objective Function function F = tracklsq(pid,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3); % Compute function value opt = simset('solver','ode5','SrcWorkspace','Current'); [tout,xout,yout] = sim('optsim',[0 100],opt); F = yout-1; IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Getting the simulation data from Simulink The idea is perform nonlinear least squares minimization of the errors from time 0 to 100 at the time step of 1. So, there are 101 objective functions to minimize.
  • 36. The lsqnonlin IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion [X,RESNORM,RESIDUAL,EXITFLAG,OUTPUT,LAMBDA,JACOBIAN] = LSQNONLIN(FUN,X0,LB,UB,OPTIONS,P1,P2,..)
  • 37. Invoking the Routine clear all Optsim; pid0 = [0.63 0.0504 1.9688]; a1 = 3; a2 = 43; options = optimset('LargeScale','off','Display','iter','TolX',0.001,'TolFun ',0.001); pid = lsqnonlin(@tracklsq,pid0,[],[],options,a1,a2) Kp = pid(1); Ki = pid(2); Kd = pid(3); IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 38. Results IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion Optimal gains
  • 39. Results (Cont.) Initial Design Optimization Process Optimal Controller Result IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective Optimization ConclusionConclusion
  • 40. Conclusion  Easy to use! But, we do not know what is happening behind the routine. Therefore, it is still important to understand the limitation of each routine.  Basic steps:  Recognize the class of optimization problem  Define the design variables  Create objective function  Recognize the constraints  Start an initial guess  Invoke suitable routine  Analyze the results (it might not make sense) IntroductionIntroduction Unconstrained MinimizationUnconstrained Minimization Constrained MinimizationConstrained Minimization Multiobjective OptimizationMultiobjective Optimization Conclusion
  • 41. Thank You! Questions & Suggestions?