Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
100%
(1)
100% found this document useful (1 vote)
141 views
12 pages
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Download
Save
Save Multi unconstrained optimisation For Later
100%
100% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
100%
(1)
100% found this document useful (1 vote)
141 views
12 pages
Multi Unconstrained Optimisation
Uploaded by
Subhajyoti Das
AI-enhanced title
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here
.
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Multi unconstrained optimisation For Later
Carousel Previous
Carousel Next
Download
Save
Save Multi unconstrained optimisation For Later
100%
100% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 12
Search
Fullscreen
Analytical Multidimensional (Multivariable) Unconstrained Optimization Learning Objectives After studying this chapter, you should be able to: 1. Describe the method of solving problems of multivariable unconstrained optimization 2. Classify the multivariable optimization problems. 3. Explain the necessary and sufficient conditions for solving multivariable unconstrained optimization problems. 4, Solve unconstrained multivariable functions. In Chapter 5, we formulated the single variable optimization problems without constraints. Now let us extend those concepts to solve multivariable optimization problems without constraints. The optimization of such problems is routed in more than one direction. For instance, if there are two variables in the objective function, we call such problems two-dimensional as we have to search from two directions. Similarly, a three-variable problem can be called three-dimensional, and so on. However, the method of solving is similar in all such cases, and, therefore, all thes? problems can be named multidimensional ot multivariable unconstrained optimization problems Recollect that we made use of complete differentiation to solve single variable unconstrai optimization problems. Here we use the concept of partial differentiation in solving multivariable unconstrained optimization problems using analytical methods. 6.1 Classification of Multivariable Optimization Problems The multivariable optimization problems can be classified broadly into two catego, multivariable unconstrained optimization, i.e. without constraints, and multivariable const@i™ optimization, i.e. with constraints. Further, since we categorize the constraints into 1¥° = 114Analytical Multidimensional (Multivariable) Uncomstrained Optimization 145 such as equality and inequality type we can further classify the multivariable constrained optimization problems into two types. Summarily, the classification of multivariable optimization problems is shown in Figure 6.1. Multivariable optimization problerns Multivariable Multivariable constrained optimization, i.e, without constraints i.e. with constraints Equality constraints Inequality constraints Figure 6.1 Classification of multivariable optimization problems. In this chapter, we discuss the analytical solution methods for multivariable unconstrained optimization. 6.2 Optimization Techniques to Solve Unconstrained Multivariable Functions Let us now discuss the necessary and sufficient conditions for optimization of an unconstrained (without constraints) multivariable function for which we make use of Taylor’s jes expansion of a multivariable function. Taylor’s theorem/series expansion is given in Exhibit 6.1. Exhibit 6.1 Taylor’s Theorem/Series Expansion Taylor’s Series In the Taylor’s formula with remainder (Eqs. (i) and (ii), if the remainder R,(x) > 0 as n >, then we obtain as 7 ine f0)= f+ 2 a2 HP par EF faye Wi [Which is called the Taylor’s series. When a = 0, we obtain the Maclaurin’s series 2 7 2 q as is assumed that f(x) has continuous derivatives up to (n + S)th order, /” * M(x) is in the interval (a, x), Hence, to establish that lim | R, (2) |= 0, it is sufficient to show nm that fim 2— at = Geli =0 for any fixed numbers x and a, Now, for any fixed numbers x and a, Sa) = f0)+=7'O) + L140) fee (i)eo7- Analytical Multidimensional (Multivariable) Unconstrained Optimization 117 6.3 Necessary Condition (Theorem 6.1) If fx) has an extreme point (maximum or minimum) at x = x” and if the first derivative of fox) exists at x°, then He’) _ He’) _ _ Ao" Ox, Ox, ox,64 Sufficient Condition (Theorem 6.2) ‘Acsufficient condition for a stationary point x” to be an extreme point is that the matrix of second partial derivatives (Hessian matrix) of f(x) evaluated at x" is: (i) Positive definite when x* is a relative minimum point. (i) Negative definite when x* is a relative maximum point.6.5 Working Rule (Algorithm) for Unconstrained Multivariable Optimization From Theorems 6.1 and 6.2, we can obtain the following procedure to solve the problems of multivariable functions without constraints. Step 1: Check the function to ensure that it belongs to the category of multivariable unconstrained optimization problems, i.e. it should contain more than one variable, SAY, X,, Xp... X, and no constraints. Step 2: Find the first partial derivatives of the function wart. x1, x9, ...5 X,-step 3 step 4 step 5¢ Step 6: Step 7: Analytical Multidimensional (Multivariable) Unconstrained Optimization 119 Equate all the a partial derivatives (obtained from Step 2) to zero to find the values fps 2525 yt 3 Find all the second partial derivatives of the function. Prepare the Hessian matrix. For example, if we have two variables x,, x,, in f(x), then the Hessian matrix is er _#s J ax? = ax ax ra) ae ay Ox Ox. Oxy Find the values of determinants of square sub-matrices Ji Jy» ” 2 In our example, J, --4 as er es = y-| Bas af oF axjax, ax? Evaluate whether x;, x3 are relative maxima or minima depending on the positive or negative nature of J,, J, and so on. Thus, the sign of J, decides whether J is positive or negative while the sign of J, decides definite or indefiniteness of J (Table 6.1). More clearly, If (i) J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. (i) J, > 0, J, <0, then J is indefinite and x" is a saddle point. (ii) J, <0, Fe > 0, then J is negative definite and x” is a relative maximum. (iv) J, <0, z <0, then J is indefinite and x* is a saddle point. Table 6.1 presents the sign notation of J. Table 6.1 Sign Notation of J negative or J, <0 (is negative) J, positive orJ,>0 | Jis positive definite andx" | Jis negative definite and is definite) is a relative minimum ¥ isa relative maximum Jznegative or J,<0 | Sis indefinite and.x’ isa | Jis indefinite and x" isa Vis indefinite)” saddle point saddle point120 Optimization Methods for Engineers Note: If in a function of two variables f(x,, *2) the Hessian — is neither positive 7 negative definite, then the point (x1, 33) at which a//x, = 2/10 is called the saddle yy, which means it may be a maximum or minimum with one variable when the other js fixed.” Ilustration 6.2 Determine the extreme points as well as evaluate the following function fo) = xB + xB + 2} + ay + 6 Solution The necessary conditions for the existence of an extreme point are af | ae 0 and ae 0 of Z as = 3x? + 4x, = 23x, + 4) = 0 i of aes 3x? + 8x, = x,3x, + 8) = 0 (i) From Eq. (i), we get x, = 0 or -4 And from Eq. (ii), we get x, = 0 or + Hence, the above equations are satisfied by the following points: (©, 0), (0, -8/3), (4/3, 0) and (-4/3, 8/3) Now, let us find the nature of these extreme points for which we use the sufficiency conditions, i.e. by the second order partial differentiation of f. 2 os = 6x, +4 ax 2 f > = 6x, + ae at 8 ef dxjax, The Hessian matrix of f is given by pe 6x, +4 0 0 6x, +8 Hence, J, = (6x, + 4] and ji pae eae 0 6x, +8Analytical Multidimensional (Multivariable) Unconstrained Optimization 121 The values of J, and J, and the nature of the extreme points are given in Table 6.2. Table 6.2 Sign Notation of J in Illustration 6.2 Pointy Valueof Value of Nature of 7 Nature of x i) I ta (0,0) +4 +32 Positive definite Relative minimum 6 (0,-8/3) +4 ~32 Indefinite Saddle point since 418/27 Ofldx,= af/dx, (4/3, 0) 4 32 Indefinite Saddle point 194/27 (43, -8/3) 4 +32 Negative definite Relative maximum 50/3 of aq 728A 0 = H=0 ef ax? Ber 0 Therefore, the Hessian matrix J = (° 4 Thus at x, = 0, x, = 0. 0 2 Since J, is positive (+2) and J, is negative (~4), the nature of J'is indefinite and the point at Gas "03 is a saddle point, and also the value the function at this saddle point is (0,0) = 0. We have J, = |2| = 2 and J, = | =-4 qs Note: At the saddle point if one of the values of x, or x, is fixed, the other may show extremity.Ilustration 6.5 Two frictionless rigid bodies R, and R, are connected by three linear elastic springs of constants k,, k, and k, as shown in Figure 6.3. The springs are at their natural positions when the applied force P is zero. Find the displacements x, and x, under the force of 26 units by using the principle of minimum potential energy assuming the spring constants fy, ky and k, as 2, 3 and 4 respectively. Examine whether the displacements correspond to the minimum potential energy.
0Analytical Multidimensional (Multivariable) Unconstrained Optimization 127 4) 6 =42+16=58>0 Since the values of J are positive, the matrix J is positive definite and hence the values ay x5 correspond to the minimum potential energy. and and xy =4 and x3 = 7. Illustration 6.6 The profit on a contract of road laying is estimated as 26x, + 20x, + 4x,x, = 3xj ~ 4xj, where x, and x, denote the labour cost and the material cost respectively. Find the values of labour cost and material cost that can maximize the profit on the contract. Solution Given, profit Z(x,, x,) = 26x, + 20x, + dx,xy — 3x? - 4x? Applying the necessary conditions, ie. a a, ax, Oxy az M By 26 te 6H =0 @ and Seaman -8x, =0 () On solving Eqs. (i) and (ii), we get x? = 9, x} = 7. Now, let us apply the sufficiency condition to maximize the profit (Z) at (xf, x1). This is verified by testing the negative definiteness of the Hessian matrix. The Hessian matrix of Z at (31, 23) is calculated as 4 az 4 =-6, =-8 and ax? ax? Ax,dx, si were pte elon . Hessian matrix ly The determinants of the square sub-matrices of J are -~6|=-6<0 48-16=32>0 Since J, is negative (i.e. -6) and J, is positive (i.e. +32) which is definite, the nature of J is "tative definite. Hence, the values of x, and x, show a relative maximum for Z at x and x3 Tespectively, 4 =O ET128 Optimization Methods for Engineers Hence max profit, Zax. = 20(9) + 20(7) + 4(9(7) - 309 — 4¢7 = 234 + 140 + 252 — 243 — 196 = 187 units Summary aaa Multivariable optimization problems without constraints are explained in this chapter, The optimization problems of multivariable functions without constraints make use of Taylor's series expansion of multivariable functions and partial differentiations. The basic idea behind the solution methods of these problems is trying to convert them nearest to the single variable problems, i.e. the partial differentiation which is the differentiation of one variable at a time keeping the other variables fixed. However, the mathematical concepts of matrices, calculus and other simple algebraic principles are prerequisites to these problems. We will take up the multivariable constrained problems in the next chapter. Key Concepts Multidimensional optimization: The optimization problems routed in more than one direction, ic. having more than one variable. Saddle point: If there is a function of two variables /(x,, x,) whose Hessian matrix is neither positive definite nor negative definite, then the point (x;, x3) at which 9f/ax," = 9f/2x) is called the saddle point. Hessian matrix: J, _ ,. = [0°//2xx,_,.] is the matrix of the second partial derivative. Positive definite: If J, > 0, J,> 0, then J is positive definite and x" is a relative minimum. Positive indefinite: If J,> 0, J,< 0, then J is indefinite and x" is a saddle point. Negative definite: If J, < 0, J, > 0, then J is negative definite and x* is a relative maximum. Negative indefinite: If J, < 0, J, < 0, then J is indefinite and x* is.a saddle point,
You might also like
Python Implementation of Random Forest Algorithm
PDF
No ratings yet
Python Implementation of Random Forest Algorithm
10 pages
Metode Edas
PDF
No ratings yet
Metode Edas
38 pages
Partial Differential Equation
PDF
100% (2)
Partial Differential Equation
9 pages
Statics-13th-Edition-R-C-Hibbeler(1)
PDF
No ratings yet
Statics-13th-Edition-R-C-Hibbeler(1)
1 page
A Collocation Method Based On Bernstein Polynomials To Solve Nonlinear Fredholm-Volterra Integro-Differential Equations-2016
PDF
100% (1)
A Collocation Method Based On Bernstein Polynomials To Solve Nonlinear Fredholm-Volterra Integro-Differential Equations-2016
13 pages
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
PDF
100% (6)
Download full Solutions Manual to accompany Design And Analysis Of Experiments 6th edition 9780471487357 (PDF) with all chapters
40 pages
31-Explicit and Crank-Nicolson Methods For Solution of Heat Equations-14!11!2022
PDF
No ratings yet
31-Explicit and Crank-Nicolson Methods For Solution of Heat Equations-14!11!2022
43 pages
College Algebra: Fifth Edition
PDF
100% (1)
College Algebra: Fifth Edition
48 pages
Keller Box Shooting Method
PDF
No ratings yet
Keller Box Shooting Method
30 pages
07 08 Simpson 3-8 Integration Method
PDF
No ratings yet
07 08 Simpson 3-8 Integration Method
10 pages
Gaussian Elimination
PDF
No ratings yet
Gaussian Elimination
8 pages
Median Polish: Sample Statfolio
PDF
No ratings yet
Median Polish: Sample Statfolio
6 pages
Regula Falsi Method Theory
PDF
100% (1)
Regula Falsi Method Theory
3 pages
1 Mathematical Preliminaries 2
PDF
No ratings yet
1 Mathematical Preliminaries 2
17 pages
Contoh Menggunakan Minitab 17 (Bartlett-Tukey-Anova)
PDF
No ratings yet
Contoh Menggunakan Minitab 17 (Bartlett-Tukey-Anova)
26 pages
Euler and Modified Euler Method
PDF
50% (2)
Euler and Modified Euler Method
8 pages
Lecture 11. Double Integrals
PDF
No ratings yet
Lecture 11. Double Integrals
8 pages
QRS Detector
PDF
No ratings yet
QRS Detector
5 pages
Gauss Seidel Method
PDF
No ratings yet
Gauss Seidel Method
4 pages
Various Fuzzy Numbers and Their Various Ranking Approaches
PDF
No ratings yet
Various Fuzzy Numbers and Their Various Ranking Approaches
10 pages
An Introduction To Interpolation
PDF
No ratings yet
An Introduction To Interpolation
2 pages
Optimasi Numerik PPT Dy PDF
PDF
No ratings yet
Optimasi Numerik PPT Dy PDF
9 pages
Higher Order ODE
PDF
No ratings yet
Higher Order ODE
30 pages
Jurnal Metode Numerik
PDF
No ratings yet
Jurnal Metode Numerik
8 pages
Extreme Values and Saddle Points
PDF
No ratings yet
Extreme Values and Saddle Points
35 pages
Empirical Wavelet Transform
PDF
No ratings yet
Empirical Wavelet Transform
97 pages
Design of Experiments Workshop
PDF
No ratings yet
Design of Experiments Workshop
13 pages
Parameter Estimation For The Two-Parameter Weibull Distribution
PDF
No ratings yet
Parameter Estimation For The Two-Parameter Weibull Distribution
108 pages
Gauss Seidel Method
PDF
No ratings yet
Gauss Seidel Method
2 pages
Module - 2 Lecture Notes - 3 Optimization of Functions of Multiple Variables: Unconstrained Optimization
PDF
No ratings yet
Module - 2 Lecture Notes - 3 Optimization of Functions of Multiple Variables: Unconstrained Optimization
4 pages
FTCS Solution To The Heat Equation: ME 448/548 Notes
PDF
No ratings yet
FTCS Solution To The Heat Equation: ME 448/548 Notes
26 pages
16.3.1 - Homogeneous Equations With Constant Coefficients Second Order
PDF
100% (1)
16.3.1 - Homogeneous Equations With Constant Coefficients Second Order
16 pages
Newton-Raphson Method: Appendix To A Radical Approach To Real Analysis 2 Edition C 2006 David M. Bressoud
PDF
No ratings yet
Newton-Raphson Method: Appendix To A Radical Approach To Real Analysis 2 Edition C 2006 David M. Bressoud
9 pages
Numerical - Jacobi and Gauss-Seidel Linear
PDF
No ratings yet
Numerical - Jacobi and Gauss-Seidel Linear
3 pages
Approximation PDF
PDF
No ratings yet
Approximation PDF
26 pages
7-Ode Ivp1
PDF
No ratings yet
7-Ode Ivp1
12 pages
EngMath4 Chapter12
PDF
No ratings yet
EngMath4 Chapter12
39 pages
Kisi Kisi Uts Statistik
PDF
No ratings yet
Kisi Kisi Uts Statistik
18 pages
Optimization For Data Science Assignment-1
PDF
No ratings yet
Optimization For Data Science Assignment-1
24 pages
Week 3 Modifikasi Posisi palsu-dan-Metode-Newton 9543 0
PDF
No ratings yet
Week 3 Modifikasi Posisi palsu-dan-Metode-Newton 9543 0
14 pages
Tutorial 8 Solution
PDF
No ratings yet
Tutorial 8 Solution
14 pages
Lazy Learning (Or Learning From Your Neighbors)
PDF
No ratings yet
Lazy Learning (Or Learning From Your Neighbors)
3 pages
To Evaluate Given Integral Using Dirac Delta Function Scilab
PDF
No ratings yet
To Evaluate Given Integral Using Dirac Delta Function Scilab
2 pages
Method of Multiple Scales and Averaging
PDF
No ratings yet
Method of Multiple Scales and Averaging
48 pages
Chap 4 Act
PDF
No ratings yet
Chap 4 Act
49 pages
Ch18 LT Instructor Solution
PDF
No ratings yet
Ch18 LT Instructor Solution
100 pages
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
PDF
No ratings yet
Power Method: Eigenvalue & Eigenvector: MPHYCC-05 Unit-IV Semester-II
9 pages
Roots of Equations
PDF
No ratings yet
Roots of Equations
108 pages
ADOMIAN Decomposition Method For Solvin1
PDF
No ratings yet
ADOMIAN Decomposition Method For Solvin1
18 pages
Combination of Variable Sol.
PDF
No ratings yet
Combination of Variable Sol.
4 pages
03 Iterative Methods PDF
PDF
No ratings yet
03 Iterative Methods PDF
19 pages
The Newton Raphson Method
PDF
No ratings yet
The Newton Raphson Method
39 pages
Cap 12 Termo
PDF
No ratings yet
Cap 12 Termo
10 pages
Soal Statistika
PDF
No ratings yet
Soal Statistika
8 pages
Group 4 - Summary PSM
PDF
No ratings yet
Group 4 - Summary PSM
33 pages
Golden Search and Quadratic Estimation
PDF
No ratings yet
Golden Search and Quadratic Estimation
8 pages
1104.4025 (Methods in Ma Thematic A For Solving Ordinary Differential Equations)
PDF
No ratings yet
1104.4025 (Methods in Ma Thematic A For Solving Ordinary Differential Equations)
13 pages
Final Exam - Solution
PDF
No ratings yet
Final Exam - Solution
5 pages
Simpson's 1-3rd Rule
PDF
No ratings yet
Simpson's 1-3rd Rule
4 pages
Newton Gauss Method
PDF
No ratings yet
Newton Gauss Method
37 pages