0% found this document useful (0 votes)
22 views23 pages

Ch17 OPT

This chapter discusses practical search methods for constrained optimization problems. When the objective function and/or constraints are not explicit, a search method may be the most practical approach. The chapter covers penalty functions, using vector algebra to determine search directions, and the generalized reduced gradient method. The GRG method iteratively moves tangent to constraints in a favorable direction, returns to the constraints, and terminates if no improvement is possible with a small step size. The Kuhn-Tucker conditions provide necessary conditions for optimality of constrained optimization problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views23 pages

Ch17 OPT

This chapter discusses practical search methods for constrained optimization problems. When the objective function and/or constraints are not explicit, a search method may be the most practical approach. The chapter covers penalty functions, using vector algebra to determine search directions, and the generalized reduced gradient method. The GRG method iteratively moves tangent to constraints in a favorable direction, returns to the constraints, and terminates if no improvement is possible with a small step size. The Kuhn-Tucker conditions provide necessary conditions for optimality of constrained optimization problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

VECTOR AND

CHAPTER 17 REDUCED GRADIENT


SEARCHES

By :
Mahendra Adhi K
23114310
The intent of this chapter is to push further into practical search
methods that apply to constrained optimization problems.
When the objective function and/or constraints are not in explicit form, a
search method may be the most practical choice.

SEARCH METHOD
3 EFFORTS
IN
EFFORTS
O SO
3 PT L
IM VI
IZ N G
AT C
IO O
N NS
PR T
O RA
B IN
LE E
M D
S
PENALTY
FUNCTIONS
Penalty functions
Illustrated here for
maximization,

To the unconstrained
problem
maximum
Example : using a search method with penalty functions,
determine x1 and x2 that minimize

Subject to the constraint

Solution : establish a new function to minimize

(17.1)
2 SEARCH
DIRECTIONS FROM
VECTOR ALGEBRA
For two and three variable constrained
problems vector algebra using dot products
and cross products provides straightforward
means of specifying search direction
Thus the three classes of moves for a three
variable constrained problem are as follow:
1. Moving freely in the direction of greatest rate of change
of .
2. Moving tangent to one -surface in the most favorable
direction with respect to another -surface or with
respect to y.
3. Moving tangent to two -surfaces simultaneously in the
direction of improving y.
3
GENERALIZED
REDUCED
GRADIENT
The GRG method uses a repeated sequence of step that
a. Starts on the constraint
b. Moves tangent to the constraint(s) in favorable direction with respect
c. Returns to the constraint(s)

The individual steps for a problem of n variables and m


constraints are as follows:
1. Start with a trial point on the constraint.
2. At the point on the constraint compute components of
the gradient vector
3. Apply the reduced gradient concept to move tangent to
the constraints in a favorable direction with respect to y.
4. Return to the constraint by adjusting m of the variables
through application of newton raphson to the m
simultaneous equations provided by the constraints
5. If in the process of returning to the constraint the value
of y is degraded relative to its value at the starting point,
return to step 3 with smaller step size.
6. If no improvement is possible with a sufficiently small
step size, terminate, otherwise, return to step 2
(The solution)

Arbitrarily choose x3 as the decision variable and denote:

So:
Use the GRG method to minimize

Subject to
KUHN TUCKER
CONDITIONS
The KTC applicable to a minimization
problem are :
Which may be combined with the second, fourth and fifth KTC,
Terima kasih

You might also like