Structural Optimization Using Shuffled Shepherd Metaheuristic Algorithm Extensions And Applications Ali Kaveh instant download
Structural Optimization Using Shuffled Shepherd Metaheuristic Algorithm Extensions And Applications Ali Kaveh instant download
https://ebookbell.com/product/structural-optimization-using-
shuffled-shepherd-metaheuristic-algorithm-extensions-and-
applications-ali-kaveh-49177056
https://ebookbell.com/product/designing-engineering-structures-using-
stochastic-optimization-methods-1st-edition-levent-aydin-editor-h-
seil-artem-editor-selda-oterkus-editor-11909814
https://ebookbell.com/product/data-structures-in-depth-using-c-a-
comprehensive-guide-to-data-structure-implementation-and-optimization-
in-c-1st-edition-mahmmoud-mahdi-232450560
https://ebookbell.com/product/optimization-of-multilayered-radar-
absorbing-structures-ras-using-nature-inspired-algorithm-1st-edition-
vineetha-joy-34801724
https://ebookbell.com/product/structural-optimization-1st-edition-
keith-m-macbain-william-r-spillers-auth-2116786
Structural Optimization With Uncertainties 1st Edition Nv Banichuk
https://ebookbell.com/product/structural-optimization-with-
uncertainties-1st-edition-nv-banichuk-4193970
https://ebookbell.com/product/structural-optimization-and-
experimental-investigation-of-the-organic-rankine-cycle-for-solar-
thermal-power-generation-1st-edition-jing-li-auth-4975956
https://ebookbell.com/product/structural-optimization-dynamic-and-
seismic-applications-cheng-6991084
https://ebookbell.com/product/stochastic-structural-optimization-
makoto-yamakawa-makoto-ohsaki-50442220
https://ebookbell.com/product/advances-in-structural-engineering-
optimization-emerging-trends-in-structural-optimization-sinan-melih-
nigdeli-21963066
Studies in Systems, Decision and Control
Volume 463
Series Editor
Janusz Kacprzyk, Systems Research Institute, Polish Academy of Sciences,
Warsaw, Poland
The series “Studies in Systems, Decision and Control” (SSDC) covers both new
developments and advances, as well as the state of the art, in the various areas of
broadly perceived systems, decision making and control–quickly, up to date and
with a high quality. The intent is to cover the theory, applications, and perspec-
tives on the state of the art and future developments relevant to systems, decision
making, control, complex processes and related areas, as embedded in the fields
of engineering, computer science, physics, economics, social and life sciences, as
well as the paradigms and methodologies behind them. The series contains mono-
graphs, textbooks, lecture notes and edited volumes in systems, decision making
and control spanning the areas of Cyber-Physical Systems, Autonomous Systems,
Sensor Networks, Control Systems, Energy Systems, Automotive Systems, Biolog-
ical Systems, Vehicular Networking and Connected Vehicles, Aerospace Systems,
Automation, Manufacturing, Smart Grids, Nonlinear Systems, Power Systems,
Robotics, Social Systems, Economic Systems and other. Of particular value to both
the contributors and the readership are the short publication timeframe and the world-
wide distribution and exposure which enable both a wide and rapid dissemination of
research output.
Indexed by SCOPUS, DBLP, WTI Frankfurt eG, zbMATH, SCImago.
All books published in the series are submitted for consideration in Web of Science.
Ali Kaveh · Ataollah Zaerreza
Structural Optimization
Using Shuffled Shepherd
Meta-Heuristic Algorithm
Extensions and Applications
Ali Kaveh Ataollah Zaerreza
Department of Civil Engineering Department of Civil Engineering
Iran University of Science and Technology Iran University of Science and Technology
Tehran, Iran Tehran, Iran
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Switzerland AG 2023
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.
This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface
Due to restrictions in available resources and the increase in human population, the
significance of optimization continues to grow in the modern world. Engineers always
seek to create structural systems that are both cost effective and robust enough to
handle the most demanding functional requirements that may arise throughout their
service life. The typical trial-and-error method to structural design is insufficient
to provide solutions that fulfill both economic and safety criteria. Therefore, the
computational method is developed for optimization. Metaheuristic algorithms are
the most popular computational optimization tools.
This book introduces a new metaheuristic algorithm named Shuffled Shepherd
Optimization Algorithms and different versions of it. New methods for structural
damage detection and reliability-based design optimization are also introduced. The
concepts presented in this book are not only applicable to the design of skeletal
structures, but can equally be used in different optimization techniques in civil engi-
neering. These concepts are also applicable in the optimal design of other systems
such as hydraulic and electrical networks.
The authors and colleagues have been involved in various developments and
applications of various metaheuristic algorithms to structural optimization in the
last two decades. This book contains part of this research suitable for various aspects
of optimization in civil engineering.
The book is likely to be of interest to civil, mechanical, industrial, and electrical
engineers who use optimization methods for design, as well as to those students and
researchers in structural optimization who will find it to be necessary professional
reading.
Chapter 1 explains the purpose of the book and provides an overview of
the remaining chapters. In Chap. 2, a newly developed multi-community meta-
heuristic optimization algorithm known as Shuffled Shepherd Optimization Algo-
rithm (SSOA) is introduced, in which the agents are first decomposed into multi-
communities, and the optimization process is then performed, mimicking the
behavior of a shepherd in nature, operating on each community. In Chap. 3, the
SSOA is modified to make it less dependent on parameter tuning. The new version
is called parameter reduced SSOA requiring less parameters to be tuned. Chapter 4,
v
vi Preface
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 The Main Phase of the Metaheuristic Algorithms . . . . . . . . . . . . . 3
1.2.1 The Problem Definition Phase . . . . . . . . . . . . . . . . . . . . . . 3
1.2.2 The Algorithm Parameter Definition Phase . . . . . . . . . . . 3
1.2.3 The Initialization Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.4 The Main Loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 Structural Optimum Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Goals and Organization of the Present Book . . . . . . . . . . . . . . . . . . 6
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2 Shuffled Shepherd Optimization Method: A New
Meta-Heuristic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2 Shuffled Shepherd Optimization Algorithm . . . . . . . . . . . . . . . . . . 12
2.2.1 Inspiration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.2 Mathematical Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2.3 Steps of the Optimization Algorithm . . . . . . . . . . . . . . . . . 14
2.3 Validation of the SSOA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3.1 Mathematical Optimization Problems . . . . . . . . . . . . . . . . 17
2.3.2 Engineering Optimization Problems . . . . . . . . . . . . . . . . . 17
2.4 Numerical Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.4.1 The 25-Bar Spatial Truss . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.4.2 The 47-Bar Planer Truss . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.4.3 The 72-Bar Spatial Truss . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.4.4 The 120-Bar Dome Truss . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.4.5 A 272-Bar Transmission Tower . . . . . . . . . . . . . . . . . . . . . 34
2.4.6 A 1016-Bar Double-Layer Grid . . . . . . . . . . . . . . . . . . . . . 39
2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
vii
viii Contents
1.1 Introduction
The world we live in today is very competitive, and people are constantly striving to
maximize their output or profit using a limited amount of resources that they have
access to. As an example, in engineering design, it is essential to select design vari-
ables that satisfy all design criteria and have the lowest feasible cost. The primary
purpose of engineering design is to adhere to fundamental standards while simul-
taneously achieving the best possible economic results. The optimization methods
provide a solution for these kinds of problems [1].
Optimizing is the process of making the most of a situation or resource to the
greatest extent possible. The optimization mathematically is defined as follows.
minimi ze f i (x) i = 1, 2, 3, . . . n
subject to μ j (x) = 0 j = 1, 2, 3, . . . , m
ϕh (x) ≤ 0 h = 1, 2, 3, . . . , b
ωg (x) ≥ 0 g = 1, 2, 3, . . . , v
where the f i (x) called the objective function. μ j (x), ϕh (x), and ωg (x) are the
constraint functions. Here, x is the design vector.
If there is only one objective function for an optimization problem, it is called
a single objective problem. In contrast, it is known as the multi-objective problem.
The objective and constraint functions can have a simple formulation, allowing the
optimal solution to be found using a simple program or manual method. The other
type of the function is the one that does not have a simple formulation or not have
formulation (called the black box problem). To solve this kind of optimization prob-
lems, stochastic optimization techniques are created. Therefore, the optimization
method can be divided into the gradient-based method and metaheuristic algorithms.
As compared to stochastic methods, gradient-based methods converge faster and
produce more accurate results. However, gradient-based methods are susceptible to
local optima trapping and are dependent on the process’s initiation point, and they
cannot be applied to the black box optimization problems. Metaheuristic algorithms
do not have these shortage; in addition, they can be utilized for a variety of optimiza-
tion issues and are simple to implement [2]. Therefore, metaheuristic algorithms
have grown in favor during the past decades.
The single metaheuristic algorithm is not capable of finding the optimum solu-
tion for all types of optimization problems [3]. Therefore, new metaheuristic
algorithms have been developed by researchers. In addition, the performance of
current approaches is improved by modifying their various components in response
to researchers’ issues. The hybridization of current techniques is a strategy for
enhancing the performance of the metaheuristic algorithms [4].
The four primary categories of metaheuristic algorithms are evolution-based,
physics-based, swarm-based, and human-based, depending on their inspiration.
Evolutionary algorithms are inspired by the characteristics of the biological evolu-
tion, including crossover, mutation, and selection. Inspiration of swarm intelligence
algorithms is based on the social behavior of creatures living in a group, which might
be a swarm, herd, or flock. The human-based algorithms consist of optimizers that
simulate certain human behaviors. As the fourth class of metaheuristic algorithms,
physics-based algorithms are motivated by physical rules [1]. Examples for each
category are provided in Fig. 1.1.
Metaheuristic algorithms have two phases of exploration (diversification) and
exploitation (intensification) [5]. Exploring the optimization search space is
performed to acquire a better solution, while the exploitation phase consists of
searching close to the best answer thus far discovered. If the algorithm exhibits
more exploration than exploitation, the metaheuristic algorithm cannot converge to
the optimal solution. Alternatively, if the exploitation phase of the algorithm is greater
than its exploration phase, the algorithm becomes trapped in local optima. Conse-
quently, there should always be a constant equilibrium between the exploration and
exploitation of the algorithms [6].
There are four main phases that should be considered when the metaheuristic
algorithms are utilized to solve the optimization algorithms. These phases include
problem definition, algorithm parameters definition, initialization phase, and the
main loop of the optimization algorithms. Each phases are explained in the following
sections.
The problem definition (also known as objective function definition) phase is the
most crucial stage of optimization using the metaheuristic algorithms. To achieve
the desired outcome, the optimization problem must be precisely coded. In addition,
the problem is continuously analyzed determining the value of the objective function;
and the majority run time of the algorithm is utilized during this phase.
There are two types of optimization problems. The first type is the non-constraint
objective function. Handling this type of objective function is straightforward, and
there is no need for further explanation. Another type of optimization problem is
constrained problems. Most of the real problems are of this type. To handle the
constraint, different methods are developed. Two of these are described here.
One of the simplest ways to consider the constraint is to add the penalty to the
objective function value when the constraint function is not satisfied. The penalty
value can be a constant value or a variable according to the constraint function. For
example, in structural optimization problems with the constraint on the stress ratio,
the value of the stress ratio is added to the objective function when it is more than one.
The other way is using multi-objective optimization algorithms, where the constraint
functions are considered as the objective functions, and the optimization process is
performed. At the end of the optimization process, the solution which satisfies the
constraint function is selected from the Pareto front.
All of the metaheuristic algorithms have at least some basic optimization parameters,
which are defined by the user. The basic parameters of the metaheuristic algorithms
is the population size and a maximum number of iterations (or a maximum number
4 1 Introduction
of function evaluations). Any metaheuristic algorithms that have only these two
parameters are called the parameters less. The algorithms may contain additional
parameters beyond the fundamental parameters. These parameters influence the main
step size and have advantages and disadvantages. Adjustment of these parameters can
help the algorithms to have good performance in a variety of optimization problems.
On the other hand, parameter adjustment can be time-consuming.
The initialization phase of most of the optimization algorithms is the same. In this
phase, the population (solutions) are randomly produced in the search space, and
then they are evaluated. This randomly generated population is transformed for the
next phase. After the random generation of the population in some optimization
algorithms, further strategies are employed to improve the quality of the initialized
population. For example, different opposition-based learning (OBL) techniques are
utilized in the enhanced shuffled shepherd optimization algorithm (ESSOA).
In the main loop of the algorithms, one tries to find a new and better solution for the
considered optimization problem. The main loop has at least one step size, and the
algorithms use step size to search the optimization space. Also, the other mechanism
can exist in the main loop. One of them is the replacement strategy. In the replacement
strategy, the new solution is compared to its old solution, and the best of them are
selected. The other mechanism is memory. The memory used in the algorithms that
do not have the replacement strategy to hold the best solution found. The other type
of mechanism that can be used in this phase is to build the algorithm multi-population
(multi-communities). For example, the shuffling technique is utilized in the SSOA
to have the multi-population algorithm. The shuffling process for the population size
of NP is provided in Fig. 1.2.
Fig. 1.3 Flowchart for the analysis and optimum design of structures
The contribution of this book is concerned with the sizing optimization and simul-
taneous size/layout optimization of the benchmark and real-size structures using the
shuffled shepherd optimization algorithm and its variant. The chapters of the book are
organized into two parts. The first part is entitled ‘Extension’. In this part, the shuffled
shepherd optimization algorithm is introduced first. Then, the two other versions of
1.4 Goals and Organization of the Present Book 7
SSOA, named parameter less SSOA (PR-SSOA) and enhanced SSOA (ESSOA), are
introduced. The second part is entitled ‘Application’. In this part, the efficiency of
the SSOA and its variant in the different problems investigated inducing the structural
damage detection, optimum design of the portal frame, optimum design of castel-
lated beams, and reliability-based design optimization. Additionally, the statistically
regeneration mechanism (SRM), which is used to improve the SSOA, is added to
the Rao algorithms and particle swarm optimization algorithms to enhance their
performance.
The remaining chapters of this book are organized as follows:
Chapter 2 introduces a newly developed multi-community metaheuristic opti-
mization algorithm. This algorithm is called the shuffled shepherd optimization algo-
rithm (SSOA), in which the agents are first separated into multi-communities, and
the optimization process is then performed mimicking the behavior of a shepherd
in nature, operating on each community. The SSOA is tested with 17 mathematical
benchmark optimization problems, 2 classic engineering problems, 5 truss design
problems, and one double-layer grid design problem. The results show that SSOA is
competitive with other considered metaheuristic algorithms [8].
Chapter 3 introduces the modified version of the shuffled shepherd optimization
algorithm. Shuffled shepherd optimization algorithm is modified to make it less
dependent of parameter tuning. The new version is called parameter reduced SSOA
(PRSSOA) requiring less parameters to be tuned [9].
Chapter 4 presents the Enhanced Shuffled Shepherd Optimization Algo-
rithm (ESSOA). Shuffled Shepherd Optimization Algorithm (SSAO) is a swarm
intelligence-based optimizer inspired by the herding behavior of shepherds in nature.
SSOA may suffer from some shortcomings, including being trapped in a local
optimum and starting from a random population without prior knowledge. In order
to solve these issues, SSOA is modified by two efficient mechanisms in this chapter.
The first mechanism is the opposition-based learning (OBL) concept, which was
first presented by Tizhoosh [10]. The OBL is used for improving the initialization
phase of the SSOA. This is because it improves the convergence rate of the algorithm
by giving prior knowledge about the search space. The second mechanism is intro-
duced a new solution generator based on the statistical results of the solutions. The
presented mechanism is called statistically regenerated step size. This mechanism
provides a good exploration in the early iterations of the algorithm and causes the
algorithm to escape from local optima in the last iterations [11].
Chapter 5 contains new strategy, namely Boundary Strategy (BS), for the process
of optimization-based damage detection. This strategy gradually neutralizes the
effects of structural elements that are healthy in the optimization process. BS causes
the optimization method to find the optimum solution better than conventional
methods that do not use the proposed BS. This technique improves both aspects of
the accuracy and convergence speed of the algorithms in identifying and quantifying
the damage [12].
Chapter 6 describes the discrete optimum design of two types of portal frames,
including planar steel Curved Roof Frame (CRF) and Pitched Roof Frame (PRF)
with tapered I-section members. The optimal design aims to minimize the weight of
8 1 Introduction
these frame structures while satisfying some design constraints based on the require-
ments of ANSI/AISC 360-16 and ASCE 7-10. Four population-based metaheuristic
optimization algorithms are applied to the optimal design of these frames. These
algorithms consist of Teaching–Learning-Based Optimization (TLBO), Enhanced
Colliding Bodies Optimization (ECBO), Shuffled Shepherd Optimization Algorithm
(SSOA), and Water Strider Algorithm (WSA) [13].
Chapter 7 presents the optimum design of castellated beams utilizing the SSOA.
The use of castellated beams has received much attention in recent decades, because
these beams have holes in their webs, and the bending moment of the cross-section
increases without increasing the weight of the beam. These beams are also more
practical from an architectural point of view since installations, and plumbing can
be passed through the holes of these beams used in the roofs [14].
Chapter 8 investigates an efficient graph-theoretical force method. A graph-
theoretical force method is utilized in the analysis of the frame structures to decrease
the time required for optimization. The performance and speed of the graph-
theoretical force method are compared to those of the displacement method in the
optimal design of frame structures. Additionally, the standard particle swarm opti-
mization algorithm (PSO) is improved to enhance its performance in the optimal
design of the steel frames [15].
Chapter 9 present a new framework for reliability-based design optimization
(RBDO) using metaheuristic algorithms based on decoupled methods. This frame-
work is named sequential optimization and reliability assessment-double meta-
heuristic (SORA-DM). The efficiency of the SOAR-DM is investigated using the
enhanced shuffled shepherd optimization algorithm (ESSOA). The efficiency of the
proposed framework is evaluated by six RBDO problems. The results show that
the SORA-DM can have better performance than the gradient-based method in the
RBDO and can easily be utilized in a wide range of RBDO problems [16].
Chapter 10 provides the reliability-based design optimization (RBDO) of the
frame structures using the force method and sequential optimization and reliability
assessment-double meta-heuristic framework (SORA-DM). In the SORA-DM, the
meta-heuristic algorithm is utilized in both the optimization process and reliability
assessment. The considered frames have a lower degree of statical indeterminacy
than the degree of kinematical indeterminacy. The force method is used for the first
time in the structural analysis of the RBDO problems [17].
References
1. Kaveh, A.: Advances in Metaheuristic Algorithms for Optimal Design of Structures, 3rd edn.
Springer (2021)
2. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)
3. Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol.
Comput. 1(1), 67–82 (1997)
4. Blum, C., Raidl, G.R.: Hybrid Metaheuristics: Powerful Tools for Optimization. Springer
(2016)
References 9
5. Kaveh, A., Bakhshpoori, T.: Metaheuristics: Outlines, MATLAB Codes and Examples,
Springer (2019)
6. Yang, X.-S.: Nature-Inspired Metaheuristic Algorithms. Luniver Press (2010)
7. Kaveh, A., Ilchi Ghazaan, M.: Meta-Heuristic Algorithms for Optimal Design of Real-Size
Structures. Springer (2018)
8. Kaveh, A., Zaerreza, A.: Shuffled shepherd optimization method: a new meta-heuristic
algorithm. Eng. Comput. 37(7), 2357–2389 (2020)
9. Kaveh, A., Zaerreza, A., Hosseini, S.M.: Shuffled shepherd optimization method simplified for
reducing the parameter dependency. Iranian J. Sci. Technol. Trans. Civ. Eng. 45(3), 1397–1411
(2021)
10. Tizhoosh, H.R.: Opposition-based learning: a new scheme for machine intelligence. In: Inter-
national Conference on Computational Intelligence for Modelling, Control and Automation
and International Conference on Intelligent Agents, Web Technologies and Internet Commerce
(CIMCA-IAWTIC’06). IEEE (2005)
11. Kaveh, A., Zaerreza, A.,Hosseini, S.M.: An enhanced shuffled Shepherd optimization algorithm
for optimal design of large-scale space structures. Engineering with Computers (2021)
12. Kaveh, A., Hosseini, S.M., Zaerreza, A.: Boundary strategy for optimization-based structural
damage detection problem using metaheuristic algorithms. Periodica Polytech. Civ. Eng. 65(1),
150–167 (2021)
13. Kaveh, A., Karimi Dastjerdi, M.I., Zaerreza, A., Hosseini, M.: Discrete optimum design
of planar steel curved roof and pitched roof portal frames using metaheuristic algorithms.
Periodica Polytech. Civ. Eng. 65(4), 1092–1113 (2021)
14. Kaveh, A., Almasi, P., Khodagholi, A.: Optimum design of castellated beams using four recently
developed meta-heuristic algorithms. Iranian J. Sci. Technol. Trans. Civ. Eng. (2022)
15. Kaveh, A., Zaerreza, A.: Comparison of the graph-theoretical force method and displacement
method for optimal design of frame structures. Structures 43, 1145–1159 (2022)
16. Kaveh, A., Zaerreza, A.: A new framework for reliability-based design optimization using
metaheuristic algorithms. Structures 38, 1210–1225 (2022)
17. Kaveh, A., Zaerreza, A.: Reliability-based design optimization of the frame structures using
the force method and SORA-DM framework. Structures 45, 814–827 (2022)
Chapter 2
Shuffled Shepherd Optimization Method:
A New Meta-Heuristic Algorithm
2.1 Introduction
stage of this algorithm, the agents are partitioned into communities, and the opti-
mization progress is inspired by the behavior of a shepherd in nature operating in
each community. During the process of optimization, consideration is given to both
good and bad agents; this ultimately results in an improvement in the algorithm’s
overall performance.
This chapter is organized as follows: In Sect. 2.2 inspiration, mathematical
model and the steps for Shuffled Shepherd Optimization Algorithm are described.
In Sect. 2.3, some benchmark functions and 2 classic engineering problems are
investigated using the SSOA. In Sect. 2.4, five truss design problems and a large-
scale double-layer gird design problem are optimized utilizing the SSOA, and finally
conclusions are derived in Sect. 2.5.
2.2.1 Inspiration
In nature, animals utilize instinct to determine the best way to live. Human beings
learn how to use animals’ instincts for their own goals. Shepherd utilizes animal
instinct to determine the best route to the pasture. Always in a herd, a shepherd puts
one horse to find the best stiff and fast way to pasture.
Horses have an instinct to find the best stiff and fast way. In nature, we can see
the trail of animals’ movement (ways). Horses or other animals always follow these
quick and rigid routes. In addition to the use of this trail by shepherds, road engineers
in the past have used this trail to build new roads.
Shepherd put one or more horses in the herd to move their tools and find the way.
Shepherd tries to guide sheep behind horses to pasture and bring them back because
this trail is the best trail they can ever find. Shepherd’s behavior has been an inspiration
in this chapter, and it is utilized for mathematical modeling of optimization.
2.2.2.1 Herd
In nature, a district contains a large number of herds. The sheep in the herd are not
the same, and each herd has both nice and bad sheep. These characteristics of the
herd are identical to those of the community in a multi-communities method. Each
community contains the worst and best agents compared to the others, similar to a
2.2 Shuffled Shepherd Optimization Algorithm 13
2.2.2.2 Shepherd
In the course of time, humans have learned how to use animal abilities for their own
benefit. As an example, shepherds have used fast-riding horses to herd domesticated
14 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
sheep and cows. To do this, the shepherd tries to lead the animals toward the horses.
This behavior is used as the step size of the algorithm introduced in this chapter.
Each community member is selected in order, and the step size is calculated for each
member. The selected member is named shepherd. A better and a worse member
are chosen from the same community in which the shepherd belongs, based on
their objective functions. The selected members are called the horse and the sheep,
respectively. According to the herding behavior of the shepherd, first, the shepherd
moves toward the sheep. The shepherd then leads the sheep toward the horse. This
movement is the step size of the SSOA, and mathematically it is described as follows:
( )
Stepsizei = β × rand ◦ (X d −X i ) + α × rand ◦ X j −X i (2.1)
in which X i, X d , and X j are solution vectors of shepherd, selected horse, and selected
worse sheep in an m-dimensional search space, respectively; rand is a random vector
in which each component is in range [0, 1]; α is a parameter set to α0 in the start of
the algorithm, then decreases to zero by increasing iteration number of the algorithm
and can be computed by Eq. (2.2); β is a parameter equal to β 0 in the start of the
algorithm then increases to β max and β can be computed by using Eq. (2.3) and sign
“z” represents element-by-element multiplication.
The first sheep selected in the herd does not have better than itself, so the first
term of the step size is set to zero; and for the last sheep selected in herd which
does not have worse than itself, the second term of the step size is zero. Decreasing
α and increasing β gradually reduce the exploration and increase the algorithm’s
exploitation.
α0
∝= α0 − × iteration (2.2)
maxiteration
βmax − β0
β = β0 + × iteration (2.3)
maxiteration
After computing the step size for all sheep in a herd, the temporary solution vector
is computed for each sheep by the following equation.
temporary
xi = xiold + stepsi zei (2.4)
If temporary objective function is not worse than the old objective function, then
temporary
the position of the sheep is updated, so we have xinew = xi , otherwise xinew =
old
xi .
The flowchart of the SSOA is illustrated in Fig. 2.2, and the steps are as follows:
2.2 Shuffled Shepherd Optimization Algorithm 15
Step 1: Initialization
The SSOA parameters are defined, and the initial position of the ith agent is
obtained randomly in an n-dimensional search space by the following equation:
in which xi0 represents the initial solution vector of the ith sheep, xmax andxmin
represent the bound of design variables, rand is a random vector with each
component being in the rang [0, 1]; NP represents the total number of sheep.
Step 2: Evaluations
The value of the objective function for each sheep is evaluated.
Step 3: Shuffling process
The agents are divided into the communities based on the shuffling process
illustrated in Sect. 2.2.2.1.
Step 4: Calculate the step size
The step size is computed for each agent as described in Sect. 2.2.2.2 employing
Eq. (2.1).
Step 5: Calculate the temple solution vector
The temporary solution vector is computed utilizing Eq. (2.4), and the objective
function is evaluated and referred to as temporary solution vector.
Step 6: Update the agent and merge
If the temporary objective function is not worse than the old objective function,
then the position of the sheep is updated and combined the herds.
Step 7: Update the parameters
The values of ∝ and β are updated utilizing Eqs. (2.2) and (2.3).
Step 8: Termination condition
Steps 3 to 7 are repeated until the specified maximum number of iterations is
reached.
In order to verify the efficiency of the new algorithm, seventeen mathematical bench-
mark problems and two classic engineering design problems, five truss structures,
and a double-layer grid are optimized employing the SSOA and compared to other
algorithms. Section 2.3.1 examines mathematical problems, and Sect. 2.3.2 examines
engineering challenges. Section 2.4 optimizes the double-layer grid and trusses.
2.3 Validation of the SSOA 17
The mathematical problems picked from Ref. [3] are given in Table 2.1. Like any
other meta-heuristics algorithm, to have good performance and increase converge
speed of the algorithm in the least computational cost, the number of agents and
maximum iteration number should have balance to find the best computational cost
for increasing converge speed of the algorithm. Here the number of herd and size
of each herd is set to 4, and the maximum number of the permitted iterations is
considered as 200. The value of α0 is considered as 0.5, β0 is set to 2.4, and βmax is
taken as 2.8. However, for Rosenbrock, the values of β0 and βmax are set to 2 and
3.5, respectively. The result of the optimization of GA [3], CPA [4], CSS [5], and
present work are compared in Table 2.2. Each mathematical function is optimized
50 times independently utilizing SSOA, and the mean numbers of function evalu-
ations are reported in Table 2.2. The numbers in the parentheses indicate the ratio
of the successful runs in which the algorithm has located the global minimum with
predefined accuracy, which is taken as ε = f min − f max = 10−4 . The absence of the
parentheses means that the algorithm has been successful in all independent runs.
Table 2.2 demonstrates that SSOA has generally performed better than variations
of GA, CPS, and CSS. Moreover, SSOA has superior performance than CPS and
variants of GA in each function, except for BL and Rosenbrock.
The first classic engineering problem considers is the design optimization of the
welded beam, as given in Fig. 2.3. The aim of this problem is to identify the minimum
constructing cost of the welded beam subjected to constraints on shear stress (s),
bending stress (r), buckling load (Pc), deflection (d) and side constraints. The design
variables are the thickness of the weld h(=x 1 ), length of attached part of the bar
l(=x 2 ), the height of the bar t(=x 3 ) and thickness of the bar b(=x 4 ).
Table 2.1 Specification of the mathematical optimization problems
18
2 2 ( )
1 2 xi
√
Griewank X ∈ [−100, 100]2 f (X ) = 1 + 200 i=1 x i − cos 0.0
i
i=1
(continued)
2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Table 2.1 (continued)
Function name Interval Function Global minimum
4 (
3 ( )2 )
Hartman 3 X∈ [0, 1]3 f (X ) = − i=1 ci ex p − j=1 ai j x j − pi j −3.862782
⎡ ⎤ ⎡ ⎤
3 10 30 1
⎢ ⎥ ⎢ ⎥
⎢ 0.1 10 35 ⎥ ⎢ ⎥
a=⎢
⎢ 3 10 30
⎥, c = ⎢ 1.2 ⎥and
⎥ ⎢ ⎥
⎣ ⎦ ⎣ 3 ⎦
2.3 Validation of the SSOA
0.1 10 35 3.2
⎡ ⎤
0.3689 0.117 0.2673
⎢ ⎥
⎢ 0.4699 0.4387 0.747 ⎥
p=⎢ ⎥
⎢ ⎥
⎣ 0.1091 0.8732 0.5547 ⎦
0.03815 0.5743 0.8828
2 ( )
Rastrigin X ∈ [−1, 1]2 f (X ) = i=1
xi2 − cos(18xi ) −2.0
n−1 ( )2
Rosenbrock X ∈ [−30, 30]n , n = 2 f (x) = i=1 100 xi+1 − xi2 + (xi − 1)2 0.0
19
20 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
g3 (x) = x1 − x4 ≤ 0
g5 (x) = 0.125 − x1 ≤ 0
g7 (x) = P − Pc ≤ 0
In which
√
x2
τ (x) = (τ ' )2 + 2τ ' τ '' + (τ '' )2
2R
P MR
τ' = √ , τ '' =
2x1 x2 J
√
( x2 ) x22 x1 + x3 2
M=P L+ ,R = +
2 4 2
√ x22 x1 + x3 2
J =2 2x1 x2 +
12 2
6P L 4P L 3
σ (x) = δ(x) =
x4 x32 E x33 x4
√
x32 x46 √
4.013E 36 x3 E
Pc (x) = 1−
L2 2L 4G
22 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
P = 6000 lb L = 14 in
0.1 ≤ x1 ≤ 2.0
0.1 ≤ x2 ≤ 10
0.1 ≤ x3 ≤ 10
0.1 ≤ x4 ≤ 2.0
Table 2.3 demonstrates that in almost all parameter combinations, the SSOA iden-
tifies near optimal solution. However, parameter adjustment is required for obtaining
the best solution. In 7 parameters combination, SSOA identifies an optimal solu-
tion but the worst solution is different. This shows which parameters combination
is appropriate for this problem. In parameter combination number 36 (α0 = 1.5,
β0 = 2, βmax = 2.5) the worst solution is 1.724871, which is extremely near to
the optimum value (1.724852). As a result, this parameter combination is appro-
priate for this problem. It can be seen from Table 2.4 that SSOA found minimum
weight and constraints are g1 (x) = −9.048E − 07, g2 (x) = −6.979E − 04,
g3 (x) = −1.779E − 08, g4 (x) = −3.433, g5 (x) = −0.801, g6 (x) = −0.236 and
g7 (x) = −0.001. Therefore, g1 (x), g2 (x) and g3 (x) have controllers’ role. Table 2.5
demonstrates that the SSOA has a minimum average. The average of SSOA is less
than the best solution of other methods except for MCSS [6] and IGMM [7].
Table 2.3 Results of the sensitivity analysis for the welded beam problem
No. ∝0 β0 βmax Best Worst Mean Std
1 0.5 1.5 2 1.960431 3.825033 2.667163 0.502
2 0.5 1.5 2.5 1.831520 3.005574 2.306228 0.308
3 0.5 1.5 3 1.724857 3.450027 2.073674 0.384
4 0.5 1.5 3.5 1.727421 2.220498 1.823992 0.124
5 0.5 1.5 4 1.724855 1.890547 1.771640 0.044
6 0.5 2 2.5 1.724852 1.994225 1.775230 0.083
7 0.5 2 3 1.724853 1.741982 1.726642 0.004
8 0.5 2 3.5 1.724856 1.753322 1.727385 0.005
9 0.5 2 4 1.724862 1.746769 1.728080 0.005
10 0.5 2.5 3 1.724854 1.726242 1.724955 2.68E-04
11 0.5 2.5 3.5 1.724872 1.726221 1.725263 3.85E-04
12 0.5 2.5 4 1.724870 1.732302 1.725973 0.002
13 0.5 3 3.5 1.724960 1.728071 1.725438 6.06E-04
14 0.5 3 4 1.725013 1.731427 1.726302 0.001
15 0.5 3.5 4 1.725214 1.744705 1.728456 0.004
16 1 1.5 2 1.724853 2.781531 2.178187 0.285
17 1 1.5 2.5 1.724852 2.570316 2.043011 0.240
18 1 1.5 3 1.724852 2.269081 1.872679 0.172
19 1 1.5 3.5 1.724853 1.907858 1.754319 0.043
20 1 1.5 4 1.724856 1.762913 1.729981 0.009
21 1 2 2.5 1.724852 1.729270 1.725003 8.06E-04
22 1 2 3 1.724853 1.725898 1.724937 2.55E-04
23 1 2 3.5 1.724861 1.725345 1.724928 9.98E-05
24 1 2 4 1.724888 1.727619 1.725423 7.03E-04
25 1 2.5 3 1.724856 1.725370 1.724949 1.01E-04
26 1 2.5 3.5 1.724891 1.727444 1.725278 5.49E-04
27 1 2.5 4 1.724890 1.728569 1.725809 8.20E-04
28 1 3 3.5 1.724954 1.727341 1.725683 6.79E-04
29 1 3 4 1.725115 1.730349 1.726995 0.001
30 1 3.5 4 1.725605 1.735186 1.728416 0.002
31 1.5 1.5 2 1.724852 2.307840 1.840313 0.162
32 1.5 1.5 2.5 1.724852 2.015799 1.750629 0.074
33 1.5 1.5 3 1.724853 1.865128 1.729554 0.026
34 1.5 1.5 3.5 1.724858 1.726011 1.724985 2.56E-04
35 1.5 1.5 4 1.724870 1.729607 1.725228 8.72E-04
36 1.5 2 2.5 1.724852 1.724871 1.724855 4.32E-06
(continued)
24 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Table 2.4 Optimization results for the welded beam design problem
Method h(= x 1 ) l(= x 2 ) t(= x 3 ) b(= x 4 ) fcost
GA2 [8] 0.208800 3.420500 8.997500 0.210000 1.748310
ESs [9] 0.199742 3.61206 9.0375 0.206082 1.7373
RO [10] 0.203687 3.528467 9.00423 0.20724 1.735344
CDE [11] 0.203137 3.542998 9.033498 0.206179 1.733462
WOA [12] 0.205396 3.484293 9.037426 0.206276 1.730499
GA3 [13] 0.205986 3.471328 9.020224 0.206480 1.728226
CPSO [14] 0.202369 3.544214 9.04821 0.205723 1.728024
CE-CBA[15] 0.205726 3.47056 9.036630 0.20573 1.724858
IAFOA [16] 0.205726 3.470562 9.036630 0.20573 1.724856
IGMM [7] 0.205729 3.470496 9.306625 0.205730 1.724853
MCSS [6] 0.205729 3.470493 9.03662 0.20572 1.724853
SSOA [1] 0.2057296 3.4704888 9.0366236 0.2057297 1.7248524
4
g3 (x) = −π x32 x4 − π x33 + 1,296,000 ≤ 0
3
g4 (x) = x4 − 240 ≤ 0
0. ≤ x1 ≤ 99
0 ≤ x2 ≤ 99
2.3 Validation of the SSOA 25
Table 2.5 Statistical results of different methods for the welded beam problem
Methods Best Mean Worst Std Dev
GA2 [8] 1.748309 1.771973 1.785835 0.011220
ESs [9] 1.737300 1.813290 1.994651 0.070500
RO [10] 1.735344 1.9083 N/A 0.173744
CDE [11] 1.733461 1.768158 1.824105 0.022194
GA3 [13] 1.728226 1.792654 1.993408 0.074713
CPSO [14] 1.728024 1.748831 1.782143 0.012926
CE-CBA [15] 1.724858 1.724858 1.724858 3.5641E-15
IAFOA [16] 1.724856 1.724856 1.424856 8.991E-0.7
IGMM [7] 1.724853 1.732152 1.74769 7.14E-03
MCSS [6] 1.724853 1.735438 1.753681 0.009527
SSOA [1] 1.724852 1.724855 1.724871 4.32E-06
10 ≤ x3 ≤ 200
10 ≤ x1 ≤ 200
Table 2.6 demonstrates that in almost all parameter combinations the SSOA can
identify solutions that are close to optimum. In 7 combinations, SSOA able to deter-
mine the optimum solution, but in combination number 26 (α0 = 1, β0 = 2.5,
βmax = 3.5) the difference between the worst and the best solution is equal to 0.611.
To this end, this combination is the best parameters for this problem. Reducing α0
and increasing β0 value compared to the welded beam problem indicate that the
pressure vessel problem needs fewer exploration and more exploitation compared to
the welded beam problem.
Table 2.7 compares the outcomes, whereas Table 2.8 compares the statistical
outcomes of the current study with those of previous optimization techniques. Table
26 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Table 2.6 Results of the sensitivity analysis for the pressure vessel problem
No ∝0 β0 βmax Best Worst Mean Std
1 0.5 1.5 2 5937.9221 9678.2863 6784.5424 937.052
2 0.5 1.5 2.5 6105.8884 7318.993 6642.038 392.250
3 0.5 1.5 3 5960.321 7407.2313 6589.6663 459.588
4 0.5 1.5 3.5 5957.5773 7167.1373 6272.9405 340.698
5 0.5 1.5 4 5887.6915 6737.8591 6074.1398 212.064
6 0.5 2 2.5 5885.5534 7110.4809 6058.9547 279.313
7 0.5 2 3 5885.4021 6538.2835 5942.1205 136.0747
8 0.5 2 3.5 5885.3516 6000.4418 5898.0198 25.609
9 0.5 2 4 5885.4596 5916.5175 5890.7515 8.903
10 0.5 2.5 3 5885.3417 5893.1437 5886.7253 1.869
11 0.5 2.5 3.5 5885.3491 5902.4701 5887.0622 3.170
12 0.5 2.5 4 5885.3328 8834.4409 5985.0822 538.170
13 0.5 3 3.5 5885.3318 8834.4091 5984.3788 538.288
14 0.5 3 4 5885.3374 8620.3227 6250.4736 9.453
15 0.5 3.5 4 5885.345 8834.4091 6264.8624 983.313
16 1 1.5 2 5987.6779 7305.9870 6403.9379 347.046
17 1 1.5 2.5 5885.8300 7283.3699 6304.6366 373.328
18 1 1.5 3 5885.6348 6940.9005 6113.3912 283.922
19 1 1.5 3.5 5885.3666 6299.4091 5940.4521 89.152
20 1 1.5 4 5885.5543 6291.8489 5920.6347 74.624
21 1 2 2.5 5885.3290 6043.9412 5894.7704 34.052
22 1 2 3 5885.3278 5890.2633 5885.7295 0.957
23 1 2 3.5 5885.3290 5896.7648 5886.4192 2.412
24 1 2 4 5885.3262 5890.5371 5885.7512 1.010
25 1 2.5 3 5885.3265 8893.5559 6076.9052 729.349
26 1 2.5 3.5 5885.3258 5885.9368 5885.4415 0.178
27 1 2.5 4 5885.3285 5889.6133 5885.5488 0.774
28 1 3 3.5 5885.3268 8627.6884 5976.8349 500.666
29 1 3 4 5885.3364 8893.5559 6359.7345 107.963
30 1 3.5 4 5885.3285 8893.5559 6359.5763 107.957
31 1.5 1.5 2 5885.3279 7282.0770 6114.4404 340.909
32 1.5 1.5 2.5 5885.3269 6730.1070 5966.3242 170.228
33 1.5 1.5 3 5885.3261 5986.5065 5899.7867 27.082
34 1.5 1.5 3.5 5885.3265 5900.9876 5886.7982 3.374
35 1.5 1.5 4 5885.3258 5929.4165 5887.9769 8.8108
36 1.5 2 2.5 5885.3258 6103.6344 5893.6254 39.922
(continued)
2.4 Numerical Examples 27
2.7 shows that the SSOA finds best results than other considered algorithms, and
constraints are g1 (x) = −2.65E − 07, g2 (x) = −5.87E − 06, g3 (x) = −33.33 and
g4 (x) = −40.00. This indicates that g1 (x) and g2 (x) have controlled optimization
progress more than other constraints. Table 2.8 shows that the SSOA has the least
average and standard deviation in comparison to the other algorithms.
In this section, six numerical examples are provided to examine the performance
of the SSOA on the optimum design of the structures. These examples are divided
into three categories. The first two examples are the size and shape optimization
28 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Table 2.8 Statistical results of different methods for the pressure vessel problem
Methods Best Mean Worst Std Dev
GA [17] 6059.95 6177.25 6469.32 130.9297
CPSO [14] 6061.08 6147.13 6363.8 86.4545
ESs [9] 6059.75 6850 7332.88 426
CSS [5] 6059.09 6067.91 6085.48 10.2564
CDE [11] 6059.73 6085.23 6371.05 43
IACO [18] 6059.7258 6081.7812 6150.1289 67.2418
CE-CBA [15] 6059.7143 6099.9218 6336.3404 104.25721
IGMM [7] 6059.7143 6060.1598 6061.2868 0.5421
MCSS [6] 6058.623 6073.5931 6108.4579 24.6712
LWOA [19] 5893.339 6223.765 7070.343 418.7902
SSOA [1] 5885.3258 5885.4415 5885.9368 0.178
Table 2.9 Parameters setting for the SSOA for truss optimization problems
Problem ∝0 β0 βmax Number of herds Size of herds Maximum iteration
number
25-bar spatial truss 0.5 2.4 2.6 4 4 300
47-bar planer truss 0.5 2 2.3 4 5 1100
72-bar spatial truss 0.5 2.3 2.6 4 5 1000
120-bar dome truss 0.5 2.3 2.6 4 5 1000
272-bar 0.5 2.0 2.3 4 5 700
transmission tower
1016-bar 0.5 2.3 2.7 4 5 600
double-layer grid
of the truss structures. The third and fourth example is the size optimization of the
truss structures with the frequency constraints, and the last two examples are the size
optimization of the truss structure with the stress and displacement constraints. The
results obtained for each example are compared to other optimization techniques. All
numerical examples are run 30 times independently to provide statistically mean-
ingful results. Parameter settings of the SSOA and the number of iteration limits on
numeric examples are listed in Table 2.9.
The first example is layout optimization of the 25-bar spatial truss, as shown in
Fig. 2.5. The optimization problem includes 13 design variables containing 8 discrete
sizing variables for the cross-section areas and 5 continuous layout variables for nodal
2.4 Numerical Examples 29
coordinate. All members are subjected to a stress limitation of ± 40 ksi, and all nodal
displacement in all directions is limited to ± 0.35 in. Optimization variables and input
data of this truss are provided in Table 2.10.
Table 2.11 compare the result obtained by the SSOA with the other methods.
According to this table, SSOA has found the solution with the least number of
analyses among the other algorithms. It shows that SSOA can easily escape from
local optima and coverage to the optimum solution easily. The average weight and
standard deviation for 30 independent runs of the SSOA are 122.4073 and 6.3443 lb,
respectively. The optimum layout found by SSOA is shown in Fig. 2.6. Convergence
curves for the best result and the mean performance of 30 independent runs for the
25-bar spatial truss are shown in Fig. 2.7.
The 47-bar planer truss shown in Fig. 2.8 is optimized by different researchers for
three load cases given in Table 2.12. The optimization problem includes 44 design
variables containing 27 discrete sizing variables for the cross-section areas and 17
continuous layout variables for nodal coordinates. All members are subjected to
stress limitation in tension and compression of 20 ksi and 15 ksi, respectively. Euler
buckling stresses for compression members (the buckling strength of the ith element)
are set to 3.96EA/L2 , and there is no limitation for node displacement. Optimization
variables and input data of the truss are given in Table 2.12.
30 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
The comparison of the optimal design found by this work with optimum designs
obtained by Salajegheh and Vanderplaats [26], Hasançebi and Erbatur [27, 28], and
Panagant and Bureerat [29] is provided in Table 2.13. It can be seen that SSOA found
the lightest weight (1869.876 lb) in less number of analyses (20,020), with average
and standard deviation being 1929.91 lb and 29.55 lb, respectively. The optimum
layout found by SSOA is shown in Fig. 2.9. Figure 2.10 shows the convergence
curves for the best result and the mean performance of 30 independent runs for the
47-bar planar truss.
The third example is the 72-bar spatial truss with the frequency constraint. The
structural members are divided into 16 groups, and their cross-sectional areas are
selected from the range of [0.645, 4] cm2 . The material density and elastic modulus
2.4 Numerical Examples 31
Fig. 2.6 Comparison of optimized layout for the 25-bar spatial truss
32 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Fig. 2.7 Convergence histories of the optimization for the 25-bar spatial truss
are 2767.99 kg/m3 and 68.95 GPa. As shown in Fig. 2.11, the nonstructural masses
are added to the last story nodes. There are two frequency constraints. The first
frequency must be 4 Hz, and the minimum value of the third frequency is 6 Hz.
The optimized designs found by Standard CSS [30], Enhanced CSS [30], HS
[31], CBO [32], CS [33], WEO [33], CPA [33], and SSOA are compared in Table
2.14. SSOA has found better results compared to the other methods. Additionally, the
statical result obtained using the SSOA is better than other considered methods. The
first five natural frequencies of optimum design are given in Table 2.15. According
to this table, frequency constraint is satisfied in all of the methods. The convergence
history of the SSOA is given in Fig. 2.12.
The 120-bar dome truss is the second example with the frequency constraint consid-
ered in this chapter. The members are divided into the 7 groups, as shown in Fig. 2.13,
and their cross-sectional areas are varied between 1 and 129.3 cm2 . The material
density is 7971.810 kg/m3 , and the modulus of elasticity is 210 GPa. The 3000 kg
nonstructural masses are added at node 1. The 500 nonstructural masses are added
at nodes 2–13, and 100 kg nonstructural masses are added to the remaining nodes.
The frequency constraints are as
The comparison of the results of the SSOA with the other optimization method is
provided in Table 2.16. The best weight is 8707.32 kg which is found by the SSOA.
The SSOA also finds the minimum average weight. In the term standard deviation,
the best result obtained by the OMGSA and the SSOA is second place. According
to Table 2.17, all of the constraints is satisfied in the optimum weight found by the
SSOA. The convergence history of the SSOA is given in Fig. 2.14.
The 272-bar transmission tower presented by Kaveh and Massoudi [37],as shown
in Fig. 2.15. All nodal coordinates and end nodes of the member are presented in
Ref. [37]. Members are divided into 28 groups because of symmetry, as shown in
Fig. 2.15. In this chapter, we imposed 11 load cases for the basic load case, as shown
in Table 2.18. The displacement of nodes 1, 2, 11, 20, 29 in Z-direction is limited to
20 mm, and in X- and Y-directions is limited to 100 mm. The modulus of elasticity
is 2 × 108 kN/m2 , and the maximum available stresses for all member is ± 275,000
kN/m2 .
2.4 Numerical Examples 35
Fig. 2.10 Convergence histories of the optimization for the 47-bar planar truss
The optimum volume found by the SSOA is presented in Table 2.19. SSOA found
optimum volume after 14,020 analyses. The maximum stress ratio is 0.7639, which
has happened in load case 10 in element 169, and the maximum displacement is −
20 mm for node 11 in Z-direction in load case 3. Displacement for nodes 1, 2, 11,
20, 29 are shown in Fig. 2.16. Figure 2.17 shows the convergence curves for the best
result and the mean performance of 30 independent runs for the 272-bar transmission
tower.
38 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
Table 2.15 Natural frequencies (Hz) evaluated at the optimum designs of the 72-bar spatial truss
Frequency Standard Enhanced HS [31] CBO CS [33] WEO CPA Present
number CSS [30] CSS [30] [32] [33] [33] work
SSOA
1 4.000 4.000 4.0000 4.000 4.0003 4.0000 4.0000 4.0000
2 4.000 4.000 4.0000 4.000 4.0003 4.0000 4.0000 4.0000
3 6.006 6.004 6.0000 6.000 6.0001 6.0002 6.0000 6.0000
4 6.210 6.155 6.2723 6.267 6.2502 6.2614 6.2696 6.2452
5 6.684 8.390 9.0749 9.101 9.0143 9.0780 9.0981 9.0761
2.4 Numerical Examples 39
Fig. 2.12 Convergence histories of the optimization for the 72-bar spatial truss
where pu is the required strength; pr is the nominal axial strength; A g is gross area of
member; Ae effective net area; Fy specified minimum yield stress and Fu is specified
minimum tensile strength.
Compression member constraint
pu ≤ pr ; pr = ∅c Fcr A g ; ∅c = 0.9
40 2 Shuffled Shepherd Optimization Method: A New Meta-Heuristic …
⎧( Fy
) √
⎨ 0.658 Fe Fy ; KL
≤ 4.71 FEy
Fcr =
r
√
⎩ 0.877∗Fe ; KL
> 4.71 FEy
r
π2E
Fe = ( )2
KL
r
where Fe is elastic bulking stress; Fcr is critical stress; E is the modulus of elasticity;
L laterally unbraced length of the member; r is radius of gyration and K is effective
length factor taken equal to 1.
Random documents with unrelated
content Scribd suggests to you:
Allez, Seigneur, défendre aux vôtres de durer:1690
Ennuyez-vous de soupirer,
Craignez de trop souffrir, et trouvez en vous-même
L'art de ne plus aimer dès qu'on perd ce qu'on aime.
Je souffrirai pour vous, et ce nouveau malheur,
De tous mes maux le plus funeste, 1695
D'un trait assez perçant armera ma douleur
Pour trancher de mes jours le déplorable reste.
COTYS.
CLÉON [65].
MANDANE.
SCÈNE PREMIÈRE.
AGÉSILAS, XÉNOCLÈS.
XÉNOCLÈS.
AGÉSILAS.
XÉNOCLÈS.
AGÉSILAS.
AGÉSILAS.
SCÈNE II.
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
Elpinice?
SPITRIDATE.
Elle-même.
AGÉSILAS.
AGÉSILAS.
Si vous considériez....
SPITRIDATE.
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
SPITRIDATE.
De votre ennemi!
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
AGÉSILAS.
L'ingrate!
SPITRIDATE.
Je réponds de sa reconnoissance,
Et qu'elle ne consent à l'espoir de Cotys
Que pour le maintenir dans votre dépendance.
Pourroit-elle, Seigneur, davantage pour vous [66]?
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
Elle eût peu hasardé peut-être pour attendre.
SPITRIDATE.
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
SPITRIDATE.
Elle vient recevoir vos ordres elle-même. 1800
SCÈNE III.
AGÉSILAS.
MANDANE.
AGÉSILAS.
MANDANE.
AGÉSILAS.
MANDANE.
Où me bannissez-vous? 1810
AGÉSILAS.
MANDANE.
AGÉSILAS.
Cotys....
MANDANE.
AGÉSILAS.
MANDANE.
SPITRIDATE.
MANDANE.
Et je me plains à lui des inégalités
Qu'il me force de voir lui-même en ses bontés.1825
Tout ce que pour un autre a voulu ma prière,
Vous me l'avez, Seigneur, et sur l'heure accordé [68];
Et pour mes intérêts ce qu'on a demandé
Prête à de prompts refus une digne matière!
AGÉSILAS.
SPITRIDATE.
AGÉSILAS.
Non, Spitridate.
C'est inutilement que ma raison me flatte:
Comme vous j'ai mon foible; et j'avoue à mon tour 1840
Qu'un si triste secours défend mal de l'amour.
Je vois par mon épreuve avec quelle injustice
Je vous refusois Elpinice:
Je cesse de vous faire une si dure loi.
Allez; elle est à vous, si Mandane est à moi. 1845
Ce que pour Lysander je semble avoir de haine
Fera place aux douceurs de cette double chaîne,
Dont vous serez le nœud commun;
Et cet heureux hymen, accompagné du vôtre,
Nous rendant entre nous garant de l'un vers l'autre,1850
Réduira nos trois cœurs en un.
Madame, parlez donc.
SPITRIDATE.
Seigneur, l'obéissance
S'exprime assez par le silence.
Trouvez bon que je puisse apprendre à Lysander
La grâce qu'à ma flamme il vous plaît d'accorder.1855
SCÈNE IV.
AGÉSILAS.
MANDANE.
AGÉSILAS.
SCÈNE V.
AGÉSILAS, XÉNOCLÈS.
AGÉSILAS.
AGÉSILAS.
XÉNOCLÈS.
AGÉSILAS.
XÉNOCLÈS.
Il ne faut que vouloir: tout est possible aux rois.
AGÉSILAS.
SCÈNE VI.
LYSANDER.
AGÉSILAS.
SCÈNE VII.
AGÉSILAS, LYSANDER.
AGÉSILAS.
AGÉSILAS.
LYSANDER.
Seigneur....
AGÉSILAS.
LYSANDER.
Je suis coupable,
Parce qu'on me trahit, que l'on vous sert trop bien, 2010
Et que par un effort de prudence admirable,
Vous avez su prévoir de quoi seroit capable,
Après tant de mépris, un cœur comme le mien.
Ce dessein toutefois ne passera pour crime
Que parce qu'il est sans effet; 2015
Et ce qu'on va nommer forfait
N'a rien qu'un plein succès n'eût rendu légitime.
Tout devient glorieux pour qui peut l'obtenir,
Et qui le manque est à punir.
AGÉSILAS.
LYSANDER.
AGÉSILAS.
LYSANDER.
AGÉSILAS.
SCÈNE VIII.
AGÉSILAS.
AGLATIDE.
AGÉSILAS.
AGLATIDE.
AGÉSILAS.
SCÈNE IX.
COTYS.
SPITRIDATE.
AGÉSILAS.
Je vous ai fait justice à tous,
Et je crois que ce jour vous doit être assez doux,
Qui de tous vos souhaits à votre gré décide;2115
Mais pour le rendre encor plus doux et plus charmant,
Sachez que Sparte voit sa reine en Aglatide,
A qui le ciel en moi rend son premier amant.
AGLATIDE.
AGÉSILAS.
NOTICE.
«Attila, dit Voltaire au commencement de la Préface qu'il a placée en
tête de cette pièce, parut malheureusement la même année
qu'Andromaque. La comparaison ne contribua pas à faire remonter
Corneille à ce haut point de gloire où il s'était élevé: il baissait, et
Racine s'élevait.» Tout en reconnaissant la justesse de ces réflexions
un peu banales, on ne doit pas oublier qu'Andromaque ne fut jouée
que huit mois après Attila, et ne put par conséquent entraver en rien
le succès de cet ouvrage. Ce fut à la troupe de Molière, établie au
Palais-Royal, que Corneille le confia. On lit dans le registre de
Lagrange, sous la date du 4 mars 1667: «Attila, pièce nouvelle de M.
Corneille l'aîné, pour laquelle on lui donna deux mille livres, prix
fait.»
Robinet racontant dans une Lettre en vers à Madame, du 13 mars
1667, une noce somptueuse, ajoute:
Après l'Agésilas,
Hélas!
Mais après l'Attila,
Holà!
ÉDITION SÉPARÉE.
1668 in-12.
RECUEILS.
1668 in-12; | 1682 in-12.
ACTEURS [99].
SCÈNE PREMIÈRE.
ATTILA.
Ils ne sont pas venus, nos deux rois? qu'on leur die
Qu'ils se font trop attendre, et qu'Attila s'ennuie;
Qu'alors que je les mande ils doivent se hâter.
OCTAR.
ATTILA.
OCTAR.
ATTILA.
ebookbell.com