100% found this document useful (1 vote)
41 views

Download full New Advancements in Swarm Algorithms: Operators and Applications Erik Cuevas ebook all chapters

New

Uploaded by

brynaboytsbx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
41 views

Download full New Advancements in Swarm Algorithms: Operators and Applications Erik Cuevas ebook all chapters

New

Uploaded by

brynaboytsbx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Download the Full Version of textbook for Fast Typing at textbookfull.

com

New Advancements in Swarm Algorithms: Operators


and Applications Erik Cuevas

https://textbookfull.com/product/new-advancements-in-swarm-
algorithms-operators-and-applications-erik-cuevas/

OR CLICK BUTTON

DOWNLOAD NOW

Download More textbook Instantly Today - Get Yours Now at textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Advances in Metaheuristics Algorithms Methods and


Applications Erik Cuevas

https://textbookfull.com/product/advances-in-metaheuristics-
algorithms-methods-and-applications-erik-cuevas/

textboxfull.com

Swarm Intelligence Algorithms: Modifications and


Applications 1st Edition Adam Slowik

https://textbookfull.com/product/swarm-intelligence-algorithms-
modifications-and-applications-1st-edition-adam-slowik/

textboxfull.com

Glowworm Swarm Optimization Theory Algorithms and


Applications 1st Edition Krishnanand N. Kaipa

https://textbookfull.com/product/glowworm-swarm-optimization-theory-
algorithms-and-applications-1st-edition-krishnanand-n-kaipa/

textboxfull.com

Generalized Fractional Calculus: New Advancements and


Applications George A. Anastassiou

https://textbookfull.com/product/generalized-fractional-calculus-new-
advancements-and-applications-george-a-anastassiou/

textboxfull.com
Nature-inspired Computation and Swarm Intelligence:
Algorithms, Theory and Applications 1st Edition Xin-She
Yang (Editor)
https://textbookfull.com/product/nature-inspired-computation-and-
swarm-intelligence-algorithms-theory-and-applications-1st-edition-xin-
she-yang-editor/
textboxfull.com

Human Factors in Practice: Concepts and Applications 1st


Edition Haydee M. Cuevas

https://textbookfull.com/product/human-factors-in-practice-concepts-
and-applications-1st-edition-haydee-m-cuevas/

textboxfull.com

Operators Between Sequence Spaces and Applications 1st


Edition Bruno De Malafosse

https://textbookfull.com/product/operators-between-sequence-spaces-
and-applications-1st-edition-bruno-de-malafosse/

textboxfull.com

Cardiovascular Engineering: Technological Advancements,


Reviews, and Applications Dyah Ekashanti Octorina Dewi

https://textbookfull.com/product/cardiovascular-engineering-
technological-advancements-reviews-and-applications-dyah-ekashanti-
octorina-dewi/
textboxfull.com

Thermal Imaging Types Advancements and Applications 1st


Edition Angel M. Desmarais

https://textbookfull.com/product/thermal-imaging-types-advancements-
and-applications-1st-edition-angel-m-desmarais/

textboxfull.com
Intelligent Systems Reference Library 160

Erik Cuevas
Fernando Fausto
Adrián González

New Advancements
in Swarm Algorithms:
Operators and
Applications
Intelligent Systems Reference Library

Volume 160

Series Editors
Janusz Kacprzyk, Polish Academy of Sciences, Warsaw, Poland

Lakhmi C. Jain, Faculty of Engineering and Information Technology, Centre for


Artificial Intelligence, University of Technology, Sydney, NSW, Australia;
Faculty of Science, Technology and Mathematics, University of Canberra,
Canberra, ACT, Australia;
KES International, Shoreham-by-Sea, UK;
Liverpool Hope University, Liverpool, UK
The aim of this series is to publish a Reference Library, including novel advances
and developments in all aspects of Intelligent Systems in an easily accessible and
well structured form. The series includes reference works, handbooks, compendia,
textbooks, well-structured monographs, dictionaries, and encyclopedias. It contains
well integrated knowledge and current information in the field of Intelligent
Systems. The series covers the theory, applications, and design methods of
Intelligent Systems. Virtually all disciplines such as engineering, computer science,
avionics, business, e-commerce, environment, healthcare, physics and life science
are included. The list of topics spans all the areas of modern intelligent systems
such as: Ambient intelligence, Computational intelligence, Social intelligence,
Computational neuroscience, Artificial life, Virtual society, Cognitive systems,
DNA and immunity-based systems, e-Learning and teaching, Human-centred
computing and Machine ethics, Intelligent control, Intelligent data analysis,
Knowledge-based paradigms, Knowledge management, Intelligent agents,
Intelligent decision making, Intelligent network security, Interactive entertainment,
Learning paradigms, Recommender systems, Robotics and Mechatronics including
human-machine teaming, Self-organizing and adaptive systems, Soft computing
including Neural systems, Fuzzy systems, Evolutionary computing and the Fusion
of these paradigms, Perception and Vision, Web intelligence and Multimedia.
** Indexing: The books of this series are submitted to ISI Web of Science,
SCOPUS, DBLP and Springerlink.

More information about this series at http://www.springer.com/series/8578


Erik Cuevas Fernando Fausto
• •

Adrián González

New Advancements
in Swarm Algorithms:
Operators and Applications

123
Erik Cuevas Fernando Fausto
CUCEI, Universidad de Guadalajara CUCEI, Universidad de Guadalajara
Guadalajara, Mexico Guadalajara, Mexico

Adrián González
CUCEI, Universidad de Guadalajara
Guadalajara, Mexico

ISSN 1868-4394 ISSN 1868-4408 (electronic)


Intelligent Systems Reference Library
ISBN 978-3-030-16338-9 ISBN 978-3-030-16339-6 (eBook)
https://doi.org/10.1007/978-3-030-16339-6

Library of Congress Control Number: 2019935485

© Springer Nature Switzerland AG 2020


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part
of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission
or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt from
the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authors or the editors give a warranty, express or implied, with respect to the material contained herein or
for any errors or omissions that may have been made. The publisher remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

The most common term for methods that employ stochastic schemes to produce
search strategies is metaheuristics. In general, there not exist strict classifications
of these methods. However, several kinds of algorithms have been coined
depending on several criteria such as the source of inspiration, cooperation among
the agents or type of operators.
From the metaheuristic methods, it is considered a special set of approaches
which are designed in terms of the interaction among the search agents of a
group. Members inside the group cooperate to solve a global objective by using
local accessible knowledge that is propagated through the set of members. With this
mechanism, complex problems can be solved more efficiently than considering the
strategy of single individual. In general terms, this group is referred to as a swarm,
where social agents interact with each other in a direct or indirect manner by using
local information from the environment. This cooperation among agents produces
an effective distributive strategy to solve problems. Swarm intelligence (SI)
represents a problem-solving methodology that results from the cooperation among
a set of agents with similar characteristics. During this cooperation, local behaviors
of simple elements produce the existence of complex collective patterns.
The study of biological entities such as animals and insects which manifest a
social behavior has produced several computational models of swarm intelligence.
Some examples include ants, bees, locust swarms, spiders and bird flocks. In the
swarm, each agent maintains a simple strategy. However, due to its social behavior,
the final collective strategy produced by all agents is usually very complex. The
complex operation of a swarm is a consequence of the cooperative behavior among
the agents generated during their interaction.
The complex operation of the swarm cannot be reduced to the aggregation of
behaviors of each agent in the group. The association of all simple agent behaviors
is so complex that usually is not easy to predict or deduce the global behavior of the
whole swarm. This concept is known as emergence. It refers to the process of
produce complex behavioral patterns from the iteration of simple and unsophisti-
cated strategies. Something remarkable is that these behavioral patterns appear
without the existence of a coordinated control system but emerge from the

v
vi Preface

exchange of local information among agents. Therefore, there subsists a close


relationship between individual and collective behavior. In general, the collective
behavior of agents determines the behavior of the swarm. On the other hand, swarm
behavior is also strongly influenced by the conditions under which each agent
executes its operations.
The operations of each agent can modify its own behavior and the behavior of
other neighbor agents, which also alters the global swarm performance. Under such
conditions, the most significant element of swarm intelligence is the model of
interaction or cooperation among the agents. Cooperation in biological entities that
operate as swarm systems happens in different mechanisms from which social
interaction represents the most important. This social interaction can be conducted
through physical contact, visual information, audio messages, or chemical per-
ceptual inputs. Examples of cooperation models in nature are numerous, and some
examples include the dynamical task assignation performed in an ant colony,
without any central control or task coordination. The adoption of optimal spatial
patterns builds by the self-organization in bird flocks and fish in schools. The
hunting strategies developed by predators. The purpose of computational swarm
intelligence schemes is to model the simple behaviors of agents and its local
interactions with other neighboring agents to perform an effective search strategy
for solving optimization problems.
One example is the particle swarm optimization (PSO) which models two simple
actions. Each agent (1) moves toward the best agent of the swarm and (2) moves
toward the position where the agent has reached its best location. As a consequence,
the collective behavior of the swarm produces that all agents are attracted to the best
positions experimented by the swarm. Another example is the ant colony opti-
mization (ACO) which models the biological pheromone trail following behavior of
ants. Under this mechanism, each ant senses pheromone concentrations in its local
position. Then, it probabilistically selects the path with the highest pheromone
concentration. Considering this model, the collective effect in the swarm is to find
the best option (shortest path) from a group of alternatives available in a
decision-making problem.
There exist several features that clearly appear in most of the metaheuristic and
swarm approaches, such as the use of diversification to force the exploration of
regions of the search space, rarely visited until now, and the use of intensification or
exploitation, to investigate thoroughly some promising regions. Another interesting
feature is the use of memory to store the best solutions encountered. For these
reasons, metaheuristics and swarm methods quickly became popular amongst
researchers to solve from simple to complex optimization problems in different
areas.
Most of the problems in science, engineering, economics, and life can be
translated as an optimization or a search problem. According to their characteristics,
some problems can be simple that can be solved by traditional optimization
methods based on mathematical analysis. However, most of the problems of
practical importance such as system identification, parameter estimation, energy
systems, represent conflicting scenarios so that they are very hard to be solved by
Preface vii

using traditional approaches. Under such circumstances, metaheuristic and swarm


algorithms have emerged as the best alternative to solve this kind of complex
formulations. Therefore, swarm techniques have consolidated as a very active
research subject in the last ten years. During this time, various new swarm
approaches have been introduced. They have been experimentally examined on a
set of artificial benchmark problems and in a large number of practical applications.
Although metaheuristic and swarm methods represent one of the most exploited
research paradigms in computational intelligence, there are a large number of open
challenges in the area of swarm intelligence. They range from premature conver-
gence, inability to maintain population diversity and the combination of swarm
paradigms with other algorithmic schemes, toward extending the available tech-
niques to tackle ever more difficult problems.
Numerous books have been published tacking in account any of the most widely
known swarm methods, namely ant colony algorithms and particle swarm opti-
mization but attempts to consider the discussion of new alternative approaches are
always scarce. Initial swarm schemes maintain in their design several limitations
such as premature convergence and inability to maintain population diversity.
Recent swarm methods have addressed these difficulties providing in general better
results. Many of these novel swarm approaches have also been lately introduced. In
general, they propose new models and innovative cooperation models for producing
an adequate exploration and exploitation of large search spaces considering a
significant number of dimensions. Most of the new metaheuristic swarm present
promising results. Nevertheless, they are still in their initial stage. To grow and
attain their complete potential, new swarm methods must be applied in a great
variety of problems and contexts, so that they do not only perform well in their
reported sets of optimization problems, but also in new complex formulations. The
only way to accomplish this is by making possible the transmission and presen-
tation of these methods in different technical areas as optimization tools. In general,
once a scientific, engineering, or practitioner recognizes a problem as a particular
instance of a more generic class, he/she can select one of the different swarm
algorithms that guarantee an expected optimization performance. Unfortunately, the
set of options are concentrated in algorithms whose popularity and high prolifer-
ation are better than the new developments.
The excessive publication of developments based on the simple modification of
popular swarm methods presents an important disadvantage: They avoid the
opportunity to discover new techniques and procedures which can be useful to
solve problems formulated by the academic and industrial communities. In the last
years, several promising swarm schemes that consider very interesting concepts and
operators have been introduced. However, they seem to have been completely
overlooked in the literature, in favor of the idea of modifying, hybridizing, or
restructuring popular swarm approaches.
The goal of this book is to present advances that discuss new alternative swarm
developments which have proved to be effective in their application to several
complex problems. The book considers different new metaheuristic methods and
their practical applications. This structure is important to us, because we recognize
viii Preface

this methodology as the best way to assist researchers, lecturers, engineers, and
practitioners in the solution of their own optimization problems.
This book has been structured so that each chapter can be read independently
from the others. Chapter 1 describes the main characteristics and properties of
metaheuristic and swarm methods. This chapter analyses the most important con-
cepts of metaheuristic and swarm schemes.
Chapter 2 discusses the performance and main applications of each metaheuristic
and swarm method in the literature. The idea is to establish the strength and
weaknesses of each traditional scheme from practical perspective.
The first part of the book that involves Chaps. 3, 4, 5, and 6 present recent
swarm algorithms their operators and characteristics. In Chap. 3, an interesting
swarm optimization algorithm called the Selfish Herd Optimizer (SHO) is presented
for solving global optimization problems. SHO is based on the simulation of the
widely observed selfish herd behavior manifested by individuals within a herd of
animals subjected to some form of predation risk. In SHO, individuals emulate the
predatory interactions between groups of prey and predators by two types of search
agents: the members of a selfish herd (the prey) and a pack of hungry predators.
Depending on their classification as either a prey or a predator, each individual is
conducted by a set of unique evolutionary operators inspired by such prey–predator
relationship. These unique traits allow SHO to improve the balance between
exploration and exploitation without altering the population size. The experimental
results show the remarkable performance of our proposed approach against those
of the other compared methods, and as such SHO is proven to be an excellent
alternative to solve global optimization problems.
Chapter 4 considers a recent swarm algorithm called the Social Spider
Optimization (SSO) for solving optimization tasks. The SSO algorithm is based on
the simulation of cooperative behavior of social spiders. In the proposed algorithm,
individuals emulate a group of spiders which interact with each other based on the
biological laws of the cooperative colony. The algorithm considers two different
search agents (spiders): males and females. Depending on gender, each individual is
conducted by a set of different evolutionary operators which mimic different
cooperative behaviors that are typically found in the colony. In order to illustrate the
proficiency and robustness of the proposed approach, it is compared to other
well-known evolutionary methods. The comparison examines several standard
benchmark functions that are commonly considered within the literature of evo-
lutionary algorithms. The outcome shows a high performance of the proposed
method for searching a global optimum with several benchmark functions.
In Chap. 5, a swarm algorithm called Locust Search (LS) is presented for solving
optimization tasks. The LS algorithm is based on the simulation of the behavior
presented in swarms of locusts. In the proposed algorithm, individuals emulate a
group of locusts which interact with each other based on the biological laws of the
cooperative swarm. The algorithm considers two different behaviors: solitary and
social. Depending on the behavior, each individual is conducted by a set of evo-
lutionary operators which mimic the different cooperative behaviors that are typi-
cally found in the swarm. In order to illustrate the proficiency and robustness of the
Preface ix

proposed approach, it is compared to other well-known evolutionary methods. The


comparison examines several standard benchmark functions that are commonly
considered within the literature of evolutionary algorithms. The outcome shows a
high performance of the proposed method for searching a global optimum with
several benchmark functions.
Chapter 6 presents an algorithm for global optimization called the collective
animal behavior (CAB). Animal groups, such as schools of fish, flocks of birds,
swarms of locusts, and herds of wildebeest, exhibit a variety of behaviors including
swarming about a food source, milling around a central location, or migrating over
large distances in aligned groups. These collective behaviors are often advanta-
geous to groups, allowing them to increase their harvesting efficiency, to follow
better migration routes, to improve their aerodynamic, and to avoid predation. In the
presented swarm algorithm, the searcher agents emulate a group of animals which
interact with each other based on the biological laws of collective motion. The
method has been compared to other well-known optimization algorithms. The
results show good performance of the proposed method when searching for a global
optimum of several benchmark functions.
The second part of the book which involves Chaps. 7, 8, and 9 presents the use
of recent swarm algorithms in different domains. The idea is to show the potential
of new swarm alternatives algorithms from a practical perspective.
In Chap. 7, an algorithm for the optimal parameter calibration of fractional fuzzy
controllers (FCs) is presented. Fuzzy controllers (FCs) based on integer schemes
have demonstrated their performance in an extensive variety of applications.
However, several dynamic systems can be more accurately controlled by fractional
controllers. Under such conditions, there is currently an increasing interest in
generalizing the design of FCs with fractional operators. In the design stage of
fractional FCs, the parameter calibration process is transformed into a multidi-
mensional optimization problem where fractional orders as well as controller
parameters of the fuzzy system are considered as decision variables. To determine
the parameters, the proposed method uses the swarm method called Social Spider
Optimization (SSO) which is inspired by the emulation of the collaborative
behavior of social spiders. In SSO, solutions imitate a set of spiders which coop-
erate to each other based on the natural laws of the cooperative colony. Different to
the most of existent evolutionary algorithms, it explicitly avoids the concentration
of individuals in the best positions, avoiding critical flaws such as the premature
convergence to suboptimal solutions and the limited exploration–exploitation bal-
ance. Numerical simulations have been conducted on several plants to show the
effectiveness of the proposed scheme.
Chapter 8 presents an algorithm for the automatic selection of pixel classes for
image segmentation. The presented method combines a swarm method with the
definition of a new objective function that appropriately evaluates the segmentation
quality with respect to the number of classes. The employed swarm algorithm is the
Locust Search (LS) which is based on the behavior of swarms of locusts. Different
to the most of existent evolutionary algorithms, it explicitly avoids the concentra-
tion of individuals in the best positions, avoiding critical flaws such as the
x Preface

premature convergence to suboptimal solutions and the limited exploration–


exploitation balance. Experimental tests over several benchmark functions and
images validate the efficiency of the proposed technique with regard to accuracy
and robustness.
Chapter 9 presents an algorithm for the automatic detection of circular shapes
embedded into cluttered and noisy images without considering conventional Hough
transform techniques. The approach is based on a swarm technique known as the
collective animal behavior (CAB). In CAB, searcher agents emulate a group of
animals which interact with each other based on simple biological laws that are
modeled as swarm operators. The approach uses the encoding of three non-collinear
points embedded into an edge-only image as candidate circles. Guided by the
values of the objective function, the set of encoded candidate circles (charged
particles) are evolved using the CAB algorithm so that they can fit into actual
circular shapes over the edge-only map of the image. Experimental evidence from
several tests on synthetic and natural images which provide a varying range of
complexity validates the efficiency of our approach regarding accuracy, speed, and
robustness.
Finally, In Chap. 10, the swarm optimization algorithm of Locust Search (LS) is
applied to a template-matching scheme. In the approach, the LS method is con-
sidered as a search strategy in order to find the pattern that better matches in the
original image. According to a series of experiments, LS achieves the best results
between estimation accuracy and computational load.
As authors, we wish to thank many people who were somehow involved in the
writing process of this book. We express our gratitude to Prof. Lakhmi C. Jain, who
so warmly sustained this project. Acknowledgments also go to Dr. Thomas
Ditzinger and Varsha Prabakaran, who so kindly agreed to its appearance.

Guadalajara, Mexico Erik Cuevas


Fernando Fausto
Adrián González
Contents

1 An Introduction to Nature-Inspired Metaheuristics


and Swarm Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Optimization Techniques: A Brief Summary . . . . . . . . . . . . . . . 1
1.2 The Rise of Nature-Inspired Metaheuristics . . . . . . . . . . . . . . . 3
1.3 General Framework of a Nature-Inspired Metaheuristic . . . . . . . 4
1.4 Classification of Nature-Inspired Metaheuristics . . . . . . . . . . . . 5
1.4.1 Evolution-Based Methods . . . . . . . . . . . . . . . . . . . . . . 6
1.4.2 Swarm-Based Methods . . . . . . . . . . . . . . . . . . . . . . . . 10
1.4.3 Physics-Based Methods . . . . . . . . . . . . . . . . . . . . . . . 26
1.4.4 Human-Based Methods . . . . . . . . . . . . . . . . . . . . . . . . 32
1.5 Nature-Inspired Metaheuristics on the Literature . . . . . . . . . . . . 36
1.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2 Metaheuristics and Swarm Methods: A Discussion
on Their Performance and Applications . . . . . . . . . . . . . . . . . . . . . 43
2.1 On the Performance of Nature-Inspired Metaheuristics . . . . . . . 43
2.1.1 Computational Complexity . . . . . . . . . . . . . . . . . . . . . 44
2.1.2 Memory Efficiency . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2.1.3 Exploration Versus Exploitation . . . . . . . . . . . . . . . . . 46
2.1.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
2.2 Nature-Inspired Metaheuristics and Their Applications . . . . . . . 50
2.2.1 Engineering Design . . . . . . . . . . . . . . . . . . . . . . . . . . 50
2.2.2 Digital Image Processing and Computer Vision . . . . . . 51
2.2.3 Networks and Communications . . . . . . . . . . . . . . . . . . 51
2.2.4 Power and Energy Management . . . . . . . . . . . . . . . . . 52
2.2.5 Data Analysis and Machine Learning . . . . . . . . . . . . . 53
2.2.6 Robotics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
2.2.7 Medical Diagnosis . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

xi
xii Contents

2.3 Nature-Inspired Metaheuristics: Research Gaps


and Future Directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
2.4 Concluding Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3 The Selfish Herd Optimizer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.2 The Selfish Herd Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
3.3 The Selfish Herd Optimizer Algorithm . . . . . . . . . . . . . . . . . . . 72
3.3.1 Initializing the Population . . . . . . . . . . . . . . . . . . . . . . 72
3.3.2 Survival Value Assignation . . . . . . . . . . . . . . . . . . . . . 73
3.3.3 Structure of a Selfish Herd . . . . . . . . . . . . . . . . . . . . . 74
3.3.4 Herd Movement Operators . . . . . . . . . . . . . . . . . . . . . 79
3.3.5 Predators Movement Operators . . . . . . . . . . . . . . . . . . 85
3.3.6 Predation Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
3.3.7 Restoration Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
3.4 Summary of the SHO Algorithm . . . . . . . . . . . . . . . . . . . . . . . 93
3.5 Discussion About the SHO Algorithm . . . . . . . . . . . . . . . . . . . 96
3.6 Comparative Experiments and Results . . . . . . . . . . . . . . . . . . . 97
3.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
4 The Swarm Method of the Social-Spider . . . . . . . . . . . . . . . . . . . . 111
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
4.2 Biological Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
4.3 The SSO Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
4.3.1 Fitness Assignation . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
4.3.2 Modeling of the Vibrations Through
the Communal Web . . . . . . . . . . . . . . . . . . . . . . . . . . 116
4.3.3 Initializing the Population . . . . . . . . . . . . . . . . . . . . . . 117
4.3.4 Cooperative Operators . . . . . . . . . . . . . . . . . . . . . . . . 118
4.3.5 Mating Operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
4.3.6 Computational Procedure . . . . . . . . . . . . . . . . . . . . . . 123
4.3.7 Discussion About the SSO Algorithm . . . . . . . . . . . . . 124
4.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
4.4.1 Performance Comparison to Other Metaheuristic
Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
4.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
Appendix: List of Benchmark Functions . . . . . . . . . . . . . . . . . . . . . . 131
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
5 The Locust Swarm Optimization Algorithm . . . . . . . . . . . . . . . . . . 139
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
5.2 Biological Fundamentals and Mathematical Models . . . . . . . . . 141
Contents xiii

5.2.1 Solitary Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142


5.2.2 Social Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
5.3 The Locust Search (LS) Algorithm . . . . . . . . . . . . . . . . . . . . . 144
5.3.1 Solitary Operation (A) . . . . . . . . . . . . . . . . . . . . . . . . 145
5.3.2 Social Operation (B) . . . . . . . . . . . . . . . . . . . . . . . . . . 147
5.3.3 Complete LS Algorithm . . . . . . . . . . . . . . . . . . . . . . . 149
5.3.4 Discussion About the LS Algorithm . . . . . . . . . . . . . . 150
5.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
5.4.1 Performance Comparison . . . . . . . . . . . . . . . . . . . . . . 151
5.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
Appendix: List of Benchmark Functions . . . . . . . . . . . . . . . . . . . . . . 155
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
6 A Swarm Algorithm Inspired by the Collective
Animal Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
6.2 Biological Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.3 Collective Animal Behavior Algorithm (CAB) . . . . . . . . . . . . . 164
6.3.1 Description of the CAB Algorithm . . . . . . . . . . . . . . . 164
6.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
6.4.1 Effect of the CAB Parameters . . . . . . . . . . . . . . . . . . . 168
6.4.2 Performance Comparison . . . . . . . . . . . . . . . . . . . . . . 169
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
7 Auto-calibration of Fractional Fuzzy Controllers
by Using the Swarm Social-Spider Method . . . . . . . . . . . . . . . . . . . 189
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
7.2 Fractional-Order Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
7.2.1 Fractional Calculus . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
7.2.2 Approximation of Fractional Operators . . . . . . . . . . . . 192
7.3 Fuzzy Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
7.4 Social Spider Optimization (SSO) . . . . . . . . . . . . . . . . . . . . . . 195
7.5 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
7.6 Numerical Simulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
7.6.1 Results Over High-Order Plants ðG1 ðsÞÞ . . . . . . . . . . . 200
7.6.2 Results Over Non-minimum Systems ðG2 ðsÞÞ . . . . . . . 201
7.6.3 Results Over Fractional Systems ðG3 ðsÞÞ . . . . . . . . . . . 203
7.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
xiv Contents

8 Locust Search Algorithm Applied to Multi-threshold


Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
8.2 Gaussian Mixture Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . 214
8.3 The Locust Search (LS) Algorithm . . . . . . . . . . . . . . . . . . . . . 215
8.3.1 LS Solitary Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
8.3.2 LS Social Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
8.4 Segmentation Algorithm Based on LS . . . . . . . . . . . . . . . . . . . 218
8.4.1 New Objective Function J new . . . . . . . . . . . . . . . . . . . 220
8.4.2 Complete Segmentation Algorithm . . . . . . . . . . . . . . . 222
8.5 Segmentation Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
8.5.1 Performance of LS Algorithm in Image
Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
8.5.2 Histogram Approximation Comparisons . . . . . . . . . . . . 227
8.5.3 Performance Evaluation of the Segmentation
Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
8.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
9 Multimodal Swarm Algorithm Based on the Collective Animal
Behavior (CAB) for Circle Detection . . . . . . . . . . . . . . . . . . . . . . . 241
9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
9.2 Biological Fundaments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
9.3 Collective Animal Behavior Algorithm (CAB) . . . . . . . . . . . . . 245
9.3.1 Description of the CAB Algorithm . . . . . . . . . . . . . . . 245
9.4 Results on Multi-modal Benchmark Functions . . . . . . . . . . . . . 253
9.4.1 Experiment Methodology . . . . . . . . . . . . . . . . . . . . . . 254
9.4.2 Comparing CAB Performance for Smooth
Landscapes Functions . . . . . . . . . . . . . . . . . . . . . . . . . 256
9.4.3 Comparing CAB Performance in Rough
Landscapes Functions . . . . . . . . . . . . . . . . . . . . . . . . . 259
9.5 Application of CAB in Multi-circle Detection . . . . . . . . . . . . . . 261
9.5.1 Individual Representation . . . . . . . . . . . . . . . . . . . . . . 261
9.5.2 Objective Function . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
9.5.3 The Multiple Circle Detection Procedure . . . . . . . . . . . 265
9.5.4 Implementation of CAB Strategy for Circle
Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
9.6 Results on Multi-circle Detection . . . . . . . . . . . . . . . . . . . . . . . 269
9.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276
Contents xv

10 Locust Search Algorithm Applied for Template Matching . . . . . . . 279


10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
10.2 Template Matching Process . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
10.3 The Locust Search (LS) Algorithm . . . . . . . . . . . . . . . . . . . . . 283
10.3.1 LS Solitary Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
10.3.2 LS Social Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.4 Template Matching (TM) Algorithm Based on the Locust
Search (LS) Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.5 Experimental Setup and Results . . . . . . . . . . . . . . . . . . . . . . . . 289
10.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
Chapter 1
An Introduction to Nature-Inspired
Metaheuristics and Swarm Methods

Abstract Mathematical Optimization is an current problem in many different areas


of science and technology; due to this, in the last few years, the interest on the devel-
opment of methods for solving such kind of problems has increased an unprece-
dented way. As a result of the intensification in research aimed to the develop-
ment of more powerful and flexible optimization tools, many different and unique
approaches have been proposed and successfully applied to solve a wide array of
real-world problems, but none has become as popular as the family of optimiza-
tion methods known as nature-inspired metaheuristics. This compelling family of
problem-solving approaches have become well-known among researchers around
the world not only for to their many interesting characteristics, but also due to their
ability to handle complex optimization problems, were other traditional techniques
are known to fail on delivering competent solutions. Nature-inspired algorithms have
become a world-wide phenomenon. Only in the last decade, literature related to this
compelling family of techniques and their applications have experienced and aston-
ishing increase in numbers, with hundreds of papers being published every single
year. In this chapter, we present a broad review about nature-inspired optimization
algorithms, highlighting some of the most popular methods currently reported on the
literature as and their impact on the current research.

1.1 Optimization Techniques: A Brief Summary

Mathematical optimization is a branch of applied mathematics and computer sciences


which deals with the selection of the optimal solution for a particular mathematical
function (or problem) with the purpose of either minimizing or maximizing the out-
put of such function. In more simple terms, optimization could be described as the
process of selecting of the best element(s) from among a set of available alternatives
to get the best possible results when solving a particular problem [1, 2]. Optimization
is a recurring problem for many different areas of application such as robotics, com-
puter networks, security, engineering design, data mining, finances, economics, and
many others [2]. Independently of the area of application, optimization problems are

© Springer Nature Switzerland AG 2020 1


E. Cuevas et al., New Advancements in Swarm Algorithms: Operators
and Applications, Intelligent Systems Reference Library 160,
https://doi.org/10.1007/978-3-030-16339-6_1
2 1 An Introduction to Nature-Inspired Metaheuristics …

wide-ranging and numerous, so much that the development of methods for solving
such problems has remained a hot topic for many years.
Traditionally, optimization techniques can be roughly classified as either deter-
ministic or stochastic [3]. Deterministic optimization approaches, which design heav-
ily relies on mathematical formulation and its properties, are known to have some
remarkable advantages, such as fast convergence and implementation simplicity [4].
On the other hand, stochastic approaches, which resort to the integration of random-
ness into the optimization process, stand as promising alternatives to deterministic
methods for being far less dependent on problem formulation and due to their ability
to thoroughly explore a problems design space, which in turn allow them to overcome
local optima more efficiently [5]. While both deterministic and stochastic methods
have been successfully applied to solve a wide variety of optimization problems,
these classical approaches are known to be subject to some significant limitations;
first of all, deterministic methods are often conditioned by problem properties (such
as differentiability in the case of gradient-based optimization approaches) [6]. Fur-
thermore, due to their nature, deterministic methods are highly susceptible to get
trapped into local optima, which is something undesirable for most (if not all) appli-
cations. As for stochastic techniques, while these are far easier to adapt to most
black-box formulations or ill-behaved optimization problems, these methods tend to
have a notably slower convergence speed in comparison to their deterministic coun-
terparts, which naturally pose as an important limitation for applications where time
is critical.
The many shortcomings of classical methods, along with the inherent challenges
of real-life optimization problems, eventually lead researchers to the development of
heuristics as an alternative to tackle such complex problems [1]. Generally speak-
ing, a heuristic could be described as a technique specifically tailored for solving
specific problems, often considered too difficult to handle with classic techniques.
In this sense, heuristics trade essential qualities such as optimality, accuracy, pre-
cision or completeness to, either solve a problem in reasonably less time or to find
an approximate solution in situations in which traditional methods fail to deliver an
exact solution. However, while heuristic methods have demonstrated to be excellent
to handle otherwise hard to solve problems, there are still subject to some issues. Like
most traditional approaches, heuristics are usually developed by considering at least
some specifications about the target problem, and as such, it is hard to apply them to
different problems without changing some or most of their original framework [7].
Recently, the idea of developing methodologies that could potentially solve a
wide variety of problems in a generic fashion has caught the attention of many
researchers, leading to the development of a new breed of “intelligent” optimization
techniques formally known as metaheuristics [8]. A metaheuristic is a particular kind
of heuristic-based methodology, devised with the idea of being able to solve many
different problems without the need of changing the algorithms basic framework.
For this purpose, metaheuristic techniques employ a series of generic procedures
and abstractions aimed to improve a set of candidate solution iteratively. With that
being said, metaheuristics are often praised due to their ability to find adequate
solutions for most problems independently of their structure and properties.
1.2 The Rise of Nature-Inspired Metaheuristics 3

1.2 The Rise of Nature-Inspired Metaheuristics

The word “nature” refers to many phenomena observed in the physical world. It com-
prises virtually everything perceptible to our senses and even some things that are
not as easy to perceive. Nature is the perfect example of adaptive problem solving;
it has shown countless times how it can solve many different problems by applying
an optimal strategy, suited to each particular natural phenomenon. Many researchers
around the world have become captivated by how nature can adapt to such an exten-
sive array of situations, and for many years they have tried to emulate these intriguing
problem-solving schemes to develop tools with real-world applications. In fact, for
the last two decades, nature has served as the most important source of inspiration
in the development of metaheuristics. As a result of this, a whole new class of opti-
mization techniques was given birth in the form of the so-called Nature-inspired
optimization algorithms. These methods (often referred as bio-inspired algorithms)
are a particular kind of metaheuristics, developed with a single idea in mind: mimick-
ing a biological or a physical phenomenon to solve optimization problems. With that
being said, depending on their source of inspiration, nature-inspired metaheuristics
can be classified in four main categories: evolution-based, swarm-based, physics-
based and human-based methods [9, 10]. Evolution-based methods are developed by
drawing inspiration in the laws of natural evolution. From these methods, the most
popular is without a doubt the Genetic Algorithms approach, which simulates Dar-
winian evolution [11]. Other popular methods grouped within this category include
Evolution Strategy [12], Differential Evolution [13] and Genetic Programming [14].
On the other hand, swarm-based techniques are devised to simulate the social and
collective behavior manifested by groups of animals (such as birds, insects, fishes,
and others). The Particle Swarm Optimization [15] algorithm, which is inspired in
the social behavior of bird flocking, stands as the most representative and successful
example within this category, although other relevant methods include Ant Colony
Optimization [16], Artificial Bee Colony [17], Firefly Algorithm [18], Social Spi-
der Optimization [19], among others. Also, there are the physics-based algorithms,
which are developed with the idea of emulating the laws of physics observed within
our universe. Some of the most popular methods grouped within this category are
Simulated Annealing [20], Gravitational Search Algorithm [21], Electromagnetism-
like Mechanism [22], States of Matter Search [23], to name a few. Finally, we can
mention human-based algorithms. These kind of nature-inspired methods are unique
due to the fact that they draw inspiration from several phenomena commonly associ-
ated with humans’ behaviors, lifestyle or perception. Some of the most well-known
methods found in the literature include Harmony Search [24], Firework Algorithm
[25], Imperialist Competitive Algorithm [26], and many more.
Most nature-inspired methods are modeled as population-based algorithms, in
which a group of randomly generated search agents (often referred as individuals)
explore different candidate solutions by applying a particular set of rules derived from
some specific natural phenomenon. This kind of frameworks offer important advan-
tages in both, the interaction among individuals, which promotes a wider knowledge
4 1 An Introduction to Nature-Inspired Metaheuristics …

about different solutions, and the diversity of the population, which is an important
aspect on ensuring that the algorithm has the power to efficiently explore the design
space while also being able to overcome local optima [8]. Due to this and many other
distinctive qualities, nature-inspired methods have become a popular choice among
researchers. As a result, literature related to nature-inspired optimization algorithms
and its applications for solving otherwise challenging optimization problems has
become extremely vast, with hundreds of new papers being published every year.
In this chapter, we analyze some of the most popular nature-inspired optimization
methods currently reported on the literature, while also discussing their impact on
the current literature. The rest of this chapter is organized as follows: in Sect. 1.2,
we analyze the general framework applied by most nature-inspired metaheuristics
in terms of design. In Sect. 1.3, we present nature-inspired methods according to
their classification while also reviewing some of the most popular algorithms for
each case. Finally, in Sect. 1.4, we present a brief study concerning the growth in the
number of publications related to nature-inspired methods.

1.3 General Framework of a Nature-Inspired


Metaheuristic

With some exceptions, most of the nature-inspired metaheuristics currently reported


on the literature are modeled as population-based algorithms, which implies that the
general framework employed by most of these methods remains almost identical,
independently of the natural phenomenon from which the algorithm is inspired [2].
Usually, the first step of a nature-inspired algorithm involves the definition of a
set of N randomly initialized solutions X = {x1 , x2 , . . . , x N } (commonly referred as
population), and such that:
 
xi = xi,1 , xi,2 , . . . , xi,d (1.1)

where the elements xi,n represent the decision variables (parameters) related to a
given optimization problem, while d denotes the dimensionality (number of decision
variables) of the target solution space.
From an optimization point of view, each set of parameters xi ∈ X (also known
as an individual) is considered as a candidate solution for the specified optimization
task; as such, each of these solutions is also assigned with a corresponding quality
value (or fitness) related to the objective function f (·) that describes the optimization
task, such that:

f i = f (xi ) (1.2)

Nature-inspired methods usually follow an iterative search scheme, in which new


candidate solutions are generated by modifying currently available individuals; this is
1.3 General Framework of a Nature-Inspired Metaheuristic 5

achieved by applying some previously specified criteria (usually devised by drawing


inspiration from an observed natural phenomenon). For most cases, this process may
be illustrated by the following expression:

xi = xi + xi (1.3)

where xi denotes the candidate solution generated by adding up a specified update
vector xi to xi . It is worth noting that the value(s) adopted by the update vector
xi depend on the specific operators employed by each individual algorithm.
Finally, most nature-inspired algorithms include some kind of selection process,
in which the newly generated solutions are compared against those in the current
population Xk (with k denoting the current iteration) in terms of solution quality,
typically with the purpose of choosing the best individual(s) among them. As a result
of this process, a new set of solutions Xk+1 = x1k+1 , x2k+1 , . . . , xnk+1 , corresponding
to the following iteration (or generation) ‘k + 1’, is generated.
This whole process is iteratively repeated until a particular stop criterion is met
(i.e., if a maximum number of iterations is reached). Once this happens, the best
solution found by the algorithm is reported as the best approximation for the global
optimum [2].

1.4 Classification of Nature-Inspired Metaheuristics

Nature-Inspired optimization algorithms have become so numerous and so varied that


illustrating every single method in existence has become an undoubtedly challenging
task. However, several algorithms have become widely popular among researchers,
either for their fascinating characteristics or their ease of implementation. In this
section, we present some of the most popular nature-inspired optimization techniques
currently reported on the literature. The algorithms presented in this section were
chosen by considering a balance between both classical and modern approaches.
Also, in order to give the reader the facility to understand, analyze and compare each
of the described methods in the same terms, we have taken some liberties regarding
nomenclature and formulation presented on each case so that it is consistent with
the general framework of nature-inspired methods presented in Sect. 1.3. While
the introduced formulations may look slightly different to those reported on their
sources, we have made a special effort to keep the essence and particular traits
that distinguish each method unaltered; with that being said, the reader is always
invited to refer to the original paper(s) in order to get a deeper understanding of these
techniques. All approaches described in this section are presented according to the
typical classification of to nature-inspired metaheuristics (see Fig. 1.1).
6 1 An Introduction to Nature-Inspired Metaheuristics …

Fig. 1.1 Classification of nature-inspired metaheuristics

1.4.1 Evolution-Based Methods

Evolution-Based methods comprise a series of optimization algorithms developed


by drawing inspiration in the laws of natural evolution. In this kind of techniques,
solutions are typically represented by a set of individuals, which compete and com-
bine in ways that allow only the most suitable individuals to prevail. The process for
modifying existent solutions in evolution-based techniques often involve the imple-
mentation of a series of operators inspired in several processes commonly observed
in natural evolution, such as crossover, mutation, and selection.

1.4.1.1 Differential Evolution

The Differential Evolution (DE) approach is an evolutionary algorithm introduced


by Rainer Storn and Kenneth Price in 1996 [13] and, along with Genetic Algorithms
(GA), is one of the most popular optimization approaches inspired in the evolution
phenomena.
At each generation ‘k’, DE applies a series of mutation, crossover and selec-
tion operators in order to allow a population of solutions X = {x1 , x2 , . . . x N } to
“evolve” toward an optimal  ksolution. For DE’s  mutation operation, new candidate
(mutant) solutions mik = m i,1 , m i,2
k
, . . . , m i,d
k
are generated for each individual xi
as illustrated as follows:
 
mik = xrk3 + F xrk1 − xrk2 (1.4)

where r1 , r2 , r3 ∈ {1, 2, . . . , N } (and with r1 = r2 = r3 = i) each denote a randomly


chosen solution index, while the parameter F ∈ [0, 2] is called differential
  weight,
and is used to control the magnitude of the differential variation xrk1 − xrk2 .
1.4 Classification of Nature-Inspired Metaheuristics 7

 kFurthermore, for the crossover operation, DE generates a trial solution vector uik =
u i,1 , u i,2 , . . . , u i,d corresponding to each population member ‘i’. The components
k k
k
u i,n in such a trial vector are given by combining both the candidate solution xik and
its respective mutant solution mik as follows:
 k
m i,n if (rand ≤ C R) or n = n ∗
k
u i,n = for n = 1, 2, . . . , d (1.5)
xi,n if(rand > C R) otherwise
k

where n ∗ ∈ {1, 2, . . . , d} denotes a randomly chosen dimension index, while rand


stand for a random number from within the interval [0, 1]. Furthermore, the parameter
C R ∈ [0, 1] represents the DE’s crossover rate which is used to control the probability
k
of an element u i,n being given by either a component from the candidate solution xik
 k   k 
xi,n or a component from the mutant solution mik m i,n .
Finally, for DE’s selection process, each trail solution uik is compared against its
respective candidate solution xik in terms of solution quality (fitness) by applying a
greedy criterion. This means that, if the trial solution uik yields to a better fitness
value than xik , then the value of the candidate solution for the next generation ‘k + 1’
takes the value of uik , otherwise, it remains unchanged. This is:
     
uik if f uik > if f xik
xik+1 = (1.6)
xik otherwise

As one of the most popular Evolution-based algorithms currently reported on


the literature, DE has been extensively studied and applied by researchers in many
different areas of science [27–32].

1.4.1.2 Evolution Strategies

Evolution Strategies (ES) are a series of optimization techniques which draw inspi-
ration from natural evolution [12]. The first ES approach was introduced by Ingo
Rechenberg in the early 1960s and further developed during the 1970s. The most
straightforward ES approach is the so-called (1 + 1)-ES (or two-membered ES).
This approach considers the existence of only a single parent x = [x1 , x2 , . . . , xd ],
which is assumed to  be able to produce a new candidate solution (offspring)
x = x1 , x2 , . . . , xd by means of mutation as follows:

x = x + N(0, σ ) (1.7)

where N(0, σ ) denotes a d-dimensional random vector whose values are drawn
from a Gaussian distribution of mean 0 and fixed standart deviation σ (although later
approaches consider a dynamic value based on the number of successful mutations)
[12].
8 1 An Introduction to Nature-Inspired Metaheuristics …

Furthermore, the (1 + 1)-ES implements a selection operator which allows exclud-


ing the individual with the least performance between the parent x and its respective
offspring x , so that only the best of these solution is considered as the parent for the
next generation (iteration).
In later approaches, Rechemberg introduced the concept of population to ES
by proposing the first multimembered ES in the  form of the so-called (μ + 1)-
ES. In such an approach, a population P = I1 , . . . , Iμ consisting
 on μ > 1
parents Ii = {xi , σi } (with xi = xi,1 , xi,2 , · · · xi,d and σi = σi,1 , σi,2 , . . . , σi,d )
is considered. Furthermore, a discrete recombination mechanism which considers
information drawn from  a pair
 of randomly chosen parent is implemented to generate
a new offspring I = x , σ as follows:

xr1 ,n if (rand > 1/2)
x j = for n = 1, 2, . . . , d (1.8)
xr2 ,n otherwise

σr1 ,n if (rand > 1/2)
σ j = for n = 1, 2, . . . , d (1.9)
σr2 ,n otherwise

where r1 , r2 ∈ {1, . . . , μ} denote two randomly chosen solution indexes correspond-


ing to the parents population, while rand stand for a random number from within the
interval [0,1].
Similarly to (1 + 1)-ES, (μ + 1)-ES also implements a mutation operator which
generate a new offspring solution by perturbing a currently existing parent. Fur-
thermore, a selection operator which allows to choose the best μ solutions from
among the population of parents and offspring (generated through recombination
and mutation) is also implemented to define a new parents’ population for the next
generation.
Later approaches, such as the (μ + λ)-ES and the (μ, λ)-ES were further proposed
to consider the generation of multiple offspring rather than a single one [12]. Fur-
thermore, several variations to the recombination and mutation processes employed
on classical ES have also been proposed, giving birth to some interesting variants
such as (μ, λ)-MSC-ES and CMA-ES, the latter of which is considered by many
authors as the state-of-the-art in ES [33–36].

1.4.1.3 Genetic Algorithms

Genetic Algorithms (GA) is one of the earliest metaheuristics inspired in the concepts
of natural selection and evolution and is among the most successful Evolutionary
Algorithms (EA) due to its conceptual simplicity and easy implementation [37]. GA
was initially developed by John Henry Holland in 1960 (and further extended in
1975) with the goal to understand the phenomenon of natural adaptation, and how
this mechanism could be implemented into computers systems to solve complex
problems.
1.4 Classification of Nature-Inspired Metaheuristics 9
 
In GA, a population of N solutions xi = xi,1 , xi,2 , . . . , xi,d is first initialized;
each of such solutions (called chromosomes) comprises a bitstring (this is, xi,n ∈
{0, 1}), which further represents a possible solution for a particular binary problem.
At each iteration (also called generation) of GA’s evolution process, the chromosome
population is modified by applying a set of three evolutionary operators, namely:
selection, crossover and mutation. For the selection operation, GA randomly selects
a pair of chromosomes x p1 and x p2 (with p1 , p2 ∈ {1, 2, . . . , N } and p1 = p2 )
from within the entire chromosome population, based on their individual selection
probabilities. The probability Pi for a given chromosome ‘i’ (xi ) to be selected
depends on its quality (fitness value), as given as follows:

f (xi )
Pi = N   (1.10)
j=1 f xj

Then, for the crossover operation, the bitstring information of the selected chro-
mosomes (now called parents) is recombined to produce two new chromosomes, xs1
and xs2 (referred as offspring) as follows:
 
x p1 ,n if (n < l) x p2 ,n if(n < l)
xs1 ,n = xs2 ,n = for n = 1, 2, . . . , d. (1.11)
x p2 ,n otherwise x p1 ,n otherwise

where l ∈ {1, 2, . . . , d} is a randomly selected pivot index (usually referred as locus).


Finally, for the mutation operation, some elements (bits) of the newly generated
offspring are flipped (changed from 1 to 0 or vice versa). Mutation can occur over
each bit position in the string with a particular probability Pm (typically as low as
0.001), as given as follows:

x̄sr ,n if (rand < Pm )
xsr ,n = for n = 1, 2, . . . , d (1.12)
xsr ,n otherwise

where xsr ,n (with r ∈ {1, 2}) stand for the jth element (bit) of the sr th offspring,
while rand stand for a randomumber form within the interval of 0 and 1.
This process of selection, crossover, and mutation of individuals takes place until
a population of N new chromosomes (mutated offspring) has been produced, and
then, the N best chromosomes among the original and new populations are taken for
the next generation, while the remainder individuals are discarded [38–41].

1.4.1.4 Genetic Programming

Genetic Programming (GP) is a unique optimization technique proposed by John


R. Koza in 1992 [14]. The development of GP is closely related to popularization
gained by evolutionary algorithms between the 1960s and 1970s. In essence, GP can
be considered an extension of evolutionary methods such as Rechenberg’s Evolution
10 1 An Introduction to Nature-Inspired Metaheuristics …

Strategies (ES) or Holland’s Genetic Algorithms (GA). Different to said traditional


methods, however, is the fact that in GP solutions are represented by a set of opera-
tions or computer programs; this is that, instead of finding a set of decision variables
that optimize a given objective function, the outputs of GP are computer programs
specifically tailored (evolved) to perform optimally on predefined tasks. Tradition-
ally, solutions in GP are represented as tree structures which group a set of functions
and operands, although other representations are also common. Furthermore, GP is
also distinctive due to its variable-length representation of output solutions, which
drastically differs to the fixed-length representations adopted by most traditional
techniques [42–46].
Typically, most GP approaches are comprised of the following four fundamental
steps:
1. Generating an initial population of computer programs, composed by the avail-
able functions and terminals (operands).
2. Execute each program in the population and assign it a fitness value according
to how well it solves a given problem.
3. Generate a new population of programs by:
a. Copying the current best computer programs (reproduction).
b. Creating new offspring programs by randomly changing some parts of a
program (mutation).
c. Creating new offspring programs by recombining parts from two existent
programs (crossover).
4. If a specified stop criterion is met, return the single best program in the population
as the solution for the pre-specified problem. Otherwise, return to step 2.

1.4.2 Swarm-Based Methods

Swarm-based optimization algorithms comprise a series of techniques which draw


inspiration from the collective behavior manifested by a wide range of living organ-
isms, such as birds, insects, fishes, and others. In this kind of techniques, search
agents are modeled by a population of individuals (usually from the same species)
which are capable of interacting with each other and the environment that surrounds
them. While the movement of search agents in swarm algorithms is often based on
simplified behavioral rules abstracted from those observed in nature, the collective
manifestation of these individual conducts allows the entire population to exhibit
global and complex behavioral patterns, thus allowing them to explore an extensive
amount of candidate solutions.
1.4 Classification of Nature-Inspired Metaheuristics 11

1.4.2.1 Ant Colony Optimization

The Ant Colony Optimization (ACO) algorithm is one of the most well-known nature-
inspired metaheuristics. The ACO approach was first proposed by Marco Dorigo in
1992 under the name of Ant Systems (AS) and draws inspiration in the natural
behavior of ants [47]. In nature, ants move randomly while foraging for food, and
when an appropriate source is found, they return to their colony while leaving a
pheromone trail behind. Ants are able to guide themselves toward previously found
food source by following the path traced by pheromones left by them or other ants.
However, as time passes, pheromones start to evaporate; intuitively, the more time an
ant takes to travel down a given path back and forth, the more time the pheromones
have to dissipate; on the other hand, shorter paths are traversed more frequently,
promoting that pheromone density becomes higher in comparison to that on longer
routes. In this sense, if an ant finds a good (short) path from the colony to a food source,
others members are more likely to follow the route traced by said ant. The positive
feedback provided by the increase in pheromone density through paths traversed by
an increasing number of ants eventually lead all members of the colony to follow a
single optimal route [16].
The first ACO approach was conceived as an iterative process devised to handle
the task of finding optimal paths in a graph [47]. For this purpose, ACO considers a
population of N ants which move through the nodes and arcs of a graph G(N , P)
(with N and P denoting its respective sets of nodes and arcs, respectively). Depend-
ing on their current state (node), each ant is able to choose from among a set of
adjacent paths (arc) to traverse based on the pheromone density and length associ-
ated to each of them. With that being said, at each iteration ‘k’, the probability for
a given ant ‘i’ to follow a specific path ‘x y’ (which connects states ‘x’ and ‘y’) is
given by the following expression:

α · τ(xk y) β · η(x
k
y)
pik(x y) = (1.13)
z∈Yx α · τ(xk z) β · η(x
k
z)

where, τ(xk y) denotes the pheromone density over the given path ‘x y’, while η(x k
y)
stand for the preference for traversing said path, which is relative to its distance
(cost). Furthermore, Yx represent the set of all adjacent states for the given current
state ‘x’. Finally, α and β are constant parameters used to control the influence of
τ(xk y) and η(x
k
y) , respectively.
By applying this mechanism, each ant moves through several paths within the
graph until a specific criterion is met (i.e., that a particular destination node has been
reached). Once this happens, each ant backtracks its traversed route while releasing
some pheromones on each the paths they used. In ACO, the amount of pheromones
released by an ant ‘i’ over any given path ‘x y’ is given by:
12 1 An Introduction to Nature-Inspired Metaheuristics …

Q/L i if  the ant used the path xy in its tour
τik(x y) = (1.14)
0 otherwise

where, L i denotes the length (cost) associated to the route taken by the ant ‘i’, while
Q stand for a constant value.
Finally, ACO includes a procedure used to update the pheromone density over all
paths in the graph for the following iteration (k + 1). For this purpose, it considers
both, the amount of pheromones released by each ant while backtracking its traced
route and the natural dissipation of pheromones which takes place as time passes.
This is applied by considering the following expression:

y) = (1 − ρ) · τ(x y) +
τ(xk+1 k
τik(x y) (1.15)
i=1

where ρ is a constant value known as pheromone evaporation coefficient, while


τik(x y) stand for the amount of pheromones released by an ant ‘i’ over a specific path
‘x y’ (as given by Eq. 1.14).

1.4.2.2 Artificial Bee Colony

Bees are among the most well-known example of insects which manifest a col-
lective behavior, either for food foraging or mating. Based on this premise, many
researchers have proposed several different swarm intelligence approaches inspired
by the behavior of bees. In particular, the Artificial Bee Colony (ABC) approach
proposed by Dervis Karaboga and Bahriye Basturk in 2007 is known to be among
the most popular of these bee-inspired methods [48].
In the ABC approach, search agents are represented as a colony of artificial honey
bees which explore a d-dimentional search space while looking for optimal food
(nectar) sources. The locations of these food sources each represent a possible solu-
tions for a given optimization problem and their amount of nectar (quality) is related
to the fitness value associated to each of such solutions. Furthermore, the members of
the bee colony are divided in three groups: employed bees, onlooker bees and scout
bees. Each of these groups of bees has distinctive functions inspired in the mechanics
employed by bees while foraging for food. For example, the employed bees com-
prises the members of the colony which function is to explore the surroundings of
individually-known food sources in the hopes of finding places with greater amounts
of nectar. In addition, employed bees are able to share the information of currently
known food sources with the rest of the members of the colony, so that they can also
exploit them. With that being said, at each iteration ‘k’ of ABC’s search process,
each employed bee generates a new candidate solution vi around a currently known
food source xi as follows:
 
vik = xik + φ xik − xrk (1.16)
1.4 Classification of Nature-Inspired Metaheuristics 13

where xik denotes the location of the food source remembered by a particular
employed bee ‘i’ while xrk (with r = i) stands for the location of any other ran-
domly chosen food source. Furthermore, φ is a random number drawn from within
the interval [−1, 1].
On the other hand, onlooker bees can randomly visit any food source known by
the employed bees. For this purpose, each available food source is assigned with a
certain probability of being visited by an onlooker bee as follows:
 
f xik
Pi =
k
(1.17)
N k
j=1 f x j

Similarly to the employed bees, once an onlooker bee has decided to visit a
particular food source, a new candidate solution vik is generated around the chosen
location xik by applying Eq. (1.16). Furthermore, any candidate solution vik generated
by either an employed or an onlooker bee is compared against its originating location
xik in terms of solution quality, and then, the best among them is chosen as the new
food source location for the following iteration; this is:
     
vik if f xik < f vik
xik+1 = (1.18)
xik otherwise

Finally, scout bees are the members of the colony whose function is to explore
the whole terrain for new food sources randomly. Scout bees are deployed to look
for new solutions only if a currently known food source is chosen to be “abandoned”
(a thus forgotten by all members of the colony). In ABC, a solution is considered
to be abandoned only if it cannot be improved by either the employed or onlooker
bees after a determined number of iterations, indicated by the algorithm’s parameter
“limit”. This mechanism is important for the ABC approach since it allows it to
keep the diversity of solutions during the search process.
In general, ABC’s local search performance may be attributed to the neighborhood
exploration and greedy selection mechanisms applied by the employed and onlooker
bees, while the global search performance is mainly related to the diversification
attributes of scout bees.

1.4.2.3 Bat Algorithm

The Bat Algorithm (BA) is a bio-inspired metaheuristic proposed by Xin-SheYang


in 2010. The BA approach draws inspiration on the behavior manifested by certain
species of bats (particularly, microbats) [49]. In nature, most bats are equipped with
a type of biologic sonar known as echolocation. In simple terms, the echolocation
consists of two steps: the emission of loud frequency-modulated sound pulses and
the reception (listening) of the echoing sounds that bounce back from surround-
ing objects, which essentially allows building a three-dimensional scenario of the
Exploring the Variety of Random
Documents with Different Content
Regardless of what gossip may have said about her, Mrs. O’Brien
was real in every sense of the word.
It was to her, therefore, that Sana turned in her trouble. Mrs. O’Brien
listened to Sana’s tale with a motherly interest, and explaining in part
her intentions, she took Sana to the office of the famous Dr. White,
on the same block.
The doctor, an elderly and affable gentleman, had been in New York
for many years, and the fame that had preceded him from Europe,
where he had been a professor at the University of Heidelberg,
increased with his years of practice in America.
He and Mrs. O’Brien were well acquainted and with a cheery “Good
evening” he led the two women from the reception room, into his
office, which was splendidly furnished and embellished with
numerous books, charts and artistic curiosities. There was nothing
about the place to give the visitor the chill that generally comes on
entering a doctor’s office. Instead the room seemed to be pervaded
with an atmosphere of congenial warmth.
The three seated themselves preparatory to the consultation. Sana
broke the momentary silence by speaking clearly and calmly.
“My fiancé, François de Rochelle, for whom I also work as secretary,
induces me daily to walk across the bridge to get fresh air. Whenever
I do so I always feel a great desire to jump over the rail and drown
myself in the waters below. This sensation increases, like my love for
him, as the days go by. Why it is, I do not know. I love my fiancé
dearly and he returns my love with equal fervor. We intend to be
married immediately upon our return to Paris. I do not wish François
to be worried over me, and for that reason I have never confided in
him my desire to commit suicide. Neither have I mentioned to him
my intention to consult a doctor.”
She paused, but Dr. White said only “Yes, go on.”
“Once in a while, of an evening, as a matter of amusement François
hypnotizes me. It always makes me feel much better. But the
following day, when I walk across the bridge, the horrible impulse to
do away with myself, forces itself upon me. Day by day the desire
increases in intensity. I should have killed myself today if it had not
been for a man who spoke to me just as I was about to leap over the
rail. Can you tell me what the trouble is, doctor?”
Dr. White was deep in thought. He had often practised the subtle art
of hypnotism as an aid to his medical work. He knew, therefore, the
sinister truth that lay behind Sana’s words.
Rousing himself at her question, he looked at Sana closely and
asked, “Will you consent to enter the hypnotic state under my
influence?”
Sana recalled to mind some of the risque situations she had found
herself in upon waking from the trances, induced by her lover. The
memories caused her to pause an instant, then raising her hands
she cried, “No, no!”
The doctor seemed to comprehend the thoughts that were surging
through her mind, and he interrupted with, “You need have no fear.
Your friend, Mrs. O’Brien is here and the experiment may be of
benefit to both you and your fiancé.”
Her reply to the man’s kindly remonstration showed how easily he
had dispelled her fears.
“Yes, perhaps it will be better so.”
Sana reclined restfully back within the cushioned chair and the
doctor bent over her. With his hands he made a few passes before
her face, with a steady look of intensity he performed the
preliminaries of the hypnotist. His piercing glance held her gaze. His
eyes seemed fairly to devour hers. Soon her eyes dimmed and
slowly commenced to close. Her mind was giving way to his
dominating will. Slowly the girl’s eyes closed entirely, the muscles of
her body relaxed and her mind sought another plane.
Dr. White straightened up and turning to Mrs. O’Brien said softly,
“She is gone.”
The doctor drew his chair close to and directly in front of Sana. In a
clear voice that seemed more to make itself felt rather than heard, he
propounded his queries.
“What does your fiancé, François de Rochelle, do when you are
under his hypnotic influence?”
Slowly came the answer, “He teaches me some dance steps and
also makes love to me.”
“Do you really love him?”
“Well, I would do anything to please him, but——”
“But what?”
“I did not love him before we were engaged.”
“How did that happen?”
No answer forthcoming, Dr. White commanded sharply, “Come,
come, answer me.”
Sana responded with “I did not care for him enough. One evening
while at dinner with him in a private dining-room of a famous Parisian
restaurant he hypnotized me, and directed me to love him and
prepare for our marriage. From then on I began to love him, and
when he was sure of my affection he disclosed to me the secret of
why I loved him. But I did not mind, for my love was already deep
rooted.”
“Are you wealthy? Did you inherit much money?”
“No. Just a few thousands.”
“Is your life insured?”
“Yes, for $50,000.”
“Who will get this money in case you die?”
“François.”
“Is de Rochelle’s life insured likewise?”
“Yes, for $10,000.”
Then like a bolt of lightning came the question, “Did de Rochelle ever
direct you to commit suicide by leaping from the Queensborough
Bridge?”
Sana shivered slightly. Her entire body seemed to shrink as she
reached forth her arms and groped blindly in the empty air.
“Answer me!” The doctor fairly hissed the words.
In a tone scarce above a whisper came the delayed reply, “François
forbade me to speak on this subject, should I ever be in a trance
induced by any other than himself. I will not—I cannot answer that
question. I will not!”
“Answer me. Did François direct you to commit suicide? I demand an
answer.”
“I refuse to speak of this matter.”
Finding himself powerless to draw from that unconscious mind the
answer he had hoped to get, Dr. White turned to Mrs. O’Brien, his
face but thinly veiling the disappointment he felt.
“Say nothing of this latter question to the girl,” he cautioned, “it would
only serve to distract her.”
He turned to the girl, and once more making a pass before her eyes,
directed, “Wake up.”
Sana opened her eyes, rose to an upright position and slowly gazed
blankly about her. Then recalling where she was and for what
purpose she had come, a more tranquil look crept into her eyes.
After Sana had recovered herself, Dr. White requested that she and
Mrs. O’Brien call the following day. To this they readily consented
and the appointment was made.
After Mrs. O’Brien and the girl had left the office, Doctor White sank
into a chair, muttering “Strange—very strange.”
For a long time he sat there, with his head bowed in deep thought.
Suddenly, he stood up, saying half aloud, “Professor Grant. That’s
the man for this.”
Going to his telephone he called up the professor’s home.
“Hello, Grant. This is White. Can you possibly be at my office
tomorrow noon? I wish you would come. I have a most interesting
case on my hands—most interesting.” A pause, then, “You will? Fine.
I knew I could rely upon you. At noon, sharp.”
The following day Sana and Mrs. O’Brien went to the doctor’s office.
He and Prof. Grant were waiting for them.
Dr. White introduced Prof. Grant, adding for Sana’s benefit, “Prof.
Grant can be trusted. I am sure he will be able to help you. Just do
as he asks, and everything will come out all right.”
Sana smiled pleasantly at Prof. Grant, who taking her by the hand
said, “I shall put you under a hypnotic spell, and while under its
influence you must answer each and every question I put to you. It is
very important and necessary that you do so, for your own benefit. A
cure cannot be effected until you have spoken as you are bidden.
Remember that.”
“I shall do as you say. Yes, I will. I want to be cured for the sake of
François.”
Little did she dream what the outcome would be. Sana, of course,
knew nothing of the diabolical schemes of de Rochelle. The victim of
hypnotic influence can never recall to mind, while conscious, what
took place during a trance.
Prof. Grant was a powerfully built man, with a heavy black beard and
a pair of black eyes that seemed to seek the innermost recesses of
the soul.
Taking Sana’s wrist he gazed into her eyes with a stare that ever
increased in piercing power and concentration. At first her glance
met his frankly, but within a fleeting moment of time, before she
could realize what was happening, Sana closed her eyes, and with
relaxing muscles sank back in her chair—totally under the magic
spell woven by those piercing eyes.
Grant came to the point quickly, with “Tell me. Did your fiancé,
François de Rochelle, direct you to commit suicide while under his
influence? What was the purpose? Tell me.”
Sana hesitated.
Grant fairly shouted, “Answer me. I command it!”
Slowly the words came, barely audible to the eager listeners.
“Yes, each time that he hypnotized me he directed and commanded
me to drown myself by leaping from the bridge into the river. When I
was not under his power, he induced me to walk every day across
the bridge. He told me it would do me good to get the air. While in a
trance, he also forbade me to ever mention to him, while in a normal
state, of my desire to drown myself. He impressed upon me, also,
that should I ever be under the hypnotic influence of another and be
questioned regarding this, I was to refuse to answer.”
“Did he ever intimate his purpose in wanting you to kill yourself?”
“One night he laughed, so I recall, saying that he would then have
plenty of money and could return to France to marry his schoolday
sweetheart.”
“Are you sure of that?” demanded Grant.
“Yes. He even told me her name. I knew her well. Her name is Edith
Durex.”
“Ah! Tell me, how often and for how long has he been hypnotizing
you?”
“Every evening last week.”
“Did you intend carrying out his demands?”
“Yes. I would do anything for François. Only yesterday was I
prevented from doing so by a stranger. But I will do it as soon as I
get the chance. The feeling grows stronger within me every time I
cross the bridge. And something makes me go to the bridge each
day.”
As Sana gave voice to these strange remarks, Mrs. O’Brien could
hardly suppress her exclamation “My God!”
Grant and White stepped aside and held earnest conversation for a
moment. Grant spoke decisively, “The secret is out, and we would be
parties to the crime if we did not take steps to prevent the act. The
girl cannot be allowed to return to de Rochelle. Suppose you ask
Mrs. O’Brien to take care of her for a few days?”
“Yes, I think that would be best,” and Dr. White stepped over to Mrs.
O’Brien, with the question, “Do you think you could take your friend
to your home and keep her for a few days? It would be the means of
helping her out of her trouble.”
Mrs. O’Brien, who was nearly overcome with pity for Sana, instantly
consented, so eager was she to do something.
“Fine,” from White, giving Grant a slight nod to indicate that his
request had been favorably met.
It was then that Prof. Grant, with a smile on his face, stepped to the
side of the insensible girl. His voice seemed to ring doubly deep and
clear, “From now on you will never again be possessed of that desire
to commit suicide. You are forever free.”
Taking again her right wrist, he softly said, “It is all right, madam,”
and with a start Sana returned to consciousness. The happy smile
upon her face told better than words her relief.
As they were about to leave Dr. White stepped to Sana’s side and
said gently, “Miss Sana, please accompany Mrs. O’Brien to her
home. If you wish to go to the hotel you may do so, but not until after
six o’clock. Do you understand?”
Sana nodded agreeably and assisted by Mrs. O’Brien she left the
two men to their thoughts.
Grant broke the silence. They had been silently thinking of some
plan to follow.
“A letter will do the trick. We shall put the fear of the Almighty in that
rascal’s heart.”
“All right. Let’s get busy. No time can be lost in dealing with him.”
The letter was written immediately and dispatched to the hotel by
messenger.
One can only imagine the thoughts that surged through de
Rochelle’s head when he read the following:
M. François de Rochelle
Hotel Claza
New York, N. Y.
Dear Sir:
Your secretary, Miss Sana, attempted to jump from the
Queensborough Bridge to drown herself, as directed and
demanded of her, while under your hypnotic influence, so
that you could collect the $50,000 insurance and marry
your old time sweetheart.
We advise you to leave this city before five o’clock this
evening, as by six o’clock we shall have reported the case
to the District Attorney.
Yours truly,
H. Grant,
Robt. E. White.
CHAPTER II
AT THE MORGUE
THE tiny hands of the ormulu clock upon the mantel told Sana’s
anxious heart that it was a quarter to six.
With a strange presentiment of coming evil that defied analysis and
strongly against the wishes of her hostess, Sana left the house and
hurried to the hotel.
Reaching de Rochelle’s suite she rapped at the door. No answer
came. A second rapping proved as futile as the first.
“He is out,” murmured the girl as she sought her own room. She
wanted to rest, but could not. For fully half an hour she paced the
floor, a dreadful oppression as of some impending catastrophe
weighing down upon her. She could not shake it off. The very silence
of the room seemed to creep into her heart and dull her mind.
Once more she crossed the corridor to de Rochelle’s rooms. This
time she gave the door a resounding knock. But still no response.
Gently, almost fearfully, she tried the door. It was unlocked, so she
entered the room.
A strange sight met her eyes. Disorder was everywhere. The little
writing table, usually so neat and well ordered, was a confusion of
jumbled papers and letters. Signs of a hasty departure were
everywhere.
Sana, however, took it only to mean that some business interest had
called de Rochelle away in a hurry. Somewhat relieved Sana picked
up a book and going over to the deeply cushioned divan, sat down to
beguile away the time pending his return. But her mind was in a
turmoil and she could not concentrate on her reading.
Nervously she let the minutes creep past. At last she could stand the
strain no longer. Taking the phone she called the desk clerk and had
him page de Rochelle. It was of no avail. Again she tried it, but still
the missing man was not to be found.
Beside herself with fear Sana called up Dr. White, but he assured
her that everything was all right and that no doubt she would hear
from de Rochelle later on. She tried to reason with herself that there
was nothing to fear, but as the hours went by, each seemingly longer
than the one before, she grew so restless that her anxiety could not
be calmed.
She could wait no longer in that lonely room, so about ten o’clock
she hurried over to see Mrs. O’Brien. To her she related her fears,
but she could do nothing to comfort her or offer any solution.
Alarmed at Sana’s state of mind Mrs. O’Brien called up Dr. White.
His words, though laconic, conveyed a world of meaning:
“All is well, and will be for the benefit of Sana. Keep her at your
home tonight.”
But Sana would not listen to any such suggestion. Her alarm had
increased three-fold and although Mrs. O’Brien did everything to
persuade her to remain, Sana hurried back to the hotel.
She felt sure that by this time her sweetheart would have returned.
But the desk clerk had neither seen nor heard anything of him.
Once more she found herself within the precincts of his apartment.
She could hardly keep from screaming aloud in her misery.
Her eyes roved around the empty room, faltered in their course, and
the wandering gaze became a fixed stare. She had found a clew!
Upon the radiator she saw a bit of charred paper. She bent over it,
studying it intently. But the message it had carried was illegible. A
handful of black ashes. What was their secret? She did not touch
them, but took a match, and kneeling on the floor slowly turned the
charred paper around with the match in an effort to decipher
something. Here and there a word could be seen, but nothing to
convey any meaning to her fevered brain. She lit the match and
holding it back of the legible letters managed to read “tell clerk”
“Sana” “leave,” but that was all.
Deeply puzzled and not knowing what to make of it, she lighted
another match, hoping to decipher other words. But before she had
realized it, the flame caught the unburned part of the paper and
destroyed it completely.
Unmindful of everything she sat on the floor, puzzled and
heartbroken.
Brought to her senses by the chiming of midnight, the confused girl
sought her room. Almost unconsciously she disrobed and threw
herself upon the bed. Through the long hours of the night she lay
with unclosed eyes and with every nerve strained to catch the sound
of the returning footsteps of the one she loved so dearly. But she
listened in vain. The dawn of the new day crept in upon her as she
lay there given up to the grief that was hers.
She arose and called the desk clerk. He was sorry, but he could get
no response from de Rochelle’s rooms, in spite of his efforts to do
so.
Mechanically Sana dressed, walking about the room without
intention or aim.
It was a little after six when she again entered de Rochelle’s room. It
was still unoccupied—unoccupied, but yet tenanted with an almost
tangible shadow—the presence of silence.
The thought that de Rochelle had deliberately deserted her did not
enter Sana’s mind for quite a time. When it did, it tended to clear her
brain, lend calmness to her being. She made a brave attempt to
figure it out, saying to herself, “What for? And if so, what will become
of me? What shall I do in this strange city?” And her thoughts went
back to Paris and her childhood days, when she had someone to
watch over her and guide her footsteps.
Sana realized her helplessness. She was alone. Dear as she was,
her friend Mrs. O’Brien could not help her, nor could she help solve
the mystery of de Rochelle’s absence. So she looked around the
rooms once more and left.
In a trembling voice, she questioned the desk clerk, “Have you had
any word from Mr. de Rochelle?”
The clerk was perusing the morning paper as she put the question to
him. He started violently, gazed intently into her face, then back at
the paper. Finally he said “de Rochelle? Is this the de Rochelle you
mean?” And with a pencil he marked a column in the paper and
handed it to her.
Her worst fears were more than realized as she read the tragic
headlines:

BRIDGE JUMPER SUCCEEDS


FRANÇOIS DE ROCHELLE
of
SAHARA DEVELOPMENT ORGANIZATION
DROWNED LAST NIGHT IN EAST RIVER
NEAR QUEENSBOROUGH BRIDGE
ADDRESS UNKNOWN
Boys playing on the water front last night discovered a
man’s body floating toward the shore and with the help of
a policeman it was soon recovered. The face was greatly
disfigured, due to his striking the bridge pilaster. The body
was removed to the morgue....
Sana grew pale. Great tears forced themselves from the deep seas
of her eyes and the paper, falling from her limp grasp fluttered to the
floor. The clerk, noticing this, hastily walked from behind his desk
and reached Sana just in time to catch her as she fell in a dead faint.
A small crowd of early hotel guests soon gathered about Sana.
Among them was the hotel doctor, who ordered that the girl be at
once taken to her room. A nurse was summoned and with her aid
the physician soon revived Sana. Quiet and rest, he said, were all
that would be required to restore the weakened girl to a normal
condition.
That morning, Mrs. O’Brien, breakfasting with her husband, read of
the drowned man in the paper. Believing that Dr. White had been
implicated in some foul play, she at once sought him out. Yes, he
had read of it, but was as much puzzled as she.
Together they called on the prostrate Sana. She was lying on her
bed weeping and softly calling the name of her lover. The couple
sought to explain, and hoped, in doing so, to mitigate the horror of
the catastrophe. But the attempt was fruitless, the girl refused to be
comforted or quieted. Realizing the futility of their desires, they took
their leave, feeling the worse for so painful and depressing a call.
They decided, however, to call later in the day.
About noon Mrs. O’Brien and Dr. White again called to see Sana.
Their explanations were lost on the girl. She could not comprehend
and she feared to believe. All she would say was, half to herself
“François, François, come to me. I need you so.”
As time went by, however, Sana became calmer under the soothing
words of her friends, and the three, together with Prof. Grant, who
had been summoned, went to the District Attorney’s office.
When they had been seated in the private office of that official, Sana
and the others were greatly surprised at the attitude he immediately
assumed. Without hesitation, he proceeded to implicate Sana in the
death of de Rochelle. His questioning was ruthless and his
accusations most bitter. From his words one would gather that Sana
was the guilty one—that in some way or other she had contrived to
put her sweetheart out of the way.
The processes of our law are peculiar, and to a stranger, as Sana
was, to such methods, it was indeed difficult to understand. She had
undergone a severe nervous strain—a terrible shock—and, naturally,
was far from being in a calm collected state of mind. It was this
nervousness, then, that had led the man to believe her guilty of
some crime. Peculiar? Yes, to be sure—but many a man has come
to realize that justice is more than blindfolded!
Dr. White, although quite familiar with incidents of this sort, was
outraged at the procedure. Knowing, as he did, the true
circumstances of the case, he could bear it no longer. His agitation
was demonstrated clearly, when, in a cold, cutting voice, he
interrupted the questioner with, “This young lady knows absolutely
nothing as to the why and wherefore of de Rochelle’s death. At the
time of his disappearance, she was at the home of Mrs. O’Brien. It is
clear, then, that you are injuring her with your accusations.”
At this, the tide of questions turned to overwhelm the O’Briens.
Suffice to say, it was easy for them to establish an alibi both for Sana
and themselves.
Dr. White was next to face the fire of the attack. His explanations
with regard to the dead man’s hypnotic influence over Sana, served
only to add fuel to the flames. A barrage of questions were hurled at
him in an effort to trick him into saying something that might be used
against him or one of the others. White, however, was too clever a
man, and knowing just what he was up against, successfully parried
the thrusts of his opponent.
The outcome was, that, failing to secure any satisfaction from his
visitors, the District Attorney bowed them out, mumbling, “Well, it will
be investigated further.”
Leaving the place, the party wended their way to the morgue, to
make an effort to identify the body.
There are moments when long restrained grief and anxiety break
loose from the mortal fetters that bind them—they escape the
chains, though in their flight they rend the soul and tear the heart.
Such a moment came to Sana as she stood in the house of the
dead, awaiting her turn to look at the body of the drowned man.
She freed herself from the supporting arm of Mrs. O’Brien and with a
cry of anguish pushed her way to the body lying upon the rude slab.
Silently she gazed upon the form. The facial features were wholly
unrecognizable and his curly hair, through which she had so often
delightedly run her fingers now was matted with dried and clotted
blood. The eye that had fascinated her—the lips that had so often
sought hers—all these were hideously mutilated.
Sana sank to her knees and fell across the body, sobbing, “François,
François come back—come back to me—your Sana—your joujou. O
François, why did you leave me? I loved you so. Oh! look at me.”
And as she raved she peered with pitying intent into the sunken eyes
of the lifeless man.
“Come, my child, we must be going,” burst upon the ears of the
anguished girl, as she moaned and wrung her hands hysterically
over the form of her dead love.
“Yes,” came from lips unconscious of the utterance.
“François, I must leave you—François, goodbye—goodb——”
With her farewell uncompleted Sana fell in a swoon at the feet of
Professor Grant.
They carried her into the office, and after regaining consciousness
she was led to the waiting automobile in which she was taken to Mrs.
O’Brien’s home.
The following day a representative of the insurance company called
upon the O’Briens to hand Sana a check for the ten thousand dollars
insurance on de Rochelle’s life, of which Sana was the beneficiary.
Sana looked at the check with a feeling of disgust, and finally passed
it back to the man saying, “I don’t want his money.”
“But it is not his money,” came the answer, “It is the insurance
company’s money.”
“Well, I don’t want it anyway.”
“But what shall I say at the office?”
“Tell them I shall let them know in a few days. Perhaps I shall donate
it to some charity.”
At this display of pride, the agent muttered something about her
being an exception, and at a signal from Mrs. O’Brien, who noticed
that Sana was becoming nervous, he left the room.
CHAPTER III
THE MOON-SHINERS
SANA was confined to a sick-bed for several weeks, at the home of
Mrs. O’Brien, following the visit to the morgue. The tragedy had well
nigh shattered her nerves and only the most careful attention on the
part of her host and Dr. White prevented a serious breakdown. But
none could be more considerate than they, and though slowly and
through periods of great suffering, Sana regained her strength.
When at last she was able to be up and about in the open air, Mrs.
O’Brien prevailed upon her to accept her invitation to go with the
O’Brien family to their bungalow in the Catskills. New York was
sweltering. It was late in August and at times the thermometer would
show one hundred in the shade.
At the earnest pleading of her friend, Sana smiled, “Oh, you are so
good—you are the kindest woman I ever met.”
Mrs. O’Brien laughed at that, saying, “My dear child, it is easy to be
kind to you.”
“I’m sure I don’t know why I am imposing upon you so much.”
Mrs. O’Brien stroked Sana’s hair and replied, “Don’t let us talk about
it. You simply come along. Your being with us will be ample reward.”
“Well, if that is the way you feel about it, I surely cannot refuse. Yes, I
shall be glad to go with you.”
“Now you are showing the proper spirit.” She rang a bell, adding, “I
shall tell the maid to pack at once. We can’t get away from here
quick enough to suit me. Perhaps you didn’t know but Mr. O’Brien is
on his way to the mountains already, to get things in order.”
The next morning they were soon on board the river steamer, sailing
up the majestic Hudson.
It was an ideal day for a river trip. The two women seated well
forward on the upper deck basked in the warm sunshine, which,
tempered by the cooling breeze that came down river, seemed so
utterly different from the sweltering sun that beamed on the city’s
paved streets that they could readily have believed themselves to be
in another land. Sana was very much interested in a book she had
brought with her and Mrs. O’Brien likewise read from the various
magazines she had purchased at the dock. So the morning hours
fled quickly by, so quickly indeed that but few words passed between
them before the dinner call was sounded.
The stimulating hours spent on deck had given them a hearty
appetite. They ate leisurely and contentedly, Mrs. O’Brien more than
once commenting on the change that had already been wrought in
Sana.
Returning to the deck they resumed their chairs and books. Reading
soon became tiresome, however, and they fell to talking of this, that
and what-not, as will two ladies at any time.
The boat was now sailing the upper reaches of the river; with the
mountains in the distance. Sana suddenly remarked:
“This reminds me of a journey I once took up the Rhine. Only the
castles and winefields which lend an added interest and romance to
that historic river, are missing here.”
“True, my dear,” from Mrs. O’Brien, “but the homes of our millionaires
answer the purpose of the castles. As for the vineyards—they are
‘verboten,’ as the saying is, since our country has gone dry.”
It was with a curious questioning glance, her head turned sidewise
toward her companion, that Sana said, “But there is always wine at
your home? Where does it come from?”
“Oh, my husband takes care of that. He used to import his wines
from France and Germany, but that, of course, cannot be done now.
So we have to do the next best thing and that is buy it from those
who manage to get it into the country. As for stronger liquors, anyone
who has the price can get all they wish. England attends to supplying
us with her national drinks, so we get all the whiskey and brandy we
wish. The English have seen what a wonderful market they have
here for their goods—wet goods, you understand, and they are
taking the opportunity to make the best of it.”
This was all news to Sana, and she was content to let her friend go
on with her story.
“Yes, indeed. Special ‘rum-ships’ are operated under the usual
English governmental protection. These ships come within a few
miles from shore, remaining just far enough outside to be beyond the
reach of the Federal authorities. ‘Rum-runners’—fast motorboats—
go out to these ships, get a cargo, and under cover of darkness or a
favorable fog, transport it to the land of liberty.”
“How is it they can smuggle in this contraband when your
government is so efficient and all your authorities so honest in
carrying out the law?”
“Don’t worry, child. Many of those authorities, although appearing
thoroughly honest on the face of things, get their rake-off. Every so
often we read, in the papers, of some such authority being caught at
just that sort of thing. Why, some of those fellows are getting rich on
the graft. It seems to me that laws of that kind are always enacted for
just one purpose. And that is that certain politicians, or preferably
their friends, may enrich themselves at the expense of the general
public. The rich today can get all the liquor they want, but part of the
price they pay goes into the pockets of some grafter.
“It was always the same. Why I remember the time, some ten years
ago, a law was put into effect to control the sale of drinks on
Sundays. Food had to be served with the drinks to keep within the
law. It was a farce. The protective police and their go-betweens took
the graft, and the sandwich which was served with the drinks went
back and forth between the bar and the tables, acting simply as a
chaperon. The same sandwich was served a hundred times or so,
before it ended its career in the garbage pail. Provided, of course,
some hungry individual, short a dime for food, would not swallow it
with his whiskey.”
From Sana, “Why, I thought people in this country always voted on
issues of this kind—that is, if the people wanted the country dry, they
would decide it and not the government, the servant of the people,
and that for this reason you call it a democracy. Only then could it, in
truth, be called a ‘government of the people, by the people, and for
the people.’ Also, I believe you call it ‘The Sweet Land of Liberty.’
What does that mean?”
“My dear child, it can readily be seen that you have not been here
long.”
“Mrs. O’Brien, surely you do not mean to tell me that the people of
this great country have nothing to say in matters of this kind? If that
is so, could their opinions count with the government in matters of
less importance than the stability of society? Stranger as I am, I have
noticed how big an increase there has been in crime and other
matters that can be laid directly at the door of this law. The absence
of light alcoholic drinks has had an effect not to be smiled at. I
wonder why doctors, surely men of learning and understanding,
prescribe such stimulating drinks to their patients. Is it to further
weaken their bodies and characters or to strengthen them?”
“Yes, Sana, I know, we do not have the logic others have, or rather I
should say, we have no logic at all. Common sense is thrown to the
winds every four years during election campaigns and twice in the
interim; therefore of what use is it to think? Seemingly a waste of
time. Politicians, as well as others representing various interests, will
state facts or untruths, for that matter, one day and contradict them
the next just to suit their interests, so the people absolutely do not
know where they stand. And when a final issue is to be decided, the
rogues step in and find it very easy to lead the dear public by the
nose.
“Why, they do not even know the correct time,—our very clocks
contradict themselves. Take the ten o’clock train, for instance. After
running for five minutes in an effort to catch it, you find it is only a
few minutes past nine or eleven. You see, it is all part of the game.
The people must have no fixed ideas. Their minds must be as pliable
as dough—to suit the interests. That is what they do not understand,
as yet, in other countries. But at the same time, the public must be
told over and over again that they are the foremost and freest people
on the face of the globe and that settles it, as sure as the ‘amen’ in
the church.”
“And these persons, running things like that, get away with it?”
“Yes, Sana, they do, but they are only so very few that the rest do
not mind them. But should one mind them, he will be a ‘marked
man,’ like Tom Lawson who exposed the frenzies of high finance in
Wall Street in his famous novel ‘Friday the Thirteenth.’ First they
drove him from his large operations to smaller ones. Finally they
‘broke’ him. The recent sale of his four million dollar estate
‘Dreamwold’ was the last of the tragedies of Lawson’s life. And the
same tactics are used with others in political life. They get them in
the long run, even if things have to be ‘framed,’ as many records
show. I could tell you more, but I must not. Someone might overhear
me and I would get myself in difficulties, even though proofs are
available. They may do anything to you, but you must not get back at
them, no matter how right you are. You know, it hurts their feelings to
know the truth, but don’t expect them to show any feeling for you.
But to get back to the liquor question, Sana. I have several recipes
with me, for very good drinks. I got them from the Duncans, friends
of mine, you know. They have been making home-brew ever since
the country went dry. The stuff they make is good and has a decided
kick to it. I have had some several times at their home. I enjoy a
good drink once in a while myself, you know.
“I brought copies of the recipes with me. You never know who you
might meet and it is always good to be able to compare notes.”
Mrs. O’Brien, after searching a few minutes among the puffs, rouge
boxes and other miscellany that filled her hand bag drew out two
slips of paper which she handed to Sana.
“Here they are, you may keep a copy. Might come in handy when
your own country goes dry.”
Sana looked at the papers for a moment, then commenced to read,
“Peach Wine—one pound evaporated peaches, two pounds
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like