0% found this document useful (0 votes)
5 views

Fast Random Opposition-based Learning Aquila Optimization Algorithm

The document presents a new optimization algorithm called Fast Random Opposition-Based Learning Aquila Optimizer (FROBLAO), which combines the Aquila Optimization (AO) algorithm with a Fast Random Opposition-Based Learning strategy to enhance convergence and prevent local optima issues. The performance of FROBLAO is validated through various benchmark tests and real-life engineering problems, demonstrating its effectiveness compared to existing algorithms. The study highlights the need for improved meta-heuristic algorithms to tackle complex optimization challenges in engineering and science.

Uploaded by

Trong Nghia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Fast Random Opposition-based Learning Aquila Optimization Algorithm

The document presents a new optimization algorithm called Fast Random Opposition-Based Learning Aquila Optimizer (FROBLAO), which combines the Aquila Optimization (AO) algorithm with a Fast Random Opposition-Based Learning strategy to enhance convergence and prevent local optima issues. The performance of FROBLAO is validated through various benchmark tests and real-life engineering problems, demonstrating its effectiveness compared to existing algorithms. The study highlights the need for improved meta-heuristic algorithms to tackle complex optimization challenges in engineering and science.

Uploaded by

Trong Nghia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Heliyon 10 (2024) e26187

Contents lists available at ScienceDirect

Heliyon
journal homepage: www.cell.com/heliyon

Research article

Fast random opposition-based learning Aquila optimization


algorithm
Gopi S., Prabhujit Mohapatra ∗
Department of Mathematics, School of Advanced Sciences, Vellore Institute of Technology, Vellore, 632 014, Tamil Nadu, India

A R T I C L E I N F O A B S T R A C T

Keywords: Meta-heuristic algorithms are usually employed to address a variety of challenging optimization
Opposition-based learning problems. In recent years, there has been a continuous effort to develop new and efficient
Optimization algorithms meta-heuristic algorithms. The Aquila Optimization (AO) algorithm is a newly established swarm-
Meta-heuristic algorithm
based method that mimics the hunting strategy of Aquila birds in nature. However, in complex
Fast random opposition-based learning
OBL
optimization problems, the AO has shown a sluggish convergence rate and gets stuck in the local
FROBL optimal region throughout the optimization process. To overcome this problem, in this study, a
new mechanism named Fast Random Opposition-Based Learning (FROBL) is combined with the
AO algorithm to improve the optimization process. The proposed approach is called the FROBLAO
algorithm. To validate the performance of the FROBLAO algorithm, the CEC 2005, CEC 2019, and
CEC 2020 test functions, along with six real-life engineering optimization problems, are tested.
Moreover, statistical analyses such as the Wilcoxon rank-sum test, the t-test, and the Friedman
test are performed to analyze the significant difference between the proposed algorithm FROBLAO
and other algorithms. The results demonstrate that FROBLAO achieved outstanding performance
and effectiveness in solving an extensive variety of optimization problems.

1. Introduction

Optimization is the act of determining the supreme combination of decision factors to address a certain optimization problem. This
approach has surfaced in a variety of fields, restraints, and real-life applications [1–3]. Finding answers to optimization problems
has become the standard in almost all fields of engineering and sciences [4–6], where the desire for more powerful solutions
is always growing. This implies that we must have sensible algorithms capable of dealing with the complexities of present-day
scientific and engineering problems. An extensive review of the literature on existing meta-heuristic algorithms reveals that there are
numerous such methods [7–9]. These methods range from traditional techniques that employ either linear or non-linear programming
methods [10] to newly developed, nature-inspired methods, each with its own set of advantages and disadvantages. Traditional
approaches, while effective in tackling well-known optimization issues [11–13], have two drawbacks: they require a full-promise
initial start vector within the search area, and they are inherently dependent on gradient information [14,15]. Also, traditional
approaches can no longer be employed to address challenging real-world problems as an outcome of the improvement of technology
and science in high society. Consequently, the meta-heuristic algorithm was developed. Because of their straightforward design
and adaptable parameters, meta-heuristic algorithms are frequently applied to these challenging real-world problems [16]. Due to

* Corresponding author.
E-mail addresses: [email protected] (S. Gopi), [email protected] (P. Mohapatra).

https://doi.org/10.1016/j.heliyon.2024.e26187
Received 18 July 2023; Received in revised form 30 January 2024; Accepted 8 February 2024
Available online 15 February 2024
2405-8440/© 2024 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/by/4.0/).
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 1
Categorization of meta-heuristic algorithms.

Category Algorithm

Evolutionary algorithms Evolution Strategy (ES) [17]


Genetic Algorithm (GA) [18]
Biogeography-Based Optimizer (BBO) [19]
Genetic Programming (GP) [20]
Differential Evolution (DE) [21]
Evolutionary Deduction Algorithm (ED) [22]
Probability-Based Incremental Learning (PBIL) [23]
Tree Growth Algorithm (TGA) [24]
Arithmetic Optimization Algorithm (AOA) [25]

Physics-based algorithms Gravitational Local Search (GLSA) [26]


Gravitational Search Algorithm (GSA) [27]
Simulated Annealing (SA) [28]
Multi-verse Optimizer (MVO) [29]
Electromagnetic Field Optimization (EFO) [30]
Equilibrium Optimizer (EO) [31]
Central Force Optimization (CFO) [32]
Big-Bang Big-Crunch (BBBC) [33]
Ray Optimization (RO) [34]
Henry Gas Solubility Optimization (HGSO) [35]
Curved Space Optimization (CSO) [36]

Swarm-based algorithms Ant Colony Optimization (ACO) [37]


Salp Swarm Algorithm (SSA) [38]
Particle Swarm Optimization (PSO) [39,40]
Artificial Bee Colony (ABC) [41]
Ant Lion Optimizer (ALO) [42]
Slime Mold Algorithm (SMA) [43]
Cuckoo Search (CS) [44]
Moth Flame Optimization (MFO) [45]
Firefly Algorithm(FA) [46]
Seagull optimization algorithm (SOA) [47]
Grey Wolf Optimizer (GWO) [48]
American zebra optimization algorithm (AZOA) [49]
Sooty Tern Optimization Algorithm (STOA) [50]
Whale Optimization Algorithm (WOA) [51]
Bald Eagle Search (BES) [52]
Marine Predators Algorithm (MPA) [53]

Human-based algorithms Collective Decision Optimization (CSO) [54]


Teaching–learning-based optimization (TLBO) [55]
Harmony Search (HS) [56]
Fireworks Algorithm (FWA) [57]
Socio Evolution & Learning Optimization Algorithm (SELOA) [58]
Poor and Rich Optimization (PRO) [59]
Human Eye Vision Algorithm (HEVA) [60]
Brain Storm Optimization Algorithm (BSOA) [61]
Human Mental Search (HMS) [62]

their benefits, including flexibility, efficiency, and the ability to produce a solution that is close to optimal in a fair amount of
time, meta-heuristic algorithms have been utilized to address such problems. Meta-heuristic algorithms mimic unpredictability in
nature to find the optimum solution. These algorithms can be separated into four classes: evolutionary algorithms, physics-based
algorithms, swarm-based algorithms, and human-based algorithms. (i) Evolutionary algorithms: this is the most frequent and earliest
type of meta-heuristic algorithm, imitating the principles of evolutionary behavior of species in nature by depending on the idea of
continuity of the qualified. In evolutionary algorithms, an initial random population improves throughout iterations to create new
solutions and eliminate the poorest ones to improve the fitness value. Given that they lack credibility in the basic fitness landscape,
these algorithms frequently do well in discovering optimal or near-optimal solutions. (ii) Physics-based algorithms: these algorithms
take their inspiration from the establishing physical fundamentals prevalent throughout the universe. These algorithms evolved from
natural physics laws and typically separated the interaction of search agents based on the existing principles of physical processes.
(iii) Swarm-based algorithms: this is an important branch of meta-heuristics that approaches the dynamic, collective, intelligent,
and concerted gregarious behavior of flocks in nature. The communities mentioned above consist of flocks of birds, fish schooling,
insect colonies such as bee and ant colonies, flocks of animals, and numerous flocks of other species of organisms. Each character
in swarm-based algorithms has its own wit and behavior, but combining characters in algorithms provides additional power to
tackle complicated optimization problems. (iv) Human-based algorithms: this group characterizes phenomena connected to human
behavior, non-physical activities like thinking, and their perceptions in society. The algorithms in this class have begun to grab
the attention of researchers as a new trend in the last decade and are still unable to compete with evolutionary and swarm-based
algorithms. Table 1 illustrates the meta-heuristic algorithms categorization.

2
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

The recently developed algorithm, the Aquila optimizer (AO), which imitates the four different stages of Aquila hunting behav-
ior, was proposed by Abualigah et al. [14]. By swapping the AO’s original exploitation phase with the Harris Hawks Optimizer’s
exploitation phase, Wang et al. [63] created an improved version of the AO. They also included a nonlinear escape operator and a
random opposition-learning technique in their proposed algorithm. Furthermore, the AO and the arithmetic optimization algorithm
(AOA) [25] were hybridized by Mahajan et al. [64], named the hybrid algorithm of arithmetic optimization algorithm with AO
algorithm (AOAAO). Then the AOAAO algorithm results are compared with the original AO, original AOA, Grey Wolf algorithm
(GWO), Grasshopper algorithm (GOA) [65], and Whale algorithm (WOA). The simplified AO method was created by Zhao et al.
[66] by maintaining the first two techniques and eliminating the control equation from the exploitation and exploration processes.
They employed 23 functions to evaluate their implemented approach using different optimizers. Furthermore, Gao et al. [67], im-
plemented three different methods to improve the AO algorithm. These techniques involve developing a search control operator,
random-opposition learning, and Gaussian mutation (GM). They claimed that their method, an improved AO, provided better results
compared to those of other optimizers. Furthermore, Huangjing et al. [68], engaged 3 different approaches to enhance the AO algo-
rithm. These approaches are the restart strategy, opposition-based learning, and chaotic local search. Also, Yufei Wang et al. [69],
introduced a new strategy, called an adaptive opposition-based learning strategy. This strategy improves local optima for the AO
algorithm. Furthermore, Ekinci et al. [70], introduced a novel strategy, namely an enhanced AO (enAO) algorithm by employing
two mechanisms such as the Nelder-Mead (NM) method and the modified opposition-based learning (OBL) strategy to improve the
exploration and exploitation. The AO has been applied successfully in a variety of applications. AlRassas et al. [71], for instance,
attempted to forecast oil production by utilizing the AO to maximize the model of an adaptive neuro-fuzzy inference system. Despite
the algorithm’s strength and superiority, as well as corresponding to the NFL theorem [72], the AO can’t solve every optimization
problem. Therefore, the AO still requires improvements and innovations.
This paper introduces a new strategy, namely the Fast Random Opposition-Based Learning (FROBL) strategy [73], and this
provides a novel contribution with the AO algorithm, Fast Random Opposition-Based Learning Aquila Optimizer (FROBLAO), to
reformulate the update customs to prevent falling into local optima while escalating convergence. The use of FROBL in collaboration
with the optimization functions of AO raises the certainty and performance of the natural AO approximately. The implementation’s
benefits involve helping the algorithm depart the local optimal while still maintaining complexity and AO’s optimization progress.
Therefore, it is hoped that the proposed FROBLAO algorithm’s adaptability will be demonstrated by careful testing later in the paper.
The following are the main features of this paper:

• A advanced AO algorithm has been developed using the Fast Random Opposition-Based Learning strategy, namely the FROBLAO
algorithm.
• The proposed FROBLAO algorithm has been compared with the original AO, opposition-based learning with AO, namely OBLAO,
five popular algorithms, two top-performing algorithms, and two recent high-performance algorithms, such as AO [14], OBLAO,
TSA [74], SSA [38], MVO [29], GWO [48], SCA [75], LSHADE [76], CMA-ES [77], MRFO [78] and AVOA [79].
• FROBLAO was tested for CEC 2005, CEC 2019, and CEC 2020 test functions, along with six engineering design problems.

The remaining part of this study is formulated as follows: Section 2 discusses the preliminaries of the original AO algorithm, OBL
strategy, FROBL strategy, and proposed FROBLAO algorithm. Section 3 discusses the experimental results and discussions for the test
functions. Section 4 discusses the real-life engineering problems. At last section 5 discusses the conclusion of the proposed work.

2. Preliminaries

2.1. Aquila optimizer (AO)

One of the most recent optimizer for population-based swarm intelligence is the Aquila Optimization algorithm. One of the most
well-known predatory birds that once inhabited the northern hemisphere was the Aquila. The body and back of the Aquila are golden.
Aquila catches a variety of prey, primarily squirrels, rabbits, marmots, and hares, using her quickness and strength, as well as its
strong feet and wide claws. The AO simulates the four various hunting techniques and those techniques are modeled mathematically
as follows:
Phase 1: expanded exploration (𝑋1 ) The Aquila soars high above ground level during this phase to properly scan the area before
diving vertically after it has discovered the prey. This behavior is expressed in two mathematical equations as follows:
𝑙
𝑋1 (𝑙 + 1) = 𝑋𝐵 (𝑙) ∗ (1 − ) + (𝑋𝑚𝑒𝑎𝑛 (𝑙) − 𝑋𝐵 (𝑙) ∗ 𝑟𝑎𝑛𝑑) (1)
𝐿
𝑀
1 ∑
𝑋𝑚𝑒𝑎𝑛 (𝑙) = 𝑋 (𝑙), ∀𝑀 = 1, 2, ⋯ , 𝐷𝑖𝑚 (2)
𝑀 𝑖=1 𝑖

where 𝑋𝑚𝑒𝑎𝑛 (𝑙) denotes the mean location of the present solutions at 𝑖𝑡ℎ iteration by using Equation (2), 𝑋𝐵 is the global best solution
in this iteration, 𝑟𝑎𝑛𝑑 stands for random values that lies in [0, 1], 𝑙 stands for the current iteration, 𝐿 signifies the number of iteration,
𝑀 express the population size, 𝐷𝑖𝑚 express the dimension size.
Phase 2: narrowed exploration (𝑋2 ) The majority of Aquila’s hunting methods involve this particular phase. Contour flying is
combined with a brief glide to attack the prey. The following Equations are updates to Aquila positions:

3
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

𝑋2 (𝑙 + 1) = 𝑋𝐵 (𝑙) ∗ 𝐿𝑒𝑣𝑦(𝐷) + 𝑋𝑅 (𝑙) + (𝑦 − 𝑥) ∗ 𝑟𝑎𝑛𝑑 (3)


where 𝑋𝑅 express the random location of the Aquila, 𝐷 denotes the dimension space, 𝐿𝑒𝑣𝑦 denotes the levy probability distribution
function which is evaluated by employing Equations (4), (5), (6), and (7).
𝑢∗𝜎
𝐿𝑒𝑣𝑦(𝐷) = 𝑠 ∗ 1
(4)
|𝜈| 𝛽
Γ(1 + 𝛽) ∗ 𝑠𝑖𝑛( 𝜋𝛽
2
)
𝜎= 𝛽−1
(5)
Γ( 1+𝛽
2
)∗𝛽 ∗2 2

Here, 𝑠 and 𝛽 denote the constant whose values are 0.01 and 1.5, respectively. The 𝑢 and 𝜈 are arbitrary numbers lies in 0 and 1. The
following two equations can be used to calculate the values of 𝑦 and 𝑥, which are employed to simulate the spiral shape:

𝑦 = 𝛾 ∗ 𝑐𝑜𝑠(𝜃) (6)

𝑥 = 𝛾 ∗ 𝑠𝑖𝑛(𝜃) (7)
where 𝛾 and 𝜃 can be determined by using Equations (8), (9), and (10).

𝛾 = 𝛾1 + 𝑉 ∗ 𝐷1 (8)

𝜃 = −𝑊 ∗ 𝐷1 + 𝜃1 (9)
3∗𝜋
𝜃1 = (10)
2
where 𝛾1 takes a value between 1 and 20, 𝑉 is equal to 0.0265, 𝑊 is equal to 0.005, and 𝐷1 denotes the random integer from the
range one to the dimension.
Phase 3: expanded exploitation (𝑋3 ) The third phase involves locating the prey location so that agents can launch a low-flying
preliminary strike vertically. The following are some possible ways that agents can attack their prey:

𝑋3 (𝑙 + 1) = (𝑋𝐵 (𝑙) − 𝑋𝑚𝑒𝑎𝑛 (𝑙)) ∗ 𝛼 − 𝑟𝑎𝑛𝑑 + ((  − ) ∗ 𝑟𝑎𝑛𝑑 + ) ∗ 𝛿 (11)
where 𝛼 and 𝛿 are exploitation fixed parameters to 0.1, and   and  denotes the upper and lower limits.
Phase 4: narrowed exploitation (𝑋4 ) The fourth phase involves the Aquila’s ability to quickly track and attack its target using
escape trajectory light, which is calculated using Equation (12), (13), (14), and (15).

𝑋4 (𝑙 + 1) = 𝑄𝐹 ∗ 𝑋𝐵 (𝑙) − (𝑃1 ∗ 𝑋(𝑙) ∗ 𝑟𝑎𝑛𝑑) − 𝑃2 ∗ 𝐿𝑒𝑣𝑦(𝐷) + 𝑟𝑎𝑛𝑑 ∗ 𝑃1 (12)


2∗𝑟𝑎𝑛𝑑−1
𝑄𝐹 (𝑙) = 𝑙 (1−𝛾)2 (13)

𝑃1 = 2 ∗ 𝑟𝑎𝑛𝑑 − 1 (14)
𝑙
𝑃2 = 2 ∗ (1 − ) (15)
𝐿
where 𝑄𝐹 (𝑙) denotes the quality value, 𝑃1 denotes the various motions of AO, and 𝑃2 denotes the chasing target flight slope.

2.2. Opposition-based learning (OBL)

Opposition-Based Learning (OBL) is one of the effective optimization tools that was developed by Tizhoosh in 2005 [80]. The fun-
damental idea of the OBL method is to simultaneously evaluate the fitness of an estimate and the opposite estimate that corresponds
to it, to find a superior candidate solution. Several meta-heuristic algorithms have effectively employed the OBL principle to speed
up convergence. The definition of the OBL strategy, let consider 𝑎̂ is the opposite value for the real value 𝑎 ∈ [,  ], which is
calculated using Equation (16).

𝑎̂ =  +   − 𝑎. (16)
The following equation can be employed to expand this definition to n dimensions:

𝑎̂𝑖 = 𝑖 +  𝑖 − 𝑎𝑖 , (17)


where the 𝑎̂𝑖 and 𝑎 are two solutions related during the optimization process; the superior of these solutions is maintained, while the
other is eliminated through the evaluation of the objective function, for example, if 𝑓 (̂
𝑎) ≤ 𝑓 (𝑎) (if minimum), then 𝑎 is maintained;
otherwise 𝑎̂ is maintained. Fig. 1, Fig. 2, and Fig. 3 denote the formation of 𝑎 and its opposite 𝑎̂ in one, two, and three-dimensional
spaces, respectively.

4
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 1. One dimensional space of OBL strategy.

Fig. 2. Two dimensional space of OBL strategy.

Fig. 3. Three dimensional space of OBL strategy.

2.3. Proposed method

2.3.1. Fast random opposition-based learning (FROBL)


In 2023, Sarada et al. [73] successfully developed the novel Fast Random Opposition-Based Learning (FROBL) technique to
increase the diversity (exploration) and intensification (exploitation) abilities of the algorithm. This innovative FROBL approach is
enforced to basic solutions to achieve a unique solution that constructs adverse populations to enhance the diversity and assist the
population to jump out from the local optima. Assume that the opposite value 𝑎̂ for the real value 𝑎 ∈ [,  ], which is evaluated
by employing Equation (18).
{
𝑀 + 𝑟2 ∗ 𝑠𝑖𝑛(2 ∗ 𝜋 ∗ 𝑟) ∗ 𝑎2 , ||𝑎|| < ||𝑀||;
𝑎̂ = (18)
𝑀 − 𝑟2 ∗ 𝑠𝑖𝑛(2 ∗ 𝜋 ∗ 𝑟) ∗ 𝑎2 , 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒.
The following equation can be employed to expand this definition to n dimensions:
{ 𝑎𝑖,𝑗
𝑀 + 𝑟2 ∗ 𝑠𝑖𝑛(2 ∗ 𝜋 ∗ 𝑟) ∗ , ||𝑎𝑖,𝑗 || < ||𝑀||;
𝑎̂𝑖,𝑗 = 2
𝑎𝑖,𝑗 (19)
𝑀 − 𝑟2 ∗ 𝑠𝑖𝑛(2 ∗ 𝜋 ∗ 𝑟) ∗ 2
, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒,

5
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 4. One dimensional space of FROBL strategy.

Fig. 5. Two dimensional space of FROBL strategy.

Fig. 6. Three dimensional space of FROBL strategy.

+ 
where 𝑀 = and 𝑟 express the random value lies between 0 and 1, and 𝜋 value is 3.14. The || || denotes the Euclidean
2
distance between the origin to the position of the particle. The sine function represents the cyclic structure combined with 𝜋 and 𝑟
according to the solution can be relocated around another solution. This can ensure that the region defined between the two solutions
is exploited. The solutions should be able to search outside the distance between their corresponding destinations to fully explore the
search space. The new functions have been introduced to modify the update conditions in comparison to Equation (17) to prevent
increased convergence while also avoiding poor diversity and local optima. Fig. 4, Fig. 5, and Fig. 6 express the formation of 𝑎 and
its opposite 𝑎̂ in one, two, and three-dimensional spaces, respectively.

2.3.2. Fast random opposition-based learning Aquila optimizer (FROBLAO)


For improving the performance of the AO algorithm, the structure of the proposed method is described in this section. To
introduce the capability of thoroughly exploring the search domain and quickly reaching the optimal value, the AO is modified
in this context by combining its original structure with the FROBL strategy. After the exploration phase of the AO is complete, the
FROBL technique is used to maintain 25% of the AO-calculated domain space. This process permits the initial domain space to quickly
approach the optimum value and restore the out-of-range values. The proposed method has been named FROBLAO, and Algorithm 1
illustrates its main steps. According to the NFL (No Free Lunch) theorem, no optimization solution can effectively address every
optimization problem. According to the NFL, no algorithm can be improved without sacrificing benefits. The proposed FROBLAO
algorithm pseudocode is shown in Algorithm 1, and Fig. 7 depicts the flowchart of the FROBLAO algorithm.

6
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Algorithm 1 The proposed FROBLAO’s pseudo-code.


1: Start FROBLAO.
2: Input: The optimization problem information.
3: Initialize the population 𝑋 of the AO.
4: Initialize AO parameters.
5: while 𝑙 < 𝐿 do
6: Compute the fitness function value.
7: Select the best candidate solution 𝑋𝐵 .
8: for (𝑙 = 1, 2, ⋯ , 𝑁 ) do
9: Update the present mean location.
10: Compute the parameters.
2
11: if (𝑙 ≤ ( ∗ 𝐿)) then
3
12: if 𝑟𝑎𝑛𝑑 ≤ 0.5 then
13: Update present location using Equation (1).
14: Compute opposite position using Equation (19).
15: else
16: Update present location using Equation (3).
17: Compute opposite position using Equation (19).
18: if 𝑟𝑎𝑛𝑑 ≤ 0.5 then
19: Update present location using Equation (11).
20: Compute opposite position using Equation (19).
21: else
22: Update present location using Equation (12).
23: Compute opposite position using Equation (19).
24: end if
25: end if
26: end if
27: end for
28: end while
29: Return best solution.
30: End FROBLAO.

Fig. 7. The flowchart of the FROBLAO algorithm.

3. Experimental results and discussions

This study is conducted to analyze the FROBLAO algorithm’s performance by employing a variety of test functions. In the numeric
process of authentication, forty-three collections of benchmark test functions have been used, namely twenty-three CEC 2005 bench-
mark functions [81,82], ten CEC 2019 benchmark functions [83] and ten CEC 2020 benchmark functions [84]. The details of the

7
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 2
CEC 2005 test functions.

Function Dim Range 𝐹𝑚𝑖𝑛


∑𝑛
𝐹 1(ℎ) = 𝑖=1 ℎ𝑖 2 30 [−100, 100] 0
∑𝑛 ∏𝑛
𝐹 2 (ℎ) = 𝑖=1 |ℎ| + 𝑖 |ℎ𝑖 | 30 [−10, 10] 0
∑𝑛 ∑𝑖 2
𝐹 3 (ℎ) = 𝑖=1 ( 𝑗=1 ℎ𝑗 ) 30 [−100, 100] 0
{ }
𝐹 4 (ℎ) = 𝑚𝑎𝑥𝑖 |ℎ𝑖 |, 1 ≤ 𝑖 ≤ 𝑛 30 [−100, 100] 0
∑𝑛−1
𝐹 5 (ℎ) = 𝑖=1 [100(ℎ𝑖+1 − ℎ𝑖 ) + (ℎ𝑖 − 1)2 ]
2 2 30 [−30, 30] 0
∑𝑛
𝐹 6 (ℎ) = 𝑖=1 ([ℎ𝑖 + 0.5])2 30 [−100, 100] 0
∑𝑛
𝐹 7 (ℎ) = 𝑖=1 𝑖ℎ4𝑖 + 𝑟𝑎𝑛𝑑𝑜𝑚[0, 1) 30 [−1.28, 1.28] 0
∑𝑛 √
𝐹 8(ℎ) = 𝑖=1 −ℎ𝑖 sin ( |ℎ𝑖 |) 30 [−500, 500] −12569.5
∑𝑛
𝐹 9 (ℎ) = 𝑖=1 [ℎ2𝑖 − 10 cos (2𝜋ℎ𝑖 ) + 10] 30 [−5.12, 5.12] 0
√ ∑ ∑
𝐹 10 (ℎ) = −20𝑒𝑥𝑝(−0.2 1𝑛 𝑛𝑖=1 ℎ2𝑖 )−𝑒𝑥𝑝( 1𝑛 𝑛𝑖=1 cos (2𝜋ℎ𝑖 )) + 20 + 𝑒 30 [−32, 32] 0
1 ∑𝑛 ∏𝑛 ℎ𝑖
𝐹 11 (ℎ) = 4000 𝑖=1 ℎ𝑖 − 𝑖=1 cos ( √ ) + 1
2 30 [−600, 600] 0
{ ∑𝑛−1
𝑖
[ ( )] } ∑
𝑛
𝐹 12 (ℎ) = 𝜋𝑛 10 sin 𝜋𝑦1 ) + 𝑖=1 (𝑦𝑖 − 1)2 1 + 10𝑠𝑖𝑛2 𝜋𝑦𝑖+1 + (𝑦𝑛 − 1)2 + 𝑖=1 𝑢(ℎ𝑖 , 10, 100, 4) 30 [−50, 50] 0
ℎ𝑖 +1
𝑦𝑖 = 1 + 4
⎧ 𝑘(ℎ − 𝑎)𝑚 ℎ𝑖 > 𝑎
⎪ 𝑖
𝑢(ℎ𝑖 , 𝑎, 𝑘, 𝑚) = ⎨ 0 −𝑎 < ℎ𝑖 < 𝑎
⎪ 𝑘(−ℎ𝑖 − 𝑎)𝑚 ℎ𝑖 < −𝑎
⎩ ∑ ∑
𝐹 13 (ℎ) = 0.1{(3𝜋ℎ1 ) + 𝑛𝑖=1 (ℎ𝑖 − 1)2 [1 + 𝑠𝑖𝑛2 (3𝜋ℎ𝑖 + 1)] + (ℎ𝑛 − 1)2 [1 + 𝑠𝑖𝑛2 (2𝜋ℎ𝑛 ]} + 𝑛𝑖=1 𝑢(ℎ𝑖 , 5, 100, 4) 30 [−50, 50] 0
1 ∑25 1 −1
𝐹 14 (ℎ) = ( 500 + 𝑗=1 ∑2 ) 2 [−65, 65] 1
𝑗+ 𝑖=1 (ℎ𝑖 −𝑎𝑖𝑗 )6
∑ ℎ1 (𝑏2𝑖 +𝑏𝑖 ℎ2 ) 2
𝐹 15 (ℎ) = 11 [𝑎
𝑖=1 𝑖
− 𝑏2 +𝑏 ℎ +ℎ
] 4 [−5, 5] 0.0003
𝑖 𝑖 3 4

𝐹 16 (ℎ) = 4ℎ21 − 2.1ℎ41 + 13 ℎ61 + ℎ1 ℎ2 − 4ℎ22 + 4ℎ42 2 [−5, 5] −1.0316


5.1 2 5 2 1
𝐹 17 (ℎ) = (ℎ2 − 4𝜋 2 ℎ1 + ℎ − 6) + 10(1 − 8𝜋
𝜋 1
) cos ℎ1 + 10 2 [−5, 5] 0.398
𝐹 18 (ℎ) = [1 + (ℎ1 + ℎ2 + 1)2 (19 − 14ℎ1 + 3ℎ21 − 14ℎ2 + 6ℎ1 ℎ2 + 3ℎ22 )] × [30 + (2ℎ1 − 3ℎ2 )2 (18 − 32ℎ1 + 2 [−2, 2] 3
12ℎ21 + 48ℎ2 − 36ℎ1 ℎ2 + 27ℎ22 )]
∑4 ∑3
𝐹 19 (ℎ) = − 𝑖=1 𝑐𝑖 𝑒𝑥𝑝(− 𝑗=1 𝑎𝑖𝑗 (ℎ𝑗 − 𝑝𝑖𝑗 )2 ) 3 [1, 3] −3.86
∑4 ∑6
𝐹 20 (ℎ) = − 𝑖=1 𝑐𝑖 𝑒𝑥𝑝(− 𝑗=1 𝑎𝑖𝑗 (ℎ𝑗 − 𝑝𝑖𝑗 )2 ) 6 [0, 1] −3.32
∑5
𝐹 21 (ℎ) = − 𝑖=1 [(𝐻 − 𝑎𝑖 )(𝐻 − 𝑎𝑖 )𝑇 + 𝑐𝑖 ]−1 4 [0, 10] −10.1532
∑7
𝐹 22 (ℎ) = − 𝑖=1 [(𝐻 − 𝑎𝑖 )(𝐻 − 𝑎𝑖 )𝑇 + 𝑐𝑖 ]−1 4 [0, 10] −10.4028
∑10
𝐹 23 (ℎ) = − 𝑖=1 [(𝐻 − 𝑎𝑖 )(𝐻 − 𝑎𝑖 )𝑇 + 𝑐𝑖 ]−1 4 [0, 10] −10.5363

Table 3
CEC 2019 test functions.

Functions Dim Range 𝐶𝐸𝐶𝑚𝑖𝑛


𝐶𝐸𝐶01 (𝑆𝑡𝑜𝑟𝑛′ 𝑠 𝑐ℎ𝑒𝑏𝑦𝑠ℎ𝑒𝑣 𝑝𝑜𝑙𝑦𝑛𝑜𝑚𝑖𝑎𝑙 𝑓 𝑖𝑡𝑡𝑖𝑛𝑔 𝑝𝑟𝑜𝑏𝑙𝑒𝑚) 9 [−8192, 8192] 1
𝐶𝐸𝐶02 (𝐼𝑛𝑣𝑒𝑟𝑠𝑒 𝐻𝑖𝑙𝑏𝑒𝑟𝑡 𝑚𝑎𝑡𝑟𝑖𝑥 𝑝𝑟𝑜𝑏𝑙𝑒𝑚) 16 [−16384, 16384] 1
𝐶𝐸𝐶03 (𝐿𝑒𝑛𝑛𝑎𝑟𝑑 − 𝑗𝑜𝑛𝑒𝑠 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑒𝑛𝑒𝑟𝑔𝑦 𝑐𝑙𝑢𝑠𝑡𝑒𝑟) 18 [−4, 4] 1
𝐶𝐸𝐶04 (𝑅𝑎𝑠𝑡𝑟𝑖𝑔𝑖𝑛′ 𝑠 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶05 (𝐺𝑟𝑖𝑒𝑤𝑎𝑛𝑔𝑘′ 𝑠 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶06 (𝑊 𝑒𝑖𝑒𝑟𝑠𝑡𝑟𝑎𝑠𝑠 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶07 (𝑀𝑜𝑑𝑖𝑓 𝑖𝑒𝑑 𝑠𝑐ℎ𝑤𝑒𝑓 𝑒𝑙′ 𝑠 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶08 (𝐸𝑥𝑝𝑎𝑛𝑑𝑒𝑑 𝑆𝑐ℎ𝑎𝑓 𝑓 𝑒𝑟′ 𝑠 𝐹 6 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶09 (𝐻𝑎𝑝𝑝𝑦 𝑐𝑎𝑡 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1
𝐶𝐸𝐶10 (𝐴𝑐𝑘𝑙𝑒𝑦 𝑓 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1

CEC 2005, CEC2019, and CEC2020 test functions are given in Table 2, Table 3, and Table 4, respectively. The FROBLAO algorithm
has been run thirty times independently to evaluate its stability and dependability. The average (Mean) value and standard deviation
(Std) value of the FROBLAO algorithm have been provided to compare with AO variants, popular algorithms, top-performing algo-
rithms, and recent high-performance algorithms, such as AO [14], OBLAO, TSA [74], SSA [38], MVO [29], GWO [48], SCA [75],
LSHADE [76], CMA-ES [77], MRFO [78], and AVOA [79]. All the algorithm’s initial parameters are given in the Table 5. To make
a valid comparison, the algorithms under evaluation have used the same number of runs, maximum number of population sizes,
maximum number of iterations of 30, 30, and 500, respectively. All the experiments are carried out on Windows 11, Intel Core i3,
2.10 GHz, 8.00 GB RAM, MABLAB R2022b.

3.1. Time complexity of the FROBLAO

By evaluating the time complexity of the initialization, calculation, and updating position processes independently, it is possible
to determine the time complexity of the proposed FROBLAO algorithm. Therefore, 𝑂(𝐹 𝑅𝑂𝐵𝐿𝐴𝑂) = 𝑂 (Initialize the position) +
𝑂 (Calculate the fitness value) + 𝑂 (Update the position) + 𝑂(𝐹 𝑅𝑂𝐵𝐿). Assume that 𝑀 express the number of populations, 𝐿 express

8
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 4
CEC 2020 test functions.

Functions Dim Range 𝐻𝑚𝑖𝑛


𝐻1 (𝐶𝐸𝐶2017, 𝐹 1)(𝑆ℎ𝑖𝑓 𝑡𝑒𝑑 𝑎𝑛𝑑 𝑅𝑜𝑡𝑎𝑡𝑒𝑑 𝐵𝑒𝑛𝑡 𝐶𝑖𝑔𝑎𝑟 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 100
𝐻2 (𝐶𝐸𝐶2014, 𝐹 11)(𝑆ℎ𝑖𝑓 𝑡𝑒𝑑 𝑎𝑛𝑑 𝑅𝑜𝑡𝑎𝑡𝑒𝑑 𝑆𝑐ℎ𝑤𝑒𝑓 𝑒𝑙′ 𝑠 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1100
𝐻3 (𝐶𝐸𝐶2017, 𝐹 7)(𝑆ℎ𝑖𝑓 𝑡𝑒𝑑 𝑎𝑛𝑑 𝑅𝑜𝑡𝑎𝑡𝑒𝑑 𝐿𝑢𝑛𝑎𝑐𝑒𝑘 𝑏𝑖 − 𝑅𝑎𝑠𝑡𝑟𝑖𝑔𝑖𝑛 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 700
𝐻4 (𝐶𝐸𝐶2017, 𝐹 19)(𝐸𝑥𝑝𝑎𝑛𝑑𝑒𝑑 𝑅𝑜𝑠𝑒𝑛 𝑏𝑟𝑜𝑐𝑘′ 𝑠 𝑝𝑙𝑢𝑠 𝐺𝑟𝑖𝑒𝑤𝑎𝑛𝑔𝑘′ 𝑠 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛) 10 [−100, 100] 1900
𝐻5 (𝐶𝐸𝐶2014, 𝐹 17)(𝐻𝑦𝑏𝑟𝑖𝑑 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 1) 10 [−100, 100] 1700
𝐻6 (𝐶𝐸𝐶2017, 𝐹 16)(𝐻𝑦𝑏𝑟𝑖𝑑 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 2) 10 [−100, 100] 1600
𝐻7 (𝐶𝐸𝐶2014, 𝐹 21)(𝐻𝑦𝑏𝑟𝑖𝑑 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 3) 10 [−100, 100] 2100
𝐻8 (𝐶𝐸𝐶2017, 𝐹 22)(𝐶𝑜𝑚𝑝𝑜𝑠𝑖𝑡𝑖𝑜𝑛 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 1) 10 [−100, 100] 2200
𝐻9 (𝐶𝐸𝐶2017, 𝐹 24)(𝐶𝑜𝑚𝑝𝑜𝑠𝑖𝑡𝑖𝑜𝑛 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 2) 10 [−100, 100] 2400
𝐻10 (𝐶𝐸𝐶2017, 𝐹 25)(𝐶𝑜𝑚𝑝𝑜𝑠𝑖𝑡𝑖𝑜𝑛 𝐹 𝑢𝑛𝑐𝑡𝑖𝑜𝑛 3) 10 [−100, 100] 2500

Table 5
The control parameters of the competing algorithms are set to certain values.

Algorithm Parameter Value

𝐹 𝑅𝑂𝐵𝐿𝐴𝑂 𝑉 , 𝛾1 , 𝑊 , 𝛼 , 𝛿 0.0265, 10, 0.005, 0.1, 0.1


𝐴𝑂 𝑉 , 𝛾1 , 𝑊 , 𝛼 , 𝛿 0.0265, 10, 0.005, 0.1, 0.1
𝑂𝐵𝐿𝐴𝑂 𝑉 , 𝛾1 , 𝑊 , 𝛼 , 𝛿 0.0265, 10, 0.005, 0.1, 0.1
𝑇 𝑆𝐴 𝑃𝑚𝑖𝑛 , 𝑃𝑚𝑎𝑥 1, 4
𝑆𝑆𝐴 𝐿𝑒𝑎𝑑𝑒𝑟 𝑝𝑜𝑠𝑖𝑡𝑖𝑜𝑛 𝑢𝑝𝑑𝑎𝑡𝑒 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 0.5
𝑀𝑉 𝑂 𝑊 𝐸𝑃 _𝑀𝑎𝑥, 𝑊 𝐸𝑃 _𝑀𝑖𝑛 1, 0.2
𝐺𝑊 𝑂 𝑙, 𝑟 [−1, 1], [0, 1]
𝑆𝐶𝐴 𝑎 2
𝐿𝑆𝐻𝐴𝐷𝐸 𝑃 𝑏𝑒𝑠𝑡 𝑟𝑎𝑡𝑒, 𝐴𝑟𝑐 𝑟𝑎𝑡𝑒, 𝑀𝑒𝑚𝑜𝑟𝑦 𝑠𝑖𝑧𝑒 0.11, 1.4, 5
𝐶𝑀𝐴𝐸𝑆 𝛼 2
𝑀𝑅𝐹 𝑂 𝑆 , 𝑟1 , 𝑟2 , 𝑟3 2, [0, 1], [0, 1], [0, 1]
𝐴𝑉 𝑂𝐴 𝛼 , 𝛽 , 𝛿 , 𝑃1 , 𝑃2 , 𝑃3 0.8, 0.2, 2.5, [0, 1], [0, 1], [0, 1]

the total number of iterations, and 𝐷𝑖𝑚 represents the number of dimensions. Then the time complexity of updating the proposed
FROBLAO is described as follows:
𝑂 (Initialize the position) = 𝑂(𝑀)
𝑂 (Calculate the fitness value) = 𝑂(𝑀 × 𝐿)
𝑂 (Update the position) = 𝑂(𝑀 × 𝐿 × 𝐷𝑖𝑚)
𝑂(𝐹 𝑅𝑂𝐵𝐿) = 𝑂(𝑀 × 𝐿 × 𝐷𝑖𝑚).
Hence the total time complexity of the proposed FROBLAO is equal to 𝑂(𝐹 𝑅𝑂𝐵𝐿𝐴𝑂) = 𝑂(𝑀) + 𝑂(𝑀 × 𝐿) + 𝑂(𝑀 × 𝐿 × 𝐷𝑖𝑚)
+ 𝑂(𝑀 × 𝐿 × 𝐷𝑖𝑚) = 𝑂(𝑀 × 𝐿 × 𝐷𝑖𝑚).

3.2. Measurements of performance

• Average (𝑀𝑒𝑎𝑛): The average value can be calculated using Equation (20).
𝑁
1 ∑
𝑀𝑒𝑎𝑛 = 𝑀, (20)
𝑁 𝑖=1 𝑖

where, 𝑀𝑖 signifies the better solution accomplished from 𝑖𝑡ℎ run and 𝑁 denotes 30 independent runs.
• Standard deviation (𝑆𝑡𝑑 ): For calculating the standard deviation, use Equation (21).


√1 ∑ 𝑁
𝑆𝑡𝑑 = √ (𝑀 − 𝑀)2 , (21)
𝑁 𝑖=1 𝑖

where, 𝑀 denotes the mean of the 30 runs, 𝑀𝑖 signifies the better solution accomplished from 𝑖𝑡ℎ run and 𝑁 denotes 30
independent runs.
• t-test: A statistical test, such as a t-test, is used to determine the significant differences between the proposed method and other
meta-heuristics. These are calculated using Equation (22).
𝑀𝑒𝑎𝑛1 − 𝑀𝑒𝑎𝑛2
𝑡 − 𝑐𝑜𝑠𝑡 = √ , (22)
𝑆𝑡𝑑12 +𝑆𝑡𝑑22
𝑁

where, 𝑀𝑒𝑎𝑛1 , 𝑀𝑒𝑎𝑛2 , 𝑆𝑡𝑑1 , and 𝑆𝑡𝑑2 be the average and std for the two different algorithms.

9
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 6
Results of CEC 2005 test functions on FROBLAO to compare with other algorithms.
Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂
F1
𝑀𝑒𝑎𝑛 1.7163E-111 0 2.5252E-21 1.7682E-07 1.4701E-02 2.4998E-27 6.7704E-12 1.0820E-01 1.1333E-05 0 5.9672E-302 0
𝑆𝑡𝑑 9.4008E-111 0 6.5609E-21 2.1148E-07 7.5624E-03 4.8313E-27 2.6459E-11 1.4983E-01 4.6012E-06 0 0 0

F2
𝑀𝑒𝑎𝑛 3.8906E-55 0 7.8971E-14 1.3435E-02 3.7292E-02 8.5561E-17 1.1659E-09 1.4620E-01 5.1791E-03 1.2923E-205 1.0365E-148 0
𝑆𝑡𝑑 2.1293E-54 0 8.0448E-14 3.7141E-02 1.3708E-02 5.5446E-17 2.6378E-09 1.6020E-01 1.5425E-03 0 5.6753E-148 0

F3
𝑀𝑒𝑎𝑛 5.7030E-102 0 3.1817E-04 7.7395E-07 1.2050E-01 8.7098E-05 1.2070E-02 5.1011E+02 4.3656E-01 0 1.1742E-228 0
𝑆𝑡𝑑 3.1220E-101 0 7.4468E-04 2.2226E-06 7.3420E-02 2.5361E-04 4.8676E-02 3.2016E+02 2.1998E-01 0 0 0

F4
𝑀𝑒𝑎𝑛 1.2953E-51 0 2.9942E-01 4.2172E-05 8.4682E-02 8.1885E-07 2.3103E-03 1.4483E+01 2.3320E-02 5.5620E-200 4.7178E-145 0
𝑆𝑡𝑑 7.0907E-51 0 3.5015E-01 1.1192E-04 2.3954E-02 1.1604E-06 4.9523E-03 3.0416E+00 6.2271E-03 0 2.5798E-144 0

F5
𝑀𝑒𝑎𝑛 1.9136E-03 1.0983E-05 2.8236E+01 1.6916E+02 2.5696E+02 2.7139E+01 7.4362E+00 1.5630E+02 4.3648E+01 2.2849E+01 4.6278E-05 2.8763E-07
𝑆𝑡𝑑 2.5562E-03 9.3549E-06 9.5719E-01 4.8774E+02 5.1973E+02 8.2774E-01 4.1879E-01 1.1556E+02 4.3488E+01 4.7244E-01 3.6732E-05 3.5913E-07

F6
𝑀𝑒𝑎𝑛 2.0674E-05 1.0051E-06 3.5443E+00 8.8172E-10 1.4239E-02 7.4424E-01 4.3737E-01 5.1390E-02 1.0834E-05 0 4.7940E-07 1.2535E-08
𝑆𝑡𝑑 2.9670E-05 1.0187E-06 7.3726E-01 3.3196E-10 6.4076E-03 3.6459E-01 1.7695E-01 4.5090E-02 4.0063E-06 0 2.7300E-07 2.1314E-08

F7
𝑀𝑒𝑎𝑛 9.1969E-05 1.0125E-05 1.0861E-02 1.5511E-02 3.1080E-03 1.8978E-03 2.4682E-03 8.4200E-02 2.4237E-02 1.5070E-04 1.5228E-04 4.3932E-07
𝑆𝑡𝑑 1.1608E-04 5.9546E-06 5.7467E-03 8.9977E-03 2.3910E-03 9.8852E-04 1.7364E-03 3.9555E-02 6.4990E-03 1.4087E-04 1.3585E-04 9.4442E-07

F8
𝑀𝑒𝑎𝑛 -3.6295E+03 -4.1898E+03 -5.8489E+03 -2.6808E+03 -2.9450E+03 -5.9188E+03 -2.2056E+03 -1.2117E+04 -1.1770E+05 -8.2331E+03 -1.2195E+04 -4.1898E+03
𝑆𝑡𝑑 8.0617E+02 1.1158E-02 6.4755E+02 3.1134E+02 3.2022E+02 1.0274E+03 1.7423E+02 1.9635E+02 6.5578E+04 7.5921E+02 9.1477E+02 1.3412E-02

F9
𝑀𝑒𝑎𝑛 0 0 1.8217E+02 1.5090E+01 1.4599E+01 4.0971E+00 1.5276E+00 5.4683E+00 1.3110E+02 0 0 0
𝑆𝑡𝑑 0 0 4.6500E+01 7.1178E+00 5.8972E+00 4.4188E+00 7.0681E+00 2.5698E+00 7.0854E+01 0 0 0

F10
𝑀𝑒𝑎𝑛 4.4409E-16 4.4409E-16 1.6831E+00 7.3684E-01 5.8901E-01 1.0430E-13 1.0500E-03 2.9171E+00 1.0363E-03 4.4409E-16 4.4409E-16 4.4409E-16
𝑆𝑡𝑑 3.0088E-31 3.0088E-31 1.6232E+00 9.8093E-01 6.4644E-01 1.9608E-14 5.7332E-03 8.3701E-01 2.4638E-04 3.0088E-31 3.0088E-31 3.0088E-31

F11
𝑀𝑒𝑎𝑛 0 0 1.1127E-02 1.9688E-01 3.6576E-01 3.1189E-03 7.6846E-02 1.5184E-01 1.3608E-04 0 0 0
𝑆𝑡𝑑 0 0 1.0123E-02 1.0612E-01 1.3126E-01 7.2341E-03 1.0683E-01 1.2428E-01 4.5562E-05 0 0 0

F12
𝑀𝑒𝑎𝑛 9.5407E-06 2.1123E-07 7.1429E+00 7.4179E-01 3.2364E-02 4.9874E-02 9.2963E-02 1.6222E+00 1.3504E-06 9.1020E-09 2.2530E-08 5.8385E-12
𝑆𝑡𝑑 2.3033E-05 1.6230E-07 3.2399E+00 1.0493E+00 9.6468E-02 2.2068E-02 3.2778E-02 1.4785E+00 6.0598E-07 1.9949E-08 1.3650E-08 1.3199E-11

F13
𝑀𝑒𝑎𝑛 1.7980E-05 1.4196E-07 3.0684E+00 1.7995E-03 6.5473E-03 5.1398E-01 3.1885E-01 3.0158E+00 1.5074E-05 2.4219E+00 4.6219E-08 1.9155E-08
𝑆𝑡𝑑 4.2318E-05 1.1463E-07 6.1632E-01 4.9376E-03 5.3018E-03 1.8370E-01 1.0016E-01 3.7334E+00 5.8219E-06 1.1242E+00 3.3144E-08 3.1011E-08

F14
𝑀𝑒𝑎𝑛 2.5053E+00 1.3948E+00 9.8551E+00 1.2297E+00 9.9800E-01 4.6626E+00 1.8907E+00 9.9800E-01 4.9447E+00 9.9800E-01 1.4280E+00 9.9800E-01
𝑆𝑡𝑑 2.7753E+00 8.0721E-01 4.7878E+00 5.6408E-01 3.3876E-16 3.6583E+00 1.8950E+00 3.3876E-16 3.0422E+00 3.3876E-16 8.1025E-01 3.3876E-16

F15
𝑀𝑒𝑎𝑛 4.9413E-04 3.3059E-04 1.2708E-02 4.0687E-03 4.2868E-03 5.7283E-03 1.0920E-03 4.4194E-04 2.3521E-03 4.4973E-04 3.6969E-04 3.1016E-04
𝑆𝑡𝑑 1.0819E-04 1.6409E-05 2.6755E-02 7.4170E-03 7.4084E-03 8.9776E-03 3.7198E-04 3.5372E-04 1.3403E-03 3.5189E-04 1.0432E-04 2.8158E-06

F16
𝑀𝑒𝑎𝑛 -1.0312E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00 -1.0316E+00
𝑆𝑡𝑑 4.2135E-04 1.0501E-05 3.9165E-07 2.3469E-14 4.0256E-07 2.8019E-08 7.6526E-05 0 0 0 0 0

F17
𝑀𝑒𝑎𝑛 3.9804E-01 3.9789E-01 3.9794E-01 3.9789E-01 3.9789E-01 3.9790E-01 4.0016E-01 3.9789E-01 3.9789E-01 3.9789E-01 3.9789E-01 3.9790E-01
𝑆𝑡𝑑 2.4679E-04 7.4273E-06 5.3285E-05 1.3357E-14 8.2593E-07 6.3724E-05 2.2952E-03 1.1292E-16 1.1292E-16 1.1292E-16 8.7029E-16 1.1292E-16

F18
𝑀𝑒𝑎𝑛 3.0285E+00 3.0015E+00 1.1368E+01 3.0000E+00 5.7000E+00 3.0000E+00 3.0001E+00 3.0000E+00 3.0000E+00 3.0000E+00 3.0000E+00 3.0000E+00
𝑆𝑡𝑑 2.7722E-02 1.4704E-03 2.2418E+01 1.8192E-13 1.4789E+01 5.4917E-05 1.8330E-04 2.5657E-15 2.5657E-15 1.8142E-15 9.1421E-06 1.8142E-15

F19
𝑀𝑒𝑎𝑛 -3.8551E+00 -3.8625E+00 -3.8624E+00 -3.8628E+00 -3.8628E+00 -3.8608E+00 -3.8548E+00 -3.0048E-01 -3.8628E+00 -3.8628E+00 -3.8628E+00 -3.8625E+00
𝑆𝑡𝑑 5.3641E-03 2.7689E-04 1.4292E-03 7.1564E-11 1.5117E-06 2.9353E-03 3.1724E-03 0 2.7101E-15 2.7101E-15 2.3705E-11 2.6409E-04

F20
𝑀𝑒𝑎𝑛 -3.1293E+00 -3.2306E+00 -3.2517E+00 -3.2270E+00 -3.2493E+00 -3.2496E+00 -2.9377E+00 -3.3022E+00 -3.2705E+00 -3.2863E+00 -3.2859E+00 -3.3197E+00
𝑆𝑡𝑑 8.9217E-02 6.1566E-02 7.3541E-02 5.9148E-02 6.0363E-02 1.1545E-01 3.3429E-01 4.5066E-02 5.9923E-02 5.5415E-02 5.6017E-02 2.2248E-03

F21
𝑀𝑒𝑎𝑛 -1.0129E+01 -1.0153E+01 -5.2940E+00 -7.3946E+00 -6.9634E+00 -8.3909E+00 -2.6755E+00 -8.6519E+00 -8.4084E+00 -8.7937E+00 -1.0153E+01 -1.0153E+01
𝑆𝑡𝑑 5.1031E-02 4.6493E-04 2.9851E+00 3.3091E+00 3.1573E+00 2.8084E+00 2.2941E+00 2.8296E+00 3.2168E+00 2.2930E+00 5.4758E-05 7.4626E-13

F22
𝑀𝑒𝑎𝑛 -1.0393E+01 -1.0403E+01 -6.4279E+00 -8.8387E+00 -8.3044E+00 -1.0401E+01 -3.4356E+00 -9.6205E+00 -1.0403E+01 -8.6312E+00 -1.0403E+01 -1.0403E+01
𝑆𝑡𝑑 1.4369E-02 4.6314E-04 3.5541E+00 2.9226E+00 2.8591E+00 9.6699E-04 1.9653E+00 2.0646E+00 0 2.5485E+00 4.0616E-05 1.0137E-13

F23
𝑀𝑒𝑎𝑛 -1.0524E+01 -1.0536E+01 -6.5208E+00 -8.6442E+00 -8.3275E+00 -1.0535E+01 -3.7448E+00 -1.0134E+01 -1.0266E+01 -9.4548E+00 -1.0536E+01 -1.0536E+01
𝑆𝑡𝑑 2.3294E-02 3.5260E-04 3.7865E+00 3.2224E+00 3.2621E+00 1.2180E-03 1.6062E+00 1.5402E+00 1.4815E+00 2.2002E+00 1.0265E-03 2.4788E-13

3.3. Comparison results of CEC 2005 test functions

In this subsection, we discuss the efficiency of the proposed FROBLAO and compare it with AO variants, popular algorithms,
and top-performing algorithms. Table 6 demonstrates the statistical outcomes of competitive algorithms, like mean and standard
deviation results. In comparison to AO, FROBLAO produces better results in 20 functions (𝐹 1 to 𝐹 7, and 𝐹 12 to 𝐹 23), whereas
FROBLAO produces the same results in 3 functions (𝐹 9 to 𝐹 11). In comparison to OBLAO, FROBLAO produces better results in 13
functions (𝐹 5 to 𝐹 7, 𝐹 12 to 𝐹 18, and 𝐹 20 to 𝐹 22), whereas FROBLAO produces the same results in 9 functions (𝐹 1 to 𝐹 4, 𝐹 8 to
𝐹 11, and 𝐹 19). In comparison to TSA, FROBLAO produces better results in 22 functions (𝐹 1 to 𝐹 7, and 𝐹 9 to 𝐹 23). In comparison
to SSA, FROBLAO produces better results in 23 functions. Results were similar when compared to GWO and SCA algorithms. In
comparison to MVO, FROBLAO produces better results in 22 functions (𝐹 1 to 𝐹 13 and 𝐹 15 to 𝐹 23). In comparison to LSHADE,

10
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 7
Results of CEC 2019 test functions on FROBLAO to compare with other algorithms.
Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂
CEC01
𝑀𝑒𝑎𝑛 5.8162E+04 4.3500E+04 4.1623E+08 6.9628E+09 4.0931E+09 2.6073E+08 8.3745E+09 2.0417E+07 3.8659E+08 3.8362E+04 4.6338E+04 3.7674E+04
𝑆𝑡𝑑 1.2167E+04 3.1767E+03 8.4702E+08 8.1850E+09 2.9698E+09 5.2126E+08 1.1427E+10 5.3397E+07 2.9751E+08 7.1216E+02 4.2265E+03 9.3083E+02

CEC02
𝑀𝑒𝑎𝑛 1.7371E+01 1.7349E+01 1.8309E+01 1.7347E+01 1.8954E+01 1.7344E+01 1.7480E+01 1.7343E+01 1.0647E+02 1.7347E+01 1.7343E+01 1.7343E+01
𝑆𝑡𝑑 1.2233E-02 2.5097E-03 5.6451E-01 1.3780E-02 7.9543E-01 3.1615E-04 8.1389E-02 3.4407E-14 4.6060E+01 7.8493E-04 7.4226E-08 0

CEC03
𝑀𝑒𝑎𝑛 1.2702E+01 1.2702E+01 1.2703E+01 1.2702E+01 1.2702E+01 1.2702E+01 1.2703E+01 1.2702E+01 1.2702E+01 1.2702E+01 1.2702E+01 1.2702E+01
𝑆𝑡𝑑 6.3450E-06 4.1688E-07 1.4095E-03 2.6513E-11 4.6965E-09 3.6996E-06 1.2184E-04 9.0336E-15 1.3329E-04 9.0336E-15 1.9787E-09 9.0336E-15

CEC04
𝑀𝑒𝑎𝑛 6.6005E+02 5.0794E+01 4.2087E+03 4.7559E+01 3.8414E+01 2.0358E+02 1.7648E+03 9.5887E+00 2.1930E+01 3.3299E+01 1.3232E+02 4.3987E+01
𝑆𝑡𝑑 5.4468E+02 2.1377E+01 2.7740E+03 2.7312E+01 1.2130E+01 6.5305E+02 9.8911E+02 4.2700E+00 6.9889E+00 1.9416E+01 6.1470E+01 2.0421E+01

CEC05
𝑀𝑒𝑎𝑛 1.5377E+00 1.2566E+00 2.6898E+00 1.2305E+00 1.3061E+00 1.4244E+00 2.2222E+00 1.0357E+00 1.0276E+00 1.1439E+00 1.4281E+00 1.2104E+00
𝑆𝑡𝑑 1.7008E-01 1.3845E-01 5.8219E-01 1.2223E-01 1.4195E-01 2.6916E-01 1.0522E-01 2.6004E-02 3.4842E-02 1.0126E-01 3.2860E-01 9.0954E-02

CEC06
𝑀𝑒𝑎𝑛 1.0662E+01 8.3942E+00 1.1122E+01 6.2934E+00 8.4414E+00 1.0866E+01 1.0838E+01 8.7360E+00 1.0517E+01 1.0426E+01 6.3738E+00 5.7330E+00
𝑆𝑡𝑑 1.1467E+00 9.8413E-01 6.3773E-01 3.5783E-01 1.1857E+00 5.8197E-01 6.7443E-01 1.9273E+00 6.5943E-01 7.2000E-01 1.5924E+00 9.0350E-01

CEC07
𝑀𝑒𝑎𝑛 3.6239E+02 1.4045E+02 6.5854E+02 3.0983E+02 3.1114E+02 4.8478E+02 8.3450E+02 2.4262E+02 2.8153E+02 2.3616E+02 3.6043E+02 3.3918E+01
𝑆𝑡𝑑 1.6835E+02 1.3710E+02 2.2301E+02 2.2817E+02 2.2466E+02 3.1424E+02 1.6636E+02 1.5391E+02 2.4113E+02 1.3423E+02 1.9949E+02 1.1933E+02

CEC08
𝑀𝑒𝑎𝑛 5.4593E+00 4.5671E+00 6.2935E+00 5.0889E+00 5.3875E+00 5.0082E+00 6.0604E+00 4.5729E+00 5.1209E+00 4.6699E+00 5.7072E+00 3.3854E+00
𝑆𝑡𝑑 7.1525E-01 5.5741E-01 5.7205E-01 6.1588E-01 5.8004E-01 1.1802E+00 4.1349E-01 5.3735E-01 1.7053E+00 8.9526E-01 5.0759E-01 7.6536E-01

CEC09
𝑀𝑒𝑎𝑛 5.1682E+00 2.4782E+00 4.7013E+02 2.6484E+00 2.4781E+00 4.4617E+00 1.2319E+02 2.4700E+00 2.4009E+00 2.3690E+00 3.3665E+00 2.3572E+00
𝑆𝑡𝑑 8.5265E-01 8.3635E-02 5.4262E+02 1.5708E-01 6.5192E-02 7.8120E-01 1.0049E+02 1.1041E-01 1.2125E-02 2.0034E-02 5.0230E-01 9.3767E-03

CEC10
𝑀𝑒𝑎𝑛 1.8980E+01 1.1731E+01 2.0504E+01 2.0038E+01 2.0104E+01 2.0472E+01 2.0508E+01 2.0098E+01 2.0433E+01 1.6432E+01 2.0045E+01 4.6705E+00
𝑆𝑡𝑑 4.4574E+00 9.6746E+00 7.6757E-02 6.4503E-02 4.5829E-02 2.3391E-01 1.0464E-01 8.5590E-02 8.3900E-02 8.0856E+00 4.8026E-02 7.8510E+00

FROBLAO produces better results in 19 functions (𝐹 1 to 𝐹 7, 𝐹 9 to 𝐹 13, 𝐹 15, and 𝐹 18 to 𝐹 23), whereas FROBLAO produces the
same results in 3 functions (𝐹 14, 𝐹 16, and 𝐹 17). In comparison to CMA-ES, FROBLAO produces better results in 19 functions (𝐹 1
to 𝐹 7, 𝐹 9 to 𝐹 15, 𝐹 18, and 𝐹 20 to 𝐹 23), whereas FROBLAO produces the same results in 3 functions (𝐹 16, 𝐹 17, and 𝐹 19). In
comparison to MRFO, FROBLAO produces better results in 11 functions (𝐹 2, 𝐹 4, 𝐹 5, 𝐹 7, 𝐹 12, 𝐹 13, 𝐹 15, and 𝐹 20 to 𝐹 23), whereas
FROBLAO produces the same results in 9 functions (𝐹 1, 𝐹 3, 𝐹 9 to 𝐹 11, 𝐹 14, 𝐹 16 to 𝐹 18). In comparison to AVOA, FROBLAO
produces better results in 18 functions (𝐹 1 to 𝐹 7, 𝐹 12 to 𝐹 15, and 𝐹 17 to 𝐹 23), whereas FROBLAO produces the same results
in 4 functions (𝐹 9 to 𝐹 11 and 𝐹 16). Hence, the proposed FROBLAO algorithm has a good exploitation ability and a good spatial
exploration ability, which makes it possible for it to handle optimization problems successfully.

3.4. Comparison results of CEC 2019 test functions

In this subsection, we discuss the efficiency of the proposed FROBLAO and compare it with AO variants, popular algorithms,
top-performing algorithms, and recent high-performance algorithms. Table 7 demonstrates the statistical outcomes of competitive
algorithms, like mean and standard deviation results. In comparison to AO, FROBLAO produces better results in 10 functions.
Results were similar when compared to the OBLAO, TSA, SSA, GWO, SCA, and AVOA algorithms. In comparison to MVO, FROBLAO
produces better results in 9 functions (𝐶𝐸𝐶01 to 𝐶𝐸𝐶03 and 𝐶𝐸𝐶05 to 𝐶𝐸𝐶10). In comparison to LSHADE, FROBLAO produces
better results in 7 functions (𝐶𝐸𝐶01, 𝐶𝐸𝐶02, and 𝐶𝐸𝐶06 to 𝐶𝐸𝐶10). Results were similar when compared to the MRFO algorithm.
In comparison to CMA-ES, FROBLAO produces better results in 8 functions (𝐶𝐸𝐶01 to 𝐶𝐸𝐶03 and 𝐶𝐸𝐶06 to 𝐶𝐸𝐶10). Hence, the
proposed FROBLAO algorithm has a good exploitation ability and a good spatial exploration ability, which makes it possible for it to
handle optimization problems successfully.

3.5. Comparison results of CEC 2020 test functions

In this subsection, we discuss the efficiency of the proposed FROBLAO and compare it with AO variants, popular algorithms,
top-performing algorithms, and recent high-performance algorithms. Table 8 demonstrates the statistical outcomes of competitive
algorithms, like mean and standard deviation results. In comparison to AO, FROBLAO produces better results in 10 functions. Results
were similar when compared to the OBLAO, TSA, GWO, and SCA algorithms. In comparison to SSA, FROBLAO produces better results
in 9 functions (𝐻2 to 𝐻10). In comparison to MVO, FROBLAO produces better results in 8 functions (𝐻2 to 𝐻9). In comparison
to LSHADE, FROBLAO produces better results in 8 functions (𝐻2 and 𝐻4 to 𝐻10). In comparison to CMA-ES, FROBLAO produces
better results in 9 functions (𝐻1 to 𝐻7, 𝐻9, and 𝐻10). In comparison to MRFO, FROBLAO produces better results in 7 functions
(𝐻2 to 𝐻7 and 𝐻9). In comparison to AVOA, FROBLAO produces better results in 7 functions (𝐻2 to 𝐻8). Hence, the proposed
FROBLAO algorithm has a good exploitation ability and a good spatial exploration ability, which makes it possible for it to handle
optimization problems successfully.

11
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 8
Results of CEC 2020 test functions on FROBLAO to compare with other algorithms.
Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂
H1
𝑀𝑒𝑎𝑛 5.3937E+07 6.6997E+04 3.7038E+09 9.5186E+03 1.8114E+04 4.1726E+07 1.1156E+09 1.0178E+02 1.9557E+06 2.2555E+03 1.3503E+03 3.6926E+04
𝑆𝑡𝑑 1.2885E+08 3.6281E+04 3.2885E+09 1.4634E+04 6.3596E+03 1.2054E+08 3.5526E+08 5.4331E+00 2.3868E+06 2.2992E+03 6.7700E+02 1.7903E+04

H2
𝑀𝑒𝑎𝑛 1.8928E+03 1.6988E+03 2.2212E+03 2.0386E+03 1.8327E+03 1.6993E+03 2.6044E+03 1.5674E+03 2.7081E+03 1.9045E+03 2.0143E+03 1.4400E+03
𝑆𝑡𝑑 2.5834E+02 2.4273E+02 2.4395E+02 4.5386E+02 3.6731E+02 3.3469E+02 1.9383E+02 1.4373E+02 2.8746E+02 3.0315E+02 3.0895E+02 1.2494E+02

H3
𝑀𝑒𝑎𝑛 7.5764E+02 7.4425E+02 7.9804E+02 7.3953E+02 7.3142E+02 7.3475E+02 7.8508E+02 7.1924E+02 7.4131E+02 7.4716E+02 7.7124E+02 7.2467E+02
𝑆𝑡𝑑 1.3519E+01 1.3226E+01 2.6156E+01 1.6429E+01 1.1723E+01 1.0591E+01 1.0617E+01 2.5758E+00 5.3341E+00 1.7168E+01 2.7312E+01 9.1039E+00

H4
𝑀𝑒𝑎𝑛 2.6933E+04 2.9383E+03 2.2371E+05 2.8999E+04 4.7528E+03 1.0046E+04 1.1533E+04 1.9023E+03 1.0481E+04 2.0784E+03 3.5674E+03 1.9011E+03
𝑆𝑡𝑑 3.9796E+04 2.0497E+03 4.3595E+05 3.7706E+04 4.6306E+03 7.8146E+03 6.8273E+03 6.5134E-01 9.5934E+03 1.7093E+02 2.4832E+03 0

H5
𝑀𝑒𝑎𝑛 2.3973E+04 1.0122E+04 4.3749E+05 2.7461E+04 5.3430E+03 6.0681E+04 7.4164E+04 2.1038E+03 2.2704E+05 4.8614E+03 2.9098E+04 1.9249E+03
𝑆𝑡𝑑 4.6236E+04 1.9864E+03 3.7701E+05 4.0720E+04 3.3606E+03 1.4449E+05 8.4941E+04 1.4989E+02 3.0841E+05 2.6132E+03 3.3555E+04 5.9604E+01

H6
𝑀𝑒𝑎𝑛 1.8223E+03 1.6758E+03 1.9372E+03 1.8425E+03 1.8010E+03 1.7269E+03 1.8190E+03 1.6190E+03 1.6944E+03 1.7180E+03 1.6895E+03 1.6030E+03
𝑆𝑡𝑑 1.4345E+02 1.0138E+02 1.1380E+02 1.3211E+02 1.4509E+02 9.8699E+01 1.0883E+02 3.1654E+01 7.3012E+01 1.3524E+02 7.4358E+01 4.9858E+00

H7
𝑀𝑒𝑎𝑛 1.7860E+04 6.4242E+03 3.6986E+04 1.0420E+04 9.1449E+03 9.5589E+03 1.5709E+04 2.1363E+03 1.8374E+04 2.5611E+03 8.6528E+03 2.1005E+03
𝑆𝑡𝑑 1.3407E+04 3.6964E+03 6.6871E+04 1.0591E+04 6.8934E+03 5.8879E+03 7.9336E+03 4.6843E+01 1.8243E+04 3.9366E+02 7.6166E+03 4.0189E-02

H8
𝑀𝑒𝑎𝑛 2.3108E+03 2.3031E+03 2.5290E+03 2.3412E+03 2.3291E+03 2.3131E+03 2.3954E+03 2.3000E+03 2.3000E+03 2.2995E+03 2.3500E+03 2.3009E+03
𝑆𝑡𝑑 1.3011E+01 1.0623E+01 2.3526E+02 2.0494E+02 1.3665E+02 9.4496E+00 4.9442E+01 2.9714E-06 5.9561E-10 1.3422E+01 0 1.7646E+00

H9
𝑀𝑒𝑎𝑛 2.7495E+03 2.7289E+03 2.8375E+03 2.7491E+03 2.7325E+03 2.7547E+03 2.7912E+03 2.7626E+03 2.7549E+03 2.7017E+03 2.6000E+03 2.6308E+03
𝑆𝑡𝑑 8.0106E+01 7.8274E+01 3.5210E+01 6.9673E+01 6.3609E+01 1.3915E+01 8.5907E+00 9.2825E+01 9.5704E+00 1.0332E+02 0 6.2973E+01

H10
𝑀𝑒𝑎𝑛 2.9318E+03 2.7289E+03 3.0582E+03 2.9412E+03 2.9156E+03 2.9367E+03 2.9787E+03 2.9304E+03 2.9312E+03 2.9137E+03 2.7000E+03 2.9266E+03
𝑆𝑡𝑑 2.4452E+01 7.8274E+01 1.4756E+02 3.3831E+01 2.2960E+01 1.6963E+01 2.2485E+01 2.1707E+01 2.1181E+01 6.3589E+01 0 2.3041E+01

3.6. Statistical analysis

The minimum mean value is superior to the algorithm’s ability to discover a solution nearly as good as the optimum solution.
Although the average values of the two algorithms may be equivalent, their performance in achieving the optimal solution may
differ in every iteration. However, Std has been employed to provide a more appropriate comparison. The standard deviation should
be small to have a smaller range of results. Statistical tests are necessary to fully evaluate the performance of the meta-heuristic
algorithms. It is insufficient to compare meta-heuristic algorithms based on their average and standard deviation values; a statistical
test is required to demonstrate a significant improvement of a new meta-heuristic algorithm to overtake the other meta-heuristic
algorithms for solving particular optimization problems. The assumption of normality is simply the presumption that a random
variable of interest, or in our instance, the data from the data sample, follows the normal probability distribution bu using average
and standard deviation. A Kolmogorov-Smirnov test is used to determine normality [85], while Levene’s test, which is based on
averages, is used to determine homoscedasticity [86]. Histograms and quantile-quantile plots, or Q-Q plots, are two other graphical
methods for determining the normality of the data [87]. The 𝑁𝐴 denotes “Not Applicable” which means that the FROBLAO algorithm
results are similar to the compared algorithm results.

• Wilcoxon rank-sum: Moreover, Table 9 represents the Wilcoxon rank-sum test [88] results on CEC 2005 test functions, Table 10
represents the Wilcoxon rank-sum test results on CEC 2019 test functions, and Table 11 represents the Wilcoxon rank-sum test
results on CEC 2020 test functions, which are used to compare the proposed FROBLAO algorithm’s statistical performance to
that of other algorithms. It is important to remember that a 𝑝 − 𝑐𝑜𝑠𝑡 of less than 0.05 denotes a significant difference between the
two compared algorithms. The FROBLAO superiors all other algorithms to varied degrees regarding this condition. The symbols
+ express that FROBLAO is better, = expresses that FROBLAO is identical, and − expresses that FROBLAO is poorer than
the comparable algorithm. 𝐻𝑦𝑝 denotes the hypothesis. In conclusion, the FROBLAO algorithm outperforms other comparison
algorithms on a majority of test functions.

• t-test: Two-tailed 𝑡-tests [12] has been used to compare different statistical outcomes at a consequence of 0.05. The 𝑡 values
are determined with the help of 𝑀𝑒𝑎𝑛 and 𝑆𝑡𝑑 values. A negative value illustrates that the statistical results of the FROBLAO
optimization mistakes are significantly less, and vice versa. The corresponding 𝑡 − 𝑐𝑜𝑠𝑡 is highlighted if the difference is a
statistically significant error. The symbols +∕ = ∕− represent that FROBLAO wins functions, ties functions, and loss functions.
However, Table 12 shows the t-test outcomes on CEC 2005 test functions, Table 13 shows the t-test results on CEC 2019 test
functions, and Table 14 shows the t-test results on CEC 2020 test functions. The statistical outcomes of the optimization mistakes
demonstrate that FROBLAO has much superior total achievement when compared with the other algorithms.

• Friedman test: The Friedman test [89] is used to further highlight how each of the 11 competing algorithms performed overall.
The algorithms can be ranked using the Friedman test according to how well they handle each problem independently. The

12
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 9
Results on Wilcoxon rank-sum test for FROBLAO with other compared algorithms of CEC 2005 test functions.

Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝
F1 1.6783𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F2 1.6783𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F3 1.6783𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F4 1.6783𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F5 1.6783𝐸 − 06 + 1.7571𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F6 1.6783𝐸 − 06 + 2.5802𝐸 − 06 + 1.6783𝐸 − 06 + 5.8925𝐸 − 01 − 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F7 1.6783𝐸 − 06 + 1.9016𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F8 1.6783𝐸 − 06 + 3.5200𝐸 − 01 − 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 3.6037𝐸 − 06 +
F9 𝑁𝐴 = 𝑁𝐴 = 1.6783𝐸 − 06 + 1.7626𝐸 − 06 + 1.6783𝐸 − 06 + 1.7516𝐸 − 06 +
F10 𝑁𝐴 = 𝑁𝐴 = 1.6783𝐸 − 06 + 1.7671𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F11 𝑁𝐴 = 𝑁𝐴 = 2.0509𝐸 − 04 + 1.7671𝐸 − 06 + 1.6783𝐸 − 06 + 3.1250𝐸 − 02 +
F12 1.6783𝐸 − 06 + 1.7571𝐸 − 06 + 1.6783𝐸 − 06 + 2.6564𝐸 − 05 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F13 1.6783𝐸 − 06 + 2.1912𝐸 − 05 + 1.6783𝐸 − 06 + 4.0773𝐸 − 01 − 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F14 3.8754𝐸 − 06 + 3.1250𝐸 − 02 + 1.0385𝐸 − 06 + 𝑁𝐴 = 𝑁𝐴 = 2.8474𝐸 − 06 +
F15 1.6783𝐸 − 06 + 4.8468𝐸 − 06 + 1.5288𝐸 − 05 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 6.4874𝐸 − 06 +
F16 2.4136𝐸 − 06 + 9.9590𝐸 − 01 − 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 2.4136𝐸 − 06 + 1.9705𝐸 − 06 +
F17 2.9085𝐸 − 05 + 2.8861𝐸 − 02 + 2.1418𝐸 − 03 + 1.6783𝐸 − 06 + 2.1813𝐸 − 06 + 6.7653𝐸 − 04 +
F18 1.6783𝐸 − 06 + 2.6564𝐸 − 05 + 9.6309𝐸 − 01 − 1.3271𝐸 − 06 + 1.8777𝐸 − 02 + 2.3898𝐸 − 01 −
F19 2.4136𝐸 − 06 + 3.1106𝐸 − 01 − 2.2297𝐸 − 01 − 1.6783𝐸 − 06 + 1.6793𝐸 − 06 + 1.0309𝐸 − 01 −
F20 1.6783𝐸 − 06 + 5.3456𝐸 − 06 + 4.8468𝐸 − 06 + 5.3754𝐸 − 05 + 1.5105𝐸 − 03 + 3.8484𝐸 − 01 −
F21 1.9705𝐸 − 06 + 1.6764𝐸 − 02 + 1.6783𝐸 − 06 + 1.0308𝐸 − 01 − 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F22 1.6783𝐸 − 06 + 5.4047𝐸 − 03 + 1.6783𝐸 − 06 + 3.7369𝐸 − 01 − 2.6697𝐸 − 06 + 1.6783𝐸 − 06 +
F23 4.8468𝐸 − 06 + 1.0041𝐸 + 00 − 1.6783𝐸 − 06 + 6.7704𝐸 − 01 − 2.1418𝐸 − 03 + 2.6663𝐸 − 04 +

Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝
F1 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 𝑁𝐴 = 2.0437𝐸 − 04 +
F2 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6772𝐸 − 06 +
F3 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 +
F4 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F5 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6772𝐸 − 06 +
F6 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 8.3142𝐸 − 07 + 1.6783𝐸 − 06 +
F7 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F8 3.6037𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F9 2.6318𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 𝑁𝐴 = 𝑁𝐴 =
F10 1.6783𝐸 − 06 + 1.6726𝐸 − 06 + 1.6782𝐸 − 06 + 𝑁𝐴 = 𝑁𝐴 =
F11 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6760𝐸 − 06 + 𝑁𝐴 = 𝑁𝐴 =
F12 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 5.3893𝐸 − 04 +
F13 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6446𝐸 − 06 + 7.3837𝐸 − 03 +
F14 1.7682𝐸 − 06 + 𝑁𝐴 = 1.6783𝐸 − 06 + 𝑁𝐴 = 7.8125𝐸 − 03 +
F15 1.6783𝐸 − 06 + 1.2097𝐸 − 02 + 1.6783𝐸 − 06 + 1.5360𝐸 − 01 − 2.6564𝐸 − 05 +
F16 1.1743𝐸 − 04 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
F17 1.9705𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F18 3.0134𝐸 − 01 − 1.3271𝐸 − 06 + 1.3271𝐸 − 06 + 1.3271𝐸 − 06 + 1.6210𝐸 − 03 +
F19 2.1813𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.7783𝐸 − 06 +
F20 1.6783𝐸 − 06 + 5.2385𝐸 − 02 − 9.9717𝐸 − 02 − 9.7917𝐸 − 01 − 9.7937𝐸 − 01 −
F21 1.6783𝐸 − 06 + 3.7369𝐸 − 01 − 3.7368𝐸 − 01 − 6.7704𝐸 − 01 − 1.7782𝐸 − 06 +
F22 1.6783𝐸 − 06 + 1.5007𝐸 − 02 + 1.6783𝐸 − 06 + 6.4721𝐸 − 01 − 1.7793𝐸 − 06 +
F23 1.6783𝐸 − 06 + 3.6599𝐸 − 04 + 3.1832𝐸 − 05 + 1.6660𝐸 − 01 − 1.7783𝐸 − 06 +

performance rank of each algorithm for CEC2005, CEC2019, and CEC2020 benchmark functions is shown in Tables 15, 16,
and 17, respectively. From Table 15, the mean rank of FROBLAO is 1.52173913, which indicates FROBLAO has the best overall
performance in dealing with CEC2005 test functions. According to Table 16, FROBLAO has the finest comprehensive performance
in dealing with CEC2019 test functions, with a mean rank of 1.6. Table 17 shows that FROBLAO has the best comprehensive
performance in dealing with CEC2020 test functions, with a mean rank of 2.4. The Friedman test results once again demonstrate
FROBLAO’s superiority over the other optimizers analyzed.

3.7. Convergence curve analysis

The convergence curve demonstrates the relationship between the fitness function solution and the maximum number of it-
erations. In the early stages of the optimization process, the population investigates the search region and deviates significantly.
The major motive behind the convergence investigation is to understand the behavior and graphical explanation of the FROBLAO.
Figs. 8(a)-8(r) show the convergence curves of the all-compared algorithms for the CEC 2005 test functions. From, Fig. 8 it is noticed
that the proposed method FROBLAO converges faster among all objective functions except for 𝐹 6, 𝐹 12, and 𝐹 19. However, the

13
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 10
Results on Wilcoxon rank-sum test for FROBLAO with other compared algorithms of CEC 2019 test functions.

Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝
CEC01 1.6783𝐸 − 06 + 4.1210𝐸 − 02 + 1.9705𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
CEC02 1.6783𝐸 − 06 + 7.6771𝐸 − 05 + 1.6783𝐸 − 06 + 2.8122𝐸 − 03 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
CEC03 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 5.4603𝐸 − 06 + 1.6726𝐸 − 06 + 1.6726𝐸 − 06 + 1.5004𝐸 − 02 +
CEC04 1.6783𝐸 − 06 + 1.7299𝐸 − 01 − 1.6783𝐸 − 06 + 6.4721𝐸 − 01 − 2.4728𝐸 − 01 − 2.7387𝐸 − 02 +
CEC05 1.9705𝐸 − 06 + 2.0041𝐸 − 01 − 1.6783𝐸 − 06 + 8.0107𝐸 − 01 − 5.0718𝐸 − 03 − 1.7390𝐸 − 03 +
CEC06 1.6783𝐸 − 06 + 2.5802𝐸 − 06 + 1.6783𝐸 − 06 + 1.0348𝐸 − 02 + 2.4136𝐸 − 06 + 2.4136𝐸 − 06 +
CEC07 3.9795𝐸 − 06 + 1.6210𝐸 − 03 + 1.6783𝐸 − 06 + 5.9202𝐸 − 05 + 5.8934𝐸 − 06 + 5.3456𝐸 − 06 +
CEC08 2.4136𝐸 − 06 + 1.2642𝐸 − 05 + 1.6783𝐸 − 06 + 2.4136𝐸 − 06 + 1.9705𝐸 − 06 + 2.9085𝐸 − 05 +
CEC09 1.6783𝐸 − 06 + 7.8780𝐸 − 06 + 1.6783𝐸 − 06 + 2.1813𝐸 − 06 + 3.6037𝐸 − 06 + 1.6783𝐸 − 06 +
CEC10 1.6783𝐸 − 06 + 1.2634𝐸 − 02 + 1.6783𝐸 − 06 + 1.1535𝐸 − 05 + 3.9795𝐸 − 06 + 1.6783𝐸 − 06 +

Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝
CEC01 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 3.4825𝐸 − 05 +
CEC02 1.6783𝐸 − 06 + 1.4299𝐸 − 06 + 1.6783𝐸 − 06 + 1.4299𝐸 − 06 + 1.7505𝐸 − 06 +
CEC03 1.6783𝐸 − 06 + 1.6726𝐸 − 06 + 1.6783𝐸 − 06 + 8.3083𝐸 − 07 + 9.3629𝐸 − 07 +
CEC04 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 7.1545𝐸 − 06 + 9.9897𝐸 − 03 + 9.9897𝐸 − 03 +
CEC05 1.6783𝐸 − 06 + 1.9705𝐸 − 06 + 1.6783𝐸 − 06 + 7.3837𝐸 − 03 + 2.2941𝐸 − 03 +
CEC06 1.6783𝐸 − 06 + 2.1293𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
CEC07 1.6783𝐸 − 06 + 2.5855𝐸 − 05 + 8.3144𝐸 − 05 + 5.4246𝐸 − 05 + 2.1813𝐸 − 06 +
CEC08 1.6783𝐸 − 06 + 6.4947𝐸 − 06 + 1.6774𝐸 − 04 + 1.2770𝐸 − 04 + 1.9705𝐸 − 06 +
CEC09 1.6783𝐸 − 06 + 5.8934𝐸 − 06 + 5.3754𝐸 − 05 + 7.6060𝐸 − 02 + 1.6783𝐸 − 06 +
CEC10 1.6783𝐸 − 06 + 2.6697𝐸 − 06 + 1.6783𝐸 − 06 + 2.4608𝐸 − 04 + 7.1545𝐸 − 06 +

Table 11
Results on Wilcoxon rank-sum test for FROBLAO with other compared algorithms of CEC 2020 test functions.

Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝 𝑝 − 𝑐𝑜𝑠𝑡 𝐻𝑦𝑝
H1 1.6783𝐸 − 06 + 4.6220𝐸 − 04 + 1.6783𝐸 − 06 + 2.1813𝐸 − 06 + 1.6783𝐸 − 06 + 4.9685𝐸 − 05 +
H2 2.4136𝐸 − 06 + 1.5082𝐸 − 04 + 1.6783𝐸 − 06 + 7.8780𝐸 − 06 + 1.6780𝐸 − 05 + 6.2739𝐸 − 04 +
H3 1.6783𝐸 − 06 + 1.1535𝐸 − 05 + 1.6783𝐸 − 06 + 1.5082𝐸 − 04 + 1.6783𝐸 − 06 + 2.6287𝐸 − 03 +
H4 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H5 1.6783𝐸 − 06 + 1.7079𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H6 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.9705𝐸 − 06 + 1.6783𝐸 − 06 +
H7 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H8 3.1832𝐸 − 05 + 4.9685𝐸 − 05 + 2.9517𝐸 − 06 + 2.9085𝐸 − 05 + 7.8780𝐸 − 06 + 1.9705𝐸 − 06 +
H9 2.2132𝐸 − 05 + 2.4608𝐸 − 04 + 1.6783𝐸 − 06 + 1.2676𝐸 − 05 + 2.9085𝐸 − 05 + 7.1545𝐸 − 06 +
H10 6.9500𝐸 − 02 + 1.6783𝐸 − 06 + 7.6771𝐸 − 05 + 1.6783𝐸 − 06 + 8.3117𝐸 − 02 + 1.6783𝐸 − 06 +

Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑝 − 𝑣𝑎𝑙𝑢𝑒 𝐻𝑦𝑝 𝑝 − 𝑣𝑎𝑙𝑢𝑒 𝐻𝑦𝑝 𝑝 − 𝑣𝑎𝑙𝑢𝑒 𝐻𝑦𝑝 𝑝 − 𝑣𝑎𝑙𝑢𝑒 𝐻𝑦𝑝 𝑝 − 𝑣𝑎𝑙𝑢𝑒 𝐻𝑦𝑝
H1 1.7783𝐸 − 06 + 1.6783𝐸 − 06 + 2.9517𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H2 1.6783𝐸 − 06 + 4.4401𝐸 − 03 + 1.6783𝐸 − 06 + 6.4947𝐸 − 06 + 1.6783𝐸 − 06 +
H3 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 2.3842𝐸 − 06 + 1.3924𝐸 − 05 + 3.2621𝐸 − 06 +
H4 1.6783𝐸 − 06 + 2.9250𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H5 1.6783𝐸 − 06 + 1.3348𝐸 − 04 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H6 1.6783𝐸 − 06 + 1.3881𝐸 − 04 + 1.9705𝐸 − 06 + 7.0429𝐸 − 05 + 2.1813𝐸 − 06 +
H7 1.6783𝐸 − 06 + 2.6318𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +
H8 1.9705𝐸 − 06 + 4.9958𝐸 − 01 − 1.5625𝐸 − 02 − 6.3412𝐸 − 02 + 4.2997𝐸 − 07 +
H9 1.6783𝐸 − 06 + 2.1587𝐸 − 05 + 7.8780𝐸 − 06 + 4.4506𝐸 − 03 + 3.1250𝐸 − 02 −
H10 2.1813𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 + 1.6783𝐸 − 06 +

FROBLAO algorithm has a larger impact on the convergence of the other algorithms, especially compared with the AO, OBLAO,
TSA, SSA, MVO, GWO, SCA, LSHADE, CMA-ES, MRFO, and AVOA. Figs. 9(a)-9(i) show the convergence curve of all the compared
algorithms for the CEC 2019 test functions. Fig. 9 shows that the FROBLAO algorithm converges quicker than all fitness functions
left of 𝐶𝐸𝐶04, and 𝐶𝐸𝐶05. Fig. 9 demonstrates that the proposed FROBLAO method outperforms existing algorithms in terms of
convergence. Figs. 10(a)-10(i) show the convergence curve of all the compared algorithms for the CEC 2020 test functions. Fig. 10
shows that the FROBLAO algorithm converges quicker than all fitness functions left of 𝐻1, 𝐻9, and 𝐻10. Fig. 10 demonstrates that
the proposed FROBLAO method outperforms existing algorithms in terms of convergence. Thus, it is demonstrated that enhancements
to the suggested FROBLAO method can result in improved quality of searches and quicker convergence of the graph.

14
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 12
Results on 𝑡-test for FROBLAO with other compared algorithms of CEC 2005 test functions.
Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
F1 −1.000𝐸 + 00 = 0 = −2.108𝐸 + 00 + −4.580𝐸 + 00 + −1.065𝐸 + 01 + −2.834𝐸 + 00 +
F2 −1.001𝐸 + 00 = 0 = −5.377𝐸 + 00 + −1.981𝐸 + 00 + −1.490𝐸 + 01 + −8.452𝐸 + 00 +
F3 −1.001𝐸 + 00 = 0 = −2.340𝐸 + 00 + −1.907𝐸 + 00 + −8.989𝐸 + 00 + −1.881𝐸 + 00 +
F4 −1.001𝐸 + 00 = 0 = −4.684𝐸 + 00 + −2.064𝐸 + 00 + −1.936𝐸 + 01 + −3.865𝐸 + 00 +
F5 −4.100𝐸 + 00 + −6.257𝐸 + 00 + −1.616𝐸 + 02 + −1.900𝐸 + 00 + −2.708𝐸 + 00 + −1.796𝐸 + 02 +
F6 −3.814𝐸 + 00 + −5.336𝐸 + 00 + −2.633𝐸 + 01 + 2.994𝐸 + 00 − −1.217𝐸 + 01 + −1.118𝐸 + 01 +
F7 −4.319𝐸 + 00 + −8.799𝐸 + 00 + −1.035𝐸 + 01 + −9.442𝐸 + 00 + −7.119𝐸 + 00 + −1.051𝐸 + 01 +
F8 −3.807𝐸 + 00 + 8.229𝐸 − 01 = 1.403𝐸 + 01 − −2.655𝐸 + 01 + −2.129𝐸 + 01 + 9.217𝐸 + 00 −
F9 0 = 0 = −2.146𝐸 + 01 + −1.161𝐸 + 01 + −1.356𝐸 + 01 + −5.079𝐸 + 00 +
F10 0 = 0 = −5.679𝐸 + 00 + −4.114𝐸 + 00 + −4.991𝐸 + 00 + −2.901𝐸 + 01 +
F11 0 = 0 = −6.020𝐸 + 00 + −1.016𝐸 + 01 + −1.526𝐸 + 01 + −2.361𝐸 + 00 +
F12 −2.267𝐸 + 00 + −6.770𝐸 + 00 + −1.208𝐸 + 01 + −3.872𝐸 + 00 + −1.838𝐸 + 00 + −1.238𝐸 + 01 +
F13 −2.325𝐸 + 00 + −5.665𝐸 + 00 + −2.727𝐸 + 01 + −1.996𝐸 + 00 + −6.764𝐸 + 00 + −1.532𝐸 + 01 +
F14 −2.975𝐸 + 00 + −2.693𝐸 + 00 + −1.013𝐸 + 01 + −2.250𝐸 + 00 + 2.903𝐸 + 00 − −5.487𝐸 + 00 +
F15 −9.310𝐸 + 00 + −6.719𝐸 + 00 + −2.538𝐸 + 00 + −2.776𝐸 + 00 + −2.940𝐸 + 00 + −3.306𝐸 + 00 +
F16 −5.571𝐸 + 00 + 6.469𝐸 − 01 = 4.221𝐸 + 00 − 4.389𝐸 + 00 − 4.260𝐸 + 00 − 4.381𝐸 + 00 −
F17 −3.081𝐸 + 00 + 2.373𝐸 + 00 − −3.607𝐸 + 00 + 4.985𝐸 + 00 − 4.675𝐸 + 00 − −8.884𝐸 − 02 =
F18 −5.571𝐸 + 00 + −4.508𝐸 + 00 + −2.044𝐸 + 00 + 3.126𝐸 + 00 − −9.999𝐸 − 01 = 2.554𝐸 + 00 −
F19 −7.584𝐸 + 00 + −8.230𝐸 − 01 = −6.820𝐸 − 01 = 4.971𝐸 + 00 − 4.940𝐸 + 00 − −3.235𝐸 + 00 +
F20 −1.168𝐸 + 01 + −7.920𝐸 + 00 + −5.061𝐸 + 00 + −8.578𝐸 + 00 + −6.382𝐸 + 00 + −3.326𝐸 + 00 +
F21 −2.608𝐸 + 00 + −2.051𝐸 + 00 + −8.916𝐸 + 00 + −4.566𝐸 + 00 + −5.534𝐸 + 00 + −3.437𝐸 + 00 +
F22 −3.860𝐸 + 00 + −3.090𝐸 + 00 + −6.126𝐸 + 00 + −2.931𝐸 + 00 + −4.020𝐸 + 00 + −1.123𝐸 + 01 +
F23 −2.732𝐸 + 00 + 1.002𝐸 + 00 = −5.808𝐸 + 00 + −3.215𝐸 + 00 + −3.708𝐸 + 00 + −3.971𝐸 + 00 +
Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
F1 −1.402𝐸 + 00 + −3.956𝐸 + 00 + −1.349𝐸 + 01 + 0 = 0 =
F2 −2.421𝐸 + 00 + −4.999𝐸 + 00 + −1.839𝐸 + 01 + 0 = −1.0003𝐸 + 00 =
F3 −1.358𝐸 + 00 + −8.727𝐸 + 00 + −1.087𝐸 + 01 + 0 = 0 =
F4 −2.555𝐸 + 00 + −2.608𝐸 + 01 + −2.051𝐸 + 01 + 0 = −1.0016𝐸 + 00 =
F5 −9.725𝐸 + 01 + −7.408𝐸 + 00 + −5.497𝐸 + 00 + −2.6490𝐸 + 02 + −6.8573𝐸 + 00 +
F6 −1.354𝐸 + 01 + −6.243𝐸 + 00 + −1.479𝐸 + 01 + 3.2213𝐸 + 00 − −9.3385𝐸 + 00 +
F7 −7.784𝐸 + 00 + −1.166𝐸 + 01 + −2.043𝐸 + 01 + −5.8423𝐸 + 00 + −6.1218𝐸 + 00 +
F8 −6.238𝐸 + 01 + 2.211𝐸 + 02 − 9.481𝐸 + 00 − 2.9169𝐸 + 01 − 4.7934𝐸 + 01 −
F9 −1.184𝐸 + 00 + −1.166𝐸 + 01 + −1.013𝐸 + 01 + 0 = 0 =
F10 −1.003𝐸 + 00 = −1.909𝐸 + 01 + −2.304𝐸 + 01 + 0 = 0 =
F11 −3.940𝐸 + 00 + −6.692𝐸 + 00 + −1.636𝐸 + 01 + 0 = 0 =
F12 −1.553𝐸 + 01 + −6.010𝐸 + 00 + −1.212𝐸 + 01 + 2.4974𝐸 + 00 − −3.0426𝐸 + 00 +
F13 −1.744𝐸 + 01 + −4.424𝐸 + 00 + −1.416𝐸 + 01 + −1.1800𝐸 + 01 + −3.2659𝐸 + 00 +
F14 −2.580𝐸 + 00 + 4.731𝐸 + 00 − −7.106𝐸 + 00 + 4.7309𝐸 + 00 − −2.9064𝐸 + 00 +
F15 −1.151𝐸 + 01 + −2.040𝐸 + 00 + −8.345𝐸 + 00 + −2.1723𝐸 + 00 + −3.1245𝐸 + 00 +
F16 −3.271𝐸 + 00 + 4.389𝐸 + 00 − 4.389𝐸 + 00 − 4.3895𝐸 + 00 − 4.3895𝐸 + 00 −
F17 −5.395𝐸 + 00 + 4.985𝐸 + 00 − 4.985𝐸 + 00 − 4.9855𝐸 + 00 − 4.9855𝐸 + 00 −
F18 1.706𝐸 + 00 − 3.126𝐸 + 00 − 3.126𝐸 + 00 − 3.1264𝐸 + 00 − 3.0742𝐸 + 00 −
F19 −1.337𝐸 + 01 + −7.388𝐸 + 04 + 4.971𝐸 + 00 − 4.9708𝐸 + 00 − 4.9708𝐸 + 00 −
F20 −6.260𝐸 + 00 + −2.127𝐸 + 00 + −4.497𝐸 + 00 + −3.2965𝐸 + 00 + −3.2980𝐸 + 00 +
F21 −1.785𝐸 + 01 + −2.906𝐸 + 00 + −2.971𝐸 + 00 + −3.2473𝐸 + 00 + 3.3111𝐸 + 00 −
F22 −1.942𝐸 + 01 + −2.075𝐸 + 00 + 1.777𝐸 + 01 − −3.8076𝐸 + 00 + 1.7770𝐸 + 01 −
F23 −2.316𝐸 + 01 + −1.428𝐸 + 00 + −9.980𝐸 − 01 = −2.6912𝐸 + 00 + 2.9390𝐸 + 00 −

Table 13
Results on 𝑡-test for FROBLAO with other compared algorithms of CEC 2019 test functions.
Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
CEC01 −9.1963𝐸 + 00 + −9.6403𝐸 + 00 + −2.6913𝐸 + 00 + −4.6593𝐸 + 00 + −7.5488𝐸 + 00 + −2.7393𝐸 + 00 +
CEC02 −1.2778𝐸 + 01 + −1.2787𝐸 + 01 + −9.3760𝐸 + 00 + −1.7762𝐸 + 00 + −1.1091𝐸 + 01 + −1.6805𝐸 + 01 +
CEC03 −7.0577𝐸 + 00 + −9.4426𝐸 + 00 + −2.4577𝐸 + 00 + −1.0312𝐸 + 00 + −6.6376𝐸 + 00 + −1.7183𝐸 + 00 +
CEC04 −6.1907𝐸 + 00 + −1.2611𝐸 + 00 + −8.2229𝐸 + 00 + −5.7366𝐸 − 01 = 1.2853𝐸 + 00 − −1.3379𝐸 + 00 +
CEC05 −9.2935𝐸 + 00 + −1.5279𝐸 + 00 + −1.3751𝐸 + 01 + −7.2125𝐸 − 01 = −3.1095𝐸 + 00 + −4.1248𝐸 + 00 +
CEC06 −1.8494𝐸 + 01 + −1.0910𝐸 + 01 + −2.6692𝐸 + 01 + −1.8824𝐸 + 00 + −9.9515𝐸 + 00 + −2.6159𝐸 + 01 +
CEC07 −8.7188𝐸 + 00 + −3.2101𝐸 + 00 + −1.3526𝐸 + 01 + −5.8691𝐸 + 00 + −5.9688𝐸 + 00 + −7.3466𝐸 + 00 +
CEC08 −1.0844𝐸 + 01 + −6.8359𝐸 + 00 + −1.6670𝐸 + 01 + −9.4979𝐸 + 00 + −9.0220𝐸 + 00 + −6.3188𝐸 + 00 +
CEC09 −1.8056𝐸 + 01 + −7.8734𝐸 + 00 + −4.7217𝐸 + 00 + −1.0134𝐸 + 01 + −1.0049𝐸 + 01 + −1.4754𝐸 + 01 +
CEC10 −8.6813𝐸 + 00 + −3.1040𝐸 + 00 + −1.1046𝐸 + 01 + −1.0721𝐸 + 01 + −1.0767𝐸 + 01 + −1.1019𝐸 + 01 +
Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
CEC01 −4.0141𝐸 + 00 + −2.0904𝐸 + 00 + −7.1166𝐸 + 00 + −3.2162𝐸 + 00 + −1.0964𝐸 + 01 +
CEC02 −9.2177𝐸 + 00 + −2.2622𝐸 + 00 + −1.0598𝐸 + 01 + −2.9000𝐸 + 01 + −2.8330𝐸 + 00 +
CEC03 −5.2516𝐸 + 00 + 0 = −2.0604𝐸 + 00 + 0 = −1.0002𝐸 + 00 =
CEC04 −9.5271𝐸 + 00 + 9.0309𝐸 + 00 − 5.5973𝐸 + 00 − 2.0776𝐸 + 00 − −7.4697𝐸 + 00 +
CEC05 −3.9844𝐸 + 01 + 1.0115𝐸 + 01 − 1.0280𝐸 + 01 − 2.6750𝐸 + 00 − −3.4974𝐸 + 00 +
CEC06 −2.4801𝐸 + 01 + −7.7271𝐸 + 00 + −2.3426𝐸 + 01 + −2.2247𝐸 + 01 + −1.9168𝐸 + 00 +
CEC07 −2.1418𝐸 + 01 + −5.8694𝐸 + 00 + −5.0410𝐸 + 00 + −6.1675𝐸 + 00 + −7.6934𝐸 + 00 +
CEC08 −1.6843𝐸 + 01 + −6.9555𝐸 + 00 + −5.0856𝐸 + 00 + −5.9736𝐸 + 00 + −1.3848𝐸 + 01 +
CEC09 −6.5860𝐸 + 00 + −5.5750𝐸 + 00 + −1.5600𝐸 + 01 + −2.9052𝐸 + 00 + −1.1004𝐸 + 01 +
CEC10 −1.1048𝐸 + 01 + −1.0762𝐸 + 01 + −1.0996𝐸 + 01 + −5.7163𝐸 + 00 + −1.0726𝐸 + 01 +

4. FROBLAO for solving real-life engineering problems

This section evaluates the proposed algorithm’s performance in six real-life engineering problems using constrained engineering
benchmarks. The tension/compression spring design, the pressure vessel design, the welded beam design, the speed reducer design,
the gear train design, and the three-bar truss design are all part of the engineering design problems. The FROBLAO runs indepen-

15
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 14
Results on 𝑡-test for FROBLAO with other compared algorithms of CEC 2020 test functions.
Functions AO vs FROBLAO OBLAO vs FROBLAO TSA vs FROBLAO SSA vs FROBLAO MVO vs FROBLAO GWO vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
H1 −2.2912𝐸 + 00 + −4.0711𝐸 + 00 + −6.1688𝐸 + 00 + 5.7111𝐸 + 00 − 5.4235𝐸 + 00 − −1.8943𝐸 + 00 +
H2 −8.6425𝐸 + 00 + −5.1919𝐸 + 00 + −1.5611𝐸 + 01 + −6.9648𝐸 + 00 + −5.5446𝐸 + 00 + −3.9755𝐸 + 00 +
H3 −1.1080𝐸 + 01 + −6.6810𝐸 + 00 + −1.4511𝐸 + 01 + −4.3347𝐸 + 00 + −2.4907𝐸 + 00 + −3.9553𝐸 + 00 +
H4 −3.4453𝐸 + 00 + −2.7717𝐸 + 00 + −2.7868𝐸 + 00 + −3.9362𝐸 + 00 + −3.3731𝐸 + 00 + −5.7090𝐸 + 00 +
H5 −2.2592𝐸 + 01 + −8.3911𝐸 + 00 + −6.3279𝐸 + 00 + −3.4348𝐸 + 00 + −5.5701𝐸 + 00 + −2.2272𝐸 + 00 +
H6 −8.3700𝐸 + 00 + −3.9279𝐸 + 00 + −1.6070𝐸 + 01 + −9.9252𝐸 + 00 + −7.4720𝐸 + 00 + −6.8687𝐸 + 00 +
H7 −6.4387𝐸 + 00 + −6.4067𝐸 + 00 + −2.8574𝐸 + 00 + −4.3024𝐸 + 00 + −5.5972𝐸 + 00 + −6.9381𝐸 + 00 +
H8 −4.1360𝐸 + 00 + −1.1112𝐸 + 00 + −5.3102𝐸 + 00 + −1.0761𝐸 + 00 + −1.1279𝐸 + 00 + −6.9509𝐸 + 00 +
H9 −6.3795𝐸 + 00 + −5.3496𝐸 + 00 + −1.5694𝐸 + 01 + −6.9008𝐸 + 00 + −6.2250𝐸 + 00 + −1.0522𝐸 + 01 +
H10 −8.5002𝐸 − 01 + 1.3268𝐸 + 01 − −4.8284𝐸 + 00 + −1.9492𝐸 + 00 + 1.8424𝐸 + 00 − −1.9300𝐸 + 00 +
Functions SCA vs FROBLAO LSHADE vs FROBLAO CMA-ES vs FROBLAO MRFO vs FROBLAO AVOA vs FROBLAO
𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕− 𝑡 − 𝑐𝑜𝑠𝑡 +∕ = ∕−
H1 −1.7200𝐸 + 01 + 1.1266𝐸 + 01 − −4.4030𝐸 + 00 + 1.0521𝐸 + 01 − 1.0876𝐸 + 01 −
H2 −2.7658𝐸 + 01 + −3.6651𝐸 + 00 + −2.2161𝐸 + 01 + −7.7591𝐸 + 00 + −9.4395𝐸 + 00 +
H3 −2.3658𝐸 + 01 + 3.1439𝐸 + 00 − −8.6408𝐸 + 00 + −6.3381𝐸 + 00 + −8.8609𝐸 + 00 +
H4 −7.7271𝐸 + 00 + −9.9863𝐸 + 00 + −4.8987𝐸 + 00 + −5.6806𝐸 + 00 + −3.6755𝐸 + 00 +
H5 −4.6582𝐸 + 00 + −6.0743𝐸 + 00 + −3.9980𝐸 + 00 + −6.1530𝐸 + 00 + −4.4354𝐸 + 00 +
H6 −1.0858𝐸 + 01 + −2.7441𝐸 + 00 + −6.8386𝐸 + 00 + −4.6564𝐸 + 00 + −6.3569𝐸 + 00 +
H7 −9.3950𝐸 + 00 + −4.1850𝐸 + 00 + −4.8858𝐸 + 00 + −6.4080𝐸 + 00 + −4.7118𝐸 + 00 +
H8 −1.0456𝐸 + 01 + 2.8455𝐸 + 00 − 2.8455𝐸 + 00 − 5.6207𝐸 − 01 = −1.5235𝐸 + 02 +
H9 −1.3825𝐸 + 01 + −6.4329𝐸 + 00 + −1.0670𝐸 + 01 + −3.2098𝐸 + 00 + 2.6803𝐸 + 00 −
H10 −8.8717𝐸 + 00 + −6.5339𝐸 − 01 = −8.0680𝐸 − 01 = 1.0457𝐸 + 00 = 5.3863𝐸 + 01 −

Table 15
Performance rank of each algorithm for CEC 2005 test functions.
Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂
F1 5 2 7 9 11 6 8 12 10 3 4 1
F2 5 2 7 10 11 6 8 12 9 3 4 1
F3 5 2 8 6 10 7 9 12 11 3 4 1
F4 5 2 11 7 10 6 8 12 9 3 4 1
F5 4 2 8 11 12 7 5 10 9 6 3 1
F6 7 5 12 2 8 11 10 9 6 1 4 3
F7 3 2 9 10 8 6 7 12 11 4 5 1
F8 9 8 6 11 10 5 12 3 1 4 2 7
F9 5 2 11 10 9 7 6 8 12 4 3 1
F10 4 3 11 10 9 6 8 12 7 5 2 1
F11 3 2 8 11 12 7 9 10 6 5 4 1
F12 6 4 12 10 7 8 9 11 5 2 3 1
F13 5 3 12 6 7 9 8 11 4 10 2 1
F14 9 6 12 5 3 10 8 2 11 4 7 1
F15 6 2 12 9 8 7 11 4 10 5 3 1
F16 12 10 8 6 9 7 11 2 3 5 4 1
F17 11 8 9 6 7 10 12 2 4 3 5 1
F18 10 9 12 5 11 7 8 3 4 2 6 1
F19 11 7 8 4 6 9 10 12 2 1 3 5
F20 11 9 6 10 8 7 12 2 5 3 4 1
F21 4 3 11 9 10 8 12 6 7 5 2 1
F22 6 4 11 8 10 5 12 7 2 9 3 1
F23 5 2 11 9 10 4 12 7 6 8 3 1
Sum of ranks 151 99 222 184 206 165 215 181 154 98 84 35
Mean of ranks 6.565217391 4.304347826 9.652173913 8 8.956521739 7.173913043 9.347826087 7.869565217 6.695652174 4.260869565 3.652173913 1.52173913
Overall ranks 5 4 12 9 10 7 11 8 6 3 2 1

Table 16
Performance rank of each algorithm for CEC 2019 test functions.

Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂


CEC01 5 3 9 11 10 7 12 6 8 2 4 1
CEC02 7 6 10 8 11 4 9 2 12 5 3 1
CEC03 9 7 12 4 6 8 10 2 11 3 5 1
CEC04 10 7 12 6 4 9 11 1 2 3 8 5
CEC05 10 6 12 5 7 8 11 2 1 4 9 3
CEC06 9 4 12 2 5 11 10 6 8 7 3 1
CEC07 9 2 12 6 7 10 11 4 5 3 8 1
CEC08 9 2 12 6 8 5 11 3 7 4 10 1
CEC09 10 6 12 7 5 9 11 4 3 2 8 1
CEC10 4 2 9 5 8 12 10 7 11 3 6 1
Sum of ranks 82 45 112 60 71 83 106 37 68 36 64 16
Mean of ranks 8.2 4.5 11.2 6 7.1 8.3 10.6 3.7 6.8 3.6 6.4 1.6
Overall ranks 9 4 12 5 8 10 11 3 7 2 6 1

dently for each engineering problem, with a selected Aquila population size of 30, with 500 iterations, and the number of function
evaluations (NFEs) of 15, 000.

16
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 17
Performance rank of each algorithm for CEC 2020 test functions.

Functions 𝐴𝑂 𝑂𝐵𝐿𝐴𝑂 𝑇 𝑆𝐴 𝑆𝑆𝐴 𝑀𝑉 𝑂 𝐺𝑊 𝑂 𝑆𝐶𝐴 𝐿𝑆𝐻𝐴𝐷𝐸 𝐶𝑀𝐴 − 𝐸𝑆 𝑀𝑅𝐹 𝑂 𝐴𝑉 𝑂𝐴 𝐹 𝑅𝑂𝐵𝐿𝐴𝑂


H1 10 7 12 4 5 9 11 1 8 3 2 6
H2 6 3 10 9 5 4 11 2 12 7 8 1
H3 9 7 12 5 3 4 11 1 6 8 10 2
H4 10 4 12 11 6 7 9 2 8 3 5 1
H5 6 5 11 7 4 9 10 2 12 3 8 1
H6 10 4 12 11 8 7 9 2 5 6 3 1
H7 10 4 12 8 6 7 9 2 11 3 5 1
H8 6 5 12 9 8 7 11 3 2 1 10 4
H9 7 4 12 6 5 8 11 10 9 3 1 2
H10 8 2 12 10 4 9 11 7 6 3 1 5
Sum of ranks 82 45 117 80 54 71 103 32 79 40 53 24
Mean of ranks 8.2 4.5 11.7 8 5.4 7.1 10.3 3.2 7.9 4 5.3 2.4
Overall ranks 10 4 12 9 6 7 11 2 8 3 5 1

Table 18
The tension/compression spring problem’s comparing results.

Algorithms Optimum values for factors Optimum weight


1 2 3
FROBLAO 5.993E-02 5.850E-01 4.651E+00 1.276E-02
AO 6.189E-02 5.704E-01 6.212E+00 1.794E-02
OBLAO 6.551E-02 7.750E-01 2.956E+00 1.648E-02
TSA 5.000E-02 3.169E-01 1.411E+01 1.276E-02
SSA 6.871E-02 1.334E+00 1.177E+00 1.627E+14
MVO 6.989E-02 9.727E-01 2.000E+00 1.900E-02
GWO 5.355E-02 4.030E-01 9.045E+00 1.398E-02
SCA 5.134E-02 3.468E-01 1.203E+01 1.283E-02
LSHADE 6.899E-02 9.334E-01 2.000E+00 1.777E-02
MRFO 5.398E-02 4.144E-01 8.575E+00 1.277E-02
AVOA 5.534E-02 4.511E-01 7.335E+00 1.290E-02

4.1. Tension/compression spring design problem

The engineering design problem of a tension/compression spring problem is designated in [90]. The primary goal of this problem
is to reduce the weight of a given spring. The design of the spring problem is shown in Fig. 11. Three choice factors were used in this
design problem: the diameter of the wire (1 ), the average coil diameter (2 ), and the overall amount of active coils (3 ).
This design problem can be expressed mathematically as follows:

⃗ = [1 2 3 ],
Consider 
⃗ = (3 + 2)2  2 ,
Minimize 𝑓 () 1
23 3
⃗ =1−
Subject to 𝑔1 () ≤ 0,
7178514
422 − 1 2 1
⃗ =
𝑔2 () + ≤ 0,
12566(2 13 − 14 ) 510812
140.451
⃗ =1−
𝑔3 () ≤ 0,
22 3
1 + 2
⃗ =
𝑔4 () − 1 ≤ 0,
1.5
Variable range 0.05 ≤ 1 ≤ 2.00,

0.25 ≤ 2 ≤ 1.30,
2.00 ≤ 3 ≤ 15.0.

Table 18 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (5.993E-02, 5.850E-01, and 4.651E+00) and an optimal solution
of 1.276E-02. Table 18 shows that FROBLAO outperformed other algorithms. The convergence graphs for FROBLAO’s superior results
and those of the other algorithms under investigation for comparison are shown in Fig. 12.

17
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 8. Convergence curves of FROBLAO algorithm in comparison to other algorithms on CEC 2005 test functions.

18
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 8. (continued)

4.2. Pressure vessel design problem

The idea is to produce a pressure vessel design with the least cost. Fig. 13 illustrates the pressure vessel and the design parameters.
This problem has four variables: shell thickness (1 ), head thickness (2 ), inner radius (3 ), and length of the cylindrical section
excluding the head (4 ).
This design problem can be expressed mathematically as follows:

⃗ = [1 2 3 4 ],
Consider 

⃗ = 0.62241 3 4 + 1.77812  2 + 3.1661 2 4 + 19.84 2 3 ,


Minimize 𝑓 () 3 1 1

⃗ = −1 + 0.01933 ≤ 0,
Subject to 𝑔1 ()

⃗ = −1 + 0.009543 ≤ 0,
𝑔2 ()
⃗ = −𝜋 2 4 − 4 𝜋 2 + 1, 296, 000 ≤ 0,
𝑔3 () 3 3 3

⃗ = 4 − 240 ≤ 0,
𝑔4 ()

Variable range 0 ≤ 1 ≤ 99,

0 ≤ 2 ≤ 99,

10 ≤ 3 ≤ 200,

10 ≤ 4 ≤ 20.

Table 19 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (7.842E-01, 3.891E-01, 4.058E+01, and 1.995E+02) and an
optimal solution of 5.975E+03. Table 19 shows that FROBLAO outperformed other algorithms. The convergence graphs for FROB-
LAO’s superior results and those of the other algorithms under investigation for comparison are shown in Fig. 14.

19
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 9. Convergence curves of FROBLAO algorithm in comparison to other algorithms on CEC 2019 test functions.

Table 19
The pressure vessel problem’s comparative results.

Algorithms Optimum values for factors Optimum weight


1 2 3 4
FROBLAO 7.842E-01 3.891E-01 4.058E+01 1.995E+02 5.975E+03
AO 1.292E+00 6.636E-01 6.579E+01 1.023E+01 7.882E+03
OBLAO 1.243E+00 6.035E-01 6.322E+01 1.894E+01 7.247E+03
TSA 1.115E+00 5.559E-01 5.711E+01 5.102E+01 6.857E+03
SSA 2.596E+01 2.304E+01 6.331E+01 6.936E+01 1.230E+06
MVO 9.922E-01 4.928E-01 5.112E+01 8.985E+01 6.405E+03
GWO 1.221E+00 6.039E-01 6.323E+01 1.901E+01 7.173E+03
SCA 1.236E+00 5.541E-01 5.683E+01 7.023E+01 8.316E+03
LSHADE 1.000E+01 1.000E+01 5.368E+01 7.159E+01 2.043E+05
MRFO 7.937E-01 4.334E-01 4.109E+01 1.896E+02 6.041E+03
AVOA 1.147E+00 5.669E-01 5.943E+01 3.759E+01 6.775E+03

20
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 10. Convergence curves of FROBLAO algorithm in comparison to other algorithms on CEC 2020 test functions.

Fig. 11. The design of the tension/compression spring problem.

4.3. Welded beam design problem

A popular welded beam design [91] is given in Fig. 15 to examine the demonstration of FROBLAO in the engineering area. The
goal is to discover the optimal design factors for reducing the total manufacturing cost of a welded beam. This problem has seven
constraints and four variables: weld thickness (1 ), bar length (2 ), height (3 ), and thickness (4 ).
This design problem can be expressed mathematically as follows:

⃗ = [1 2 3 4 ],
Consider 

21
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 12. Convergence graphs for the tension/compression spring problem.

Fig. 13. The design of the pressure vessel problem.

Fig. 14. Convergence graphs for the pressure vessel problem.

⃗ = 1.10471 2 2 + 0.048113 4 (14.0 + 2 ),


Minimize 𝑓 () 1
⃗ = 𝜏()
Subject to 𝑔1 () ⃗ − 𝜏𝑚𝑎𝑥 ≤ 0,
⃗ = 𝜎()
𝑔2 () ⃗ − 𝜎𝑚𝑎𝑥 ≤ 0,
⃗ = 𝛿()
𝑔3 () ⃗ − 𝛿𝑚𝑎𝑥 ≤ 0,
⃗ = 1 − 4 ≤ 0,
𝑔4 ()
⃗ = 𝑃 − 𝑃𝑐 ()
𝑔5 () ⃗ ≤ 0,
⃗ = 0.125 − 1 ≤ 0,
𝑔6 ()
⃗ = 1.1047 2 − 0.048113 4 (14.0 + 2 ) − 5.0 ≤ 0,
𝑔7 () 1

Variable range 0.1 ≤ 1 ≤ 2,

0.1 ≤ 2 ≤ 10,

22
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 15. The design of the welded beam design problem.

0.1 ≤ 3 ≤ 10,
0.1 ≤ 4 ≤ 2,


where 𝜏()⃗ = (𝜏 ′ )2 + 2𝜏 ′ 𝜏 ′′ 2 + (𝜏 ′′ )2 ,
2𝑅
𝑃
𝜏 =√

,
21 2
𝑀𝑅
𝜏 ′′ = ,
𝐽

𝑀 = 𝑃 (𝐿 + 2 ),
2

22
 + 3 2
𝑅= +( 1 ) ,
4 2
√ 2  + 3 2
𝐽 = 2{ 21 2 [ 2 + ( 1 ) ]},
4 2
⃗ = 6𝑃 𝐿 ,
𝜎()
4 32

⃗ = 6𝑃 𝐿3
𝛿() ,
𝐸32 4

32 46 √
4.013𝐸 3
⃗ = 36 𝐸
𝑃𝑐 () (1 − ),
𝐿2 2𝐿 4𝐺
𝑃 = 6.00𝐸 + 03 𝑙𝑏,
𝐿 = 14 𝑖𝑛,
𝛿𝑚𝑎𝑥 = 0.25 𝑖𝑛,
𝐸 = 3.00𝐸 + 07 𝑝𝑠𝑖,
𝐺 = 1.20𝐸 + 07 𝑝𝑠𝑖,
𝜏𝑚𝑎𝑥 = 1.36𝐸 + 04 𝑝𝑠𝑖,
𝜎𝑚𝑎𝑥 = 1.36𝐸 + 04 𝑝𝑠𝑖.

Table 20 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (1.986E-01, 3.687E+00, 9.067E+00, and 2.056E-01) and an
optimal solution of 1.725E+00. Table 20 shows that FROBLAO outperformed other algorithms. The convergence graphs for FROB-
LAO’s superior results and those of the other algorithms under investigation for comparison are shown in Fig. 16.

4.4. Speed reducer design problem

The speed reducer [92], is a crucial component of the gearbox in mechanical systems and has a wide range of uses. In this
optimization problem, the weight of the speed reducer has to be lowered with 11 constraints (see Fig. 17). Seven variables make up
this problem, such as 1 , 2 , 3 , 4 , 5 , 6 , and 7 .

23
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 20
The welded beam problem’s comparative results.

Algorithms Optimum values for factors Optimum weight


1 2 3 4
FROBLAO 1.986E-01 3.687E+00 9.067E+00 2.056E-01 1.725E+00
AO 2.478E-01 2.895E+00 9.249E+00 2.591E-01 2.145E+00
OBLAO 1.746E-01 4.320E+00 9.101E+00 2.073E-01 1.809E+00
TSA 1.955E-01 3.775E+00 9.117E+00 2.058E-01 1.764E+00
SSA 1.398E+00 9.842E-01 1.949E+00 1.848E+00 1.750E+24
MVO 1.901E-01 3.714E+00 9.400E+00 2.040E-01 1.783E+00
GWO 1.976E-01 3.667E+00 9.030E+00 2.060E-01 1.740E+00
SCA 1.775E-01 3.994E+00 9.866E+00 2.034E-01 1.876E+00
LSHADE 1.963E+00 1.939E+00 2.000E+00 2.000E+00 1.089E+24
MRFO 2.056E-01 3.469E+00 9.036E+00 2.056E-01 1.747E+00
AVOA 1.742E-01 4.308E+00 9.038E+00 2.057E-01 1.782E+00

Fig. 16. Convergence graphs for the welded beam problem.

Fig. 17. The design of the speed reducer design problem.

This design problem can be expressed mathematically as follows:

⃗ = [1 2 3 4 5 6 7 ],
Consider 
⃗ = 0.78541  2 (3.3333 2 + 14.93343 − 43.0934) − 1.5081 ( 2 +  2 )
Minimize 𝑓 () 2 3 6 7

+7.4777(62 + 72 ) + 0.7854(4 62 + 5 72 ),

24
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 21
The speed reducer design problem’s comparative results.

Algorithms Optimum values for factors Optimum weight


1 2 3 4 5 6 7
FROBLAO 3.5359 0.7 17 7.3803 7.8849 3.3549 5.2906 2994.4739
AO 3.6 0.7 17 8.3 7.9518 3.3999 5.3037 3071.7139
OBLAO 3.5476 0.7 17.0455 8.2744 7.7850 3.4791 5.2924 3069.7601
TSA 3.5817 0.7 17 7.3 8.3 3.3655 5.2869 3043.4951
SSA 3.5134 2.7941 3.1298 3.5494 2.8633 3.0239 3.3793 1.878E+16
MVO 3.5106 0.7 17 7.5247 8.1840 3.4735 5.2869 3043.8232
GWO 3.5046 0.7 17 7.4628 8.0724 3.4021 5.2909 3021.8463
SCA 3.6 0.7 17 7.3 8.3 3.4702 5.3891 3144.8691
LSHADE 5.5 1.4308 5.5 5.5 5.5 3.2804 5.1030 3.270E+14
MRFO 3.5 0.7 17 7.3 7.7153 3.3502 5.2866 3016.7605
AVOA 3.5 0.7 17 8.0272 8.1225 3.3516 5.2867 3010.2794

⃗ = 27
Subject to 𝑔1 () − 1 ≤ 0,
1 22 3
⃗ = 397.5
𝑔2 () − 1 ≤ 0,
1 22 32
1.9343
⃗ =
𝑔3 () − 1 ≤ 0,
2 64 3
1.9353
⃗ =
𝑔4 () − 1 ≤ 0,
2 74 3

745
(   4 )2 + 16.9 × 106
2 3
⃗ =
𝑔5 () − 1 ≤ 0,
11063

745
(   5 )2 + 157.5 × 106
2 3
⃗ =
𝑔6 () − 1 ≤ 0,
8573
2 3
⃗ =
𝑔7 () − 1 ≤ 0,
40
52
⃗ =
𝑔8 () − 1 ≤ 0,
1
1
⃗ =
𝑔9 () − 1 ≤ 0,
122
1.56 + 1.9
⃗ =
𝑔10 () − 1 ≤ 0,
4
1.17 + 1.9
⃗ =
𝑔11 () − 1 ≤ 0,
5
Variable range 2.6 ≤ 1 ≤ 3.6,

0.7 ≤ 2 ≤ 0.8,
3 ∈ {17, 18, 19, ...., 28},
7.3 ≤ 4 ,
5 ≤ 8.3,
2.9 ≤ 6 ≤ 3.9,
5 ≤ 7 ≤ 5.5.
Table 21 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (3.536E+00, 7.000E-01, 1.700E+01, 7.380E+00, 7.885E+00,
3.355E+00, and 5.291E+00) and an optimal solution of 2.994E+03. Table 21 shows that FROBLAO outperformed other algorithms.
The convergence graphs for FROBLAO’s superior results and those of the other algorithms under investigation for comparison are
shown in Fig. 18.

25
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Fig. 18. Convergence graphs for the speed reducer problem.

Fig. 19. The design of the gear train problem.

4.5. Gear train design problem

Sandgren proposed the design of a gear train [93]. This problem aims to reduce the gear ratio, which is the ratio of angular
velocities between input and output shafts. In this problem, the variables that represent the overall amount of teeth in the gears are
𝐴(1 ), 𝐵(2 ), 𝐶(3 ), and 𝐷(4 ). Fig. 19 depicts the gear train problem.
This design problem can be expressed mathematically as follows:

⃗ = [1 2 3 4 ],
Consider 
1  
⃗ =(
Minimize 𝑓 () − 3 2 )2 ,
6.931 1 4
Variable range 1 2 3 4 ∈ {12, 13, 14, ⋯ , 60}.

Table 22 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (5.918E+01, 1.640E+01, 3.123E+01, and 5.999E+01) and an
optimal solution of 1.233E-32. Table 22 shows that FROBLAO outperformed other algorithms. The convergence graphs for FROB-
LAO’s superior results and those of the other algorithms under investigation for comparison are shown in Fig. 20.

4.6. Three-bar truss design problem

In this case, a three-bar planar truss is depicted in Fig. 21. While stress (𝜎 ) limits on each truss member are maintained, the
volume of a statically loaded 3-bar truss must be reduced. The aim is to find the best cross-sectional areas, 𝐻1 and 𝐻2 .
This design problem can be expressed mathematically as follows:

⃗ = [1 2 ],
Consider 

⃗ = (2 21 + 2 ) × 𝑙,
Minimize 𝑓 ()

26
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 22
The gear train problem’s comparative results.

Algorithms Optimum values for factors Optimum weight


1 2 3 4
FROBLAO 5.918E+01 1.640E+01 3.123E+01 5.999E+01 1.233E-32
AO 5.246E+01 3.049E+01 1.200E+01 4.834E+01 3.058E-12
OBLAO 5.093E+01 3.359E+01 1.261E+01 5.767E+01 1.743E-13
TSA 6.000E+01 2.096E+01 1.504E+01 3.642E+01 1.160E-11
SSA 4.660E+01 2.238E+01 1.751E+01 5.123E+01 3.968E-04
MVO 4.526E+01 3.086E+01 1.200E+01 5.672E+01 5.224E-12
GWO 6.001E+01 2.415E+01 1.201E+01 3.348E+01 3.182E-12
SCA 4.394E+01 1.937E+01 1.478E+01 4.519E+01 1.537E-10
LSHADE 5.945E+01 2.091E+01 2.390E+01 5.827E+01 3.397E-31
MRFO 5.998E+01 4.324E+01 1.200E+01 5.998E+01 1.374E-19
AVOA 5.447E+01 1.356E+01 1.948E+01 3.363E+01 9.322E-30

Fig. 20. Convergence graphs for the gear train problem.

Fig. 21. The design of the three-bar truss problem.


21 + 2

Subject to 𝑔1 () = √  − 𝜎 ≤ 0,
212 + 21 2
2
⃗ =√
𝑔2 ()  − 𝜎 ≤ 0,
212 + 21 2

⃗ =√ 1
𝑔3 ()  − 𝜎 ≤ 0,
22 + 1
𝑙 = 100𝑐𝑚,
 = 2𝑘 ∕(𝑐𝑚)3 ,
𝜎 = 2𝑘 ∕(𝑐𝑚)3 ,

27
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Table 23
The three-bar truss problem’s comparative results.

Algorithms Optimum values for factors Optimum weight


𝐻1 𝐻2
FROBLAO 7.880E-01 4.113E-01 264.0251
AO 7.610E-01 5.111E-01 266.3447
OBLAO 7.915E-01 4.030E-01 264.1778
TSA 7.881E-01 4.103E-01 263.9257
SSA 8.721E-01 2.255E-01 269.2571
MVO 7.924E-01 3.979E-01 263.9087
GWO 7.877E-01 4.109E-01 263.9106
SCA 7.729E-01 4.554E-01 264.1940
LSHADE 7.887E-01 4.082E-01 263.8958
MRFO 7.885E-01 4.089E-01 263.8958
AVOA 7.961E-01 3.876E-01 263.9354

Fig. 22. Convergence graphs for the three-bar truss problem.

Variable range 0 ≤ 1 , 1 ≤ 1.
Table 23 compares the performance of FROBLAO with standard algorithms. The outcomes demonstrate that FROBLAO has pro-
vided the solution to this problem with optimal values for variables (7.880E-01 and 4.113E-01) and an optimal solution of 264.0251.
Table 23 shows that FROBLAO outperformed other algorithms. The convergence graphs for FROBLAO’s superior results and those of
the other algorithms under investigation for comparison are shown in Fig. 22.

5. Conclusion

This paper introduced an improved Aquila optimizer by employing the Fast Random Opposition-Based Learning strategy (FROBL),
and the new strategy is named FROBLAO. This strategy targets enhancing the convergence speed and escapes from local optima by
generating a new opposite solution to the current solution. The FROBLAO is tested using CEC 2005, CEC2019, and CEC 2020 test
functions, and the results are compared with eleven competitive algorithms. The experimental results demonstrate that the suggested
FROBLAO improves optimization by maintaining stability throughout exploration, exploitation, and quicker convergence. In addition,
the FROBLAO is employed with six real-life engineering problems, and the results were compared with other state-of-the-art meta-
heuristic algorithms. The statistical outcomes show that the FROBLAO also performs superior in engineering optimization problems
compared to other meta-heuristic techniques. In future work, we will be able to apply FROBLAO to feature selection, combinatorial
optimization problems, job scheduling, and stress suitability. Future versions may include binary and multi-objective features.

CRediT authorship contribution statement

Gopi S.: Writing – original draft, Validation, Methodology, Formal analysis, Data curation. Prabhujit Mohapatra: Writing –
review & editing, Validation, Supervision, Resources, Methodology, Investigation, Conceptualization.

Declaration of competing interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to
influence the work reported in this paper.

28
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

Data availability

No data was used for the research described in the article.

References

[1] P. Mohapatra, K.N. Das, S. Roy, A modified competitive swarm optimizer for large scale optimization problems, Appl. Soft Comput. 59 (2017) 340–362, https://
doi.org/10.1016/j.asoc.2017.05.060.
[2] S. Mohapatra, P. Sarangi, P. Mohapatra, An improvised grey wolf optimiser for global optimisation problems, Int. J. Math. Oper. Res. 26 (2) (2023) 263–281,
https://doi.org/10.1504/IJMOR.2023.134490.
[3] P. Mohapatra, Combined economic emission dispatch in hybrid power systems using competitive swarm optimization, J. King Saud Univ, Comput. Inf. Sci.
34 (10) (2022) 8955–8971, https://doi.org/10.1016/j.jksuci.2022.08.022.
[4] P. Sarangi, P. Mohapatra, Modified hybrid gwo-sca algorithm for solving optimization problems, in: International Conference on Data Analytics and Computing,
Springer, 2022, pp. 121–128.
[5] S. Mohapatra, P. Mohapatra, An improved golden jackal optimization algorithm using opposition-based learning for global optimization and engineering
problems, Int. J. Comput. Intell. Syst. 16 (1) (2023) 147, https://doi.org/10.1007/s44196--023--00320--8.
[6] P. Sarangi, P. Mohapatra, A novel cosine swarm algorithm for solving optimization problems, in: Proceedings of 7th International Conference on Harmony
Search, Soft Computing and Applications: ICHSA 2022, Springer, 2022, pp. 427–434.
[7] M.S. Braik, Chameleon swarm algorithm: a bio-inspired optimizer for solving engineering design problems, Expert Syst. Appl. 174 (2021) 114685, https://
doi.org/10.1016/j.eswa.2021.114685.
[8] S. Gopi, P. Mohapatra, Opposition-based learning cooking algorithm (olca) for solving global optimization and engineering problems, Int. J. Mod. Phys. C (2023),
https://doi.org/10.1142/S0129183124500517.
[9] P. Sarangi, P. Mohapatra, Evolved opposition-based mountain gazelle optimizer to solve optimization problems, J. King Saud Univ, Comput. Inf. Sci. (2023)
101812, https://doi.org/10.1016/j.jksuci.2023.101812.
[10] T. Ommen, W.B. Markussen, B. Elmegaard, Comparison of linear, mixed integer and non-linear programming methods in energy system dispatch modelling,
Energy 74 (2014) 109–118, https://doi.org/10.1016/j.energy.2014.04.023.
[11] V. Chandran, P. Mohapatra, Enhanced opposition-based grey wolf optimizer for global optimization and engineering design problems, Alex. Eng. J. 76 (2023)
429–467, https://doi.org/10.1016/j.aej.2023.06.048.
[12] P. Mohapatra, S. Roy, K.N. Das, S. Dutta, M.S.S. Raju, A review of evolutionary algorithms in solving large scale benchmark optimisation problems, Int. J. Math.
Oper. Res. 21 (1) (2022) 104–126, https://doi.org/10.1504/IJMOR.2022.120340.
[13] M. Braik, A. Sheta, H. Al-Hiary, A novel meta-heuristic search algorithm for solving optimization problems: capuchin search algorithm, Neural Comput. Appl.
33 (2021) 2515–2547, https://doi.org/10.1007/s00521--020--05145--6.
[14] L. Abualigah, D. Yousri, M. Abd Elaziz, A.A. Ewees, M.A. Al-Qaness, A.H. Gandomi, Aquila optimizer: a novel meta-heuristic optimization algorithm, Comput.
Ind. Eng. 157 (2021) 107250, https://doi.org/10.1016/j.cie.2021.107250.
[15] N. Kumar, R. Kumar, P. Mohapatra, R. Kumar, Modified competitive swarm technique for solving the economic load dispatch problem, J. Inf. Optim. Sci. 41 (1)
(2020) 173–184, https://doi.org/10.1080/02522667.2020.1714184.
[16] L. Abualigah, A. Diabat, Advances in sine cosine algorithm: a comprehensive survey, Artif. Intell. Rev. 54 (4) (2021) 2567–2608, https://doi.org/10.1007/
s10462--020--09909--3.
[17] B. Schneider, U. Ranft, Simulationsmethoden in der Medizin und Biologie, in: Workshop, Hannover, 29. Sept.–1. Okt. 1977, vol. 8, Springer-Verlag, 2013.
[18] M. Srinivas, L.M. Patnaik, Genetic algorithms: a survey, Computer 27 (6) (1994) 17–26, https://doi.org/10.1109/2.294849.
[19] D. Simon, Biogeography-based optimization, IEEE Trans. Evol. Comput. 12 (6) (2008) 702–713, https://doi.org/10.1109/TEVC.2008.919004.
[20] J.R. Koza, J.P. Rice, Automatic programming of robots using genetic programming, in: AAAI, vol. 92, 1992, pp. 194–207.
[21] R.A. Sarker, S.M. Elsayed, T. Ray, Differential evolution with dynamic parameters selection for optimization problems, IEEE Trans. Evol. Comput. 18 (5) (2013)
689–707, https://doi.org/10.1109/TEVC.2013.2281528.
[22] X. Yao, Y. Liu, K.-H. Liang, G. Lin, Fast evolutionary algorithms, in: Advances in Evolutionary Computing: Theory and Applications, 2003, pp. 45–94.
[23] D. Dasgupta, Z. Michalewicz, Evolutionary algorithms in engineering applications, Int. J. Evol. Optim. 1 (1) (1999) 93–94.
[24] A. Cheraghalipour, M. Hajiaghaei-Keshteli, M.M. Paydar, Tree growth algorithm (tga): a novel approach for solving optimization problems, Eng. Appl. Artif.
Intell. 72 (2018) 393–414, https://doi.org/10.1016/j.engappai.2018.04.021.
[25] L. Abualigah, A. Diabat, S. Mirjalili, M. Abd Elaziz, A.H. Gandomi, The arithmetic optimization algorithm, Comput. Methods Appl. Mech. Eng. 376 (2021)
113609, https://doi.org/10.1016/j.cma.2020.113609.
[26] B. Webster, P.J. Bernhard, A local search optimization algorithm based on natural principles of gravitation, tech. report, 2003.
[27] E. Rashedi, H. Nezamabadi-Pour, S. Saryazdi, Gsa: a gravitational search algorithm, Inf. Sci. 179 (13) (2009) 2232–2248, https://doi.org/10.1016/j.ins.2009.
03.004.
[28] D. Bertsimas, J. Tsitsiklis, Simulated annealing, Stat. Sci. 8 (1) (1993) 10–15, https://doi.org/10.1214/ss/1177011077.
[29] S. Mirjalili, S.M. Mirjalili, A. Hatamlou, Multi-verse optimizer: a nature-inspired algorithm for global optimization, Neural Comput. Appl. 27 (2016) 495–513,
https://doi.org/10.1007/s00521--015--1870--7.
[30] H. Abedinpourshotorban, S.M. Shamsuddin, Z. Beheshti, D.N. Jawawi, Electromagnetic field optimization: a physics-inspired metaheuristic optimization algo-
rithm, Swarm Evol. Comput. 26 (2016) 8–22, https://doi.org/10.1016/j.swevo.2015.07.002.
[31] A. Faramarzi, M. Heidarinejad, B. Stephens, S. Mirjalili, Equilibrium optimizer: a novel optimization algorithm, Knowl.-Based Syst. 191 (2020) 105190, https://
doi.org/10.1016/j.knosys.2019.105190.
[32] R. Formato, Central force optimization: a new metaheuristic with applications in applied electromagnetics, Prog. Electromagn. Res. 77 (2007) 425–491.
[33] O.K. Erol, I. Eksin, A new optimization method: big bang–big crunch, Adv. Eng. Softw. 37 (2) (2006) 106–111, https://doi.org/10.1016/j.advengsoft.2005.04.
005.
[34] A. Kaveh, M. Khayatazad, A new meta-heuristic method: ray optimization, Comput. Struct. 112 (2012) 283–294, https://doi.org/10.1016/j.compstruc.2012.09.
003.
[35] F.A. Hashim, E.H. Houssein, M.S. Mabrouk, W. Al-Atabany, S. Mirjalili, Henry gas solubility optimization: a novel physics-based algorithm, Future Gener.
Comput. Syst. 101 (2019) 646–667, https://doi.org/10.1016/j.future.2019.07.015.
[36] F.F. Moghaddam, R.F. Moghaddam, M. Cheriet, Curved space optimization: a random search based on general relativity theory, arXiv preprint, arXiv:1208.2214,
https://doi.org/10.48550/arXiv.1208.2214, 2012.
[37] M. Dorigo, V. Maniezzo, A. Colorni, Ant system: optimization by a colony of cooperating agents, IEEE Trans. Syst. Man Cybern., Part B, Cybern. 26 (1) (1996)
29–41, https://doi.org/10.1109/3477.484436.
[38] S. Mirjalili, A.H. Gandomi, S.Z. Mirjalili, S. Saremi, H. Faris, S.M. Mirjalili, Salp swarm algorithm: a bio-inspired optimizer for engineering design problems,
Adv. Eng. Softw. 114 (2017) 163–191, https://doi.org/10.1016/j.advengsoft.2017.07.002.

29
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

[39] R. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micro Machine and Human
Science, MHS’95, 1995, pp. 39–43.
[40] S. Gopi, P. Mohapatra, A modified whale optimisation algorithm to solve global optimisation problems, in: Proceedings of 7th International Conference on
Harmony Search, Soft Computing and Applications: ICHSA 2022, 2022, pp. 465–477.
[41] D. Karaboga, B. Basturk, A powerful and efficient algorithm for numerical function optimization: artificial bee colony (abc) algorithm, J. Glob. Optim. 39 (2007)
459–471, https://doi.org/10.1007/s10898--007--9149--x.
[42] S. Mirjalili, The ant lion optimizer, Adv. Eng. Softw. 83 (2015) 80–98, https://doi.org/10.1016/j.advengsoft.2015.01.010.
[43] S. Li, H. Chen, M. Wang, A.A. Heidari, S. Mirjalili, Slime mould algorithm: a new method for stochastic optimization, Future Gener. Comput. Syst. 111 (2020)
300–323, https://doi.org/10.1016/j.future.2020.03.055.
[44] A.H. Gandomi, X.-S. Yang, A.H. Alavi, Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems, Eng. Comput. 29 (2013)
17–35, https://doi.org/10.1007/s00366--011--0241--y.
[45] S. Mirjalili, Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm, Knowl.-Based Syst. 89 (2015) 228–249, https://doi.org/10.1016/j.
knosys.2015.07.006.
[46] N.F. Johari, A.M. Zain, M.H. Noorfa, A. Udin, Firefly algorithm for optimization problem, Appl. Mech. Mater. 421 (2013) 512–517, https://doi.org/10.4028/
www.scientific.net/AMM.421.512.
[47] G. Dhiman, V. Kumar, Seagull optimization algorithm: theory and its applications for large-scale industrial engineering problems, Knowl.-Based Syst. 165 (2019)
169–196, https://doi.org/10.1016/j.knosys.2018.11.024.
[48] S. Mirjalili, S.M. Mirjalili, A. Lewis, Grey wolf optimizer, Adv. Eng. Softw. 69 (2014) 46–61, https://doi.org/10.1016/j.advengsoft.2013.12.007.
[49] S. Mohapatra, P. Mohapatra, American zebra optimization algorithm for global optimization problems, Sci. Rep. 13 (1) (2023) 5211, https://doi.org/10.1038/
s41598--023--31876--2.
[50] G. Dhiman, A. Kaur, Stoa: a bio-inspired based optimization algorithm for industrial engineering problems, Eng. Appl. Artif. Intell. 82 (2019) 148–174, https://
doi.org/10.1016/j.engappai.2019.03.021.
[51] S. Mirjalili, A. Lewis, The whale optimization algorithm, Adv. Eng. Softw. 95 (2016) 51–67, https://doi.org/10.1016/j.advengsoft.2016.01.008.
[52] H.A. Alsattar, A. Zaidan, B. Zaidan, Novel meta-heuristic bald eagle search optimisation algorithm, Artif. Intell. Rev. 53 (2020) 2237–2264, https://doi.org/10.
1007/s10462--019--09732--5.
[53] A. Faramarzi, M. Heidarinejad, S. Mirjalili, A.H. Gandomi, Marine predators algorithm: a nature-inspired metaheuristic, Expert Syst. Appl. 152 (2020) 113377,
https://doi.org/10.1016/j.eswa.2020.113377.
[54] Q. Zhang, R. Wang, J. Yang, K. Ding, Y. Li, J. Hu, Collective decision optimization algorithm: a new heuristic optimization method, Neurocomputing 221 (2017)
123–137, https://doi.org/10.1016/j.neucom.2016.09.068.
[55] R.V. Rao, V.J. Savsani, D. Vakharia, Teaching-learning-based optimization: an optimization method for continuous non-linear large scale problems, Inf. Sci.
183 (1) (2012) 1–15, https://doi.org/10.1016/j.ins.2011.08.006.
[56] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 60–68, https://doi.org/10.1177/
003754970107600201.
[57] Y. Tan, Y. Zhu, Fireworks algorithm for optimization, in: Advances in Swarm Intelligence: First International Conference, ICSI 2010, Beijing, China, June 12–15,
2010, in: Proceedings, Part I, vol. 1, 2010, pp. 355–364.
[58] M. Kumar, A.J. Kulkarni, S.C. Satapathy, Socio evolution & learning optimization algorithm: a socio-inspired optimization methodology, Future Gener. Comput.
Syst. 81 (2018) 252–272, https://doi.org/10.1016/j.future.2017.10.052.
[59] S.H.S. Moosavi, V.K. Bardsiri, Poor and rich optimization algorithm: a new human-based and multi populations algorithm, Eng. Appl. Artif. Intell. 86 (2019)
165–181, https://doi.org/10.1016/j.engappai.2019.08.025.
[60] D. Panwar, G. Saini, P. Agarwal, Human eye vision algorithm (heva): a novel approach for the optimization of combinatorial problems, Artif. Intell. Healthc.
(2022) 61–71, https://doi.org/10.1007/978--981--16--6265--2_5.
[61] A.R. Jordehi, Brainstorm optimisation algorithm (bsoa): an efficient algorithm for finding optimal location and setting of facts devices in electric power systems,
Int. J. Electr. Power Energy Syst. 69 (2015) 48–57, https://doi.org/10.1016/j.ijepes.2014.12.083.
[62] S.J. Mousavirad, H. Ebrahimpour-Komleh, Human mental search: a new population-based metaheuristic optimization algorithm, Appl. Intell. 47 (2017) 850–887,
https://doi.org/10.1007/s10489--017--0903--6.
[63] S. Wang, H. Jia, L. Abualigah, Q. Liu, R. Zheng, An improved hybrid Aquila optimizer and Harris hawks algorithm for solving industrial engineering optimization
problems, Processes 9 (9) (2021) 1551, https://doi.org/10.3390/pr9091551.
[64] Y.-J. Zhang, Y.-X. Yan, J. Zhao, Z.-M. Gao, Aoaao: the hybrid algorithm of arithmetic optimization algorithm with Aquila optimizer, IEEE Access 10 (2022)
10907–10933, https://doi.org/10.1109/ACCESS.2022.3144431.
[65] S. Saremi, S. Mirjalili, A. Lewis, Grasshopper optimisation algorithm: theory and application, Adv. Eng. Softw. 105 (2017) 30–47, https://doi.org/10.1016/j.
advengsoft.2017.01.004.
[66] J. Zhao, Z.-M. Gao, H.-F. Chen, The simplified Aquila optimization algorithm, IEEE Access 10 (2022) 22487–22515, https://doi.org/10.1109/ACCESS.2022.
3153727.
[67] B. Gao, Y. Shi, F. Xu, X. Xu, An improved Aquila optimizer based on search control factor and mutations, Processes 10 (8) (2022) 1451, https://doi.org/10.
3390/pr10081451.
[68] H. Yu, H. Jia, J. Zhou, A. Hussien, Enhanced Aquila optimizer algorithm for global optimization and constrained engineering problems, Math. Biosci. Eng.
19 (12) (2022) 14173–14211, https://doi.org/10.3934/mbe.2022660.
[69] Y. Wang, Y. Zhang, Y. Yan, J. Zhao, Z. Gao, An enhanced Aquila optimization algorithm with velocity-aided global search mechanism and adaptive opposition-
based learning, Math. Biosci. Eng. 20 (4) (2023) 6422–6467, https://doi.org/10.3934/mbe.2023278.
[70] S. Ekinci, D. Izci, E. Eker, L. Abualigah, An effective control design approach based on novel enhanced Aquila optimizer for automatic voltage regulator, Artif.
Intell. Rev. 56 (2) (2023) 1731–1762, https://doi.org/10.1007/s10462--022--10216--2.
[71] A.M. AlRassas, M.A. Al-qaness, A.A. Ewees, S. Ren, M. Abd Elaziz, R. Damaševičius, T. Krilavičius, Optimized anfis model using Aquila optimizer for oil
production forecasting, Processes 9 (7) (2021) 1194, https://doi.org/10.3390/pr9071194.
[72] D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization, IEEE Trans. Evol. Comput. 1 (1) (1997) 67–82, https://doi.org/10.1109/4235.585893.
[73] S. Mohapatra, P. Mohapatra, Fast random opposition-based learning golden jackal optimization algorithm, Knowl.-Based Syst. (2023) 110679, https://doi.org/
10.1016/j.knosys.2023.110679.
[74] S. Kaur, L.K. Awasthi, A. Sangal, G. Dhiman, Tunicate swarm algorithm: a new bio-inspired based metaheuristic paradigm for global optimization, Eng. Appl.
Artif. Intell. 90 (2020) 103541, https://doi.org/10.1016/j.engappai.2020.103541.
[75] S. Mirjalili, Sca: a sine cosine algorithm for solving optimization problems, Knowl.-Based Syst. 96 (2016) 120–133, https://doi.org/10.1016/j.knosys.2015.12.
022.
[76] R. Tanabe, A.S. Fukunaga, Improving the search performance of shade using linear population size reduction, in: 2014 IEEE Congress on Evolutionary Compu-
tation (CEC), 2014, pp. 1658–1665.
[77] N. Hansen, S.D. Müller, P. Koumoutsakos, Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (cma-es),
Evol. Comput. 11 (1) (2003) 1–18, https://doi.org/10.1162/106365603321828970.

30
S. Gopi and P. Mohapatra Heliyon 10 (2024) e26187

[78] W. Zhao, Z. Zhang, L. Wang, Manta ray foraging optimization: an effective bio-inspired optimizer for engineering applications, Eng. Appl. Artif. Intell. 87 (2020)
103300, https://doi.org/10.1016/j.engappai.2019.103300.
[79] B. Abdollahzadeh, F.S. Gharehchopogh, S. Mirjalili, African vultures optimization algorithm: a new nature-inspired metaheuristic algorithm for global optimiza-
tion problems, Comput. Ind. Eng. 158 (2021) 107408, https://doi.org/10.1016/j.cie.2021.107408.
[80] H.R. Tizhoosh, Opposition-based learning: a new scheme for machine intelligence, in: International Conference on Computational Intelligence for Modelling,
Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), vol. 1, 2005,
pp. 695–701.
[81] J.G. Digalakis, K.G. Margaritis, On benchmarking functions for genetic algorithms, Int. J. Comput. Math. 77 (4) (2001) 481–506, https://doi.org/10.1080/
00207160108805080.
[82] J.-J. Liang, P.N. Suganthan, K. Deb, Novel composition test functions for numerical global optimization, in: Proceedings 2005 IEEE Swarm Intelligence Sympo-
sium, 2005, SIS 2005, 2005, pp. 68–75.
[83] K. Price, N. Awad, M. Ali, P. Suganthan, Problem definitions and evaluation criteria for the 100-digit challenge special session and competition on single
objective numerical optimization, Technical Report, Nanyang Technological University Singapore, 2018, p. 1.
[84] J.-J. Liang, B. Qu, D. Gong, C. Yue, Problem Definitions and Evaluation Criteria for the Cec 2019 Special Session on Multimodal Multiobjective Optimization,
Computational Intelligence Laboratory, Zhengzhou University, 2019.
[85] H.W. Lilliefors, On the Kolmogorov-Smirnov test for normality with mean and variance unknown, J. Am. Stat. Assoc. 62 (318) (1967) 399–402, https://
doi.org/10.1080/01621459.1967.10482916.
[86] B.B. Schultz, Levene’s test for relative variation, Syst. Zool. 34 (4) (1985) 449–456, https://doi.org/10.1093/sysbio/34.4.449.
[87] G.S. Easton, R.E. McCulloch, A multivariate generalization of quantile-quantile plots, J. Am. Stat. Assoc. 85 (410) (1990) 376–386, https://doi.org/10.1080/
01621459.1990.10476210.
[88] F. Wilcoxon, Individual Comparisons by Ranking Methods, Springer, 1992.
[89] M. Friedman, The use of ranks to avoid the assumption of normality implicit in the analysis of variance, J. Am. Stat. Assoc. 32 (200) (1937) 675–701, https://
doi.org/10.1080/01621459.1937.10503522.
[90] J. Arora, Introduction to Optimum Design, quarta edição, Elsevier, 2012.
[91] C.A.C. Coello, Use of a self-adaptive penalty approach for engineering optimization problems, Comput. Ind. 41 (2) (2000) 113–127, https://doi.org/10.1016/
S0166--3615(99)00046--9.
[92] D. Sattar, R. Salim, A smart metaheuristic algorithm for solving engineering problems, Eng. Comput. 37 (3) (2021) 2389–2417, https://doi.org/10.1007/s00366-
-020--00951--x.
[93] E. Sandgren, Nonlinear integer and discrete programming in mechanical design, Am. Soc. Mech. Eng. 26584 (1988) 95–105, https://doi.org/10.1115/DETC1988-
-0012.

31

You might also like