An Improved Tunicate Swarm Algorithm With Best-Random Mutation Strategy For Global Optimization Problems
An Improved Tunicate Swarm Algorithm With Best-Random Mutation Strategy For Global Optimization Problems
https://doi.org/10.1007/s42235-022-00185-1
RESEARCH ARTICLE
Received: 5 November 2021 / Revised: 25 February 2022 / Accepted: 28 February 2022 / Published online: 28 March 2022
© Jilin University 2022
Abstract
The Tunicate Swarm Algorithm (TSA) inspires by simulating the lives of Tunicates at sea and how food is obtained. This
algorithm is easily entrapped to local optimization despite the simplicity and optimal, leading to early convergence compared
to some metaheuristic algorithms. This paper sought to improve this algorithm's performance using mutating operators such
as the lévy mutation operator, the Cauchy mutation operator, and the Gaussian mutation operator for global optimization
problems. Thus, we introduced a version of this algorithm called the QLGCTSA algorithm. Each of these operators has a
different performance, increasing the QLGCTSA algorithm performance at a specific optimization operation stage. This algo-
rithm has been run on benchmark functions, including three different compositions, unimodal (UM), and multimodal (MM)
groups and its performance evaluate six large-scale engineering problems. Experimental results show that the QLGCTSA
algorithm had outperformed other competing optimization algorithms.
13
Vol.:(0123456789)
F. S. Gharehchopogh
1178
Inspired by simulating the life of Tunicates at sea and A population-based metaheuristic algorithm famous for
how food provided by Satnam, Kaur et al. proposed the TSA the Sine Cosine Algorithm (SCA) solves optimization prob-
algorithm [20]. Moreover, it is seen as one of the most recent lems [28]. In this algorithm, as a random primary candidate,
metaheuristic algorithms to solve engineering optimization several solutions have been produced by cosine and sine
problems. Tunicate can search for a food source, but they functions, as they move in a sinusoidal/co-sinusoidal way
have no idea about it. This algorithm is easily entrapped to (up and down) towards the best solution. Several random
local optimization despite the simplicity and optimal, lead- and comparative variables have been incorporated in this
ing to early convergence compared to some metaheuristic algorithm to emphasize the exploration and exploitation of
algorithms. This paper sought to improve this algorithm's the search space at different optimization milestones. Simu-
performance using mutation operators such as Lévy, Cauchy, lation results confirm that the SCA algorithm effectively
and Gaussian. These mutation operators use Lévy, Cauchy, reveals different search space areas, preventing local opti-
and Gaussian mutation distributions. Each of these distribu- mization, convergence to all-out global optimization, and
tions has different performances, increasing the LGCTSA using promising areas of a search space during optimization.
algorithm's performance at a particular stage of the opti- The Spotted Hyena Optimization (SHO) algorithm [29] is
mization operation. Conducted experiments suggest that a metaheuristic algorithm with a collective approach based
simultaneously using three mutation operators increas- on the collective hunting behavior of spotted hyenas. In this
ingly improves the performance of the LGCTSA algorithm. algorithm, each solution to the problem codes as a hyena.
Gaussian mutation creates small jumps in solutions. On the The hyenas approach the optimal answer or the prey by fol-
other hand, the Cauchy mutation creates more extensive lowing the optimal answer or the group-leading hyena. The
mutations than the Gaussian mutation, thus leading to the three primary SHO stages are searching for the prey, laying
increased overall ability of search agents to search globally the siege, and invading the prey, all three of which are math-
in each generation. ematically modeled and executed.
There are different classes of metaheuristic algorithms A population-based metaheuristic optimization algorithm
is proposed in the literature. However, apart from different is inferred from Harris hawk's collaborative behavior and
classes, one can observe that several of these algorithms are hunting style in nature using the surprise touring model
based on collective behavior and how animals hunt in nature known as the Harris Hawks Optimization (HHO) algorithm
[21]. Our goal is to examine some nature-based metaheuris- [18]. In this algorithm, several Harris hawks collaborate and
tics and basic algorithms to solve this subpart's optimization invade from different angles to surprise prey. Harris Hawk
problem. A Spider Monkey Optimization (SMO) algorithm can produce different chase patterns based on dynamic sce-
[22] proposes numerical optimization using feeding spider narios' nature and prey's escape patterns. This technique
monkey modeling. They have division and combination- computationally imitates emotional patterns and behaviors
based social structures. Such animals move from more lead- to create an optimization algorithm.
ing groups to smaller ones based on a deficiency of food We only investigated several the metaheuristic algo-
availability, and vice versa—the metaheuristic algorithm of rithms such as: Starling Murmuration Optimizer (SMO)
coexistence search planners on animals' interaction in nature [30], Simulated Annealing (SA) algorithm [31], Biogeog-
[23]. The algorithm has regarded three stages of co-assis- raphy-Based Optimization (BBO) [32], Clonal Selection
tance, food-sharing, and parasitic life for the animals such Algorithm (CSA) [33], Gorilla Troops Optimizer (GTO)
that two animals may benefit or harm each other. Prey-preda- [34], Conscious Neighborhood-based Crow Search Algo-
tor metaheuristic algorithm[24] randomly allocates solutions rithm (CCSA) [35], Farmland Fertility Algorithm (FFA)
produced as a huntsman and the prey relevant on their func- [36], Quantum-based Avian Navigation Optimizer Algo-
tion in the target function. A new metaheuristic optimization rithm[37], Gravitational Search Algorithm (GSA) [38], Afri-
algorithm inspired by the movement of galaxies, stars, and can Vultures Optimization Algorithm(AVOA)[39], League
massive galaxies under the influence of gravity, called the Championship Algorithm (LCA)[40], Henry Gas Solubil-
Galactic Swarm Optimization(GSO) algorithm, was pro- ity Optimization(HGSO)[41], Group Counseling Optimizer
posed [25] in 2015. It applies several exploration and exploi- (GCO)[42], Intelligent Water Drops (IWD) algorithm[43],
tation stages for establishing the best balance rate between and etc.
the exploration of new solutions and the exploitation of new In this paper, we attained a new solution to solve engi-
solutions. Also, a metaheuristic algorithm provided by Mir- neering problems. We improved the TAS algorithm; we
jalili et al. (2014) is based on the hierarchical behavior of begin to describe the LGCTSA algorithm in detail after
wolves [26] named Grey Wolf Optimizer (GWO). The three the TSA algorithm was introduced in Sect. 3. Moreover,
best solutions are alpha, beta, and delta wolves, consecutive, the Lévy mutation is used to strengthen more profound
with the rest of the solutions considered regular wolves [27]. search capacities. Metaheuristic algorithms use mutation
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1179
13
F. S. Gharehchopogh
1180
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1181
3 QLGCTSA Algorithm mutation operators. The Lévy flight operator uses to grow
the capability of search agents. It helps all search agents find
The TSA algorithm is entrapped in the local optimization, better positions and increase the ability to search deeper dur-
leading to early convergence. For this, we have improved ing optimization operations. Therefore, the LTSA algorithm
the TSA algorithm's performance using mutation opera- has more extraordinary ability and efficiency in an all-out
tors. These mutation operators have been designed based search. The Lévy mutation describes in Eq. (9).
on three distributions: Lévy, Cauchy, and Gaussian. Each
of these distributions has different functions, increasing the
Levy(𝛽) ∼ u = t−1−𝛽 , 0 < 𝛽 ≤ 2 (9)
QLGCTSA algorithm's performance at a given stage in the In Eq. (9), β is a stability index, and Lévy flight calculates
optimization operation. On the other hand, experiments con- as Eq. (10).
ducted illustrated that three mutation operators' simultane-
ous application increases the performance of the QLGCTSA u×𝜎
Levy(𝛽) ∼ 1 (10)
algorithm. This algorithm and how the optimization opera- |v| 𝛽
tion process conductors are present in Fig. 1.
In Eq. (10), u and v are standard normal distributions. σ
3.1 Lévy Mutation ‑TSA (QLTSA) was obtained using Eq. (11).
1
No
Yes
Generate the initial Yes
Start t≤ T j ≤ Pop
tunicate population
No
Calculate the fitness of updated Detetmine the jet propulsion and swarm
search agents behaviors of tunicate using eq.(7)
13
F. S. Gharehchopogh
1182
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1183
In this subsection, the Cauchy mutation is applied to the Equation (17), (α) is a uniform distribution taken from
CTSA algorithm. The Cauchy mutation shows using the Gaussian mutation as a random value. It incorporates the
Eq. (13). Gaussian mutation, such as the Lévy flight, in the rotational
( ) motion phase. However, Eq. (17) is used instead of Eq. (12).
1 1 𝛾
y = + arctan (13)
2 𝜋 g 3.4 Quantum Rotation Gate (QRG)
The corresponding density function shows using Eq. (14).
In QLGCTSA, the QRG mechanism[44] is proposed to
1 g enhance local search capabilities and increase population
fCauchy(0,g) (𝛾) =
𝜋 g2 + 𝛾 2 (14) diversity. The population data, consisting of several dimen-
sions, are generated as floating data by the GTSA algorithm.
In Eq. (14), g is a ratio parameter with a value of 1. y is Therefore, the quantum bit's discrete data must be converted
a random value betwixt 0 and 1, and γ = tan(𝜋(y − 1∕2)) . to continuous algorithm data, so these values are placed
The Cauchy mutation congestion function is related to the directly as quantum bits to work in the QRG. Setting up and
Gaussian mutation. Still, there are also differences in the updating a QRG is as follows[44].
Cauchy mutation relative to the Gaussian mutation in that [ ( ) ( )]
the Cauchy mutation is it is small and is in a vertical direc- ( ) cos( 𝜃i) −sin( 𝜃)i
U 𝜃i = (18)
tion. On the other hand, the Cauchy mutation is horizon- sin 𝜃i cos 𝜃i
tally wider than the Gaussian mutation. It is conceivable [ �] [ ] [ ( ) ( ) ][ ]
𝛼i ( ) 𝛼i cos( 𝜃i) −sin( 𝜃)i 𝛼i
to modify the capability to search in search agents or add � = U 𝜃i = s (19).
𝛽i 𝛽i sin 𝜃i cos 𝜃i 𝛽i
neighbors of each generation using the Cauchy mutation.
On the other hand, this leads to increased reliability of Where, θi represents the angle of rotation of ith. The(method )T
search agents to improve the solutions found on a large scale of setting it in QLGCTSA is shown in Table 1. (αi , βi )
, T
shows the quantum bit state vectors before ( and) αi , βi
�
13
F. S. Gharehchopogh
1184
one mutation operator rarely balance the exploration and simultaneously applied in these three algorithms, and the
exploitation phases. For this, four significant changes were best solution was selected. The QLGTSA algorithm con-
used to increase performance. In the QLGCTSA algorithm, struct is illustrated in Fig. 1. It should be borne that the
Lévy flight, Gaussian mutation, and Cauchy mutation opera- construct used for QLCTSA and QGCTSA algorithms is the
tors yielded new solutions. It was obtained by determining same as that of the QLGTSA algorithm. However, two Eqs.
the most appropriate distribution among the seven proposed (20 and 22) where used in the QLCTSA algorithm, while
distributions. Three main Eqs. (20–22) are illustrated for also two Eqs. (21 and 22) in the QGCTSA algorithm replac-
mutation operators. ing Eqs. (20–21). The Lévy flight makes more extensive
mutations in the solutions produced relative to the Gaussian
xnewl = xi × (1 + L(𝛽)) (20) mutation and Cauchy mutations. The Lévy flight can take
significant strides in new solutions and, therefore, prevent
xnewg = xi × (1 + G(𝛼)) (21) early convergence. In the QLGCTSA algorithm, three Eqs.
(20–22) were simultaneously used, and the best solution
xnewc = xi × (1 + C(𝛾)) (22) was selected from the three created solutions. The construct
used for the QLGCTSA algorithm was the same as that pre-
Two equations (i.e., Eqs. 20 and 21) were used for the sented in the LGTSA algorithms, except that Eqs. (20–22)
QLGTSA algorithm and two Eqs. (20 and 22) for the were employed instead of using Eqs. (20–22). QRG is used
QLCTSA algorithm. As well, two Eqs. (21 and 22) were in all algorithms. In Algorithm (3), the pseudo-code of the
used for the QGCTSA algorithm. Two equations were QLGCTSA algorithm is shown.
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1185
3.6 Time Complexity functions through Min, Max, Mean, and STD criteria are:
illustrated. Table 5 evaluates the TSA algorithm and its vari-
In this subsection, the time complexity of the QLGCTSA ous improvements. The benchmark functions of numbers
algorithm is investigated. The computational complex- one to thirteen with dimensions 30 have been investigated.
ity of initial population generation in the TSA algorithm It should be noted that QRG was used in all experiments
is equal to O(n × d) . Where n is equal to the population except for the TSA algorithm tested without any improve-
size and d represents the dimensions of the problem. The ment. On the other hand, in this section, the effect of muta-
tion operators on the QLGCTSA algorithm is investigated.
(cost calculation operation
) of each solution is equal to
Maxiteration × n × d which Maxiteration is equal to the total According to Table 5, it is clear that the LGCTSA
amount of iterations for the optimization operation. The TSA algorithm has the highest performance compared to other
algorithm takes O(n) time to improve solutions. The total algorithms. The best performance pertains to QLGTSA,
computational complexity)of the TSA algorithm is equal to QGCTSA, QLCTSA, QLTSA, QGTSA, and QCTSA algo-
(
O Maxiteration × n × d × N . rithms. The results in Table 5 demonstrate that the TSA
algorithm substantially increases its performance using
mutation operators.
4 Experiments and Results According to Fig. 2, the QLGCTSA algorithm has been
able to have the best performance in convergence. The
The LGCTSA algorithm and other algorithms presented in QLGTSA, QGCTSA, and QLCTSA algorithms have also
this paper were run and tested using Matlab 9.9 and a Win- shown acceptable performance. Eventually, Examining
dows 7 Enterprise 32-bit Intel® Core™ i7-4510U Processor, Fig. 2 shows that all algorithms integrated with mutation
16.00 GB RAM. All the experiments and a review of algo- operators have performed well.
rithms and algorithms' performance are conducted using 30 Table 6 shows the results obtained from Wilcoxon's
populations and 500 iterations. Experimental results were nonparametric statistical test rank. According to the results
according to a mean of 30 executions of the stored perfor- shown, it can be easily understood that the QLGCTSA algo-
mance compared. However, in solving the CEC 2017 com- rithm has a statistically good ability to discover quality solu-
petition problem, all experiments were performed using 30 tions. Because in most cases, it has been shown that it is
populations and 1000 iterations. Moreover, Wilcoxon's non- statistically significantly different from the solutions of other
parametric statistical test rank test [45] with a 0.05 emphasis algorithms. But functions 18 and 20 have statistically poor
level was conducted to specify if the QLGCTSA algorithm results compared to other algorithms.
managed to produce statistical advancement. It should be
noted, however, that the symbols " + ," " = ," "–" suggest 4.1.1 The Scalability Test for QLGCTSA
whether or not the QLGCTSA algorithm had better or equal
performance or poor performance compared to other algo- An experiment was conducted using large-dimension bench-
rithms compared. Hence, the Friedman test[46] was con- mark functions to examine the QLGCTSA algorithm and its
ducted to assess all competitors' mean performance, while scalability (Table 7). The dimensions used to perform the
the rating's Average Value (ARV) was provided to indicate scalability test were 100, 500, and 1000, respectively.
these test results. Table 7 pertains to the scalability test findings using
the TSA and QLGCTSA algorithms. It is clear that the
4.1 Experiments on Test Functions QLGCTSA algorithm still finds better solutions even as
dimensions of the problems increase; using the proposed
To investigate the QLGCTSA algorithm, a different set of mutation operators, the TSA algorithm's performance
criterion functions [47, 48]. This set of criterion functions increases significantly, thus yielding the desired perfor-
included three UM, MM, and Cauchy mutation groups. The mance in solving solutions of high dimensions. The CEC
criterion functions of UM (F1-F7) have one best global opti- 2017 competition benchmark functions (shown in Table 8)
mization solutions. The capacity of the exploitation phase of were applied in the third group, covering hybrid rotated,
each optimization algorithm can be shown. While using MM composite, and shifted MM cases.
(F8-F23) criterion functions, the capabilities of the explora- The QLGCTSA algorithm results and performance are
tion phase of optimization algorithms can be better shown. compared with TSA [20], SCA [28], SHO [29], HHO [49],
Math formula and features of criterion functions of MM and MVO [50], and AVOA [39] optimization algorithms. In this
UM are illustrated in Tables 2, 3, 4. connection, optimizing algorithms were regulated in the
Simulation results and findings from improving the base algorithm, as seen in Table 9.
QTSA algorithm's performance by Gaussian, Cauchy, and This comparison is based on STD and AVG, as shown in
lévy flights for Fixed-dimension multimodal benchmark Table 10. This table compares the QLGCTSA algorithm's
13
F. S. Gharehchopogh
1186
F6 ∑d �� �
� 2 US 0 30, 100, 500, 1000
f (x) = [−100, 100]d
i=1 �xi + 0.5�
F7 ∑d US 0 30, 100, 500, 1000
f (x) = i=1 ix4i + random[0, 1) [−128, 128]d
F9 ∑d � d � ��
MS 0 30, 100, 500, 1000 [−5.12, 5.12]d
f (x) = 10d + x − 10cos 2𝜋xi
�i=1 i � � � ∑ �
F10 ∑d d MN 0 30, 100, 500, 1000 [−32, 32]d
f (x) = −20exp −0.2 d1 i=1 xi2 − exp d1 i=1 cos2𝜋xi + 20 + e
� �
F11 f (x) = 1 ∑d ∏d x MN 0 30, 100, 500, 1000 [−600, 600]d
4000
x2
i=1 i
− i=1
cos √i +1
i
� � � ∑ � �2 � � �� � �2 � ∑d � �
F12 f (x) = 𝜋
10sin 𝜋y1 + d−1
i=1 yi − 1 1 + 10sin2 𝜋yi+1 + yd − 1 + i=1 U xi , 10, 100, 4
d
xi +1
yi = 1 + 4
⎧ � �m MN 0 30, 100, 500, 1000 [−50, 50]d
� � ⎪ k xi − a xi > a
U xi , a, k, m = ⎨ � 0 − a <� xi < a
⎪ k −xi − a m xi < −a
⎩
� � � ∑d � �2 � � �� � �2 � � ���
F13 f (x) = 0.1 sin2 3𝜋x1 + i=1 xi − 1 1 + sin2 3𝜋xi + 1 + xd − 1 1 + sin2 2𝜋xd MN 0 30, 100, 500, 1000 [−50, 50]d
∑d � �
+ i=1 U xi , 5, 100, 4
simulation results and other algorithms with SDT and AVG intersection as they found the worst solutions compared to
parameters applied to the CEC2017 functions. The results different optimizing algorithms. The HHO algorithm was
indicate that the QLGCTSA algorithm outperforms different outperformed other optimization algorithms but under-per-
algorithms to be compared. formed the QLGCTSA algorithm in most benchmark func-
Table 10 data pertain to evaluating the LGCTSA algo- tions. Finally, one can see that the QLGCTSA algorithm
rithm and other optimizing algorithms. That suggests a outperformed the TSA algorithm while having a remark-
better QLGCSTA algorithm's performance than differ- able ability to fight the optimal local trap. As shown in the
ent optimization algorithms, including TSA, SCA, SHO, diagrams in Fig. 3, it becomes clear that the QLGCTSA
HHO, and MVO. On the other hand, it becomes clear that algorithm uses the proposed mutation operators to improve
the QLGCTSA algorithm has outperformed other optimiza- solving optimization problems substantially.
tion algorithms and has an excellent ability to solve several Table 11 shows the results of the Wilcoxon rank-sum test
benchmark functions. Figure 3 illustrates the convergence with 5% significance. According to the results, it can be
diagram of different iterations on CEC2017 benchmark concluded that the performance of the QLGCTSA algorithm
functions on different optimization algorithms that demon- in comparison with other comparable algorithms in testing
strate that the QLGCTSA algorithm is better than the others. CEC2017 benchmark functions has a good capability and
It becomes clear that the QLGCTSA algorithm had out- has been able in most cases compared to different compari-
performed in convergence than other algorithms and had son algorithms. It has had a significant performance. This
better desirability. MVO and SHO algorithms have a minor shows that the solutions obtained in testing the CEC2017
13
Table 4 Describes fixed-dimension multimodal benchmark functions
No Function Type Fmin Dimen- Range
sions
� �−1
F14 1 ∑25 1
FM 21 2 [−65, 65]d
f (x) = 500
+ i=1 i+∑2 6
j=1 (xj −aj,i )
∑ � 2 � 0.00030
F15 x (b +bi x2 ) 2 FM 4 [−5, 5]d
f (x) = di=1 ai − b12 +bi x +x
i i 3 4
F16 f (x) = 4x12 − 2.1x14 + 13 x16 + x1 x2 − 4x22 + 4x24 FM – 1.0316 2 [−5, 5]d
( )2 ( )
F17 5.1 2 5 1 FM 0.398 2 [−5, 5]d
f (x) = x2 − 4𝜋 2 x1 +
𝜋
x 1 − 6 + 10 1 − 8𝜋
cosx1 + 10
[ ( )2 ( )] [ ( )2 (
F18 f (x) = 1 + x1 + x2 + 1 19 − 14x1 + 3x21 − 14x2 + 6x1 x2 + 3x22 × 30 + 2x1 − 3x2 × 18 − 32x1 FM 3 2 [−2, 2]d
)]
+12x21 + 48x2 − 36x1 x2 + 27x22
∑ � ∑ � �2 �
F19 f (x) = − 4i=1 ai exp − 3j=1 bij xj − pij FM – 3.86 3 [1, 3]d
∑ � ∑ � �2 �
F20 f (x) = − 4i=1 ai exp − 6j=1 bij xj − pij FM – 3.32 6 [0, 1]d
∑5 �� �� �T �−1
F21
f (x) = − i=1 X − ai X − ai + ci
FM – 10.1532 4 [0, 10]d
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization…
∑7 �� �� �T �−1
F22
f (x) = − i=1 X − ai X − ai + ci
FM – 10.4028 4 [0, 10]d
∑10 �� �� �T �−1
F23
f (x) = − i=1 X − ai X − ai + ci
FM – 10.5363 4 [0, 10]d
13
1187
F. S. Gharehchopogh
1188
Table 5 Examining the effect of Gaussian mutation, Cauchy mutation, and lévy flights on the QLGCTSA algorithm's performance
Function Metric TSA QLGCTSA QLCTSA QLGTSA QLTSA QGCTSA QGTSA QCTSA
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1189
Table 5 (continued)
Function Metric TSA QLGCTSA QLCTSA QLGTSA QLTSA QGCTSA QGTSA QCTSA
F13 Min 7.92E-01 3.93E-09 3.07E-07 3.31E-07 8.28E-06 3.61E-07 5.31E-07 3.51E-05
Max 7.09E + 00 1.40E-07 1.85E-04 1.33E-05 2.83E-02 1.22E-05 1.14E-02 2.60E-02
Mean 3.17E + 00 6.22E-08 2.57E-05 2.79E-06 2.43E-03 3.49E-06 8.36E-04 4.31E-03
STD 1.49E + 00 3.79E-08 4.63E-05 3.26E-06 7.23E-03 3.26E-06 2.91E-03 7.88E-03
F14 Min 9.98E-01 9.98E-01 9.98E-01 9.98E-01 9.98E-01 9.98E-01 9.98E-01 9.98E-01
Max 1.64E + 01 9.98E-01 2.98E + 00 2.98E + 00 1.99E + 00 3.97E + 00 2.98E + 00 5.93E + 00
Mean 8.39E + 00 9.98E-01 1.33E + 00 1.20E + 00 1.06E + 00 1.59E + 00 1.26E + 00 1.92E + 00
STD 5.79E + 00 2.59E-16 7.18E-01 5.56E-01 2.57E-01 1.11E + 00 6.98E-01 1.47E + 00
F15 Min 3.08E-04 3.07E-04 3.08E-04 3.08E-04 3.08E-04 3.07E-04 3.08E-04 3.09E-04
Max 5.66E-02 1.22E-03 2.04E-02 4.74E-04 2.04E-02 1.22E-03 2.04E-02 2.05E-02
Mean 8.24E-03 3.72E-04 3.23E-03 3.45E-04 1.94E-03 4.32E-04 1.86E-03 3.19E-03
STD 1.57E-02 2.36E-04 6.97E-03 5.25E-05 5.11E-03 2.45E-04 5.13E-03 7.02E-03
F16 Min – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00
Max – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00
Mean – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00 – 1.03E + 00
STD 3.29E-07 2.85E-16 5.57E-16 3.20E-16 5.73E-14 3.66E-16 1.78E-15 1.97E-10
F17 Min 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01
Max 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01
Mean 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01 3.98E-01
STD 3.34E-05 0.00E + 00 4.13E-14 1.30E-13 1.42E-11 6.25E-16 5.74E-12 8.00E-11
F18 Min 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00
Max 8.40E + 01 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00
Mean 2.28E + 01 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00 3.00E + 00
STD 3.30E + 01 1.22E-06 2.16E-07 2.60E-08 2.09E-06 5.47E-08 2.11E-07 4.45E-05
F19 Min – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00
Max – 3.85E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.85E + 00
Mean – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00 – 3.86E + 00
STD 2.01E-03 6.83E-14 1.39E-06 1.36E-08 7.46E-05 3.20E-08 4.64E-06 2.36E-03
F20 Min – 3.32E + 00 – 3.32E + 00 – 3.32E + 00 – 3.32E + 00 – 3.32E + 00 – 3.32E + 00 – 3.32E + 00 – 3.32E + 00
Max – 3.14E + 00 – 3.20E + 00 – 3.20E + 00 – 3.20E + 00 – 3.19E + 00 – 3.20E + 00 – 3.20E + 00 – 3.20E + 00
Mean – 3.26E + 00 – 3.26E + 00 – 3.28E + 00 – 3.27E + 00 – 3.30E + 00 – 3.28E + 00 – 3.29E + 00 – 3.30E + 00
STD 7.72E– 02 6.14E– 02 5.81E– 02 6.14E– 02 5.07E– 02 5.80E– 02 5.47E– 02 4.94E– 02
F21 Min – 1.01E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01
Max – 2.60E + 00 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01
Mean – 7.14E + 00 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01 – 1.02E + 01
STD 3.18E + 00 1.36E-12 3.73E-10 2.37E-11 5.32E-08 3.60E-11 2.22E-09 8.06E-07
F22 Min – 1.03E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01
Max – 1.84E + 00 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01
Mean – 6.46E + 00 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01 – 1.04E + 01
STD 3.72E + 00 1.23E– 12 2.65E– 10 3.30E– 11 1.06E– 07 3.04E– 11 3.63E– 09 4.76E– 07
F23 Min – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01
Max – 1.85E + 00 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01
Mean – 6.91E + 00 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01 – 1.05E + 01
STD 3.88E + 00 6.51E– 13 6.66E– 10 8.22E– 11 1.86E– 07 9.04E– 11 2.63E– 09 4.81E– 07
Overall Rank 8 1 4 2 5 3 6 7
Rank + / = /– 22/1/0 ~ 7/14/1 3/19/1 8/14/1 6/17/0 10/12/1 15/8/0
ARV 6.7941 3.6724 3.9825 3.9587 4.3715 3.8607 4.6948 4.6231
13
F. S. Gharehchopogh
1190
Fig. 2 Convergence diagram about Gaussian, Cauchy, and lévy flight effects on the QLGCTSA algorithm
benchmark functions are of high quality. On the other hand, better solutions. Finally, according to all the experiments
trying the CEC2017 benchmark functions due to its high performed, it can be concluded that the QLGCTSA algo-
complexity, the optimization algorithm to solve these prob- rithm can be considered a good and effective option for solv-
lems should have a good performance in the exploration and ing various problems.
exploitation phases and create a good balance between them.
So, it can easily escape from the optimal local trap and has 4.2 Engineering Optimization Problem
a good ability in the diversity component in all optimization
stages. Based on the explanations given and the performance The QLGCTSA algorithm's performance to solve engineer-
of the QLGCTSA algorithm, it can be concluded that the ing problems is evaluated in this subsection. These prob-
QLGCTSA algorithm has not only performed well in the lems were solved using P-metaheuristics as one of the hot
face of various standard functions. And, even if the dimen- research fields[51]. A total of six engineering problems are
sions of the problem increase, it has been able to achieve a applied for this evaluation.
quality solution. And maintain its optimization capabilities
if the problem size increases. 4.2.1 Speed Reducer Design Engineering Problem
On the other hand, according to the convergence dia-
grams, it can be concluded that it has had a good ability This problem aims to minimize the total weight of the speed
in terms of convergence. And, in all stages of optimization reducer, which constraints are surface stress, transverse
can escape from the optimal local trap and repeatedly find deflection, and bending stress. It has various components
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1191
Table 6 P-values of the Wilcoxon rank-sum test with 5% significance for F1–F23 problems
No QLGCTSA algorithm QLGCTSA QLGCTSA QLGCTSA QLGCTSA QLGCTSA QLGCTSA
versus TSA algorithm versus algorithm versus algorithm versus algorithm versus algorithm versus algorithm ver-
QLGTSA QLCTSA QGCTSA QLTSA QGTSA sus QCTSA
P-values R P-values R P-values R P-values R P-values R P-values R P-values R
F1 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 +
F2 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 1.2118E-12 + 3.0199E-11 + 3.0199E-11 +
F3 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 +
F4 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 +
F5 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 +
F6 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 +
F7 3.0199E-11 + 1.3017E-02 + 1.0467E-03 + 2.1590E-03 + 4.6054E-03 + 1.1564E-02 + 4.4411E-03 +
F8 3.0199E-11 + 3.9512E-01 = 6.4789E-01 = 4.2039E-01 = 4.4425E-01 = 5.5412E-02 = 2.6145E-01 =
F9 1.2118E-12 + NaN = NaN = NaN = NaN = NaN = NaN =
F10 1.2118E-12 + NaN = NaN = NaN = NaN = NaN = NaN =
F11 1.2118E-12 + NaN = NaN = NaN = NaN = NaN = NaN =
F12 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 +
F13 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 + 3.0199E-11 +
F14 1.2108E-12 + 1.2108E-12 + 1.2108E-12 + 1.2108E-12 + 1.2108E-12 + 1.2108E-12 + 1.2108E-12 +
F15 8.7584E-09 + 7.9022E-01 = 1.4025E-08 + 2.4038E-01 = 4.9074E-09 + 4.7984E-09 + 4.8947E-08 +
F16 1.7203E-12 + 1.7203E-12 + 1.7203E-12 + 1.7203E-12 + 1.7203E-12 + 1.7203E-12 + 1.7203E-12 +
F17 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 +
F18 1.2118E-12 + 8.1340E-05 – 2.2531E-02 – 1.8655E-03 – 7.0892E-01 = 2.5103E-02 – 1.5846E-01 =
F19 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 + 1.2118E-12 +
F20 0.097091067596477 = 6.4820E-01 = 2.4549E-01 = 7.0892E-01 = 5.0691E-01 = 3.1951E-01 = 5.0691E-01 =
F21 1.3369E-11 + 1.3369E-11 + 1.3369E-11 + 1.3369E-11 + 1.3369E-11 + 1.3369E-11 + 1.3369E-11 +
F22 5.1436E-12 + 5.1436E-12 + 5.1436E-12 + 5.1436E-12 + 5.1436E-12 + 5.1436E-12 + 5.1436E-12 +
F23 1.4488E-11 + 1.4488E-11 + 1.4488E-11 + 1.4488E-11 + 1.4488E-11 + 1.4488E-11 + 1.4488E-11 +
( )
( ) u2 u3
40
( ) 1.93u3 where 2.6 ≤ u1 ≤ 3.6, 0.7 ≤ u2 ≤ 0.8, 17 ≤ u3 ≤ 28, 7.3
⇀ 4
g3 u = −1≤0 ≤ u4 ≤ 8.3, 7.3 ≤ u5 ≤ 8.3 2 . )9
4 .
u2 u6 u3 ⇀ (
≤ u6 ≤ 3.9 and 5.0 ≤ u7 ≤ 5.5, u = u1 .u2 .u3 .u4 .u5 .u6 .u7
Table 12 summarizes the solutions for this problem with
several optimization algorithms. The results were shown
13
F. S. Gharehchopogh
1192
Table 7 Investigating the effect of Cauchy mutation, Gaussian mutation, and Lévy flight on the QLGCTSA algorithm
F1 Avg Std F2 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 0.00E + 00 2.56E-10 0.00E + 00 3.11E-10 100 8.17E-217 2.52E-07 0.00E + 00 1.55E-07
500 0.00E + 00 2.10E-03 0.00E + 00 1.37E-02 500 4.73E-213 1.02E-02 0.00E + 00 3.73E-03
1000 0.00E + 00 4.91E + 00 0.00E + 00 2.41E + 00 1000 0.00E + 00 3.30E-02 0.00E + 00 2.71E-02
F3 Avg Std F4 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 0.00E + 00 1.09E + 04 0.00E + 00 8.27E + 03 100 7.05E-222 6.23E + 01 0.00E + 00 1.36E + 01
500 0.00E + 00 1.28E + 06 0.00E + 00 1.57E + 05 500 2.60E-217 9.93E + 01 0.00E + 00 1.72E-01
1000 0.00E + 00 6.11E + 06 0.00E + 00 8.13E + 05 1000 1.01E-216 9.95E + 01 0.00E + 00 1.65E-01
F5 Avg Std F6 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 3.80E-04 9.85E + 01 2.56E-04 2.03E-01 100 2.22E-05 1.44E + 01 1.09E-05 1.10E + 00
500 5.24E-03 9.83E + 04 4.81E-03 9.27E + 04 500 7.59E-04 1.02E + 03 1.26E-03 1.97E + 00
1000 9.77E-03 4.51E + 07 6.93E-03 2.74E + 07 1000 5.44E-02 2.35E + 03 1.61E-01 6.32E + 00
F7 Avg Std F8 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 6.35E-05 5.04E-02 5.03E-05 1.00E-02 100 – 3.97E + 04 – 1.32E + 04 2.39E + 03 1.23E + 03
500 1.07E-04 3.37E + 00 9.64E-05 1.32E + 00 500 – 1.96E + 05 – 3.14E + 04 1.45E + 04 2.69E + 03
1000 4.49E-05 3.87E + 02 3.41E-05 2.40E + 02 1000 – 3.95E + 05 – 4.39E + 04 3.58E + 04 3.68E + 03
F9 Avg Std F10 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 0.00E + 00 9.09E + 02 0.00E + 00 1.47E + 02 100 8.89E-16 2.09E-01 0.00E + 00 8.10E-01
500 0.00E + 00 6.00E + 03 0.00E + 00 4.78E + 02 500 8.89E-16 1.26E-02 0.00E + 00 4.44E-03
1000 0.00E + 00 9.12E + 03 0.00E + 00 1.70E + 03 1000 8.89E-16 9.50E-02 0.00E + 00 4.02E-03
F11 Avg Std F12 Avg Std
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
100 0.00E + 00 5.78E-04 0.00E + 00 1.21E-02 100 7.42E-08 1.39E + 01 3.34E-08 7.67E + 00
500 0.00E + 00 3.91E-03 0.00E + 00 7.71E-02 500 1.91E-07 3.52E + 06 1.08E-07 5.73E + 06
1000 0.00E + 00 3.59E-01 0.00E + 00 2.34E-01 1000 1.86E-07 6.22E + 08 1.89E-07 2.36E + 08
F13 Avg Std Overall rank Rank
QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA QLGCTSA TSA
with the parameter setting to compare the QLGCTSA algo- 4.2.2 Cantilever Beam Design Engineering Problem
rithm's performance. The experimental results show the
superiority of the QLGCTSA algorithm over different opti- It is related to optimizing the cantilever beam's weight with
mization algorithms. a square cross-section[52]. The analytical expression is as
follows: ( )
Minimize f2 u = 0.0624(u1 + u2 + u3 + u4 + u5 )
⇀
Subject to
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1193
g1 ( u) = 61 37 19
− 1 ≤ 0 where 0.01
7 1
⇀
Table 9 Parameter settings of optimization algorithms u3
+ u32
+ u33
+ u34
+ u35
1 ( ⇀ )
Algorithm Parameter Value ≤ uj ≤ 100(j = 1.2.3.4.5).u u1 .u2. u3 .u4 .u5
The QLGCTSA algorithm solves this problem and gets
SHO Control Parameter (h)
�⃗ [5, 0] optimum output notes, as shown in Table 13.
��⃗ Constant
M [0.5, 1] The experimental result of the QLGCTSA algorithm is
HHO The energy of a rabbit E ∈ [0, 2] compared with SCA [28], CS [52], m-SCA [64], and GCA
TSA Parameter Pmin 1 [65] optimization algorithms. The optimum cost is 1.3401,
Parameter Pmax 4 and the optimum values of the different beams' width (or
SCA Flight length 0.01 height) are 6.01604, 5.30915, 4.4943, 3.5015, and 2.1527.
Awareness probability 0.1 As a result, the efficient execution valence of the QLGCTSA
MVO Wmax(max wep) 1 algorithm is in contrast to the other optimization algorithm.
Wmin(min wep) 0.2
p (Exploitation accuracy) 6 4.2.3 Tension/Compression Spring Design Problem
AVOA L1 0.8
L2 0.2 The objective of this problem is to minimize the tension /
w 2.5 compression spring weight. Its optimum design must sat-
P1 0.6 isfy deflection constraints, shear stress, and surge frequency.
P2 0.4 Wire diameter (u1), mean coil diameter (u2), and the number
P3 0.6
13
F. S. Gharehchopogh
1194
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1195
13
F. S. Gharehchopogh
1196
Fig. 3 Convergence diagram on the evaluation of the LGCTSA algorithm and other optimizing algorithms in solving CEC2017 problems
of active coils (u3) are three design variables. The analytical The standard control parameters' parameter setting is the
expression is as follows: same as [53] in the QLGCTSA algorithm. The experimental
Minimize f3 (u) = (u3 + 2)u2 u21 result in Table 14 shows that the QLGCTSA algorithm per-
⇀
Subject
( ) to u3 +u forms better than all other optimization algorithms.
g1 u = 1-71785u
⇀ 2 3
4 ≤ 0
( ) 2 1
4u2 −u1 u2 4.2.4 Three‑Bar Truss Design Problem
2 -1 ≤ 0
⇀ 1
g2 u = 12566(u3 u −u4 ) + 5108u
1 2 1 1
( ) 140.45u1 The objective of this problem is to define the optimum cross-
⇀
g3 u = 1 − ≤0 sectional areas. It involved only two decision parameters that
u22 u3
can be retouched to obtain the minimum weight of stress,
( ) buckling constraints, and deflection. The analytical expres-
where 0.05 ≤ u1 ≤ 2.0 ⇀
≤ u2 ≤ 1.3, 2 ≤ u3 ≤ 15;u = u1 .u2 .u3
sion is as follows:
25
This engineering optimization problem was resolved √
Minimize f4 (u) = (2 2u1 + u2 ) × l1
⇀
using either mathematical techniques such as WOA [53],
GSA [66], GSA [66], HS [67], Improved HS [67], GA [68], Subject to
ES [55], DE [69], and PSO [54]. The QLGCTSA algorithm's √
� � 2u1 + u2
optimized results are compared with the above algorithms g1 u = √
⇀
p−𝜎 ≤0
presented in Table 14 regarding the design variables and 2u21 + 2u1 u2
optimum compression spring weight.
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1197
F24 3.02E-11 + 3.02E-11 + 3.02E-11 + 4.50E-11 + 3.02E-11 + 3.81E-02 +
F25 3.02E-11 + 3.02E-11 + 3.02E-11 + 3.02E-11 + 3.02E-11 + 3.39E-06 +
F26 5.65E-04 + 1.29E-05 + 3.02E-11 + 8.03E-01 - 3.02E-11 + 2.23E-02 +
F27 4.81E-05 + 9.66E-05 + 3.02E-11 + 1.89E-04 + 3.02E-11 + 9.01E-01 =
F28 6.15E-06 + 2.80E-05 + 3.02E-11 + 1.60E-04 + 1.89E-04 + 4.12E-02 +
F29 3.36E-05 + 3.39E-06 + 3.02E-11 + 5.74E-05 + 3.02E-11 + 3.40E-02 +
F30 1.87E-03 + 3.08E-04 = 3.02E-11 + 8.13E-05 + 3.02E-11 + 7.40E-02 =
F31 1.44E-02 + 9.67E-01 = 3.02E-11 + 8.13E-05 + 3.02E-11 + 5.61E-02 =
F32 1.99E-01 = 4.81E-05 + 3.02E-11 + 2.29E-01 = 3.02E-11 + 1.35E-02 +
F33 1.33E-05 + 4.14E-06 + 3.02E-11 + 2.23E-04 + 3.02E-11 + 8.95E-01 =
F34 9.07E-06 + 3.39E-06 + 3.02E-11 + 3.36E-05 + 3.02E-11 + 1.87E-03 +
F35 7.80E-04 + 4.02E-05 + 3.02E-11 + 9.34E-01 = 5.76E-04 + 3.34E-01 +
F36 2.62E-04 + 1.28E-02 + 3.02E-11 + 1.15E-01 = 3.02E-11 + 1.28E-02 +
F37 2.82E-03 + 3.39E-06 + 3.02E-11 + 2.23E-04 + 3.02E-11 + 2.15E-03 +
F38 7.94E-03 + 2.81E-01 = 3.02E-11 + 4.21E-02 + 3.02E-11 + 2.29E-01 =
F39 2.15E-03 + 3.44E-02 + 3.02E-11 + 1.81E-02 + 3.02E-11 + 1.47E-01 =
F40 6.78E-01 = 1.89E-04 + 3.02E-11 + 5.64E-02 = 3.02E-11 + 1.90E-02 +
F41 1.81E-02 + 2.46E-03 + 3.02E-11 + 7.45E-02 + 3.02E-11 + 1.14E-02 +
F42 1.40E-03 + 6.80E-02 = 3.02E-11 + 2.25E-02 + 3.02E-11 + 4.81E-02 +
F43 3.39E-06 + 3.39E-06 + 3.02E-11 + 3.39E-06 + 3.02E-11 + 3.39E-06 +
F44 9.07E-06 + 3.39E-06 + 3.02E-11 + 5.64E-02 = 3.02E-11 + 4.48E-02 +
F45 2.33E-05 + 1.87E-03 = 3.02E-11 + 3.23E-03 + 1.40E-03 + 3.81E-02 +
F46 1.47E-01 = 1.99E-01 = 3.02E-11 + 1.87E-03 + 3.02E-11 + 8.90E-03 +
F47 9.07E-06 + 5.76E-04 = 3.02E-11 + 7.72E-01 = 3.02E-11 + 3.20E-01 =
F48 2.79E-02 + 1.58E-01 = 3.02E-11 + 6.20E-02 = 3.02E-11 + 4.46E-01 =
F49 1.33E-05 + 3.10E-02 + 3.02E-11 + 1.89E-04 + 3.02E-11 + 2.29E-02 =
F50 3.39E-06 + 3.39E-06 + 3.02E-11 + 4.81E-05 + 3.02E-11 + 3.10E-02 +
F51 1.61E-02 + 5.07E-01 = 3.02E-11 + 3.20E-01 = 3.02E-11 + 7.40E-01 =
F52 2.02E-02 + 2.62E-04 + 3.02E-11 + 9.71E-02 = 3.02E-11 + 7.94E-03 +
13
F. S. Gharehchopogh
1198
Table 12 Solving speed reducer problem with QLGCTSA and another optimization algorithm
Algorithms u1 u2 u3 u4 u5 u6 u7 fmin
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1199
13
F. S. Gharehchopogh
1200
4.2.6 Pressure Vessel Design Problem implementation problems. Experimental results show that
the QLGCTSA algorithm has dramatically improved com-
This problem minimizes the total material, forming cost, and pared with several optimization algorithms. In future work,
single 60° welding cost [85]. The different components are the QLGCTSA algorithm can solve different engineering
the diameter of the shell (u1), the diameter of the head (u2), problems. We intend to use and evaluate the multi-objective
the internal radius (u3), and the length of the cylindrical part version of this algorithm to solve multi-objective issues con-
of the vessel (u4). The analytical expression is as follows: cerning excellent and significant performance.
M i n i m i z e f6 ( u ) = 0 . 6 2 2 4
⇀
Declarations
(u)=u4 + 240 ≤ 0 g4
⇀
Here, 1 ≤ u1 ≤ 99.1 ≤ u2 ≤ 99.10 ≤ u3 ≤ 200.10 ≤ u4 Conflict of interest The authors declare that they have no conflict of
interest.
⇀
≤ 200; u = (u1 .u2 .u3 .u4 )
Table 17 compares the QLGCTSA algorithm with another
optimization algorithm. Experimental result shows that the
QLGCTSA algorithm obtained the best optimum results. References
As shown in Table 17, the optimum values of the
QLGCTSA algorithm's decision parameters are 13.39411, 1. Gharehchopogh, F.S., I. Maleki, & Z.A. Dizaji. (2021). Chaotic
vortex search algorithm: Metaheuristic algorithm for feature selec-
7.075665, 42.09845, 176.636596, and the cylindrical pres- tion. Evolutionary Intelligence,1–32.
sure vessel's optimum cost 6059.7143. The QLGCTSA algo- 2. Nadimi-Shahraki, M. M., Banaie-Dezfouli, M., Zamani, H.,
rithm is superior to another optimization algorithm in terms Shokooh, T., & Seyedali, M. (2021). B-MFO: A binary moth-
of the best solution. flame optimization for feature selection from medical datasets.
Computers, 10(11), 136.
3. Benyamin, A., Farhad, S. G., & Saeid, B. (2021). Discrete farm-
land fertility optimization algorithm with metropolis acceptance
5 Conclusions and Future Work criterion for traveling salesman problems. International Journal
of Intelligent Systems, 36(3), 1270–1303.
4. Gharehchopogh, F. S., Farnad, B., & Alizadeh, A. (2021). A modi-
Tunicate Swarm Algorithm (TSA) was used to solve some fied farmland fertility algorithm for solving constrained engineer-
engineering problems. This paper used three mutation opera- ing problems. Concurrency and Computation: Practice and Expe-
tors to increase the TSA algorithm's global search capability rience, 33(17), e6310.
based on three different distributions: Gaussian mutation, 5. Houssein, E. H., Emam, M. M., & Ali, A. A. (2021). An efficient
multilevel thresholding segmentation method for thermography
Cauchy mutation, and lévy flight. Then we evaluated the breast cancer imaging based on improved chimp optimization
impact of each mutation operator on the performance of the algorithm. Expert Systems with Applications, 185, 115651.
QLGCTSA algorithm. We also assessed the TSA algorithm's 6. Furugi, A. (2021). A tabu search algorithm for the unrelated
performance using mutation operators in a two-by-two form, parallel machine scheduling problem with machine availability
constraint and sequence-dependent setup time. Journal of the Fac-
finally presenting an algorithm that applies all three muta- ulty of Engineering and Architecture of Gazi University, 36(3),
tion operators for optimization operations. We called this 1539–1550.
algorithm the QLGCTSA algorithm. To perform the evalu- 7. Houssein, E. H. (2021). Hybrid slime mould algorithm with adap-
ations, 52 benchmark functions, including three different tive guided differential evolution algorithm for combinatorial and
global optimization problems. Expert Systems with Applications,
UM, MM, and Cauchy mutation groups, were performed, 2021(174), 114689.
29 of which include CEC2017. The QLGCTSA algorithm 8. Nadimi-Shahraki, M. H. (2020). MTDE: An effective multi-trial
was also applied to evaluate the QLGCTSA algorithm's per- vector-based differential evolution algorithm and its applications
formance of six engineering problems of large dimensions for engineering design problems. Applied Soft Computing, 97,
106761.
presented in CEC 2017. An experiment was also conducted 9. Solimanpur, M., Foroughi, A., & Mohammadi, M. (2016). Opti-
to test the QLGCTSA algorithm's scalability benchmarks mum route selection in hole-making operations using a dynamic
of large and various sizes. Implementing comparative programming-based method. Cogent Engineering, 3(1), 1201991.
algorithms indicates that the QLGCTSA algorithm out- 10. Furugi, A. (2022). Sequence-dependent time-and cost-oriented
assembly line balancing problems: A combinatorial Benders’
performs other competing optimization algorithms and can decomposition approach. Engineering Optimization, 54(1),
be used in the future as a powerful and efficient algorithm 170–184.
to solve various problems. The paper major considers the
efficiency of the QLGCTSA algorithm using six technical
13
An Improved Tunicate Swarm Algorithm with Best‑random Mutation Strategy for Global Optimization… 1201
11. Abdollahzadeh, B., Barshandeh S., Javadi H., Epicoco N. (2021). 32. Simon, D. (2008). Biogeography-based optimization. IEEE Trans-
An enhanced binary slime mould algorithm for solving the 0–1 actions on Evolutionary Computation, 12(6), 702–713.
knapsack problem. Engineering with Computers, 1–22. 33. De Castro L.N., & Von Zuben. F.J. (2000). The clonal selection
12. Essam, H. H., Bahaa, E., Diego, O., & Ahmed, A. E. S. (2021). A algorithm with engineering applications. In Workshop on Artifi-
novel black widow optimization algorithm for multilevel thresh- cial Immune Systems and Their Application, Las Vegas, USA.
olding image segmentation. Expert Systems with Applications., 34. Abdollahzadeh, B., Soleimanian, G. F., & Mirjalili, S. (2021).
167, 114159. Artificial gorilla troops optimizer: A new nature-inspired
13. Taghian, S., & Nadimi-Shahraki M.H. (2019). A binary metaheuristic algorithm for global optimization problems. Inter-
metaheuristic algorithm for wrapper feature selection. Interna- national Journal of Intelligent Systems, 36(10), 5887–5958.
tional Journal of Computer Science Engineering (IJCSE), 8(5), 35. Zamani, H., Nadimi-Shahraki, M. H., & Gandomi, A. H. (2019).
168–172. CCSA: Conscious neighborhood-based crow search algorithm for
14. Houssein, E. H. (2020). Hybrid harris hawks optimization with solving global optimization problems. Applied Soft Computing,
cuckoo search for drug design and discovery in chemoinformatics. 85, 105583.
Scientific Reports, 10(1), 1–22. 36. Shayanfar, H., & Gharehchopogh, F. S. (2018). Farmland fertility:
15. Furugi, A. & Yapici, F. (2021). Optimization of production param- A new metaheuristic algorithm for solving continuous optimiza-
eters in oriented strand board (osb) manufacturing by using tagu- tion problems. Applied Soft Computing, 71, 728–746.
chi method. Wood Industry/Drvna Industrija, 72(4). 37. Zamani, H., Nadimi-Shahraki, M. H., & Gandomi, A. H. (2021).
16. Gharehchopogh, F. S., Shayanfar, H., & Gholizadeh, H. (2020). A QANA: Quantum-based avian navigation optimizer algorithm.
comprehensive survey on symbiotic organisms search algorithms. Engineering Applications of Artificial Intelligence, 104, 104314.
Artificial Intelligence Review, 53(3), 2265–2312. 38. Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2009). GSA:
17. Gharehchopogh, F.S. (2022). Advances in tree seed algorithm: A gravitational search algorithm. Information Sciences, 179(13),
A comprehensive survey. Archives of Computational Methods in 2232–2248.
Engineering, 1–24. 39. Abdollahzadeh, B., Gharehchopogh, F. S., & Mirjalili, S. (2021).
18. Ghafori, S., & Gharehchopogh F. S. (2021). Advances in spotted African vultures optimization algorithm: A new nature-inspired
hyena optimizer: A comprehensive survey. Archives of Compu- metaheuristic algorithm for global optimization problems. Com-
tational Methods in Engineering, 1–22. puters & Industrial Engineering, 158, 107408.
19. Gharehchopogh, F. S., & Gholizadeh H. (2019). A comprehensive 40. Kashan, A. H. (2014). League championship algorithm (lca): An
survey: Whale optimization algorithm and its applications. Swarm algorithm for global optimization inspired by sport champion-
and Evolutionary Computation, 48, 1–24. ships. Applied Soft Computing, 16, 171–200.
20. Kaur, S., Lalit, K. A., Sangal, A. L., & Gaurav, D. (2020). Tuni- 41. Fatma, A. H., Essam, H. H., Mai, S. M., Walid, A., & Mirjalili, S.
cate swarm algorithm: a new bio-inspired based metaheuristic (2019). Henry gas solubility optimization: A novel physics-based
paradigm for global optimization. Engineering Applications of algorithm. Future Generation Computer Systems, 101, 646–667.
Artificial Intelligence, 90, 103541. 42. Eita, M. & Fahmy, M. (2010). Group counseling optimization:
21. Zamani, H., & Nadimi-Shahraki, M. H. (2016). Feature selection A novel approach. In: Research and development in intelligent
based on whale optimization algorithm for diseases diagnosis. systems XXVI. Springer, 195–208.
International Journal of Computer Science and Information Secu- 43. Shah-Hosseini, H. (2009). The intelligent water drops algorithm:
rity, 14(9), 1243. A nature-inspired swarm-based optimization algorithm. Interna-
22. Bansal, J. C. H., & S., Shimpi S. J. & Maurice C. (2014). Spi- tional Journal of Bio-Inspired Computation, 1(1–2), 71–79.
der monkey optimization algorithm for numerical optimization. 44. Hegen, X., Zhiyuan, W., Huali, F., Gongfa, L., & Guozhang, J.
Memetic Computing, 6(1), 31–47. (2018). Quantum rotation gate in quantum-inspired evolutionary
23. Cheng, M.Y& Prayogo D. (2014). Symbiotic organisms search: algorithm: A review, analysis and comparison study. Swarm and
A new metaheuristic optimization algorithm. Computers & Struc- Evolutionary Computation, 42, 43–57.
tures, 139, 98–112. 45. Salvador, G., Alberto, F., Julian, L., & Francisco, H. (2010).
24. Tilahun, N. H. S. L., Sathasivam, S., & Choon, O. H. (2013). Advanced nonparametric tests for multiple comparisons in the
Prey-predator algorithm as a new optimization technique using in design of experiments in computational intelligence and data
radial basis function neural networks. Research Journal of Applied mining: Experimental analysis of power. Information sciences,
Sciences, 8(7), 383–387. 180(10), 2044–2064.
25. Muthiah-Nakarajan, V., & Noel, M. M. (2016). Galactic swarm 46. Alcala-Fdez, J., Sanchez, L., Garcia, S., Del, M. J., Ventura, S.,
optimization: A new global optimization metaheuristic inspired Garrell, J. M., Otero, J., Romero, C., Bacardit, J., Rivas, V. M.,
by galactic motion. Applied Soft Computing, 38, 771–787. Fernandez, J. C., & Herrera, F. (2009). KEEL: A software tool
26. Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf opti- to assess evolutionary algorithms for data mining problems. Soft
mizer. Advances in Engineering Software, 69, 46–61. Computing, 13(3), 307–318.
27. Nadimi-Shahraki, M. H., Taghian, S., & Mirjalili, S. (2021). An 47. Yao, X., Liu, Y., & Lin, G. (1999). Evolutionary programming
improved grey wolf optimizer for solving engineering problems. made faster. IEEE Transactions on Evolutionary computation,
Expert Systems with Applications, 166, 113917. 3(2), 82–102.
28. Mirjalili, S. (2016). SCA: A sine cosine algorithm for solving 48. Digalakis, J. G., & Margaritis, K. G. (2001). On benchmarking
optimization problems. Knowledge-Based Systems, 96, 120–133. functions for genetic algorithms. International Journal of Com-
29. Dhiman, G., & Kumar, V. (2017). Spotted hyena optimizer: A puter Mathematics, 77(4), 481–506.
novel bio-inspired based metaheuristic technique for engineering 49. Heidari, A. A., Mirjalili, S., Hossam, F., Ibrahim, A., Majdi, M.,
applications. Advances in Engineering Software, 114, 48–70. & Huiling, C. (2019). Harris hawks optimization: Algorithm and
30. Zamani, H., Nadimi-Shahraki, M. H., & Gandomi A.H. (2022). applications. Future Generation Computer Systems, 97, 849–872.
Starling murmuration optimizer: A novel bio-inspired algorithm 50. Mirjalili, S., Mirjalili, S. M., & Hatamlou, A. (2016). Multi-verse
for global and engineering optimization. Computer Methods in optimizer: A nature-inspired algorithm for global optimization.
Applied Mechanics and Engineering, 392, 114616. Neural Computing and Applications, 27(2), 495–513.
31. Kirkpatrick, S., Gelatt, C. D., & Vecchi, M. P. (1983). Optimiza- 51. Eneko, O., Esther, V.R., Javier, D. S., Antonio, J.N., Daniel, M.,
tion by simulated annealing. Science, 220(4598), 671–680. Antonio, L.T., Ponnuthurai, N.S., Carlos, A.C.C., & Francisco, H.
13
F. S. Gharehchopogh
1202
(2021). A tutorial on the design, experimentation and application 70. Moosavi, S. H. S., & Bardsiri, V. K. (2019). Poor and rich opti-
of metaheuristic algorithms to real-world optimization problems. mization algorithm: A new human-based and multi populations
Swarm and Evolutionary Computation, 100888. algorithm. Engineering Applications of Artificial Intelligence, 86,
52. Gandomi, A. H., Yang, X. S., & Alavi, A. H. (2013). Cuckoo 165–181.
search algorithm: A metaheuristic approach to solve structural 71. Mirjalili, S. (2015). The ant lion optimizer. Advances in Engineer-
optimization problems. Engineering with Computers, 29(1), ing Software, 83, 80–98.
17–35. 72. Zhang, M., Luo, W., & Wang, X. (2008). Differential evolution
53. Mirjalili, S., & Lewis, A. (2016). The whale optimization algo- with dynamic stochastic selection for constrained optimization.
rithm. Advances in Engineering Software, 95, 51–67. Information Sciences, 178(15), 3043–3074.
54. He, Q., & Wang, L. (2007). An effective co-evolutionary particle 73. Wang, C., & Liu, K. (2019). A randomly guided firefly algorithm
swarm optimization for constrained engineering design problems. based on elitist strategy and its applications. IEEE Access, 7,
Engineering Applications of Artificial Intelligence, 20(1), 89–99. 130373–130387.
55. Mezura-Montes, E., & Coello, C. A. C. (2008). An empirical 74. Huiling, C., Yueting, X., Mingjing, W., & Xuehua, Z. (2019). A
study about the usefulness of evolution strategies to solve con- balanced whale optimization algorithm for constrained engineer-
strained optimization problems. International Journal of General ing design problems. Applied Mathematical Modelling, 71, 45–59.
Systems, 37(4), 443–473. 75. Chen, D., Ziqi, X., Ximeng, L., Yin, Y., Yang, Y., & Wenzhong,
56. Gupta, S., Deep, K., Moayedi, H., Foong, L.K., & Assif, A. G. (2019). Dual-search artificial bee colony algorithm for engi-
(2020). Sine cosine grey wolf optimizer to solve engineering neering optimization. IEEE Access, 7, 24571–24584.
design problems. Engineering with Computers, 1–27. 76. Saremi, S., Mirjalili, S., & Lewis, A. (2017). Grasshopper optimi-
57. Ray, T., & Saini, P. (2001). Engineering design optimization using zation algorithm: Theory and application. Advances in Engineer-
a swarm with an intelligent information sharing among individu- ing Software, 105, 30–47.
als. Engineering Optimization, 33(6), 735–748. 77. Liu, H., Cai, Z., & Wang, Y. (2010). Hybridizing particle swarm
58. Akhtar, S., Tai, K., & Ray, T. (2002). A socio-behavioural simu- optimization with differential evolution for constrained numeri-
lation model for engineering design optimization. Engineering cal and engineering optimization. Applied Soft Computing, 10(2),
Optimization, 34(4), 341–354. 629–640.
59. Mittal, N., Singh, U., & Sohi, B.S. (2016). Modified grey wolf 78. Tsai, J. F. (2005). Global optimization of nonlinear fractional pro-
optimizer for global engineering optimization. Applied Compu- gramming problems in engineering design. Engineering Optimi-
tational Intelligence and Soft Computing, pp. 1–17. zation, 37(4), 399–409.
60. Gupta, S., & Deep, K. (2019). A hybrid self-adaptive sine cosine 79. Ali, S., Ardeshir, B., Hadi, E., & Mohd, H. (2013). Mine blast
algorithm with opposition based learning. Expert Systems with algorithm: A new population based algorithm for solving con-
Applications, 119, 210–230. strained engineering optimization problems. Applied Soft Comput-
61. Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel ing, 13(5), 2592–2612.
nature-inspired heuristic paradigm. Knowledge-Based Systems, 80. Kamalinejad, M., Arzani, H., & Kaveh, A. (2019). Quantum
89, 228–249. evolutionary algorithm with rotational gate and SS Hepsilon SS-
62. Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H., gate updating in real and integer domains for optimization. Acta
& Mirjalili, S. M. (2017). Salp swarm algorithm: A bio-inspired Mechanica, 230(8), 2937–2961.
optimizer for engineering design problems. Advances in Engineer- 81. Wang, C., & Chu X. (2019). An improved firefly algorithm with
ing Software, 114, 163–191. specific probability and its engineering application. IEEE Access,
63. Gupta, S., & Deep, K. (2019). Improved sine cosine algorithm 7, 57424–57439.
with crossover scheme for global optimization. Knowledge-Based 82. Ray, T., & Liew, K. M. (2003). Society and civilization: An opti-
Systems, 165, 374–406. mization algorithm based on the simulation of social behavior.
64. Sayed, G. I., Khoriba, G., & Haggag, M. H. (2018). A novel cha- IEEE Transactions on Evolutionary Computation, 7(4), 386–396.
otic salp swarm algorithm for global optimization and feature 83. Kandikonda, H. R., Sharma, R. S., Mishra, G. S. A., & Patvard-
selection. Applied Intelligence, 48(10), 3462–3481. han, C. (2005). An evolutionary computational technique for
65. Chickermane, H., & Gea, H. C. (1996). Structural optimization constrained optimisation in engineering design. Journal of the
using a new local approximation method. International Journal Institution of Engineers India Part Me Mechanical Engineering
for Numerical Methods in Engineering, 39(5), 829–846. Division, 86, 121–128.
66. Ashok, D. B., & Jasbir, S. A. (1985). A study of mathematical 84. Wang, G. G. (2003). Adaptive response surface method using
programming methods for structural optimization: Part I: Theory. inherited latin hypercube design points. Journal of Mechanical
Numerical Method in Engineering, 21(9), 1583–1599. Design, 125(2), 210–220.
67. Mahdavi, M., Fesanghary, M., & Damangir E. (2007). An 85. Holland, J.H. (1992). Adaptation in natural and artificial systems:
improved harmony search algorithm for solving optimization an introductory analysis with applications to biology, control, and
problems. Applied Mathematics and Computation, 188(2), artificial intelligence. MIT Press.
1567–1579.
68. Lee, K. S., & Geem, Z. W. (2005). A new meta-heuristic algo- Publisher's Note Springer Nature remains neutral with regard to
rithm for continuous engineering optimization: Harmony search jurisdictional claims in published maps and institutional affiliations.
theory and practice. Computer Methods in Applied Mechanics and
Engineering, 194(36–38), 3902–3933.
69. Li, L. J., Huang, Z. B., Liu, F., & Wu, Q. H. (2007). A heuristic
particle swarm optimizer for optimization of pin connected struc-
tures. Computers & Structures, 85(7–8), 340–349.
13