Hybridization Approach to Hippopotamus Optimization Algorithm (HO) Using Particle Swarm Optimization (PSO)
Hybridization Approach to Hippopotamus Optimization Algorithm (HO) Using Particle Swarm Optimization (PSO)
ISSN NO-2584-2706
2. Literature Review-
In many different fields, optimization algorithms
are essential for resolving challenging real-world
issues. Non-convex, high-dimensional, and
discontinuous search spaces are frequently
difficult for conventional optimization
techniques, including gradient-based approaches.
In order to get beyond these restrictions, scientists
have created nature-inspired metaheuristic
algorithms, which fall into two general
categories: swarm intelligence (SI) algorithms
and evolutionary algorithms (EAs).
IterativeAlgorithms-Evolutionary algorithms
draw inspiration from genetics and natural
selection. Selection, crossover, and mutation
operators are employed in the Genetic Algorithm
(GA) (Holland, 1975), one of the oldest and most
popular methods for evolving solutions. The use
of vector-based mutation techniques to boost
convergence is how Differential Evolution (DE)
(Storn & Price, 1997) improves GA. EAs are
useful, but they frequently have sluggish
convergence and need a lot of parameter
adjustment.
For 13 distinct functions, the table contrasts the compare HO's performance with well-known
top results attained with and without PSO. The algorithms like Genetic Algorithm (GA),
following are the main findings: Particle Swarm Optimization (PSO), Grey Wolf
Functions where PSO dramatically raised the Optimizer (GWO), and Whale Optimization
score: Algorithm (WOA), benchmark test functions
F1: Increased from 2.51E-180 to 0.00667, and real-world optimization issues were used. In
indicating no discernible improvement in PSO. terms of solution correctness, convergence
F3: Significant decline, going from 59473.62 speed, and robustness, the experimental results
to10991.67. show that HO performs better than conventional
F4: Showed poorer PSO performance, rising optimization techniques. In multimodal
from3.07E-07to12.66. functions in particular, it demonstrated better
F5: PSO performed worse, rising from 28.79 global optima avoidance, faster convergence
to110.88. rates, and lower average fitness values.
F7: Not much of an improvement, the gain from Additionally, the approach demonstrated
0.00024 to 0.21957 is slight. remarkable flexibility in practical applications,
F8: Showed a slight improvement from such as supply chain optimization, engineering
6105.07to5645.86. design, and feature selection in machine
F9: Showed poorer PSO performance, rising learning.
from0to137.45. Even while HO performs well, it still has to be
F11: Showed a notable improvement, rising from improved in terms of adaptive processes and
0.95 to 0.056. parameter tweaking in order to maximize
Situations in which PSO underperformed: performance across various problem domains.
The scores for F2, F4, F5, F6, F9, F12, and F13 Future research might examine applying HO to
all increased, indicating that PSO had no more real-world problems like industrial process
effectonperformance. control and deep learning model optimization,
Situations under which PSO excelled: or hybridizing it with other metaheuristic
Better PSO optimization was seen in F3, F7, F8, algorithms and adaptive learning techniques. All
F10, and F11, as seen by their lower top scores. things considered, the Hippopotamus Optimizer
It was concluded that In several instances (F3, offers a fresh and effective method of
F8, F11), PSO was successful in lowering the optimization with encouraging possibilities for a
function's best score considerably. range of computational intelligence
In instances like F2, F4, F5, F6, F9, F12, and applications.
F13, where the function's greatest score rose,
PSO either failed or performed worse, 5. References-
indicating inefficient optimization. The 1.Dhiman, G., Garg, M., Nagar, A., Kumar, V.
inconsistent performance indicates that PSO & Dehghani, M. A novel algorithm for global
may not be always the best option for all optimization: Rat Swarm Optimizer. J Ambient
functions and that hybrid optimization Intell Humaniz Comput 12, 8457–8482 (2021).
approaches or parameter modification may be 2.Chen, H. et al. An opposition-based sine
necessary to achieve better outcomes. cosine approach with local search for parameter
estimation of photovoltaic models. Energy
4. Conclusion- Convers Manag 195, 927–942 (2019).
In this work, we presented the 3. Li, S., Chen, H., Wang, M., Heidari, A. A. &
Hippopotamus Optimizer (HO), a new Mirjalili, S. Slime mould algorithm: A new
metaheuristic algorithm that draws inspiration method for stochastic optimization. Future
from hippopotamuses' territorial and social Generation Computer Systems 111, 300–323
tendencies. By mimicking hippo movements (2020).
between water and land, cooperative group 4. Gharaei, A., Shekarabi, S. & Karimi, M.
behaviours, and territorial dominance, the Modelling And optimal lot-sizing of the
program successfully strikes a balance between replenishments in constrained, multi product and
exploration and exploitation. In order to bi-objective EPQ models with defective products: