0% found this document useful (0 votes)
0 views

Hybridization Approach to Hippopotamus Optimization Algorithm (HO) Using Particle Swarm Optimization (PSO)

The nature-inspired metaheuristic algorithm called the Hippopotamus Optimizer (HO) imitates the social

Uploaded by

IJMSRT
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

Hybridization Approach to Hippopotamus Optimization Algorithm (HO) Using Particle Swarm Optimization (PSO)

The nature-inspired metaheuristic algorithm called the Hippopotamus Optimizer (HO) imitates the social

Uploaded by

IJMSRT
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Volume-3,Issue-5,May2025 International Journal of Modern Science Technology

ISSN NO-2584-2706

Hybridization Approach to Hippopotamus


Optimization Algorithm (HO) using Particle
Swarm Optimization (PSO)
Mayuri Rangari; Dr. Sandhya Dahake
Kshitij Murarkar; Shubhanshi Mishra
Department of MCA, G H Raisoni College of Engineering & Management, Nagpur, Maharashtra, India

Abstract- The nature-inspired metaheuristic Algorithms (GA), Particle Swarm Optimization


algorithm called the Hippopotamus Optimizer (PSO), Grey Wolf Optimizer (GWO), an Ant
(HO) imitates the social, territorial, and survival Colony Optimization (ACO), which are inspired
strategies of hippopotamuses in both aquatic and by biological evolution, swarm intelligence, and
terrestrial settings. This research includes natural occurrences. These algorithms seek to
important behavioral characteristics including efficiently find the best answers by striking a
migration between water and land, dominance balance between exploitation (local search) and
hierarchy, and cooperative group interactions, the exploration (global search). Based on the
algorithm is made to strike a balance between distinctive behaviours of hippopotamuses in
exploration and exploitation. In order to avoid their native environments, we present the
local optima, hippos travel randomly around the Hippopotamus Optimizer (HO), a revolutionary
search space during the exploration phase, nature-inspired optimization system. It is
motivated by their territorial roaming. Hippos possible to model the unique traits of
refine solutions toward optimality throughout the hippopotamuses including their semi-aquatic
exploitation phase through social interactions and lifestyle, social interactions, territorial
competitive resource allocation. dominance, and cooperative behaviours, to
develop an efficient optimization framework.
Keyword: Nature-inspired algorithm, Dominance The HO algorithm is a competitive substitute for
hierarchy current metaheuristic methods since it
mimics similar tendencies to improve
1. Introduction exploration and exploitation capabilities. The
The solution of intricate real-world issues in a following are the Hippopotamus Optimizer's
variety of fields, such as engineering, machine main contributions:
learning, finance, and healthcare, depends heavily Novel Inspiration: The mobility, territorial, and
on optimization. Due to its capacity to identify cooperative behaviours of hippopotamuses
nearly optimal solutions for intricate, high- served as the inspiration for the first
dimensional, and nonlinear problems where optimization algorithm, HO.
conventional mathematical methods could fall Balanced Search Approach: The program uses
short, metaheuristic optimization algorithms have simulated hippo movements and social
attracted a lot of interest. There are many interactions to successfully strike a balance
metaheuristic algorithms that have been between exploration and exploitation.
suggested, including Genetic Competitive Performance: When evaluated
against real-world optimization problems and
benchmark functions, HO shows strong
convergence and accuracy.

IJMSRT25 MAY069 www.ijmsrt.com 314


DOI: https://doi.org/10.5281/zenodo.15497150
Volume-3,Issue-5,May2025 International Journal of Modern Science Technology
ISSN NO-2584-2706

Numerous domains, such as feature selection,


engineering design, machine learning
hyperparameter tweaking, and industrial process
optimization, can benefit from the algorithm's
broad applicability

1.1. Mathmatical Expression Of Pso-


Table 1. Standard Benchmark Function

2. Literature Review-
In many different fields, optimization algorithms
are essential for resolving challenging real-world
issues. Non-convex, high-dimensional, and
discontinuous search spaces are frequently
difficult for conventional optimization
techniques, including gradient-based approaches.
In order to get beyond these restrictions, scientists
have created nature-inspired metaheuristic
algorithms, which fall into two general
categories: swarm intelligence (SI) algorithms
and evolutionary algorithms (EAs).
IterativeAlgorithms-Evolutionary algorithms
draw inspiration from genetics and natural
selection. Selection, crossover, and mutation
operators are employed in the Genetic Algorithm
(GA) (Holland, 1975), one of the oldest and most
popular methods for evolving solutions. The use
of vector-based mutation techniques to boost
convergence is how Differential Evolution (DE)
(Storn & Price, 1997) improves GA. EAs are
useful, but they frequently have sluggish
convergence and need a lot of parameter
adjustment.

IJMSRT25 MAY069 www.ijmsrt.com 315


DOI: https://doi.org/10.5281/zenodo.15497150
Volume-3,Issue-5,May2025 International Journal of Modern Science Technology
ISSN NO-2584-2706

Algorithms for Swarm Intelligence-Through


decentralized decision-making, a collection of
agents can find the best answers thanks to
swarm intelligence (SI) algorithms, which
imitate the collective behaviour of social
animals.
Bird flocking behaviour serves as the
inspiration for Particle Swarm Optimization
(PSO) (Kennedy & Eberhart, 1995), in which
particles modify their motion according to
their own and the world's optimal solutions.
Due to its ease of use and efficiency, PSO is
frequently employed; nevertheless, in
multimodal issues, it frequently becomes stuck
in local optima. 3. Result And Discussion-
AdvancementsinNature-Inspired Table 2. Functions Checked on
Optimization- applyingPSO and without PSO
Researchers have been creating new Fig 2. Images Depicts the benchmark
metaheuristic algorithms in recent years using functions which gives the following search
a variety of biological inspirations: spaces
Moth-Flame Optimization (MFO) (Mirjalili,
2015) uses a logarithmic spiral to imitate how
moths navigate.
The foraging behaviour of butterflies serves as
the foundation for the Butterfly Optimization
Algorithm (BOA) (Arora &Singh,2019).
Inspired by chimpanzee hunting, Kaveh and
Farhoudi (2022) developed the Chimp
Optimization Algorithm (ChOA).
Even with these algorithms' success, problems
including algorithm-specific parameter tuning,
lack of variety, and premature convergence
still exist. As a result, new bio-inspired models
are being investigated, such as semi-aquatic
animals like hippopotamuses, which have
distinctive behavioral patterns that can be used
for optimization.
Fig 1. Classification of Optimization

IJMSRT25 MAY069 www.ijmsrt.com 316


DOI: https://doi.org/10.5281/zenodo.15497150
Volume-3,Issue-5,May2025 International Journal of Modern Science Technology
ISSN NO-2584-2706

For 13 distinct functions, the table contrasts the compare HO's performance with well-known
top results attained with and without PSO. The algorithms like Genetic Algorithm (GA),
following are the main findings: Particle Swarm Optimization (PSO), Grey Wolf
Functions where PSO dramatically raised the Optimizer (GWO), and Whale Optimization
score: Algorithm (WOA), benchmark test functions
F1: Increased from 2.51E-180 to 0.00667, and real-world optimization issues were used. In
indicating no discernible improvement in PSO. terms of solution correctness, convergence
F3: Significant decline, going from 59473.62 speed, and robustness, the experimental results
to10991.67. show that HO performs better than conventional
F4: Showed poorer PSO performance, rising optimization techniques. In multimodal
from3.07E-07to12.66. functions in particular, it demonstrated better
F5: PSO performed worse, rising from 28.79 global optima avoidance, faster convergence
to110.88. rates, and lower average fitness values.
F7: Not much of an improvement, the gain from Additionally, the approach demonstrated
0.00024 to 0.21957 is slight. remarkable flexibility in practical applications,
F8: Showed a slight improvement from such as supply chain optimization, engineering
6105.07to5645.86. design, and feature selection in machine
F9: Showed poorer PSO performance, rising learning.
from0to137.45. Even while HO performs well, it still has to be
F11: Showed a notable improvement, rising from improved in terms of adaptive processes and
0.95 to 0.056. parameter tweaking in order to maximize
Situations in which PSO underperformed: performance across various problem domains.
The scores for F2, F4, F5, F6, F9, F12, and F13 Future research might examine applying HO to
all increased, indicating that PSO had no more real-world problems like industrial process
effectonperformance. control and deep learning model optimization,
Situations under which PSO excelled: or hybridizing it with other metaheuristic
Better PSO optimization was seen in F3, F7, F8, algorithms and adaptive learning techniques. All
F10, and F11, as seen by their lower top scores. things considered, the Hippopotamus Optimizer
It was concluded that In several instances (F3, offers a fresh and effective method of
F8, F11), PSO was successful in lowering the optimization with encouraging possibilities for a
function's best score considerably. range of computational intelligence
In instances like F2, F4, F5, F6, F9, F12, and applications.
F13, where the function's greatest score rose,
PSO either failed or performed worse, 5. References-
indicating inefficient optimization. The 1.Dhiman, G., Garg, M., Nagar, A., Kumar, V.
inconsistent performance indicates that PSO & Dehghani, M. A novel algorithm for global
may not be always the best option for all optimization: Rat Swarm Optimizer. J Ambient
functions and that hybrid optimization Intell Humaniz Comput 12, 8457–8482 (2021).
approaches or parameter modification may be 2.Chen, H. et al. An opposition-based sine
necessary to achieve better outcomes. cosine approach with local search for parameter
estimation of photovoltaic models. Energy
4. Conclusion- Convers Manag 195, 927–942 (2019).
In this work, we presented the 3. Li, S., Chen, H., Wang, M., Heidari, A. A. &
Hippopotamus Optimizer (HO), a new Mirjalili, S. Slime mould algorithm: A new
metaheuristic algorithm that draws inspiration method for stochastic optimization. Future
from hippopotamuses' territorial and social Generation Computer Systems 111, 300–323
tendencies. By mimicking hippo movements (2020).
between water and land, cooperative group 4. Gharaei, A., Shekarabi, S. & Karimi, M.
behaviours, and territorial dominance, the Modelling And optimal lot-sizing of the
program successfully strikes a balance between replenishments in constrained, multi product and
exploration and exploitation. In order to bi-objective EPQ models with defective products:

IJMSRT25 MAY069 www.ijmsrt.com 317


DOI: https://doi.org/10.5281/zenodo.15497150
Volume-3,Issue-5,May2025 International Journal of Modern Science Technology
ISSN NO-2584-2706

Generalised Cross Decomposition. Int J Syst Sci


1–13 (2019) 13. Kunakote, T. et al. Comparative
doi:10.1080/23302674.2019.1574364. Performance of Twelve Metaheuristics for
5. Sayadi, R. & Awasthi, A. An integrated Wind Farm Layout Optimisation. Archives
approach based on system dynamics and ANP for of Computational Methods in Engineering
evaluating sustainable transportation policies. 29, 717–730 (2022).
International Journal of Systems Science: 14. Ocak, A., Melih Nigdeli, S. & Bekdaş,
Operations & Logistics 7, 1–10 (2018). G. Optimization of the base isolator
6. Golalipour, K. et al. The corona virus search systems by considering the soil-structure
optimizer for solving global and engineering interaction via metaheuristic algorithms.
optimization problems. Alexandria Engineering Structures 56, 104886 (2023).
Journal 78, 614–642 (2023). 15.Domínguez, A., Juan, A. & Kizys, R.
7. Wolpert, D. H. & Macready, W. G. No free A Survey on Financial Applications of
lunch theorems for optimization. IEEE Metaheuristics. ACM Comput Surv 50,
Transactions on Evolutionary Computation 1, 67– 1–23 (2017).
82 (1997). 16. Han, S. et al. Thermal-economic
8. Emam, M. M., Samee, N. A., Jamjoom, M. M. optimization design of shell and tube
& Houssein, E. H. Optimized deep learning heat exchanger using an improved
architecture for brain tumor classification using sparrow search algorithm. Thermal
improved Hunger Games Search Algorithm. Science and Engineering Progress 45,
Comput Biol Med 160, 106966 (2023). 102085 (2023).
9. Lu, D. et al. Effective detection of Alzheimer’s 17. Hazra, A., Rana, P., Adhikari, M. &
disease by optimizing fuzzy K-nearest neighbors Amgoth, T. Fog computing for next-
based on salp swarm algorithm. Comput Biol generation Internet of Things:
Med 159, 106930 (2023). Fundamental, state-of-the-art and
10. Patel, H. R. & Shah, V. A. Fuzzy Logic Based research challenges. Comput Sci Rev 48,
Metaheuristic Algorithm for Optimization of 100549 (2023).
Type-1 Fuzzy Controller: Fault-Tolerant Control 18. Mohapatra, S. & Mohapatra, P. American
for Nonlinear System with Actuator Fault⁎⁎The zebra optimization algorithm for global
author(s) received funding for the ACODS 2022 optimization problems. Sci Rep 13, 5211
registration fees from Dharmsinh Desai (2023).
University, Nadiad-387001, Gujarat, India. IFAC- 19. Dehghani, M., Hubálovský, Š. &
PapersOnLine 55, 715–721 (2022). Trojovský, P. Northern Goshawk
11. Ekinci, S. & Izci, D. Enhancing IIR system Optimization: A New Swarm-Based Algorithm
identification: Harnessing the synergy of gazelle for Solving Optimization Problems. IEEE
optimization and simulated annealing algorithms. Access 9, 162059–162080 (2021).
e-Prime - Advances in Electrical Engineering, 20. Kennedy, J. & Eberhart, R. Particle swarm
Electronics and Energy 5, 100225 (2023). optimization. in Proceedings of ICNN’95 -
12. Refaat, A. et al. A novel metaheuristic MPPT International Conference on Neural Networks
technique based on enhanced autonomous group vol. 4 1942–1948 vol.4 (1995).
Particle Swarm Optimization Algorithm to track
the GMPP under partial shading conditions -
Experimental validation. Energy Convers Manag
287, 117124 (2023).

IJMSRT25 MAY069 www.ijmsrt.com 318


DOI: https://doi.org/10.5281/zenodo.15497150
Volume-3,Issue-5,May2025 International Journal of Modern Science Technology
ISSN NO-2584-2706

IJMSRT25 MAY069 www.ijmsrt.com 319


DOI: https://doi.org/10.5281/zenodo.15497150

You might also like