Autonomous Navigation System For Hexa-Legged Search and Rescue Robot Using Lidar
Autonomous Navigation System For Hexa-Legged Search and Rescue Robot Using Lidar
Corresponding Author:
Aris Budiyarto
Department of Automation and Mechatronics Engineering, Politeknik Manufaktur Bandung
Jl. Kanayakan 21, Dago, Bandung, Jawa Barat, Indonesia
Email: [email protected]
1. INTRODUCTION
The hexa-legged robot, a six-legged marvel, has captivated widespread interest in recent years
owing to its innovative design. This robotic system derives its leg morphology from biological creatures, a
biomimetic approach aimed at enhancing adaptability across a spectrum of challenging terrains [1]–[5]. The
unique leg configuration, inspired by natural biomechanics, endows the hexa-legged robot with unparalleled
stability and versatility. Research indicates that in comparison to quadrupedal or four-legged counterparts,
the hexa-legged robot demonstrates superior stability during locomotion. This heightened stability is
attributed to its utilization of a static walking technique, in contrast to the dynamic walking techniques
employed by other legged robots [6]. The static walking technique minimizes control complexity, enabling
the robot to traverse complex terrains more efficiently. By mimicking the steady and reliable locomotion
observed in certain biological organisms, this innovative robotic design holds promise for enhanced
adaptability, stability, and efficiency, making it an ideal candidate for applications in diverse fields such as
search and rescue operations, exploration of rugged terrains, and tasks requiring precise and stable movement
in unpredictable environments.
The autonomous navigation system represents a critical paradigm in robotics, empowering robots to
navigate from one point to another, regardless of whether the destination is familiar or uncharted. This
intricate system integrates multiple components to facilitate seamless movement. Firstly, perception becomes
paramount, necessitating the incorporation of sensors that enable the robot to perceive its environment
comprehensively. These sensors provide crucial data inputs, aiding the robot's understanding of its
surroundings. Secondly, localization emerges as a pivotal aspect, allowing the robot to determine its precise
position within the environment. This spatial awareness is achieved through sophisticated localization
algorithms, ensuring accurate positioning in real-time. Furthermore, the system incorporates recognition
capabilities, enabling the robot to make intelligent decisions autonomously. This decision-making process
involves analyzing the gathered information, identifying obstacles or optimal paths, and formulating
strategies to achieve its objectives effectively. Lastly, motion control emerges as the final link in this chain of
operations, wherein the robot translates its decisions into physical actions. Through the actuation of precise
motions, the robot executes its planned movements, responding to the environment's dynamics. Thus, the
Autonomous Navigation System amalgamates perception, localization, recognition, and motion control,
orchestrating a harmonious interplay of functionalities that enable robots to navigate autonomously with
efficiency, adaptability, and intelligence [6]–[9].
The unpredictable and hazardous conditions arising from natural disasters in regions like Indonesia's
ring of fire necessitate the deployment of advanced robotic solutions for efficient search and rescue
operations [10], [11]. The use of a hexa-legged robot, capable of maneuvering through complex terrains, can
significantly improve the effectiveness of rescue missions. This paper presents an autonomous navigation
system for such a robot, utilizing light detection and ranging (LiDAR) technology to enhance perception,
localization, and obstacle detection capabilities [12]–[14].
The central focus of this research endeavors to advance the implementation of an autonomous
navigation system specifically customized for the hexapod robot. Building upon prior accomplishments
utilizing Navstack_Pub and Hector_SLAM for mapping while employing LiDAR as the primary sensor,
notable accuracy and repeatability were demonstrated within controlled environments. However, this study
aims to further enhance this implementation by addressing synchronization challenges associated with
encountered hardware limitations during subsequent trials. The overarching aim is to bolster the adaptability
and reliability of the navigation system tailored for the hexapod robot, particularly emphasizing its
performance in challenging scenarios like search and rescue operations amidst dynamic and hazardous
conditions.
2. METHOD
This research encompasses the creation of an autonomous navigation system tailored for a hexa-
legged search and rescue robot. The system distinguishes itself by its methodical integration of perception,
localization, recognition, and motion control components. Through this cohesive framework, the robot
acquires the ability to independently navigate and execute search and rescue operations with a high degree of
efficiency. This study not only advances the field of robotics but also underscores the system's practical
significance in various scenarios, including both familiar and unfamiliar environments [6], [7].
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
52 ISSN: 2722-2586
(a)
(b)
Figure 1. Illustration of hexapod System: (a) robot construction diagram and (b) general system diagram
Furthermore, in Figure 2, the design layout for the robot is elaborated, presenting both an overhead
view as shown in Figure 2(a) and a side perspective view as shown in Figure 2(b). This meticulous design
encapsulates the seamless integration of essential components. The strategic arrangement of the Dynamixel
servo motors, LiDAR, and IMU sensors exemplifies the meticulous engineering employed in the robot's
construction. The placement of these components not only ensures optimal functionality but also contributes
to the robot's overall aesthetic coherence. The layout emphasizes the efficient utilization of space, fostering a
compact yet efficient design. The positioning of the 18 servo motors dedicated to leg movements, along with
the two motors allocated for the gripper and lifter modules, reflects a balanced distribution that augments the
robot's stability and versatility. The LiDAR and IMU sensors are strategically positioned to enable
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 53
comprehensive environmental perception and precise localization, enhancing the robot's navigational
capabilities in various scenarios.This detailed design overview underlines the thoughtful consideration given
to both the technical requirements and the visual appeal, culminating in a robot that embodies efficiency,
functionality, and a visually appealing form factor.
(a) (b)
Figure 2. Robot design hexapod: (a) overhead view and (b) side perspective view
When driving the robot's steps, step motion is used, so the step motion is smooth. The trajectory is
based on a curve formed by a 3rd-order polynomial equation. The gait pattern is then used to maintain the
robot's balance during movement. This pattern is a sequential arrangement of each robot leg so that the robot
can move dynamically [15]–[18].
The gait pattern is used to maintain the balance of the robot while moving. This pattern is a
sequential order of each robot leg so that the robot can move dynamically and remain stable. Types of gait
patterns that are often used include tripod gait, ripple gait and wave gait as depicted in Figure 4. In the tripod
gait pattern, the step pattern is made using three feet to tread and three feet to step as depicted in Figure 4(a).
Then in the ripple gait pattern, two robot legs are used alternately to step as depicted in Figure 4(b). While in
the wave gait pattern, one leg of the robot is used alternately to step as depicted in Figure 4(c). Tripod gait,
ripple gait, and wave gait patterns [19]–[21].
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
54 ISSN: 2722-2586
Figure 4. Gait Patterns: (a) tripod gait, (b) ripple gait, and (c) wave gait
The Hector SLAM algorithm is a scan matching-based technique that converts scans into localized
coordinate frames by leveraging position estimates from the LIDAR system. The scans are transformed into a
cloud point representation of endpoint positions, taking into account platform orientation estimates and
combined values. Pre-processing steps, such as point reduction or outlier removal, can be applied to the cloud
point based on the specific scenario. In the proposed approach, filtering is solely performed using the z-
coordinate of the endpoint positions, allowing only endpoints within the desired scanning plane threshold to
participate in the scan matching process [22].
The technique presented in Figure 6 encompasses bilinear filtering of the occupancy grid map,
aiming to interpolate the value of point 𝑃m . This method allows for approximating both the occupancy value
𝜕𝑀 𝜕𝑀
𝑀(𝑃m ) and the gradient ∇𝑀(𝑃m ) = ( (𝑃m ), (𝑃m )) using the four closest integer coordinates 𝑃00..11 as
𝜕𝑥 𝜕𝑦
depicted in Figure 6(a). Through linear interpolation along the x- and y-axis, the approximation for 𝑀(𝑃m ) is
derived:
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 55
𝑦 − 𝑦0 𝑥 − 𝑥0 𝑥1 − 𝑥
𝑀(𝑃𝑚 ) ≈ ( 𝑀(𝑃11 ) + 𝑀(𝑃01 ) )
y1 − y0 x1 − x0 x1 − x0
𝑦1 − 𝑦 𝑥 − 𝑥0 𝑥1 − 𝑥
+ ( 𝑀(𝑃10 ) + 𝑀(𝑃00 ) )
y1 – y0 x1 − x0 x1 − x0
𝜕𝑀 𝑦 − 𝑦0 𝑦1 − 𝑦
(𝑃𝑚 ) ≈ (𝑀(𝑃11 ) − 𝑀(𝑃01 ) ) + (𝑀(𝑃10 ) + 𝑀(𝑃00 ) )
𝜕𝑥 y1 – y0 y1 – y0
𝜕𝑀 𝑥 − 𝑥0 𝑥1 − 𝑥
(𝑃𝑚 ) ≈ (𝑀(𝑃11 ) − 𝑀(𝑃01 ) ) + (𝑀(𝑃10 ) + 𝑀(𝑃00 ) )
𝜕𝑦 x1 – x0 x1 – x0
It's important to note that the sample points or grid cells of the map are arranged on a regular grid,
each positioned at a distance of 1 in map coordinates from each other. This regularity in the grid simplifies
the presented equations for gradient approximation, ensuring a more streamlined calculation process.
Figure 6(b) visualizes the resulting occupancy grid map and its spatial derivatives, offering a comprehensive
insight into the filtering process's outcomes [22], [23].
(a) (b)
Figure 6. Bilinear filtering of the occupancy grid map (a). Point P_m is the point whose value shall
be interpolated and (b) Occupancy grid map and spatial derivatives
optimized path planning and responsive control, ensuring efficient and reliable performance in various
scenarios. The presented navstack_pub system demonstrates the successful integration of state-of-the-art
algorithms and mapping techniques within the Robot Operating System (ROS), providing a robust and
versatile solution for robot navigation and path planning. The system's capabilities have been validated
through comprehensive simulations and real-world experiments, showcasing its efficacy and potential for a
wide range of applications in autonomous robotic systems [23], [29]–[31].
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 57
3.2. Navigation
With the implementation of this program, the hexapod robot gains the capability to navigate
effectively using the Navstack_pub system within an environment that has been meticulously mapped by
hector_slam as depicted in the Figure 10. This navigational prowess is further enhanced by the integration of
crucial sensor data, including insights from the IMU and LiDAR. The harmonious communication
established between these interconnected nodes serves as the cornerstone for the robot's seamless navigation.
This cohesion empowers the hexapod robot to operate with exceptional coordination, deftly executing its
navigation tasks with precision and efficiency.
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
58 ISSN: 2722-2586
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 59
consult Table 3, which systematically presents the hector_SLAM algorithm's performance in the context of
dual obstacle scenarios. Furthermore, Figure 13 visually encapsulates the mapping results, providing an
insightful portrayal of the algorithm's competence in navigating and representing environments marked by
the presence of two obstacles. Through this visual representation, the algorithm's proficiency in handling
multi-obstacle scenarios becomes palpably evident.
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
60 ISSN: 2722-2586
The implementation navigation testing in this arena is conducted to verify the successful execution
of the robot within a designed arena, which simulates various road conditions and predefined paths in the
main program. By commanding the robot to navigate obstacles such as cracked paths, inclined surfaces,
rocks, and marbles, denoted as targets 1, 2, and 3, as depicted in Figure 15. This comprehensive testing
approach ensures that the robot's implementation aligns with the intended functionality across diverse terrain
challenges and predefined routes.
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 61
To present the findings in a structured manner, the accuracy navigation results are succinctly
captured in a dedicated table, specifically Table 4. This table encompasses target coordinates, denoted in
meters, alongside the resultant values in both the x and y axes. Additionally, the calculated errors in both the
x and y axes are thoughtfully incorporated within the same table, providing a comprehensive depiction of the
accuracy achieved by the navigation system.
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
62 ISSN: 2722-2586
4. CONCLUSION
The implementation of the navigation system on the hexapod robot has been successfully achieved
using Navstack_Pub, utilizing the mapping results from Hector_SLAM. The mapping process employed
LiDAR as the sole sensor for mapping and localization. The LiDAR mapping results, with a pixel resolution
of 2.5cm, exhibit an average error of 0.21%. For circular or rounded surfaces, the average error is slightly
higher at 5.34%.
The accuracy of the navigation system holds promise, especially within a 2-meter range, showing an
average error of 1.2% on the x-axis and 0.011% on the y-axis during straight-line motion. The system's
repeatability is reliable, evident in its ability to return to the home position with a minimal average error of
4.33 cm on the x-axis and 0.5 cm on the y-axis, as demonstrated in the repeatability table.
Furthermore, the implementation testing within a designated arena showcased the robot's capability
to navigate through various obstacles like cracked paths, inclined surfaces, rocks, and marbles. The robot
successfully navigated these challenges and reached its intended destinations during the initial trials. In a
subsequent iteration, however, the robot encountered difficulty reaching target 3 due to synchronization
issues attributed to hardware limitations. Notably, the Raspberry Pi 4 CPU reached usage rates of up to 97%,
affecting program synchronization and resulting in navigational challenges.
These results collectively underscore the successful integration of the navigation system,
highlighting its potential for various applications. The achieved accuracy, repeatability, and obstacle
navigation capability substantiate its effectiveness in real-world scenarios.
ACKNOWLEDGEMENTS
The authors would like to express their sincere gratitude to Politeknik Manufaktur Bandung for
providing sponsorship and financial support for the development of the robot used in this research. Without
their generous support, this project would not have been possible. Acknowledging the invaluable
contributions of sponsors and financial support is vital in research papers, as it highlights the institutions'
commitment to fostering advancements in technology and promoting scientific endeavors. The authors
acknowledge the significant role of Politeknik Manufaktur Bandung in making this research a reality and
extend their appreciation for the resources and opportunities provided. The support received from Politeknik
Manufaktur Bandung not only enabled the successful completion of this project but also contributed to the
authors' professional growth and development. The guidance and assistance provided by the institution's
faculty and staff were instrumental in the research's execution, and the authors are grateful for the invaluable
mentorship they received.
Furthermore, the authors would like to acknowledge the collaboration and coordination with the
institution's various departments and facilities, which played a pivotal role in the successful implementation
of the robot and experimentation process. The authors also extend their appreciation to all the individuals
who directly or indirectly supported the research project. Their contributions, whether technical, logistical, or
intellectual, have greatly enriched the outcome of this work. In conclusion, the authors are immensely
grateful to Politeknik Manufaktur Bandung for their unwavering support, which has been instrumental in the
successful execution of this research. Their sponsorship and financial backing have made a profound impact,
and the authors are proud to acknowledge their invaluable contribution to the advancement of technology and
robotics research.
REFERENCES
[1] Z. Chen, J. Li, S. Wang, J. Wang, and L. Ma, “Flexible gait transition for six wheel-legged robot with unstructured terrains,”
Robotics and Autonomous Systems, vol. 150, Apr. 2022, doi: 10.1016/j.robot.2021.103989.
[2] Z. Chen, J. Liu, and F. Gao, “Real-time gait planning method for six-legged robots to optimize the performances of terrain
adaptability and walking speed,” Mechanism and Machine Theory, vol. 168, Feb. 2022, doi:
10.1016/j.mechmachtheory.2021.104545.
[3] J. Chen, Z. Liang, Y. Zhu, and J. Zhao, “Improving kinematic flexibility and walking performance of a six-legged robot by
rationally designing leg morphology,” Journal of Bionic Engineering, vol. 16, no. 4, pp. 608–620, Jul. 2019, doi: 10.1007/s42235-
019-0049-9.
[4] B. Qin, Y. Gao, and Y. Bai, “Sim-to-real: six-legged robot control with deep reinforcement learning and curriculum learning,” in
2019 4th International Conference on Robotics and Automation Engineering, Nov. 2019, pp. 1–5, doi:
10.1109/ICRAE48301.2019.9043822.
[5] M. Schilling, K. Konen, and T. Korthals, “Modular deep reinforcement learning for emergent locomotion on a six-legged robot,”
in Proceedings of the IEEE RAS and EMBS International Conference on Biomedical Robotics and Biomechatronics, Nov. 2020,
pp. 946–953, doi: 10.1109/BioRob49111.2020.9224332.
[6] F. Rubio, F. Valero, and C. Llopis-Albert, “A review of mobile robots: Concepts, methods, theoretical framework, and
applications,” International Journal of Advanced Robotic Systems, vol. 16, no. 2, Mar. 2019, doi: 10.1177/1729881419839596.
[7] A. Nguyen, N. Nguyen, K. Tran, E. Tjiputra, and Q. D. Tran, “Autonomous navigation in complex environments with deep
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64
IAES Int J Rob & Autom ISSN: 2722-2586 63
multimodal fusion network,” in IEEE International Conference on Intelligent Robots and Systems, Oct. 2020, pp. 5824–5830, doi:
10.1109/IROS45743.2020.9341494.
[8] Y. D. V. Yasuda, L. E. G. Martins, and F. A. M. Cappabianco, “Autonomous visual navigation for mobile robots: A systematic
literature review,” ACM Computing Surveys, vol. 53, no. 1, pp. 1–34, Feb. 2020, doi: 10.1145/3368961.
[9] S. A. S. Mohamed, M. H. Haghbayan, T. Westerlund, J. Heikkonen, H. Tenhunen, and J. Plosila, “A survey on odometry for
autonomous navigation systems,” IEEE Access, vol. 7, pp. 97466–97486, 2019, doi: 10.1109/ACCESS.2019.2929133.
[10] T. H. Siagian, P. Purhadi, S. Suhartono, and H. Ritonga, “Social vulnerability to natural hazards in Indonesia: driving factors and
policy implications,” Natural Hazards, vol. 70, no. 2, pp. 1603–1617, Oct. 2014, doi: 10.1007/s11069-013-0888-3.
[11] A. Boucher et al., “The AROUND project: adapting robotic disaster response to developing countries,” Nov. 2009, doi:
10.1109/SSRR.2009.5424156.
[12] T. Yoshiike et al., “The experimental humanoid robot E2-DR: design for inspection and disaster response in industrial
environments,” IEEE Robotics and Automation Magazine, vol. 26, no. 4, pp. 46–58, Dec. 2019, doi:
10.1109/MRA.2019.2941241.
[13] S. Park, Y. Oh, and D. Hong, “Disaster response and recovery from the perspective of robotics,” International Journal of
Precision Engineering and Manufacturing, vol. 18, no. 10, pp. 1475–1482, Oct. 2017, doi: 10.1007/s12541-017-0175-4.
[14] R. ; Siegwart et al., “Legged and flying robots for disaster response conference paper legged and flying robots for disaster
response,” 2015, doi: 10.3929/ethz-a-010643912.
[15] F. Delcomyn and M. E. Nelson, “Architectures for a biomimetic hexapod robot,” Robotics and Autonomous Systems, vol. 30, no.
1, pp. 5–15, Jan. 2000, doi: 10.1016/S0921-8890(99)00062-7.
[16] U. Saranli, M. Buehler, and D. E. Koditschek, “Design, modeling and preliminary control of a compliant hexapod robot,” in
Proceedings-IEEE International Conference on Robotics and Automation, 2000, vol. 3, pp. 2589–2596, doi:
10.1109/ROBOT.2000.846418.
[17] U. Saranli, M. Buehler, and D. E. Koditschek, “RHex: a simple and highly mobile hexapod robot,” International Journal of
Robotics Research, vol. 20, no. 7, pp. 616–631, Jul. 2001, doi: 10.1177/02783640122067570.
[18] J. Coelho, F. Ribeiro, B. Dias, G. Lopes, and P. Flores, “Trends in the control of hexapod robots: a survey,” Robotics, vol. 10, no.
3, Aug. 2021, doi: 10.3390/robotics10030100.
[19] W. Chen, G. Ren, J. Zhang, and J. Wang, “Smooth transition between different gaits of a hexapod robot via a central pattern
generators algorithm,” Journal of Intelligent and Robotic Systems: Theory and Applications, vol. 67, no. 3–4, pp. 255–270, Mar.
2012, doi: 10.1007/s10846-012-9661-1.
[20] M. A. Lewis, A. H. Fagg, and G. A. Bekey, “Genetic algorithms for gait synthesis in a hexapod robot,” in World Scientific Series
in Robotics and Intelligent Systems, WORLD SCIENTIFIC, 1994, pp. 317–331.
[21] D. Li, Y. Gao, W. Wei, and X. Liu, “Terrain adaptation of hexapod robot based on ground detection and sliding mode control,” in
ACM International Conference Proceeding Series, Jun. 2022, pp. 59–65, doi: 10.1145/3556267.3556271.
[22] S. Kohlbrecher, O. Von Stryk, J. Meyer, and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion
estimation,” in 9th IEEE International Symposium on Safety, Security, and Rescue Robotics, Nov. 2011, pp. 155–160, doi:
10.1109/SSRR.2011.6106777.
[23] X. Zhang, J. Lai, D. Xu, H. Li, and M. Fu, “2D LiDAR-based SLAM and path planning for indoor rescue using mobile robots,”
Journal of Advanced Transportation, vol. 2020, pp. 1–14, Nov. 2020, doi: 10.1155/2020/8867937.
[24] F. H. Ajeil, I. K. Ibraheem, A. T. Azar, and A. J. Humaidi, “Autonomous navigation and obstacle avoidance of an omnidirectional
mobile robot using swarm optimization and sensors deployment,” International Journal of Advanced Robotic Systems, vol. 17, no.
3, May 2020, doi: 10.1177/1729881420929498.
[25] O. C. Barawid, A. Mizushima, K. Ishii, and N. Noguchi, “Development of an autonomous navigation system using a two-
dimensional laser scanner in an orchard application,” Biosystems Engineering, vol. 96, no. 2, pp. 139–149, Feb. 2007, doi:
10.1016/j.biosystemseng.2006.10.012.
[26] J. Li, H. Qin, J. Wang, and J. Li, “OpenStreetMap-based autonomous navigation for the four wheel-legged robot via 3D-LiDAR
and CCD camera,” IEEE Transactions on Industrial Electronics, vol. 69, no. 3, pp. 2708–2717, Mar. 2022, doi:
10.1109/TIE.2021.3070508.
[27] A. A. Zhilenkov, S. G. Chernyi, S. S. Sokolov, and A. P. Nyrkov, “Intelligent autonomous navigation system for UAV in
randomly changing environmental conditions,” Journal of Intelligent and Fuzzy Systems, vol. 38, no. 5, pp. 6619–6625, May
2020, doi: 10.3233/JIFS-179741.
[28] A. N. A. Rafai, N. Adzhar, and N. I. Jaini, “A review on path planning and obstacle avoidance algorithms for autonomous mobile
robots,” Journal of Robotics, vol. 2022, pp. 1–14, Dec. 2022, doi: 10.1155/2022/2538220.
[29] W. Hess, D. Kohler, H. Rapp, and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” in Proceedings - IEEE International
Conference on Robotics and Automation, May 2016, pp. 1271–1278, doi: 10.1109/ICRA.2016.7487258.
[30] X. Yang, “Slam and navigation of indoor robot based on ROS and LiDAR,” Journal of Physics: Conference Series, vol. 1748, no.
2, Jan. 2021, doi: 10.1088/1742-6596/1748/2/022038.
[31] S. Kohlbrecher, O. Von Stryk, J. Meyer, and U. Klingauf, “A flexible and scalable SLAM system with full 3D motion
estimationn,” in 9th IEEE International Symposium on Safety, Security, and Rescue Robotics, Nov. 2011, pp. 155–160, doi:
10.1109/SSRR.2011.6106777.
BIOGRAPHIES OF AUTHORS
Autonomous navigation system for hexa-legged search and rescue robot using Lidar (Aris Budiyarto)
64 ISSN: 2722-2586
IAES Int J Rob & Autom, Vol. 13, No. 1, March 2024: 50-64