A Review of Perception Sensors, Techniques, and Hardware Architectures For
A Review of Perception Sensors, Techniques, and Hardware Architectures For
A R T I C L E I N F O A B S T R A C T
Keywords: Unmanned Aerial Vehicles (UAVs) can detect and communicate with cooperative obstacles through established
Non-cooperative obstacle avoidance protocols. However, non-cooperative obstacles pose a significant threat to UAVs during low-flight operations.
Perception sensor These obstacles include static obstacles like buildings, trees, or communication towers and dynamic objects like
Low-altitude surveillance
other UAVs. The application of autonomous UAVs in low-altitude surveillance has motivated research into non-
Autonomous UAV
cooperative local obstacle avoidance. This paper provides an overview of such solutions that have been proposed
within the last decade. Unlike most literature that limits obstacle avoidance to algorithms, this work provides an
in-depth review of obstacle avoidance components, namely the perception sensor, techniques, and hardware
architecture of the obstacle avoidance system. This review categorizes the non-cooperative obstacle avoidance
techniques into four groups: gap-based methods, geometric methods, repulsive force-based methods, and Arti
ficial Intelligence (AI) based methods. This paper provides a comprehensive resource for researchers working on
collision-free surveillance by autonomous UAVs at low altitudes.
1. Introduction individual pilot errors [19,20]. Autonomous flight systems can poten
tially reduce these human errors and improve overall safety. UAVs are a
UAVs have the remarkable capacity to access dangerous or difficult- category of aircraft that can fly autonomously. The notion of autono
to-reach areas. The development of these UAV platforms has several mous UAVs raises the question of how their flight control is achieved.
advantages and is widely used in numerous applications worldwide. As Different methods exist to control the flight of UAVs: remotely piloted,
of November 2020, there are 1.7 million registered UAVs in the USA, semi-autonomous, and fully autonomous. Remotely piloted UAV is used
according to the Federal Aviation Administration (FAA), and in the same in [23], where a ground-based pilot uses visual feedback to control the
year, China had 392,000 registered UAVs with more than 1.25 million UAV’s speed, location, and direction. The UAV must, however, be in the
flight hours [11]. With a global market share of $127 billion, the global pilot’s line of sight for this to work. As an alternative, the UAV can be
UAV market is predicted to expand with a Compounded Annual Growth remotely operated from a command-and-control center, where a
Rate (CAGR) of 16.2 % [3,14]. Furthermore, market analysts have given telemetry device is used to transmit the flight parameters and position of
optimistic forecasting regarding the growth of UAV payload markets. the UAV as determined by onboard sensors.
Cameras, Inertial Measurement Units (IMUs), radar, LiDAR, communi With the help of onboard sensors (e.g. GPS module, barometer,
cations devices, weapons, and other equipment are examples of UAV magnetic compass, accelerometer, gyroscope), a flight controller con
payloads. By 2027, the drone payload market is anticipated to increase trols semi-autonomous UAV speed, heading and altitude [27]. But the
at a compound yearly growth rate of 12.22 %, around $13.53 billion waypoints and flying parameters are pre-programmed into the flight
[17]. controller by the ground control station. In semi-autonomous flights,
UAVs can traverse remote and inaccessible dynamic environments UAVs can still not make wise decisions like path planning and obstacle
better than Unmanned Vehicles (UVs). This feature increases the like avoidance. A completely autonomous UAV conducts flight operations
lihood of a collision. Based on statistical data, human error accounts for without receiving guidance from a ground control station. It can make
80 % of air traffic accidents, with 53 % of these incidents attributed to intelligent real-time decisions while in flight. This intelligent
* Corresponding author.
E-mail address: [email protected] (N. Nasir).
https://doi.org/10.1016/j.robot.2024.104629
Received 3 August 2023; Received in revised form 2 December 2023; Accepted 2 January 2024
Available online 3 January 2024
0921-8890/© 2024 Elsevier B.V. All rights reserved.
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
decision-making uses an onboard companion computer (e.g., Raspberry avoidance and local obstacle avoidance. Global path planning estab
Pi, Jetson Nano, Intel NUC, ODROID XU4) based on the feedback from lishes the optimal route between the starting and ending points. Various
perception sensors like camera or LiDAR. These intelligent decisions algorithms have been formulated over the past few years to facilitate
include essential capabilities like navigation, path optimization, locali global and local UAV path planning. The study mentioned in [25]
zation, and motion planning. Yet, obstacle avoidance is the most critical demonstrates that prior knowledge of the exact map of the environment
for low-altitude surveillance applications. is critical for global path planning. Many path planning techniques,
The capacity to navigate to a predetermined destination while including heuristic pathfinding [41], Rapidly exploring Random Tree
avoiding obstacles obstructing the path is an essential element of (RRT) [42], breath first search, and the A* algorithm [43], construct the
autonomous flight. As stated in [31], low-flying autonomous UAVs rely best flight path to avoid obstacles while knowing the map of the envi
on their Obstacle Avoidance System, specifically designed to meet ronment. These global path planning algorithms seek to identify the
operational needs. As shown in Fig. 1, operating UAVs at lower altitudes most effective route that minimizes flight time and avoids collisions or
causes more obstruction and reduces operational speed compared to obstacles. Global obstacle avoidance techniques have several limitations
those flying at higher altitudes. Most low-altitude UAV applications when it comes to low-altitude surveillance. These methods can be prone
have a maximum permitted flight level of 122 m [4]. This study focuses to errors as they rely on maps, which can be computationally expensive
on multi-rotor UAV surveillance on low-altitude flights. A robust to build and update. Moreover, latency in map updating can increase the
obstacle avoidance system is necessary for safe operation in all weather likelihood of collisions.
conditions. A UAV encounters unanticipated obstacles when flying at In contrast, localized path planning algorithms require continuous
low altitudes, so the obstacle avoidance system must be quick and monitoring of the UAV proximity. Without prior knowledge of the area
effective. Additionally, the system must recognize and avoid obstacles map, the next action vector is decided based on the feedback from the
even when unaware of the local obstacle map. perception sensor to accomplish the goal of real-time obstacle avoid
Obstacle avoidance can be divided into cooperative and non- ance. Local obstacle avoidance methods incorporate various solutions,
cooperative categories depending on the extent of cooperation be such as gap-based, geometric, repulsive-force, and AI-based obstacle
tween the flying agents. In cooperative obstacle avoidance, UAVs work avoidance techniques. Gap-based methodologies, such as open-sector
together to avoid collisions or share information about their position, collision avoidance [8], the Vector Field Histogram (VFH) [44], and
altitude, and heading with a third party. According to [11], there are the Nearness Diagram (ND) [3], exploit the presence of localized gaps
three techniques for cooperative obstacle avoidance: the Traffic Colli between obstacles to facilitate the UAV movement through complex
sion Avoidance System (TCAS), the Automatic Dependent environments. With geometric approaches like collision cone [45] and
Surveillance-Broadcast (ADS-B), and the Flight Alarm (FLARM). These velocity obstacle methods [46], the obstacles’ shape, location, and
techniques allow the UAVs to convey the warning information and relative velocity are considered to avoid a collision. Repulsive-force
negotiate their trajectories to prevent potential collisions. However, techniques, such as the Potential Field (PF) algorithm [25] and Artifi
when operating at low altitudes, UAVs may collide with other UAVs and cial Potential Field (APF) algorithm [47], generate repulsive force fields
a range of non-cooperative obstacles, including structures, trees, and to avoid obstacles and attractive force functions to move toward the
cables. Thus, cooperative techniques have limitations against goal. The utilization of AI-based techniques, specifically those based on
non-cooperative obstacles at low altitudes. On the other hand, machine learning, is dependent on the use of training datasets to
non-cooperative obstacle avoidance relies on individual UAVs sensing effectively train Convolutional Neural Networks (CNN) [48] and Deep
and reacting to their surroundings without coordinating with other Neural Networks (DNN) [49] to facilitate obstacle avoidance. Deep
flying agents. As a result of these limitations, a lot of research has been Reinforcement Learning (DRL) [50] is another AI-based technique that
pursued in the area of non-cooperative local obstacle avoidance for uses data from perception sensors rather than a dataset to prevent col
low-altitude surveillance. lisions. Nevertheless, the multi-layered obstacle avoidance method in
As per findings in [39], non-cooperative obstacle avoidance at low [51] employs both global and local path planners to offer obstacle
altitudes can be segregated into two categories; global obstacle avoidance trajectories that are both energy-efficient and optimal.
Fig. 1. UAV collision risk at low altitude [4] (see Section 2 for UAV’s low-altitude application areas).
2
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
This paper reviews non-cooperative and local obstacle avoidance 2. UAV low-altitude application areas
strategies for UAVs and focuses primarily on collision-free navigation
systems. In contrast to previous reviews, as shown in Table 1, this work In the age of Industry 4.0, there is a trend towards intelligent auto
provides an in-depth assessment of the obstacle avoidance components. mation, which includes using UAVs in various practical applications
To our knowledge, the existing literature reviews have a limited scope. alongside other types of robots. UAVs have been extensively employed
They do not compare the different obstacle avoidance strategies and in low-flight applications, benefiting from their agility, versatility, and
hardware designs used in the reviewed literature. For example, the ability to operate at lower altitudes. These unmanned aerial systems
previous reviews presented in [4,52] focus on future directions for the offer a range of advantages in such scenarios, including enhanced situ
utilization of UAVs in potential applications, [53] present sensors and ational awareness, efficient data collection, and reduced risk to human
classification of obstacle avoidance techniques from a broader operators. Some of the relevant application areas for UAVs involving
perspective and [31] provide a detailed insight into the swarm of mul low-flight operation include surveillance [6], security [55], search and
tiple UAVs, using computer vision algorithms. The key contributions of rescue [32], construction [21] and infrastructure inspection [25],
this paper are as follows: disaster response, precision agriculture [30,39,56,57], delivery of goods
[38], traffic monitoring [7] and wireless network coverage [18]. UAVs
• A review of perception sensors for non-cooperative and local obstacle are presently utilized for a wide range of tasks [52]. It is essential to
avoidance, including their classification, advantages, and recognize that their usefulness in novel application areas cannot be
drawbacks. dismissed from a future perspective [58]. The diverse sectors that
• A structured review and categorization of the various non- employ UAVs for various uses are listed in Table 2. As suggested in [4],
cooperative local obstacle avoidance techniques into four groups. increasing demand for UAVs in the future is expected to create 100,000
• Present researchers with an overview of the hardware architectures new jobs by the end of 2025, leading to an increased number of UAVs
used in low-altitude local obstacle avoidance techniques. worldwide.
The UAV must eventually descend into a low-altitude flight phase to
The rest of the paper is organized as follows. The numerous UAV be used for any of these applications. The collision threats these appli
applications that demand low-altitude operation are described in Sec cation areas have that could endanger the UAV or the surveillance site’s
tion 2 of the paper. Section 3 describes the sensing systems used by UAVs safety include trees, windmills, wooden electric poles, small houses,
to avoid obstacles. Sensing systems comprise active sensors, passive communication towers, tall buildings, power lines, bridges, and other
sensors, and a fusion of both. The obstacle avoidance techniques used to UAVs. The types of obstacles encountered during low-altitude flight
this date in low-altitude surveillance are reviewed in detail in Section 4. depend on the application area, as mentioned in Table 2.
Hardware architectures used in the reviewed literature for collision-free
navigation solutions are presented in Section 5. The discussion and 3. Perception sensors for non-cooperative obstacle avoidance
research directions for the future are presented in Section 6.
Perception sensors used for obstacle avoidance are an essential part
of low-altitude navigation. Other sensors include GPS, IMUs, barome
Table 1 ters, altimeters, and telemetry sensors. The three categories of percep
Summary of previous reviews. tion sensors used for obstacle avoidance are visual, non-visual and
sensor fusion. Visual sensing is known for its low cost, less payload and
Refs. Core area Scope and limitations
wider field of view [59]. The primary mode of operation for visual
[4] • A survey on UAV civil applications • The main topic covered is UAV- sensors is passive sensing. Various sensors can capture visual data,
and their future research based application areas and their
challenges research challenges
including RGB, stereo, color infrared, multispectral, hyperspectral, and
• Perception sensors, hardware thermal cameras. The second category comprises non-visual sensors that
architecture and Collison operate on the principle of active sensing. These sensors emit their en
avoidance techniques are not ergy to illuminate the surrounding environment and receive the re
covered
flected signal rather than rely on the energy emitted by surrounding
[54] • Reviews computer vision • It only covers vision-based
algorithms and their utilization in systems structures or obstacles. Active sensors include radar, LiDAR, and ultra
intelligent applications • Hardware architecture and non- sonic sensors. Integrating active and passive sensors, known as sensor
visual obstacle avoidance sys fusion, constitutes the third category of perception sensors. This method
tems are not covered. aims to improve the precision and dependability of obstacle identifica
• Low altitude obstacle avoidance
systems not discussed
tion and avoidance by combining the strengths of visual and non-visual
[52] • UAV classification, application • UAV applications in different sensors. Sensor fusion techniques can make a more detailed perception
areas, controllers, limitations, and sectors are the main topic of the environment possible, enabling navigation systems working in
future challenges are discussed covered in this review. low-altitude circumstances to make more reliable decisions.
• Perception sensors, obstacle
An extensive categorization of perception sensors is presented in this
avoidance, and hardware
architecture of obstacle section, facilitating an in-depth review of their strengths and weak
avoidance are not included nesses. The reviewed literature on low-altitude obstacle avoidance uti
[53] • Reviews the sensors and obstacle • Discussion on perception sensors lizes a variety of visual and non-visual sensors, as depicted in Fig. 2.
avoidance strategies for and obstacle avoidance Visual sensing uses spectral wavelengths like the visible, near-infrared,
Unmanned Aerial Vehicles (UAV) techniques from a broader
perspective
and thermal infrared frequency regions. Millimeter-wave, radio-wave,
• Hardware architecture for low and ultraviolet frequency bands are utilized for non-visual sensing. The
altitude surveillance applications selection of an optimal sensor for a given application depends upon
not covered multiple considerations, such as weight, cost, detection range, spectral
[31] • UAV obstacle avoidance • Path planning and multi-UAVs
resolution, computational complexity, and the processing capabilities of
approaches are reviewed, obstacle avoidance trajectory
applicable specifically to a team or planning is the main scope of the onboard companion computer.
a swarm of UAVs. review A systematic examination of the Web of Science (WoS) database is
• Perception sensors and hardware undertaken to explore the prevailing patterns of publications and
architecture for low altitude countries engaged in the field of “UAV” and “sensors”. As shown in
surveillance UAVs not covered
Fig. 3, the search queries "UAV" and "sensors" exhibited a notable
3
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
upsurge in publications since 2014, reaching 1286 by 2022. The results perception. In the cited study [60], the images undergo initial processing
indicated that China, the United States, and Germany are primary con to extract salient features like edges, corners and contours. Subse
tributors to this domain, with the most significant publications. This quently, the “Size Explanation Algorithm” is employed, using these
study considers scholarly literature that satisfies the inclusion criteria extracted features to estimate the obstacle depth. In [62], this obstacle
for multi-rotor UAVs operating at low altitudes. avoidance is achieved using the “Vanishing Point algorithm” by pro
cessing the frames of a monocular camera. CNN methodology estimates
depth from an RGB monocular camera [61,63]. Another study based on
3.1. Visual obstacle avoidance
CNN in [48] predicts the collision’s probability and the UAV’s steering
angle for obstacle avoidance. A Deep Reinforcement Learning based
The low cost and lightweight attributes of visual sensors have been
approach is proposed in [50], learning from a minimal memory of past
employed as perception sensors for UAV applications [60]. In visual
observations in the form of monocular images. Based on a single-lens
sensing, UAV surroundings can be captured in the form of images or a
camera, monocular vision systems cannot perceive the depth of the
video stream. Utilizing the camera, significant information like obstacle
obstacle. Multiple frames are utilized for image processing to predict
detection, tracking [60], and depth perception [61]. Notably, visual
velocity, depth, and future position.
sensors have a wider field of view as compared to other perception
Another technique widely used to estimate depth from a monocular
sensors. Moreover, they can distinguish and identify obstacles based on
camera is the optical flow technique based on motion parallax. This
their color and texture information. However, employing vision
technique works by analyzing the motion of pixels in successive frames
perception sensors in UAVs comes with certain challenges, including
of an image sequence. The displacement of pixels between frames can be
variations in lightning conditions, occlusion, and motion blur [54].
used to estimate the direction and magnitude of the motion vector,
This review paper categorizes the vision-based perception sensors
which can be interpreted as the 2D projection of the 3D motion vector of
into three categories: monocular, stereo vision systems, and RGB-D
the corresponding object in the scene. In [59], the optical flow technique
cameras. Vision-based perception sensors promise to enable advanced
achieves local obstacle avoidance, as mentioned earlier. In addition, a
capabilities such as autonomous flight and obstacle avoidance for indoor
global path planner utilizing the RRT algorithm is integrated into the
and GPS-denied environments [62]. A detailed overview of vision-based
system. The disadvantage of optical flow is that it can be computation
sensors applied in different reviewed articles is compiled in Table 3.
ally expensive and result in processing delays, mainly when used with
multiple high-resolution images or complex scenes with multiple mov
3.1.1. Monocular vision systems
ing objects.
A monocular vision system uses a single camera to capture images of
the environment and extract relevant information about objects in the
3.1.2. Stereo vision systems
scene. The camera captures light reflected from the objects in its field of
Stereo vision systems provide depth perception and spatial
view and converts it into a digital image. Monocular cameras lack depth
Table 2
UAV’s low-altitude application areas.
Application Description Purpose Refs.
(Collision threats)
4
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Fig. 3. Results from WoS database as of March 5, 2023, show the publication trend, yearly (left) and country-wise (right), using the keywords “UAV and sensors”.
information. Stereo vision systems rely on two cameras placed a fixed vulnerability to poor calibration under varying lighting conditions.
distance apart to capture two images of the same scene from slightly Stereo vision is used in [65] to generate the online 3D occupancy map of
different perspectives, enabling them to compute the distance between the environment. A roadmap study is shown in [66] to use the Infrared
objects based on their disparity. Using stereo vision systems as sensors, (IR) Lepton sensor as a stereo vision system for UAVs to locate hot ob
UAVs can accurately detect and avoid obstacles in their path, improving jects. An object detection algorithm is proposed in [67] based on stereo
their safety and maneuverability. However, this adds more computa vision, which can detect targets within a 15 m range. In [68], a laser
tional cost and complexity of hardware architecture to the system. A transmitter-assisted stereo vision approach is presented using a pair of
previous study in [64] evaluated the performance of stereo vision Logitech® HD Webcam C310.
showing promising results. However, unlike laser ranging, it exhibited
5
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
6
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Table 4 Radar sensors are accurate and precise for UAV obstacle avoidance.
Active or non-visual sensors for obstacle avoidance by autonomous UAV. However, they can produce false readings in cluttered environments due
Type Refs. Application Areas Consideration to multiple path reflections. Limited range resolution (in order of me
ters) of radar sensors [83] can also hinder the detection of small and
LiDAR [8] • Open sector-based UAV • A 2D laser scan is used, and
navigation system in an a virtual target is defined distant obstacles. Because of radar’s increased payload weight and high
outdoor and unstructured for navigation (if the actual power requirement, their use is mainly limited to fixed-wing UAVs [84,
environment target is not in sight) 85] flying at higher altitudes [83]. Recently, the utilization of
• Fully autonomous • Unable to reach a goal near millimeter-wave radars has gained prominence due to their higher
outdoor navigation with and in front of an obstacle,
UAV flying smoothly hence using a hybrid
operational frequency, enabling the acquisition of high-resolution radar
around obstacles approach utilizing PF images of the surroundings [42,43,86]. Most millimeter-wave radars
algorithm operate in the 24 to 77 GHz frequency range. But at the same time, they
[25] • Autonomous chemical • 2D laser scan is used with a are more easily attenuated by atmospheric conditions than the radars
sensing aerial robot for new PF approach to solve
operating at lower operating frequencies. The lighter weight and
outdoor navigation the famous local minima
• Used a novel approach, problem compact size of millimeter-wave radars have made them suitable for
namely Potential Field small UAVs for obstacle avoidance.
that Incorporates Past
Actions (PF-IPA), for 3.2.3. Ultrasonic
obstacle avoidance
[23] • Obstacle avoidance • 2D spinning LiDAR is used
Ultrasonic sensors are based on the principle of sound waves, which
system for remotely • Collision prediction are emitted by a transducer and then reflected when they encounter an
piloted, semi-autonomous resulted in reduced speed obstacle. These reflected waves are then detected by the sensor and used
UAV and increased avoidance to determine the distance and direction of the obstacle. The frequency
distance
range of ultrasonic sensors typically falls between 20 kHz and 200 kHz.
[78] • 3D LiDAR (Velodyne VLP- • The system needs to be
16 Puck) is used for thoroughly tested with Ultrasonic sensors are lighter in weight and cheap [69,84], making them
obstacle avoidance multiple obstacles highly suitable for use in UAVs. Ultrasonic sensors are unidirectional
• Obstacle avoidance and used in multi-sensor configuration [87,88] for complete 360◦
system uses angle coverage. A previous study in [88] shows an array of 4 ultrasonic sensors
displacement and edge
utilized in fusion with an IMU for indoor UAV localization in a
detection methods
[76] • UAV uses Sense and Avoid • Sensor fusion of 2D GPS-denied environment. Similarly, an array of 12 ultrasonic sensors is
(SAA) method based on spinning LiDAR and the fused with IMU in [85] for collision-free UAV autonomous indoor
Fast Obstacle Avoidance monocular camera is used navigation.
Motion (FOAM) • Obstacle cluttered scenario
Ultrasonic sensors have a limited detection range, typically up to a
algorithm has only been tested in a
simulated environment few meters, making them unsuitable for detecting obstacles from far
Radar [42] • Bi-directional RRT* path • Millimeter-wave Radar distances. Additionally, they can be affected by external interferences,
planning is used to create and monocular camera including ambient noise and weather conditions, which can impair their
a local map based on data sensor fusion is performed performance, as concluded in [89]. Furthermore, their ability to accu
from sensors using Extended Kalman
rately measure the size and shape of obstacles is limited, which may
Filter (EKF)
• Sensor fusion enables us to restrict their efficacy in specific scenarios.
get outlines and spatial To comprehensively understand the various categories of perception
information on obstacles sensors, this paper presents a comparative analysis in Table 5. The
[70] • MIMO (Multi-Input Multi- • Obstacle avoidance and 3D
comparison matrix emphasizes 12 key evaluation metrics that are
Output) millimeter-wave mapping were performed
radar is used as both a in the forward direction
deemed critical for low-altitude obstacle detection and collision avoid
mapping and obstacle- • Obstacle avoidance ance. This analysis aims to facilitate the reader’s comprehension and
avoidance sensor successfully tested with evaluation of the different sensor categories about their performance
different types of single- capabilities.
target
The first metric, robustness, refers to the ability of a sensor to func
[43] • An improved A* • Data fusion of millimeter-
algorithm for path wave radar and the tion accurately in diverse and challenging environments (see Table 5 for
planning is proposed for monocular camera has the comparison matrix). The literature review shows that active sensors
plant protection UAV shown improved results such as LiDAR are more robust when compared to monocular and stereo
• Improvement in results as
cameras. The second metric deals with the ability to detect multiple
compared to simple A*
algorithm
obstacles. Detection depends on the sensors’ Field of View (FOV). Both
Ultrasonic [90] • Triple Awareness Fusion • 12 Ultrasonic (US) and 8 cameras and scanning LiDAR have a wider FOV, which enables them to
(TAF) is used for collision Infrared (IR) sensors array detect and react to multiple obstacles. On the other hand, ultrasonic
avoidance using low-cost is used for awareness of the sensors with a narrower FOV cannot handle multiple obstacles in the
sensors surroundings
environment. The depth perception of the sensor, referred to as the third
• The scope of the study
comprises low-cost sensors metric in the comparison matrix, is critical in detecting obstacles in the
for fully autonomous UAV path of a quadrotor UAV. While all sensors possess depth perception
navigation capabilities, monocular cameras cannot perceive obstacle distance.
[88] • Ultrasonic and Inertial • Localization is based on Consequently, CNN-based methods are adopted to predict the obstacle
Measuring Unit-based EKF. Four ultrasonic
(IMU) fusion-based UAV sensors fused with IMU
distance. In contrast, a study referenced in [60] avoids estimating
localization and obstacle readings for localization obstacle range and instead relies on a size expansion algorithm to avoid
avoidance in an indoor • The computational obstacles.
environment complexity of the proposed In the context of this research, the fourth metric pertains to cali
navigation system is low
bration errors and is identified as a significant concern for visual
• Not tested indoors with
electronic equipment that sensing. Calibration errors are attributed to inaccuracies in the mea
could interfere with IMU’s surement process, which can lead to incorrect depth perception and
magnetic compass obstacle detection, ultimately resulting in collisions. As such, it is crucial
7
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
outdoor environments is limited by the potential for real-time data loss.
–
The literature reviewed in Table 5 demonstrates a primary reliance on
perception sensors to provide a collision-free solution in GPS-denied
Computational
situations.
complexity
Medium
Medium
Medium
Medium
evaluates the ability of a sensor to detect and sense the environment in
High
High
High
High
High
High
Low
Low
Low
Low
Low
Low
Low
Low
Low
3D space. With CNN-based prediction methods, sensors such as 3D
LiDAR, radar, and cameras can detect and avoid obstacles in 3D space,
whereas 2D LiDAR sensors and ultrasonic sensor arrays can only detect
dependency
Medium
Medium
Medium
effectiveness of sensor technology in complex environments where 3D
High
High
High
High
Low
Low
Low
Low
Low
Low
Low
Low
Low
sensing is essential for safe navigation and obstacle avoidance. The
–
High
High
High
High
High
light
Low
Low
Low
Low
Low
Low
Low
Low
Low
Low
effective detection and avoidance of obstacles. In contrast, nonvisual
–
●
●
●
●
●
●
●
○
○
-
●
●
●
●
●
●
●
○
○
-
●
●
●
●
●
●
–
○
○
-
●
●
●
●
●
●
–
–
–
–
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
○
○
-
contrast, passive sensors like cameras have advantages such as low cost
and a wide field of view, but they are vulnerable to environmental el
ements and lighting conditions [53]. The benefits of both sensor types
●
●
●
●
●
●
●
○
[50]
[59]
[48]
[61]
[60]
[64]
[68]
[71]
[72]
[69]
[25]
[23]
[78]
[70]
[88]
[90]
[42]
[43]
[76]
fusion.
[8]
Monocular
Monocular
Camera
Camera
multiple sensors allows for better fault tolerance, ensuring that the UAV
Table 5
RGB-D
LiDAR
Radar
Type
can continue to operate even if one of the sensors fails. Thirdly, sensor
fusion can provide better situational awareness, enabling the UAV to
8
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
make more informed decisions about its actions in real-time. algorithm is particularly suitable for real-time applications, as it does
In [42] monocular camera is fused with millimeter-wave radar with not require maps and is computationally efficient. Experimental results
the help of an EKF to estimate the 3D location of the obstacle. This in show a smooth trajectory at a relatively high speed of 3 m/s demonstrate
formation is based on the Bi-directional RRT* algorithm for path plan the algorithm’s effectiveness in handling multiple obstacles. The algo
ning. Similarly, [43] also uses a monocular camera and millimeter-wave rithm selects the most appropriate open sector available based on the
radar for sensor fusion, achieving improved results in unstructured horizontal 2D laser scan of the environment to determine the action
farmland. The 2D LiDAR is utilized in [76] to generate a Probabilistic vector θV . In the first step, the UAV’s circular proximity is categorized
Occupancy Map (POM) in the horizontal plane, subsequently integrated into open and closed sectors. In the second step, a virtual target is
with the visual information acquired from camera images. A study by introduced to enhance the UAV’s navigation capability in the presence
[90] explores the fusion of two low-cost active sensors, an Ultrasonic of path traps. Even when the actual target is not in the line of sight, the
and Infrared range finder, to design an obstacle avoidance system UAV can avoid traps by travelling towards the virtual target instead of
despite their limited accuracy in providing readings. Another work in the actual target. The proposed method incorporates the contribution of
[88] utilizes the onboard IMU and ultrasonic sensor array for environ past m action vectors into vector A to overcome the local minima
ment mapping and obstacle avoidance simultaneously. problem encountered in the PF algorithm, which is defined in Eq. (1).
∑
k
4. Non-cooperative techniques for obstacle avoidance A= Vi (1)
i=k− m
Non-cooperative local obstacle avoidance techniques enable UAVs to
The local minima problem in the PF algorithm is addressed by
autonomously detect and avoid obstacles in their immediate vicinity to
incorporating the contribution of past actions in the proposed method.
prevent collisions, particularly when the UAVs lack prior knowledge of
Eq. (2) shows the direction of the virtual target is determined by
the obstacles’ flight trajectories and cooperative communication is un
incorporating the contribution of past actions into the actual target
available [31]. Lack of prior knowledge about the obstacle is particu
direction.
larly relevant in low-flight surveillance scenarios where non-cooperative
obstacles are encountered. Using non-cooperative local reactive obstacle θϑ = Ψ(θT , θA )u + θT (2)
avoidance techniques is essential to ensure UAVs’ safe and efficient
The contribution of past actions is utilized to determine the direction
operation in shared airspace. One of the main advantages of
of the virtual target action vector θϑ , with the past actions vector θA
non-cooperative techniques is that they allow for obstacle avoidance
being assigned a weight u ∈ (0,1]. The proposed method only avoids the
without prior knowledge of the obstacle map of the environment [8].
obstacles within the maximum look-ahead distance, da , and ignores any
Additionally, since the UAVs are equipped with sensors that continu
obstacles that are situated outside of this range. The included angle
ously sense nearby proximity [39,53], the obstacle map is updated in
between the past actions vector θA and the target vector θT is denoted by
real-time, thus avoiding any latency issues that may result in mishaps.
Ψ(θT , θA ). Once the suitable open sector and virtual target action vector
This section presents a comprehensive review of non-cooperative
θϑ are determined, the OS algorithm ensures safety by incorporating
local obstacle avoidance techniques, crucial for navigating UAVs to
safety boundaries. If the minimum obstacle detection distance (rm1 and
their final target by merging the local planner with the global path
rm2 ) is less than the minimum safety distance rs , the calculation of safety
planner. These techniques are classified into four approaches: gap-based
boundaries (∅1 and ∅2 ), depending on the shape and distance of the
methods, geometric methods, repulsive force-based methods, and AI-
obstacle, is shown in Eq. (3) and Eq. (4).
based methods. Instead of reacting to obstacles, Gap-based methods
⎧ ( )
utilize the presence of admissible gaps between the obstacles, reducing ⎪
⎪ π rs
⎪ cos − 1
, if rm1 > rs
the probability of collision in densely cluttered environments [8]. ⎪
⎨ 2
−
r1
Geometric-based methods utilize the obstacle’s geometry and velocity ∅1 = ( ) (3)
⎪
⎪ π − 1 rs
information to maneuver the UAV to prevent the potential predicted ⎪
⎪
⎩2 − cos − k(r s − r m1 ) , otherwise
da
collision. Repulsive force-based methods respond to the attractive and
repulsive forces generated by the target and obstacles [25]. AI-based
and
methods use machine learning models trained on some dataset or a
⎧
reinforced learning method optimized by a defined policy that maxi ⎪ π
( )
rs
⎪
⎪ 1
mizes the reward function. AI-based obstacle avoidance systems use ⎪
⎨ 2
− cos−
r2
, if rm2 > rs
real-time situational awareness from perception sensors to make ∅2 = ( ) (4)
⎪ π − 1 rs
autonomous decisions. ⎪
⎪
⎩ 2 − cos
⎪ − k(rs − rm2 ) , otherwise
da
4.1. Gap-based methods The aggressiveness of maneuvers performed by the aerial robot to
avoid obstacles is determined by the parameter k in Eq. (3) and Eq. (4).
Gap-based methods have been a viable solution for collision-free Additionally, the maximum look-ahead distance da is influenced by
UAV navigation and pathfinding in recent years. A roadmap of sensor characteristics, environmental factors, and the dynamics of the
feasible paths is updated based on various algorithms as the UAV tra UAV.
verses through the dynamic environment. Pathfinding is based on the The final action vector θV is established after verifying that the ideal
available gaps detected around the UAV’s current location. The most open sector satisfies the safety boundary condition. However, it should
suitable gap for the UAV to move through is selected by a pathfinding be noted that the algorithm does not account for the aerial robot’s dy
algorithm under predefined constraints [44]. These methods have namics, which may reduce its effectiveness in highly dynamic environ
shown promising results in densely cluttered environments by avoiding ments. Moreover, in cases where there are no viable open sectors in a
static and dynamic obstacles [3,91]. Moreover, gap-based methods are densely cluttered environment, the algorithm switches to the PF-IPA
promising for real-time UAV applications because of low computational method, which incorporates past actions, as described in [25], to
requirements with fast reactive decision-making. guide the UAV towards the final waypoint.
Another Admissible Gap (AG) method has emerged as a viable
4.1.1. Open sector approach for reactive obstacle avoidance in autonomous navigation. It
The underlying notion of Open-Sector (OS) is to divide the UAV’s offers a safer, more accurate, and easier-to-implement solution than
proximity into open and closed sectors as illustrated in Fig. 4 [8]. This
9
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Fig. 4. (a) Segmentation of scan into sectors. Sectors ω1 , ω3 and ω5 are identified as open sectors. Sectors ω2 and ω4 are identified as open sectors. Each open sector
has six parameters like in case of sector ω3 ; the start and stop angle θ21 , θ22 ; the range at start and stop angle r21 , r22 ; the minimum ranges in the adjacent closed sectors
r2m1 , r2m2 . (b)Illustration of safety boundaries ∅1 and ∅2 ; the actual target vector T; the virtual target vector ϑ; the action vector V [8].
other gap-based methods [91]. Although the proposed AG method in Additionally, the ND method solely considers circular-shaped vehi
[91] employs multiple laser scanners with a wider field of view, it can be cles and overlooks vehicle kinematics and dynamic limitations. In
applied to sensor types with a limited field of view. This method presents contrast, other approaches, such as the AG method described in [91],
a novel gap detection methodology while considering the aerial robot’s consider the robot’s shape, kinematics, and dynamics, making them
geometry and kinematic constraints. However, the AG approach is un more adaptable to various types of robots. Additionally, as highlighted
suitable for dynamic obstacles moving abruptly and in a random pattern. in [79], the faster a robot moves, the more distortion occurs in the LiDAR
The AG method additionally takes the shape and kinematics constraints point cloud scan, indicating that gap-based methods in [3,91] that rely
into account for exploring the admissible gaps, making it computa on clear laser scans may have limitations in applications requiring rapid
tionally expensive method as compared to other gap-based methods. navigation. Nevertheless, the issue of local traps in highly cluttered
environments and U-shaped obstacles has been resolved in [3].
4.1.2. Nearness diagram Another gap-based approach for motion planning of robots in con
The ND navigation solution, proposed in [3], utilizes a "divide and strained workspaces is presented in [92]. The authors propose a
conquer" strategy to solve the complexity of the navigation problem. two-phase method, where the first phase involves a learning process that
This reactive obstacle avoidance outperforms other collision-less navi constructs a probabilistic roadmap for a given environment. A heuristic
gation methods in densely cluttered and troublesome scenarios. ND evaluator is used to locate difficult areas in the free Configuration Space
method considers the obstacles distribution and final goal location to (C-space) and enhance the roadmap density in those regions. The
navigate the obstacle map. Available gaps are identified utilizing the roadmap is an undirected graph that captures the connectivity of the
nearness diagram from the robot’s center point, as illustrated in Fig. 5. free C-space. In the second phase, the roadmap is leveraged to process
However, the ND method’s performance depends on the availability of path-planning queries quickly. It should be noted, however, that the
the clear sensory scan, underlining its limitations in the case of a noisy method assumes a static workspace and does not account for dynamic
perception sensor. obstacles or environmental changes. Additionally, the approach is also
Fig. 5. (a) Obstacles and gaps in the scan (b) ND from center point [3].
10
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Fig. 6. (a) Obstacles around UAV (b) Linear polar histogram [2].
11
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
by directing the UAV’s velocity vector outside the collision cone region. used as a platform for experimental testing. The outcome shows effective
The collision cone approach is not limited to single UAVs in the obstacle avoidance while minimizing the UAV flight time.
literature. Researchers have extended the collision-cone concept to the
formation of UAVs [97]. They propose a simple algorithm with low 4.2.2. Velocity obstacle method
computational requirements and define the collision avoidance enve Velocity Obstacle (VO) is a geometric method that works by esti
lope while considering dynamic constraints and obstacle detection range mating the collision region based on the velocity vector vB of the
limits. The guidance law for collision avoidance is analyzed for stability obstacle. UAV defines its velocity obstacle space by considering the
using the Lyapunov theory. Moreover, the pioneers of the collision cone velocities of the obstacles in its vicinity. Two primary components of VO
method address the problem of a team of n UAVs pursuing a swarm of method are determining the velocity barriers for each obstacle and
intruder UAVs on a 2D plane [98]. The guidance laws are developed for choosing the UAV’s velocity outside the velocity obstacle region. The
situations where the intruder swarm is confined to a bounding circle, velocity obstacle region of an obstacle is the set of all velocities that
which may vary in size with time. The collision cone method serves as would result in a collision if the UAV continued along its current tra
the foundation for these pursuit guidance laws. Simulation results jectory. The avoidance maneuvers are generated by selecting the UAV’s
indicate that the target UAV swarm is successfully surrounded by the velocities outside the velocity obstacles. To select the velocity obstacle
chasing UAV. However, the proposed strategies in [97,98] do not region, we move the collision cone by obstacle velocity vB , as first shown
consider communication delays between UAVs, sensor noise in range in [101].
detection and other factors such as wind, turbulence, or other environ There are many improved variants of the VO method in the litera
mental factors affecting the system’s performance in real-world ture. Many multi-agent navigation algorithms, such as the original VO
scenarios. face common oscillation problems. As a solution to the oscillatory
In [45], researchers propose a reactive obstacle avoidance strategy behavior, a new method called the Reciprocal Velocity Obstacle (RVO)
for a multirotor UAV to avoid potential collisions in 3D space. This is proposed [102]. The method guarantees safe and oscillation-free
strategy utilizes a LiDAR sensor to detect a single moving obstacle, and a motions for each agent. It is applied to agents’ navigation in densely
Kalman filter is incorporated into the collision cone technique to esti populated environments containing static and moving obstacles. In RVO
mate the obstacle’s position, velocity, and acceleration. However, the contribution of the velocity vector of the UAV, i.e., vA is introduced by
accuracy of the LiDAR sensor and the Kalman filter is crucial to the al changing the position of the velocity obstacle by 12(vA + vB ) as shown in
gorithm’s performance, and any errors in these components may affect the Fig. 8.
its effectiveness. In contrast, the algorithm proposed in [99] can avoid The Hybrid Reciprocal Velocity Obstacle (HRVO) in [5] considers
multiple obstacles simultaneously and is superior to other 3D navigation obstacles in the environment, uncertainty in radius, position, and ve
algorithms, such as the one presented in [45], which can only avoid locity, as well as the dynamics and kinematics of the robots. In this
obstacles one at a time. The study in [99] proposes a modified 3D vision method, both VO and RVO are calculated and based on the velocity of
cone-based reactive navigation algorithm that enables small quadcopter the UAV, i.e., vA . If the velocity of the UAV vA is on the right side of the
UAVs to avoid collisions while navigating around crowd-spaced 3D center line of RVO, then the region of RVO is extended up to the VO
obstacles. region on the left side, as shown in Fig. 8. Otherwise, in the second case
Previous collision avoidance techniques discussed in this section did of vA the region of RVO is extended up to the VO region on the right side.
not consider time a constraint. The method proposed in [100] uses the A hybrid reciprocal velocity obstacle is reported in [5] to provide better
time-efficient collision cone approach to predict collisions. The results in collision-free and oscillation-free navigation of multiple agents
comparative analysis considers heading change, speed change, and the compared to simple VO and RVO.
fusion of both approaches. The study demonstrates that in terms of All the VO variants, including RVO and HRVO, utilize the informa
UAV’s flight time the heading-based obstacle avoidance strategy is the tion on the agent’s shape, position, and velocity information. These
most efficient of the three. A DJI Matrice 600 Pro hexacopter has been methods ensure collision-free and oscillation-free navigation, but no one
ensures the time to avoid the collision. In [1], the author presents an
Optimal Reciprocal Collision Avoidance (ORCA) method, which in
troduces a safety time τ to avoid the collision earlier than the actual
collision time. As shown in the Fig. 9 the obstacle appears to be near the
UAV because of the scaling factor τ introduced.
12
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Fig. 8. (a) Velocity Obstacle VOA|B of UAV A induced by obstacle B (b) Reciprocal velocity obstacle RVOA|B of UAV A induced by obstacle B (c) Hybrid reciprocal
velocity obstacle HRVOA|B of UAV A induced by obstacle B [5].
( )
Fatt = kα pg − p (7)
⎧ ⎛ ⎞
⎪ ⇀
⎪ 1 1
⎪ ⎠ do , ‖ do ‖ ≤ rsafe
⇀
⎪
⎨ kβ ⎝ −
Frep = rsafe ‖ ⇀
do ‖
⇀
‖ do ‖ (8)
⎪
⎪
⎪
⎪ ⇀
⎩
0, ‖ do ‖>rsafe
∑
Ftotal = Fatt + i
Frep (9)
i
where Uatt in Eq. (5) and Urep in Eq. (6) are attractive and repulsive
potential field functions; Fatt in Eq. (7) and Frep in Eq. (8) are the force
field functions of their respective potential fields; kα and kβ are attractive
⇀
and repulsive force constants; do is the distance of the obstacle from the
UAV and rsafe is the threshold safety distance from the obstacle; pg is the
position of the goal, and p is the position of the UAV.
In [107], a 2D Gaussian Mixture Model (GMM) of the obstacles
provides the potentials at different coordinates of a two-dimensional
environment. Potential vectors are defined by using the derivative of
the defined potential functions. The GMM-based approach is adopted to
Fig. 9. Optimal reciprocal collision avoidance [1]. handle complex-shaped objects, which is one of the limitations of the
simple PF algorithm. The inability of the PF algorithm to pass through
depends on initial conditions and is limited by local minima problems. the narrow passage has been addressed in [25]. A PF-IPA has been
The study in [103] uses a PF controller to track the dynamic targets proposed. The strategy to navigate the obstacle accommodates the
while avoiding static obstacles in the environment. Simple PF navigation repulsive forces from the obstacles, the direction of the next waypoint,
gets stuck into the narrow passages where the cumulative sum of and the contribution of the past action vectors. The magnitude of the
attractive and repulsive forces cancels each other, referring to the situ contribution is determined experimentally.
ation as local minima [25]. Many modified PF algorithms have been Another solution to the local minima problem by introducing the
proposed to solve local minima problem. virtual target is introduced in [104], an improved algorithm that gen
The APF method relies on two interacting forces to guide the erates a virtual target within the neighborhood of the local minimum
movement of a robot in an environment. Specifically, a robot is attracted point to avoid falling into a local minimum point in a complex envi
towards a desired destination through an attractive force while being ronment with many obstacles. APF algorithm has also been used for
repelled from obstacles by a repulsive force. By combining these forces, multi-UAV formation control and obstacle avoidance systems [108,
a path can be generated from the starting point to the target location 109]. In [108], along with the APF algorithm, a sliding mode control is
along the direction of the net force. The APF can be expressed through established to ensure the UAVs follow the desired trajectory and main
the following equations [106]. tain the desired formation. The saturation function is used to avoid
chattering. This study introduced a robust formation control and target
1 ( )
Uatt = kα p − pg (5) tracking algorithm for multiple UAVs, which can effectively achieve
2
formation flight and target tracking while improving the system’s
⎧ ⎛ ⎞ robustness. Similarly, in [109], a comprehensive optimal obstacle
⎪
⎪
⎪ 1 1 1 avoidance mechanism of UAV path planning using an APF algorithm for
⇀
⎪
⎨ kβ ⎝ ⇀ − ⎠, ‖ do ‖ ≤ rsafe
Urep = 2 ‖ do ‖ rsafe (6) multi-UAV systems in a complex environment. This method saves UAV’s
⎪
⎪
⎪
⎪ ⇀
energy and solves UAV local minimization problems.
⎩
0, ‖ do ‖>rsafe Another method proposed in [106] generates a real-time reactive
collision-free path for UAVs flying in dynamic airspace. The method is
13
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
based on the Dynamic Artificial Potential Field (DAPF) algorithm and function to optimize the trajectory and evaluate the shortest
aims to guarantee flight safety while minimizing the effect of sur collision-free path length from the starting point to the endpoint.
rounding obstacles. According to [106], the safety distance between the In [111], an improved version of the APF method for UAV navigation
UAV and obstacles is referred as a variable threshold that scales adap in a 3D space is proposed. The proposed algorithm modifies the repul
tively based on the relative motion states of obstacles and the perfor sive function according to the influence range of obstacles on UAV and
mance parameters of the UAV. This adaptive safety distance helps to divides the repulsive forces exerted by obstacles at different distances.
resolve conflicts between the UAV and obstacles and improves operation
efficiency and safety. Secondly, the potential field functions are modi
fied to adjust automatically based on the threat levels of the surrounding 4.4. AI-based methods
obstacles. The relative position, speed, and flight behavior between the
UAV and the obstacle define the obstacle’s threat level. In the DAPF Numerous studies have been conducted on applying AI to UAV
algorithm, the repulsive force of the classic APF algorithm is retained, obstacle avoidance. These approaches are categorized based on learning
and a steering force is added to shift the flight direction of the UAV to techniques, with supervised and unsupervised learning algorithms being
accelerate obstacle avoidance. the most common. AI approaches, particularly supervised learning, are
primarily used in path planning on obstacle maps to process data and
4.3.2. Improved potential field algorithm extract useful information. AI algorithms have become a crucial resource
Extensive research has been undertaken to address limitations in for processing and analyzing the enormous amounts of data that UAV
repulsive force-based methods for obstacle avoidance. For instance, sensing systems produce. The advancements in cloud computing,
significant attention has been dedicated to solving jitter problems in the graphics processing units (GPUs), and parallel computing have made it
APF algorithm. Furthermore, efforts have focused on enhancing obstacle computationally possible for these AI algorithms to process an unprec
avoidance with the constraint of finding the shortest possible path be edented volume of data in real-time. Machine learning and deep rein
tween the start and endpoints. In [105], the proposed method improves forcement learning are the two most often applied AI-based techniques
upon the APF method by adding a distance correction factor to the for avoiding obstacles.
repulsive potential field function to solve the inherited GNRON problem
in APF and using a standard hexagon-guided method to improve the 4.4.1. Deep reinforcement learning
local minima problem. Additionally, the proposed method uses the Deep Reinforcement Learning (DRL) is an AI technique that com
relative velocity method for moving object detection and avoidance in bines reinforced learning with deep neural networks. It learns to take
dynamic environments. The repulsive force function is divided into two actions based on feedback in the form of a reward or a penalty signal.
parts, as shown in Eq. (10), to address the issue of GNRON with the APF This feedback about reward and penalty defines the policy that maxi
algorithm [105]. Doing this allows the repulsive force to gradually mizes the reward function. This policy is represented as the neural
lessen as the objective point is approached with nearby obstacles. network whose input is the current state and output is the probability
⎧ distribution of possible actions. Inherently reinforced learning is well
⎨ ⇀
Frep1 + Frep2 , ‖ do ‖ ≤ rsafe suited for the application without having a training data set. In deep
Frep = (10)
reinforcement learning, the decision is made based on feedback from the
⇀
⎩ 0, ‖ d ‖>r
o safe
environment.
where Frep1 and Frep2 are defined in Eq. (11) and Eq. (12) as A DRL method using a monocular camera for UAV obstacle avoid
⎛ ⎞ ⇀n ance utilizes information about the ambient environment for decision-
1 1 ⎠ dg making is proposed in [50]. It uses recurrent neural networks (RNNs)
Frep1 = kβ ⎝
⇀ − ⇀2
(11) with temporal attention to retain relevant information about the envi
‖ do ‖ rsafe do ronment structure to make better future navigation decisions. The pro
⎛ ⎞2 posed method can be integrated with a high-level planner, which inputs
n ⎝ 1 1 ⎠ ⇀n− 1 an overall path objective with a start and a goal position. The experi
Frep2 = kβ ⇀ − dg (12) mental results show that the proposed method outperforms the Deep
2 ‖ do ‖ rsafe Q-Network (DQN) and Double Deep Q-Network (D3QN) algorithms
regarding distance covered by UAVs without collisions. The paper
⇀ ⇀
where dg and do are the distance of the goal and obstacle from the UAV, concludes that the proposed method can be used for practical obstacle
respectively. As UAV approaches to goal with obstacles in the nearby avoidance of UAVs in cluttered and unseen environments.
⇀n ⇀n− 1 In [112], Probability Distribution-Based Collision Avoidance (PICA)
vicinity, the dg and dg approaches zero, and UAV moves towards the and Reinforced Learning-Based Collision Avoidance (RELIANCE) algo
goal under the influence of only attractive force Fatt , solving the GNRON rithms are proposed for avoiding collisions while saving energy con
problem. sumption in UAVs. The RELIANCE method has demonstrated superior
Another obstacle avoidance method for a swarm of UAVs in [110] performance to PICA in dynamic environments with multiple UAVs. Test
uses a combination of the second-order consensus algorithm and an results highlight its effectiveness in terms of obstacle avoidance and
improved potential field (IPF) algorithm. A "leader-follower" strategy is policy convergence. The study, however, did not consider the impact of
used by the swarm of UAV, where the leader UAV flies autonomously in communication delays between the UAV and Multiaccess-Edge
accordance with the mission requirements, while the follower UAVs Computing (MEC) on the performance of the proposed algorithms.
follow the leader using a second-order consensus algorithm. The method A deep reinforcement learning system with a map-based architecture
combines the target’s attractive force and the obstacle’s repulsive force is given for environments devoid of communication [113]. This
to control the UAV’s subsequent movement. The technique significantly map-based approach uses the Distributed Proximal Policy Optimization
eliminates the inherited jitter problem in the classical APF method. (DPPO) algorithm to train neural networks. In the case of noisy data
A Bi-directional APF-RRT* algorithm with a goal-biased strategy is from the perception sensors, it has been claimed that this map-based
proposed in [47]. The path finding is carried out by the proposed al multi-robot obstacle avoidance solution outperforms conventional
gorithm using two alternating random search trees to speed up reinforced learning methods. The present study adds substantially to our
convergence. The APF method is incorporated into the Bi-directional understanding that DRL is particularly well suited to applications
search tree to reduce the number of iterations significantly. Moreover, involving high-dimensional input data, such as images or other sensor
the proposed algorithm uses cubic spline interpolation as the fitness data.
14
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
4.4.2. Machine learning components like types of airframes, propellers, motors, GPS modules,
Machine learning algorithms are trained on images and sensor magnetometer, gyrocompass, accelerometer, gyroscopes, and onboard
datasets to identify potential environmental obstacles and hazards. The sensors. Regarding obstacle avoidance in UAVs, three hardware topol
quality and size of the dataset have a significant impact on the perfor ogies can be segregated into three categories in terms of flight control
mance of machine learning algorithms. Large-scale datasets are used for lers and flight computers.
training to improve system performance on untrained datasets. The The first category directly incorporates obstacle avoidance capabil
DRL-based scheme proposed in [50] increases decision accuracy by ities into the flight controller. This hardware topology integrates mul
utilizing collected data over time. However, its limited applicability tiple sensors such as LiDAR, camera, and ultrasonic sensors and employs
arises from the requirement of the UAV to repeatedly operate in the the flight controller’s onboard processing power to make real-time de
same workspace, posing a challenge in utilizing the system in cisions. A laser sensor TFmini integrated into Pixhwak hardware is uti
safety-critical environment. lized in [116] for range and direction estimation of obstacles near a
A two-stage CNN-based learning scheme for a quadrotor UAV to quadrotor UAV. Moreover, the study employs the Dijkstra method
avoid obstacles automatically in unknown and unstructured environ locally to search for possible paths, and an improved particle swarm
ments is proposed in [48]. It uses a forward-facing monocular camera as optimization algorithm is utilized to obtain the globally optimized path.
a perception sensor. The first stage uses a CNN-based model as the The second hardware topology utilizes a companion computer
prediction mechanism, simultaneously predicting the steering angle and alongside a primary flight controller to handle intelligent tasks such as
the collision probability. The second stage involves mapping the steering autonomous obstacle avoidance. In this hardware configuration com
angle to an instruction that changes the yaw angle of the UAV, and the panion computer is an additional computing unit responsible for pro
collision probability is mapped as a forward speed to maintain the flight cessing sensor data, executing complex algorithms and sending
or stop going forward. Similarly, in [61], CNN is used to estimate the navigation commands to the flight controller for executing avoidance
depth of the RGB image, further using this depth prediction as input to maneuvers. This configuration allows for more computational power
the obstacle avoidance mechanism. But the platforms used for experi and flexibility in implanting advanced obstacle avoidance strategies.
ments in [48,61] rely on wireless LAN, which imposes restrictions on Table 7 summarizes combinations of flight controllers and companion
communication distance between UAV and ground control station. computers for UAV obstacle avoidance. Arduino nano is used as a flight
Researchers have recently investigated machine-learning techniques controller in [117,118]; other flight controllers utilized in hybrid ar
to optimize obstacle avoidance performance on resource-constrained chitecture include Pixhawk [23,76–78,116,119–123], Mateksys
hardware with minimal processing requirements. The proposed H743-WING [124], DJI A3 [8] and Pixracer R15 [75]. Most UAV
scheme in [49] uses a machine learning-based approach to reduce the obstacle avoidance applications utilizing visual perception sensors
memory footprint of the cost tables used by Airborne Collisions Avoid employ Jetson Nano as companion computer [76,117,118,123,124].
ance Systems (ACAS-Xu) for collision avoidance in UAVs. Instead of a Other companion computers using single or multiple perception sensors
classical cost table, using trained DNN to predict cost vectors consider include different models of Raspberry Pi [77,120–122], ODROID [8,23]
ably reduces the memory size required by a factor of 500. Moreover, the and Intel NUC [75,78,119].
performance of ACAS-Xu is not degraded using DNN to approximate the The third scheme uses additional hardware components specifically
classical cost table. A framework of energy-efficient UAV surveillance dedicated to obstacle avoidance. That additional hardware often in
network is proposed in [114]. In this study, CNN is utilized to extract cludes a dedicated obstacle avoidance system such as ADS-B receivers
features of the moving objects from UAVs. But the proposed framework that can detect other aircraft. An obstacle avoidance system combining a
in [114] is tested in a simulated environment, and the real-world Thermal-Infrared (TIR) camera with an ADS-B receiver is proposed in
implementation may have different challenges and limitations. A cate [125]. The proposed obstacle avoidance scheme estimates the detected
gorized summary of the existing obstacle avoidance techniques is shown aircraft range by matching the TIR images with the corresponding
in Table 6. ADS-B message. Another study in [126] used a sampling-based approach
While comparing, each non-cooperative local obstacle avoidance on ADS-B data for obstacle avoidance path planning. It has been vali
method has its advantages and disadvantages. In summary, compared to dated in Software-In-the-Loop Simulation (SITL) using ADS-B data from
repulsive force-based methods, gap-based methods exhibit smooth commercial aircraft flying over the Phoenix Sky Harbor airport.
navigation and are computationally efficient [8]. However, the noise in In conclusion, the first hardware topology that exclusively in
the sensory scan affects their performance and lowers their navigation corporates a flight controller is unable to navigate precisely to achieve
speed [3]. In contrast to gap-based methods, repulsive force-based the optimal trajectory [116]. In principle, the flight controllers, namely
methods are more likely to become trapped in local minima and Pixhawk, Arduino Nano, and DJI A3, serve as the autopilots, responsible
GNRON [105]. The challenge of navigating a swarm of UAVs is an issue for real-time flight stabilization, control, and localization sensor data
that geometric methods can handle in addition to a single UAV [97]. processing. These flight controllers can integrate various types of sen
Furthermore, in contrast to previous approaches, geometric methods sors, such as a GPS module, gyroscope, and accelerometer, to sustain
make use of the obstacle’s shape, position, and velocity information to stable flight and execute autonomous flight as planned by the remote
guarantee collision-free navigation. Lastly, Al-based obstacle avoidance pilot. However, this hardware architecture has limitations in the form of
methods are appropriate for applications involving enormous amounts processing power, computational capabilities, and hardware interfaces
of data from perception sensors. Machine learning algorithms are with higher data rates.
computationally expensive as their performance depends on the size and In contrast, the subsequent hardware architecture, as shown in
quality of the dataset. AI-based methods using DRL have limited appli Table 7, employs a hybrid configuration with companion computers
cability because UAVs need to be repeatedly operated at the site to (Jetson, Raspberry Pi, ODRIOD, and Intel NUC) in conjunction with a
collect data over time [50]. flight controller that serves as an autopilot. Each of these companion
computers comes with a variety of interfaces, including USB and
5. Hardware architecture for non-cooperative obstacle Ethernet, capable of facilitating networking and extracting data from
avoidance perception sensors at higher data rates. With its dedicated GPU
(Graphics Processing Unit) with CUDA cores, Jetson outperforms the
The hardware architecture of a UAV plays a vital role in its obstacle Raspberry Pi, rendering it a more viable option for implementing AI-
avoidance performance and capabilities. This section introduces the based obstacle avoidance methods that involve the execution of deep
classification of hardware topologies used in UAV obstacle avoidance. learning and machine learning algorithms.
The hardware architecture of a UAV consists of many essential
15
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Table 6
A summary of Non-cooperative obstacle avoidance techniques.
Type Ref Methodology Sensors Performance
Gap-based [8] • Reactive obstacle avoidance algorithm that uses 2D Laser rangefinder: • Computationally efficient and does not require maps
scan to detect obstacles and determine open sectors for Hokuyo UST-10LX • Smooth trajectory at a relatively high speed of 3 m/s in
collision-free navigation unstructured urban/suburban environment
[91] • AG method for reactive obstacle avoidance in robotics Laser rangefinder1: • The AG approach outperforms the other methods in terms of
• Exact robot shape and kinematic constraints are Hokuyo UTM-30LX success rate and path length
considered, which improves the safety and efficiency of Laser rangefinder2: • Outstanding navigation performance is achieved in unknown
robot navigation Hokuyo URG-04LX dense environments
• AG approach performance has not been compared in terms of
computational efficiency
[3] • The ND navigation method uses a "divide and conquer" The paper mentions • It navigates robots in troublesome scenarios where other methods
strategy to simplify the difficulty of navigation in very robots equipped with present a high degree of difficulty in navigating
dense, cluttered, and complex scenarios • 3D laser • It does not consider the robot’s shape, kinematics, and dynamics,
• Ultrasound sensors which makes them more suitable for different types of robots
• 2D laser
(Exact names are not
mentioned)
[94] • Designing an obstacle avoidance system for Hexacopter Laser rangefinder: • The obstacle avoidance system is tested in the Gazebo
using the Vector Field Histogram-Plus (VFH+) algorithm 2D LiDAR resolution environment running on Robot Operation System (ROS)
of 10 • This work needs to be tested in a real environment for accuracy
(Exact sensor model is and reliability analysis
not mentioned)
Geometric [100] • Proposing a heading change algorithm for time- - • UAVs can avoid collisions with dynamic obstacles by changing
efficient obstacle avoidance for aerial drones using their speed, heading, or both. The paper proposes a purely heading-
collision cone-based approaches based method to be the most efficient
• The proposed algorithm does not consider the limitations of the
UAV’s sensors, such as their range and accuracy
[115] • The paper proposes a 3D velocity obstacle method to Laser sensor • Introducing improvements to the existing velocity obstacle cone
avoid collisions with multiple dynamic and static and presenting a tall cylinder shape protected zone to approximate
obstacles for UAVs all kinds of static obstacles
• Presented the velocity obstacle pyramid to handle 3D obstacles
Repulsive [25] • Use of a potential-field algorithm for collision-free Laser rangefinder: • The paper presents a novel and advanced system for autonomous
force-based monitoring in areas with obstacles Hokuyo UST-10LX chemical sensing and monitoring in complex environments
• UAV uses a modified PF algorithm that avoids the local • The system is tested in outdoor environments with obstacles such
minima problem in a simple PF algorithm as buildings and trees; but requires further evaluation in complex
and dynamic environments
[111] • An improved algorithm based on the traditional APF - • The proposed algorithm is only tested in a simulation
method for quadrotor UAV in a three-dimensional space environment, not a real-world scenario. Therefore, the algorithm’s
effectiveness in a real-world environment must be further
investigated
AI-based [48] • A two-stage obstacle avoidance architecture utilizing Front-facing • The system uses a low-cost sensor, lightweight network topology,
camera as perception sensor monocular camera strong learning capability, and environmental adaptability
• A CNN-based model predicts the steering angle initially • Wi-Fi is utilized as a protocol for communication with the ground
• The control system changes the UAV’s yaw angle in the control station, which limits the operational range of the UAV
second stage
[50] • A Deep reinforcement learning-based method is Monocular RGB • The problem of partial observability is addressed in obstacle
proposed for UAV obstacle avoidance in unstructured Camera: avoidance using a recurrent neural network with temporal
and unknown indoor environments. Image size attention.
84 × 84 × 3 pixels • Outperforming prior works in terms of distance covered without
collisions
• The method requires a lot of training data to learn effective
obstacle avoidance behavior, which may be difficult to obtain in
certain situations
• The method assumes static environment and does not account for
moving obstacles or outdoor scenario
6. Discussion and future directions making them a more suitable option for low-altitude aerial surveil
lance. It is anticipated that the utility of rotary-wing UAVs integrated
This paper accounts for non-cooperative obstacle avoidance solu with cutting-edge sensing technologies will increase in the future.
tions for low-altitude surveillance UAVs. The following are the key Hence, there is a need for a regulatory framework to ensure the safe
findings derived from the comprehensive analysis of the reviewed use of multi-rotor UAVs in various industries.
literature. • In the context of low-altitude UAV surveillance, two sensing tech
nologies, namely visual and non-visual sensing, were thoroughly
• In aerial surveillance applications, diverse types of UAV platforms examined and compared. Non-visual sensing employing active sen
have been employed. The airspace above an altitude of 122 m is sors is less affected by lighting conditions, calibration errors, and
categorized as a "High-speed UAV traffic" area, and the risk of weather and offers more resilience and accuracy. Visual sensing,
collision is considered relatively lower in this zone. However, con relying on favorable conditions in contrast, offers insight into
ducting aerial surveillance at low altitudes poses a significant risk of obstacle shape and texture information. However, obstacle avoid
collision, necessitating the selection of appropriate UAV platforms. ance based on visual sensing in low-light and adverse weather is still
In this regard, rotary-wing multi-copters offer several advantages, a challenge and is expected to draw attention from researchers in
such as their hovering capability and lower speed of operation, years to come.
16
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Table 7
A summary of onboard hardware architecture for obstacle avoidance.
Companion Refs. Flight Sensors Application
Computer controller
Jetson Xavier NX [124] Mateksys • Intel RealSense L515 LiDAR • Autonomous navigation by multirotor UAV in an unfamiliar GPS-denied environment
H743-WING • Intel RealSense T265 tracking utilizing visual navigation, black lane segmentation and neural network models
camera
Jetson Nano [123] Pixhawk • Intel RealSense D435i camera • A simple and robust approach for reactive obstacle avoidance using spirals for escape in
unknown scenarios
[76] Pixhawk 4 • Slamtec RP LiDAR S1 • Fast Obstacle Avoidance Motion (FOAM) algorithm for sense-and-avoid operations in a
• Logitech C930e web camera cluttered environment using multi-sensor fusion
[117] Arduino Nano • 2D LiDAR sensor • Implementation of SLAM and the development of an autonomous robot using a 2D LiDAR
sensor
NVIDIA Jetson [118] Arduino Nano • Velodyne VLP-16 LiDAR • A path-following and obstacle avoidance system for autonomous surface vehicles
TX2
Raspberry Pi Zero [122] Pixhawk 2 • LiDAR-Lite v3 sensors • A computationally inexpensive obstacle avoidance control of a drone for infrastructure
inspection
Raspberry Pi 3 [77] Pixhawk 2.0 • Scanse Sweep Laser Scanner • Guidance Navigation and Control (GNC) system architecture for lightweight UAV
Model B Cube • PX4Flow Flow camera autonomous indoor flight using ROS
[120] Pixhawk 2.4.8 • RPLiDAR A1 • A low-cost Obstacle Detection and Collision Avoidance System for multi-rotor drones based
on Coulomb’s inverse-square law
Raspberry Pi 4 [121] 3DR Pixhawk • Ouster OS1–16 LiDAR • UAV indoor mapping to support disaster management in GPS-denied environment.
Model B Mini
ODROID XU4 [23] Pixhawk • RPLiDAR 360◦ . • A model-based automatic obstacle avoidance algorithm for remotely piloted UAVs
ODROID-C2 [8] DJI A3 • Hokuyo UST-10LX LiDAR. • A reactive-based obstacle avoidance method that does not require maps and is computa
tionally efficient
Intel NUC [75] Pixracer R15 • Hokuyo UST-10LX LiDAR • Comparison of APFs and CBFs for obstacle avoidance in mobile robots
• T265 RealSense camera
[119] Pixhawk 2.1 • Ouster OS1–16 LiDAR • An indoor navigation solution for UAV in areas where GPS and magnetometer sensor
• Intel T265 camera. measurements are unavailable
• Pozyx ultra-wideband (UWB)
positioning system
[78] Pixhawk 2 • Velodyne VLP-16 LiDAR • An obstacle detection and avoidance system for a multicopter UAV using LiDAR data
Others [127] – – • An air pollution source tracking algorithm based on APF and particle swarm optimization
[116] Pixhawk • TFmini Laser Sensor • A new obstacle avoidance strategy for UAVs using Dijkstra method and an improved
particle swarm optimization
• LiDAR-based active sensing, which has been utilized in a few works, information gathered from perception sensors. AI-based methods can
is one of the most promising non-visual sensing technologies. It make accurate and timely decisions by analyzing vast amounts of
provides accurate and high spatial resolution without additional data and adapting to changing environments and unexpected situa
computational requirements, as compared to other non-visual sen tions. Furthermore, given that AI-based obstacle avoidance tech
sors, e.g., Radar. Its extended detection range makes LiDAR suitable niques are still in their infancy, future research will be directed
for applications like aerial surveying and surveillance. Notably, towards exploring energy-efficient AI models and exploring low-
LiDAR facilitates real-time data acquisition without adding compu power hardware solutions to meet the constraint of extended oper
tational overhead, making it a suitable option for integration with ational endurance of small UAVs.
other sensors in a sensor fusion framework. • This paper provides an account of different hardware topologies for
• Sensor fusion, which integrates multiple perception sensors, UAV obstacle avoidance. The investigated topologies include flight
including both visual and non-visual sensing, enhances redundancy, controller-based configuration, hybrid architecture, and dedicated
reliability, and decision-making in the UAV’s obstacle detection and obstacle avoidance architecture. While flight controller-based ar
avoidance systems. The field of sensor fusion remains nascent in non- chitecture exhibits simplicity, its limited computational resources
cooperative and local obstacle avoidance and is anticipated to be a result in less accurate performance in obstacle avoidance. On the
key area of exploration in the forthcoming years. Moreover, in the contrary, hybrid architecture is more complex but affords more
future, there is a need to focus on optimizing energy consumption in computational capabilities, improved navigation accuracy, and the
the sensor fusion framework, prioritizing smart utilization of sensors ability to handle complex decision-making processes. The dedicated
based on operational needs. obstacle avoidance architecture incorporates a dedicated subsystem
• In this paper, four distinctive methods for non-cooperative and local for obstacle avoidance, yet only the hybrid topology provides a
obstacle avoidance, i.e., gap-based, geometric, repulsive force-based, platform for future upgrades by assimilating new sensors as tech
and AI-based methods, have been systematically classified. Due to nology evolves, thereby enhancing companion computers’
their fast and efficient computation, gap-based methods are effective capabilities.
for low-altitude obstacle avoidance. Nevertheless, the efficacy of
gap-based methods degrades when confronted with noisy sensor data CRediT authorship contribution statement
from the environment scan in complex and cluttered environments.
Moreover, several studies have been conducted to address oscilla Muhammad Zohaib Butt: Conceptualization, Data curation, Formal
tions in the UAV flight trajectory, and local minima are problems analysis, Investigation, Methodology, Writing – original draft. Nazri
encountered in geometric and repulsive force-based methods. These Nasir: Funding acquisition, Project administration, Supervision, Vali
challenges constitute an active area for prospective research in the dation, Visualization, Writing – review & editing. Rozeha Bt
field of non-cooperative obstacle avoidance. A.
. Rashid: Supervision, Validation.
• AI-based methods have shown great potential in enhancing the
adaptability and flexibility of obstacle avoidance systems by learning
through training datasets or by reinforced learning through
17
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
18
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
perishable food, Appl. Math. Model. 106 (2022) 844–866, https://doi.org/ algorithm for UAVs, Sensors 17 (5) (2017) 1061, https://doi.org/10.3390/
10.1016/j.apm.2022.02.024, 2022/06/01/. s17051061. MayArt no.
[39] S. Ahmed, B. Qiu, F. Ahmad, C.W. Kong, H. Xin, A state-of-the-art analysis of [61] Z. Zhang, M. Xiong, H. Xiong, Monocular depth estimation for UAV obstacle
obstacle avoidance methods from the perspective of an agricultural sprayer avoidance, in: Proceedings of the 4th International Conference on Cloud
UAV’s operation scenario, Agronomy 11 (6) (2021) 1069 [Online]. Available, Computing and Internet of Things (CCIOT), IEEE, 2019, pp. 43–47.
https://www.mdpi.com/2073-4395/11/6/1069. [62] R.P. Padhy, F. Xia, S.K. Choudhury, P.K. Sa, S. Bakshi, Monocular vision aided
[40] X.D. Deng, M.K. Guan, Y.F. Ma, X.J. Yang, T. Xiang, Vehicle-assisted UAV autonomous UAV navigation in indoor corridor environments, IEEE Trans.
delivery scheme considering energy consumption for instant delivery, Sensors 22 Sustain. Comput. 4 (1) (2019) 96–108, https://doi.org/10.1109/
(5) (2022), https://doi.org/10.3390/s22052045. MarArt no. 2045. TSUSC.2018.2810952.
[41] Y. Wang, M.X. Zhang, Y.J. Zheng, A hyper-heuristic method for UAV search [63] S. Back, G. Cho, J. Oh, X.T. Tran, H. Oh, Autonomous UAV trail navigation with
planning, in: Proceedings of the Advances in Swarm Intelligence: 8th obstacle avoidance using deep neural networks, J. Intell. Robot. Syst. 100 (3–4)
International Conference, ICSI 2017, Springer, Fukuoka, Japan, 2017, (2020) 1195–1211, https://doi.org/10.1007/s10846-020-01254-5. Dec.
pp. 454–464. July 27–August 1, 2017, Proceedings, Part II 8. [64] S. Hrabar, An evaluation of stereo and laser-based range sensing for rotorcraft
[42] H. Yu, F. Zhang, P. Huang, C. Wang, L. Yuanhao, Autonomous obstacle avoidance unmanned aerial vehicle obstacle avoidance, J. Field Robot. 29 (2) (2012)
for uav based on fusion of radar and monocular camera, in: Proceedings of the 215–239, https://doi.org/10.1002/rob.21404. Mar-Apr.
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), [65] S. Hrabar, 3D path planning and stereo-based obstacle avoidance for rotorcraft
IEEE, 2020, pp. 5954–5961. UAVs, in: Proceedings of the IEEE/RSJ International Conference on Intelligent
[43] X. Huang, et al., The improved A* obstacle avoidance algorithm for the plant Robots and Systems, IEEE, 2008, pp. 807–814.
protection UAV with millimeter wave radar and monocular camera data fusion, [66] Z. Cook, L. Zhao, J. Lee, W. Yim, Unmanned aerial vehicle for hot-spot avoidance
Remote Sens. 13 (17) (2021) 3364, https://doi.org/10.3390/rs13173364. SepArt with stereo FLIR cameras, in: Proceedings of the 12th International Conference on
no. Ubiquitous Robots and Ambient Intelligence (URAI), IEEE, 2015, pp. 318–319.
[44] S.S. Bolbhat, A.S. Bhosale, G. Sakthivel, D. Saravanakumar, R. Sivakumar, [67] Y. Xiao, X. Lei, S. Liao, Research on uav multi-obstacle detection algorithm based
J. Lakshmipathi, Intelligent obstacle avoiding AGV using vector field histogram on stereo vision, in: Proceedings of the IEEE 3rd Information Technology,
and supervisory control, J. Phys. Conf. Ser. 1716 (1) (2020) 012030, https://doi. Networking, Electronic and Automation Control Conference (ITNEC), IEEE, 2019,
org/10.1088/1742-6596/1716/1/012030, 2020/12/01. pp. 1241–1245.
[45] J. Park, N. Cho, Collision avoidance of hexacopter UAV based on LiDAR data in [68] M.K. Cheong, M.R. Bahiki, S. Azrad, Development of collision avoidance system
dynamic environment, Remote Sens. 12 (6) (2020) 975, https://doi.org/ for useful UAV applications using image sensors with laser transmitter, in:
10.3390/rs12060975. MarArt no. Proceedings of the 6th AEROTECH Conference - Innovation in Aerospace
[46] M. Choi, A. Rubenecia, T. Shon, H.H. Choi, Velocity obstacle based 3D collision Engineering and Technology, Kuala Lumpur, Malaysia 152, 2016, https://doi.
avoidance scheme for low-cost micro UAVs, Sustainability 9 (7) (2017) 1174 org/10.1088/1757-899x/152/1/012026. Nov 08-09IOP Conference Series-
[Online]. Available, https://www.mdpi.com/2071-1050/9/7/1174. Materials Science and Engineering, 2016[Online]. Available: ≤Go to ISI≥://
[47] J.M. Fan, X. Chen, X. Liang, UAV trajectory planning based on bi-directional APF- WOS:000390862700026.
RRT* algorithm with goal-biased, Expert Syst. Appl. 213 (2023) 119137, https:// [69] J. Hu, Y. Niu, Z. Wang, Obstacle avoidance methods for rotor UAVs using
doi.org/10.1016/j.eswa.2022.119137. MarArt no. RealSense camera, in: Proceedings of the Chinese Automation Congress (CAC),
[48] X. Dai, Y.X. Mao, T.P. Huang, N. Qin, D.Q. Huang, Y.N. Li, Automatic obstacle IEEE, 2017, pp. 7151–7155.
avoidance of quadrotor UAV via CNN-based learning, Neurocomputing 402 [70] L. Miccinesi, et al., Geo-referenced mapping through an anti-collision radar
(2020) 346–358, https://doi.org/10.1016/j.neucom.2020.04.020. Aug. aboard an unmanned aerial system, Drones 6 (3) (2022) 72, https://doi.org/
[49] I. Lahsen-Cherif, H. Liu, C. Lamy-Bergot, Real-time drone anti-collision avoidance 10.3390/drones6030072. MarArt no.
systems: an edge artificial intelligence application, in: Proceedings of the IEEE [71] J. Hou, Q. Zhang, Y. Zhang, K. Zhu, Y. Lv, C. Yu, Low altitude sense and avoid for
Radar Conference (RadarConf22), 2022, pp. 1–6, https://doi.org/10.1109/ muav based on stereo vision, in: Proceedings of the 35th Chinese Control
RadarConf2248738.2022.9764175, 21-25 March 2022. Conference (CCC), IEEE, 2016, pp. 5579–5584.
[50] A. Singla, S. Padakandla, S. Bhatnagar, Memory-based deep reinforcement [72] M. Iacono, A. Sgorbissa, Path following and obstacle avoidance for an
learning for obstacle avoidance in UAV with limited environment knowledge, autonomous UAV using a depth camera, Robot. Auton. Syst. 106 (2018) 38–46,
IEEE Trans. Intell. Transp. Syst. 22 (1) (2021) 107–118, https://doi.org/10.1109/ https://doi.org/10.1016/j.robot.2018.04.005. Aug.
tits.2019.2954952. Jan. [73] D.M. Randelovic, G.S. Vorotovic, A.C. Bengin, P.N. Petrovic, Quadcopter altitude
[51] Y. Choi, H. Jimenez, D.N. Mavris, Two-layer obstacle collision avoidance with estimation using low-cost barometric, infrared, ultrasonic and LiDAR sensors,
machine learning for more energy-efficient unmanned aircraft trajectories, Robot. FME Trans. 49 (1) (2021) 21–28, https://doi.org/10.5937/fme2101021R.
Auton. Syst. 98 (2017) 158–173, https://doi.org/10.1016/j.robot.2017.09.004. [74] S. Ramasamy, R. Sabatini, A. Gardi, J. Liu, LiDAR obstacle warning and
Dec. avoidance system for unmanned aerial vehicle sense-and-avoid, Aerosp. Sci.
[52] N. Elmeseiry, N. Alshaer, T. Ismail, A detailed survey and future directions of Technol. 55 (2016) 344–358, https://doi.org/10.1016/j.ast.2016.05.020, 2016/
unmanned aerial vehicles (UAVs) with potential applications, Aerospace 8 (12) 08/01/.
(2021) 363 [Online]. Available, https://www.mdpi.com/2226-4310/8/12/363. [75] A. Singletary, K. Klingebiel, J. Bourne, A. Browning, P. Tokumaru, A. Ames,
[53] J.N. Yasin, S.A.S. Mohamed, M.H. Haghbayan, J. Heikkonen, H. Tenhunen, Comparative analysis of control barrier functions and artificial potential fields for
J. Plosila, Unmanned aerial vehicles (UAVs): collision avoidance systems and obstacle avoidance, in: Proceedings of the IEEE/RSJ International Conference on
approaches, IEEE Access 8 (2020) 105139–105155, https://doi.org/10.1109/ Intelligent Robots and Systems (IROS), 2021, pp. 8129–8136, https://doi.org/
ACCESS.2020.3000064. 10.1109/IROS51168.2021.9636670, 27 Sept.-1 Oct. 2021.
[54] A. Al-Kaff, D. Martín, F. García, A.d.l. Escalera, J. María Armingol, Survey of [76] C.S. Gadde, M.S. Gadde, N. Mohanty, S. Sundaram, Fast obstacle avoidance
computer vision algorithms and applications for unmanned aerial vehicles, Expert motion in small quadcopter operation in a cluttered environment, in: Proceedings
Syst. Appl. 92 (2018) 447–463, https://doi.org/10.1016/j.eswa.2017.09.033, of the IEEE International Conference on Electronics, Computing and
2018/02/01/. Communication Technologies (CONECCT), 2021, pp. 1–6, https://doi.org/
[55] A.M. Gaber, et al., Development of an autonomous IoT-based drone for campus 10.1109/CONECCT52877.2021.9622631, 9-11 July 2021.
security, ELEKTRIKA J. Electr. Eng. 20 (2–2) (2021) 70–76, 09/15[Online]. [77] Y. Li, M. Scanavino, E. Capello, F. Dabbene, G. Guglieri, A. Vilardi, A novel
Available, https://elektrika.utm.my/index.php/ELEKTRIKA_Journal/article/vie distributed architecture for UAV indoor navigation, Transp. Res. Procedia 35
w/295. (2018) 13–22, https://doi.org/10.1016/j.trpro.2018.12.003, 2018/01/01/.
[56] U. Emmanuel, B. Yekini, Review of agricultural unmanned aerial vehicles (UAV) [78] A. Moffatt, E.G. Platt, B. Mondragon, A. Kwok, D. Uryeu, S. Bhandari, Obstacle
obstacle avoidance system, in: Proceedings of the IEEE Nigeria 4th International detection and avoidance system for small UAVs using a LiDAR, in: Proceedings of
Conference on Disruptive Technologies for Sustainable Development the International Conference on Unmanned Aircraft Systems (ICUAS), 2020,
(NIGERCON), 2022, pp. 1–4, https://doi.org/10.1109/ pp. 633–640.
NIGERCON54645.2022.9803184, 5-7 April 2022. [79] L. Zheng, P. Zhang, J. Tan, F. Li, The obstacle detection method of UAV based on
[57] P. Radoglou-Grammatikis, P. Sarigiannidis, T. Lagkas, I. Moscholios, 2D LiDAR, IEEE Access 7 (2019) 163437–163448, https://doi.org/10.1109/
A compilation of UAV applications for precision agriculture, Comput. Netw. 172 ACCESS.2019.2952173.
(2020) 107148, https://doi.org/10.1016/j.comnet.2020.107148, 2020/05/08/. [80] S. Huang, H.Z. Huang, Q. Zeng, P. Huang, A robust 2D LiDAR SLAM method in
[58] C. Wargo, G.C. Church, J. Glaneueski, M. Strout, Unmanned Aircraft Systems complex environment, Photonic Sens. 12 (4) (2022) 220416, https://doi.org/
(UAS) research and future analysis, in: Proceedings of the IEEE Aerospace 10.1007/s13320-022-0657-6. ArticleArt no.
Conference, 2014, pp. 1–16. [81] K.V. Stefanik, J.C. Gassaway, K. Kochersberger, A.L. Abbott, UAV-based stereo
[59] H.Y. Lin, X.Z. Peng, Autonomous quadrotor navigation with vision based obstacle vision for rapid aerial terrain mapping, GISci. Remote Sens. 48 (1) (2011) 24–49,
avoidance and path planning, IEEE Access 9 (2021) 102450–102459, https://doi. https://doi.org/10.2747/1548-1603.48.1.24. Jan-Mar.
org/10.1109/ACCESS.2021.3097945. [82] M.Z. Butt, S.T. Gul, Range and doppler estimation of multiple moving targets for
[60] A. Al-Kaff, F. Garcia, D. Martin, A. De la Escalera, J.M. Armingol, Obstacle pulsed doppler radars with CFAR detector at very low SNRs, in: Proceedings of
detection and avoidance system based on monocular camera and size expansion
19
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
the International Conference on Emerging Technologies (ICET), 2014, Conference of the Society of Instrument and Control Engineers of Japan (SICE),
pp. 147–152, https://doi.org/10.1109/ICET.2014.7021034, 8-9 Dec. 2014. 2017, pp. 1316–1319, https://doi.org/10.23919/SICE.2017.8105744, 19-22
[83] Y.K. Kwag, M.S. Choi, C.H. Jung, K.Y. Hwang, Collision avoidance radar for UAV, Sept. 2017.
in: Proceedings of the CIE International Conference on Radar, 2006, pp. 1–4, [108] H. Song, S. Hu, W. Jiang, Q. Guo, M. Zhu, Artificial potential field-based multi-
https://doi.org/10.1109/ICR.2006.343231, 16-19 Oct. 2006. UAV formation control and target tracking, Int. J. Aerosp. Eng. 2022 (2022)
[84] M.P. Owen, S. Duffy, M.W.M. Edwards, Unmanned aircraft sense and avoid radar: 4253558, https://doi.org/10.1155/2022/4253558.
surrogate flight testing performance evaluation, in: Proceedings of the IEEE Radar [109] Y.A. Yan, Z.Y. Lv, J.B. Yuan, S.F. Zhang, Obstacle avoidance for multi-UAV system
Conference, 2014, pp. 0548–0551. with optimized artificial potential field algorithm, Int. J. Robot. Autom. 36
[85] A. Viquerat, L. Blackhall, A. Reid, S. Sukkarieh, G. Brooker, Reactive collision (2021), https://doi.org/10.2316/j.2021.206-0610.
avoidance for unmanned aerial vehicles using doppler radar, in: Proceedings of [110] Y. Huang, J. Tang, S.Y. Lao, UAV group formation collision avoidance method
the Field and Service Robotics: Results of the 6th International Conference, based on second-order consensus algorithm and improved artificial potential
Springer, 2008, pp. 245–254. field, Symmetry 11 (9) (2019) 1162, https://doi.org/10.3390/sym11091162.
[86] G. Rankin, A. Tirkel, A. Leukhin, Millimeter wave array for UAV imaging MIMO BaselSepArt no.
radar, in: Proceedings of the 16th International Radar Symposium (IRS), Dresden, [111] X.J. Jiang, Y. Deng, UAV track planning of electric tower pole inspection based on
2015. " ed, 2015. improved artificial potential field method, J. Appl. Sci. Eng. 24 (2) (2021)
[87] N. Gageik, T. Müller, S. Montenegro, Obstacle Detection and Collision Avoidance 123–132, https://doi.org/10.6180/jase.202104_24(2).0001.
Using Ultrasonic Distance Sensors for an Autonomous Quadrocopter, University [112] S. Ouahouah, M. Bagaa, J. Prados-Garzon, T. Taleb, Deep-reinforcement-learning-
of Wurzburg, Aerospace information Technologhy (Germany) Wurzburg, 2012, based collision avoidance in UAV environment, IEEE Internet Things J. 9 (6)
pp. 3–23. (2022) 4015–4030, https://doi.org/10.1109/jiot.2021.3118949. Mar.
[88] L.Y. Yang, X.K. Feng, J. Zhang, X.Q. Shu, Multi-ray modeling of ultrasonic sensors [113] G.D. Chen, et al., Distributed non-communicating multi-robot collision avoidance
and application for micro-UAV localization in indoor environments, Sensors 19 via map-based deep reinforcement learning, Sensors 20 (17) (2020) 4836,
(8) (2019) 1770, https://doi.org/10.3390/s19081770. AprArt no. https://doi.org/10.3390/s20174836. SepArt no.
[89] S. Suherman, R.A. Putra, M. Pinem, Ultrasonic sensor assessment for obstacle [114] H.T. Do, et al., Energy-efficient unmanned aerial vehicle (UAV) surveillance
avoidance in quadcopter-based drone system, in: Proceedings of the 3rd utilizing artificial intelligence (AI), Wirel. Commun. Mob. Comput. 2021 (2021)
International Conference on Mechanical, Electronics, Computer, and Industrial 8615367, https://doi.org/10.1155/2021/8615367. OctArt no.
Technology (MECnIT), IEEE, 2020, pp. 50–53. [115] C.Y. Tan, S. Huang, K.K. Tan, R.S.H. Teo, Three dimensional collision avoidance
[90] N. Gageik, P. Benz, S. Montenegro, Obstacle detection and collision avoidance for for multi unmanned aerial vehicles using velocity obstacle, J. Intell. Robot. Syst.
a UAV with complementary low-cost sensors, IEEE Access 3 (2015) 599–609, 97 (1) (2020) 227–248, https://doi.org/10.1007/s10846-019-01055-5, 2020/
https://doi.org/10.1109/access.2015.2432455. 01/01.
[91] M. Mujahed, D. Fischer, B. Mertsching, Admissible gap navigation: a new [116] Z. Chen, F. Luo, C. Zhai, Obstacle avoidance strategy for quadrotor UAV based on
collision avoidance approach, Robot. Auton. Syst. 103 (2018) 93–110, https:// improved particle swarm optimization algorithm, in: Proceedings of the Chinese
doi.org/10.1016/j.robot.2018.02.008, 2018/05/01/. Control Conference (CCC), 2019, pp. 8115–8120, https://doi.org/10.23919/
[92] L.E. Kavraki, P. Svestka, J.C. Latombe, M.H. Overmars, Probabilistic roadmaps for ChiCC.2019.8865866, 27-30 July 2019.
path planning in high-dimensional configuration spaces, IEEE Trans. Robot. [117] A. Santos, J. Castellanos, A.M. Reyes Duke, Ros: an autonomous robot operating
Autom. 12 (4) (1996) 566–580, https://doi.org/10.1109/70.508439. system for simultaneous localization and mapping using A 2D LiDAR sensor, in:
[93] I. Ulrich, J. Borenstein, VFH+: reliable obstacle avoidance for fast mobile robots, Proceedings of the 19th LACCEI International Multi-Conference for Engineering,
in: Proceedings of the IEEE International Conference on Robotics and Automation Education, and Technology, 2021, https://doi.org/10.18687/
(Cat. No.98CH36146) 2, 1998, pp. 1572–1577, https://doi.org/10.1109/ LACCEI2021.1.1.520.
ROBOT.1998.677362, 20-20 May 1998vol.2. [118] A. Gonzalez-Garcia, I. Collado-Gonzalez, R. Cuan-Urquizo, C. Sotelo, D. Sotelo,
[94] I.P. Sary, Y.P. Nugraha, M. Megayanti, E.M.I. Hidayat, B.R. Trilaksono, Design of H. Castañeda, Path-following and LiDAR-based obstacle avoidance via NMPC for
obstacle avoidance system on hexacopter using vector field histogram-plus, in: an autonomous surface vehicle, Ocean Eng. 266 (2022) 112900, https://doi.org/
Proceedings of the IEEE 8th International Conference on System Engineering and 10.1016/j.oceaneng.2022.112900, 2022/12/15/.
Technology (ICSET), 2018, pp. 18–23. [119] L. Markovic, M. Kovac, R. Milijas, M. Car, S. Bogdan, Error state extended kalman
[95] I. Ulrich, J. Borenstein, VFH/sup */: local obstacle avoidance with look-ahead filter multi-sensor fusion for unmanned aerial vehicle localization in GPS and
verification, in: Proceedings of the ICRA, Millennium Conference, IEEE magnetometer denied indoor environments, in: Proceedings of the International
International Conference on Robotics and Automation, Symposia Proceedings Conference on Unmanned Aircraft Systems, ICUAS 2022, 2022, pp. 184–190,
(Cat. No.00CH37065) 3, 2000, pp. 2505–2511, https://doi.org/10.1109/ https://doi.org/10.1109/ICUAS54217.2022.9836124 [Online]. Available,
ROBOT.2000.846405, 24-28 April 2000vol.3. https://www.scopus.com/inward/record.uri?eid=2-s2.0-85136117940&doi=10
[96] A. Chakravarthy, D. Ghose, Obstacle avoidance in a dynamic environment: a .1109%2fICUAS54217.2022.9836124&partnerID=40&md5=35daa5dd9075e6
collision cone approach, IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 28 (5) 3edd8a832ce9b4b5c8.
(1998) 562–574, https://doi.org/10.1109/3468.709600. [120] A. Singh, A. Payal, Development of a low-cost Collision Avoidance System based
[97] J. Seo, Y. Kim, S. Kim, A. Tsourdos, Collision avoidance strategies for unmanned on Coulomb’s inverse-square law for Multi-rotor Drones (UAVs), in: Proceedings
aerial vehicles in formation flight, IEEE Trans. Aerosp. Electron. Syst. 53 (6) of the International Conference on Computational Performance Evaluation,
(2017) 2718–2734, https://doi.org/10.1109/taes.2017.2714898. Dec. ComPE 2021, 2021, pp. 306–316, https://doi.org/10.1109/
[98] A. Chakravarthy, D. Ghose, Cooperative pursuit guidance to surround intruder ComPE53109.2021.9752133 [Online]. Available, https://www.scopus.com/inwa
swarms using collision cones, J. Aerosp. Inf. Syst. 17 (8) (2020) 454–469, https:// rd/record.uri?eid=2-s2.0-85128952920&doi=10.1109%2fComPE53109.2021.9
doi.org/10.2514/1.I010790. Aug. 752133&partnerID=40&md5=685c53dc851aebbeb2782c3f23e23f63.
[99] Z.X. Ming, H.L. Huang, A 3D vision cone based method for collision free [121] S. Karam, et al., Micro and macro quadcopter drones for indoor mapping to
navigation of a quadcopter UAV among moving obstacles, Drones 05 (04) (2021) support disaster management, ISPRS Ann. Photogramm. Remote Sens. Spat. Inf.
134, https://doi.org/10.3390/drones5040134. DecArt no. Sci. V-1-2022 (2022) 203–210, https://doi.org/10.5194/isprs-annals-V-1-2022-
[100] M. Gnanasekera, J. Katupitiya, A time-efficient method to avoid collisions for 203-2022, 05/17.
collision cones: an implementation for UAVs navigating in dynamic [122] A. Devos, E. Ebeid, P. Manoonpong, Development of autonomous drones for
environments, Drones 6 (5) (2022) 106, https://doi.org/10.3390/ adaptive obstacle avoidance in real world environments, in: Proceedings of the
drones6050106. MayArt no. 21st Euromicro Conference on Digital System Design (DSD), 2018, pp. 707–710,
[101] P. Fiorini, Z. Shiller, Motion planning in dynamic environments using velocity https://doi.org/10.1109/DSD.2018.00009, 29-31 Aug. 2018.
obstacles, Int. J. Robot. Res. 17 (7) (1998) 760–772. [123] F. Azevedo, J.S. Cardoso, A. Ferreira, T. Fernandes, M. Moreira, L. Campos,
[102] J.V.d. Berg, L. Ming, D. Manocha, Reciprocal Velocity Obstacles for real-time Efficient reactive obstacle avoidance using spirals for escape, Drones 5 (2) (2024),
multi-agent navigation, in: Proceedings of the IEEE International Conference on https://doi.org/10.3390/drones5020051.
Robotics and Automation, 2008, pp. 1928–1935, https://doi.org/10.1109/ [124] A. Abdulov, A. Abramenkov, K. Rusakov, A. Shevlyakov, Problems solved during
ROBOT.2008.4543489, 19-23 May 2008. AEROBOT-2021 UAV challenge, Procedia Comput. Sci. 207 (2022) 2077–2085,
[103] A.C. Woods, H.M. La, Dynamic target tracking and obstacle avoidance using a https://doi.org/10.1016/j.procs.2022.09.267, 2022/01/01/[Online]. Available,
drone. Advances in Visual Computing, Springer International Publishing, Cham, https://www.sciencedirect.com/science/article/pii/S187705092201153X.
2015, pp. 857–866. G. Bebis et al., Eds., 2015//. [125] A. Carrio, Y. Lin, S. Saripalli, P. Campoy, Obstacle detection system for small
[104] Z. Yingkun, Flight path planning of agriculture UAV based on improved artificial UAVs using ADS-B and thermal imaging, J. Intell. Robot. Syst. 88 (2) (2017)
potential field method, in: Proceedings of the Chinese Control And Decision 583–595, https://doi.org/10.1007/s10846-017-0529-2, 2017/12/01.
Conference (CCDC), 2018, pp. 1526–1530, https://doi.org/10.1109/ [126] L. Yucong, S. Saripalli, Sense and avoid for Unmanned Aerial Vehicles using ADS-
CCDC.2018.8407369, 9-11 June 2018. B, in: Proceedings of the IEEE International Conference on Robotics and
[105] X. Fan, Y. Guo, H. Liu, B. Wei, W. Lyu, Improved artificial potential field method Automation (ICRA), 2015, pp. 6402–6407, https://doi.org/10.1109/
applied for AUV path planning, Math. Probl. Eng. 2020 (2020) 6523158, https:// ICRA.2015.7140098, 26-30 May 2015.
doi.org/10.1155/2020/6523158, 2020/04/27. [127] Z. Fu, Y. Chen, Y. Ding, D. He, Pollution source localization based on multi-UAV
[106] Y. Du, X. Zhang, Z. Nie, A real-time collision avoidance strategy in dynamic cooperative communication, IEEE Access 7 (2019) 29304–29312, https://doi.
airspace based on dynamic artificial potential field algorithm, IEEE Access 7 org/10.1109/ACCESS.2019.2900475.
(2019) 169469–169479.
[107] J. Mok, Y. Lee, S. Ko, I. Choi, H.S. Choi, Gaussian-mixture based potential field
approach for UAV collision avoidance, in: Proceedings of the 56th Annual
20
M.Z. Butt et al. Robotics and Autonomous Systems 173 (2024) 104629
Muhammad Zohaib Butt received his Bachelor of Engineering Rozeha A. Rashid received her B.Sc. degree in electrical en
in Electrical Engineering at the National University of Sciences gineering from University of Michigan, Ann Arbor, USA and
and Technology, Pakistan, in 2012, an M.S. in Systems Engi her M.E.E. and PhD degrees in telecommunication engineering
neering form Pakistan Institute of Engineering and Applied from Universiti Teknologi Malaysia (UTM). She is a senior
Sciences, Islamabad, 2014. Currently he is a research assistant lecturer in the Communication Engineering Program, School of
and Ph.D. candidate in the department of Mechanical Engi Electrical Engineering, Universiti Teknologi Malaysia and is
neering, Universiti Teknologi Malaysia. His research interests currently the Head of Telecommunication Software and System
include radar signal processing, embedded control systems, (TeSS) research group. She is an IEEE member and has more
electronics, and robotics. than 140 publications mostly in the area of Telecommunication
Engineering. Her current research interests include wireless
communications, sensor network, cognitive radio and Internet-
of-Things (IoT).
21