0% found this document useful (0 votes)
7 views

Intelligent+Raspberry+Pi+for+UAV+Swarm+Scheme

This paper presents a lightweight and efficient method for synchronizing a swarm of drones, utilizing ad hoc communications and a semi-autonomous leader drone to coordinate follower drones for search and rescue missions. The proposed system employs Raspberry Pi for control, integrates various sensors for obstacle detection, and utilizes algorithms for path planning and communication. Test results indicate the effectiveness of the swarm in executing tasks efficiently while addressing challenges such as power consumption and data transmission.

Uploaded by

duyvu.utc
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Intelligent+Raspberry+Pi+for+UAV+Swarm+Scheme

This paper presents a lightweight and efficient method for synchronizing a swarm of drones, utilizing ad hoc communications and a semi-autonomous leader drone to coordinate follower drones for search and rescue missions. The proposed system employs Raspberry Pi for control, integrates various sensors for obstacle detection, and utilizes algorithms for path planning and communication. Test results indicate the effectiveness of the swarm in executing tasks efficiently while addressing challenges such as power consumption and data transmission.

Uploaded by

duyvu.utc
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Intelligent Raspberry Pi for UAV Swarm Scheme

Abdelrahman R. S. Almassri1, Naim Ajlouni2


1
Istanbul Aydin University / Faculty of Software Engineering, Department of Artificial Intelligence & Data Science, Istanbul, Turkey
2
Istanbul Atlas University / Faculty of Software Engineering, Istanbul, Turkey

Corresponding author: Abdelrahman R. S. Almassri (e-mail: [email protected]) ORCID 0000-0001-5450-7956

Article Info ABSTRACT Swarms of drones are increasingly being requested to carry out
Received: January 04, 2021
missions that single drones cannot complete. Particularly in civil security, high
Accepted: April 03, 2021 needs emerge in surveillance and observation of hostile areas. Existing
Published: April 26, 2021 solutions do not meet this demand, as they utilise heavy infrastructures without
consideration of the quality of service (especially in terms of throughput). This
paper proposes a light and efficient solution to synchronise a swarm of drones
based only on ad hoc communications to position drones. The proposed
Keywords method operates a semi-autonomous leader drone, while all the others (the
followers) are autonomous and follow the Leader using commands sent by the
Artificial Intelligence
UAV Leader via a communication channel. The proposed method emulates
A* algorithm squadrons' fundamental flight operations. The test results show that the process
Obstacle avoidance is very efficient, with all follower drones responding efficiently.
Swarm
Autonomous flight

I.INTRODUCTION • It requires coordination between drones


covering different areas, which helps to
Human beings have achieved a great development utilise search time effectively.
of technology in terms of rescuing methods; however, • It needs to protect the data being transmitted
it remains difficult to locate and find missing persons between drones themselves and the rescue
as quickly as it should be. In search and rescue cases, team on the other.
many of the studies conducted on numerous operations
indicated that the survival ratio significantly declined • There is a need to reduce power consumption
during the first 18 hours. It further declines to zero to preserve the battery and achieve the
after 20 hours (A. L. Adams 2007). Therefore, in this longest flying time.
study, we will focus on developing a system utilising • Lack of memory space and computational
multi drones as a swarm of unmanned aerial vehicles power, where a typical drone has resource
(UAVs), particularly drones. Recently, UAVs or constraints.
drones have been widely used in a variety of
• The camera's input data is processed with
applications, like mapping (Madawalagama 2016),
lower latency and faster performance to
precision agriculture (Chandra 2018), surveying,
ensure suitable performance.
infrastructure inspection (I. Sa 2015) search and SAR
This study is concerned with using a swarm of
(Rudol 2007), etc. Many studies on drones have been
drones for search and rescue plus surveillance
conducted for searching, rescuing operations and
operations. The swarm will utilise the idea of Leader
detecting single private objects or multiple objects,
and Followers. The Leader will receive the
which have become a global standard in all rescue
information about the task that needs to be executed;
operations. The following are considered to be the
in turn, the Leader will pass the required information
main challenges in the use of UAV platform:
to its followers, including their particular sub-task, and
• Covering a specific area in the shortest the followers will also receive flight instructions,
possible time. including the coordinates of the location they are
travelling to. Both the Leader and its followers will
utilise several algorithms to achieve their tasks during

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

their flight. These algorithms include path planning, The remainder of this paper is organised as follows;
object avoidance, detection, identification, and Section 2 discusses UAV architecture and sensor
tracking. This process depends on several indicators, payload. Section 3 presents autonomous obstacle
including creating a squadron of drones, because it avoidance. Section 4 describes the ground control unit.
provides many advantages compared to non-swarms, Section 5, presents the results and discussion. Finally,
such as area coverage, speed, and coordinated effects Section 6 provides some conclusions.
(R. Arnold 2019) (B. Abruzzo 2019). One more
important advantage is that it allows collective control II.UAV ARCHITECTURE AND
of many agents while a limited number of mission SENSOR PAYLOAD
objectives can be prepared. UAV swarms can be
equipped to communicate and take decisions quickly In this study, a mix of Raspberry Pi 3 and 4 are used
(R. Arnold 2019), and this is proven to be more to control the swarm of drones autonomously. This
efficient than using one drone with many decision- confirms that the Raspberry Pi 3 and 4 can perform the
making individuals. same task, and their difference does not hinder their
Each drone can receive info that helps in the operational capabilities. Along with the raspberry, Pi's
searching and rescuing processes such as missing drone also carries two cameras (visual and Infrared)
persons, speed, location, and all data. Meanwhile, plus ultrasound sensors to aid in positioning the drones
decision-makers and rescuers can't get much data or within the swarm and to detect obstacles during the
process it as quickly as it should be transmitted. execution of drone duties. In this section, the overall
Whereas, in milliseconds, drones of artificial structure of the proposed system is presented.
intelligence algorithms can process data and take
decisions. This combination of advantages makes the A. System Overview
squadron more efficient without a doubt. Second,
equipping the drones with cameras to assist in the Among the huge set of applications for drones, there
finding and rescue process, where the camera is one of is a real interest in video surveillance and observation.
the most important elements of drone surveillance. A fixed-wing aircraft drone or a quadcopter can be
The drone is able to cover large spaces, but of course, operated to realise such a mission. The possibilities are
that depends on the shooting height, which affects the limited in the sense that the covered area is reduced
location and size of the object. (because of radio range and the characteristics of a
single drone). To cover an extended area or allow
Moreover, the visibility and size of the object (S.
long-distance real-time video, we propose a basic
Sambolek 2019) are affected by lighting and weather.
Making object detection efficient requires creating a drone swarm where we use the notion of Leader and
model of artificial intelligence and training it to detect followers. In the proposed swarm, followers act as
and recognise a group of objects, like a human, a rock relays; the Leader is always human-piloted and
or a tree. Lately, deep learning, especially a model responsible for real-time video capture, deciding on
based on convolutional neural networks, has targets, and operational change. The drones and the
contributed to achieving object detection. control centre are all interconnected using internet
network extended coverage. The drones and the centre
Third, when a group of drones are flying, they have
are all capable of pear to pear communication. This
to communicate and coordinate to execute the required
will solve the problem of leader loss due to technical
operations efficiently. The basic data exchange is
based on long-range protocols (LoRa) (Redmon 2016) or operational problems. If for any reason the Leader
and IEEE 802.11s (Seller 2013), and for example, this is lost or danged during an operation, the command
data is the path information for each drone to take, this centre will promote a follower to a leader who will
opens up the possibility for many applications in wide- then control the remaining followers.
areas coverage. Long Range Wide Area Network
(LoRaWAN(The Things Network 2020) at a Medium
Access Control (MAC) layer) as well as LoRa (at the
physical (PHY) layer) are attractive as they allow
users to reach long distances (on the order of several
kilometres) in Line-of-Sight (LOS) applications
(Callebaut 2019)(Petäjäjärvi 2015), e.g., inside the
swarm. In addition, IEEE 802.11 (Wi-Fi) (IEEE
20216) has been adopted in many applications,
especially those using live video broadcasting.

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

B. SYSTEM COMPONENT

The drone architecture and main components


connectivity is illustrated in Figure 3. The figure
shows that the drone's main controller is the Pixhawk
(PX4); the drone is also equipped with GPS, plus all
the internal flight controller sensors, including
accelerometer, magnetometer, barometer, and
gyroscope. In Loiter mode, the quadrotor holds its
position based on Global Positioning System (GPS)
and barometric altitude measurements. The data
FIGURE 1. A proposed solution with a basic line formation.
collected by the onboard sensors is stored in a log file
by the Pixhawk, which is accessed and analysed post-
By the proposed method, leader L executes the
flight. The quadcopter was assembled as given in
commands transmitted from the command centre and
Figure 3. The establishment of a quadcopter was based
relays sub-command (tasks) to the followers to
execute. The followers and the Leader all transmit on the following:
their videos and images back to the command centre. • Body Frame (F450 quadcopter and F550
Suppose any of the followers or a sub swarm is
operating outside the internet zone for any reason. Octocopter)
• Propulsion System (lift power)
The nearest swarm member will act as a transmitter for • PixHawk Flight Controller (PX4)
transmitting the information back to the command
• Raspberry Pi 4 (Companion Computer)
centre.
• Raspberry Pi Camera version 2
Distance between drones is predefined to be 3 meters • 6 channel Wireless Radio Telemetry 2.4
(± 10%). The maintain_distance function will keep MHz
track of the current position relative to all
• Neo-Gps Module
neighbouring drones and adjust the drone's position
accordingly. The maintain_distance function will
utilise both GPS and distance measurement from a
distance sensor mounted on the drone. Each drone will
use several sensors to cover 360°.

The communication between the command centre to


drones and drone to the drone is achieved via the
internet. Figure 2 shows the actual setup, and the
different signals and commands exchanged between
drones and the command centre and leader drone.

FIGURE 3. Drone Structure and main components connectivity.

The above configuration also includes the ground


controller based on a customised controller for human
control to control and customise and plan operations
and tasks for the swarm system. In this study, the
swarm is made out of two UAVs, a Quadcopter and a
Hexacopter; both are based on the same hardware and
FIGURE 2. Communication setup between the command centre software with the exception of additional motors in the
and leader drone, drone to drone Neo-Gps Module.
Hexacopter.

The Raspberry Pi 4 and the PixHawk communicate


through the TX-RX pins connected to the telemetry

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

port on the pi USB on the PixHawk; the two boards sensors with diverse resolutions are the essential
communicate using the MavLink Protocol. A library components. Some of the sensors used in observation
called Drone-Kit was downloaded into the Pi, which are LiDAR, visual cameras, thermal or IR cameras,
contained all the various dependencies for Pi and and solid-state or optomechanical devices (C. A.
PixHawk communication. Wargo 2014). The usage of sensors is quite diverse,
depending on the needs. To detect an obstacle,
different types of sensors are used, which can be
C. EQUATION OF MOTION
mainly categorised into two sets:

In this study, we use North-East-Down (NED) • Passive Sensors


coordinates as our world frame to determine the • Active Sensors
position and orientation of the copter. Our body frame
is fixed at the centre of the copter, and initially, its Passive sensors are the ones that detect the energy
three-axis is parallel to the axis of the world frame. We discharged by the objects or the scenery under
require all propellers to thrust upwards, not observation. Most of the passive sensors being
downwards, to use propellers efficiently. This can be employed in sensing applications are optical or visual
guaranteed by matching the propeller type and motor cameras, thermal or infrared (IR) cameras, and
spin direction. The thrust and torque produced by the spectrometers (J. Kim 2012). Different types of
ith motor are u (propeller thrust), d ( motor orientation cameras work on different wavelengths, for instance,
in body frame) and b (motor spin direction -1 means visible light, infrared (short-wave, near-wave, mid-
counterclockwise and 1 clockwise), λ (propeller wave, and long-wave infrared), and ultraviolet band
torque-thrust ratio), u, d . From Newton's second law (UV).
and Euler's equation, the dynamics are:
Optical or visual sensors are cameras such as
𝒎𝒑̈ = 𝒎𝒈 + ∑𝒏𝒊=𝟏 𝒖𝒊 𝒅𝒊 (1) monocular or stereo cameras that work in visible light
(M. B. van Leeuwen 2005). Thermal or infrared
𝑱𝝎̇ + 𝝎 x 𝑱𝝎 = ∑𝒏𝒊=𝟏(𝒃𝒊 , λ𝑖 , u𝑖 , d𝑖 + 𝑟𝑖 x 𝑢𝑖 𝑑𝑖 ) (2) cameras, in turn, have a longer wavelength than the
visible light range and work in infrared light, i.e. from
Since the net thrust and torque on the right side are 700nm to 14µm. Consequently, the fundamental
linear on ui , we introduce matrices Mf (Mapping from difference is that visual cameras form images using
thrusts to net force, the ith column is di) and Mt visible light, whereas thermal cameras form images
(Mapping fro thrusts to net torque. The ith column is based on infrared radiation. While traditional cameras
𝒃𝒊 𝛌𝒊 , 𝐝𝒊 + 𝒓𝒊 𝐱 𝒅𝒊 ) for compact representation: have poor performance in low lighting, IR cameras
excel in such conditions (J. Kim 2012). All cameras
𝒎𝒑̈ = 𝒎𝒈 + 𝑹𝑴𝒇 𝒖 (3) rely on heavy image processing to extract useful
information from the chunks of raw data provided by
𝑴𝒇 𝒖 = 𝑱𝝎̇ + 𝝎 (4) the sensor. The extraction of points of interest requires
a separate algorithm. The additional algorithm is
The explicit definitions of Mf and Mt. The motor needed to calculate the range and other parameters of
positions {ri} can be computed from the shape the obstacles and, therefore, requires extra processing
parameter s by ri = Ais + bi , where all Ai and bi are power (F. Kóta 2019). Besides the limitation of field-
constant matrices and vectors from parametric shape of-view of the used sensor, visual cameras depend
representation, we will omit {ri} and only use s. heavily on the weather conditions, such as lighting
level, fog or rain (A. Mcfadyen 2014).
D. PERCEPTION: OBSTACLE
DETECTION 1. CAMERA

Perception is the first step in any collision Visual sensors or cameras rely on capturing the
avoidance system. To detect obstacles, the drone images of the environment and objects to give useful
should perceive its surroundings and environment. For information to be extracted. Visual cameras are
this, it needs to be equipped with one or more sensors typically monocular, stereo, and event-based cameras
working as a perception unit (C. H. R. Everett 1989). (S. Saha 2014), (S. A. S. Mohamed 2020). The
For remote sensing systems, sensors such as imaging benefits of using cameras are their small size, lesser

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

weight, lower power consumption, flexibility, and instance, LiDARs have difficulty detecting clear glass,
easily mounted. In contrast, the disadvantages of using while ultrasonic sensors are not affected by the colour
such sensors are, e.g., their high dependency on of the objects.
weather conditions, lack of image clarity, as well as
sensitivity to lighting conditions and background III. Autonomous Obstacle Avoidance
colour contrast. These factors have a significant
impact on the outcome, as the quality of the captured The study requires developing autonomous
image drops drastically if any of those factors play a navigation for the UAV without any external control
part in it. (remote controls or another control source) while
respecting real-time constraints and ensuring high-
An improved Inverse Perspective Mapping (IPM) performance and robust navigation. Indeed, our UAV
with a vertical plane model is used to perform a coarse had to calculate an optimal path from World Geodetic
obstacle detection in the bottom third of the image, System (WGS) indicated by the window Server to
which makes it only appropriate for slow-moving optimise battery consumption and increase operating
robots ( 1m/s). Afterwards, the obstacles are performance to cover the search area. The main
segmented using the Markov random field (MRF) features of our UAV are:
framework, and the distance between the robot and the
nearest obstacle is obtained. • The calculation of the optimal UAV
navigation trajectory to cover a search area.
2. ACTIVE SENSORS • Automatic piloting of the Raspberry
Pi 3B+ embedded Android from the Ubuntu
Active sensors work on the mechanism of emission server.
of radiation and reading the reflected radiation. An • Detection of obstacles during the
active sensor has a transmitter (source) and navigation of the UAV.
receiver/detector. A transmitter emits a signal in the • Live streaming of the mission
form of a light wave, an electrical signal, or an acoustic
signal; this signal bounces off an object, and the sensor In the sequel, we detail the features as mentioned
receiver reads the reflected signal (M. Rouse 2020), above. Starting with the preparation of the area to
(Active Sensors 2020). Their ability to penetrate the cover and the computation of the optimal trajectory
atmosphere in most conditions is due to the fact that
Ultrasonic sensors work on the principle of emitting A. Optimal Trajectory
sound waves and listening to its reflection back from Computation
an object to calculate the distance between the object
and the sensor (S. Chaulya 2016). The sound waves To be able to design a suitable algorithm for the
are generated at a frequency too high, i.e., 25-50 KHz, optimal trajectory computation, the terrain must be
for the human hearing frequency band ([30] J. M. modelled as an oriented graph. However, intelligent
Armingol 2018). The basic principle used to calculate modelling is chosen instead of basing the graph on
the distance is similar to the one used by radars or conventional meshing; this allows for a reduced
LiDARs, i.e., emit a wave, wait until the bounced off calculation time on the graph without losing precision
wave from an object arrives, and calculate the distance (which is the classical problem with meshing). The
based on how long it took for the wave to reflect using modelling was based on the notion of a visibility
this simple formula: graph. It integrated the consideration of obstacles,
danger zones, and the non-holonomy constraint of the
(𝑣∗𝑡) aircraft that implies a maximum steering angle. The
𝑑= (5)
2 resulting graph was then cleaned to keep only the strict
minimum necessary for trajectory calculation. The
where d is the distance, v is the speed of the wave, generation of the graph can require much computation
and t is the time of flight. time, but this generation is done only once before the
planning stage and therefore does not affect the
Ultrasonic sensors are readily available and are trajectory computation times.
much cheaper than most of the other ranging sensors
available in the market. Unlike LiDARs, ultrasonic The objective of the first part of the problem was
sensors are unaffected by the object's transparency; for to find a path between the start (0) and the goal (1) by
VOLUME 02, NO: 01. 2021
Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

avoiding obstacles (−1). Many algorithms find a path o Search for object
connecting two points on this graph. One of the best o Track an object
known is the Dijkstra algorithm and A* algorithm. o Surveillance
• Replace the Leader in any of the swarm
In this part, we illustrate the tests performed to
validate and verify the behaviour of the A* algorithm The swarms will send their video feeds back to the
in order to compute an optimal path from the WGS ground control; the user will further decide if there is
coordinates sent by the server. Figure 4 shows a going to be any changes or modification to the
scenario to find the optimal path from Point 1 (sstart) assigned task; the user command will be transmitted to
to the endpoint (sgoal) while passing from several the Leader, the Leader will pass the new instructions
nodes s1 to s4. To illustrate the advantages of this to all assigned followers. The Leader and the followers
algorithm, we used the previous example of a map will execute their assigned tasks autonomously.
without obstacles, with sstartin (100, 50) and sgoal in
(200, 50). We present the A* function:

𝑭(𝒔) = 𝒈(𝒔) + 𝒉(𝒔) (6)

FIGURE 6. Ground Control GUI.

FIGURE 4. A* Algorithm before expansion.


The pseudo-code for the selected tasks is
presented and discussed in this section. The first
activity to be presented is the basic activity of taking
off and landing. The following algorithms describe
some of the main operations that the drone will do
during all tasks.

Algorithm 1: Automatic take-off algorithm


FIGURE 5. A* Algorithm after expansion. Data: An existing ad-hoc network between the
drones and C, L and at least a follower already
IV.Ground Control Station flying
Result: Landing of a follower at the initial position
In this study, a customised ground control unit is begin
developed to control and edit the operations of the if PC,l > Thigh then
swarms. The ground control unit GUI is as shown in Land drone labeled l
Figure 6. The ground control will have the ability to
Assign label l to the drone ahead
repeat
• Define the swarm leader
• Assign followers to the swarm the condition
• Split the swarm into 2 or more swarms until all the followers are on the ground
with a condition of at least 2 units per
swarm
• Direct the swarm to a predefined area
through passing the coordinates of the
area of operation
• Select the operation for each swarm

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

The Leader will pass the relevant information to the


Algorithm 2: Automatic Landing algorithm
followers. Once the Leader and all the followers are in
Data: An existing internet network between the drones the air, they execute their tasks autonomously,
and C, L already flying and controlled by the operator including path planning and object avoidance, which
Results: Automatic landing of all followers will always include a position correction factor based
Begin on the role of the drone, be it a leader and or a follower.
If PC,I > Tlow then The priority of the drones is always to execute their
landing a new drone FI operation as a swarm. Once the operation is
Assign Label I to the new drone completed, the drones will always return to their flight
Repeat pad and land unless being ordered to do otherwise. The
The condition drones will also have to monitor their battery life as
Until no drone left this will always force a drone to return to base. This
applies to all leader or follower drones. Once the
battery life is around 20%, the drone priority switches
to the task of returning and landing safely. If this
happens to the Leader, a second drone will be assigned
Algorithm 3: Automatic Track algorithm
to the leadership to continue the operation.
Data: frame and objects, Followers (F) and Leader (L)
in Search mode Drone software is embedded in every drone,
Results: allowing one to periodically send status messages to
Begin its Follower (if any), forwarded to C. A drone
Send frame from F to L receiving such a message treats the content and
While objects delivers it to the next drone in direction to C. Status
Send object(i) to L messages are used to maintain information about the
Repeat drones (e.g. drone is a follower, the height of the leader
Until no objects drone etc.).
Send frame from F to UC
While objects V.Results and discussions
Send object(i) to UC
Repeat
Until no objects This section presents the testing environment and
L will receive frame, object and command(track) records the results for each test. It is intended to try
Repeat and modify the algorithms to improve the result
whenever possible. Several tests will be carried out,
Algorithm 4: Automatic Point travel algorithm including simple take-off hover and land in a swarm
formation autonomous flight to a predefined zone
Data: frame and objects, Followers (F) and Leader (L) based on coordinates received from the ground
in Search mode controller. And finally, a coordinated zone search for
Results: a predefined object.
Begin
While Followers A. Taking Off and Landing
Calculate distance to point
Min = F# This test is performed to check the ability of the
Repeat
leader drone to execute the ground control take-off
Until no Followers
command plus its ability to communicate the same
Send point to F# // this function will send point to
command to the follower drone. The first test is
nearest Follower
Repeat executed in this test; the leader drone is given the
command to take off and hover at the height of 2.0
meters and do nothing for a period of 3 minutes; the
In algorithm 1 take-off, the ground control will Leader executes the command and transmits the
control the take-off of the leader drone; this is command to its Follower. The Follower executed the
followed by the automatic take-off of the assigned command and hovered 2.5 meters from the Leader for
follower one by one. During this time, the ground
control will send the operation details to the Leader.
VOLUME 02, NO: 01. 2021
Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

3 minutes, the test completed by both leader and


follower landing.
The flight path of the results is illustrated in both
During the test, it was noticed that both drones Figure 7 and Figure 8. Figure 7 shows the swarm
were moving a short distance depending on wind leader flight path as a result of commands and
direction. However, both drones were compensated coordinates of flight received by the Leader from the
for this movement and adjusted their position via a ground control. The slight variation between the
correction action. coordinates received and the actual UAV coordinates
can be due to GPS accuracy, the slight delay between
B. Autonomous Flight to a satellite and GPS communication. From the results, it
Predefined Coordinates can be seen that the swarm leader could execute the
command efficiently and follow the predefined route
In this test, the ground control transmitted the accurately.
flight command to the leader drone; the command
included the taking-off to a height of 2.5 meters, hover 28.9818
for 2 minutes, then fly to the predefined coordinates 28.9816
28.9814
(X: 41.098077 Y: 28.98001) to (X: 41.097945 Y: 28.9812

Longitude
28.981
28.9817). 28.9808
Transmitted
28.9806
28.9804
Leader
The leader drone, once command received, it 28.9802
28.98
executed the command in sequence 28.9798
41.09785 41.0979 41.09795 41.098 41.09805 41.0981
Latitude

• take-off
• transmit the command to the Follower
• It hovered for 2 minutes FIGURE 7. Transmitted coordinates & Leader actual coordinates.

• Flight to the predefined coordinates


• Once it arrived, it continued to hover for Figure 8 illustrate the flight path of both the Leader
5 minutes. and the Follower. The results show that the Leader was
• The Follower followed the Leader as it able to receive and execute the commands sent by the
moved between the two coordinates ground controller and forward a relative command to
maintaining the minimum distance of 2.5 the designated Follower. It also shows that the
meters. Follower received the necessary command from its
• Both Leader and Follower landed after swarm leader. It efficiently executed the command and
hovering for 2 minutes. followed the flight path specified by the Leader. The
follower algorithm specifies that the minimum
A number of measurements were carried out distance allowable between the Follower and the
during the flight to check the accuracy of the execution Leader or any other swarm member is 2.5 m +/- 10%.
of the commands. The results of this test are given in The results show that the Follower maintained the
Table 1. minimum distance condition during the defined test
flight.
Table1: test results height 2 m
28.982
28.9818
Test command Leader Follower 28.9816
28.9814
A 41.098077, 41.098078, 41.0981, 28.9812
Longitude

28.98001 28.98002 28.98002 28.981


28.9808 Leader
B 41.098004, 41.098004, 41.098024, 28.9806
28.9804 Follower
28.98045 28.98046 28.98047 28.9802
28.98
C 41.097954, 41.097952, 41.097965, 28.9798
41.0978541.097941.0979541.09841.0980541.098141.09815
28.98085 28.98084 28.98086 Latitude
D 41.097904, 41.097903, 41.097908,
28.98125 28.98124 28.98127
E 41.097945, 41.097942, 41.097957, FIGURE 8. Leader coordinates & Follower Coordinates.
28.9817 28.9817 28.98172

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

VI.Conclusions Active Sensors—European Space Agency. Accessed: Mar. 13,


2020. [Online]. Available:
In this study, a new swarm scheme has been https://www.esa.int/Education/7._Active_sensors
introduced. The scheme is based on human combat
flight formation (leader-follower). The swarm B. Abruzzo, R. Arnold, and C. Korpela, "Towards a heterogeneous
swarm for object classification," in IEEE National Aerospace and
operates in an autonomous mode while executing its Electronics Conference, Dayton, Ohio, 2019.
duties. The platform used to test the scheme is two
drones a 6 and 4 rotor drone. The autonomous control C. A. Wargo, G. C. Church, J. Glaneueski, and M. Strout,
"Unmanned aircraft systems (UAS) research and future analysis,"
is achieved using a Raspberry Pi mini-computer in Proc. IEEE Aerosp. Conf., Mar. 2014, pp. 1–16.
mounted on each drone. For communication, the
scheme uses regular wi-fi or available internet service. C. H. R. Everett, "Survey of collision avoidance and ranging sensors
for mobile robots," robot. Auton. Syst., vol. 5, no. 1, pp. 5–67,
A number of sensors were used for detecting obstacles,
May 1989. [Online]. Available:
objects, and other swarm members. Special artificial http://www.sciencedirect.com/science/article/pii/0921889089900
intelligence control algorithms were developed to 419
control the swarm and monitor its status during its
F. Kóta, T. Zsedrovits, and Z. Nagy, "Sense-and-avoid system
operation. The proposed scheme was tested under development on an FPGA," in Proc. Int. Conf. Unmanned Aircr.
different environmental conditions: daylight flying, Syst. (ICUAS), Jun. 2019, pp. 575–579.
night flying dray, dry, and windy weather. The test
I. Sa, S. H. Outdoor flight testing of a pole inspection UAV
results show that the algorithms developed for the incorporating highspeed vision. Springer Tracts in Advanced
scheme have performed well under all the testing Robotics, pp. 107-121,2015.
conditions. The test results also show that the
Institute of Electrical and Electronics Engineers (IEEE). IEEE
execution of the flight path was very accurate in which Standard for Information Technology–Telecommunications and
both Leader and Follower followed a defined flight information exchange between systems—Local and metropolitan
path with high accuracy. Figure 7 and 8 also shows area networks–Specific requirements Part 11: Wireless LAN
that the compensation algorithm used for correcting Medium Access Control (MAC) and Physical Layer (PHY)
specifications Amendment 10: Mesh Networking. In IEEE
the drone position during the flight performed with Standard 802.11s-2011; IEEE: Piscataway, NJ, USA, 2011; pp.
high efficiency and managed to achieve the correction 1–372. [Google Scholar] [CrossRef]
action rapidly, resulting in an efficient use of energy
Institute of Electrical and Electronics Engineers (IEEE). IEEE).
which means an extended battery flight time. IEEE Standard for Information technology–Telecommunications
and information exchange between systems Local and
It intends to extend the work presented in this study by metropolitan area networks–Specific requirements—Part 11:
applying the developed scheme on a bigger swarm size Wireless LAN Medium Access Control (MAC) and Physical
equipped with more sophisticated sensors, which may Layer (PHY) Specifications. In IEEE Standard 802.11-2016
include MMWave radar for maintaining distance (Revision of IEEE Std 802.11-2012); IEEE: Piscataway, NJ,
USA, 2016; pp. 1–3534. [Google Scholar] [CrossRef]
within the swarm and detecting and identifying
obstacles. The use of MMWave technology will J. Kim, S. Hong, J. Baek, E. Kim, and H. Lee, "Autonomous
enable further application, which will include security vehicle detection system using visible and infrared camera," in
Proc. 12th Int. Conf. Control, Automat. Syst., Oct. 2012, pp.
and surveillance applications. 630–634.

J. Redmon, S. Divvala, R. Girshick, A. Farhadi, "You only look


once: Unified, real-time object detection," in Proceedings of the
REFERENCES IEEE conference on computer vision and pattern recognition,
2016, pp. 779-788.

A. L. Adams, T. A. Schmidt, C. D. Newgard, C. S. Federiuk, M. J. M. Armingol, J. Alfonso, N. Aliane, M. Clavijo, S. Campos-


Christie, S. Scorvo, and M. DeFreest, "Search Is a Time-Critical Cordobés, A. de la Escalera, J. del Ser, J. Fernández, F. García, F.
Event: When Search and Rescue Missions May Become Futile," Jiménez, A. M. López, M. Mata, D. Martín, J. M. Menéndez, J.
Wilderness & Environmental Medicine, vol. 18, no. 2, pp. 95– Sánchez-Cubillo, D. Vázquez, and G. Villalonga,
101, 6 2007. [Online]. Available: ‘‘Environmental perception for intelligent vehicles,’’ in
https://www.sciencedirect.com/science/article/pii/S10806032077 Intelligent Vehicles, F. Jiménez, Ed. Oxford, U.K.: Butterworth-
02181 Heinemann, 2018, ch. 2, pp. 23–101. [Online]. Available:
A. Mcfadyen, A. Durand-Petiteville, and L. Mejias, "Decision http://www.sciencedirect.com/science/article/pii/B97801281280
strategies for automated visual collision avoidance," in Proc. Int. 08000023 .
Conf. Unmanned Aircr. Syst. (ICUAS), May 2014, pp. 715–725.
L. Mejias, S. McNamara, J. Lai, and J. Ford, "Vision-based
detection and tracking of aerial targets for UAV collision

VOLUME 02, NO: 01. 2021


Abdelrahman R. S. Almassri | Intelligent Raspberry Pi for UAV Swarm Scheme.

avoidance," in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., Oct. Seller, OBA; Sornin, N. Low Power Long Range Transmitter.
2010, pp. 87–92. European Patent EP2763321A1, 5 February 2013. [Google
Scholar]
Madawalagama, S. Low Cost Aerial Mapping with Consumer Grade
Drones. 37th Asian Conference on Remote Sensing,2016. The Things Network (TTN). 2020. Available online:
https://www.thethingsnetwork.org/ (accessed on 29 December
M. B. van Leeuwen and F. C. A. Groen, "Vehicle detection with a 2020).
mobile camera: Spotting midrange, distant, and passing cars,"
IEEE Robot. Autom. Mag., vol. 12, no. 1, pp. 37–43, Mar. 2005. W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C. Y. Fu, A.
C. Berg, "SSD: Single shot multi-box detector," in European
M. Rouse and M. Haughn. What is Active Sensor?—Definition conference on computer vision, Springer, Cham, 2016, pp. 21-
From Whatis.com. Accessed: Mar. 13, 2020. Accessed: Mar. 13, 37.
2020. [Online]. Available:
https://internetofthingsagenda.techtarget.com/definition/active-
sensor

Petäjäjärvi, J.; Mikhaylov, K.; Roivainen, A.; Hänninen, T.;


Pettissalo, M. On the Coverage of LPWANs: Range Evaluation
and Channel Attenuation Model for LoRa Technology. In
Proceedings of the 14th International Conference on ITS
Telecommunications (ITST), Copenhagen, Denmark, 2–4
December 2015. [Google Scholar] [CrossRef]

R. D. Arnold, H. Yamaguchi, and T. Tanaka, "Search and rescue


with autonomous flying robots through behavior-based
cooperative intelligence," Journal of International Humanitarian
Action, vol. 3, no. 1, p. 18, 12 2018.

R. Arnold, K. Carey, B. Abruzzo, and C. Korpela, "What is a Robot


Swarm: A Definition for Swarming Robotics," in IEEE 10th
Annual Ubiquitous Computing, Electronics & Mobile
Communication Conference10th Annual Ubiquitous Computing,
Electronics & Mobile Communication Conference, New York,
NY, 2019.

Rudol, P. D. A UAV search and rescue scenario with human body


detection and geolocalisation. Advances in Artificial
Intelligence, pp. 1-13,2007.

S. A. S. Mohamed, M.-H. Haghbayan, J. Heikkonen, H. Tenhunen,


and J. Plosila, "Towards real-time edge detection for event
cameras based on lifetime and dynamic slicing," in Proc. Int.
Conf. Artif. Intell. Comput. Vis. (AICV), A.-E. Hassanien, A. T.
Azar, T. Gaber, D. Oliva, and F. M. Tolba, Eds. Cham,
Switzerland: Springer, 2020, pp. 584–593.

S. Chaulya and G. Prasad, "Mine transport surveillance and


production management system," in Sensing and Monitoring
Technologies for Mines and Hazardous Areas, S. Chaulya and
G. Prasad, Eds. Amsterdam, The Netherlands: Elsevier, 2016,
ch. 2, pp. 87–160. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/B97801280319
4000 0027

S. Saha, A. Natraj, and S. Waharte, "A real-time monocular vision-


based frontal obstacle detection and avoidance for low cost
UAVs in GPS denied environment," in Proc. IEEE Int. Conf.
Aerosp. Electron. Remote Sens. Technol., Nov. 2014, pp. 189–
195.

S. Sambolek, M. Ivasic-Kos, Detection of Toy Soldiers Taken from


a Bird's Perspective Using Convolutional Neural Networks, ICT
Innovations 2019, Ohrid. Springer Communications in Computer
and Information Science.

VOLUME 02, NO: 01. 2021

You might also like