0% found this document useful (0 votes)
17 views

Development of Drone Capable of

Uploaded by

amal.es23
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Development of Drone Capable of

Uploaded by

amal.es23
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol II

IMECS 2018, March 14-16, 2018, Hong Kong

Development of Drone Capable of


Autonomous Flight Using GPS
Masataka Kan. Author, Shingo Okamoto, Member, IAENG, and Jae Hoon Lee, Member, IAENG

Abstract— An experimental drone capable of methods to reduce the amount of data communication
autonomous flight was developed by equipping the between drones and computers for flight control while
microcomputer of Raspberry Pi 2.0 (Model B) for flight applying compression sensing techniques.
control and GPS sensor on the AR. Drone 2.0 (Power In addition, various methods on autonomous flight of
Edition). The GPS was calibrated by measuring the drones have been reported. For instance, several researches
deviation between a position where the GPS sensor was on autonomous flight of drones performed by detecting
set and the ones measured by the GPS. An experiment markers with specific feature quantity using a camera have
on autonomous flight control was performed, and the been reported. P. Smyczyński, et al. [6] developed a drone
autonomous flight control of drone was evaluated by that can autonomously follow markers and land at the
comparing planned flight routes of straight and position of markers detected by a camera installed on the
L-shaped lines and routes where the drone actually flew. front and the bottom of drone using Canny edge detection
Consequently, the experimental drone could fly algorithm. M. A. Garratt, et al. [7] performed velocity
autonomously along the planned flight routes. feedback control with markers detected by a camera
installed on the bottom of AR.Drone as reference points, and
Index Terms— Drone,Autonomous Flight , Global stable hovering by the drone was performed. Moreover, M.
Positioning System, Calibration A. Ma'sum, et al. [8] developed a drone that can
autonomously follow objects detected from images taken by
I. INTRODUCTION a camera installed on the front of drone using machine
learning. Y. P. Huang, et al. [9] proposed methods to detect
N recent years, drones has been used in a wide range of
I fields because they have high mobility. KOMATSU Ltd.
positions of objects using HOG (Histograms of Oriented
Gradients) and linear SVM (support vector machine)
[1] have been carried out a three-dimensional measurement algorithm.
service of building sites by drones. In this service, photos Furthermore, researches on autonomous flight of
taken from the sky by drones are converted to point cloud drones in disaster environment have also been reported.
data, and a three-dimensional map is created with that data. A.J.A. Rivera, et al. [10] developed a drone capable of
These services realize cost reductions for measurements. autonomously follow humans detected by optical and
Amazon.com Inc. [2] is planning to deliver commodities thermal sensors installed on the front of the drone. D. Yavuz,
that customers order within 30 minutes by drones. Thus, et al. [11] performed map creations using a panoramic
drones capable of autonomous flight have been receiving a image that was created from images taken by a camera
lot of attention. installed on the front of drone using image stitching
In addition, the application of drones using GPS is technology. Then, humans was detected from its images
expected in rescue work in a disaster area. In disaster sites, using the computer vision technology. Furthermore, J. P.
it is impossible for rescuers to pass due to the fragments of Carvalho [12] developed a drone equipped with a
destroyed buildings. Therefore, swift investigations of microcomputer of Odriod for flight control on the top, and
sufferers are required regardless situations in the affected an autonomous flight so that the drone can fly along the
areas. It is possible to check the situation of affected areas in route determined by PTAM (Parallel Tracking and
safety from the sky using drones capable of autonomous Mapping) algorithm was performed.
flight. Thereby, the sufferers will be able to find even in In this study, an experimental drone capable of
places where the rescuers are under risk. Therefore, the autonomous flight was developed by equipping a
drones capable of autonomous flight are required in various microcomputer for flight control and GPS sensor on the AR.
fields. Drone 2.0. Then, experiments on autonomous flight control
Several researches on drones have been carried out. P. J. were performed, and autonomous flight control of drones
Bristeau, et al. [3] derived the equations on postures and was evaluated by comparing planned flight routes and routes
velocities of AR.Drone during a flight and proposed where the drone actually flew.
methods to estimate the postures of AR.Drone during a
hovering. L. V. Santana, et al. [4] derived the equations on
behavior of AR.Drone during a flight and determined II. COMPOSITION OF EXPERIMENTAL DRONE
optimal gains for the velocity feedback control from the A. Composition of Hardware
derived equation. Furthermore, K. Shetti, et al. [5] proposed
Figures 1 and 2 shows the composition of experimental
Manuscript received January 11, 2018; revised February 8, 2018. drone and the composition of control device.
M. Kan, S. Okamoto, and J.H. Lee are with Mechanical Engineering AR. Drone 2.0 (Power Edition) was used as a basic drone.
Course, Graduate School of Science and Engineering, Ehime University, The three-axis gyroscope, the three-axis accelerometer, the
Matsuyama, Japan. three-axis magnetic sensor, the barometric pressure sensor
E-mail: Masataka Kan < [email protected] > and the ultrasonic sensor are equipped on the drone.
Shingo Okamoto < [email protected] >

ISBN: 978-988-14048-8-6 IMECS 2018


ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)
Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol II
IMECS 2018, March 14-16, 2018, Hong Kong

unknown environment. Then, the GPS was used to detect


control the self-position of the experimental drone in this study. The
sensor is connected with the Raspberry Pi 2.0 via a serial
port. It is possible to receive radio wave signals sent from
satellites. Self-position coordinates are estimated by
distances calculated from difference between time
information of the radio wave signals sent from satellites
and a system clock of the Raspberry Pi 2.0. The GPS
updated self-positions using radio signals received in
intervals of 10 [Hz], and the self-positions are calculated
receiving the radio signals from six to eight satellites at the
same time. In addition, the MTK 3339 chipset (Media Tek
Inc.) was used as the GPS antenna. This antenna can also
use QZSS (Quasi-Zenith Satellite System) developed by the
Japan Aerospace Exploration Agency.
Fig.1. Composition of experimental drone Real Time Clock (Adafruit DS1307) was used as a real
time clock module. A crystal oscillator and a battery are
equipped in the module. Then, it is possible to retain
accurate time even if the power supply of microcomputer is
cut off. It is possible to match the system clock of
microcomputer with current time when estimating
self-position coordinates with the GPS. Equipping the
module made possible to retain accurate time without
synchronizing with NTP (Network Time Protocol) servers
which are communication protocol servers for
synchronizing the clock of device to correct time through
Internets. Wi-Fi was used in communication between the
microcomputer of AR. Drone 2.0 and the Raspberry Pi 2.0.
A vertical camera is installed on the bottom of drone. A HD
(High Definition) camera is also installed on the front of
drone. Images captured by their cameras can be transmitted
to the microcomputer of AR. Drone 2.0 by the Wi-Fi
communication.
Fig.2. Composition of control device Maximum loading capacity of AR. Drone 2.0 is around
100 [g]. Total loading masses containing the Raspberry Pi
2.0, the GPS sensor, the real time clock module and the
USB cable of the experimental drone is 94 [g]. Then, the
experimental drone can fly stably.

B. Configuration of Software
Figure 3 shows a configuration of control system. Ubuntu
14.04 LTS (Long Time Support) was used as the OS
(Operation System) in this research because it is suitable in
using the ROS (Robot Operating System). The ROS was
used because it is possible to easily exchange data of sensor
among the Raspberry Pi 2.0, the microcomputer of AR.
Drone 2.0, and the personal computer.

III. CALIBRATION OF GPS


A. Measurement of Positions Using GPS
Positions measured by the GPS had certain deviations
compared to a position where the GPS sensor was set.
Therefore, it was necessary to develop programs considering
the deviation between the positions where the GPS sensor
was set and the one measured by the GPS. Measurement
experiments to find the deviations of values measured by
Fig.3. Configuration of control system the GPS were conducted in the campus of Ehime
University. Then, calibration of GPS was performed.
An experimental drone was developed by equipping a Figure 4 shows a position where the GPS sensor was located
microcomputer of Raspberry Pi 2.0 (Model B) and a GPS for its calibration. The latitude and the longitude of the
sensor. Maximum total input current of power supplies of a position where the GPS sensor was set was 33.849586 [deg]
Raspberry Pi 2.0 is 460 [mA] that is within the permissible and 132.770918 [deg], respectively. Then, the positions
current of 500 [mA] of the drone. were measured 10 times where one time measurement
Ultimate GPS Breakout (Adafruit) was used as the GPS means to measure them for 10 minutes every an hour by the
sensor. The technology using GPS is more effective than the GPS in the position shown in Fig.4. Then, the GPS was
image processing technology for the position detection in calibrated by measuring the shifts in the x and y directions

ISBN: 978-988-14048-8-6 IMECS 2018


ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)
Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol II
IMECS 2018, March 14-16, 2018, Hong Kong

average measured by the GPS was 2.7 [m]. Programs


considering the shifts between the positions where the GPS
sensor was set and the one measured by the GPS were
developed based on calibration values obtained from the
measurement experiments.

IV. AUTONOMOUS FLIGHT EXPERIMENT


A. Flight Control Method of Drone
Velocity to give to the drone was calculated by
multiplying deviations between goal coordinates and
self-position ones measured with the GPS and proportional
gains. Flight velocity, u(t) of the drone was given by the
following expression.

𝑢(𝑡) = 𝐾𝑝 𝑒(𝑡) (1)

Here, 𝐾𝑃 is the proportional gain. 𝑒(𝑡) is also a distance


deviation between goal coordinates and self-position
coordinates. The distance deviation, e(t) was calculated
using Hubeny formula given as follows.

Fig.4. Position where GPS was located for its calibration 𝑒(𝑡) =
√*𝑀(𝜆𝐺𝑃 − 𝜆𝑆𝑃 )+2 + *𝑁𝜑̅(𝜑𝐺𝑃 − 𝜑𝑆𝑃 )+2 (2)

Here, 𝜑𝐺𝑃 and 𝜆𝐺𝑃 are latitude and longitude of goal


coordinates, respectively. The 𝜑𝑆𝑃 and 𝜆𝑆𝑃 are latitude
and longitude of self-position ones, respectively. The 𝜑̅ is
also average values of latitude between the goal coordinates
and the self-position ones. The 𝑀 is the curvature radius of
the meridian. The 𝑁 is the curvature radius of the prime
vertical circle. They are given by the following expressions.

𝑎(1−𝑒 2 )
𝑀= 3 (3)
(1−𝑒 2 𝑠𝑖𝑛2 𝜑
̅ )2
𝑎
𝑁= (4)
√1−𝑒 2 𝑠𝑖𝑛2 𝜑
̅

Here, the 𝑒 and the a mean major eccentricity and the


equatorial radius, respectively. The e and the a have
different definitions depending on a geodetic reference
: Position where GPS was set frame. The e is 0.081819 [m] and the a is 6378137 [m]
: Average of positions measured by GPS because the WGS84 is used as the geodetic reference frame.
: Position measured by GPS
In addition, directions of flight of drone are determined by
positive or negative signs of deviations of latitude or
: Error range of position measured by GPS
longitude between positions during a flight of drone and a
R : Distance between average of positions goal position.
measured by GPS and farthest position
from average measured by GPS
B. Straight Flight
Fig.5. Position where GPS was set and average of positions measured by
Experiments were performed in the campus of Ehime
GPS University. Figure 6 shows a planned flight route of a
straight line. The experimental drone was taken off from the
between the position where the GPS sensor was set and the start position. Then, this drone was autonomously flown to
average of positions measured by the GPS. the goal position. Latitude and longitude of goal position are
33.849600 [deg] and 132.770905 [deg], respectively. Figure
7 shows a flight route when the drone autonomously flew
B. Experimental Result
above the straight line. The experimental drone was able to
Figure 5 shows a position where a GPS sensor was set autonomously fly to the position around 2 [m] from the goal
and average of positions measured by the GPS. The position as shown in Figure 7.
positions measured by the GPS were out by 2.86 [m] in the
x direction and 3.42 [m] in the y direction as shown in
Figure 5. The distance between the average of positions C. L-shaped Flight
measured by the GPS and the farthest position from the Experiments of autonomous flight control on planned

ISBN: 978-988-14048-8-6 IMECS 2018


ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)
Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol II
IMECS 2018, March 14-16, 2018, Hong Kong

Fig.6. Planned flight route of straight line

Fig.8. Planned flight route of L-shaped line

: Start Position : Goal Position


: Planned flight route : Route where drone
: Position that drone actually flew
: Start Position : Via Position : Goal Position
hovered
: Planned flight route : Route where drone
Fig.7. Flight route when drone autonomously flew above straight line
: Position that drone actually flew
hovered
flight route of a L-shaped line were performed. Figure 8
Fig.9. Flight route when drone autonomously flew above L-shaped line
shows a planned flight route of a L-shaped line. The
experimental drone was taken off from the start position.
This drone was autonomously flown to the via position. V. CONCLUSION
Then, this drone hovered around the via position. Finally, The experimental drone was developed by equipping the
the drone was autonomously flown to the goal position. microcomputer for flight control and the GPS sensor on the
Latitude and longitude of the via position are 33.849600 drone in order to verify practical effectiveness of methods
[deg] and 132.770275 [deg], respectively. Latitude and about outdoor autonomous flight using GPS. The
longitude of goal position are 33.849600 [deg] and measurement experiments of GPS were performed to
132.770357 [deg], respectively. Figure 9 shows a flight evaluate the GPS sensor. As a result, it was found that the
route when the drone autonomously flew above a L-shaped sensor had certain deviations. The experiment on
line. The experimental drone was able to autonomously fly autonomous flight control of the drone was performed
from the start position to the via position and the goal one as considering the results of calibration of the GPS. The
shown in Figure 9. The experimental drone hovered at a usefulness of proposed method on autonomous flight
position 1.38 [m] away from the via position. This drone was controls was evaluated by comparing the planned flight
landed on 4.47 [m] away from the goal position. routes of straight and L-shaped lines and the routes where

ISBN: 978-988-14048-8-6 IMECS 2018


ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)
Proceedings of the International MultiConference of Engineers and Computer Scientists 2018 Vol II
IMECS 2018, March 14-16, 2018, Hong Kong

the drone actually flew. The experimental drone was able to [7] M.A.Garratt, L.K.Scott, A.J.Lambert, and P.Li, “Flight Test Results
autonomously fly to the goal position by using the proposed of a 2D Snapshot Hover,” IFAC Proceedings Volumes, vol. 46, Issue
method. 10, June 2013, pp. 17–22.
[8] M. A. Ma'sum, M. K. Arrofi, G. Jati, F. Arifin, M. N. Kurniawan, P.
REFERENCES Mursanto, and W. Jatmiko, “Simulation of intelligent Unmanned
Aerial Vehicle (UAV) For military surveillance,” International
[1] Komatsu Ltd. (2015. 1. 20). KOMATSU : Komatsu Embarks on
Conference on Advanced Computer Science and Information
SMARTCONSTRUCTION: ICT solutions to construction job sites.
Systems, Sept. 2013, pp. 161–166.
Available: [9] Y. P. Huang, L. Sithole, and T. T. Lee, “Structure From Motion
http://www.komatsu.com/CompanyInfo/press/201501201228320248 Technique for Scene Detection Using Autonomous Drone
1.html (accessed 2017. 12. 27) Navigation,” IEEE Transactions on Systems, Man, and Cybernetics:
[2] Amazon.com Inc. (2016. 12. 7). Amazon Prime Air. Available: Systems, vol. 99, pp. 1-12.
https://www.amazon.com/b?node=8037720011
[10] A. J. A. Rivera, A. D. C. Villalobos, J. C. N. Monje, J. A. G.
(accessed 2017. 12. 27)
Mariñas, and C. M. Oppus, “Post-disaster rescue facility: Human
[3] P. J. Bristeau, F. Callou, D. Vissière, and N. Petit, “The Navigation
detection and geolocation using aerial drones,” IEEE Region 10
and Control technology inside the AR.Drone micro UAV,” IFAC
Conference, Nov. 2016, pp. 384-386.
Proceedings Volumes, vol. 44, Issue 1, Sept. 2011, pp. 1477–1484.
[4] L. V. Santana, A. S. Brandão, and M. S. Filho, “Navigation and [11] D. Yavuz, H. Akbıyık, and E. Bostancı, “Intelligent drone navigation
Cooperative Control Using the AR.Drone Quadrotor,” Journal of for search and rescue operations,” 24th Signal Processing and
Intelligent & Robotic Systems, vol. 84, Dec. 2016, pp. 327–350. Communication Application Conference, May 2016, pp. 565-568.
[5] K. Shetti, and A. Vijayakumar, “Evaluation of compressive sensing [12] J. P. Carvalho, M. A. Jucá, A. Menezes, L. R. Olivi, A. L. M.
encoding on AR drone,” Asia-Pacific Signal and Information Marcato, and A. B. dos Santos, “Autonomous UAV Outdoor Flight
Processing Association Annual Summit and Conference, Dec. 2015, Controlled by an Embedded System Using Odroid and ROS,”
pp. 204-207. Lecture Notes in Electrical Engineering, vol. 402, Sep. 2016, pp.
423–437.
[6] P. Smyczyński, Ł. Starzec, and G. Granosik, “Autonomous drone
control system for object tracking: Flexible system design with
implementation example,” 22nd International Conference on
Methods and Models in Automation and Robotics, Aug. 2017, pp.
734-738.

ISBN: 978-988-14048-8-6 IMECS 2018


ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)

You might also like