Development of Drone Capable of
Development of Drone Capable of
Abstract— An experimental drone capable of methods to reduce the amount of data communication
autonomous flight was developed by equipping the between drones and computers for flight control while
microcomputer of Raspberry Pi 2.0 (Model B) for flight applying compression sensing techniques.
control and GPS sensor on the AR. Drone 2.0 (Power In addition, various methods on autonomous flight of
Edition). The GPS was calibrated by measuring the drones have been reported. For instance, several researches
deviation between a position where the GPS sensor was on autonomous flight of drones performed by detecting
set and the ones measured by the GPS. An experiment markers with specific feature quantity using a camera have
on autonomous flight control was performed, and the been reported. P. Smyczyński, et al. [6] developed a drone
autonomous flight control of drone was evaluated by that can autonomously follow markers and land at the
comparing planned flight routes of straight and position of markers detected by a camera installed on the
L-shaped lines and routes where the drone actually flew. front and the bottom of drone using Canny edge detection
Consequently, the experimental drone could fly algorithm. M. A. Garratt, et al. [7] performed velocity
autonomously along the planned flight routes. feedback control with markers detected by a camera
installed on the bottom of AR.Drone as reference points, and
Index Terms— Drone,Autonomous Flight , Global stable hovering by the drone was performed. Moreover, M.
Positioning System, Calibration A. Ma'sum, et al. [8] developed a drone that can
autonomously follow objects detected from images taken by
I. INTRODUCTION a camera installed on the front of drone using machine
learning. Y. P. Huang, et al. [9] proposed methods to detect
N recent years, drones has been used in a wide range of
I fields because they have high mobility. KOMATSU Ltd.
positions of objects using HOG (Histograms of Oriented
Gradients) and linear SVM (support vector machine)
[1] have been carried out a three-dimensional measurement algorithm.
service of building sites by drones. In this service, photos Furthermore, researches on autonomous flight of
taken from the sky by drones are converted to point cloud drones in disaster environment have also been reported.
data, and a three-dimensional map is created with that data. A.J.A. Rivera, et al. [10] developed a drone capable of
These services realize cost reductions for measurements. autonomously follow humans detected by optical and
Amazon.com Inc. [2] is planning to deliver commodities thermal sensors installed on the front of the drone. D. Yavuz,
that customers order within 30 minutes by drones. Thus, et al. [11] performed map creations using a panoramic
drones capable of autonomous flight have been receiving a image that was created from images taken by a camera
lot of attention. installed on the front of drone using image stitching
In addition, the application of drones using GPS is technology. Then, humans was detected from its images
expected in rescue work in a disaster area. In disaster sites, using the computer vision technology. Furthermore, J. P.
it is impossible for rescuers to pass due to the fragments of Carvalho [12] developed a drone equipped with a
destroyed buildings. Therefore, swift investigations of microcomputer of Odriod for flight control on the top, and
sufferers are required regardless situations in the affected an autonomous flight so that the drone can fly along the
areas. It is possible to check the situation of affected areas in route determined by PTAM (Parallel Tracking and
safety from the sky using drones capable of autonomous Mapping) algorithm was performed.
flight. Thereby, the sufferers will be able to find even in In this study, an experimental drone capable of
places where the rescuers are under risk. Therefore, the autonomous flight was developed by equipping a
drones capable of autonomous flight are required in various microcomputer for flight control and GPS sensor on the AR.
fields. Drone 2.0. Then, experiments on autonomous flight control
Several researches on drones have been carried out. P. J. were performed, and autonomous flight control of drones
Bristeau, et al. [3] derived the equations on postures and was evaluated by comparing planned flight routes and routes
velocities of AR.Drone during a flight and proposed where the drone actually flew.
methods to estimate the postures of AR.Drone during a
hovering. L. V. Santana, et al. [4] derived the equations on
behavior of AR.Drone during a flight and determined II. COMPOSITION OF EXPERIMENTAL DRONE
optimal gains for the velocity feedback control from the A. Composition of Hardware
derived equation. Furthermore, K. Shetti, et al. [5] proposed
Figures 1 and 2 shows the composition of experimental
Manuscript received January 11, 2018; revised February 8, 2018. drone and the composition of control device.
M. Kan, S. Okamoto, and J.H. Lee are with Mechanical Engineering AR. Drone 2.0 (Power Edition) was used as a basic drone.
Course, Graduate School of Science and Engineering, Ehime University, The three-axis gyroscope, the three-axis accelerometer, the
Matsuyama, Japan. three-axis magnetic sensor, the barometric pressure sensor
E-mail: Masataka Kan < [email protected] > and the ultrasonic sensor are equipped on the drone.
Shingo Okamoto < [email protected] >
B. Configuration of Software
Figure 3 shows a configuration of control system. Ubuntu
14.04 LTS (Long Time Support) was used as the OS
(Operation System) in this research because it is suitable in
using the ROS (Robot Operating System). The ROS was
used because it is possible to easily exchange data of sensor
among the Raspberry Pi 2.0, the microcomputer of AR.
Drone 2.0, and the personal computer.
Fig.4. Position where GPS was located for its calibration 𝑒(𝑡) =
√*𝑀(𝜆𝐺𝑃 − 𝜆𝑆𝑃 )+2 + *𝑁𝜑̅(𝜑𝐺𝑃 − 𝜑𝑆𝑃 )+2 (2)
𝑎(1−𝑒 2 )
𝑀= 3 (3)
(1−𝑒 2 𝑠𝑖𝑛2 𝜑
̅ )2
𝑎
𝑁= (4)
√1−𝑒 2 𝑠𝑖𝑛2 𝜑
̅
the drone actually flew. The experimental drone was able to [7] M.A.Garratt, L.K.Scott, A.J.Lambert, and P.Li, “Flight Test Results
autonomously fly to the goal position by using the proposed of a 2D Snapshot Hover,” IFAC Proceedings Volumes, vol. 46, Issue
method. 10, June 2013, pp. 17–22.
[8] M. A. Ma'sum, M. K. Arrofi, G. Jati, F. Arifin, M. N. Kurniawan, P.
REFERENCES Mursanto, and W. Jatmiko, “Simulation of intelligent Unmanned
Aerial Vehicle (UAV) For military surveillance,” International
[1] Komatsu Ltd. (2015. 1. 20). KOMATSU : Komatsu Embarks on
Conference on Advanced Computer Science and Information
SMARTCONSTRUCTION: ICT solutions to construction job sites.
Systems, Sept. 2013, pp. 161–166.
Available: [9] Y. P. Huang, L. Sithole, and T. T. Lee, “Structure From Motion
http://www.komatsu.com/CompanyInfo/press/201501201228320248 Technique for Scene Detection Using Autonomous Drone
1.html (accessed 2017. 12. 27) Navigation,” IEEE Transactions on Systems, Man, and Cybernetics:
[2] Amazon.com Inc. (2016. 12. 7). Amazon Prime Air. Available: Systems, vol. 99, pp. 1-12.
https://www.amazon.com/b?node=8037720011
[10] A. J. A. Rivera, A. D. C. Villalobos, J. C. N. Monje, J. A. G.
(accessed 2017. 12. 27)
Mariñas, and C. M. Oppus, “Post-disaster rescue facility: Human
[3] P. J. Bristeau, F. Callou, D. Vissière, and N. Petit, “The Navigation
detection and geolocation using aerial drones,” IEEE Region 10
and Control technology inside the AR.Drone micro UAV,” IFAC
Conference, Nov. 2016, pp. 384-386.
Proceedings Volumes, vol. 44, Issue 1, Sept. 2011, pp. 1477–1484.
[4] L. V. Santana, A. S. Brandão, and M. S. Filho, “Navigation and [11] D. Yavuz, H. Akbıyık, and E. Bostancı, “Intelligent drone navigation
Cooperative Control Using the AR.Drone Quadrotor,” Journal of for search and rescue operations,” 24th Signal Processing and
Intelligent & Robotic Systems, vol. 84, Dec. 2016, pp. 327–350. Communication Application Conference, May 2016, pp. 565-568.
[5] K. Shetti, and A. Vijayakumar, “Evaluation of compressive sensing [12] J. P. Carvalho, M. A. Jucá, A. Menezes, L. R. Olivi, A. L. M.
encoding on AR drone,” Asia-Pacific Signal and Information Marcato, and A. B. dos Santos, “Autonomous UAV Outdoor Flight
Processing Association Annual Summit and Conference, Dec. 2015, Controlled by an Embedded System Using Odroid and ROS,”
pp. 204-207. Lecture Notes in Electrical Engineering, vol. 402, Sep. 2016, pp.
423–437.
[6] P. Smyczyński, Ł. Starzec, and G. Granosik, “Autonomous drone
control system for object tracking: Flexible system design with
implementation example,” 22nd International Conference on
Methods and Models in Automation and Robotics, Aug. 2017, pp.
734-738.