0% found this document useful (0 votes)
181 views

Design and Development of The Hardware For Vision Based UAV Autopilot

Uploaded by

Joshua Daniel
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
181 views

Design and Development of The Hardware For Vision Based UAV Autopilot

Uploaded by

Joshua Daniel
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Proceedings of The Canadian Society for Mechanical Engineering International Congress 2014

CSME International Congress 2014


June 1-4, 2014, Toronto, Ontario, Canada

Design and Development of the Hardware for a Vision-based UAV Autopilot

Nikolai Kummer, Craig Jee, Jamie Garbowski, Jenner Richards, Afzal Suleman
Ephraim Nowak, Homayoun Najjaran Department of Mechanical Engineering
School of Engineering University of Victoria
University of British Columbia Victoria, BC, Canada
Kelowna, BC, Canada
[email protected]

Abstract— Vision-based control of unmanned aerial vehicles


has potential to increase their autonomy and safety during
flight and landing, thus increasing their prevalence in various
industries. This paper presents the design and development of
the hardware for a vision-based autopilot. The aim was to
develop a modular, low-cost system that consists of non-
proprietary components. The main components are a custom
switching module that interfaces between the autopilot and the
RC receiver, a single-board computer for image analysis and
control, and an autopilot for low-level servomotor control.
Hardware-in-the-loop tests show that the switching module is
capable of switching between the manual and vision-based Figure 1: UAV with vision-based autopilot, currently under
control input. It can also return the control of the UAV to the development at UBC.
operator when needed.
associated with crashed landings. Due to the remote nature of
Keywords-unmanned aerial vehicle; autopilot; UAV; fixed- UAV operation, where signal and communication loss is a
wing; vision-based control; visual servoing
possibility, the UAV is required to react autonomously to
unforeseen circumstances. Transport Canada regulates the
I. INTRODUCTION operation of UAV and requires a special flight operation
The civilian use of unmanned aerial vehicles (UAVs) has certificate (SFOC) before flight permission is granted. These
experienced rapid growth over the past years. Current regulatory restrictions ensure that the UAV is safe, but also
applications include surveying, forestry, law enforcement, restrict the maximum UAV weight to 35 kg and operation to
search and rescue, and wildlife monitoring. Numerous other within line-of sight[1]. The line-of-sight operation increases
applications remain yet to be discovered as UAV performance UAV cost as it requires personnel and equipment to travel to
and reliability improve, but the UAV industry is projected to the destination to perform the task. There are also practical
continue to grow. constraints due to the large open space requirement to operate a
(non-micro) fixed-wing UAV. An increased level of UAV
The two common architectures are the fixed-wing aircraft autonomy would alleviate some of these problems, as increased
and the helicopter-type UAV. The helicopter-type UAV (single safety will reduce UAV operation cost as well as allow the
and multirotor) is capable of hovering stationary in the air and regulatory bodies to loosen restriction.
has a good payload capacity at the cost of limited flight time.
The fixed-wing UAV offers a higher endurance, but higher It is because of the restrictions and challenges that a great
payload capacities require faster flight speeds to generate deal of UAV related research is currently implemented in
greater lift. The fixed-wing UAV’s ability to cover large areas simulation only. Further research into the field of UAV
quickly makes it better suited for surveying applications than operation is necessary to fully utilize the current technology,
the helicopter-type UAV. In this paper, the focus will be on the increase safety and derive economic benefit.
fixed-wing aircraft (Figure 1). Most previously mentioned UAV applications include a
There are hurdles that need to be overcome before these vision or extraspectral camera sensor. However the camera is
unmanned systems become more prevalent in industry. The only used for remote sensing and does not actively control the
current cost of hardware has been rapidly declining. Still, UAV UAV. Active vision control has proven itself in other fields of
operation is quite costly due to loss of hardware and downtime robotics. In the image-based control schemes it has been shown
The authors would like to acknowledge the financial support of the
Natural Science and Engineering Research Council (NSERC) Canada for
this project under the Engage program.
1 Copyright © 2014 by CSME
that positioning will converge to the desired position, robust and reliable vision-based autopilot and to reduce the
regardless of onboard sensor accuracy and in the presence of time from algorithm development to implementation.
camera calibration errors[2]. The addition of a vision-based
control has great potential to increase the autonomy of the A. Design Requirements
UAV, due to decreased reliance on onboard positioning This section summarizes the guidelines for development of
sensors. Vision-based control can be categorized into feature the vision-based autopilot. The two main areas of development
tracking and optical flow. Optical-flow sensors have been used should be the autopilot hardware and a corresponding
for obstacle avoidance in flight [3]and landing [4]. In this paper hardware-in-the-loop (HIL) system that allows the safe testing
the focus is on feature tracking-based methods, due to the large of algorithms, prior to the implementation on a real UAV. A
pool of potential features to choose from, such as point summary of design requirements for the autopilot follows. The
features, line features and image moments. system should be:
Visual servoing is the practice of controlling a robotic
• capable of onboard processing of images due to potential
system using vision sensor information as feedback. It has been
for signal loss.
extensively researched for robot arms[2], mobile robots and
quadrotor helicopters[5][6] with improvements to positioning • inexpensive to reduce cost associated with potential
accuracy. Visual servoing of fixed-wing aircraft has received crashes.
limited attention. Extensive research deals with information
that can be extracted from images to augment onboard sensors, • robust enough to detect if the image processing computer
such as horizon estimation in [7], height-above ground has become unresponsive.
estimation in [4]. Reference [8] uses visual servoing to control • return manual control to the operator at any time.
heading for skid-to-turn manoeuvres for inspection tasks,
which were tested in simulations. Vision-based control • modular to facilitate improvements on each subsystem.
methods for UAV are often implemented in simulation or in
offline analysis on images captured during flight. Major • Assembled using off-the-shelf components, where
challenges associated with vision-based fixed-wing control possible, to reduce development cost and time.
include nonlinear equations of motion, coupled attitude and • lightweight enough to be portable by a small UAV.
velocity dynamics, lack of readily available hardware and the
practical requirements of a suitable testing area. B. Contributions
A prominent area of research for vision-based fixed-wing This research focuses on fixed-wing UAVs, in view of their
control is automated landing. Landing accounts for a increased range and endurance. The progress on the
significant percentage of damage to UAVs[9]. Manual landing, development of a vision-based autopilot for testing vision-
performed by trained human pilots is susceptible to human based algorithms is presented. An overview of the proposed
error[9][10]. Partially responsible is the third person view that system is shown in Figure 2.
the pilot has of the UAV during landing. Landing automation
has great potential to reduce UAV operational cost and increase The three major components of the vision autopilot are a
safety to ground personnel. The motivation behind the work vision computer, an autopilot and a switching module. Figure 3
presented in this paper is fuelled by landing automation. presents the proposed circuit diagram for a switching module
Controlled collision landings are popular for small fixed-
wing UAVs. The authors in [10] present a vision-based landing
method with a large coloured net. The method was tested in
experiments and the image processing was done on a ground
station computer, which makes the system highly vulnerable to
signal loss. Controlled collision landing with a large inflatable
airbag, addresses some problems with the net based landing
method, as landing can occur from any heading direction.
Airbag landing has been implemented in simulation in [11] and
in experiments in [9] for a blended-wing body UAV with
onboard image processing.
The presented literature shows research aimed at increasing
autonomy and safety of UAVs. However the algorithms are
either implemented in simulation only or are implemented on a
custom created vision autopilot system, interfacing on custom
designed boards. Quadrotor helicopters have the PIXHAWK
vision autopilot[12], but to the best of the author’s knowledge
no such system exists for fixed-wing aircraft. The availability
of a standard vision-based autopilot can fuel further research
into UAV control. In this paper, the design and development of
a vision-based autopilot is presented, to assist the creation of a Figure 2: Hardware Connection Diagram for the Vision Autopilot

2 Copyright © 2014 by CSME


(SN74F257)

Figure 3: Heartbeat-Sensing Circuit Diagram

that switches between manual and automated input and is been implemented in hardware, to not interfere with regular
capable of detecting when the vision computer becomes autopilot operation. The onboard vision computer receives
unresponsive. We have tested the switching performance in images from a camera, performs image analysis and
the laboratory environment, but flight-tests are currently implements a control thread. The control thread receives
pending an SFOC application from Transport Canada. telemetry data from the APM2.5 via the UART interface. The
control thread controls the UAV by sending PWM commands
In addition to the autopilot hardware, an HIL setup is
to the APM2.5 that mimic the signal from the RC receiver.
presented that allows the testing of vision algorithms by
The vision computer also transmits an alternating high-low
interfacing the autopilot to Matlab® and Simulink®. The
heartbeat signal, which indicates that the computer is still
hardware and the HIL simulator will be presented in Sections
responsive. The switching module will ignore inputs from the
II and III, respectively. Preliminary HIL flight results are
computer, should the heartbeat signal stop alternating.
presented in Section IV and the conclusions will be presented
in Section V. The proposed system is highly modular, allowing for
replacement of components to suit the user’s needs or price
II. HARDWARE requirements. This modularity is attributed to the fact that the
vision computer mimics the RC receiver signal. Any autopilot
This section outlines the hardware components of the
that communicates with the RC receiver could be a potential
vision-based autopilot. The major components are the replacement for the APM2.5. The rest of the section will
Ardupilot Mega (APM2.5) 1 autopilot, switching module, describe the APM2.5 autopilot, the switching module, the
vision computer, which are all installed in the UAV airframe
Beaglebone Black 2 vision computer, and the airframe
hardware.
hardware in detail. The proposed system will be implemented
The pilot controls the UAV via a 6 channel RC remote.
on fixed-wing UAVs, but can also be extended to multi-rotor
Four of the channels control roll, pitch, throttle, and yaw of the
helicopters.
UAV. One channel is used to indicate the flight-mode that the
APM2.5 is currently in and allows the pilot to switch from
A. APM2.5 Autopilot
“fly-by-wire” to “manual” to “return-to-launch” modes. The
last channel is the auto-switch, which changes UAV control The APM2.5 is an autopilot that is distributed by
from the operator to the vision computer. The operator can 3DRobotics and which has been developed since 2007. This
regain control of the UAV at any time via the auto-switch. autopilot has been selected, due to its low cost (approximately
The APM2.5 controls the low-level functions, such as the $200), numerous features and its ability to be used on fixed-
servomotors and the propeller thrust. During normal operation, wing as well as multirotor helicopters. The autopilot features an
the APM2.5 receives commands from the RC receiver, which onboard GPS module, a 3 axis gyroscope, 3 axis accelerometer
receives signals from the pilot via the RC remote. and a magnetometer. The APM2.5 can transmit telemetry
wirelessly to a ground station computer using a 3DR Radio.
In the proposed system, a custom designed switching APM Mission Planner, the Ardupilot ground station software,
module (shown in Figure 4) interfaces between the RC receiver interfaces with the XPlane3 and Flightgear4 flight simulators,
and the APM2.5 and allows the vision computer to take which eases the creation of the HIL system.
control, if the auto-switch is engaged. The switching module
also monitors the health of the vision computer and, if
necessary, ignores inputs from the vision computer and
switches back to manual control. The switching module has 2
http://beagleboard.org/
3
http://www.x-plane.com/desktop/home/
1 4
Ardupilot website: http://ardupilot.com/ http://www.flightgear.org/

3 Copyright © 2014 by CSME


module is always powered whenever the autopilot is
operational.
The heartbeat-sensing-circuit diagram, which connects to a
multiplexer (SN54F257) is shown in Figure 3. The switching
circuit requires two voltage levels for proper operation. A 555
timer on the heartbeat sensing circuit runs on 3.3V and the
multiplexer uses the 5V power to switch the input signals
between the RC receiver and the vision computer.
The voltage level shifter, shown in Figure 2 was also placed
on the switching module to keep the system compact. The
APM autopilot is Arduino-based which transmits telemetry as a
5V UART signal. The Beaglebone Black UART port uses
Figure 4: Beaglebone Black vision computer (Left) and Switching 3.3V. The voltage level shifter (SN74LVC245A) allows the
Module (Right). vision computer to read the 5V telemetry.
The APM 2.5 is in charge of low level servomotor and The heartbeat-sensing circuit consists of four stages, an
propeller thrust control of the UAV. The fly-by-wire autopilot active high pass filter, an inverter, a 555 timer in monostable
mode was selected, which automatically calculates control mode, and an active low pass filter. The first stage uses the
surface (aileron, elevator, and rudder) deflection required to capacitor C1 to pass the high frequency heartbeat signal while
achieve and hold an attitude angle (pitch, roll and yaw). The blocking any steady state voltage signal in case the vision
attitude angle is set by the stick deflection on the RC remote. computer fails in a high or low state.
The fly-by-wire mode can be used by the vision-computer to The output of transistor Q1 is an inverted pulsed signal, or
calculate the attitude angles required to follow an image a low logic state, if the vision computer is unresponsive since
trajectory. In addition to the fly-by-wire mode, the APM2.5 Q1 is normally on. The output of Q1 is passed to the second
features a “manual” mode, which allows the direct control of stage which is the base of Q2. The function of transistor Q2 is
the control surface deflections. This mode was not ruled out for to invert the signal from Q1. The output of Q2 will be the
vision-based control, but it requires extensive system original pulsed signal or a logic high state if the vision
identification (for a review on UAV system identification see computer is unresponsive.
[13]). The APM also features a “waypoint”-mode which guides
the UAV to specific GPS coordinates and a “return-to-launch”- The third stage of the circuit is an NE555 timer IC in
mode, which returns the UAV to the launch area. monostable mode. When the trigger pin receives a low signal
the output of the 555 timer will be a high logic state for a fixed
The APM2.5 is powered by a battery eliminator circuit time. As long as the pulsed signal is present the trigger pin is
(BEC), which connects directly to the onboard flight batteries constantly reset. If the pulsed signal stops, the output of Q2
and also powers the servomotors. A 10 Amp Castle BEC was stays high and the output of the 555 will eventually go low
selected, which powers the autopilot and the servomotors. The signaling that the vision computer has become unresponsive.
APM2.5 supplies power to the RC receiver and the switching
module. The final stage is a low pass filter to remove the high
frequency noise from the 555 timer switching. The filtering is
B. Switching Module accomplished by resistor R6 and capacitor C4 with the
The main purpose of the switching module is to switch transistor Q3 inverting the steady state logic level signal.
between the RC input and the vision computer commands, C. Vision Computer
while monitoring the status of the vision computer in case it
becomes unresponsive. The switching module interfaces A Beaglebone Black (BBB) single-board computer was
between the autopilot and the RC receiver and connects to the used as the onboard vision computer. The BBB retails for
vision computer. The vision computer supplies an alternating approximately $45 and features a 1GHz processor with
high-low heartbeat signal to ensure that the switching module 512MB DDR RAM. The BBB is light-weight (40g) and
continues to forward vision computer commands to the compact (3.4in×2.1in), which makes it well suited for onboard
autopilot. The heartbeat signal must be an alternating signal, as UAV applications. The BBB runs Angstrom Linux and uses
the vision computer output pins may fail in either a high or low OpenCV C++ libraries for image processing, which allows for
voltage state. easy transfer of code to other future platforms. A Logitech
Quickcam Pro 9000 webcam was connected to the BBB.
The switching module was implemented in hardware rather
The BBB also features 65 general-purpose-input-output
than software for two reasons: (i) to ensure safe operation
(GPIO) pins, which allow it to receive and send data to sensors
regardless of computer state and (ii) to use the autopilot
without modification to autopilot software, preserving normal and actuators. The Raspberry Pi 5 was previously considered
operation and pre-existing safety protocols. The switching
module is powered by the APM2.5 to ensure that the switching
5
http://www.raspberrypi.org/

4 Copyright © 2014 by CSME


for vision computer application, but was slower than the BBB tests can be performed via an HIL setup (shown in Figure 5).
and had fewer GPIO pins. The setup was used to test the switching module and will be
The BBB is powered by a portable 5V, 2600mAh universal used for future research on vision-based UAV control
backup battery charger through the micro USB port. The BBB schemes. The HIL setup consists of two computers: the
consumes approximately 210-460mA, depending on the simulator computer and the HIL vision computer, which are
processor load and peripheral devices that are connected, explained in the rest of this section.
which gives the BBB well over 4 hours of operation time on a
A. Simulator Computer
single charge.
The vision computer runs three threads in parallel, the The simulator computer runs the APM Mission Planner,
control, the image analysis, and the heartbeat thread. If the which connects to the XPlane 9 flight simulator. The flight
vision computer becomes unresponsive, the alternating simulator provides response telemetry of the simulated UAV
heartbeat will stop and the switching module will ignore any due to the APM2.5 input, as well as image data to the HIL
vision computer commands. vision computer. The simulated UAV can be tuned to react
The heartbeat thread checks if the auto-switch has been similar to the real UAV by examining flight test data. XPlane
engaged. The signal from the auto-switch is send by the RC sends telemetry data over the network, which is collected by
receiver. The RC receiver output signal is a 5V square wave the HIL vision computer and the APM Mission Planner.
with a period of 20ms and a duty cycle between 1ms and 2ms. B. HIL Vision Computer
The output was converted from a digital to an analog signal
The HIL vision computer runs Simulink and Matlab, which
via a first-order RC filter consisting of a 4.7kΩ resistor and a
allow for fast prototyping of aircraft control schemes. These
47μF capacitor. The resulting analog signal is read by the
prototyped control schemes can be tuned and finalized on the
vision computer to detect if the auto-switch is engaged. Once
detected, the vision computer starts alternating the heartbeat simulated UAV and then converted to C++ code and run on
signal from high to low in a loop. the Beaglebone Black.
The HIL vision computer has to emulate the functionality of
The image analysis thread collects images from the webcam
the BBB that analyzes the images, runs the control thread and
and detects target features. The feature locations are forwarded
sends the control signals over the GPIO pins. To emulate the
to the control thread. The control thread reads the autopilot
telemetry information through the UART interface and GPIO pins on the BBB, the vision desktop uses an Arduino
receives the target information from the image analysis thread. UNO micro-processor to send/receive digital and analog
signals. The Arduino I/O library for Simulink allows the
The control thread implements a visual servoing controller and
sending and receiving of commands in real-time. The
calculates the required UAV attitude angles. The control
Simulink image processing library is used to analyze images
thread sends the mimicked RC receiver signals over the vision
received from the simulator computer. The XPlane telemetry
computer’s PWM-capable pins to achieve the desired attitude.
data is collected in the form of UDP packets over the local
D. Airframe area network.
The EPP-FPV airframe by Hobbyking was selected. The
airframe (shown in Figure 1) weighs approximately 3kg when
loaded with the batteries, autopilot, switching module and
vision computer. The airframe was selected as it was designed
for first-person-view (FPV) flight; therefore it features
additional interior space for onboard electronics. The
airframe’s pusher configuration (rear facing propeller) makes
it well suited for mounting a forward facing camera.
A 35-36B 1400kV brushless DC motor, powered by a
Turnigy 45Amp electric speed controller (ESC), drives a 10
inch propeller. The control surfaces are moved by four 9 gram
analog servomotors. The airframe contains two 2200mAh
Lithium polymer batteries, which power the motor ESC and
the servo/autopilot BEC. Based on the batteries, the
approximate flight time is 20-30 minutes.
III. HARDWARE IN THE LOOP SYSTEM
Successful simulation results are required prior to
implementation of a control scheme on a real UAV. These

Figure 5: Hardware-in-the-loop setup to test vision-based UAV


control algorithms.

5 Copyright © 2014 by CSME


Figure 6: Pitch and roll angle response (Top). Four input channels perceived by the autopilot (Bottom). Gray shaded areas correspond to auto-switch
“off” state. Red shaded area corresponds to a disconnection of the heartbeat-signal wire.

highly modular low-cost reliable autopilot system that is not


IV. RESULTS just limited to fixed-wing UAV.
This section outlines the HIL test that was performed to
evaluate whether the switching module is capable of proper REFERENCES
switching between two different simultaneously received [1] “Unmanned Air Vehicle Working Group Final Report,” Transport
Canada, 2007. [Online]. Available:
inputs during a simulated flight. The results were obtained
http://www.tc.gc.ca/eng/civilaviation/standards/general-recavi-
using the X-Plane 9 flight simulator, the APM Mission uavworkinggroup-2266.htm.
Planner and the HIL setup described in Section III. The [2] S. A. Hutchinson, G. D. G. D. Hager, P. I. P. I. Corke, and Hutchinson,
simulated airplane was placed into level flight in the “fly-by- Seth, “A tutorial on visual servo control,” IEEE Transactions on Robotics
and Automation, vol. 12, no. 5, pp. 651–670, 1996.
wire”-mode.
[3] A. Beyeler, J.-C. Zufferey, and D. Floreano, “Vision-based control of
The first input was the manual input from the RC remote near-obstacle flight,” Autonomous Robots, vol. 27, no. 3, pp. 201–219,
which was set with its pitch, roll and yaw sticks centered (0% Aug. 2009.
input) and the throttle at 100%. The second input was sent [4] R. Beard, S. Griffiths, T. McLain, and D. Barber, “Autonomous Landing
of Miniature Aerial Vehicles,” Journal of Aerospace Computing,
from the HIL vision computer. A sinusoidal signal was sent on
Information, and Communication, vol. 4, no. 5, pp. 770–784, May 2007.
the pitch, roll and yaw channels and a square wave ranging [5] L. Mejias, S. Saripalli, P. Campoy, G. S. Sukhatme, and L. Mejías,
from 0%to 20% was sent to the throttle input. The frequency “Visual servoing of an autonomous helicopter in urban areas using feature
of the roll, pitch, throttle and yaw was 0.3Hz, 0.16Hz, 0.2 Hz tracking,” Journal of Field Robotics, vol. 23, no. 3–4, pp. 185–199, Mar.
2006.
and 0.08 Hz, respectively. A sinusoidal signal was sent to
[6] M. Achtelik, S. Weiss, and R. Siegwart, “Onboard IMU and monocular
ensure continuous disturbance from level flight. vision based control for MAVs in unknown in-and outdoor
During the test, the auto-switch was turned on and off environments,” in Robotics and automation (ICRA), 2011 IEEE
repeatedly. Figure 6 shows the pitch and roll response of the International Conference on, 2011, pp. 3056–3063.
[7] S. Ettinger, M. Nechyba, P. G. Ifju, and M. Waszak, “Towards flight
UAV, as well as the perceived inputs by the APM2.5. The
autonomy: Vision-based horizon detection for micro air vehicles,” in
grey shaded regions indicate the auto-switch “off’ state, which Florida Conference on Recent Advances in Robotics, 2002.
reduces the roll angle to zero and the pitch angle to [8] S. J. Mills, J. J. Ford, and L. Mejías, “Vision Based Control for Fixed
approximately 3°, which corresponds to trim flight conditions. Wing UAVs Inspecting Locally Linear Infrastructure Using Skid-to-Turn
Maneuvers,” Journal of Intelligent & Robotic Systems, vol. 61, no. 1–4,
The red shaded region shows the response to a physical
pp. 29–42, Oct. 2010.
removal of the heartbeat signal wire from the switching [9] S. Huh and D. Shim, “A vision-based landing system for small unmanned
module. The result is the same as disengaging of the auto- aerial vehicles using an airbag,” Control Engineering Practice, vol. 18,
switch. Figure 6 shows that the switching module is capable of no. 7, pp. 812–823, 2010.
[10] H. J. Kim, M. Kim, H. Lim, C. Park, S. Yoon, D. Lee, H. Choi, G. Oh, J.
switching between two simultaneously received inputs and
Park, and Y. Kim, “Fully Autonomous Vision-Based Net-Recovery
more importantly: returns manual control to the operator. Landing System for a Fixed-Wing UAV,” ieeexplore.ieee.org, pp. 1–14,
2013.
V. CONCLUSIONS [11] N. Kummer and H. Firouzi, “Autonomous UAV Landing via eye in hand
visual servoing,” in Unmanned Systems Canada, 2011, pp. 2–7.
The design and development of the hardware of a vision- [12] L. Meier, P. Tanskanen, F. Fraundorfer, and M. Pollefeys, “PIXHAWK:
based autopilot for a fixed-wing UAV was presented in this A system for autonomous flight using onboard computer vision,” in 2011
paper. A switching module that interfaces between an IEEE International Conference on Robotics and Automation, 2011, pp.
autopilot and the RC receiver was presented and tested in HIL 2992–2997.
[13] N. V Hoffer, C. Coopmans, A. M. Jensen, and Y. Chen, “Small low-cost
simulations. The switching module performance was tested in unmanned aerial vehicle system identification: A survey and
experiments and shows it capable of switching between categorization,” in International Conference on Unmanned Aircraft
manual and vision computer input. The resulting system is a Systems (ICUAS), 2013, pp. 897–904.

6 Copyright © 2014 by CSME

You might also like