0% found this document useful (0 votes)
5 views5 pages

Menendez Aponte2016

The document discusses the development of a cooperative autonomous robot network consisting of an Unmanned Aerial Vehicle (UAV) and an Unmanned Ground Vehicle (UGV) for early-stage disease detection in strawberry crops. It outlines the hardware and software architectures involved, emphasizing modularity for ease of integration and future enhancements. The proposed system aims to improve the efficiency and accuracy of agricultural disease detection compared to traditional human scouting methods.

Uploaded by

Kristian Dokic
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views5 pages

Menendez Aponte2016

The document discusses the development of a cooperative autonomous robot network consisting of an Unmanned Aerial Vehicle (UAV) and an Unmanned Ground Vehicle (UGV) for early-stage disease detection in strawberry crops. It outlines the hardware and software architectures involved, emphasizing modularity for ease of integration and future enhancements. The proposed system aims to improve the efficiency and accuracy of agricultural disease detection compared to traditional human scouting methods.

Uploaded by

Kristian Dokic
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

2016 International Conference on Collaboration Technologies and Systems

Software and Hardware Architectures in Cooperative


Aerial and Ground Robots for Agricultural Disease
Detection
Pablo Menendez-Aponte, Christian Garcia, Douglas Freese, Sinem Defterli, and Yunjun Xu
Department of Mechanical and Aerospace Engineering
University of Central Florida
Orlando FL
[email protected]

Abstract— Currently, disease detection tasks performed on A relatively small presence of automated mobile robots in
strawberry crops are conducted by human scouts. This the agricultural industry has led to debate in the execution of
qualitative approach is susceptible to error and relies on the autonomous vehicles for agriculture [15-16]. In this project, a
visibility of symptoms which are discernable only after the cooperative autonomous robot network, which consists of an
disease has progressed to a certain stage. Precision agriculture is Unmanned Aerial Vehicle (UAV) and an Unmanned Ground
a thriving new discipline that promises to integrate new Vehicle (UGV), is utilized. The UAV has the advantage of
autonomous machine technology with agronomy. In this speed and thus brings a wide range of reachability in an
research, the cooperation of dedicated ground and aerial vehicles expedited fashion. Fitted with a spectral imaging camera the
is proposed to undertake the challenging task of early stage
UAV will build a georeferenced map calling attention to areas
disease detection for commercial farms. The focus of this paper
is the architectures of the communication, software, and
of crop that are suspected to be diseased. To provide a robust
hardware in such a network. and reliable solution, the UGV will go to the marked suspected
regions and perform high resolution spectral analysis on the
Keywords - agricultural robot; cooperative vehicle network; crops. Furthermore because spectral imaging is a fairly new
disease detection; software architecture; precision agriculture; technology in disease detection, the UGV is equipped with a
UAV; field robot 6DoF manipulator arm system for collecting leaf samples so
that further testing methods can be used to verify the accuracy.
I. INTRODUCTION This paper outlines the architecture of such a networked
The competitive nature of the agriculture market requires system used to realize the goal of automated disease detection
an inevitable demand for automation. In agriculture in a strawberry orchard. As stated earlier the disease detection
automation, implementation of robotic technologies brings system is split into two mobile robotic platforms. The software
efficiency in routine operations, high standards in production and hardware architectures are discussed here, as well as the
quality, and long term cost reduction [1]. Development of state communication between the two platforms.
of the art sensor technologies such as spectral cameras for All the software packages are modularized, which give
disease detection, make the applicability of automation in control engineers the ability to add or remove a component
disease detection a feasible and reliable solution [2]. The tasks while minimally affecting the existing system. Another design
in field and indoor cultivation places are mostly repetitive such consideration was the use of high level languages such as
that cyclic and continuous operation of robots fit very well [3]. MATLAB and C++ so other engineers can easily contribute to
To date robotics and sensor technologies have been the design.
successfully implemented for automation in some agricultural
tasks such as harvesting [4-5], grading [6], weed control [7],
irrigation [8], spraying [9], and disease detection [10]. II. HARDWARE ARCHITECTURE

Advancements in crop disease detection have been long A. Octorotor Hardware


overdue. Disease can cause huge economic losses, a case study The UAV was designed with components tailored to
of soybean production in the United States showed that by satisfy its flight performance and mission criteria. The vehicle
removing 20% of soybean rust, a fungal disease, could result in will take spectral images, identify suspected regions in the
an $11 million profit for farmers [11]. Current methods for
images, and estimate the region’s position in GPS coordinates.
diseases detection are slow and labor intensive, and sometimes
In order to take accurate spectral images, the vehicle should
require a lab test [12]. In practice, many farmers of commercial
strawberry orchards rely on human scouting which is prone to hover above the desired point with minimal disturbance.
error and often times find the disease in the late stages after it Additionally, the UAV requires a flight time of 30 mins to
has already begun to affect neighboring crops. Spectroscopic perform its tasks across the span of a commercial strawberry
disease detection techniques have been presented for detecting field.
disease in citrus [13], grapevine [14] and many other crops
[12].

978-1-5090-2300-4/16 $31.00 © 2016 IEEE 354


DOI 10.1109/CTS.2016.68
An eight-rotor vehicle frame was used with eight brushless analyzation and leaf sample collection sub system. Creating a
motors positioned in-plane. Four 22.2 V lithium polymer suitable environment for disease detection is accomplished by
batteries are run in parallel to achieve an overall battery life of lowering curtains and turning halogen lights on ensuring an
32,000 mAh, allowing the UAV to reach the desired flight appropriately lit testing environment for the spectral camera.
time. A 3-axis aerial gimbal is implemented to stabilize the The testing area onboard the UGV is large enough to
camera against rotational motion from the UAV. The electrical encompass up to 4 strawberry crops thus to position the
architecture, shown in Fig. 1, consists of directly running spectral camera directly over any of the plants it is secured to
power from the batteries, through a power distribution board, an XYZ table. For acquisition of a leaf sample a manipulator
to eight electronic speed controllers (ESCs), which in turn arm with three joints and a leaf gripping end effector is also
power the avionics and their respective motors. The gimbal fastened to the XYZ table giving the UGV a manipulator with
motors and controller also receive power from the power 6 DoF capable of grabbing any leaf in the testing environment.
board.
The realization of a cohesive control system was done by
integrating sensors and actuators to USB peripherals on a
laptop. This allows all of the control software to be
implemented on a stable computer system in a higher level
computer language. In this case we use a Windows operating
Figure 1. UAV mechatronic architecture system with the MATLAB scripting language, but the
The UAV avionics architecture is governed by the architecture described here can easily be used with open source
Arducopter APM 2.6 (APM), an open source flight controller, production ready systems such as the Robotic Operating
which regulates signal to the eight ESC motor subsystems, System and its many compatible programming languages. The
based on the desired flight command. It receives feedback design consists of a microprocessor attached to a laptop. In this
from an array of sensors including a GPS module and a 6-axis case the laptop and microprocessors communicate via a
IMU [17], as well as an ultrasonic laser range-finder for more Universal Asynchronous serial transmission (UART) to the
precise altitude information below 7 meters. The avionics USB link. The microprocessors’ other ports are then connected
architecture is displayed in Fig. 2. The range-finder’s primary either directly to sensors/actuators or in some cases to an
function is to be used with a custom protection system integrated circuit (IC) that allows the microprocessor to
designed for the payload on the UAV. However, having the indirectly communicate with a sensor/actuator. The role of the
camera lens pointed downwards allows the range-finder to microprocessor is to collect, prepare and send feedback data
also assist the flight controller in altitude measurements. from the sensors to the laptop and also to collect commands
from the laptop and send them to the actuators. As will be seen
To accomplish image processing and real-time position later, in some cases the IC or microprocessors can execute
estimation of diseased regions, a Raspberry Pi board was some of the lower level time sensitive control loops. The
installed. It was chosen because of its programmability and architecture described is shown in Fig. 3. Since all components
ability to interface with the other peripherals onboard. An on the system are electrical or electromechanical the arrows in
Xbee radio telemetry set is connected to the Pi for figure II.III can be thought of as electrical signals; analog or
transmission of the position estimates, once they are digital depending on the component. This design allows
determined. The estimates are sent to the control station on the modular hardware packages to be added and removed from the
UGV via a paired Xbee receiver connected to the Arduino central laptop with ease.
Mega, granting direct and immediate access to the predicted
diseased locations to the UGV. The estimation data
communication layout is shown in Fig. 2.

Figure 3. Overall hardware architecture


The first mechatronic system shown in Fig. 4 connects
most of the components necessary for the navigation and
mobility of the robot. The actuator equipment consists of a
Figure 2. Avionics and UAV data architecture
pair of electric DC motors while the sensor equipment is
comprised of IMU, GPS, ultrasonic range finding sensors and
B. Ground Robot Hardware
optical quadrature encoders. For data processing the
The UGV platform is made mobile by a skid steer style navigation system uses an Arduino Mega which includes an
differentially driven robot designed to drive over and cross Atmel 2560 microprocessor and an FTDI UART to USB
strawberry beds. The structure of the robot has telescoping integrated circuit for communication with the onboard
tubes that give a transforming capability for different size computer. The range finders, IMU and GPS communicate
strawberry beds. Between the tires of the robot is a crop directly with the Atmel 2560 via PWM signals, SPI serial and
This work is funded by the United States Department of Agriculture-
National Institute of Food and Agriculture (#2013-67021-20934).

355
UART serial communication respectively. The two DC motors III. SOFTWARE ARCHITECTURE
for the drivetrain require they’re own circuits for quadrature Software structure is modularized so that it can be easily
encoding and power regulation to the DC motors. The motor expanded in the future to include new hardware, new
driver circuit has its own microprocessor that communicates functionalities, or even additional vehicles in this network.
with the Atmel 2560 through a UART serial link.
In this section, the software structures of the network to
realize communication, information processing, guidance,
navigation, and control functionalities of the UGV and UAV
are discussed.

A. Octorotor Software
Excluding the communication component with the UGV,
the octorotor software architecture is composed of a flight-
path navigation component, a spectral image processing
component, and a geolocation component.
The flight control algorithms are self-contained in the
software of the APM, of which more information can be found
in [17].

Figure 4. Navigation and control architecture The UAV’s flight-path navigation component is based on
GPS waypoint navigation, interfaced on the Mission Planner
The second mechatronic system shown in Fig. 5 combines human machine interface from a ground station. Depending on
much of the necessary hardware for crop analyzing and leaf the flight altitude, predetermined waypoints can be placed to
sample collection. It is designed in the same architecture and collect spectral images across the entire field. The UAV will
uses an Arduino Mega as a hub for components much like the then maneuver through the waypoints at a specified altitude,
navigation hardware. Again a motor driver is necessary for hovering at each point for a given time while taking images.
providing power to the DC motors that are used to position the Waypoint paths can be uploaded from the interface to the flight
XYZ table and raise and lower curtains. This motor driver controller directly or wirelessly through radio telemetry.
however, is not capable of quadrature encoding thus position
control of the XYZ table components are done directly by the The geolocation component was designed according to the
Arduino. The halogen lights used to provide light for the structure given above in Fig. 6. This component is
spectral imaging are also controlled on this Arduino hub. The implemented in the Python language and interfaced on a bash
last component is the ZigBee xBee Pro used for radio script. A data object is used to store the UAV GPS and altitude
communication between the UAV and UGV. information accessed via a MAVlink connection. A video
object interfaces the camera, storing the images taken in
sequence. To identify suspected regions based on abnormal
spectral intensities, an image processing function is used. Its
identification criteria depends on the spectral data being
observed and will vary. The function outputs the center pixel
measurements of the suspected regions in each image. These
pixels are given to the estimator function, which determines
and tracks the GPS coordinates that correspond to the center
pixels. A convergence function determines if the estimates
have converged and places that final GPS estimate in a serial
object to send to the UGV.

Figure 5. Leaf acquisition architecture

In addition to the mechatronic systems described above, the


UGV’s onboard computer also does spectral and visual image
processing. For visual image processing the laptop is linked to
several RGB webcams via USB ports on the computer. The
current configuration for processing the near infrared spectral
image includes a Raspberry Pi micro-computer that transmits
frequency domain data of the spectral image to the laptop via a
wireless Wi-Fi link.

356
stores the data in a mission log along with event information
for post mission debugging.

Figure 6. Geolocation software structure

B. Ground Robot Figure 7. Navigation software structure


The software for the UGV is designed around the Arduino
C++ and MATLAB environment. To accomplish the software
architecture shown in Fig. 7 the Object Oriented Programming
paradigm is adopted. The C++ software on-board the Arduinos
utilizes a similar architecture for interfacing with sensors.
However, the C++ software handles much simpler tasks and
thus is only briefly discussed here. The role of software for
data processing is to partition binary sentences received over
serial interfaces with the sensors and onboard laptop.
Additionally the Arduino for the Manipulator has three PID
components for regulating position of the XYZ table DC
motors and the Arduino for the navigation architecture
contains component for estimating the posture of the robot.
The algorithm used for position estimation can be referred to Figure 8. Leaf acquisition software structure
in [18].
Class definitions are written in MATLAB for the UGV’s C. Communication in the network
main software structures; the navigation structure and the leaf The UAV and UGV communicate through a 2.4 Hz IEEE
acquisition structure. The navigation structure shown in Fig. 7 802.15.4 based radio telecommunication link. As the
has three main components: path generation, cross-bed and geolocation software finds areas of the orchard that it suspects
over-bed control components. The algorithms used can be to have ailments, it interfaces with an xBee radio device to send
referred to in [19] for the over-bed component and in [20-21] the UGV the location. The software is written to periodically
for the path planning component. The cross-bed component check a location queue on this Arduino, when it is interfacing
algorithm interfaces with vision spectrum cameras to with it, either during crop analyzing, leaf sampling, or in the
case that the UGV is not busy and at its home position.
determine the robot position relative to the next strawberry
bed. The leaf acquisition structure shown in Fig. 8 has three
main components: crop analyzation, leaf localization, and leaf IV. CONCLUSIONS
sampling. The crop analyzation component interfaces with a The paper presented the hardware and software
spectral camera and the XYZ table to check individual crops architectures for a UGV and UAV network for agricultural
for a disease signature. The leaf localization component disease detection scenarios. A benefit of the modularized
interfaces with vision web cameras to localize a leaf on a software package is the ability to easily add and switch
diseased plant in the working region of the manipulator. The between hardware interfaces. The modular architecture allows
leaf sampling component interfaces with the Arduino to engineers to develop software that switches between interfaces
command the manipulator to a posture computed by an inverse using minimal code modifications. Field testing of the robot
kinematics algorithm which was developed in house. All has begun and has demonstrated the architecture structure as
components implement an event invocated callback routine. competent of performing the control tasks.
The data processing hardware mentioned above has software
that continually streams feedback data to the computer. While
the callback routine on the laptop stores the data in a global
location for the component control loops to access and also

357
ACKNOWLEDGMENT [11] M. J. Roberts , D. Schimmelpfennig, E. Ashley, M. Livingston, M Ash,
and U. Vasavada, “The value of plant disease early-warning systems: a
The authors would like to thank the United States case study of USDA's soybean rust coordinated framework,” Economic
Department of Agriculture - National Institute of Food and Research Service, United States Department of Agriculture, no. 7208,
Agriculture for the financial support (#2013-67021-20934). 2006.
[12] S. Sankaran , A. Mishra, R. Ehsani, and C. Davis, "A review of
advanced techniques for detecting plant diseases," Computers and
REFERENCES Electronics in Agriculture, vol. 72, no. 1, 2010, pp. 1-13.
[1] S. G. Defterli, Y. Shi, Y. Xu, and R. Ehsani, “Review of robotic [13] J. Belasque Jr, M.C.G. Gasparoto, and L. Gustavo-Marcassa, "Detection
technology for strawberry production,” Accepted to American Society of of mechanical and disease stresses in citrus plants by fluorescence
Agricultural and Biological Engineers - Applied Engineering in spectroscopy," Applied Optics, vol. 47, no. 11, 2008, pp. 1922-1926.
Agriculture, 2016, to be published. [14] R. A. Naidu, E. Perry, F. Pierce, and T. Mekuria, "The potential of
[2] S. Yaghoubi, N. A. Akbarzadeh, S. S. Bazargani, M. Bamizan, and M. I. spectral reflectance technique for the detection of Grapevine leafroll-
Asl, “Autonomous robots for agricultural tasks and farm assignment and associated virus-3 in two red-berried wine grape cultivars," Computers
future trends in agro robots,” International Journal of Mechanical & and Electronics in Agriculture, vol. 66, no. 1, 2009, pp. 38-45.
Mechatronics Engineering, vol. 13, no. 3, 2013, pp. 1-6. [15] M. Li, K. Imou, K. Wakabayashi, and S. Yokoyama, “Review of
[3] D. Gardner, “Robotics and smart engineering,” Journal of Farm research on agricultural vehicle autonomous guidance,” International
Management, vol. 14, no. 3, 2011, pp. 249-270. Journal of Agricultural and Biological Engineering, vol. 2, no. 3, 2009,
[4] N. Kondo, K. Ninomiya, S. Hayashi, T. Ota, and K. Kubota, “A new pp. 1-16.
challenge of robot for harvesting strawberry grown on table top culture,” [16] S. S. H. Hajjaj, and K. S. M. Sahari, “Review of research in the area of
ASAE Annual Meeting Presentation, Tampa, Florida, 17-20 July 2005. agriculture mobile robots,” The 8th International Conference on
[5] K. F. Sanders, “Orange harvesting systems review,” Biosystems Robotics, Vision, Signal Processing and Power Applications, Lecture
Engineering, vol. 90, no. 2, 2005, pp. 115-125. Notes in Electrical Engineering, vol. 291, ed. H. A. Mat-Sakim and M.
T. Mustaffa, 2014, pp. 107-118.
[6] J. Qiao, A. Sassao, S. Shibusawa, N. Kondo, and E. Morimoto, “Mobile
fruit grading robot (part 1) – development of a robotic system for [17] H. Lim, J. Park, D. Lee, and H. J. Kim, “Build your own quadrotor: open
grading sweet peppers,” Journal of Japanese Society of Agricultural source projects on unmanned aerial vehicles,” IEEE Robotics &
Machinery, vol. 66, no. 2, 2004, pp. 113-122. Automation Magazine, September, 2012, pp. 33-45.
[7] D. C. Slaughter, D. K. Giles, and D. Downey, “Autonomous robotic [18] J. Yi, H. Wang, J. Zhang, D. Song, S. Jayasuriya, and J. Liu, “Kinematic
weed control systens: a review,” Computers and Electronics in modeling and analysis of skid-steered mobile robots with applications to
Agriculture, vol. 61, no. 1, 2008, pp. 63-78. low-cost inertial-measurement-unit-based motion estimation,” IEEE
Transactions on Robotics, vol. 25, no. 5, 2009, pp. 1087-1097.
[8] S. Blackmore, B. Stout, M. Wang, and B. Runov, “Robotic agriculture –
the future of agricultural mechanzation,” 5th European Conference on [19] F. Dong, H. Wolfgang, and R. Kasper, “Development of a row guidance
Precision Agriculture, 2005, Wageningen Academic Publishers, ed. J. V. system for an autonomous rovot for white asparagus harvesting,”
Stafford, Uppsala, Sweden, pp. 621-628. Computers and Electronics in Agriculture, vol. 79, no. 2, 2011, pp. 216-
225.
[9] P. J. Sammons, T. Furkawa, and A. Bulgin, “Autonomous pesticide
spraying robot for use in greenhouse,” Proceedings of the Australasian [20] Y. Xu, “Motion camouflage and constrained suboptimal trajectory
Conference on Robotics & Automation, 5 – 7, Decemeber, 2005, New design,” AIAA Guidance, Navigation, and Control Conference, 20-23
South Whales, Sydney, Australia, pp. 1-9. August, 2007, Hilton Head, South Carolina.
[10] Y. Xu, R. Ehsani, J. Kaplan, I. Ahmed, W. Kuzma, J. Orlandi, K. [21] N. Li, C. Remeikas, Y. Xu, S. Jayasuriya, and R. Ehsani, “Task
Nehila, K. Waller, and S. G. Defterli, “An octo-rotor ground network for assignment and trajectory planning algorithm for a class of cooperative
autonomous strawberry disease detection – year 1 status update,” 2nd agricultural robots,” ASME Journal of Dynamic Systems, Measurement,
International Conference on Robotics and Associated High- and Control, vol. 137, No. 5, 2015, pp. 051004-1 - 051004-9.
Technologies and Equipment for Agriculture and Forestry, Madrid,
Spain, 21-23 May 2014, pp. 457-466.

358

You might also like