A Survey on Deep Learning Based Identification of Plant and Crop UAV
A Survey on Deep Learning Based Identification of Plant and Crop UAV
https://doi.org/10.1007/s10586-022-03627-x (0123456789().,-volV)(0123456789().,-volV)
Received: 18 December 2021 / Revised: 12 April 2022 / Accepted: 10 May 2022 / Published online: 3 August 2022
The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022
Abstract
The agricultural crop productivity can be affected and reduced due to many factors such as weeds, pests, and diseases.
Traditional methods that are based on terrestrial engines, devices, and farmers’ naked eyes are facing many limitations in
terms of accuracy and the required time to cover large fields. Currently, precision agriculture that is based on the use of
deep learning algorithms and Unmanned Aerial Vehicles (UAVs) provides an effective solution to achieve agriculture
applications, including plant disease identification and treatment. In the last few years, plant disease monitoring using UAV
platforms is one of the most important agriculture applications that have gained increasing interest by researchers. Accurate
detection and treatment of plant diseases at early stages is crucial to improving agricultural production. To this end, in this
review, we analyze the recent advances in the use of computer vision techniques that are based on deep learning algorithms
and UAV technologies to identify and treat crop diseases.
Keywords Computer vision Deep learning Unmanned Aerial Vehicles Precision agriculture Plant disease
Convolutional neural network
1 Introduction arable lands, plant diseases, and more recently the high
spread of the COVID-19 pandemic Bouguettaya et al. [18],
Agriculture is one of the most important human activities Rahman et al. [61].
that plays a crucial role to improve the economy of any Plant diseases have always been considered one of the
country Thangaraj et al. [76]. However, several issues have most significant threats to crops restricting food produc-
imposed additional challenges in the agriculture field in tivity Vishnoi et al. [78], Jiang et al. [41] and increasing
terms of crop productivity and food security, including the economic losses. They occur in agriculture fields due to
continuous population growth, climate change, shortage of many factors, including climate change, water stress, and
insects Card et al. [21], Bondre and Sharma [16].
According to Thangaraj et al. [76], plant diseases are
Abdelmalek Bouguettaya and Hafed Zarzour have causing losses of around 40% of food supplements every
contributed equally to this work. year. Also, we are losing more than 5 million tons of wheat
every year due to the yellow rust disease Beddow et al.
& Abdelmalek Bouguettaya
[email protected] [14]. Unfortunately, due to the Russian-Ukrainian war, the
amount of wheat losses are expected to be increased
Hafed Zarzour
[email protected] because these two countries are considered among the main
sources of wheat production and export.
Ahmed Kechida
[email protected] The reduced crop yields eventually result in starvation
and insufficient food supplies. Therefore, early and effi-
Amine Mohammed Taberkit
[email protected] cient crop and plant diseases detection and diagnosis are
required to increase food productivity. Traditional methods
1
Research Centre in Industrial Technologies (CRTI), that are based on human experts and on-ground machines
P.O. Box 64, Cheraga, 16014 Algiers, Algeria scouting can be useful to monitor small crops, but it is very
2
LIM Research, Department of Mathematics and Computer difficult and can be impossible in some cases to cover large
Science, Souk Ahras University, 41000 Souk Ahras, Algeria
123
1298 Cluster Computing (2023) 26:1297–1317
crops, which is time-consuming and very exhaustive the most effective deep learning approaches in the com-
making them not suitable for early crop and plant diseases puter vision field. Currently, there is a wide range of CNN-
identification Zhang et al. [84], Kerkech et al. [43], Guo based deep learning algorithms and architectures that are
et al. [33]. used to detect and classify different crop diseases
To overcome the aforementioned issues sustainably with Bouguettaya et al. [18]. Nowadays, the application of
low cost while not affecting the environment, we need an computer vision techniques, deep learning algorithms, and
agricultural revolution that is based on innovative ideas and crop diseases identification-based UAV platforms are an
technologies. To this end, over the last few decades, agri- active research field that can provide solutions to solve
culture has changed and still changing from being some problems related to the early and effective identifi-
accomplished by human workers to smart agricultural cation of different plant diseases. To make UAV platforms
machines and robots due to the continuous adoption of able to achieve autonomous detection and treatment of crop
recent advanced technologies, including Internet of Things diseases, we need to address several disciplines, including
devices, intelligent algorithms, sophisticated sensors, and agriculture, electronic control, remote sensing technolo-
modern machines Reddy Maddikunta et al. [63], Moysiadis gies, computer vision, artificial intelligence, among others.
et al. [53], Ouhami et al. [55]. This new paradigm is called Therefore, in this review paper, we investigate the effec-
Precision Agriculture or Smart Farming, where its main tiveness of UAV platforms, remote sensing technologies,
role is to optimize the use of agricultural resources such as preprocessing techniques, and deep learning-based com-
water, agrochemical products, and drudgery to improve puter vision models to detect and treat different crop dis-
crop quality and quantity while saving time and money eases at their early stages.
Neupane and Baysal-Gurel [54]. There are several review papers in the literature target-
To detect crop diseases, several studies have adopted ing crop and plant diseases identification through deep
satellite, airplane, on-ground machinery, and UAV plat- learning methods. However, most of these studies have
forms to collect high-resolution images. However, airplane targeted diseases recognition using datasets collected based
and satellite technologies are facing several limitations in on on-ground technologies. For example, the authors in
terms of spatial/temporal resolutions and limited viewing Thangaraj et al. [76] provided a review on different
capacity due to weather conditions. Moreover, they provide machine learning and deep learning algorithms targeting
a very expensive solution. Similarly, on-ground technolo- tomato leaves diseases from non-aerial images. However,
gies suffer from low area coverage and a long time to cover in addition to the adoption of datasets collected from non-
large fields. As a result, UAVs equipped with intelligent aerial platforms, there is a lack of methods based on object
visual systems could be an effective low-cost solution to detection and image segmentation, where most of the
achieve crop and plant diseases detection in small, med- reviewed papers are based on image classification tech-
ium, and large agricultural fields allowing farmers to apply niques. Similarly, the authors in Sirohi et al. [68] provided
the right treatment with the right amount and at the right a short review of some recent deep learning algorithms to
place at the right time. identify and classify plant diseases from their leaves.
To analyze the collected images, we require efficient However, none of these studies have focused on the use of
algorithms. Several studies have adopted traditional UAVs as the main platform to collect data. Some other
machine learning techniques, including SVM and Random reviews, like Zhang et al. [86] and Barbedo [12], provided
Forest Sujatha et al. [73]. However, these techniques are few studies targeting plant disease identification through
facing many limitations because they depend on manual UAV technologies, but they did not investigate either the
feature extraction methods making them inefficient, espe- use of UAVs or deep learning techniques in detail. To the
cially in the case of complex environments. Recently, deep best of our knowledge, this is the first review study that
learning algorithms have emerged as a new effective focuses on the combination of deep learning techniques
solution to improve computer vision-based systems for and UAV technologies to identify crop and plant diseases
automatic crop disease monitoring. They perform auto- in detail. The main contributions of the present paper are
matic feature extraction without any human intervention summarized as follows:
providing valuable information that could help farmers to
• Presenting different UAV and remote sensing tech-
make the right decisions, while reducing crops treatment
nologies adopted for crop and plant diseases recognition
costs and increasing their productivity. Therefore, the
from aerial images, while providing their important
combination of recent UAV camera sensor technologies
characteristics and advantages over other available
and deep learning algorithms could be an efficient solution
technologies.
for the early detection of crop diseases.
• Providing new and effective solutions that may improve
Since the advance of AlexNet architecture in 2012,
crop productivity, while reducing cost and drudgery.
Convolutional Neural Network (CNN) is considered one of
123
Cluster Computing (2023) 26:1297–1317 1299
123
1300 Cluster Computing (2023) 26:1297–1317
smart UAVs are widely applied to perform a wide range of tasks. Their high flexibility and capability of hovering and
applications, including search and rescue operations Mar- flying at low altitudes above the targeted crop provide
tinez-Alpiste et al. [52], wildfire detection Bouguettaya farmers with good images allowing them to detect crop
et al. [20], vehicle detection Bouguettaya et al. [19], pre- diseases at early stages through small visual symptoms at
cision agriculture Di Nisio et al. [27], Delavarpour et al. the different parts of the plant, including leaf level
[26], package delivery Shahzaad et al. [64], smart cities Bouguettaya et al. [18]. Multirotor UAVs depend on
Abualigah et al. [5], to name a few. multiple propellers to fly making them the best choice to
The agriculture sector has largely changed over the past cover crops at difficult areas from different altitudes. We
few decades, and new technologies have played a signifi- can classify them according to the number of rotors
cant role in this transformation. Several spatial and aerial mounted on them. Some UAVs have three rotors, which
platforms have been used to perform different agricultural are called tri-copters. If there are four rotors, they are
tasks, including satellites, airplanes, and UAVs. Satellites called quad-copters. Also, there are hexacopters and octo-
and airplanes can cover very large areas in a very short copters with six and eight propellers, respectively. The
time compared to UAV platforms that take a longer time to latter categories are mostly preferred for high payload
cover large fields. However, in addition to the huge budget lifting making them more suitable for precise crop spraying
to accomplish missions through satellites and airplanes, operations. For example, the authors in Pittu and Gorantla
they also suffer from poor spatial and temporal image [58] adopted a hexacopter to detect the exact diseased areas
resolutions compared to those provided by UAVs and and spray pesticides. However, UAVs with six and eight
terrestrial technologies. Furthermore, they are very sensi- rotors are facing a serious problem concerning high energy
tive to cloudy and rainy weathers that could affect the consumption resulting in short flying time. Thus, quad-
overall performance of these systems. copter UAVs are considered the most appropriate UAV
The introduction of agricultural UAVs is one of the most category for crop disease monitoring due to many factors,
significant changes that lead to smart farming. Due to their including high flexibility, ease of use, and their higher
high flexibility and mobility, UAVs can perform flight endurance compared to hexacopters and octocopters.
missions at different altitudes and viewing angles above Another rotary-wing type similar to helicopters was
dangerous and difficult areas that was impossible to reach adopted in Théau et al. [77] for potato crops surveying to
through piloted airplanes or satellites. Recently, different reduce the impact of diseases and pests on crop produc-
UAV types equipped with high-resolution camera sensors tivity. This type of UAVs has a higher payload capacity
have been widely adopted to perform different agriculture- than the aforementioned rotary-wing UAVs.
related activities, including crop diseases identification and
treatment. They could be categorized into three main 3.2 Fixed-wings UAVs
classes, which are fixed-wing, rotary-wing, and Hybrid
VTOL UAVs (Fig. 2). In this section, we focus on pre- Fixed-wing UAVs (Fig. 2) are other UAV types that are
senting the various UAV types all along with their prop- capable of covering larger areas in less time compared to
erties and agricultural operations purposes. rotary-wing UAVs due to their long endurance, high speed,
and high altitude Raeva et al. [60]. These properties make
3.1 Rotary-wing UAVs them more suitable for forest and large crops surveillance.
Fixed-wing UAVs are already widely used for crop disease
Multirotor UAVs, also called rotary-wing (Fig. 2), are the monitoring. For example, the authors in Albetis et al. [10],
most popular UAV types to achieve different agricultural Albetis et al. [9] adopted the Long-range DT-18 Fixed-
123
Cluster Computing (2023) 26:1297–1317 1301
wing UAV to detect Flavescence dorée and Grapevine potential diseases in crops from visual data, and providing
Trunk Diseases within seven different vineyards. However, a map of crops status that could be helpful to the farmers or
in addition to their high cost, fixed-wing UAVs are suf- other machines working in coordination with the UAV
fering from low flexibility making small crop monitoring Dammer et al. [24], Lytridis et al. [51]. Remote sensing
very difficult. Moreover, they require runways and space to systems that are based on camera sensors mounted on UAV
land and take off Zhang et al. [85]. platforms can be classified according to two main factors,
which are the UAV type and the camera sensor type. UAV-
3.3 Hybrid VTOL UAVs based aerial imaging is one of the most important and
useful data types that can help to improve the agricultural
More recently, hybrid Vertical Take-Off and Landing field. Usually, the choice of UAV platforms and sensor
(VTOL) UAVs (Fig. 2) have emerged as a new effective types depends on the purpose of the targeted application
solution to overcome multirotor and fixed-wing UAVs and crop type. Therefore, UAVs can be equipped with
problems Song and Park [70]. They combine multirotor different types of cameras, including RGB, spectral, and
UAVs’ VTOL capability with fixed-wing UAVs’ cruise thermal. Fixed-wing and VTOL UAVs are capable of
flight Delavarpour et al. [26]. Unfortunately, this type of carrying more sophisticated cameras than multirotor
UAV is still not largely used to monitor crop diseases. UAVs, especially in the case of hyperspectral cameras that
In Table 1, we highlighted the most used agricultural have high weight. Also, UAVs equipped with sophisticated
UAVs in the literature for various crop disease monitoring, cameras can help farmers to improve crop yield while
where we find that DJI UAVs are the most used ones. saving time and money by automating some processes that
Similarly, the pros and cons of the different agricultural require a group of persons working on them. However,
UAV categories are summarized in Table 2. cameras mounted on multirotor UAVs provide better
Ground Sampling Distance resolution due to their ability to
fly at lower altitudes. To this end, in this section, we are
4 UAV-based visual remote sensing systems going to present the most used camera sensors for crop
used to identify crop diseases disease monitoring.
Sensors represent one of the fundamental building blocs of 4.1 RGB cameras
any UAV to perform different tasks autonomously with
little or no human intervention through intelligent algo- According to Table 3, the visible (or RGB) camera is one
rithms, including navigation, detect and geolocating of the most adopted sensor types to achieve different
Table 1 Most used UAVs in the literature for crop disease monitoring
UAV brand UAV category Product name References
123
1302 Cluster Computing (2023) 26:1297–1317
agricultural tasks, including crop disease identification. conventional RGB cameras and require complex and time-
Their high adoption is due to many factors such as low consuming calibration procedures. To this end, the authors
price, ease of use, low weight, and high spatial resolution in Costa et al. [23] provide a solution to create an NDVI
Heidarian Dehkordi et al. [35], Abdulridha et al. [3]. These map from RGB data, which is one of the most useful
cameras represent the best choice for smallholders farmers vegetation indices to identify crop diseases. Genetic algo-
with low financial resources providing valuable visual rithms have been used to achieve such objectives.
information to easily identify visual plant disease symp-
toms at the leaf level Kerkech et al. [43], Tetila et al. [75]. 4.2 Multispectral and hyperspectral cameras
Several studies adopted this type of camera to detect
plant diseases from UAV imagery. For example, Tetila Depending on their spectral resolutions, spectral imaging
et al. Tetila et al. [75] used a Sony EXMOR RGB camera systems could be categorized into two main types, which
mounted on a DJI Phantom 3 Pro to recognize soybean leaf are multispectral and hyperspectral. They provide infor-
diseases. Also, the authors in Wu et al. [82] used an RGB mation in the electromagnetic spectrum that ranges from
camera mounted on a DJI Phantom 4 RTK UAV to diag- the visible to the Near-Infrared (NIR) allowing the calcu-
nose pine wilt disease at an early stage. However, com- lation of different robust vegetation indices such as NDVI
pared to other available technologies, visible cameras Zhang et al. [84]. These characteristics make spectral
provide lower performance to detect diseases at their early cameras one of the most adopted sensor types for crop and
stages due to many factors. They are extremely susceptible plant disease identification from UAV platforms (Table 3).
to environmental conditions such as sunlight angle and The main differences between multispectral and hyper-
shadows that could affect the crop disease identification spectral imaging systems can be summarized as follow: 1)
system causing false disease detection Ganchenko and the number of channels in hyperspectral cameras is much
Doudkin [29]. The authors in Li et al. [49] used another more than those in multispectral ones, 2) hyperspectral
type of visible camera called RGB-D that is less sensitive camera are more expensive. Multispectral and hyperspec-
to light. This camera provides depth information that could tral cameras could be an effective tool for disease symp-
be used to improve the performance of targeted object toms automatic detection. They are more robust than RGB
detection and localization in agricultural tasks. cameras against different illumination conditions making
To have more information about crop status, we need them more reliable to distinguish between healthy and
other types of camera sensors that provide more details. For stressed plants Zhang et al. [87]. According to Théau et al.
example, a multispectral sensor operating in the Near- [77], it is difficult to distinguish between different crop
InfraRed (NIR) wavelength is required to generate a Nor- stress types from multispectral data. However, hyperspec-
malized Difference Vegetation Index (NDVI) map. On the tral cameras provide more details allowing the
other hand, multispectral cameras are more expensive than
123
Table 3 Deep learning-based crop and plant diseases identification methods from UAV imagery
References Crop Disease Altitude/ Data type Model Data size Input data size Performance
GSD (m)
Tetila et al. [75] Soybean Leaf Diseases 2/- RGB Inception-v3 (FT 75%) 3000 256 256 99.04% (accuracy)
ResNet-50 (FT 75%) 99.02% (accuracy)
VGG-19 (FT 100%) 99.02% (accuracy)
Xception (FT 100%) 98.56% (accuracy)
Gomez Selvaraj Banana BBTD, BXW 50 to 100/1 to RGB VGG-16 3300 64 64 85% (accuracy)
et al. [31] 3
Custom CNN 92% (accuracy)
Görlich et al. [32] Sugar Cercospora Leaf Spot – RGB FCN – 320 320 76% (precision), 83.87% (recall), 75.74%
Cluster Computing (2023) 26:1297–1317
beet (F1)
Dang et al. [25] Radish Fusarium Wilt 3, 7, and 15/- RGB RadRGB 1700 64 64 96.4% (accuracy), 0.043 s/image (testing
time)
Inception-V3 95.7% (accuracy), 0.1 s/image (testing time)
VGG-16 93.1% (accuracy), 0.22 s/image (testing time)
Wu et al. [82] Pine Pine Wilt Disease 120/0.04 RGB Faster R-CNN (ResNet-50) 476 – 60,2% (mAP), 134 MB (model size), 0.191
(FPS)
Faster R-CNN (ResNet-101) 62,2% (mAP), 208 MB (model size), 0.18
(FPS)
YOLOv3 (DarkNet-53) 64% (mAP), 241 MB (model size), 1.066
(FPS)
YOLOv3 (MobileNet) 63,2% (mAP), 95 MB (model size), 1.393
(FPS)
Hu et al. [38] Pine – 103/- RGB Proposed approach (with – – 46.4% (precision), 92.9% (recall), 61.9% (F1)
Augmentor)
Proposed approach (with 56.5% (precision), 92.9% (recall), 70.3% (F1)
DCGAN)
Hu et al. [37] Pine – 103/0.025 RGB AlexNet 1486 64 64 39.1% (Recall)
VGGNet 91.3% (Recall)
Inception-v3 73.7% (F1), 61.8% (precision), 91.3%
(Recall)
AlexNet ? Adaboost 71.2% (F1), 58.3% (precision), 91.3%
(Recall)
VGGNet ? Adaboost 76.9% (F1), 69% (precision), 87% (Recall)
Proposed method 86.3% (F1), 78.6% (precision), 95.7%
(Recall)
Qin et al. [59] Pine Pine Wood Nematode 150 - 200/0.1 - Multispectral SCANet 4862 – 79.33% (OA), 86% (precision), 91% (recall),
Disease 0.125 88.43% (F1)
DeepLabV3? 56.62% (OA), 68% (precision), 77% (recall),
72.22% (F1)
HRNet 56.9% (OA), 75.66% (precision), 68.66%
(recall), 72% (F1)
DenseNet 54.7% (OA), 64.33% (precision), 76.66%
(recall), 70% (F1)
1303
123
Table 3 (continued)
1304
References Crop Disease Altitude/GSD Data type Model Data size Input data size Performance
(m)
123
Yu et al. [83] Pine Pine Wilt Disease 100/12 Multispectral Faster R-CNN 1905 800 800 60.98% (mAP), 113.43 MB (model size),
10.51 (FPS)
YOLOv4 57.07% (mAP), 243.96 MB (model size),
25.55 (FPS)
Shi et al. [65] Potato Late Blight Disease 30/2.5 Hyperspectral CropdocNet – – 98.2% (OA), 0.812 (Kappa), 721 ms
(Computing Time)
SVM 82.7% (OA), 0.571 (Kappa), 162 ms
(Computing Time)
RF 78.8% (OA), 0.615 (Kappa), 117 ms
(Computing Time)
3D-CNN 88.8% (OA), 0.771 (Kappa), 956 ms
(Computing Time)
Abdulridha et al. [2] Tomato Target Spot 30/0.1 Hyperspectral MLP – – TS: 97% (accuracy)
Bacterial Spot BS: 98% (accuracy)
Duarte-Carvajalino Potato Late Blight 30/0.008 Multispectral MLP (NIR-G-B) 748,071 50 40 16.37 (MAE), 23.25 (RMSE), 0.47 (R2)
et al. [28]
MLP (NDVI) 18.71 (MAE), 21.98 (RMSE), 0.44 (R2)
MLP (Band Differences) 13.23 (MAE), 16.28 (RMSE), 0.75 (R2)
MLP (PCA) 16.60 (MAE), 21.87 (RMSE), 0.48 (R2)
SVR (Band Differences) 17.34 (MAE), 21.06 (RMSE), 0.45 (R2)
RF (Band Differences) 12.96 (MAE), 16.15 (RMSE), 0.75 (R2)
CNN (NIR-G-B) 11.72 (MAE), 15.09 (RMSE), 0.74(R2)
Kerkech et al. [43] Vineyard Esca 25/0.01 RGB CNN ? YUV ? ExGR 70,560 (16 16), (32 32), 95.92% (accuracy)
(64 64)
Wiesner-Hanks et al. Vineyard Mildew disease 25/– Multispectral VddNet – 256 256 93.72% (accuracy)
[79]
SegNet 92.75% (accuracy)
U-Net 90.69% (accuracy)
DeepLabv3? 88.58% (accuracy)
PSPNet 84.63% (accuracy)
Raj et al. [62] Vineyard Mildew disease (leaf- 25/0.01 RGB SegNet 105, 515 360 480 85.13% (accuracy)
level) (RGB)
Infrared 98, 895 78.72% (accuracy)
(IR)
Fusion AND 82.20% (accuracy)
Fusion OR 90.23% (accuracy)
Mildew disease RGB 94.41% (accuracy)
(grapevine-level)
Infrared 89.16% (accuracy)
Fusion AND 88.14% (accuracy)
Fusion OR 95.02% (accuracy)
Cluster Computing (2023) 26:1297–1317
Table 3 (continued)
References Crop Disease Altitude/GSD Data type Model Data size Input data size Performance
(m)
Wu et al. [81] Maize Northern Leaf Blight 6/– RGB CNN (Resnet-34) 6267 224 224 97.76% (accuracy), 97.85% (recall), 98.42%
(Corn) (precision)
Wiesner-Hanks et al. Maize Northern Leaf Blight 6/– RGB CNN (ResNet-34) ? CRF 18,222 224 224 99.79% (accuracy), 71.53 (F1)
[80] (Corn)
Stewart et al. [71] Maize Northern Leaf Blight 6/– RGB Mask R-CNN 3000 512 512 96% at IoU = 0.5 (average precision)
(Corn)
Huang et al. [39] Wheat Helminthosporium 80/0.034 RGB Color Histogram ? SVM 246 100 100 85.92% (OA)
Leaf Blotch
LBPH ? SVM 65.10% (OA)
Cluster Computing (2023) 26:1297–1317
123
1306 Cluster Computing (2023) 26:1297–1317
measurement of different Vegetation Indices (VIs) that Recently, there has been considerable interest in crop
could be used to discriminate stress types accurately. diseases diagnosis using deep learning algorithms to pro-
cess images acquired through UAV platforms. Several
4.3 Thermal infrared cameras recent studies on crop diseases detection from UAV ima-
gery are based on deep learning models to overcome the
Thermal infrared cameras are other available sensors that limitations of traditional techniques, especially Convolu-
could be used for crop diseases identification from aerial tional Neural Network (CNN) algorithms. Most of these
images. InfraRed (IR) region consists of several spectral studies targeted subsistence crops such as wheat Pan et al.
bands, including Near InfraRed (NIR), Short-Wave Infra- [56], Su et al. [72], Zhang et al. [87], maize Wiesner-Hanks
Red (SWIR), Mid-Wave InfraRed (MWIR), Long-Wave et al. [80], Stewart et al. [71], potato Théau et al. [77],
InfraRed (LWIR), and Far InfraRed (FIR) Khalid et al. Siebring et al. [66], and tomato Abdulridha et al. [3],
[46]. Due to their characteristics, these types of cameras Abdulridha et al. [2].
can be used either during the day or night times. Recently,
UAV-based thermal cameras are widely used in several 5.1 Major grain crops diseases identification
agriculture-related tasks, including water stress and crop
disease monitoring providing valuable information. Unlike Several recent studies have successfully used the combi-
RGB cameras that measure visible light, thermal cameras nation of UAV-based images and deep learning algorithms
are sensitive to infrared spectra providing more informa- to identify different diseases that affect the major grain
tion about the plant status, which are not achievable using crops, including wheat and maize. For example, the authors
visible, multispectral, and hyperspectral cameras. They in Pan et al. [56], Su et al. [72], Zhang et al. [87] targeted
provide a heatmap by measuring the emitted radiation from the detection of yellow rust disease using different deep
the targeted crop. The measured energy could be used to learning-based computer vision models. The yellow rust
estimate the crop status through its temperature irregular- disease is one of the most dangerous diseases that causes
ities and anomalies Sishodia et al. [69]. This information great wheat production losses each year estimated at more
can improve the visual inspection abilities of UAV-based than 5 million tonnes Beddow et al. [14]. To minimize the
crop diseases identification systems. Therefore, thermal impact of this disease, the authors in Zhang et al. [87],
remote sensing data could be used to identify crop diseases proposed a novel semantic segmentation method derived
even before being distinguished by the naked eyes Kerkech from the U-Net model to identify the infected wheat crop
et al. [43], Khanal et al. [47]. However, according to Raj regions with yellow rust disease using multispectral data
et al. [62], when employing thermal images, there are a few collected through a UAV platform. To improve the main
concerns that must be handled, including images’ temporal U-Net architecture, they embed three modules which are
and spatial resolutions, environmental conditions, crop Irregular Encoder Module (IEM), Irregular Decoder Mod-
species diversity and their growth stage, and the flight ule (IDM), and Content-aware Channel Re-weight Module
altitude and viewing angle. (CCRM). They investigated the impact of the input data
type on the overall performance of the deep learning model
to detect yellow rust disease in wheat crops. They found
5 Deep learning algorithms to identify crop that the performance of the proposed Ir-Unet model pro-
diseases from UAV-based visual data vides good results using all the five bands information
gathered using the RedEdge multispectral camera achiev-
Over the last decade, deep learning-based computer vision ing an overall accuracy of 96.95% and an F1-score of
techniques achieved interesting results in many fields, 94.66% exceeding the results obtained in Su et al. [72],
including agriculture. Crops affected by diseases show where they achieved an F1-score of only 92%. Further-
several visual symptoms such as plant color, leaf winding, more, they achieved an even slightly better overall accu-
leaf spots, fruit spots Thangaraj et al. [76], Kerkech et al. racy of 96.97% using a combination of all the raw bands
[43]. This can make deep learning algorithms the best and their variant measured Selected Vegetation Indices
choice to identify these diseases. To achieve plant diseases (SVIs). Also, by applying feature re-weight using the
identification, three main computer vision-based tasks can CCRM, the Ir-Unet model provides an overall accuracy of
be used to achieve better crop disease identification from 97.13%. Similarly, the authors in Pan et al. [56] investi-
UAV imagery, which are image classification Tetila et al. gated the performance of different machine learning and
[75], object detection Wu et al. [82], and image segmen- deep learning models to identify yellow rust disease in the
tation Pan et al. [56], Qin et al. [59]. The deep learning wheat crops from UAV-based RGB images. They found
algorithms workflow diagram used to detect and classify that PSPNet (98%) and SVM (96) algorithms provide the
crop diseases from UAV imagery is illustrated in Figure 3. best accuracy among all the tested models, including
123
Cluster Computing (2023) 26:1297–1317 1307
Random Forest (73%), BPNN (86%), FCN (90%), and to classify healthy and diseased maize leaves from low
U-Net (94%). Liu et al. Liu et al. [50], proposed a BPNN altitude UAV imagery providing an accuracy of 97.76%, a
model to monitor Fusarium Head Blight from aerial recall rate of 97.85%, and a precision rate of 98.42%.
hyperspectral imagery achieving the highest overall accu- Similarly, Wiesner-Hanks et al. Wiesner-Hanks et al. [80],
racy of 98% outperforming SVM and RF with 95% both. combined crowdsourced ResNet-based CNN and Condi-
Huang et al. Huang et al. [39] targeted another type of tional Random Field (CRF) techniques to segment UAV-
wheat diseases from UAV-based RGB images, which is based RGB images into regions affected by the NLB dis-
Helminthosporium Leaf Blotch Disease. They proposed to ease or not, where the crowdsourced CNN is used to
use a LeNet-based CNN model to classify HLBD accord- generate heatmaps and the CRF to classify each pixel in the
ing to its severity at different disease progression stages. image as lesion or non-lesion. Applying this approach, they
The adopted CNN model achieved an overall accuracy of were able to identify the NLB disease in maize crops down
91.43% outperforming the different techniques combined to the millimeter level achieving an impressive accuracy of
with the SVM model (Table 3). 99.79% overcoming the approach adopted in Wu et al. [81]
Other studies targeted one of the major maize crop by more than 2% (Table 3).
diseases, which is the Northern Leaf Blight (NLB). For
example, to achieve the NLB identification in maize crops, 5.2 Vineyards diseases identification
they used the dataset created in Wiesner-Hanks et al. [79].
This type of disease caused annual production losses esti- Vines are other important crops that are susceptible to
mated to be approximately 14 million tonnes between 2012 different diseases. Therefore, several studies targeted
and 2015 only in the United States and Ontario. In Stewart vineyards diseases from aerial images collected through
et al. [71], the authors adopted an instance segmentation UAVs. In Kerkech et al. [43], the authors adopted LeNet-5
technique (Mask R-CNN) to detect NLB disease from low architecture to identify grapevine infected regions from
altitude RGB aerial images collected using a DJI Matrice RGB aerial images gathered through a UAV flew at an
600. The proposed approach was able to identify and altitude of 25 meters above the ground. Combining YUV
segment individual lesions with an average precision of color space and ExGR vegetation index as input to the used
96%. In Wu et al. [81], a ResNet-based model was adopted CNN model, they achieved the best accuracy of 95.92%
123
1308 Cluster Computing (2023) 26:1297–1317
(using patches of 64 64 and a filter size of 5 5) among adopted to recognize diseased pinus trees from UAV.
all the adopted combinations, including YUV & ExG DCGAN model was used to increase the number of images
(95.41%), YUV & ExR (95.70%), YUV & GRVI used in the training process, while MobileNet architecture
(95.70%), YUV & NDI (95.52%), and YUV & RGI was used to reduce the complex background information,
(95.73%). In Kerkech et al. [45], a deep learning-based such as roads, soils, and shadows that have some feature
semantic segmentation approach was developed to auto- similarities with the targeted pine tree disease; then, Faster
matically identify Mildew disease in vineyards from RGB R-CNN was used to detect diseased pine trees. According
images, infrared images, and multispectral data by com- to Table 3, the proposed method with DCGAN-based data
bining the visible and infrared bands collected through a augmentation provides acceptable results achieving an F1-
UAV platform. They used the SegNet model to classify score of 70.3%, a recall rate of 92.9%, and a precision rate
each pixel in the image as diseased or not at both leaf-level of 56.5% while it achieved only 61.9%, 92.9%, and 46.4%,
and grapevine-level. According to Table 3, the proposed respectively, with Augmentor-based data augmentation.
method achieved accuracies of 85.13%, 78.72%, 82.20%, Also, the authors in Wu et al. [82] focused on the detection
and 90.23% at leaf-level and 94.41%, 89.16%, 88.14%, and of Pine Wilt Disease (PWD) using UAV technology and
95.02% at grapevine-level using visible, infrared, fusion two state-of-the-art deep learning-based detectors. The first
AND, and Fusion OR data. Also, Kerkech et al. Kerkech detector is Faster R-CNN (a two-stage detector), while the
et al. [44] developed VddNet which is a semantic seg- second is YOLOv3 (a one-stage detector). These detectors
mentation model inspired by VGGNet, SegNet, and U-Net are based on different backbone architectures for features
architectures. It consists of three parallel encoders based on extraction, including ResNet-50, ResNet-101 for Faster
VGGNet architecture for each type of the used data (RGB, R-CNN and DarkNet-53, MobileNet for YOLOv3. They
Near InfraRed (NIR), and depth map) and one decoder to achieved a mAP of 60,2%, 62,2%, 64%, and 63,2% using
generate a disease map at the pixel level. The proposed Faster R-CNN (ResNet-50), Faster R-CNN (ResNet-101),
model achieved an accuracy of 93.72% overcoming state- YOLOv3 (DarkNet-53), and YOLOv3 (MobileNet),
of-the-art semantic segmentation algorithms, which are respectively (Table 3). According to Table 3, the DarkNet-
SegNet (92.75%), U-Net (90.69%), DeepLab3? (88.58%), 53-based YOLOv3 provides the best mAP among all the
and PSPNet (84.63%) (Table 3). tested detectors, but with the largest model size of 241 MB.
On the other hand, MobileNet-based YOLOv3 provides the
5.3 Pine tree crops diseases identification highest inference speed with around 1.4 FPS while keeping
a competitive mAP. Similarly, in another study Yu et al.
Several studies targeted the detection of pine trees diseases [83], the authors adopted a ResNet50-based Faster R-CNN
from UAV-based aerial imagery. For example, Qin et al. and DarkNet53-based YOLOv4 detectors to identify PWD
Qin et al. [59] developed Spatial-Context-Attention Net- in pinus trees at different growing stages, including green,
work (SCANet) architecture to segment pine nematode early, middle, and late stages. According to Table 3, Faster
disease in multispectral aerial images of pine trees gathered R-CNN provides a better mAP and (61%) smaller model
using UAV technology. SCANet architecture consists of size (113M) than YOLOv4 achieving only an mAP of
Spatial Information Retention Module (SIRM) and Context around 57% and a model size of 224M. Compared with the
Information Module (CIM), where SIRM was used to work of Wu et al. [82], YOLOv4 performs worse than
obtain low-level features and CIM was designed to expand YOLOv3, which could be due to the dataset type, image
the receptive field. Their approach provides better results size, image preprocessing, and model configuration.
than state-of-the-art semantic segmentation algorithms However, YOLOv4 provides real-time PWD detection
achieving a mean F1-score rate of more than 88% while achieving an inference speed of more than 25 FPS.
DeeoLabV3?, HRNet, and DenseNet provide only Though, these results are still relatively low, which could
72.22%, 72%, and 70%, respectively. Also, Hu et al. [37] be due to many factors such as the flight altitude and the
combined Deep Convolutional Neural Network (DCNN), small size of the targeted disease.
Deep Convolutional Generative Adversarial Network
(DCGAN), and AdaBoost classifier to improve the detec- 5.4 Other crops diseases identification
tion of diseased pinus trees from RGB UAV imagery. The
proposed approach overcomes traditional machine learning The use of deep learning and UAVs were adopted to
methods providing an F1-score of 86.3% and a recall of identify diseases in many other crop types, including
95.7% against recall rates of 78.3% and 65.2% for SVM potato, tomato, soybean, banana, among others. For
and AdaBoost classifiers, respectively (Table 3). Similarly, example, the authors in Tetila et al. [75] proposed a com-
in Hu et al. [38], a combination between MobileNet, Faster puter vision technique that combines the SLIC algorithm
R-CNN, Augmentor, and DCGAN architectures was and different CNN models to identify soybean diseases at
123
Cluster Computing (2023) 26:1297–1317 1309
leaf level from RGB images captured through a DJI s/image (Table 3). Cercospora leaf spot detection from
Phantom 3 Pro equipped with a Sony EXMOR camera. To UAV imagery of sugar beet crops was targeted in Görlich
classify soybean leaf diseases, they adopted several deep et al. [32] using a sematic segmentation model (FCN)
CNN architectures with different parameter fine-tuning, based on fully convolutional DenseNet (FC-DenseNet)
including Inception-v3 Szegedy et al. [74], ResNet-50 He proposed in Jégou et al. [40]. The proposed FCN approach
et al. [34], VGG-19 Simonyan and Zisserman [67], and takes RGB images as input and provides a pixel-wise map.
Xception Chollet [22]. According to Table 3, Inception-v3 The adopted method achieved an F1-score of 75.74% on
with a fine-tuning rate of 75% slightly overcomes the other data under similar field conditions to the training data and a
architectures in terms of accuracy and training time. It similar F1-score of 75.55% under changing field
achieved an accuracy of 99.04% against 99.02%, 99.02%, conditions.
and 98.56% achieved by Resnet-50, VGG-19, and Xcep-
tion, respectively. Moreover, it took less training time by 6,
31, and 42 hours than ResNet-50, VGG-19, and Xception 6 Discussions
respectively. The authors in Gomez Selvaraj et al. [31]
used a RetinaNet detector based on ResNet-50 architecture Early crop and plant disease identification is a crucial task
to detect banana plants from UAV-based RGB aerial to improve crop productivity. The adoption of recent UAV-
images achieving an F1-score of around 84% on the based technologies and advanced deep learning algorithms
training set and 70% on the test set, which is relatively low. has emerged as a new effective solution, where their
The banana plant’s small size from high altitudes could adoption to detect crop diseases has gained high impor-
explain the detector’s poor performance. Then, they tance in several studies over the last few years. In this
investigated the performance of two CNN models to clas- review paper, we investigated the importance of several
sify each of the detected plants as healthy or diseased UAV platforms, camera sensors technologies, and deep
achieving an accuracy of 85% for VGG-16 and 92% for the learning algorithms to improve crop and plant disease
proposed architecture. To determine the severity of potato identification. These techniques and technologies provide
Late Blight disease from multispectral UAV imagery, the better performance than traditional ones that are based on
authors in Duarte-Carvajalino et al. [28] used different spatial and terrestrial technologies and machine learning-
methods, including traditional machine learning approa- based methods such as SVM and random forest classifiers.
ches (MLP, RF, and SVR) and a deep learning model In this section, we aim to provide readers and farmers with
(CNN). According to Table 3, CNN and RF provide better the most challenging issues that could face them to detect
results than MLP and SVR achieving an R2 of 0.74 for crop diseases from UAV-based aerial images and how they
CNN and 0.75 for RF. Similarly, the authors in Shi et al. can select the appropriate technology and algorithm to
[65] developed a 3D-CNN model called CropdocNet to achieve better results.
detect potato Late Blight disease from hyperspectral ima-
ges collected using a UAV platform. The proposed model 6.1 Challenges, limitations, and potential
provided impressive results achieving an average accuracy solutions related to the UAV and camera
of around 98% and 96% on the training and independent technologies
testing sets. However, in their study, they only targeted one
single potato disease type. The authors in Abdulridha et al. Even with all their high benefits, the UAV industry still
[2] developed a system based on MLP and VIs to detect facing several challenges to achieve different agricultural
Target Spot (TS) and Bacterial Spot (BS) diseases in tasks, including large fields monitoring and pesticide
tomato crops from UAV-based hyperspectral images in spraying. In addition to the regulations that could restrict
both laboratory and field conditions and at three different flying in several countries and areas across the world, the
diseases development stages, which are healthy, early, and short flight time is considered one of the most important
late. Applying MLP, they were able to identify TS and BS limitations of UAVs due to many factors, including battery
diseases from UAV imagery with an accuracy of 97% and capacity, the computational power needed to run deep
98%, respectively. Dang et al. [25], proposed RadRGB learning algorithms, and the high payload. Unfortunately,
model to classify Fusarium Wilt Disease in radish crops. these issues could affect and limit the use of UAVs in the
Compared to VGG-16 and Inception-V3, the proposed modern smart agriculture. Several solutions were proposed
architecture provides the best results in terms of accuracy in many studies to minimize the impact of these factors on
and testing time of 96.4% and 0.043 s/image, respectively, the overall efficiency of UAV technology. For example, the
while VGG-16 and Inception-V3 achieved slightly lower authors in Gao et al. [30] proposed to plan the UAV flight
accuracies of 93.1% and 95.7%, respectively, and longer route in advance to ensure that the entire crop is checked
testing time of 0.1 (VGG-16) and 0.22 (Inception-V3) following the shortest flight path to reduce energy
123
1310 Cluster Computing (2023) 26:1297–1317
consumption. However, this approach is not always et al. [32] provides similar results on both similar field
applicable due to many factors, including crop structure conditions to the training data and under changing field
and location. Similarly, to reduce energy consumption and conditions, such as varying illumination and orientation.
increase flight time, the adoption of lightweight models and However, deep learning models are still facing several
offboard processing could be other effective solutions challenges and limitations.
Bouguettaya et al. [17]. Furthermore, to cover larger fields, The use of deep learning methods requires a huge
the use of UAV swarms is an interesting solution Albani amount of data, which is not always achievable in the case
et al. [8], Ju and Son [42]. However, it is still very chal- of crop diseases identification from UAV imagery due to
lenging to control a large number of UAVs simultaneously. the large variants of crops, plants, and diseases. Several
Also, fixed-wing and Hybrid VTOL UAVs can be used for approaches were adopted in different studies to reduce the
large crops monitoring. However, they are suffering from impact of the low availability of datasets, including transfer
many limitations that could restrict and reduce the per- learning, fine-tuning, and techniques employed within the
formance of the system for early disease identification due deep learning architecture like dropout and normalization
to the lower spatial resolution compared to multirotor techniques Tetila et al. [75], Gomez Selvaraj et al. [31],
UAVs that can fly at lower altitudes and provide better Duarte-Carvajalino et al. [28]. Also, data augmentation
resolution allowing farmers to monitor crops even at leaf could be another effective solution to overcome this issue
level. by increasing artificially the training dataset size. This can
The camera type selection is a crucial task that we need be done through image processing techniques like rotation,
to consider. Multispectral, hyperspectral, and thermal zooming, mirroring, and adding some noises and brightness
camera sensors provide richer information than visible to the images Stewart et al. [71], Kerkech et al. [44]. For
cameras making them more suitable for early crop diseases example, the authors in Stewart et al. [71], applied rota-
identification from aerial images Wu et al. [82], Pineda tions on each image to increase the number of images used
et al. [57]. Several studies have focused on calculating for the training process resulting in seven additional aug-
different VIs using multispectral and hyperspectral data mented images per image. Similarly, the authors in Tetila
allowing farmers to identify general crop stress, including et al. [75] used a total of 3000 images for six different
different diseases Kerkech et al. [43], Abdulridha et al. [1], classes to train the adopted CNN architectures. Each of the
Bagheri [11]. However, UAVs equipped with visible six classes consists of 500 UAV-based imagery, which is
cameras are the most used in the agriculture field due to not sufficient to train deep learning models. To this end,
their price, weight, and availability Bouguettaya et al. [18], they applied different geometric transformations to
Wu et al. [82]. Also, the performance of the different increase the number of image samples in each class,
imaging systems depends on the flight altitudes and dif- including rotation, rescaling, scrolling, and zooming
ferent viewing angles. For example, flying at high altitudes operations. Another way of data augmentation is to use
could increase the scene complexity that may restrict the deep learning algorithms to generate new unseen artificial
deep learning model’s performance in the case of small- data, including different types of Generative Adversarial
scale visual symptoms. Also, flying at very low altitudes Networks (GANs). For example, the authors in Hu et al.
can affect the detection performance due to the wind [38], Hu et al. [37] used DCGAN architecture to increase
generated by the rotation of propellers resulting in the data size by generating new unseen data to improve the
continuous movement of the plants’ leaves that can cause model performance. Also, to overcome the lack of data to
hiding of some symptoms. train an efficient deep learning model, the authors in Tetila
et al. [75] applied dropout and data augmentation tech-
6.2 Challenges, limitations, and potential niques using the Keras module. Also, they investigated the
solutions related to the deep learning impact of transfer learning and fine-tuning techniques on
models the overall performance of the CNN model, where various
pre-trained models on the ImageNet dataset were tested.
Recently, deep learning models have emerged as a novel These techniques provide much better results than training
technology that has shown promising results in the visual deep learning models from scratch with random weight
data processing. They provide significant benefits against initialization. Data type and data preprocessing are other
classical machine learning approaches. For example, deep fundamental factors we should consider to develop effi-
learning algorithms can extract relevant features automat- cient deep learning models, especially in the case of crop
ically instead of extracting them manually, which is a time- diseases identification from UAV-based aerial images.
consuming task. Generalization is another important Thus, the data type selection may improve the performance
parameter that would favor deep learning models over of the developed model. Several studies adopted different
classical techniques. For example, the model in Görlich data fusion approaches to improve the model performance.
123
Cluster Computing (2023) 26:1297–1317 1311
For example, the authors in Zhang et al. [84] proposed to Case study Deep learning-based image classification
combine spectral and spatial information. Similarly, the models are widely used to recognize different plant and
authors in Kerkech et al. [45] combined visible and infra- crop diseases. In most cases, the image classification task is
red data to improve the model performance in identifying utilized to recognize diseases at the leaf level. Thus, most
diseases. Also, combining different vegetation indices and of the reviewed studies adopted image classification tech-
color spaces were adopted in Kerkech et al. [43], where the niques to identify plant diseases from UAV-based RGB
choice of the right vegetation index could affect the overall images collected from low altitudes (Table 3).
performance of the crop disease identification systems as Adopted techniques Due to their high effectiveness in
shown in many studies, including Guo et al. [33], Abdul- the image classification task, CNNs are considered the
ridha et al. [2], and Bagheri [11]. main deep learning architectures for plant and crop dis-
Also, several studies focused on the deep learning model eases identification. According to the reviewed papers, in
architecture, such as the chosen backbone for features addition to some custom CNN architectures, AlexNet,
extraction, the number of layers, the loss functions, to VGGNet, ResNet, Inception, and Xception were among the
name a few. The selected architecture is a fundamental most employed CNN architectures for plant diseases clas-
parameter that could affect the performance of the detec- sification. For example, the authors in Tetila et al. [75]
tion. As presented in Table 3, the authors in Gomez Sel- adopted four state-of-the-art CNN architectures to classify
varaj et al. [31], Wu et al. [82], and Kerkech et al. [45] soybean diseases, which are Inception-V3, ResNet-50,
investigated the impact of the selected backbone architec- VGG-19, and Xception. Other researchers developed their
ture on the effectiveness of the used algorithms. For own custom CNN architectures to classify diseases,
example, the authors in Wu et al. [82] showed that including the study of Dang et al. [25].
YOLOv3 based on DarkNet architecture as the main fea- Performance The performance of plant diseases iden-
ture extractor achieved a better mAP of 64% than the one tification using image classification techniques is calcu-
based on MobileNet achieving a relatively smaller mAP of lated using different evaluation metrics, including
63.2%. However, the latter provides better processing accuracy, precision, recall, f1-score, learning error, training
speed and model size making it more suitable for small time, and inference time. Image classification based on
devices with limited computational resources. deep learning models provides high recognition rates
compared to traditional machine learning models. For
example, the authors in Duarte-Carvajalino et al. [28]
7 Comparison showed that CNN achieved lower error rates than Random
Forest and MLP. Shallow CNN models are among the
Regarding crop and plant diseases detection, the computer preferred solutions for plant disease classification from
vision-based methods are often varying among three main small-size images. For example, the authors in Dang et al.
tasks, which are image classification, object detection, and [25] showed the effectiveness of using shallow CNN
image segmentation. However, the selection of the appro- architecture with only five convolution layers in identifying
priate computer vision task depends on the targeted fusarium wilt of radish from 6464 images. Similarly, in
objective. Therefore, in this section, we aim to provide a deeper networks, as shown in Gomez Selvaraj et al. [31],
detailed analysis of the most adopted deep learning-based where they adopted ResNet-50 as the main CNN classifier.
computer vision categories for crop diseases identification Using such deep architectures may result in losing signif-
to help researchers and farmers in selecting the appropriate icant information about small objects at the deeper layer
models and tools to achieve the most important study levels.
cases. Also, different evaluation metrics for computer Image classification could be an effective and even
vision tasks are presented. faster way when we target single disease identification per
image. However, in the case of multiple diseases in the
7.1 Image classification case same image, we need more advanced and complicated
techniques such as object detection and image segmenta-
Image classification is one of the first deep learning tion that can identify multiple diseases in the same image.
approaches to be widely used in the field of crop disease
detection, where the classifier receives an image as input 7.2 Object detection case
and tries to assign a label to the entire image from a pre-
defined set of categories. According to Table 3, deep Object detection is another important computer vision task
learning-based image classification is the most used tech- adopted to identify crop and plant diseases from UAV
nique for crop and plant diseases from RGB images col- imagery. Unlike image classification, deep learning-based
lected from UAVs that fly at low altitudes. object detection models can classify and localize multiple
123
1312 Cluster Computing (2023) 26:1297–1317
diseases that are present in the input image providing remains unsolved making image segmentation techniques
bounding boxes around each detected disease with its more appropriate in such cases.
appropriate class. Case study Image segmentation techniques based on
Case study According to the studies available in the deep learning models are mostly used to identify plant and
literature, object detection-based techniques are mainly crop diseases from multispectral and hyperspectral images
adopted to identify diseases in tree crops, including pine collected using UAV platforms that fly at different
and banana (Table 3). altitudes.
Adopted techniques There are two main object detec- Adopted techniques Image segmentation algorithms
tion categories to detect plant and crop diseases from aerial are divided into two main categories, including semantic
images, which are two-stage and single-stage algorithms. segmentation and instance segmentation. Several studies
Due to its high performance, compared to other region- adopted image segmentation algorithms to identify crop
based models, Faster R-CNN is the most used two-stage and plant diseases from UAV imagery. For example, the
detector, which is adopted in several studies, including Wu authors in Stewart et al. [71] adopted an instance seg-
et al. [82] and Yu et al. [83], whereas, different YOLO mentation algorithm called Mask R-CNN to identify the
versions are used as the main single-stage detector. Northern Leaf Blight in maize crops from images acquired
YOLOv3 and YOLOv4 were adopted in Wu et al. [82] and through a UAV achieving an average precision of 96%.
Yu et al. [83], respectively, to identify pine wilt disease. However, according to Table 3, semantic segmentation is
Other studies combined object detection and image seg- the most used image segmentation technique, which is
mentation techniques to improve the model performance. adopted in several studies. FCN, U-Net, SegNet, PSPNet,
The authors in Gomez Selvaraj et al. [31] used a RetinaNet and DeepLab-V3 are the most adopted semantic segmen-
based on ResNet-50 architecture to detect banana plants tation, but their performances depend on several factors,
from high-altitude UAV imagery. Then, they cropped the including data type, flight altitude, crop types, and diseases
detected plants from the original image and fed it to VGG- types.
16 or a custom CNN architecture to classify the detected Performance In the case of plant and crop diseases
plant as diseased or not. identification from high altitude UAV imagery, according
Performance Compared to image classification and to Table 3, image segmentation models provide better
image segmentation techniques, object detection-based recognition rates compared to object detection-based
plant and crop diseases identification provides relatively techniques. For example, using object detection models,
low performance. For example, the detection performance the authors in Wu et al. [82] achieved precision rates
achieved in Gomez Selvaraj et al. [31] using the RetinaNet varying between 60% and 64% to identify pine disease,
model (F1-score of around 84% on the training set and whereas adopting semantic segmentation techniques, the
70% on the test set) could be improved using a larger input authors in Qin et al. [59] were able to achieve a precision
data size, but it comes with the cost of longer training time rate between 68% and 86%. However, these results did not
and higher processing power. Thus, one of the major only depend on the used model but also the disease type.
drawbacks of object detection approaches is that their Object detection and image segmentation techniques
overall performance depends on the flight altitude and also may require more computational power, where UAV
on the adopted feature extractor. platforms do not have such a powerful processing system.
According to Table 3, object detection-based methods To overcome such a problem, several approaches were
are the less adopted techniques for plant and crop diseases adopted over the last few years, including online off-board
identification from UAV imagery due to their relatively data processing Li et al. [48], Hu et al. [37], offline data
low performances. processing Bayraktar et al. [13], and lightweight deep
learning models implementation Bouguettaya et al. [17].
7.3 Image segmentation case However, each of these techniques has its pros and cons.
For example, the online off-board processing method has
Image segmentation aims to classify each pixel in the some issues in terms of data privacy and security because
image according to its class. In the plant and crop diseases we need to transmit the collected data in real-time using
identification field, deep learning-based image segmenta- different communications protocols, whereas offline data
tion techniques provide a more precise location of the processing does not provide real-time solutions in the case
detected disease by classifying each pixel in the image of high-resolution data. Also, lightweight versions of deep
according to the disease types. The problem of detecting learning models have lower accuracy compared to large
diseases that are in arbitrary poses and cluttered and/or and complex deep learning models.
occluded environments using object detection algorithms Deep learning-based crop and plant diseases identifica-
tion through visual data acquired from UAVs may suffer in
123
Cluster Computing (2023) 26:1297–1317 1313
terms of performance due to many factors related to the The abbreviations TP, TN, FP, FN in Table 4 denote: -
crop and diseases characteristics. Thus, several techniques True Positives (represent the number of correctly identified
can be used to improve their performance. For example, the diseases); - True Negatives (represent the number of
choice of CNN architectures to extract effective features identified diseases); - False Positives (represent the number
play a crucial role in terms of accuracy and speed. Table 3 of incorrectly identified diseases); - False Negatives (rep-
shows that the selected CNN has a significant impact on the resent the number of non-identified diseases). Also, p0 and
model accuracy. Using different model-related parameters, pe denote the observed proportional agreement and the
like learning rate, optimizers, normalization, and the expected agreement by chance, respectively.
number of epochs, may improve the recognition accuracy.
Also, various bio-inspired meta-heuristic optimization
techniques were proposed over the last few years to 8 Conclusions and future directions
improve the performance of developed models, including
Reptile Search Algorithm (RSA) Abualigah et al. [7], In the present paper, we reviewed the existing deep
Arithmetic Optimization Algorithm (AOA) Abualigah learning-based computer vision methods to identify and
et al. [4], and Aquila Optimizer (AO) Abualigah et al. [6]. classify crop and plant diseases from UAV-based aerial
images. Firstly, we introduced the differet UAV types used
7.4 Evaluation metrics in the agricultural field, including rotary-wing, fixed-wing,
and hybrid VTOL UAVs. Secondly, we highlighted various
To evaluate the performance of deep learning models, camera sensors that could be mounted on UAV platforms
several evaluation metrics were proposed over the years, adopted to identify crop diseases such as RGB, multi-
including accuracy, precision, recall, f1-score, among spectral, hyperspectral, and thermal cameras. Thirdly, dif-
others. Table 4 summarizes the most used metrics to ferent deep learning models used to identify crop and plant
evaluate the performance of the developed deep learning- diseases from aerial images were presented. Finally, we
based computer vision models. investigated the most challenging issues that could face
farmers to detect crop diseases from UAV imagery and
Table 4 List of the most used metrics to evaluate deep learning-based computer vision models
Metric Formula Description
Accuracy TPþTN
Acc ¼ TPþTNþFPþFN The accuracy is the most used evaluation technique that measures how many
times the developed model made the correct prediction in the classification
task.
Recall TP
R ¼ TPþFN The recall rate indicates how many misidentified plant disease types the
(sensitivity) developed model can predict. A better recall rate means lower false-negative
predictions.
Precision TP
P ¼ TPþFP The precision rate indicates how many of a certain disease class the developed
model incorrectly diagnosed as another disease type. A better precision rate
means lower false-positive predictions.
F1-score (F- F1 ¼ 2PR
PþR
The F1-score metric represents the harmonic mean of precision and recall rates.
measure)
R1
Average AP ¼ PðRÞdðRÞ The AP is a metric to evaluate the performance of an object detection model,
0
Precision which is calculated for each class.
PN
Mean mAP ¼ N1 i¼1 APi The mAP is the average of AP over all classes. In the case when we have a single
Average class, mAP and AP are the same metrics.
Precision
o Pe
Kappa Kappa ¼ P1P e
The Kappa coefficient is a statistic that measures inter-annotator agreement.
coefficient
Intersection The IoU measures the overlap between the ground truth area and the predicted
over Union T area.
Predicted area Ground truth area
IoU ¼ S
Predicted area Ground truth area
FPS Number of Frames The FPS is used to compute the detection speed.
FPS ¼ Current timeStart time
123
1314 Cluster Computing (2023) 26:1297–1317
how they can select the appropriate deep learning model 4. Abualigah, L., Diabat, A., Mirjalili, S., et al.: The arithmetic
and technology to achieve better results. optimization algorithm. Comput. Methods Appl. Mech. Eng.
376(113), 609 (2021). https://doi.org/10.1016/j.cma.2020.113609
Developing accurate, real-time, reliable, and autono- 5. Abualigah, L., Diabat, A., Sumari, P., et al.: Applications,
mous UAV-based systems for plant and crop diseases deployments, and integration of internet of drones (iod): a review.
identification is becoming more and more essential in IEEE Sens. J. 21(22), 25532–25546 (2021). https://doi.org/10.
modern agriculture. These systems require complex and 1109/JSEN.2021.3114266
6. Abualigah, L., Yousri, D., Abd Elaziz, M., et al.: Aquila opti-
efficient algorithms that can overcome the encountered mizer: a novel meta-heuristic optimization algorithm. Comput.
problems and challenges, such as lighting condition chan- Ind. Eng. 157(107), 250 (2021). https://doi.org/10.1016/j.cie.
ges, disease size, occlusion, and changes in viewpoints, 2021.107250
among others. In addition, it is necessary to combine recent 7. Abualigah, L., Elaziz, M.A., Sumari, P., et al.: Reptile search
algorithm (RSA): a nature-inspired meta-heuristic optimizer.
deep learning architectures and UAV platforms with Expert Syst. Appl. 191(116), 158 (2022). https://doi.org/10.1016/
advanced technologies to build a system that works effi- j.eswa.2021.116158
ciently to improve crop productivity. One other major 8. Albani, D., Nardi, D., Trianni, V.: Field coverage and weed
problem that we have to deal with is agricultural data mapping by UAV swarms. In: 2017 IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS),
availability. Therefore, we need to collect more data or pp. 4319–4325, (2017). https://doi.org/10.1109/IROS.2017.
develop sophisticated algorithms based on generative deep 8206296
learning architectures to generate realistic datasets. 9. Albetis, J., Duthoit, S., Guttler, F., et al.: Detection of flavescence
dorée grapevine disease using unmanned aerial vehicle (uav)
multispectral imagery. Remote Sens. (2017). https://doi.org/10.
3390/rs9040308
Author contributions All authors contributed to the study conception
10. Albetis, J., Jacquin, A., Goulard, M., et al.: On the potentiality of
and design. Material preparation, data collection and analysis were
UAV multispectral imagery to detect flavescence dorée and
performed by [AB], [HZ], [AK] and [AMT]. The first draft of the
grapevine trunk diseases. Remote Sens. (2019). https://doi.org/10.
manuscript was written by [AB] and [HZ] and all authors commented
3390/rs11010023
on previous versions of the manuscript. All authors read and approved
11. Bagheri, N.: Application of aerial remote sensing technology for
the final manuscript.
detection of fire blight infected pear trees. Comput. Electron.
Agric. 168(105), 147 (2020). https://doi.org/10.1016/j.compag.
Funding The authors declare that no funds, grants, or other support
2019.105147
were received during the preparation of this manuscript.
12. Barbedo, J.G.A.: A review on the use of Unmanned Aerial
Vehicles and imaging sensors for monitoring and assessing plant
Data availability Data sharing not applicable to this article as no
stresses. Drones (2019). https://doi.org/10.3390/drones3020040
datasets were generated or analyzed during the current study.
13. Bayraktar, E., Basarkan, M.E., Celebi, N.: A low-cost UAV
framework towards ornamental plant detection and counting in
the wild. ISPRS J. Photogramm. Remote. Sens. 167, 1–11 (2020).
Declarations https://doi.org/10.1016/j.isprsjprs.2020.06.012
14. Beddow, J.M., Pardey, P.G., Chai, Y., et al.: Research investment
Conflict of interest The authors declare that they have no conflict of implications of shifts in the global geography of wheat stripe rust.
interest. Nat. Plants 1(10), 1–5 (2015). https://doi.org/10.1038/nplants.
2015.132
Consent to participate Not applicable. 15. Bohnenkamp, D., Behmann, J., Mahlein, A.K.: In-field detection
of yellow rust in wheat on the ground canopy and UAV scale.
Remote Sens. (2019). https://doi.org/10.3390/rs11212495
References 16. Bondre, S., Sharma A. K.: Review on leaf diseases detection
using deep learning. In: 2021 Second International Conference on
Electronics and Sustainable Communication Systems (ICESC),
1. Abdulridha, J., Ampatzidis, Y., Ehsani, R., et al.: Evaluating the pp. 1455–1461 (2021). https://doi.org/10.1109/ICESC51422.
performance of spectral features and multivariate analysis tools to 2021.9532697
detect laurel wilt disease and nutritional deficiency in avocado. 17. Bouguettaya, A., Kechida, A., Taberkit, A.M.: A survey on
Comput. Electron. Agric. 155, 203–211 (2018). https://doi.org/ lightweight CNN-based object detection algorithms for platforms
10.1016/j.compag.2018.10.016 with limited computational resources. Int. J. Inf. Appl. Math.
2. Abdulridha, J., Ampatzidis, Y., Kakarla, S.C., et al.: Detection of 2(2), 28–44 (2019)
target spot and bacterial spot diseases in tomato using UAV- 18. Bouguettaya, A., Zarzour, H., Kechida, A., et al.: Recent
based and benchtop-based hyperspectral imaging techniques. advances on UAV and deep learning for early crop diseases
Precision Agric. 21(5), 955–978 (2020). https://doi.org/10.1007/ identification: A short review. In: 2021 International Conference
s11119-019-09703-4 on Information Technology (ICIT), pp. 334–339 (2021). https://
3. Abdulridha, J., Ampatzidis, Y., Qureshi, J., et al.: Laboratory and doi.org/10.1109/ICIT52682.2021.9491661
UAV-based identification and classification of tomato yellow leaf 19. Bouguettaya, A., Zarzour, H., Kechida, A., et al.: Vehicle
curl, bacterial spot, and target spot diseases in tomato utilizing detection from UAV imagery with deep learning: a review. In:
hyperspectral imaging and machine learning. Remote Sens. IEEE Transactions on Neural Networks and Learning Systems,
(2020). https://doi.org/10.3390/rs12172732 pp. 1–21 (2021). https://doi.org/10.1109/TNNLS.2021.3080276
20. Bouguettaya, A., Zarzour, H., Taberkit, A.M., et al.: A review on
early wildfire detection from Unmanned Aerial Vehicles using
123
Cluster Computing (2023) 26:1297–1317 1315
deep learning-based computer vision algorithms. Signal Process. Biosys. Eng. 194, 138–151 (2020). https://doi.org/10.1016/j.bio
190(108), 309 (2022). https://doi.org/10.1016/j.sigpro.2021. systemseng.2020.03.021
108309 38. Hu, G., Zhu, Y., Wan, M., et al.: Detection of diseased pine trees
21. Card, S.D., Bastı́as, D.A., Caradus, J.R.: Antagonism to plant in unmanned aerial vehicle images by using deep convolutional
pathogens by epichloë fungal endophytes-a review. Plants (2021). neural networks. Geocarto Int. (2021). https://doi.org/10.1080/
https://doi.org/10.3390/plants10101997 10106049.2020.1864025
22. Chollet, F.: Xception: Deep learning with depthwise separable 39. Huang, H., Deng, J., Lan, Y., et al.: Detection of helminthospo-
convolutions. In: 2017 IEEE Conference on Computer Vision and rium leaf blotch disease based on UAV imagery. Appl. Sci.
Pattern Recognition (CVPR), pp. 1800–1807 (2017). https://doi. (2019). https://doi.org/10.3390/app9030558
org/10.1109/CVPR.2017.195 40. Jégou, S., Drozdzal, M., Vazquez, D., et al.: The one hundred
23. Costa, L., Nunes, L., Ampatzidis, Y.: A new visible band index layers tiramisu: Fully convolutional densenets for semantic seg-
(VNDVI) for estimating NDVI values on RGB images utilizing mentation. In: Proceedings of the IEEE Conference on Computer
genetic algorithms. Comput. Electron. Agric. 172(105), 334 Vision and Pattern Recognition Workshops, pp. 11–19 (2017)
(2020). https://doi.org/10.1016/j.compag.2020.105334 41. Jiang, F., Lu, Y., Chen, Y., et al.: Image recognition of four rice
24. Dammer, K.H., Garz, A., Hobart, M., et al.: Combined UAV-and leaf diseases based on deep learning and support vector machine.
tractor-based stripe rust monitoring in winter wheat under field Comput. Electron. Agric. 179(105), 824 (2020). https://doi.org/
conditions. Agron. J. (2021). https://doi.org/10.1002/agj2.20916 10.1016/j.compag.2020.105824
25. Dang, L.M., Wang, H., Li, Y., et al.: Fusarium wilt of radish 42. Ju, C., Son, H.I.: Multiple UAV systems for agricultural appli-
detection using RGB and near infrared images from Unmanned cations: Control, implementation, and evaluation. Electronics
Aerial Vehicles. Remote Sens. (2020). https://doi.org/10.3390/ (2018). https://doi.org/10.3390/electronics7090162
rs12172863 43. Kerkech, M., Hafiane, A., Canals, R.: Deep leaning approach
26. Delavarpour, N., Koparan, C., Nowatzki, J., et al.: A technical with colorimetric spaces and vegetation indices for vine diseases
study on UAV characteristics for precision agriculture applica- detection in UAV images. Comput. Electron. Agric. 155,
tions and associated practical challenges. Remote Sens. (2021). 237–243 (2018). https://doi.org/10.1016/j.compag.2018.10.006
https://doi.org/10.3390/rs13061204 44. Kerkech, M., Hafiane, A., Canals, R.: Vddnet: Vine disease
27. Di Nisio, A., Adamo, F., Acciani, G., et al.: Fast detection of detection network based on multispectral images and depth map.
olive trees affected by Xylella Fastidiosa from UAVs using Remote Sens. (2020). https://doi.org/10.3390/rs12203305
multispectral imaging. Sensors (2020). https://doi.org/10.3390/ 45. Kerkech, M., Hafiane, A., Canals, R.: Vine disease detection in
s20174915 UAV multispectral images using optimized image registration
28. Duarte-Carvajalino, J.M., Alzate, D.F., Ramirez, A.A., et al.: and deep learning segmentation approach. Comput. Electron.
Evaluating late blight severity in potato crops using Unmanned Agric. 174(105), 446 (2020). https://doi.org/10.1016/j.compag.
Aerial Vehicles and machine learning algorithms. Remote Sens. 2020.105446
(2018). https://doi.org/10.3390/rs10101513 46. Khalid, B., Akram, M.U., Khan, A.M.: Multistage deep neural
29. Ganchenko, V., Doudkin, A.: Agricultural vegetation monitoring network framework for people detection and localization using
based on aerial data using convolutional neural networks. Opt. fusion of visible and thermal images. In: El Moataz, A., Mam-
Mem. Neural Netw. 28(2), 129–134 (2019). https://doi.org/10. mass, D., Mansouri, A., et al. (eds.) Image Signal Process.,
3103/S1060992X1902005X pp. 138–147. Springer International Publishing, Cham (2020)
30. Gao, D., Sun, Q., Hu, B., et al.: A framework for agricultural pest 47. Khanal, S.K.C.K., Fulton, J.P., et al.: Remote sensing in agri-
and disease monitoring based on internet-of-things and Unman- culture-accomplishments, limitations, and opportunities. Remote
ned Aerial Vehicles. Sensors (2020). https://doi.org/10.3390/ Sens. (2020). https://doi.org/10.3390/rs12223783
s20051487 48. Li, Y., Qian, M., Liu, P., et al.: The recognition of rice images by
31. Gomez Selvaraj, M., Vergara, A., Montenegro, F., et al.: Detec- UAV based on capsule network. Clust. Comput. 22(4),
tion of banana plants and their major diseases through aerial 9515–9524 (2019). https://doi.org/10.1007/s10586-018-2482-7
images and machine learning methods: a case study in dr congo 49. Li, D., Sun, X., Elkhouchlaa, H., et al.: Fast detection and loca-
and republic of benin. ISPRS J. Photogramm. Remote. Sens. 169, tion of Longan fruits using UAV images. Comput. Electron.
110–124 (2020). https://doi.org/10.1016/j.isprsjprs.2020.08.025 Agric. 190(106), 465 (2021). https://doi.org/10.1016/j.compag.
32. Görlich, F., Marks, E., Mahlein, A.K., et al.: UAV-based classi- 2021.106465
fication of cercospora leaf spot using RGB images. Drones 50. Liu, L., Dong, Y., Huang, W., et al.: Monitoring wheat fusarium
(2021). https://doi.org/10.3390/drones5020034 head blight using unmanned aerial vehicle hyperspectral imagery.
33. Guo, A., Huang, W., Dong, Y., et al.: Wheat yellow rust detection Remote Sens. (2020). https://doi.org/10.3390/rs12223811
using UAV-based hyperspectral technology. Remote Sens. 51. Lytridis, C., Kaburlasos, V.G., Pachidis, T., et al.: An overview
(2021). https://doi.org/10.3390/rs13010123 of cooperative robotics in agriculture. Agronomy (2021). https://
34. He, K., Zhang, X., Ren, S., et al.: Deep residual learning for doi.org/10.3390/agronomy11091818
image recognition. In: 2016 IEEE Conference on Computer 52. Martinez-Alpiste, I., Golcarenarenji, G., Wang, Q., et al.: Search
Vision and Pattern Recognition (CVPR), pp. 770–778, (2016). and rescue operation using UAVs: a case study. Expert Syst.
https://doi.org/10.1109/CVPR.2016.90 Appl. 178(114), 937 (2021). https://doi.org/10.1016/j.eswa.2021.
35. Heidarian Dehkordi, R., El Jarroudi, M., Kouadio, L., et al.: 114937
Monitoring wheat leaf rust and stripe rust in winter wheat using 53. Moysiadis, V., Sarigiannidis, P., Vitsas, V., et al.: Smart farming
high-resolution UAV-based red-green-blue imagery. Remote in Europe. Comput. Sci. Rev. 39(100), 345 (2021). https://doi.
Sens. (2020). https://doi.org/10.3390/rs12223696 org/10.1016/j.cosrev.2020.100345
36. Hu, G., Wu, H., Zhang, Y., et al.: A low shot learning method for 54. Neupane, K., Baysal-Gurel, F.: Automatic identification and
tea leaf’s disease identification. Comput. Electron. Agric. monitoring of plant diseases using Unmanned Aerial Vehicles: a
163(104), 852 (2019). https://doi.org/10.1016/j.compag.2019. review. Remote Sens. (2021). https://doi.org/10.3390/rs13193841
104852 55. Ouhami, M., Hafiane, A., Es-Saady, Y., et al.: Computer vision,
37. Hu, G., Yin, C., Wan, M., et al.: Recognition of diseased Pinus IoT and data fusion for crop disease detection using machine
trees in UAV images using deep learning and adaboost classifier.
123
1316 Cluster Computing (2023) 26:1297–1317
learning: a survey and ongoing research. Remote Sens. (2021). Microprocess. Microsyst. 80(103), 615 (2021). https://doi.org/10.
https://doi.org/10.3390/rs13132486 1016/j.micpro.2020.103615
56. Pan, Q., Gao, M., Wu, P., et al.: A deep-learning-based approach 74. Szegedy, C., Vanhoucke, V., Ioffe, S., et al.: Rethinking the
for wheat yellow rust disease recognition from unmanned aerial inception architecture for computer vision. In: 2016 IEEE Con-
vehicle images. Sensors (2021). https://doi.org/10.3390/ ference on Computer Vision and Pattern Recognition (CVPR),
s21196540 pp. 2818–2826, (2016). https://doi.org/10.1109/CVPR.2016.308
57. Pineda, M., Barón, M., Pérez-Bueno, M.L.: Thermal imaging for 75. Tetila, E.C., Machado, B.B., Menezes, G.K., et al.: Automatic
plant stress detection and phenotyping. Remote Sens. (2021). recognition of soybean leaf diseases using UAV images and deep
https://doi.org/10.3390/rs13010068 convolutional neural networks. IEEE Geosci. Remote Sens. Lett.
58. Pittu, V.R., Gorantla, S.R.: Diseased area recognition and pesti- 17(5), 903–907 (2020). https://doi.org/10.1109/LGRS.2019.
cide spraying in farming lands by multicopters and image pro- 2932385
cessing system. J. Eur. Syst. Autom. 53(1), 123–130 (2020) 76. Thangaraj, R., Anandamurugan, S., Pandiyan, P., et al.: Artificial
59. Qin, J., Wang, B., Wu, Y., et al.: Identifying pine wood nematode intelligence in tomato leaf disease detection: a comprehensive
disease using UAV images and deep learning algorithms. Remote review and discussion. J. Plant Dis. Prot. (2021). https://doi.org/
Sens. (2021). https://doi.org/10.3390/rs13020162 10.1007/s41348-021-00500-8
60. Raeva, P.L., Šedina, J., Dlesk, A.: Monitoring of crop fields using 77. Théau, J., Gavelle, E., Ménard, P.: Crop scouting using UAV
multispectral and thermal imagery from UAV. Eur. J. Remote imagery: a case study for potatoes. J. Unmanned Veh. Syst. 8(2),
Sens. 52(sup1), 192–201 (2019). https://doi.org/10.1080/ 99–118 (2020). https://doi.org/10.1139/juvs-2019-0009
22797254.2018.1527661 78. Vishnoi, V.K., Kumar, K., Kumar, B.: Plant disease detection
61. Rahman, M.F.F., Fan, S., Zhang, Y., et al.: A comparative study using computational intelligence and image processing. J. Plant
on application of unmanned aerial vehicle systems in agriculture. Dis. Prot. 128(1), 19–53 (2021). https://doi.org/10.1007/s41348-
Agriculture (2021). https://doi.org/10.3390/agriculture11010022 020-00368-0
62. Raj, M., Gupta, S., Chamola, V., et al.: A survey on the role of 79. Wiesner-Hanks, T., Stewart, E.L., Kaczmar, N., et al.: Image set
internet of things for adopting and promoting agriculture 4.0. for deep learning: field images of maize annotated with disease
J. Netw. Comput. Appl. 187, 103–107 (2021). https://doi.org/10. symptoms. BMC. Res. Notes 11(1), 1–3 (2018). https://doi.org/
1016/j.jnca.2021.103107 10.1186/s13104-018-3548-6
63. Reddy Maddikunta, P.K., Hakak, S., Alazab, M., et al.: 80. Wiesner-Hanks, T., Wu, H., Stewart, E., et al.: Millimeter-level
Unmanned Aerial Vehicles in smart agriculture: applications, plant disease detection from aerial photographs via deep learning
requirements, and challenges. IEEE Sens. J. 21(16), and crowdsourced data. Front. Plant Sci. 10, 1550 (2019). https://
17608–17619 (2021). https://doi.org/10.1109/JSEN.2021. doi.org/10.3389/fpls.2019.01550
3049471 81. Wu, H., Wiesner-Hanks, T., Stewart, E.L., et al.: Autonomous
64. Shahzaad, B., Bouguettaya, A., Mistry, S., et al.: Resilient detection of plant disease symptoms directly from aerial imagery.
composition of drone services for delivery. Futur. Gener. Com- Plant Phenome J. 2(1), 1–9 (2019). https://doi.org/10.2135/
put. Syst. 115, 335–350 (2021). https://doi.org/10.1016/j.future. tppj2019.03.0006
2020.09.023 82. Wu, B., Liang, A., Zhang, H., et al.: Application of conventional
65. Shi, Y., Han, L., Kleerekoper, A., et al.: Novel cropdocnet model UAV-based high-throughput object detection to the early diag-
for automated potato late blight disease detection from unmanned nosis of pine wilt disease by deep learning. For. Ecol. Manag.
aerial vehicle-based hyperspectral imagery. Remote Sens. (2022). 486(118), 986 (2021). https://doi.org/10.1016/j.foreco.2021.
https://doi.org/10.3390/rs14020396 118986
66. Siebring, J., Valente, J., Domingues, Franceschini M.H., et al.: 83. Yu, R., Luo, Y., Zhou, Q., et al.: Early detection of pine wilt
Object-based image analysis applied to low altitude aerial ima- disease using deep learning algorithms and UAV-based multi-
gery for potato plant trait retrieval and pathogen detection. Sen- spectral imagery. For. Ecol. Manag. 497(119), 493 (2021).
sors (2019). https://doi.org/10.3390/s19245477 https://doi.org/10.1016/j.foreco.2021.119493
67. Simonyan, K., Zisserman, A. Very deep convolutional networks 84. Zhang, X., Han, L., Dong, Y., et al.: A deep learning-based
for large-scale image recognition (2014). arXiv:1409.1556 approach for automated yellow rust disease detection from high-
68. Sirohi, A., Malik, A., Luhach A. K., et al.: A review on various resolution hyperspectral UAV images. Remote Sens. (2019).
deep learning techniques for identification of plant diseases. In: https://doi.org/10.3390/rs11131554
International Conference on Advanced Informatics for Comput- 85. Zhang, H., Zhang, B., Wei, Z., et al.: Lightweight integrated
ing Research, pp. 487–498. Springer, Berlin (2020). https://doi. solution for a UAV-borne hyperspectral imaging system. Remote
org/10.1007/978-981-16-3660-8_46 Sens. 12(4) (2020). https://doi.org/10.3390/rs12040657
69. Sishodia R. P., Ray R. L., Singh S. K.: Applications of remote 86. Zhang, N., Yang, G., Pan, Y., et al.: A review of advanced
sensing in precision agriculture: a review. Remote Sens. 12(19) technologies and development for hyperspectral-based plant
(2020). https://doi.org/10.3390/rs12193136 disease detection in the past three decades. Remote Sens. (2020).
70. Song, B., Park, K.: Detection of aquatic plants using multispectral https://doi.org/10.3390/rs12193188
UAV imagery and vegetation index. Remote Sens. (2020). 87. Zhang, T., Xu, Z., Su, J., et al.: Ir-unet: Irregular segmentation
https://doi.org/10.3390/rs12030387 u-shape network for wheat yellow rust detection by UAV mul-
71. Stewart, E.L., Wiesner-Hanks, T., Kaczmar, N., et al.: Quanti- tispectral imagery. Remote Sens. (2021). https://doi.org/10.3390/
tative phenotyping of northern leaf blight in UAV images using rs13193892
deep learning. Remote Sens. (2019). https://doi.org/10.3390/
rs11192209 Publisher’s Note Springer Nature remains neutral with regard to
72. Su, J., Yi, D., Su, B., et al.: Aerial visual perception in smart jurisdictional claims in published maps and institutional affiliations.
farming: field study of wheat yellow rust monitoring. IEEE Trans.
Ind. Inf. 17(3), 2242–2249 (2021). https://doi.org/10.1109/TII.
2020.2979237
73. Sujatha, R., Chatterjee, J.M., Jhanjhi, N., et al.: Performance of
deep learning vs machine learning in plant leaf disease detection.
123
Cluster Computing (2023) 26:1297–1317 1317
123