0% found this document useful (0 votes)
8 views

Deep Learning Techniques for Weed Detection in Agricultural Environments a Comprehensive Review

This document is a comprehensive review of deep learning techniques for weed detection in agricultural environments, highlighting the challenges posed by the similarities between crops and weeds. It discusses various machine learning methods, the integration of AI with IoT technologies, and advancements in precision agriculture that enhance weed management. The review emphasizes the potential of deep learning to improve accuracy in weed identification and the importance of sustainable practices in agriculture to address challenges such as resource scarcity and climate change.

Uploaded by

mmk.info25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Deep Learning Techniques for Weed Detection in Agricultural Environments a Comprehensive Review

This document is a comprehensive review of deep learning techniques for weed detection in agricultural environments, highlighting the challenges posed by the similarities between crops and weeds. It discusses various machine learning methods, the integration of AI with IoT technologies, and advancements in precision agriculture that enhance weed management. The review emphasizes the potential of deep learning to improve accuracy in weed identification and the importance of sustainable practices in agriculture to address challenges such as resource scarcity and climate change.

Uploaded by

mmk.info25
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Received 6 April 2024, accepted 17 June 2024, date of publication 24 June 2024, date of current version 26 August 2024.

Digital Object Identifier 10.1109/ACCESS.2024.3418454

Deep Learning Techniques for Weed Detection in


Agricultural Environments: A Comprehensive
Review
DEEPTHI G PAI , RADHIKA KAMATH , AND
MAMATHA BALACHANDRA , (Senior Member, IEEE)
Department of Computer Science and Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal 576104, India
Corresponding author: Mamatha Balachandra ([email protected])
This work was supported in part by the Dr. TMA Pai Research Scholarship under Grant 230900101-2023; and in part by the
Department of Computer Science and Engineering, Manipal Institute of Technology, Manipal Academy of Higher Education.

ABSTRACT Agriculture has been completely transformed by Deep Learning (DL) techniques, which allow
for quick object localization and detection. However, because weeds and crops are similar in color, form,
and texture, weed detection and categorization can be difficult. Advantages in object detection, recognition,
and image classification can be obtained with deep learning (DL), a vital aspect of machine learning (ML).
Because crops and weeds are similar, ML techniques have difficulty extracting and choosing distinguishing
traits. This literature review demonstrates the potential of various DL methods for crop weed identification,
localization, and classification. This research work investigates the present status of Deep Learning based
weed identification and categorization systems. The majority of research employs supervised learning
strategies, polishing pre-trained models on sizable, labeled datasets to achieve high accuracy. Innovations
are driven by the need for sustainable weed management methods, and deep learning is demonstrating
encouraging outcomes in image-based weed detection systems. To solve issues like resource scarcity,
population increase, and climate change, precision agriculture holds great promise for the integration of AI
with IoT-enabled farm equipment.

INDEX TERMS Agriculture, artificial intelligence, deep learning, weed detection, neural networks.

NOMENCLATURE M− Unet Multispectral U-net.


AI Artificial Intelligence. ML Machine Learning.
ASFF Adaptive Spatial Feature Fusion. MSR Multi Scale Retinex.
CBAM Convolutional Block Attention Module. MT− Unet Multispectral Thermal U-net.
CFFI Channel Feature Fusion with Involution. RCNN Regional Convolutional Neural Network.
CLAHE Contrast Limited Adaptive Histogram RF Random Forest.
Equalization. Soft− NMS Soft-Non-Maximum Suppression.
DCGAN Deep Convolutional Generative SSR Single Scale Retinex.
Adver-sarial Network. SV M Support Vector Machine.
DCNN Deep Convolutional Neural Network. XGB eXtreme Gradient Boosting.
DL Deep Learning.
FPN Feature Pyramid Network. I. INTRODUCTION
GAN Generative Adversarial Network. Agriculture is the art and science of cultivating soil for the
KNN K Nearest Neighbor. growth of crops that will supply people with food, fiber, and
other goods to buy and consume [1], [2]. The global populace
The associate editor coordinating the review of this manuscript and is predicted to rise quickly to nine billion people by the
approving it for publication was Claudio Zunino. year 2050. To meet the projected demand, agricultural output
2024 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License.
VOLUME 12, 2024 For more information, see https://creativecommons.org/licenses/by/4.0/ 113193
D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

needs to expand by nearly 70% [3]. Nevertheless, agriculture also have disadvantages [13]. Herbicide resistance, environ-
encounters several substantial challenges, including the risk mental contamination, and ecological imbalance can result
of diseases, a critical shortage of cultivable land and water from using the same class of herbicides repeatedly over time.
resources, and the impact of a changing climate as well as Overuse can lead to herbicide-resistant weed populations,
threats from weeds and pests [4]. Implementing intelligent which lowers farmland biodiversity and dominates hard-
farming practices is essential to addressing the problems asso- to-control weed species in agricultural settings. Negative
ciated with agricultural production including sustainability, side effects from chemical pesticides include contaminating
food security, productivity, and environmental impact [5]. ground and surface waters and releasing residues into the
Plants that spread quickly and negligently are known as food chain [14]. This puts the long-term sustainability of
weeds, and they can negatively impact crop production and the farming industry and biodiversity conservation at risk by
quality [6]. They contend with crops for nutrients, water, increasing contamination of the environment from agricul-
sunlight, and growth space, necessitating farmers to allo- tural chemical inputs.
cate resources to control them [7]. To mitigate the impact Because of the higher bulk density and compaction of
of weeds, various management tactics are employed, which topsoil, little tillage or non-tillage can also raise the phy-
can be categorized into five main groups according to [8]: totoxicity of the soil. Reducing tillage may force farmers
‘‘preventative’’ (preventing weed establishment), ‘‘cultural’’ to use additional pesticides and herbicides to counter these
(preserving field cleanliness) to reduce the weed seed bank), risks. The criteria of soil quality, such as biological diver-
‘‘mechanical’’ (utilizing techniques such as mulching, till- sity, soil structure, and water storage capacity, are negatively
ing, and cutting), ‘‘biological’’ (employing natural enemies impacted by overuse of tillage. Tillage causes soil erosion
like insects, grazing animals, or diseases), and ‘‘chemical’’ and degradation by depriving microorganisms of carbon and
(using herbicides). Despite their effectiveness, each strat- nitrogen resources. This leads to an increase in agricultural
egy has drawbacks, typically being costly, time-consuming, contamination of the environment.
and labor-intensive. Additionally, control measures may have There are restrictions on other ground cover techniques
adverse effects on the health of humans, plants, soil, animals, as well, like fire, mulching, and cattle grazing. Mulching
or the environment [9]. Until now, various techniques and can induce soil alterations, be costly, and have allelopathic
technologies have been employed for weed detection. impacts on crops if certain organic mulches are used [15].
Living mulches compete with other plants for nutrients and
II. TRADITIONAL METHODS AND CHALLENGES IN WEED water, they can stunt crop development and yield and raise
REMOVAL the danger of disease and pest infestation. Livestock grazing
1) Manual Inspection: Traditional weed detection involves can disperse weed seeds, harm non-target species and the
labor-intensive and time-consuming manual inspection soil’s structure, and even result in an animal’s condition or
and removal by human workers. Despite its drawbacks, liveweight being lost.
this method is still utilized in certain situations. Precision Weed Management (PWM) technology can be
2) Chemical herbicides: They are frequently used in agri- integrated to reduce or eliminate these constraints, opening
culture to suppress weeds, however, their application the door where precision is the norm [16].
can have detrimental effects on the environment and be
non-selective, which could hurt crops as well as weeds. III. TECHNOLOGICAL PROGRESS IN WEED DETECTION
3) Mechanical weed control techniques: Plowing, tilling, In expansive agricultural regions, the use of remote sensing
and mechanical weeding equipment, are useful for get- technology—such as drones and satellite imaging is essen-
ting rid of weeds from fields, but they can be inaccurate tial for identifying and monitoring weed infestations [17].
and harm crops. By differentiating between weeds and crops, computer vision
4) Crop rotation: To increase soil fertility, nutrient levels, and machine learning enhance weed detection, with accuracy
and control weeds and pests, crop rotation is the prac- being continuously enhanced through training [18].
tice of growing various crops one after another. The With cameras and automated equipment, robotic weed
distinct growth requirements of various crops, however, eaters provide real-time weed detection and elimination,
may present difficulties for farmers. which might cut down on manual labor and the need for
5) Mulching: To efficiently inhibit the growth of weeds, herbicides [19], [20], [21], [22].
organic materials such as leaves, wood chips, or straw With the use of precision agricultural technologies, such
are added to the soil [10], [11], [12]. as GPS-guided equipment, farmers may spray herbicides
selectively, sparing areas that are not affected by weed infes-
Figure 1 depicts various weed management techniques. tations while focusing on those that are instruments for
chemical sensing that identify biological fingerprints, such
A. LIMITATIONS OF TRADITIONAL WEED CONTROL as changes in chlorophyll content, indicate the presence of
STRATEGIES weeds [23], [24].
Herbicides work well to manage weeds, however because Additionally, weed management apps utilizing image
of the shortcomings of traditional spraying systems, they recognition technology assist farmers in identifying and

113194 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 1. Various weed management techniques [26].

FIGURE 4. Solar powered weeding robot [31].


managing weeds, often based on photos provided by
users [25].
a Raspberry Pi microcontroller and achieved 92.9% accuracy
A. ROBOTIC TECHNOLOGY in identifying sugarcane crops among various weed species.
Expected to transform farming, the agriculture robot, or agri- Reference [33] demonstrated the Adigo robot platform for
bot, is driving an exponential increase in global investment autonomous herbicide application. The Ladybird robot from
and research in robotics, science, and engineering [27]. the University of Sydney, equipped with a spraying end actu-
Robots that carry out in-field weeding operations using com- ator and a machine learning algorithm, effectively controls
puter vision techniques are shown in Figure 2, 3, 4 and 5. weeds with targeted herbicide application.

FIGURE 2. BoniRob terrestrial robot [28].

FIGURE 5. Agricultural robotic platform with four wheel steering for weed
detection [20].

[34] developed the AgBotII, a modular weeding robot that


identifies crops and weeds using image processing techniques
and removes weeds with different tools. Reference [35]
merged a multifunctional agricultural automated terrain vehi-
cle with the aerial survey capability of a small UAV to
achieve thorough weed management. Reference [36] pro-
posed a weeding robot that navigates autonomously in paddy
FIGURE 3. Tertill weeding robot [29]. fields, disrupting soil to remove weeds and inhibiting their
growth. The Sinobot prototype, equipped with independently
Several researchers have made progress in the devel- steered wheels, was designed for weeding and route planning.
opment of robotic systems for controlling and detecting These advancements indicate progress in robotic weed
weeds, although the practical application is still a significant control, but practical implementation remains challenging.
challenge [30].
Reference [32] created a robot with dual-gimbal capabil- B. PRECISION AGRICULTURE AND AI INTEGRATION
ities, successfully identifying and targeting weeds indoors, Precision agriculture combines the advancements of the
achieving a high hit rate of 97% with specific laser param- information age with an established agricultural sector [37].
eters. Reference [22] designed a weed-detecting robot using It serves as a comprehensive crop management system,

VOLUME 12, 2024 113195


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

aiming to align input types and quantities with the specific frameworks to create accurate herbicide application maps.
requirements of small sections within a farm field. Color analysis methods have been implemented for detecting
Although the objective is not novel, recent technologi- specific weed types in various environmental conditions effi-
cal developments have made precision agriculture feasible cacy and limits of unmanned aerial vehicle (UAV) technology
in practical farming scenarios. Precision agriculture is fre- for weed seedling detection as affected by sensor resolution.
quently identified by the enabling technologies, commonly For example, in a vineyard field, a quadcopter UAV with
known as GPS (Global Positioning System) agriculture RGB photos mapped weed patches using an OBIA approach,
or variable-rate farming [38]. Despite the significance of optimizing site-specific weed control. Reference [50] argued
devices, it becomes evident upon reflection that information that the md4-1000 quadcopter employs a weed mapping rule
is the crucial element for achieving accuracy in farming set method to categorize crop rows, differentiate between
practices. crop plants and weeds, and develop a weed infestation map.
To enhance the efficiency of modern agriculture, the inte- This technique aims to reduce herbicide applications by tai-
gration of drones for aerial applications is crucial. This loring doses based on observed weed infestation levels Weed
approach standardizes chemical spraying processes and mapping in early-season maize fields using object-based
addresses the labor shortage in rural areas. The use of drones analysis of unmanned aerial vehicle (UAV) images. Figure 6
ensures precise deposition of products on target areas, min- depicts an image of a drone utilized for data collection pro-
imizing environmental losses. Reference [39] proposed that cedure in agricultural domain.
UAVs enable the monitoring of individual plants and weed
patches, a capability previously unavailable.
References [40] and [41] presented a method involving
UAV imagery to apply herbicides selectively, demonstrating
the identification of weeds in row crops through aerial image
analysis.
The concept put out by [42] suggests that weed man-
agement tactics have evolved to use drones equipped with FIGURE 6. Pictures taken of the drone at various points during the data
cameras and Geographic Information Systems (GIS). collection procedure [51].

Improved results may be achieved by optimizing agri-


cultural activities linked to weed detection and eradication The detection capability of algorithms, indicating the accu-
through the combination of drones, robots, artificial intelli- racy in classifying pixels as crops or weeds, reaches 91% with
gence, and sensors, as proposed by [43], [44], [45], and [46]. a spatial resolution of 21.6 mm/pixel.
Reference [47] argued that technology not only reduces Reference [52] carried out research on UAVs equipped
manual labor but also enhances food quality by utilizing with visible and multispectral cameras, utilizing automated
drones for various agricultural purposes. OBIA approaches, effectively map weed patches such as
Johnsongrass.
Overall, the integration of drones and advanced imaging
C. UTILIZING DRONES FOR WEED CONTROL technologies enhances the precision and efficiency of weed
The assertion made by [48] in the area of weed management management in agriculture.
is that drones play a crucial role in detecting and identifying
weed patches efficiently. They use near-infrared and visible IV. SENSOR TECHNIQUE IN WEED CONTROL
light for crop condition assessment, offering a significant Reference [53] focused on digital video cameras called
advantage of reduced surveying time, especially among crop Robocrop inter-row and Robocrop Inrow® that are used in
rows. The capacity of UAVs to cover large areas quickly and agricultural techniques to control weeds particular to a given
generate photographic images facilitates weed patch identi- place. They help to reduce the amount of herbicide used
fication. The processing of these images involves advanced in spraying applications and to steer mechanical weeding
technologies such as deep neural networks and convolutional equipment. These methods use shape, color, and crop row
neural networks. spacing data to increase classification rates for transplanted
RGB, multispectral, and hyperspectral cameras are the crops. Variable herbicide rates depend on online sensors
three primary types of cameras used in [49]’s research on for weed detection. Comparing field trials with traditional
UAV-based weed identification. Still, other parameters like application, cereal, and pea trials revealed average pesticide
drone kind, flight height, and camera resolution affect how savings of 24.6%, no yield reduction, and no variations in
well these cameras identify weed patches. Differentiating weed density between lowered and standard dosage areas.
between crop seedlings and weeds is crucial for designing an
effective automated weed management system. Specific UAV A. NON IMAGING SENSORS
models equipped with GPS and cameras, like the md4-1000 Using spectral and height features, non-imaging sensors (e.g.,
quadcopter, are employed for weed detection and mapping. spectrometers and fluorescence sensors) quantify weed spots
These systems utilize object-based image analysis (OBIA) in fields [17].

113196 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

From the ultraviolet (UV) to the near-infrared (NIR), spec- Reference [56] classified soil with 100% accuracy and
trum analyzers measure the strength of reflections at different detected weeds with over 90% accuracy in other circum-
electromagnetic spectrum wavelengths [54]. Although they stances. These systems are not yet commercially available,
can’t tell different species of plants apart, they can provide despite their shown ability to distinguish between different
information that can help separate plants from soil. The species of weeds. Certain devices, like the H-sensor, use
reflectance of bare soil increases linearly from blue to near- different pictures of the red and infrared wavebands taken
infrared light, whereas green leaves have low reflectivity in under active illumination to implement shape-based species
the red and blue spectra and high reflectance in the green and discrimination.
near-infrared wavelengths.
A plant’s spectrum response varies with its growth stage, V. WEED MANAGEMENT APPLICATIONS
and the signal that is received is a blend of various plant Below are the few innovative weed management apps that
species and soil composition. Approaches to spectral iden- bring enhanced control right to our fingertips [57].
tification are intricate and necessitate appropriate prior 1) Site of Action Lookup Tool:
knowledge, which is unavailable in the field. For weed Purpose: Swiftly identify the site of action (SOA)
identification and quantification, chemometrics works well; of commonly used herbicides and diversify your
however, it is not effective for weed detection. approach.
Optoelectronic sensors distinguish between the presence Available on: Android, iPhone, iPad
and absence of plants by focusing on particular spectral 2) ID Weeds:
bands in the red/near-infrared (R/NIR) spectrum. In rows Purpose: Quickly and easily identify weeds with this
of crops, these sensors can identify weeds in between the app from the University of Missouri, offering a list of
rows. To calculate an index similar to the NDVI, commercial suspects based on characteristics.
sensors evaluate reflectance characteristics in the NIR and Available on: Android, iPhone, iPad
R wavelengths. 3) Windfinder:
The DetectSpray spot-spraying system and the Weed- Purpose: A weather app displaying wind speed and
Seeker, GreenSeeker, WEEDit, and Crop Circle ACS-470 are direction, crucial information for spray preparation.
a few examples. When paired with a sprayer, these active Available on: Android, iPhone, iPad
sensors indicate a high level of vegetation cover. Underes- 4) Calibrate My Sprayer:
timating weed coverage is the most common inaccuracy that Purpose: User-friendly app by Clemson University for
has been reported. sprayer calibration, optimizing weed control and mini-
Because of flavonol anthocyanins, polyphenols, and mizing crop damage.
chlorophyll, plants’ leaves generate fluorescent light, which Available on: Android, iPhone, iPad
is detected by fluorescence sensors. While chlorophyll a and 5) Agrian:
b emit fluorescence in the red to the far-red range, UV light Purpose: Access chemical labels, including supple-
causes blue-green fluorescence (BGF) to be stimulated in mental labels and updates, quickly. Note: Information
leaves. Identification of plants can be done using the ratio covers the entire U.S., and product registration varies
of BGF to CLa fluorescence, which has a strong relationship by state.
with plant species. Available on: Android, iPhone, iPad
6) Mix Tank:
Purpose: Determine the right order for adding products
B. IMAGING SENSORS to the spray tank for compatibility, featuring integrated
For more than thirty years, the use of image sensors for weather data and GPS information in spray logs.
weed identification has been an important area of research. Available on: Android, iPhone, iPad
In agricultural fields, portable imaging and analysis tools like 7) SpraySelect:
RGB sensors and NDVI cameras have been used to identify Purpose: Easily select the appropriate spray tip by
weed patches, distinguish weeds from crops, and identify entering speed, tip spacing, and target rate, providing
various weed species. The procedure entails capturing dig- a list of recommended tips.
ital images, segmenting them, and then extracting plant Available on: Android, iPhone, iPad
properties.
Using red and NIR wavelengths, [55] created a bi-spectral VI. DEEP LEARNING
camera to identify different species of weeds. They produced By incorporating hierarchical functions and adding depth to
high-resolution images with pixel sizes of 0.23 mm and a data, deep learning (DL) is a technique that increases the
classification accuracy of 95%. They also employed RGB complexity of machine learning (ML). Because of its intricate
imagery and the active shape models (ASM) matching tech- models, which enable large parallelization, it is very good at
nique to get comparable outcomes. The RGB color space addressing complicated issues [58]. When extensive datasets
was converted to HSI values to apply the color co-occurrence are available, deep learning (DL) can improve classification
method (CCM) for species differentiation. accuracy or decrease errors in regression studies. Depending

VOLUME 12, 2024 113197


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

on which network architecture is being utilized, DL might effects, precision agriculture, grounded on deep learn-
consist of different components (Unsupervised Pre-trained ing, allows for the targeted application of resources
Networks, Convolutional Neural Networks, Recurrent Neural (fertilizers, herbicides, and water).
Networks, and Recursive Neural Networks). It can handle 9) Market Analysis and Decision Making: Farmers may
many different complicated data analysis problems due to make well-informed choices about crop selection and
its huge learning capacity and hierarchical structure [59]. production by employing deep learning to analyze mar-
Although natural language processing (DL) is widely used in ket trends, pricing data, and consumer preferences.
systems that work with raster-based data, it may be used with Figure 8 depicts the application of Deep Learning in
any type of data, including speech, audio, natural language, Agriculture.
weather data, and soil chemistry. Figure 7 depicts the CNN
architecture.

FIGURE 7. CNN architecture [60].

VII. IMPORTANCE OF DEEP LEARNING IN AGRICULTURE


Deep learning has found many applications in agriculture and
has changed various aspects of the field as mentioned below:
1) Crop monitoring and yield forecasting: Deep learning FIGURE 8. Deep learning in agriculture.

models process data from drones, satellites, and IoT


devices to monitor crop health, detect disease, estimate
yields, and optimize irrigation and fertilization. VIII. EVOLUTION OF DEEP LEARNING IN WEED
2) Weed and pest detection: Deep learning algorithms DETECTION
help identify and differentiate plants from unwanted Reference [61] quoted that the gathering of weed data and
plants (weeds) or pests, enabling targeted and precise weed management strategies are determined by sensing tech-
management strategies. nologies. Weed data is essential for creating and comparing
3) Crop disease detection: Deep learning models employ weed identification techniques.
plant image analysis to identify illnesses early on, Thanks to developments in imaging techniques includ-
allowing for prompt intervention to avoid crop loss. ing multispectral imaging, near-infrared imaging, and depth
4) Soil Health Management in Harvest Automation: Tech- imaging, interest in image-based weed identification has
nologies such as harvest automation and soil health increased. The development of novel algorithms for weed
management, identify ripe crops and suggest crops that identification tasks is facilitated by the availability of exten-
are appropriate for specific soil types, increase agricul- sive public datasets in the field [62], [63], [64].
tural production, and lower labor expenses. Figure 9 depicts various approaches considered in weed
5) Climate Forecasting and Management: To forecast detection using deep learning.
climate change, deep learning models examine past Although the public datasets provide useful annotation
weather patterns and historical data. This information data and photos for benchmarking, they are not consistent
helps farmers decide when to plant and harvest their in terms of metadata reporting requirements or contextual
crops. information. Comprehending the types of weeds is essential
6) Optimization of supply chains: Deep learning enhances to creating weed management strategies that work.
distribution efficiency, cuts waste, and optimizes sup- In weed control scenarios, annotated dataset construction is
ply chains by evaluating a variety of data points, such time-consuming and can result in overfitting and inadequate
as demand forecasting and transportation logistics. diversity. Several data augmentation techniques, such as rota-
7) Genomics and breeding: By forecasting desired traits tion, random cropping, and generative approaches, have been
and genetic combinations, deep learning assists in used to improve the quantity and quality of training sets to
genotype and phenotype prediction and speeds up agri- solve this.
cultural breeding procedures. Depending on the identification method, different criteria
8) Precision Agriculture: Utilizing real-time data to are used to evaluate the effectiveness of weed identification
improve resource use and minimize environmental algorithms. Based on the categorization of an input sample,

113198 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 9. Deep learning in weed detection.

four different outcomes for binary image classification can be


derived: true positive (TP), false positive (FP), true negative
(TN), and false negative (FN).
Four categories of studies exist for weed identification:
categorization of weed images, detection of weed objects,
segmentation of weed objects, and segmentation of weed
instances.

IX. RELATED WORK


The accuracy of mapping infestations of maize weed was
evaluated by Villiers et al. [65] through the use of a multitem-
poral UAV and data from PlanetScope. During the mid-to-late FIGURE 10. Architecture for machine learning-based crop monitoring
stages of maize crop growth, they employed machine learn- system classification [66].
ing techniques such as support vector machine and random
forest to identify weeds. For PlanetScope, accuracy of less
than 49% was attained out of eight experiments. A greater A lightweight YOLO v4-tiny model for weed detection in
comprehension of the relationships between weeds and maize maize seedlings is proposed in this paper [67]. Corn and weed
throughout their life cycles is necessary, which is the study’s data photos were manually labeled and then separated into
shortcoming. three sets: test, validation, and training. The training set was
In this research, Vijayalaksmi et al. [66], proposed a novel preprocessed and input into enhanced network models. After
crop-monitoring system based on machine learning-based training, the ideal weights were determined, and the models
categorization and UAVs is presented. The proposed archi- were tested using the test set.
tecture is depicted in Figure 10. It uses CNN to track crops In an Australian chilli crop field, UAV photographs are
in remote areas with below-average cultivation and local analyzed for weed identification. Three machine learning
climate, classifying them as either crops or weeds. Metrics algorithms are tested for this task: random forest (RF), sup-
like accuracy, precision, and specificity are used to evaluate port vector machine (SVM), and k-nearest neighbors (KNN).
the accuracy of the system. Results for weed detection accuracy with UAV photos

VOLUME 12, 2024 113199


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

indicate 96%, 94%, and 63%, respectively, suggesting that an average f1-score of 93% to 97.5%, showing promising
RF and SVM algorithms work well and are useful [68]. outcomes for site-specific weed management in precision
In soybean fields, the authors in this paper [69] used image agriculture. Figure 12 shows the Captured images of weed
datasets to create an edge-based vision system for weed iden- species and crops from a greenhouse, preprocessed to extract
tification. After testing three CNN architectures—ResNet50, the green component, allowing for a better interpretation of
MobileNet, and others—they discovered that a five-layer color references.
CNN architecture had the greatest results in terms of per- YOLOv4-Tiny and an improved model were used by the
formance, lowest latency, and maximum accuracy of 97.7%. authors to create a weed recognition framework that outper-
Custom lightweight deep learning models were used in the formed 33,572 labels for 1000 pictures, with a mean average
system’s design, and Raspberry Pi images were used for precision of 86.89% [74].
training and inference. Precision, recall, and F1 score criteria The study [75] used images of paddy crops and
were used to assess the system’s correctness. broadleaved and sedge-type weeds to segment them using the
This study [70] focuses on object detection models in semantic segmentation models PSPNet, UNet, and SegNet.
inpasture environments, specifically weed identification. PSPNet fared better than SegNet and UNet, suggesting its
Three dataset types were created using synthetic method- potential for safe food production and weed control at the site
ology. Tuning experiments improved model performance, level. It may also be able to advise farmers on the appropriate
achieving over 95% accuracy for testing photos and 93% herbicides.
mAP accuracy for training images. The leaf-based model Utilizing data from an unmanned aerial aircraft in a
performed marginally better. barley field, the study offers a rule-based approach for
Figure 11 shows an illustration of deep learning and trans- classifying perennial weed data. The multispectral-thermal-
fer learning process. canopy-height model yielded the best F1 score when used
in conjunction with the Normalized Difference Vegetation
Index (NDVI) and U-net models [76].
The paper [77] offers a faster R-CNN-based technique
that uses the CBAM module and field photographs to detect
weeds in soybean seedlings. With VGG19 having the best
structure, the model gets an accuracy rate of 99.16% on
average. Using one hundred soybean data samples, the gen-
eralizability of the model is verified.
The authors in this paper [78] propose a pixel-level syn-
thesization data augmentation technique and a TIA-YOLOv5
network for weed and crop detection in complex field
FIGURE 11. Deep learning and transfer learning process illustration [70]. environments. The pixel-level synthesization method creates
synthetic images, while the TIA-YOLOv5 network adds a
The study [71] proposes a multiscale detection and atten- transformer encoder block and a channel feature fusion with
tion mechanism-based weed identification model called an involution strategy to increase sensitivity to weeds and
EM-YOLOv4-Tiny, based on YOLOv4-Tiny. It uses a Fea- minimize information loss.
ture Pyramid Network with an Efficient Channel Attention The study [79] uses a remotely piloted airplane to
module, soft Non-Maximum Suppression, and Complete map weed-occupied areas, calculate percentages, and pro-
Intersection over Union loss. The model detects a single vide field-based treatment and control measures. Data
image in 10.4 ms and achieves a 94.54% mAP, making it is analyzed using R, QGIS, and PIX4D, with random
suitable for rapid and precise weed identification in peanut forest and support vector machine methods used for
fields. classification.
This paper [72] demonstrated the effectiveness of Deep The study [80] proposes a soybean field weed recognition
Convolutional Neural Networks (DCNN) in identifying model using an enhanced DeepLabv3+ model, incorporating
weeds in perennial ryegrass. AlexNet and VGGNet showed a Swin transformer for feature extraction and a convolution
similar performance on datasets with one weed species. block attention module. The model outperformed tradi-
However, VGGNet showed the highest MCC values for mul- tional semantic segmentation models in identifying densely
tiple weed species, demonstrating increased precision and distributed weedy soybean seedlings, with an average inter-
improved F1 score. section ratio of 91.53%. The study suggests further use of
The study [73] compared SVM and VGG16 classification transformers in weed recognition.
models using RGB picture texture data to categorize weeds The study [81] trained convolutional neural networks
and crop species. The researchers used 3792 RGB photos (CNNs) on images of various plant species, resulting in
from a greenhouse and selected crucial features for prediction a Top-1 accuracy of 77% to 98% in plant detection and weed
models. Six crop species and four weeds were classified species discrimination, using three different CNNs( VGG16,
using SVM and VGG16 classifiers. The VGG16 model had ResNet-50, and Xception)from a pool of 93,000 photos.

113200 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

TABLE 1. Challenges and gaps in crop and weed detection: Addressing dataset limitations and controlled environment studies.

Table 1 addresses dataset limitations and controlled envi- identification in real environment images with complex back-
ronmental studies. grounds. It analyzes the performance of DCGANs using
This study [85] uses the Weed-ConvNet model to integrate various architectural configurations, compares transfer learn-
IoT and digital image processing for weed plant detection ing approaches like Random, ImageNet, and Agricultural
in agriculture. The model achieves higher accuracy with col- datasets, and compares traditional and GAN-based data aug-
orsegmented images (0.978) than with grayscale-segmented mentation techniques. The optimal configuration achieved
images (0.942). 99.07% performance on a tomato and black nightshade
This paper [87] presents a two-stage methodology dataset, with other designs achieving similar results. Future
combining GANs and transfer learning to improve weed research should focus on larger, more complicated datasets.

VOLUME 12, 2024 113201


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 12. Captured images of weed species and crops from a greenhouse, preprocessed to extract the green
component, allowing for better interpretation of color references [73].

In order to enhance a weed dataset, this work [86] presents in difficult field settings, demonstrating its effectiveness and
WeedGan, a novel generative adversarial network. It gener- capacity for visual servoing.
ates synthetic images with low resolution, which are then Figure 13 shows different types of data collection methods.
processed by ESRGAN to produce high resolution versions. The authors in this paper [93] developed a unique crop row
The process comprises gathering datasets, enhancing images, identification algorithm for visual servoing in agricultural
identifying images, and evaluating them. The study vali- fields, outperforming the baseline by 37.66%. They identified
dates the efficacy of the dataset using seven transfer learning weed population and row discontinuities as the most chal-
approaches. lenging conditions. They also developed an EOR detector to
The study [90] used device visualization and deep learn- safely direct robots away from crop rows.
ing to detect weeds in wheat crop fields in real-time. The paper [94] presents a system design for an autonomous
Using 6000 images from PMAS Arid Agriculture University agricultural robot aimed at real-time weed identification,
research farm, the study found that the PyTorch framework potentially extending to other farming applications like weed
outperformed other networks in terms of speed and accuracy. removal and plowing.
The study also compared the inference time and detection The research [95] presents a base model framework for
accuracy of various deep learning models, with the NVIDIA an instructor framework to improve semantic segmentation
RTX2070 GPU showing the best results. models for crops and weeds in uncontrolled field settings.
This study [82] proposes an improved YOLOv4 model for It suggests using a teacher model trained on various tar-
weed detection in potato fields. The model uses Depthwise get crops and weeds to instruct a student model, and a
separable convolutions, convolutional block attention mod- meta-architecture to enhance performance.
ule, K-means++ clustering algorithm, and image processing Figure 14 depicts different models used for weed detection.
techniques to improve detection accuracy. The model’s learn- This study [96] proposes a multi-layer attention technique
ing rate is modified using cosine annealing decay, and the using a transformer and fusion rule to interpret deep neu-
MC-YOLOv4 model has a 98.52% mAP value for weed ral network decisions. The fusion rule integrates attention
detection in the potato field. maps based on saliency. The model uses the Plant Seedlings
A GCN graph was created using recovered weed CNN Dataset (PSD) and Open Plant Phenotyping Dataset (OPPD)
characteristics and Euclidean distances. The GCN-ResNet- to train and assess the model. Attention maps are marked with
101 strategy outperformed leading techniques, achieving red needs and misclassification information for cross-dataset
recognition accuracy scores of 97.80%, 99.37%, 98.93%, analyses. Modern comparisons show improved classification,
and 96.51% on four weed datasets. This CNN feature-based with an average gain of 95.42% for negative and posi-
method is effective for real-time field weed control [91]. tive explanations in PSD test sets and 97.78% and 97.83%
This research [92] proposes a crop row recognition system in OPPD evaluations. High-resolution information is also
using low-cost cameras to detect field variations. It uses included in visual comparisons.
a deep learning-based method to segment crop rows and This research [97] aims to develop a new crop row recogni-
extracts the central crop using a new central crop row selec- tion technique using orthomosaic UAV photos. Using wheat
tion algorithm. The system outperforms industry standards and nitrogen field trials, the new crop detection technique

113202 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

image-based pesticide applicators [100]. Figure 16 shows the


presence of weeds in bell paper grown in polyhouse.
The study [101] reveals that deep learning CNN
(DL-CNN) models are effective in identifying broadleaf
weeds in turfgrasses. VGGNet was the best model for detect-
ing various broadleaf weeds in dormant bermudagrass, while
DetectNet was the best for detecting cutleaf evening primrose
in bahiagrass. These models have high recall values, strong
F1 scores, and overall accuracy, indicating their potential for
turfgrass weed detection.
The research carried out in this paper [102] used object
identification Convolutional Neural Networks to detect weed
species and differentiate between broadleaved and grasses.
YOLOv3 outperformed other networks for spotting grass
weeds. Faster R-CNN and YOLOv3 were outperformed by
GoogleNet and VGGNet. VGGNet was the most successful
FIGURE 13. Different data collection methods used. for spotting grass and broadleaf plants in alfalfa.
The authors in this paper [103] recommend the RetinaNet-
based WeedNet-R model for sugar beet fields, enhancing
weed recognition accuracy without significant parameter
increase. They also implemented an untuned exponential
warmup schedule for the Adam optimizer and manually rela-
beled nearly 5,000 photos for object detection.
Three deep learning-based image processing techniques
are compared in this study [104] to detect weeds in lettuce
fields. First, YOLOV3 is used for object identification, fol-
lowed by Mask R-CNN for instance segmentation, and last,
histograms of oriented gradients (HOG) as a feature descrip-
FIGURE 14. Different models used for weed detection. tor in the second. Remove non-photosynthetic elements using
the NDVI index. For edge detection and crop identification,
based on least squares fitting was compared to the Hough the methods additionally make use of CNN features and
transform method. The new approach showed better crop row masks.
detection accuracy (CRDA) for cotton nitrogen levels and The study [105] presents a method for identifying weed
wheat nitrogen and water levels, outperforming the Hough species threatening tomato crops using RetinaNet neural net-
transform method. works for object detection. The technique was tested against
The first RGB-D photo dataset for the semantic seg- popular models like YOLOv7 and Faster-R Results showed
mentation of plant species in crop farming is presented by RetinaNet performed best with an AP ranging from 0.900 to
the authors as WE3DS. The dataset consists of a bench- 0.977, while Faster-RCNN and YOLOv7 also achieved good
mark, 2568 images, and hand-annotated ground-truth masks. results. The study suggests CNN-based weed recognition
The trained models are capable of distinguishing between techniques could be more relevant for real-time applications.
10 weed species, seven crop species, and soil [98]. The research [106] aims to create a lightweight weed detect-
The authors [99] propose a new framework for data aug- ing system for laser weeding robots using a dataset of 9,000
mentation based on the random image cropping and patching photos from six Pakistani fields. The YOLO5 single-shot
(RICAP) technique for semantic segmentation and catego- object detection model was chosen due to its superior perfor-
rization of weeds and crops as shown in Figure 15. The mance in predicting true positives and false negatives. The
framework enhances segmentation accuracies, with improve- model is used to identify and categorize crops and weeds,
ments over the original RICAP. Experiments show that with the YOLO model being the best choice due to its strong
the proposed method improves deep neural network mean performance in frame extraction and detection. The system
accuracy and intersection over union, but has limitations, is implemented using an embedded Nvidia Xavier AGX chip
especially when using large training data. for high-performance and low-power operation.
The study evaluated deep learning-based weed identifica- The study [83] used multispectral data from UAVs to iden-
tion methods from RGB photographs of a bell pepper field. tify hawkweed leaves and flowers using traditional Machine
The models, trained using different epochs and batch sizes, Learning techniques. Results showed that RF, KNN, and
achieved varying accuracy rates. InceptionV3, with 97.7% XGB models accurately identified flowers at 0.65 cm/pixel,
accuracy, 98.5% precision, and 97.8% recall, outperformed demonstrating the potential of ML and remote sensing for
others, enabling accurate weed management integration with large-scale hawkweed detection.

VOLUME 12, 2024 113203


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 15. The proposed method divides an image region into horizontal and vertical parts, randomly selecting 6 images and labels from the
dataset, and cropping and patching these parts to create new images and labels [99].

FIGURE 17. YOLO-v3 failed to cover all vegetables due to occlusion, with
FIGURE 16. Presence of weeds in bell paper grown in polyhouse [100]. yellow dashed boxes representing missed detection and yellow dashed
boxes representing erroneous detection [108].

The Faster R-CNN network model is proposed for weed robotic weeders for machine vision control. Further research
identification in cropping region images. It incorporates the is needed to refine the deep learning models for improved
feature pyramid network (FPN) method for increased recog- accuracy in weed detection.
nition precision. The model combines the ResNeXt network In this paper [109] the author addresses the issue of occlu-
with FPN for feature extraction. Tests show a recognition sion, the Efficient Channel Attention Network ECA module
accuracy of over 95%, making it suitable for weed manage- was incorporated into the Spatial Pyramid Pooling (SPP)
ment systems. The model outperforms the ResNet feature structure layer, which resolves the SPP layer’s channel com-
extraction network in terms of quick and accurate target pression issue, and enhances PSPnet’s capacity to access
recognition, demonstrating the high effectiveness of deep global context information and also uses a semi-supervised
learning techniques in this area [84]. semantic segmentation was used to increase the effectiveness
This paper [107] introduces ‘‘DenseHHO’’, a deep learning of small datasets.
framework for weed identification using pre-trained CNNs. A new Generative Adversarial Network called WeedGan
The model’s architecture is chosen based on weed images is introduced in the study to enhance weed recognition in
from sprayer drones, and the model’s hyperparameters are real-world photos of the cotton Weed ID15 dataset. Low
automatically adjusted using HHO for binary class classifi- resolution synthetic images are produced by WeedGan and
cation. then processed with ESRGAN to produce super-resolution
Table 2 and 3 addresses the need for a deep learning versions. In the cotton weed dataset, federated learning and
framework in detail. generative adversarial network principles are being applied
The study [108] demonstrates that deep learning can indi- for the first time. The two-stage system enhances the reso-
rectly detect weeds by identifying vegetables. The strategy lution and characteristics of generated synthetic images by
involves detecting vegetable instances and creating bounding combining innovative generative adversarial networks with
boxes, then identifying plants growing outside these boxes transfer learning techniques [86].
as weeds. Three deep learning models ( CenterNet, YOLO- The study [110] suggests three techniques to lessen the
v3, and Faster R-CNN) were tested, with YOLO-v3 being requirement for manual image annotation. While the second
the best. YOLO-v3 failed to cover all vegetables due to includes constructing false datasets from a single plant image
occlusion, with yellow dashed boxes representing missed (dataset B) and genuine field datasets from several plants of
detection and yellow dashed boxes representing erroneous a single weed species (dataset C), the first requires altering
detection as shown in Figure 17. The approach can be used in real image datasets (dataset A).

113204 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

TABLE 2. Challenges and gaps in crop and weed detection: Addressing the need for a deep learning framework.

This research [111] aims to use a deep learning model RetinaNet by 5.5% mAP and 5.6 frames per second, with
to accurately identify weeds in rice crop images, achieving a high mAP of 94.1% and a frame rate of 24.3 fps. They
real-time identification and low machine cost. A dataset of suggest using Det-ResNet to reduce detailed information loss
rice and eight weed types is created, and a model called and ERetina-Head for more potent feature maps. The net-
WeedDet is proposed to address overlap issues. The authors work also outperforms YOLOv3 by slightly slower fps and
propose a new detection network, WeedDet, that outperforms higher mAP.

VOLUME 12, 2024 113205


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

This study [112] developed deep-learning models for weed fluorescence crop signaling material excites fluorochrome,
development phase classification, focusing on Consolida which facilitates its easy detection by machine vision algo-
regalis weeds. Three models were created, each with a dif- rithms. Systemic markers, plant labels labeled with fluo-
ferent backbone, using a weed dataset. The models were rescence signaling compounds, expressing a fluorescence
trained using RetinaNet, YOLOv5 models, and Faster R- gene through agrobacterium transformation into plants, and
CNN. The results showed Yolo had the highest precision in topical markers are the four crop signaling techniques
identifying growth stages, while RetinaNet with ResNet-101- that have been successfully applied. Promising outcomes
FPN achieved the highest average precision. RetinaNet with emerged from in-field trials, with 100% and 99.7% classi-
ResNet-101-FPN,the final model is suggested for real-time fication accuracy, respectively, and low false positive error
performance. rates. This method could assist in removing technical obsta-
This paper [113] analyzes the Faster Region-based Convo- cles in the way of completely automated weed-control
lutional Neural Network (RCNN) with ResNet-101, focusing robots [116].
on weed classification and localization. The model’s perfor- The authors developed an automated method for measuring
mance is influenced by anchor box generation. Enhancements maize seedling growth using RGB imagery from unmanned
to scales and aspect ratios were made, resulting in the best aerial vehicles (UAVs). They improved the color difference
classification and localization for all weed classes, with a between young and old leaves, created a maize seedling cen-
24.95% enhancement in Chinee apple weed. ter detection index, and used morphological processing and a
The study introduces Conditional Random Field (CRF)- dual-threshold technique to eliminate weed noise. The study
based post-processing for the ResNet-50 U-Net model as calculates quantity, canopy coverage, emergence uniformity,
shown in Figure 18, enhancing crop/weed segmentation and rate of maize emergence [117].
using a publicly available sunflower dataset. U-Net improves The authors propose a method using drone photos to
accuracy in underrepresented weed classes and is ideal for automatically identify weeds using Convolutional Neural
real-time weed detection due to its computational efficiency Networks and unsupervised training data. The technique
and limited parameters. However, future work should explore involves finding crop rows, and weeds growing between
deep learning models to reduce misclassification [114]. them, creating a training dataset, and creating a Deep Learn-
ing model. This method is robust and adaptive, allowing for
field adaptation without feature selection [118].
CRowNet is a deep network designed for crop row recog-
nition in UAV photos, a crucial task in precision agriculture
as depicted in Figure 19. It uses Convolutional Neural Net-
works to analyze images and recognize crop rows based on
their visual characteristics. Compared to other approaches
like semantic segmentation and Hough transform, CRowNet
outperformed them in terms of Intersection over Union (IoU)
scores. In a maize field with shadows, CRowNet achieved an
IoU score of 93.58% for crop rows in a beet field [119].
This research proposes a method using GANs to gener-
ate synthetic images and transfer learning for early weed
identification in agriculture. It compares the performance
of different architectural configurations and conventional vs.
GAN-based data augmentation strategies for weed detection.
The methodology combines different techniques for early
weed identification, addressing the lack of real-world datasets
in agricultural areas [87].
FIGURE 18. U-Net architecture [114]. This research [120] proposes a method using GANs to gen-
erate synthetic images and transfer learning for early weed
To detect crops from weeds in challenging natural situa- identification in agriculture. It compares the performance of
tions, such as high weed concentrations on organic farms, different architectural configurations and conventional vs.
this research introduces a revolutionary crop signaling tech- GAN-based data augmentation strategies for weed detection.
nique that uses machine vision. The technology is made for The methodology combines different techniques for early
a vision-based weeding robot with a micro-jet herbicide- weed identification, addressing the lack of real-world datasets
spraying system and uses a machine-readable signaling in agricultural areas.
chemical to create visual features [115]. Using a single-leaf labeling method, the research [121]
Crop signaling is a novel idea that enables automated provides a deep-learning methodology for crop seedling
onfarm plant care tasks by creating a machine-readable sig- detection in challenging field situations. Examined on a
nal on crop plants at planting. Under excitation lights, the dataset of four crops, the technique demonstrated excellent

113206 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 20. The weed detection process involves labeling images with
the Roboflow tool, training the YOLOv7 model, and detecting weeds in
the input images, resulting in a final output map [89].

FIGURE 19. Flowchart for crop row detection with CNN (CRowNet) [119].

accuracy under dense planting circumstances and enhanced


the model’s mAP 0.5 by 1.2%, resolving difficulties with
missed detection. Farmers can gain from this strategy by
increasing crop output and decreasing weed interference, and
it is more appropriate for high-density farms.
The research carried out in the paper [122] introduces
a metaheuristic optimization method for weed detection in FIGURE 21. DUAL PSPNet scheme [110].
wheat fields. It uses an optimal voting ensemble model,
ADSCFGWO algorithm for feature selection, and transfer
learning for feature extraction. The method outperforms cur- in this paper [110]. Targeting species of grass, broadleaf, and
rent optimization techniques, with a detection accuracy of crop, three datasets were created. Real field photos made up
97.70%, F-score of 98.60%, specificity of 95.20%, and sen- the first dataset, single-species plots made up the second,
sitivity of 98.40%. and artificially generated images made up the third. An addi-
In order to facilitate the development of visual perception tional classification loss was added to a PSPNet architecture
algorithms for tasks such as semantic segmentation, panoptic to create a semantic segmentation architecture as shown in
segmentation, leaf instance segmentation, and plant and leaf Figure 21. The research shows that augmenting the real
recognition, the paper [88] provides a sizable dataset and field images dataset with other datasets improves network
standards for semantic interpretation of agricultural imagery. performance without human annotation, surpassing the state-
Using You Only Look Once version 7 (YOLOv7), the arti- of-the-art method.
cle [89] tests deep weed object detection on a new weed The Figure 22 shows various steps involved in weed detec-
and crop dataset named Chicory Plant (CP). Using more tion using deep learning.
than 3000 RGB photos of chicory plantings, the dataset
provides 12,113 bounding box annotations. With strong X. ANALYSIS OF THE RELATED WORKS
[email protected] scores, the YOLOv7 model surpasses other YOLO Dataset limitations are revealed by the identified research
variations in CP and LB. Figure 20 represents the weed papers on weed detection in agriculture. The majority of
detection process invloving labeling images using Roboflow studies rely on datasets gathered from particular experimen-
tool and training using YOLOv7 model. tal fields or controlled environments, which restricts the
According to the paper [123], machine learning techniques applicability of their conclusions to a variety of real-world
are used to assess the normal RGB or 4-channel NIR + RGB agricultural scenarios. These artificial settings might not
images that serve as the input for weed detection on robots. accurately capture the diversity found in various geographic
To improve the model’s ability to accurately and precisely regions, soil types, weather patterns, and crop types.
detect weeds, the authors train it on a large dataset of cotton Furthermore, some study’s inability to make their datasets
crop weeds (which can subsequently be applied to many dif- publicly available compromises the reproducibility and com-
ferent crops). Here, the authors employed a machine learning parability of their findings across various research projects.
algorithm with an accuracy of up to 90% to 95% as the SSD To overcome these dataset constraints, further research is
MobileNet model. required that integrates more representative and diverse
A deep learning segmentation model that can differentiate datasets that depict the intricacies of real-world agricultural
between different plant species at the pixel level is presented contexts. Figure 23 depicts different models or algorithms

VOLUME 12, 2024 113207


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

FIGURE 24. Different crops used for the study.

Creating models that can adapt to different crops and field


conditions can significantly enhance the practical utility of
weed detection systems in various agricultural settings.

XI. CROP STUDY INSIGHTS: MODALITY AND DATA


COLLECTION METHODS IN FOCUS
FIGURE 22. Steps involved in weed detection using deep learning.
A key strategy for combating herbicide misuse is site-specific
weed management (SSWM), which focuses on weed control
implementation, decision-making algorithms for herbicide
administration, and crop and weed detection systems [124],
[125]. According to [126], high spatial resolution images
are ideal for precision weed management and customizable
spraying systems. Digital image-based weed detection is
a crucial technological tool for precise weed identification
and localization in agricultural areas. Conventional machine
learning is utilized in wheat fields to detect weeds which
frequently call for manual characteristics including color,
position, morphology, and texture. These techniques, how-
ever, are sensitive to sample variation, have a high time
complexity, and are ineffective for multi-scale object detec-
tion tasks [127], [128]. Deep learning approaches based on
CNNs have significantly improved recognition efficiency in
weeds detection. The Figure 25 depicts the visual representa-
tion of words based on frequency and relevance.

XII. THE SCARCITY OF DATA AND POTENTIAL


SOLUTIONS
Researchers have investigated diverse approaches to tackle
FIGURE 23. Different models or algorithms used for weed detection in the issue of data scarcity in agricultural systems. The first
agricultural fields. is data augmentation, which enhances the quality of images
by applying geometric and color transformations. The sec-
ond tactic is transfer learning, which is using knowledge
used for weed detection in agricultural fields and Figure 24 gained in one context to another, frequently in agricultural
depicts various types of crops used for the study. settings. Weed identification and disease categorization sys-
Despite advancements in deep learning and machine vision tems have been made more functional with the use of modern
technology, research on weed detection in agriculture concen- architectures such as Xception, ResNet, VGGNets, Incep-
trates on specific or limited crops. Given their great diversity, tion, or DenseNet. Generative Adversarial Networks (GANs),
the proposed model’s narrow scope raises questions regarding which generate synthetic data from preexisting datasets, con-
their application in other agricultural environments, requiring stitute the third tactic.
the creation of extended models to recognize and classify In several fields, including the creation of synthetic sam-
weeds in different crops. ples of plants and the classification of plant illnesses in

113208 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

TABLE 3. Challenges and gaps in crop and weed detection: Addressing the need for a deep learning framework (contd..).

FIGURE 25. Word cloud, visual representation of words based on frequency and relevance.

authentic settings, Deep Convolutional GAN (DCGAN) has and setups in comparable benchmarks are also required.
demonstrated encouraging outcomes. For instance, a neural Overall, these methods offer promising solutions for address-
network design with two stages was utilized to achieve a final ing data scarcity in agricultural systems.
accuracy of 93.67% using conventional data augmentation
and GANs approaches. A. DATA AUGMENTATION
However, more experimental results are needed to fully Utilizing modified versions of current data or producing syn-
understand the effectiveness of these deep learning-based thetic data from existing data, data augmentation techniques
techniques. As GANs are a relatively young topic, more expands the actual amount of available data. It is used to apply
through empirical investigations comparing various designs random, realistic data transformations like flipping or rotating

VOLUME 12, 2024 113209


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

images, to a model’s training sets to increase their diversity. why transfer learning is feasible, we might take inspira-
By providing variants of the data that the model would meet tion from non-technical, real-world events. Take the case of
in the real world, data augmentation techniques help deep two individuals who aspire to become proficient pianists.
learning models become more accurate and robust. While one individual has never played an instrument before,
On the other hand, data augmentation may be detrimental the other has played the guitar for a long time and has
if it produces inferior prediction outcomes. It’s vital to find a vast knowledge of music. A person with a strong foun-
equilibrium between bias and variance and experiment with dation in music will be able to pick up the piano more
different combinations of data augmentation to determine quickly because they will be able to use their prior under-
which works best for the issue statement to prevent this. standing of music to the challenge of learning how to play
When a model fits too closely to the training dataset and is the instrument. An individual can effectively apply knowl-
unable to generalize, this is known as overfitting. edge from a previously acquired task to acquire a related
The data’s variability, the model’s potential for gen- task [132].
eralization, the reduction of overfitting, the cost savings
associated with gathering and labeling extra data, and the XIII. DISCUSSION
enhanced prediction accuracy of the deep learning model Regarding the identification and classification of crop weeds,
are all advantages of data augmentation. However, because the Deep Learning model works incredibly well. RGB images
augmented datasets contain the biases of current datasets, are captured for most research using a digital camera; some
mechanisms must be put in place to monitor and evaluate their use multi-spectral or hyper-spectral data. By using detec-
quality. tion accuracy as the primary parameter, researchers train the
Adding Gaussian noise, brightness, hue, contrast, satura- model using supervised learning techniques. The application
tion, flipping, rotating, scaling, cropping, and translating are of new technology and different spectrum indices are two
examples of common data augmentation techniques. Data areas where there is still opportunity for advancement. Vast
augmentation can greatly increase the accuracy of deep learn- datasets are required for weeds and crops, however, the cost
ing models by creating methods to reduce bias and boost of annotating these vast datasets is high. This issue can
neural network learning capacity. be solved by using weakly supervised or semi-supervised
techniques. Large datasets can be produced for automated
B. GENERATIVE ADVERSARIAL NETWORK
weed detection systems using deep learning techniques and
Generative Adversarial Networks (GAN). Nevertheless, class
A deep learning architecture known as a Generative Adver-
imbalance is present in most datasets, which causes biases
sarial Network (GAN) pits two neural networks against one
and over-fitting. To address this issue, future research should
another in an attempt to produce more genuine new data from
use class-balancing classifiers, cost-sensitive learning, or data
a given dataset [129]. Up until it is unable to discriminate
redistribution. The main objective is to increase crop yields
between fake and original data, the predicting network gen-
while reducing expenses.
erates newer, better versions of the fake data values in an
attempt to ascertain whether the generated data is part of the
XIV. FUTURE SCOPE
original dataset.
Further studies in this area could enhance intercropping mod-
Mathematical formulas and the relationship between the
els for smallholder farmers by adding more sensors and
generator and discriminator serve as the foundation for GAN
cloud computing, integrating a larger range of color spaces,
models. The simplest model, known as vanilla GAN [130],
vegetation indexes, and spectral bands, and expanding their
generates data variation without feedback. By introducing
use to other crops for yield prediction and disease detection.
conditionality, conditional GAN enables the production of
One interesting direction for the development of intelligent
tailored data. Using transposed convolutions to upscale data
weeding technology is the integration of machine learning
distribution and convolutional layers for data classification,
algorithms with robotic systems, which will increase the
deep convolutional GAN incorporates convolutional neural
efficiency of weed detection and removal.
networks (CNNs) into GANs. The goal of super-resolution
Subsequent investigation could improve real-time herbi-
GANs (SRGANs) [131] is to upsample low-resolution pho-
cide administration by integrating more weed species in
tos to high resolution while preserving detail and quality.
pastures and refining models of deep convolutional neural
Laplacian Pyramid GANs (LAPGANs) employ a hierarchical
networks for weed identification in turfgrass species.
method with several generators and discriminators operating
WeedGan is a suggested semantic segmentation model that
at various sizes or resolutions to divide the problem into
might be made better by adding new versions, growing the
stages. The procedure starts with producing a low-resolution
dataset, and improving training results on real-world datasets.
image, whose quality increases as the GAN phases advance.
Both the study’s coverage of weed species and the experimen-
tal field setting might be lacking in disruptive elements. Down
C. TRANSFER LEARNING the line, more research is necessary to determine whether
By using knowledge from a related area, transfer learning the performance of the suggested data augmentation strategy
helps learners in one domain become better. To comprehend decreases with larger training photos.

113210 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

XV. CONCLUSION [18] J. M. Guerrero, J. J. Ruz, and G. Pajares, ‘‘Crop rows and weeds detection
The research publications on identification and categoriza- in maize fields applying a computer vision system based on geometry,’’
Comput. Electron. Agricult., vol. 142, pp. 461–472, Nov. 2017.
tion of weed species using deep learning in value crops are [19] D. C. Slaughter, D. K. Giles, and D. Downey, ‘‘Autonomous robotic weed
reviewed in this study. The majority of research employs control systems: A review,’’ Comput. Electron. Agricult., vol. 61, no. 1,
supervised learning strategies and uses plant datasets to refine pp. 63–78, Apr. 2008.
[20] T. Bak and H. Jakobsen, ‘‘Agricultural robotic platform with four wheel
pretrained models. When enough labeled data is provided, steering for weed detection,’’ Biosyst. Eng., vol. 87, no. 2, pp. 125–136,
high accuracy can be reached. However, the excellent accu- Feb. 2004.
racy and processing speed achieved by current research are [21] X. Wu, S. Aravecchia, P. Lottes, C. Stachniss, and C. Pradalier, ‘‘Robotic
weed control using automated weed and crop classification,’’ J. Field
limited to tiny datasets. Future work should focus on class Robot., vol. 37, no. 2, pp. 322–340, Mar. 2020.
imbalance issues, weed growth phase identification, large- [22] M. Sujaritha, S. Annadurai, J. Satheeshkumar, S. K. Sharan, and
scale datasets with a variety of crop and weed species, L. Mahesh, ‘‘Weed detecting robot in sugarcane fields using fuzzy real
time classifier,’’ Comput. Electron. Agricult., vol. 134, pp. 160–171,
efficient detection approaches, and comprehensive field test- Mar. 2017.
ing for commercial deployments. [23] N. Khan, G. Medlock, S. Graves, and S. Anwar, ‘‘Gps guided autonomous
navigation of a small agricultural robot with automated fertilizing sys-
tem,’’ SAE Tech. Paper 2018-01-0031, 2018, doi: 10.4271/2018-01-0031.
REFERENCES [24] R. Gerhards and H. Oebel, ‘‘Practical experiences with a system for
[1] A. Steensland and M. Zeigler, ‘‘Productivity in agriculture for a sustain- site-specific weed control in arable crops using real-time image analysis
able future,’’ in The Innovation Revolution in Agriculture. Cambridge, and GPS-controlled patch spraying,’’ Weed Res., vol. 46, no. 3, 2006,
U.K.: Cambridge Univ. Press, 2021. Art. no. 185193.
[2] D. Hemathilake and D. Gunathilake, ‘‘Agricultural productivity and [25] I. Dasgupta, J. Saha, P. Venkatasubbu, and P. Ramasubramanian, ‘‘AI crop
food supply to meet increased demands,’’ in Future Foods. Amsterdam, predictor and weed detector using wireless technologies: A smart applica-
The Netherlands: Elsevier, 2022, pp. 539–553. tion for farmers,’’ Arabian J. Sci. Eng., vol. 45, no. 12, pp. 11115–11127,
[3] M. van Dijk, T. Morley, M. L. Rau, and Y. Saghai, ‘‘A meta-analysis of Dec. 2020.
projected global food demand and population at risk of hunger for the [26] ‘‘Images related to weed control: Aleiodes indiscretus wasp parasitizing
period 2010–2050,’’ Nature Food, vol. 2, no. 7, pp. 494–501, Jul. 2021. gypsy moth caterpillar, migrants weeding sugar beets, people weed-
[4] K. Barman, V. Singh, R. Dubey, P. Singh, A. Dixit, and A. Sharma, ing, and a weeding tool,’’ Images Taken From Wikimedia Commons,
‘‘Challenges and opportunities in weed management under a chang- Tech. Rep.
ing agricultural scenario,’’ in Recent Advances in Weed Management. [27] A. Gollakota and M. B. Srinivas, ‘‘Agribot—A multipurpose agricultural
New York, NY, USA: Springer, 2014, pp. 365–390. robot,’’ in Proc. Annu. IEEE India Conf., Dec. 2011, pp. 1–4.
[5] S. Tshewang, B. M. Sindel, M. Ghimiray, and B. S. Chauhan, ‘‘Weed [28] N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard, and
management challenges in rice (Oryza sativa L.) for food security in C. Stachniss, ‘‘Agricultural robot dataset for plant classification, local-
bhutan: A review,’’ Crop Protection, vol. 90, pp. 117–124, Dec. 2016. ization and mapping on sugar beet fields,’’ Int. J. Robot. Res., vol. 36,
[6] D. Patel and B. Kumbhar, ‘‘Weed and its management: A major threat to no. 10, pp. 1045–1052, Sep. 2017.
crop economy,’’ J. Pharm. Sci. Biosci. Res., vol. 6, no. 6, pp. 453–758, [29] Tertill: The Solar Powered Weeding Robot, Tertill Corp., USA.
2016. [30] G. Amer, S. M. M. Mudassir, and M. A. Malik, ‘‘Design and operation of
[7] N. Iqbal, S. Manalil, B. S. Chauhan, and S. W. Adkins, ‘‘Germination Wi-Fi agribot integrated system,’’ in Proc. Int. Conf. Ind. Instrum. Control
biology of sesbania (Sesbania cannabina): An emerging weed in the (ICIC), May 2015, pp. 207–212.
Australian cotton agro-environment,’’ Weed Sci., vol. 67, no. 1, pp. 68–76, [31] U-Idaho’s College of Agricultural and L-Sciences, Solar Powered Weed-
Jan. 2019. ing Robot, Univ. Idaho, Moscow, Russia, 2022.
[8] K. F. Egbadzor and E. K. Sakyi, ‘‘Effects of synthetic agricultural chem- [32] Y. Xiong, Y. Ge, Y. Liang, and S. Blackmore, ‘‘Development of a pro-
icals on health: Views of smallholder farmers in the ho west district,’’ totype robot and fast path-planning algorithm for static laser weeding,’’
J. Agricult. Food Sci., vol. 18, no. 2, pp. 73–84, Feb. 2021. Comput. Electron. Agricult., vol. 142, pp. 494–503, Nov. 2017.
[9] T. Kankam, K. Okese, and J. F. Boamah, ‘‘Types, principles, methods [33] T. Utstumo, F. Urdal, A. Brevik, J. Dørum, J. Netland, Ø. Overskeid,
and importance,’’ 2020. [Online]. Available: https://blog.agrihomegh. T. W. Berge, and J. T. Gravdahl, ‘‘Robotic in-row weed control in veg-
com/organic-farming-typesprinciples/ etables,’’ Comput. Electron. Agricult., vol. 154, pp. 36–45, Nov. 2018.
[10] A. A. Bajwa, ‘‘Sustainable weed management in conservation agricul- [34] O. Bawden, J. Kulk, R. Russell, C. McCool, A. English, F. Dayoub,
ture,’’ Crop Protection, vol. 65, pp. 105–113, Nov. 2014. C. Lehnert, and T. Perez, ‘‘Robot for weed species plant-specific man-
[11] K. N. Harker and J. T. O’Donovan, ‘‘Recent weed control, weed manage- agement,’’ J. Field Robot., vol. 34, no. 6, pp. 1179–1199, Sep. 2017.
ment, and integrated weed management,’’ Weed Technol., vol. 27, no. 1, [35] A. Pretto, S. Aravecchia, W. Burgard, N. Chebrolu, C. Dornhege, T. Falck,
pp. 1–11, Mar. 2013. F. Fleckenstein, A. Fontenla, M. Imperoli, R. Khanna, and F. Liebisch,
[12] A. Monteiro and S. Santos, ‘‘Sustainable approach to weed management: ‘‘Building an aerial–ground robotics system for precision farming: An
The role of precision weed management,’’ Agronomy, vol. 12, no. 1, adaptable solution,’’ IEEE Robot. Autom. Mag., vol. 28, no. 3, pp. 29–49,
p. 118, Jan. 2022. Sep. 2021.
[13] V. Partel, S. C. Kakarla, and Y. Ampatzidis, ‘‘Development and evaluation [36] H. Sori, H. Inoue, H. Hatta, and Y. Ando, ‘‘Effect for a paddy weed-
of a low-cost and smart technology for precision weed management ing robot in wet rice culture,’’ J. Robot. Mechatronics, vol. 30, no. 2,
utilizing artificial intelligence,’’ Comput. Electron. Agricult., vol. 157, pp. 198–205, Apr. 2018.
pp. 339–350, Feb. 2019. [37] U. Shafi, R. Mumtaz, J. García-Nieto, S. A. Hassan, S. A. R. Zaidi, and
[14] M. J. Mia, F. Massetani, G. Murri, and D. Neri, ‘‘Sustainable alternatives N. Iqbal, ‘‘Precision agriculture techniques and practices: From consid-
to chemicals for weed control in the orchard–a review,’’ Horticultural Sci., erations to applications,’’ Sensors, vol. 19, no. 17, p. 3796, Sep. 2019.
vol. 47, no. 1, pp. 1–12, Mar. 2020. [38] M. R. Yousefi and A. M. Razdari, ‘‘Application of GIS and GPS in
[15] S. K. P. G. Peera, S. Debnath, and S. Maitra, ‘‘Mulching: Materials, precision agriculture (a review),’’ Int. J. Adv. Biol. Biomed. Res., vol. 3,
advantages and crop production,’’ in Protected Cultivation Smart Agri- no. 1, pp. 7–9, 2015.
culture, S. Maitra, D. J. Gaikwad, and S. Tanmoy, Eds., New Delhi, India: [39] H. Xiang and L. Tian, ‘‘Development of a low-cost agricultural remote
New India Publishing Agency, 2020, pp. 55–66. sensing system based on an autonomous unmanned aerial vehicle
[16] S. S. Valle and J. Kienzle, ‘‘Agriculture 4.0—Agricultural robotics and (UAV),’’ Biosyst. Eng., vol. 108, no. 2, pp. 174–190, Feb. 2011.
automated equipment for sustainable crop production,’’ Food Agricult. [40] R. Mink, A. Dutta, G. G. Peteinatos, M. Sökefeld, J. J. Engels, M. Hahn,
Org. United Nations (FAO), Rome, Italy, 2020. and R. Gerhards, ‘‘Multi-temporal site-specific weed control of Cirsium
[17] G. G. Peteinatos, M. Weis, D. Andújar, V. R. Ayala, and R. Gerhards, arvense (L.) Scop. and Rumex crispus L. in maize and sugar beet using
‘‘Potential use of ground-based sensor technologies for weed detection,’’ unmanned aerial vehicle based mapping,’’ Agriculture, vol. 8, no. 5, p. 65,
Pest Manage. Sci., vol. 70, no. 2, pp. 190–199, Feb. 2014. Apr. 2018.

VOLUME 12, 2024 113211


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

[41] M. T. Islam, K. Yoshida, S. Nishiyama, and K. Sakai, ‘‘Mutual validation [63] A. Brilhador, M. Gutoski, L. T. Hattori, A. de Souza Inacio,
of remote hydraulic estimates and flow model simulations using UAV- A. E. Lazzaretti, and H. S. Lopes, ‘‘Classification of weeds and crops
borne LiDAR and deep learning-based imaging techniques,’’ Results at the pixel-level using convolutional neural networks and data aug-
Eng., vol. 20, Dec. 2023, Art. no. 101415. mentation,’’ in Proc. IEEE Latin Amer. Conf. Comput. Intell. (LA-CCI),
[42] M. C. Oliveira, O. A. Osipitan, K. Begcy, and R. Werle, ‘‘Cover crops, Nov. 2019, pp. 1–6.
hormones and herbicides: Priming an integrated weed management strat- [64] A. Wang, Y. Xu, X. Wei, and B. Cui, ‘‘Semantic segmentation of crop
egy,’’ Plant Sci., vol. 301, Dec. 2020, Art. no. 110550. and weed using an encoder–decoder network and image enhancement
[43] N. Jihani, M. N. Kabbaj, and M. Benbrahim, ‘‘Kalman filter based sensor method under uncontrolled outdoor illumination,’’ IEEE Access, vol. 8,
fault detection in wireless sensor network for smart irrigation,’’ Results pp. 81724–81734, 2020.
Eng., vol. 20, Dec. 2023, Art. no. 101395. [65] C. de Villiers, C. Munghemezulu, Z. Mashaba-Munghemezulu,
[44] A. Rabak, K. Uppuluri, F. F. Franco, N. Kumar, V. P. Georgiev, G. J. Chirima, and S. G. Tesfamichael, ‘‘Weed detection in rainfed maize
C. Gauchotte-Lindsay, C. Smith, R. A. Hogg, and L. Manjakkal, ‘‘Sen- crops using UAV and PlanetScope imagery,’’ Sustainability, vol. 15,
sor system for precision agriculture smart watering can,’’ Results Eng., no. 18, p. 13416, Sep. 2023.
vol. 19, Sep. 2023, Art. no. 101297. [66] K. Vijayalakshmi, S. Al-Otaibi, L. Arya, M. A. Almaiah,
[45] D. Bini, D. Pamela, and S. Prince, ‘‘Machine vision and machine learning T. P. Anithaashri, S. S. Karthik, and R. Shishakly, ‘‘Smart
for intelligent agrobots: A review,’’ in Proc. 5th Int. Conf. Devices, agricultural–industrial crop-monitoring system using unmanned aerial
Circuits Syst. (ICDCS), Mar. 2020, pp. 12–16. vehicle–Internet of Things classification techniques,’’ Sustainability,
[46] H. Kamyab, T. Khademi, S. Chelliapan, M. SaberiKamarposhti, vol. 15, no. 14, p. 11242, Jul. 2023.
S. Rezania, M. Yusuf, M. Farajnezhad, M. Abbas, B. H. Jeon, and Y. Ahn, [67] S. Liu, Y. Jin, Z. Ruan, Z. Ma, R. Gao, and Z. Su, ‘‘Real-time detection of
‘‘The latest innovative avenues for the utilization of artificial intelligence seedling maize weeds in sustainable agriculture,’’ Sustainability, vol. 14,
and big data analytics in water resource management,’’ Results Eng., no. 22, p. 15088, Nov. 2022.
vol. 20, Dec. 2023, Art. no. 101566. [68] N. Islam, M. M. Rashid, S. Wibowo, C.-Y. Xu, A. Morshed, S. A. Wasimi,
[47] B. K. I. Ahmed, A. Prakash, and A. Prakash, ‘‘An approach for digital S. Moore, and S. M. Rahman, ‘‘Early weed detection using image pro-
farming using mobile robot,’’ in Proc. 2nd Int. Conf. Inventive Res. cessing and machine learning techniques in an Australian chilli farm,’’
Comput. Appl. (ICIRCA), Jul. 2020, pp. 580–585. Agriculture, vol. 11, no. 5, p. 387, Apr. 2021.
[48] U. Shapira, I. Herrmann, A. Karnieli, and J. Bonfil, ‘‘Weeds detection [69] N. Razfar, J. True, R. Bassiouny, V. Venkatesh, and R. Kashef, ‘‘Weed
by ground-level hyperspectral data,’’ Theory Pract, vol. 38, pp. 27–33, detection in soybean crops using custom lightweight deep learning mod-
Jan. 2010. els,’’ J. Agricult. Food Res., vol. 8, Jun. 2022, Art. no. 100308.
[49] W. H. Maes and K. Steppe, ‘‘Perspectives for remote sensing with [70] H. Chegini, F. Beltran, and A. Mahanti, ‘‘Designing and developing
unmanned aerial vehicles in precision agriculture,’’ Trends Plant Sci., a weed detection model for California thistle,’’ ACM Trans. Internet
vol. 24, no. 2, pp. 152–164, Feb. 2019. Technol., vol. 23, no. 3, pp. 1–29, Aug. 2023.
[50] A. I. de Castro, J. M. Peña, J. Torres-Sánchez, F. Jiménez-Brenes, and
[71] H. Zhang, Z. Wang, Y. Guo, Y. Ma, W. Cao, D. Chen, S. Yang, and R. Gao,
F. López-Granados, ‘‘Mapping cynodon dactylon in vineyards using UAV
‘‘Weed detection in peanut fields based on machine vision,’’ Agriculture,
images for site-specific weed control,’’ Adv. Animal Biosciences, vol. 8,
vol. 12, no. 10, p. 1541, Sep. 2022.
no. 2, pp. 267–271, 2017.
[72] J. Yu, A. W. Schumann, Z. Cao, S. M. Sharpe, and N. S. Boyd, ‘‘Weed
[51] L. Parra, D. Mostaza-Colado, S. Yousfi, J. F. Marin, P. V. Mauri,
detection in perennial ryegrass with deep learning convolutional neural
and J. Lloret, ‘‘Drone RGB images as a reliable information source to
network,’’ Frontiers Plant Sci., vol. 10, p. 1422, Oct. 2019.
determine legumes establishment success,’’ Drones, vol. 5, no. 3, p. 79,
Aug. 2021. [73] G. C. Sunil, Y. Zhang, C. Koparan, M. R. Ahmed, K. Howatt, and X. Sun,
[52] J. Peña, J. Torres-Sánchez, A. Serrano-Pérez, A. De Castro, and ‘‘Weed and crop species classification using computer vision and deep
F. López-Granados, ‘‘Quantifying efficacy and limits of unmanned learning technologies in greenhouse conditions,’’ J. Agricult. Food Res.,
aerial vehicle (UAV) technology for weed seedling detection as vol. 9, Sep. 2022, Art. no. 100325.
affected by sensor resolution,’’ Sensors, vol. 15, no. 3, pp. 5609–5626, [74] H. Pei, Y. Sun, H. Huang, W. Zhang, J. Sheng, and Z. Zhang, ‘‘Weed
Mar. 2015. detection in maize fields by UAV images based on crop row preprocessing
[53] K.-H. Dammer and G. Wartenberg, ‘‘Sensor-based weed detection and and improved YOLOv4,’’ Agriculture, vol. 12, no. 7, p. 975, Jul. 2022.
application of variable herbicide rates in real time,’’ Crop Protection, [75] R. Kamath, M. Balachandra, A. Vardhan, and U. Maheshwari, ‘‘Classi-
vol. 26, no. 3, pp. 270–277, Mar. 2007. fication of paddy crop and weeds using semantic segmentation,’’ Cogent
[54] U. Shapira, I. Herrmann, A. Karnieli, and D. J. Bonfil, ‘‘Field spec- Eng., vol. 9, no. 1, Dec. 2022, Art. no. 2018791.
troscopy for weed detection in wheat and chickpea fields,’’ Int. J. Remote [76] L.-F. Thomas, M. Änäkkälä, and A. Lajunen, ‘‘Weakly supervised peren-
Sens., vol. 34, no. 17, pp. 6094–6108, Sep. 2013. nial weed detection in a barley field,’’ Remote Sens., vol. 15, no. 11,
[55] J. Machleb, G. G. Peteinatos, M. Sökefeld, and R. Gerhards, ‘‘Sensor- p. 2877, Jun. 2023.
based intrarow mechanical weed control in sugar beets with motorized [77] X. Zhang, J. Cui, H. Liu, Y. Han, H. Ai, C. Dong, J. Zhang, and
finger weeders,’’ Agronomy, vol. 11, no. 8, p. 1517, Jul. 2021. Y. Chu, ‘‘Weed identification in soybean seedling stage based on opti-
[56] T. F. Burks, S. A. Shearer, J. P. Fulton, and C. J. Sobolik, ‘‘Effects of mized faster R-CNN algorithm,’’ Agriculture, vol. 13, no. 1, p. 175,
timevarying inflow rates on combine yield monitor accuracy,’’ Appl. Eng. Jan. 2023.
Agricult., vol. 20, no. 3, pp. 269–275, 2004. [78] A. Wang, T. Peng, H. Cao, Y. Xu, X. Wei, and B. Cui, ‘‘TIA-YOLOv5:
[57] Weed Management Apps for Your Phone or Tablet, North Carolina Soy- An improved YOLOv5 network for real-time detection of crop and weed
bean Producers Assoc., Raleigh, NC, USA, 2018. in the field,’’ Frontiers Plant Sci., vol. 13, Dec. 2022, Art. no. 1091655.
[58] Y. LeCun, Y. Bengio, and G. Hinton, ‘‘Deep learning,’’ Nature, vol. 521, [79] N. L. Bento, G. A. E. S. Ferraz, J. D. S. Amorim, L. S. Santana,
no. 7553, pp. 436–444, 2015. R. A. P. Barata, D. V. Soares, and P. F. P. Ferraz, ‘‘Weed detection and
[59] S. J. Pan and Q. Yang, ‘‘A survey on transfer learning,’’ IEEE Trans. mapping of a coffee farm by a remotely piloted aircraft system,’’ Agron-
Knowl. Data Eng., vol. 22, no. 10, pp. 1345–1359, Oct. 2010. omy, vol. 13, no. 3, p. 830, Mar. 2023.
[60] C. Ren, D.-K. Kim, and D. Jeong, ‘‘A survey of deep learning in agri- [80] H. Yu, M. Che, H. Yu, and J. Zhang, ‘‘Development of weed detection
culture: Techniques and their applications,’’ J. Inf. Process. Syst., vol. 16, method in soybean fields utilizing improved DeepLabv3+ platform,’’
no. 5, pp. 1015–1033, 2020. Agronomy, vol. 12, no. 11, p. 2889, Nov. 2022.
[61] J. Machleb, G. G. Peteinatos, B. L. Kollenda, D. Andújar, and [81] G. G. Peteinatos, P. Reichel, J. Karouta, D. Andújar, and R. Gerhards,
R. Gerhards, ‘‘Sensor-based mechanical weed control: Present state ‘‘Weed identification in maize, sunflower, and potatoes with the aid of
and prospects,’’ Comput. Electron. Agricult., vol. 176, Sep. 2020, convolutional neural networks,’’ Remote Sens., vol. 12, no. 24, p. 4185,
Art. no. 105638. Dec. 2020.
[62] A. Farooq, J. Hu, and X. Jia, ‘‘Analysis of spectral bands and spatial [82] J. Zhao, G. Tian, C. Qiu, B. Gu, K. Zheng, and Q. Liu, ‘‘Weed detection in
resolutions for weed classification via deep convolutional neural net- potato fields based on improved YOLOv4: Optimal speed and accuracy
work,’’ IEEE Geosci. Remote Sens. Lett., vol. 16, no. 2, pp. 183–187, of weed detection in potato fields,’’ Electronics, vol. 11, no. 22, p. 3709,
Feb. 2019. Nov. 2022.

113212 VOLUME 12, 2024


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

[83] N. Amarasingam, M. Hamilton, J. E. Kelly, L. Zheng, J. Sandino, [104] K. Osorio, A. Puerto, C. Pedraza, D. Jamaica, and L. Rodríguez, ‘‘A deep
F. Gonzalez, R. L. Dehaan, and H. Cherry, ‘‘Autonomous detection of learning approach for weed detection in lettuce crops using multispectral
mouse-ear hawkweed using drones, multispectral imagery and supervised images,’’ AgriEngineering, vol. 2, no. 3, pp. 471–488, Aug. 2020.
machine learning,’’ Remote Sens., vol. 15, no. 6, p. 1633, Mar. 2023. [105] J. M. López-Correa, H. Moreno, A. Ribeiro, and D. Andújar, ‘‘Intelligent
[84] Y. Mu, R. Feng, R. Ni, J. Li, T. Luo, T. Liu, X. Li, H. Gong, Y. Guo, Y. Sun, weed management based on object detection neural networks in tomato
Y. Bao, S. Li, Y. Wang, and T. Hu, ‘‘A faster R-CNN-based model for crops,’’ Agronomy, vol. 12, no. 12, p. 2953, Nov. 2022.
the identification of weed seedling,’’ Agronomy, vol. 12, no. 11, p. 2867, [106] H. S. Fatima, I. U. Hassan, S. Hasan, M. Khurram, D. Stricker, and
Nov. 2022. M. Z. Afzal, ‘‘Formation of a lightweight, deep learning-based weed
[85] S. Tiwari, A. K. Sharma, A. Jain, D. Gupta, M. Gono, R. Gono, detection system for a commercial autonomous laser weeding robot,’’
Z. Leonowicz, and M. Jasiński, ‘‘IoT-enabled model for weed seedling Appl. Sci., vol. 13, no. 6, p. 3997, Mar. 2023.
classification: An application for smart agriculture,’’ AgriEngineering, [107] P. P. F. Rajeena, W. N. Ismail, and M. A. S. Ali, ‘‘A metaheuristic Harris
vol. 5, no. 1, pp. 257–272, Jan. 2023. hawks optimization algorithm for weed detection using drone images,’’
[86] V. Sharma, A. K. Tripathi, H. Mittal, A. Parmar, A. Soni, and R. Amarwal, Appl. Sci., vol. 13, no. 12, p. 7083, Jun. 2023.
‘‘WeedGan: A novel generative adversarial network for cotton weed [108] X. Jin, Y. Sun, J. Che, M. Bagavathiannan, J. Yu, and Y. Chen, ‘‘A novel
identification,’’ Vis. Comput., vol. 39, no. 12, pp. 6503–6519, Dec. 2023. deep learning-based method for detection of weeds in vegetables,’’ Pest
[87] B. Espejo-Garcia, N. Mylonas, L. Athanasakos, E. Vali, and S. Fountas, Manage. Sci., vol. 78, no. 5, pp. 1861–1869, May 2022.
‘‘Combining generative adversarial networks and agricultural transfer [109] Y. Cai, F. Zeng, J. Xiao, W. Ai, G. Kang, Y. Lin, Z. Cai, H. Shi, S. Zhong,
learning for weeds identification,’’ Biosyst. Eng., vol. 204, pp. 79–89, and X. Yue, ‘‘Attention-aided semantic segmentation network for weed
Apr. 2021. identification in pineapple field,’’ Comput. Electron. Agricult., vol. 210,
[88] J. Weyler, F. Magistri, E. Marks, Y. Linn Chong, M. Sodano, Jul. 2023, Art. no. 107881.
G. Roggiolani, N. Chebrolu, C. Stachniss, and J. Behley, ‘‘PhenoBench— [110] A. Picon, M. G. San-Emeterio, A. Bereciartua-Perez, C. Klukas,
A large dataset and benchmarks for semantic image interpretation in the T. Eggers, and R. Navarra-Mestre, ‘‘Deep learning-based segmentation
agricultural domain,’’ 2023, arXiv:2306.04557. of multiple species of weeds and corn crop using synthetic and real
[89] I. Gallo, A. U. Rehman, R. H. Dehkordi, N. Landro, R. La Grassa, and image datasets,’’ Comput. Electron. Agricult., vol. 194, Mar. 2022,
M. Boschetti, ‘‘Deep object detection of crop weeds: Performance of Art. no. 106719.
YOLOv7 on a real case dataset from UAV images,’’ Remote Sens., vol. 15, [111] H. Peng, Z. Li, Z. Zhou, and Y. Shao, ‘‘Weed detection in paddy field
no. 2, p. 539, Jan. 2023. using an improved RetinaNet network,’’ Comput. Electron. Agricult.,
[90] S. I. U. Haq, M. N. Tahir, and Y. Lan, ‘‘Weed detection in wheat crops vol. 199, Aug. 2022, Art. no. 107179.
using image analysis and artificial intelligence (AI),’’ Appl. Sci., vol. 13,
[112] A. M. Almalky and K. R. Ahmed, ‘‘Deep learning for detecting and
no. 15, p. 8840, Jul. 2023.
classifying the growth stages of consolida regalis weeds on fields,’’
[91] H. Jiang, C. Zhang, Y. Qiao, Z. Zhang, W. Zhang, and C. Song, ‘‘CNN Agronomy, vol. 13, no. 3, p. 934, Mar. 2023.
feature based graph convolutional network for weed and crop recognition
[113] M. H. Saleem, J. Potgieter, and K. M. Arif, ‘‘Weed detection by faster
in smart farming,’’ Comput. Electron. Agricult., vol. 174, Jul. 2020,
RCNN model: An enhanced anchor box approach,’’ Agronomy, vol. 12,
Art. no. 105450.
no. 7, p. 1580, Jun. 2022.
[92] R. de Silva, G. Cielniak, G. Wang, and J. Gao, ‘‘Deep learning-based
[114] H. M. Sahin, T. Miftahushudur, B. Grieve, and H. Yin, ‘‘Segmentation of
crop row detection for infield navigation of agri-robots,’’ J. Field Robot.,
weeds and crops using multispectral imaging and CRF-enhanced U-Net,’’
pp. 1–23, Aug. 2023.
Comput. Electron. Agricult., vol. 211, Aug. 2023, Art. no. 107956.
[93] R. de Silva, G. Cielniak, and J. Gao, ‘‘Vision based crop row naviga-
tion under varying field conditions in arable fields,’’ Comput. Electron. [115] R. Raja, T. T. Nguyen, D. C. Slaughter, and S. A. Fennimore, ‘‘Real-
Agricult., vol. 217, Feb. 2024, Art. no. 108581. time weed-crop classification and localisation technique for robotic weed
control in lettuce,’’ Biosyst. Eng., vol. 192, pp. 257–274, Apr. 2020.
[94] D. Patel, M. Gandhi, H. Shankaranarayanan, and A. D. Darji, ‘‘Design
of an autonomous agriculture robot for real-time weed detection using [116] R. Raja, D. C. Slaughter, S. A. Fennimore, T. T. Nguyen, V. L. Vuong,
CNN,’’ in Advances in VLSI and Embedded Systems. Berlin, Germany: N. Sinha, L. Tourte, R. F. Smith, and M. C. Siemens, ‘‘Crop signalling:
Springer, 2022, pp. 141–161. A novel crop recognition technique for robotic weed control,’’ Biosyst.
[95] M. H. Asad, S. Anwar, and A. Bais, ‘‘Improved crop and weed detection Eng., vol. 187, pp. 278–291, Nov. 2019.
with diverse data ensemble learning,’’ 2023, arXiv:2310.01055. [117] M. Gao, F. Yang, H. Wei, and X. Liu, ‘‘Automatic monitoring of maize
[96] S. Farkhani, S. K. Skovsen, M. Dyrmann, R. N. Jørgensen, and seedling growth using unmanned aerial vehicle-based RGB imagery,’’
H. Karstoft, ‘‘Weed classification using explainable multi-resolution slot Remote Sens., vol. 15, no. 14, p. 3671, Jul. 2023.
attention,’’ Sensors, vol. 21, no. 20, p. 6705, Oct. 2021. [118] M. D. Bah, A. Hafiane, and R. Canals, ‘‘Deep learning with unsupervised
[97] P. Chen, X. Ma, F. Wang, and J. Li, ‘‘A new method for crop row detection data labeling for weed detection in line crops in UAV images,’’ Remote
using unmanned aerial vehicle images,’’ Remote Sens., vol. 13, no. 17, Sens., vol. 10, no. 11, p. 1690, Oct. 2018.
p. 3526, Sep. 2021. [119] M. D. Bah, A. Hafiane, and R. Canals, ‘‘CRowNet: Deep net-
[98] F. Kitzler, N. Barta, R. W. Neugschwandtner, A. Gronauer, and V. Motsch, work for crop row detection in UAV images,’’ IEEE Access, vol. 8,
‘‘WE3DS: An RGB-D image dataset for semantic segmentation in agri- pp. 5189–5200, 2020.
culture,’’ Sensors, vol. 23, no. 5, p. 2713, Mar. 2023. [120] B. Costello, O. O. Osunkoya, J. Sandino, W. Marinic, P. Trotter,
[99] D. Su, H. Kong, Y. Qiao, and S. Sukkarieh, ‘‘Data augmentation for deep B. Shi, F. Gonzalez, and K. Dhileepan, ‘‘Detection of parthenium weed
learning based semantic segmentation and crop-weed classification in (Parthenium hysterophorus L.) and its growth stages using artificial intel-
agricultural robotics,’’ Comput. Electron. Agricult., vol. 190, Nov. 2021, ligence,’’ Agriculture, vol. 12, no. 11, p. 1838, Nov. 2022.
Art. no. 106418. [121] S. Kong, J. Li, Y. Zhai, Z. Gao, Y. Zhou, and Y. Xu, ‘‘Real-time detection
[100] A. Subeesh, S. Bhole, K. Singh, N. S. Chandel, Y. A. Rajwade, of crops with dense planting using deep learning at seedling stage,’’
K. V. R. Rao, S. P. Kumar, and D. Jat, ‘‘Deep convolutional neural net- Agronomy, vol. 13, no. 6, p. 1503, May 2023.
work models for weed detection in polyhouse grown bell peppers,’’ Artif. [122] E.-S.-M. El-Kenawy, N. Khodadadi, S. Mirjalili, T. Makarovskikh,
Intell. Agricult., vol. 6, pp. 47–54, Jan. 2022. M. Abotaleb, F. K. Karim, H. K. Alkahtani, A. A. Abdelhamid,
[101] J. Yu, S. M. Sharpe, A. W. Schumann, and N. S. Boyd, ‘‘Detection M. M. Eid, T. Horiuchi, A. Ibrahim, and D. S. Khafaga, ‘‘Metaheuristic
of broadleaf weeds growing in turfgrass with convolutional neural net- optimization for improving weed detection in wheat images captured by
works,’’ Pest Manage. Sci., vol. 75, no. 8, pp. 2211–2218, Aug. 2019. drones,’’ Mathematics, vol. 10, no. 23, p. 4421, Nov. 2022.
[102] J. Yang, Y. Wang, Y. Chen, and J. Yu, ‘‘Detection of weeds growing in [123] S. K. Nagothu, G. Anitha, B. Siranthini, V. Anandi, and P. S. Prasad,
alfalfa using convolutional neural networks,’’ Agronomy, vol. 12, no. 6, ‘‘Weed detection in agriculture crop using unmanned aerial vehicle and
p. 1459, Jun. 2022. machine learning,’’ Mater. Today, Proc., Mar. 2023.
[103] Z. Guo, H. H. Goh, X. Li, M. Zhang, and Y. Li, ‘‘WeedNet-R: A [124] C. C. D. Lelong, P. Burger, G. Jubelin, B. Roux, S. Labbé, and F. Baret,
sugar beet field weed detection algorithm based on enhanced RetinaNet ‘‘Assessment of unmanned aerial vehicles imagery for quantitative moni-
and context semantic fusion,’’ Frontiers Plant Sci., vol. 14, Jul. 2023, toring of wheat crop in small plots,’’ Sensors, vol. 8, no. 5, pp. 3557–3585,
Art. no. 1226329. May 2008.

VOLUME 12, 2024 113213


D. G Pai et al.: DL Techniques for Weed Detection in Agricultural Environments: A Comprehensive Review

[125] S. Christensen, H. T. Søgaard, P. Kudsk, M. Nørremark, I. Lund, RADHIKA KAMATH is currently an Associate
E. S. Nadimi, and R. Jørgensen, ‘‘Site-specific weed control technolo- Professor with the Department of Computer Sci-
gies,’’ Weed Res., vol. 49, no. 3, pp. 233–241, Jun. 2009. ence and Engineering, Manipal Institute of Tech-
[126] A. Bakhshipour, A. Jafari, S. M. Nassiri, and D. Zare, ‘‘Weed segmenta- nology, Manipal Academy of Higher Education,
tion using texture features extracted from wavelet sub-images,’’ Biosyst. Manipal, India. Her research interests include
Eng., vol. 157, pp. 1–12, May 2017. computer vision-based applications for agriculture
[127] A. Tellaeche, G. Pajares, X. P. Burgos-Artizzu, and A. Ribeiro, ‘‘A com- and wireless visual sensor network applications in
puter vision approach for weeds identification through support vector agriculture.
machines,’’ Appl. Soft Comput., vol. 11, no. 1, pp. 908–915, Jan. 2011.
[128] P. Bosilj, T. Duckett, and G. Cielniak, ‘‘Analysis of morphology-based
features for classification of crop and weeds in precision agriculture,’’
IEEE Robot. Autom. Lett., vol. 3, no. 4, pp. 2950–2956, Oct. 2018.
[129] Y. Lu, D. Chen, E. Olaniyi, and Y. Huang, ‘‘Generative adversar-
ial networks (GANs) for image augmentation in agriculture: A sys-
tematic review,’’ Comput. Electron. Agricult., vol. 200, Sep. 2022,
Art. no. 107208.
[130] M. Durgadevi, ‘‘Generative adversarial network (GAN): A general review
on different variants of GAN and applications,’’ in Proc. 6th Int. Conf.
Commun. Electron. Syst. (ICCES), Jul. 2021, pp. 1–8.
[131] J. Zhang, X. Wang, J. Liu, D. Zhang, Y. Lu, Y. Zhou, L. Sun, S. Hou,
X. Fan, S. Shen, and J. Zhao, ‘‘Multispectral drone imagery and SRGAN
for rapid phenotypic mapping of individual Chinese cabbage plants,’’
Plant Phenomics, vol. 2022, p. 0007, Jan. 2022.
[132] K. Weiss, T. M. Khoshgoftaar, and D. Wang, ‘‘A survey of transfer
learning,’’ J. Big data, vol. 3, pp. 1–40, Dec. 2016.
MAMATHA BALACHANDRA (Senior Mem-
ber, IEEE) is currently a Professor with the
DEEPTHI G PAI is currently a Research Scholar Department of Computer Science and Engineer-
with the Department of Computer Science and ing, Manipal Institute of Technology, Manipal
Engineering, Manipal Institute of Technology, Academy of Higher Education, Manipal, India.
Manipal Academy of Higher Education, Manipal, She has around 23 years of teaching experience
India. Her research interests include deep learning and had published around 20 research papers
in agriculture and computer vision-based applica- in national/international journals/conferences. Her
tions in agriculture. research interests include mobile ad hoc networks,
the IoT, and network security. She is also on the
editorial board of some journals.

113214 VOLUME 12, 2024

You might also like