Weed Detection Among Crops by Convolutional Neural Networks With Sliding Windows
Weed Detection Among Crops by Convolutional Neural Networks With Sliding Windows
Keywords. Convolutional Neural Network, True weed detection rate, Crop wastage, sliding
window, sub-image
The authors are solely responsible for the content of this paper, which is not a refereed publication.. Citation of this work should state that it
is from the Proceedings of the 14th International Conference on Precision Agriculture. EXAMPLE: Lastname, A. B. & Coauthor, C. D. (2018).
Title of paper. In Proceedings of the 14th International Conference on Precision Agriculture (unpaginated, online). Monticello, IL: International
Society of Precision Agriculture.
1.2. Dataset
The dataset consists of 60 images along with annotations available online
(https://github.com/cwfid/dataset). These images were acquired by two German researchers
Sebastian Haug and Jӧrn Ostermann with the help of autonomous field robot Bonirob in an
organic carrot farm when the plants were still at their early growth stage [9]. Also, at the time when
the images were captured for the data collection, both the weeds and crops were of approximately
same size.
2. Approach:
The weed detection process consists of two sub-processes: Image extraction with labelling the
input and building the network architecture. The image extraction process divides each incoming
training image into sub-images where the collections of these sub-images are sent to the
Convolutional Neural Network models to predict the potential weed regions in the test images.
Fig. 1 shows the sequence of different layers in Convolutional Neural Network model that was used for predictions
3. Experimental evaluation:
3.1. Performance measures
After building the model, the sub-image collection for a sliding window size along with labels is
passed into the network to predict the potential weed regions for measuring the performances of
the model in terms of true weed detection rate and crop wastage.
CW = mean (Percentage of crop pixels contained in the prediction boxes of weeds) of all images
(3)
The collections of sub-images sliced by various sizes of sliding windows are passed into the same
network to choose the optimum sliding window size that results in high true weed detection rate
and lower crop wastage as well.
Table 1. shows recall, precision, WD (F_score or Weed Detection rate), CW (Crop Wastage) and Ratio (WD/CW) values for
different sliding windows when various collections of sub-images were passed into the CNN model.
Fig. 2 shows the color variations of Recall, Precision, F-Score and Crop Wastage values with increasing window sizes
4. Conclusion:
A sliding window approach was proposed to predict the most potential weed regions in the organic
carrot farm imagery with the likelihood of causing least damage to the crops. The experiments
conducted showed that varying window sizes impact the weed detection rate and percentage of
crop wastage values enormously. When the sliding window sizes are too small, it is found that
although true weed detection rates are very high, the crop damage rates are also high as well.
This indicates that even though small sliding window sizes are good predictors for detection of
true weed regions, they are not good predictors for the prediction of crop pixels and there is a
high chance of farmers spraying their chemicals on crop regions too, thereby damaging the crop
health. When the sliding windows sizes are too large, even though crop wastage percentage is
too small when weed regions were predicted, the true weed detection rate is not significant
enough to detect all the weeds in the farm. So, when a ratio was calculated between true weed
detection rate and crop wastage, it was found that sliding window size of [80 80] would result in
an effective detection of weeds with 63.28% while causing the least damage to the crop with
13.33%.
5. References
Article by DOI
[1] Tellaechea, A., Burgos-Artizzub, X., P., Pajaresa, G., Ribeirob, A. (2007). A vision-based method for weeds
identification through the Bayesian decision theory. The Journal of the Pattern Recognition Society, 41. 521-530. doi:
10.1016/j.patcog.2007.07.007
Proceedings paper
[2] Shapira, U, Karnieli, A., Bonfil, D. (2010). Weeds detection by ground-level hyperspectral data. International
Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives. 38.
Article by DOI
[3] Hemming, J., Rath, T. (2001). PA—Precision Agriculture: Computer-Vision-based Weed Identification under Field
Conditions using Controlled Lighting. Journal of Agricultural Engineering Research. 78. 233-243. doi:
10.1006/jaer.2000.0639.
Online document
[4] Milioto, A., Lottes, P., Stachniss, C. (2017). Real-time Semantic Segmentation of Crop and Weed for Precision
Agriculture Robots Leveraging Background Knowledge in CNNs. https://arxiv.org/abs/1709.06764.
Article by DOI
[5] Sa, I., Chen, Z., Popovic, M., Khanna, R., Liebisch, F., Nieto, J., Siegwart, R. (2017). weedNet: Dense Semantic
Weed Classification Using Multispectral Images and MAV for Smart Farming. IEEE Robotics and Automation Letters.
PP. doi:10.1109/LRA.2017.2774979.
Proceedings paper
[6] Ciro, P., Daniele, N., Alberto, P. (2017). Fast and Accurate Crop and Weed Identification with Summarized Train
Sets for Precision Agriculture. Intelligent Autonomous Systems 14: Proceedings of the 14th International Conference
IAS-14 (pp.105-121).doi 10.1007/978-3-319-48036-7_9
Article by DOI
[7] Nejati, H., Azimifar, Z., Zamani, M. (2008). Using fast fourier transform for weed detection in corn fields. 1215 -
1219. doi:10.1109/ICSMC.2008.4811448.
Article by DOI
[8] Watchareeruetai, U., Takeuchi, Y., Matsumoto, T., Kudo, H., Ohnishi, N. (2006). Computer Vision Based Methods
for Detecting Weeds in Lawns. Machine Vision and Applications 17. 287-296. doi:10.1109/ICCIS.2006.252275.
Proceedings paper
[9] Haug, S., Ostermann, J. (2014).A Crop/Weed Field Image Dataset for the Evaluation of Computer Vision Based
Precision Agriculture Tasks. Computer Vision - ECCV 2014 Workshops: Zurich, Switzerland, September 6-7 and 12,
2014, Proceedings, Part IV (pp.105-116). doi 10.1007/978-3-319-16220-1_8