Immediate download (Ebook) Computer Vision and Machine Learning in Agriculture (Algorithms for Intelligent Systems) by Mohammad Shorif Uddin (editor), Jagdish Chand Bansal (editor) ISBN 9789813364233, 9813364238 ebooks 2024
Immediate download (Ebook) Computer Vision and Machine Learning in Agriculture (Algorithms for Intelligent Systems) by Mohammad Shorif Uddin (editor), Jagdish Chand Bansal (editor) ISBN 9789813364233, 9813364238 ebooks 2024
com
OR CLICK HERE
DOWLOAD EBOOK
ebooknice.com
https://ebooknice.com/product/vagabond-vol-29-29-37511002
ebooknice.com
https://ebooknice.com/product/boeing-b-29-superfortress-1573658
ebooknice.com
https://ebooknice.com/product/29-single-and-nigerian-53599780
ebooknice.com
(Ebook) Jahrbuch für Geschichte: Band 29 ISBN
9783112622223, 3112622227
https://ebooknice.com/product/jahrbuch-fur-geschichte-band-29-50958290
ebooknice.com
https://ebooknice.com/product/harrow-county-29-53599548
ebooknice.com
https://ebooknice.com/product/organometallic-chemistry-
volume-29-2440106
ebooknice.com
https://ebooknice.com/product/advances-in-insect-physiology-
vol-29-1308058
ebooknice.com
Computer Vision
and Machine
Learning in
Agriculture
Algorithms for Intelligent Systems
Series Editors
Jagdish Chand Bansal, Department of Mathematics, South Asian University,
New Delhi, Delhi, India
Kusum Deep, Department of Mathematics, Indian Institute of Technology Roorkee,
Roorkee, Uttarakhand, India
Atulya K. Nagar, School of Mathematics, Computer Science and Engineering,
Liverpool Hope University, Liverpool, UK
This book series publishes research on the analysis and development of algorithms for
intelligent systems with their applications to various real world problems. It covers
research related to autonomous agents, multi-agent systems, behavioral modeling,
reinforcement learning, game theory, mechanism design, machine learning, meta-
heuristic search, optimization, planning and scheduling, artificial neural networks,
evolutionary computation, swarm intelligence and other algorithms for intelligent
systems.
The book series includes recent advancements, modification and applications of
the artificial neural networks, evolutionary computation, swarm intelligence, artifi-
cial immune systems, fuzzy system, autonomous and multi agent systems, machine
learning and other intelligent systems related areas. The material will be benefi-
cial for the graduate students, post-graduate students as well as the researchers who
want a broader view of advances in algorithms for intelligent systems. The contents
will also be useful to the researchers from other fields who have no knowledge of
the power of intelligent systems, e.g. the researchers in the field of bioinformatics,
biochemists, mechanical and chemical engineers, economists, musicians and medical
practitioners.
The series publishes monographs, edited volumes, advanced textbooks and
selected proceedings.
Computer Vision
and Machine Learning
in Agriculture
Editors
Mohammad Shorif Uddin Jagdish Chand Bansal
Department of Computer Science Department of Applied Mathematics
and Engineering South Asian University
Jahangirnagar University New Delhi, India
Dhaka, Bangladesh
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore
Pte Ltd. 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.
This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
Preface
v
vi Preface
of fresh and rotten fruits and confirms that the proposed deep learning architec-
ture outperforms the existing approaches. Chapter Deep Learning-Based Essen-
tial Paddy Pests’ Filtration Technique for Economic Damage Management” illus-
trates a region-based deep convolutional neural network known as Faster R-CNN to
perform the detection and identification of both beneficial and non-beneficial paddy
pests from the images. It has investigated three models of Faster R-CNN based on
ResNet-101, VGG-16, and MobileNet and has obtained the highest accuracy from
the ResNet-101-based Faster R-CNN. Besides, it developed an extensive dataset
of paddy pests. Chapter “Deep CNN-Based Mango Insect Classification” discusses
the creation of a quality dataset containing three different types of common mango
insects. After that, it performs a classification of insects using an ensemble of three
fine-tuned deep learning models, namely Xception, MobileNet, and VGG19. The
ensemble model achieves a very good classification accuracy. Chapter “Implemen-
tation of a Convolutional Neural Network for the Detection of Tomato-Leaf Diseases”
shows the implementation of a deep convolutional neural network for the early detec-
tion of tomato leaf diseases. Chapter “A Multi-Plant Disease Diagnosis Method
Using Convolutional Neural Network” presents an optimal plant disease diagnosis
model for multiple plants using convolutional neural networks. Chapter “A Deep
Learning-Based Approach for Potato Disease Classification” investigates an early
detection of potato disease through different deep CNN strategies by developing a
dataset containing 7870 images of various diseases. Based on accuracy, precision,
recall, and F1 score it finds that the ResNet is the best model for this particular
application. Chapter “An In-Depth Analysis of Different Segmentation Techniques
in Automated Local Fruit Disease Recognition” describes four different segmenta-
tion strategies, such as Otsu’s method, K-means clustering, fuzzy c-means clustering,
and region growing for the extraction of defective regions of the defective region of
three common fruits of Bangladesh, namely guava, jackfruit, and papaya. K-means
clustering technique gives the best performance among these segmentation tech-
niques based on six quantitative analysis metrics by attaining an aggregate accuracy
of 81.65%. Chapter “Machine Vision-Based Fruit and Vegetable Disease Recog-
nition: A Review” presents a comprehensive survey of the recent advancement of
computer vision and machine learning research efforts for fruit and vegetable disease
recognition. It also shows a comparative study on these efforts based to find state-
of-the-art techniques and shows ways for future research. Chapter “An Efficient
Bag-of-Features for Diseased Plant Identification” proposes a bag-of-features based
diseased plant identification method using gray relational analysis. It presents the
experimental results on a publicly available leaf-image dataset (PlantVillage).
This book is expected to be very useful the researchers, academicians, undergrad-
uate and postgraduate students who wish to work and explore the applications of
computer vision and machine learning systems in the agricultural sector for boosting
productions.
Preface vii
We sincerely appreciate the time, effort, and contribution of the authors and
esteemed reviewers in maintaining the quality of the papers. Special thanks to the
editors and supporting team of Springer for helping in publishing this book.
ix
x Contents
Prof. Mohammad Shorif Uddin received his Ph.D. degree in Information Science
from Kyoto Institute of Technology in 2002, Japan, Master of Technology Educa-
tion degree from Shiga University, Japan, in 1999, Bachelor of Electrical and
Electronic Engineering degree from Bangladesh University of Engineering and Tech-
nology in 1991, and also MBA from Jahangirnagar University in 2013. He began
his teaching career as Lecturer in 1991 at Bangladesh Institute of Technology, Chit-
tagong (Renamed as CUET). He joined the Department of Computer Science and
Engineering of Jahangirnagar University in 1992, and currently, he is Professor of
this department. In addition, he is Teacher-in-Charge of ICT Cell of Jahangirnagar
University. He served as Chairman of Computer Science and Engineering of Jahangir-
nagar University from June 2014 to June 2017. He undertook postdoctoral researches
at Bioinformatics Institute, Singapore, Toyota Technological Institute, Japan, Kyoto
Institute of Technology, Japan, Chiba University, Japan, Bonn University, Germany,
and Institute of Automation, Chinese Academy of Sciences, China. His research is
motivated by applications in the fields of artificial intelligence, imaging informatics,
and computer vision. Mohammad Uddin is IEEE Senior Member and Fellow of
Bangladesh Computer Society (BCS) and The Institution of Engineers Bangladesh
(IEB). He has lectured a good number of undergraduate and graduate courses, wrote
more than 140 journal and conference papers, and organized some national and
international conferences and seminars. He had delivered a remarkable number of
keynotes and invited talks and acted as General Chair as well as TPC Chair of
many international conferences. He holds two patents for his scientific inventions
and received the Best Paper Award in the International Conference on Informatics,
Electronics & Vision (ICIEV2013), Dhaka, Bangladesh, and Best Presenter Award
from the International Conference on Computer Vision and Graphics (ICCVG 2004),
Warsaw, Poland. He is an Associate Editor of IEEE Access.
xi
xii Editors and Contributors
Dr. Jagdish Chand Bansal is Associate Professor at South Asian University New
Delhi and Visiting Faculty at Maths and Computer Science, Liverpool Hope Univer-
sity UK. Dr. Bansal has obtained his Ph.D. in Mathematics from IIT Roorkee. Before
joining SAU New Delhi, he has worked as Assistant Professor at ABV-Indian Insti-
tute of Information Technology and Management Gwalior and BITS Pilani. His
primary area of interest is swarm intelligence and nature-inspired optimization tech-
niques. Recently, he proposed a fission–fusion social structure-based optimization
algorithm, Spider Monkey Optimization (SMO), which are being applied to various
problems from engineering domain. He has published more than 70 research papers
in various international journals/conferences. He is Series Editor of the book series
Algorithms for Intelligent Systems (AIS) and Studies in Autonomic, Data-driven and
Industrial Computing (SADIC) published by Springer. He is Editor-in-Chief of the
International Journal of Swarm Intelligence (IJSI) published by Inderscience. He is
also Associate Editor of IEEE ACESSS published by IEEE and ARRAY published
by Elsevier. He is Steering Committee Member and General Chair of the annual
conference series SocProS. He is General Secretary of Soft Computing Research
Society (SCRS). He has also received Gold Medal at UG and PG levels.
Contributors
1 Introduction
M. S. Uddin (B)
Department of Computer Science and Engineering, Jahangirnagar University, Savar, Dhaka,
Bangladesh
e-mail: [email protected]
J. C. Bansal
Department of Applied Mathematics, South Asian University, New Delhi, India
e-mail: [email protected]
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 1
M. S. Uddin and J. C. Bansal (eds.), Computer Vision and Machine Learning
in Agriculture, Algorithms for Intelligent Systems,
https://doi.org/10.1007/978-981-33-6424-0_1
2 M. S. Uddin and J. C. Bansal
Fig. 2 A schematic flow diagram of computer vision-based machine learning as well as a deep
learning system
Introduction to Computer Vision and Machine Learning … 3
• LeNet
• VGGNet (VGG 16, VGG 19)
• ResNet (ResNet 50, ResNeXt 50)
• DenseNet (DenseNet 121, DenseNet 161, DenseNet 201)
• GoogleNet (Inception V1, Inception V3, Inception V4)
• Xception
• AlexNet
• MobileNet
• NASNetMobile
4 M. S. Uddin and J. C. Bansal
production efficiency by reducing monitoring time and labor work. CVML tech-
niques can detect the delicate changes in crop growth due to malnutrition at an
early stage and efficiently monitoring crop health regularly [18–20].
(b) Diseases, Weeds, and Insects Detection
CVML uses diverse techniques in different types of crop diseases, pest, and
weed detection [21–28]. An overall review of the use of vision-based systems
in pests, diseases, and weeds detection was presented in [29].
(c) Automatic Crop Harvesting
CVML has brought revolutionary changes in the automation of different types
of vegetables and fruits harvesting such as cucumber, apple, cherries, etc. using
robotic systems [30–32].
(d) Product Inspection and Quality Testing
Several CVML techniques have been applied for inspection and quality testing
of agricultural products particularly for fruits and vegetables and are described
in [33–35].
(e) Plant Phenotyping
Plant phenotyping is a scientific process of identifying physical plant charac-
teristics and function (known as the phenotype) that can be jointly affected by
genotype and environment. In recent years, computer vision technologies with
deep learning have been widely used in plant phenology and phenotyping to
improve plant productivity [36–38].
(f) Species Recognition
CVML can be used for faster detection and classification of plant species to
reduce the classification time without human effort. A research study based on
the identification and classification of three legume species (white beans, red
beans, and soybean) through leaf vein patterns was presented in [39].
(g) Yield Prediction
Yield prediction has become one of the most popular research topics in preci-
sion agriculture as it has outperformed the simple prediction based traditional
Introduction to Computer Vision and Machine Learning … 5
methods for crop yield. Several surveys on yield prediction using machine
learning algorithms have been conducted during the past few years [40–42].
(h) Water Management
Water management has significant impacts on hydrological, climatological, and
agronomical balance in agriculture. Several machine learning algorithms have
been developed to build an effective regular irrigation system based on weather
conditions and evaporation [43–45].
(i) Soil Management
Machine learning algorithms are used to study evaporation processes and measure
soil moisture and temperature for a better understanding of the essential eco-elements
in agriculture [46, 47].
4 Conclusion
This chapter reported an introduction to computer vision and machine learning appli-
cations in agriculture that can be served as a strong reference by illustrating the latest
advancements in agricultural machine vision applications. It will certainly moti-
vate researchers in contributing to the development of agricultural tools for crop
health growth monitoring, disease and pest detection and control, weeding, irriga-
tion, crop management, and harvesting with low cost and high efficiency. Finally, it
can be concluded that in the future, machine vision technology associated with large-
scale datasets will be immensely used in every aspect of agricultural automation to
overcome the current challenges in agriculture.
References
1. Liakos, K.G., Busato, P., Moshou, D., Pearson, S., Bochtis, D.: Machine learning in agriculture:
A review. Sensors 18(8), 1–29 (2018)
2. Bochtis, D.D., Sørensen, C.G.C., Busato, P.: Advances in agricultural machinery management:
A review. Biosys. Eng. 126, 69–81 (2014)
3. Hunter, M., Smith, R., Schipanski, M., Atwood, L., Mortensen, D.: Agriculture in 2050:
Recalibrating targets for sustainable intensification. Bioscience 67(4), 386–391 (2017)
4. Wu, X., Guo, J., Han, M., Chen, G.: An overview of arable land use for the world economy:
From source to sink via the global supply chain. Land Use Policy 76, 201–214 (2018)
5. Aubert, B.A., Schroeder, A., Grimaudo, J.: IT as enabler of sustainable farming: An empirical
analysis of farmers’ adoption decision of precision agriculture technology. Decis. Support Syst.
54(1), 510–520 (2012)
6. Gomes, J.F.S., Leta, F.R.: Applications of computer vision techniques in the agriculture and
food industry: A review. Eur. Food Res. Technol. 235(6), 989–1000 (2012)
7. Rehman, T.U., Mahmud, M.S., Chang, Y.K., Jin, J., Shin, J.: Current and future applications
of statistical machine learning algorithms for agricultural machine vision systems. Comput.
Electron. Agric. 156, 585–605 (2019)
8. Mahajan, S., Das, A., Sardana, H.K.: Image acquisition techniques for assessment of legume
quality. Trends Food Sci. Technol. 42(2), 116–133 (2015)
9. Vithu, P., Moses, J.A.: Machine vision system for food grain quality evaluation: A review.
Trends Food Sci. Technol. 56, 13–20 (2016)
10. Kirk, D.B., Hwu, W.W.: “Programming Massively Parallel Processors,” A Hands-on Approach,
3rd edn. Morgan Kaufmann, San Francisco, CA, USA (2016)
11. Tripicchio, P., Satler, M., Dabisias, G., Ruffaldi, E., Avizzano, C.A.: Towards smart farming
and sustainable agriculture with drones. Int. Conf. Intell. Environ. 140–143 (IEEE, 2015)
12. Wjtowicz, M., Wjtowicz, A., Piekarczyk, J.: Application of remote sensing methods in
agriculture. Commun. Biometry Crop Sci. 11(1), 31–50 (2016)
13. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional
neural networks. Adv. Neural. Inf. Process. Syst. 25(2), 1097–1105 (2012)
14. C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, A.
Rabinovich: Going deeper with convolutions. In: IEEE Conference on Computer Vision and
Pattern Recognition, pp. 1–9, Boston, MA (2015).
15. Simonyan, K., Zisserman, A.: Very Deep Convolutional Networks for Large-Scale Image
Recognition (2014). Available Online: https://arxiv.org/pdf/1409.1556.
16. Cheng, G., Han, J., Lu, X.: Remote sensing image scene classification: Benchmark and state
of the art. Proc. IEEE 105(10), 1865–1883 (2017)
Introduction to Computer Vision and Machine Learning … 7
17. Mavridou, E., Vrochidou, E., Papakostas, G.A., Pachidis, T., Kaburlasos, V.G.: Machine vision
systems in precision agriculture for crop farming. J. Imaging 5(12), 1–32 (2019)
18. Li, K., Lian, H., Deun, R.V., Brik, M.G.: A far-red-emitting NaMgLaTeO6 :Mn4+ phosphor
with perovskite structure for indoor plant growth. Dyes Pigm. 162, 214–221 (2019)
19. Shimizu, H., Heins, R.: Computer-vision-based system for plant growth analysis. Trans. ASAE
38(3), 959–964 (1995)
20. Tombe, R.: Computer vision for smart farming and sustainable agriculture. In: 2020 IST-Africa
Conference (IST-Africa), Kampala, Uganda, pp. 1–8 (2020)
21. Maharlooei, M., Sivarajan, S., Bajwa, S.G., Harmon, J.P., Nowatzki, J.: Detection of soybean
aphids in a greenhouse using an image processing technique. Comput. Electron. Agric. 132,
63–70 (2017)
22. Xiaolong, L., Ma, Z., Bienvenido, F., Feng, Q., Haiguang, W., Álvarez-Bermejo, J.A.: Devel-
opment of automatic counting system for Urediospores of wheat stripe rust based on image
processing. Int. J. Agric. Biol. Eng. 10(5), 134–143 (2017)
23. Wang, G., Sun, Y., Wang, J.: Automatic image-based plant disease severity estimation using
deep learning. Comput. Intell. Neurosci. 2017, 1–8 (2017)
24. Liu, H., Chahl, J.S.: A multispectral machine vision system for invertebrate detection on green
leaves. Comput. Electron. Agric. 150, 279–288 (2018)
25. Zhong, Y., Gao, J., Lei, Q., Zhou, Y.: A vision-based counting and recognition system for flying
insects in intelligent agriculture. Sensors 18(5), 1–19 (2018)
26. Tellaeche, A., Pajares, G., Burgos-Artizzu, X.P., Ribeiro, A.: A computer vision approach for
weeds identification through support vector machines. Appl. Soft Comput. 11(1), 908–915
(2011)
27. Sabzi, S., Abbaspour-Gilandeh, Y., Garcia-Mateos, G.: A fast and accurate expert system for
weed identification in potato crops using metaheuristic algorithms. Comput. Industry 98, 80–89
(2018)
28. Pantazi, X.E., Tamouridou, A.A., Alexandridis, T.K., Lagopodi, A.L., Kashefi, J., Moshou, D.:
Evaluation of hierarchical self-organising maps for weed mapping using UAS multispectral
imagery. Comput. Electron. Agric. 139, 224–230 (2017)
29. Muppala, C., Guruviah, V.: Machine vision detection of pests, diseases and weeds: A review.
J. Phytol. 12, 9–19 (2020)
30. Yuan, T., Xu, C.-G., Ren, Y.-X., Feng, Q.-C., Tan, Y.-Z., Li, W.: Detecting the information of
cucumber in greenhouse for picking based on NIR image. Guang Pu Xue Yu Guang Pu Fen Xi
29(8), 2054–2058 (2009)
31. Davidson, J.R., Silwal, A., Hohimer, C.J., Karkee, M., Mo, C, Zhang, Q.: Proof-of-concept
of a robotic apple harvester. In: International Conference on Intelligent Robots and Systems
(IROS). IEEE/RSJ, Daejeon, pp. 634–639 (2016)
32. Zhang, Q., Chen, S., Yu, T., Wang, Y.: Cherry recognition in natural environment based on the
vision of picking robot. IOP Conf. Series Earth Environ. Sci. 61(1), 1–6 (2017)
33. Patel, K.K., Kar, A., Jha, S.N., Khan, M.A.: Machine vision system: A tool for quality inspection
of food and agricultural products. J. Food Sci. Technol. 49(2), 123–141 (2012)
34. Saldaña, E., Siche, R., Luján, M., Quevedo, R.: Review: Computer vision applied to the
inspection and quality control of fruits and vegetables. Braz. J. Food Technol. 16(4), 254–272
(2013)
35. Bhargava, A., Bansal, A.: Fruits and vegetables quality evaluation using computer vision: A
review. J. King Saud Univ. Computer Inf. Sci. 1–15 (2018)
36. Mochida, K., Koda, S., Inoue, K., Hirayama, T., Tanaka, S., Nishii, R., Melgani, F.:
Computer vision-based phenotyping for improvement of plant productivity: A machine learning
perspective. GigaScience 8(1), 1–12 (2018)
37. Li, Z., Guo, R., Li, M., Chen, Y., Li, G.: A review of computer vision technologies for plant
phenotyping. Comput. Electron. Agric. 176, 1–21 (2020)
38. Chandra, A.L., Desai, S.V., Guo, W., Balasubramanian, V.N.: Computer vision with deep
learning for plant phenotyping in agriculture: A survey. Adv. Comput. Commun. 1–27 (2020)
8 M. S. Uddin and J. C. Bansal
39. Grinblat, G.L., Uzal, L.C., Larese, M.G., Granitto, P.M.: Deep learning for plant identification
using vein morphological patterns. Comput. Electron. Agric. 127, 418–424 (2016)
40. Chlingaryan, A., Sukkarieh, S., Whelan, B.: Machine learning approaches for crop yield predic-
tion and nitrogen status estimation in precision agriculture: A review. Comput. Electron. Agric.
151, 61–69 (2018)
41. Elavarasan, D., Vincent, D.R., Sharma, V., Zomaya, A.Y., Srinivasan, K.: Forecasting yield by
integrating agrarian factors and machine learning models: A survey. Comput. Electron. Agric.
155, 257–282 (2018)
42. van Klompenburg, T., Kassahun, A., Catal, C.: Crop yield prediction using machine learning:
A systematic literature review. Comput. Electron. Agric. 177, 1–18 (2020)
43. Mohammadi, K., Shamshirband, S., Motamedi, S. Petkovic, D. Hashim, R., Gocic, M.: Extreme
learning machine based prediction of daily dew point temperature. Comput. Electron. Agric.
117, 214–225 (2015)
44. Patil, A.P., Deka, P.C.: An extreme learning machine approach for modeling evapotranspiration
using extrinsic inputs. Comput. Electron. Agric. 121, 385–392 (2016)
45. Mehdizadeh, S., Behmanesh, J., Khalili, K.: Using MARS, SVM, GEP and empirical equations
for estimation of monthly mean reference evapotranspiration. Comput. Electron. Agric. 139,
103–114 (2017)
46. Morellos, A., Pantazi, X.-E., Moshou, D., Alexandridis, T., Whetton, R., Tziotzios, G., Wieben-
sohn, J., Bill, R., Mouazen, A.M.: Machine learning based prediction of soil total nitrogen,
organic carbon and moisture content by using VIS-NIR spectroscopy. Biosys. Eng. 152,
104–116 (2016)
47. Nahvi, B., Habibi, J., Mohammadi, K., Shamshirband, S., Razgan, O.S.A.: Using self-adaptive
evolutionary algorithm to improve the performance of an extreme learning machine for
estimating soil temperature. Comput. Electron. Agric. 124, 150–160 (2016)
Robots and Drones
in Agriculture—A Survey
1 Introduction
Agriculture is a crucial factor that has a significant contribution to the global economy,
as more than 60% of our population is entirely dependent on agriculture for survival
[1]. Additionally, continuous expansion of urbanization, which is responsible for
the gradual destruction of the land area for cultivation, causes large-scale damage
to agriculture [2]. Despite being the leading source of food and income, agronomy
is a tremendously time-consuming, labor-intensive, and slow speed process. Thus,
agricultural robotics has been introduced to eliminate these barriers and increase the
accuracy of an efficient autonomous agricultural system [3]. During the past few
decades, robotics has immensely been applied in different fields, including smart
home, medical research and diagnosis, manufacturing, agricultural industry [4–7],
and so on. An agricultural robot is such an automated machine, which operates
different computational algorithms to increase production efficiency by considering
the agro-products as objects, based on environmental perceptions [8]. In recent years,
precision agriculture has emerged with artificial technologies for the automation of
farming processes to minimize labor-intensive work and time [9]. This precision
concept has brought a tremendous change in the design of agricultural tools by
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 9
M. S. Uddin and J. C. Bansal (eds.), Computer Vision and Machine Learning
in Agriculture, Algorithms for Intelligent Systems,
https://doi.org/10.1007/978-981-33-6424-0_2
10 R. Basri et al.
connecting them with small smart devices such as different types of sensors, drones,
trackers, etc. which can easily detect, spray, weed, and pick crops. An agricultural
robot can be designed by using software and GSM to interface the robot with a
computer [10]. Agricultural production has been increased to a great extent by the
substantial usage of these agricultural robots. In this paper, we review the progress
of research on agricultural robots in terms of design, classification, grafting, picking,
weeding, spraying, harvesting [11–15], etc. along with their features and opera-
tions. Furthermore, we instantiate a summary of robotic development with technical
challenges and future scopes in agricultural robotics.
The remainder of this chapter is structured as follows: Sect. 2 presents a
basic architecture and classification of agricultural robots with their functional-
ities. Sections 3 and 4 illustrate various applications of agricultural robots and
drones, respectively. Section 5 shows the commercialization and current challenges
of agricultural robots and drones. Finally, Sect. 6 provides a conclusion.
2 Robotics Basic
Due to the vast application of modern technologies, researchers have recently shown
a high interest in the research and development of agricultural robots. According to
the use of robots in diverse agricultural tasks, agricultural robots can be categorized
mainly as outdoor and indoor robots, which can be further classified according to
their operations. Table 2 illustrates the classification of agricultural robots along with
their specific functions [28, 29].
During the past few decades, even though there was a presence of manpower, robots
have been used to perform several challenging agricultural tasks such as autonomous
path navigation, grafting, seeding, weeding, spraying, harvesting, and so on. All these
tasks should be studied well for better understanding.
Automatic path navigation is the most critical task when it comes to high-value crops.
It requires localization, tracking, mapping, motion control, and path planning. Robots
generally navigate the path by using cameras to take pictures of the path and plan
the desired path by recognizing the color of the plants, usually detecting green or
not. Path navigation and planning in agriculture have been studied well, and research
in this sector is developing day by day. A brief review of different path navigation
modules is described as follows and a basic navigation strategy is shown in Fig. 2.
Robots and Drones in Agriculture—A Survey 13
Table 2 (continued)
Category Class Name Function
Lumbering robot Cleans up different slopes
and terrain using a
hydraulic system
Indoor robot Harvesting robots Greenhouse harvesting Uses vision-based
robot machines for movement
through greenhouse aisles
to harvest some sensitive
crops like tomatoes and
strawberries
Material handling Greenhouse Material Performs plant spacing and
robots Handler optimizes plant placement
to reduce production costs
by controlling the use of
water, pesticides,
herbicides, and fertilizers
for the high-quality plant
Hellström [31] combined GPS with Inertial Navigation System (INS) and other
sensors to collect accurate information for navigation by measuring the derivatives
of robot position using gyros and accelerometers [31]. Nagasaka et al. [32] presented
an automated rice transplanter that accurately transplanted rice using a real-time kine-
matic global positioning system (RTKGPS) and fiber optic gyroscope (FOG) sensor
to measure the direction, and a simple steering controller in the desired path less than
12 cm. Hellström [33] introduced a superior algorithm for path tracking, where a radar
performed obstacle avoidance and a real-time kinematic differential GPS/GLONASS
completed localization. Li et al. [34] developed a conceptual framework along with
reviewing the guidance of an autonomous agricultural vehicle consisting of naviga-
tion sensors with GPS, computational methods, and steering controllers. Ringdahl
et al. [35] designed an automated vehicle to track two different paths more accu-
rately by using a forwarder with GPS and a gyro. However, this system can track
only previously demonstrated paths.
(b) Vision-Based Navigation
Zhao et al. [19] presented a robot that used Hough transform to track the path from
the continuous sample images captured by the camera. Gottschalk et al. [36] devel-
oped a machine to navigate between two-crop rows for automatic field inspection
through image segmentation, classification, and geometrical line extraction by using
a webcam. Wang et al. [37] presented an improved path navigation framework against
local map navigation based on near to far perception [37]. Besides, a large quantity of
meticulous research has been conducted to develop an automated vision-based navi-
gation system [38–41]. Weiss et al. [42] introduced a stereo vision system to map
3D filed images for autonomous navigation. Ball et al. [43] defined a cost-effective
Robots and Drones in Agriculture—A Survey 15
USA started to produce different types of commercial robot models of automatic fruit
and vegetable grafting [59–64]. Li et al. [65] developed a plant let-cutting mechanism
for the grafting robot with an efficiency of 99.14%. On the other hand, Zhao et al. [66]
applied a scion cutting mechanism for a sapling grafting robot and obtained 96.5%
efficiency. Jiang [67] developed a grape sapling robot for a stick-to-stick grafting. A
grafting robot was presented using the cutting grafting method for camellia oleifera
seedlings in [68].
(b) Picking Robot
California Machinery Company introduced an automatic tomato picking robot in
2004. This automatic tomato picking robot was used to pick fruits and separate them
from leaves into a sorting bin to further process the picked leaves as fertilizer. Nezhad
et al. [69] introduced a tomato picking robot that can identify and pick tomatoes
through image processing. Feng et al. [70] developed an intelligent tomato picking
robot to reduce labor effort harvesting fresh tomatoes with a success rate of 83.9%.
Xiong et al. [71] presented a novel mechanism for faster picking of strawberries
Robots and Drones in Agriculture—A Survey 17
Weeds or unwanted plants must be controlled for faster growth of crops using
automatic weeding [73]. A four-wheeled weeding robot was developed that was
composed of a diesel engine, hydraulic transmission, and was able to steer 360° to
detect and remove weeds automatically [74]. A co-robot system along with odometry
was described for inter-row weeding without damaging the crop rows [75]. Fenni-
more et al. [76] focused on the improvement of weeding efficiency with a reduction in
production cost. Bechar and Vigneault [77] discussed the development of weeding
robots [77] for real-field operations. Kunz et al. [78] introduced integrated weed
management in sugar beet, maize, and soybean fields, combining multiple tactics
to increase weeding performance [78]. Chang and Lin [79] proposed a vision-based
agricultural robot for weeding in real-time with average plants and weeds classifica-
tion rate of 90% or higher. It can also perform smart watering while maintaining the
moisture content of the deep soil at an efficiency of 80%. Steward et al. [80] empha-
sized the development of efficient weed robots including perception system and weed
control mechanisms. Wu et al. [81] used a multi-camera system for removing weeds
by classifying plants and weeds.
(b) Disease and Pest Detector
Detecting diseases and controlling pests at an early stage is inevitable for high-quality
crop production. Previously farmers used different strategies to detect crop diseases
[82]. Hengstler et al. [83] designed a smart visual surveillance system (Mesh Eye)
to detect diseases of fruits and identify the affected area by determining the shape
and color of the leaf and stem [83]. Additionally, one major problem was controlling
pest nuisance, caused by harmful insects or germs that destroyed crops. Pests cannot
entirely be eliminated but can be effectively controlled to reduce their nuisance.
Thus, early detection of pests and monitor them are essential to control them for the
prevention of crop damages. Laothawornkitkul et al. [84] used a potential electronic-
nose technology for remote sensing and monitoring of diseases and pests [84]. Pest
identification on leaves along with automatic spraying was performed by using a
vision-based system [85]. Camargoa et al. [86] proposed a visual method, capable
of detecting all visual symptoms of plant diseases by analyzing colored images [86].
López et al. [87] presented an effective pest control technique, known as insect
traps to efficiently perform automatic pest monitoring and inspection using high-
resolution images [87]. Francis et al. [88] used soft computing techniques to identify
leaf diseases in pepper plants. Yazgac et al. [89] applied a signal processing method to
18 R. Basri et al.
detect the sunn pests in wheat and barley by sound, but this method was useful only for
a single leaf [89]. Gonzalez-de-Santos et al. [90] presented a fleet of heterogeneous
ground and aerial robots that were used for effective weed and pest control to reduce
the use of chemical substances.
(c) Pest Control and Spraying Robot
In recent years, different types of pesticides and chemical sprays have been used to
control pests, which are very dangerous for humans resulting in skin cancer, asthma,
or other chronic diseases. In that case, an automated robotic system can be used to
spray pesticides to avoid human contact and save time, as robots are programmed
to spray pesticides on crops only if they can detect pests. Blackmore et al. [91]
applied a multi-purpose robotic sprayer that can operate automatically according
to the weather conditions. Jian-sheng [92] proposed a wireless controlled robot for
spraying pesticides. Pilli et al. [93] designed a robot that was capable of moving
along the intercrop rows to detect diseases and spraying pesticides automatically
and showed promising results in cotton and groundnut fields. Sharma and Borse
[94] described a mobile robot to monitor plant growth and detect disease with the
spraying mechanism for pesticides, fertilizers, and water. Sudha et al. [95] proposed
an automated technique to detect pests and spray pesticides based on a pre-defined
threshold value to indicate the pesticide level. Chaitanya et al. [96] designed an
autonomous robot, capable of spraying pesticides in a limited quantity only when
pests were detected.
For many years, robots have been continuously used for harvesting a variety of
fruits and vegetables such as apple, grapes, watermelon, tomatoes, cucumbers, etc.
Ceres et al. [97] proposed a new robot (Agribot) for fruit harvesting in a particularly
unstructured environment. Yanbin et al. [98] demonstrated the current development
of automatic fruit harvesting robots in horticulture and experimented on apple, kiwi,
tomato, and sweet pepper. Onishi et al. [99] presented an automated fruit detection
and harvesting robot using a single shot multi-box detector and a stereo camera and
found a performance of more than 90%. Some robots are described below based on
their application for specific fruits.
(a) Apple Harvesting
Yuan et al. [100] applied an ant colony algorithm to improve the performance of
the apple harvesting system and found an optimal result. Lv et al. [101] proposed
a vision- based apple harvesting robot to efficiently recognize apples using video.
De-An et al. [102] presented a 5 DOF (degrees of freedom) robot to detect, locate,
and pick apples automatically using the support vector machine algorithm.
Robots and Drones in Agriculture—A Survey 19
4 Drones in Agriculture
6 Conclusion
Automation in agriculture using intelligent robotic systems is one of the most chal-
lenging tasks that has already grabbed the attention of researchers due to its diverse
practical applications for commercial use. This chapter is reported on an extensive
review of the time-to-time development of robotic applications including their classi-
fication and some useful practical applications in agriculture. Agriculture is the most
valuable source of income, as more than 60% of the world population earn their liveli-
hood from farming. Thus, technical scopes of agricultural robotics should be spread
in cities, where farming scopes are limited. More automation in agricultural robots
is expected to help the farmers by efficiently increasing crop productivity using solar
energy that can work for many hours without any break. Moreover, the invention of
drones or aerial devices has already brought a huge revolution in digital farming,
which will extend scopes for smart crop management such as, crop scouting, crop
monitoring, weed, and pest control, spraying, and selective harvesting in precision
agriculture.
Robots and Drones in Agriculture—A Survey 23
References
1. Alston, J.M., Pardey, P.G.: Agriculture in the Global Economy. J. Econ. Perspect. 28(1),
121–146 (2014)
2. Calabi-Floody, M., Medina, J., Rumpel, C., Condron, L.M., Hernandez, M., Dumont, M.,
Mora, M.L.: Smart fertilizers as a strategy for sustainable agriculture. Adv. Agron. 147,
119–157 (2018)
3. Ahmed, H., Juraimi, A.S., Hamdani, S.M.: Introduction to robotics agriculture in pest control:
A review. Pertanika J. Sch. Res. Rev. 2(2), 80–93 (2016)
4. Lee, K.W., Kim, H.R., Yoon, W.C., Yoon, Y.S., Kwon D.S.: Designing a human-robot inter-
action framework for home service robot. In: International Workshop on Robots and Human
Interactive Communication, pp. 286–293. IEEE (2005)
5. Barbash, G.I., Glied, S.A.: New technology and health care costs—The case of robot-assisted
surgery. N. Engl. J. Med. 363(8), 701–704 (2010)
6. Wang, W., Li, R., Chen, Y., Diekel, Z.M., Jia, Y.: Facilitating human-robot collaborative tasks
by teaching-learning-collaboration from human demonstrations. In: IEEE Transactions on
Automation Science and Engineering, pp. 1–14. IEEE (2018)
7. Bechar, A., Edan, Y.: Human-robot collaboration for improved target recognition of
agricultural robots. Ind. Robot 30(5), 432–436 (2003)
8. Monta, M., Kondo, N., Shibano, Y.: Agricultural robot in grape production system. In: Inter-
national Conference on Robotics and Automation, vol. 3, pp. 2504–2509. IEEE, Nagoya,
Japan (1995)
9. Amer, G., Mudassir, S.M.M., Malik, M.A.: Design and operation of Wi-Fi agribot inte-
grated system. In: International Conference on Industrial Instrumentation and Control (ICIC),
pp. 207–212. IEEE, Pune (2015)
10. Auat Cheein, F.A., Carelli, R.: Agricultural robotics: Unmanned robotic service units in
agricultural tasks. Ind. Electron. Mag. 7(3), 48–58, IEEE (2013)
11. Kurata, K.: Cultivation of grafted vegetables II. Development of grafting robots in Japan.
HortScience 29, 240–244 (1994)
12. Van Henten, E.J., Van Tuijl, B.A.J., Hemming, J., Kornet, J.G., Bontsema, J., Os, E.A.V.:
Field test of an autonomous cucumber picking robot. Biosys. Eng. 86(3), 305–313 (2003)
13. Bawden, O., Kulk, J., Russell, R., McCool, C., English, A., Dayoub, F., Lehnert, C., Perez, T.:
Robot for weed species plant-specific management. J. Field Robot. 34(6), 1179–1199 (2017)
14. Sammons, P.J., Furukawa, T., Bulgin, A.: Autonomous pesticide spraying robot for use in a
greenhouse. Aust. Conf. Robotics Autom. 1–9 (2005)
15. Tanigaki, K., Fujiura, T., Akase, A., Imagawa, J.: Cherry-harvesting robot. Comput. Electron.
Agric. 63(1), 65–72 (2008)
16. Zhou, J., Zhang, M., Liu, G., Li, S.: Fuzzy control for automatic steering and line tracking of
agricultural robot. Int. Conf. Comput. Sci. Softw. Eng. 1094–1097 (2008)
17. Prema, K., Kumar, N.S., Dash, S.S., Chowdary, S.: Online control of remote operated agri-
cultural robot using Fuzzy controller and virtual instrumentation. Int. Conf. Adv. Eng. Sci.
Manag. 196–201 (2012)
18. de Sousa, R.V., Lopes, W.C., Pereira, R.R.D., Inamasu, R.Y., Porto, A.J.V.: A methodology for
composing and coordinating primitive Fuzzy behaviors to guide mobile agricultural robots.
In: 9th International Conference on Control and Automation, pp. 280–285. IEEE (2011)
19. Zhao, C.J., Jiang, G.Q.: Baseline detection and matching to vision-based navigation of
agricultural robot. Int. Conf. Wavelet Anal. Pattern Recogn. 44–48 (2010)
20. Zhang, Y., Gao, F., Tian, L.: INS/GPS integrated navigation for wheeled agricultural robot
based on sigma-point Kalman filter. In: 7th International Conference on System Simulation
and Scientific Computing, pp. 1425–1431. Asia Simulation Conference (2008)
21. Thamrin, N.M., Arshad, N.H.M., Adnan, R., Sam, R., Razak, N.A., Misnan, M.F., Mahmud,
S.F.: Tree detection profile using a single non-intrusive ultrasonic sensor for inter-row tracking
application in agriculture field. In: 9th international colloquium on signal processing and its
applications, pp. 310–313. IEEE (2013)
24 R. Basri et al.
22. Xue, J., Xu, L.: Autonomous agricultural robot and its row guidance. In: International
Conference on Measuring Technology and Mechatronics Automation, pp. 725–729 (2010)
23. Liu, P., Bi, S., Zang, G., Wang, W., Gao, Y., Deng, Z.: Obstacle avoidance system for agri-
cultural robots based on multi-sensor information fusion. In: International Conference on
Computer Science and Network Technology, pp. 1181–1185 (2011)
24. Srivastava, A., Vijay, S., Negi, A., Shrivastava, P., Singh, A.: DTMF based intelligent farming
robotic vehicle: An ease to farmers. In: International Conference on Embedded Systems
(ICES), pp. 206–210, Coimbatore (2014)
25. Agarwal, N., Thakur, R.: Agricultural robot: Intelligent robot for farming. Int. Adv. Res. J.
Sci. Eng. Technol. (IARJSET) 3(8), 1–5 (2016)
26. Srilekha, K., Monika, J.: Design and operation of Wi-Fi Agribot integrated system. Int. J. Sci.
Eng. Technol. Res. (IJSETR) 5(22), 4473–4478 (2016)
27. Phanomchoeng, G., Saadi, M., Sasithong, P., Tangmongkhonsuk, J., Wijayasekara, S.K.,
Wuttisittikulkij, L.: Hardware software co-design of a farming robot. Eng. J. 24(1), 1–10
(2020)
28. Jiang, L., Zhang, Y.: Making agriculture more intelligent: Progress of agricultural robots.
Rob. Autom. Eng. J. 4(1), 1–7 (2018)
29. Agriculture Robots. Available Online: https://www.postscapes.com/agriculture-robots/#ind
oor-ag-robots
30. Mikhaylov, M.N., Lositskii, I.A.: Control and navigation of forest robot. In: 25th Saint
Petersburg International Conference on Integrated Navigation Systems (ICINS), pp. 1–2,
St. Petersburg (2018)
31. Hellström, T.: Autonomous navigation for forest machines. A project pre-study in the
Department of Computer Science, Umea University, Sweden (2002)
32. Nagasaka, Y., Umeda, N., Kanetai, Y., Taniwaki, K., Sasaki, Y.: Autonomous guidance for
rice transplanting using global positioning and gyroscopes. Comput. Electron. Agric. 43(3),
223–234 (2004)
33. Hellström, T., Johansson, T., Ringdahl, O.: Development of an autonomous forest machine
for path tracking. In: P. Corke, S. Sukkariah (eds.) Field and Service Robotics, Springer Tracts
in Advanced Robotics, vol. 25, Springer, Berlin, Heidelberg (2006)
34. Li, M., Imou, K., Wakabayashi, K., Yokoyama, S.: Review of research on agricultural vehicle
autonomous guidance. Int. J. Agric. Biol. Eng. 2(3), 1–16 (2008)
35. Ringdahl, O., Lindroos, O., Hellström, T., Bergström, D., Athanassiadis, D., Nordfjell, T.:
Path tracking in forest terrain by an autonomous forwarder. Scand. J. For. Res. 26(4), 350–359
(2011)
36. Gottschalk, R., Burgos-Artizzu, X.P., Ribeiro, A., Pajares, G.: Real-time image processing
for the guidance of a small agricultural field inspection vehicle. Int. J. Intell. Syst. Technol.
Appl. 8(1–4), 434–443 (2010)
37. Wang, M., Wang, X., Yi, X., Tu, J.: Experimental study on long-range navigation behavior of
agricultural robots. Int. Conf. Comput. Measure. Control Sens. Netw. 409–412 (2012)
38. Ayala, M., Soria, C., Carelli, R.: Visual servo control of a mobile robot in agriculture
environments. Mech. Based Des. Struct. Mach. 36(4), 392–410 (2008)
39. Gao, F., Xun, Y. Wu, J. Bao, G., Tan, Y.: Navigation line detection based on robotic vision
in natural vegetation-embraced environment. In: 3rd International Congress on Image and
Signal Processing (CISP), pp. 2596–260 (2010)
40. Torres-Sospedra, J., Nebot, P.: A new approach to visual-based sensory system for navigation
into orange groves. Sensors 11(4), 4086–4103 (2011)
41. Zhang, J., Kantor, G., Bergerman, M., Singh, S.: Monocular visual navigation of an
autonomous vehicle in natural scene corridor-like environments. In: International Conference
on Intelligent Robots and Systems (IROS), pp. 3659–3666, IEEE/RSJ, Vilamoura, Algarve,
Portugal (2012)
42. Weiss, U., Biber, P.: Plant detection and mapping for agricultural robots using a 3D LIDAR
sensor. Rob. Auton. Syst. 59, 265–273 (2011)
Robots and Drones in Agriculture—A Survey 25
43. Ball, D., Upcroft, B., Wyeth, G., Corke, P., English, A., Ross, P., Patten, T., Fitch, R., Sukkarieh,
S., Bate, A.: Vision-based obstacle detection and navigation for an agricultural robot. J. Field
Rob. 33, 1107–1130 (2016)
44. Han, S., Zhang, Q., Noh, H.: Kalman filtering of DGPS positions for a parallel tracking
application. Trans. Am. Soc. Agric. Eng. 45(3), 553–560 (2002)
45. Nørremark, M., Griepentrog, H.W., Nielsen, J., Søgaard, H.T.: The development and assess-
ment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed
control in row crops. Biosys. Eng. 101(4), 396–410 (2008)
46. Libby, J., Kantor, G.: Deployment of a point and line feature localization system for an
outdoor agriculture vehicle. International Conference on Robotics and Automation (ICRA),
pp. 1565–1570. IEEE, Shanghai, China (2011)
47. Christiansen, M.: Localization in Orchards using extended Kalman filter for sensor-fusion.
Master thesis, University of Southern Denmark (2011)
48. Dørum, J.: Autonomous navigation and row detection in crop fields using computer vision.
Master thesis, Norwegian University of Science and Technology (2015)
49. Boubertakh, H., Tadjine, M., Glorennec, P., Labiod, S.: A simple goal seeking navigation
method for a mobile robot using human sense, fuzzy logic and reinforcement learning. J.
Autom. Control 18(1), 23–27 (2008)
50. Yousfi, N., Rekik, C., Jallouli, M., Derbel, N.: Optimized Fuzzy controller for mobile robot
navigation in a cluttered environment. In: 7th International Multi-Conference on Systems,
Signals and Devices, pp. 1–7, IEEE, Amman, Jordan (2010)
51. Troyer, T.A., Pitla, S., Nutter, E.: Inter-row robot navigation using 1D ranging sensors. IFAC-
PapersOnLine 49(16), 463–468 (2016)
52. Gavrilov, A. V., Lee, S.: An architecture of hybrid neural network based navigation system
for mobile robot. In: 7th International Conference on Intelligent Systems Design and
Applications, pp. 587–590, IEEE, Brazil (2007)
53. Ryerson, A., Zhang, Q.: Vehicle path planning for complete field coverage using genetic
algorithms. In: Proceedings of the Automation Technology for Off-road Equipment (ATOE),
pp. 309–317, Boon, Germany (2007)
54. Nichols, E., McDaid, L.J., Siddique, N.: Biologically inspired SNN for robot control. IEEE
Trans. Cybern. 15(2), 115–128 (2013)
55. Motlagh, O., Nakhaeinia, D., Tang, S.H., Karasfi, B., Khaksar, W.: Automatic navigation of
mobile robots in unknown environments. Neural Comput. Appl. 24(7), 1569–1581 (2014)
56. Nishiura, Y., Honami, N., Taira, T.: Development of a new grafting method, 4: Robotization
of grafting operation. J. Jpn. Soc. Agric. Mach. 61(6), 103–112 (1999)
57. Osamu, Y., Akiko, F.: Growth adjustment technique of seedling optimal for grafting by
seedling storage for cucumber full-automatic machine grafting. Tokyo Agric. Res. 55(1),
201–202 (2002)
58. Ken, K., Kenta, S., Sadao, S.: Study on automation of seedlings feeding for grafting robot for
cucurbitaceous vegetables (Part1). J. Jpn. Soc. Agric. Mach. 68(6), 117–123 (2006)
59. Kang, C.-H., Han, G.-S., Noh, T.-H., Choi, H.-G.: Splice Grafting Robot for Fruit and
Vegetable Plants. World Intellectual Property Organization, WO/2005/089532 (2005)
60. Chen, S., Chiu, Y.C., Chang, Y.C.: Development of a tubing-grafting robotic system for
fruit-bearing vegetable seedlings. Appl. Eng. Agric. 26(4), 707–714 (2010)
61. Lee, J.-M., Kubota, C., Tsao, S.J., Bie, Z.-L., Echevarría, P., Morra, L., Oda, M.: Current
status of vegetable grafting: Diffusion, grafting techniques, automation. Sci. Hortic. 127(2),
93–105 (2010)
62. Chang, Y.-C., Chen, S., Chiu, Y.-C., Lin, L.-H., Chang, Y.-S.: Growth and union acclima-
tion process of sweet pepper grafted by a tubing-grafting robotic system. Hortic. Environ.
Biotechnol. 53(2), 93–101 (2012)
63. Libin, Z., Qinghua, Y., Guanjun, B., Yan, W., Liyong, Q., Feng, G., Fang, X.: Overview of
research on agricultural robots in China. Int. J. Agric. Biol. Eng. 1(1), 12–21 (2008)
64. Kubota, C., McClure, M.A., Kokalis-Burelle, N., Bausher, M.G., Rosskopf, E.N.: Vegetable
grafting: History, use, and current technology status in North America. HortScience 43(6),
1664–1669 (2008)
26 R. Basri et al.
65. Li, M., Dai, S., Tang, C., Xiang, Y.: Simulation test on plant let-cutting mechanism of grafting
robot. Trans. Chin. Soc. Agric. Eng. 24(6), 129–132 (2008)
66. Zhao, Y., Zhang, T., Wang, H.: Cutting mechanism of root parental stock in automatic sapling
grafting machine. Trans. Chin. Soc. Agric. Eng. 24(9), 79–83 (2008)
67. Jiang, X.: Design of model PJJ-50 grape grafting machine. Agric. Equip. Vehicle Eng. 11,
7–9 (2011)
68. Wang, F., Liu, M., Wu, X., Cao, X.: Study and design of grafting robot for camellia oleifera
seedlings. Forest Machine and Equipment 39(4), 36–39 (2011)
69. Nezhad, B., Massah, J., Ebrahimpour-Komleh, H.: Design and construction of intelligent
tomato picking machine vision. Majlesi J. Electr. Eng (2012)
70. Feng, Q., Wang, X., Wang, G., Li, Z.: Design and test of tomatoes harvesting robot. In:
International Conference on Information and Automation, pp. 949–952. IEEE, Lijiang (2015)
71. Xiong, Y., From, P.J., Isler, V.: Design and evaluation of a novel cable-driven gripper with
perception capabilities for strawberry picking robots. In: International Conference on Robotics
and Automation (ICRA), pp. 7384–739. IEEE, Brisbane, QLD (2018)
72. Ashwini, K.: Survey paper on fruit picking robots. Int. J. Comput. Sci. Mobile Comput. 5(1),
96–101 (2016)
73. Slaughter, D.C., Giles, D.K., Downey, D.: Autonomous robotic weed control systems: A
review. Comput. Electron. Agric. 61(1), 63–78 (2008)
74. Bakker, T., van K.A., Bontsema, J., Müller, J., van, G.S.: Systematic design of an autonomous
platform for robotic weeding. J. Terramech. 47(2), 63–73 (2010)
75. Pérez-Ruíz, M., Slaughter, D.C., Fathallah, F.A., Gliever, C.J., Miller, B.J.: Co-Robotic intra-
row weed control system. Biosys. Eng. 126, 45–55 (2014)
76. Fennimore, S.A., Slaughter, D.C., Siemens, M.C., Leon, R.G., Saber, M.N.: Technology for
automation of weed control in specialty crops. Weed Technol. 30(4), 823–837 (2016)
77. Bechar, A., Vigneault, C.: Agricultural robots for field operations. Part 2: Operations and
systems. Biosys. Eng. 153, 110–128 (2017)
78. Kunz, C., Weber, J.F., Peteinatos, G.G., Sokefeld, M., Gerhards, R.: Camera steered mechan-
ical weed control in sugar beet, maize and soybean. Precision Agric. 19(4), 708–720
(2018)
79. Chang, C.-L., Lin, K.-M.: Smart agricultural machine with a computer vision-based weeding
and variable rate irrigation scheme. Robotics 7(3), 38 (2018)
80. Steward, B.L., Gai, J., Tang, L.: The Use of Agricultural Robots in Weed Management
and Control. Robotics and Automation for Improving Agriculture, pp. 161–186. Biosystems
Engineering Publications (2019)
81. Wu, X., Aravecchia, S., Lottes, P., Stachniss, C., Pradalier, C.: Robotic weed control using
automated weed and crop classification. J. Field Robot. 37(1), 1–29 (2020)
82. Sankaran, S., Mishra, A., Ehsani, R., Davis, C.: A review of advanced techniques for detecting
plant diseases. Comput. Electron. Agric. 72(1), 1–13 (2010)
83. Hengstler, S., Prashanth, D., Fong, S., Hamid, A.: Mesh: A hybrid-resolution smart camera
mote for applications in distributed intelligent surveillance. In: 6th International Conference
on Information Processing in Sensor Networks, pp. 360–369. New York, USA (2007)
84. Laothawornkitkul, J., Moore, J.P., Taylor, J.E., Possell, M., Gibson, T.D., Hewitt, C.N., Paul,
N.D.: Discrimination of plant volatile signatures by an electronic nose: A potential technology
for plant pest and disease monitoring. Environ. Sci. Technol. 42(22), 8433–8439 (2008)
85. Li, Y., Xia, C., Lee, J.: Vision-based pest detection and automatic spray of greenhouse plant.
In: International Symposium on Industrial Electronics, pp. 920–925. IEEE, Seoul (2009)
86. Camargoa, A., Smith, J.S.: An image-processing based algorithm to automatically identify
plant disease visual symptoms. Biosys. Eng. 102(1), 9–21 (2009)
87. López, O., Rach, M.M., Migallon, H., Malumbres, M.P., Bonastre, A., Serrano, J.J.: Moni-
toring pest insect traps by means of low-power image sensor technologies. Sensors 12(11),
15801–15819 (2012)
88. Francis, J., Anto Sahaya Dhas, D., Anoop, B.K.: Identification of leaf diseases in pepper plants
using soft computing techniques. In: Conference on Emerging Devices and Smart Systems
(ICEDSS), pp. 168–173. Namakkal (2016)
Robots and Drones in Agriculture—A Survey 27
89. Yazgaç, B.G., Kirci, M., Kivan, M.: Detection of Sunn pests using sound signal processing
methods. In: 5th International Conference on Agro-Geoinformatics (Agro-Geoinformatics),
pp. 1–6. Tianjin (2016)
90. Gonzalez-de-Santos, P., Ribeiro, A., Fernandez-Quintanilla, C., Lopez-Granados, F., Brand-
stoetter, M., Tomic, S., Pedrazzi, S., Peruzzi, A., Pajares, G., Kaplanis, G., Perez-Ruiz,
M., Valero, C., del Cerro, J., Vieri, M., Rabatel, G., Debilde, B.: Fleets of robots for
environmentally-safe pest control in agriculture. Precision Agric. 18, 574–614 (2017)
91. Blackmore, S., Stout, B., Wang, M., Runov, B.: Robotic agriculture—The future of agricultural
mechanization. In: 5th European Conference on Precision Agriculture (ECPA), pp. 621–628.
Upsala, Sweden (2005)
92. Jian-sheng, P.: An intelligent robot system for spraying pesticides. Open Electr. Electron. Eng.
J. 8(1), 435–444 (2014)
93. Pilli, S.K., Nallathambi, B., George, S. J., Diwanji, V.: eAGROBOT—A robot for early crop
disease detection using image processing. In: 2nd International Conference on Electronics
and Communication Systems (ICECS), pp. 1684–1689. Coimbatore (2015)
94. Sharma, S., Borse, R.: Automatic agriculture spraying robot with smart decision making.
In: Corchado J., Rodriguez, S., Mitra, S., Thampi, E., El-Alfy, S. (eds) Intelligent Systems
Technologies and Applications (ISTA), Advances in Intelligent Systems and Computing, vol.
530, pp. 743–758. Springer, Cham (2016)
95. Sudha, B., Bhuvana, L., Divya, V., Mamathashree, S.R., Pallavi: Automated pest detection
and pesticide spraying robot. Int. J. Recent Trends Eng. Res. (IJRTER) 4(4), 1–8 (2018)
96. Chaitanya, P., Kotte, D., Srinath, A., Kalyan, K.B.: Development of smart pesticide spraying
robot. Int. J. Recent Technol. Engi. (IJRTE) 8(5), 2193–2202 (2020)
97. Ceres, R., Pons, J.L., Jiménez, A.R., Martín, J.M., Calderón, L.: Design and implementation
of an aided fruit-harvesting robot (Agribot). Industr. Robot: An Int. J. 25(5), 337–346 (1998)
98. Hua, Y., Zhang, N., Yuan, X., Quan, L., Yang, J., Nagasaka, K., Zhou, X.-G.: Recent advances
in intelligent automated fruit harvesting robots. Open Agric. J. 13(1), 101–106 (2019)
99. Onishi, Y., Yoshida, T., Kurita, H., Fukao, T., Arihara, H., Iwai, A.: An automated fruit
harvesting robot by using deep learning. ROBOMECH J. 6(1), 1–8 (2019)
100. Yuan, Y., Zhang, X., Zhao, H.: Apple harvesting robot picking path planning and simulation.
In: International Conference on Information Engineering and Computer Science, pp. 1–4,
Wuhan (2009)
101. Lv, J., Zhao, D., Ji, W., Chen, Y., Shen, H.: Design and research on vision system of apple
harvesting robot. In: 3rd International Conference on Intelligent Human-Machine Systems
and Cybernetics, pp. 177–180, Zhejiang (2011)
102. De-An, Z., Jidong, L., Wei, J., Ying, Z., Yu, C.: Design and control of an apple harvesting
robot. Biosys. Eng. 110(2), 112–122 (2011)
103. Li, Z., Liu, J., Li, P., Li, W.: Analysis of workspace and kinematics for a tomato harvesting
robot. In: International Conference on Intelligent Computation Technology and Automation
(ICICTA), pp. 823–827, Hunan (2008)
104. Wang, J., Zhou, Z., Du, X.: Design and co-simulation for tomato harvesting robots. In: 31st
Chinese Control Conference, pp. 5105–5108, Hefei (2012)
105. Liu, J., Li, Z., Wang, F., Li, P., Xi, N.: Hand-arm coordination for a tomato harvesting robot
based on commercial manipulator. In: International Conference on Robotics and Biomimetics
(ROBIO), pp. 2715–2720. IEEE, Shenzhen (2013)
106. Yaguchi, H., Nagahama, K., Hasegawa, T., Inaba, M.: Development of an autonomous tomato
harvesting robot with rotational plucking gripper. In: International Conference on Intelligent
Robots and Systems (IROS), pp. 652–657. IEEE/RSJ, Daejeon, (2016)
107. Wang, G., Yu, Y., Feng, Q.: Design of end-effector for tomato robotic harvesting. IFAC-
PapersOnLine 49(16), 190–193 (2016)
108. Sakai, S., Osuka, K., Fukushima, H., Iida, M.: Watermelon harvesting experiment of a
heavy material handling agricultural robot with LQ control. In: International Conference
on Intelligent Robots and Systems, vol. 1, pp. 769–774. IEEE/RSJ, Lausanne, Switzerland
(2002)
28 R. Basri et al.
109. Arima, S., Shibusawa, S., Kondo, N., Yamashita, J.: Traceability based on multi-operation
robot; Information from spraying, harvesting and grading operation robot. In: International
Conference on Advanced Intelligent Mechatronics (AIM 2003), vol. 2, pp. 1204–1209.
IEEE/ASME, Kobe, Japan (2003)
110. Qingchun, F., Wengang, Z., Quan, Q., Kai, J., Rui, G.: Study on strawberry robotic harvesting
system. In: International Conference on Computer Science and Automation Engineering
(CSAE), pp. 320–324. IEEE, Zhangjiajie (2012)
111. Irie, N., Taguchi, N., Horie, T., Ishimatsu, T.: Asparagus harvesting robot coordinated with
3-D vision sensor. In: International Conference on Industrial Technology, pp. 1–6. IEEE,
Gippsland, VIC (2009)
112. Birrell, S., Hughes, J., Cai, J.Y., Iida, F.: A field-tested robotic harvesting system for iceberg
lettuce. J. Field Rob. 37(2), 225–245 (2020)
113. Hajjaj, S.S.H., Sahari, K.S.M.: Review of agriculture robotics: Practicality and feasibility.
In: International Symposium on Robotics and Intelligent Sensors (IRIS), pp. 194–198. IEEE,
Tokyo (2016)
114. Behmanesh, M., Hong, T.S., Kassim, M.S.M., Azim, A., Dashtizadeh, Z.: A brief survey on
agricultural robots. Int. J. Mech. Eng. Rob. Res. (IJMERR) 6(3), 178–182 (2017)
115. Shamshiri, R.R., Weltzien, C., Hameed, I.A., Yule, I.J., Grift, T.E., Balasundram, S.K., Piton-
akova, L., Ahmad, D., Chowdhary, G.: Research and development in agricultural robotics: A
perspective of digital farming. Int. J. Agric. Biol. Eng. (IJABE) 11(4), 1–14 (2018)
116. Jadhav, P.K., Deshmukh, S.S., Khairnar, P.N.: Survey paper on AgRo-bot autonomous robot.
Int. Res. J. Eng. Technol. (IRJET) 6(12), 434–441 (2019)
117. Fue, K.G., Porter, W.M., Barnes, E.M., Rains, G.C.: An extensive review of mobile agricultural
robotics for field operations: Focus on cotton harvesting. AgriEngineering 2(1), 150–174
(2020)
118. Veroustraete, F.: The rise of the drones in agriculture. Ecronicon 2(2), 1–3 (2015)
119. Natu, A.S., Kulkarni, S.C.: Adoption and utilization of drones for advanced precision farming:
A review. Int. J. Recent Innov. Trends Comput. Commun. 4(5), 563–565 (2016)
120. Ahirwar, S., Swarnkar, R., Bhukya, S., Namwade, G.: Application of drone in agriculture.
Int. J. Curr. Microbiol. Appl. Sci. 8(1), 2500–2505 (2019)
121. Mogili, U.R., Deepak, B.: Review on application of drone systems in precision agriculture.
Proc. Comput. Sci. 133, 502–509 (2018)
122. Daponte, P., De Vito, L., Glielmo, L., Iannelli, L., Liuzza, D., Picariello, F., Silano, G.: A
review on the use of drones for precision agriculture. IOP Conf. Ser. Earth Environ. Sci. 275,
1–11 (2019)
123. Abdullahi, H.S., Mahieddine, F., Sheriff, R.E.: Technology impact on agricultural produc-
tivity: A review of precision agriculture using unmanned aerial vehicles. In: Pillai, P., Hu, Y.,
Otung, I., Giambene, G. (eds) Wireless and Satellite Systems, WiSATS, Lecture Notes of the
Institute for Computer Sciences, Social Informatics and Telecommunications Engineering,
vol. 154, pp. 388–400. Springer, Cham (2015)
124. Primicerio, J., Di Gennaro, S.F., Fiorillo, E., Genesio, L., Lugato, E., Matese, A., Vaccari, F.P.:
A flexible unmanned aerial vehicle for precision agriculture. Precis. Agric. 13(4), 517–523
(2012)
125. Bendig, J., Bolten, A., Bareth, G.: Introducing a low-cost mini-UAV for thermal-and
multispectral-imaging. International Archives of the Photogrammetry, Remote Sensing
and Spatial Information Sciences, XXII ISPRS Congress, vol. XXXIX-B1, pp. 345–349,
Melbourne, Australia (2012)
126. Anthony, D., Elbaum, S., Lorenz, A., Detweiler, C.: On crop height estimation with UAVs.
In: International Conference on Intelligent Robots and Systems, pp. 4805–4812. IEEE/RSJ,
Chicago, IL (2014)
127. Huang, Y., Hoffmann, W.C., Lan, Y., Wu, W., Fritz, B.K.: Development of a spray system for
an unmanned aerial vehicle platform. Appl. Eng. Agric. 25(6), 803–809 (2009)
128. Faiçal, B.S., Costa, F.G., Pessin, G., Ueyama, J., Freitas, H., Colombo, A., Fini, P.H., Villas,
L., Osório, F.S., Vargas, P.A., Braun, T.: The use of unmanned aerial vehicles and wireless
sensor networks for spraying pesticides. J. Syst. Architect. 60(4), 393–404 (2014)
Robots and Drones in Agriculture—A Survey 29
129. Faiçal, B.S., Freitas, H., Gomes, P.H., Mano, L.Y., Pessin, G., de Carvalho, A.C.P.L.F., Krish-
namachar, B., Ueyama, J.: An adaptive approach for UAV-based pesticide spraying in dynamic
environments. Comput. Electron. Agric. 138, 210–223 (2017)
130. Kurkute, S.R., Deore, B.D., Kasar, P., Bhamare, M., Sahane, M.: Drones for smart agriculture:
A technical report. Int. J. Res. Appl. Sci. Eng. Technol. (IJRASET) 6(IV), 341–346 (2018)
131. Talaviya, T., Shah, D., Patel, N., Yagnik, H., Shah, M.: Implementation of artificial intelligence
in agriculture for optimisation of irrigation and application of pesticides and herbicides. Artif.
Intell. Agric. 4, 58–73 (2020)
132. Thompson, L.J., Shi, Y., Ferguson, R.B.: Getting Started with Drones in Agriculture (G-2296).
University of Nebraska Extension, NebGuide (2017)
133. Zheng, H., Zhou, X., Cheng, T., Yao, X., Tian, Y., Cao, W., Zhu, Y.: Evaluation of a UAV
based hyperspectral frame camera for monitoring the leaf nitrogen concentration in rice. In:
International Geoscience and Remote Sensing Symposium (IGARSS), pp. 7350–7353. IEEE,
Beijing (2016)
134. Reinecke, M., Prinsloo, T.: The influence of drone monitoring on crop health and harvest size.
In: 1st International Conference in Next Generation Computing Applications (NextComp),
pp. 5–10. IEEE, Mauritius (2017)
135. Psirofonia, P., Samaritakis, V., Eliopoulos, P., Potamitis, I.: Use of unmanned aerial vehicles
for agricultural applications with emphasis on crop protection: Three novel case-studies. Int.
J. Agric. Sci. Technol. 5(1), 30–39 (2017)
136. Yallappa, D., Veerangouda, M., Maski, D., Palled, V., Bheemanna, M.: Development and
evaluation of drone mounted sprayer for pesticide applications to crops. Glob. Humanitarian
Technol. Conf. (GHTC), pp. 1–7. IEEE, San Jose, CA (2017)
137. Pharne, I.D., Kanase, S., Patwegar, S., Patil, P., Pore, A., Kadam, Y.: Agriculture drone sprayer.
Int. J. Recent Trends Eng. Res. 4(3), 181–185 (2018)
138. Xiang, H., Tian, L.: Development of a low-cost agricultural remote sensing system based on
an autonomous unmanned aerial vehicle (UAV). Biosys. Eng. 108(2), 174–190 (2011)
139. Senthilnath, J., Dokania, A., Kandukuri, M.K.N.R., Anand, G., Omkar, S.N.: Detection of
tomatoes using spectral-spatial methods in remotely sensed RGB images captured by UAV.
Biosys. Eng. 146, 16–36 (2016)
140. Kushwaha, H.L., Sinha, J.P., Khura, T., Kushwaha, D.K.: Status and scope of robotics in
agriculture. Int. Conf. Emerg. Technol. Agric. Food Eng. 264–277 (2016)
141. Roldán, J.J., del Cerro, J., Garzón-Ramos, D., Garcia-Aunon, P., Garzón, M., de León, J., Barri-
entos, A.: Robots in agriculture: State of art and practical experiences. Appl. Rob. Greenhouse
Farming 67–90 (2018)
142. Emmi, L., de Soto, M.G., Pajares, G., de Santos, P.G.: New trends in robotics for agriculture:
Integration and assessment of a real fleet of robots. Sci. World J. 2014, 21 (2014)
Detection of Rotten Fruits and Vegetables
Using Deep Learning
1 Introduction
Fruits and vegetables are very necessary items for our daily life. There are different
species of edible fruits and vegetables in nature. Fresh fruits and vegetables are not
only delicious to eat but also a good source of many important vitamins or minerals.
Fresh fruits and vegetables are used in the food processing industries to process deli-
cious food products. The fruits and vegetables have to pass through various stages
from harvesting to reach the customer. The stages are harvesting, sorting, classi-
fication, grading, etc. The manual execution of those tasks requires lots of expert
resources and a long time. Many countries are suffering from a resource shortage for
agricultural tasks because of a lack of interest in such a laborious job. Hence, automa-
tion is needed in every aspect of the processing of fruits and vegetables. Computer
vision and machine learning have earned huge success in solving various automa-
tion problems in different industries. The researchers also contributed to addressing
various problems in fruits and vegetable processing with the help of computer vision
and machine learning techniques. This chapter explores those problems and chal-
lenges of fruits and vegetable processing using computer vision and machine learning
techniques. The major focus has been given on the problem of automatic detection
of rotten fruits and vegetables.
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 31
M. S. Uddin and J. C. Bansal (eds.), Computer Vision and Machine Learning
in Agriculture, Algorithms for Intelligent Systems,
https://doi.org/10.1007/978-981-33-6424-0_3
32 S. Jana et al.
Most of the time, the shape, color, and texture are changed on the surface of rotten
fruit and vegetable. The bad smell is also an important indication of rot. The fruits
and vegetables mostly rot in the inventory. There are many factors for the fruit or
vegetable to become rotten [1, 2]. The factors are temperature, moisture, air, light,
and microorganisms. The fruit and vegetables also rot during transportation [3, 4].
A single rotten fruit or vegetable can damage multiple fresh fruit and vegetable in
inventory. Inventory damage causes a good amount of loss in the business of fruits
and vegetables. The early detection of rotten fruits and vegetables reduces the amount
of damage inside inventory or store and also enhances food safety. Manual resource
detects rotten fruits and vegetables by smelling, observing the shape deformation, and
change in surface color, and texture. The smell cannot be tested in case of automatic
detection of rotten fruits and vegetables using computer vision and machine learning.
The computer vision has to rely only on the change of surface feature compared
with the fresh one. It makes the task of computer-based detection of rotten fruits
and vegetables into a challenging task for researchers. This chapter addressed the
problem of rotten fruit and vegetable detection using state-of-the-art deep learning
techniques. A convolutional neural network (CNN) architecture has been proposed
to classify the rotten and fresh from a captured image of fruit and vegetable.
This chapter has been structured as follows: Sect. 2 describes the state-of-the-art
problems and challenges of fruits and vegetable processing using computer vision
and machine learning techniques. Section 3 elucidates the materials and the proposed
method in detail. Section 4 brings experiments and results. A detailed discussion on
this work has been presented in Sect. 5. Section 6 concludes the chapter with future
scope.
The computer vision and machine learning had already achieved astounding success
in many automation challenges regarding fruits and vegetable processing. Computer
vision completely relies on the appearance of the outer surface of fruits or vegetables.
The literature on fruits and vegetable processing can be broadly categorized based
on problems. This section highlights some of the very challenging problems of fruits
and vegetable processing i.e. segmentation and detection of fruits and vegetables
from the natural environment, classification of fruits and vegetable type, grading the
fruits and vegetables, sorting the defective fruits, and vegetables.
Discovering Diverse Content Through
Random Scribd Documents
javadalmazásból… mi kell a gyermekeknek… egy egész életet éltek
át – képzeletben.
Olyan bizonyosra vették a kinevezést, hogy nyiltan beszéltek a
gyermekek előtt, s később maguk a gyermekek is tréfálkoztak
szüleiknek ezen az emlékezetes, nem első, de utolsó csalódásán:
– Tudjátok, ez aznap történt, amikor a papa igazgató volt!…
Huszonkettedikén reggel Gregorics betámolygott az
államtitkárhoz és zöldesfehér ábrázattal várta az itéletet.
Az államtitkár, aki már három nappal előbb aláiratta a miniszterrel
azt az okiratot, mely Szenttamásy Ferdit nevezte ki az alsó-tengelici
állami csillagvizsgáló intézet igazgatójává, – Szenttamásy Ferdi
érdekes fiu és kitünő parti volt, akit rendkivül sokan protegáltak – igy
szólt az ájuldozó Gregoricshoz:
– Egy kis memorandumot kellene kidolgozni arról, hogy micsoda
ujitások váltak szükségesekké az alsó-tengelici csillagvizsgáló
intézetben s hogy milyen feladatokat tüzzön a miniszterium az
ujonnan kinevezett igazgató, Szenttamásy Nándor ur elé. Önt ügyes
embernek mondják. Ha elvállalja ezt a munkát, tiz napi tizkoronás
napidijnak és utiköltségeinek megtéritésén kivül kétszáz korona
jutalomdijban részesitjük önt.
IV.
I.
II.
Ó
Óh, hogy nincs fiam, aki keresztülgázoljon rajta és kitapossa a
belét!…
Aztán ujra kezdte:
– Egy életen át üldözött!… Már az iskolában is üldözött!… És
megrontotta az egész életemet!… Ami bajom volt, mind ő okozta!…
A disznó!… Az alávaló gazember!…
Addig orditozott, mig elfogta a roham és fuldokolni kezdett.
Két napig volt magánkivül. De megint felgyógyult és a háboru
ujra kezdődött.
Hol az egyik, hol a másik helyezte el a kishirdetések közé a
méreggel átitatott „kedélyeskedés“-t. Amelyik túladott a maga
nadragulyáján, vigadott a jó tréfán és azon az éjjelen édesdeden
aludt, mint aki sikerült, jó munkát végzett.
Aztán lesték a feleletet és mindegyik, mindig belebetegedett a
feleletbe.
Már csak hálni járt beléjök a lélek; már csak ennek éltek. De ezt
kiélvezték és ugyancsak átszenvedték.
Megint gonosz gyermekek voltak.
Valami hasznosra már nem voltak alkalmasak, de ártani még
tudtak.
Az ember néha hajlandó elhinni, hogy minden nagy baja magától
a természettől van. A halál, a betegség, a testi szenvedés, mind
természetes; az éhséget, a fázást, a sokféle fajta nélkülözést, mind,
mind a természet küldi ránk.
Elfelejtjük, hogy a természet türelmes, elnéző, kegyes és a
segitségre mindig kész barátunk.
Elfelejtjük, hogy az ember legnagyobb ellensége; az emberi
ostobaság és az emberi gonoszság. Az emberi állat.
Ugy haltak meg, mint a leghiresebb szerelmesek: Jaufre Rudel és
Melisande, egymástól távol, de azonegy órában.
Amint hogy az utolsó tiz évben ugy is éltek: egymást soha se
látva, de mindig egymásra gondolva.
GYŐZELMES HARC.
I.
II.
III.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebooknice.com