base_paper_for_project
base_paper_for_project
Abstract
The research targets an intelligent agriculture system that uses IoT sensors and camera-assists
to aim for ecologically friendly crop prediction and disease detection. The objectives are to
navigate the plant selection through environmental data in real-time, detect crop diseases on
the farm through cameras or uncrewed aerial vehicles (UAVs) and Convolutional Neural
Networks (CNNs), and perfect farming operations by using resource allocation and intelligence-
led crop loss deduction through AI-infused solutions. The IoT sensors are utilized to bring real-
time soil and also environmental data. At first, the machine learning modules, composed of
classification algorithms, will be used to forecast the most suitable crops. Besides, the camera
assists in collecting the necessary aerial and land images and then using the respective
processing software demonstrated with a Convolutional Neural Network (CNN) to identify any
disease on crops. Subsequently, this approach will be the main one when testing and verifying
in a farming-adjusted simulator environment. The project has an intelligent part, which is the
system that is reliable in terms of the prediction and selection of the best crops. The use of real-
time sensor data enables the accuracy of crop selection. Furthermore, the HD cameras or
uncrewed vehicles flying through the air with the help of CNNs will produce accurate pictures
where the earlier stages of disease in crops will be visible. The generated approach will bring
gains in resource optimization, reduce crop losses, and improve decision-making resulting from
the data in precision agriculture. This project is different from the others since it shows that by
using IoT sensors, camera-assisted, and AI models, one may be able to predict crops and
diseases with higher precision in the farming industry. It thus has the advantage of saving
resources, increasing yields, and reducing losses. Further research should look at other types of
sensors and improve the capabilities of cameras and drones. Also, reliability is another factor
that can be achieved through testing in various farming environments for general application.
Introduction
Agriculture plays a vital role in global food security; farmers worldwide are bound by the
momentum of agriculture, which is a facilitator in traditional and contemporary farming to
increase food levels as per the ever-growing demand from society. Conventional farming
methods are focused on the inefficient use of resources, unpredictable yield of crops, and slow
response to plant diseases. By using advanced equipment, precision farming could be an
effective new model to solve this issue. Precision incorporation of information algorithms in
improving resources, disease detection, and increased efficiency is possible. Furthermore, IoT
sensors and drones monitor precision live in vitro information that is used to detect soil
conditions, weather patterns, and crop health status, besides the fact that they are the two
most efficient precision technologies. IoT sensors constantly monitor soil parameters, such as
Nitrogen, phosphorus, Potassium (NPK), temperature, humidity, pH, and rainfall. These sensors
produce abundant data that can be exploited to make the best crop predictions for a specific
environment type. Camera-assist simultaneously assists in the ground-level views of farm
fields. Therefore, this enables crop growth monitoring and the early diagnosis of diseases.
Combined with these sophisticated AI approaches, such as CNN, these technologies allow
instant and actionable insights for farmers.
Nevertheless, this is one of the significant deal challenges most of the time since it is jagged
that the given situation goes rightly: smart farming is getting safe when it comes to data created
by such IoT devices and camera-assists, and it is the protection of them from attacks. The study,
therefore, addresses two main areas of intelligent farming operations, including the utilization of
precision technologies. The nature of plant cultivation in a particular locality may change over
time. As the climate warms, changing characteristics are likely to follow. The farmers then
decide to cultivate other crops suitable for the new condition. Such a model introduces a data-
driven approach to solve the problem of selecting the appropriate crops that are cost-effective
without causing damage to the environment in the given area under different weather
conditions. The use of a combination of the novelty of camera assist, and the machine learning
mechanism of Convolutional Neural Networks (CNNs) is evident in an autonomous
phenomenological model. The interweaving of cameras or UAVs as surveillance instruments in
the form of early crop disease detectors using artificial intelligence is a tool for short-term
therapies to be tangible. There might emerge in this manner not only wide-scale applications
that will lead to enhanced farm productions but also these technologies to ensure secured farm
IoT networks and avert cinema ties. We can subsequently see the world witnessing easy
adaptations to advanced technologies and highly efficiently improved farm practices thanks to
drones and IoT in the face of future dangers from cyber-attacks in the future as portrayed.
Methodology
The project integrates IoT sensors, camera assist, and machine learning to build a system for
crop prediction and disease detection in intelligent farming. The methodology is divided into
two key components: IoT-based crop prediction and Camera-Assisted-based crop disease
detection. Here's a step-by-step breakdown:
• Sensors: IoT sensors are deployed in the farm fields to measure essential soil and
environmental parameters, including:
o Soil Properties: Nitrogen (N), Phosphorous (P), Potassium (K), and pH levels.
• Edge Computing & Cloud Storage: Data is transmitted to storage systems, either
locally using edge computing or remotely via cloud platforms. This ensures that the
collected information is accessible for further processing and analysis.
• Data Cleaning: Remove noise, missing values, and inconsistencies in the collected
sensor data.
• Feature Engineering: Create relevant features from raw sensor data, such as time-
based aggregations (e.g., average rainfall over a week).
• Normalization: Normalize the data so that all features contribute equally to the
machine learning model, ensuring uniformity in data scaling.
o Decision Trees
o Random Forest
• Validation: Use cross-validation techniques (e.g., k-fold) to ensure the model performs
well across unseen data.
• AI-Driven Insights: Based on the analysis, the decision support system offers
actionable insights to farmers, which are displayed through a user-friendly interface.
These insights help optimize farming practices and allow farmers to predict the crops
that should be planted based on the IOT sensor data.
1.5 Actuation:
• Automated Systems & Alert Mechanisms: The system can trigger automated irrigation
or pest control actions. Farmers are also alerted to take manual actions when
necessary.
• Continuous Learning & Farmer Input: Farmers provide input and feedback into the
system, allowing continuous learning and adaptation. The system refines its
recommendations based on real-world outcomes over time, making the process more
efficient and intelligent.
2. Camera-Assisted-Based Crop Disease Detection Using CNN
• Image Acquisition: Fly the Camera-assist over the farm fields to collect aerial imagery
data periodically for disease monitoring.
• Image Resizing: Resize the collected images to ensure they are compatible with the
input size of the CNN model.
• Labelling: Annotate the images with the type of disease or healthy crop label based on
visual inspection or expert knowledge.
• Convolutional Neural Network (CNN): CNN is used for image classification to detect
crop diseases.
• Model Architecture: Customize the final layers of the CNN for disease classification
based on the number of disease classes.
2.4 Model Training and Evaluation:
• Evaluation: Use metrics like accuracy, precision, recall, and confusion matrix to assess
the performance of the CNN model in detecting crop diseases.
• Real-Time Analysis: Deploy the trained model to analyze cameras captured images in
real time and detect any disease in the crops.
• Traffic Capture: Traffic data is captured from the internet to form the raw input for the
Intrusion Detection System (IDS).
• Feature Construction: After capturing the traffic, relevant features are constructed
from the raw traffic to ensure the data suits the machine learning (ML) process. This
involves transforming and extracting critical attributes from the traffic data.
• Dataset Creation: The processed traffic data (now with constructed features) is
organized into a dataset, forming the IDS Raw Traffic Dataset.
• Data Splitting: The raw dataset is split into Training and Testing Traffic, each stored as a
CSV file.
• ML Algorithm: The Training Traffic CSV file is fed into an appropriate machine learning
algorithm to train the model.
• ML Model: A machine learning model is generated based on the algorithm and training
data. This model is used to classify traffic and predict whether it is benign or malicious
(B/M).
• Tuned Model: The machine learning model is refined and tuned for optimization using
the Testing Traffic data.
• Prediction: The tuned model is then used to predict whether new traffic data is Benign
(B) or Malicious (M).
• Deploy the IoT sensor-based crop prediction and Camera-assisted disease detection
models on a cloud platform for real-time analysis.
• Cloud computing can handle large datasets and enable farmers to monitor remotely.
• Develop a user-friendly dashboard for farmers to visualize sensor data, crop predictions,
disease alerts, and security notifications.
5. Performance Evaluation
• Deploy the integrated system in real-world farming scenarios to test the effectiveness of
crop prediction and disease detection.
• Collect feedback from farmers to refine the models and improve usability.
• Ensure the system can handle increased data loads from large-scale farming
operations.
Technologies Used in the Project
The project combines advanced IoT, AI, machine learning, and Camera-Assisted technologies to
deliver a comprehensive, innovative farming solution. Below is a description of the critical
technologies used in the project:
IoT plays a crucial role in precision farming by enabling real-time monitoring of soil and
environmental conditions. Various sensors are deployed in the farm fields to collect data, which
are transmitted wirelessly to a centralized system for analysis.
• Soil Nutrient Sensors: Measure levels of Nitrogen (N), phosphorus (P), and Potassium
(K), which are critical for plant growth, will be done by NPK sensor
• Temperature and Humidity Sensors: Monitor the local climate, which affects crop
growth.
• Rainfall Sensors: Track the amount of precipitation, assisting in irrigation and water
management.
Rainfall Sensor
Communication Protocols:
• LoRaWAN, Zigbee, or MQTT are used for wireless data transmission from IoT sensors to
a cloud-based platform or local server.
2. Camera Assistance
UAVs (drones) or cameras are well-equipped to capture high-resolution aerial images of crops,
enabling real-time crop monitoring and early detection of diseases. The drones can be
programmed to fly over farm fields, covering large areas efficiently and autonomously.
Technologies Used:
HD Camera
• Flight Controllers: Autonomous drones rely on controllers such as PX4 or Ardupilot for
stable flight and navigation.
Drones
• GPS Navigation: Ensures the drones cover specific areas and return to base after data
collection.
• Image Transmission: Camera images are transmitted to a central server or cloud for
further analysis.
Machine learning algorithms are applied to IoT sensor data, and the camera captures images to
predict suitable crops and detect crop diseases.
Technologies Used:
• Scikit-learn: A Python library used for building machine learning models, such as
Decision Trees, Random Forests, or Support Vector Machines (SVM) for crop prediction.
• TensorFlow/Keras: Deep learning frameworks are used to build and train convolutional
neural networks (CNNs) to analyze camera images and detect crop diseases.
• Transfer Learning: Pre-trained CNN models like MobileNet, ResNet, or VGG16 are
fine-tuned for disease detection tasks.
4. Convolutional Neural Networks (CNN)
CNNs are employed for image classification tasks, specifically for detecting crop diseases from
captured images captured by cameras. CNNs automatically learn features from the image data,
making them suitable for pattern recognition and image analysis tasks.
Technologies Used:
• Convolution Layers: Extract features like edges, shapes, and textures from crop
images.
5. Cloud Computing
Cloud computing stores, processes, and analyzes large amounts of data from IoT sensors and
Camera-assists. It enables remote monitoring and real-time analysis of crop conditions.
Cloud Platforms:
• AWS (Amazon Web Services), Google Cloud, or Microsoft Azure: These host data
analysis models and provide real-time insights to farmers.
• Cloud computing: Cloud computing power processes and trains machine learning
models on vast amounts of data, such as Google Colab or Kaggle.
• IOT Cloud: Arduino IoT Cloud is a powerful service that allows anyone to create IoT
applications with just a few simple steps. With a combination of intelligent technology,
user-friendly interfaces, and powerful features.
The IDS ensures the security of IoT devices and camera-assistance networks by monitoring
network traffic for signs of malicious activity. It helps prevent cyber-attacks that could disrupt
farming operations or corrupt data.
Technologies Used:
• Machine Learning-Based IDS: A machine learning model that classifies network traffic
as usual or suspicious based on patterns.
A user-friendly interface allows farmers to monitor real-time data, receive insights on crop
health, and take timely actions based on AI predictions. Data visualization helps
understandably present complex data.
Technologies Used:
• Dashboards: Tools like Plotly Dash, Tableau, or Grafana visually display real-time
sensor data and camera imagery analysis results.
Summary of Technologies:
Data Description
1.IoT-Based Crop Prediction
The Dataset contains environmental and soil data to assist with predicting the most suitable
crops for farming. The data includes 2200 rows and eight columns. Here's an overview of the key
features in the dataset:
1. N (Nitrogen): Essential for plant growth, nitrogen is one of the three primary nutrients
that influence crop yield and health.
2. P (Phosphorus): A critical nutrient that promotes root development and increases the
plant's ability to absorb water.
3. K (Potassium): Enhances a plant's resistance to disease, aids in water regulation, and
improves the overall health of crops.
4. Temperature: Recorded in Celsius, this is a vital environmental factor that affects plant
metabolism and growth.
6. pH: The acidity or alkalinity of the soil, which influences nutrient availability and overall
plant health.
7. Rainfall: Measured in mm, it indicates the water available to crops, affecting their
growth and irrigation needs.
8. Label: The target variable represents the recommended crop for environmental and soil
conditions.
This data contains 22 unique labels of different crops with their percentage weightage in the
figure below.
The image below displays a series of bar plots comparing various soil and environmental
parameters for different crops. Here's a description of each parameter based on the dataset:
1. Nitrogen (N): The first plot shows the nitrogen levels across different crops. Crops like
rice, pigeon peas, and cotton have higher nitrogen requirements, while crops like lentils
and kidney beans require significantly less nitrogen.
2. Phosphorus (P): The second plot highlights phosphorus requirements. Rice and
watermelon need more phosphorus than other crops, while crops like moth beans and
chickpeas demand lower phosphorus.
3. Potassium (K): The third plot reveals potassium levels, with crops like coconut and
watermelon requiring very high amounts, while chickpeas and lentils need
comparatively less.
4. Temperature (°C): The fourth plot presents the optimal temperature ranges for various
crops. Crops like coffee and oranges thrive in higher temperatures, while crops like
chickpeas and kidney beans grow better at lower temperatures.
5. Humidity (%): The fifth plot indicates the humidity levels suitable for different crops.
Coconut and papaya are more tolerant of humid conditions, while chickpeas and black
gram grow well in lower humidity.
6. pH Level: The sixth plot compares soil pH levels for each crop. Most crops prefer slightly
acidic to neutral pH, with crops like watermelon and coconut tolerating higher pH
values, while pomegranate and chickpea perform better in more acidic soils.
7. Rainfall (mm): The final plot displays rainfall requirements, with crops like rice and
papaya needing much more water than lentils, black gram, and moth beans, which
thrive in lower rainfall environments.
2. Camera-Assisted Based Crop Disease Detection
This data set was created as part of a study project called Plant Disease Classification. We,
therefore, had some self-imposed requirements on the data set strength: - - - We needed a large
number of images. We required for each plant (if the plant is healthy or not) and at least one
disease image for the most common diseases. We needed annotated images. We needed
laboratory and field images. We wanted important staple food plant species and plants with the
highest global production for four years running. No existing data set could be taken as it was,
so we compiled a new one: we collected images from 14 different, existing data sets and
merged them into a new one. In the table below, you will find all the data sets used, their
characteristics, and how many images we imported from them at the end. For the total data set,
images of all classes and all classes are taken. Watermarked images were removed. Singular
condition classes were removed. Non-food plants were removed. Every class with less than 50
samples was removed. As a result, the data set used to train in this work has 88 classes. There
are 76084 images, with a file size of about 17.6GB. There might be a class bias if, for some
classes, different images were shot with very different background shots or some laboratory
background shots.
Data Description
This table provides a detailed overview of multiple plant disease datasets, each with unique
properties and image types, used for plant disease classification tasks. Here's a summary of the
key aspects:
• PlantDisease65: Includes 65 classes with 62,600 images, featuring plant village images
and photos of single plant leaves. Non-food plants and solitary courses were excluded.
• PlantDoc: Contains 2,598 images across 27 classes, primarily field images taken under
different lighting conditions, with some lab images included.
• Coffee Plant Disease: This dataset has 1,000 images across three classes, focusing on
field images of coffee plants.
• Wheat Leaf: Comprises 407 images in 3 classes, containing authentic field wheat
images from the Holeta wheat farm in Ethiopia, verified by a plant pathologist.
• Chili Plant Disease: Includes 500 images across five classes, featuring field images of
whole chili plants or plant parts.
• Soybean Leaves (Mendeley): Offers 6,410 images across three classes, with field
images of soybean plants taken at various times and heights using drones and
smartphones.
• Rice Leaf Disease: Features 120 images in 3 classes, with single rice leaves
photographed against a plain white background.
• Rice Leaf: Contains 3,355 images in 4 classes, focused solely on Hispa disease images
of rice leaves.
• Cucumber Plant Disease: Includes 691 images in 2 classes (healthy and ill), with field
images of cucumber leaves.
• Plant Disease Expert: Contains additional images from various classes, such as six
classes of tea leaves and grape black rot images.
• Leaf Disease: Field images with added 2,596 images across five classes for cassava.
• Sugarcane Disease: Includes 299 images across three classes, focusing on field
images of sugarcane leaves.
• Sugarcane Leaf Disease: Contains 224 images from sugarcane farms across three
classes of field images.
The image above presents disease counts across 23 different plant species. Each subplot
shows the occurrence of various diseases in a specific plant, along with the count of healthy
samples for that species. Here's a breakdown:
• Apple: The significant diseases affecting apples include scab, black rot, and rust, with
many healthy samples.
• Cassava: Brown streak disease, bacterial blight, and mosaic disease are prominent,
with fewer healthy samples.
• Cherry: Primarily affected by powdery mildew, though there are many healthy samples.
• Chili: Whitefly, red spot, yellowish symptoms, and leaf curl are expected, with relatively
fewer healthy samples.
• Coffee: Rust, red spider mite, and cercospora leaf spot are common diseases.
• Corn: Affected by common rust, northern leaf blight, and gray leaf spot.
• Grape: Black rot, black measles, and leaf blight are the primary diseases.
• Potato: Early blight and late blight are significant issues, with many healthy samples.
• Rice: Neck blast, leaf blast, and brown spot are notable diseases, with hispa in fewer
cases.
• Soybean: Powdery mildew, rust, bacterial blight, and southern blight are common
diseases.
• Strawberry: Leaf scorch is the primary disease, but many healthy samples exist.
• Sugarcane: Affected by red rot, bacterial blight, rust, and red stripe.
• Tea: Red leaf spot, brown blight, and bird eye spot are significant concerns.
• Tomato: Yellow leaf curl virus, bacterial spot, and early blight are expected, with healthy
samples being notable.
• Wheat: Major diseases include yellow rust, brown rust, and septoria.
Limitations
Despite the promising results from the integration of IoT, Camera-assist, and AI for precision
agriculture, several limitations need to be acknowledged:
1. Data Limitations:
• Limited Data for Certain Crops: While the system successfully predicts crop health
and diseases for the available datasets, there may be limitations due to the lack of data
diversity for less common crops. This could lead to reduced accuracy when applying the
system in regions where the dataset does not adequately represent local crops or
conditions.
• Quality of Camera-Assisted-Collected Images: The quality of images captured by the
camera is affected by environmental conditions such as weather, light exposure, and
flight stability. Poor image quality can negatively impact the performance of disease
detection models, as machine learning models heavily rely on precise, high-resolution
data for accurate predictions.
• Sensor Data Gaps: The IoT sensors used in the soil and environmental data collection
are prone to failure due to hardware issues or network connectivity problems, leading to
gaps in the data. These data gaps can hinder the accuracy of real-time predictions and
limit the robustness of the crop prediction models.
3. Environmental Sensitivity:
• Weather Conditions: Harsh weather conditions, such as rain, wind, or extreme heat,
can disrupt the performance of IoT sensors and camera assist. For example, sensors
may malfunction in saturated soils, and camera assistance cannot be available during
rainfall, reducing the system's effectiveness during the critical farming season.
4. Model Generalization:
There are several avenues for extending and improving the system to address the limitations
above and advance the capabilities of AI-driven precision agriculture:
o Data Augmentation and Expansion: Future work can focus on expanding the
dataset to cover a broader range of crops, geographic regions, and growing
conditions. This will help improve the generalization of crop prediction and
disease detection models, making them more applicable across diverse farming
environments.
o Use of UAVs: Drones can cover larger land areas and provide aerial crop
coverage.
o Edge Computing and Local Processing: Edge computing can be integrated into
the system to mitigate network issues in rural areas. This involves processing
data locally at the sensor or drone level before transmitting only essential data
to the cloud, reducing the dependency on continuous internet connectivity.
3. AI Model Advancements:
4. Environmental Resilience:
Conclusion
Modern technologies such as IoT, camera-assist, machine learning, and deep learning emerged
as the harbinger of the agricultural revolution characterized by precision farming. The adoption
of IoT sensors that make an exact variable selection of the crops based on the critical soil and
environmental parameters are the technologies that permit the farmers to make a conclusion
on the crop types to be planted, the land area to be scheduled, the resources to be used, and
the predicted yield respectively. Such a criterion is essential as it enables divine steam engine
machines in agriculture.
In this project, blending an IoT sensor and camera-assist technology with AI models has
demonstrated promising results that complement precision agriculture. The system utilizes
real-time environmental data to determine suitable crops to plant among the options available,
hence steering farmers to minimize input resources and uplift yields. The Camera-Assistance-
based plant disease detection system based on Convolutional Neural Networks (CNNs) allows
rapid detection of crop diseases, thus improving the chances for intervention and minimizing
economic loss.
This combination of IoT for crop prediction and Camera assistance for disease surveillance
enhances farming efficiency by automating strategic decision-making and reducing reliance on
manual labor. The intelligent farming solutions utilizing AI show impressive results, confirming
their ability to influence productivity and sustainability immensely.
References
1. Gopi PS, Karthikeyan M. Multimodal machine learning based crop recommendation and
yield prediction model. Intell Autom Soft Comput. 2023 Jan;36:313–26.
2. Prince Rajak, Jaykumar Lachure & Rajesh DoriyaCNN-LSTM-based IDS on Precision
Farming for IIoT data
3. Precision Agriculture for Sustainability and Environmental Protection Edited by Margaret
A. Oliver, Thomas F. A. Bishop and Ben P. Marchant
4. Precision farming for sustainability: An agricultural intelligence model Vinod Chandra
S.S.a, Anand Hareendran S.b, Ghassan Faisal Albaajia,c,∗
5. Enabling Precision Agriculture Through Embedded Sensing With Artificial Intelligence
Dmitrii Shadrin, Alexander Menshchikov, Andrey Somov Gerhild Bornemann, Jens
Hauslage, and Maxim Fedoro
6. Towards leveraging the role of machine learning and artificial intelligence in precision
agriculture and smart farming Tawseef Ayoub Shaikh a, *, Tabasum Rasool b, Faisal
Rasheed Lone c
7. Dataset IOT-based Plant Prediction: https://www.kaggle.com/datasets/aksahaha/crop-
recommendation
8. Dataset Plant Disease: https://www.kaggle.com/datasets/alinedobrovsky/plant-
disease-classification-merged-dataset/data