Agri EYE
Agri EYE
on
CERTIFICATE
Date:
Place:
Executive Director
Dr.Jayaraj U. Kidav
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
2
AGRICULTURE
ACKNOWLEDGEMENT
We express our heartfelt gratitude to all those who have contributed to the successful completion
of our project titled "AGRIEYE: AUTONOMOUS CROP HEALTH MONITORING FOR
PRECISION AGRICULTURE," which was undertaken as part of the requirements for the
award of the Post Graduate Diploma in Unmanned Aircraft System Programming (PGDUASP),
September 2023 batch, from the National Institute of Electronics and Information Technology
(NIELIT), Aurangabad.
First and foremost, we extend our sincere thanks to Mr. Rishidas E.V., whose guidance,
encouragement, and invaluable insights were instrumental in shaping this project. His expertise
and unwavering support throughout the duration of this endeavor have been truly invaluable.
We are also grateful to Mr. Saurabh Bansode, Scientist 'C,' for his valuable inputs,
encouragement, and guidance as the course coordinator. His expertise in the field has greatly
enriched our learning experience.
We would also like to thank all the faculty members and staff of NIELIT, Aurangabad, for their
support and assistance throughout the duration of the course.
Last but not least, we are grateful to our families and friends for their unwavering support and
encouragement, which has been a constant source of motivation throughout this journey.
Thank you to everyone who has been a part of this project and has contributed to its successful
completion.
LIST OF FIGURES i
LIST OF ABBREVIATIONS ii
ABSTRACT iii
SR. NO. TOPIC PAGE NO.
1. INTRODUCTION
1.1 Introduction to Precision Agriculture
1.2 Challenges in Crop Health Monitoring
1.3 Role of Autonomous Systems in Agriculture
1.4 Overview of AGRIEYE Project
1.5 Significance and Potential Impact
2. LITERATURE SURVEY
2.1 Traditional Crop Monitoring Methods
2.3 Data Analytics and Machine Learning for Crop Health Monitoring
2.4 Integration of Autonomous Systems and Data Analytics in Precision
Agriculture
2.5 Case Studies and Best Practices
2.6 Challenges and Opportunities in Autonomous Crop Health Monitoring
3. SYSTEM OVERVIEW
3.1 Introduction to AGRIEYE Project
3.2 Key Components of AGRIEYE System
3.3 Integration and Interoperability
4. SYSTEM DEVELOPMENT
4.1 Workflow and Operations
4.2 Architecture of the Project
4.3 Working Principle of AgriEYE
4.4 Presentation of the Project
4.5 Code
4.6 Performance and Validation
4.7 Scalability and Adaptability
4.8 Future Development and Enhancement
BLDC Brushless DC
LiPo Lithium-Polymer
NIR Near-Infrared
In the world of modern agriculture, the adoption of precision farming techniques has become
increasingly crucial for enhancing productivity while minimizing resource consumption. The
project "AGRIEYE: Autonomous Crop Health Monitoring for Precision Agriculture" stands
at the forefront of this technological revolution, aiming to revolutionize crop monitoring
practices through the integration of autonomous systems and advanced data analytics.
The significance of the AGRIEYE project extends beyond individual farm operations. By
enabling initiative-taking disease management, optimizing resource allocation, and promoting
sustainable agricultural practices, AGRIEYE has the potential to drive significant
improvements in crop yields, economic returns for farmers, and environmental stewardship.
Furthermore, the scalability and adaptability of the AGRIEYE system make it well-suited for
deployment across diverse agricultural landscapes, from smallholder farms to large-scale
plantations.
Through collaborative research and innovation, the AGRIEYE project represents a paradigm
shift in crop monitoring methodologies, paving the way for a more efficient, resilient, and
Introduction
The advent of precision agriculture can be traced back to the late 20th century, with the
proliferation of global positioning systems (GPS), geographic information systems (GIS),
and remote sensing technologies. These advancements empowered farmers to collect
detailed spatial and temporal data about their fields, enabling them to make informed
decisions tailored to specific crop needs and environmental conditions.
At the heart of precision agriculture lies the concept of variability – recognizing that soils,
crops, and environmental conditions vary spatially and temporally within a field. By
understanding and managing this variability, farmers can maximize yields, minimize input
costs, and mitigate environmental impacts. This targeted approach not only enhances
agricultural productivity but also contributes to sustainability by reducing chemical runoff,
soil erosion, and greenhouse gas emissions.
The adoption of precision agriculture continues to grow globally, driven by the need to
feed a growing population while facing challenges such as climate change, resource
scarcity, and sustainability concerns. By embracing innovation and leveraging cutting-edge
technologies, precision agriculture offers a pathway towards a more resilient, efficient, and
sustainable future for global food production.
Crop health monitoring is a critical aspect of modern agriculture, essential for ensuring
optimal yields, minimizing losses, and promoting sustainable farming practices. However,
several challenges persist in traditional crop monitoring methods, hindering the ability of
farmers to effectively manage crop health. These challenges include:
One of the key roles of autonomous systems in agriculture is in the domain of crop
monitoring and management. Drones equipped with advanced sensors, including
multispectral and thermal cameras, can autonomously survey large agricultural fields with
high spatial resolution and capture detailed imagery of crop health indicators. By
leveraging artificial intelligence and machine learning algorithms, these systems can
analyze the collected data in real-time to detect early signs of crop diseases, nutrient
deficiencies, and pest infestations. This proactive approach to crop monitoring enables
farmers to take timely and targeted action, such as adjusting irrigation schedules, applying
fertilizers, or deploying pest control measures, thereby minimizing yield losses, and
optimizing resource use.
Furthermore, the role of autonomous systems extends beyond individual farm operations
to encompass broader agricultural ecosystems. Autonomous sensor networks deployed
across agricultural landscapes can collect real-time environmental data, monitor soil
health, and assess weather conditions, providing valuable insights for decision-making at
regional and even global scales. This data-driven approach to agriculture facilitates
precision farming practices, enhances resilience to climate change, and promotes
sustainable agricultural development.
Real-time Data Analysis: Data collected by the AGRIEYE system are analyzed in
real-time using advanced data analytics techniques. Machine learning algorithms
are trained to recognize patterns in the imagery and identify anomalies indicative of
crop health issues. By processing data on-the-fly, the AGRIEYE system can
provide farmers with timely insights into emerging threats and recommend
appropriate management actions.
Decision Support Tools: The AGRIEYE project also includes the development of
decision support tools to assist farmers in interpreting and acting upon the insights
generated by the system. These tools provide actionable recommendations for
optimizing input management, adjusting irrigation schedules, and implementing
pest control measures based on the detected crop health issues.
The significance and potential impact of the AGRIEYE project extend far beyond
individual farm operations, encompassing broader societal, environmental, and economic
dimensions. By revolutionizing crop health monitoring and management through the
integration of autonomous systems and advanced data analytics, the project holds immense
promise for transforming agriculture and addressing key challenges facing the global food
system.
Improved Crop Yields and Food Security: Timely and accurate detection of crop
diseases, nutrient deficiencies, and pest infestations facilitated by the AGRIEYE
system can lead to improved crop yields and enhanced food security. By enabling
farmers to identify and address emerging threats promptly, the project helps
mitigate yield losses, ensuring a stable and reliable food supply for growing
populations.
The significance and potential impact of the AGRIEYE project are multifaceted, spanning
agricultural, environmental, economic, and social dimensions. By leveraging autonomous
systems and advanced data analytics, the project holds the promise of transforming
agriculture and paving the way towards a more resilient, efficient, and sustainable food
system for future generations.
Traditional crop monitoring methods have long been the cornerstone of agricultural
management practices, providing farmers with essential insights into the health and
performance of their crops. These methods, rooted in observational techniques and manual
labor, have been instrumental in guiding decision-making processes. However, they come
with inherent limitations and challenges that have prompted the exploration of more
advanced technologies in recent years.
Manual Sampling and Soil Testing: Manual sampling involves collecting soil and
plant tissue samples from various locations within a field for laboratory analysis. Soil
tests assess soil fertility, pH levels, and nutrient content, providing valuable
information for nutrient management decisions. Plant tissue analysis evaluates nutrient
uptake and identifies potential deficiencies or imbalances affecting crop health.
However, manual sampling is labor-intensive, time-consuming, and may not capture
spatial variability adequately.
Despite their historical significance, traditional crop monitoring methods have several
limitations that hinder their effectiveness in modern agriculture. These limitations include
subjectivity, labor intensiveness, spatial variability, and temporal delays in data collection
and analysis. As a result, there has been a growing interest in adopting advanced
technologies such as remote sensing, drones, and data analytics to complement or replace
traditional methods and enhance crop monitoring capabilities.
3. LiDAR (Light Detection and Ranging): LiDAR technology uses laser pulses to
measure distances to objects and generate highly accurate three-dimensional (3D)
maps of terrain and vegetation structure. In agriculture, LiDAR is used to assess
crop canopy height, biomass, and structural characteristics. LiDAR-derived data can
help quantify crop growth dynamics, identify areas of lodging or canopy closure,
and assess the effects of topography on water flow and soil erosion.
2.3 Data Analytics and Machine Learning for Crop Health Monitoring
In recent years, data analytics and machine learning have emerged as powerful tools for
analyzing agricultural data and improving crop health monitoring practices. By leveraging
large volumes of data collected from various sources such as remote sensing, field sensors,
and weather stations, these techniques enable farmers to make data-driven decisions and
optimize agricultural management strategies. In the context of crop health monitoring, data
analytics and machine learning offer several advantages, including:
Predictive Modeling: Machine learning models can predict crop health indicators,
such as disease occurrence, nutrient deficiencies, or yield potential, based on input
data collected from multiple sources. By analyzing historical data and learning
from past observations, these models can forecast future trends and anticipate
potential risks, allowing farmers to take proactive measures to mitigate crop health
issues.
Scalability and Adaptability: Data analytics and machine learning techniques are
scalable and adaptable to different agricultural contexts and cropping systems.
Whether applied to smallholder farms or large-scale plantations, these techniques
can be tailored to specific needs and environmental conditions, making them
versatile tools for crop health monitoring.
Data analytics and machine learning offer promising solutions for enhancing crop health
monitoring practices in agriculture. By harnessing the power of data-driven insights and
predictive modeling, farmers can optimize resource allocation, reduce risks, and improve
overall crop productivity and sustainability. As these technologies continue to evolve, they
hold the potential to revolutionize agricultural decision-making and contribute to the
advancement of precision agriculture.
The integration of autonomous systems and data analytics in precision agriculture holds
tremendous potential for revolutionizing farming practices. By harnessing the power of
real-time data, advanced algorithms, and autonomous technologies, farmers can optimize
crop management decisions, enhance productivity, and promote sustainability in
agriculture.
1. Afef Krichen, Issam Kerkeb, et. al. (2023), “Recent Advances in Crop Disease
Detection Using UAV and Deep Learning Techniques", A thorough review
paper that summarizes the general state of drone-based crop disease detection using
deep learning. Covers different sensors (multispectral, hyperspectral), image
processing, and various machine/deep learning techniques used. Provides a
taxonomy to categorize different research approaches. (Remote Sensing 2023:
Link: https://www.mdpi.com/2072-4292/15/9/2450)
The integration of autonomous technologies for crop health monitoring presents both
challenges and opportunities in modern agriculture. While these technologies offer
SYSTEM OVERVIEW
Validation and Adoption: Conduct field trials and validation studies to assess
the performance and effectiveness of the AGRIEYE system in real-world
agricultural settings, with the ultimate goal of facilitating widespread adoption
and integration into existing precision agriculture workflows.
The significance of the AGRIEYE project in advancing precision agriculture lies in its
potential to address critical challenges faced by farmers in managing crop health and
optimizing productivity. By leveraging autonomous technologies and data analytics,
AGRIEYE offers several key advantages:
Timely Insights: Real-time data analytics provide farmers with timely insights
into crop health status, allowing for early detection of issues and prompt
intervention to mitigate risks and minimize yield losses.
In the dynamic landscape of modern agriculture, the quest for precision and efficiency
has led to the development of innovative technologies aimed at optimizing crop health
monitoring practices. Among these advancements stands an integrated system poised
to revolutionize how farmers assess and manage crop health: AGRIEYE. Designed at
the intersection of autonomous systems and advanced data analytics, AGRIEYE
represents a paradigm shift in precision agriculture, offering farmers unprecedented
insights and capabilities to enhance productivity while minimizing environmental
impact.
At its core, the AGRIEYE system leverages autonomous aerial platforms equipped
with state-of-the-art sensors to capture high-resolution imagery of agricultural fields.
These platforms, often drones, operate autonomously, flying predefined flight paths or
following GPS waypoints to cover vast areas efficiently. Equipped with specialized
sensors such as multispectral and thermal cameras, they capture detailed data on crop
health indicators, soil conditions, and environmental parameters with remarkable
accuracy and precision.
However, the true power of AGRIEYE lies in its advanced data analytics capabilities.
Collected data are processed in real-time using sophisticated algorithms, including
machine learning techniques, to extract actionable insights into crop health status. By
analyzing imagery and detecting patterns indicative of crop diseases, nutrient
Moreover, the integration of decision support tools within the AGRIEYE system
further enhances its effectiveness. These tools synthesize analyzed data with
agronomic models, weather forecasts, and historical records to generate customized
recommendations tailored to specific field conditions. Whether it is adjusting input
application rates, optimizing irrigation schedules, or implementing targeted pest
control measures, farmers can rely on AGRIEYE to guide their actions with precision
and confidence.
Airframe:
The airframe serves as the structural framework of the aerial platform, providing
stability and aerodynamic efficiency during flight. It is typically constructed from
lightweight materials such as carbon fiber or durable plastics to minimize weight
while maintaining structural integrity.
The S500 multirotor airframe offers a multitude of benefits that make it an ideal
choice for various aerial applications, including crop monitoring within the
AGRIEYE system. Its robust and lightweight construction, typically made from
durable materials such as carbon fiber or high-strength plastics, ensures structural
integrity while minimizing weight, resulting in enhanced flight performance and
agility. The S500's modular design facilitates easy assembly and customization,
allowing for the integration of different payloads, sensors, and electronics to suit
specific operational requirements. Its spacious frame provides ample room for
Propeller:
Propellers are responsible for generating thrust to propel the aerial platform
through the air. They come in assorted sizes and configurations, with factors such
as diameter, pitch, and blade count influencing flight characteristics such as speed,
maneuverability, and efficiency.
Motor:
Motors power the propellers, converting electrical energy from the battery into
rotational motion. Brushless DC (BLDC) motors are commonly used in
autonomous aerial platforms due to their high efficiency, reliability, and torque-to-
weight ratio.
Figure no. 3.3: BLDC motors: a2212\13T 1000KV and Flycat i-rotor 5010-
360kU
ESCs regulate the speed and direction of the motors by adjusting the voltage and
current supplied to them. They play a crucial role in controlling the rotational speed
of the propellers, enabling stable flight and maneuverability.
Flight Controller:
The flight controller is the brain of the aerial platform, responsible for processing
sensor data, executing control algorithms, and stabilizing the aircraft during flight.
It integrates gyroscopes, accelerometers, and other sensors to maintain stability and
responsiveness.
Figure no. 3.5: Flight Controller: PIXHAWK 2.4 and Orange Cube +
Co-microprocessor:
Battery:
Batteries provide the electrical power necessary to operate the aerial platform and
its onboard systems. Lithium polymer (LiPo) batteries are commonly used due to
their high energy density, lightweight, and rechargeable nature, providing sufficient
power for extended flight durations.
The adoption of the Orange 18650 Li-ion battery, specifically the 3-cell 11.1V
10000mAh variant, presents a multitude of benefits for aerial platforms employed
in agricultural applications such as crop monitoring within the AGRIEYE system.
Renowned for their high energy density, lightweight design, and long cycle life, Li-
ion batteries offer unparalleled performance and reliability in demanding
environments. The 3-cell configuration provides a stable voltage output of 11.1V,
ensuring consistent power delivery to the aircraft's propulsion system and onboard
electronics. With a capacity of 10000mAh, this battery variant offers extended
flight durations, enabling aerial platforms to cover large agricultural areas in a
single mission without the need for frequent battery swaps. Additionally, the
Orange 18650 Li-ion battery's compact form factor and robust construction make it
well-suited for integration into aerial platforms, optimizing weight distribution and
ensuring durability during prolonged use.
Receiver:
The receiver communicates wirelessly with the ground control station or remote
pilot, receiving commands and transmitting telemetry data from the aerial platform.
It enables remote operation and monitoring of the platform's status and
performance during flight.
Each of these components plays a critical role in the functionality and autonomy of
the aerial platform within the AGRIEYE system, enabling precise and reliable data
collection for crop health monitoring and management. Through their integration
and coordination, autonomous aerial platforms empower farmers with actionable
insights to optimize agricultural practices and enhance crop productivity.
The utilization of Sony IMX 290-83 and IMX 219-83 cameras offers a
multitude of benefits in agricultural applications, particularly in crop
monitoring within the AGRIEYE system. Renowned for their high resolution,
exceptional image quality, and low-light performance, these cameras provide
detailed and accurate imagery essential for assessing crop health and
environmental conditions. The IMX 290-83 excels in low-light situations,
making it ideal for capturing clear images during dawn, dusk, or overcast
conditions, while the IMX 219-83 offers superior dynamic range and color
accuracy, ensuring precise visualization of crop characteristics and anomalies.
The Sony IMX 290-83 and IMX 219-83 cameras exhibit capabilities that are
instrumental in detecting crucial crop health indicators, including chlorophyll
content, canopy temperature, and water stress. Leveraging their high resolution
and sensitivity to different spectral bands, these cameras can capture detailed
imagery that enables precise analysis of crop health parameters. The IMX 290-
83 excels in low-light conditions, facilitating the detection of chlorophyll
content variations through spectral reflectance analysis, crucial for assessing
plant vigor and photosynthetic activity. Meanwhile, the IMX 219-83's superior
dynamic range and color accuracy enable accurate measurement of canopy
temperature differentials, providing insights into plant stress levels and water
distribution within the crop canopy. Additionally, both cameras can capture
thermal infrared imagery, allowing for the assessment of water stress by
analyzing temperature variations in plant tissues. By leveraging the capabilities
of the IMX 290-83 and IMX 219-83 cameras, farmers can effectively monitor
crop health indicators, enabling timely interventions and optimized
management practices within the AGRIEYE system.
The data processing pipeline for analyzing imagery and extracting relevant crop
health information within the AGRIEYE system involves a series of
sophisticated steps to transform raw imagery into actionable insights for
farmers. Initially, captured imagery from sensors such as cameras or
multispectral sensors is preprocessed to enhance clarity and remove noise.
Subsequently, image segmentation techniques are applied to identify regions of
interest within the imagery, such as crop rows or individual plants. Feature
extraction algorithms then analyze these regions to quantify various crop health
indicators, including chlorophyll content, canopy temperature, and vegetation
indices like NDVI (Normalized Difference Vegetation Index). Machine learning
algorithms are often employed to classify pixels or regions based on predefined
criteria, enabling the detection of anomalies such as disease outbreaks or
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
36
AGRICULTURE
nutrient deficiencies. Finally, the extracted information is aggregated and
visualized in user-friendly formats, such as maps or graphs, to provide farmers
with actionable insights into crop health status and potential management
interventions. Through this data processing pipeline, the AGRIEYE system
empowers farmers with timely and accurate information to optimize agricultural
practices and maximize yields while minimizing resource inputs and
environmental impact.
Machine learning algorithms play a pivotal role in the AGRIEYE system for
crop monitoring by enabling sophisticated analysis and interpretation of
agricultural data. Anomaly detection algorithms leverage pattern recognition
techniques to identify deviations from normal crop health conditions, such as
disease outbreaks or pest infestations, allowing for timely intervention and
mitigation. Feature extraction algorithms extract relevant information from raw
data, such as spectral signatures or texture patterns in imagery, enabling the
quantification of crop health indicators like chlorophyll content or water stress
levels. Moreover, predictive modeling algorithms utilize historical data and
environmental variables to forecast future trends and potential outcomes,
assisting farmers in decision-making processes related to irrigation scheduling,
pest management, and crop yield optimization. Through the application of
machine learning algorithms, the AGRIEYE system enhances the efficiency,
accuracy, and effectiveness of crop monitoring practices, ultimately
empowering farmers to make informed decisions and achieve sustainable
agricultural outcomes.
The user interface of the AGRIEYE system is designed to provide farmers with
intuitive access to the wealth of data generated by the platform, facilitating
informed decision-making and optimized management strategies. Through a
user-friendly dashboard, farmers can visualize key crop health indicators,
environmental parameters, and predictive analytics in a clear and
comprehensible manner. Interactive tools and customizable dashboards enable
farmers to tailor the display of information according to their specific
preferences and priorities. Additionally, the interface incorporates features such
as real-time alerts, trend analysis, and scenario modeling, empowering farmers
to identify emerging issues, assess potential risks, and explore alternative
management strategies. By providing farmers with easy-to-understand insights
and actionable recommendations, the user interface of the AGRIEYE system
enables them to optimize agricultural practices, maximize yields, and promote
sustainability effectively.
Data collected from autonomous aerial platforms within the AGRIEYE system
undergo a comprehensive processing and analysis pipeline before being integrated
into decision support tools. Initially, raw imagery captured by specialized sensors is
preprocessed to enhance clarity and remove noise. Subsequently, advanced
algorithms, often based on machine learning techniques, analyze the imagery to
extract relevant crop health indicators, such as chlorophyll content, canopy
temperature, and vegetation indices. These indicators are then aggregated and
interpreted within decision support tools, where they are combined with agronomic
models, historical data, and environmental parameters. Through this integration,
decision support tools generate personalized recommendations and actionable
The workflow and operation of the AGRIEYE system encompass several key steps, from
mission planning to data analysis and decision-making. The step-by-step explanation as
follows:
Mission Planning:
Autonomous Flight:
o These platforms fly autonomously along the designated flight path, capturing
high-resolution imagery and sensor data of the agricultural fields below.
Data Collection:
o The sensors onboard the aerial platforms capture several types of data, including
RGB imagery, multispectral imagery, and environmental parameters such as
temperature and humidity.
o Data collected during the flight are transmitted in real-time or stored onboard for
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
41
AGRICULTURE
later analysis, depending on the system configuration.
Data Processing:
o Raw data collected from the aerial platforms are preprocessed to enhance quality
and remove noise, ensuring accurate analysis.
Decision Support:
o Analyzed data are integrated into decision support tools, which leverage
agronomic models, historical data, and environmental parameters to generate
actionable insights and recommendations.
Implementation:
o Throughout the growing season, farmers continuously monitor crop health and
performance, using aerial imagery and sensor data collected by AGRIEYE to
track progress and identify emerging issues.
o Feedback from field observations and monitoring activities is fed back into the
system, informing future missions and refinements to decision support algorithms.
o Farmers or operators assess the current status of their crops and identify areas of
concern or potential issues.
o Based on this assessment, they determine the need for aerial monitoring and
prioritize areas for data collection.
Mission Planning:
o Using mission planning software such as Mission Planner, farmers define the
parameters of the aerial survey, including the area to be covered, altitude, and
flight path.
o They set waypoints to ensure comprehensive coverage of the target area and
specify any specific objectives or areas of interest.
o Autonomous aerial platforms equipped with sensors are deployed to execute the
predefined mission.
o These platforms autonomously follow the designated flight path, capturing high-
resolution imagery and sensor data of the agricultural fields below.
Data Collection:
o Sensors onboard the aerial platforms collect diverse types of data, including RGB
imagery, multispectral imagery, and environmental parameters.
o Data collected during the flight are transmitted in real-time or stored onboard for
later analysis.
o Farmers interpret the analyzed data to assess crop health indicators such as
chlorophyll content, canopy temperature, and vegetation indices.
o Based on the insights derived from the data analysis, farmers make informed
decisions regarding crop management practices, input application rates, irrigation
scheduling, and pest control measures.
o Throughout the growing season, farmers continuously monitor crop health and
performance using aerial imagery and sensor data collected by AGRIEYE.
The diagram outlines the general process of using a drone for crop disease detection. It
begins with image acquisition, where the drone captures images of the field. Image
preprocessing then prepares the images for analysis (e.g., cropping, resizing, noise
reduction). A decision is made whether to train a disease classification model. If training is
needed, experts perform disease identification and validation to create labeled data.
Feature extraction then analyzes image characteristics like texture, color, and shape that
are indicative of specific diseases. This information is used to either train a disease
classification model or, if a model is already trained, the model directly classifies the
Figure no. 4.2: Wiring Diagram of the Crop disease detection UAVs (Source:
Internet)
The wiring diagram of the crop disease detection drone encompasses several
interconnected components designed to facilitate efficient data collection and analysis for
detecting crop diseases. At its core, power distribution is established through a central
power distribution board, providing energy to various components. Brushless DC motors,
coupled with electronic speed controllers, are wired to the flight controller, ensuring
precise control over the drone's movements. Cameras equipped with advanced sensors,
such as the Sony IMX 290-83 or IMX 219-83, are strategically positioned on the drone's
frame and connected to onboard computers, like the Raspberry Pi or NVIDIA Jetson Nano,
for real-time data processing. Telemetry receivers establish communication between the
drone and the ground control station, enabling remote monitoring and control. Mission
planning software, such as Mission Planner, is utilized to set flight paths and waypoints,
The working principle of the AGRIEYE project involves the integration of autonomous
aerial platforms equipped with advanced sensors, data processing algorithms, and decision
support systems to monitor crop health and optimize agricultural practices.
Data
Collection
Monitoring Data
and Feedback Processing
Decision
Implemention
Support
3. Decision Support: The analyzed data are integrated into decision support tools,
which leverage agronomic models, historical data, and environmental parameters to
generate actionable insights and recommendations for farmers. These
recommendations may include optimized management strategies for irrigation
scheduling, nutrient management, and pest control.
The quadcopter and Hexacopter models utilized in this project feature an S500 multirotor
airframe, providing stability and durability for aerial operations. Paired with 14 and 15.5-
inch propellers for quadcopter and hexacopter configurations respectively, it ensures
efficient thrust generation and maneuverability. The propulsion system comprises BLDC
motors, including the A2212/13T 1000KV and Flycat I-Rotor 5010-360kU, combined with
SimonK 30A electronic speed controllers for precise motor control. The flight controller,
whether PIXHAWK 2.4 or Orange Cube +, serves as the brain of the quadcopter,
orchestrating flight operations and navigation. Power is supplied by an Orange 18650 Li-
ion battery, providing ample energy for extended flight missions. Navigation is facilitated
by a GPS navigator, while onboard data processing and communication are handled by co-
microprocessors like Raspberry Pi 4 and NVIDIA Jetson Nano. Imaging capabilities are
enabled by Sony IMX 290-83 and IMX 219-83 cameras, capturing high-resolution
imagery for analysis. Finally, communication with ground control is ensured through
receivers like FS T6 Flysky, Cubepilot Herelink Controller Unit, and Air Unit, allowing
for real-time telemetry and remote operation of the quadcopter. This comprehensive setup
4.5 Code:
import pandas as pd
import numpy as np
import os
import keras
x=base_model.output
x=GlobalAveragePooling2D()(x)
x=Dense(1024,activation='relu')(x) #we add dense layers so that the model can learn
more complex functions and classify for better results.
model=Model(inputs=base_model.input,outputs=preds)
layer.trainable=False
layer.trainable=True
# To only split into training and validation set, set a tuple to `ratio`, i.e, `(.8, .2)`.
splitfolders.ratio('../Downloads/Project/JetbotV2.A-English/crop%20detection/dataset',
output="output", seed=1337, ratio=(.8, .2), group_prefix=None) # default values
train_datagen=ImageDataGenerator(preprocessing_function=preprocess_input)
#included in our dependencies
target_size=(224,224),
color_mode='rgb',
batch_size=32,
class_mode='categorical',
shuffle=True)
model.compile(optimizer='Adam',loss='categorical_crossentropy',metrics=['accuracy'])
# Adam optimizer
step_size_train=train_generator.n//train_generator.batch_size
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
52
AGRICULTURE
accident_model=model.fit_generator(generator=train_generator,
validation_data =val_generator,
steps_per_epoch= step_size_train//50,
epochs = 5,
plt.grid(True)
plt.legend()
model.save("network.h5")
plt.grid(True)
plt.legend()
# summarize model.
model.summary()
accuracy_score = model.evaluate(val_generator)
print(f"Accuracy: {accuracy_score[1]*100}")
model.save("network.h5")
import numpy as np
import cv2
%matplotlib inline
import pickle
Img_size = 256
clas = ['early_blight','healthy','late_blight']
if type(pth)== str:
gray = rgb2gray(new_img)
else:
gray = rgb2gray(pth)
gray_r = gray.reshape(gray.shape[0]*gray.shape[1])
for i in range(gray_r.shape[0]):
gray_r[i] = 255
gray_r[i] = 255
gray_r[i] = 0
else:
gray_r[i] = 0
gray = gray_r.reshape(gray.shape[0],gray.shape[1])
plt.imshow(gray, cmap='gray')
x1 = 0
gr = gray.reshape(-1)
for i in range(gray.shape[0]*gray.shape[1]):
if gr[i] != 0:
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
55
AGRICULTURE
x1+= 1
y1=gray.shape[0]*gray.shape[1]
z = (y1-x1)/y1
if z <0.3:
else:
import requests
def cal_per(url):
if type(url) == str:
response = requests.get(url)
if response.status_code == 200:
else:
else:
else:
pass
import os
os.getcwd()
cal_per('C:\\Users\\CHOTA DON\\Downloads\\Project\\JetbotV2.A-English\\crop
%20detection\\dataset\\healthy_leaves\\ca69740a-ce43-11ee-abe0-16f63a1aa8c9.JPG')
Performance metrics play a critical role in evaluating the effectiveness and accuracy of the
Validation procedures, field trials, and case studies are essential components of assessing
the performance of the AGRIEYE system in real-world agricultural settings. Field trials
involve deploying the system in agricultural fields under varying environmental conditions
and management practices to evaluate its performance in capturing accurate data and
providing meaningful insights. These trials often include comparison with ground truth
data collected through manual measurements or field observations to validate the accuracy
of the system's predictions.
Additionally, case studies are conducted to demonstrate the practical application and
effectiveness of the AGRIEYE system in addressing specific agricultural challenges.
These case studies typically involve collaboration with farmers, agronomists, and
agricultural researchers to implement the system in real-world scenarios and evaluate its
impact on crop management practices, yield outcomes, and resource efficiency.
The AGRIEYE system exhibits scalability and adaptability to diverse agricultural contexts
and cropping systems, making it a versatile solution for precision agriculture. One key
aspect of scalability is the modular design of the system, which allows for customization
and integration of various components to suit specific agricultural needs. Additionally, the
system's compatibility with several types of autonomous aerial platforms, sensors, and data
analysis techniques enhances its adaptability across a wide range of agricultural settings.
Moreover, the AGRIEYE system can accommodate varying field sizes, crop types, and
management practices, thanks to its ability to collect and analyze high-resolution imagery
and sensor data at scale. Whether deployed in small-scale vegetable farms or large-scale
agribusiness operations, the system can provide valuable insights into crop health,
optimize management strategies, and improve overall productivity.
Despite its scalability, scaling up the deployment of autonomous crop health monitoring
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
58
AGRICULTURE
technologies such as the AGRIEYE system presents several challenges. One challenge is
the high initial investment required for acquiring equipment, implementing infrastructure,
and training personnel. Additionally, ensuring reliable connectivity and data transmission
in remote or rural areas may pose logistical challenges.
Furthermore, capacity-building initiatives and training programs can empower farmers and
agricultural professionals to effectively utilize autonomous technologies and interpret data
generated by the AGRIEYE system. By overcoming these challenges and implementing
scalable solutions, the deployment of autonomous crop health monitoring technologies can
be expanded, leading to improved agricultural productivity, sustainability, and resilience in
diverse farming communities.
o Validation and Field Trials: Ongoing validation studies and field trials
assess the performance of the AGRIEYE system in diverse agricultural
settings, providing valuable feedback for further improvements and
By continuously investing in research and innovation efforts, the AGRIEYE system can
evolve to meet the evolving needs and challenges of modern agriculture, ultimately
contributing to improved productivity, sustainability, and resilience in global food
production systems.
CONCLUSION
One of the most significant achievements of AGRIEYE lies in its ability to provide
farmers with actionable insights derived from real-time monitoring and analysis of crop
health indicators. By leveraging advanced sensor technology and data analytics
algorithms, AGRIEYE empowers farmers to make informed decisions regarding
irrigation scheduling, nutrient management, pest control, and other critical aspects of
crop management. This has not only led to improvements in crop yields and quality but
has also minimized resource usage and environmental impact.
Furthermore, the scalability and adaptability of the AGRIEYE system have been
instrumental in its widespread adoption across diverse agricultural contexts and
cropping systems. Whether deployed in smallholder farms or large-scale agribusiness
As we reflect on the journey of the AGRIEYE project, it becomes evident that ongoing
research and innovation efforts are crucial for its continued success and impact. Future
developments may focus on enhancing sensor technology, refining data analytics
algorithms, and improving decision support capabilities to meet the evolving needs of
farmers and agricultural stakeholders. Moreover, collaborations with farmers,
agronomists, and researchers will be essential for validating the system's performance in
real-world agricultural settings and incorporating user feedback to drive continuous
improvement.
The AGRIEYE project represents a paradigm shift in crop health monitoring practices,
heralding a new era of data-driven precision agriculture. By harnessing the power of
technology, AGRIEYE has empowered farmers to optimize agricultural practices,
maximize yields, and promote sustainability, thus shaping the future of agriculture for
generations to come. As we embark on the next phase of this journey, let us continue to
innovate, collaborate, and strive towards a more resilient and productive agricultural
ecosystem.
The AGRIEYE system comprises autonomous aerial platforms equipped with advanced
sensors, data analytics tools, and decision support systems tailored for precision
agriculture. Key components include autonomous aerial platforms with sensors such as
RGB cameras and multispectral imagers, data processing algorithms implemented in
Python-based environments like Jupyter Notebook, and decision support tools for
generating actionable insights and recommendations. The system enables farmers to
monitor crop health indicators, optimize management practices, and maximize yields
through data-driven decision-making.
Concluding that the AGRIEYE system has the potential to revolutionize precision
agriculture by providing farmers with unprecedented insights into crop health and
environmental conditions. By leveraging advanced sensor technology and data
analytics, the system enables precise monitoring of crop health indicators such as
AGRIEYE : AUTONOMOUS CROP HEALTH MONITORING FOR PRECISION
62
AGRICULTURE
chlorophyll content, canopy temperature, and vegetation indices. This empowers
farmers to make informed decisions regarding irrigation scheduling, nutrient
management, and pest control, leading to improved resource efficiency, enhanced
productivity, and reduced environmental impact. The AGRIEYE system plays a pivotal
role in shaping the future of crop health monitoring practices by promoting data-driven
precision agriculture solutions that optimize agricultural practices and ensure global
food security.
REFERENCES
1. Tice, Brian P. (Spring 1991). "Unmanned Aerial Vehicles – The Force Multiplier of the
1990s". Airpower Journal. Archived from the original on 24 July 2009. Retrieved 6
June 2013. When used, UAVs should generally perform missions characterized by the
three Ds: dull, dirty, and dangerous.
3. F. Rekabi-Bana; Hu, J.; T. Krajník; Arvin, F., "Unified Robust Path Planning and
Optimal Trajectory Generation for Efficient 3D Area Coverage of Quadrotor UAVs"
IEEE Transactions on Intelligent Transportation Systems, 2023.
4. Hu, J.; Niu, H.; Carrasco, J.; Lennox, B.; Arvin, F., "Fault-tolerant cooperative
navigation of networked UAV swarms for forest fire monitoring" Aerospace Science
and Technology, 2022.
5. Remote sensing of the environment using unmanned aerial systems (UAS). [S.l.]:
ELSEVIER - HEALTH SCIENCE. 2023. ISBN 978-0-323-85283-8. OCLC
1329422815. Archived from the original on 27 February 2023. Retrieved 11 January
2023.
6. Perks, Matthew T.; Dal Sasso, Silvano Fortunato; Hauet, Alexandre; Jamieson,
Elizabeth; Le Coz, Jérôme; Pearce, Sophie; Peña-Haro, Salvador; Pizarro, Alonso;
Strelnikova, Dariia; Tauro, Flavia; Bomhof, James; Grimaldi, Salvatore; Goulet, Alain;
Hortobágyi, Borbála; Jodeau, Magali (8 July 2020). "Towards harmonisation of image
7. Koparan, Cengiz; Koc, A. Bulent; Privette, Charles V.; Sawyer, Calvin B. (March
2020). "Adaptive Water Sampling Device for Aerial Robots". Drones. 4 (1): 5.
doi:10.3390/drones4010005.
8. Koparan, Cengiz; Koc, Ali Bulent; Privette, Charles V.; Sawyer, Calvin B.; Sharp,
Julia L. (May 2018). "Evaluation of a UAV-Assisted Autonomous Water Sampling".
Water. 10 (5): 655. doi:10.3390/w10050655.
9. Koparan, Cengiz; Koc, Ali Bulent; Privette, Charles V.; Sawyer, Calvin B. (March
2018). "In Situ Water Quality Measurements Using an Unmanned Aerial Vehicle
(UAV) System". Water. 10 (3): 264. doi:10.3390/w10030264.
10. Koparan, Cengiz; Koc, Ali Bulent; Privette, Charles V.; Sawyer, Calvin B. (March
2019). "Autonomous In Situ Measurements of Noncontaminant Water Quality
Indicators and Sample Collection with a UAV". Water. 11 (3): 604.
doi:10.3390/w11030604.
11. "Drones smuggling porn, drugs to inmates around the world". Fox News. 17 April
2017. Archived from the original on 31 August 2018. Retrieved 17 April 2017.
12. Note; the term "drone" refers to the male bee that serves only to fertilize the queen bee,
hence the use of the name in reference to the DH Queen Bee aerial target.
13. "Drones and Artificial Intelligence". Drone Industry Insights. 28 August 2018.
Archived from the original on 17 September 2020. Retrieved 11 April 2020.
14. "What is the difference between a drone and an RC plane or helicopter?". Drones Etc.
Archived from the original on 17 November 2015. Retrieved 12 October 2015.
17. "Unmanned Aircraft Systems Roadmap" (PDF). Archived from the original (PDF) on 2
October 2008.
18. "European ATM Master Plan 2015 | SESAR". www.sesarju.eu. Archived from the
original on 6 February 2016. Retrieved 3 February 2016.
19. "State government gears up for autonomous RPAS mapping". 23 January 2017.
Archived from the original on 25 February 2017. Retrieved 1 February 2017.
21. "UAV classification". Archived from the original on 23 May 2022. Retrieved 10 June
2022.
22. "Eyes of the Army: U.S. Army Roadmap for UAS 2010–2035" (PDF). Archived (PDF)
from the original on 18 February 2022. Retrieved 10 June 2022.
23. "Nano, micro, small: The different drone types in India & if Jammu-like strike can be
averted" Archived 29 June 2021 at the Wayback Machine, ThePrint, 29 June 2021.
24. Drones, Percepto (3 January 2019). "The Differences Between UAV, UAS, and
Autonomous Drones". Percepto. Archived from the original on 18 February 2020.
Retrieved 18 February 2020.
25. Cary, Leslie; Coyne, James. "ICAO Unmanned Aircraft Systems (UAS), Circular
328". 2011–2012 UAS Yearbook – UAS: The Global Perspective (PDF). Blyenburgh &
Co. pp. 112–115. Archived from the original (PDF) on 4 March 2016. Retrieved 26
February 2022.
26. Hu, J.; Lanzon, A. (2018). "An innovative tri-rotor drone and associated distributed
aerial drone swarm control". Robotics and Autonomous Systems. 103: 162–174.
doi:10.1016/j.robot.2018.02.019.
27. The Encyclopedia of the Arab-Israeli Conflict: A Political, Social, and Military
History: A Political, Social, and Military History, ABC-CLIO, 12 May 2008, by
28. The Future of Drone Use: Opportunities and Threats from Ethical and Legal
Perspectives Archived 27 February 2023 at the Wayback Machine, Asser Press –
Springer, chapter by Alan McKenna, page 355
29. Kaplan, Philip (2013). Naval Aviation in the Second World War. Pen and Sword. p.
19. ISBN 978-1-4738-2997-8. Archived from the original on 27 February 2023.
Retrieved 19 August 2019.
30. Hallion, Richard P. (2003). Taking Flight: Inventing the Aerial Age, from Antiquity
through the First World War. Oxford University Press. p. 66. ISBN 978-0-19-028959-1.