0% found this document useful (0 votes)
12 views40 pages

Thesis Malam

The document outlines the development of an unmanned ground robot, 'Agri-bot', aimed at autonomous navigation in agricultural fields. It addresses the challenges of navigating unstructured terrains and proposes a solution involving robust hardware, sensor integration, and advanced navigation algorithms, including self-supervised learning for traversability prediction. Future work includes enhancing the robot's adaptability to dynamic environments and improving its navigation capabilities in uneven agricultural terrains.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views40 pages

Thesis Malam

The document outlines the development of an unmanned ground robot, 'Agri-bot', aimed at autonomous navigation in agricultural fields. It addresses the challenges of navigating unstructured terrains and proposes a solution involving robust hardware, sensor integration, and advanced navigation algorithms, including self-supervised learning for traversability prediction. Future work includes enhancing the robot's adaptability to dynamic environments and improving its navigation capabilities in uneven agricultural terrains.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 40

velopment of an Unmanned Ground Robot ”Agri-bot” f

Autonomous Navigation in an
Agricultural Field

Name: Syed Muhammad Alam


Supervisor: Dr. Hassan Jaleel
Background and
Motivation
• Robotics can increase efficiency in agricultural
production, such as planting, harvesting, and
monitoring of crops
• UGVs are designed to traverse various types of terrain.
• Sensors, cameras, to assist with navigation, object
detection and manipulation, mapping, and other
tasks.
Problem Statement
• The deployment of field robots in unstructured fields environments presents a
formidable challenge.
• These robots are tasked with navigating through diverse terrains, including
obstacle-ridden landscapes, and expansive fields, often lacking accurate maps.
• Traditional methods faces challenges in these dynamic environments:
• Sensors’ limitations in distinguishing outdoor obstacles
• Navigating unstructured terrains
• Finding traversable path
• The challenge is to develop a comprehensive solution that combines:
• Robust robotic strucure
• learning-based traction,
• adaptable autonomous navigation
Proposed Objective
• Objective: Development of a ground robot equipped with sensors
to autonomously navigate through crop rows
• Requirements:
• Development of Robot platform
• Interfacing sensors/camera for robot perception
• Create navigation controller to move at required path
• Navigation Algorithm to guide the robot across uneven, unstructured
terrains
1. Hardware development
• The hardware design includes the following:
• Brushless DC Hub motors
• Motor drivers
• Buck converter
• ESP8266 Wi-Fi module for remote
access
• Jetson Nano computational module
• Realsense D455 dept camera
• Razor IMU module
Stage-I Overview
• Simulation model
Issues in deployment
• In unstructured terrains like grassy fields, muddy patches, or uneven
ground, robot’s movement became noticeably hindered.
• Limitations in its power output
• Limitation in physical design. Given
• Given the importance of agricultural settings in our study,
encountering areas with such terrains is almost inevitable.
• This realization emphasized the need for an enhanced traversability
algorithm suited for rough and unstructured terrains
Robot Perception for Traversability
• Robots for outdoor applications need to account for a terrain’s
geometric properties such as:
• Slope, elevation, texture, bumpiness, softness/deformability, etc.
• Classical Navigation: GNSS, Sensor fusion, SLAMs
• Learned Navigation
• Supervised Learning
• Self-supervised Learning
Traversability using supervised
methods
• Pixel-wise semantic segmentation classify a
terrain into multiple predefined classes such as
traversable, non-traversable, forbidden, etc.
• Utilize large hand-labeled datasets of images to
train classifiers.
• Disadvantages:
• Manually annotating datasets is time- and labor-
intensive
• May not be applicable for robots of different sizes,
inertias, and dynamics
• Do no consider robot’s dynamics, velocities, or other
terrain segmentation.for robot navigation in unstructured outdoor environments’, IEEE Robotics and Automation Letters
constraints
-Guan, T. et al. (2022) ‘Ga-nav: Efficient
-Lee, H. et al, "Learning Terrain-Aware Kinodynamic Model for Autonomous Off-Road Rally Driving With Model Predictive Path Integral Control", IEEE Robotics and
Automation Letters
-Wellhausen, L. , "Where Should I Walk? Predicting Terrain Properties From Images Via Self-Supervised Learning," in IEEE Robotics and Automation Letters, vol. 4, no. 2, pp.
1509-1516, April 2019, doi: 10.1109/LRA.2019.2895390.
Traversability using Self-supervised
methods
• Overcome the need for large datasets by automating the labeling
process by collecting terrain-interaction data:
• Forces/torques, contact vibrations, acoustic data , vertical acceleration
experienced [24], and stereo depth , and associating them with visual
features
• This allow for robot’s dynamics and constraints to be a part of
learning process
• This thesis focuses on self-supervised learning to estimate
traversability coefficient values in the image space.
M. V. Gasparino et al., "WayFAST: Navigation With Predictive Traversability in the Field," in IEEE Robotics and Automation Letters,
vol. 7, no. 4, pp. 10651-10658, Oct. 2022, doi: 10.1109/LRA.2022.3193464.

A. J. Sathyamoorthy et al.,, "TerraPN: Unstructured Terrain Navigation using Online Self-Supervised Learning," 2022 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS),, Japan, 2022.
Robot kinematics model
• Differential drive comprises two drive wheels placed on a shared axis.
Each wheel can operate independently,
• To achieve rolling motion, the robot to pivots around a specific point
along the common axis of the left and right wheels (ICC).
• The point that the robot rotates about is known as the ICC

• L: distance between the centers of the two wheels


• Vr , Vl: right and left wheel velocities,
• R: distance from the ICC to the midpoint between the wheels
• ω: rate of rotation
Robot kinematics model
• Assume the robot is at some positon (x, y), headed in a direction
making an angle θ with the X axis.
• By manipulating the control parameters Vl , Vr, we can get the robot
to move to different positions and orientations.
Finding Odometry for Traversability
• x˙, y ˙,θ˙ are the derivatives of the robot's position and orientation
over time, representing the rate of change of these quantities.
• v is the linear velocity of the robot.
• ω is the angular velocity of the robot.
• θ is the orientation of the robot.
Robot kinematics model
• x(t) is the state vector composed of px(t), py(t) and θ(t)
• Represents position in x and y axis and heading angle.
• μ and ν are unknown parameters related to a skid coefficient caused
by the interaction between the wheels and terrain.
• v(t) and ω(t) are the commanded linear and angular velocities, which
are the control inputs.
Traversability using unsupervised
methods
• Traversability prediction in image space provides traversability
coefficient for trajectories within robot’s field-of-view.
• The traversability coefficient represents how much control effort is
needed to reach a defined location.
• The algorithm has two main parts:
a) Generating Traversability Labels
b) Self-Supervised Learned Perception for Traversability.

M. V. Gasparino et al., "WayFAST: Navigation With Predictive Traversability in the Field," in IEEE
Robotics and Automation Letters, vol. 7, no. 4, pp. 10651-10658, Oct. 2022, doi:
10.1109/LRA.2022.3193464.
Generating Traversability Labels
• The research employs a nonlinear moving horizon estimator (NMHE) to
generate traversability labels for training.
• It is a model-based estimator that provides real-time estimates of the
state variables of a dynamic system
• Dynamic System Model: How the system's state variables evolve over time.
• Measurement Data: Measurement data from sensors
• Estimation Horizon: Finite time horizon to estimate the system's state variables.
• Optimization Problem: Aims to find the state variables that best explain the
observed measurements over the estimation horizon.
• Horizon Shift: As time progresses, the MHE algorithm continually shifts the
estimation horizon forward in time.
Nonlinear moving horizon estimator
• This NMHE makes use of the affine kinodynamic model:
• μ and ν assume the role of the traversability
• The coefficient indicates the robot’s capability to traverse different
terrains
• Objective:
• Traversability coefficients of values of 1 on smooth surface
• Traversability coefficients of values of 0 on obstacles or impassable terrain
NMHE for traversability
• We solve the following finite horizon optimization formulation to obtain the
system states and unknown parameters μ and ν :

• xk is the state vector composed of px(t), py(t) and θ(t), xk˜ is the initial estimated state vector
• m = [μ, ν] is the vector of parameters, ˜m is the estimated vector of parameters from the previous
iteration
• N ∈ N is the length of the horizon.
• The measurement equation model, subject to additive noise:

• Where zk = [zxk, zyk, zθk ], zxk and zyk are the pair of measured position coordinates from the
GPS, and zθk is the measured heading angle from the embedded inertial sensor.
NMHE for traversability
• The optimization seeks to minimize the difference between the actual
states and their estimated states , the parameter vectors, and the
model-predicted measurements while adhering to specified
constraints.
• The constraints: traversability coefficients μ and ν are confined within
[0, 1].
• In practice, the NMHE operates in conjunction with an Extended
Kalman Filter (EKF) for precise state estimation.
NHME Block Diagram

Data collected

RGB & Depth Traversability


Images Coefficient

Odometry
data
Experimentation settings for
Traversability labels
• The labels for the neural network are generated using RGB, Depth
images, traversability coefficients and state vectors
• Manually navigating the robot through diverse outdoor settings,
including unstructured grassy and semi-urban environments
• 20 minutes, equivalent to a dataset comprising 15,000 images.
• These included:
• Grassy area
• Muddy area
• Collision with trees
• Collision with poles
• Collision with obstacles
• Plain area
Examples
Example
Odom data

Grassy Area: Low


traversability, High
Cost

Less Grassy Area:


High traversability,
Low Cost
Projection
• We projected the traversed path onto the image space using precise
point-to-pixel transformation matrices.
• These matrices facilitated the creation of label images for each
corresponding RGB frame.
• The relationship between the 3D world point, the camera's intrinsic

• 𝑋 = (𝑥 − 𝑐𝑥) ∗ 𝑍 / 𝑓𝑥
parameters (focal length, principal point), and the 2D image point:

• 𝑌 = (𝑦 − 𝑐𝑦) ∗ 𝑍 / 𝑓𝑦
•𝑍=𝑍
Spare Labels
• Binary mask image as our labeling mechanism.
• This mask image adopts a value of one in regions where labels are
present, signifying instances where the robot successfully navigated
• Regions outside these successful traversals are designated with a zero
value, forming a clear distinction in the mask image
Block diagram for label images

X, Y Point to pixel
Odometry
coordinates transformation

Non-Linear
Traversability Traversability
Horizon Mask Image Label image
Coefficient value
estimator
Example High Traversability
Value

Low
Traversability
upon collision
Examples
TravNet
• ResNet-18 backbone pretrained on the ImageNet dataset
• Architecture is modified by truncating the network prior to the
average-pooling layer.
• A bottleneck block with two convolutional layers is introduced,
followed by a decoder incorporating four blocks of convolutional
layers.
• For depth information, a parallel branch encodes this data and
combines with the RGB-encoded block after every convolutional layer

M. V. Gasparino et al., "WayFAST: Navigation With Predictive Traversability in the Field," in IEEE
Robotics and Automation Letters, vol. 7, no. 4, pp. 10651-10658, Oct. 2022, doi:
10.1109/LRA.2022.3193464.
Architecture

M. V. Gasparino et al., ”WayFAST: Navigation With Predictive Traversability in the Field,” IEEE Robot. Autom. Lett.,
vol. 7, no. 4, pp. 10651–10658, Oct. 2022, doi: 10.1109/LRA.2022.3193464.
Prediction results
• The results showcased the model’s ability in extracting semantic insights that are
unattainable
• model successfully differentiated between grassy and less grassy areas
• Similarly, the model was also able to predict obstacles such as trees
Realtime Results
Robot navigation
• The second part of the research involves using the traversability
prediction image as a cost map for the robot navigation
• Navigation is based on Dynamic Window Approach (DWA)
• A range velocity combinations (v, ω) is generated
• For each combination, the algorithm predicts the robot’s trajectory
• The algorithm selects the trajectory with the lowest cost
• To adapt the DWA algorithm for outdoor terrains, the parameter of
surface costs is introduced to capture the terrain’s navigability
characteristics
Robot navigation
• Distance Cost: minimum distance to the goal position.

• Traversability Cost: Traversability Cost captures the navigability


characteristics of the outdoor terrains
• Control Cost: represents the control effort or deviation from a desired
motion command. Difference between the current and desired
velocities
Robot navigation
Results
Challenges
• Structure: The current structural design of the robot necessitates
some adjustments.
• Surge Current Requirement: On uneven surfaces, such as grassy areas,
the robot exhibited a significant surge current requirement during its
initial movement.
• Localization Accuracy: The accuracy of the robot’s localization system,
particularly in GPS-denied environments
• Microcomputer Processing Power: The Jetson Nano has limited
processing power compared to more high-end GPUs and CPUs.
Future work
• Adaptive Terrain Navigate:
• equip the robot with the capability to navigate uneven and muddy
agricultural terrains while simultaneously navigating through crop rows

• Dynamic Environment Perception:


• Advancing real-time perception and decision-making algorithms allows for
identification and navigation around dynamic obstacles in agricultural
environments.
References
• G. Kahn, P. Abbeel, and S. Levine, ”BADGR: An Autonomous Self- Supervised Learning-
Based Navigation System,” IEEE Robot. Autom. Lett. vol. 6, no. 2, pp. 1312–1319, Apr.
2021, doi: 10.1109/LRA.2021.3057023.
• T. Guan, D. Kothandaraman, R. Chandra, A. J. Sathyamoorthy, K. Weerakoon, and D. Manocha, ”GA-Nav: Efficient
Terrain Segmentation for Robot Navigation in Unstructured Outdoor Environments,” IEEE Robot. Autom. Lett.,
vol. 7, no. 3, pp. 8138–8145, Jul. 2021, doi: 10.1109/LRA.2022.3187278
• T. Guan, Z. He, D. Manocha, and L. Zhang, ”TTM: Terrain Traversability Mapping for
Autonomous Excavator Navigation in Unstructured Environments,” Accessed: Aug. 15,
2023. [Online]. Available https://gamma.umd.edu/ttm.
• F. Schilling, X. Chen, J. Folkesson, and P. Jensfelt, ”Geometric and visual terrain classification for autonomous
mobile navigation,” IEEE Int. Conf. Intell. Robot. Syst., vol. 2017-September, pp. 2678–2684, Dec. 2017, doi:
10.1109/IROS.2017.8206092
• M. V. Gasparino et al., ”WayFAST: Navigation With Predictive Traversability in the Field,” IEEE Robot. Autom. Lett.,
vol. 7, no. 4, pp. 10651–10658, Oct. 2022, doi: 10.1109/LRA.2022.3193464.
• D. Fox, W. Burgard, and S. Thrun, ”The dynamic window approach to collision avoidance,” IEEE Robot. Autom.
Mag., vol. 4, no. 1, pp. 23–33, Mar. 1997, doi: 10.1109/100.580977.
Thank you for your
time

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy