Thesis Malam
Thesis Malam
Autonomous Navigation in an
Agricultural Field
A. J. Sathyamoorthy et al.,, "TerraPN: Unstructured Terrain Navigation using Online Self-Supervised Learning," 2022 IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS),, Japan, 2022.
Robot kinematics model
• Differential drive comprises two drive wheels placed on a shared axis.
Each wheel can operate independently,
• To achieve rolling motion, the robot to pivots around a specific point
along the common axis of the left and right wheels (ICC).
• The point that the robot rotates about is known as the ICC
M. V. Gasparino et al., "WayFAST: Navigation With Predictive Traversability in the Field," in IEEE
Robotics and Automation Letters, vol. 7, no. 4, pp. 10651-10658, Oct. 2022, doi:
10.1109/LRA.2022.3193464.
Generating Traversability Labels
• The research employs a nonlinear moving horizon estimator (NMHE) to
generate traversability labels for training.
• It is a model-based estimator that provides real-time estimates of the
state variables of a dynamic system
• Dynamic System Model: How the system's state variables evolve over time.
• Measurement Data: Measurement data from sensors
• Estimation Horizon: Finite time horizon to estimate the system's state variables.
• Optimization Problem: Aims to find the state variables that best explain the
observed measurements over the estimation horizon.
• Horizon Shift: As time progresses, the MHE algorithm continually shifts the
estimation horizon forward in time.
Nonlinear moving horizon estimator
• This NMHE makes use of the affine kinodynamic model:
• μ and ν assume the role of the traversability
• The coefficient indicates the robot’s capability to traverse different
terrains
• Objective:
• Traversability coefficients of values of 1 on smooth surface
• Traversability coefficients of values of 0 on obstacles or impassable terrain
NMHE for traversability
• We solve the following finite horizon optimization formulation to obtain the
system states and unknown parameters μ and ν :
• xk is the state vector composed of px(t), py(t) and θ(t), xk˜ is the initial estimated state vector
• m = [μ, ν] is the vector of parameters, ˜m is the estimated vector of parameters from the previous
iteration
• N ∈ N is the length of the horizon.
• The measurement equation model, subject to additive noise:
• Where zk = [zxk, zyk, zθk ], zxk and zyk are the pair of measured position coordinates from the
GPS, and zθk is the measured heading angle from the embedded inertial sensor.
NMHE for traversability
• The optimization seeks to minimize the difference between the actual
states and their estimated states , the parameter vectors, and the
model-predicted measurements while adhering to specified
constraints.
• The constraints: traversability coefficients μ and ν are confined within
[0, 1].
• In practice, the NMHE operates in conjunction with an Extended
Kalman Filter (EKF) for precise state estimation.
NHME Block Diagram
Data collected
Odometry
data
Experimentation settings for
Traversability labels
• The labels for the neural network are generated using RGB, Depth
images, traversability coefficients and state vectors
• Manually navigating the robot through diverse outdoor settings,
including unstructured grassy and semi-urban environments
• 20 minutes, equivalent to a dataset comprising 15,000 images.
• These included:
• Grassy area
• Muddy area
• Collision with trees
• Collision with poles
• Collision with obstacles
• Plain area
Examples
Example
Odom data
• 𝑋 = (𝑥 − 𝑐𝑥) ∗ 𝑍 / 𝑓𝑥
parameters (focal length, principal point), and the 2D image point:
• 𝑌 = (𝑦 − 𝑐𝑦) ∗ 𝑍 / 𝑓𝑦
•𝑍=𝑍
Spare Labels
• Binary mask image as our labeling mechanism.
• This mask image adopts a value of one in regions where labels are
present, signifying instances where the robot successfully navigated
• Regions outside these successful traversals are designated with a zero
value, forming a clear distinction in the mask image
Block diagram for label images
X, Y Point to pixel
Odometry
coordinates transformation
Non-Linear
Traversability Traversability
Horizon Mask Image Label image
Coefficient value
estimator
Example High Traversability
Value
Low
Traversability
upon collision
Examples
TravNet
• ResNet-18 backbone pretrained on the ImageNet dataset
• Architecture is modified by truncating the network prior to the
average-pooling layer.
• A bottleneck block with two convolutional layers is introduced,
followed by a decoder incorporating four blocks of convolutional
layers.
• For depth information, a parallel branch encodes this data and
combines with the RGB-encoded block after every convolutional layer
M. V. Gasparino et al., "WayFAST: Navigation With Predictive Traversability in the Field," in IEEE
Robotics and Automation Letters, vol. 7, no. 4, pp. 10651-10658, Oct. 2022, doi:
10.1109/LRA.2022.3193464.
Architecture
M. V. Gasparino et al., ”WayFAST: Navigation With Predictive Traversability in the Field,” IEEE Robot. Autom. Lett.,
vol. 7, no. 4, pp. 10651–10658, Oct. 2022, doi: 10.1109/LRA.2022.3193464.
Prediction results
• The results showcased the model’s ability in extracting semantic insights that are
unattainable
• model successfully differentiated between grassy and less grassy areas
• Similarly, the model was also able to predict obstacles such as trees
Realtime Results
Robot navigation
• The second part of the research involves using the traversability
prediction image as a cost map for the robot navigation
• Navigation is based on Dynamic Window Approach (DWA)
• A range velocity combinations (v, ω) is generated
• For each combination, the algorithm predicts the robot’s trajectory
• The algorithm selects the trajectory with the lowest cost
• To adapt the DWA algorithm for outdoor terrains, the parameter of
surface costs is introduced to capture the terrain’s navigability
characteristics
Robot navigation
• Distance Cost: minimum distance to the goal position.