0% found this document useful (0 votes)
16 views64 pages

2019 - Slam I - After Lecture

Uploaded by

Jaewan Ahn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views64 pages

2019 - Slam I - After Lecture

Uploaded by

Jaewan Ahn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 64

ASL

Autonomous Systems Lab

SLAM I: The problem of SLAM

Autonomous Mobile Robots

Margarita Chli
Roland Siegwart and Nick Lawrance

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart |
ASL
Autonomous Systems Lab

SLAM | today’s lecture

Section 5.8 + some extras…


§ SLAM: what is it?

§ Approaches to SLAM:
§ Bundle Adjustment
§ Filtering (UKF/EKF/Particle Filter SLAM)
§ Keyframes

§ EKF SLAM in detail


§ EKF SLAM case study: MonoSLAM
§ Components for a scalable SLAM system

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 2
DIGITIZATION IN ARCHAEOLOGY SEARCH & RESCUE

Computer
Vision
&
Robotics

CROP MONITORING INDUSTRIAL INSPECTION

Autonomous Mobile Robots |


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart SLAM I | 3
SLAM |=Simultaneous
WHAT IS IT? Localization And Mapping

The SLAM problem:

How can a body navigate in a previously unknown environment, while constantly building and
updating a map of its workspace using onboard sensors & onboard computation?

WHEN IS IT NECESSARY?
§ When a robot must be truly autonomous (no human input)
§ When there is no prior knowledge about the environment
§ When we cannot rely exclusively on pre-placed beacons or external positioning systems (e.g.
GPS)
§ When the robot needs to know where it is

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart SLAM I | 4
Motion tracking

§ How to track the motion of a robot while it is moving?

Localization system: Map of the workspace:


not always available
+ not always available
=

§ SLAM

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart SLAM I | 5
SLAM | Simultaneous Localization And Mapping

§ The backbone of spatial awareness of a robot


§ One of the most challenging problems in probabilistic robotics
§ An unbiased map is necessary for localizing the robot
Pure localization with a known map. Robot localization using Satellite images
[Senlet and Elgammal, ICRA 2012]

SLAM: no a priori knowledge of the


robot’s workspace
§ An accurate pose estimate is necessary
for building a map of the environment
Mapping with known robot poses.
SLAM: the robot poses have to be
estimated along the way Helicopter position given by Vicon tracker
ETH Zurich Flying Machine Arena, IDCS & NCCR DF, 2013

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart SLAM I | 6
ASL
Autonomous Systems Lab

SLAM | a short history of photogrammetry


Motivation | Maps from Aerial Imagery

Surveyed points on the ground


§ Originated from efforts to formalize
production of topographic maps from
Image points in common aerial imagery
§ “Photogrammetry” – the practice of
determining the geometric properties
of objects from images

Goal: align the images to build


a topographic map of the area

Surveying for Mapping---part 1, http://www.icsm.gov.au/mapping/surveying1.html!

Autonomous Mobile Robots Paul Furgale Bundle Adjustment 3 May 3, 2014


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 7
ASL
Autonomous Systems Lab

ivation
SLAM | a from
| Maps short history
AerialofImagery
photogrammetry

. Slotted templets being laid at the Soil Conservation Service, U.S. Department of Agriculture.
Points surveyed on the ground: physically nailed into the board. Other tie points: loosely constrain the cardboard templates
act on topographic mapping of developments in land and air survey: 1900-1939. Cartography and Geographic
J.W. 1917. The use of the panoramic camera in Chasseaud, P. 1999. Artillery’s astrologers: A history of
e, 29(3):155–174, 2002.!
aphic surveying . . . Mobile
Autonomous U.S. Geological
Robots Survey. USGS British survey and mapping on the Western Front.
in no. Margarita Chli, Nick Lawrance
657. Washington, and Roland
D.C.: SiegwartAdjustment
Government
Bundle Lewes, Sussex: Mapbooks. 5 May 3, 2014 SLAM I | 8
ASL
Autonomous Systems Lab

SLAM | a short history of photogrammetry


Motivation | Maps from Aerial Imagery
Bundle Adjustment
many different areas
photogrammetry and geodesy.
of apply his principles to the survey of the
radio telescope at Greenbank, WV.
Using a modified ballistic camera,
Brown was able to achieve accuracies
“A rigorous least squares adjustment,
at around the 1:50,000 level, or about 2
mm [Brown, 2005].
believed to be of unprecedented
universality, is given for the
simultaneous adjustment of the entire
set of observations arising from a general
Figure 48. Duane Brown with the
m-station photogrammetric net.
D.C.camera
Brown[from Brown, 2005].
CRC-1

Figure 49. BC4 Satellite
In 1961, Duane Brown joined the
Instrument Corporation of Florida and
A computing
Triangulation program
Camera station for automatic
operated by the U.S. Coast and
two years later purchased the Research
and Analysis Division, where he was
electronic
Geodetic Survey. computers is outlined.”
the Director, and formed DBA (Duane
During his stay at DBA, the company
Brown and Associates). DBA soon
developed a number of high-accuracy,
established itself as a leader in close-
large-format, close-range
range photogrammetry and analytical
photogrammetric cameras. His work in
photogrammetry. Brown continued to
Brown, photogrammetry also included RCA Technical Report No. 43,
refine D.C., A Solution
the bundle to the General
adjustment Problem of Multiple
for large Station Analytical Stereo triangulation,
February 1958! adjusting for decentering distortion,
photogrammetric blocks to include self-
! principal point calibration, and film
calibration.
Paul Furgale The importance of self- Bundle Adjustment 9 May 3, 2014
Autonomous Mobile Robots unflatness where he was able to show
calibration is that the accuracy and
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 9
considerable deformation of glass
ASL
Autonomous Systems Lab

SLAM | a short history of photogrammetry


Bundle Adjustment | Unprecedented Universality

Michael J. Broxton, Ara V. Nefian, Zachary Moratto, Taemin Kim, Michael Lundy, and Aleksandr V. Segal, "3D Lunar Terrain Reconstruction

Autonomous Mobile Robots


(b)
(a)
from Apollo Images", International Symposium on Visual Computing 2009!
Paul Furgale Bundle Adjustment 11
10 May 3, 2014
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 10
ASL
Autonomous Systems Lab

SLAM | a short history of2570


photogrammetry IEEE TRANS
| Unprecedented Universality | Unprecedented
Bundle Adjustment | Unprecedented Universality Universality
TIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 49, NO. 7, JULY 2011
ent IEEE TRANSACTIONS ON GEOSCIENCE
BundleAND REMOTE SENSING, VOL. 49, NO. 7, JULY 2011
Adjustment

nd
Fig.K.12.
Di. Rigorous photogrammetric
Residuals of check pointsprocessing of hirise
onto HiRISE stereo
image imagery
mosaics for mars
before andtopographic mapping.
ng, IEEE
after BA.Transactions on, (99):1–15,
(a) Before BA 2008.!
(exaggerated
R. Li, J. Hwangbo, Y. Chen, and
50 times). (b) After BA (exaggerated
Geoscience and Remote Sensing,
K. Di. Fig. 14. HiRISE orthophoto draped on corresponding DEM with vertical
R. Li, J. Hwangbo,
Rigorous
Fig. 12. Geoscience
Residuals of
andcheck
Y. Chen,processing
photogrammetric
points
Remote
and K. Di.ofRigorous
onto IEEE
Sensing, HiRISE hirise
image
photogrammetric
stereo
Transactions
imagery
mosaics
on, before
(99):1–15,
processing
for mars
and
2008.!
of hirise
topographic stereo imagery for mars topographic mapping.
mapping.
afterIEEE
BA.Transactions
(a) Before BAon, (99):1–15, 2008.!
(exaggerated 50 times). (b) After BA (exaggerated
200 times). Bundle Adjustment
Autonomous Mobile Robots Paul Furgale
12
200 times).exaggeration of two times.
Paul
May 3, 2014
Furgale Bundle Adjustment Bundle Adjustment
12 May 3, 2014 13 May 3, 2014
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 11
intersection using bundle-adjusted orientation parameters, are
ASL
Autonomous Systems Lab

SLAM | from photogrammetry to SFM SFM: Structure From Motion


[Agarwal et al., “Reconstructing Rome”, IEEE Computer, 2010 ]

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 12
SLAM | from photogrammetry to SFM to SLAM

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart SLAM I | 13
SLAM | how does it work?

§ How can we track the motion of a camera/robot while


it is moving?

§ Traditional SLAM:
Pick natural scene features as landmarks, observe their
motion & reason about robot motion

§ Research into:
§ Good features to track, sensors, trackers, representations,
assumptions
§ Ways of dealing with uncertainty in the processes
involved

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance, Juan Nieto and Roland Siegwart The videos are courtesy of A. Davison
| 14
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Use internal representations for


§ the positions of landmarks (: map)
THE “STATE” B C
§ the camera parameters

§ Assumption:
Robot’s uncertainty at starting position is zero

Start: robot has zero uncertainty


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 15
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

First measurement of feature A


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 16
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ The robot observes a feature which is mapped with


an uncertainty related to the measurement model
B C
e.g. the camera model, describing how world points
map into pixels in the image

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 17
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ As the robot moves, its pose uncertainty increases,


obeying the robot’s motion model.
B C
e.g. the driver‘s commands: turn right, drive on for 1m
– uncertainty is added due to wheel slippage and
other imprecisions

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot moves forwards: uncertainty grows


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 18
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Robot observes two new features.

B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot makes first measurements of B & C


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 19
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Their position uncertainty results from the combination


of the measurement error with the robot pose
uncertainty. B C

a map becomes correlated with the robot pose estimate.

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot makes first measurements of B & C


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 20
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Robot moves again and its uncertainty increases


(motion model)
B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot moves again: uncertainty grows more


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 21
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Robot re-observes an old feature


a Loop closure detection
B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot re-measures A: “loop closure”


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 22
ASL
Autonomous Systems Lab

how to do SLAM | with a Gaussian Filter

§ Robot updates its position: the resulting pose estimate


becomes correlated with the feature location
B C
estimates.

§ Robot’s uncertainty shrinks and so does the


uncertainty in the rest of the map

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot re-measures A: “loop closure”


uncertainty shrinks
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 23
ASL
Autonomous Systems Lab

SLAM | probabilistic formulation


Using the notation of [Davison et al., PAMI 2007]:

§ Robot pose at time t : xt a Robot path up to this time: {x0, x1,…, xt}
m
B2 Cm3
§ Robot motion between time t-1 and t : ut (e.g. control inputs)
a Sequence of robot relative motions: {u0, u1,…, ut}
xt-1 ut
§ The true map of the environment: {m0, m1,…, mN} imag
e
§ At each time t the robot makes measurements zi zi xt
a Set of all measurements (observations): {z0, z1,…, zk} m
A1

§ The Full SLAM problem: estimate the posterior p(x0:t ,m0:N | z0:k ,u0:t )
§ The Online SLAM problem: estimate the posterior p(xt ,m0:N | z0:k ,u0:t )
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 24
ASL
Autonomous Systems Lab

SLAM | graphical representation

m0 m1 m2 m3 m4 m5 m6 m7 m8

z0 z1 z2 z3 z4 z5 z6 z7 z8 z9 z10 z11 z12 z13 z14 z15 z16

x0 x1 x2 x3 . . .
u0 u1 u2 u3

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 25
ASL
Autonomous Systems Lab

SLAM | approaches to SLAM

Full graph optimization (Bundle Adjustment)


m0 m1 m2 m3 m4 m5 m6 m7 m8

x0 x1 x2 x3

§ Eliminate observations & control-input nodes and solve for the constraints between poses and
landmarks.
§ Globally consistent solution, but infeasible for large-scale SLAM

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 26
ASL
Autonomous Systems Lab

SLAM | full graph optimization

Full graph optimization (Bundle Adjustment)


§ Minimize the total least-squares cost function – the reprojection error
§ Use a batch Maximum Likelihood approach
§ Assume Gaussian noise densities
§ Pros
ü Information can move backward in time
ü Best possible (most likely) estimate given the data
and models
§ Cons
x Computationally demanding
x Difficult to provide the online estimates for a controller

a If real-time is a requirement, we need to sparsify the SLAM graph


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 27
ASL
Autonomous Systems Lab

SLAM | approaches to SLAM

Filtering
m0 m1 m2 m3 m4 m5 m6 m7 m8

x0 x1 x2 x3

§ Eliminate all past poses: ‘summarize’ all experience with respect to the last pose, using a state
vector and the associated covariance matrix

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 28
ASL
Autonomous Systems Lab

SLAM | filtering
x0 x1

§ Gaussian Filtering (EKF, UKF)


§ Tracks a Gaussian belief of the state/landmarks Courtesy of P. Furgale
§ Assumes all noise is Gaussian
§ Follows the “predict/measure/
update” approach

§ Pros
ü Can run online
ü Works well for problems experiencing
expected perturbations/uncertainty

§ Cons
x Unimodal estimate
x States must be well approximated by a Gaussian
x The vanilla implementation does not scale very well with larger maps
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 29
ASL
Autonomous Systems Lab

SLAM | filtering
x0 x1

§ Particle Filtering
pi = { y ti , wi }
§ Represents belief by a series of samples
§ Each Particle = a hypothesis of the state with an
associated weight (all weights should add up to 1)
§ Follows the “predict/measure/update” approach

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 30
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Use internal representations for


§ the positions of landmarks (: map)
B C
§ the camera parameters

§ Assumption:
Robot’s uncertainty at starting position is zero

§ Initialize N particles at the origin, each with weight 1/N

Start: robot has zero uncertainty


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 31
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

First measurement of feature A


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 32
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ The robot observes a feature which is mapped with


an uncertainty related to the measurement model
B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 33
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ As the robot moves, its pose uncertainty increases,


obeying the robot’s motion model.
B C

§ Apply motion model to each particle

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot moves forwards: uncertainty grows


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 34
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Robot observes two new features.

B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot makes first measurements of B & C


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 35
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Their position uncertainty is encoded for each particle individually


For each particle:
B C
§ Compare the particle’s predicted measurements with
the obtained measurements
§ Re-weigh such that particles with good predictions get
higher weight & re-normalize particle weights
§ Re-sample according to likelihood

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot makes first measurements of B & C


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 36
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Robot moves again and its uncertainty increases


(motion model)
B C

§ Apply motion model to each particle

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot moves again: uncertainty grows more


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 37
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Robot moves again and its uncertainty increases


(motion model)
B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart Localization | SLAM: a worked example
SLAM I | 38
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

§ Robot re-observes an old feature


a Loop closure detection
B C

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot re-measures A: “loop closure”


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 39
ASL
Autonomous Systems Lab

how to do SLAM | with a Particle Filter


x0 x1

For each particle:


§ Compare the particle’s predicted measurements with
the obtained measurements B C
§ Re-weigh such that particles with good predictions get
higher weight & re-normalize particle weights
§ Re-sample according to likelihood

On every frame:
§ Predict how the robot has moved
§ Measure
A
§ Update the internal representations

Robot re-measures A: “loop closure”


uncertainty shrinks
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 40
ASL
Autonomous Systems Lab

SLAM | filtering
x0 x1

§ Particle Filtering Courtesy of P. Furgale

§ Represents belief by a series of samples


§ Each Particle = a hypothesis of the state with an
associated weight (all weights should add up to 1)
§ Follows the “predict/measure/update” approach

§ Pros
ü Noise densities can be from any distribution
ü Works for multi-modal distributions
ü Easy to implement

§ Cons
x Does not scale to high-dimensional problems Distribution in the robot’s position estimate:
x Requires many particles to have good convergence • red dots – particle filtering
• red ellipse – EKF filtering
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 41
ASL
Autonomous Systems Lab

SLAM | approaches to SLAM

Key-frames
m0 m1 m2 m3 m4 m5 m6 m7 m8

x0 x1 x2 x3

§ Retain the most ‘representative’ poses (key-frames) and their dependency links a optimize the
resulting graph

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 42
ASL
Autonomous Systems Lab

SLAM | keyframes

§ Keyframe-based SLAM
[PTAM, Klein & Murray, ISMAR 2007]
§ Minimizes the least-squares cost function
§ Typically optimizes over a window of recent
keyframes for efficiency
§ Assumes Gaussian noise densities
§ Pros
ü Known to provide better balance between
accuracy & efficiency than filtering
ü Permits processing of many more features
per frame than filtering
§ Cons
x Size of optimization window affects scalability
and convergence
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 43
ASL
Autonomous Systems Lab

EKF SLAM | overview

§ SLAM using an Extended Kalman Filter (EKF)

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 44
ASL
Autonomous Systems Lab

EKF SLAM | overview

§ EKF SLAM summarizes all past experience in an extended state vector yt comprising of the
robot pose xt and the position of all the features mi in the map, and an associated covariance
matrix Py : ⎡ xt ⎤ ⎡ Pxx Pxm .. Pxm ⎤ Y
t 1 n−1
⎢m ⎥ ⎢P Pm1m1 .. Pm1mn−1 ⎥⎥
mx
yt = ⎢ 1 ⎥ , Pyt = ⎢ 1
⎢ ... ⎥ ⎢ .. .. .. .. ⎥
⎢ ⎥ ⎢ ⎥
m
⎣ n −1 ⎦ ⎣⎢ Pmn−1x Pmn−1m1 .. Pmn−1mn−1 ⎦⎥
X

§ If we sense 2D line-landmarks, the size of yt is 3+2n (and size of Py : (3+2n)(3+2n) )


t
§ 3 variables to represent the robot pose and
§ 2n variables for the n line-landmarks with state components
T
Hence, yt = [ X t , Yt , θ t , α 0 , r0 ,...,α n −1 , rn −1 ]
§ As the robot moves and makes measurements, yt and Py are updated using the standard
t
EKF equations
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 45
ASL
Autonomous Systems Lab

EKF SLAM | prediction step

§ The predicted robot pose x̂t at time-stamp t is computed using the estimated pose xt −1 at time-stamp t-1
and the odometric control input ut = {ΔSl , ΔS r }
⎡ ΔS r + ΔSl ΔS r − ΔSl ⎤
cos(θ + )⎥
⎡ Xˆ t ⎤ ⎡ X t −1 ⎤ ⎢ 2
t −1
2b ΔSl ; ΔS r : distance travelled by the left and right wheels resp.
⎢ ⎥ ⎢ ΔS + ΔS ΔS r − ΔSl ⎥
ˆxt = f ( xt −1 , ut ) = ⎢ Yˆt ⎥ = ⎢ Yt −1 ⎥ + ⎢ r l
sin(θ t −1 + )⎥ b : distance between the two robot wheels
⎢ ⎥ ⎢ 2 2b ⎥
⎢ θˆ ⎥ ⎢ θ ⎥ ΔS r − ΔSl (based on the example of Section 5.8.4 of the AMR book)
⎣ t ⎦ ⎣ t −1 ⎦ ⎢ ⎥
Odometry function ⎢⎣ b ⎥⎦

§ During this step, the position of the features remains unchanged. EKF Prediction Equations:
⎡ ΔS r + ΔSl ΔS r − ΔSl ⎤ Jacobians of f
cos( θ + )⎥
⎡ Xˆ t ⎤ ⎡ X t ⎤ ⎢ 2
t −1
2b
⎢ ˆ ⎥ ⎢
Y ⎥ ⎢ ΔS r + ΔSl ΔS r − ΔSl ⎥
Y sin(θ t −1 + )⎥
⎢ t ⎥ ⎢ t ⎥ ⎢
⎢ θˆ ⎥ ⎢ θ t ⎥
t
⎢ 2
Δ S − Δ S
2b ⎥ ˆ T
Pyt = Fy Pyt−1 Fy + Fu Qt Fu
T
⎢ ⎥ ⎢ ⎥ ⎢ r l ⎥
α ˆ α ⎢ b ⎥
yˆ t = ⎢⎢ 0 ⎥⎥ = ⎢ 0 ⎥ + ⎢ ⎥
rˆ0 ⎢ r0 ⎥ 0
⎢ ⎥ ⎢ ⎥ ⎢ ⎥
⎢ ... ⎥ ⎢ ... ⎥ ⎢ 0 ⎥ Covariance at previous Covariance of noise
⎢ ˆ ⎥ ⎢αˆ ⎥ ⎢ ... ⎥ time-stamp associated to the motion
⎢ α n − 1 ⎥ ⎢ n −1 ⎢ ⎥

⎢⎣ rˆn −1 ⎥⎦ ⎢⎣ rn −1 ⎥⎦ ⎢ 0 ⎥
Autonomous Mobile Robots
⎢ ⎥
Margarita Chli, Nick Lawrance and Roland Siegwart
⎣ 0 ⎦ SLAM I | 46
ASL
Autonomous Systems Lab

EKF SLAM | vs. EKF localization


EKF LOCALIZATION EKF SLAM
§ The state xt is only the robot configuration: § The state yt comprises of the robot configuration xt and
that of each feature mi :

xt = [ X t , Yt , θ t ]T yt = [ X t , Yt ,θ t , α 0 , r0 ,...,α n −1 , rn −1 ]T
§ The prediction function is: § The prediction function is:
xˆt = f ( xt −1 , ut ) yˆ t = f ( yt −1 , ut )
⎡ ΔS r + ΔSl ΔS − ΔSl ⎤
cos(θ t −1 + r )⎥
⎡ ΔS r + ΔSl ΔS r − ΔSl ⎤ ⎡Xˆ t ⎤ ⎡ Xt ⎤ ⎢ 2 2b
cos(θ + )⎥ ⎢
Yˆt
⎥ ⎢ ⎥ ⎢ ΔS r + ΔSl ΔS r − ΔSl ⎥
⎡ Xˆ t ⎤ ⎡ X t −1 ⎤ ⎢ 2
t −1
2b ⎢ ⎥ ⎢ Y t ⎥ ⎢
2
sin( θ t −1 +
2b
)⎥
⎢ ˆ⎥ ⎢ ⎢ ΔS + ΔS ΔS r − ΔSl ⎥ ⎢θˆ ⎥ ⎢ θt ⎥ ⎢
S S

⎥ r l ⎢ t ⎥ ⎢ ⎢ Δ − Δ ⎥
⎢ Yt ⎥ = ⎢ Yt −1 ⎥ + ⎢ sin(θ t −1 + )⎥ ⎥ r l
αˆ 0 ⎥ ⎢ α 0 ⎥ ⎢ b ⎥
⎢ θˆ ⎥ ⎢ θ ⎥ ⎢ 2 2b ⎥

yˆ t = ⎢ ⎥ =
⎢ r0 ⎥
+⎢
0 ⎥
rˆ0
⎣ t ⎦ ⎣ t −1 ⎦ ⎢ ΔS r − ΔSl ⎥ ⎢ ⎥ ⎢ ⎥ ⎢
0

⎢ ... ⎥ ⎢ ... ⎥ ⎢ ⎥
⎢⎣ b ⎥⎦ ⎢ ˆ ⎥ ⎢αˆ ⎥ ⎢ ... ⎥
⎢ α n −1 ⎥ ⎢ n −1 ⎢ ⎥

⎢⎣ rˆn −1 ⎥⎦ ⎣⎢ rn −1 ⎦⎥ ⎢ 0 ⎥
⎢ ⎥
⎣ 0 ⎦
Pˆxt = Fx Pxt −1 Fx + Fu Qt Fu
T T
Pˆyt = Fy Pyt−1 Fy + Fu Qt Fu
T T

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 47
ASL
Autonomous Systems Lab

EKF SLAM | measurement prediction & update

§ The application of the measurement model is the same as in EKF localization. The predicted
observation of each feature mi is:
⎡αˆ i ⎤ The predicted new pose is used to predict
zˆi = ⎢ ⎥ = hi ( xˆt , mi ) where each feature lies in measurement space
⎣ rˆi ⎦

§ After obtaining the set of actual observations z0:n-1 the EKF state gets updated:
Jacobian of h Measurement
yt = yˆ t + K t ( z0:n−1 − h0:n−1 ( xˆt , m0:n−1 )) noise

yt
ˆ
P = P −K Σ K yt
T
t IN t where
Σ IN = HPˆyt H T + R
K t = Pˆyt H (Σ IN ) −1
“Innovation” (= observation-prediction)
Innovation
Kalman Gain
Covariance

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 48
ASL
Autonomous Systems Lab

MonoSLAM

§ An example of EKF SLAM: MonoSLAM

margaritachli.com

[Davison, Reid, Molton and Stasse, PAMI 2007]


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I |
49
ASL
Autonomous Systems Lab

MonoSLAM | single camera SLAM

§ Images = information-rich snapshots of a scene


Vision
for SLAM § Compactness + affordability of cameras
§ HW advances

§ SLAM using a single, handheld camera:


§ Hard but … (e.g. cannot recover depth from 1 image)
§ very applicable, compact, affordable, …

Autonomous Mobile Robots


Image Courtesy of G. Klein
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 50
ASL
Autonomous Systems Lab

MonoSLAM | from SFM to SLAM

Structure from Motion (SFM):


P1
P3
§ Take some images of the object/scene to reconstruct P2

§ Features (points, lines, …) are extracted from all frames and matched among them
§ Process all images simultaneously: images do not need to be ordered
§ Optimization to recover both:
§ camera motion and
§ 3D structure
up to a scale factor
§ Not real-time

San Marco square, Venice


Autonomous Mobile Robots 14,079 images, 4,515,157 points
Margarita Chli, Nick Lawrance and Roland Siegwart |
[Agarwal et al., “Reconstructing Rome”, IEEE Computer, 2010 ]
ASL
Autonomous Systems Lab

MonoSLAM | from SFM to SLAM

Structure from Motion (SFM):


P1
P3
§ Take some images of the object/scene to reconstruct P2

§ Features (points, lines, …) are extracted from all frames and matched among them
§ Process all images simultaneously: images do not need to be ordered
§ Optimization to recover both:
§ camera motion and
§ 3D structure
up to a scale factor
§ Not real-time

San Marco square, Venice


Autonomous Mobile Robots 14,079 images, 4,515,157 points
Margarita Chli, Nick Lawrance and Roland Siegwart |
[Agarwal et al., “Reconstructing Rome”, IEEE Computer, 2010 ]
ASL
Autonomous Systems Lab

MonoSLAM | problem statement


§ Can we track the motion of a hand-held camera while it is moving? i.e. online
§ Extract Shi-Tomasi features & track them in image space

scene view camera view


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart The videos are courtesy of Andrew J. Davison SLAM I | 53
ASL
Autonomous Systems Lab

MonoSLAM | problem statement


§ SLAM using a single camera, grabbing frames at 30Hz
§ Ellipses (in camera view) and Ellipsoids (in map view) represent uncertainty

camera view internal SLAM map


Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart The videos are courtesy of Andrew J. Davison SLAM I | 54
ASL
Autonomous Systems Lab

MonoSLAM | representation of the world


§ The belief about the state of the world x is approximated with a single, multivariate Gaussian distribution:

Landmark’s state
e.g. 3D position for point-
features
Mean Covariance
(state vector) matrix

d×1 d×d

Camera state
: Position [3 dim.]
: Orientation using quaternions [4 dim.]
: Linear velocity [3 dim.]
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart
: Angular velocity [3 dim.] SLAM I | 55
ASL
Autonomous Systems Lab

MonoSLAM | motion & probabilistic prediction

§ How has the camera moved from frame t-1 to frame t ?


xˆt = f ( xt −1 , ut )
Pˆt = Fy Pt −1 Fy + Fu Qt Fu
T T

§ The camera is hand-held a no odometry or control input data


a Use a motion model

§ But how can we model the unknown intentions of a human carrier ?


§ Davison et al. use a constant linear velocity, constant angular velocity motion model:
“we assume that the camera moves at a constant velocity over all time , […] but on
average we expect undetermined accelerations occur with a Gaussian profile”

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 56
MonoSLAM | constant linear & angular velocity model

At each time step, the unknown linear a and angular α accelerations (characterized by zero-
mean Gaussian distribution) cause an impulse of velocity:

The constant velocity motion model,


imposes a certain smoothness on
the camera motion expected.

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 57
ASL
Autonomous Systems Lab

MonoSLAM | motion & probabilistic prediction

§ Based on the predicted new camera pose a predict which known features will be visible and where
they will lie in the image

§ Use measurement model h to identify the predicted location zˆi = hi ( xˆt , yi ) of each feature and an
associated search region (defined in the corresponding diagonal block of Σ IN = HP ˆ HT + R )
t

§ Essentially: project the 3D ellipsoids from the SLAM map onto the image space

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 58
ASL
Autonomous Systems Lab

MonoSLAM | measurement & EKF update steps

§ Search for the known feature-patches inside their corresponding search


regions to get the set of all observations

§ Update the state using the EKF equations


xt = x̂t + K t (z0:n−1 − h0:n−1 ( x̂t , y0:n−1 ))
Pt = P̂t − K t Σ IN K tT

where: Σ IN = HPˆt H T + R
Kt = Pˆt H (Σ IN ) −1
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 59
MonoSLAM | applications

§ MonoSLAM for Augmented Reality

§ HPR-2 Humanoid at JRL, AIST, Japan

The videos are courtesy of Andrew J. Davison

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart |
ASL
Autonomous Systems Lab

EKF SLAM | a note on correlations

§ At start up: the robot makes the first measurements and the covariance matrix is populated
assuming that these (initial) features are uncorrelated a off-diagonal elements are zero.
⎡ Pxx 0 0 ... 0 0 ⎤
⎢ ⎥
⎢ 0 Py0 y0 0 ... 0 0 ⎥
⎢ ⎥
⎢ 0 0 Py1y1 ... 0 0 ⎥
P0 = ⎢ ⎥
... ... ... ... ... ...
⎢ ⎥
⎢ 0 0 0 ... Pyn−2 yn−2 0 ⎥
⎢ ⎥
⎢⎣ 0 0 0 ... 0 Pyn−1yn−1 ⎥⎦
§ When the robot starts moving & taking new measurements, both the robot pose and features
start becoming correlated. ˆ T T
Pyt = Fy Pyt−1 Fy + Fu Qt Fu
§ Accordingly, the covariance matrix becomes dense.
Autonomous Mobile Robots
Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 61
ASL
Autonomous Systems Lab

EKF SLAM | a note on correlations

§ Correlations arise as:


§ the (uncertain) robot pose is used to obtain the (uncertain) position of the features.
§ the observed features are used to update the robot’s pose.
Chli & Davison, ICRA 2009

§ Regularly covisible features become


correlated and when their motion is
coherent, their correlation is even stronger

§ Correlations very important for convergence:


The more observations are made, the more
the correlations between the features will
grow, the better the solution to SLAM.

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 62
ASL
Autonomous Systems Lab

EKF SLAM | drawbacks

§ The state vector in EKF SLAM is much larger than the state vector in EKF localization

§ Newly observed features are added to the state vector a The covariance matrix grows
quadratically with the no. features a computationally expensive for large-scale SLAM.

§ Approach to attack this: sparsify the structure of the covariance matrix (via approximations)
Chli & Davison, ICRA 2009

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 63
ASL
Autonomous Systems Lab

SLAM Challenges | components for scalable SLAM

Robust local motion estimation

2 3

Map management
& optimisation

Mapping &
loop-closure
detection
64

Autonomous Mobile Robots


Margarita Chli, Nick Lawrance and Roland Siegwart SLAM I | 64
[Chli, PhD Thesis, 2009]

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy