0% found this document useful (0 votes)
51 views20 pages

Mobile Robot Positioning: Sensors and Techniques: J. Borenstein

This document provides a review of sensors and techniques used for mobile robot positioning. It discusses seven categories: (1) odometry, (2) inertial navigation, (3) magnetic compasses, (4) active beacons, (5) global positioning systems, (6) landmark navigation, and (7) model matching. For each category, it describes the characteristics and provides examples from the literature. However, it notes that no single solution exists and most systems take a hybrid approach, combining relative and absolute position measurement techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views20 pages

Mobile Robot Positioning: Sensors and Techniques: J. Borenstein

This document provides a review of sensors and techniques used for mobile robot positioning. It discusses seven categories: (1) odometry, (2) inertial navigation, (3) magnetic compasses, (4) active beacons, (5) global positioning systems, (6) landmark navigation, and (7) model matching. For each category, it describes the characteristics and provides examples from the literature. However, it notes that no single solution exists and most systems take a hybrid approach, combining relative and absolute position measurement techniques.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Mobile Robot Positioning:

Sensors and Techniques


J. Borenstein*
The University of Michigan
Advanced Technologies Lab
1101 Beal Avenue
Ann Arbor, MI 48109-2110
e-mail: johannb@umich.edu

H. R. Everett
Naval Command, Control, and Ocean Surveillance
Center
RDT&E Division 5303
271 Catalina Boulevard
San Diego, CA 92152-5001

L. Feng
The University of Michigan
Advanced Technologies Lab
1101 Beal Avenue
Ann Arbor, MI 48109-2110

D. Wehe
The University of Michigan
Dept. of Nuclear Engineering and Radiological
Sciences
239 Cooley Bldg.
Ann Arbor, MI 48109

Submitted April 1996, accepted and resubmitted:


September 1996

*To whom all correspondence should be addressed.

Journal of Robotic Systems 14(4), 231249 (1997)


1997 by John Wiley & Sons, Inc. CCC 0741-2223/97/040231-19
232 Journal of Robotic Systems1997

Exact knowledge of the position of a vehicle is a fundamental problem in mobile robot


applications. In search of a solution, researchers and engineers have developed a variety
of systems, sensors, and techniques for mobile robot positioning. This article provides
a review of relevant mobile robot positioning technologies. The article defines seven
categories for positioning systems: (1) Odometry, (2) Inertial Navigation, (3) Magnetic
Compasses, (4) Active Beacons, (5) Global Positioning Systems, (6) Landmark Naviga-
tion, and (7) Model Matching. The characteristics of each category are discussed and
examples of existing technologies are given for each category. The field of mobile robot
navigation is active and vibrant, with more great systems and ideas being developed
continuously. For this reason the examples presented in this article serve only to repre-
sent their respective categories, but they do not represent a judgment by the authors.
Many ingenious approaches can be found in the literature, although, for reasons of
brevity, not all could be cited in this article. 1997 John Wiley & Sons, Inc.

1. INTRODUCTION or even artillery addresses some of the problems


found in mobile robot navigation.1,2 However, we
This article surveys the state-of-the-art in sensors, have focused our survey only on literature pertaining
systems, methods, and technologies that aim at find- directly to mobile robots. This is because sensor sys-
ing a mobile robots position in its environment. In tems for mobile robots must usually be relatively
surveying the literature on this subject, it became small, lightweight, and inexpensive. Similarly we are
evident that a benchmark-like comparison of differ- not considering Automated Guided Vehicles (AGVs)
ent approaches is difficult because of the lack of com- in this article. AGVs use magnetic tape, buried guide
monly accepted test standards and procedures. The wires, or painted stripes on the ground for guidance.
research platforms used differ greatly, and so do the These vehicles are thus not freely programmable, and
key assumptions used in different approaches. Fur- they cannot alter their path in response to external
ther challenges arise from the fact that different sys- sensory input (e.g., obstacle avoidance). However,
tems are at different stages in their development. For the interested reader may find a survey of guidance
example, one system may be commercially available, techniques for AGVs in ref. 3.
while another system, perhaps with better perfor- Perhaps the most important result from survey-
mance, has been tested only under a limited set of ing the literature on mobile robot positioning is that,
laboratory conditions. For these reasons we generally to date, there is not truly elegant solution for the
refrain from comparing or even judging the perfor- problem. The many partial solutions can roughly be
mance of different systems or techniques. Further- categorized into two groups: relative and absolute
more, we have not tested most of the systems and position measurements. Because of the lack of a single
techniques, so the results and specifications given in good method, developers of mobile robots usually
this article are derived from the literature. combine two methods, one from each group. The
Finally, we should point out that a large body of two groups can be further divided into the following
literature related to navigation of aircraft, space craft, seven categories:
Borenstein et al.: Mobile Robot Positioning 233

I: Relative position measurements (also called wheelbase. Non-systematic errors are those that re-
dead-reckoning) sult from the interaction of the floor with the wheels,
e.g., wheel slippage or bumps and cracks. Typically,
1. Odometry when a mobile robot system is installed with a hybrid
2. Inertial navigation odometry/landmark navigation system, the density
in which the landmarks must be placed in the envi-
II: Absolute position measurements (reference- ronment is determined empirically and is based on
based systems) the worst-case systematic errors. Such systems are
likely to fail when one or more large non-systematic
3. Magnetic compasses errors occur.
4. Active beacons
5. Global positioning systems
2.1.1. Measurement of Odometry Errors
6. Landmark navigation
7. Model matching One important but rarely addressed difficulty in mo-
bile robotics is the quantitative measurement of
odometry errors. Lack of well-defined measuring
2. REVIEW OF SENSORS AND TECHNIQUES procedures for the quantification of odometry errors
results in the poor calibration of mobile platforms
In this section we review some of the sensors and and incomparable reports on odometric accuracy in
techniques used in mobile robot positioning. Exam- scientific communications. To overcome this prob-
ples of commercially available systems or well-docu- lem, Borenstein and Feng8 developed a method for
mented research results will also be given. quantitatively measuring systematic odometry errors
and, to a limited degree, non-systematic odometry
errors. This method, called University of Michigan
2.1. Odometry
Benchmark (UMBmark) requires that the mobile robot
Odometry is the most widely used navigation method be programmed to follow a preprogrammed square
for mobile robot positioning; it provides good short- path of 4 3 4 m side-length and four on-the-spot 90-
term accuracy, is inexpensive, and allows very high degree turns. This run is to be performed five times
sampling rates. However, the fundamental idea of in clockwise (cw) and five times in counter-clockwise
odometry is the integration of incremental motion (ccw) direction.
information over time, which leads inevitably to the When the return position of the robot as com-
unbounded accumulation of errors. Specifically, ori- puted by odometry is compared to the actual return
entation errors will cause large lateral position errors, position, an error plot similar to the one shown in
which increase proportionally with the distance trav- Figure 1 will result. The results of Figure 1 can be
eled by the robot. Despite these limitations, most re- interpreted as follows:
searchers agree that odometry is an important part
of a robot navigation system and that navigation tasks The stopping positions after cw and ccw runs
will be simplified if odometric accuracy can be im- are clustered in two distinct areas.
proved. For example, Cox,4 Byrne et al.,5 and Chena- The distribution within the cw and ccw clusters
vier and Crowley6 propose methods for fusing odo- are the result of non-systematic errors. How-
metric data with absolute position measurements to ever, Figure 1 shows that in an uncalibrated
obtain more reliable position estimation. vehicle, traveling over a reasonably smooth
Odometry is based on simple equations,7 which concrete floor, the contribution of systematic er-
hold true when wheel revolutions can be translated rors to the total odometry error can be notably
accurately into linear displacement relative to the larger than the contribution of non-systematic
floor. However, in the case of wheel slippage and errors.
some other more subtle causes, wheel rotations may
not translate proportionally into linear motion. The The asymmetry of the centers of gravity in cw
resulting errors can be categorized into one of two and ccw results from the dominance of two types of
groups: systematic errors and non-systematic errors.8 systematic errors, collectively called Type A and Type
Systematic errors are those resulting from kinematic B.8 Type A errors are defined as orientation errors
imperfections of the robot, for example, unequal that reduce (or increase) the amount of rotation of
wheel diameters or uncertainty about the exact the robot during the square path experiment in both
234 Journal of Robotic Systems1997

Figure 2. Position errors after completion of the bi-direc-


Figure 1. Typical results from running UMBmark (a square tional square-path experiment (4 3 4 m).
path run five times in cw and five times in ccw directions)
with an uncalibrated TRC LabMate robot.
can be included in the basic odometry computation
of the robot. Application of this procedure to several
cw and ccw direction. By contrast, Type B errors re- differential-drive platforms resulted consistently in
duce (or increase) the amount of rotation when travel- a 10- to 20-fold reduction in systematic errors. Figure
ing in cw but have the opposite effect when traveling 2 shows the result of a typical calibration session,
in ccw direction. One typical source for Type A errors Emax,sys . The results for many calibration sessions
is the uncertainty about the effective wheelbase; a with TRCs LabMate robots averaged Emax,sys 5 330
typical source for Type B errors is unequal wheel di- mm for uncalibrated vehicles and Emax,sys 5 24 mm
ameters. after calibration.
After conducting the UMBmark experiment a sin-
gle numeric value that expresses the odometric accu-
2.1.2. Measurement of Non-Systematic Errors
racy (with respect to systematic errors) of the tested
vehicle can be found from8: Borenstein and Feng10 also proposed a method for
measuring non-systematic errors. This method, called
Emax,syst 5 max(rc.g.,cw ; rc.g.,ccw ). (1) extended UMBmark, can be used for comparison of
different robots under similar conditions, although
where the measurement of non-systematic errors is less use-
ful because it depends strongly on the floor character-
rc.g.,cs 5 (xc.g.,cw)2 1 ( yc.g.,cw )2 istics. However, using a set of well-defined floor
irregularities and the UMBmark procedure, the
and susceptibility of a differential-drive platform to non-
systematic errors can be expressed. Experimental re-
rc.g.,ccw 5 (xc.g.,ccw )2 1 ( yc.g.,ccw )2. sults from six different vehicles, which were tested for
their susceptibility to non-systematic error by means of
Based on the UMBmark test, Borenstein and the extended UMBmark test, are presented in ref. 10.
Feng8 developed a calibration procedure for reduc- Borenstein11 developed a method for detecting
ing systematic odometry errors in differential drive and rejecting non-systematic odometry errors in mo-
vehicles. In this procedure the UMBmark test is bile robots. With this method, two collaborating plat-
performed five times in cw and ccw direction to forms continuously and mutually correct their non-
find xc.g.,cw and xc.g.,ccw . From a set of equations defined systematic (and certain systematic) odometry errors,
in ref. 8, two calibration constants are found that even while both platforms are in motion. A video
Borenstein et al.: Mobile Robot Positioning 235

Figure 3. The OmniMate is a commercially available fully omni-directional platform. The


two linked trucks mutually correct their odometry errors.

clip entitled CLAPPER showing this system in op- cause the sensor to detect a component of the gravita-
eration is included in refs. 12 and 20. A commercial tional acceleration g. One low-cost inertial navigation
version of this robot, shown in Figure 3, is now avail- system aimed at overcoming the latter problem in-
able under the name OmniMate.13 Because of its cluded a tile sensor.14,15 The tilt information provided
internal odometry error correction, the OmniMate is by the tilt sensor was supplied to the accelerometer
almost completely insensitive to bumps, cracks, or to cancel the gravity component projecting on each
other irregularities on the floor.9 axis of the accelerometer. Nonetheless, the results
obtained from the tilt-compensated system indicate
a position drift rate of 1 to 8 cm/s (0.4 to 3.1 in/s),
2.2. Inertial Navigation
depending on the frequency of acceleration changes.
Inertial navigation uses gyroscopes and accelerome- This is an unacceptable error rate for most mobile
ters to measure rate of rotation and acceleration, re- robot applications.
spectively. Measurements are integrated once (or
twice, for accelerometers) to yield position. Inertial
2.2.2. Gyroscopes
navigation systems have the advantage that they are
self-contained, that is, they dont need external refer- Gyroscopes (also known as rate gyros or just gy-
ences. However, inertial sensor data drift with time ros) are of particular importance to mobile robot
because of the need to integrate rate data to yield positioning because they can help compensate for
position; any small constant error increases without the foremost weakness of odometry: in an odometry-
bound after integration. Inertial sensors are thus based positioning method, any small momentary ori-
mostly unsuitable for accurate positioning over an entation error will cause a constantly growing lateral
extended period of time. position error. For this reason it would be of great
benefit if orientation errors could be detected and
corrected immediately.
2.2.1. Accelerometers
Until recently, highly accurate gyros were too
Test results from the use of accelerometers for mobile expensive for mobile robot applications. For example,
robot navigation have been generally poor. In an in- a high-quality inertial navigation system (INS) such
formal study at the University of Michigan it was as those found in a commercial airliner would have
found that there is a very poor signal-to-noise ratio a typical drift of about 1850 meters (1 nautical mile)
at lower accelerations (i.e., during low-speed turns). per hour of operation, and cost between $50K and
Accelerometers also suffer from extensive drift, and $70K.5 High-end INS packages used in ground appli-
they are sensitive to uneven ground because any dis- cations have shown performance of better than 0.1%
turbance from a perfectly horizontal position will of distance traveled, but cost in the neighborhood of
236 Journal of Robotic Systems1997

Table I. Selected specifications for the Andrew Autogyro


Navigator (courtesy of Andrew Corporation17).

Parameter Value Units

Input rotation rate 6100 8/s


Instantaneous bandwidth 100 Hz
Bias drift (at stabilized tem- 0.005 8/hr rms
perature)RMS 18 8/s rms
Temperature range
Operating 240 to 1 75 8C
Storage 250 to 185 8C
Warm up time 1 s
Size 115 3 90 3 41 mm
(excluding connector) 4.5 3 3.5 3 1.6 in
Weight (total) 0.25 kg
0.55 lb
Power analog ,2 W
Power digital ,3 W

netic compass, however, is that the earths magnetic


field is often distorted near power lines or steel struc-
tures.5 This makes the straightforward use of geomag-
Figure 4. The Andrew AUTOGYRO Navigator (courtesy
netic sensors difficult for indoor applications.
of Andrew Corporation17).
Based on a variety of physical effects related to
the earths magnetic field, different sensor systems
are available:
$100K to $200K, while lower performance versions
(i.e., 1% of distance traveled) run between $20K to
$50K.16 Mechanical magnetic compasses.
However, very recently fiber-optic gyros (also Fluxgate compasses.
called laser gyros), which are known to be very Hall-effect compasses.
accurate, have fallen dramatically in price and have Magnetoresistive compasses.
become a very attractive solution for mobile robot Magnetoelastic compasses.
navigation.
One commercially available laser gyro is the Au- The compass best suited for use with mobile robot
togyro Navigator from Andrew Corp.,17 shown in applications is the fluxgate compass. When main-
Figure 4. It is a single-axis interferometric fiber-optic tained in a level attitude, the fluxgate compass will
gyroscope (see ref. 3 for technical details) based on measure the horizontal component of the earths mag-
polarization-maintaining fiber and precision fiber- netic field, with the decided advantages of low power
optic gyroscope technology. Technical specifications consumption, no moving parts, intolerance to shock
for Andrews most recent model, the Autogyro Navi- and vibration, rapid start-up, and relatively low cost.
gator, are shown in Table I. This laser gyro costs under If the vehicle is expected to operate over uneven ter-
$1,000 and is well suited for mobile robot navigation. rain, the sensor coil should be gimbal-mounted and
mechanically dampened to prevent serious errors in-
troduced by the vertical component of the geomag-
2.3. Magnetic Compasses netic field.
Vehicle heading is the most significant of the naviga-
tion parameters (x, y, and u) in terms of its influence Example: KVH Fluxgate Compasses
on accumulated dead-reckoning errors. For this rea- KVH Industries, Inc., Middletown, RI, offers a
son, sensors that provide a measure of absolute head- complete line of fluxgate compasses and related ac-
ing are extremely important in solving the navigation cessories, ranging from inexpensive units targeted for
needs of autonomous platforms. The magnetic com- the individual consumer up through sophisticated
pass is such a sensor. One disadvantage of any mag- systems intended for military applications.18 the C100
Borenstein et al.: Mobile Robot Positioning 237

Table II. Technical specifications for the KVH C-100 flux-


gate compass (Courtesy of KVH Industries18).

Parameter Value Units

Resolution 60.1 8
Accuracy 60.5 8
Repeatability 60.2 8
Size 46 3 110 mm
1.8 3 4.5 in.
Weight (total) 62 gr
2.25 oz
Power: Current drain 0.04 A
Supply voltage 818 or 1828 V

mounting of beacons is required for accurate position-


ing. Two different types of active beacon systems can
Figure 5. The C-100 fluxgate compass engine. (Repro- be distinguished: trilateration and triangulation.
duced from ref. 18 with permission from KVH Indus-
tries, Inc.)
2.4.1. Trilateration

COMPASS ENGINE shown in Figure 5 is a versatile, Trilateration is the determination of a vehicles posi-
low-cost (less than $700) developers kit that includes tion based on distance measurements to known bea-
a microprocessor-controlled stand-alone fluxgate con sources. In trilateration navigation systems there
sensor subsystem based on a two-axis toroidal ring- are usually three or more transmitters mounted at
core sensor. known locations in the environment, and one receiver
Two different sensor options are offered with the on board the robot. Conversely, there may be one
C100: (1) the SE-25 sensor, recommended for applica- transmitter on board, and the receivers are mounted
tions with a tilt range of 616 degrees, and (2) the SE- on the walls. Using time-of-flight information, the
10 sensor, for applications anticipating a tilt angle of system computes the distance between the stationary
up to 645 degrees. transmitters and the onboard receiver. Global Posi-
The SE-25 sensor provides internal gimballing by tioning Systems (GPSs), discussed in section 2.5, are
floating the sensor coil in an inert fluid inside the an example of trilateration.
lexan housing. The SE-10 sensor provides a 2 degree-
of-freedom (DOF) pendulous gimbal in addition to 2.4.2. Triangulation
the internal fluid suspension. The SE-25 sensor
mounts on top of the sensor PC board, while the SE- In this configuration there are three or more active
10 is suspended beneath it. The sensor PC board can transmitters mounted at known locations, as shown
be separated as much as 122 cm (48 in) from the in Figure 6. A rotating sensor on board the robot
detachable electronics PC board with an optional ca- registers the angles l1 , l2 , and l3 at which it sees
ble. Additional technical specifications are given in the transmitter beacons relative to the vehicles longi-
Table II. tudinal axis. From these three measurements the un-
known x- and y-coordinates and the unknown vehicle
orientation can be computed. One problem with this
2.4. Active Beacons
configuration is that in order to be seen at distances
Active beacon navigation systems are the most com- of, say, 20 m or more, the active beacons must be
mon navigation aids on ships and airplanes, as well focused within a cone-shaped propagation pattern.
as on commercial mobile robot systems. Active bea- As a result, beacons are not visible in many areas, a
cons can be detected reliably and provide accurate problem that is particularly grave because at least
positioning information with minimal processing. As three beacons must be visible for triangulation.
a result, this approach allows high sampling rates Cohen and Koss19 performed a detailed analysis
and yields high reliability, but it does also incur high on three-point triangulation algorithms and ran com-
cost in installation and maintenance. Accurate puter simulations to verify the performance of differ-
238 Journal of Robotic Systems1997

operating in conjunction with fixed-location refer-


ences strategically placed at predefined locations
within the operating environment. A number of vari-
ations on this theme are seen in practice3: (a) scanning
detectors with fixed active beacon emitters, (b) scan-
ning emitter/detectors with passive retroreflective
targets, (c) scanning emitter/detectors with active
transponder targets, and (d) rotating emitters with
fixed detector targets.

Example: MTI Research CONACe


A similar type system using a predefined net-
work of fixed-location detectors is made by MTI Re-
search, Inc., Chelmsford, MA.19a MTIs Computerized
Opto-electronic Nvaigation and Control (CONACe) is a
Figure 6. The basic triangulation problem: a rotating sen-
navigational referencing system employing a vehicle-
sor head measures the three angles l1 , l2 , and l3 between
the vehicles longitudinal axes and the three sources S1 , S2 , mounted laser unit called STRuctured Opto-electronic
and S3 . Acquisition Beacon (STROAB), as shown in Figure 7.
The scanning laser beam is spread vertically to elimi-
nate critical alignment, allowing the receivers, called
ent algorithms. The results are summarized as Networked Opto-electronic Acquisition Datums
follows: (NOADs) (see Fig. 8), to be mounted at arbitrary
heights as illustrated in Figure 9. Detection of incident
The Geometric Triangulation method works con- illumination by a NOAD triggers a response over the
sistently only when the robot is within the trian- network to a host PC, which in turn calculates the
gle formed by the three beacons. There are areas implied angles a1 and a2. An index sensor built into
outside the beacon triangle where the geometric the STROAB generates a rotation reference pulse to
approach works, but these areas are difficult to facilitate heading measurement. Indoor accuracy is
determine and are highly dependent on how on the order of centimeters or millimeters, and better
the angles are defined. than 0.18 for heading.
The Geometric Circle Intersection method has
large errors when the three beacons and the
robot all lie on, or close to, the same circle.
The Newton-Raphson method fails when the ini-
tial guess of the robots position and orientation
is beyond a certain bound.
The heading of at least two of the beacons was
required to be greater than 90 degrees. The an-
gular separation between any pair of beacons
was required to be greater than 45 degrees.

In summary, it appears that none of the above


methods alone is always suitable, but an intelligent
combination of two or more methods helps overcome
the individual weaknesses.

2.4.3. Specific Triangulation Systems


Because of their technical maturity and commercial
availability, optical triangulation-systems are widely Figure 7. A single STROAB beams a vertically spread laser
used in mobile robotics applications. Typically these signal while rotating at 3,000 rpm (courtesy of MTI Re-
systems involve some type of scanning mechanism search Inc.19).
Borenstein et al.: Mobile Robot Positioning 239

of the robot and any predefined visual barriers. A


short video clip showing the CONAC system in oper-
ation is included in ref. 20).

2.5. Global Positioning Systems


The Global Positioning System (GPS) is a revolution-
ary technology for outdoor navigation. GPS was de-
veloped as a Joint Services Program by the Depart-
ment of Defense. The system comprises 24 satellites
(including three spares) that transmit encoded RF
signals. Using advanced trilateration methods,
ground-based receivers can compute their position by
measuring the travel time of the satellites RF signals,
which include information about the satellites mo-
mentary location. Knowing the exact distance from
the ground receiver to three satellites theoretically
allows for calculation of receiver latitude, longitude,
and altitude.
Figure 8. Stationary NOADs are located at known posi- The US government deliberately applies small er-
tions; at least two NOADs are networked and connected rors in timing and satellite position to prevent a hostile
to a PC (courtesy of MTI Research Inc.19a). nation from using GPS in support of precision weap-
ons delivery. This intentional degradation in posi-
tional accuracy to approximately 100 m (328 ft) worst
The reference NOADs are installed at known lo- case is termed selective availability (SA).21 Selective
cations throughout the area of interest. STROAB ac- availability has been on continuously (with a few ex-
quisition range is sufficient to allow three NOADS to ceptions) since the end of Operation Desert Storm. It
cover an area of 33,000 m2 if no interfering structures was turned off during the war from August 1990 until
block the view. Additional NOADS may be employed July 1991 to improve the accuracy of commercial hand-
to increase fault tolerance and minimize ambiguities held GPS receivers used by coalition ground forces.
when two or more robots are operating in close prox- At another occasion (October 1992) SA was also turned
imity. The optimal set of three NOADS is dynamically off for a brief period while the Air Force was conduct-
selected by the host PC, based on the current location ing tests. Byrne22 conducted tests at that time to com-

Figure 9. The CONACe system employs an onboard, rapidly rotating and vertically
spread laser beam, which sequentially contacts the networked detectors (courtesy of MTI
Research Inc.19).
240 Journal of Robotic Systems1997

Figure 10. Typical GPS static position error with SA On (courtesy of Byrne, Sandia
National Laboratories22).

pare the accuracy of GPS with SA turned on and off. differential correction can then be passed to the first
The static measurements of the GPS error as a function receiver to null out the unwanted effects, effectively
of time (shown in Fig. 10) were taken before the Octo- reducing position error for commercial systems.
ber 1992 test, i.e., with SA on (note the slowly vary- Many commercial GPS receivers are available
ing error in Fig. 10, which is caused by SA). By con- with differential capability. This, together with the
trast, Figure 11 shows measurements from the October service of some local radio stations that make differ-
1992 period when SA was briefly off. ential corrections available to subscribers of the ser-
The effect of SA can be essentially eliminated vice,23 makes the use of DGPS possible for many appli-
through use of a practice known as differential GPS cations. Typical DGPS accuracies are around 4 to 6
(DGPS). The concept is based on the premise that a m (13 to 20 ft), with better performance seen as the
second GPS receiver in fairly close proximity (i.e., distance between the mobile receivers and the fixed
within 10 km, which is 6.2 miles) to the first will reference station is decreased. For example, the Coast
experience basically the same error effects when view- Guard is in the process of implementing differential
ing the same reference satellites. If this second receiver GPS in all major U.S. harbors, with an expected accu-
is fixed at a precisely surveyed location, its calculated racy of around 1 m (3.3 ft).24 A differential GPS system
solution can be compared to the known position to already in operation at OHare International Airport
generate a composite error vector representative of in Chicago has demonstrated that aircraft and ser-
prevailing conditions in that immediate locale. This vice vehicles can be located to 1 m (3.3 ft) in real-
Borenstein et al.: Mobile Robot Positioning 241

Figure 11. Typical GPS static position error with SA Off (courtesy of Byrne, Sandia
National Laboratories22).

time, while moving. Surveyors use differential GPS to data for approximately 24 hours. The plots of the
achieve centimeter accuracy, but this practice requires static position error for the Magnavox GPS Engine
significant postprocessing of the collected data.22 were shown in Figure 10. The mean and standard
In 1992 and 1993 Raymond H. Byrne22 at the Ad- deviation (s) of the position error in this test was 22
vanced Vehicle Development Department, Sandia m (72 ft) and 16 m (53 ft), respectively.
National Laboratories, Albuquerque, New Mexico
conducted a series of in-depth comparison tests with Fractional Availability of Signals
five different GPS receivers. Testing focused on re- The dynamic test data was obtained by driving
ceiver sensitivity, static accuracy, dynamic accuracy, an instrumented van over different types of terrain.
number of satellites tracked, and time-to-first-fix. The The various routes were chosen so that the GPS
more important parameters evaluated in this test, the receivers would be subjected to a wide variety of
static and dynamic accuracy, are summarized below obstructions. These include buildings, underpasses,
for the Magnavox GPS Engine, a representative of signs, and foliage for the city driving. Rock cliffs and
the five receivers tested. foliage were typical for the mountain and canyon
driving. Large trucks, underpasses, highway signs,
Position Accuracy buildings, foliage, and small canyons were found on
Static position accuracy was measured by placing the interstate and rural highway driving routes.
the GPS receivers at a surveyed location and taking The results of the dynamic testing are shown in
242 Journal of Robotic Systems1997

Figure 12. Summary of dynamic environment performance for the Magnavox GPS Engine
(courtesy of Byrne, Sandia National Laboratories22).

Figure 12; the percentages have the following landmarks must be known and stored in the robots
meaning: memory. The main task in localization is then to rec-
ognize the landmarks reliably and to calculate the
No NavigationNot enough satellites were in sight robots position.
to permit positioning. To simplify the problem of landmark acquisition
2-D NavigationEnough satellites were in sight to it is often assumed that the current robot position
determine the x- and y-coordinates of the vehicle. and orientation are known approximately, so that the
3-D NavigationOptimal data available. System robot only needs to look for landmarks in a limited
could determine x-, y-, and z-coordinates of the ve- area. For this reason good odometry accuracy is a
hicle. prerequisite for successful landmark detection.
Some approaches fall between landmark and
In summary one can conclude that GPS is a tre- map-based positioning (see section 2.7). They use sen-
mendously powerful tool for many outdoor naviga- sors to sense the environment, and then extract dis-
tion tasks. The problems associated with using GPS tinct structures that serve as landmarks for navigation
for mobile robot navigation are: (a) periodic signal in the future.
blockage due to foliage and hilly terrain, (b) multi- Our discussion in this section addresses two
path interference, and (c) insufficient position accu- types of landmarks: artificial and natural land-
racy for primary (stand-alone) navigation systems. marks. It is important to bear in mind that natural
landmarks work best in highly structured environ-
ments such as corridors, manufacturing floors, or hos-
2.6. Landmark Navigation
pitals. Indeed, one may argue that natural land-
Landmarks are distinct features that a robot can rec- marks work best when they are actually man-made
ognize from its sensory input. Landmarks can be geo- (as is the case in highly structured environments).
metric shapes (e.g., rectangles, lines, circles), and they For this reason, we shall define the terms natural
may include additional information (e.g., in the form landmarks and artificial landmarks as follows:
of bar-codes). In general, landmarks have a fixed and natural landmarks are those objects or features that
known position, relative to which a robot can localize are already in the environment and have a function
itself. Landmarks are carefully chosen to be easy to other than robot navigation; artificial landmarks are
identify; for example, there must be sufficient contrast specially designed objects or markers that need to be
relative to the background. Before a robot can use placed in the environment with the sole purpose of
landmarks for navigation, the characteristics of the enabling robot navigation.
Borenstein et al.: Mobile Robot Positioning 243

made pan-and-tilt table, a CCD camera, and an eye-


safe IR spot laser rangefinder. Two VME-based cards,
a single-board computer, and a microcontroller pro-
vide processing power. The navigation module is
used to periodically correct the robots accumulating
odometry errors. The system uses natural landmarks
such as alphanumeric signs, semi-permanent struc-
tures, or doorways. The only criteria used is that the
landmark be distinguishable from the background
scene by color or contrast.
The ARK navigation module uses an interesting
hybrid approach: the system stores (learns) land-
marks by generating a three-dimensional gray-level
surface from a single training image obtained from
the CCD camera. A coarse, registered range scan of
the same field of view is performed by the laser
rangefinder, giving depths for each pixel in the gray-
level surface. Both procedures are performed from a
known robot position. Later, during operation, when
the robot is at an approximately known (from odome-
Figure 13. The ARKs natural landmark navigation system
uses a CCD camera and a time-of-flight laser rangefinder
try) position within a couple of meters of the training
to identify landmarks and to measure the distance between position, the vision system searches for those land-
landmark and robot (courtesy of Atomic Energy of Can- marks that are expected to be visible from the robots
ada Ltd.). momentary position. Once a suitable landmark is
found, the projected appearance of the landmark is
computed. This expected appearance is then used in
2.6.1. Natural Landmarks
a coarse-to-fine normalized correlation-based match-
The main problem in natural landmark navigation ing algorithm that yields the robots relative distance
is to detect and match characteristic features from and bearing with regard to that landmark. With this
sensory inputs. The sensor of choice for this task is procedure the ARK can identify different natural
computer vision. Most computer vision-based natural landmarks and measure its position relative to the
landmarks are long vertical edges, such as doors, wall landmarks. A video clip showing the ARK system in
junctions, and ceiling lights (see TRC video clip in operation is included in ref. 20.
ref. 20).
When range sensors are used for natural land-
2.6.2. Artificial Landmarks
mark navigation, distinct signatures, such as those of
a corner or an edge, or of long straight walls, are Detection is much easier with artificial landmarks,26
good feature candidates. The selection of features is which are designed for optimal contrast. In addition,
important since it will determine the complexity in the exact size and shape of artificial landmarks are
feature description, detection, and matching. Proper known in advance. Size and shape can yield a wealth
selection of features will also reduce the chances for of geometric information when transformed under
ambiguity and increase positioning accuracy. the perspective projection.
Researchers have used different kinds of patterns
Example: AECLs ARK Project or marks, and the geometry of the method and the
One system that uses natural landmarks was de- associated techniques for position estimation vary ac-
veloped jointly by the Atomic Energy of Canada Ltd cordingly.27 Many artificial landmark positioning sys-
(AECL) and Ontario Hydro Technologies with sup- tems are based on computer vision. We will not dis-
port from the University of Toronto and York Univer- cuss these systems in detail, but will mention some
sity.25 This project aimed at developing a sophisti- of the typical landmarks used with computer vision.
cated robot system called the Autonomous Robot Fukui28 used a diamond-shaped landmark and ap-
for a Known Environment (ARK). plied a least-squares method to find line segments in
The navigation module of the ARK robot is the image plane. Other systems use reflective material
shown in Figure 13. The module consists of a custom- patterns and strobed light to ease the segmentation
244 Journal of Robotic Systems1997

and parameter extraction.29,30 There are also systems pletely erroneous determination of the robots
that use active (i.e., LED) patterns to achieve the position.
same effect.31 Landmarks must be available in the work envi-
The accuracy achieved by the above methods de- ronment around the robot.
pends on the accuracy with which the geometric pa- Landmark-based navigation requires an ap-
rameters of the landmark images are extracted from proximate starting location so that the robot
the image plane, which in turn depends on the rela- knows where to look for landmarks. If the start-
tive position and angle between the robot and the ing position is not known, the robot has to con-
landmark. In general, the accuracy decreases with the duct a time-consuming search process. This
increase in relative distance. Normally there is a range search process may go wrong and may yield
of relative angles in which good accuracy can be an erroneous interpretation of the objects in
achieved, while accuracy drops significantly once the the scene.
relative angle moves out of the good region. A database of landmarks and their location in
There is also a variety of landmarks used in con- the environment must be maintained.
junction with non-vision sensors. Most often used are There is only limited commercial support for
bar-coded reflectors for laser scanners. For example, natural landmark-based techniques.
work on the Mobile Detection Assessment and Response
System (MDARS)3,32,33 uses retro-reflectors, and so
2.7. Map-Based Positioning
does the commercially available system from Cater-
pillar on their Self-Guided Vehicle.5,34 The shape of these Map-based positioning, also known as map match-
landmarks is usually unimportant. By contrast, a ing, is a technique in which the robot uses its sensors
unique approach taken by Feng et al.35 used a circular to create a map of its local environment. This local
landmark and applied an optical Hough transform map is then compared to a global map previously
to extract the parameters of the ellipse on the image stored in memory. If a match is found, then the robot
plane in real time. can compute its actual position and orientation in
We summarize the characteristics of landmark- the environment. The pre-stored map can be a CAD
based navigation as follows: model of the environment, or it can be constructed
from prior sensor data. Map-based positioning is ad-
Natural landmarks offer flexibility and require vantageous because it uses the naturally occurring
no modifications to the environment. structure of typical indoor environments to derive
Artificial landmarks are inexpensive and can position information without modifying the environ-
have additional information encoded as pat- ment. Also, with some of the algorithms being devel-
terns or shapes. oped, map-based positioning allows a robot to learn
The maximal effective distance between robot a new environment and to improve positioning accu-
and landmark is substantially shorter than in racy through exploration. Disadvantages of map-
active beacon systems. based positioning are the stringent requirements for
The positioning accuracy depends on the dis- accuracy of the sensor map, and the requirement that
tance and angle between the robot and the land- there be enough stationary, easily distinguishable fea-
mark. Landmark navigation is rather inaccurate tures that can be used for matching. Because of the
when the robot is further away from the land- challenging requirements currently, most work in
mark. A higher degree of accuracy is obtained map-based positioning is limited to laboratory set-
only when the robot is near a landmark. tings and to relatively simple environments.
Substantially more processing is necessary than
with active beacon systems. In many cases on-
2.7.1. Map Building
board computers cannot process natural land-
mark algorithms quickly enough for real-time There are two fundamentally different starting points
motion. for the map-based positioning process. Either there
Ambient conditions, such as lighting, can be is a pre-existing map, or the robot has to build its
problematic; in marginal visibility landmarks own environment map. Rencken36 defined the map
may not be recognized at all, or other objects building problem as the following: Given the robots
in the environment with similar features can be position and a set of measurements, what are the
mistaken for a legitimate landmark. This is a sensors seeing? Obviously, the map-building ability
serious problem because it may result in a com- of a robot is closely related to its sensing capacity.
Borenstein et al.: Mobile Robot Positioning 245

A problem related to map-building is autono-


mous exploration.37 To build a map, the robot must
explore its environment to map uncharted areas. Typ-
ically it is assumed that the robot begins its explora-
tion without having any knowledge of the environ-
ment. Then, a certain motion strategy is followed that
aims at maximizing the amount of charted area in
the least amount of time. Such a motion strategy is
called exploration strategy, and it depends
strongly on the kind of sensors used. One example
for a simple exploration strategy based on a lidar
sensor is given by Edlinger and Puttkamer.38
Many researchers believe that no single sensor
modality alone can adequately capture all relevant
features of a real environment. To overcome this
problem, it is necessary to combine data from differ-
ent sensor modalities, a process known as sensor fu-

sion. For example, Buchberger et al.39 and Jorg40,41 de-
veloped a mechanism that utilizes heterogeneous Figure 14. A typical scan of a room, produced by the
information obtained from a laser-radar and a sonar University of Kaiserslauterns in-house developed lidar
system to construct reliable and complete world mod- system (courtesy of the University of Kaiserslautern).
els. Sensor fusion is an active research area, and the
literature is replete with techniques that combine var-
ious types of sensor data. Then the algorithm measures the relative angle d be-
tween any two adjacent hits (see Fig. 15). After com-
pensating for noise in the readings (caused by the
2.7.2 Map Matching inaccuracies in position between adjacent hits), the
One of the most important and challenging aspects angle histogram shown in Figure 16(a) can be built.
of map-based navigation is map matching, i.e., estab- The uniform direction of the main walls are clearly
lishing the correspondence between a current local visible as peaks in the angle histogram. Computing
map and a stored global map.42 Work on map match- the histogram modulo f results in only two main
ing in the computer vision community is often fo- peaks: one for each pair of parallel walls. This algo-
cused on the general problem of matching an image of rithm is very robust with regard to openings in the
arbitrary position and orientation relative to a model walls, such as doors and windows, or even cabinets
(e.g., ref. 27). In general, matching is achieved by first lining the walls.
extracting features, followed by determination of the After computing the angle histogram, all angles
correct correspondence between image and model of the hits can be normalized, resulting in the repre-
features, usually by some form of constrained search.4 sentation shown in Figure 16(b). After this transfor-
A discussion of two different classes of matching al- mation, two additional histograms, one for the x- and
gorithms, icon-based and feature-based, is given one for the y-direction can be constructed. This time,
in ref. 43. peaks show the distance to the walls in x- and y-

Example: University of Kaiserslauterns Angle


Histogram
A simple but apparently very effective method
for map-building was developed by Hinkel and
Knieriemen44 from the University of Kaiserslautern,
Germany. This method, called the Angle Histo-
gram, used an in-house developed lidar. A typical
scan from this lidar is shown in Figure 14.
The angle histogram method works as follows.
First, a 360-degree scan of the room is taken with the Figure 15. Calculating angles for the angle histogram
lidar, and the resulting hits are recorded in a map. (courtesy of Weiss45).
246 Journal of Robotic Systems1997

Figure 16. Readings from a rotating laser scanner generate the contours of a room. (a)
the angle histogram allows the robot to determine its orientation relative to the walls; (b)
after normalizing the orientation of the room relative to the robot, an x-y histogram can
be built from the same data points (adapted from Hinkel and Knieriemen44 with permission).

direction. Hinkel and Knieriemens original algo- 3. CONCLUSIONS


rithms have been further refined over the past years
(e.g., Weiss et al.45) and the Angle Histogram method This article presented an overview of existing sen-
is now said to yield a reliable accuracy of 0.58). sors and techniques for mobile robot positioning.
We defined seven categories for these sensors and
techniques, but obviously other ways for organizing
Example 2: Siemens Roamer
the subject are possible. The foremost conclusion
Rencken36,37 at the Siemens Corporate Research
we could draw from reviewing the vast body of
and Development Center in Munich, Germany, has
literature was that for indoor mobile robot naviga-
made substantial contributions toward solving the
tion no single, elegant solution exists. For outdoor
boot strap problem resulting from the uncertainty in
navigation GPS is promising to become the universal
position and environment. This problem exists when
navigation solution for almost all automated vehi-
a robot must move around in an unknown environ-
cle systems.
ment, with uncertainty in its odometry-derived posi-
tion. For example, when building a map of the envi- Unfortunately, an indoor equivalent to GPS is
ronment, all measurements are necessarily relative to difficult to realize because none of the currently
the carrier of the sensors (i.e., the mobile robot). Yet, existing RF-based trilateration systems work reliably
the position of the robot itself is not known exactly, indoors. If line-of sight between stationary and
because of the errors accumulating in odometry. onboard components can be maintained, then
Rencken addresses the problem as follows: to rep- RF-based solutions can work indoors as well. How-
resent features seen by its 24 ultrasonic sensors, ever, in that case optical components using triangu-
the robot constructs hypotheses about these features. lation are usually less expensive. The market seems
To account for the typically unreliable information to have adopted this thought some time ago, as
from ultrasonic sensors, features can be classified as can be seen in the relatively large number of com-
hypothetical, tentative, or confirmed. Once a feature mercially available navigation systems that are
is confirmed, it is used for constructing the map. Be- based on optical triangulation (as discussed in sec-
fore the map can be updated, however, every new tion 2.4.3).
data point must be associated with either a plane, a Despite the variety of powerful existing systems
corner, or an edge (or some variations of these fea- and techniques, we believe that mobile robotics is
tures). Rencken devises a hypothesis tree, which still in need for a particularly elegant and universal
is a data structure that allows tracking of different indoor navigation method. Such a method will likely
hypotheses until a sufficient amount of data has been bring scientific recognition and commercial success
accumulated to make a final decision. to its inventor.
Borenstein et al.: Mobile Robot Positioning 247

Table A-1. Tabular comparison of positioning systems.

Accuracy
System & Effective
description Features Position [mm] Orientation [8] range Ref. no.

Odometry on TRC LabMate, after 434 meters square path: Smooth floor: Unlimited 8
UMBmark calibration. Wheel-encoder smooth floor: 30 mm 128
resolution: 0.012 mm linear travel per 10 bumps: 500 mm With 10 bumps: 88
pulse

CLAPPER and OmniMate: 434 m square path: Smooth floor: ,18 Unlimited 9
Dual-drive robot with internal correction smooth floor: p20 mm 10 bumps: ,18
of odometry. Made from two TRC Lab- 10 bumps: p40 mm
Mates, connected by compliant linkage.
Uses 2 abs. rotary encoders, 1 linear en-
coder.

Complete inertial navigation system in- Position drift rate: Drift: 50.258/s Unlimited 14,15
cluding ENV-O5S Gyrostar solid state 18 cm/s depending on After compensation
rate gyro, START solid state gyro, triax- frequency of accelera- drift 0.01258/s
ial linear accelerometer and 2 inclinom- tion change
eters

Andrew Autogyro and Autogyro Naviga- Not applicable Drift: 0.0058/s Unlimited 17
tor. Quoted minimum detectable rota-
tion rate: 60.028/s. Actual minimum
detectable rate limited by deadband
after A/D conversion: 0.06258/s. Cost:
$1000

KVH Fluxgate Compass. Includes micro- Not applicable Resolution: 60.58 Unlimited 18
processor-controlled fluxgate sensor Accuracy: 60.58
subsystem. Cost ,$700 Repeatability: 60.28

CONACe (computer- Measures both Indoor 61.3 mm Indoor and out- .100 m 19a
ized opto-electronic angle and Outdoor 65 mm door 60.058
navigation and distance to
control). target
Cost: $6,000.

Global Positioning Sys- Order of 20 m during Not applicable Unlimited Various


tems (GPS). motion, order of centi- vendors
Cost: $1,000$5,000. meters when standing
for minutes

Landmark Navigation ,5 cm ,1 deg p10 m Various


research
projects

Model Matching (map- Order of 110 cm Order of 13 deg p10 m Various


based positioning) research
projects
248 Journal of Robotic Systems1997

Parts of this research were funded by a U.S. Depart- 19. C. Cohen and F. Koss, A comprehensive study of three
ment of Energy Grant DE-FG02-86NE37969. Parts of object triangulation, Proc. SPIE Conf. Mobile Robots,
the text were adapted from refs. 3, 7, 20, and 22. Boston, MA, 1992, pp. 95106.
19a. MTIMTI Research, Inc., Chelmsford, MA.
20. J. Borenstein, B. Everett, and L. Feng, Navigating Mobile
Robots: Systems and Techniques (CD-ROM Edition),
A. K. Peters, Ltd., Wellesley, MA, 1996.
REFERENCES 21. B. M. Gothard, R. D. Etersky, and R. E. Ewing, Lessons
learned on a low-cost global navigation system for the
1. J. L. Farrell, Integrated Aircraft Navigation, Academic
surrogate semi-autonomous vehicle, SPIE Proc., Vol.
Press, New York, 1976.
2058, pp. 258269, 1993.
2. R. H. Battin, An Introduction to the Mathematics and Meth-
22. R. H. Byrne, Global positioning system receiver
ods of Astrodynamics, AIAA Education Series, New
evaluation results, Sandia Report SAND93-0827,
York, 1987.
Sandia National Laboratories, Albuquerque, NM,
3. H. R. Everett, Sensors for Mobile Robots: Theory and Appli-
1993.
cation, A. K. Peters, Ltd., Wellesley, MA, 1995.
23. GPS Report. Phillips Business Information, Potomac,
4. I. J. Cox, BlancheAn experiment in guidance and
MD, 1992.
navigation of an autonomous mobile Robot, IEEE
24. I. A. Getting, The global positioning system, IEEE
Trans. Rob. Autom., 7(3), 193204, 1991.
Spectrum, December, 3647, 1993.
5. R. H. Byrne, P. R. Klarer, and J. B. Pletta, Techniques
for autonomous navigation. Sandia Report SAND92- 25. M. Jenkin, et al., Global navigation for ARK, Proc.
0457, Sandia National Laboratories, Albuquerque, IEEE/RSJ Int. Conf. Intell. Rob. Syst., Yokohama, Japan,
NM, 1992. 1993, pp. 21652171.
6. F. Chenavier and J. Crowley, Position estimation for 26. S. Atiya and G. Hager, Real-time vision-based robot
a mobile robot using vision and odometry, Proc. IEEE localization, IEEE Trans. Rob. Autom., 9(6), 785800,
Int. Conf. Rob. Autom., Nice, France, 1992, pp. 25882593. 1993.
7. J. Borenstein, B. Everett, and L. Feng, Navigating Mobile 27. R. Talluri and J. Aggarwal, Position estimation tech-
Robots: Systems and Techniques, A. K. Peters, Ltd., Welles- niques for an autonomous mobile robotA review,
ley, MA, 1996. in Handbook of Pattern Recognition and Computer Vision,
8. J. Borenstein and L. Feng, Measurement and correction World Scientific, Singapore, 1993, Chapter 4.4, pp.
of systematic odometry errors in mobile robots, IEEE 769801.
J. Rob. Autom., 12(6), 1996, pp. 869880. 28. I. Fukui, TV image processing to determine the posi-
9. Borenstein, J. and Evans, J., The OmniMate Mobile tion of a robot vehicle, Pattern Recognit., 14, 101
RobotDesign, Implementation, and Experimental Re- 109, 1981.
sults. 1997 IEEE International Conference on Robotics 29. B. Lapin, Adaptive position estimation for an auto-
and Automation, Albuquerque, NM, Apr. 2127, 1997. mated guided vehicle, Proc. SPIE Conf. Mobile Rob.,
10. J. Borenstein and L. Feng, UMBmarkA method for Boston, MA, Nov. 1820, pp. 8294.
measuring, comparing, and correcting dead-reckoning 30. Y. Mesaki and I. Masuda, A new mobile robot guid-
errors in mobile robots, Technical Report, The Univer- ance system using optical reflectors, Proc. IEEE/RSJ
sity of Michigan UM-MEAM-9422, 1994. Int. Conf. Intell. Rob. Syst., Raleigh, NC, 1992, pp.
11. J. Borenstein, Internal correction of dead-reckoning 628635.
errors with the compliant linkage vehicle, J. Rob. Syst., 31. S. Fleury and T. Baron, Absolute external mobile robot
12(4), 1995, pp. 257273. localization using a single image, Proc. SPIE Conf.
12. J. Borenstein, The CLAPPER: A dual-drive mobile Mobile Rob., Boston, MA, 1992, pp. 131143.
robot with internal correction of dead-reckoning er- 32. H. R. Everett, et al., Real-world issues in warehouse
rors, Proc. IEEE Int. Conf. Rob. Autom. (Video Proceed- navigation, Proc. SPIE Conf. Mobile Rob., Boston, MA,
ings), Nagoya, Japan, 1995. 1994, Vol. 2352.
13. TRCTransitions Research Corp. (now under new 33. C. DeCorte, Robots train for security surveillance,
name: HelpMate Robotics Inc.HRI), Shelter Rock Access Control, June, 3738, 1994.
Lane, Danbury, CT 06810. 34. L. Gould, Is off-wire guidance alive or dead? Manag-
14. B. Barshan and H. F. Durrant-Whyte, An inertial navi- ing Autom., May, 3840, 1990.
gation system for a mobile robot. Proc. IEEE/RSJ Int. 35. L. Feng, Y. Fainman, and Y. Koren, Estimate of abso-
Conf. Intell. Rob. Syst., Yokohama, Japan, 1993, pp. 2243 lute position of mobile systems by optoelectronic proc-
2248. essor, IEEE Trans. Man Mach. Cybern., 22(5), 954
15. B. Barshan and H. F. Durrant-Whyte, Inertial naviga- 963, 1992.
tion systems mobile robots, IEEE Trans. Rob. Autom. 36. W. D. Rencken, Concurrent localization and map
11(3), 1995 pp. 328342. building for mobile robots using ultrasonic sensors,
16. T. Dahlin and D. Krantz, Low-cost, medium-accuracy Proc. IEEE/RSJ Int. Conf. Intell. Rob. Syst., Yokohama,
land navigation system, Sensors, Feb., 2634, 1988. Japan, 1993, pp. 21922197.
17. Andrew Corporation, 10500 W. 153rd Street, Orland 37. W. D. Rencken, Autonomous sonar navigation in in-
Park, IL 60462. door, unknown, and unstructured environments, Proc.
18. KVHKVH Industries, C100 Compass Engine Product IEEE/RSJ Int. Conf. Intell. Rob. Syst., Munich, Germany,
Literature, 110 Enterprise Center, Middletown, RI 1994, pp. 127134.
02840. 38. T. Edlinger and E. Puttkamer, Exploration of an indoor
Borenstein et al.: Mobile Robot Positioning 249

environment by an autonomous mobile robot, Proc. the PSEIKI system and experiments in model-driven
IEEE/RSJ Int. Conf. Intell. Rob. Syst., Munich, Germany, mobile robot navigation, Uncertainty in Artificial Intelli-
1994, pp. 12781284. gence, Elsevier Science Publishers B. V., North-Holland,

39. M. Buchberger, K. Jorg, and E. Puttkamer, Laser radar 1990, Vol. 5, pp. 353369.
and sonar based world modeling and motion control 43. G. Schaffer, J. Gonzalez, and A. Stentz, Comparison
for fast obstacle avoidance of the autonomous mobile of two range-based pose estimators for a mobile robot,
robot MOBOT-IV, Proc. IEEE Int. Conf. Rob. Autom., Proc. SPIE Conf. Mobile Rob., Boston, MA, 1992, pp.
Atlanta, GA, 1993, pp. 534540. 661667.

40. K. W. Jorg, Echtzeitfahige Multisensorintegration fur auto- 44. R. Hinkel and T. Knieriemen, Environment perception
nome mobile Roboter, B. I. Wissenschaftsverlag, Mann- with a laser radar in a fast moving robot, Symp. Rob.

heim, Leipzig, Wien, Zurich, 1994. Control, Karlsruhe, Germany, 1988, pp. 68.168.7.

41. K. W. Jorg World modeling for an autonomous mobile 45. G. Weiss, C. Wetzler, and E. Puttkamer, Keeping track
robot using heterogenous sensor information, Rob. of position and orientation of moving indoor systems
Auton. Syst., 14, 159170, 1995. by correlation of range-finder scans, 1994 Int. Conf.
42. A. Kak, et al., Hierarchical evidence accumulation in Intell. Rob. Syst., Munich, Germany, 1994, pp. 595601.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy