Mobile Robot Positioning: Sensors and Techniques: J. Borenstein
Mobile Robot Positioning: Sensors and Techniques: J. Borenstein
J. Borenstein*
The University of Michigan
Advanced Technologies Lab
1101 Beal Avenue
Ann Arbor, MI 48109-2110
e-mail: johannb@umich.edu
H. R. Everett
Naval Command, Control, and Ocean Surveillance
Center
RDT&E Division 5303
271 Catalina Boulevard
San Diego, CA 92152-5001
L. Feng
The University of Michigan
Advanced Technologies Lab
1101 Beal Avenue
Ann Arbor, MI 48109-2110
D. Wehe
The University of Michigan
Dept. of Nuclear Engineering and Radiological
Sciences
239 Cooley Bldg.
Ann Arbor, MI 48109
I: Relative position measurements (also called wheelbase. Non-systematic errors are those that re-
dead-reckoning) sult from the interaction of the floor with the wheels,
e.g., wheel slippage or bumps and cracks. Typically,
1. Odometry when a mobile robot system is installed with a hybrid
2. Inertial navigation odometry/landmark navigation system, the density
in which the landmarks must be placed in the envi-
II: Absolute position measurements (reference- ronment is determined empirically and is based on
based systems) the worst-case systematic errors. Such systems are
likely to fail when one or more large non-systematic
3. Magnetic compasses errors occur.
4. Active beacons
5. Global positioning systems
2.1.1. Measurement of Odometry Errors
6. Landmark navigation
7. Model matching One important but rarely addressed difficulty in mo-
bile robotics is the quantitative measurement of
odometry errors. Lack of well-defined measuring
2. REVIEW OF SENSORS AND TECHNIQUES procedures for the quantification of odometry errors
results in the poor calibration of mobile platforms
In this section we review some of the sensors and and incomparable reports on odometric accuracy in
techniques used in mobile robot positioning. Exam- scientific communications. To overcome this prob-
ples of commercially available systems or well-docu- lem, Borenstein and Feng8 developed a method for
mented research results will also be given. quantitatively measuring systematic odometry errors
and, to a limited degree, non-systematic odometry
errors. This method, called University of Michigan
2.1. Odometry
Benchmark (UMBmark) requires that the mobile robot
Odometry is the most widely used navigation method be programmed to follow a preprogrammed square
for mobile robot positioning; it provides good short- path of 4 3 4 m side-length and four on-the-spot 90-
term accuracy, is inexpensive, and allows very high degree turns. This run is to be performed five times
sampling rates. However, the fundamental idea of in clockwise (cw) and five times in counter-clockwise
odometry is the integration of incremental motion (ccw) direction.
information over time, which leads inevitably to the When the return position of the robot as com-
unbounded accumulation of errors. Specifically, ori- puted by odometry is compared to the actual return
entation errors will cause large lateral position errors, position, an error plot similar to the one shown in
which increase proportionally with the distance trav- Figure 1 will result. The results of Figure 1 can be
eled by the robot. Despite these limitations, most re- interpreted as follows:
searchers agree that odometry is an important part
of a robot navigation system and that navigation tasks The stopping positions after cw and ccw runs
will be simplified if odometric accuracy can be im- are clustered in two distinct areas.
proved. For example, Cox,4 Byrne et al.,5 and Chena- The distribution within the cw and ccw clusters
vier and Crowley6 propose methods for fusing odo- are the result of non-systematic errors. How-
metric data with absolute position measurements to ever, Figure 1 shows that in an uncalibrated
obtain more reliable position estimation. vehicle, traveling over a reasonably smooth
Odometry is based on simple equations,7 which concrete floor, the contribution of systematic er-
hold true when wheel revolutions can be translated rors to the total odometry error can be notably
accurately into linear displacement relative to the larger than the contribution of non-systematic
floor. However, in the case of wheel slippage and errors.
some other more subtle causes, wheel rotations may
not translate proportionally into linear motion. The The asymmetry of the centers of gravity in cw
resulting errors can be categorized into one of two and ccw results from the dominance of two types of
groups: systematic errors and non-systematic errors.8 systematic errors, collectively called Type A and Type
Systematic errors are those resulting from kinematic B.8 Type A errors are defined as orientation errors
imperfections of the robot, for example, unequal that reduce (or increase) the amount of rotation of
wheel diameters or uncertainty about the exact the robot during the square path experiment in both
234 Journal of Robotic Systems1997
clip entitled CLAPPER showing this system in op- cause the sensor to detect a component of the gravita-
eration is included in refs. 12 and 20. A commercial tional acceleration g. One low-cost inertial navigation
version of this robot, shown in Figure 3, is now avail- system aimed at overcoming the latter problem in-
able under the name OmniMate.13 Because of its cluded a tile sensor.14,15 The tilt information provided
internal odometry error correction, the OmniMate is by the tilt sensor was supplied to the accelerometer
almost completely insensitive to bumps, cracks, or to cancel the gravity component projecting on each
other irregularities on the floor.9 axis of the accelerometer. Nonetheless, the results
obtained from the tilt-compensated system indicate
a position drift rate of 1 to 8 cm/s (0.4 to 3.1 in/s),
2.2. Inertial Navigation
depending on the frequency of acceleration changes.
Inertial navigation uses gyroscopes and accelerome- This is an unacceptable error rate for most mobile
ters to measure rate of rotation and acceleration, re- robot applications.
spectively. Measurements are integrated once (or
twice, for accelerometers) to yield position. Inertial
2.2.2. Gyroscopes
navigation systems have the advantage that they are
self-contained, that is, they dont need external refer- Gyroscopes (also known as rate gyros or just gy-
ences. However, inertial sensor data drift with time ros) are of particular importance to mobile robot
because of the need to integrate rate data to yield positioning because they can help compensate for
position; any small constant error increases without the foremost weakness of odometry: in an odometry-
bound after integration. Inertial sensors are thus based positioning method, any small momentary ori-
mostly unsuitable for accurate positioning over an entation error will cause a constantly growing lateral
extended period of time. position error. For this reason it would be of great
benefit if orientation errors could be detected and
corrected immediately.
2.2.1. Accelerometers
Until recently, highly accurate gyros were too
Test results from the use of accelerometers for mobile expensive for mobile robot applications. For example,
robot navigation have been generally poor. In an in- a high-quality inertial navigation system (INS) such
formal study at the University of Michigan it was as those found in a commercial airliner would have
found that there is a very poor signal-to-noise ratio a typical drift of about 1850 meters (1 nautical mile)
at lower accelerations (i.e., during low-speed turns). per hour of operation, and cost between $50K and
Accelerometers also suffer from extensive drift, and $70K.5 High-end INS packages used in ground appli-
they are sensitive to uneven ground because any dis- cations have shown performance of better than 0.1%
turbance from a perfectly horizontal position will of distance traveled, but cost in the neighborhood of
236 Journal of Robotic Systems1997
Resolution 60.1 8
Accuracy 60.5 8
Repeatability 60.2 8
Size 46 3 110 mm
1.8 3 4.5 in.
Weight (total) 62 gr
2.25 oz
Power: Current drain 0.04 A
Supply voltage 818 or 1828 V
COMPASS ENGINE shown in Figure 5 is a versatile, Trilateration is the determination of a vehicles posi-
low-cost (less than $700) developers kit that includes tion based on distance measurements to known bea-
a microprocessor-controlled stand-alone fluxgate con sources. In trilateration navigation systems there
sensor subsystem based on a two-axis toroidal ring- are usually three or more transmitters mounted at
core sensor. known locations in the environment, and one receiver
Two different sensor options are offered with the on board the robot. Conversely, there may be one
C100: (1) the SE-25 sensor, recommended for applica- transmitter on board, and the receivers are mounted
tions with a tilt range of 616 degrees, and (2) the SE- on the walls. Using time-of-flight information, the
10 sensor, for applications anticipating a tilt angle of system computes the distance between the stationary
up to 645 degrees. transmitters and the onboard receiver. Global Posi-
The SE-25 sensor provides internal gimballing by tioning Systems (GPSs), discussed in section 2.5, are
floating the sensor coil in an inert fluid inside the an example of trilateration.
lexan housing. The SE-10 sensor provides a 2 degree-
of-freedom (DOF) pendulous gimbal in addition to 2.4.2. Triangulation
the internal fluid suspension. The SE-25 sensor
mounts on top of the sensor PC board, while the SE- In this configuration there are three or more active
10 is suspended beneath it. The sensor PC board can transmitters mounted at known locations, as shown
be separated as much as 122 cm (48 in) from the in Figure 6. A rotating sensor on board the robot
detachable electronics PC board with an optional ca- registers the angles l1 , l2 , and l3 at which it sees
ble. Additional technical specifications are given in the transmitter beacons relative to the vehicles longi-
Table II. tudinal axis. From these three measurements the un-
known x- and y-coordinates and the unknown vehicle
orientation can be computed. One problem with this
2.4. Active Beacons
configuration is that in order to be seen at distances
Active beacon navigation systems are the most com- of, say, 20 m or more, the active beacons must be
mon navigation aids on ships and airplanes, as well focused within a cone-shaped propagation pattern.
as on commercial mobile robot systems. Active bea- As a result, beacons are not visible in many areas, a
cons can be detected reliably and provide accurate problem that is particularly grave because at least
positioning information with minimal processing. As three beacons must be visible for triangulation.
a result, this approach allows high sampling rates Cohen and Koss19 performed a detailed analysis
and yields high reliability, but it does also incur high on three-point triangulation algorithms and ran com-
cost in installation and maintenance. Accurate puter simulations to verify the performance of differ-
238 Journal of Robotic Systems1997
Figure 9. The CONACe system employs an onboard, rapidly rotating and vertically
spread laser beam, which sequentially contacts the networked detectors (courtesy of MTI
Research Inc.19).
240 Journal of Robotic Systems1997
Figure 10. Typical GPS static position error with SA On (courtesy of Byrne, Sandia
National Laboratories22).
pare the accuracy of GPS with SA turned on and off. differential correction can then be passed to the first
The static measurements of the GPS error as a function receiver to null out the unwanted effects, effectively
of time (shown in Fig. 10) were taken before the Octo- reducing position error for commercial systems.
ber 1992 test, i.e., with SA on (note the slowly vary- Many commercial GPS receivers are available
ing error in Fig. 10, which is caused by SA). By con- with differential capability. This, together with the
trast, Figure 11 shows measurements from the October service of some local radio stations that make differ-
1992 period when SA was briefly off. ential corrections available to subscribers of the ser-
The effect of SA can be essentially eliminated vice,23 makes the use of DGPS possible for many appli-
through use of a practice known as differential GPS cations. Typical DGPS accuracies are around 4 to 6
(DGPS). The concept is based on the premise that a m (13 to 20 ft), with better performance seen as the
second GPS receiver in fairly close proximity (i.e., distance between the mobile receivers and the fixed
within 10 km, which is 6.2 miles) to the first will reference station is decreased. For example, the Coast
experience basically the same error effects when view- Guard is in the process of implementing differential
ing the same reference satellites. If this second receiver GPS in all major U.S. harbors, with an expected accu-
is fixed at a precisely surveyed location, its calculated racy of around 1 m (3.3 ft).24 A differential GPS system
solution can be compared to the known position to already in operation at OHare International Airport
generate a composite error vector representative of in Chicago has demonstrated that aircraft and ser-
prevailing conditions in that immediate locale. This vice vehicles can be located to 1 m (3.3 ft) in real-
Borenstein et al.: Mobile Robot Positioning 241
Figure 11. Typical GPS static position error with SA Off (courtesy of Byrne, Sandia
National Laboratories22).
time, while moving. Surveyors use differential GPS to data for approximately 24 hours. The plots of the
achieve centimeter accuracy, but this practice requires static position error for the Magnavox GPS Engine
significant postprocessing of the collected data.22 were shown in Figure 10. The mean and standard
In 1992 and 1993 Raymond H. Byrne22 at the Ad- deviation (s) of the position error in this test was 22
vanced Vehicle Development Department, Sandia m (72 ft) and 16 m (53 ft), respectively.
National Laboratories, Albuquerque, New Mexico
conducted a series of in-depth comparison tests with Fractional Availability of Signals
five different GPS receivers. Testing focused on re- The dynamic test data was obtained by driving
ceiver sensitivity, static accuracy, dynamic accuracy, an instrumented van over different types of terrain.
number of satellites tracked, and time-to-first-fix. The The various routes were chosen so that the GPS
more important parameters evaluated in this test, the receivers would be subjected to a wide variety of
static and dynamic accuracy, are summarized below obstructions. These include buildings, underpasses,
for the Magnavox GPS Engine, a representative of signs, and foliage for the city driving. Rock cliffs and
the five receivers tested. foliage were typical for the mountain and canyon
driving. Large trucks, underpasses, highway signs,
Position Accuracy buildings, foliage, and small canyons were found on
Static position accuracy was measured by placing the interstate and rural highway driving routes.
the GPS receivers at a surveyed location and taking The results of the dynamic testing are shown in
242 Journal of Robotic Systems1997
Figure 12. Summary of dynamic environment performance for the Magnavox GPS Engine
(courtesy of Byrne, Sandia National Laboratories22).
Figure 12; the percentages have the following landmarks must be known and stored in the robots
meaning: memory. The main task in localization is then to rec-
ognize the landmarks reliably and to calculate the
No NavigationNot enough satellites were in sight robots position.
to permit positioning. To simplify the problem of landmark acquisition
2-D NavigationEnough satellites were in sight to it is often assumed that the current robot position
determine the x- and y-coordinates of the vehicle. and orientation are known approximately, so that the
3-D NavigationOptimal data available. System robot only needs to look for landmarks in a limited
could determine x-, y-, and z-coordinates of the ve- area. For this reason good odometry accuracy is a
hicle. prerequisite for successful landmark detection.
Some approaches fall between landmark and
In summary one can conclude that GPS is a tre- map-based positioning (see section 2.7). They use sen-
mendously powerful tool for many outdoor naviga- sors to sense the environment, and then extract dis-
tion tasks. The problems associated with using GPS tinct structures that serve as landmarks for navigation
for mobile robot navigation are: (a) periodic signal in the future.
blockage due to foliage and hilly terrain, (b) multi- Our discussion in this section addresses two
path interference, and (c) insufficient position accu- types of landmarks: artificial and natural land-
racy for primary (stand-alone) navigation systems. marks. It is important to bear in mind that natural
landmarks work best in highly structured environ-
ments such as corridors, manufacturing floors, or hos-
2.6. Landmark Navigation
pitals. Indeed, one may argue that natural land-
Landmarks are distinct features that a robot can rec- marks work best when they are actually man-made
ognize from its sensory input. Landmarks can be geo- (as is the case in highly structured environments).
metric shapes (e.g., rectangles, lines, circles), and they For this reason, we shall define the terms natural
may include additional information (e.g., in the form landmarks and artificial landmarks as follows:
of bar-codes). In general, landmarks have a fixed and natural landmarks are those objects or features that
known position, relative to which a robot can localize are already in the environment and have a function
itself. Landmarks are carefully chosen to be easy to other than robot navigation; artificial landmarks are
identify; for example, there must be sufficient contrast specially designed objects or markers that need to be
relative to the background. Before a robot can use placed in the environment with the sole purpose of
landmarks for navigation, the characteristics of the enabling robot navigation.
Borenstein et al.: Mobile Robot Positioning 243
and parameter extraction.29,30 There are also systems pletely erroneous determination of the robots
that use active (i.e., LED) patterns to achieve the position.
same effect.31 Landmarks must be available in the work envi-
The accuracy achieved by the above methods de- ronment around the robot.
pends on the accuracy with which the geometric pa- Landmark-based navigation requires an ap-
rameters of the landmark images are extracted from proximate starting location so that the robot
the image plane, which in turn depends on the rela- knows where to look for landmarks. If the start-
tive position and angle between the robot and the ing position is not known, the robot has to con-
landmark. In general, the accuracy decreases with the duct a time-consuming search process. This
increase in relative distance. Normally there is a range search process may go wrong and may yield
of relative angles in which good accuracy can be an erroneous interpretation of the objects in
achieved, while accuracy drops significantly once the the scene.
relative angle moves out of the good region. A database of landmarks and their location in
There is also a variety of landmarks used in con- the environment must be maintained.
junction with non-vision sensors. Most often used are There is only limited commercial support for
bar-coded reflectors for laser scanners. For example, natural landmark-based techniques.
work on the Mobile Detection Assessment and Response
System (MDARS)3,32,33 uses retro-reflectors, and so
2.7. Map-Based Positioning
does the commercially available system from Cater-
pillar on their Self-Guided Vehicle.5,34 The shape of these Map-based positioning, also known as map match-
landmarks is usually unimportant. By contrast, a ing, is a technique in which the robot uses its sensors
unique approach taken by Feng et al.35 used a circular to create a map of its local environment. This local
landmark and applied an optical Hough transform map is then compared to a global map previously
to extract the parameters of the ellipse on the image stored in memory. If a match is found, then the robot
plane in real time. can compute its actual position and orientation in
We summarize the characteristics of landmark- the environment. The pre-stored map can be a CAD
based navigation as follows: model of the environment, or it can be constructed
from prior sensor data. Map-based positioning is ad-
Natural landmarks offer flexibility and require vantageous because it uses the naturally occurring
no modifications to the environment. structure of typical indoor environments to derive
Artificial landmarks are inexpensive and can position information without modifying the environ-
have additional information encoded as pat- ment. Also, with some of the algorithms being devel-
terns or shapes. oped, map-based positioning allows a robot to learn
The maximal effective distance between robot a new environment and to improve positioning accu-
and landmark is substantially shorter than in racy through exploration. Disadvantages of map-
active beacon systems. based positioning are the stringent requirements for
The positioning accuracy depends on the dis- accuracy of the sensor map, and the requirement that
tance and angle between the robot and the land- there be enough stationary, easily distinguishable fea-
mark. Landmark navigation is rather inaccurate tures that can be used for matching. Because of the
when the robot is further away from the land- challenging requirements currently, most work in
mark. A higher degree of accuracy is obtained map-based positioning is limited to laboratory set-
only when the robot is near a landmark. tings and to relatively simple environments.
Substantially more processing is necessary than
with active beacon systems. In many cases on-
2.7.1. Map Building
board computers cannot process natural land-
mark algorithms quickly enough for real-time There are two fundamentally different starting points
motion. for the map-based positioning process. Either there
Ambient conditions, such as lighting, can be is a pre-existing map, or the robot has to build its
problematic; in marginal visibility landmarks own environment map. Rencken36 defined the map
may not be recognized at all, or other objects building problem as the following: Given the robots
in the environment with similar features can be position and a set of measurements, what are the
mistaken for a legitimate landmark. This is a sensors seeing? Obviously, the map-building ability
serious problem because it may result in a com- of a robot is closely related to its sensing capacity.
Borenstein et al.: Mobile Robot Positioning 245
Figure 16. Readings from a rotating laser scanner generate the contours of a room. (a)
the angle histogram allows the robot to determine its orientation relative to the walls; (b)
after normalizing the orientation of the room relative to the robot, an x-y histogram can
be built from the same data points (adapted from Hinkel and Knieriemen44 with permission).
Accuracy
System & Effective
description Features Position [mm] Orientation [8] range Ref. no.
Odometry on TRC LabMate, after 434 meters square path: Smooth floor: Unlimited 8
UMBmark calibration. Wheel-encoder smooth floor: 30 mm 128
resolution: 0.012 mm linear travel per 10 bumps: 500 mm With 10 bumps: 88
pulse
CLAPPER and OmniMate: 434 m square path: Smooth floor: ,18 Unlimited 9
Dual-drive robot with internal correction smooth floor: p20 mm 10 bumps: ,18
of odometry. Made from two TRC Lab- 10 bumps: p40 mm
Mates, connected by compliant linkage.
Uses 2 abs. rotary encoders, 1 linear en-
coder.
Complete inertial navigation system in- Position drift rate: Drift: 50.258/s Unlimited 14,15
cluding ENV-O5S Gyrostar solid state 18 cm/s depending on After compensation
rate gyro, START solid state gyro, triax- frequency of accelera- drift 0.01258/s
ial linear accelerometer and 2 inclinom- tion change
eters
Andrew Autogyro and Autogyro Naviga- Not applicable Drift: 0.0058/s Unlimited 17
tor. Quoted minimum detectable rota-
tion rate: 60.028/s. Actual minimum
detectable rate limited by deadband
after A/D conversion: 0.06258/s. Cost:
$1000
KVH Fluxgate Compass. Includes micro- Not applicable Resolution: 60.58 Unlimited 18
processor-controlled fluxgate sensor Accuracy: 60.58
subsystem. Cost ,$700 Repeatability: 60.28
CONACe (computer- Measures both Indoor 61.3 mm Indoor and out- .100 m 19a
ized opto-electronic angle and Outdoor 65 mm door 60.058
navigation and distance to
control). target
Cost: $6,000.
Parts of this research were funded by a U.S. Depart- 19. C. Cohen and F. Koss, A comprehensive study of three
ment of Energy Grant DE-FG02-86NE37969. Parts of object triangulation, Proc. SPIE Conf. Mobile Robots,
the text were adapted from refs. 3, 7, 20, and 22. Boston, MA, 1992, pp. 95106.
19a. MTIMTI Research, Inc., Chelmsford, MA.
20. J. Borenstein, B. Everett, and L. Feng, Navigating Mobile
Robots: Systems and Techniques (CD-ROM Edition),
A. K. Peters, Ltd., Wellesley, MA, 1996.
REFERENCES 21. B. M. Gothard, R. D. Etersky, and R. E. Ewing, Lessons
learned on a low-cost global navigation system for the
1. J. L. Farrell, Integrated Aircraft Navigation, Academic
surrogate semi-autonomous vehicle, SPIE Proc., Vol.
Press, New York, 1976.
2058, pp. 258269, 1993.
2. R. H. Battin, An Introduction to the Mathematics and Meth-
22. R. H. Byrne, Global positioning system receiver
ods of Astrodynamics, AIAA Education Series, New
evaluation results, Sandia Report SAND93-0827,
York, 1987.
Sandia National Laboratories, Albuquerque, NM,
3. H. R. Everett, Sensors for Mobile Robots: Theory and Appli-
1993.
cation, A. K. Peters, Ltd., Wellesley, MA, 1995.
23. GPS Report. Phillips Business Information, Potomac,
4. I. J. Cox, BlancheAn experiment in guidance and
MD, 1992.
navigation of an autonomous mobile Robot, IEEE
24. I. A. Getting, The global positioning system, IEEE
Trans. Rob. Autom., 7(3), 193204, 1991.
Spectrum, December, 3647, 1993.
5. R. H. Byrne, P. R. Klarer, and J. B. Pletta, Techniques
for autonomous navigation. Sandia Report SAND92- 25. M. Jenkin, et al., Global navigation for ARK, Proc.
0457, Sandia National Laboratories, Albuquerque, IEEE/RSJ Int. Conf. Intell. Rob. Syst., Yokohama, Japan,
NM, 1992. 1993, pp. 21652171.
6. F. Chenavier and J. Crowley, Position estimation for 26. S. Atiya and G. Hager, Real-time vision-based robot
a mobile robot using vision and odometry, Proc. IEEE localization, IEEE Trans. Rob. Autom., 9(6), 785800,
Int. Conf. Rob. Autom., Nice, France, 1992, pp. 25882593. 1993.
7. J. Borenstein, B. Everett, and L. Feng, Navigating Mobile 27. R. Talluri and J. Aggarwal, Position estimation tech-
Robots: Systems and Techniques, A. K. Peters, Ltd., Welles- niques for an autonomous mobile robotA review,
ley, MA, 1996. in Handbook of Pattern Recognition and Computer Vision,
8. J. Borenstein and L. Feng, Measurement and correction World Scientific, Singapore, 1993, Chapter 4.4, pp.
of systematic odometry errors in mobile robots, IEEE 769801.
J. Rob. Autom., 12(6), 1996, pp. 869880. 28. I. Fukui, TV image processing to determine the posi-
9. Borenstein, J. and Evans, J., The OmniMate Mobile tion of a robot vehicle, Pattern Recognit., 14, 101
RobotDesign, Implementation, and Experimental Re- 109, 1981.
sults. 1997 IEEE International Conference on Robotics 29. B. Lapin, Adaptive position estimation for an auto-
and Automation, Albuquerque, NM, Apr. 2127, 1997. mated guided vehicle, Proc. SPIE Conf. Mobile Rob.,
10. J. Borenstein and L. Feng, UMBmarkA method for Boston, MA, Nov. 1820, pp. 8294.
measuring, comparing, and correcting dead-reckoning 30. Y. Mesaki and I. Masuda, A new mobile robot guid-
errors in mobile robots, Technical Report, The Univer- ance system using optical reflectors, Proc. IEEE/RSJ
sity of Michigan UM-MEAM-9422, 1994. Int. Conf. Intell. Rob. Syst., Raleigh, NC, 1992, pp.
11. J. Borenstein, Internal correction of dead-reckoning 628635.
errors with the compliant linkage vehicle, J. Rob. Syst., 31. S. Fleury and T. Baron, Absolute external mobile robot
12(4), 1995, pp. 257273. localization using a single image, Proc. SPIE Conf.
12. J. Borenstein, The CLAPPER: A dual-drive mobile Mobile Rob., Boston, MA, 1992, pp. 131143.
robot with internal correction of dead-reckoning er- 32. H. R. Everett, et al., Real-world issues in warehouse
rors, Proc. IEEE Int. Conf. Rob. Autom. (Video Proceed- navigation, Proc. SPIE Conf. Mobile Rob., Boston, MA,
ings), Nagoya, Japan, 1995. 1994, Vol. 2352.
13. TRCTransitions Research Corp. (now under new 33. C. DeCorte, Robots train for security surveillance,
name: HelpMate Robotics Inc.HRI), Shelter Rock Access Control, June, 3738, 1994.
Lane, Danbury, CT 06810. 34. L. Gould, Is off-wire guidance alive or dead? Manag-
14. B. Barshan and H. F. Durrant-Whyte, An inertial navi- ing Autom., May, 3840, 1990.
gation system for a mobile robot. Proc. IEEE/RSJ Int. 35. L. Feng, Y. Fainman, and Y. Koren, Estimate of abso-
Conf. Intell. Rob. Syst., Yokohama, Japan, 1993, pp. 2243 lute position of mobile systems by optoelectronic proc-
2248. essor, IEEE Trans. Man Mach. Cybern., 22(5), 954
15. B. Barshan and H. F. Durrant-Whyte, Inertial naviga- 963, 1992.
tion systems mobile robots, IEEE Trans. Rob. Autom. 36. W. D. Rencken, Concurrent localization and map
11(3), 1995 pp. 328342. building for mobile robots using ultrasonic sensors,
16. T. Dahlin and D. Krantz, Low-cost, medium-accuracy Proc. IEEE/RSJ Int. Conf. Intell. Rob. Syst., Yokohama,
land navigation system, Sensors, Feb., 2634, 1988. Japan, 1993, pp. 21922197.
17. Andrew Corporation, 10500 W. 153rd Street, Orland 37. W. D. Rencken, Autonomous sonar navigation in in-
Park, IL 60462. door, unknown, and unstructured environments, Proc.
18. KVHKVH Industries, C100 Compass Engine Product IEEE/RSJ Int. Conf. Intell. Rob. Syst., Munich, Germany,
Literature, 110 Enterprise Center, Middletown, RI 1994, pp. 127134.
02840. 38. T. Edlinger and E. Puttkamer, Exploration of an indoor
Borenstein et al.: Mobile Robot Positioning 249
environment by an autonomous mobile robot, Proc. the PSEIKI system and experiments in model-driven
IEEE/RSJ Int. Conf. Intell. Rob. Syst., Munich, Germany, mobile robot navigation, Uncertainty in Artificial Intelli-
1994, pp. 12781284. gence, Elsevier Science Publishers B. V., North-Holland,
39. M. Buchberger, K. Jorg, and E. Puttkamer, Laser radar 1990, Vol. 5, pp. 353369.
and sonar based world modeling and motion control 43. G. Schaffer, J. Gonzalez, and A. Stentz, Comparison
for fast obstacle avoidance of the autonomous mobile of two range-based pose estimators for a mobile robot,
robot MOBOT-IV, Proc. IEEE Int. Conf. Rob. Autom., Proc. SPIE Conf. Mobile Rob., Boston, MA, 1992, pp.
Atlanta, GA, 1993, pp. 534540. 661667.
40. K. W. Jorg, Echtzeitfahige Multisensorintegration fur auto- 44. R. Hinkel and T. Knieriemen, Environment perception
nome mobile Roboter, B. I. Wissenschaftsverlag, Mann- with a laser radar in a fast moving robot, Symp. Rob.
heim, Leipzig, Wien, Zurich, 1994. Control, Karlsruhe, Germany, 1988, pp. 68.168.7.
41. K. W. Jorg World modeling for an autonomous mobile 45. G. Weiss, C. Wetzler, and E. Puttkamer, Keeping track
robot using heterogenous sensor information, Rob. of position and orientation of moving indoor systems
Auton. Syst., 14, 159170, 1995. by correlation of range-finder scans, 1994 Int. Conf.
42. A. Kak, et al., Hierarchical evidence accumulation in Intell. Rob. Syst., Munich, Germany, 1994, pp. 595601.