0% found this document useful (0 votes)
23 views11 pages

Novel Indoor Positioning Algorithm Based On Lidari

The document presents a novel indoor positioning algorithm based on combining Lidar and inertial measurement units. It uses a rank Kalman filter to estimate motion trajectories from sensor observations. Experiments show the new algorithm improves positioning accuracy compared to Lidar-only and extended Kalman filter approaches.

Uploaded by

ducdo dang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views11 pages

Novel Indoor Positioning Algorithm Based On Lidari

The document presents a novel indoor positioning algorithm based on combining Lidar and inertial measurement units. It uses a rank Kalman filter to estimate motion trajectories from sensor observations. Experiments show the new algorithm improves positioning accuracy compared to Lidar-only and extended Kalman filter approaches.

Uploaded by

ducdo dang
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Research Article

International Journal of Advanced


Robotic Systems
March-April 2021: 1–11
Novel indoor positioning algorithm ª The Author(s) 2021
Article reuse guidelines:
based on Lidar/inertial measurement sagepub.com/journals-permissions
DOI: 10.1177/1729881421999923

unit integrated system journals.sagepub.com/home/arx

Ping Jiang1, Liang Chen1, Hang Guo1 , Min Yu2 and Jian Xiong1

Abstract
As an important research field of mobile robot, simultaneous localization and mapping technology is the core technology
to realize intelligent autonomous mobile robot. Aiming at the problems of low positioning accuracy of Lidar (light
detection and ranging) simultaneous localization and mapping with nonlinear and non-Gaussian noise characteristics, this
article presents a mobile robot simultaneous localization and mapping method that combines Lidar and inertial mea-
surement unit to set up a multi-sensor integrated system and uses a rank Kalman filtering to estimate the robot motion
trajectory through inertial measurement unit and Lidar observations. Rank Kalman filtering is similar to the Gaussian
deterministic point sampling filtering algorithm in structure, but it does not need to meet the assumptions of Gaussian
distribution. It completely calculates the sampling points and the sampling points weights based on the correlation
principle of rank statistics. It is suitable for nonlinear and non-Gaussian systems. With multiple experimental tests of small-
scale arc trajectories, we can see that compared with the alone Lidar simultaneous localization and mapping algorithm, the
new algorithm reduces the mean error of the indoor mobile robot in the X direction from 0.0928 m to 0.0451 m, with an
improved accuracy rate of 46.39%, and the mean error in the Y direction from 0.0772 m to 0.0405 m, which improves the
accuracy rate of 48.40%. Compared with the extended Kalman filter fusion algorithm, the new algorithm reduces the
mean error of the indoor mobile robot in the X direction from 0.0597 m to 0.0451 m, with an improved accuracy rate of
24.46%, and the mean error in the Y direction from 0.0537 m to 0.0405 m, which improves the accuracy rate of 24.58%.
Finally, we also tested on a large-scale rectangular trajectory, compared with the extended Kalman filter algorithm, rank
Kalman filtering improves the accuracy of 23.84% and 25.26% in the X and Y directions, respectively, it is verified that the
accuracy of the algorithm proposed in this article has been improved.

Keywords
Mobile robot, Lidar SLAM, IMU, multi-sensor fusion, rank Kalman filter

Date received: 31 January 2021; accepted: 14 February 2021

Topic Area: Robot Sensors and Sensor Networks


Topic Editor: Nak-Young Chong
Associate Editor: Euntai Kim

1
Introduction Information Engineering School, Nanchang University, Nanchang, China
2
College of Software, Jiangxi Normal University, Nanchang, China
Nowadays, navigation and positioning have become an
indispensable part of people’s lives and also have the var- Corresponding authors:
Hang Guo, Information Engineering School, Nanchang University, 999
ious applications in the field of unmanned system naviga-
Xuefu Avenue, Nanchang 330031, China.
tion and positioning, such as indoor autonomous sweeping Email: hguo@ncu.edu.cn
robot, driverless cars, and so on. These products are con- Min Yu, College of Software, Jiangxi Normal University, 99 Zi Yang
stantly developing and expanding and have brought a pro- Avenue, Nanchang 330022, China.
found impact on the technological progress of the entire Email: myu@jxnu.edu.cn

Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License
(https://creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without
further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/
open-access-at-sage).
2 International Journal of Advanced Robotic Systems

society. Simultaneous localization and mapping (SLAM) is with Lidar positioning technology, inertial navigation tech-
a research focus in the field of unmanned system navigation nology is not affected by the external environment and has
and positioning. In recent years, many investigators have the ability of independent navigation. The 100 Hz high-
conducted in-depth research on SLAM technology.1,2 The frequency outputs of IMU device can provide the short-
most important significance of SLAM technology is that it term high accurate navigation parameters (position,
not only solves the problem of autonomous positioning of velocity, and attitude). However, the navigation and posi-
mobile robot in unmanned systems but also enables mobile tioning errors of IMU will gradually accumulate as time
robot to build the environmental maps based on their own goes, so the inertial navigation needs other sensors aid to
positioning in an unknown environments and then to fur- obtain high-precision navigation results. Visual informa-
ther modify the positioning information according to the tion may be used in a navigation and positioning, but it has
environment maps built, so that the positioning effect and the limitation of being easily affected by the environment.
the mapping effect can be improved. SLAM technology is In the case of violent motion and dark area, blurred images,
an advantageous and substitutional module in the case of insufficient image feature extraction, and obscure textures
indoor mapping and outdoor Global Navigation Satellite problems will occur.
System failure. Compared with indoor pedestrian dead Since the performance limitations of a single sensor,
reckoning positioning, Wi-Fi positioning, fingerprint multi-sensor information fusion has become the current
library recognition positioning, and other positioning meth- mainstream method of navigation and positioning. Com-
ods,3–5 SLAM technology has high automation and real bining several different navigation systems together can
time, which can perform mapping and positioning of indoor make use of multiple information sources to complement
and outdoor environment in real time. each other for the purpose of forming a multidimensional,
Smith and Cheesman6 first proposed SLAM problem, multifunctional, and high-accuracy navigation system. At
combining both positioning and mapping around with a present, the mainstream integration methods are Lidar/
robot. In 1987, they presented a SLAM algorithm based IMU,14,15 Lidar/Visual,16 Visual/IMU,17 Lidar/Visual/
on Kalman filtering (KF), but this state estimation method IMU,18 and so on.
is only suitable for linear systems obeying Gaussian distri- In this article, an RKF algorithm in the framework of
bution of the noise. Generally speaking, the state estimation loosely integrated Lidar/IMU is used to estimate the posi-
problem of nonlinear and non-Gaussian phenomena are tion and attitude information of the mobile robot, and it
universal, and the extended Kalman filter (EKF) is the improved indoor mobile robot positioning accuracy and
earliest proposed state processing algorithm for nonlinear increased the system robustness.
systems and Gaussian noise assumptions. Subsequently,
Liu et al. proposed a central difference Kalman filter,
which solved the problem of low accuracy of multi- SLAM problem and mathematical
source data fusion in SLAM applications.7 Martinez-
description
Cantin and Castellanos proposed an unscented Kalman
filter8 to effectively solve the problem of filter divergence The problem of SLAM of a mobile robot can be summar-
caused by large nonlinear characteristics of the system. ized as follows: In an unknown environment, the robot
In addition, there are quadrature Kalman filter9 and calculates the current position and attitude based on the
cubature Kalman filter10 that are used to solve the state sensor data according to the needs of the position estima-
estimation problem of nonlinear systems. In the algorithm tion during the movement process and simultaneously
mentioned above, the sampling points are essentially builds and updates the environment maps. The essence of
extracted from the prior distribution of the state, and then this problem is to use the data obtained by the sensors and
the sampling points are iterated linearly or nonlinearly. the known environmental information to estimate the pos-
Finally, fusion is performed to obtain an accurate state ture of the mobile robot and describe the environmental
estimation. information. Therefore, the SLAM problem can be
Rank Kalman filter (RKF) does not need to make Gaus- described as shown in Figure 1.
sian condition assumptions on the system noise in the cal- Generally speaking, the sensors collect data within a
culation process and calculates the sampling points and the fixed interval, when calculating the posture of the mobile
sampling points weights based on the correlation principle robot and updating the map, the motion of the robot in a
of rank statistics.11,12 Because of this characteristic, it is continuous period of time can be discreated into the posture
widely used in nonlinear and non-Gaussian noise systems. and the map at time t ¼ 1, . . . K. During the two-
Therefore, compared with other filtering algorithms, RKF dimensional motion, the position and orientation of the
algorithm has a wider application range.13 mobile robot can be represented by a three-dimensional
Although Lidar can provide high-precision position and vector X k ¼ ðxk ; yk ; qk ÞT with xk ; yk as the plane coordi-
heading results, but it is easily affected by the external nates and qk as the orientations of the segments of the
environment and then the positioning accuracy is greatly trajectory line. The map is composed of multiple features.
reduced in nonlinear and non-Gaussian noise. Compared If there are N feature points, the map can be described as
Jiang et al. 3

yj yk

zt+1,k xt+1
zt,j
ut+1
xt
ut
xt-1

Estimated
zt,i position
zt-1,i
Actual
position
yi

Figure 1. SLAM problem description. SLAM: simultaneous localization and mapping.

Coordinate Position and


Feature detection
transformation attitude calculation
and matching
LiDAR and transformation

Error estimation

Angular Position and


Position
rate Navigation attitude Rank Kalman
solutions filter
IMU Acceleration Attitude

Figure 2. Structure diagram of multi-sensor fusion algorithm.

y ¼ ðy 0 ; y 1 ; . . . ; y N Þ. The observed feature at the posture RKF-based multi-sensor fusion algorithm


vector j is yi. In general, the mobile robot is equipped with
sensors to measure its own motion information, and the Multi-sensor fusion algorithm
motion process can be expressed by the following equation The state equation is established through the error model of
X t ¼ f ðX t1 ; ut ; W t Þ (1) strapdown inertial navigation system, and the observation
equation is established through posture information calcu-
Among them, ut is the data (the position, velocity, and lated by the separate Lidar and IMU algorithms. Then the
angular rate of the robot itself) of the sensors (coded wheels position, velocity, attitude, and inertial device errors of the
or inertial sensors) at time t and Wt is the system noise. This integrated navigation system are estimated. The method is
equation is also called the motion equation (state-transi- shown in Figure 2:
tion), which describes the movement of the robot from t
 1 to t. The observation equation zt;j describes the mea-
RKF design based on Lidar/IMU
surement collected when the sensors on the robot measure
the feature yj at the position xt. The measurement process is RKF algorithm is used as a fusion model to process data of
expressed by equations both IMU and Lidar to reduce the nonlinear and non-Gaussian
  effects of the system, and IMU in the method is used to correct
zt;j ¼ h yj ; xt ; vt;j (2) the posture error caused by the alone Lidar scan matching. As
a result, the final posture information is more accurate. First,
Among them, vt;j is the noise observed at time t. a nonlinear system model is introduced as follows.
4 International Journal of Advanced Robotic Systems

Nonlinear system model. The navigation coordinate system algorithms as the measurements, the observation equation
selected here is the geographic coordinate system of East, is as follows
North, and Up. The robot error angle model of strapdown
Z k ¼ HX k þ nk (7)
inertial navigation system19–21 is

  where the measurement vector is Z k ¼ dvDN dvDN dvDN
x y z
_ ¼ C 1 I  C t 0 v t t0 t t0 b
F t ^ it þ C t dvit  C b dvib (3)
!
dLDN dlDN dhDN T at the time k, the observation equation
Velocity error model coefficient matrix is H ¼ ½ 063 I 66 066 , and nk is
 the observation equation noise vector. DN represents the
 0 T  0 b  0 T  t  velocity and position difference between IMU and Lidar
d V_ ¼ I  C t t
Cb ^
t t t b
f þ C t C b df  2v
^ ie þ v t
^ et
solutions.
 t t

dV  2dvie þ dvet  V þ dg
(4)
Calculation process of RKF
Position error model
8 This section will introduce the RKF the calculation process.
> dvy vy
>
>
> d B_ ¼  dh
>
>
>
Rm þ h ðRm þ hÞ2
< System model analysis
dvx sec B vx tan B sec B vx sec B (5)
>
> d l_ ¼ þ dB  dh Assume the following nonlinear discrete system
>
>
>
R n þ h R m þ h ðRn þ hÞ 2
>
> X k ¼ f ðX k1 Þ þ wk
: _
d h ¼ dvz ; k ¼ 0; 1;    ; N  1 (8)
Z k ¼ hð X k Þ þ v k
where C 1! describes the relationship between the angular where X k represents the state vector of the system at time
0
rate and the robot error angle (refer to21); C tt is the trans- k; Z k is the measurement vector of the system at time k;
formation matrix from the geographic coordinates to the f ðÞ is the nonlinear equation with the state vector; hðÞ is
0
calculation coordinates; C tb is the attitude matrix; C tb is the the nonlinear observation equation with an measurement
transformation matrix from the IMU coordinates to the vector; wk and vk are the system noise vector at time k and
calculation coordinates; vtie the earth rotation rate; the measurement noise vector at time k, respectively.
^ tie ¼ vtie þ dvtie is the value of vtie in the calculation
v
coordinates; vtet is the angular rate vector generated by the Principle of rank sampling
robot motion; v ^ tet ¼ vtet þ dvtet is the value of vtet in the RKF algorithm calculates the sampling points and their
calculating coordinates; v ^ tit is the value of the sum vtie and weights according to the principle of the rank statistic cor-
vtet in the calculating geographic coordinates; dvtit ; dvtie ; relation, therefore, there is no requirement for the system to
dvt are the calculation error of v ^t ; v
^t ; v
b
^ t ; f^ ¼ f b þ follow the Gaussian distribution assumption.12,13 By incor-
et it ie et
porating the idea of rank statistics into the filtering algorithm
df b is the specific force; df b is the accelerometer error; dg
that determines the sampling type, and then the quantile is
is the gravity error; B is the latitude; l is the longitude; h is
used as its sampling point to construct the sets of sampling
the height; Rm is the radius of the meridian, and Rn is the
points corresponding to the different distributions, that is,
radius of the prime vertical circle. pffiffiffiffiffiffiffiffiffiffi
χik1 ¼ X^ k1 þ rj lp
j
P k1 l ; l ¼ 1; 2;    ;
i
Nonlinear system model state equation and observation fχk1 g : i ¼ ðj  1Þn þ l; j ¼ 1; 2;    ; r
equation. Selecting a 15-dimensional state vector, the state i ¼ ðj  2Þn þ l; j ¼ r þ 2; r þ 3;    ; 2r þ 1
equation is as follows (9)
X k ¼ f ðX k1 Þ þ wk (6)
In the formula, χik1 ð 1 8 i 8 2rnÞ represents the sam-
where X k ¼ ½ dFk dvk dLk dV k dak T is the state pling point i of X k1 ; n is the dimension of the state vector;
 r is the total number of layers sampled; 2r n is the total
vector; dFk ¼ x y z represents the east, north, and azi-
 number of sampling points; P k1 is the covariance matrix
muth misalignment angle at the time k; dvk ¼ !x !y !z of the state; lpj represents the quantile point of the one-
is three-axis gyroscope constant drift error at the time k;
dimensional Gaussian distribution probability pj of the state
dLk ¼ ½dx dy dz is the position error at the time k;
 estimation error e. pj is the probability corresponding to the
dV k ¼ dvx dvy dvz is the velocity error at the time k; sampling point of the j layer, and the general value range is

dak ¼ dax day daz is the constant error of the acceler- ðj  0:3Þ=ð 2n þ 1:4Þ or j=ð 2n þ 2Þ; r j is the correction
ometer at the time k; Taking the velocity difference and coefficient corresponding to the sampling point of the j
position difference calculated by the alone IMU and Lidar layer, which satisfies to the following formula
Jiang et al. 5

X
2rþ1 (2) Calculate the measurement value iterated through
r j lp j ¼ 0 (10) the non-linear observation equation, that is,
j ¼ 1; j 6¼ mþ1 fZ ikjk1 g
 
Z ikjk1 ¼ h χikjk1 ; vk i ¼ 1; 2; . . . ; 2rn (17)
RKF calculation process
(3) Estimated measurement value
The following are the specific steps of RKF algorithm:
First step: time update process X 2rn
^ kjk1 ¼ 1
Z Zi (18)
(1) According to the rank sampling mechanism shown 2rn i¼1 kjk1
in section “Principle of rank sampling,” the sam-
(4) Measurement update covariance matrix
pling point set fχik1jk1 g is obtained
pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi 2rn  T 
ri X  i 
^ kjk1 Z i ^ kjk1
χik1jk1 ¼ X^ k1jk1 þ rj lp
j
Pk1jk1 ; l ¼ 1; 2;    P ZZ;kjk1 ¼
! i¼1
Z kjk1  Z kjk1  Z þ Rk
l
(
i ¼ ðj  1Þn þ l; j ¼ 1; 2;    ; r (19)
i ¼ ðj  2Þn þ l; j ¼ r þ 2; r þ 3;    ; 2r þ 1 where R is the measurement covariance matrix.
(11)
(5) Measurement cross-covariance matrix
(2) Calculate the set of sampling points iterated
2rn  T 
through the non-linear state equation, that is, ri X  i 
PX Z;kjk1 ¼ χkjk1  X kjk1 Z kjk1  Z kjk1
^ i ^
fX ikjk1 g ! i¼1
  (20)
X  ikjk1 ¼ f χik1jk1 ; wk1 i ¼ 1; 2; . . . ; 2rn (12)
(6) Calculate the gain matrix
(3) Forecast the state
K k ¼ PX Z;kjk1 P1
ZZ;kjk1 (21)
1 X
2rn
^ kjk1
X ¼ X  ikjk1 (13) (7) State estimation
2rn i¼1  
^ kjk ¼ X
X ^ kjk1 þ K k Z k  Z
^ kjk1 (22)
(4) Calculate the covariance matrix of the prediction
error (8) Covariance matrix of the state estimation
2rn  
X  T  Pkjk ¼ P kjk1  K k P ZZ;kjk1 K Tk (23)
ri
Pkjk1 ¼ X ikjk1  X
^ kjk1 X ikjk1  X
^ kjk1 þ Qk1
! i¼1

(14)
In the above formula, ri represents the covariance Experimental analysis
correction coefficient corresponding to the sampling Experiment environment for indoor positioning
point i, and for convenience, it is 1 here in this article.
! represents the covariance weighting coefficient, Q is
analysis
the system noise covariance matrix, and the relevant Lidar used in this experiment is R-Fans 16-line. This prod-
calculation formula is uct is based on the time-of-flight ranging principle to obtain
the point cloud data with three-dimensional scanning. The
X
2rþ1
positioning principle is based on the relative motion esti-
!¼ rj2 lp2j (15)
j¼1; j6¼rþ1
mation of point cloud matching, which decomposes the
motion in the three-dimensional space into a three-
Second step: measurement update process dimensional rotation matrix R and a three-dimensional
translation vector t. The performance index parameters are
(1) Re-rank sampling to get the sampling point set listed in Table 1. IMU is the model of Xsens MTI. It con-
fχikjk1 g sists of a three-axis gyroscope, a three-axis accelerometer,
pffiffiffiffiffiffiffiffiffiffiffiffiffi
and a three-axis magnetometer. The technical parameters
χikjk1 ¼X^ kjk1 þ rj lp
j
P kjk1 ; l ¼ 1; 2; . . . ; n
( l are listed in Table 2. The experimental equipment is shown
i ¼ ðj  1Þn þ l; i ¼ 1; 2; . . . r in Figure 3. The ground truth of the mobile robot in the
experiment is provided by the optical space positioning
i ¼ ðj  2Þn þ l; i ¼ r þ 2; r þ 3; . . . 2r þ 1
system Realis Tracking System. This system can be applied
(16) to navigation trajectory of sensors, robot, or drones. It is
6 International Journal of Advanced Robotic Systems

Table 1. R-Fans-16 parameter table.

Parameter Numerical specification


Maximum detection distance 200 m
Minimum detection distance <0.1 m
Horizontal angle of view 360
Distance resolution 3 mm
Frame measurement rate 5–20 Hz
Measurement accuracy <3 cm

Table 2. IMU parameter table.

Parameter Numerical specification


Zero bias stability of the gyroscope 1 /s
Gyroscopic range +300 /s
Zero bias stability of accelerometer +50 m/s2 Figure 4. TRACKER software operation interface.
Accelerometer range 0.02 m/s2
Zero bias stability of the magnetometer 0.1  107 Tesla
Magnetometer range +7.5  105 Tesla
system comprehensive experimental innovation base at
Shanghai Jiao Tong University, China, as shown in Fig-
IMU: inertial measurement unit. ure 5. In a small-scale arc trajectory environment, the robot
starts from the selected point and finally returns to the same
point. The walking path is a closed loop path. Each experi-
ment includes Lidar alone and Lidar/IMU integrated posi-
tioning tests and analyses. A KF algorithm for alone Lidar
sensor, an EKF fusion algorithm for Lidar/IMU integrated
system, and an RKF fusion algorithm for Lidar/IMU inte-
grated system are used to estimate the location. The results
of the positioning errors estimated by the above three algo-
rithms are compared and analyzed. Since the alone IMU
test accuracy was at the meter level for this loop path that
was done by our group several years ago,22 and is not what
we discuss here at the accuracy level of the centimeter, we
did not compare with the test results of IMU alone in this
article.
During the positioning experiment, the trajectory is
16.28 m long, the robot starts from the starting point A
(7.5057, 2.9011), at a speed not exceeding 20 cm per sec-
ond walks clockwise around the polygon, finally returns to
the point A. There are four turning points in the walking
trajectory, namely A, B, C, D. Among them, the three turn-
Figure 3. Mobile robot data acquisition system. ing points of A, B, and D have relatively large angles, and
the turning angle of C is relatively small. The nonlinearity
currently the only high-precision positioning system that and non-Gaussian characteristics of the system at the turn-
can meet the submillimeter level of indoor positioning. The ing point are more obvious, which is good for evaluating
trajectory information is provided by the corresponding the performance of the algorithm in solving nonlinear and
TRACKER software, as shown in Figure 4. We have con- non-Gaussian problems. The trajectory solved by the alone
ducted experiments on two trajectories, one is a small-scale Lidar algorithm, the trajectory solved by Lidar/IMU inte-
arc trajectory environment, and the other is a large-scale grated system EKF fusion algorithm, and the trajectory
rectangular trajectory environment. solved by Lidar/IMU integrated system RKF fusion algo-
rithm are compared with the ground truth. The comparison
result is shown in Figure 6. The black line is the ground
Small-scale arc trajectories environment test analysis truth, the green line is the trajectory solved by alone Lidar
To verify the positioning accuracy of the algorithm, we algorithm, the blue line is the trajectory solved by Lidar/
conducted a navigation experiment of the indoor environ- IMU EKF fusion algorithm, and the red line is the trajec-
ment. The experimental site is in China BeiDou unmanned tory solved by Lidar/IMU RKF fusion algorithm. (a) is the
Jiang et al. 7

Figure 5. Experimental environment.

trajectory solved by the alone Lidar algorithm and the with the alone Lidar algorithm, the trajectory solved by
ground truth comparison figure, (b) is the trajectory solved Lidar/IMU EKF fusion algorithm in the whole process was
by Lidar/IMU integrated system EKF fusion algorithm and closer to the trajectory of ground truth.
the ground truth comparison figure, (c) is the trajectory At last, Lidar/IMU RKF integrated system positioning
solved by Lidar/IMU integrated system RKF fusion algo- test is analyzed. During the whole trajectory, the position-
rithm and the ground truth comparison figure, and (d) is the ing error of the mobile robot is very small, basically close
all trajectories comparison figure. to the ground truth. In the whole process, the maximum
First, the alone Lidar positioning test is analyzed. In error in the X direction is 0.0753 m, and the maximum error
section AB, the walking path of the robot is a straight line. in the Y direction is 0.0479 m. It can be seen RKF algorithm
During this time, due to the linear characteristics of the has a stronger ability to restrain the error divergence caused
system, no large positioning error will occur. When it turns by nonlinear systems and non-Gaussian noise. Although
at the point B, C, D, and A, there is a nonlinearity and non- the IMU position drift error is existing, the attitude devia-
Gaussian characteristics of the system, which is caused by tion of Xsens MTi is relatively small due to its better atti-
two wheels of differential rotation, and make the mobile tude constraints. And the attitude information collected by
robot vibrative (quivery). At this time, the nonlinearity and MTi is used to compensate for the posture error caused by
non-Gaussian characteristics of the system are enhanced, the alone Lidar scan matching through RKF, which can
the Lidar is prone to the scan matching errors. As a result, improve the positioning accuracy of the direction angle,
the trajectory fluctuates greatly and trajectory drift occurs. and also improve the stability of the integrated system. In
In the whole process, the maximum error in the X direction addition, by adding IMU sensing information, the cumula-
is 0.1489 m, and the maximum error in the Y direction is tive errors14,18 in the process of Lidar data processing are
0.1456 m. With the time goes, due to the limited ability of compensated and updated in real time, the fusion algorithm
Lidar Loop Closing function, the cumulative error is diffi- can better perform cumulative errors, it can be seen from
cult to be eliminated, and it increases significantly. As can Figure 6(c) that the last position (7.5032, 2.9110) the robot
be seen from Figure 6(a), the mobile robot did not eventu- returns (the end point) to is very close to the starting point
ally return to the end point but returned to the point (7.4533, 2.9492). Further quantitative analysis is shown in
(7.4533, 2.9492). Table 3.
Second, Lidar/IMU EKF integrated system positioning Table 3 shows the error statistics in the X and Y direc-
test is analyzed. When it turns at B, the trajectory wave still tions with the three algorithms for indoor positioning,
fluctuates and there is a little trajectory drift. This is where STD is the standard deviation, MAX is the maxi-
because the robustness of the EKF algorithm in processing mum error value, and mean is the average error. Compared
nonlinear systems and non-Gaussian noise is not very with the results of the alone Lidar KF algorithm and Lidar/
strong, so there are certain limitations. However, compared IMU integrated system with EKF, Lidar/IMU integrated
8 International Journal of Advanced Robotic Systems

system with RKF obtains better the results. It can be seen


from Table 3 that nonlinearity and non-Gaussian properties
have a greater impact on the alone Lidar positioning with
KF and Lidar/IMU integrated system with EKF, but the
Lidar/IMU integrated system with RKF fusion algorithm
can deal with this situation very well and has a better accu-
racy than the alone Lidar positioning algorithm.
Through multiple experiments to further verify the posi-
tioning accuracy of the algorithm, we did five tests around
the polygon and get five sets of data for analysis. The
results of alone Lidar algorithm and RKF algorithm com-
parison are shown in Table 4, and the results of EKF algo-
rithm and RKF algorithm comparison are shown in Table 5.
As we can see, the mean value of X direction error with
alone Lidar algorithm is 0.0928 m, the mean value of Y
direction error is 0.0772 m, the mean value of X direction
error with Lidar/IMU integrated system with EKF is 0.0597
m, and the mean value of Y direction error is 0.0537 m,
while the mean value of X direction error with Lidar/IMU
integrated system with RKF is 0.0451 m, and the mean
value of Y direction error is 0.0405 m. As a result, compare
with alone Lidar algorithm, the new algorithm improves the
mean accuracy rate of 46.39% in the X direction and
48.40% in the Y direction. Compared with EKF algorithm,
the new algorithm improves the mean accuracy rate of
24.46% in the X direction and 24.58% in the Y direction.

Large-scale rectangular trajectories environment test


analysis
To ensure the reliability of the results, we repeated the
experiment several times in a large-scale rectangular tra-
jectory environment. The trajectory is 54.7 m long, the
trajectory solved by Lidar/IMU integrated system EKF
fusion algorithm and the trajectory solved by Lidar/IMU
integrated system RKF fusion algorithm are compared with
the ground truth. The comparison result is shown in Fig-
ure 7. The black line is the ground truth, the blue line is the
trajectory solved by Lidar/IMU EKF fusion algorithm, and
the red line is the trajectory solved by Lidar/IMU RKF
fusion algorithm. The black solid point is the starting point,
the black hollow point is the end point of the RKF algo-
rithm, and the green hollow point is the end point of the
EKF algorithm. Table 6 shows the error statistics in the X
and Y directions with the two algorithms for indoor posi-
tioning. In the EKF algorithm, the maximum error in the X
direction is 0.3472 m, and the maximum error in the Y
direction is 0.4181 m. In the RKF algorithm, the maximum
error in the X direction is 0.2321 m, and the maximum error
in the Y direction is 0.2516 m.
Figure 6. The trajectories of small-scale arc environment. (a) The To ensure the reliability of the results, we repeated the
trajectory solved by the alone Lidar algorithm and the ground
experiment several times in a large-scale rectangular tra-
truth comparison figure; (b) the trajectory solved by Lidar/IMU
integrated system EKF fusion algorithm and the ground truth jectory. The results of EKF algorithm and RKF algorithm
comparison figure; (c) the trajectory solved by Lidar/IMU comparison are shown in Table 7, the mean value of X
integrated system RKF fusion algorithm and the ground truth direction error with EKF algorithm is 0.1147 m, and the
comparison figure; (d) all trajectories comparison figure. mean value of Y direction error is 0.1052 m, while the mean
Jiang et al. 9

Table 3. Results of error statistics of three positioning methods.

Results of three positioning methods

Axis direction Technical index Lidar (m) EKF Lidar/IMU (m) RKF Lidar/IMU (m)
X direction STD 0.0281 0.0214 0.0159
MAX 0.1489 0.1120 0.0753
Mean 0.0936 0.0667 0.0420
Y direction STD 0.0222 0.0141 0.0068
MAX 0.1456 0.0935 0.0479
Mean 0.0630 0.0467 0.0305

Table 4. Alone Lidar algorithm and RKF algorithm comparison.

Lidar X Lidar Y RKF Lidar/IMU RKF Lidar/IMU


Experiments mean/m mean/m X mean/m Y mean/m Increased accuracy rate X (%) Increased accuracy rate Y (%)
1 0.0856 0.0930 0.0490 0.0427 57.24 45.91
2 0.0936 0.0630 0.0420 0.0305 44.87 48.41
3 0.0674 0.1041 0.0372 0.0613 40.36 38.89
4 0.0989 0.0612 0.0456 0.0384 46.11 62.75
5 0.1187 0.0647 0.0515 0.0298 43.39 46.02
Mean 0.0928 0.0772 0.0451 0.0405 46.39 48.40

RKF: rank Kalman filtering; IMU: inertial measurement unit.

Table 5. EKF algorithm and RKF algorithm comparison.

EKF Lidar/IMU EKF Lidar/IMU RKF Lidar/IMU RKF Lidar/IMU Increased accuracy Increased accuracy
Experiments X mean/m Y mean/m X mean/m Y mean/m rate X (%) rate Y (%)
1 0.0664 0.0547 0.0490 0.0427 26.20 21.94
2 0.0519 0.0555 0.0420 0.0305 19.08 45.05
3 0.0478 0.0706 0.0372 0.0613 22.18 13.17
4 0.0625 0.0412 0.0456 0.0384 27.04 6.80
5 0.701 0.0465 0.0515 0.0298 26.53 35.91
Mean 0.0597 0.0537 0.0451 0.0405 24.46 24.58

EKF: extended Kalman filter; RKF: rank Kalman filtering; IMU: inertial measurement unit.

Figure 7. The trajectories of large-scale rectangular environment.


10 International Journal of Advanced Robotic Systems

Table 6. Results of error statistics of two positioning methods.

Results of three positioning methods

Axis direction Technical index EKF Lidar/IMU (m) RKF Lidar/IMU (m)
X direction MAX 0.3472 0.2321
Mean 0.1067 0.0718
Y direction MAX 0.4181 0.2516
Mean 0.0984 0.0707
EKF: extended Kalman filter; RKF: rank Kalman filtering; IMU: inertial measurement unit.

Table 7. EKF algorithm and RKF algorithm comparison.

EKF Lidar/IMU EKF Lidar/IMU RKF Lidar/IMU RKF Lidar/IMU Increased accuracy Increased accuracy
Experiments X mean/m Y mean/m X mean/m Y mean/m rate X (%) rate Y (%)
1 0.1124 0.1136 0.0981 0.0721 12.69 36.54
2 0.1280 0.0899 0.0902 0.0709 29.56 21.17
3 0.1222 0.0950 0.0938 0.0719 23.25 24.36
4 0.0972 0.1257 0.0730 0.1019 26.12 18.91
5 0.1135 0.1019 0.0822 0.0761 27.58 25.32
Mean 0.1147 0.1052 0.0875 0.0786 23.84 25.26

EKF: extended Kalman filter; RKF: rank Kalman filtering; IMU: inertial measurement unit.

value of X direction error with RKF algorithm is 0.0875 m, ORCID iD


and the mean value of Y direction error is 0.0786 m. Hang Guo https://orcid.org/0000-0002-5444-0285

Conclusions References
Aiming at the problems of low positioning accuracy of 1. Sun BY. Mobile robot SLAM technology. Electron Technol
Lidar SLAM with nonlinear and non-Gaussian noise char- Softw Eng 2018; 1(2): 95.
acteristic, this article uses Lidar and IMU integrated navi- 2. Aghili F. 3D SLAM using IMU and its observability analysis.
gation system and proposes a fusion RKF algorithm to In: IEEE international conference on mechatronics and auto-
realize positioning estimation of the indoor mobile robot. mation, Xi’an, China, 4–7 August 2010, pp. 377–383. IEEE.
RKF algorithm is the core of the fusion method. With its 3. Qian J C, Pei L, Ying R, et al. Continuous motion recognition
unique sampling mechanism, it can improve the position- for natural pedestrian dead reckoning using smartphone sen-
ing accuracy of the indoor mobile robot under nonlinear sors. In: Proceedings of the 27th international technical,
model and non-Gaussian noise model. Finally, multiple meeting of the satellite division of the institute of navigation
experimental results and analysis verify that the proposed (ION GNSSþ 2014), Florida, Tampa, 8–12 September, 2014,
algorithm improves the accuracy and robustness of the pp. 1796–1801.
positioning system compared with the alone Lidar position- 4. Zhao B, Pei L, Xu C, et al. Graph-based efficient WiFi fin-
ing algorithm. It has higher positioning accuracy, better gerprint training using un-supervised learning. In: Proceed-
system stability, and stronger robustness and is very impor- ings of the 28th international technical meeting of the
tant in trajectory calculation. satellite division of the institute of navigation (ION GNSS
2015). Tampa, FL, USA, 14–18 September 2015, pp. 14–18.
Declaration of conflicting interests 5. Pei L, Liu DH and Qian JC. A survey of indoor positioning
technology and application. Navig Position Timing 2017;
The author(s) declared no potential conflicts of interest with
4(3): 1–10.
respect to the research, authorship, and/or publication of this
article. 6. Smith RC and Cheeseman P. On the representation and estima-
tion of spatial uncertainty. Int J Robot Res 1986; 5(4): 56–68.
Funding 7. Liu D, Duan J and Shi H. A strong tracking square root
central difference fast SLAM for unmanned intelligent vehi-
The author(s) disclosed receipt of the following financial sup-
port for the research, authorship, and/or publication of this cle with adaptive partial systematic resampling. IEEE Trans
article: This work is supported by the projects of the National Intell Transp Syst 2016; 17(11): 3110–3120.
Natural Science Foundation of China (Grant No. 41764002) and 8. Martinez-Cantin R and Castellanos JA. Unscented SLAM
the national key technologies research and development pro- for large-scale outdoor environments. In: 2005 IEEE/RSJ
gram (No. 2016YFB0502002). international conference on intelligent robot and systems,
Jiang et al. 11

IEEE, Edmonton, AB, Canada, 2–6 August 2005, 16. Debeunne C and Vivet D. A review of visual-LiDAR fusion
pp. 3427–3432. based simultaneous localization and mapping. Sensors 2020;
9. Arasaratnam I and Haykin S. Square-root quadrature Kalman 20(7): 2068.
filtering. IEEE Trans Signal Process 2008; 56(6): 2589–2593. 17. Gao M, Yu M, Guo H, et al. Mobile robot indoor positioning
10. Zhao Y. Performance evaluation of cubature Kalman filter in based on a combination of visual and inertial sensors. Sensors
a GPS/IMU tightly-coupled navigation system. Signal Pro- 2019; 19(8): 1773.
cess 2016; 119: 67–79. 18. Li H X, Ao LH, Guo H, et al. Indoor multi-sensor fusion
11. Fu HM, Yang HF, Xiao ML, et al. Nonlinear state equation positioning based on federated filtering. Measurement
self-calibration filtering method. J Aerosp Power 2019; 2020; 154: 107506.
34(02): 267–273. 19. Feng S, Wayne CJ and Dempster AG. A DSRC Doppler/
12. Fu HM, Xiao Q, Lou TS, et al. Nonlinear and non-Gaussian IMU/ GNSS tightly-coupled cooperative positioning method
rank filter method. J Aerosp Power 2015; 30(10): 2318–2322. for relative positioning in VANETs. J Navig 2016; 70(1):
13. Wang L, Cheng XH, Liu CL, et al. Huber-based rank Kalman 120–136.
filtering algorithm for mobile robot in complex environment. 20. Sun F and Tang L. INS/GPS integrated navigation filter algo-
J Chin Inertial Technol 2019; 27(01): 60–65. rithm based on cubature Kalman filter. Control Decis 2012;
14. Yan XY, Guo H, Yu M, et al. Light detection and ranging/ 27(7): 1032–1036.
inertial measurement unit-integrated navigation positioning 21. Yan GM, Yan WS and Xu DM. Application of simplified
for indoor mobile robots. Int J Adv Robot Syst 2020; 17(2): UKF in SINS initial alignment for large misalignment angles.
1–11. J Chin Inertial Technol 2008; 16(3): 253–264.
15. Karam S, Lehtola V and Vosselman G. Strategies to integrate 22. Yin H, Guo H and Deng X. Pedestrian dead reckoning indoor
IMU and lidar slam for indoor mapping. ISPRS Ann Photo- positioning with step detection based on foot-mounted IMU.
gramm Remote Sens Spatial Inf Sci 2020; V-1-2020: In: Proceeding of the ION ITM 2014, San Diego, CA, 27–29
223–230. January 2014, pp. 186–192.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy