0% found this document useful (0 votes)
5 views

Lidar-IMUandWheelOdometerBasedAutonomousVehicleLocalization

This conference paper presents a novel localization method for autonomous vehicles that combines Lidar, IMU, and wheel odometer data to improve accuracy in environments where GPS signals are unreliable. The proposed system utilizes a Point-to-Plane Iterative Closest Point (PP-ICP) algorithm for point cloud map matching and employs an extended Kalman filter for data fusion. Experimental results demonstrate the robustness and precision of the system in challenging conditions such as underground garages.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Lidar-IMUandWheelOdometerBasedAutonomousVehicleLocalization

This conference paper presents a novel localization method for autonomous vehicles that combines Lidar, IMU, and wheel odometer data to improve accuracy in environments where GPS signals are unreliable. The proposed system utilizes a Point-to-Plane Iterative Closest Point (PP-ICP) algorithm for point cloud map matching and employs an extended Kalman filter for data fusion. Experimental results demonstrate the robustness and precision of the system in challenging conditions such as underground garages.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/335795668

Lidar-IMU and Wheel Odometer Based Autonomous Vehicle Localization


System

Conference Paper · June 2019


DOI: 10.1109/CCDC.2019.8832695

CITATIONS READS

30 1,490

4 authors, including:

Yanning Guo Zhiyuan Liu


Harbin Institute of Technology Harbin Institute of Technology
152 PUBLICATIONS 1,645 CITATIONS 3 PUBLICATIONS 48 CITATIONS

SEE PROFILE SEE PROFILE

All content following this page was uploaded by Zhiyuan Liu on 08 January 2020.

The user has requested enhancement of the downloaded file.


Lidar-IMU and Wheel Odometer Based Autonomous Vehicle Localization
System
Shaojiang Zhang1, Yanning Guo1, Qiang Zhu2 , Zhiyuan Liu1

1. Dept. of Control Science and Engineering, Harbin Institute of Technology, Harbin 150001, China
E-mail: [email protected]
2. Tianjin CATARC Data Ltd, Tianjin 300300
E-mail: [email protected]

Abstract: Localization is a critical issue in autonomous navigation and path planning. The traditional localization
method for automatic driving is to use the GPS. But in many cases, such as near the high buildings, under the viaduct, in
the basement or the tunnel, GPS signal will disappeared or be weakened, resulting in localization failure. This paper
proposed a precise localization method based on Lidar, IMU and wheel odometer combined with point cloud data map
matching. In this paper, information in the form of depth point cloud collected by Lidar is matched with the pre-known
point cloud map data, and a Point-to-Plane Iterative Closest Point algorithm (PP-ICP) is introduced. In order to avoid the
mismatch and localization failure, the start of the autonomous vehicle is set to the initial coordinates of the point cloud
map data. Then, the predicted state equation which consists of the throttle control and steering wheel control of the
autonomous vehicle is established. The first observation equation consists of the acceleration and angular velocity
measured by the IMU. The second observation equation consists of the position and attitude measured by the Lidar
matching point cloud, along with speed and distance measured by the wheel odometer. Finally, the extended Kalman
filter is used to fuse the information data of the three sensors and thus update and correct the localization. Experiments
using multi-sensor fusion were carried out in underground garages and the experimental results showed that the
robustness and positioning accuracy can meet the engineering requirements.
Key Words: Autonomous Vehicle, Multi-Sensor Fusion, EKF, PP-ICP, Point-Cloud Map

of surrounding environment information, and precise


1 INTRODUCTION positioning in the environment. Further, the decision layer
The autonomous vehicle, that is, the driverless smart car, and control layer enable the vehicle to be safely and reliably
relies on the control system and intelligent algorithm in the completed while driving on the road.
vehicle without human intervention which control the The key technologies for autonomous driving are vehicle
vehicle underlying protocol through a variety of onboard control technology and vehicle precise positioning
sensor data fusion to complete the normal vehicle driving technology [1]. The traditional automatic driving
function. The automatic driving system is a comprehensive positioning technology relies on the GPS [2], but under real
integrated system. As is shown in Figure 1, the framework autonomous driving circumstances, such as near high
of the automatic driving system can be divided into three buildings, in underground garages, tunnels and so on, GPS
parts: the perception layer, the decision layer and the control signals are unstable or lost, which causes the vehicle to get
layer. out of control. In Ref. [3], Marin-Plaza P studied the
Set destination Path planning
autonomous positioning method using stereo vision. There
is an accurate mathematical model to solve the pose, but the
Perception Decision Control
calculation is relatively large, and when the texture is pale or
the light is dark, the captured image cannot complete the
Object
recognition
Localization HD-Map Decision instruction Control instruction feature extraction or matching. From the perspective of
real-time and robustness, this way cannot meet the needs of
Variable Car Steering
autonomous driving. The method of track estimation
proposed by Eling C et al. [4] is used to calculate the pose.
Camera Lidar Radar IMU GPS Follow Turn Brake
speed throttle wheel

Fig 1. Technical architecture of autonomous vehicle. The algorithm is simple, but the IMU has a large integral
error and does not meet the requirement of centimeter-level
The sensing layer includes multi-sensor data acquisition, localization accuracy for autonomous vehicle. In Ref. [5],
processing and fusion, accurate and comprehensive sensing Liu G studied the change of the external environment to add
RFID tags, and Hu ZF et al. [6] uses two-dimensional code
and adaptive Monte Carlo positioning algorithm to calculate
This work is supported by the National Natural Science Foundation of the position and attitude. In Ref. [7], Babin T S et al. uses
China (Grant No. 61673135, 61603114,61876050)
the WIFI local network module for localization. The Observed point cloud, the algorithm is to make PB translate
method of externally adding a mark is simple and accurate, and rotate, so that PB and PR overlap as much as possible,
but for scenarios with large scales or under extreme the error reaches a minimum value, and the corresponding
circumstances, the external condition is restricted and thus transformation matrix is solved for pose correction. The ICP
cannot be placed with marks. algorithm does not require a form of Lidar point cloud, and
This paper first analyzes the shortcomings of using a single its relative motion change can be calculated. This algorithm
sensor, and then uses the IMU, Lidar and wheel odometer is used in this paper to correct the position and attitude of
multi-sensor fusion method to achieve precise positioning the autonomous vehicle in the point cloud map and optimize
of the autonomous vehicle. A matching algorithm of point to the yaw angle (  in Figure 2).
plane ICP is proposed. After the vehicle is calibrated at the
initial position of the point cloud data map, the extended
Kalman filter algorithm is used to predict and update the
state of the autonomous vehicle. Combined with other 
localization solutions, a real-time, accurate and robust Y
y
autonomous vehicle localization system can be achieved,
and thus meet engineering needs. O
x X
Fig 2. 2D point cloud registration Fig 3. 3D point cloud registration.
2 LOCALIZATION TYPE COMPARISON
The sensors used in this paper are divided into two types: Let the point cloud data collected by the first frame of Lidar
track speculation type and feature matching type. data be : P ={p1 ,p2 ,...,pn } , The point cloud data in the point
2.1 Track Speculation Type cloud map corresponding to the pose state is :
P ' ={p'1 ,p'2 ,...,p' n } ,Then initialize a European transform so
The principle of the track estimation algorithm is to infer the
that:
current position and attitude based on the position and
attitude of the previous moment. The position, attitude and i, pi  Rpi'  t (1)
speed are obtained by IMU, which is composed of By matching a standard set of point clouds, the error term
accelerometer and gyroscope. The basic working principle for the first point cloud is defined as:
of the IMU is Newton's laws of mechanics. The velocity ei  pi  ( Rpi'  t ) (2)
value is integrated by the value of the acceleration, and the
displacement value is obtained by integrating the velocity Then construct a least squares objective function, and solve
value. Angular velocity is measured by a gyroscope. The the problem to make the sum of squared errors reach a
advantage of IMU is that the output frequency is very high, minimum value. The objective function is as follows:
and the precision is high in short time; the disadvantage is 1 n
that the error accumulates with time, and the influence of the min J 
R ,t
 || ( pi  ( Rpi'  t )) ||22
2 i 1
(3)
motion acceleration calculation needs to be eliminated.
In order to improve the real-time and robustness of the
The wheel odometer obtains the wheel speed by the encoder algorithm, the Point-Plane ICP (PP-ICP) algorithm is used
installed on the wheel axle, converts the speed information to optimize the distance error in the normal vector direction
into the travel distance through the wheel radius, and of the matching point cloud tangent plane. Reduce the
superimposes the position of the autonomous vehicle at the impact of error points and mismatches on the calculation
previous moment to calculate the current time position. The results. The optimization objective function is as follows :
biggest advantage of a wheeled odometer is stability, which
1 n
is only related to the environmental ground and has nothing
to do with other changes in the scenes. The disadvantage is
min J 
R ,t

2 i 1
|| ( pi  ( Rpi'  t ))  ni ||22 (4)
that when the wheel slips, there is a large deviation and only The SVD algebra method or the nonlinear optimization
two-dimensional information can be provided. method can solve the relative pose state of the autonomous
2.2 Feature Matching Type vehicle in the point cloud map, and correct the value in real
time to accurately locate the autonomous vehicle.
The feature matching type is based on the point cloud map The advantage of Lidar localization is that it can work
matching of the Lidar sensor and the surrounding scenes. without GPS, and the robustness is very good; the
Commonly used matching localization algorithms include disadvantage is that it is necessary to collect and make maps
Iterative Closest Point (ICP) algorithm and probability map as well as maintain maps in advance, the bad weather will be
histogram filter matching positioning [8]. This paper uses a affected, and the price is temporarily expensive.
faster calculation of PP-ICP algorithm.
The Iterative Closest Point (ICP) algorithm is a method of 3 MULTI-SENSOR FUSION ALGORITHM
registering between one frame of point cloud data and a Multi-sensor data fusion is a technology that combines
point cloud data set. Figure 2 below shows a simple 2D multiple input information to form a unified data output.
point cloud registration diagram. Figure 3 shows a 3D point Mainly to enhance the reliability and availability of data.
cloud registration diagram. RP (red point cloud) and BP
(blue point cloud) represent Lidar in two different position.
Based on the advantages and disadvantages of the probability of a missing position or incorrect positioning in
above-mentioned sensors, this paper uses multi-sensor the subsequent positioning, which causes a path offset
fusion to do the autonomous vehicle localization solution. during the driving. This situation is very dangerous. In the
First of all, collecting point cloud map data in advance with state of no initialization, the experiment was carried out in
Lidar. In order to prevent mismatching and positioning the simulation experiment part, and the phenomenon of lost
failure, the autonomous vehicle is calibrated at the initial position and autonomous vehicle track drift occurred.
position of the point map. The control of vehicle includes So in the process of initialization, this paper uses the known
throttle brake and steering wheel angle. Predicting the state precise point cloud data initial coordinates, combined with
of the vehicle at the next moment depends on control the initial coordinates of the point cloud map, and the
variables. The IMU sensor is used to provide the first PP-ICP algorithm correct the exact initialization position
observation equation of the autonomous vehicle. The wheel value. In the environment, real vehicle testing can be
odometer and the Lidar are used to the update localization accurately and stably positioned.
values as the second observation equation. Then, the
extended Kalman filter algorithm is performed to complete
3.3 Extended Kalman Filter (EKF) Fusion
the multi-sensor fusion for precise localization. After the autonomous vehicle is successfully initialized, the
IMU is used to determine acceleration, angular velocity and
3.1 Architecture of LINS on Wheel attitude traveled on the point cloud map. Through the
LINS on Wheel means Lidar inertial navigation system integration, the first observation equation can get the
based on wheel odometer. The sensor fusion architecture vehicle's speed, yaw angle, and travel distance. But the
diagram is shown in Figure 4. attitude value will accumulate error with the accumulation
Last
of time. The wheel odometer can accurately measure the
Gravity

Wheel
v
navigation
result
model driving speed and distance under the condition that the
odometer
wheel does not slip under normal conditions. And the point
w 
Gyroscope
Integral
transform
Update
Position
update cloud data collected by the Lidar is matched with the known
IMU
a Integral v ,v p
point cloud map as the detection update value. The second
Accelerometer
transform
observation equation consists of the detected values of the
Position and  Position
Lidar attitude
initialization
(R,t) EKF Attitude
Velocity
wheel odometer and Lidar. Data fusion is performed by an
extended Kalman filter (EKF) algorithm.
Point-Map
The equation of motion state of the autonomous vehicle can
Fig 4. The Architecture of LINS on Wheel.
be expressed as:
The input of the point cloud location is the predicted xk  Ak xk 1  uk  k (5)
position and attitude provided by the fusion location and the
Expand the equation of motion is:
point cloud data of the Lidar as well as the pre-recorded
point cloud map (starting point cloud map coordinates). 1 
at 2 
Point cloud positioning mainly includes two steps: yaw  pk  1 0 t   pk 1   2
heading angle optimization, and transformation matrix       
calculation, corresponding to IMU track resolution, point   k   0 1 0    k 1    wt  +k , (6)
cloud map matching alignment. Point cloud positioning  v  0 0 1   v   at 
 k     k 1   
outputs R, t, yaw and z by positioning the map.
 k
First, do the optimization of yaw. Because the fusion where k  0,1,..., N represents the time state, and
position is easy to produce angle errors. The processing xk , xk 1 represents the state of the moment k , k  1 .The
method uses a point-plane Iterative Closest Point algorithm
status x includes the position p , yaw angle  and vehicle
(PP-ICP) to perform alignment and correction of the current
frame of the Lidar sensor for the point cloud frame. Then do speed v , Ak is a state transition matrix, uk is a control
the difference and Sum of Squared Difference (SSD). matrix, a is acceleration caused by throttle or brake, w is
Matching is to subtract the corresponding reflection value of angular speed controlled by the steering wheel. k is
each Lidar point from the corresponding value of the Gaussian noise.
positioning map, and then add the square of the difference. And the observation equation is:
The smaller each measurement value, the better the position
matching. For the reflection values collected by the Lidar, z k  Ck xk  vk (7)
the histogram distribution is calculated separately, and the Expand the equation of IMU’s observation is:
weight of the histogram distribution is determined. Adaptive  pimu   pk 
convergence algorithm is used to make the matching more    C    v
accurate. (8)
 imu  k  k k
 vimu   vk 
3.2 Precise Localization Initialization
Expand the equation of Lidar and wheel odometer’s
The initial precise positioning is critical when the observation is:
autonomous vehicle enters a none-GPS environment. If the
initial position is not accurately located, there is a high
 pwheel   pk  odometer, which update the distance, yaw angle, and speed
   C    v, , state at the moment, reaching a more precise positioning
(9) requirement. Equation (18) represents the covariance a
 Lidar  k  k  k

 vwheel   vk  posteriori update value at the time. The Markov assumption


is used in the motion and observation models of the
where, zk represents the observation in the state xk . Ck is nonlinear system. [7] The extended Kalman filter algorithm
an observation coefficient. The observation of the wheel is used to predict the state of the moment through the motion
odometer indicates the position and speed in the world state estimation equation, then correct and update through
coordinate system. The observation of the Lidar represents multi-sensor fusion observation equation to complete
precise positioning. The framework for achieving data
the yaw angles and transformation matrix. k , vk represent
fusion is shown in Figure 5:
the noise during the motion process and the observation
process. Respectively, they are both conform to the Pp-ICP

Gaussian distribution: k ~ (0, R), vk ~ (0, Q) , and Lidar Data

processing noise is brought in randomly.


Improve EKF
However, in practice, the autonomous motion equation and Fusion
the observation equation are both nonlinear functions. Algorithm
Therefore, the nonlinear function model is linearized, and Strapdown
the first-order Taylor expansion is used. After the IMU&Wheel
calculating
Integral

linearization of the system, the Gaussian distribution is still


satisfied. Bias

f Fig 5. Sensor fusion with EKF architecture.


xk  f ( xˆk 1 , uk )  xˆk 1 ( xk 1  xˆk 1 )  k (10)
xk 1
The classical Kalman filter is applied in linear systems,
h which can form the maximum a posteriori probability
zk  h( xk )  ( x k  xˆk )  vk (11)
xk xk
estimate and achieve the optimal unbiased estimation. This
f h is the advantage of Kalman filtering, but the Taylor
The partial derivative is: F  ,H  . The expansion is approximated in the nonlinear system.The
xk 1 xˆk 1
xk xk

error nonlinearity occurs in the high-order nonlinear system,


post-probability xˆk 1 representing k  1 probability so this paper uses the improved extended Kalman filter
distribution the moment distinguish the prior probability (EKF) for data fusion [8].
distribution of the state at the moment. Before using the extended Kalman filter, all the sensor
The Bayesian probability estimation formula can be nodes are randomly distributed in a certain area. After the
obtained in the prediction step according to the equation of calibration is successful, it can be self-contained into a
motion: sensor network, and the data is collected to the central
processing unit by the sensor data. Multi-sensor data
P( xk | x0 , u1:k , z0:k 1 )  N ( f ( xˆk 1, uk ), FPˆk 1F T  Rk ) (12)
pre-processing fusion network shown in Figure 6:
The mean of the prior distribution and covariance is:
xk =f ( xˆk 1 , uk ) (13) Lidar
Wavelet
Calculate
Wavelet Signal- Data
P  FPˆ F T  R
IMU Domain EKF
k k 1 k
(14) Denoising
Transform Noise
Ratio
Fusion
Wheel Odometer
The physical meaning of the function (13) is the Fig.6 Multi-sensor fusion network.
acceleration and angular velocity controlled by the throttle
and steering wheel of autonomous vehicle at that time as the
predicted pose value at next time. Equation (14) represents The specific multi-sensor network fusion process steps are
the predicted covariance matrix, the transfer relationship at as follows:
the next moment. 1) Using wavelet technology to remove Gaussian white
Then, based on the Bayesian probability estimation noise from data collected by each sensor, including data
expansion, the observation equation is derived: cleaning, data transformation and numerical specification;
P( zk | xk )  N (h( xk )  H ( xk  xk ), Qk ) 2) Compressing the data using wavelet technology;
(15)
3) Using the wavelet technique to layer the preprocessed
The extended Kalman gain is defined: data, and performing extended Kalman filtering on the
K k  Pk H T ( HPk H T  QK ) 1 (16) layered similar high-frequency coefficients and the detailed
Calculating the form of the posterior probability distribution low-frequency coefficients;
by extending the Kalman gain: 4) Calculate the signal-to-noise ratio of the sensor data, and
xˆk  xk  K k ( zk  h( xk )) (17) fuse the approximation coefficients of the data by the
signal-to-noise ratio [9];
Pˆ  ( I  K H ) P (18)
k k k
5) Reconstruct the data using wavelet technology to obtain
The physical meaning of equation (17) indicates that the the fusion result.
observations at the moment by the Lidar and wheel
3.4 Multi-sensor fusion program framework The inertial measurement unit adopts MEMSIC400, which
integrates a three-axis gyroscope, a three-axis accelerometer,
Firstly, load the point map data and confirm that the map is
and a three-axis magnetometer. The model of the wheel
loaded. Then, the multiple sensors are initialized. Optimize
odometer is XSCLD12, and the collected pulse data is
the yaw angle of the autonomous vehicle and correct the
accurate. Figure 8 and Figure 9 show the test diagrams of the
attitude using the PP-ICP algorithm. The autonomous
above sensors in the underground garage environment.
vehicle starts running according to the predicted of the state
equation. However, the movement process will have errors
with time. So, use the IMU as the first observation equation
to correct. The observation equation of the wheel odometer
is used to compensate the error of the position information,
determine the precise position of the autonomous vehicle in
the basement point cloud map. And then use the observation
equation of the Lidar, through the PP-ICP algorithm for
attitude compensation, the precise pose of the autonomous Fig.8 Real vehicle test verification Fig.9 Test verification.
vehicle is obtained. And finally the desired navigation and
localization information is output. The software flowchart is As shown in Figure 10, the point cloud data map of the
shown in Figure 7: entire underground garage which is prepared by the Lidar
for the autonomous vehicle is collected and scanned, and is
ST ART
displayed in the visual interface of the RVIZ in the ROS
environment.
IMU Observation
Loading
Equation Update
Point Map
No

Map Ready?
Lidar and Wheel
Odometer
Observation
Yes Equation Update

Sensor
Initialization

EKF

Vehicle Initial
Calibration

Output Position
Attitude

No
State Error=0?

No Reach
Yes Destination?
Fig 10. Basement map with RVIZ display.
State Equation Yes
Prediction

END
In order to know the storage and reading of point cloud data,
the point cloud map is converted into an Octo-map form, as
shown in figure 11. Then, when the vehicle enters the map,
Fig 7. Multi-sensor fusion program flow chart.
it is initialized at the position of the red frame in Figure 12.
4 EXPERIMENT AND SIMULATION
Combining the wheel odometer with IMU and Lidar data,
the LINS on Wheels algorithm is proposed to provide the
IMU and Lidar with observability constraints through the
plane motion hypothesis of the wheel odometer, thus
improving the accuracy and stability of the system. At the
same time, the autonomous vehicle initialize its own
positioning, and then the initial positioning can be Fig 11. Octo-map. fig 12. Positioning initialization.
accurately determine the localization in the entire data point
cloud map. After the autonomous vehicle completes the initialization at
The HDL-32-Velodyne Lidar used in the autonomous the starting position, it predicts the state through the
vehicle can detect surrounding environment information, equation of state motion. And the acceleration and angular
integrate 32 laser emitters and 32 laser receivers, 360° velocity calculated by the IMU are used to update the
rotation, up to 20 Hz rotation frequency, 800,000 per position at the next moment, and the wheel odometer and
second. The output of the coordinate points, which allows the Lidar are used as observation data to update and correct
the 32E Lidar to obtain sufficient environmental the position and attitude of the current moment. As shown in
information, which can be used as data point cloud (a) of figure 13, the red line indicates the trajectory of the
information for collecting and matching acquisitions. autonomous vehicle in the loaded point cloud map in a
multi-sensor fusion manner, (b) and (c) diagram show
observation data obtained by Lidar of the autonomous error of the corresponding value of the observation equation trajectory and
multi-sensor fusion.
vehicle during traveling.
5 SUMMARY AND OUTLOOK
Autonomous vehicle technology is the future direction, but
the kernel issue is to solve the problem of localization. This
(b)
paper adopts the technology of IMU, Lidar as well as wheel
odometer and the EKF algorithm for multi-sensor data
fusion. Combined with point cloud data map, it considers
the localization initialization problem of autonomous
vehicles, and finally achieves more accurate and stable
localization. However, at the hardware level, it is required to
provide a sufficient number of different types of sensor
(a) (c)
devices to ensure sufficient and redundant information
acquisition; at the software level, the used algorithm to
Fig 13. (a)Multi-sensor fusion of the autonomous vehicle trajectories, (b)
and (c) Observation data obtained by Lidar. calculate must be sufficiently optimized. The fusion
algorithm provides a large number of prior models to
As shown in (a) of Figure 14, there is no initialization the achieve the real time and accuracy of the final localization.
autonomous vehicle localization at the initial position. It can With the rapid development of modern technology and the
be seen from the figure that the point cloud data collected by continuous improvement of fusion algorithms, it will be
the Lidar has deviated, and the positioning will fail in the more widely used in various fields including the
subsequent navigation and positioning process. autonomous vehicle. Thus the technology of autonomous
As shown in (b) of Figure 14, only a single Lidar driving will become a reliable, effective and practical
localization and a multi-sensor fusion positioning scheme information processing tool to promote the progress of
are used for simulation comparison. The sampling period is human society.
0.5s, sampling 200 times, the driving speed in the
underground garage is 2m/s, and the unit of the horizontal REFERENCES
and vertical coordinates is centimeter. The solid black line [1] Blayvas I, Fridental R, Da S. Systems and methods for
indicates the real car trajectory, and the blue line indicates autonomous vehicle navigation: U.S. Patent Application
that only the observation equation is used. And the red line 10/002,471[P]. 2018-6-19.
indicates the localization track after the multi-sensor fusion. [2] Sukkarieh S, Nebot E M, Durrant-Whyte H F. A high
Figure (c) shows the error comparison between the two integrity IMU/GPS navigation loop for autonomous land
localization schemes. The green point is the localization vehicle applications[J]. IEEE Transactions on Robotics and
error generated by the Lidar only at the current sampling Automation, 1999, 15(3): 572-578.
time, and the maximum can be generated to 37 cm, and it is [3] Marin-Plaza P, Beltrán J, Hussein A, et al. Stereo
vision-based local occupancy grid map for autonomous
unstable. The red dot indicates that the current sampling
navigation in ROS [C] Imaging and Computer Graphics
time uses multiple sensors. The positioning error caused by Theory and Applications .2016.
the fusion method is within 10 cm. It can be seen that the
[4] Eling C, Klingbeil L, Kuhlmann H. Real-time
localization accuracy and stability of multi-sensor fusion are single-frequency GPS/MEMS-IMU attitude determination of
high. lightweight UAVs[J]. Sensors, 2015, 15(10): 26212-26235.
[5] Liu G, Geng Y, Pahlavan K. Effects of calibration RFID tags
on performance of inertial navigation in indoor
environment[C] (ICNC), 2015 International Conference on.
IEEE, 2015: 945-949.
[6] Hu ZF, Zeng L, An adaptive Monte Carlo positioning
algorithm incorporating two-dimensional code information.
Computer application. 2018.
[7] Babin T S, Hani M S B, Mathew T, et al. Method enabling
(a) indoor local positioning and movement tracking in wifi
capable mobile terminals: U.S. Patent Application
11/617,772[P]. 2008-7-3.
[8] Kumar R V, Shen S, Michael N, et al. Multi-sensor fusion for
robust autonomous flight in indoor and outdoor
environments with a rotorcraft micro-aerial vehicle (mav):
(b) U.S. Patent Application 15/165,846[P]. 2017-7-27.
(c) [9] Liu Y, Fan X, Lv C, et al. An innovative information fusion
Fig 14. (a) shows that the autonomous vehicle does not initialize the pose method with adaptive Kalman filter for integrated INS/GPS
and causes the positioning offset. (b) is the comparison of the trajectory of navigation of autonomous vehicles[J]. Mechanical Systems
the observation equation and the multi-sensor fusion trajectory. (c) is the and Signal Processing, 2018, 100: 605-

View publication stats

You might also like