Precise, robust, and consistent localization is an important subject in many areas of science such as vision-based control, path planning, and simultaneous localization and mapping (SLAM). To estimate the pose of a platform, sensors such as inertial measurement units (IMUs), global positioning system (GPS), and cameras are commonly employed. Each of these sensors has their strengths and weaknesses. Sensor fusion is a known approach that combines the data measured by different sensors to achieve a more accurate or complete pose estimation and to cope with sensor outages. In this paper, a three-dimensional (3D) pose estimation algorithm is presented for a unmanned aerial vehicle (UAV) in an unknown GPS-denied environment. A UAV can be fully localized by three position coordinates and three orientation angles. The proposed algorithm fuses the data from an IMU, a camera, and a two-dimensional (2D) light detection and ranging (LiDAR) using extended Kalman filter (EKF) to achieve accurate localization. Among the employed sensors, LiDAR has not received proper attention in the past; mostly because a two-dimensional (2D) LiDAR can only provide pose estimation in its scanning plane, and thus, it cannot obtain a full pose estimation in a 3D environment. A novel method is introduced in this paper that employs a 2D LiDAR to improve the full 3D pose estimation accuracy acquired from an IMU and a camera, and it is shown that this method can significantly improve the precision of the localization algorithm. The proposed approach is evaluated and justified by simulation and real world experiments.

References

1.
Jung
,
P. G.
,
Oh
,
S.
,
Lim
,
G.
, and
Kong
,
K. A.
,
2014
, “
Mobile Motion Capture System Based on Inertial Sensors and Smart Shoes
,”
ASME J. Dyn. Syst., Meas., Control
,
136
(
1
), p.
011002
.
2.
Bevly
,
D. M.
,
2004
, “
Global Positioning System (GPS): A Low-Cost Velocity Sensor for Correcting Inertial Sensor Errors on Ground Vehicles
,”
ASME J. Dyn. Syst., Meas., Control
,
126
(
2
), pp.
255
264
.
3.
Caron
,
F.
,
Duflos
,
E.
,
Pomorski
,
D.
, and
Vanheeghe
,
P.
,
2006
, “
GPS/IMU Data Fusion Using Multisensor Kalman Filtering: Introduction of Contextual Aspects
,”
Inf. Fusion
,
7
(
2
), pp.
221
230
.
4.
Mirzaei
,
F. M.
,
Roumeliotis
,
S.
, and
Kalman
,
A.
,
2008
, “
Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation
,”
IEEE Trans. Rob.
,
24
(
5
), pp.
1143
1156
.
5.
Hesch
,
J. A.
,
Kottas
,
D. G.
,
Bowman
,
S. L.
, and
Roumeliotis
,
S. I.
,
2013
, “
Camera-IMU-Based Localization: Observability Analysis and Consistency Improvement
,”
Int. J. Rob. Res.
,
33
(
1
) pp.
182
201
.
6.
Carrillo
,
L. R.
,
Lpez
,
A. E.
,
Lozano
,
R.
, and
Pgard
,
C.
,
2012
, “
Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV
,”
J. Intell. Rob. Syst.
,
65
(
1–4
), pp.
373
387
.
7.
Schauwecker
,
K.
, and
Zell
,
A.
,
2014
, “
On-Board Dual-Stereo-Vision for the Navigation of an Autonomous MAV
,”
J. Intell. Rob. Syst.
,
74
(
1
), pp.
1
6
.
8.
Di
,
L.
,
Fromm
,
T.
, and
Chen
,
Y.
,
2012
, “
A Data Fusion System for Attitude Estimation of Low-Cost Miniature UAVs
,”
J. Intell. Rob. Syst.
,
65
(
1–4
), pp.
621
35
.
9.
Hesch
,
J.
,
Mirzaei
,
F. M.
,
Mariottini
,
G. L.
, and
Roumeliotis
,
S.
,
2009
, “
A 3D Pose Estimator for the Visually Impaired
,”
IEEE/RSJ International Conference on Intelligent Robots and Systems
(
IROS
), St. Louis, MO, Oct. 10–15, pp.
2716
2723
.
10.
Chambers
,
A.
,
Scherer
,
S.
,
Yoder
,
L.
,
Jain
,
S.
,
Nuske
,
S.
, and
Singh
,
S.
,
2014
, “
Robust Multi-Sensor Fusion for Micro Aerial Vehicle Navigation in GPS-Degraded/Denied Environments
,”
American Control Conference
(
ACC
), Portland, OR, June 4–6, pp.
1892
1899
.
11.
Kelly
,
J.
, and
Sukhatme
,
G. S.
,
2011
, “
Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-Calibration
,”
Int. J. Rob. Res.
,
30
(
1
), pp.
56
79
.
12.
Laviola
,
J. J.
,
2003
, “
A Comparison of Unscented and Extended Kalman Filtering for Estimating Quaternion Motion
,” American Control Conference (
ACC
), Denver, CO, June 4–6, Vol.
3
, pp.
2435
2440
.
13.
UmaMageswari
,
A.
,
Ignatious
,
J. J.
, and
Vinodha
,
R.
,
2012
, “
A Comparitive Study of Kalman Filter, Extended Kalman Filter and Unscented Kalman Filter for Harmonic Analysis of the Non-Stationary Signals
,”
Int. J. Sci. Eng. Res.
,
3
(
7
), pp.
1
9
.
14.
Fiorenzani
,
T.
,
Manes
,
C.
,
Oriolo
,
G.
, and
Peliti
,
P.
,
2008
, “
Comparative Study of Unscented Kalman Filter and Extended Kalman Filter for Position/Attitude Estimation in Unmanned Aerial Vehicles
,”
Institute for Systems Analysis and Computer Science (IASI-CNR)
, Rome, Italy, Report No. 08-08.
15.
Akhoundi
,
M. A.
, and
Valavi
,
E.
,
2010
, “
Multi-Sensor Fuzzy Data Fusion Using Sensors With Different Characteristics
,”
J. Computer Science
, epub.
16.
Goebel
,
K.
, and
Agogino
,
A. M.
,
2001
, “
Sensor Validation and Fusion for Automated Vehicle Control Using Fuzzy Techniques
,”
ASME J. Dyn. Sys., Meas., Control
,
123
(
1
), pp.
145
146
.
17.
Zhao
,
S.
, and
Farrell
,
J. A.
,
2013
, “
2D LIDAR Aided INS for Vehicle Positioning in Urban Environments
,” IEEE International Conference on Control Applications (
CCA
), Hyderabad, India, Aug. 28–30, pp.
376
381
.
18.
Weiss
,
S.
, and
Siegwart
,
R.
,
2011
, “
Real-Time Metric State Estimation for Modular Vision-Inertial Systems
,” IEEE International Conference on Robotics and Automation (
ICRA
), Shanghai, China, May 9–13, pp.
4531
4537
.
19.
Lupton
,
T.
, and
Sukkarieh
,
S.
,
2009
, “
Efficient Integration of Inertial Observations Into Visual SLAM Without Initialization
,” IEEE/RSJ International Conference on Intelligent Robots and Systems (
IROS
), St. Louis, MO, Oct. 10–15, pp.
1547
1552
.
20.
Wasielewski
,
S.
, and
Strauss
,
O.
,
1995
, “
Calibration of a Multi-Sensor System Laser Rangefinder/Camera
,” Intelligent Vehicles' 95 Symposium (
IVS
), Detroit, MI, Sept. 25–26, pp.
472
477
.
21.
Chow
,
J. C.
,
Lichti
,
D. D.
,
Hol
,
J. D.
,
Bellusci
,
G.
, and
Luinge
,
H.
,
2014
, “
IMU and Multiple RGB-D Camera Fusion for Assisting Indoor Stop-and-Go 3D Terrestrial Laser Scanning
,”
Robotics
,
3
(
3
), pp.
247
280
.
22.
Janabi-Sharifi
,
F.
, and
Marey
,
M.
,
2010
, “
A Kalman-Filter-Based Method for Pose Estimation in Visual Servoing
,”
IEEE Trans. Rob.
,
26
(
5
), pp.
939
947
.
23.
Ligorio
,
G.
, and
Sabatini
,
A. M.
,
2013
, “
Extended Kalman Filter-Based Methods for Pose Estimation Using Visual, Inertial and Magnetic Sensors: Comparative Analysis and Performance Evaluation
,”
Sensors
,
13
(
2
), pp.
1919
1941
.
24.
Sanchez-Orta
,
A.
,
Parra-Vega
,
V.
,
Izaguirre-Espinosa
,
C.
, and
Garcia
,
O.
,
2015
, “
PositionYaw Tracking of Quadrotors
,”
ASME J. Dyn. Syst., Meas., Control
,
137
(
6
), p.
061011
.
25.
Placht
,
S.
,
Frsattel
,
P.
,
Mengue
,
E. A.
,
Hofmann
,
H.
,
Schaller
,
C.
,
Balda
,
M.
, and
Angelopoulou
,
E.
,
2014
, “
ROCHADE: Robust Checkerboard Advanced Detection for Camera Calibration
,”
Computer Vision ECCV
,
Springer
,
Berlin
, pp.
766
779
.
26.
Civera
,
J.
,
Grasa
,
O. G.
,
Davison
,
A. J.
, and
Montiel
,
J. M.
,
2010
, “
1Point RANSAC for Extended Kalman Filtering: Application to Realtime Structure From Motion and Visual Odometry
,”
J. Field Rob.
,
27
(
5
), pp.
609
631
.
You do not currently have access to this content.