A Robust Indoor/Outdoor Navigation Filter Fusing Data from Vision and Magneto-Inertial Measurement Unit
<p>Reference coordinate frames at play in the problem, with associated typical measurements.</p> "> Figure 2
<p>Schematic view of on-board sensors. In addition to accelerometers and gyrometers, the MIMU includes several magnetometers: a central one and at least three peripheral ones in order to compute the full <math display="inline"> <semantics> <mrow> <mn>3</mn> <mo>×</mo> <mn>3</mn> </mrow> </semantics> </math> matrix of magnetic field gradients. The camera is rigidly attached to the MIMU sensor.</p> "> Figure 3
<p>Illustration of state augmentation and marginalization across time as described in <a href="#sec4dot2-sensors-17-02795" class="html-sec">Section 4.2</a>.</p> "> Figure 4
<p>The sensor setup used in this work. The white box on the left side contains the MIMU sensor, the camera is on the right side. Both sensors are rigidly attached through a non-magnetic, non conductive material (wood).</p> "> Figure 5
<p>Image processing pipeline. (<b>Left</b>): Raw input images. (<b>Right</b>): after rectification, intensity normalization, and corner detection. On the second example, the normalization reveals a faint signal, but it is too noisy for corner detection to works.</p> "> Figure 6
<p>(<b>a</b>) Overview of trajectory Traj2 as reconstructed by the three filters. (<b>b</b>) Visualisation of dark areas and low-gradient areas over the entire trajectory surimposed on MI-MSCKF estimate. (<b>c</b>) Height profile of the three estimators on this trajectory.</p> "> Figure 7
<p>Details of estimation results on Traj2 showing a different behavior between the MSCKF and MI-MSCKF filter. <b>Left and middle plots</b>: while transitioning from dark area to lit environment some strong filter correction happen for the MSCKF and lead to discontinuities of the position estimate. In the same areas, MI-MSCKF stays smoother. <b>Right plot</b>: here the device is laid on the ground at the end of the trajectory. A large drift of MSCKF occurs, as visual information does not provide any feedback on position. Here again, the MI-MSCKF appears more stable.</p> "> Figure 8
<p>(<b>a</b>) Overview of trajectory Traj2 as reconstructed by the three filters. (<b>b</b>) Visualisation of dark areas and low-gradient areas over the entire trajectory surimposed on MI-MSCKF estimate. (<b>c</b>) Height profile of the three estimators on this trajectory.</p> "> Figure 9
<p>Summary of trajectories on the remaining sequences of the dataset. (<b>Left</b>): Estimate of the three configuration of our filter. (<b>Right</b>) Color coded MI-MSCKF trajectory showing areas of weak gradient and weak illumination. (<b>a</b>) <b>Traj1</b> Length: ∼530 m; (<b>b</b>) <b>Traj3</b> Length: ∼368 m; (<b>c</b>) <b>Traj4</b> Length: ∼180 m.</p> ">
Abstract
:1. Introduction
1.1. Motivation
1.2. State of the Art and Contribution
1.3. Paper Organization
2. Notations
2.1. General Conventions
2.2. Reserved Symbols
2.3. Rotation Parametrization
3. On-Board Sensors and Evolution Model
3.1. Sensing Hardware
3.2. Evolution Model
- Flat-earth approximation. We assume that the Est-North-Up (ENU) earth frame at filter initialization is an inertial frame.
- Stationary magnetic field in the world frame—although possibly spatially non-uniform, leading to the spatial gradient in (7).
3.3. Model Discretization
3.4. Sensors Error Model
4. Tight Fusion Filter
4.1. State and Error State
- the keyframe poses state space , which elements are the poses of a set of N past frames at time indexes not necessarily temporally successive but close in time. With poses written , this part of the state have the following form:
- the current mimu state space which elements have the following form:
4.2. Propagation/Augmentation/Marginalization
4.2.1. Propagation
4.2.2. State Augmentation
4.2.3. Marginalization of Old State
4.3. Measurement Update
4.3.1. Magnetic Measurement Update
4.3.2. Opportunistic Feature Tracks Measurement Update
4.4. Filter Initialization
5. Experimental Study
5.1. Hardware Prototype Description and Data Syncing
5.2. Filter Parameters Tuning
5.3. Visual Processing Implementation
- they go out of the field of view;
- the tracking fails;
- they are classified as outliers;
- the frame where they were firstly detected is to be marginalized at next propagation step.
- it spans at least three poses;
- its initial triangulation did not exhibit any degeneracy;
- its re-projection error is below a threshold.
5.4. Trajectory Evaluation
5.4.1. Dataset Presentation
5.4.2. Overall Comparison
- the estimated trajectories are superimposed to a georeferenced orthoimage in which one pixel represents precisely 0.5 m. We compute an alignment of the trajectory when the checkerboard detected for the first time in the sequence. This alignement results from setting manually the position and heading of the checkerboard frame relative to the coordinates system of the satellite image. Note that no manual scale alignment has been made, hence this visualization allows to evaluate roughly the correctness of the global scale of the estimate, for instance on Figure 6a;
- the z profile is globally known as the pedestrian walks along flat corridors—except when he takes stairs to change levels;
- a translational error is computed each time the system comes back to its initial position, thanks to a static checkerboard placed at starting point. This criterion can be visualized in Figure 6c for Traj2 where it is clear the MI-DR estimate is less stable vertically over the entire trajectory.
5.4.3. The Fused Estimate Improves MI-DR in Outdoor Trajectories
5.4.4. Data Fusion Improves Local Consistency
5.4.5. Comparison with a State of the Art Filter
6. Conclusions
Supplementary Materials
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Zhang, J.; Kaess, M.; Singh, S. Real-Time Depth Enhanced Monocular Odometry. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014), Chicago, IL, USA, 14–18 September 2014; pp. 4973–4980. [Google Scholar]
- Guo, C.; Roumeliotis, S.I. IMU-RGBD Camera Navigation Using Point and Plane Features. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–7 November 2013; pp. 3164–3171. [Google Scholar]
- Konolige, K.; Agrawal, M.; Sola, J. Large-Scale Visual Odometry for Rough Terrain. In Robotics Research; Springer: Berlin, Germany, 2011; pp. 201–212. [Google Scholar]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-Based Visual—Inertial Odometry Using Nonlinear Optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
- Usenko, V.; Engel, J.; Stückler, J.; Cremers, D. Direct Visual-Inertial Odometry with Stereo Cameras. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1885–1892. [Google Scholar]
- Li, M.; Mourikis, A.I. High-Precision, Consistent EKF-Based Visual—Inertial Odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Hernandez, J.; Tsotsos, K.; Soatto, S. Observability, Identifiability and Sensitivity of Vision-Aided Inertial Navigation. In Proceedings of the Robotics and 2015 IEEE International Conference on Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 2319–2325. [Google Scholar]
- Dorveaux, E.; Boudot, T.; Hillion, M.; Petit, N. Combining Inertial Measurements and Distributed Magnetometry for Motion Estimation. In Proceedings of the American Control Conference (ACC), San Francisco, CA, USA, 29 June–1 July 2011; pp. 4249–4256. [Google Scholar]
- Chesneau, C.I.; Hillion, M.; Prieur, C. Motion Estimation of a Rigid Body with an EKF Using Magneto-Inertial Measurements. In Proceedings of the IEEE 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016; pp. 1–6. [Google Scholar]
- Chesneau, C.I.; Hillion, M.; Hullo, J.F.; Thibault, G.; Prieur, C. Improving Magneto-Inertial Attitude and Position Estimation by Means of Magnetic Heading Observer. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017. [Google Scholar]
- Caruso, D.; Sanfourche, M.; Le Besnerais, G.; Vissiere, D. Infrastructureless Indoor Navigation with an Hybrid Magneto-Inertial and Depth Sensor System. In Proceedings of the 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016. [Google Scholar]
- Caruso, D.; Eudes, A.; Sanfourche, M.; Vissiere, D.; Le Besnerais, G. Robust Indoor/Outdoor Navigation through Magneto-Visual-Inertial Optimization-Based Estimation. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Caruso, D.; Eudes, A.; Sanfourche, M.; Vissiere, D.; Le Besnerais, G. An Inverse Square-Root Filter for Robust Indoor/Outdoor Magneto-Visual-Inertial Odometry. In Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan, 18–21 September 2017. [Google Scholar]
- Wu, K.; Ahmed, A.; Georgiou, G.A.; Roumeliotis, S.I. A Square Root Inverse Filter for Efficient Vision-Aided Inertial Navigation on Mobile Devices. In Proceedings of the 2015 Robotics: Science and Systems Conference, Rome, Italy, 13–17 July 2015. [Google Scholar]
- Anderson, B.; Moore, J. Optimal Filtering; Prentice-Hall: Englewood Cliffs, NJ, USA, 1979. [Google Scholar]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration Theory for Fast and Accurate Visual-Inertial Navigation. arXiv 2015, arXiv:1512.02363. [Google Scholar]
- Dorveaux, E.; Vissiere, D.; Martin, A.P.; Petit, N. Iterative Calibration Method for Inertial and Magnetic Sensors. In Proceedings of the 48th IEEE Conference on Decision and Control, 2009 Held Jointly with the 2009 28th Chinese Control Conference, CDC/CCC 2009, Shanghai, China, 15–18 December 2009; pp. 8296–8303. [Google Scholar]
- Hesch, J.A.; Kottas, D.G.; Bowman, S.L.; Roumeliotis, S.I. Camera-IMU-Based Localization: Observability Analysis and Consistency Improvement. Int. J. Robot. Res. 2014, 33, 182–201. [Google Scholar] [CrossRef]
- Simon, D. Kalman Filtering with State Constraints: A Survey of Linear and Nonlinear Algorithms. IET Control Theory Appl. 2010, 4, 1303–1318. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- Yang, Z.; Shen, S. Monocular Visual-Inertial State EstimationWith Online Initialization and Camera-IMU Extrinsic Calibration. IEEE Trans. Autom. Sci. Eng. 2017, 14, 39–51. [Google Scholar] [CrossRef]
- Dong-Si, T.C.; Mourikis, A. Estimator Initialization in Vision-Aided Inertial Navigation with Unknown Camera-IMU Calibration. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, Portugal, 7–12 October 2012; pp. 1064–1071. [Google Scholar]
- Furgale, P.; Rehder, J.; Siegwart, R. Unified Temporal and Spatial Calibration for Multi-Sensor Systems. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–7 November 2013; pp. 1280–1286. [Google Scholar]
- Paul, M.K.; Wu, K.; Hesch, J.A.; Nerurkar, E.D.; Roumeliotis, S.I. A Comparative Analysis of Tightly-Coupled Monocular, Binocular, and Stereo VINS. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 165–172. [Google Scholar]
Traj1 | Traj2 | Traj3 | Traj4 | Traj5 | |
---|---|---|---|---|---|
MI-DR | 1.11 | 1.98 | 1.81 | 1.54 | 2.87 |
MSCKF (VINS) | 0.33 | 0.63 | 0.59 | 1.05 | 0.21 |
MI-MSCKF | 0.20 | 0.31 | 0.49 | 0.71 | 0.15 |
State of the art VINS [24] | 0.26 | 0.52 | 0.79 | 0.62 | 0.20 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Caruso, D.; Eudes, A.; Sanfourche, M.; Vissière, D.; Le Besnerais, G. A Robust Indoor/Outdoor Navigation Filter Fusing Data from Vision and Magneto-Inertial Measurement Unit. Sensors 2017, 17, 2795. https://doi.org/10.3390/s17122795
Caruso D, Eudes A, Sanfourche M, Vissière D, Le Besnerais G. A Robust Indoor/Outdoor Navigation Filter Fusing Data from Vision and Magneto-Inertial Measurement Unit. Sensors. 2017; 17(12):2795. https://doi.org/10.3390/s17122795
Chicago/Turabian StyleCaruso, David, Alexandre Eudes, Martial Sanfourche, David Vissière, and Guy Le Besnerais. 2017. "A Robust Indoor/Outdoor Navigation Filter Fusing Data from Vision and Magneto-Inertial Measurement Unit" Sensors 17, no. 12: 2795. https://doi.org/10.3390/s17122795