An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot
<p>The system framework.</p> "> Figure 2
<p>The Gazebo simulation environment.</p> "> Figure 3
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration + Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> "> Figure 3 Cont.
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration + Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> "> Figure 4
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration +Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> "> Figure 4 Cont.
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration +Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> "> Figure 5
<p>The comparison of estimated trajectory between OpenVINS and the proposed method for the (<b>a</b>) MH-04-difficult and (<b>b</b>) MH-05-difficult sequences.</p> "> Figure 6
<p>(<b>a</b>) The experiment platform; (<b>b</b>) localization of the robot on the established environment map; (<b>c</b>) camera view (the blue points are SLAM tracking points); (<b>d</b>) estimated trajectory of the robot (red points indicate SLAM feature points).</p> "> Figure 7
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration +Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> "> Figure 7 Cont.
<p>Comparison of the proposed method versus OpenVINS: (<b>a</b>) OpenVINS; (<b>c</b>) OpenVINS+IMU Pre-integration; (<b>e</b>) OpenVINS+IMU Pre-integration +Odom; (<b>b</b>,<b>d</b>,<b>f</b>) are the corresponding errors.</p> ">
Abstract
:1. Introduction
- (1)
- Cosidering the insufficient use of the IMU information in the traditional MSCKF VIO system, the IMU pre-integration method is used to constrain and update the state of the sliding window to improve the positioning accuracy of the system. In order to select the appropriate weight between the covariance of the visual feature point reprojection and the IMU pre-integration, this paper uses the Helmert variance component estimation method in the sliding window update process to select the maximum posterior weights between the visual reprojection and the IMU pre-integration.
- (2)
- For indoor mobile robots in the process of positioning using the MSCKF-based VIO system, there are observable changes (such as loss of scale) during start–stop and uniform motion, resulting in a decrease in positioning accuracy. The speed information provided by the wheel odometer is used for zero-velocity detection, wheel odometer pre-integration, and corresponding status updates to improve the positioning accuracy of the mobile robot system.
- (3)
- Tests and experiments were carried out in the Gazebo simulation environment, public dataset EuRoc [25], and actual environment. The results of tests and experiments were compared with related mainstream algorithms S-MSCKF [10], VINS-Fusion [5,26], and OpenVINS [14]. Simulations and experiments show that the algorithm proposed in this paper can not only ensure real-time performance but also improve the positioning accuracy significantly.
2. System Overview and Methodology
2.1. System Overview
2.2. State Representation
2.3. IMU Dynamic Model and Pre-Integration
2.4. Measurements Update
2.4.1. Point Feature Measurement Update
2.4.2. Pre-Integration Measurement Update
2.4.3. Wheel Odometer Measurement Update
2.5. Helmert Variance Component Estimation
3. Simulations
3.1. Simulation Environment Settings
3.2. Analysis of Localization Accuracy
4. Experiments
4.1. Public Dataset Test
4.2. Real-World Test
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
References
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
- Engel, J.; Koltun, V.; Cremers, D. Direct Sparse Odometry. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 611–625. [Google Scholar] [CrossRef]
- Forster, C.; Pizzoli, M.; Scaramuzza, D. SVO: Fast Semi-Direct Monocular Visual Odometry. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014. [Google Scholar]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Rob. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef] [Green Version]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Campos, C.; Elvira, R.; Rodriguez, J.J.G.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Delmerico, J.; Scaramuzza, D. A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 2502–2509. [Google Scholar] [CrossRef]
- Gui, J.; Gu, D.; Wang, S.; Hu, H. A review of visual inertial odometry from filtering and optimisation perspectives. Adv. Robot. 2015, 29, 1289–1301. [Google Scholar] [CrossRef]
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation. IEEE Int. Conf. Robot. Autom. 2007, 39, 3207–3215. [Google Scholar] [CrossRef] [Green Version]
- Sun, K.; Mohta, K.; Pfrommer, B.; Watterson, M.; Liu, S.; Mulgaonkar, Y.; Taylor, C.J.; Kumar, V. Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight. IEEE Robot. Autom. Lett. 2018, 3, 965–972. [Google Scholar] [CrossRef] [Green Version]
- Bloesch, M.; Omari, S.; Hutter, M.; Siegwart, R. Robust Visual Inertial Odometry Using a Direct EKF-Based Approach. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015. [Google Scholar]
- Li, M.; Mourikis, A.I. High-Precision, Consistent EKF-based Visual-Inertial Odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Li, M.; Mourikis, A.I. Improving the accuracy of EKF-based visual-inertial odometry. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 828–835. [Google Scholar] [CrossRef]
- Geneva, P.; Eckenhoff, K.; Lee, W.; Yang, Y.; Huang, G. OpenVINS: A Research Platform for Visual-Inertial Estimation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4666–4672. [Google Scholar] [CrossRef]
- Heo, S.; Cha, J.; Park, C.G. EKF-Based Visual Inertial Navigation Using Sliding Window Nonlinear Optimization. IEEE Trans. Intell. Transp. Syst. 2019, 20, 2470–2479. [Google Scholar] [CrossRef]
- Lupton, T.; Sukkarieh, S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions. IEEE Trans. Robot. 2012, 28, 61–76. [Google Scholar] [CrossRef]
- Forster, C.; Carlone, L.; Dellaert, F.; Scaramuzza, D. On-Manifold Preintegration for Real-Time Visual-Inertial Odometry. IEEE Trans. Robot. 2017, 33, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Yu, Z.C. A universal formula of maximum likelihood estimation of variance-covariance components. J. Geod. 1996, 70, 233–240. [Google Scholar] [CrossRef]
- Li, M.; Nie, W. Helmert Variance Component Estimation for Multi-GNSS Relative Positioning. Sensors 2020, 20, 669. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gao, Z.; Shen, W.; Zhang, H.; Ge, M.; Niu, X. Application of helmert variance component based adaptive kalman filter in multi-GNSS PPP/INS tightly coupled integration. Remote Sens. 2016, 8, 553. [Google Scholar] [CrossRef]
- Xu, B.; Chen, Y.; Zhang, S.; Wang, J. Improved point-line visual-inertial odometry system using Helmert variance component estimation. Remote Sens. 2020, 12, 2901. [Google Scholar] [CrossRef]
- Hesch, J.A.; Kottas, D.G.; Bowman, S.L.; Roumeliotis, S.I. Observability-constrained vision-aided inertial navigation. Univ. Minnesota Dept. Comp. Sci. Eng. MARS Lab. Tech. Rep. 2012, 1, 6. [Google Scholar]
- Wu, K.J.; Guo, C.X.; Georgiou, G.; Roumeliotis, S.I. VINS on wheels. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5155–5162. [Google Scholar] [CrossRef]
- Quan, M.; Piao, S.; Tan, M.; Huang, S.S. Tightly-Coupled Monocular Visual-Odometric SLAM Using Wheels and a MEMS Gyroscope. IEEE Access 2019, 7, 97374–97389. [Google Scholar] [CrossRef]
- Burri, M.; Nikolic, J.; Gohl, P.; Schneider, T.; Rehder, J.; Omari, S.; Achtelik, M.W.; Siegwart, R. The EuRoC micro aerial vehicle datasets. Int. J. Rob. Res. 2016, 35, 1157–1163. [Google Scholar] [CrossRef]
- Qin, T.; Cao, S.; Pan, J.; Shen, S. A General Optimization-based Framework for Global Pose Estimation with Multiple Sensors. arXiv 2019, arXiv:1901.03642. [Google Scholar]
- Siegwart, R.; Nourbakhsh, I.R. Introduction to Autonomous Mobile Robots, 2nd ed.; MIT Press: Cambridge, MA, USA, 2004. [Google Scholar]
- ROS-Academy-for-Beginners. Available online: https://github.com/DroidAITech/ROS-Academy-for-Beginners (accessed on 2 September 2021).
- Evo. Available online: https://github.com/MichaelGrupp/evo (accessed on 6 December 2021).
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar] [CrossRef]
Evaluation | OpenVINS | OpenVINS + IMU + HVCE | OpenVINS + IMU + Odom + HVCE | |||
---|---|---|---|---|---|---|
RMSE | Trans (m) | Rot (°) | Trans (m) | Rot (°) | Trans (m) | Rot (°) |
0.1826 | 2.1098 | 0.1368 | 1.6815 | 0.1197 | 0.7797 | |
Improvement | 25.08% | 20.30% | 34.45% | 63.04% |
Evaluation | OpenVINS | OpenVINS + IMU + HVCE | OpenVINS + IMU + Odom + HVCE | |||
---|---|---|---|---|---|---|
RMSE | Trans (m) | Rot (°) | Trans (m) | Rot (°) | Trans (m) | Rot (°) |
0.1011 | 0.7536 | 0.0787 | 0.5475 | 0.0698 | 0.3743 | |
Improvement | 22.15% | 27.34% | 30.95% | 50.33% |
Seq | S-MSCKF | VINS-Fusion | OpenVINS | The Proposed | ||||
---|---|---|---|---|---|---|---|---|
Trans (m) | Rot (°) | Trans (m) | Rot (°) | Trans (m) | Rot (°) | Trans (m) | Rot (°) | |
V1_02_medium | 0.1082 | 2.4125 | × | × | 0.0542 | 1.8723 | 0.0480 | 1.8564 |
V1_03_difficult | 0.1654 | 4.1323 | 0.1076 | 6.8387 | 0.0516 | 2.5557 | 0.0512 | 2.3123 |
V2_02_medium | 0.1174 | 1.7794 | 0.1167 | 2.8392 | 0.0462 | 1.4552 | 0.0469 | 1.3057 |
V2_03_medium | × | × | × | × | 0.0708 | 0.9819 | 0.0601 | 0.8100 |
MH_03_medium | 0.2889 | 2.0835 | 0.2856 | 1.4097 | 0.1079 | 1.3833 | 0.0980 | 1.3748 |
MH_04_difficult | 0.2804 | 1.1874 | 0.4241 | 2.3703 | 0.1625 | 1.2023 | 0.1162 | 1.0628 |
MH_05_difficult | 0.4001 | 1.1348 | 0.3081 | 1.7703 | 0.1518 | 1.2390 | 0.1031 | 0.9418 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, Y.; Zhao, C.; Ren, M. An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot. Sensors 2022, 22, 2930. https://doi.org/10.3390/s22082930
Liu Y, Zhao C, Ren M. An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot. Sensors. 2022; 22(8):2930. https://doi.org/10.3390/s22082930
Chicago/Turabian StyleLiu, Yanjie, Changsen Zhao, and Meixuan Ren. 2022. "An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot" Sensors 22, no. 8: 2930. https://doi.org/10.3390/s22082930
APA StyleLiu, Y., Zhao, C., & Ren, M. (2022). An Enhanced Hybrid Visual–Inertial Odometry System for Indoor Mobile Robot. Sensors, 22(8), 2930. https://doi.org/10.3390/s22082930