Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion
<p>Hybrid fusion mode. The system consists of primary local nodes and a secondary fusion node.</p> "> Figure 2
<p>The modules of the system communicate in real time through the Controller Area Network (CAN) bus and the Universal Asynchronous Reciever/Transmitter (UART) bus.</p> "> Figure 3
<p>Comparison of multi-sensor data fusion (MSDF)-based estimation and high-precision sensors in an outdoor environment. (<b>a</b>) Contrast between the position estimations of the real-time kinematics (RTK) and fusion-based systems; (<b>b</b>) Contrast between the velocity estimations of the RTK and fusion-based systems; (<b>c</b>) Contrast between the attitude estimations of the IMU (SBG-N) and fusion-based systems. The red line represents the fusion data and the black line represents the ground truth, which was provided by the RTK system or the high-precision inertial measurement unit (IMU).</p> "> Figure 3 Cont.
<p>Comparison of multi-sensor data fusion (MSDF)-based estimation and high-precision sensors in an outdoor environment. (<b>a</b>) Contrast between the position estimations of the real-time kinematics (RTK) and fusion-based systems; (<b>b</b>) Contrast between the velocity estimations of the RTK and fusion-based systems; (<b>c</b>) Contrast between the attitude estimations of the IMU (SBG-N) and fusion-based systems. The red line represents the fusion data and the black line represents the ground truth, which was provided by the RTK system or the high-precision inertial measurement unit (IMU).</p> "> Figure 4
<p>Comparison of the estimated pose and velocity of the unmanned aerial vehicle (UAV) in an indoor environment. (<b>a</b>) Contrast between the position estimation of the VICON system and that of the fusion-based system; (<b>b</b>) contrast between the velocity estimation of the VICON system and that of the fusion-based system; (<b>c</b>) contrast between the attitude estimation of the VICON system and that of the fusion-based system. The red line represents the fusion-based system’s velocity estimation, and the black line represents the ground truth, which was provided by the VICON system.</p> "> Figure 4 Cont.
<p>Comparison of the estimated pose and velocity of the unmanned aerial vehicle (UAV) in an indoor environment. (<b>a</b>) Contrast between the position estimation of the VICON system and that of the fusion-based system; (<b>b</b>) contrast between the velocity estimation of the VICON system and that of the fusion-based system; (<b>c</b>) contrast between the attitude estimation of the VICON system and that of the fusion-based system. The red line represents the fusion-based system’s velocity estimation, and the black line represents the ground truth, which was provided by the VICON system.</p> "> Figure 5
<p>Comparison of the state estimation of the RTK system and the fusion-based system in an indoor-to-outdoor transition area. (<b>a</b>) The number of satellites and the comparison curve for position; (<b>b</b>) the number of satellites and the comparison curve for velocity. The blue solid line represents the number of satellites, the red solid line represents the fusion data, and the black solid line represents the ground truth, which was provided by the RTK system.</p> "> Figure 6
<p>Our 8.5 kg drone platform (including paddles and batteries), which is equipped with three-dimensional (3D) Light Detection and Ranging (LiDAR), an RGB-D camera, an inertial measurement unit (IMU), an optical flow sensor (OFS), a barometer, and a global navigation satellite system (GNSS) receivers (BDS/GPS, RTK).</p> "> Figure 7
<p>Images of the UAV in flight in different environments. (<b>a</b>) The drone flying in the woods; (<b>b</b>) The drone flying adjacent to high-rise buildings; (<b>c</b>) The drone flying in an indoor environment.</p> "> Figure 8
<p>The flight trajectory of the UAV near high-rise buildings. (<b>a</b>–<b>c</b>) A comparison of the traces displayed in MATLAB. The red dashed line represents data from the fusion-based system, and the black line represents the ground truth, which was provided by the RTK system; (<b>d</b>) A comparison of the trajectories of the UAV via the station on the ground. The blue line represents the data from the fusion-based system, and the red line represents the ground truth provided by the RTK system.</p> "> Figure 8 Cont.
<p>The flight trajectory of the UAV near high-rise buildings. (<b>a</b>–<b>c</b>) A comparison of the traces displayed in MATLAB. The red dashed line represents data from the fusion-based system, and the black line represents the ground truth, which was provided by the RTK system; (<b>d</b>) A comparison of the trajectories of the UAV via the station on the ground. The blue line represents the data from the fusion-based system, and the red line represents the ground truth provided by the RTK system.</p> ">
Abstract
:1. Introduction
- (1)
- The 3D state is estimated by fusing data from multiple sensors (homogeneous or heterogeneous) in real-time, and can be applied to UAV navigation in multi-environments (indoor, outdoor, and outdoor GNSS-denied environments);
- (2)
- In the fusion architecture, hybrid mode is chosen. First, the primary local nodes fuse some of the data from the sensors to obtain state information. Primary local node 1 is based on the data of IMU, magnetometer, GNSS, OFS and primary local node 2 is based on 3D LiDAR SLAM and vSLAM. Then, the secondary fusion node uses the Extended Kalman Filter (EKF) fusion algorithm to estimate the final state. Figure 1 shows the hybrid fusion architecture. In addition, we use a Controller Area Network (CAN) bus [8] interface to output UAV status information. CAN buses have priority and arbitration functions. Multiple modules are linked to the CAN bus through a CAN controller, which facilitates the increase or decrease in modules.
2. Related Work
3. System Composition
4. Multi-Sensor Fusion Algorithm
4.1. MSDF System Model
4.1.1. The State Equations of the MSDF System
4.1.2. Relative Measurement Model
4.1.3. Extended Kalman Filter Algorithm
4.1.4. Absolute Measurement
5. Simulation and Experiment
5.1. Simulation
5.2. Field Experiment
5.2.1. Experimental Platform
5.2.2. Introduction to the Calculation of Power
5.2.3. Experimental Results
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
Appendix A
References
- Timothy, D.B. State Estimation for Robotics; Cambridge University Press: Cambridge, Britain, 2018; pp. 1–3. [Google Scholar]
- Jitendra, R.R. Multi-Sensor Data Fusion with MATLAB; CRC Press, Inc.: Boca Raton, FL, USA, 2010; pp. 4–23. [Google Scholar]
- Hanieh, D.; Timothy, C.H.; Joshua, M. Heterogeneous Multisensor Fusion for Mobile Platform Three-Dimensional Pose Estimation. J. Dyn. Syst. Meas. Control 2017, 139, 8. [Google Scholar]
- Shen, S.; Mulgaonkar, Y.; Michael, N.; Kumar, V. Multi-Sensor Fusion for Robust Autonomous Flight in Indoor and Outdoor Environments with a Rotorcraft MAV. In Proceedings of the 2014 IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, China, 31 May–7 June 2014; pp. 4974–4981. [Google Scholar]
- Zheng, W.; Yan, B.; Wang, Z. Multi-sensor fusion based pose estimation for unmanned aerial vehicles on ships. In Proceedings of the IEEE International Conference on Information and Automation, Ningbo, China, 1–3 August 2016; pp. 648–653. [Google Scholar]
- Song, Y.; Nuske, S.; Scherer, S. A Multi-Sensor Fusion MAV State Estimation from Long-Range Stereo, IMU, GPS and Barometric Sensors. Sensors 2017, 17, 11. [Google Scholar] [CrossRef] [PubMed]
- Gao, Y.; Liu, S.; Atia, M.M.; Nourdldin, A. INS/GPS/LiDAR Integrated Navigation System for Urban and Indoor Environments Using Hybrid Scan Matching Algorithm. Sensors 2015, 15, 23286–23302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Paret, D. Multiplexed Networks for Embedded Systems: CAN, LIN, Flexray, Safe-by-Wire; Wiley: Hoboken, NJ, USA, 2007; pp. 1–57. [Google Scholar]
- Kendoul, F. Survey of Advances in Guidance, Navigation and Control of Unmanned Rotorcraft Systems. J. Field Robot. 2012, 29, 315–378. [Google Scholar] [CrossRef]
- Yao, Y.; Xu, X.; Zhu, C.; Chan, C.Y. A hybrid fusion algorithm for GPS/INS integration during GPS outages. Measurement 2017, 103, 42–51. [Google Scholar] [CrossRef]
- Hinüber, E.L.; Reimer, C.; Schneider, T.; Stock, M. INS/GNSS Integration for Aerobatic Flight Applications and Aircraft Motion Surveying. Sensors 2017, 17, 941. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU Integrated Indoor Navigation System for UAVs and its Application in Real-Time Pipeline Classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, R.; Liu, J.; Zhang, L.; Hang, Y. LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments. In Proceedings of the IEEE 2014 DGON Inertial Sensors and Systems Symposium, Karlsruhe, Germany, 16–17 September 2014; pp. 18–32. [Google Scholar]
- Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. LIDAR-Inertial Integration for UAV Localization and Mapping in Complex Environments. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 649–656. [Google Scholar]
- Bachrach, A.; He, R.; Roy, N. Autonomous Flight in Unstructured and Unknown Indoor Environments. 2010, pp. 1–8. Available online: https://dspace.mit.edu/handle/1721.1/54222 (accessed on 9 February 2020).
- Shen, S.; Michael, N.; Kumar, V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, DC, USA, 26–30 May 2015; pp. 5303–5310. [Google Scholar]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef] [Green Version]
- Huang, A.S.; Bachrach, A.; Henry, P.; Krainin, M.; Maturana, D.; Fox, D.; Roy, N. Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera; Springer: Cham, Switzerland, 2017; pp. 1–17. [Google Scholar]
- Nieuwenhuisen, M.; Droeschel, D.; Beul, M.; Behnke, S. Autonomous Navigation for Micro Aerial Vehicles in Complex GNSS-denied Environments. J. Intell. Robot. Syst. 2016, 84, 199–216. [Google Scholar] [CrossRef]
- Tsai, G.J.; Chiang, K.W.; Chu, C.H.; Chen, Y.L.; Habib, A. The Performance Analysis of an Indoor Mobile Mapping System with Rgb-D Sensor. In Proceedings of the International Conference on Unmanned Aerial Vehicles in Geomatics, Toronto, Canada, 30 August–2 September 2015; pp. 183–188. [Google Scholar]
- Honegger, D.; Meier, L.; Tanskanen, P.; Pollefeys, M. An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1736–1741. [Google Scholar]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics & Automation, Stockholm, Sweden, 16–21 May 2016; pp. 1271–1278. [Google Scholar]
- Zeng, Q.; Chen, W.; Liu, J.; Wang, H. An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph. Sensors 2017, 17, 641. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Schmitz, G.; Alves, T.; Henriques, R.; Freetas, E.; Youssef, E.E. A simplified approach to motion estimation in a UAV using two filters. IFAC-PapersOnLine 2016, 49, 325–330. [Google Scholar] [CrossRef]
- Valenti, R.G. State Estimation and Multi-Sensor Data Fusion for Micro Aerial Vehicles Navigation. Ph.D. Thesis, The City University of New York, New York, NY, USA, 2016. [Google Scholar]
- Samadzadegan, F.; Abdi, G. Autonomous navigation of Unmanned Aerial Vehicles based on multi-sensor data fusion. In Proceedings of the 20th Iranian Conference on Electrical Engineering (ICEE2012), Tehran, Iran, 15–17 May 2012; pp. 868–873. [Google Scholar]
- Khaleghi, B.; Khamis, A.; Karray, F.O.; Razavi, S.N. Multisensor data fusion A review of the state-of-the-art. Inf. Fusion 2013, 14, 28–44. [Google Scholar] [CrossRef]
- Kalman, R.E. A New Approach to Linear Filtering and Prediction Problems. Trans. ASME–J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef] [Green Version]
- Quaternion Kinematics for the Error-State Kalman Filter. Available online: https://hal.archives-ouvertes.fr/hal-01122406v5 (accessed on 9 February 2020).
- Mourikis, A.I.; Roumeliotis, S.I. A Multi-State Constraint Kalman Filter for Vision-Aided Inertial Navigation. In Proceedings of the 2007 IEEE International Conference on Robotics & Automation, Roma, Italy, 10–14 April 2007; pp. 3565–3572. [Google Scholar]
- SBG Systems. Available online: https://www.sbg-systems.com/products/ellipse-2-series/#ellipse2-n-miniature-ins-gnss (accessed on 21 January 2020).
- Quan, Q. Introduction to Multicopter Design and Control; Springer: Singapore, 2017; pp. 150–160. [Google Scholar]
Sensor | State | Accuracy |
---|---|---|
IMU ELLIPSE-N | Roll/Pitch Heading | 0.1° 0.5° |
GNSS (RTK) | Horizontal Position | 1 cm + 1 ppm |
GNSS (RTK) | Vertical position | 2 cm + 1 ppm |
GNSS (RTK) | Velocity | <0.03 m/s |
VICON | Position | <0.5 mm |
Type | SPECS |
---|---|
Weight (with 12,000 mAh TATTU batteries) | 8.5 kg |
Diagonal Wheelbase | 1000 mm |
Max Takeoff Weight | 12 kg |
Hovering Accuracy (RTK) | Vertical: ± 10 cm, Horizontal: ± 10 cm |
Max Speed | 43 km/h (no wind) |
Max Wind Resistance | 10 m/s |
Hovering Time | No payload: 25 min, 3 kg payload: 10 min |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Du, H.; Wang, W.; Xu, C.; Xiao, R.; Sun, C. Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion. Sensors 2020, 20, 919. https://doi.org/10.3390/s20030919
Du H, Wang W, Xu C, Xiao R, Sun C. Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion. Sensors. 2020; 20(3):919. https://doi.org/10.3390/s20030919
Chicago/Turabian StyleDu, Hao, Wei Wang, Chaowen Xu, Ran Xiao, and Changyin Sun. 2020. "Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion" Sensors 20, no. 3: 919. https://doi.org/10.3390/s20030919