Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements
<p>Examples of the proposed marker and binary encoding matrix.</p> "> Figure 2
<p>Flow chart of pose estimation by detecting the marker.</p> "> Figure 3
<p>Image processing results. (<b>a</b>) Original image; (<b>b</b>) binary image after Otsu binarization; (<b>c</b>) polygon contours extracted in (<b>b</b>); (<b>d</b>) quadrilateral contours and the matched marker; (<b>e</b>) the corner points extracted on the marker; (<b>f</b>) the extracted corner points of the central marker when the unmanned air vehicle (UAV) is very close to the ground.</p> "> Figure 4
<p>Definition of coordinate systems and camera projection model. <math display="inline"><semantics> <mrow> <msub> <mi>O</mi> <mi>c</mi> </msub> </mrow> </semantics></math> is the optic center of the camera and <math display="inline"><semantics> <mrow> <msub> <mi>O</mi> <mi>c</mi> </msub> <mo>−</mo> <msub> <mi>X</mi> <mi>c</mi> </msub> <msub> <mi>Y</mi> <mi>c</mi> </msub> <msub> <mi>Z</mi> <mi>c</mi> </msub> </mrow> </semantics></math> denotes the c-frame; <math display="inline"><semantics> <mrow> <msub> <mi>O</mi> <mi>p</mi> </msub> <mo>−</mo> <mi>u</mi> <mi>v</mi> </mrow> </semantics></math> denotes the p-frame; and <math display="inline"><semantics> <mrow> <msub> <mi>O</mi> <mi>w</mi> </msub> <mo>−</mo> <msub> <mi>X</mi> <mi>w</mi> </msub> <msub> <mi>Y</mi> <mi>w</mi> </msub> <msub> <mi>Z</mi> <mi>w</mi> </msub> </mrow> </semantics></math> denotes the w-frame.</p> "> Figure 5
<p>System overview of the camera/INS (inertial navigation system) integrated navigation. The vehicle pose is retrieved by the two sensors independently and then fused by error-state extended Kalman filter (EKF), namely loosely coupled.</p> "> Figure 6
<p>Installation relationship between the camera and inertial measurement unit (IMU).</p> "> Figure 7
<p>Experimental equipment.</p> "> Figure 8
<p>Trajectory of the test platform in test 05.</p> "> Figure 9
<p>Attitude variation of the test platform in test 05.</p> "> Figure 10
<p>Experimental environment.</p> "> Figure 11
<p>Position errors of the visual solution in test 03.</p> "> Figure 12
<p>Attitude errors of the visual solution in test 03.</p> "> Figure 13
<p>Corner points reprojection errors on the horizontal plane in test 01.</p> "> Figure 14
<p>Comparison of roll and pitch errors by the visual solution and visual/inertial solution in test 04.</p> "> Figure 15
<p>Comparison of roll and pitch errors by the attitude and heading reference system (AHRS) solution and visual/inertial solution.</p> "> Figure 16
<p>Position errors of the visual-inertial solution in the vision outage simulation test.</p> "> Figure 17
<p>Reprojection error in the vision outage simulation test.</p> "> Figure 18
<p>Edge extraction and marker detection results in challenging situations. (<b>a</b>) Detected marker at a distance of 8 m with no tilt; (<b>b</b>) detected marker at a distance of 8 m with tilt of 40 degrees.</p> ">
Abstract
:1. Introduction
2. Methods
2.1. Marker Detection
2.2. Pose Estimation by Marker
2.3. Visual/Inertial Fusion
2.3.1. INS Mechanization and System Model
- Integrate the angular rate to update the vehicle attitude;
- Transform the acceleration to the n-frame by the updated attitude, then integrate it to obtain the vehicle velocity;
- Integrate the velocity to calculate the vehicle position.
2.3.2. Linearized Measurement Model
3. Experimental Results Discussion
3.1. Experiments Description
3.2. Results Discussion
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Grenzdörffer, G.; Teichert, B. The photogrammetric potential of low-cost UAVs in forestry and agriculture. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 31, 1207–1214. [Google Scholar]
- Herwitz, S.; Johnson, L.F.; Dunagan, S.E.; Higgins, R.G.; Sullivan, D.; Zheng, V.J.; Lobitz, B.M.; Leung, J.G.; Gallmeyer, B.A.; Aoyagi, M.; et al. Imaging from an unmanned aerial vehicle: Agricultural surveillance and decision support. Comput. Electron. Agric. 2004, 44, 49–61. [Google Scholar] [CrossRef]
- Waharte, S.; Trigoni, N. Supporting search and rescue operations with UAVs. In Proceedings of the International Conference on Emerging Security Technologies, Canterbury, UK, 6–7 September 2010; pp. 142–147. [Google Scholar]
- Barmpounakis, E.N.; Vlahogianni, E.I.; Golias, J.C. Unmanned Aerial Aircraft Systems for transportation engineering: Current practice and future challenges. Int. J. Transp. Sci. Technol. 2016, 5, 111–122. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, Q.; Zhuang, Y.; Hu, H. A Novel trail detection and scene understanding framework for a quadrotor UAV with monocular vision. IEEE Sens. J. 2017, 17, 6778–6787. [Google Scholar] [CrossRef]
- Merz, T.; Duranti, S.; Conte, G. Autonomous landing of an unmanned helicopter based on vision and inertial sensing. In Experimental Robotics, 9th ed.; Springer: Berlin, Germany, 2006; pp. 343–352. [Google Scholar]
- Groves, P.D. Principles of GNSS, Inertial, Multisensor Integrated Navigation Systems; Artech House: Norwood, MA, USA, 2013. [Google Scholar]
- Nemra, A.; Aouf, N. Robust INS/GPS sensor fusion for UAV localization using SDRE nonlinear filtering. IEEE Sens. J. 2010, 10, 789–798. [Google Scholar] [CrossRef] [Green Version]
- Falco, G.; Gutiérrez, M.C.C.; Serna, E.P.; Zacchello, F.; Bories, S. Low-cost real-time tightly-coupled GNSS/INS navigation system based on carrier-phase double-differences for UAV applications. In Proceedings of the 27th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2014), Tampa, FL, USA, 8–12 September 2014; p. 841857. [Google Scholar]
- Dabove, P.; Di Pietra, V. Single-baseline RTK positioning using dual-frequency GNSS receivers inside smartphones. Sensors 2019, 19, 4302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Robustelli, U.; Baiocchi, V.; Pugliano, G. Assessment of dual frequency GNSS observations from a Xiaomi Mi 8 Android smartphone and positioning performance analysis. Electronics 2019, 8, 91. [Google Scholar] [CrossRef] [Green Version]
- Balamurugan, G.; Valarmathi, J.; Naidu, V.P.S. survey on UAV navigation in GPS denied environments. In Proceedings of the International Conference on Signal Processing, Communication, Power and Embedded System (SCOPES), Odisha, India, 3–5 October 2016; pp. 198–204. [Google Scholar]
- Shen, S.; Mulgaonkar, Y.; Michael, N.; Kumar, V. Vision-based state estimation for autonomous rotorcraft MAVs in complex environments. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1758–1764. [Google Scholar]
- Gautam, A.; Saripalli, S.P.B.S. A survey of autonomous landing techniques for UAVs. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 1210–1218. [Google Scholar]
- Fiala, M. ARTag, a fiducial marker system using digital techniques. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 590–596. [Google Scholar]
- Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
- Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
- Ling, K. Precision Landing of a Quadrotor UAV on a Moving Target Using Low-Cost Sensors. Master’s Thesis, University of Waterloo, Waterloo, ON, Canada, September 2014. [Google Scholar]
- Kyristsis, S.; Antonopoulos, A.; Chanialakis, T.; Stefanakis, E.; Linardos, C.; Tripolitsiotis, A.; Partsinevelos, P. Towards autonomous modular UAV missions: The detection, geo-location and landing paradigm. Sensors 2016, 16, 1844. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jayatilleke, L.; Zhang, N. Landmark-based localization for Unmanned Aerial Vehicles. In Proceedings of the IEEE International Systems Conference (SysCon), Orlando, FL, USA, 15–18 April 2013; pp. 448–451. [Google Scholar]
- Millet, P.; Ready, B.; McLain, T. Vision-based precision landings of a tailsitter UAV. In Proceedings of the AIAA Guidance, Navigation, Control Conference, Chicago, IL, USA, 10–13 August 2009. [Google Scholar]
- Pan, X.; Ma, D.; Jin, L.; Jiang, Z. Vision-Based Approach Angle and Height Estimation for UAV Landing. In Proceedings of the Congress on Image and Signal Processing, Hainan, China, 27–30 May 2008; pp. 801–805. [Google Scholar]
- Acuna, R.; Willert, V. Dynamic Markers: UAV landing proof of concept. In Proceedings of the Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR) and 2018 Workshop on Robotics in Education (WRE), Joao Pessoa, Brazil, 6–10 November 2018; pp. 496–502. [Google Scholar]
- Yang, S.; Scherer, S.A.; Zell, A. An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 2013, 69, 499–515. [Google Scholar] [CrossRef] [Green Version]
- Araar, O.; Aouf, N.; Vitanov, I. Vision based autonomous landing of multirotor UAV on moving platform. J. Intell. Robot. Syst. 2016, 85, 369–384. [Google Scholar] [CrossRef]
- Lange, S.; Sunderhauf, N.; Protzel, P. A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments. In Proceedings of the International Conference on Advanced Robotics, Munich, Germany, 22–26 June 2009; pp. 1–6. [Google Scholar]
- Nguyen, P.H.; Kim, K.W.; Lee, Y.W.; Park, K.R. Remote marker-based tracking for UAV landing using visible-light camera sensor. Sensors 2017, 17, 1987. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nguyen, P.H.; Arsalan, M.; Koo, J.H.; Naqvi, R.A.; Truong, N.Q.; Park, K.R. LightDenseYOLO: A fast and accurate marker tracker for autonomous UAV landing by visible light camera sensor on drone. Sensors 2018, 18, 1703. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Benini, A.; Rutherford, M.J.; Valavanis, K.P. Real-time, GPU-based pose estimation of a UAV for autonomous takeoff and landing. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 3463–3470. [Google Scholar]
- Otsu, N. A Threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Canny, J. A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, PAMI-8, 679–698. [Google Scholar] [CrossRef]
- Shi, J.; Tomasi, C. Good features to track. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 21–23 June 1994; pp. 593–600. [Google Scholar]
- Bouguet, J.-Y. Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corp. 2001, 5, 4. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Lepetit, V.; Moreno-Noguer, F.; Fua, P. EPnP: An accurate O(n) solution to the PnP problem. Int. J. Comput. Vis. 2009, 81, 155. [Google Scholar] [CrossRef] [Green Version]
- Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Triggs, B.; McLauchlan, P.F.; Hartley, R.I.; Fitzgibbon, A.W. Bundle adjustment—A modern synthesis. In Proceedings of the International Workshop on Vision Algorithms, Corfu, Greece, 21–22 September 1999; pp. 298–372. [Google Scholar]
- Agarwal, S.; Mierle, K. Ceres Solver. Available online: http://ceres-solver.org (accessed on 6 November 2019).
- Shin, E.H. Estimation Techniques for Low-Cost Inertial Navigation. Ph.D. Thesis, Department of Geomatics Engineering, The University of Calgary, Calgary, AB, Canada, May 2005. [Google Scholar]
- Shin, E.-H. Accuarcy Improvement of Low Cost INS/GPS for Land Applications. Master’s Thesis, Department of Geomatics Engineering, University of Calgary, Calgary, AB, Canada, December 2001. [Google Scholar]
- Furgale, P.; Rehder, J.; Siegwart, R. Unified temporal and spatial calibration for multi-sensor systems. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1280–1286. [Google Scholar]
- Teunissen, P.; Montenbruck, O. Springer Handbook of Global Navigation Satellite Systems; Springer: Berlin, Germany, 2017. [Google Scholar]
- Geiger, W.; Bartholomeyczik, J.; Breng, U.; Gutmann, W.; Hafen, M.; Handrich, E.; Huber, M.; Jackle, A.; Kempfer, U.; Kopmann, H.; et al. MEMS IMU for AHRS applications. In Proceedings of the IEEE/ION Position, Location and Navigation Symposium, Portland, OR, USA, 20–23 April 2008; pp. 225–231. [Google Scholar]
- Madgwick, S.O.H.; Harrison, A.J.L.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the IEEE international conference on rehabilitation robotics, Toronto, ON, Canada, 24–28 June 2011; pp. 1–7. [Google Scholar]
- Mahony, R.; Hamel, T.; Pflimlin, J. Nonlinear complementary filters on the special orthogonal group. IEEE Trans. Autom. Control 2008, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version]
Parameters | MEMS IMU (System under Test) | POS320 (Reference System) |
---|---|---|
Gyro bias stability | 200 | 0.5 |
Angle random work | 0.24 | 0.05 |
Accelerometer bias stability | 0.01 | 0.00025 |
Velocity random work | 3 | 0.1 |
Statistics | 01 | 02 | 03 | 04 | 05 | |
---|---|---|---|---|---|---|
North Error (m) | RMSE | 0.007 | 0.006 | 0.008 | 0.006 | 0.006 |
MAX | 0.024 | 0.029 | 0.025 | 0.021 | 0.031 | |
MEAN | 0.006 | 0.004 | 0.007 | 0.005 | 0.004 | |
East Error (m) | RMSE | 0.008 | 0.009 | 0.010 | 0.008 | 0.007 |
MAX | 0.025 | 0.025 | 0.024 | 0.034 | 0.019 | |
MEAN | 0.007 | 0.007 | 0.008 | 0.007 | 0.006 | |
Down Error (m) | RMSE | 0.003 | 0.003 | 0.004 | 0.004 | 0.003 |
MAX | 0.013 | 0.010 | 0.013 | 0.012 | 0.014 | |
MEAN | 0.003 | 0.003 | 0.003 | 0.003 | 0.002 |
Statistics | 01 | 02 | 03 | 04 | 05 | |
---|---|---|---|---|---|---|
Roll Error (°) | RMSE | 0.45 | 0.53 | 0.54 | 0.38 | 0.34 |
MAX | 1.45 | 1.58 | 1.75 | 1.47 | 1.14 | |
MEAN | 0.37 | 0.42 | 0.41 | 0.31 | 0.27 | |
Pitch Error (°) | RMSE | 0.37 | 0.49 | 0.55 | 0.39 | 0.35 |
MAX | 1.12 | 1.33 | 1.76 | 1.52 | 1.05 | |
MEAN | 0.29 | 0.41 | 0.45 | 0.30 | 0.28 | |
Yaw Error (°) | RMSE | 0.05 | 0.08 | 0.09 | 0.05 | 0.05 |
MAX | 0.15 | 0.28 | 0.29 | 0.13 | 0.20 | |
MEAN | 0.04 | 0.05 | 0.06 | 0.04 | 0.04 |
Statistics | 01 | 02 | 03 | 04 | 05 | |
---|---|---|---|---|---|---|
North Error (m) | RMSE | 0.007 | 0.006 | 0.009 | 0.007 | 0.006 |
MAX | 0.025 | 0.180 | 0.029 | 0.022 | 0.034 | |
MEAN | 0.006 | 0.005 | 0.007 | 0.005 | 0.005 | |
East Error (m) | RMSE | 0.009 | 0.009 | 0.010 | 0.008 | 0.008 |
MAX | 0.033 | 0.031 | 0.025 | 0.034 | 0.023 | |
MEAN | 0.007 | 0.007 | 0.008 | 0.007 | 0.006 | |
Down Error (m) | RMSE | 0.004 | 0.004 | 0.005 | 0.004 | 0.004 |
MAX | 0.013 | 0.010 | 0.015 | 0.010 | 0.012 | |
MEAN | 0.003 | 0.003 | 0.003 | 0.004 | 0.003 |
Statistics | 01 | 02 | 03 | 04 | 05 | |
---|---|---|---|---|---|---|
Roll Error (°) | RMSE | 0.08 | 0.44 | 0.34 | 0.21 | 0.43 |
MAX | 0.24 | 1.66 | 1.50 | 0.88 | 1.86 | |
MEAN | 0.06 | 0.24 | 0.20 | 0.17 | 0.28 | |
Pitch Error (°) | RMSE | 0.17 | 0.14 | 0.24 | 0.27 | 0.17 |
MAX | 0.83 | 0.58 | 0.80 | 1.03 | 0.70 | |
MEAN | 0.11 | 0.08 | 0.19 | 0.21 | 0.12 | |
Yaw Error (°) | RMSE | 0.06 | 0.13 | 0.07 | 0.05 | 0.03 |
MAX | 0.13 | 0.36 | 0.32 | 0.12 | 0.15 | |
MEAN | 0.05 | 0.10 | 0.05 | 0.04 | 0.02 |
Statistics | 01 | 02 | 03 | 04 | 05 |
---|---|---|---|---|---|
RMSE (m) | 0.022 | 0.023 | 0.021 | 0.020 | 0.024 |
MAX (m) | 0.042 | 0.049 | 0.049 | 0.033 | 0.044 |
MEAN (m) | 0.020 | 0.020 | 0.018 | 0.019 | 0.023 |
Module | MAX (ms) | MEAN (ms) |
---|---|---|
Marker detection | 36.15 | 25.21 |
Pose estimation by marker | 9.01 | 1.71 |
Filtering update | 1.21 | 0.25 |
Total | 70.34 | 32.26 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Y.; Niu, X.; Du, J.; Chang, L.; Tang, H.; Zhang, H. Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements. Sensors 2019, 19, 5428. https://doi.org/10.3390/s19245428
Wu Y, Niu X, Du J, Chang L, Tang H, Zhang H. Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements. Sensors. 2019; 19(24):5428. https://doi.org/10.3390/s19245428
Chicago/Turabian StyleWu, Yibin, Xiaoji Niu, Junwei Du, Le Chang, Hailiang Tang, and Hongping Zhang. 2019. "Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements" Sensors 19, no. 24: 5428. https://doi.org/10.3390/s19245428
APA StyleWu, Y., Niu, X., Du, J., Chang, L., Tang, H., & Zhang, H. (2019). Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements. Sensors, 19(24), 5428. https://doi.org/10.3390/s19245428