Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Fiducial Markers for Pose Estimation

Overview, Applications and Experimental Comparison of the ARTag, AprilTag, ArUco and STag Markers

  • Regular Paper
  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

Robust localization is critical for the navigation and control of mobile robots. Global Navigation Satellite Systems (GNSS), Visual-Inertial Odometry (VIO), and Simultaneous Localization and Mapping (SLAM) offer different methods for achieving this goal. In some cases however, these methods may not be available or provide high enough accuracy. In such cases, these methods may be augmented or replaced with fiducial marker pose estimation. Fiducial markers can increase the accuracy and robustness of a localization system by providing an easily recognizable feature with embedded fault detection. This paper presents an overview of fiducial markers developed in the recent years and an experimental comparison of the four markers (ARTag, AprilTag, ArUco, and STag) that represent the state-of-the-art and most widely used packages. These markers are evaluated on their accuracy, detection rate and computational cost in several scenarios that include simulated noise from shadows and motion blur. Different marker configurations, including single markers, planar and non-planar bundles and multi-sized marker bundles are also considered in this work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Kalaitzakis, M., Carroll, S., Ambrosi, A., Whitehead, C., Vitzilaios, N.: Experimental comparison of fiducial markers for pose estimation. In: 2020 International Conference on Unmanned Aircraft Systems (ICUAS). https://doi.org/10.1109/icuas48674.2020.9213977, pp 781–789. Athens, Greece (2020)

  2. Leonard, J.J., Durrant-Whyte, H.F.: Mobile robot localization by tracking geometric beacons. IEEE Trans. Robot. Autom. 7(3), 376–382 (1991)

    Article  Google Scholar 

  3. Grupen, R.A., Henderson, T.C., McCammon, I.D.: A survey of general- purpose manipulation. Int. J. Robot. Res. 8(1), 38–62 (1989). https://doi.org/10.1177/027836498900800103

    Article  Google Scholar 

  4. Delmerico, J., Scaramuzza, D.: A benchmark comparison of monocular visual-inertial Odometry algorithms for flying robots. In: 2018 IEEE International Conference on Robotics and Automation (ICRA). https://doi.org/10.1109/icra.2018.8460664. IEEE (2018)

  5. Scaramuzza, D., Fraundorfer, F.: Visual odometry [tutorial]. IEEE Robot. Automat. Magazine 18(4), 80–92 (2011). https://doi.org/10.1109/mra.2011.943233

    Article  Google Scholar 

  6. Fraundorfer, F., Scaramuzza, D.: Visual odometry : Part II: Matching, robustness, optimization, and applications. IEEE Robot. Autom. Magazine 19(2), 78–90 (2012). https://doi.org/10.1109/mra.2012.2182810

    Article  Google Scholar 

  7. Fuentes-Pacheco, J., Ruiz-Ascencio, J., Rendȯn-Mancha, J.M.: Visual simultaneous localization and mapping: A survey. Artif. Intell. Rev. 43(1), 55–81 (2012). https://doi.org/10.1007/s10462-012-9365-8

    Article  Google Scholar 

  8. Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I., Leonard, J.J.: Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 32(6), 1309–1332 (2016). https://doi.org/10.1109/tro.2016.2624754

    Article  Google Scholar 

  9. Chahine, G., Pradalier, C.: Survey of monocular SLAM algorithms in natural environments. In: 2018 15Th Conference on Computer and Robot Vision (CRV). https://doi.org/10.1109/crv.2018.00055. IEEE (2018)

  10. Joshi, B., Rahman, S., Kalaitzakis, M., Cain, B., Johnson, J., Xanthidis, M., Karapetyan, N., Hernandez, A., Li, A.Q., Vitzilaios, N., Rekleitis, I.: Experimental comparison of open source visual-inertial-based state estimation algorithms in the underwater domain. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros40897.2019.8968049. IEEE (2019)

  11. Yu, G., Liu, Y., Han, X., Zhang, C.: Objects grasping of robotic arm with compliant grasper based on vision. In: Proceedings of the 2019 4th International Conference on Automation, Control and Robotics Engineering. https://doi.org/10.1145/3351917.3351958. ACM (2019)

  12. Quigley, M., Gerkey, B.P., Conley, K., Faust, J., Foote, T., Leibs, J., Berger, E., Wheeler, R., Ng, A.Y.: ROS: An open-source robot operating system. Tech rep (2009)

  13. Shabalina, K., Sagitov, A., Sabirova, L., Li, H., Magid, E.: ARTag, AprilTag and CALTag fiducial systems comparison in a presence of partial rotation: Manual and automated approaches. In: Informatics in Control, Automation and Robotics. https://doi.org/10.1007/978-3-030-11292-9_27, pp 536–558. Springer International Publishing (2019)

  14. dos Santos Cesar, D.B., Gaudig, C., Fritsche, M., dos Reis, M.A., Kirchner, F.: An evaluation of artificial fiducial markers in underwater environments. In: OCEANS 2015 - Genova. https://doi.org/10.1109/oceans-genova.2015.7271491. IEEE (2015)

  15. Babinec, A., Jurišica, L., Hubinský, P., Duchoň, F.: Visual localization of mobile robot using artificial markers. Procedia Eng. 96, 1–9 (2014). https://doi.org/10.1016/j.proeng.2014.12.091

    Article  Google Scholar 

  16. Olson, E.: Apriltag: A robust and flexible visual fiducial system. In: 2011 IEEE International Conference on Robotics and Automation. https://doi.org/10.1109/icra.2011.5979561. IEEE (2011)

  17. Wang, J., Olson, E.: Apriltag 2: Efficient and robust fiducial detection. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros.2016.7759617. IEEE (2016)

  18. Fiala, M.: ARTag, a Fiducial marker system using digital techniques. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05). https://doi.org/10.1109/cvpr.2005.74. IEEE (2005)

  19. Fiala, M.: Designing highly reliable fiducial markers. IEEE Trans. Pattern Anal. Mach. Intell. 32(7), 1317–1324 (2010). https://doi.org/10.1109/tpami.2009.146

    Article  Google Scholar 

  20. Kato, H., Billinghurst, M.: Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In: Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99). https://doi.org/10.1109/iwar.1999.803809. IEEE Comput. Soc (1999)

  21. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F., Marín-Jiménez, M.: Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recogn. 47(6), 2280–2292 (2014). https://doi.org/10.1016/j.patcog.2014.01.005

    Article  Google Scholar 

  22. Flohr, D., Fischer, J.: A lightweight ID-based extension for marker tracking systems. In: Eurographics Symposium on Virtual Environments (EGVE) Short Paper Proceedings, pp 59–64 (2007)

  23. Klokmose, C.N., Kristensen, J.B., Bagge, R., Halskov, K.: Bullseye: High-precision fiducial tracking for table-based tangible interaction. In: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces - ITS. https://doi.org/10.1145/2669485.2669503. ACM Press (2014)

  24. Atcheson, B., Heide, F., Heidrich, W.: Caltag: High precision fiducial markers for camera calibration. Vision, Modeling and Visualization (2010). https://doi.org/10.2312/PE/VMV/VMV10/041-048 (2010)

  25. Gatrell, L.B., Hoff, W.A., Sklair, C.W.: Robust image features: Concentric contrasting circles and their image extraction. In: Stoney, W.E. (ed.) Cooperative Intelligent Robotics in Space II. https://doi.org/10.1117/12.56761. SPIE (1992)

  26. Calvet, L., Gurdjos, P., Griwodz, C., Gasparini, S.: Detection and accurate localization of circular Fiducials under highly challenging conditions. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). https://doi.org/10.1109/cvpr.2016.67. IEEE (2016)

  27. DeGol, J., Bretl, T., Hoiem, D.: Chromatag: A colored marker and fast detection algorithm. In: 2017 IEEE International Conference on Computer Vision (ICCV). https://doi.org/10.1109/iccv.2017.164. IEEE (2017)

  28. Sattar, J., Bourque, E., Giguere, P., Dudek, G.: Fourier Tags: Smoothly degradable Fiducial markers for use in human-robot interaction. In: Fourth Canadian Conference on Computer and Robot Vision (CRV ’07). https://doi.org/10.1109/crv.2007.34. IEEE (2007)

  29. Xu, A., Dudek, G.: Fourier Tag: A smoothly degradable Fiducial marker system with configurable payload capacity. In: 2011 Canadian Conference on Computer and Robot Vision. https://doi.org/10.1109/crv.2011.13. IEEE (2011)

  30. Naimark, L., Foxlin, E.: Circular data matrix Fiducial system and robust image processing for a Wearable Vision-Inertial Self-Tracker. In: Proceedings. International Symposium on Mixed and Augmented Reality. https://doi.org/10.1109/ismar.2002.1115065. IEEE Comput. Soc (2002)

  31. Cho, Y., Lee, J., Neumann, U.: A multi-ring color fiducial system and an intensity-invariant detection method for scalable Fiducial-Tracking augmented reality. In: IWAR, pp 1–15 (1998)

  32. Bergamasco, F., Albarelli, A., Torsello, A.: Pi-tag: A fast image-space marker design based on projective invariants. Mach. Vis. Appl. 24(6), 1295–1310 (2012). https://doi.org/10.1007/s00138-012-0469-6

    Article  Google Scholar 

  33. Bergamasco, F., Albarelli, A., Rodola, E., Torsello, A.: RUNE-Tag: A high accuracy Fiducial marker with strong occlusion resilience. In: CVPR 2011. https://doi.org/10.1109/cvpr.2011.5995544. IEEE (2011)

  34. Bergamasco, F., Albarelli, A., Cosmo, L., Rodola, E., Torsello, A.: An accurate and robust artificial marker based on cyclic codes. IEEE Trans. Pattern Anal. Mach. Intell. 38(12), 2359–2373 (2016). https://doi.org/10.1109/tpami.2016.2519024

    Article  Google Scholar 

  35. Schweiger, F., Zeisl, B., Georgel, P., Schroth, G., Steinbach, E., Navab, N.: Maximum detector Response markers for SIFT and SURF. In: VMV 2009 - Proceedings of the Vision Modeling, and Visualization Workshop, vol. 2009, pp 145–154 (2009)

  36. Benligiray, B., Topal, C., Akinlar, C.: STAg: A stable fiducial marker system. Image Vis. Comput. 89, 158–169 (2019). https://doi.org/10.1016/j.imavis.2019.06.007

    Article  Google Scholar 

  37. Yu, G., Hu, Y., Dai, J.: Topotag: A robust and scalable topological fiducial marker system. IEEE Trans. Vis. Comput. Graph. 1–1 https://doi.org/10.1109/tvcg.2020.2988466 (2020)

  38. De Ipiṅa, D. L., Mendonċa, P. R., Hopper, A.: TRIP: A low-cost vision-based location system for ubiquitous computing. Pers. Ubiquit. Comput. 6(3), 206–219 (2002). https://doi.org/10.1007/s007790200020

    Article  Google Scholar 

  39. Rohs, M.: Real-world interaction with camera phones. In: Ubiquitous Computing Systems, pp 74–89. Springer Berlin Heidelberg (2005), https://doi.org/10.1007/11526858∖_7

  40. Lightbody, P., Krajník, T., Hanheide, M.: An efficient visual fiducial localisation system. ACM SIGAPP Appl. Comput. Rev. 17(3), 28–37 (2017). https://doi.org/10.1145/3161534.3161537

    Article  Google Scholar 

  41. Abbas, S.M., Aslam, S., Berns, K., Muhammad, A.: Analysis and improvements in AprilTag based state estimation. Sensors 19(24), 5480 (2019). https://doi.org/10.3390/s19245480

    Article  Google Scholar 

  42. Krajník, T., Nitsche, M., Faigl, J., Vaněk, P., Saska, M., Přeučil, L., Duckett, T., Mejail, M.: A practical multirobot localization system. J. Intell. Robot. Syst. 76(3-4), 539–562 (2014). https://doi.org/10.1007/s10846-014-0041-x

    Article  Google Scholar 

  43. Tian, Z., Carver, C.J., Shao, Q., Roznere, M., Li, A.Q., Zhou, X.: Polartag: Invisible data with light polarization. In: Proceedings of the 21st International Workshop on Mobile Computing Systems and Applications. https://doi.org/10.1145/3376897.3377854. ACM (2020)

  44. Jayatilleke, L., Zhang, N.: Landmark-based localization for unmanned aerial vehicles. In: 2013 IEEE International Systems Conference (SysCon). https://doi.org/10.1109/syscon.2013.6549921. IEEE (2013)

  45. Vanegas, F., Gonzalez, F.: Enabling UAV navigation with sensor and environmental uncertainty in cluttered and GPS-denied environments. Sensors 16(5), 666 (2016). https://doi.org/10.3390/s16050666

    Article  Google Scholar 

  46. Zhenglong, G., Qiang, F., Quan, Q.: Pose estimation for multicopters based on monocular vision and AprilTag. In: 2018 37th Chinese Control Conference (CCC). https://doi.org/10.23919/chicc.2018.8483685. IEEE (2018)

  47. Pickem, D., Glotfelter, P., Wang, L., Mote, M., Ames, A., Feron, E., Egerstedt, M.: The Robotarium: A remotely accessible swarm robotics research Testbed. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). https://doi.org/10.1109/icra.2017.7989200. IEEE (2017)

  48. Tsoukalas, A., Tzes, A., Khorrami, F.: Relative pose estimation of unmanned aerial systems. In: 2018 26th Mediterranean Conference on Control and Automation (MED). https://doi.org/10.1109/med.2018.8442959. IEEE (2018)

  49. Heshmati-Alamdari, S., Bechlioulis, C.P., Karras, G.C., Nikou, A., Dimarogonas, D.V., Kyriakopoulos, K.J.: A robust interaction control approach for underwater vehicle manipulator systems. Annu. Rev. Control. 46, 315–325 (2018). https://doi.org/10.1016/j.arcontrol.2018.10.003

    Article  MathSciNet  Google Scholar 

  50. Bormann, R., Hampp, J., Hagele, M.: New brooms sweep clean - An autonomous robotic cleaning assistant for professional office cleaning. In: 2015 IEEE International Conference on Robotics and Automation (ICRA). https://doi.org/10.1109/icra.2015.7139818. IEEE (2015)

  51. Cantieri, A.R., Wehrmeister, M.A., Oliveira, A.S., Lima, J., Ferraz, M., Szekir, G.: Proposal of an augmented reality tag UAV positioning system for power line tower inspection. In: Advances in Intelligent Systems and Computing, pp 87–98. Springer International Publishing (2019), https://doi.org/10.1007/978-3-030-35990-4∖_8

  52. Kalaitzakis, M., Kattil, S.R., Vitzilaios, N., Rizos, D., Sutton, M.: Dynamic structural health monitoring using a DIC-Enabled Drone. In: 2019 International Conference on Unmanned Aircraft Systems (ICUAS). Atlanta, GA, USA. https://doi.org/10.1109/icuas.2019.8798270, pp 321–327 (2019)

  53. Carroll, S.: Autonomous Drone-based sensor package deployment to the underside of structures. Master’s thesis, University of South Carolina (2020)

  54. Lim, H., Lee, Y.S.: Real-Time Single Camera SLAM Using Fiducial Markers. In: 2009 ICCAS-SICE. IEEE, Fukuoka, Japan, pp 177–182 (2009)

  55. Pfrommer, B., Daniilidis, K.: TagSLAM: Robust SLAM with fiducial markers. arXiv:1910.00679 (2019)

  56. Bacik, J., Durovsky, F., Fedor, P., Perdukova, D.: Autonomous flying with quadrocopter using fuzzy control and ArUco markers. Intell. Serv. Robot. 10(3), 185–194 (2017). https://doi.org/10.1007/s11370-017-0219-8

    Article  Google Scholar 

  57. Muñoz-Salinas, R., Marín-jimenez, M.J., Yeguas-Bolivar, E., Medina-Carnicer, R.: Mapping and localization from planar markers. Pattern Recognit. 73, 158–171 (2018). https://doi.org/10.1016/j.patcog.2017.08.010

    Article  Google Scholar 

  58. VTT Technical Research Centre of Finland Ltd: A library for virtual and augmented reality. http://virtual.vtt.fi/virtual/proj2/multimedia/alvar/ (2019)

  59. Wubben, J., Fabra, F., Calafate, C.T., Krzeszowski, T., Marquez-Barja, J.M., Cano, J.C., Manzoni, P.: Accurate landing of unmanned aerial vehicles using ground pattern recognition. Electronics 8(12), 1532 (2019). https://doi.org/10.3390/electronics812153

    Article  Google Scholar 

  60. Zhang, X., Jiang, J., Fang, Y., Zhang, X., Chen, X.: Enhanced Fiducial marker based precise landing for Quadrotors. In: 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM). https://doi.org/10.1109/aim.2019.8868532. IEEE (2019)

  61. Sani, M.F., Karimian, G.: Automatic navigation and landing of an indoor AR. Drone Quadrotor using ArUco marker and inertial sensors. In: 2017 International Conference on Computer and Drone Applications (IConDA). https://doi.org/10.1109/iconda.2017.8270408. IEEE (2017)

  62. Chaves, S., Wolcott, R., Eustice, R.: NEEC Research: Toward GPS-denied landing of unmanned aerial vehicles on ships at sea. Nav. Eng. J. 127(1), 23–35 (2015)

    Google Scholar 

  63. Li, Z., Chen, Y., Lu, H., Wu, H., Cheng, L.: UAV autonomous landing technology based on AprilTags vision positioning algorithm. In: 2019 Chinese Control Conference (CCC). https://doi.org/10.23919/chicc.2019.8865757. IEEE (2019)

  64. Araar, O., Aouf, N., Vitanov, I.: Vision based autonomous landing of multirotor UAV on moving platform. J. Intell. Robot. Syst. 85(2), 369–384 (2016). https://doi.org/10.1007/s10846-016-0399-z

    Article  Google Scholar 

  65. Vlantis, P., Marantos, P., Bechlioulis, C.P., Kyriakopoulos, K.J.: Quadrotor landing on an inclined platform of a moving ground vehicle. In: 2015 IEEE International Conference on Robotics and Automation (ICRA). https://doi.org/10.1109/icra.2015.7139490. IEEE (2015)

  66. Kyristsis, S., Antonopoulos, A., Chanialakis, T., Stefanakis, E., Linardos, C., Tripolitsiotis, A., Partsinevelos, P.: Towards autonomous modular UAV missions: the detection, geo-location and landing paradigm. Sensors 16(11), 1844 (2016). https://doi.org/10.3390/s16111844

    Article  Google Scholar 

  67. Borowczyk, A., Nguyen, D.T., Nguyen, A.P.V., Nguyen, D.Q., Saussié, D., Ny, J.L.: Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle. IFAC-PapersOnLine 50(1), 10488–10494 (2017). https://doi.org/10.1016/j.ifacol.2017.08.1980

    Article  Google Scholar 

  68. Liang, X., Chen, G., Zhao, S., Xiu, Y.: Moving target tracking method for unmanned aerial vehicle/unmanned ground vehicle heterogeneous system based on AprilTags. Measurement Control 53 (3-4), 427–440 (2020). https://doi.org/10.1177/0020294019889074

    Article  Google Scholar 

  69. Chen, J., Liu, T., Shen, S.: Tracking a moving target in cluttered environments using a Quadrotor. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). https://doi.org/10.1109/iros.2016.7759092. IEEE (2016)

  70. Hood, S., Benson, K., Hamod, P., Madison, D., O’Kane, J.M., Rekleitis, I.: Bird’s Eye View: Cooperative exploration by UGV and UAV. In: 2017 International Conference on Unmanned Aircraft Systems (ICUAS). https://doi.org/10.1109/icuas.2017.7991513. IEEE (2017)

  71. Tweddle, B.E., Saenz-Otero, A.: Relative computer vision-based navigation for small inspection spacecraft. J. Guidance Control Dynam. 38(5), 969–978 (2015). https://doi.org/10.2514/1.g000687

    Article  Google Scholar 

  72. Sattar, J., Dudek, G.: A vision-based control and interaction framework for a legged underwater robot. In: 2009 Canadian Conference on Computer and Robot Vision. https://doi.org/10.1109/crv.2009.18. IEEE (2009)

  73. Deeds, J., Engstrom, Z., Gill, C., Wood, Z., Wang, J., Ahn, I.S., Lu, Y.: Autonomous vision-based target detection using unmanned aerial vehicle. In: 2018 IEEE 61st International Midwest Symposium on Circuits and Systems (MWSCAS). https://doi.org/10.1109/mwscas.2018.8623940. IEEE (2018)

  74. Falanga, D., Zanchettin, A., Simovic, A., Delmerico, J., Scaramuzza, D.: Vision-based autonomous Quadrotor landing on a moving platform. In: 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR). https://doi.org/10.1109/ssrr.2017.8088164. IEEE (2017)

  75. Kalaitzakis, M., Cain, B., Vitzilaios, N., Rekleitis, I., Moulton, J.: A marsupial robotic system for surveying and inspection of freshwater ecosystems. J. Field Robot. 38(1), 121–138 (2021). https://doi.org/10.1002/rob.21957

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Contributions

The authors confirm contribution to the paper as follows: study conception and design: M. Kalaitzakis and N. Vitzilaios; data collection: M. Kalaitzakis, B. Cain, S. Carroll, A. Ambrosi and C. Whitehead; analysis and interpretation of results: M. Kalaitzakis, B. Cain, S. Carroll and N. Vitzilaios; draft manuscript preparation: M. Kalaitzakis, B. Cain, S. Carroll, A. Ambrosi and C. Whitehead; manuscript revision: M. Kalaitzakis and N. Vitzilaios. All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Nikolaos Vitzilaios.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A preliminary version of this paper has been presented at the 2020 International Conference on Unmanned Aircraft Systems (ICUAS 2020) and published in [1 ].

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kalaitzakis, M., Cain, B., Carroll, S. et al. Fiducial Markers for Pose Estimation. J Intell Robot Syst 101, 71 (2021). https://doi.org/10.1007/s10846-020-01307-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10846-020-01307-9

Keywords

Navigation