Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Point-line feature fusion based field real-time RGB-D SLAM

Published: 01 October 2022 Publication History

Abstract

3D reconstruction of crops is important for researching their biological properties, canopy light distribution and robotic harvesting. However, the complex field environment makes the real-time 3D reconstruction of crops difficult. Due to the low-textured in the field, it is difficult to obtain effective features to construct accurate and real-time 3D maps of the field from existing single-feature SLAM methods. In this paper, we propose a novel RGB-D SLAM based on point-line feature fusion for the real-time field 3D scene reconstruction. By optimizing the point-line features joint poses, we first build a 3D scene map of the field based on the point-line feature structure. Then, a joint point cloud filtering method is designed based on the keyframes optimization of the point-line feature. Finally, we obtain the consistently high-quality dense map in the global respect. The overall performance in terms of pose estimation and reconstruction is evaluated on public benchmarks and shows improved performance compared to state-of-the-art methods. Qualitative experiments on the field scenes show that our method enables real-time 3D reconstruction of crops with high robustness.

Highlights

A point-line feature fusion optimization method is proposed to improve the accuracy of pose estimation.
A keyframe optimization strategy and a dense map optimization scheme are proposed to achieve a globally consistent dense map.
A real-time RGB-D SLAM method is presented for field 3D scenes reconstruction based on point-line feature fusion.

Graphical abstract

Display Omitted

References

[1]
Abanay A., Masmoudi L., El Ansari M., A calibration method of 2D LIDAR-visual sensors embedded on an agricultural robot, Optik 249 (2022).
[2]
Zhang Z., Li P., Zhao S., Lv Z., Du F., An Y., An adaptive vision navigation algorithm in agricultural IoT system for smart agricultural robots, CMC-Comput Mater Continua 66 (1) (2021) 1043–1056.
[3]
Polvi J., Taketomi T., Yamamoto G., Dey A., Sandor C., Kato H., Slidar: A 3D positioning method for SLAM-based handheld augmented reality, Comput Graph 55 (2016) 33–43.
[4]
Zhang J., Singh S., LOAM: Lidar odometry and mapping in real-time, in: Robotics: Science and Systems, Vol. 2, 2014.
[5]
Li Y., Brasch N., Wang Y., Navab N., Tombari F., Structure-slam: Low-drift monocular slam in indoor environments, IEEE Robot Autom Lett 5 (4) (2020) 6583–6590.
[6]
Pumarola A., Vakhitov A., Agudo A., Sanfeliu A., Moreno-Noguer F., PL-SLAM: Real-time monocular visual SLAM with points and lines, in: 2017 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2017, pp. 4503–4508.
[7]
Gomez-Ojeda R., Moreno F.-A., Zuniga-Noël D., Scaramuzza D., Gonzalez-Jimenez J., PL-SLAM: A stereo SLAM system through the combination of points and line segments, IEEE Trans Robot 35 (3) (2019) 734–746.
[8]
Fu Q., Wang J., Yu H., Ali I., Guo F., Zhang H., PL-VINS: Real-time monocular visual-inertial SLAM with point and line, 2020, arXiv e-prints arXiv–2009.
[9]
Von Gioi R.G., Jakubowicz J., Morel J.-M., Randall G., LSD: A fast line segment detector with a false detection control, IEEE Trans Pattern Anal Mach Intell 32 (4) (2008) 722–732.
[10]
Strasdat H., Montiel J., Davison A.J., Scale drift-aware large scale monocular SLAM, Robot: Sci Syst VI 2 (3) (2010) 7.
[11]
Coughlan J.M., Yuille A.L., Manhattan world: Compass direction from a single image by bayesian inference, in: Proceedings of the seventh IEEE International Conference on Computer Vision, Vol. 2, IEEE, 1999, pp. 941–947.
[12]
Mur-Artal R., Tardós J.D., Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras, IEEE Trans Robot 33 (5) (2017) 1255–1262.
[13]
Kim W.-S., Lee D.-H., Kim Y.-J., Kim T., Lee W.-S., Choi C.-H., Stereo-vision-based crop height estimation for agricultural robots, Comput Electron Agric 181 (2021).
[14]
Gai J., Xiang L., Tang L., Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle, Comput Electron Agric 188 (2021).
[15]
Isachsen U.J., Theoharis T., Misimi E., Fast and accurate GPU-accelerated, high-resolution 3D registration for the robotic 3D reconstruction of compliant food objects, Comput Electron Agric 180 (2021).
[16]
Zheng H., Zhou X., He J., Yao X., Cheng T., Zhu Y., et al., Early season detection of rice plants using RGB, NIR-GB and multispectral images from unmanned aerial vehicle (UAV), Comput Electron Agric 169 (2020).
[17]
Davison A.J., Reid I.D., Molton N.D., Stasse O., MonoSLAM: Real-time single camera SLAM, IEEE Trans Pattern Anal Mach Intell 29 (6) (2007) 1052–1067.
[18]
Kalman R.E., A new approach to linear filtering and prediction problems, 1960.
[19]
Klein G., Murray D., Parallel tracking and mapping on a camera phone, in: 2009 8th IEEE International Symposium on Mixed and Augmented Reality, IEEE, 2009, pp. 83–86.
[20]
Engel J., Schöps T., Cremers D., LSD-SLAM: Large-scale direct monocular SLAM, in: European Conference on Computer Vision, Springer, 2014, pp. 834–849.
[21]
Mur-Artal R., Montiel J.M.M., Tardos J.D., ORB-SLAM: a versatile and accurate monocular SLAM system, IEEE Trans Robot 31 (5) (2015) 1147–1163.
[22]
Rublee E., Rabaud V., Konolige K., Bradski G., ORB: An efficient alternative to SIFT or SURF, in: 2011 International Conference on Computer Vision, IEEE, 2011, pp. 2564–2571.
[23]
Lee J., Park S.-Y., PLF-VINS: Real-time monocular visual-inertial SLAM with point-line fusion and parallel-line fusion, IEEE Robot Autom Lett 6 (4) (2021) 7033–7040.
[24]
Lim H., Jeon J., Myung H., UV-SLAM: Unconstrained line-based SLAM using vanishing points for structural mapping, IEEE Robot Autom Lett (2022).
[25]
Qin T., Li P., Shen S., Vins-mono: A robust and versatile monocular visual-inertial state estimator, IEEE Trans Robot 34 (4) (2018) 1004–1020.
[26]
Zuo X., Xie X., Liu Y., Huang G., Robust visual SLAM with point and line features, in: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, 2017, pp. 1775–1782.
[27]
Zhou F., Zhang L., Deng C., Fan X., Improved point-line feature based visual SLAM method for complex environments, Sensors 21 (13) (2021) 4604.
[28]
Paz L.M., Tardós J.D., Neira J., Divide and conquer: EKF SLAM in O ( n ), IEEE Trans Robot 24 (5) (2008) 1107–1120.
[29]
Bellavia F., Fanfani M., Pazzaglia F., Colombo C., Robust selective stereo SLAM without loop closure and bundle adjustment, in: International Conference on Image Analysis and Processing, Springer, 2013, pp. 462–471.
[30]
Lee J.-S., Kim C., Chung W.K., Robust RBPF-SLAM using sonar sensors in non-static environments, in: 2010 IEEE International Conference on Robotics and Automation, IEEE, 2010, pp. 250–256.
[31]
Fu M., Zhu H., Yang Y., Wang M., Deng Z., A navigation map building algorithm using refined RBPF-SLAM, in: 2016 IEEE Chinese Guidance, Navigation and Control Conference (CGNCC), IEEE, 2016, pp. 2483–2487.
[32]
Tang F., Li H., Wu Y., Fmd stereo slam: Fusing mvg and direct formulation towards accurate and fast stereo slam, in: 2019 International Conference on Robotics and Automation (ICRA), IEEE, 2019, pp. 133–139.
[33]
Newcombe R.A., Izadi S., Hilliges O., Molyneaux D., Kim D., Davison A.J., et al., Kinectfusion: Real-time dense surface mapping and tracking, in: 2011 10th IEEE International Symposium on Mixed and Augmented Reality, IEEE, 2011, pp. 127–136.
[34]
Labbé M., Michaud F., RTAB-map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation, J Field Robotics 36 (2) (2019) 416–446.
[35]
Campos C., Elvira R., Rodríguez J.J.G., Montiel J.M., Tardós J.D., ORB-SLAM3: An accurate open-source library for visual, visual–Inertial, and multimap SLAM, IEEE Trans Robot (2021).
[36]
Scona R., Jaimez M., Petillot Y.R., Fallon M., Cremers D., Staticfusion: Background reconstruction for dense rgb-d slam in dynamic environments, in: 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2018, pp. 3849–3856.
[37]
Hart P., The condensed nearest neighbor rule (corresp.), IEEE Trans Inform Theory 14 (3) (1968) 515–516.
[38]
Fischler M.A., Bolles R.C., Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography, Commun ACM 24 (6) (1981) 381–395.
[39]
Triggs B., McLauchlan P.F., Hartley R.I., Fitzgibbon A.W., Bundle adjustment—a modern synthesis, in: International Workshop on Vision Algorithms, Springer, 1999, pp. 298–372.
[40]
Zhang L., Koch R., An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency, J Vis Commun Image Represent 24 (7) (2013) 794–805.
[41]
Sola J., Vidal-Calleja T., Devy M., Undelayed initialization of line segments in monocular SLAM, in: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2009, pp. 1553–1558.
[42]
Zhou Y., Kneip L., Rodriguez C., Li H., Divide and conquer: Efficient density-based tracking of 3D sensors in manhattan worlds, in: Asian Conference on Computer Vision, Springer, 2016, pp. 3–19.
[43]
Li Y., Yunus R., Brasch N., Navab N., Tombari F., RGB-D SLAM with structural regularities, in: 2021 IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 11581–11587.
[44]
Endres F., Hess J., Sturm J., Cremers D., Burgard W., 3-D mapping with an RGB-D camera, IEEE Trans Robot 30 (1) (2013) 177–187.

Cited By

View all

Index Terms

  1. Point-line feature fusion based field real-time RGB-D SLAM
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Computers and Graphics
          Computers and Graphics  Volume 107, Issue C
          Oct 2022
          340 pages

          Publisher

          Pergamon Press, Inc.

          United States

          Publication History

          Published: 01 October 2022

          Author Tags

          1. Visual SLAM
          2. RGB-D camera
          3. Point-line feature
          4. Field scene reconstruction

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 25 Nov 2024

          Other Metrics

          Citations

          Cited By

          View all

          View Options

          View options

          Login options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media