Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Towards autonomous mapping in agriculture: : A review of supportive technologies for ground robotics

Published: 01 November 2023 Publication History

Abstract

This paper surveys the supportive technologies currently available for ground mobile robots used for autonomous mapping in agriculture. Unlike previous reviews, we describe state-of-the-art approaches and technologies aimed at extracting information from agricultural environments, not only for navigation purposes but especially for mapping and monitoring. The state-of-the-art platforms and sensors, the modern localization techniques, the navigation and path planning approaches, as well as the potentialities of artificial intelligence towards autonomous mapping in agriculture are analyzed. According to the findings of this review, many examples of recent mobile robots provide full navigation and autonomous mapping capability. Significant resources are currently devoted to this research area, in order to further improve mobile robot capabilities in this complex and challenging field.

Highlights

We review hardware technologies for autonomous robotic mapping in agriculture.
Modern localization and mapping approaches for mobile robotics are discussed.
Path planning strategies for autonomous navigation in agriculture are analyzed.
The role of artificial intelligence for robotic mapping and monitoring is explored.
Future trends and challenges for robotic mapping in agriculture are highlighted.

References

[1]
Abbass K., Qasim M.Z., Song H., Murshed M., Mahmood H., Younis I., A review of the global climate change impacts, adaptation, and sustainable mitigation measures, Environ. Sci. Pollut. Res. 29 (2022) 42539–42559.
[2]
Searchinger T., Waite R., Hanson C., Ranganathan J., Dumas P., Matthews E., Klirs C., Creating a sustainable food future: A menu of solutions to feed nearly 10 billion people by 2050, World Resourc. Inst. (2019).
[3]
Phasinam K., Kassanuk T., Shabaz M., Applicability of internet of things in smart farming, J. Food Qual. 2022 (2022).
[4]
Oliveira L.F., Moreira A.P., Silva M.F., Advances in agriculture robotics: A state-of-the-art review and challenges ahead, Robotics 10 (2) (2021) 52.
[5]
Goel R.K., Yadav C.S., Vishnoi S., Rastogi R., Smart agriculture–Urgent need of the day in developing countries, Sustain. Comput.: Inform. Syst. 30 (2021).
[6]
Subeesh A., Mehta C., Automation and digitization of agriculture using artificial intelligence and internet of things, Artif. Intell. Agric. 5 (2021) 278–291.
[7]
Mohamed E.S., Belal A., Abd-Elmabod S.K., El-Shirbeny M.A., Gad A., Zahran M.B., Smart farming for improving agricultural management, Egypt. J. Remote Sens. Space Sci. 24 (2021) 971–981.
[8]
Meng S., Wang X., Hu X., Luo C., Zhong Y., Deep learning-based crop mapping in the cloudy season using one-shot hyperspectral satellite imagery, Comput. Electron. Agric. 186 (2021).
[9]
Anagnostis A., Tagarakis A.C., Kateris D., Moysiadis V., Sørensen C.G., Pearson S., Bochtis D., Orchard mapping with deep learning semantic segmentation, Sensors 21 (11) (2021) 3813.
[10]
Yang C., Everitt J.H., Du Q., Luo B., Chanussot J., Using high-resolution airborne and satellite imagery to assess crop growth and yield variability for precision agriculture, Proc. IEEE 101 (3) (2012) 582–592.
[11]
Hunt E.R. Jr., Daughtry C.S., What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture?, Int. J. Remote Sens. 39 (15–16) (2018) 5345–5376.
[12]
Gyagenda N., Hatilima J.V., Roth H., Zhmud V., A review of GNSS-independent UAV navigation techniques, Robot. Auton. Syst. (2022).
[13]
Hameed I.A., la Cour-Harbo A., Osen O.L., Side-to-side 3D coverage path planning approach for agricultural robots to minimize skip/overlap areas between swaths, Robot. Auton. Syst. 76 (2016) 36–45.
[14]
Tiozzo Fasiolo D., Scalera L., Maset E., Gasparetto A., Recent trends in mobile robotics for 3D mapping in agriculture, in: International Conference on Robotics in Alpe-Adria Danube Region, Springer, 2022, pp. 428–435.
[15]
Ammoniaci M., Kartsiotis S.-P., Perria R., Storchi P., State of the art of monitoring technologies and data processing for precision viticulture, Agriculture 11 (3) (2021) 201.
[16]
Atefi A., Ge Y., Pitla S., Schnable J., Robotic technologies for high-throughput plant phenotyping: Contemporary reviews and future perspectives, Front. Plant Sci. 12 (2021).
[17]
Botta A., Cavallone P., Baglieri L., Colucci G., Tagliavini L., Quaglia G., A review of robots, perception, and tasks in precision agriculture, Appl. Mech. 3 (3) (2022) 830–854.
[18]
Sishodia R.P., Ray R.L., Singh S.K., Applications of remote sensing in precision agriculture: A review, Remote Sens. 12 (19) (2020) 3136.
[19]
Duckett T., Pearson S., Blackmore S., Grieve B., Chen W.-H., Cielniak G., Cleaversmith J., Dai J., Davis S., Fox C., et al., Agricultural robotics: The future of robotic agriculture, UK-RAS Network, Robot. Auton. Syst. (2018).
[20]
Fountas S., Mylonas N., Malounas I., Rodias E., Hellmann Santos C., Pekkeriet E., Agricultural robotics for field operations, Sensors 20 (9) (2020) 2672.
[21]
Vougioukas S.G., Agricultural robotics, Ann. Rev. Control, Robot. Auton. Syst. 2 (1) (2019) 365–392.
[22]
Magalhães S.A., Moreira A.P., dos Santos F.N., Dias J., Active perception fruit harvesting robots—A systematic review, J. Intell. Robot. Syst. 105 (1) (2022) 1–22.
[23]
Tardaguila J., Stoll M., Gutiérrez S., Proffitt T., Diago M.P., Smart applications and digital technologies in viticulture: A review, Smart Agric. Technol. 1 (2021).
[24]
Guo Q., Su Y., Hu T., Guan H., Jin S., Zhang J., Zhao X., Xu K., Wei D., Kelly M., et al., LiDAR boosts 3D ecological observations and modelings: A review and perspective, IEEE Geosci. Remote Sens. Mag. 9 (1) (2020) 232–257.
[25]
Aguiar A.S., dos Santos F.N., Boaventura Cunha J., Sobreira H., Sousa A.J., Localization and mapping for robots in agriculture and forestry: A survey, Robotics 9 (4) (2020) 97.
[26]
Ding H., Zhang B., Zhou J., Yan Y., Tian G., Gu B., Recent developments and applications of simultaneous localization and mapping in agriculture, J. Field Robotics 39 (2022) 956–983.
[27]
Gao X., Li J., Fan L., Zhou Q., Yin K., Wang J., Song C., Huang L., Wang Z., Review of wheeled mobile robots’ navigation problems and application prospects in agriculture, IEEE Access 6 (2018) 49248–49268.
[28]
L.C. Santos, F.N. Santos, E.S. Pires, A. Valente, P. Costa, S. Magalhães, Path planning for ground robots in agriculture: A short review, in: 2020 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC, 2020, pp. 61–66.
[29]
Li X., Qiu Q., Autonomous navigation for orchard mobile robots: A rough review, in: 2021 36th Youth Academic Annual Conference of Chinese Association of Automation, YAC, IEEE, 2021, pp. 552–557.
[30]
Hrabar I., Goričanec J., Kovačić Z., Towards autonomous navigation of a mobile robot in a steep slope vineyard, in: 2021 44th International Convention on Information, Communication and Electronic Technology, MIPRO, IEEE, 2021, pp. 1119–1124.
[31]
Bonadies S., Gadsden S.A., An overview of autonomous crop row navigation strategies for unmanned ground vehicles, Eng. Agric., Environ. Food 12 (1) (2019) 24–31.
[32]
Ferreira J., Moreira A.P., Silva M., Santos F., A survey on localization, mapping, and trajectory planning for quadruped robots in vineyards, in: 2022 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC, IEEE, 2022, pp. 237–242.
[33]
Tian H., Wang T., Liu Y., Qiao X., Li Y., Computer vision technology in agricultural automation—A review, Inf. Process. Agric. 7 (1) (2020) 1–19.
[34]
Darwin B., Dharmaraj P., Prince S., Popescu D.E., Hemanth D.J., Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review, Agronomy 11 (4) (2021) 646.
[35]
Linaza M.T., Posada J., Bund J., Eisert P., Quartulli M., Döllner J., Pagani A., G. Olaizola I., Barriguinha A., Moysiadis T., et al., Data-driven artificial intelligence applications for sustainable precision agriculture, Agronomy 11 (6) (2021) 1227.
[36]
Pathan M., Patel N., Yagnik H., Shah M., Artificial cognition for applications in smart agriculture: A comprehensive review, Artif. Intell. Agric. 4 (2020) 81–95.
[37]
Santos L., Santos F., Mendes J., Costa P., Lima J., Reis R., Shinde P., Path planning aware of robot’s center of mass for steep slope vineyards, Robotica 38 (4) (2020) 684–698.
[38]
Gao P., Jiang J., Song J., Xie F., Bai Y., Fu Y., Wang Z., Zheng X., Xie S., Li B., Canopy volume measurement of fruit trees using robotic platform loaded LiDAR data, IEEE Access 9 (2021) 156246–156259.
[39]
Ball D., Upcroft B., Wyeth G., Corke P., English A., Ross P., Patten T., Fitch R., Sukkarieh S., Bate A., Vision-based obstacle detection and navigation for an agricultural robot, J. Field Robotics 33 (8) (2016) 1107–1130.
[40]
Bayar G., Bergerman M., Koku A.B., Ilhan Konukseven E., Localization and control of an autonomous orchard vehicle, Comput. Electron. Agric. 115 (2015) 118–128.
[41]
Choudhary A., Kobayashi Y., Arjonilla F.J., Nagasaka S., Koike M., Evaluation of mapping and path planning for non-holonomic mobile robot navigation in narrow pathway for agricultural application, in: 2021 IEEE/SICE International Symposium on System Integration, SII, 2021, pp. 17–22.
[42]
Santos L.C., Santos A., Santos F.N., Valente A., A case study on improving the software dependability of a ROS path planner for steep slope vineyards, Robotics 10 (3) (2021) 103.
[43]
Gan H., Lee W.S., Development of a navigation system for a smart farm, IFAC-PapersOnLine 51 (17) (2018) 1–4.
[44]
Iqbal J., Xu R., Sun S., Li C., Simulation of an autonomous mobile robot for LiDAR-based in-field phenotyping and navigation, Robotics 9 (2) (2020) 46.
[45]
Eiffert S., Wallace N., Kong H., Pirmarzdashti N., Sukkarieh S., Resource and response aware path planning for long-term autonomy of ground robots in agriculture, Field Robot. 2 (2022) 1–33.
[46]
Fentanes J.P., Gould I., Duckett T., Pearson S., Cielniak G., 3D soil compaction mapping through kriging-based exploration with a mobile robot, IEEE Robot. Autom. Lett. 3 (4) (2018) 3066–3072.
[47]
M.A. Post, A. Bianco, X.T. Yan, Autonomous navigation with ROS for a mobile robot in agricultural fields, in: 14th International Conference on Informatics in Control, Automation and Robotics, ICINCO, 2017.
[48]
Bak T., Jakobsen H., Agricultural robotic platform with four wheel steering for weed detection, Biosyst. Eng. 87 (2) (2004) 125–136.
[49]
Manish R., Lin Y.-C., Ravi R., Hasheminasab S.M., Zhou T., Habib A., Development of a miniaturized mobile mapping system for in-row, under-canopy phenotyping, Remote Sens. 13 (2) (2021) 276.
[50]
Baek E.-T., Im D.-Y., ROS-based unmanned mobile robot platform for agriculture, Appl. Sci. 12 (9) (2022) 4335.
[51]
Garrido M., Paraforos D.S., Reiser D., Vázquez Arellano M., Griepentrog H.W., Valero C., 3D maize plant reconstruction based on georeferenced overlapping LiDAR point clouds, Remote Sens. 7 (12) (2015) 17077–17096.
[52]
Chebrolu N., Lottes P., Schaefer A., Winterhalter W., Burgard W., Stachniss C., Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields, Int. J. Robot. Res. 36 (10) (2017) 1045–1052.
[53]
Cubero S., Marco-Noales E., Aleixos N., Barbé S., Blasco J., Robhortic: A field robot to detect pests and diseases in horticultural crops by proximal sensing, Agriculture 10 (7) (2020) 276.
[54]
Gasparino M.V., Higuti V.A., Velasquez A.E., Becker M., Improved localization in a corn crop row using a rotated laser rangefinder for three-dimensional data acquisition, J. Braz. Soc. Mech. Sci. Eng. 42 (2020) 1–10.
[55]
Pire T., Mujica M., Civera J., Kofman E., The Rosario dataset: Multisensor data for localization and mapping in agricultural environments, Int. J. Robot. Res. 38 (6) (2019) 633–641.
[56]
Underwood J., Wendel A., Schofield B., McMurray L., Kimber R., Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle, J. Field Robotics 34 (6) (2017) 1061–1083.
[57]
Kragh M.F., Christiansen P., Laursen M.S., Larsen M., Steen K.A., Green O., Karstoft H., Jø rgensen R.N., Fieldsafe: Dataset for obstacle detection in agriculture, Sensors 17 (11) (2017) 2579.
[58]
Krus A., Van Apeldoorn D., Valero C., Ramirez J.J., Acquiring plant features with optical sensing devices in an organic strip-cropping system, Agronomy 10 (2) (2020) 197.
[59]
Grimstad L., From P.J., The Thorvald II agricultural robotic system, Robotics 6 (2017) 24.
[60]
Shafiekhani A., Kadam S., Fritschi F.B., DeSouza G.N., Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping, Sensors 17 (1) (2017) 214.
[61]
de Silva R., Cielniak G., Gao J., Towards agricultural autonomy: Crop row detection under varying field conditions using deep learning, 2021, arXiv preprint arXiv:2109.08247.
[62]
Beloev I., Kinaneva D., Georgiev G., Hristov G., Zahariev P., Artificial intelligence-driven autonomous robot for precision agriculture, Acta Technol. Agric. 24 (1) (2021) 48–54.
[63]
Gai J., Xiang L., Tang L., Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle, Comput. Electron. Agric. 188 (2021).
[64]
Wang K., Zhou J., Zhang W., Zhang B., Mobile LiDAR scanning system combined with canopy morphology extracting methods for tree crown parameters evaluation in orchards, Sensors 21 (2) (2021) 339.
[65]
Blok P.M., van Boheemen K., van Evert F.K., IJsselmuiden J., Kim G.-H., Robot navigation in orchards with localization based on Particle filter and Kalman filter, Comput. Electron. Agric. 157 (2019) 261–269.
[66]
S. Marden, M. Whitty, GPS-free localisation and navigation of an unmanned ground vehicle for yield forecasting in a vineyard, in: Recent Advances in Agricultural Robotics, International Workshop Collocated with the 13th International Conference on Intelligent Autonomous Systems, IAS-13, 2014.
[67]
Reina G., Milella A., Galati R., Terrain assessment for precision agriculture using vehicle dynamic modelling, Biosyst. Eng. 162 (2017) 124–139.
[68]
Aghi D., Mazzia V., Chiaberge M., Local motion planner for autonomous navigation in vineyards with a RGB-D camera-based algorithm and deep learning synergy, Machines 8 (2) (2020) 27.
[69]
Aghi D., Cerrato S., Mazzia V., Chiaberge M., Deep semantic segmentation at the edge for autonomous navigation in vineyard rows, in: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, IEEE, 2021, pp. 3421–3428.
[70]
Jiang S., Wang S., Yi Z., Zhang M., Lv X., Autonomous navigation system of greenhouse mobile robot based on 3D LiDAR and 2D LiDAR SLAM, Front. Plant Sci. 13 (2022).
[71]
Fawakherji M., Potena C., Pretto A., Bloisi D.D., Nardi D., Multi-spectral image synthesis for crop/weed segmentation in precision farming, Robot. Auton. Syst. 146 (2021).
[72]
Imperoli M., Potena C., Nardi D., Grisetti G., Pretto A., An effective multi-cue positioning system for agricultural robotics, IEEE Robot. Autom. Lett. 3 (4) (2018) 3685–3692.
[73]
Le T., Gjevestad J.G.O., From P.J., Online 3D mapping and localization system for agricultural robots, IFAC-PapersOnLine 52 (30) (2019) 167–172.
[74]
Weyler J., Milioto A., Falck T., Behley J., Stachniss C., Joint plant instance detection and leaf count estimation for in-field plant phenotyping, IEEE Robot. Autom. Lett. 6 (2) (2021) 3599–3606.
[75]
Winterhalter W., Fleckenstein F., Dornhege C., Burgard W., Localization for precision navigation in agricultural fields—Beyond crop row following, J. Field Robotics 38 (3) (2021) 429–451.
[76]
Aguiar A.S.P., dos Santos F.N., dos Santos L.C.F., de Jesus Filipe V.M., de Sousa A.J.M., Vineyard trunk detection using deep learning–An experimental device benchmark, Comput. Electron. Agric. 175 (2020).
[77]
Aguiar A.S., dos Santos F.N., Sobreira H., Boaventura Cunha J., Sousa A.J., Particle filter refinement based on clustering procedures for high-dimensional localization and mapping systems, Robot. Auton. Syst. 137 (2021).
[78]
Aguiar A.S., dos Santos F.N., Sobreira H., Boaventura Cunha J., Sousa A.J., Localization and mapping on agriculture based on point-feature extraction and semiplanes segmentation from 3D LiDAR data, Front. Robot. AI 9 (2022).
[79]
Dogru S., Marques L., Evaluation of an automotive short range radar sensor for mapping in orchards, in: 2018 IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC, IEEE, 2018, pp. 78–83.
[80]
Habibie N., Nugraha A.M., Anshori A.Z., Ma’sum M.A., Jatmiko W., Fruit mapping mobile robot on simulated agricultural area in Gazebo simulator using simultaneous localization and mapping (SLAM), in: 2017 International Symposium on Micro-NanoMechatronics and Human Science, MHS, IEEE, 2017, pp. 1–7.
[81]
N. Ohi, K. Lassak, R. Watson, J. Strader, Y. Du, C. Yang, G. Hedrick, J. Nguyen, S. Harper, D. Reynolds, et al., Design of an autonomous precision pollination robot, in: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2018, pp. 7711–7718.
[82]
Santos L.C., Aguiar A.S., Santos F.N., Valente A., Ventura J.B., Sousa A.J., Navigation stack for robots working in steep slope vineyard, in: Proceedings of SAI Intelligent Systems Conference, Springer, 2020, pp. 264–285.
[83]
Santos L.C., Santos F.N., Valente A., Sobreira H., Sarmento J., Petry M., Collision avoidance considering iterative Bézier based approach for steep slope terrains, IEEE Access 10 (2022) 25005–25015.
[84]
C. Yang, R.M. Watson, J.N. Gross, Y. Gu, Localization Algorithm Design and Evaluation for an Autonomous Pollination Robot, in: Proceedings of the 32nd International Technical Meeting of the Satellite Division of the Institute of Navigation, ION GNSS+ 2019, 2019, pp. 2702–2710.
[85]
Zhang L., Li R., Li Z., Meng Y., Liang J., Fu L., Jin X., Li S., A quadratic traversal algorithm of shortest weeding path planning for agricultural mobile robots in cornfield, J. Robot. 2021 (2021).
[86]
Cerrato S., Mazzia V., Salvetti F., Chiaberge M., A deep learning driven algorithmic pipeline for autonomous navigation in row-based crops, 2021, arXiv preprint arXiv:2112.03816.
[87]
Pak J., Kim J., Park Y., Son H.I., Field evaluation of path-planning algorithms for autonomous mobile robot in smart farms, IEEE Access 10 (2022) 60253–60266.
[88]
Shalal N., Low T., McCarthy C., Hancock N., Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion–Part B: Mapping and localisation, Comput. Electron. Agric. 119 (2015) 267–278.
[89]
T. Clamens, G. Alexakis, R. Duverne, R. Seulin, E. Fauvet, D. Fofi, Real-time multispectral image processing and registration on 3D point cloud for vineyard analysis, in: 16th International Conference on Computer Vision Theory and Applications, 2021.
[90]
Guzmán R., Ariño J., Navarro R., Lopes C., Graça J., Reyes M., Barriguinha A., Braga R., Autonomous hybrid GPS/reactive navigation of an unmanned ground vehicle for precision viticulture-VINBOT, Intervitis Interfructa Hortitechnica-Technol. Wine, Juice Spec. Crops (2016).
[91]
Hroob I., Polvara R., Molina S., Cielniak G., Hanheide M., Benchmark of visual and 3D LiDAR SLAM systems in simulation environment for vineyards, in: Annual Conference Towards Autonomous Robotic Systems, Springer, 2021, pp. 168–177.
[92]
Halstead M., Ahmadi A., Smitt C., Schmittmann O., McCool C., Crop agnostic monitoring driven by deep learning, Front. Plant Sci. 12 (2021).
[93]
Emmi L., Le Flécher E., Cadenat C., Devy O., A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture, Precis. Agric. 22 (2) (2021) 524–549.
[94]
Duarte M., dos Santos F.N., Sousa A., Morais R., Agricultural wireless sensor mapping for robot localization, in: Robot 2015: Second Iberian Robotics Conference, Springer, 2016, pp. 359–370.
[95]
Malavazi F.B., Guyonneau R., Fasquel J.-B., Lagrange S., Mercier F., LiDAR-only based navigation algorithm for an autonomous agricultural robot, Comput. Electron. Agric. 154 (2018) 71–79.
[96]
Yamasaki Y., Morie M., Noguchi N., Development of a high-accuracy autonomous sensing system for a field scouting robot, Comput. Electron. Agric. 193 (2022).
[97]
J. Jackson, B. Davis, D. Gebre-Egziabher, A performance assessment of low-cost RTK GNSS receivers, in: IEEE/ION Position, Location and Navigation Symposium, PLANS, 2018, pp. 642–649.
[98]
Guo J., Li X., Li Z., Hu L., Yang G., Zhao C., Fairbairn D., Watson D., Ge M., Multi-GNSS precise point positioning for precision agriculture, Precis. Agric. 19 (5) (2018) 895–911.
[99]
Kaartinen H., Hyyppä J., Vastaranta M., Kukko A., Jaakkola A., Yu X., Pyörälä J., Liang X., Liu J., Wang Y., Kaijaluoto R., Melkas T., Holopainen M., Hyyppä H., Accuracy of kinematic positioning using global satellite navigation systems under forest canopies, Forests 6 (9) (2015) 3218–3236.
[100]
Hirokawa R., Ebinuma T., A low-cost tightly coupled GPS/INS for small UAVs augmented with multiple GPS antennas, Navigation 56 (1) (2009) 35–44.
[101]
Grocholsky B., Nuske S., Aasted M., Achar S., Bates T., A camera and laser system for automatic vine balance assessment, in: 2011 Louisville, Kentucky, August 7-10, 2011, American Society of Agricultural and Biological Engineers, 2011, p. 1.
[102]
H. Lan, M. Elsheikh, W. Abdelfatah, A. Wahdan, N. El-Sheimy, Integrated RTK/INS navigation for precision agriculture, in: 32nd International Technical Meeting of the Satellite Division of the Institute of Navigation, 2019, pp. 4076–4086.
[103]
Lowe T., Moghadam P., Edwards E., Williams J., Canopy density estimation in perennial horticulture crops using 3D spinning LiDAR SLAM, J. Field Robotics 38 (4) (2021) 598–618.
[104]
Vitali R.V., McGinnis R.S., Perkins N.C., Robust error-state Kalman filter for estimating IMU orientation, IEEE Sensors J. 21 (3) (2020) 3561–3569.
[105]
Masuzawa H., Miura J., Oishi S., Development of a mobile robot for harvest support in greenhouse horticulture—Person following and mapping, in: 2017 IEEE/SICE International Symposium on System Integration, SII, IEEE, 2017, pp. 541–546.
[106]
del Moral-Martínez I., Rosell-Polo J.R., Company J., Sanz R., Escolà A., Masip J., Martinez-Casasnovas J.A., Arno J., Mapping vineyard leaf area using mobile terrestrial laser scanners: Should rows be scanned on-the-go or discontinuously sampled?, Sensors 16 (1) (2016) 119.
[107]
Moreno H., Valero C., Bengochea-Guevara J.M., Ribeiro Á., Garrido-Izard M., Andújar D., On-ground vineyard reconstruction using a LiDAR-based automated system, Sensors 20 (4) (2020) 1102.
[108]
Pagliai A., Ammoniaci M., Sarri D., Lisci R., Perria R., Vieri M., D’Arcangelo M.E.M., Storchi P., Kartsiotis S.-P., Comparison of aerial and ground 3D point clouds for canopy size assessment in precision viticulture, Remote Sens. 14 (5) (2022) 1145.
[109]
G. Riggio, C. Fantuzzi, C. Secchi, A low-cost navigation strategy for yield estimation in vineyards, in: 2018 IEEE International Conference on Robotics and Automation, ICRA, 2018, pp. 2200–2205.
[110]
Saiz-Rubio V., Rovira-Más F., Cuenca-Cuenca A., Alves F., Robotics-based vineyard water potential monitoring at high resolution, Comput. Electron. Agric. 187 (2021).
[111]
dos Santos F.N., Sobreira H., Campos D., Morais R., Paulo Moreira A., Contente O., Towards a reliable robot for steep slope vineyards monitoring, J. Intell. Robot. Syst. 83 (3) (2016) 429–444.
[112]
Sanz R., Rosell J., Llorens J., Gil E., Planas S., Relationship between tree row LIDAR-volume and leaf area density for fruit orchards and vineyards obtained with a LIDAR 3D dynamic measurement system, Agric. Forest Meteorol. 171 (2013) 153–162.
[113]
Siebers M.H., Edwards E.J., Jimenez-Berni J.A., Thomas M.R., Salim M., Walker R.R., Fast phenomics in vineyards: Development of GRover, the grapevine rover, and LiDAR for assessing grapevine traits in the field, Sensors 18 (9) (2018) 2924.
[114]
Silwal A., Yandun F., Nellithimaru A., Bates T., Kantor G., Bumblebee: A path towards fully autonomous robotic vine pruning, 2021, arXiv preprint arXiv:2112.00291.
[115]
Lepej P., Rakun J., Simultaneous localisation and mapping in a complex field environment, Biosyst. Eng. 150 (2016) 160–169.
[116]
Zhang J., Kantor G., Bergerman M., Singh S., Monocular visual navigation of an autonomous vehicle in natural scene corridor-like environments, in: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2012, pp. 3659–3666.
[117]
Mao W., Liu H., Hao W., Yang F., Liu Z., Development of a combined orchard harvesting robot navigation system, Remote Sens. 14 (3) (2022) 675.
[118]
Ristorto G., Gallo R., Gasparetto A., Scalera L., Vidoni R., Mazzetto F., A mobile laboratory for orchard health status monitoring in precision farming, Chem. Eng. Trans. 58 (2017) 661–666.
[119]
Vidoni R., Gallo R., Ristorto G., Carabin G., Mazzetto F., Scalera L., Gasparetto A., ByeLab: An agricultural mobile robot prototype for proximal sensing and precision farming, ASME International Mechanical Engineering Congress and Exposition, vol. 58370, American Society of Mechanical Engineers, 2017.
[120]
Zhang J., Maeta S., Bergerman M., Singh S., Mapping orchards for autonomous navigation, in: Proc. ASABE Annu. Int. Meeting, Montreal, Quebec Canada, American Society of Agricultural and Biological Engineers, 2014.
[121]
Al-Mashhadani Z., Mainampati M., Chandrasekaran B., Autonomous exploring map and navigation for an agricultural robot, in: 2020 3rd International Conference on Control and Robots, ICCR, IEEE, 2020, pp. 73–78.
[122]
Harik E.H.C., Korsaeth A., Combining hector SLAM and artificial potential field for autonomous navigation inside a greenhouse, Robotics 7 (2018) 22.
[123]
Ulloa C.C., Krus A., Barrientos A., del Cerro J., Valero C., Robotic fertilization in strip cropping using a CNN vegetables detection-characterization method, Comput. Electron. Agric. 193 (2022).
[124]
Cheein F.A.A., Guivant J., SLAM-based incremental convex hull processing approach for treetop volume estimation, Comput. Electron. Agric. 102 (2014) 19–30.
[125]
Colaco A.F., Trevisan R.G., Molin J.P., Rosell-Polo J.R., Escolà A., A method to obtain orange crop geometry information using a mobile terrestrial laser scanner and 3D modeling, Remote Sens. 9 (8) (2017) 763.
[126]
Martínez-Casasnovas J.A., Rufat J., Arnó J., Arbonés A., Sebé F., Pascual M., Gregorio E., Rosell-Polo J.R., et al., Mobile terrestrial laser scanner applications in precision fruticulture/horticulture and tools to extract information from canopy point clouds, Precis. Agric. 18 (1) (2017) 111–132.
[127]
Bietresato M., Carabin G., Vidoni R., Gasparetto A., Mazzetto F., Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications, Comput. Electron. Agric. 124 (2016) 1–13.
[128]
Wichmann M., Kamil M., Frederiksen A., Kotzur S., Scherl M., Long-term investigations of weather influence on direct time-of-flight LiDAR at 905nm, IEEE Sens. J. 22 (3) (2021) 2024–2036.
[129]
Rouveure R., Faure P., Monod M., Description and experimental results of a panoramic k-band radar dedicated to perception in mobile robotics applications, J. Field Robotics 35 (5) (2018) 678–704.
[130]
Cheng Y., Changsong P., Mengxin J., Yimin L., Relocalization based on millimeter wave radar point cloud for visually degraded environments, J. Field Robot. (2023),.
[131]
A. Velasquez, V. Higuti, H. Guerrero, D. Milori, D. Magalhães, M. Becker, Helvis-a small-scale agricultural mobile robot prototype for precision agriculture, in: 13th International conference of precision agriculture. International Society of Precision Agriculture, St. Louis, Missouri, USA, vol. 17, 2016.
[132]
Blanquart J.-E., Sirignano E., Lenaerts B., Saeys W., Online crop height and density estimation in grain fields using LiDAR, Biosyst. Eng. 198 (2020) 1–14.
[133]
Cheein F.A., Steiner G.M., Paina G.P., Carelli R., Optimized EIF-SLAM algorithm for precision agriculture mapping based on stems detection, Comput. Electron. Agric. 78 (2) (2011) 195–207.
[134]
Daglio G., Gallo R., Petrera S., Andergassen C., Kelderer M., Mazzetto F., Automated crop monitoring solutions to assess the blooming charge in orchards: Preliminary results achieved by a prototype mobile lab used on apple trees, IOP Conference Series: Earth and Environmental Science, vol. 275, IOP Publishing, 2019.
[135]
French A.N., Gore M.A., Thompson A., Cotton phenotyping with LiDAR from a track-mounted platform, Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping, vol. 9866, International Society for Optics and Photonics, 2016, p. 98660B.
[136]
Freitas G., Zhang J., Hamner B., Bergerman M., Kantor G., A low-cost, practical localization system for agricultural vehicles, in: International Conference on Intelligent Robotics and Applications, Springer, 2012, pp. 365–375.
[137]
Pérez-Ruiz M., Prior A., Martínez-Guanter J., Apolo-Apolo O., Andrade-Sanchez P., Egea G., Development and evaluation of a self-propelled electric platform for high-throughput field phenotyping in wheat breeding trials, Comput. Electron. Agric. 169 (2020).
[138]
Sun S., Li C., Paterson A.H., Jiang Y., Xu R., Robertson J.S., Snider J.L., Chee P.W., In-field high throughput phenotyping and cotton plant growth analysis using LiDAR, Front. Plant Sci. 9 (2018) 16.
[139]
Arita Y., di Maria E., Gallone R., Capodieci L., Morita M., Okubo K., Giannoccaro N.I., Shige-Eda M., Nishida T., Development of a mobile robot platform for 3D measurement of forest environments, 2016, ICT-ROBOT2016.
[140]
Weiss U., Biber P., Plant detection and mapping for agricultural robots using a 3D LiDAR sensor, Robot. Auton. Syst. 59 (5) (2011) 265–273.
[141]
Nehme H., Aubry C., Solatges T., Savatier X., Rossi R., Boutteau R., LiDAR-based structure tracking for agricultural aobots: Application to autonomous navigation in vineyards, J. Intell. Robot. Syst. 103 (4) (2021) 1–16.
[142]
Dong N., Chi R., Zhang W., LiDAR odometry and mapping based on semantic information for maize field, Agronomy 12 (12) (2022) 3107.
[143]
Chakraborty M., Khot L.R., Sankaran S., Jacoby P.W., Evaluation of mobile 3D light detection and ranging based canopy mapping system for tree fruit crops, Comput. Electron. Agric. 158 (2019) 284–293.
[144]
Wendel A., Underwood J., Walsh K., Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform, Comput. Electron. Agric. 155 (2018) 298–313.
[145]
Astolfi P., Gabrielli A., Bascetta L., Matteucci M., Vineyard autonomous navigation in the echord++ grape experiment, IFAC-PapersOnLine 51 (2018) 704–709.
[146]
Durmuş H., Güneş E.O., Kırcı M., Üstündağ B.B., The design of general purpose autonomous agricultural mobile-robot: “AGROBOT”, in: 2015 Fourth International Conference on Agro-Geoinformatics, Agro-Geoinformatics, IEEE, 2015, pp. 49–53.
[147]
Barbosa W.S., Oliveira A.I., Barbosa G.B., Leite A.C., Figueiredo K.T., Vellasco M.M., Caarls W., Design and development of an autonomous mobile robot for inspection of soy and cotton crops, in: 2019 12th International Conference on Developments in ESystems Engineering, DeSE, IEEE, 2019, pp. 557–562.
[148]
C. Smitt, M. Halstead, T. Zaenker, M. Bennewitz, C. McCool, Pathobot: A robot for glasshouse crop phenotyping and intervention, in: IEEE International Conference on Robotics and Automation, ICRA, 2021, pp. 2324–2330.
[149]
Kurtser P., Ringdahl O., Rotstein N., Berenstein R., Edan Y., In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D camera, IEEE Robot. Autom. Lett. 5 (2) (2020) 2031–2038.
[150]
Vulpi F., Marani R., Petitti A., Reina G., Milella A., An RGB-D multi-view perspective for autonomous agricultural robots, Comput. Electron. Agric. 202 (2022).
[151]
Matsuzaki S., Masuzawa H., Miura J., Oishi S., 3D semantic mapping in greenhouses for agricultural mobile robots with robust object recognition using robots’ trajectory, in: 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC, IEEE, 2018, pp. 357–362.
[152]
Rosell-Polo J.R., Gregorio E., Gené J., Llorens J., Torrent X., Arnó J., Escola A., Kinect V2 sensor-based mobile terrestrial laser scanner for agricultural outdoor applications, IEEE/ASME Trans. Mechatronics 22 (6) (2017) 2420–2427.
[153]
Khan M.S.A., Hussian D., Khan S., Rehman F.U., Aqeel A.B., Khan U.S., Implementation of SLAM by using a mobile agribot in a simulated indoor environment in Gazebo, in: 2021 International Conference on Robotics and Automation in Industry, ICRAI, IEEE, 2021, pp. 1–4.
[154]
Jay S., Rabatel G., Hadoux X., Moura D., Gorretta N., In-field crop row phenotyping from 3D modeling performed using structure from motion, Comput. Electron. Agric. 110 (2015) 70–77.
[155]
Patrício D.I., Rieder R., Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review, Comput. Electron. Agric. 153 (2018) 69–81.
[156]
Zaman S., Comba L., Biglia A., Aimonino D.R., Barge P., Gay P., Cost-effective visual odometry system for vehicle motion control in agricultural environments, Comput. Electron. Agric. 162 (2019) 82–94.
[157]
Lv M., Wei H., Fu X., Wang W., Zhou D., A loosely coupled Extended Kalman Filter algorithm for agricultural scene-based multi-sensor fusion, Front. Plant Sci. 13 (2022).
[158]
Durand-Petiteville A., Le Flecher E., Cadenat V., Sentenac T., Vougioukas S., Tree detection with low-cost three-dimensional sensors for autonomous navigation in orchards, IEEE Robot. Autom. Lett. 3 (4) (2018) 3876–3883.
[159]
Tagarakis A.C., Filippou E., Kalaitzidis D., Benos L., Busato P., Bochtis D., Proposing UGV and UAV systems for 3D mapping of orchard environments, Sensors 22 (2022) 1571.
[160]
Fernández-Novales J., Saiz-Rubio V., Barrio I., Rovira-Más F., Cuenca-Cuenca A., Santos Alves F., Valente J., Tardaguila J., Diago M.P., Monitoring and mapping vineyard water status using non-invasive technologies by a ground robot, Remote Sens. 13 (14) (2021) 2830.
[161]
Bannari A., Morin D., Bonn F., Huete A., A review of vegetation indices, Remote Sens. Rev. 13 (1–2) (1995) 95–120.
[162]
Pettorelli N., Vik J.O., Mysterud A., Gaillard J.-M., Tucker C.J., Stenseth N.C., Using the satellite-derived NDVI to assess ecological responses to environmental change, Trends Ecol. Evol. 20 (9) (2005) 503–510.
[163]
Thompson C.N., Guo W., Sharma B., Ritchie G.L., Using normalized difference red edge index to assess maturity in cotton, Crop Sci. 59 (5) (2019) 2167–2177.
[164]
Pôças I., Calera A., Campos I., Cunha M., Remote sensing for estimating and mapping single and basal crop coefficientes: A review on spectral vegetation indices approaches, Agricult. Water Manag. 233 (2020).
[165]
Dong T., Liu J., Shang J., Qian B., Ma B., Kovacs J.M., Walters D., Jiao X., Geng X., Shi Y., Assessment of red-edge vegetation indices for crop leaf area index estimation, Remote Sens. Environ. 222 (2019) 133–143.
[166]
M. Bietresato, G. Carabin, D. D’Auria, R. Gallo, G. Ristorto, F. Mazzetto, R. Vidoni, A. Gasparetto, L. Scalera, A tracked mobile robotic lab for monitoring the plants volume and health, in: 2016 12th IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, MESA, 2016, pp. 1–6.
[167]
Khanal S., Fulton J., Shearer S., An overview of current and potential applications of thermal remote sensing in precision agriculture, Comput. Electron. Agric. 139 (2017) 22–32.
[168]
Peng C., Fei Z., Vougioukas S.G., Depth camera based row end detection and headland manuvering in orchard navigation without GNSS, in: 30th Mediterranean Conference on Control and Automation, Athens, Greece, IEEE, 2022, pp. 538–544.
[169]
Chen J., Qiang H., Wu J., Xu G., Wang Z., Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform, Comput. Electron. Agric. 180 (2021).
[170]
Colucci G., Botta A., Tagliavini L., Cavallone P., Baglieri L., Quaglia G., Kinematic modeling and motion planning of the mobile manipulator AgriQ for precision agriculture, Machines 10 (5) (2022) 321.
[171]
Fei Z., Vougioukas S., Row-sensing templates: A generic 3D sensor-based approach to robot localization with respect to orchard row centerlines, J. Field Robotics 39 (2022) 712–738.
[172]
Kragh M., Jørgensen R.N., Pedersen H., Object detection and terrain classification in agricultural fields using 3D LiDAR data, in: International Conference on Computer Vision Systems, Springer, 2015, pp. 188–197.
[173]
Li Q., Xu Y., Minimum-time row transition control of a vision-guided agricultural robot, J. Field Robotics 39 (2022) 335–354.
[174]
Narváez F.Y., Gregorio E., Escolà A., Rosell-Polo J.R., Torres-Torriti M., Cheein F.A., Terrain classification using ToF sensors for the enhancement of agricultural machinery traversability, J. Terramech. 76 (2018) 1–13.
[175]
Chen M., Tang Y., Zou X., Huang Z., Zhou H., Chen S., 3D global mapping of large-scale unstructured orchard integrating eye-in-hand stereo vision and SLAM, Comput. Electron. Agric. 187 (2021).
[176]
Skoczeń M., Ochman M., Spyra K., Nikodem M., Krata D., Panek M., Pawłowski A., Obstacle detection system for agricultural mobile robot application using RGB-D cameras, Sensors 21 (16) (2021) 5292.
[177]
Yandun F., Parhar T., Silwal A., Clifford D., Yuan Z., Levine G., Yaroshenko S., Kantor G., Reaching pruning locations in a vine using a deep reinforcement learning policy, in: 2021 IEEE International Conference on Robotics and Automation, ICRA, IEEE, 2021, pp. 2400–2406.
[178]
Zhang X., Guo Y., Yang J., Li D., Wang Y., Zhao R., Many-objective evolutionary algorithm based agricultural mobile robot route planning, Comput. Electron. Agric. 200 (2022).
[179]
Reiser D., Martín-López J.M., Memic E., Vázquez-Arellano M., Brandner S., Griepentrog H.W., 3D imaging with a sonar sensor and an automated 3-axes frame for selective spraying in controlled conditions, J. Imaging 3 (1) (2017) 9.
[180]
J.P. Simões, P.D. Gaspar, E. Assunção, R. Mesquita, M.P. Simões, Navigation system of autonomous multitask robotic rover for agricultural activities on peach orchards based on computer vision through tree trunk detection, in: X International Peach Symposium 1352, 2022, pp. 593–600.
[181]
Adhikari S.P., Kim G., Kim H., Deep neural network-based system for autonomous navigation in paddy field, IEEE Access 155 (2020) 71272–71278.
[182]
Aguiar A.S., Monteiro N.N., dos Santos F.N., Solteiro Pires E.J., Silva D., Sousa A.J., Boaventura Cunha J., Bringing semantics to the vineyard: An approach on deep learning-based vine trunk detection, Agriculture 11 (2021) 131.
[183]
Mazzia V., Khaliq A., Salvetti F., Chiaberge M., Real-time apple detection system using embedded systems with hardware accelerators: An edge AI application, IEEE Access 8 (2020) 9102–9114.
[184]
Panigrahi P.K., Bisoy S.K., Localization strategies for autonomous mobile robots: A review, J. King Saud Univ. - Comput. Inf. Sci. 34 (8, Part B) (2022) 6019–6039.
[185]
Cerrato S., Aghi D., Mazzia V., Salvetti F., Chiaberge M., An adaptive row crops path generator with deep learning synergy, in: 2021 6th Asia-Pacific Conference on Intelligent Robot Systems, ACIRS, IEEE, 2021, pp. 6–12.
[186]
Gao P., Lee H., Jeon C.-W., Yun C., Kim H.-J., Wang W., Liang G., Chen Y., Zhang Z., Han X., Improved position estimation algorithm of agricultural mobile robots based on multisensor fusion and autoencoder neural network, Sensors 22 (4) (2022) 1522.
[187]
Debeunne C., Vivet D., A review of visual-LiDAR fusion based simultaneous localization and mapping, Sensors 20 (7) (2020) 2068.
[188]
Se S., Lowe D., Little J., Vision-based mobile robot localization and mapping using scale-invariant features, IEEE International Conference on Robotics and Automation, vol. 2, IEEE, 2001, pp. 2051–2058.
[189]
Dong J., Burnham J.G., Boots B., Rains G., Dellaert F., 4D crop monitoring: Spatio-temporal reconstruction for agriculture, in: 2017 IEEE International Conference on Robotics and Automation, ICRA, IEEE, 2017, pp. 3878–3885.
[190]
Mur-Artal R., Montiel J.M.M., Tardos J.D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Trans. Robot. 31 (5) (2015) 1147–1163.
[191]
Mur-Artal R., Tardos J.D., ORB-SLAM2: An open-source SLAM system for monocular, stereo and RGB-D cameras, IEEE Trans. Robot. 33 (5) (2017) 1255–1262.
[192]
F. Shu, P. Lesur, Y. Xie, A. Pagani, D. Stricker, SLAM in the field: An evaluation of monocular mapping and localization on challenging dynamic agricultural environment, in: IEEE/CVF Winter Conference on Applications of Computer Vision, 2021, pp. 1761–1771.
[193]
Zhao W., Wang X., Qi B., Runge T., Ground-level mapping and navigating for agriculture based on IoT and computer vision, IEEE Access 8 (2020) 221975–221985.
[194]
Campos C., Elvira R., Rodríguez J.J.G., Montiel J.M., Tardós J.D., ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM, IEEE Trans. Robot. 37 (6) (2021) 1874–1890.
[195]
Cremona J., Comelli R., Pire T., Experimental evaluation of visual-inertial odometry systems for arable farming, J. Field Robot. 39 (7) (2022) 1121–1135.
[196]
Labbé M., Michaud F., RTAB-Map as an open-source LiDAR and visual simultaneous localization and mapping library for large-scale and long-term online operation, J. Field Robotics 36 (2) (2019) 416–446.
[197]
Comelli R., Pire T., Kofman E., Evaluation of visual SLAM algorithms on agricultural dataset, Reunión de Trabajo en Procesamiento de la Información y Control (2019) 1–6.
[198]
Thrun S., Probabilistic robotics, Commun. ACM 45 (3) (2002) 52–57.
[199]
G. Grisetti, C. Stachniss, W. Burgard, Improving grid-based SLAM with Rao-Blackwellized particle filters by adaptive proposals and selective resampling, in: Proceedings of the 2005 IEEE International Conference on Robotics and Automation, 2005, pp. 2432–2437.
[200]
Grisetti G., Stachniss C., Burgard W., Improved techniques for grid mapping with Rao-Blackwellized particle filters, IEEE Trans. Robot. 23 (1) (2007) 34–46.
[201]
Marquardt D.W., An algorithm for least-squares estimation of nonlinear parameters, J. Soc. Ind. Appl. Math. 11 (2) (1963) 431–441.
[202]
Besl P.J., McKay N.D., Method for registration of 3D shapes, Sensor Fusion IV: Control Paradigms and Data Structures, vol. 1611, Spie, 1992, pp. 586–606.
[203]
Dellaert F., Factor Graphs and GTSAM: A Hands-On Introduction, Georgia Institute of Technology, 2012.
[204]
Zhang J., Singh S., LOAM: Lidar odometry and mapping in real-time, Robotics: Science and Syste, vol. 2, Berkeley, CA, 2014, pp. 1–9.
[205]
T. Shan, B. Englot, LeGO-LOAM: Lightweight and ground-optimized LiDAR odometry and mapping on variable terrain, in: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2018, pp. 4758–4765.
[206]
T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, D. Rus, LIO-SAM: Tightly-coupled LiDAR inertial odometry via smoothing and mapping, in: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2020, pp. 5135–5142.
[208]
Ye H., Chen Y., Liu M., Tightly coupled 3D LiDAR inertial odometry and mapping, in: 2019 IEEE International Conference on Robotics and Automation, ICRA, IEEE, 2019, pp. 3144–3150.
[209]
P. Biber, W. Straßer, The normal distributions transform: A new approach to laser scan matching, in: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, IROS 2003, 2003, pp. 2743–2748.
[210]
Koide K., Miura J., Menegatti E., A portable three-dimensional LiDAR-based system for long-term and wide-area people behavior measurement, Int. J. Adv. Robot. Syst. 16 (2) (2019).
[211]
Radcliffe J., Cox J., Bulanon D.M., Machine vision for orchard navigation, Comput. Ind. 98 (2018) 165–171.
[212]
Mammarella M., Comba L., Biglia A., Dabbene F., Gay P., Cooperation of unmanned systems for agricultural applications: A case study in a vineyard, Biosyst. Eng. 223 (2022) 81–102.
[213]
Ahmed S.N.A., Zeng Y., UWB positioning accuracy and enhancements, in: TENCON 2017-2017 IEEE Region 10 Conference, IEEE, 2017, pp. 634–638.
[214]
Yao L., Hu D., Zhao C., Yang Z., Zhang Z., Wireless positioning and path tracking for a mobile platform in greenhouse, Int. J. Agric. Biol. Eng. 14 (1) (2021) 216–223.
[215]
Fung M.L., Chen M.Z., Chen Y.H., Sensor fusion: A review of methods and applications, in: 2017 29th Chinese Control and Decision Conference, CCDC, IEEE, 2017, pp. 3853–3860.
[216]
Hansen S., Bayramoglu E., Andersen J.C., Ravn O., Andersen N., Poulsen N.K., Orchard navigation using derivative free Kalman filtering, in: Proceedings of the 2011 American Control Conference, IEEE, 2011, pp. 4679–4684.
[217]
Iqbal J., Xu R., Halloran H., Li C., Development of a multi-purpose autonomous differential drive mobile robot for plant phenotyping and soil sensing, Electronics 9 (2020) 1550.
[218]
Kschischang F.R., Frey B.J., Loeliger H.-A., Factor graphs and the sum-product algorithm, IEEE Trans. Inform. Theory 47 (2) (2001) 498–519.
[219]
Loeliger H.-A., An introduction to factor graphs, IEEE Signal Process. Mag. 21 (1) (2004) 28–41.
[220]
Indelman V., Williams S., Kaess M., Dellaert F., Factor graph based incremental smoothing in inertial navigation systems, in: 15th International Conference on Information Fusion, IEEE, 2012, pp. 2154–2161.
[221]
Tiozzo Fasiolo D., Scalera L., Maset E., Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping, Robotica (2023) 1–17.
[222]
Narvaez F.Y., Reina G., Torres-Torriti M., Kantor G., Cheein F.A., A survey of ranging and imaging techniques for precision agriculture phenotyping, IEEE/ASME Trans. Mechatronics 22 (6) (2017) 2428–2439.
[223]
Fischler M.A., Bolles R.C., Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography, Commun. ACM 24 (6) (1981) 381–395.
[224]
Zhang W., Qi J., Wan P., Wang H., Xie D., Wang X., Yan G., An easy-to-use airborne LiDAR data filtering method based on cloth simulation, Remote Sens. 8 (6) (2016) 501.
[225]
Ihuoma S.O., Madramootoo C.A., Recent advances in crop water stress detection, Comput. Electron. Agric. 141 (2017) 267–275.
[226]
Watson D.J., Comparative physiological studies on the growth of field crops: I. Variation in net assimilation rate and leaf area between species and varieties, and within and between years, Ann. Botany 11 (41) (1947) 41–76.
[227]
Xue J., Zhang S., et al., Navigation of an agricultural robot based on laser radar, Trans. Chin. Soc. Agric. Mach. 45 (9) (2014) 55–60.
[228]
Siciliano B., Sciavicco L., Villani L., Oriolo G., Motion planning, Robot.: Model., Plan. Control (2009) 523–559.
[229]
Phillips M., SBPL_Lattice_Planner, 2018, http://wiki.ros.org/sbpl_lattice_planner.
[230]
Mahmud M.S.A., Abidin M.S.Z., Mohamed Z., Abd Rahman M.K.I., Iida M., Multi-objective path planner for an agricultural mobile robot in a virtual greenhouse environment, Comput. Electron. Agric. 157 (2019) 488–499.
[231]
Jeon C.-W., Kim H.-J., Yun C., Gang M., Han X., An entry-exit path planner for an autonomous tractor in a paddy field, Comput. Electron. Agric. 191 (2021).
[232]
Fox D., Burgard W., Thrun S., The dynamic window approach to collision avoidance, IEEE Robot. Autom. Mag. 4 (1) (1997) 23–33.
[233]
S. Quinlan, O. Khatib, Elastic bands: Connecting path planning and control, in: Proceedings IEEE International Conference on Robotics and Automation, 1993, pp. 802–807.
[234]
Reeds J., Shepp L., Optimal paths for a car that goes both forwards and backwards, Pacific J. Math. 145 (2) (1990) 367–393.
[235]
Dang N.T., Luy N.T., et al., LiDAR-based online navigation algorithm for an autonomous agricultural robot, J. Control Eng. Appl. Inf. 24 (2022) 90–100.
[236]
Isack H., Boykov Y., Energy-based geometric multi-model fitting, Int. J. Comput. Vis. 97 (2) (2012) 123–147.
[237]
Danton A., Roux J.-C., Dance B., Cariou C., Lenain R., Development of a spraying robot for precision agriculture: An edge following approach, in: 2020 IEEE Conference on Control Technology and Applications, CCTA, IEEE, 2020, pp. 267–272.
[238]
Iberraken D., Gaurier F., Roux J.-C., Chaballier C., Lenain R., Autonomous vineyard tracking using a four-wheel-steering mobile robot and a 2D LiDAR, AgriEngineering 4 (4) (2022) 826–846.
[239]
Duda R.O., Hart P.E., Use of the Hough transformation to detect lines and curves in pictures, Commun. ACM 15 (1) (1972) 11–15.
[240]
Canny J., A computational approach to edge detection, IEEE Trans. Pattern Anal. Mach. Intell. (1986) 679–698.
[241]
Sharifi M., Chen X., A novel vision based row guidance approach for navigation of agricultural mobile robots in orchards, in: 2015 6th International Conference on Automation, Robotics and Applications, ICARA, IEEE, 2015, pp. 251–255.
[242]
Liang X., Chen B., Wei C., Zhang X., Inter-row navigation line detection for cotton with broken rows, Plant Methods 18 (2022) 1–12.
[243]
Ruangurai P., Dailey M.N., Ekpanyapong M., Soni P., Optimal vision-based guidance row locating for autonomous agricultural machines, Precis. Agric. (2022) 1–21.
[244]
Hu Y., Huang H., Extraction method for centerlines of crop cow based on improved lightweight YoloV4, in: 2021 6th International Symposium on Computer and Information Processing Technology, ISCIPT, IEEE, 2021, pp. 127–132.
[245]
Bajcsy R., Active perception, Proc. IEEE 76 (1988) 966–1005.
[246]
Lluvia I., Lazkano E., Ansuategi A., Active mapping and robot exploration: A survey, Sensors 21 (2021) 2445.
[247]
Yamauchi B., A frontier-based approach for autonomous exploration, in: Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation, CIRA, IEEE, 1997, pp. 146–151.
[248]
Yamauchi B., Schultz A., Adams W., Mobile robot exploration and map-building with continuous localization, Proceedings 1998 IEEE International Conference on Robotics and Automation, vol. 4, IEEE, 1998, pp. 3715–3720.
[249]
Keidar M., Kaminka G.A., Efficient frontier detection for robot exploration, Int. J. Robot. Res. 33 (2014) 215–236.
[250]
Dornhege C., Kleiner A., A frontier-void-based approach for autonomous exploration in 3D, Adv. Robot. 27 (2013) 459–468.
[251]
P. Senarathne, D. Wang, Towards autonomous 3D exploration using surface frontiers, in: 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics, SSRR, 2016, pp. 34–41.
[252]
A. Bircher, M. Kamel, K. Alexis, H. Oleynikova, R. Siegwart, Receding horizon “next-best-view” planner for 3D exploration, in: 2016 IEEE International Conference on Robotics and Automation, ICRA, 2016, pp. 1462–1468.
[253]
Polvara R., Del Duchetto F., Neumann G., Hanheide M., Navigate-and-seek: A robotics framework for people localization in agricultural environments, IEEE Robot. Automat. Lett. 6 (2021) 6577–6584.
[254]
Holz D., Basilico N., Amigoni F., Behnke S., Evaluating the efficiency of frontier-based exploration strategies, in: ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), VDE, 2010, pp. 1–8.
[255]
Stachniss C., Hahnel D., Burgard W., Exploration with active loop-closing for FastSLAM, in: 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, IROS, 2004, pp. 1505–1510.
[256]
Metre V.A., Sawarkar S.D., Reviewing important aspects of plant leaf disease detection and classification, in: 2022 International Conference for Advancement in Technology, ICONAT, IEEE, 2022, pp. 1–8.
[257]
Ahmadi A., Halstead M., McCool C., Virtual temporal samples for recurrent neural networks: Applied to semantic segmentation in agriculture, DAGM German Conference on Pattern Recognition, vol. 13024, Springer, 2021, pp. 574–588.
[258]
Maset E., Padova B., Fusiello A., Efficient large-scale airborne LiDAR data classification via fully convolutional network, Int. Arch. Photogram., Remote Sens. Spat. Inf. Sci. 43 (2020) 527–532.
[259]
Girshick R., Radosavovic I., Gkioxari G., Dollár P., He K., Detectron, 2018, https://github.com/facebookresearch/detectron.
[260]
K. He, G. Gkioxari, P. Dollár, R. Girshick, Mask R-CNN, in: Proceedings of the IEEE International Conference on Computer Vision, 2017, pp. 2961–2969.
[261]
J. Redmon, A. Farhadi, YOLO9000: Better, faster, stronger, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 7263–7271.
[262]
Redmon J., Farhadi A., Yolov3: An incremental improvement, 2018, arXiv preprint arXiv:1804.02767.
[263]
Tan M., Le Q., Efficientnet: Rethinking model scaling for convolutional neural networks, in: International Conference on Machine Learning, PMLR, 2019, pp. 6105–6114.
[264]
Fu L., Feng Y., Wu J., Liu Z., Gao F., Majeed Y., Al-Mallahi A., Zhang Q., Li R., Cui Y., Fast and accurate detection of kiwifruit in orchard using improved YOLOv3-tiny model, Precis. Agric. 22 (3) (2021) 754–776.
[265]
Xia X., Xu C., Nan B., Inception-v3 for flower classification, in: 2017 2nd International Conference on Image, Vision and Computing, ICIVC, IEEE, 2017, pp. 783–787.
[266]
L.-C. Chen, Y. Zhu, G. Papandreou, F. Schroff, H. Adam, Encoder-decoder with atrous separable convolution for semantic image segmentation, in: Proceedings of the European Conference on Computer Vision, ECCV, 2018, pp. 801–818.
[267]
D. Bolya, C. Zhou, F. Xiao, Y.J. Lee, Yolact: Real-time instance segmentation, in: IEEE/CVF International Conference on Computer Vision, 2019, pp. 9157–9166.
[268]
Ronneberger O., Fischer P., Brox T., U-net: Convolutional networks for biomedical image segmentation, in: International Conference on Medical Image Computing and Computer-Assisted Intervention, Springer, 2015, pp. 234–241.
[269]
A. Howard, M. Sandler, G. Chu, L.-C. Chen, B. Chen, M. Tan, W. Wang, Y. Zhu, R. Pang, V. Vasudevan, et al., Searching for MobilenetV3, in: Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 1314–1324.
[270]
Santos L.C., Aguiar A.S., Santos F.N., Valente A., Petry M., Occupancy grid and topological maps extraction from satellite images for path planning in agricultural robots, Robotics 9 (2020) 77.
[271]
Matsuzaki S., Miura J., Masuzawa H., Multi-source pseudo-label learning of semantic segmentation for the scene recognition of agricultural mobile robots, Adv. Robot. (2022) 1–19.
[272]
Zaenker T., Smitt C., McCool C., Bennewitz M., Viewpoint planning for fruit size and position estimation, in: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, IEEE, 2021, pp. 3271–3277.
[273]
Sipola T., Alatalo J., Kokkonen T., Rantonen M., Artificial intelligence in the IoT era: A review of edge AI hardware and software, in: 2022 31st Conference of Open Innovations Association, FRUCT, IEEE, 2022, pp. 320–331.
[274]
Gonzalez-de Soto M., Emmi L., Gonzalez-de Santos P., Hybrid-powered autonomous robots for reducing both fuel consumption and pollution in precision agriculture tasks, Agric. Robots-Fund. Appl. (2018).
[275]
Dharmasena T., de Silva R., Abhayasingha N., Abeygunawardhana P., Autonomous cloud robotic system for smart agriculture, in: 2019 Moratuwa Engineering Research Conference, MERCon, IEEE, 2019, pp. 388–393.
[276]
Albiero D., Garcia A.P., Umezu C.K., de Paulo R.L., Swarm robots in mechanized agricultural operations: A review about challenges for research, Comput. Electronic. Agric. 193 (2022).
[277]
Chen K., Lopez B.T., Agha-mohammadi A.-a., Mehta A., Direct LiDAR odometry: Fast localization with dense point clouds, IEEE Robot. Autom. Lett. 7 (2) (2022) 2000–2007.
[278]
Newcombe R.A., Lovegrove S.J., Davison A.J., DTAM: Dense tracking and mapping in real-time, in: 2011 International Conference on Computer Vision, IEEE, 2011, pp. 2320–2327.
[279]
Wang W., Liu J., Wang C., Luo B., Zhang C., DV-LOAM: Direct visual LiDAR odometry and mapping, Remote Sens. 13 (16) (2021) 3340.
[280]
H. Wang, C. Wang, C.-L. Chen, L. Xie, F-LOAM: Fast LiDAR odometry and mapping, in: 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS, 2021, pp. 4390–4396.
[281]
Garcia-Fidalgo E., Company-Corcoles J.P., Bonnin-Pascual F., Ortiz A., LiODOM: Adaptive local mapping for robust LiDAR-only odometry, Robot. Auton. Syst. 156 (2022).
[282]
T. Shan, B. Englot, C. Ratti, D. Rus, LVI-SAM: Tightly-coupled LiDAR-visual-inertial odometry via smoothing and mapping, in: 2021 IEEE International Conference on Robotics and Automation, ICRA, 2021, pp. 5692–5698.
[283]
Engel J., Schöps T., Cremers D., LSD-SLAM: Large-scale direct monocular SLAM, in: European Conference on Computer Vision, Springer, 2014, pp. 834–849.
[284]
He Y., Zhao J., Guo Y., He W., Yuan K., PL-VIO: Tightly-coupled monocular visual–inertial odometry using point and line features, Sensors 18 (4) (2018) 1159.
[285]
R. Wang, M. Schworer, D. Cremers, Stereo DSO: Large-scale direct sparse visual odometry with stereo cameras, in: Proceedings of the IEEE International Conference on Computer Vision, 2017, pp. 3903–3911.
[286]
Forster C., Pizzoli M., Scaramuzza D., SVO: Fast semi-direct monocular visual odometry, in: 2014 IEEE International Conference on Robotics and Automation, ICRA, IEEE, 2014, pp. 15–22.
[287]
Vaswani A., Shazeer N., Parmar N., Uszkoreit J., Jones L., Gomez A.N., Kaiser L., Polosukhin I., Attention is all you need, Adv. Neural Inf. Process. Syst. 30 (2017).

Cited By

View all
  • (2024)Technological and Research Challenges in Data Engineering for Sustainable AgricultureProceedings of the International Workshop on Big Data in Emergent Distributed Environments10.1145/3663741.3664786(1-6)Online publication date: 9-Jun-2024
  • (2024)A comprehensive review of recent approaches and Hardware-Software technologies for digitalisation and intellectualisation of Open-Field crop ProductionComputers and Electronics in Agriculture10.1016/j.compag.2024.109326225:COnline publication date: 18-Nov-2024

Index Terms

  1. Towards autonomous mapping in agriculture: A review of supportive technologies for ground robotics
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image Robotics and Autonomous Systems
          Robotics and Autonomous Systems  Volume 169, Issue C
          Nov 2023
          232 pages

          Publisher

          North-Holland Publishing Co.

          Netherlands

          Publication History

          Published: 01 November 2023

          Author Tags

          1. Mobile robotics
          2. Agriculture
          3. Localization
          4. Mapping
          5. Path planning
          6. Artificial intelligence

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 25 Nov 2024

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)Technological and Research Challenges in Data Engineering for Sustainable AgricultureProceedings of the International Workshop on Big Data in Emergent Distributed Environments10.1145/3663741.3664786(1-6)Online publication date: 9-Jun-2024
          • (2024)A comprehensive review of recent approaches and Hardware-Software technologies for digitalisation and intellectualisation of Open-Field crop ProductionComputers and Electronics in Agriculture10.1016/j.compag.2024.109326225:COnline publication date: 18-Nov-2024

          View Options

          View options

          Login options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media