Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology
<p>On the left, the bibliographic dataset we collected, categorized by topic (DT stands for ‘digital twin’). The sum of works in each category exceeds the total entries, as resources may belong to multiple groups. On the right, the distribution of publications over time.</p> "> Figure 2
<p>On the left: a typical robot development platform for agricultural monitoring. On the right: a simplified diagram of the main monitoring and navigation components (dimensions are in mm). Sensors are typically mounted on an external frame atop the UGV, offering a ‘human-like’ perspective and easy access to the devices. In many cases, two or more cameras are installed, facing the left and right crops. The actual number and placement of devices on the UGV may vary; the diagram is for conceptual purposes only. The image and diagram (modified by the authors) were taken from [<a href="#B24-machines-12-00750" class="html-bibr">24</a>] with preliminary authorization.</p> "> Figure 3
<p>Dataflow architecture of UGVs for agricultural monitoring. The boxes on the right (‘Field Operations Controllers’ and ‘Actuators’) represent the potential integration of field treatment functionalities, an aspect that is beyond the scope of this review.</p> "> Figure 4
<p>Prevalence and distribution of hardware ICT components for monitoring UGVs. This study examines the number of scientific publications from the last 10 years, as investigated using Google Scholar. On the left: sensors. On the right: computational devices (CPUs are ubiquitous and therefore are not included in the statistics).</p> ">
Abstract
:1. Introduction
2. Background and Related Literature
3. Methodology
4. Architecture and Data Workflow of UGVs
5. Sensors
5.1. RGB-D Cameras
5.2. RGB Cameras
5.3. Multispectral Cameras
5.4. LiDARs
5.5. GNSSs and IMUs
6. Governing Software
6.1. Operating Systems
- Communication Architecture:
- –
- ROS1 uses a centralized communication model with a single node (the ROS Master), which facilitates node connections but is a possible bottleneck and a point of failure.
- –
- ROS2 adopts a decentralized architecture without a master node, using the DDS (Data Distribution Service), which improves system scalability and resilience.
- Real-time Performance:
- –
- ROS1 does not provide native support for real-time operations.
- –
- ROS2 supports real-time capabilities, allowing for the more precise timing of operations, which is crucial for applications such as motor control and other critical robotic functions.
- Multiplatform Support:
- –
- ROS1 is primarily developed for Unix-based operating systems, mainly Ubuntu
- –
- ROS2 is more portable and supports a variety of OSs, including Windows, macOS, and various Linux distributions: it is more flexible for different development environments.
- Security:
- –
- ROS1 has limited security options, which can be a concern in commercial applications. Data privacy in agriculture is important!
- –
- ROS2 introduces advanced security features such as communication encryption, node authentication, and authorization, improving security in broader and critical scenarios.
- Communication Management:
- –
- ROS1 uses a simple TCP/UDP-based communication model, which may not be efficient in complex or large networks.
- –
- ROS2 uses DDS for communication management, which is highly scalable and efficient for large-scale distributed systems.
- Usability and Documentation:
- –
- ROS1 has a large user base and extensive documentation with many years of development, making it accessible for new users.
- –
- ROS2 is still expanding its documentation and examples but is quickly gaining ground due to its modern architecture and new features.
6.2. Simulation Environments
- Gazebo allows for the simulation of robots in both indoor and outdoor settings, including sensors and actuators, with a high degree of realism thanks to precise physical models. This makes it useful for testing navigation and monitoring in complex agricultural environments.
- CoppeliaSim (formerly V-REP) offers detailed control over simulations, supporting various physical engines. This means that it is able to deal with complex interactions between rovers and various types of terrain or crops.
- Webots is the development environment for the simulation of autonomous mobile robots. It is known for its ease of use and extensive support of robot models. It is used to prototype and test navigation and terrain mapping algorithms.
- Nvidia’s Isaac Sim is an advanced simulation environment that utilizes GPU acceleration for realistic and detailed simulations, optimized for robotics and AI. This makes it particularly useful for simulating agricultural rovers that use AI for the visual analysis of crops.
6.3. Programming Languages and Frameworks
7. On-Board Hardware
8. Navigation
8.1. Mapping
8.2. Localization
8.3. Path Planning
- A* Algorithm, a well-known popular pathfinding algorithm that searches for the shortest path from a start point to an endpoint by considering both the cost to reach a point and the estimated cost to the destination. It is widely used in agricultural UGVs for its efficiency and accuracy in path planning [100,101].
- Dijkstra’s Algorithm, which calculates the shortest paths from a single source node to all other nodes in a weighted graph. Its application in UGVs involves determining the optimal routes for tasks such as soil sampling and weed management.
- Rapidly exploring Random Trees (RRT) algorithm, designed for exploring high-dimensional and cluttered environments, is often used to find feasible paths in real-time scenarios [102]. This algorithm incrementally builds a tree of random samples from the UGV’s starting position towards the goal, avoiding obstacles while randomly exploring the space. Its speed and low computational intensity means that it is adopted in many real-world scenarios. Several variations have been developed and adopted, such as RRT*, which ensures asymptotic convergence to the optimal path, a feature not guaranteed in the original algorithm [103].
9. Algorithms, AI Models and Digital Twins
9.1. Algorithms in Agricultural UGVs
- Edge Detection algorithms, such as Canny and Sobel, are used to identify the boundaries of objects within images. This is crucial for tasks like crop row detection [96], the diagnosis of plant health issues, and discrimination between crops, weeds, and the soil background.
- Image segmentation algorithms assign each pixel to a class in order to partition the image content at a fine grade level and simplify analysis. Techniques like k-means clustering and watershed segmentation help to isolate plants from the background for precise monitoring.
- Feature Extraction techniques identify specific characteristics within an image, such as texture, shape, and color. These features are then used to diagnose plant health, check growth, detect pest infestations, or recognize weeds. The most common are the Scale Invariant Feature Transform (SIFT) algorithm [104], Hough transform, and Otsu method [105]. When multispectral images are acquired, the Normalized Difference Vegetation Index (NDVI) and its variations could be used to estimate the plant growth and vegetation phase, combining near-infrared and red channels acquired by sensors [106]. Other authors, as in [107], use data generated by LiDAR sensors to compute the Leaf Area Index (LAI) on detected plants. Environment features are detected by methods like FAST, ORB, VINS, and BASALT in SLAM navigation to estimate orientation and movement [108].
- Proportional–Integral–Derivative (PID) controllers, which are widely used in UGVs for maintaining the desired speed, direction, and stability. They adjust the control inputs to minimize the error between the desired and actual states [112].
- Fuzzy logic controllers, which handle uncertainties and imprecisions in UGV operations by using fuzzy set theory. They are applied in scenarios where traditional control methods may struggle, such as variable terrain and crop conditions, or for collision avoidance [113].
- Adaptive Fuzzy PID (AFPID) controllers: these systems modify their parameters in real-time to cope with changing environmental conditions and uncertainties [114]. Such adaptability is crucial for UGVs operating in dynamic agricultural fields with a dynamic environment.
9.2. AI in Agricultural UGVs
- Convolutional Neural Networks (CNNs) are widely used in UGVs for classification, detection, and semantic segmentation [133,134,135,136,137,138,139,140,141]. These models can analyze images of crops to identify specific plant health issues, such as nutrient deficiencies or pest infestations, and can also be used for UGV navigation, detecting obstacles, and recognizing viable pathways, such as tree rows. The most relevant DL models can be grouped into three categories: (a) Simple classification models: These models detect the presence of specific object in an image without localizing it, but by identifying patterns in the extracted features. Examples include AlexNet, VGGNet, MobileNet, and ResNet, which are often used as encoders in more complex deep learning networks. (b) Single-stage methods, based on regression, which directly predict object locations and classifications in a single pass through the network. YOLO (You Only Look Once) and SSD (Single Shot MultiBox Detector) are the most well-known examples. (c) Two-stage methods, which first generate candidate regions and then classify them. These methods are generally more accurate but more computationally demanding. Two-stage models includes R-CNN and its improved versions like Faster R-CNN and Mask R-CNN. Traditionally, due to the limited computational power of UGVs and the need for speed in detection, single-stage models were predominantly used, despite their lower accuracy. Today, advances in hardware performance (see Section 7) have made some two-stage methods feasible as well.
- Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) networks are used for analyzing sequential data, such as predicting weather conditions or tracking crop growth over time [142,143]. These deep learning models excel at preserving temporal relationships within data sequences, effectively modeling patterns and dependencies that evolve over time. This ability is due to their architecture, which includes output connections that loop back to some hidden nodes in the input layer, allowing for information to persist. This enables the network to maintain a hidden state that captures information from previous time steps.
- Reinforcement Learning (RL) models optimize the decision-making processes of UGVs [54,144]. In this approach, UGVs learn to make decisions by receiving feedback from their actions in the form of rewards or penalties, and model training can be achieved in unseen environments. These models are used for path-planning and rover exploration in unstructured environments, where autonomous navigation, efficient resource management, and adaptation to dynamic field conditions are essential. Additionally, RL models are employed to coordinate multiple unmanned vehicles without explicitly defining and coding their behavior [145,146]. By continually learning from their interactions with the environment, these models help UGVs improve their performance over time, leading to more effective and efficient agricultural operations.
9.3. Digital Twins in Agricultural UGVs
10. Discussion and Conclusions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Alatise, M.B.; Hancke, G.P. A Review on Challenges of Autonomous Mobile Robot and Sensor Fusion Methods. IEEE Access 2020, 8, 39830–39846. [Google Scholar] [CrossRef]
- Fernandes, H.R.; Polania, E.C.M.; Garcia, A.P.; Mendonza, O.B.; Albiero, D. Agricultural unmanned ground vehicles: A review from the stability point of view. Rev. Cienc. Agron. 2020, 51, e20207761. [Google Scholar] [CrossRef]
- Gonzalez-De-Santos, P.; Fernández, R.; Sepúlveda, D.; Navas, E.; Armada, M. Unmanned Ground Vehicles for Smart Farms. In Agronomy; Amanullah, K., Ed.; IntechOpen: Rijeka, Croatia, 2020; Chapter 6. [Google Scholar] [CrossRef]
- Wang, Y.; Li, X.; Zhang, J.; Li, S.; Xu, Z.; Zhou, X. Review of wheeled mobile robot collision avoidance under unknown environment. Sci. Prog. 2021, 104, 003685042110377. [Google Scholar] [CrossRef] [PubMed]
- Mahmud, M.S.A.; Abidin, M.S.Z.; Emmanuel, A.A.; Hasan, H.S. Robotics and automation in agriculture: Present and future applications. Appl. Model. Simul. 2020, 4, 130–140. [Google Scholar]
- Chandra, R.; Collis, S. Digital agriculture for small-scale producers: Challenges and opportunities. Commun. ACM 2021, 64, 75–84. [Google Scholar] [CrossRef]
- Fasiolo, D.T.; Scalera, L.; Maset, E.; Gasparetto, A. Towards autonomous mapping in agriculture: A review of supportive technologies for ground robotics. Robot. Auton. Syst. 2023, 169, 104514. [Google Scholar] [CrossRef]
- Colucci, F.; Maggio, F.; Pintus, M. Recent Advances in AIoT for Image Classification and Continuous Monitoring in Agriculture. IoT, 2024; paper under submission. [Google Scholar]
- Yépez-Ponce, D.F.; Salcedo, J.V.; Rosero-Montalvo, P.D.; Sanchis, J. Mobile robotics in smart farming: Current trends and applications. Front. Artif. Intell. 2023, 6, 1213330. [Google Scholar] [CrossRef]
- Etezadi, H.; Eshkabilov, S. A Comprehensive Overview of Control Algorithms, Sensors, Actuators, and Communication Tools of Autonomous All-Terrain Vehicles in Agriculture. Agriculture 2024, 14, 163. [Google Scholar] [CrossRef]
- Bazargani, K.; Deemyad, T. Automation’s Impact on Agriculture: Opportunities, Challenges, and Economic Effects. Robotics 2024, 13, 33. [Google Scholar] [CrossRef]
- Scopus Document Search. Available online: https://www.scopus.com (accessed on 9 October 2024).
- Google Scholar. Available online: https://scholar.google.com/ (accessed on 9 October 2024).
- arXiv. Available online: https://arxiv.org (accessed on 9 October 2024).
- IEEE Xplore. Available online: https://ieeexplore.ieee.org/ (accessed on 9 October 2024).
- NIH National Library of Medicine. Available online: https://pubmed.ncbi.nlm.nih.gov (accessed on 9 October 2024).
- Science.gov. Available online: https://www.science.gov (accessed on 9 October 2024).
- ScienceDirect. Available online: https://www.sciencedirect.com (accessed on 9 October 2024).
- Semantic Scholar. Available online: https://www.semanticscholar.org (accessed on 9 October 2024).
- World Wide Science. Available online: https://worldwidescience.org (accessed on 30 September 2024).
- Botta, A.; Quaglia, G. Performance analysis of low-cost tracking system for mobile robots. Machines 2020, 8, 29. [Google Scholar] [CrossRef]
- Botta, A.; Cavallone, P.; Tagliavini, L.; Colucci, G.; Carbonari, L.; Quaglia, G. Modelling and simulation of articulated mobile robots. Int. J. Mech. Control 2021, 22, 15–25. [Google Scholar]
- Botta, A.; Cavallone, P.; Carbonari, L.; Tagliavini, L.; Quaglia, G. Modelling and Experimental Validation of Articulated Mobile Robots with Hybrid Locomotion System. Mech. Mach. Sci. 2021, 91, 758–767. [Google Scholar] [CrossRef]
- Robodyne. Available online: https://www.robo-dyne.com/ (accessed on 9 October 2024).
- Dhanush, G.; Khatri, N.; Kumar, S.; Shukla, P.K. A comprehensive review of machine vision systems and artificial intelligence algorithms for the detection and harvesting of agricultural produce. Sci. Afr. 2023, 21, e01798. [Google Scholar] [CrossRef]
- Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV powered by solar energy for precision agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef]
- Ulrich, L.; Vezzetti, E.; Moos, S.; Marcolin, F. Analysis of RGB-D camera technologies for supporting different facial usage scenarios. Multimed. Tools Appl. 2020, 79, 29375–29398. [Google Scholar] [CrossRef]
- Tychola, K.A.; Tsimperidis, I.; Papakostas, G.A. On 3D Reconstruction Using RGB-D Cameras. Digital 2022, 2, 401–421. [Google Scholar] [CrossRef]
- Kurtser, P.; Lowry, S. RGB-D datasets for robotic perception in site-specific agricultural operations—A survey. Comput. Electron. Agric. 2023, 212, 108035. [Google Scholar] [CrossRef]
- Ram, B.G.; Oduor, P.; Igathinathane, C.; Howatt, K.; Sun, X. A systematic review of hyperspectral imaging in precision agriculture: Analysis of its current state and future prospects. Comput. Electron. Agric. 2024, 222, 109037. [Google Scholar] [CrossRef]
- Ishimwe, R.; Abutaleb, K.; Ahmed, F. Applications of Thermal Imaging in Agriculture—A Review. Adv. Remote Sens. 2014, 3, 128–140. [Google Scholar] [CrossRef]
- Thomson, S.J.; Ouellet-Plamondon, C.M.; DeFauw, S.L.; Huang, Y.; Fisher, D.K.; English, P.J. Potential and Challenges in Use of Thermal Imaging for Humid Region Irrigation System Management. J. Agric. Sci. 2012, 4, 103–115. [Google Scholar] [CrossRef]
- Xiang, L.; Wang, D. A review of three-dimensional vision techniques in food and agriculture applications. Smart Agric. Technol. 2023, 5, 100259. [Google Scholar] [CrossRef]
- Rivera, G.; Porras, R.; Florencia, R.; Sánchez-Solís, J.P. LiDAR applications in precision agriculture for cultivating crops: A review of recent advances. Comput. Electron. Agric. 2023, 207, 107737. [Google Scholar] [CrossRef]
- Radočaj, D.; Plaščak, I.; Jurišić, M. Global Navigation Satellite Systems as State-of-the-Art Solutions in Precision Agriculture: A Review of Studies Indexed in the Web of Science. Agriculture 2023, 13, 1417. [Google Scholar] [CrossRef]
- Upadhyay, A.; Zhang, Y.; Koparan, C.; Rai, N.; Howatt, K.; Bajwa, S.; Sun, X. Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review. Comput. Electron. Agric. 2024, 225, 109363. [Google Scholar] [CrossRef]
- GNSS Products. Available online: https://drfasching.com/products/raspignss/ (accessed on 9 October 2024).
- Centimeter Precision GPS/GNSS—RTK Explained. Available online: https://www.ardusimple.com/rtk-explained/ (accessed on 9 October 2024).
- Yuanyuan, Z.; Bin, Z.; Cheng, S.; Haolu, L.; Jicheng, H.; Kunpeng, T.; Zhong, T. Review of the field environmental sensing methods based on multi-sensor information fusion technology. Int. J. Agric. Biol. Eng. 2024, 17, 1–13. [Google Scholar] [CrossRef]
- Liu, C.; Nguyen, B.K. Low-Cost Real-Time Localisation for Agricultural Robots in Unstructured Farm Environments. Machines 2024, 12, 612. [Google Scholar] [CrossRef]
- NVIDIA Isaac ROS. Available online: https://developer.nvidia.com/isaac/ros (accessed on 9 October 2024).
- ROS—Robot Operating System. Available online: https://dev.intelrealsense.com/docs/ros-wrapper (accessed on 9 October 2024).
- Cheng, C.; Fu, J.; Su, H.; Ren, L. Recent Advancements in Agriculture Robots: Benefits and Challenges. Machines 2023, 11, 48. [Google Scholar] [CrossRef]
- Yuan, S.; Wang, H.; Xie, L. Survey on Localization Systems and Algorithms for Unmanned Systems; World Scientific Pub Co Pte Ltd.: Singapore, 2023; pp. 145–179. [Google Scholar] [CrossRef]
- Emmi, L.; Fernández, R.; Gonzalez-de Santos, P.; Francia, M.; Golfarelli, M.; Vitali, G.; Sandmann, H.; Hustedt, M.; Wollweber, M. Exploiting the Internet Resources for Autonomous Robots in Agriculture. Agriculture 2023, 13, 1005. [Google Scholar] [CrossRef]
- Kitić, G.; Krklješ, D.; Panić, M.; Petes, C.; Birgermajer, S.; Crnojević, V. Agrobot Lala—An Autonomous Robotic System for Real-Time, In-Field Soil Sampling, and Analysis of Nitrates. Sensors 2022, 22, 4207. [Google Scholar] [CrossRef]
- Vulpi, F.; Marani, R.; Petitti, A.; Reina, G.; Milella, A. An RGB-D multi-view perspective for autonomous agricultural robots. Comput. Electron. Agric. 2022, 202, 107419. [Google Scholar] [CrossRef]
- Mohammadi, H.; Jiang, Z.; Nguyen, L. A Programmable Hybrid Simulation Environment for Coordination of Autonomous Vehicles. In Proceedings of the NAECON 2023-IEEE National Aerospace and Electronics Conference, Fairborn, OH, USA, 28–31 August 2023; pp. 36–41. [Google Scholar]
- Macenski, S.; Moore, T.; Lu, D.V.; Merzlyakov, A.; Ferguson, M. From the desks of ROS maintainers: A survey of modern and capable mobile robotics algorithms in the robot operating system 2. Robot. Auton. Syst. 2023, 168. [Google Scholar] [CrossRef]
- Sperti, M.; Ambrosio, M.; Martini, M.; Navone, A.; Ostuni, A.; Chiaberge, M. Non-linear Model Predictive Control for Multi-task GPS-free Autonomous Navigation in Vineyards. arXiv 2024, arXiv:2404.05343. [Google Scholar]
- Svyatov, K.; Rubtcov, I.; Ponomarev, A. Virtual testing ground for the development of control systems for unmanned vehicles in agriculture. E3S Web Conf. 2023, 458, 08018. [Google Scholar] [CrossRef]
- Mansur, H.; Welch, S.; Dempsey, L.; Flippo, D. Importance of Photo-Realistic and Dedicated Simulator in Agricultural Robotics. Engineering 2023, 15, 318–327. [Google Scholar] [CrossRef]
- Iqbal, J.; Xu, R.; Sun, S.; Li, C. Simulation of an Autonomous Mobile Robot for LiDAR-Based In-Field Phenotyping and Navigation. Robotics 2020, 9, 46. [Google Scholar] [CrossRef]
- Martini, M.; Cerrato, S.; Salvetti, F.; Angarano, S.; Chiaberge, M. Position-Agnostic Autonomous Navigation in Vineyards with Deep Reinforcement Learning. In Proceedings of the 2022 IEEE 18th International Conference on Automation Science and Engineering (CASE), Mexico City, Mexico, 20–24 August 2022. [Google Scholar] [CrossRef]
- Chatziparaschis, D.; Scudiero, E.; Karydis, K. Robot-assisted soil apparent electrical conductivity measurements in orchards. arXiv 2023, arXiv:2309.05128. [Google Scholar]
- Ramin Shamshiri, R.; Hameed, I.A.; Pitonakova, L.; Weltzien, C.; Balasundram, S.K.; Yule, I.J.; Grift, T.E.; Chowdhary, G. Simulation software and virtual environments for acceleration of agricultural robotics: Features highlights and performance comparison. Int. J. Agric. Biol. Eng. 2018, 11, 12–20. [Google Scholar] [CrossRef]
- Ribeiro, J.P.L.; Gaspar, P.D.; Soares, V.N.G.J.; Caldeira, J.M.L.P. Computational Simulation of an Agricultural Robotic Rover for Weed Control and Fallen Fruit Collection-Algorithms for Image Detection and Recognition and Systems Control, Regulation, and Command. Electronics 2022, 11, 790. [Google Scholar] [CrossRef]
- Berger, G.S.; Teixeira, M.; Cantieri, A.; Lima, J.; Pereira, A.I.; Valente, A.; Castro, G.G.R.d.; Pinto, M.F. Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario. Agriculture 2023, 13, 239. [Google Scholar] [CrossRef]
- Zhang, J.; Du, X.; Dong, Q.; Xin, B. Distributed Collaborative Complete Coverage Path Planning Based on Hybrid Strategy. J. Syst. Eng. Electron. 2024, 35, 463–472. [Google Scholar] [CrossRef]
- PyTorch. Available online: https://pytorch.org/ (accessed on 9 October 2024).
- TensorFlow. Available online: https://www.tensorflow.org/ (accessed on 9 October 2024).
- Nvidia CUDA. Available online: https://developer.nvidia.com/cuda-toolkit (accessed on 9 October 2024).
- Intel oneAPI. Available online: https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html (accessed on 9 October 2024).
- AMD ROCm. Available online: https://www.amd.com/en/products/software/rocm.html (accessed on 9 October 2024).
- OpenCV—Open Computer Vision Library. Available online: https://opencv.org/ (accessed on 9 October 2024).
- Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep Network for Crop Row Detection in UAV Images. IEEE Access 2020, 8, 5189–5200. [Google Scholar] [CrossRef]
- Zhou, M.; Xia, J.; Yang, F.; Zheng, K.; Hu, M.; Li, D.; Zhang, S. Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. Int. J. Agric. Biol. Eng. 2021, 14, 176–184. [Google Scholar] [CrossRef]
- Gehan, M.A.; Fahlgren, N.; Abbasi, A.; Berry, J.C.; Callen, S.T.; Chavez, L.; Doust, A.N.; Feldman, M.J.; Gilbert, K.B.; Hodge, J.G.; et al. PlantCV v2: Image analysis software for high-throughput plant phenotyping. PeerJ 2017, 5, e4088. [Google Scholar] [CrossRef]
- OpenMP. Available online: https://www.openmp.org/ (accessed on 9 October 2024).
- Nvidia GeForce RTX 40 Series Graphics Cards. Available online: https://www.nvidia.com/en-us/geforce/graphics-cards/40-series/ (accessed on 9 October 2024).
- NVIDIA Jetson for Next-Generation Robotics. Available online: https://www.nvidia.com/en-us/autonomous-machines/embedded-systems (accessed on 9 October 2024).
- Coral USB Accelerator. Available online: https://coral.ai/products/accelerator (accessed on 9 October 2024).
- AMD Versal AI Edge Series VEK280 Evaluation Kit. Available online: https://www.xilinx.com/products/boards-and-kits/vek280.html (accessed on 9 October 2024).
- Vasconcelos, G.J.Q.; Costa, G.S.R.; Spina, T.V.; Pedrini, H. Low-Cost Robot for Agricultural Image Data Acquisition. Agriculture 2023, 13, 413. [Google Scholar] [CrossRef]
- Aguilera, C.A.; Figueroa-Flores, C.; Aguilera, C.; Navarrete, C. Comprehensive Analysis of Model Errors in Blueberry Detection and Maturity Classification: Identifying Limitations and Proposing Future Improvements in Agricultural Monitoring. Agriculture 2024, 14, 18. [Google Scholar] [CrossRef]
- Alibabaei, K.; Assunção, E.; Gaspar, P.D.; Soares, V.N.G.J.; Caldeira, J.M.L.P. Real-Time Detection of Vine Trunk for Robot Localization Using Deep Learning Models Developed for Edge TPU Devices. Future Internet 2022, 14, 199. [Google Scholar] [CrossRef]
- Budiyanta, N.E.; Sereati, C.O.; Manalu, F.R.G. Processing time increasement of non-rice object detection based on YOLOv3-tiny using Movidius NCS 2 on Raspberry Pi. Bull. Electr. Eng. Inform. 2022, 11, 1056–1061. [Google Scholar] [CrossRef]
- Routis, G.; Michailidis, M.; Roussaki, I. Plant Disease Identification Using Machine Learning Algorithms on Single-Board Computers in IoT Environments. Electronics 2024, 13, 1010. [Google Scholar] [CrossRef]
- Shende, K.; Sharda, A.; Hitzler, P. Hardware Design and Architecture of Multiagent Wireless Data Communication for Precision Agriculture Applications. SSRN 2023. [Google Scholar] [CrossRef]
- Mwitta, C.; Rains, G.C. The integration of GPS and visual navigation for autonomous navigation of an Ackerman steering mobile robot in cotton fields. Front. Robot. AI 2024, 11, 1359887. [Google Scholar] [CrossRef]
- Lyu, Z.; Lu, A.; Ma, Y. Improved YOLOv8-Seg Based on Multiscale Feature Fusion and Deformable Convolution for Weed Precision Segmentation. Appl. Sci. 2024, 14, 5002. [Google Scholar] [CrossRef]
- Edge TPU Performance Benchmarks. Available online: https://coral.ai/docs/edgetpu/benchmarks/ (accessed on 9 October 2024).
- Bringing Generative AI to Life with NVIDIA Jetson. 2023. Available online: https://developer.nvidia.com/blog/bringing-generative-ai-to-life-with-jetson/ (accessed on 9 October 2024).
- Lei, T.; Luo, C.; Jan, G.E.; Bi, Z. Deep Learning-Based Complete Coverage Path Planning With Re-Joint and Obstacle Fusion Paradigm. Front. Robot. AI 2022, 9, 843816. [Google Scholar] [CrossRef] [PubMed]
- Botta, A.; Moreno, E.; Baglieri, L.; Colucci, G.; Tagliavini, L.; Quaglia, G. Autonomous Driving System for Reversing an Articulated Rover for Precision Agriculture. Mech. Mach. Sci. 2022, 120 MMS, 412–419. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Filippou, E.; Kalaitzidis, D.; Benos, L.; Busato, P.; Bochtis, D. Proposing UGV and UAV Systems for 3D Mapping of Orchard Environments. Sensors 2022, 22, 1571. [Google Scholar] [CrossRef] [PubMed]
- Matsuzaki, S.; Miura, J.; Masuzawa, H. Multi-source pseudo-label learning of semantic segmentation for the scene recognition of agricultural mobile robots. Adv. Robot. 2022, 36, 1011–1029. [Google Scholar] [CrossRef]
- Matsuzaki, S.; Masuzawa, H.; Miura, J. Image-Based Scene Recognition for Robot Navigation Considering Traversable Plants and Its Manual Annotation-Free Training. IEEE Access 2022, 10, 5115–5128. [Google Scholar] [CrossRef]
- Ding, H.; Zhang, B.; Zhou, J.; Yan, Y.; Tian, G.; Gu, B. Recent developments and applications of simultaneous localization and mapping in agriculture. J. Field Robot. 2022, 39, 956–983. [Google Scholar] [CrossRef]
- Jiang, S.; Qi, P.; Han, L.; Liu, L.; Li, Y.; Huang, Z.; Liu, Y.; He, X. Navigation system for orchard spraying robot based on 3D LiDAR SLAM with NDT_ICP point cloud registration. Comput. Electron. Agric. 2024, 220, 108870. [Google Scholar] [CrossRef]
- Yan, Y.; Zhang, B.; Zhou, J.; Zhang, Y.; Liu, X. Real-Time Localization and Mapping Utilizing Multi-Sensor Fusion and Visual–IMU–Wheel Odometry for Agricultural Robots in Unstructured, Dynamic and GPS-Denied Greenhouse Environments. Agronomy 2022, 12, 1740. [Google Scholar] [CrossRef]
- Polvara, R.; Del Duchetto, F.; Neumann, G.; Hanheide, M. Navigate-and-Seek: A Robotics Framework for People Localization in Agricultural Environments. IEEE Robot. Autom. Lett. 2021, 6, 6577–6584. [Google Scholar] [CrossRef]
- Xu, R.; Li, C. A Review of High-Throughput Field Phenotyping Systems: Focusing on Ground Robots. Plant Phenomics 2022, 2022, 9760269. [Google Scholar] [CrossRef] [PubMed]
- Li, L.; Liang, H.; Wang, J.; Yang, J.; Li, Y. Online Routing for Autonomous Vehicle Cruise Systems with Fuel Constraints. J. Intell. Robot. Syst. Theory Appl. 2022, 104, 68. [Google Scholar] [CrossRef]
- Liu, L.; Wang, X.; Yang, X.; Liu, H.; Li, J.; Wang, P. Path planning techniques for mobile robots: Review and prospect. Expert Syst. Appl. 2023, 227, 120254. [Google Scholar] [CrossRef]
- Shi, C.; Xiong, Z.; Chen, M.; Wang, R.; Xiong, J. Cooperative Navigation for Heterogeneous Air-Ground Vehicles Based on Interoperation Strategy. Remote Sens. 2023, 15, 2006. [Google Scholar] [CrossRef]
- Sevastopoulos, C.; Konstantopoulos, S. A Survey of Traversability Estimation for Mobile Robots. IEEE Access 2022, 10, 96331–96347. [Google Scholar] [CrossRef]
- Gonzalez, D.; Perez, J.; Milanes, V.; Nashashibi, F. A Review of Motion Planning Techniques for Automated Vehicles. IEEE Trans. Intell. Transp. Syst. 2016, 17, 1135–1145. [Google Scholar] [CrossRef]
- Pak, J.; Kim, J.; Park, Y.; Son, H.I. Field Evaluation of Path-Planning Algorithms for Autonomous Mobile Robot in Smart Farms. IEEE Access 2022, 10, 60253–60266. [Google Scholar] [CrossRef]
- Jakubczyk, K.; Siemiątkowska, B.; Więckowski, R.; Rapcewicz, J. Hyperspectral Imaging for Mobile Robot Navigation. Sensors 2023, 23, 383. [Google Scholar] [CrossRef]
- Chakraborty, S.; Elangovan, D.; Govindarajan, P.L.; ELnaggar, M.F.; Alrashed, M.M.; Kamel, S. A Comprehensive Review of Path Planning for Agricultural Ground Robots. Sustainability 2022, 14, 9156. [Google Scholar] [CrossRef]
- Wang, H.; Li, G.; Hou, J.; Chen, L.; Hu, N. A Path Planning Method for Underground Intelligent Vehicles Based on an Improved RRT* Algorithm. Electronics 2022, 11, 294. [Google Scholar] [CrossRef]
- Noreen, I.; Khan, A.; Habib, Z. Optimal path planning using RRT* based approaches: A survey and future directions. Int. J. Adv. Comput. Sci. Appl. 2016, 7. Available online: https://thesai.org/Publications/ViewPaper?Volume=7&Issue=11&Code=IJACSA&SerialNo=14 (accessed on 9 October 2024). [CrossRef]
- Zadeh, N.; Hashimoto, H.; Raper, D.; Tanuyan, E.; Bruca, M. Autonomous smart farming system using FLANN-based feature matcher with robotic arm. Proc. Aip Conf. Proc. 2022, 2502, 040004. [Google Scholar] [CrossRef]
- Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
- Subeesh, A.; Mehta, C. Automation and digitization of agriculture using artificial intelligence and internet of things. Artif. Intell. Agric. 2021, 5, 278–291. [Google Scholar] [CrossRef]
- Lowe, T.; Moghadam, P.; Edwards, E.; Williams, J. Canopy density estimation in perennial horticulture crops using 3D spinning lidar SLAM. J. Field Robot. 2021, 38, 598–618. [Google Scholar] [CrossRef]
- Giubilato, R.; Sturzl, W.; Wedler, A.; Triebel, R. Challenges of SLAM in Extremely Unstructured Environments: The DLR Planetary Stereo, Solid-State LiDAR, Inertial Dataset. IEEE Robot. Autom. Lett. 2022, 7, 8721–8728. [Google Scholar] [CrossRef]
- Galati, R.; Mantriota, G.; Reina, G. RoboNav: An Affordable Yet Highly Accurate Navigation System for Autonomous Agricultural Robots. Robotics 2022, 11, 99. [Google Scholar] [CrossRef]
- Abdelaziz, S.I.K.; Elghamrawy, H.Y.; Noureldin, A.M.; Fotopoulos, G. Body-Centered Dynamically-Tuned Error-State Extended Kalman Filter for Visual Inertial Odometry in GNSS-Denied Environments. IEEE Access 2024, 12, 15997–16008. [Google Scholar] [CrossRef]
- Zhao, Z.; Zhang, Y.; Shi, J.; Long, L.; Lu, Z. Robust Lidar-Inertial Odometry with Ground Condition Perception and Optimization Algorithm for UGV. Sensors 2022, 22, 7424. [Google Scholar] [CrossRef]
- Rosero-Montalvo, P.D.; Gordillo-Gordillo, C.A.; Hernandez, W. Smart Farming Robot for Detecting Environmental Conditions in a Greenhouse. IEEE Access 2023, 11, 57843–57853. [Google Scholar] [CrossRef]
- Kamil, F.; Gburi, F.H.; Kadhom, M.A.; Kalaf, B.A. Fuzzy Logic-Based Control for Intelligent Vehicles: A Survey. Proc. Aip Conf. Proc. 2024, 3092. [Google Scholar] [CrossRef]
- Wang, T.; Wang, H.; Hu, H.; Lu, X.; Zhao, S. An adaptive fuzzy PID controller for speed control of brushless direct current motor. SN Appl. Sci. 2022, 4, 71. [Google Scholar] [CrossRef]
- Kägo, R.; Vellak, P.; Karofeld, E.; Noorma, M.; Olt, J. Assessment of using state of the art unmanned ground vehicles for operations on peat fields. Mires Peat 2021, 27, 11. [Google Scholar] [CrossRef]
- Simmons, A.; Chappell, S. Artificial intelligence-definition and practice. IEEE J. Ocean. Eng. 1988, 13, 14–42. [Google Scholar] [CrossRef]
- Kok, J.N.; Boers, E.J.; Kosters, W.A.; Van der Putten, P.; Poel, M. Artificial intelligence: Definition, trends, techniques, and cases. Artif. Intell. 2009, 1, 51. [Google Scholar]
- Nie, J.; Wang, Y.; Li, Y.; Chao, X. Artificial intelligence and digital twins in sustainable agriculture and forestry: A survey. Turk. J. Agric. For. 2022, 46, 642–661. [Google Scholar] [CrossRef]
- Ojo, M.O.; Zahid, A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors 2022, 22, 7965. [Google Scholar] [CrossRef]
- Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2020, 7, 1–19. [Google Scholar] [CrossRef]
- Linaza, M.T.; Posada, J.; Bund, J.; Eisert, P.; Quartulli, M.; Döllner, J.; Pagani, A.; Olaizola, I.G.; Barriguinha, A.; Moysiadis, T.; et al. Data-Driven Artificial Intelligence Applications for Sustainable Precision Agriculture. Agronomy 2021, 11, 1227. [Google Scholar] [CrossRef]
- Bini, D.; Pamela, D.; Prince, S. Machine Vision and Machine Learning for Intelligent Agrobots: A review. In Proceedings of the 2020 5th International Conference on Devices, Circuits and Systems (ICDCS), Coimbatore, India, 5–6 March 2020. [Google Scholar] [CrossRef]
- Wang, H.; Gu, J.; Wang, M. A review on the application of computer vision and machine learning in the tea industry. Front. Sustain. Food Syst. 2023, 7, 1172543. [Google Scholar] [CrossRef]
- Mohimont, L.; Alin, F.; Rondeau, M.; Gaveau, N.; Steffenel, L.A. Computer Vision and Deep Learning for Precision Viticulture. Agronomy 2022, 12, 2463. [Google Scholar] [CrossRef]
- Kumar, G.K.; Bangare, M.L.; Bangare, P.M.; Kumar, C.R.; Raj, R.; Arias-Gonzáles, J.L.; Omarov, B.; Mia, M.S. Internet of things sensors and support vector machine integrated intelligent irrigation system for agriculture industry. Discov. Sustain. 2024, 5, 6. [Google Scholar] [CrossRef]
- Bishnoi, S.; Hooda, B.K. Decision Tree Algorithms and their Applicability in Agriculture for Classification. J. Exp. Agric. Int. 2022, 44, 20–27. [Google Scholar] [CrossRef]
- Sapkal, K.G.; Kadam, A.B. Random Forest Classifier For Crop Prediction Based On Soil Data. J. Adv. Zool. 2024, 45, 113–117. [Google Scholar] [CrossRef]
- Bhanu Koduri, S.; Gunisetti, L.; Raja Ramesh, C.; Mutyalu, K.V.; Ganesh, D. Prediction of crop production using adaboost regression method. J. Phys. Conf. Ser. 2019, 1228, 012005. [Google Scholar] [CrossRef]
- Liu, R.; Yandun, F.; Kantor, G. LiDAR-Based Crop Row Detection Algorithm for Over-Canopy Autonomous Navigation in Agriculture Fields. arXiv 2024, arXiv:2403.17774. [Google Scholar]
- Mokssit, S.; Licea, D.B.; Guermah, B.; Ghogho, M. Deep Learning Techniques for Visual SLAM: A Survey. IEEE Access 2023, 11, 20026–20050. [Google Scholar] [CrossRef]
- Wang, K.; Ma, S.; Chen, J.; Ren, F.; Lu, J. Approaches, Challenges, and Applications for Deep Visual Odometry: Toward Complicated and Emerging Areas. IEEE Trans. Cogn. Dev. Syst. 2022, 14, 35–49. [Google Scholar] [CrossRef]
- Pathan, M.; Patel, N.; Yagnik, H.; Shah, M. Artificial cognition for applications in smart agriculture: A comprehensive review. Artif. Intell. Agric. 2020, 4, 81–95. [Google Scholar] [CrossRef]
- Sujatha, K.; Reddy, T.K.; Bhavani, N.; Ponmagal, R.; Srividhya, V.; Janaki, N. UGVs for Agri Spray with AI assisted Paddy Crop disease Identification. Proc. Procedia Comput. Sci. 2023, 230, 70–81. [Google Scholar] [CrossRef]
- Wang, K.; Kooistra, L.; Pan, R.; Wang, W.; Valente, J. UAV-based simultaneous localization and mapping in outdoor environments: A systematic scoping review. J. Field Robot. 2024, 41, 1617–1642. [Google Scholar] [CrossRef]
- Gharakhani, H.; Thomasson, J.A. Evaluating object detection and stereoscopic localization of a robotic cotton harvester under real field conditions. In SPIE—The International Society for Optical Engineering; SPIE Defense + Commercial Sensing: Orlando, FL, USA, 2023; Volume 12539. [Google Scholar] [CrossRef]
- Akter, R.; Islam, M.S.; Sohan, K.; Ahmed, M.I. Insect Recognition and Classification Using Optimized Densely Connected Convolutional Neural Network. Lect. Notes Netw. Syst. 2023, 624, 251–264. [Google Scholar] [CrossRef]
- Stefanović, D.; Antić, A.; Otlokan, M.; Ivošević, B.; Marko, O.; Crnojević, V.; Panić, M. Blueberry Row Detection Based on UAV Images for Inferring the Allowed UGV Path in the Field. Lect. Notes Netw. Syst. 2023, 590, 401–411. [Google Scholar] [CrossRef]
- Thapa, S.; Rains, G.C.; Porter, W.M.; Lu, G.; Wang, X.; Mwitta, C.; Virk, S.S. Robotic Multi-Boll Cotton Harvester System Integration and Performance Evaluation. AgriEngineering 2024, 6, 803–822. [Google Scholar] [CrossRef]
- Park, Y.H.; Choi, S.H.; Kwon, Y.J.; Kwon, S.W.; Kang, Y.J.; Jun, T.H. Detection of Soybean Insect Pest and a Forecasting Platform Using Deep Learning with Unmanned Ground Vehicles. Agronomy 2023, 13, 477. [Google Scholar] [CrossRef]
- Huang, P.; Huang, P.; Wang, Z.; Wu, X.; Liu, J.; Zhu, L. Deep-Learning-Based Trunk Perception with Depth Estimation and DWA for Robust Navigation of Robotics in Orchards. Agronomy 2023, 13, 1084. [Google Scholar] [CrossRef]
- Lacotte, V.; NGuyen, T.; Sempere, J.D.; Novales, V.; Dufour, V.; Moreau, R.; Pham, M.T.; Rabenorosoa, K.; Peignier, S.; Feugier, F.G.; et al. Pesticide-Free Robotic Control of Aphids as Crop Pests. AgriEngineering 2022, 4, 903–921. [Google Scholar] [CrossRef]
- Khan, M.S.A.; Hussian, D.; Ali, Y.; Rehman, F.U.; Aqeel, A.B.; Khan, U.S. Multi-Sensor SLAM for efficient Navigation of a Mobile Robot. In Proceedings of the 2021 IEEE 4th International Conference on Computing and Information Sciences, ICCIS 2021, Karachi, Pakistan, 29–30 November 2021. [Google Scholar] [CrossRef]
- Nourizadeh, P.; Stevens McFadden, F.J.; Browne, W.N. In situ slip estimation for mobile robots in outdoor environments. J. Field Robot. 2023, 40, 467–482. [Google Scholar] [CrossRef]
- Liu, C.; Zhao, J.; Sun, N. A Review of Collaborative Air-Ground Robots Research. J. Intell. Robot. Syst. Theory Appl. 2022, 106, 60. [Google Scholar] [CrossRef]
- Wang, C.; Wang, J.; Wei, C.; Zhu, Y.; Yin, D.; Li, J. Vision-Based Deep Reinforcement Learning of UAV-UGV Collaborative Landing Policy Using Automatic Curriculum. Drones 2023, 7, 676. [Google Scholar] [CrossRef]
- Blais, M.A.; Akhloufi, M.A. Reinforcement learning for swarm robotics: An overview of applications, algorithms and simulators. Cogn. Robot. 2023, 3, 226–256. [Google Scholar] [CrossRef]
- Xiao, Q.; Li, Y.; Luo, F.; Liu, H. Analysis and assessment of risks to public safety from unmanned aerial vehicles using fault tree analysis and Bayesian network. Technol. Soc. 2023, 73, 102229. [Google Scholar] [CrossRef]
- Altalak, M.; Ammad uddin, M.; Alajmi, A.; Rizg, A. Smart Agriculture Applications Using Deep Learning Technologies: A Survey. Appl. Sci. 2022, 12, 5919. [Google Scholar] [CrossRef]
- Farjon, G.; Huijun, L.; Edan, Y. Deep-Learning-based Counting Methods, Datasets, and Applications in Agriculture—A Review. arXiv 2023, arXiv:2303.02632. [Google Scholar]
- Lu, Y.; Young, S. A survey of public datasets for computer vision tasks in precision agriculture. Comput. Electron. Agric. 2020, 178, 105760. [Google Scholar] [CrossRef]
- Chen, J.; Liu, H.; Zhang, Y.; Zhang, D.; Ouyang, H.; Chen, X. A Multiscale Lightweight and Efficient Model Based on YOLOv7: Applied to Citrus Orchard. Plants 2022, 11, 3260. [Google Scholar] [CrossRef]
- Samson Adekunle, T.; Oladayo Lawrence, M.; Omotayo Alabi, O.; Afolorunso, A.A.; Nse Ebong, G.; Abiola Oladipupo, M. Deep learning technique for plant disease detection. Comput. Sci. Inf. Technol. 2024, 5, 55–62. [Google Scholar] [CrossRef]
- Yu, F.; Wang, M.; Xiao, J.; Zhang, Q.; Zhang, J.; Liu, X.; Ping, Y.; Luan, R. Advancements in Utilizing Image-Analysis Technology for Crop-Yield Estimation. Remote Sens. 2024, 16, 1003. [Google Scholar] [CrossRef]
- Singh, M.; Fuenmayor, E.; Hinchy, E.; Qiao, Y.; Murray, N.; Devine, D. Digital Twin: Origin to Future. Appl. Syst. Innov. 2021, 4, 36. [Google Scholar] [CrossRef]
- Verdouw, C.; Tekinerdogan, B.; Beulens, A.; Wolfert, S. Digital twins in smart farming. Agric. Syst. 2021, 189, 103046. [Google Scholar] [CrossRef]
- Tomczyk, M.; van der Valk, H. Digital Twin Paradigm Shift: The Journey of the Digital Twin Definition. Proc. ICEIS 2022, 2, 90–97. [Google Scholar]
- Agrawal, A.; Fischer, M.; Singh, V. Digital Twin: From Concept to Practice. arXiv 2022, arXiv:2201.06912. [Google Scholar]
- Skobelev, P.; Laryukhin, V.; Simonova, E.; Goryanin, O.; Yalovenko, V.; Yalovenko, O. Multi-agent approach for developing a digital twin of wheat. In Proceedings of the 2020 IEEE International Conference on Smart Computing (SMARTCOMP), Bologna, Italy, 14–17 September 2020. [Google Scholar] [CrossRef]
- Alves, R.G.; Souza, G.; Maia, R.F.; Tran, A.L.H.; Kamienski, C.; Soininen, J.P.; Aquino, P.T.; Lima, F. A digital twin for smart farming. In Proceedings of the 2019 IEEE Global Humanitarian Technology Conference (GHTC), Seattle, WA, USA, 17–20 October 2019. [Google Scholar] [CrossRef]
- Han, J.B.; Kim, S.S.; Song, H.J. Development of real-time digital twin model of autonomous field robot for prediction of vehicle stability. J. Inst. Control Robot. Syst. 2021, 27, 190–196. [Google Scholar] [CrossRef]
- Cesco, S.; Sambo, P.; Borin, M.; Basso, B.; Orzes, G.; Mazzetto, F. Smart agriculture and digital twins: Applications and challenges in a vision of sustainability. Eur. J. Agron. 2023, 146, 126809. [Google Scholar] [CrossRef]
- Malik, P.; Sneha; Garg, D.; Bedi, H.; Gehlot, A.; Malik, P.K. An Improved Agriculture Farming Through the Role of Digital Twin. In Proceedings of the 2023 4th International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 6–8 July 2023. [Google Scholar]
- Nair, M.; Dede, O.L.; De, S.; Fernandez, R.E. Digital Twin for Bruise Detection in Precision Agriculture. In Proceedings of the 2024 IEEE 21st Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 6–9 January 2024. [Google Scholar]
- Weckesser, F.; Beck, M.; Hülsbergen, K.J.; Peisl, S. A Digital Advisor Twin for Crop Nitrogen Management. Agriculture 2022, 12, 302. [Google Scholar] [CrossRef]
- Banić, M.; Simonović, M.; Stojanović, L.; Rangelov, D.; Miltenović, A.; Perić, M. Digital twin based lightweighting of robot unmanned ground vehicles. Facta Univ. Ser. Autom. Control Robot. 2022, 1, 187–199. [Google Scholar] [CrossRef]
- Tsolakis, N.; Bechtsis, D.; Bochtis, D. Agros: A robot operating system based emulation tool for agricultural robotics. Agronomy 2019, 9, 403. [Google Scholar] [CrossRef]
MFR 1 | Model | Tech 1 | Shutter 1 | Max res 1 | Range (m) 1 | FoV 1 | Intf 1 | Prot 1 |
---|---|---|---|---|---|---|---|---|
Orbbec | Gemini 336 | SV | Rolling | @30 fps | 0.1–10 | 90° × 65° | USB3.0 Type C | IP5X |
Orbbec | Gemini 336L | SV | Global | @30 fps | 0.17–10 | 94° × 68° | USB3.0 Type C | IP65 |
Orbbec | Femto Mega I | ToF | Rolling | @10 fps | 0.25–3.86 | 80° × 51° | Ethernet | IP65 |
Orbbec | Femto Bolt | ToF | Rolling | @15 fps | 0.25–5.46 | 80° × 51° | USB3.2 Type C | n/a |
Orbbec | Astra 2 | SL | Rolling | @30 fps | 0.6–8 | 75° × 36° | USB3.0 Type C | n/a |
Intel | RS ZR300 | SV | Rolling | @30 fps | 0.55–2.8 | 59° × 46° | USB3.0 | n/a |
Sick | V3S146 | SV | Global | 0.28–16 | 130° × 105° | Ethernet | n/a | |
@30 fps | 0.65–37 | 90° × 60° | ||||||
Vzense | NYX660 | ToF | Global | @15 fps | 0.3–4.5 | 70° × 50° | Ethernet, RS485 | IP67 |
Basler | blaze-102 | ToF | Global | @30 fps | 0.3–10 | 67° ×51° | Ethernet | IP67 |
StereoLabs | ZED 2i | SV | Rolling | 0.3–20 @f/2 | 120° | USB | IP66 | |
@15 fps | 1.5–35 @f/1.8 | type C | ||||||
Sipeed | MaixSense A075 | ToF | Rolling | @30 fps | 0.15–1.5 | 120° | USB2.0 Type C | n/a |
eCon | Tara Stereo Vision | SV | Global | @60 fps (monochrome) | 0.5–3 | n/a | USB3 | n/a |
Lucid | Helios2 Ray Outdoor | ToF | Global | @30 fps | 0.3–8.33 | n/a | Ethernet | IP67 |
MFR 1 | Model | HR (deg) 1 | VR (deg) 1 | SR (Hz) 1 | MR (m) 1 | FoV (deg) 1 | Acc (cm) 1 | Intf 1 | Prot 1 |
---|---|---|---|---|---|---|---|---|---|
Velodyne | Puck VLP-16 | 0.1–0.4 | 2 | 5–20 | 100 | 30 × 360 | 3 | ethernet | IP67 |
Ultra Puck VLP-32c | 0.1–0.4 | 0.33 | 5–20 | 200 | 40 × 360 | 3 | ethernet | IP67 | |
HDL-32 | 0.08–0.33 | 1.33 | 5–20 | 100 | 41 × 360 | 2 | ethernet | IP67 | |
Alpha Prime | 0.1–0.4 | 0.11 | 5–20 | 245 | 40 × 360 | 3 | ethernet | IP67 | |
Innoviz | One | 0.1 | 0.1 | 10–15 | 250 | 25 × 115 | 3 | MIPI CSI-2 | IP6K9K |
Two | 0.05 | 0.05 | 20 | 300 | 43 × 120 | 1 | MIPI CSI-2 | IP6K9K | |
Leddar Tech | Pixell | n/a | n/a | 20 | 56 | 16 × 178 | 3 | ethernet | IP67 |
Quanergy Systems | M8-Prime Ultra | 0.033 | 0.033 | 5–20 | 200 | 20 × 360 | 3 | ethernet | IP69K |
RIEGL | VUX-1HA | 0.001 | 250 | 150 | 360 | 0.5 | ethernet, USB | IP64 | |
Sick AG | LMS511 | 0.17 | 25–100 | 80 | 190 | 0.24 | ethernet | IP67 |
Simulation Environment | Robot Types | Sensors | OS 1 | Programming Languages and Environments | Ref. 1 |
---|---|---|---|---|---|
Gazebo | UAV, UGV, cars, boats | RGB, IR, LiDAR, IMU, GNSS, Force, Distance, Altimeter, Depth, RFID | Linux, Mac OS, Windows | Python, C++ and Java through ROS | [47,49,50,53,54,55,56] |
CoppeliaSim | Various industrial robots, ground vehicles | LiDAR, GNSS, IMU, RGB, Infrared, Proximity | Linux, Mac OS, Windows | C++, Python, ROS | [56,57,58] |
Webots | UAV, UGV, cars, humanoids | LiDAR, GNSS, IMU, RGB | Linux, Mac OS, Windows | C++, Python, Java, MATLAB, ROS | [51,56] |
Isaac Sim | Industrial robots, Ground robots | LiDAR, GNSS, IMU, Infrared, RGB-D | Linux, Windows | C, C++, Python, Java, Lua, MATLAB, Octave | [59] |
Class | Device | AI (TOPS) 1 | TDP (W) 1 | RAM (GB) 1 | Cost (€) |
---|---|---|---|---|---|
GPU | Nvidia 4060 | 353 | 160 | 16 (GDDR6) | 430 |
Nvidia 4070 | 706 | 285 | 16 (GDDR6X) | 800 | |
Nvidia 4080 | 836 | 320 | 16 (GDDR6X) | 1050 | |
Nvidia 4090 | 1321 | 450 | 24 (GDDR6X) | 1800 | |
SoM/SoC | Nvidia Jetson Nano | 0.5 | 10 | 4 (LPDDR4) | 240 |
Nvidia Jetson TX2 | 1.3 | 20 | 8 (LPDDR4) | 420 | |
Nvidia Jetson Xavier | 21 | 20 | 16 (LPDDR4X) | 720 | |
Nvidia Jetson AGX Orin | 275 | 60 | 64 (LPDDR5) | 2250 | |
Nvidia Jetson Orin Nano | 40 | 15 | 8 (LPDDR5) | 550 | |
TPU | Coral TPU | 4 | 2 | - | 80 |
FPGA | AMD Versal AI Edge VEK280 | 114 (a) | 75 | 12 | 6900 |
SoM | New CPUs Q4 2024 | up to 120 | 17–37 | 16/32 | - |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Agelli, M.; Corona, N.; Maggio, F.; Moi, P.V. Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology. Machines 2024, 12, 750. https://doi.org/10.3390/machines12110750
Agelli M, Corona N, Maggio F, Moi PV. Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology. Machines. 2024; 12(11):750. https://doi.org/10.3390/machines12110750
Chicago/Turabian StyleAgelli, Maurizio, Nicola Corona, Fabio Maggio, and Paolo Vincenzo Moi. 2024. "Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology" Machines 12, no. 11: 750. https://doi.org/10.3390/machines12110750
APA StyleAgelli, M., Corona, N., Maggio, F., & Moi, P. V. (2024). Unmanned Ground Vehicles for Continuous Crop Monitoring in Agriculture: Assessing the Readiness of Current ICT Technology. Machines, 12(11), 750. https://doi.org/10.3390/machines12110750