Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario
<p>Arrangement of olive groves in the cultivation zone (<b>left</b>), yellow chromotropic trap (<b>centre</b>), and positioning of the trap carried out by the operator on the ground (<b>right</b>).</p> "> Figure 2
<p>Image capture of the trap at three different distances. (<b>a</b>) 3.5 m with 5× digital zoom in detail; (<b>b</b>) 1.5 m with 2× digital zoom in detail; (<b>c</b>) 0.30 m with no digital zoom.</p> "> Figure 3
<p>Overview of the proposed methodology.</p> "> Figure 4
<p>Flowchart of the proposed methodology steps.</p> "> Figure 5
<p>YOLO providing the object detection. Note that the code extracts the objected center and its extremities, given the object occupancy in the image.</p> "> Figure 6
<p>Flowchart of the UGV behavior.</p> "> Figure 7
<p>Splits in the image made for the detection of <math display="inline"><semantics> <mrow> <mi>t</mi> <mi>r</mi> <mi>a</mi> <mi>p</mi> <mo>_</mo> <mi>p</mi> <mi>o</mi> <mi>s</mi> <mo>_</mo> <mi>U</mi> <mi>G</mi> <mi>V</mi> </mrow> </semantics></math>. This value is informed to the UAS to start the trap search process.</p> "> Figure 8
<p>Overview of the UAS control strategy.</p> "> Figure 9
<p>UAS performing take-off from the UGV and maintaining a minim distance. After the <math display="inline"><semantics> <mrow> <mi>z</mi> <mo>_</mo> <mi>l</mi> <mi>i</mi> <mi>n</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> </semantics></math> adjustment, the UAS performs a <math display="inline"><semantics> <mrow> <mi>z</mi> <mo>_</mo> <mi>a</mi> <mi>n</mi> <mi>g</mi> <mi>u</mi> <mi>l</mi> <mi>a</mi> <mi>r</mi> </mrow> </semantics></math> adjustment according to the trap’s position.</p> "> Figure 10
<p>Inputs and output of the fuzzy controller.</p> "> Figure 11
<p>Input fuzzification.</p> "> Figure 12
<p>Output fuzzification.</p> "> Figure 13
<p>System’s surface behavior. (<b>a</b>). Centering the robot’s angle concerning the trap. (<b>b</b>) and (<b>c</b>) Equipment linear speed in the <math display="inline"><semantics> <mrow> <mi>x</mi> <mo>_</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math>.</p> "> Figure 14
<p>Image of the landing base and the AR-tag with the Tello drone moving toward it.</p> "> Figure 15
<p>Simulated environment developed to validate the proposed strategy.</p> "> Figure 16
<p>UGV model used in this work.</p> "> Figure 17
<p>UGV validation strategy.</p> "> Figure 18
<p>Euclidean distance between the UGV and the trap for the UGV validation. (<b>a</b>) Presents a box plot per experiment. (<b>b</b>) Single graph with data from all experiments.</p> "> Figure 19
<p>Traps position for the UAS validation.</p> "> Figure 20
<p>Euclidean distance between the UAS and the trap for the UAS validation. (<b>a</b>) Presents a box plot per experiment. (<b>b</b>) Single graph with data from all experiments.</p> "> Figure 21
<p>(<b>a</b>) UAS view when reaching the objective. (<b>b</b>) UAS position relative to the trap.</p> "> Figure 22
<p>Trap position in relation to the UAS and UGV initial position. (<b>1</b>) Trap at 70 degrees left. (<b>2</b>) Trap at 30 degrees left. (<b>3</b>) Trap at center. (<b>4</b>) Trap at 30 degrees right. (<b>5</b>) Trap at 70 degrees right.</p> "> Figure 23
<p>Manual position landing error measurement example.</p> "> Figure 24
<p>Landing error measurements for the 10-round landing experiments.</p> ">
Abstract
:1. Introduction
Main Contributions
- Implementation of an artificial-neural-network-based algorithm to identify the chromotropic yellow traps fixed in a group of trees and provide position and orientation data to the UGV and UAS navigation algorithms to execute their missions.
- Evaluation of the proposed architecture operation in a simulated environment using small-sized vehicles integrated through ROS as a first step to build a fully operational autonomous cooperative trap capture system.
- Proposition and experimental evaluation control strategy combined with a fiducial marker for UAS vision-based landing onto a UGV roof considering the specific application environment and operational conditions.
2. Background and Related Works
2.1. Agricultural Robots
2.2. UAS Landing Strategies
3. Proposed Methodology
3.1. Problem Definition
3.2. Image Capture Parameters Definition
- In this experiment, a yellow chromotropic trap was positioned at a height of approximately 1.70 cm, with insects (Bactrocera Olea) trapped on it.
- A measuring tape was used to guide the distance between the camera (from a smartphone) and the trap.
- For every 30 cm distance between the camera and the trap, five images were captured. The process occurred until reaching the maximum distance corresponding to 5 m.
- The test took place in a controlled environment with exposure to diffused sunlight.
3.3. Overview of the Robotic Architecture Strategy
3.4. Architecture Description
3.5. ROS Packages
3.6. Trap Detection Algorithm
3.6.1. UGV Control
3.6.2. UAS Control
3.6.3. Predefined Movements
3.6.4. Fuzzy Controller
3.7. UAS Base Search and Landing Algorithm
4. Results and Discussion
4.1. Experiment Description
4.1.1. UGV Validation
4.1.2. UAS Validation
4.2. UAS Base Search and Landing Algorithm Experimental Evaluation
4.2.1. AR-Tag Absolute Error Measurements
4.2.2. Base Search and Autonomous Landing Using the AR-Tag Position Readings
4.3. Discussion and Challenges
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
Abbreviations
IMU | Inertial Measurement Unit |
FAO | Food and Agriculture Organization |
FCU | Flight Controller Unit |
MDPI | Multidisciplinary Digital Publishing Institute |
PID | Proportional Integral Derivative |
UAS | Unmanned Aerial System |
UGV | Unmanned Ground Vehicle |
UN | United Nations |
ROS | Robotic Operating System |
SITL | Software In The loop |
SMC | Sliding Mode Control |
FCKF | Fuzzy Complementary Kalman Filter |
YOLO | You Only Look Once |
References
- Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine vision systems in precision agriculture for crop farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [Green Version]
- Xie, D.; Chen, L.; Liu, L.; Chen, L.; Wang, H. Actuators and Sensors for Application in Agricultural Robots: A Review. Machines 2022, 10, 913. [Google Scholar] [CrossRef]
- Khujamatov, K.E.; Toshtemirov, T.; Lazarev, A.; Raximjonov, Q. IoT and 5G technology in agriculture. In Proceedings of the 2021 International Conference on Information Science and Communications Technologies (ICISCT), Tashkent, Uzbekistan, 3–5 November 2021; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
- Li, Z.; Xie, D.; Liu, L.; Wang, H.; Chen, L. Inter-row Information Recognition of Maize in Middle and Late Stages via LiDAR Supplementary Vision. Front. Plant Sci. 2022, 13, 1–14. [Google Scholar] [CrossRef]
- Milioto, A.; Lottes, P.; Stachniss, C. Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–28 May 2018; IEEE: Piscataway, NJ, USA; pp. 2229–2235. [Google Scholar]
- Carbone, C.; Garibaldi, O.; Kurt, Z. Swarm robotics as a solution to crops inspection for precision agriculture. KnE Eng. 2018, 2018, 552–562. [Google Scholar] [CrossRef] [Green Version]
- Gonzalez-de Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of robots for environmentally-safe pest control in agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef] [Green Version]
- Pereira, C.S.; Morais, R.; Reis, M.J. Recent advances in image processing techniques for automated harvesting purposes: A review. In Proceedings of the 2017 Intelligent Systems Conference (IntelliSys), London, UK, 7–8 September 2017; IEEE: Piscataway, NJ, USA; pp. 566–575. [Google Scholar]
- Biundini, I.Z.; Melo, A.G.; Pinto, M.F.; Marins, G.M.; Marcato, A.L.; Honorio, L.M. Coverage path planning optimization for slopes and dams inspection. In Proceedings of the Iberian Robotics conference, Porto, Portugal, 20–22 November 2019; Springer: Berlin/Heidelberg, Germany; pp. 513–523. [Google Scholar]
- Ramos, G.S.; Pinto, M.F.; Coelho, F.O.; Honório, L.M.; Haddad, D.B. Hybrid methodology based on computational vision and sensor fusion for assisting autonomous UAV on offshore messenger cable transfer operation. Robotica 2022, 40, 1–29. [Google Scholar] [CrossRef]
- Melo, A.G.; Andrade, F.A.; Guedes, I.P.; Carvalho, G.F.; Zachi, A.R.; Pinto, M.F. Fuzzy Gain-Scheduling PID for UAV Position and Altitude Controllers. Sensors 2022, 22, 2173. [Google Scholar] [CrossRef]
- de Castro, G.G.; Pinto, M.F.; Biundini, I.Z.; Melo, A.G.; Marcato, A.L.; Haddad, D.B. Dynamic Path Planning Based on Neural Networks for Aerial Inspection. J. Control. Autom. Electr. Syst. 2022, 34, 1–21. [Google Scholar] [CrossRef]
- Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
- Kakutani, K.; Matsuda, Y.; Nonomura, T.; Takikawa, Y.; Osamura, K.; Toyoda, H. Remote-controlled monitoring of flying pests with an electrostatic insect capturing apparatus carried by an unmanned aerial vehicle. Agriculture 2021, 11, 176. [Google Scholar] [CrossRef]
- Roosjen, P.P.; Kellenberger, B.; Kooistra, L.; Green, D.R.; Fahrentrapp, J. Deep learning for automated detection of Drosophila suzukii: Potential for UAV-based monitoring. Pest Manag. Sci. 2020, 76, 2994–3002. [Google Scholar] [CrossRef]
- Benheim, D.; Rochfort, S.; Robertson, E.; Potter, I.; Powell, K. Grape phylloxera (Daktulosphaira vitifoliae)–a review of potential detection and alternative management options. Ann. Appl. Biol. 2012, 161, 91–115. [Google Scholar] [CrossRef]
- Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A novel methodology for improving plant pest surveillance in vineyards and crops using UAV-based hyperspectral and spatial data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [Green Version]
- Albani, D.; IJsselmuiden, J.; Haken, R.; Trianni, V. Monitoring and mapping with robot swarms for agricultural applications. In Proceedings of the 2017 14th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Madrid, Spain, 29 November–2 December 2017; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
- Mammarella, M.; Comba, L.; Biglia, A.; Dabbene, F.; Gay, P. Cooperative Agricultural Operations of Aerial and Ground Unmanned Vehicles. IEEE Int. Workshop Metrol. Agric. For. 2020, 224–229. [Google Scholar]
- Madridano, Á.; Al-Kaff, A.; Flores, P.; Martín, D.; de la Escalera, A. Software architecture for autonomous and coordinated navigation of uav swarms in forest and urban firefighting. Appl. Sci. 2021, 11, 1258. [Google Scholar] [CrossRef]
- Shi, Y.; Wang, N.; Zheng, J.; Zhang, Y.; Yi, S.; Luo, W.; Sycara, K. Adaptive informative sampling with environment partitioning for heterogeneous multi-robot systems. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; IEEE: Piscataway, NJ, USA; pp. 11718–11723. [Google Scholar]
- Narváez, E.; Ravankar, A.A.; Ravankar, A.; Emaru, T.; Kobayashi, Y. Autonomous vtol-uav docking system for heterogeneous multirobot team. IEEE Trans. Instrum. Meas. 2020, 70, 1–18. [Google Scholar] [CrossRef]
- Sinnemann, J.; Boshoff, M.; Dyrska, R.; Leonow, S.; Mönnigmann, M.; Kuhlenkötter, B. Systematic literature review of applications and usage potentials for the combination of unmanned aerial vehicles and mobile robot manipulators in production systems. Prod. Eng. 2022, 16, 579–596. [Google Scholar] [CrossRef]
- Rizk, Y.; Awad, M.; Tunstel, E.W. Cooperative heterogeneous multi-robot systems: A survey. ACM Comput. Surv. (CSUR) 2019, 52, 1–31. [Google Scholar] [CrossRef]
- Fu, M.; Zhang, K.; Yi, Y.; Shi, C. Autonomous landing of a quadrotor on an UGV. In Proceedings of the 2016 IEEE International Conference on Mechatronics and Automation, Harbin, China, 7–10 August 2016; IEEE: Piscataway, NJ, USA; pp. 988–993. [Google Scholar]
- Chen, X.; Phang, S.K.; Shan, M.; Chen, B.M. System integration of a vision-guided UAV for autonomous landing on moving platform. In Proceedings of the 2016 12th IEEE International Conference on Control and Automation (ICCA), Kathmandu, Nepal, 1–3 June 2016; IEEE: Piscataway, NJ, USA; pp. 761–766. [Google Scholar]
- FAO. The future of food and agriculture–Trends and challenges. Annu. Rep. 2017, 296, 1–180. [Google Scholar]
- Kim, W.S.; Lee, W.S.; Kim, Y.J. A review of the applications of the internet of things (IoT) for agricultural automation. J. Biosyst. Eng. 2020, 45, 385–400. [Google Scholar] [CrossRef]
- Jha, K.; Doshi, A.; Patel, P.; Shah, M. A comprehensive review on automation in agriculture using artificial intelligence. Artif. Intell. Agric. 2019, 2, 1–12. [Google Scholar] [CrossRef]
- Lattanzi, D.; Miller, G. Review of Robotic Infrastructure Inspection Systems. J. Infrastruct. Syst. 2017, 23. [Google Scholar] [CrossRef]
- Coelho, F.O.; Pinto, M.F.; Souza, J.P.C.; Marcato, A.L. Hybrid methodology for path planning and computational vision applied to autonomous mission: A new approach. Robotica 2020, 38, 1000–1018. [Google Scholar] [CrossRef]
- Chebrolu, N.; Läbe, T.; Stachniss, C. Robust long-term registration of UAV images of crop fields for precision agriculture. IEEE Robot. Autom. Lett. 2018, 3, 3097–3104. [Google Scholar] [CrossRef]
- Pinto, M.F.; Coelho, F.O.; De Souza, J.P.; Melo, A.G.; Marcato, A.L.; Urdiales, C. Ekf design for online trajectory prediction of a moving object detected onboard of a uav. In Proceedings of the 2018 13th APCA International Conference on Automatic Control and Soft Computing (CONTROLO), Ponta Delgada, Portugal, 4–6 June 2018; IEEE: Piscataway, NJ, USA; pp. 407–412. [Google Scholar]
- Pathmakumar, T.; Kalimuthu, M.; Elara, M.R.; Ramalingam, B. An autonomous robot-aided auditing scheme for floor cleaning. Sensors 2021, 21, 4332. [Google Scholar] [CrossRef]
- Azeta, J.; Bolu, C.; Hinvi, D.; Abioye, A.; Boyo, H.; Anakhu, P.; Onwordi, P. An android based mobile robot for monitoring and surveillance. Procedia Manuf. 2019, 35, 1129–1134. [Google Scholar] [CrossRef]
- Bayati, M.; Fotouhi, R. A mobile robotic platform for crop monitoring. Adv. Robot. Autom. 2018, 7, 1000186. [Google Scholar] [CrossRef]
- Maciel, G.M.; Pinto, M.F.; Júnior, I.C.d.S.; Coelho, F.O.; Marcato, A.L.; Cruzeiro, M.M. Shared control methodology based on head positioning and vector fields for people with quadriplegia. Robotica 2022, 40, 348–364. [Google Scholar] [CrossRef]
- Kulbacki, M.; Segen, J.; Knieć, W.; Klempous, R.; Kluwak, K.; Nikodem, J.; Kulbacka, J.; Serester, A. Survey of drones for agriculture automation from planting to harvest. In Proceedings of the 2018 IEEE 22nd International Conference on Intelligent Engineering Systems (INES), Las Palmas de Gran Canaria, Spain, 21–23 June 2018; IEEE: Piscataway, NJ, USA; pp. 353–358. [Google Scholar]
- Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the use of unmanned aerial systems for environmental monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
- Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
- Hajjaj, S.S.H.; Sahari, K.S.M. Review of research in the area of agriculture mobile robots. In Proceedings of the 8th International Conference on Robotic, Vision, Signal Processing & Power Applications, Penang, Malaysia, 10–12 November 2013; Springer: Berlin/Heidelberg, Germany; pp. 107–117. [Google Scholar]
- Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An Overview of Cooperative Robotics in Agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
- Kim, P.; Price, L.C.; Park, J.; Cho, Y.K. UAV-UGV cooperative 3D environmental mapping. In Proceedings of the ASCE International Conference on Computing in Civil Engineering, Atlanta, GA, USA, 17–19 June 2019. [Google Scholar]
- Maini, P.; Sujit, P. On cooperation between a fuel constrained UAV and a refueling UGV for large scale mapping applications. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; IEEE: Piscataway, NJ, USA; pp. 1370–1377. [Google Scholar]
- Arbanas, B.; Ivanovic, A.; Car, M.; Haus, T.; Orsag, M.; Petrovic, T.; Bogdan, S. Aerial-ground robotic system for autonomous delivery tasks. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; IEEE: Piscataway, NJ, USA; pp. 5463–5468. [Google Scholar]
- Alam, M.S.; Oluoch, J. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs). Expert Syst. Appl. 2021, 179, 115091. [Google Scholar] [CrossRef]
- Jin, S.; Zhang, J.; Shen, L.; Li, T. On-board vision autonomous landing techniques for quadrotor: A survey. In Proceedings of the 2016 35th Chinese Control Conference (CCC), Chengdu, China, 27–29 July 2016; IEEE: Piscataway, NJ, USA; pp. 10284–10289. [Google Scholar]
- Khazetdinov, A.; Zakiev, A.; Tsoy, T.; Svinin, M.; Magid, E. Embedded ArUco: A novel approach for high precision UAV landing. In Proceedings of the 2021 International Siberian Conference on Control and Communications (SIBCON), Kazan, Russia, 13–15 May 2021; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
- Polvara, R.; Sharma, S.; Wan, J.; Manning, A.; Sutton, R. Towards autonomous landing on a moving vessel through fiducial markers. In Proceedings of the 2017 European Conference on Mobile Robots (ECMR), Paris, France, 6–8 September 2017; IEEE: Piscataway, NJ, USA; pp. 1–6. [Google Scholar]
- Kumar, A. Real-time performance comparison of vision-based autonomous landing of quadcopter on a ground moving target. IETE J. Res. 2021, 1–18. [Google Scholar] [CrossRef]
- Yang, Q.; Sun, L. A fuzzy complementary Kalman filter based on visual and IMU data for UAV landing. Optik 2018, 173, 279–291. [Google Scholar] [CrossRef]
- Kim, J.; Jung, Y.; Lee, D.; Shim, D.H. Outdoor autonomous landing on a moving platform for quadrotors using an omnidirectional camera. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; IEEE: Piscataway, NJ, USA; pp. 1243–1252. [Google Scholar]
- Yang, S.; Ying, J.; Lu, Y.; Li, Z. Precise quadrotor autonomous landing with SRUKF vision perception. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; IEEE: Piscataway, NJ, USA; pp. 2196–2201. [Google Scholar]
- Yang, S.; Scherer, S.A.; Schauwecker, K.; Zell, A. Autonomous landing of MAVs on an arbitrarily textured landing site using onboard monocular vision. J. Intell. Robot. Syst. 2014, 74, 27–43. [Google Scholar] [CrossRef] [Green Version]
- Acuna, R.; Willert, V. Dynamic Markers: UAV landing proof of concept. In Proceedings of the 2018 Latin American Robotic Symposium, 2018 Brazilian Symposium on Robotics (SBR) and 2018 Workshop on Robotics in Education (WRE), Joao Pessoa, Brazil, 6–10 November 2018; IEEE: Piscataway, NJ, USA; pp. 496–502. [Google Scholar]
- Augustinos, A.; Stratikopoulos, E.; Zacharopoulou, A.; Mathiopoulos, K. Polymorphic microsatellite markers in the olive fly, Bactrocera oleae. Mol. Ecol. Notes 2002, 2, 278–280. [Google Scholar] [CrossRef]
- Nardi, F.; Carapelli, A.; Dallai, R.; Roderick, G.K.; Frati, F. Population structure and colonization history of the olive fly, Bactrocera oleae (Diptera, Tephritidae). Mol. Ecol. 2005, 14, 2729–2738. [Google Scholar] [CrossRef]
- Gonçalves, F.; Torres, L. The use of trap captures to forecast infestation by the olive fly, Bactrocera oleae (Rossi) (Diptera: Tephritidae), in traditional olive groves in north-eastern Portugal. Int. J. Pest Manag. 2013, 59, 279–286. [Google Scholar] [CrossRef]
- Sparrow, R.; Howard, M. Robots in agriculture: Prospects, impacts, ethics, and policy. Precis. Agric. 2021, 22, 818–833. [Google Scholar] [CrossRef]
- Mamdouh, N.; Khattab, A. YOLO-Based Deep Learning Framework for Olive Fruit Fly Detection and Counting. IEEE Access 2021, 9, 84252–84262. [Google Scholar] [CrossRef]
- Beyaz, A.; Martínez Gila, D.M.; Gómez Ortega, J.; Gámez García, J. Olive fly sting detection based on computer vision. Postharvest Biol. Technol. 2019, 150, 129–136. [Google Scholar] [CrossRef]
- Shaked, B.; Amore, A.; Ioannou, C.; Valdés, F.; Alorda, B.; Papanastasiou, S.; Goldshtein, E.; Shenderey, C.; Leza, M.; Pontikakos, C.; et al. Electronic traps for detection and population monitoring of adult fruit flies (Diptera: Tephritidae). J. Appl. Entomol. 2018, 142, 43–51. [Google Scholar] [CrossRef]
- López-Villalta, M.C. Olive Pest and Disease Management; International Olive Oil Council Madrid: Madrid, Spain, 1999. [Google Scholar]
- Hiemann, A.; Kautz, T.; Zottmann, T.; Hlawitschka, M. Enhancement of Speed and Accuracy Trade-Off for Sports Ball Detection in Videos—Finding Fast Moving, Small Objects in Real Time. Sensors 2021, 21, 3214. [Google Scholar] [CrossRef]
- de Oliveira Junior, A.; Piardi, L.; Bertogna, E.G.; Leitao, P. Improving the Mobile Robots Indoor Localization System by Combining SLAM with Fiducial Markers. In Proceedings of the 2021 Latin American Robotics Symposium (LARS), 2021 Brazilian Symposium on Robotics (SBR), and 2021 Workshop on Robotics in Education (WRE), Natal, Brazil, 11–15 October 2021; pp. 234–239. [Google Scholar] [CrossRef]
- Niekum, S. ar_track_alvar Ros Package Wiki. 2016. Available online: http://wiki.ros.org/ar_track_alvar (accessed on 15 January 2022).
- Enterprise, D. DJI Tello. 2022. Available online: https://m.dji.com/pt/product/tello (accessed on 15 January 2022).
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
- Wu, D.; Jiang, S.; Zhao, E.; Liu, Y.; Zhu, H.; Wang, W.; Wang, R. Detection of Camellia oleifera Fruit in Complex Scenes by Using YOLOv7 and Data Augmentation. Appl. Sci. 2022, 12, 11318. [Google Scholar] [CrossRef]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft coco: Common objects in context. In Proceedings of the European Conference on Computer Vision, Tel Aviv, Israel, 8–14 September 2014; Springer: Berlin/Heidelberg, Germany; pp. 740–755. [Google Scholar]
- Rohmer, E.; Singh, S.P.; Freese, M. V-REP: A versatile and scalable robot simulation framework. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; IEEE: Piscataway, NJ, USA; pp. 1321–1326. [Google Scholar]
- Ferro, M.; Mirante, A.; Ficuciello, F.; Vendittelli, M. A CoppeliaSim Dynamic Dimulator for the da Vinci Research Kit. IEEE Robot. Autom. Lett. 2022, 8, 129–136. [Google Scholar] [CrossRef]
- Robotics, C. Husky-unmanned ground vehicle. In Technical Specifications; Clearpath Robotics: Kitcener, ON, Canada, 2013; Available online: https://clearpathrobotics.com/husky-unmanned-ground-vehicle-robot/ (accessed on 15 January 2022).
Starting Point X; Y (m) | X Mean | X Std | Y Mean | Y Std |
---|---|---|---|---|
0.0; 2.0 | 1.51 | 0.15 | 1.28 | 0.32 |
1.0; 3.0 | 2.33 | 0.33 | 1.90 | 0.12 |
2.0; 2.0 | 2.62 | 0.42 | 2.21 | 0.44 |
3.0; 2.0 | 3.10 | 0.20 | 1.87 | 0.37 |
3.0; −3.0 | 3.45 | 0.42 | 2.85 | 0.34 |
95 % confidence interval | 2.60\pm 0.82 | 2.02\pm 0.85 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Berger, G.S.; Teixeira, M.; Cantieri, A.; Lima, J.; Pereira, A.I.; Valente, A.; Castro, G.G.R.d.; Pinto, M.F. Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario. Agriculture 2023, 13, 239. https://doi.org/10.3390/agriculture13020239
Berger GS, Teixeira M, Cantieri A, Lima J, Pereira AI, Valente A, Castro GGRd, Pinto MF. Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario. Agriculture. 2023; 13(2):239. https://doi.org/10.3390/agriculture13020239
Chicago/Turabian StyleBerger, Guido S., Marco Teixeira, Alvaro Cantieri, José Lima, Ana I. Pereira, António Valente, Gabriel G. R. de Castro, and Milena F. Pinto. 2023. "Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario" Agriculture 13, no. 2: 239. https://doi.org/10.3390/agriculture13020239
APA StyleBerger, G. S., Teixeira, M., Cantieri, A., Lima, J., Pereira, A. I., Valente, A., Castro, G. G. R. d., & Pinto, M. F. (2023). Cooperative Heterogeneous Robots for Autonomous Insects Trap Monitoring System in a Precision Agriculture Scenario. Agriculture, 13(2), 239. https://doi.org/10.3390/agriculture13020239