A Synergy of Innovative Technologies towards Implementing an Autonomous DIY Electric Vehicle for Harvester-Assisting Purposes
<p>A typical arrangement for all the components being necessary for the fruit transport process.</p> "> Figure 2
<p>The basic electromechanical layout of the harvester-assisting vehicle: (<b>a</b>) Initial design; (<b>b</b>) Implementation outcome.</p> "> Figure 3
<p>The geometry of the vehicle implies a specific set of calculations in order to feed the low-level control components with the necessary information for turning at a given radius, speed.</p> "> Figure 4
<p>(<b>a</b>) The plan of a field consisting of five parallel plantation lines; (<b>b</b>) The grid modeling this field, containing 6 lanes, 2 sides and 12 waypoints. The vehicle is allowed to move only on lanes and sides, and any path between two points reduces to a sequence of intermediate waypoints.</p> "> Figure 5
<p>A typical high-level control components arrangement inside an IP55 plastic box, including a navio2 unit, a raspberry pi model 3B, an Intel NCS-2 module, and a USB camera. The connection with indicative low-level components (here, an arduino uno unit, on the left) is also shown.</p> "> Figure 6
<p>An indicative arrangement allowing for voice command functionality and GPS positioning using a raspberry pi 3 model A+, a LoRa Dragino radio module, and the ASUS AI noise-canceling stick plus a headset. This system variant is tailored for accompanying the operator (i.e., the farmer).</p> "> Figure 7
<p>A typical interoperability diagram highlighting the roles of the entities used to support the autonomous vehicle’s functionality.</p> "> Figure 8
<p>The robotic vehicle in action: the high-level controlling parts, inside the IP55 box, are clearly distinguished as well as the plastic pallet bin carrying testing cargos of 10–15 kg, approximately, on smooth (at the <b>left</b>) and comparatively rough (at the <b>right</b>) terrains.</p> "> Figure 9
<p>The experimental arrangement deployment being necessary for testing the path-following accuracy of the proposed robotic vehicle on both smooth and comparatively rough terrains.</p> "> Figure 10
<p>The activity traces of the robotic vehicle for smooth (in the <b>left</b> part) and rough (in the <b>right</b> part) terrains, as visualized using the QGIS software package.</p> "> Figure 11
<p>The hardware arrangements of the robotic vehicle providing RTK-level accuracy (in the <b>left</b>) and efficient energy measurement (in the <b>right</b>). The key components (i.e., the RTK GPS and the INA219 power meter) are enclosed inside dashed circles.</p> "> Figure 12
<p>Detailed recordings of the dynamic behavior of the robotic vehicle, from the low-level control unit perspective in two cases: on smooth terrains while loaded (<b>left</b> part) and on comparatively rough terrains while unloaded (<b>right</b> part).</p> "> Figure 13
<p>Detailed recordings of the dynamic behavior of the robotic vehicle, from the low-level control unit perspective, on comparatively rough terrains and loaded in two cases: turning (<b>left</b> part) and moving straight (<b>right</b> part).</p> ">
Abstract
:1. Introduction
2. Functional Requirements, Methods and Materials
- To start from a specific location and reach the location of a specific harvester, in the field, between the plants.
- To follow a specific harvester during its work, so that the latter is able to put into it the fruits being collected.
- To recognize simple predefined voice commands, in order to stop to a specific location, to start/restart following the harvester, to go to the truck when full.
- To repeat the whole process, i.e., to go back to the harvester’s location from the truck’s location and so on by following the narrow paths surrounding the field or moving between the plantation lines.
- To provide health status indication messages (i.e., battery voltage low alarm) and perform the necessary emergency actions (e.g., to return to the truck or to start beeping).
- To continuously report its location for testing purposes and provide efficient manual overdrive alternatives, e.g., when a sensor malfunction case is scanned.
- To operate on a local (off-line) basis without heavy dependencies on cloud-based elements (i.e., accessible via the Internet). The latter option can be implemented for debug/emergency cases only.
3. Design and Implementation Details and Challenging Trade-Offs
3.1. Electromechanical Layout
3.2. Low-Level Control Equipment and Functions
3.3. Overview of the High-Level Control Equipment and Functions
3.4. High-Level Control Equipment and Functions (Path Planning)
- Point A, if not on the grid, it is almost on the grid, i.e., its distance from a side or lane is negligible.
- Point B may be far from any side or lane, but another point B’ is used instead of B. This is the point on the grid closest to B. Then, a path from A to B’ is calculated.
- There are no obstacles along the path from A to B’.
- closestPointOnLane(P), which takes a point P and returns three values: the point closest to P lying on some lane, the lane number of that lane and the distance from P.
- closestPointOnSide(P), which takes a point P and returns three values: the point closest to P lying on some side, the side number of that side and the distance from P.
- A, B’ are on the same side or on the same lane.
- 2.
- A, B’ are on opposite sides.
- 3.
- A is on a lane and B’ is on a side.
- 4.
- A is on a side and B’ is on a lane.
- 5.
- A, B’ are on different lanes.
3.5. High-Level Control Equipment and Functions (Machine Vision)
3.6. High-Level Control Equipment and Functions (Voice Recognition)
3.7. Interoperability and Networking Issues (Summary of Features)
- GPS (and IMU)-assisted continuous robot’s localization, distant target finding (i.e., the tractor or the farmer), and actions for reaching it, according to the lane grid constraints.
- When close to target, the visual navigation methods, based on machine learning techniques, take control, for a more accurate response.
- Voice commands functionality facilitates the guidance of the robot by allowing simple instruction execution.
4. Evaluation Results and Discussion
4.1. Testing Details and Results
4.2. Discussion
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- United Nations. UN Report on Earth’s Population. 2020. Available online: https://www.un.org/en/sections/issues-depth/population/index.html (accessed on 30 October 2020).
- FAO. The Future of Food and Agriculture—Trends and Challenges. 2018. Available online: http://www.fao.org/3/a-i6583e.pdf (accessed on 30 October 2020).
- O’Grady, M.J.; O’Hare, G.M. Modelling the smart farm. Inf. Process. Agric. 2017, 4, 179–187. [Google Scholar] [CrossRef]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
- Krishna, K.R. Push Button Agriculture: Robotics, Drones, Satellite-Guided Soil and Crop Management; Apple Academic Press: Oakville, ON, Canada, 2016; ISBN 13: 978-1-77188-305-4. (eBook-PDF). [Google Scholar]
- UK-RAS Network: Robotics & Autonomous Systems. Agricultural Robotics: The Future of Robotic Agriculture, UK-RAS White Papers; UK-RAS Network: London, UK, 2018; Available online: https://arxiv.org/ftp/arxiv/papers/1806/1806.06762.pdf (accessed on 20 August 2020).
- Roldan, J.J.; del Cerro, J.; Garzon-Ramos, D.; Garcia-Aunon, P.; Garzon, M.; de Leon, J.; Barrientos, A. Robots in agriculture: State of art and practical experiences. In Service Robots; IntechOpen: London, UK, 2017. [Google Scholar]
- Shockley, J.M.; Dillon, C.R. An Economic Feasibility Assessment for Adoption of Autonomous Field Machinery in Row Crop Production. In Proceedings of the 2018 International Conference on Precision Agriculture, Montreal, QC, Canada, 24–27 June 2018. [Google Scholar]
- Fountas, S.; Mylonas, N.; Malounas, I.; Rodias, E.; Santos, C.H.; Pekkeriet, E. Agricultural Robotics for Field Operations. Sensors 2020, 20, 2672. [Google Scholar] [CrossRef]
- Lamborelle, A.; Álvarez, L.F. Farming 4.0: The Future of Agriculture? Available online: https://www.euractiv.com/section/agriculture-food/infographic/farming-4-0-the-future-of-agriculture/ (accessed on 20 October 2020).
- King, A. Technology: The Future of Agriculture. Nat. Cell Biol. 2017, 544, S21–S23. [Google Scholar] [CrossRef] [Green Version]
- Shamshiri, R.R.; Weltzien, C.; Hameed, I.A.; Yule, I.J.; Grift, T.E.; Balasundram, S.K.; Pitonakova, L.; Ahmad, D.; Chowdhary, G. Research and development in agricultural robotics: A perspective of digital farming. Int. J. Agric. Biol. Eng. 2018, 11, 1–11. [Google Scholar] [CrossRef]
- Gaus, C.-C.; Urso, L.M.; Minßen, T.F.; de Witte, T. Economics of mechanical weeding by a swarm of small field robots. In Proceedings of the 57th Annual Conference of German Association of Agricultural Economists (GEWISOLA), Weihenstephan, Germany, 13–15 September 2017. [Google Scholar]
- Sørensen, C.; Bochtis, D. Conceptual model of fleet management in agriculture. Biosyst. Eng. 2010, 105, 41–50. [Google Scholar] [CrossRef]
- Fountas, S.; Gemtos, T.A.; Blackmore, S. Robotics and Sustainability in Soil Engineering. In Soil Engineering; Springer: Berlin, Germany, 2010; pp. 69–80. [Google Scholar]
- Martinez-Martin, E.; Escalona, F.; Cazorla, M. Socially Assistive Robots for Older Adults and People with Autism: An Overview. Electronics 2020, 9, 367. [Google Scholar] [CrossRef] [Green Version]
- World Health Organization (WHO). Disability. 2019. Available online: https://www.who.int/disabilities/en/ (accessed on 20 August 2020).
- Harrop, P.; Dent, M. Electric Vehicles and Robotics in Agriculture 2020–2030, IDTechEx. Available online: https://www.idtechex.com/en/research-report/electric-vehicles-and-robotics-in-agriculture-2020-2030/717 (accessed on 20 February 2021).
- International Federation of Robotics (IFR). IFR Press Releases 2020. Available online: https://ifr.org/downloads/press2018/Presentation_Service_Robots_2020_Report.pdf (accessed on 20 February 2021).
- Aymerich-Franch, L.; Ferrer, I. The implementation of social robots during the COVID-19 pandemic. arXiv 2020, arXiv:2007.03941. [Google Scholar]
- Thomasson, J.A.; Baillie, C.P.; Antille, D.L.; Lobsey, C.R.; McCarthy, C.L. Autonomous technologies in agricultural equipment: A review of the state of the art. In Proceedings of the 2019 Agricultural Equipment Technology Conference, Louisville, KY, USA, 11 February 2019; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2019. [Google Scholar] [CrossRef]
- Fayyad, J.; Jaradat, M.A.; Gruyer, D.; Najjaran, H. Deep Learning Sensor Fusion for Autonomous Vehicle Perception and Localization: A Review. Sensors 2020, 20, 4220. [Google Scholar] [CrossRef]
- Doran, M.V.; Clark, G.W. Enhancing Robotic Experiences throughout the Computing Curriculum. In Proceedings of the 49th ACM Technical Symposium on Computer Science Education (SIGCSE’18), Baltimore, MD, USA, 21–24 February 2018; pp. 368–371. [Google Scholar]
- Loukatos, D.; Tzaninis, G.; Arvanitis, K.G.; Armonis, N. Investigating Ways to Develop and Control a Multi Purpose and Low Cost Agricultural Robotic Vehicle, under Scale. In Proceedings of the XXXVIII CIOSTA & CIGR V International Conference (CIOSTA2019), Rhodes, Greece, 24–26 June 2019. [Google Scholar]
- Loukatos, D.; Arvanitis, K.G. Extending Smart Phone Based Techniques to Provide AI Flavored Interaction with DIY Robots, over Wi-Fi and LoRa interfaces. Educ. Sci. 2019, 9, 224. [Google Scholar] [CrossRef] [Green Version]
- Zhang, G.; Li, G.; Peng, J. Risk Assessment and Monitoring of Green Logistics for Fresh Produce Based on a Support Vector Machine. Sustainability 2020, 12, 7569. [Google Scholar] [CrossRef]
- Ngo, H.Q.T.; Phan, M.-H. Design of an Open Platform for Multi-Disciplinary Approach in Project-Based Learning of an EPICS Class. Electronics 2019, 8, 200. [Google Scholar] [CrossRef] [Green Version]
- Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV Powered by Solar Energy for Precision Agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef] [Green Version]
- Rbr. The Ultimate Guide to Agricultural Robotics. 2017. Available online: https://www.roboticsbusinessreview.com/agriculture/the_ultimate_guide_to_agricultural_robotics/ (accessed on 30 April 2020).
- Grimstad, L.; From, P.J. The Thorvald II agricultural robotic system. Robotics 2017, 6, 24. [Google Scholar] [CrossRef] [Green Version]
- Saga Robotics. The Official Site of Saga Robotics. 2021. Available online: https://sagarobotics.com/ (accessed on 20 February 2021).
- AFN. Description of the Recent Financial Raise of Saga Robotics. 2021. Available online: https://agfundernews.com/saga-robotics-raises-11m-to-develop-robo-strawberry-pickers.html (accessed on 31 March 2021).
- Arduino Uno. Arduino Uno Board Description on the Official Arduino Site. 2020. Available online: https://store.arduino.cc/arduino-uno-re (accessed on 15 September 2020).
- Raspberry. Raspberry Pi 3 Model B Board Description on the Official Raspberry Site. 2020. Available online: https://www.raspberrypi.org/products/raspberry-pi-3-model-b/ (accessed on 20 September 2020).
- WeMos. The WeMos D1 R2 Board. 2020. Available online: https://wiki.wemos.cc/products:d1:d1 (accessed on 20 August 2020).
- LSM9DS1. Description of the SparkFun 9DoF IMU Breakout-LSM9DS1. 2020. Available online: https://www.sparkfun.com/products/13284 (accessed on 20 September 2020).
- U-blox M9. The U-Blox M9 Products Generation. 2021. Available online: https://www.u-blox.com/en/robust-nature (accessed on 22 February 2021).
- Navio2. “Navio2: Autopilot HAT for Raspberry Pi”, Official Description of the Emlid Navio2 Board. 2020. Available online: https://emlid.com/navio/ (accessed on 20 August 2020).
- Pixy2. Description of the Pixy2 AI-Assisted Robot Vision Camera. 2020. Available online: https://pixycam.com/pixy2/ (accessed on 20 September 2020).
- OpenCV. Description of the OpenCV, an Open Source Computer Vision and Machine Learning Software. 2020. Available online: https://opencv.org/about/ (accessed on 20 September 2020).
- Movidius Stick (2020) Intel® Movidius™ Neural Compute Stick. Available online: https://ark.intel.com/content/www/us/en/ark/products/125743/intel-movidius-neural-compute-stick.html (accessed on 30 September 2020).
- ASUS. ASUS AI Noise-Canceling Mic Adapter with USB-C 3.5 mm Connection. 2020. Available online: https://www.asus.com/Accessories/Streaming-Kit/All-series/AI-Noise-Canceling-Mic-Adapter/ (accessed on 30 September 2020).
- PyGeodesy. The PyGeodesy Python Library Implementation and Description. 2020. Available online: https://github.com/mrJean1/PyGeodesy (accessed on 25 September 2020).
- Hruska, J. New Movidius Myriad X VPU Packs a Custom Neural Compute Engine. 2017. Available online: https://www.extremetech.com/computing/254772-new-movidius-myriad-x-vpu-packs-custom-neural-compute-engine (accessed on 25 September 2020).
- Kyriakos, A.; Papatheofanous, E.A.; Bezaitis, C.; Petrongonas, E.; Soudris, D.; Reisis, D. Design and Performance Comparison of CNN Accelerators Based on the Intel Movidius Myriad2 SoC and FPGA Embedded Prototype. In Proceedings of the 2019 International Conference on Control, Artificial Intelligence, Robotics & Optimization (ICCAIRO), Athens, Greece, 8–9 December 2019; IEEE: Los Alamitos, CA, USA, 2019; pp. 142–147. [Google Scholar] [CrossRef]
- Lentaris, G.; Maragos, K.; Stratakos, I.; Papadopoulos, L.; Papanikolaou, O.; Soudris, D.; Lourakis, M.; Zabulis, X.; Gonzalez-Arjona, D.; Furano, G. High-Performance Embedded Computing in Space: Evaluation of Platforms for Vision-Based Navigation. J. Aerosp. Inf. Syst. 2018, 15, 178–192. [Google Scholar] [CrossRef]
- Lentaris, G.; Stamoulias, I.; Soudris, D.; Lourakis, M. HW/SW Codesign and FPGA Acceleration of Visual Odometry Algorithms for Rover Navigation on Mars. IEEE Trans. Circuits Syst. Video Technol. 2016, 26, 1563–1577. [Google Scholar] [CrossRef]
- Ran, L.; Zhang, Y.; Zhang, Q.; Yang, T. Convolutional neural network-based robot navigation using uncalibrated spherical images. Sensors 2017, 17, 1341. [Google Scholar] [CrossRef] [PubMed]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft coco: Common objects in context. In European Conference on Computer Vision; Springer: Cham, Switzerland, 2014; pp. 740–755. [Google Scholar]
- Wang, R.J.; Li, X.; Ling, C.X. Pelee: A real-time object detection system on mobile devices. In Advances in Neural Information Processing Systems 31: 32nd Conference on Neural Information Processing Systems (NeurIPS 2018); Curran Associates, Inc.: New York, NY, USA, 2019; pp. 1963–1972. [Google Scholar]
- Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2008. [Google Scholar]
- Loukatos, D.; Arvanitis, K.G.; Armonis, N. Investigating Educationally Fruitful Speech-Based Methods to Assist People with Special Needs to Care Potted Plants. In Human Interaction and Emerging Technologies. IHIET 2019. Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2018; Volume 1018. [Google Scholar] [CrossRef]
- Loukatos, D.; Kahn, K.; Alimisis, D. Flexible Techniques for Fast Developing and Remotely Controlling DIY Robots, with AI Flavor. In Educational Robotics in the Context of the Maker Movement. Edurobotics 2018. Advances in Intelligent Systems and Computing; Moro, M., Alimisis, D., Iocchi, L., Eds.; Springer: Cham, Switzerland, 2020; Volume 946. [Google Scholar] [CrossRef]
- Loukatos, D.; Fragkos, A.; Arvanitis, K.G. Exploiting Voice Recognition Techniques to Provide Farm and Greenhouse Monitoring for Elderly or Disabled Farmers, over WI-FI and LoRa Interfaces. In Bio-Economy and Agri-Production: Concepts and Evidence; Bochtis, D., Achillas, C., Banias, G., Lampridi, M., Eds.; Academic Press: London, UK, 2021. [Google Scholar] [CrossRef]
- SOPARE. Sound Pattern Recognition—SOPARE. 2020. Available online: https://www.bishoph.org/ (accessed on 30 October 2020).
- MQTT. The MQ Telemetry Transport Protocol (Wikipedia). 2020. Available online: https://en.wikipedia.org/wiki/MQTT (accessed on 20 October 2020).
- Draguno. The LoRa Draguno Shield for Arduino. 2020. Available online: http://www.dragino.com/products/module/item/102-lora-shield.html (accessed on 20 October 2020).
- LoRa. LoRa Protocol Description on Wikipedia. 2020. Available online: https://en.wikipedia.org/wiki/LoRa (accessed on 15 October 2020).
- Sanchez-Iborra, R.; Sanchez-Gomez, J.; Ballesta-Viñas, J.; Cano, M.-D.; Skarmeta, A.F. Performance Evaluation of LoRa Considering Scenario Conditions. Sensors 2018, 18, 772. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- ZED-F9P. The SparkFun ZED-F9P GPS-RTK2 Board Description. 2021. Available online: https://www.sparkfun.com/products/15136 (accessed on 22 February 2021).
- QGIS. The QGIS Geographic Information System Application Software, Release 3.10. 2021. Available online: https://blog.qgis.org/2019/11/02/qgis-3-10-a-coruna-is-released/ (accessed on 22 February 2021).
- Gravity. The Gravity I2C Digital Wattmeter Module Using the INA219 Chip. 2020. Available online: https://www.dfrobot.com/product-1827.html (accessed on 22 February 2021).
- U-Center. The U-Center Evaluation Software Description. 2021. Available online: https://www.u-blox.com/en/product/u-center (accessed on 31 March 2021).
ri | vi | θ |
---|---|---|
0.01 | 0.02 | 57.0 |
0.10 | 0.17 | 48.8 |
0.50 | 0.50 | 28.1 |
1.00 | 0.67 | 17.7 |
2.00 | 0.80 | 10.1 |
4.00 | 0.89 | 5.4 |
8.00 | 0.94 | 2.8 |
10.00 | 0.95 | 2.2 |
15.00 | 0.97 | 1.5 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Loukatos, D.; Petrongonas, E.; Manes, K.; Kyrtopoulos, I.-V.; Dimou, V.; Arvanitis, K.G. A Synergy of Innovative Technologies towards Implementing an Autonomous DIY Electric Vehicle for Harvester-Assisting Purposes. Machines 2021, 9, 82. https://doi.org/10.3390/machines9040082
Loukatos D, Petrongonas E, Manes K, Kyrtopoulos I-V, Dimou V, Arvanitis KG. A Synergy of Innovative Technologies towards Implementing an Autonomous DIY Electric Vehicle for Harvester-Assisting Purposes. Machines. 2021; 9(4):82. https://doi.org/10.3390/machines9040082
Chicago/Turabian StyleLoukatos, Dimitrios, Evangelos Petrongonas, Kostas Manes, Ioannis-Vasileios Kyrtopoulos, Vasileios Dimou, and Konstantinos G. Arvanitis. 2021. "A Synergy of Innovative Technologies towards Implementing an Autonomous DIY Electric Vehicle for Harvester-Assisting Purposes" Machines 9, no. 4: 82. https://doi.org/10.3390/machines9040082