Robotic Complex for Harvesting Apple Crops
<p>Columnar apple trees.</p> "> Figure 2
<p>Automated fruit-collection system: (<b>a</b>) side view; (<b>b</b>) top view.</p> "> Figure 3
<p>Graphic model automated fruit-collection system.</p> "> Figure 4
<p>Industrial robot KUKA KR 30-3.</p> "> Figure 5
<p>Workspace of the robot KUKA KR 30-3.</p> "> Figure 6
<p>Graphic model of the filling system: 1—catching funnel; 2—flexible sleeve; 3—container.</p> "> Figure 7
<p>Flexible bagging filling system: 1—catching funnel; 2—flexible sleeve.</p> "> Figure 8
<p>Work cycle time (man).</p> "> Figure 9
<p>Working cycle time (robot).</p> "> Figure 10
<p>(<b>a</b>) Vacuum gripper (suction cup) Festo ESS-30-GT-G1/8; (<b>b</b>) vacuum generator Festo VAD-1/8.</p> "> Figure 11
<p>Stereo sensor RC Visard 160.</p> "> Figure 12
<p>Scheme of the stereo vision principle: the more distant object (black) shows less mismatch <span class="html-italic">d</span><sub>2</sub>, than nearest object (gray), <span class="html-italic">d</span><sub>1</sub>.</p> "> Figure 13
<p>Selecting the TCP and orientation of the gripper coordinate system.</p> "> Figure 14
<p>Calibration of the vision system: scheme (<b>left</b>), process (<b>right</b>).</p> "> Figure 15
<p>Testing the sensor RC Visard 160.</p> "> Figure 16
<p>Examples of images from the setup shown above: image (<b>top left</b>), true image (<b>bottom left</b>) and depth image (<b>right</b>).</p> "> Figure 17
<p>The image of the objects on the <b>left</b> and the calculated capture points on the <b>right</b>.</p> "> Figure 18
<p>1—Robotic arm (robotic arm), 2—SmartPAD programming console, 3—connection cable, 4—KUKA KR C4 controller, 5—data cable, 6—motor control cable, 7—Ethernet cable, 8—3D sensor RC Visard 160.</p> ">
Abstract
:1. Introduction
2. Concept
3. Methods
- –
- searching for the next fetus in the robot’s workspace by changing the orientation of the recognition device;
- –
- measuring the distance to the observation objects;
- –
- automatic adjustment of the video sensor depending on the illumination of the working area;
- –
- comparison of objects with the “model” in memory according to the specified criteria.
- –
- a dedicated page in the RC Visard web interface for easy setup and testing;
- –
- definition of capture areas to select the appropriate planes in the scene;
- –
- a work-area detection function for debris-collection applications to ensure that items are captured only within that area;
- –
- definition of compartments within container cans to place items in a specific order only;
- –
- definition of the surface quality for each captured object;
- –
- sorting objects according to their location, so that the items at the top of the pile are grabbed first.
- –
- perform calibration;
- –
- set up the object recognition;
- –
- define the area in which the robot should work;
- –
- perform a recognition test and optimize the parameters;
- –
- adjust the enclosed sample programs.
4. Conclusions and Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Bu, L.X.; Chen, C.K.; Hu, G.R.; Sugirbay, A.; Chen, J. Technological development of robotic apple harvesters: A review. INMATEH-Agric. Eng. 2020, 61, 151–164. [Google Scholar] [CrossRef]
- Lytridis, C.; Kaburlasos, V.G.; Pachidis, T.; Manios, M.; Vrochidou, E.; Kalampokas, T.; Chatzistamatis, S. An overview of cooperative robotics in agriculture. Agronomy 2021, 11, 1818. [Google Scholar] [CrossRef]
- Bu, L.; Hu, G.; Chen, C.; Sugirbay, A.; Chen, J. Experimental and simulation analysis of optimum picking patterns for robotic apple harvesting. Sci. Hortic. 2020, 261, 108937. [Google Scholar] [CrossRef]
- Tahriri, F.; Mousavi, M.; Yap, H.J.; Zawiah, M.D.S.; Taha, Z. Optimizing the robot arm movement time using virtual reality robotic teaching system. Int. J. Simul. Model. 2015, 14, 28–38. [Google Scholar] [CrossRef]
- Liu, J.Z.; Yuan, Y.; Gao, Y.; Tang, S.Q.; Li, Z.G. Virtual model of grip-and-cut picking for simulation of vibration and falling of grape clusters. Trans. ASABE 2019, 62, 603–614. [Google Scholar] [CrossRef]
- Ozakyol, H.; Karaman, C.; Bingul, Z. Advanced robotics analysis toolbox for kinematic and dynamic design and analysis of high-dof redundant serial manipulators. Comput. Appl. Eng. Educ. 2019, 27, 1429–1452. [Google Scholar] [CrossRef]
- Zhang, C.; Xu, W.; Liu, J.; Liu, Z.; Zhou, Z.; Pham, D.T. Digital twin-enabled reconfigurable modeling for smart manufacturing systems. Int. J. Comput. Integr. Manuf. 2021, 34, 709–733. [Google Scholar] [CrossRef]
- Kang, H.; Zhou, H.; Chen, C. Visual perception and modeling for autonomous apple harvesting. IEEE Access 2020, 8, 62151–62163. [Google Scholar] [CrossRef]
- Brown, J.; Sukkarieh, S. Design and evaluation of a modular robotic plum harvesting system utilizing soft components. J. Field Robot. 2021, 38, 289–306. [Google Scholar] [CrossRef]
- Gunderman, A.L.; Collins, J.; Myer, A.; Threlfall, R.; Chen, Y. Tendon-driven soft robotic gripper for berry harvesting. arXiv 2021, arXiv:2103.04270. [Google Scholar]
- Gongal, A.; Silwal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Apple crop-load estimation with over-the-row machine vision system. Comput. Electron. Agric. 2016, 120, 26–35. [Google Scholar] [CrossRef]
- Gongal, A.; Karkee, M.; Amatya, S. Apple fruit size estimation using a 3d machine vision system. Inf. Processing Agric. 2018, 5, 498–503. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Vilaplana, V.; Rosell-Polo, J.R.; Morros, J.R.; Ruiz-Hidalgo, J.; Gregorio, E. Multi-modal deep learning for fuji apple detection using rgb-d cameras and their radiometric capabilities. Comput. Electron. Agric. 2019, 162, 689–698. [Google Scholar] [CrossRef]
- Gené-Mola, J.; Gregorio, E.; Guevara, J.; Auat, F.; Sanz-Cortiella, R.; Escolà, A.; Llorens, J.; Morros, J.R.; Ruiz-Hidalgo, J.; Vilaplana, V.; et al. Fruit detection in an apple orchard using a mobile terrestrial laser scanner. Biosyst. Eng. 2019, 187, 171–184. [Google Scholar] [CrossRef]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; García-Mateos, G.; Ruiz-Canales, A.; Molina-Martínez, J.; Arribas, J. An automatic non-destructive method for the classification of the ripeness stage of red delicious apples in orchards using aerial video. Agronomy 2019, 9, 84. [Google Scholar] [CrossRef] [Green Version]
- Fan, P.; Lang, G.; Guo, P.; Liu, Z.; Yang, F.; Yan, B.; Lei, X. Multi-feature patch-based segmentation technique in the gray-centered rgb color space for improved apple target recognition. Agriculture 2021, 11, 273. [Google Scholar] [CrossRef]
- Saedi, S.I.; Khosravi, H. A deep neural network approach towards real-time on-branch fruit recognition for precision horticulture. Expert Syst. Appl. 2020, 159, 113594. [Google Scholar] [CrossRef]
- Gao, F.; Fu, L.; Zhang, X.; Majeed, Y.; Li, R.; Karkee, M.; Zhang, Q. Multi-class fruit-on-plant detection for apple in snap system using faster r-cnn. Comput. Electron. Agric. 2020, 176, 105634. [Google Scholar] [CrossRef]
- Apolo-Apolo, O.E.; Pérez-Ruiz, M.; Martínez-Guanter, J.; Valente, J. A cloud-based environment for generating yield estimation maps from apple orchards using uav imagery and a deep learning technique. Front. Plant Sci. 2020, 11, 1086. [Google Scholar] [CrossRef]
- Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of bloom/yield in crop images using deep learning models for smart agriculture: A review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
- Zhang, X.; Karkee, M.; Zhang, Q.; Whiting, M.D. Computer vision-based tree trunk and branch identification and shaking points detection in dense-foliage canopy for automated harvesting of apples. J. Field Robot. 2021, 38, 476–493. [Google Scholar] [CrossRef]
- Zhang, H.; Fang, H.; Zou, Q.; Zhang, D. Dynamic modeling and adaptive robust synchronous control of parallel robotic manipulator for industrial application. Complexity 2020, 2020, 5640246. [Google Scholar] [CrossRef] [Green Version]
- Yu, X.; Fan, Z.; Wang, X.; Wan, H.; Wang, P.; Zeng, X.; Jia, F. A lab-customized autonomous humanoid apple harvesting robot. Comput. Electr. Eng. 2021, 96, 107459. [Google Scholar] [CrossRef]
- Korchagin, S.A.; Gataullin, S.T.; Osipov, A.V.; Smirnov, M.V.; Suvorov, S.V.; Serdechnyi, D.V.; Bublikov, K.V. Development of an optimal algorithm for detecting damaged and diseased potato tubers moving along a conveyor belt using computer vision systems. Agronomy 2021, 11, 1980. [Google Scholar] [CrossRef]
- Osipov, A.; Shumaev, V.; Ekielski, A.; Gataullin, T.; Suvorov, S.; Mishurov, S.; Gataullin, S. Identification and classification of mechanical damage during continuous harvesting of root crops using computer vision methods. IEEE Access 2022, 10, 28885–28894. [Google Scholar] [CrossRef]
- Dogadina, E.P.; Smirnov, M.V.; Osipov, A.V.; Suvorov, S.V. Evaluation of the forms of education of high school students using a hybrid model based on various optimization methods and a neural network. Informatics 2021, 8, 46. [Google Scholar] [CrossRef]
- Osipov, A.; Filimonov, A.; Suvorov, S. Applying machine learning techniques to identify damaged potatoes. In Proceedings of the 20th International Conference, ICAISC 2021, Virtual, 21–23 June 2021; Volume 12854. [Google Scholar] [CrossRef]
- Kuznetsova, A.; Maleva, T.; Soloviev, V. Using yolov3 algorithm with pre- and post-processing for apple detection in fruit-harvesting robot. Agronomy 2020, 10, 1016. [Google Scholar] [CrossRef]
- Sebyakin, A.; Soloviev, V.; Zolotaryuk, A. Spatio-temporal deepfake detection with deep neural networks. In Proceedings of the 16th International Conference, iConference 2021, Beijing, China, 17–31 March 2021; Volume 12645. [Google Scholar] [CrossRef]
- Andriyanov, N.; Khasanshin, I.; Utkin, D.; Gataullin, T.; Ignar, S.; Shumaev, V.; Soloviev, V. Intelligent system for estimation of the spatial position of apples based on yolov3 and real sense depth camera d415. Symmetry 2022, 14, 148. [Google Scholar] [CrossRef]
- Kamyshova, G.; Osipov, A.; Gataullin, S.; Korchagin, S.; Ignar, S.; Gataullin, T.; Terekhova, N.; Suvorov, S. Artificial neural networks and computer vision’s-based phytoindication systems for variable rate irrigation improving. IEEE Access 2022, 10, 8577–8589. [Google Scholar] [CrossRef]
- Krakhmalev, O.N.; Petreshin, D.I.; Fedonin, O.N. Provision of controlled motion accuracy of industrial robots and multiaxis machines by the method of integrated deviations correction. In Proceedings of the International Conference on Mechanical Engineering, Automation and Control Systems 2015 (MEACS2015), Tomsk, Russia, 1–4 December 2015; Volume 124, p. 012067. [Google Scholar] [CrossRef]
- Krakhmalev, O.N.; Petreshin, D.I.; Fedonin, O.N. Improving the precision of multicoordinate machine tools and industrial robots. Russ. Eng. Res. 2017, 37, 434–437. [Google Scholar] [CrossRef]
- Krakhmalev, O.N.; Petreshin, D.I.; Fedonin, O.N. Mathematical models for base calibration in industrial robots. Russ. Eng. Res. 2017, 37, 995–1000. [Google Scholar] [CrossRef]
- Krakhmalev, O.N.; Petreshin, D.I.; Krakhmalev, G.N. Methods of calibrating the orientation of the industrial robot tool. In Proceedings of the 2018 International Multi-Conference on Industrial Engineering and Modern Technologies (FarEastCon), Vladivostok, Russia, 3–4 October 2018. [Google Scholar] [CrossRef]
- Krakhmalev, O.N.; Korostelyov, D.A. Solutions of the inverse kinematic problem for manipulation robots based on the genetic algorithm. In Proceedings of the International Conference of Young Scientists and Students “Topical Problems of Mechanical Engineering” (ToPME-2019), Moscow, Russia, 4–6 December 2019; IOP Publishing: Bristol, UK, 2020; Volume 747. [Google Scholar] [CrossRef]
- Osipov, A.; Pleshakova, E.; Gataullin, S.; Korchagin, S.; Ivanov, M.; Finogeev, A.; Yadav, V. Deep learning method for recognition and classification of images from video recorders in difficult weather conditions. Sustainability 2022, 14, 2420. [Google Scholar] [CrossRef]
- Onishi, Y.; Yoshida, T.; Kurita, H.; Fukao, T.; Arihara, H.; Iwai, A. An automated fruit harvesting robot by using deep learning. Robomech J. 2019, 6, 13. [Google Scholar] [CrossRef] [Green Version]
- Roboception. Available online: https://roboception.com/en/innovation-en/erf2021/ (accessed on 30 April 2022).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Krakhmalev, O.; Gataullin, S.; Boltachev, E.; Korchagin, S.; Blagoveshchensky, I.; Liang, K. Robotic Complex for Harvesting Apple Crops. Robotics 2022, 11, 77. https://doi.org/10.3390/robotics11040077
Krakhmalev O, Gataullin S, Boltachev E, Korchagin S, Blagoveshchensky I, Liang K. Robotic Complex for Harvesting Apple Crops. Robotics. 2022; 11(4):77. https://doi.org/10.3390/robotics11040077
Chicago/Turabian StyleKrakhmalev, Oleg, Sergey Gataullin, Eldar Boltachev, Sergey Korchagin, Ivan Blagoveshchensky, and Kang Liang. 2022. "Robotic Complex for Harvesting Apple Crops" Robotics 11, no. 4: 77. https://doi.org/10.3390/robotics11040077
APA StyleKrakhmalev, O., Gataullin, S., Boltachev, E., Korchagin, S., Blagoveshchensky, I., & Liang, K. (2022). Robotic Complex for Harvesting Apple Crops. Robotics, 11(4), 77. https://doi.org/10.3390/robotics11040077