Design and Evaluation of Capacitive Smart Transducer for a Forestry Crane Gripper
<p>Hydraulic forestry crane for autonomous log manipulation. (<b>a</b>) Real crane, picture of a mobile forestry crane at a sawmill delivery and (<b>b</b>) scaled crane model, the 1:5 scaled model with hydraulics designed to match the scale and manual control equal to the real version. The scaled crane model has an arm length of 2.5 m and it is surrounded by a protective cage, providing a workspace volume of approximately 20 m<sup>3</sup>.</p> "> Figure 2
<p>Single ended capacitance measurement sensing principle. The capacitance between the transmitter electrode and the environment is affected in presence of nearby materials and their dielectric properties.</p> "> Figure 3
<p>Design of gripper claw and sensors’ Placement. A sensors array consists of three transmitter electrodes (top, middle and bottom) as illustrated in the expanded view. The electrodes are placed over an active shield layer separated by an insulating layer. The electrode/shield layers are then fully enclosed in an insulating enclosure. The sensor arrays are placed each on both the left and right fingers of the the outer gripper claw.</p> "> Figure 4
<p>High level data acquisition and processing framework. The crane consists of various sensors for monitoring and hydraulic system for actuating the position control. It also has an embedded controller for controlling the crane based on external from the high level controller. The designed grasp sensing system (STIM) is mounted on the crane in a protective cavity. The STIM communicates to the Network Capable Application Processor (NCAP) framework in the host computer via BLE; the obtained data is then used for further processing and passed to a program using Robot Operating System (ROS).</p> "> Figure 5
<p>Sensor data acquisition system. Design of STIM is shown in (<b>a</b>). The power management harvests energy using solar panels and uses a coin cell as battery backup. NRF52 is used to configure and acquire data from the CDC, which is then transmitted via BLE in compliance with IEEE 1451.0 specifications to the crane controller. (<b>b</b>) The fully assembled PCB of the STIM and highlights the various important components related to capacitance measurement, energy harvesting and wireless transmission.</p> "> Figure 6
<p>Experimental setup. (<b>a</b>) The placement of active shield layer on the gripper’s claw is shown. One of the cavities for holding the STIM is shown clearly on the gripper’s inner claw. The STIM is placed in a cavity on the right finger of the outer gripper. (<b>b</b>) The construction of the sensor electrodes.</p> "> Figure 7
<p>Logs of different diameters used in the grasping experiments. Big log (top) has a diameter of 13.5 cm, medium log (middle) has a diameter of 7.5 cm and small log (bottom) has a diameter of 2.5 cm.</p> "> Figure 8
<p>Experimental results for grasping big log (13.5 cm). The solid lines represent the electrodes on the right claw and dashed lines represent the electrodes on the left claw. The result of detection algorithm is plotted in red on top of each plot. (<b>a</b>) shows the output of the sensors during a proper firm grasp, (<b>b</b>) shows the output of the sensors when grasping the log in the corner, (<b>c</b>) shows the output of the sensors when grasping the log at an angle and a picture of angle grasping scenario is shown in (<b>d</b>), (<b>e</b>) shows the output of the sensors when moving the log under incomplete grasp and (<b>f</b>) shows the output of both the open and close hydraulic pressure sensors (right y-axis, yellow background); and the capacitive sensors (left y-axis) during fingertip grasp.</p> "> Figure 9
<p>Experimental results for grasping medium (7.5 cm) log. The solid lines represent the electrodes on the right claw and dashed lines represent the electrodes on the left claw. The result of detection algorithm is plotted in red on top of each plot. (<b>a</b>) shows the output of the sensors during a proper firm grasp, (<b>b</b>) shows the output of the sensors when grasping the log in the corner and (<b>c</b>) shows the output of the sensors when moving the log under incomplete grasp, (<b>d</b>) shows a picture of incomplete grasping procedure and (<b>e</b>) shows the results of grasp detection for small log.</p> ">
Abstract
:1. Introduction
1.1. Motivation and Related Work
1.2. Contribution
2. System Description
2.1. Sensing Principle
2.2. Gripper Design and Sensor Placement
2.3. System Overview
2.3.1. Design and Realization of STIM
2.3.2. Wireless Network
2.3.3. Grasp Detection Algorithm
Algorithm 1: Grasp Detection Algorithm |
3. Experimental Results
3.1. Grasping Experiments
- Proper grasp: The log is firmly grasped by the claws, the grippers are fully closed and the sensors are under tight contact with the log. When the crane is moving, the log does not experience any shift in its grasped position.
- Corner grasp: The log is held by the claws near its end, and is only partially held by the claw.
- Incomplete grasp: The claws are not closed to the maximum possible extent, hence the log is not held firmly by the claw.
3.2. Grasp Detection Results
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Roblek, V.; Meško, M.; Krapež, A. A Complex View of Industry 4.0. SAGE Open 2016, 6, 2158244016653987. [Google Scholar] [CrossRef] [Green Version]
- Lu, Y. Industry 4.0: A survey on technologies, applications and open research issues. J. Ind. Inf. Integr. 2017, 6, 1–10. [Google Scholar] [CrossRef]
- Aguiar, A.S.; dos Santos, F.N.; Cunha, J.B.; Sobreira, H.; Sousa, A.J. Localization and Mapping for Robots in Agriculture and Forestry: A Survey. Robotics 2020, 9, 97. [Google Scholar] [CrossRef]
- Chen, S.W.; Nardari, G.V.; Lee, E.S.; Qu, C.; Liu, X.; Romero, R.A.F.; Kumar, V. SLOAM: Semantic Lidar Odometry and Mapping for Forest Inventory. IEEE Robot. Autom. Lett. 2020, 5, 612–619. [Google Scholar] [CrossRef] [Green Version]
- Emmi, E.; Devy, M. A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture. Precis. Agric. 2021, 22, 524–549. [Google Scholar] [CrossRef]
- Hamaza, S.; Farinha, A.; Nguyen, H.N.; Kovac, M. Sensor Delivery in Forests with Aerial Robots: A New Paradigm for Environmental Monitoring. 2020. Available online: https://plus.empa.ch/images/2020-11-02-Drohnen/3-Sensor%20Delivery%20paper.pdf (accessed on 24 February 2023).
- Du, G.; Wang, K.; Lian, S.; Zhao, K. Vision-based robotic grasping from object localization, object pose estimation to grasp estimation for parallel grippers: A review. Artif. Intell. Rev. 2020, 54, 1677–1734. [Google Scholar] [CrossRef]
- Ren, Y.; Sun, H.; Tang, Y.; Wang, S. Vision Based Object Grasping of Robotic Manipulator. In Proceedings of the 2018 24th International Conference on Automation and Computing (ICAC), Tyne, UK, 6–7 September 2018; pp. 1–5. [Google Scholar] [CrossRef]
- Yue, Y.; Yang, C.; Zhang, J.; Wen, M.; Wu, Z.; Zhang, H.; Wang, D. Day and Night Collaborative Dynamic Mapping in Unstructured Environment Based on Multimodal Sensors. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 2981–2987. [Google Scholar] [CrossRef]
- Allen, P.; Miller, A.; Oh, P.; Leibowitz, B. Integration of Vision, Force and Tactile Sensing for Grasping. Int. J. Intell. Mach. 1999, 4, 129–149. [Google Scholar]
- Cui, S.; Wang, R.; Wei, J.; Hu, J.; Wang, S. Self-Attention Based Visual-Tactile Fusion Learning for Predicting Grasp Outcomes. IEEE Robot. Autom. Lett. 2020, 5, 5827–5834. [Google Scholar] [CrossRef]
- Li, J.; Dong, S.; Adelson, E. Slip Detection with Combined Tactile and Visual Information. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QC, Australia, 21–25 May 2018; pp. 7772–7777. [Google Scholar] [CrossRef] [Green Version]
- Tavakoli, M.; Lopes, P.; Lourenço, J.; Rocha, R.P.; Giliberto, L.; de Almeida, A.T.; Majidi, C. Autonomous Selection of Closing Posture of a Robotic Hand Through Embodied Soft Matter Capacitive Sensors. IEEE Sens. J. 2017, 17, 5669–5677. [Google Scholar] [CrossRef]
- Schmitz, A.; Maiolino, P.; Maggiali, M.; Natale, L.; Cannata, G.; Metta, G. Methods and Technologies for the Implementation of Large-Scale Robot Tactile Sensors. IEEE Trans. Robot. 2011, 27, 389–400. [Google Scholar] [CrossRef]
- Tomo, T.P.; Schmitz, A.; Wong, W.K.; Kristanto, H.; Somlor, S.; Hwang, J.; Jamone, L.; Sugano, S. Covering a Robot Fingertip With uSkin: A Soft Electronic Skin with Distributed 3-Axis Force Sensitive Elements for Robot Hands. IEEE Robot. Autom. Lett. 2018, 3, 124–131. [Google Scholar] [CrossRef]
- Schmitz, A.; Maggiali, M.; Natale, L.; Bonino, B.; Metta, G. A tactile sensor for the fingertips of the humanoid robot iCub. In Proceedings of the Intelligent Robots and Systems (IROS), 2010 IEEE/RSJ, Taipei, Taiwan, 18–22 October 2010; pp. 2212–2217. [Google Scholar] [CrossRef]
- Kaboli, M.; Cheng, G. Robust Tactile Descriptors for Descriminating Objects From Textural Properties via Artificial Robotic Skin. IEEE Trans. Robot. 2018, 34, 985–1003. [Google Scholar] [CrossRef]
- Dai, Y.; Gao, S. A Flexible Multi-Functional Smart Skin for Force, Touch Position, Proximity, and Humidity Sensing for Humanoid Robots. IEEE Sens. J. 2021, 21, 26355–26363. [Google Scholar] [CrossRef]
- Duchaine, V. Why Tactile Intelligence Is the Future of Robotic Grasping. IEEE Spectrum. 2016. Available online: https://spectrum.ieee.org/why-tactile-intelligence-is-the-future-of-robotic-grasping (accessed on 24 February 2023).
- Rocha, R.; Lopes, P.; de Almeida, A.T.; Tavakoli, M.; Majidi, C. Soft-matter sensor for proximity, tactile and pressure detection. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24– September 1 October 2017; pp. 3734–3738. [Google Scholar] [CrossRef]
- Le, T.H.L.; Maslyczyk, A.; Roberge, J.P.; Duchaine, V. A Highly Sensitive Multimodal Capacitive Tactile Sensor. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017. [Google Scholar] [CrossRef] [Green Version]
- Yuan, W.; Dong, S.; Adelson, E.H. GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force. Sensors 2017, 17, 2762. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Costanzo, M.; De Maria, G.; Natale, C.; Pirozzi, S. Design and Calibration of a Force/Tactile Sensor for Dexterous Manipulation. Sensors 2019, 19, 966. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ding, Y.; Zhang, H.; Thomas, U. Capacitive Proximity Sensor Skin for Contactless Material Detection. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7179–7184. [Google Scholar] [CrossRef]
- Navarro, S.E.; Koch, S.; Hein, B. 3D Contour Following for a Cylindrical End-Effector Using Capacitive Proximity Sensors. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016. [Google Scholar] [CrossRef]
- Tsuji, S.; Kohama, T. Proximity and Contact Sensor for Human Cooperative Robot by Combining Time-of-Flight and Self-Capacitance Sensors. IEEE Sens. J. 2020, 20, 5519–5526. [Google Scholar] [CrossRef]
- M’Colo, K.E.; Luong, B.; Crosnier, A.; Néel, C.; Fraisse, P. Obstacle Avoidance using a Capacitive Skin for Safe Human-Robot Interaction. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; pp. 6742–6747. [Google Scholar] [CrossRef]
- Hera, P.L.; Morales, D.O. What Do We Observe When We Equip a Forestry Crane with Motion Sensors? Croat. J. For. Eng. J. Theory Appl. For. Eng. 2019, 40, 259–280. [Google Scholar] [CrossRef]
- Dvořák, J.; Malkovský, Z.; Macků, J. Influence of human factor on the time of work stages of harvesters and crane-equipped forwarders. J. For. Sci. 2018, 54, 24–30. [Google Scholar] [CrossRef] [Green Version]
- Bergerman, M. Robotics in Agriculture and Forestry. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1463–1492. [Google Scholar]
- Andersson, J.; Bodin, K.; Lindmark, D.; Servin, M.; Wallin, E. Reinforcement Learning Control of a Forestry Crane Manipulator. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 2121–2126. [Google Scholar] [CrossRef]
- Grudziński, M.; Marchewka, L.; Pajor, M.; Ziętek, R. Stereovision Tracking System for Monitoring Loader Crane Tip Position. IEEE Access 2020, 8, 223346–223358. [Google Scholar] [CrossRef]
- Nurmi, J.; Mattila, J. Global Energy-Optimal Redundancy Resolution of Hydraulic Manipulators: Experimental Results for a Forestry Manipulator. Energies 2017, 10, 647. [Google Scholar] [CrossRef] [Green Version]
- Ortiz Morales, D.; Westerberg, S.; La Hera, P.X.; Mettin, U.; Freidovich, L.; Shiriaev, A.S. Increasing the Level of Automation in the Forestry Logging Process with Crane Trajectory Planning and Control. J. Field Robot. 2014, 31, 343–363. [Google Scholar] [CrossRef]
- Shao, L.; Chen, X.; Milne, B.; Guo, P. A Novel Tree Trunk Recognition Approach for Forestry Harvesting Robot. In Proceedings of the 2014 9th IEEE Conference on Industrial Electronics and Applications, Hangzhou, China, 9–11 June 2014. [Google Scholar]
- Park, Y.; Shiriaev, A.; Westerberg, S.; Lee, S. 3D Log Recognition and Pose Estimation for Robotic Forestry Machine. In Proceedings of the 2011 International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar]
- Westerberg, S. Semi-Automating Forestry Machines Motion Planning, System Integration, and Human-Machine Interaction. Ph.D. Thesis, Umeå Universitet, Umea, Sweden, 2017. [Google Scholar]
- Weiss, S.; Ainetter, S.; Arneitz, F.; Perez, D.A.; Dhakate, R.; Fraundorfer, F.; Gietler, H.; Gubensak, W.; Ferreira, M.M.D.R.; Stetco, C.; et al. Automated Log Ordering through Robotic Grasper. In Proceedings of the Austrian Computer Vision and Robotics Workshop, Vienna, Austria, 17–18 September 2020. [Google Scholar] [CrossRef]
- Gietler, H.; Stetco, C.; Zangl, H. Scalable Retrofit Angular Position Sensor System. In Proceedings of the 2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Dubrovnik, Croatia, 25–28 May 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Faller, L.M.; Stetco, C.; Zangl, H. Design of a Novel Gripper System with 3D- and Inkjet-printed Multimodal Sensors for Automated Grasping of a Forestry Robot. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; pp. 5620–5627. [Google Scholar] [CrossRef]
- Gietler, H.; Böhm, C.; Ainetter, S.; Schöffmann, C.; Fraundorfer, F.; Weiss, S.; Zangl, H. Forestry Crane Automation using Learning-based Visual Grasping Point Prediction. In Proceedings of the 2022 IEEE Sensors Applications Symposium (SAS), Sundsvall, Sweden, 1–3 August 2022; pp. 1–6. [Google Scholar] [CrossRef]
- Analog Devices. CapTouch Programmable Controller for Single-Electrode Capacitance Sensors. 2015. Available online: https://www.analog.com/en/products/ad7147.html (accessed on 24 February 2023).
- Nordic Semiconductor. nRF52832 Product Specification. 2017. Available online: https://infocenter.nordicsemi.com/index.jsp?topic=%2Fstruct_nrf52%2Fstruct%2Fnrf52832_ps.html (accessed on 24 February 2023).
- Texas Instruments. BQ25570 Nano Power Boost Charger and Buck Converter for Energy Harvester Powered Applications. 2019. Available online: https://www.ti.com/lit/ds/symlink/bq25570.pdf (accessed on 24 February 2023).
- IEEE. IEEE Standard for a Smart Transducer Interface for Sensors and Actuators—Common Functions, Communication Protocols, and Transducer Electronic Data Sheet (TEDS) Formats. 2007. Available online: https://ieeexplore.ieee.org/document/4338161 (accessed on 24 February 2023).
- Mitterer, T.; Gietler, H.; Schöffmann, C.; Zangl, H. A Dynamic Sensor Interpreter for Robotic Systems. 2021. Available online: https://www.aau.at/wp-content/uploads/2021/08/arw21_teds.pdf (accessed on 24 February 2023).
- National Instruments. PXIe, ±60 V, ±3 A (DC)/±10 A (Pulsed) Precision System PXI Source Measure Unit. 2022. Available online: https://www.ni.com/en-my/shop/hardware/products/pxi-source-measure-unit.html (accessed on 24 February 2023).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Anandan, N.; Arronde Pérez, D.; Mitterer, T.; Zangl, H. Design and Evaluation of Capacitive Smart Transducer for a Forestry Crane Gripper. Sensors 2023, 23, 2747. https://doi.org/10.3390/s23052747
Anandan N, Arronde Pérez D, Mitterer T, Zangl H. Design and Evaluation of Capacitive Smart Transducer for a Forestry Crane Gripper. Sensors. 2023; 23(5):2747. https://doi.org/10.3390/s23052747
Chicago/Turabian StyleAnandan, Narendiran, Dailys Arronde Pérez, Tobias Mitterer, and Hubert Zangl. 2023. "Design and Evaluation of Capacitive Smart Transducer for a Forestry Crane Gripper" Sensors 23, no. 5: 2747. https://doi.org/10.3390/s23052747