From CySkin to ProxySKIN: Design, Implementation and Testing of a Multi-Modal Robotic Skin for Human–Robot Interaction
<p>Excerpt of the CAD model of a UR10e robot partially covered with proximity and tactile sensors placed on ad hoc designed covers. The hole pattern on the tactile sensors support is required to lodge electronic components on the rear side of each module.</p> "> Figure 2
<p>Rear/front view of a CySkin patch with highlighted core components.</p> "> Figure 3
<p>Tactile information representations obtained from CySkin data processing: (<b>a</b>) Three-dimensional visualization of the tactile map projected on the mesh of a cylinder covered with CySkin modules. (<b>b</b>) Tactile image obtained by a full hand grasp of the cylinder.</p> "> Figure 4
<p>The shared-payload task implemented as use-case scenario for the CoLLaboratE project. (<b>a</b>) KUKA KR QUANTEC robot equipped with a vacuum gripper approaching the car windshield to pick it up. (<b>b</b>) Collaborative phase: the operator grasps the handles covered with CySkin to effortlessly move the payload. Safety is guaranteed by the detection of a hand shape in the stream of tactile images, which smoothly enables the admittance controller [<a href="#B34-sensors-24-01334" class="html-bibr">34</a>].</p> "> Figure 5
<p>ProxySKIN technology in an integration-ready arrangement.</p> "> Figure 6
<p>Volumetric analysis of the coverage of circular arrays of proximity sensors. (<b>a</b>) Qualitative results showing the reference volume <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>f</mi> </mrow> </msub> </semantics></math>, the projection of the Field of View (FoV) of <span class="html-italic">n</span> arrays with <span class="html-italic">m</span> ToF sensors and the blind volume <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>t</mi> <mi>o</mi> <mi>t</mi> </mrow> </msub> </semantics></math> obtained by subtracting from <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>f</mi> </mrow> </msub> </semantics></math> the projection of the ToF sensors FoV. (<b>b</b>) Plot of the blind volume <math display="inline"><semantics> <msub> <mi>V</mi> <mrow> <mi>t</mi> <mi>o</mi> <mi>t</mi> </mrow> </msub> </semantics></math> as a function of the number of arrays and the number of ToF sensors per array.</p> "> Figure 7
<p>OScilloscope screen capture. Data transmission analysis from an array of 10 ToF sensors to its associated microcontroller (IHB) and from the IHB to the EtherCAT slave device. <span class="html-italic">Yellow and cyan lines</span>: SPI signals (MISO and CLOCK) from the array of proximity sensors to the IHB. <span class="html-italic">Magenta and green lines</span>: SPI signals (MISO and CLOCK) from the IHB to the EtherCAT slave.</p> "> Figure 8
<p>(<b>a</b>) The ToF connector on the top layer grabs supplies and signals from the straight bottom tracks; (<b>b</b>) the 20-pin end connector groups the little pins together to wider tracks in order to sustain signal integrity and current requirements.</p> "> Figure 9
<p>PCB routing of the ToF sensor board. It has a 4-layer stack (ground plane not visible in the figure for clarity). All the SMD parts are placed on the top layer, while only the connector to the flex strip is on the bottom layer to avoid extra height between the PCB and flex that could prevent a proper mating between the male plug and the female socket.</p> "> Figure 10
<p>System level integration. Two separated EtherCAT backbones are used for the data acquisition, respectively, for CySkin and arrays of proximity sensors.</p> "> Figure 11
<p>Excerpt of the CAD model of a UR10e robot with the projection of the FoV of two arrays of proximity sensors mounted, respectively, on the first link and on the second link. In this robot configuration, it is evident that the intersection of the FoV of ToF sensors belong to different arrays.</p> "> Figure 12
<p>Interference tests among proximity sensors belonging to the same array. (<b>a</b>) <span class="html-italic">Test 1</span>: The target plane, parallel with respect to the linear array of ToF sensors, is placed at various distances. (<b>b</b>) Picture of the real setup. (<b>c</b>) <span class="html-italic">Test 2</span>: The target plane is set up at different angles with respect to the linear array of proximity sensors.</p> "> Figure 13
<p><span class="html-italic">Test 1</span>: Plot of 150 distance measurements as a function of time acquired by ToF3 in different operational conditions, considering a parallel target plane placed at 50 cm with respect to the linear array of ToF sensors.</p> "> Figure 14
<p><span class="html-italic">Test 2</span>: The 2D and 3D plots of the reconstructed point cloud acquired by eight ToF sensors with target plane set up at different angles. ToF sensor origins are represented with red dots. ToF measurements are represented by blue dots.</p> "> Figure 15
<p>Interference tests between proximity sensors belonging to the different arrays. (<b>a</b>) Picture of the real setup. (<b>b</b>,<b>c</b>) The 2D and 3D reconstructions of the distance measurements collected by eight ToF sensors of array 1, with the data acquisition from array 2 simultaneously enabled.</p> "> Figure 16
<p>Plot of 150 distance measurements as a function of time acquired by ToF5 of the first array in two different operational conditions: the second array is disabled (blue graph) and enabled (red graph).</p> ">
Abstract
:1. Introduction
2. CySkin Technology
3. CoLLaboratE Project
4. ProxySKIN: Next Generation of CySkin
4.1. Time-of-Flight Sensors
- ToF Field of View (FoV). The FoV of a single proximity sensor is 45°. Therefore, assuming the linear arrays are shaped into a circular pattern, with the sensing elements facing outwards, at least eight ToFs would be necessary to cover the outlying 360°. However, from a volumetric analysis of the ToF coverage, we found that with 10 ToFs per array, the ratio between blind spots and visible spots is significantly reduced. The analysis considers a constant cylindrical volume , consistent with the space monitored by the proximity sensors on a link of the robot manipulator, and a varying number of arrays (from one to four) and ToFs per array (from 8 to 16). By intersecting and subtracting from the reference volume () the projection of the ToFs FoV, we obtain the blind volume () that is not covered by the proximity fields. The results are reported in Figure 6 and Table 1, showing that three arrays of 10 ToFs represent a good compromise in terms of volume coverage and number of ToFs per array.
- ToF Power Consumption. The typical power consumption for a single proximity sensor is about 215 mW, leading to a power consumption of 2150 mW for an array.
- ToF Data Payload and Transmission Frequency. The VL53L8CX proximity sensor can provide several data other than the target distance measurement, including the light ambient noise, the variance and the validity of the measured target distance and the estimated reflectance of the target. We consider three fields of the message containing relevant information for robotic applications. In particular, from a single sensor reading in its 64-zone configuration, we extract the target distance, the status and the variance of the measurement, reaching a payload of 320 bytes. With 10 sensors, the full data payload amounts to 3220 bytes, which must be transmitted within 66.7 ms to the microcontroller, assuming to work at the maximum frequency of 15 Hz. Each array of ToF sensors is driven by a microcontroller (the IHB) that sequentially reads the range data through an SPI channel operating at a frequency of 3 MHz. Data are then forwarded to an EtherCAT slave through another SPI channel. This configuration represents a trade-off between system complexity (larger number of ToF sensors for each array) and time lag introduced to transmit the ToF data to be processed by the robot controller. From a temporal analysis of the SPI signals, reported in Figure 7, we can see a significant bandwidth consumption with 10 sensors that does not allow the integration of additional ToFs to the array.
4.2. Hardware Design of the Proximity Sensor Arrays
4.3. Communication Protocols and System Level Description
5. Proximity Sensors Characterization
- The Field of View (FoV) of adjacent ToF sensors in an array is partially overlapped to locally minimize the blind volume not monitored by the proximity sensors, as explained in Section 4.1;
- The FoV of proximity sensors placed on different links of the robot might intersect depending on the joint configuration, as shown in Figure 11.
- Interference test among ToF sensors belonging to the same array;
- Interference test between distinct arrays of ToF sensors.
5.1. Interference Test among ToF Sensors Belonging to the Same Array
5.2. Interference Test between Distinct Arrays of ToF Sensors
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zanchettin, A.M.; Casalino, A.; Piroddi, L.; Rocco, P. Prediction of Human Activity Patterns for Human–Robot Collaborative Assembly Tasks. IEEE Trans. Ind. Inform. 2019, 15, 3934–3942. [Google Scholar] [CrossRef]
- Landi, C.T.; Cheng, Y.; Ferraguti, F.; Bonfè, M.; Secchi, C.; Tomizuka, M. Prediction of Human Arm Target for Robot Reaching Movements. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 5950–5957. [Google Scholar] [CrossRef]
- Bascetta, L.; Ferretti, G.; Rocco, P.; Ardo, H.; Bruyninckx, H.; Demeester, E.; Di Lello, E. Towards Safe Human-Robot Interaction in Robotic Cells: An Approach Based on Visual Tracking and Intention Estimation. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 2971–2978. [Google Scholar] [CrossRef]
- Roitberg, A.; Perzylo, A.; Somani, N.; Giuliani, M.; Rickert, M.; Knoll, A. Human activity recognition in the context of industrial human-robot interaction. In Proceedings of the Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, Siem Reap, Cambodia, 9–12 December 2014; pp. 1–10. [Google Scholar] [CrossRef]
- Tellaeche, A.; Maurtua, I.; Ibarguren, A. Human Robot Interaction in Industrial Robotics. Examples from Research Centers to Industry. In Proceedings of the 2015 IEEE 20th Conference on Emerging Technologies & Factory Automation (ETFA), Luxembourg, 8–11 September 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Du, G.; Zhang, P. A Markerless Human–Robot Interface Using Particle Filter and Kalman Filter for Dual Robots. IEEE Trans. Ind. Electron. 2015, 62, 2257–2264. [Google Scholar] [CrossRef]
- Wang, Y.; Wu, X.; Mei, D.; Zhu, L.; Chen, J. Flexible Tactile Sensor Array for Distributed Tactile Sensing and Slip Detection in Robotic Hand Grasping. Sens. Actuators A Phys. 2019, 297, 111512. [Google Scholar] [CrossRef]
- Maggiali, M.; Cannata, G.; Maiolino, P.; Metta, G.; Randazzo, M.; Sandini, G. Embedded Distributed Capacitive Tactile Sensor. In Proceedings of the Mechatronics 2008: 11th Mechatronics Forum Biennial International Conference, Institution of Mechanical Engineers, Limerick, Ireland, 23–25 June 2008. [Google Scholar]
- Yu, P.; Liu, W.; Gu, C.; Cheng, X.; Fu, X. Flexible Piezoelectric Tactile Sensor Array for Dynamic Three-Axis Force Measurement. Sensors 2016, 16, 819. [Google Scholar] [CrossRef] [PubMed]
- Krishna, G.; Rajanna, K. Tactile Sensor Based on Piezoelectric Resonance. IEEE Sensors J. 2004, 4, 691–697. [Google Scholar] [CrossRef]
- Dahiya, R.S.; Metta, G.; Valle, M.; Adami, A.; Lorenzelli, L. Piezoelectric Oxide Semiconductor Field Effect Transistor Touch Sensing Devices. Appl. Phys. Lett. 2009, 95, 034105. [Google Scholar] [CrossRef]
- Li, W.; Konstantinova, J.; Noh, Y.; Ma, Z.; Alomainy, A.; Althoefer, K. An Elastomer-based Flexible Optical Force and Tactile Sensor. In Proceedings of the 2019 2nd IEEE International Conference on Soft Robotics (RoboSoft), Seoul, Republic of Korea, 14–18 April 2019; pp. 361–366. [Google Scholar] [CrossRef]
- Yuan, W.; Dong, S.; Adelson, E. GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force. Sensors 2017, 17, 2762. [Google Scholar] [CrossRef] [PubMed]
- Mittendorfer, P.; Cheng, G. Humanoid Multimodal Tactile-Sensing Modules. IEEE Trans. Robot. 2011, 27, 401–410. [Google Scholar] [CrossRef]
- Grella, F.; Baldini, G.; Canale, R.; Sagar, K.; Wang, S.A.; Albini, A.; Jilich, M.; Cannata, G.; Zoppi, M. A Tactile Sensor-Based Architecture for Collaborative Assembly Tasks with Heavy-Duty Robots. In Proceedings of the 2021 20th International Conference on Advanced Robotics (ICAR), Ljubljana, Slovenia, 6–10 December 2021; pp. 1030–1035. [Google Scholar] [CrossRef]
- Hughes, D.; Lammie, J.; Correll, N. A Robotic Skin for Collision Avoidance and Affective Touch Recognition. IEEE Robot. Autom. Lett. 2018, 3, 1386–1393. [Google Scholar] [CrossRef]
- Ding, Y.; Zhang, H.; Thomas, U. Capacitive Proximity Sensor Skin for Contactless Material Detection. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7179–7184. [Google Scholar] [CrossRef]
- Ding, Y.; Wilhelm, F.; Faulhammer, L.; Thomas, U. With Proximity Servoing towards Safe Human-Robot-Interaction. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 4907–4912. [Google Scholar] [CrossRef]
- Tsuji, S.; Kohama, T. Proximity Skin Sensor Using Time-of-Flight Sensor for Human Collaborative Robot. IEEE Sensors J. 2019, 19, 5859–5864. [Google Scholar] [CrossRef]
- Cheung, E.; Lumelsky, V. A sensitive skin system for motion control of robot arm manipulators. Robot. Auton. Syst. 1992, 10, 9–32. [Google Scholar] [CrossRef]
- Lumelsky, V.; Cheung, E. Real-Time Collision Avoidance in Teleoperated Whole-Sensitive Robot Arm Manipulators. IEEE Trans. Syst. Man, Cybern. 1993, 23, 194–203. [Google Scholar] [CrossRef]
- Mittendorfer, P.; Yoshida, E.; Cheng, G. Realizing whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. Adv. Robot. 2015, 29, 51–67. [Google Scholar] [CrossRef]
- Mittendorfer, P.; Yoshida, E.; Moulard, T.; Cheng, G. A general tactile approach for grasping unknown objects with a humanoid robot. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 4747–4752. [Google Scholar] [CrossRef]
- Cheng, G.; Dean-Leon, E.; Bergner, F.; Rogelio Guadarrama Olvera, J.; Leboutet, Q.; Mittendorfer, P. A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications. Proc. IEEE 2019, 107, 2034–2051. [Google Scholar] [CrossRef]
- Bergner, F.; Dean-Leon, E.; Cheng, G. Event-based signaling for large-scale artificial robotic skin - realization and performance evaluation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Republic of Korea, 9–14 October 2016; pp. 4918–4924. [Google Scholar] [CrossRef]
- Zhou, Y.; Zhao, J.; Lu, P.; Wang, Z.; He, B. TacSuit: A Wearable Large-Area, Bioinspired Multimodal Tactile Skin for Collaborative Robots. IEEE Trans. Ind. Electron. 2024, 71, 1708–1717. [Google Scholar] [CrossRef]
- Abah, C.; Orekhov, A.L.; Johnston, G.L.; Yin, P.; Choset, H.; Simaan, N. A Multi-modal Sensor Array for Safe Human-Robot Interaction and Mapping. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 3768–3774. [Google Scholar] [CrossRef]
- Abah, C.; Orekhov, A.L.; Johnston, G.L.H.; Simaan, N. A Multi-Modal Sensor Array for Human–Robot Interaction and Confined Spaces Exploration Using Continuum Robots. IEEE Sensors J. 2022, 22, 3585–3594. [Google Scholar] [CrossRef] [PubMed]
- Collaborate Project. 2018. Available online: https://collaborate-project.eu/ (accessed on 5 December 2023).
- CySkin Project. 2015. Available online: https://www.cyskin.com/ (accessed on 5 December 2023).
- Maiolino, P.; Maggiali, M.; Cannata, G.; Metta, G.; Natale, L. A Flexible and Robust Large Scale Capacitive Tactile System for Robots. IEEE Sens. J. 2013, 13, 3910–3917. [Google Scholar] [CrossRef]
- Albini, A.; Grella, F.; Maiolino, P.; Cannata, G. Exploiting Distributed Tactile Sensors to Drive a Robot Arm Through Obstacles. IEEE Robot. Autom. Lett. 2021, 6, 4361–4368. [Google Scholar] [CrossRef]
- Albini, A.; Cannata, G. Pressure Distribution Classification and Segmentation of Human Hands in Contact with the Robot Body. Int. J. Robot. Res. 2020, 39, 668–687. [Google Scholar] [CrossRef]
- Sidiropoulos, A.; Dimeas, F.; Papageorgiou, D.; Prapavesis Semetzidis, T.; Doulgeri, Z.; Zanella, A.; Grella, F.; Sagar, K.; Jilich, M.; Albini, A.; et al. Safe and Effective Collaboration With a High-Payload Robot: A Framework Integrating Novel Hardware and Software Modules. IEEE Robot. Autom. Mag. 2023, 2–11. [Google Scholar] [CrossRef]
- Sestosenso Project. 2022. Available online: https://sestosenso.eu/ (accessed on 5 December 2023).
- VL53L8CX—Low-Power High-Performance 8 × 8 Multizone Time-of-Flight Sensor (ToF). 2023. Available online: https://www.st.com/en/imaging-and-photonics-solutions/vl53l8cx.html (accessed on 5 December 2023).
- Ding, Y.; Thomas, U. Collision Avoidance with Proximity Servoing for Redundant Serial Robot Manipulator. In Proceedings of the 2020 International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 10249–10255. [Google Scholar] [CrossRef]
- Albini, A.; Denei, S.; Cannata, G. Enabling Natural Human-Robot Physical Interaction Using a Robotic Skin Feedback and a Prioritized Tasks Robot Control Architecture. In Proceedings of the 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids), Birmingham, UK, 15–17 November 2017; pp. 99–106. [Google Scholar] [CrossRef]
- Datasheet VL53L8CX—Low-Power High-Performance 8 × 8 Multizone Time-of-Flight Sensor (ToF). 2023. Available online: https://www.st.com/resource/en/datasheet/vl53l8cx.pdf (accessed on 5 December 2023).
# arrays | ||||||||
---|---|---|---|---|---|---|---|---|
1 array | 2 arrays | 3 arrays | 4 arrays | |||||
aligned | misaligned | aligned | misaligned | aligned | misaligned | |||
# ToFs | 8 ToFs | 2.421 | 1.456 | 1.417 | 1.379 | 1.231 | 1.365 | 1.159 |
10 ToFs | 2.253 | 1.162 | 1.173 | 1.067 | 1.067 | 1.049 | 1.047 | |
12 ToFs | 2.240 | 1.143 | 1.142 | 1.044 | 1.044 | 1.024 | 1.024 | |
14 ToFs | 2.234 | 1.135 | 1.135 | 1.036 | 1.036 | 1.015 | 1.015 | |
16 ToFs | 2.230 | 1.130 | 1.130 | 1.032 | 1.032 | 1.010 | 1.010 |
cell 0 | cell 18 | cell 36 | cell 54 | ||||||
---|---|---|---|---|---|---|---|---|---|
50 mm | Config 1 ToF3 enabled | 55.583 | 1.300 | 51.565 | 1.1555 | 54.360 | 1.273 | 53.866 | 1.442 |
Config 2 all ToFs enabled | 55.583 | 1.299 | 51.565 | 1.155 | 54.358 | 1.272 | 53.866 | 1.442 | |
500 mm | Config 1 ToF3 enabled | 514.252 | 23.42 | 500.40 | 22.59 | 493.40 | 22.23 | 485.01 | 21.90 |
Config 2 all ToFs enabled | 515.89 | 23.58 | 509.85 | 23.25 | 505.84 | 23.01 | 487.09 | 22.04 | |
1000 mm | Config 1 ToF3 enabled | 988.70 | 47.24 | 1010.63 | 45.82 | 995.92 | 44.88 | 980.68 | 44.42 |
Config 2 all ToFs enabled | 994.30 | 48.88 | 1012.78 | 46.11 | 999.11 | 45.09 | 982.95 | 44.65 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Giovinazzo, F.; Grella, F.; Sartore, M.; Adami, M.; Galletti, R.; Cannata, G. From CySkin to ProxySKIN: Design, Implementation and Testing of a Multi-Modal Robotic Skin for Human–Robot Interaction. Sensors 2024, 24, 1334. https://doi.org/10.3390/s24041334
Giovinazzo F, Grella F, Sartore M, Adami M, Galletti R, Cannata G. From CySkin to ProxySKIN: Design, Implementation and Testing of a Multi-Modal Robotic Skin for Human–Robot Interaction. Sensors. 2024; 24(4):1334. https://doi.org/10.3390/s24041334
Chicago/Turabian StyleGiovinazzo, Francesco, Francesco Grella, Marco Sartore, Manuela Adami, Riccardo Galletti, and Giorgio Cannata. 2024. "From CySkin to ProxySKIN: Design, Implementation and Testing of a Multi-Modal Robotic Skin for Human–Robot Interaction" Sensors 24, no. 4: 1334. https://doi.org/10.3390/s24041334
APA StyleGiovinazzo, F., Grella, F., Sartore, M., Adami, M., Galletti, R., & Cannata, G. (2024). From CySkin to ProxySKIN: Design, Implementation and Testing of a Multi-Modal Robotic Skin for Human–Robot Interaction. Sensors, 24(4), 1334. https://doi.org/10.3390/s24041334