Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1109/I2MTC43012.2020.9128751guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
research-article

Gesture-based Contactless Control of Mobile Manipulators using Capacitive Sensing

Published: 25 May 2020 Publication History

Abstract

Contactless control of serial and mobile manipulators is of interest in highly sensitive environments such as clean rooms and operational rooms to circumvent contamination of surrounding materials and in collaborative robotics to ensure safe and intuitive operation on shared workspaces. We present a contactless control scheme based on capacitive sensing which enables an intuitive control of robot manipulators. Contrary to optical and vision-based systems the capacitive sensor is robust against mechanical impact, dirt and does not suffer from occlusions or bad light conditions. The sensor can be realized on a flexible substrate, which offers a variety of placement options for the sensors, e.g. directly on a robot arm or integrated in the surface of a table or workplace. A comparatively simple model based approach is used to detect gestures thus avoiding the need for large training sets and allowing for easy adaptability to various geometric constraints. The capabilities of the proposed system are demonstrated by controlling the end-effector velocity of a mobile manipulator in 3D task space combined with a visualization of the system as feedback for the operator.

References

[1]
S. Mühlbacher-Karrer, M. Brandstötter, D. Schett, and H. Zangl, “Contactless control of a kinematically redundant serial manipulator using tomographic sensors,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 562-569, April 2017.
[2]
T. Ende, S. Haddadin, S. Parusel, T. Wsthoff, M. Hassenzahl, and A. Albu-Schffer, “A Human-Centered Approach to Robot Gesture Based Communication Within Collaborative Working Processes,” in Proc. of the 2011 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2011.
[3]
T. Schlegl, T. Kröger, A. Gaschler, O. Khatib, and H. Zangl, “Virtual whiskers — highly responsive robot collision avoidance,” in 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nov 2013, pp. 5373-5379.
[4]
B. Mayton, L. LeGrand, and J. Smith, “An electric field pretouch system for grasping and co-manipulation,” in Robotics and Automation (ICRA), 2010 IEEE International Conference on, May 2010, pp. 831-838. [Online]. Available: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=\&arnumber=5509658
[5]
S. Mühlbacher-Karrer, A. Gaschler, and H. Zangl, “Responsive fingers — capacitive sensing during object manipulation,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sep. 2015, pp. 4394-4401.
[6]
S. E. Navarro, S. Koch, and B. Hein, “3d contour following for a cylindrical end-effector using capacitive proximity sensors,” in IEEE/RSJ Internat. Conf. on Intell. Robots and Systems (IROS) 2016, 2016.
[7]
H. Alagi, A. Heiligl, S. E. Navarro, T. Kröeger, and B. Hein, “Material recognition using a capacitive proximity sensor with flexible spatial resolution,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2018, pp. 6284-6290.
[8]
Y. Ding, H. Zhang, and U. Thomas, “Capacitive proximity sensor skin for contactless material detection,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2018, pp. 7179-7184.
[9]
S. E. Navarro, B. Hein, and H. Wörn, “Capacitive tactile proximity sensing: from signal processing to applications in manipulation and safe human-robot interaction,” in Soft Robotics. Springer, 2015, pp. 54-65.
[10]
A. Hoffmann, A. Poeppel, A. Schierl, and W. Reif, “Environment-aware proximity detection with capacitive sensors for human-robot-interaction,” in 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2016, pp. 145-150.
[11]
Z. Erickson, M. Collier, A. Kapusta, and C. C. Kemp, “Tracking human pose during robot-assisted dressing using single-axis capacitive proximity sensing,” IEEE Robotics and Automation Letters, vol. 3, no. 3, pp. 2245-2252, July 2018.
[12]
R. Stiefelhagen, C. Fügen, P. Gieselmann, H. Holzapfel, K. Nickel, and A. Waibel, “Natural Human-Robot Interaction using Speech, Head Pose and Gestures,” in Proc. of the 2004 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2004.
[13]
M. Sigalas, H. Baltzakis, and P. Trahanias, “Gesture recognition based on arm tracking for human-robot interaction,” in Proc. of the 2010 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2010.
[14]
M. Van den Bergh, D. Carton, R. De Nijs, N. Mitsou, C. Landsiedel, K. Kuehnlenz, D. Wollherr, L. Van Gool, and M. Buss, “Real-Time 3D Hand Gesture Interaction With a Robot for Understanding Directions from Humans,” in 2011 RO-MAN, 2011.
[15]
P. Kondaxakis, J. Pajarinen, and V. Kyrki, “Real-Time Recognition of Pointing Gestures fro Robot to Robot Interaction,” in Proc. of the 2014 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS 2014), 2014.
[16]
A. Malima, E. Ozgur, and M. Cetin, “A fast algorithm for vision-based hand gesture recognition for robot control,” 2006 IEEE 14th Signal Processing and Communications Applications, pp. 1-4, 2006.
[17]
E. N. K. Kollorz, J. Penne, J. Hornegger, and A. Barke, “Gesture Recognition wiht a Time-Of-Flight Camera,” International Journal of Intelligent Systems Technologies and Applications, vol. 5, no. 3, pp. 334-343, 2008.
[18]
K. Ishii, S. Zhao, M. Inami, T. Igarashi, and M. Imai, “Designing laser gesture interface for robot control,” in Human-Computer Interaction – INTERACT 2009, T. Gross, J. Gulliksen, P. Kotzé, L. Oestreicher, P. Palanque, R. O. Prates, and M. Winckler, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009, pp. 479-492.
[19]
R. Wen, W. -L. Tay, B. P. Nguyen, C. -B. Chng, and C. -K. Chui, “Hand gesture guided robot-assisted surgery based on a direct augmented reality interface,” Computer Methods and Programs in Biomedicine, vol. 116, no. 2, pp. 68 - 80, 2014, new methods of human-robot interaction in medical practice. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0169260713004082
[20]
D. Xu, X. Wu, and Y. -L. Chen, “Online Dynamic Gesture Recognition for Human Robot Interaction,” Journal of Intelligent & Robotic Systems, vol. 77, pp. 583-596, 2015.
[21]
X. -H. Wu, M. -C. Su, and P. -C. Wang, “A Hand-Gesture-Based Control Interface for a Car-Robot,” in Proc. of the 2010 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2010.
[22]
X. Zhang, X. Chen, W. -h. Wang, J. -h. Yang, V. Lantz, and K. -q. Wang, “Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors,” in Proceedings of the 14th international conference on Intelligent user interfaces, 2009.
[23]
C. Zhu and W. Sheng, “Wearable Sensor-Based Hand Gesture and Daily ActivityRecognition for Robot-Assisted Living,” IEEE Trans. on Systems, Man and Cybernetics - Part A: Systems and Humans, vol. 41, no. 3, 2011.
[24]
M. T. Wolf, C. Assad, M. T. Vernacchia, J. Fromm, and H. L. Jethani, “Gesture-based robot control with variable autonomy from the jpl biosleeve,” in 2013 IEEE International Conference on Robotics and Automation, May 2013, pp. 1160-1165.
[25]
E. Coronado, J. Villalobos, B. Bruno, and F. Mastrogiovanni, “Gesture-based robot control: Design challenges and evaluation with humans,” in 2017 IEEE International Conference on Robotics and Automation (ICRA), May 2017, pp. 2761-2767.
[26]
A. Mujibiya, X. Cao, D. S. Tan, D. Morris, S. N. Patel, and J. Rekimoto, “The Sound of Touch: On-Body Touch and Gesture Sensing Based on Transdermal Ultrasound Propagation,” in Proc. of the 2013 ACM international conference on Interactive tabletops and surfaces, 2013.
[27]
V. Villani, L. Sabattini, G. Riggio, C. Secchi, M. Minelli, and C. Fantuzzi, “A natural infrastructure-less human–robot interaction system,” IEEE Robotics and Automation Letters, vol. 2, no. 3, pp. 1640-1647, 2017.
[28]
A. Asokan, A. J. Pothen, and R. K. Vijayaraj, “ARMatron - A Wearable Gesture Recognition Glove: For Conteol of Robotic Devices in Disaster Management and Human Rehabilitation,” in Proc. of the 2016 Int. Conf. on Robotics and Automation for Humanitarian Applications (RAHA), 2016.
[29]
X. Pu, H. Guo, Q. Tang, J. Chen, L. Feng, G. Liu, X. Wang, Y. Xi, C. Hu, and Z. L. Wang, “Rotation Sensing and Gesture Control of a Robot Joint via Triboelectric Quantization Sensor,” Nano Energy, vol. 54, pp. 453-460, 2018.
[30]
J. Lien, N. Gillian, M. E. Karagozler, P. Amihood, C. Schwesig, E. Olson, H. Raja, and I. Poupyrev, “Soli: Ubiquitous gesture sensing with millimeter wave radar,” ACM Trans. Graph., vol. 35, no. 4, pp. 142:1-142:19, Jul. 2016. [Online]. Available: http://doi.acm.org/10.1145/2897824.2925953
[31]
L. -M. Faller, S. Mühlbacher-Karrer, and H. Zangl, “Inkjet-printing rapid prototyping of a robust and flexible capacitive touch panel,” in IEEE Sensors 2016, 2016.
[32]
S. Mühlbacher-Karrer, L. -M. Faller, H. Zangl, T. Schlegl, and M. Moser, “Short range capacitive proximity sensing,” in 2nd Workshop on Alternative Sensing for Robot Perception Beyond Laser and Vision, Hamburg, Germany, October 2015.
[33]
L. -M. Faller, J. P. Leitzke, and H. Zangl, “Design of a Fast, High-Resolution Sensor Evaluation Platform applied to a Capacitive Position Sensor for a Micromirror,” in Proc. of the IEEE I2MTC 2017, 2017.
[34]
M. Weyrer, M. Brandstötter, and M. Husty, “Singularity avoidance control of a non-holonomic mobile manipulator for intuitive hand guidance,” Robotics, vol. 8, no. 1, 2019. [Online]. Available: http://www.mdpi.com/2218-6581/8/1/14

Cited By

View all
  • (2022)TangibleTouch: A Toolkit for Designing Surface-based Gestures for Tangible InterfacesProceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3490149.3502263(1-14)Online publication date: 13-Feb-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
2020 IEEE International Instrumentation and Measurement Technology Conference (I2MTC)
May 2020
2205 pages

Publisher

IEEE Press

Publication History

Published: 25 May 2020

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)TangibleTouch: A Toolkit for Designing Surface-based Gestures for Tangible InterfacesProceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3490149.3502263(1-14)Online publication date: 13-Feb-2022

View Options

View options

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media