Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2663204.2663242acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

Touching the Void -- Introducing CoST: Corpus of Social Touch

Published: 12 November 2014 Publication History

Abstract

Touch behavior is of great importance during social interaction. To transfer the tactile modality from interpersonal interaction to other areas such as Human-Robot Interaction (HRI) and remote communication automatic recognition of social touch is necessary. This paper introduces CoST: Corpus of Social Touch, a collection containing 7805 instances of 14 different social touch gestures. The gestures were performed in three variations: gentle, normal and rough, on a sensor grid wrapped around a mannequin arm. Recognition of the rough variations of these 14 gesture classes using Bayesian classifiers and Support Vector Machines (SVMs) resulted in an overall accuracy of 54% and 53%, respectively. Furthermore, this paper provides more insight into the challenges of automatic recognition of social touch gestures, including which gestures can be recognized more easily and which are more difficult to recognize.

References

[1]
J. Chang, K. MacLean, and S. Yohanan. Gesture recognition in the haptic creature. In Proceedings of the International Conference EuroHaptics, (Amsterdam, The Netherlands), pages 385--391. 2010.
[2]
M. D. Cooney, S. Nishio, and H. Ishiguro. Recognizing affection for a touch-based interaction with a humanoid robot. In Proceedings of the International Conference on Intelligent Robots and Systems (IROS), (Vilamoura, Portugal), pages 1420--1427, 2012.
[3]
R. S. Dahiya, G. Metta, M. Valle, and G. Sandini. Tactile sensing -- from humans to humanoids. Transactions on Robotics, 26(1):1--20, 2010.
[4]
T. Field. Touch for socioemotional and physical well-being: A review. Developmental Review, 30(4):367--383, 2010.
[5]
A. Gallace and C. Spence. The science of interpersonal touch: an overview. Neuroscience & Biobehavioral Reviews, 34(2):246--259, 2010.
[6]
A. Haans and W. IJsselsteijn. Mediated social touch: a review of current research and future directions. Virtual Reality, 9(2--3):149--159, 2006.
[7]
M. J. Hertenstein, J. M. Verkamp, A. M. Kerestes, and R. M. Holmes. The communicative functions of touch in humans, nonhuman primates, and rats: a review and synthesis of the empirical research. Genetic, Social, and General Psychology Monographs, 132(1):5--94, 2006.
[8]
R. Heslin, T. D. Nguyen, and M. L. Nguyen. Meaning of touch: The case of touch from a stranger or same sex person. Journal of Nonverbal Behavior, 7(3):147--157, 1983.
[9]
G. Huisman, A. Darriba Frederiks, B. Van Dijk, D. Heylen, and B. Kröse. The tasst: Tactile sleeve for social touch. In Proceedings World Haptics Conference (WHC), (Daejeon, Korea), pages 211--216, 2013.
[10]
H. Knight, R. Toscano, W. D. Stiehl, A. Chang, Y. Wang, and C. Breazeal. Real-time social touch gesture recognition for sensate robots. In Proceedings of the International Conference on Intelligent Robots and Systems (IROS), (St. Louis, MO), pages 3715--3720, 2009.
[11]
K. Nakajima, Y. Itoh, Y. Hayashi, K. Ikeda, K. Fujita, and T. Onoye. Emoballoon a balloon-shaped interface recognizing social touch interactions. In Proceedings of Advances in Computer Entertainment (ACE), (Boekelo, The Netherlands), pages 182--197. 2013.
[12]
F. Naya, J. Yamato, and K. Shinozawa. Recognizing human touching behaviors using a haptic interface for a pet-robot. In Proceedings of the International Conference on Systems, Man, and Cybernetics (SMC), (Tokyo, Japan), volume 2, pages 1030--1034, 1999.
[13]
D. Silvera-Tawil, D. Rye, and M. Velonaki. Touch modality interpretation for an eit-based sensitive skin. In Proceedings of the International Conference on Robotics and Automation (ICRA), (Shanghai, China), pages 3770--3776, 2011.
[14]
D. Silvera-Tawil, D. Rye, and M. Velonaki. Interpretation of the modality of touch on an artificial arm covered with an eit-based sensitive skin. The International Journal of Robotics Research, 31(13):1627--1641, 2012.
[15]
W. D. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, and M. Wolf. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of International Workshop on Robot and Human Interactive Communication (ROMAN), (Nashville, TN), pages 408--415, 2005.
[16]
A. Vinciarelli, M. Pantic, H. Bourlard, and A. Pentland. Social signals, their function, and automatic analysis: a survey. In Proceedings of the international conference on Multimodal interfaces (ICMI), (Chania, Crete, Greece), pages 61--68, 2008.
[17]
S. Yohanan and K. E. MacLean. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. International Journal of Social Robotics, 4(2):163--180, 2012.

Cited By

View all
  • (2024)FEELing (key)Pressed: Implicit Touch Pressure Bests Brain Activity for Modeling Emotion Dynamics in the Space Between Stressed & RelaxedIEEE Transactions on Haptics10.1109/TOH.2023.330805917:3(310-318)Online publication date: 1-Jul-2024
  • (2024)Advancements in Tactile Hand Gesture Recognition for Enhanced Human-Machine Interaction2024 IEEE International Symposium on Robotic and Sensors Environments (ROSE)10.1109/ROSE62198.2024.10590799(01-08)Online publication date: 20-Jun-2024
  • (2023)The tactile dimension: a method for physicalizing touch behaviorsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581137(1-15)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '14: Proceedings of the 16th International Conference on Multimodal Interaction
November 2014
558 pages
ISBN:9781450328852
DOI:10.1145/2663204
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 November 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. social touch
  2. touch corpus
  3. touch gesture recognition

Qualifiers

  • Poster

Funding Sources

  • Dutch national program COMMIT

Conference

ICMI '14
Sponsor:

Acceptance Rates

ICMI '14 Paper Acceptance Rate 51 of 127 submissions, 40%;
Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)54
  • Downloads (Last 6 weeks)8
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)FEELing (key)Pressed: Implicit Touch Pressure Bests Brain Activity for Modeling Emotion Dynamics in the Space Between Stressed & RelaxedIEEE Transactions on Haptics10.1109/TOH.2023.330805917:3(310-318)Online publication date: 1-Jul-2024
  • (2024)Advancements in Tactile Hand Gesture Recognition for Enhanced Human-Machine Interaction2024 IEEE International Symposium on Robotic and Sensors Environments (ROSE)10.1109/ROSE62198.2024.10590799(01-08)Online publication date: 20-Jun-2024
  • (2023)The tactile dimension: a method for physicalizing touch behaviorsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581137(1-15)Online publication date: 19-Apr-2023
  • (2023)Mediated Social Touch With Mobile Devices: A Review of Designs and EvaluationsIEEE Transactions on Haptics10.1109/TOH.2023.332750616:4(785-804)Online publication date: Oct-2023
  • (2023)Discerning Affect From Touch and Gaze During Interaction With a Robot PetIEEE Transactions on Affective Computing10.1109/TAFFC.2021.309489414:2(1598-1612)Online publication date: 1-Apr-2023
  • (2023)Recognizing Social Touch Gestures using Optimized Class-weighted CNN-LSTM Networks2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN57019.2023.10309595(2024-2029)Online publication date: 28-Aug-2023
  • (2023)Touch Technology in Affective Human–, Robot–, and Virtual–Human Interactions: A SurveyProceedings of the IEEE10.1109/JPROC.2023.3272780111:10(1333-1354)Online publication date: Oct-2023
  • (2023)Clustering Social Touch Gestures for Human-Robot InteractionSocial Robotics10.1007/978-981-99-8715-3_6(53-67)Online publication date: 3-Dec-2023
  • (2022)Touch Gesture and Emotion Recognition Using Decomposed Spatiotemporal ConvolutionsIEEE Transactions on Instrumentation and Measurement10.1109/TIM.2022.314733871(1-9)Online publication date: 2022
  • (2022)Multitask Touch Gesture and Emotion Recognition Using Multiscale Spatiotemporal Convolutions With Attention MechanismIEEE Sensors Journal10.1109/JSEN.2022.318777622:16(16190-16201)Online publication date: 15-Aug-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media