Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3472307.3484162acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article

Expression of Robot’s Emotion and Intention Utilizing Physical Positioning in Conversation

Published: 09 November 2021 Publication History

Abstract

More and more robots are providing services that involve conversations with people in daily life. Among them, the movement and positioning of the robot are effective factors to improve the robot’s expression, and the distance from the human and the direction of the robot are reported as important parameters. In multi-person dialogues, it is expected that robots can communicate their impressions, emotions, and thoughts to other robots by successfully adjusting their movement and positioning. We investigated the effect of the robot’s movement in a conversation on the subsequent conversation and the expression of the moving robot. It was found that there were intentions and emotions of the robot that could be expressed by moving forward and backward, respectively. The next step of this experiment is to conduct interaction between subjects using a system that incorporates forward and backward movements.

References

[1]
Cindy L Bethel and Robin R Murphy. 2007. Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 38, 1 (2007), 83–92.
[2]
Cynthia Breazeal. 2003. Emotion and sociable humanoid robots. International journal of human-computer studies 59, 1-2 (2003), 119–155.
[3]
Andrew G Brooks and Ronald C Arkin. 2007. Behavioral overlays for non-verbal communication expression on a humanoid robot. Autonomous robots 22, 1 (2007), 55–74.
[4]
Lola Cañamero and Jakob Fredslund. 2001. I show you how i like you-can you read it in my face?[robotics]. IEEE Transactions on systems, man, and cybernetics-Part A: Systems and humans 31, 5 (2001), 454–459.
[5]
[5] CommU.2015. https://www.jst.go.jp/erato/ishiguro/robot.html.
[6]
Sandra Costa, Filomena Soares, and Cristina Santos. 2013. Facial expressions and gestures to convey emotions with a humanoid robot. In International Conference on Social Robotics. Springer, Cham, Berlin Heidelberg, 542–551.
[7]
Stephen M Fiore, Travis J Wiltshire, Emilio JC Lobato, Florian G Jentsch, Wesley H Huang, and Benjamin Axelrod. 2013. Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior. Frontiers in psychology 4 (2013), 859.
[8]
Chie Hieda, Shunsuke Kudoh, and Takashi Suehiro. 2019. Extract Movement Parameters toward the Development of Effective Emotional Body Expressions for Flying Robots(Japanese). Nihon Robot Gakkai Shi(Journal of the Robotics Society of Japan) 37, 9(2019), 856–863.
[9]
Hiroshi Kobayashi, Fumio Hara, and Akira Tange. 1994. A basic study on dynamic control of facial expressions for face robot. In Proceedings of 1994 3rd IEEE International Workshop on Robot and Human Communication. IEEE, Nagoya, Japan, 168–173.
[10]
Hideaki Kuzuoka, Yuya Suzuki, Jun Yamashita, and Keiichi Yamazaki. 2010. Reconfiguring spatial formation arrangement by robot body orientation. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, Osaka, Japan, 285–292.
[11]
Lilia Moshkina and Ronald C Arkin. 2005. Human perspective on affective robotic behavior: A longitudinal study. In 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, Edmonton, AB, Canada, 1444–1451.
[12]
Marketta Niemelä, Päivi Heikkilä, and Hanna Lammi. 2017. A social service robot in a shopping mall: expectations of the management, retailers and consumers. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction. the Association for Computing Machinery, New York, NY, United States, 227–228.
[13]
Olimpia Pino, Giuseppe Palestra, Rosalinda Trevino, and Berardina De Carolis. 2020. The humanoid robot NAO as trainer in a memory program for elderly people with mild cognitive impairment. International Journal of Social Robotics 12, 1 (2020), 21–33.
[14]
[14] RoBoHon.2019. https://robohon.com/.
[15]
James A Russell. 1980. A circumplex model of affect.Journal of personality and social psychology 39, 6(1980), 1161.
[16]
Mark Scheeff, John Pinto, Kris Rahardja, Scott Snibbe, and Robert Tow. 2002. Experiences with Sparky, a social robot. In Socially intelligent agents. Springer, Boston, MA, 173–180.
[17]
Darja Stoeva and Margrit Gelautz. 2020. Body language in affective human-robot interaction. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. Association for Computing Machinery, New York, NY, United States, 606–608.
[18]
Shigeki Sugano and Tetsuya Ogata. 1996. Emergence of mind in robots for human interface-research methodology and robot model. In Proceedings of IEEE international conference on robotics and automation, Vol. 2. IEEE, Minneapolis, MN, USA, 1191–1198.
[19]
Kazunori Terada, Atsushi Yamauchi, and Akira Ito. 2012. Artificial emotion expression for a robot by dynamic color change. In 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication. IEEE, Paris, France, 314–321.
[20]
Ayaka Ueda, Yuichiro Yoshikawa, and Hiroshi Ishiguro. 2020. Effect of robot’s vertical oscillation on short-term emotion expression(Japanese). In JSME CONFERENCE ON ROBOTICS AND MECHATRONICS Proceedings 2020. Nihon Kikai Gakkai(The Japan Society Of Mechanical Engineers), online, 1P2–E09.
[21]
Marjorie Fink Vargas. 1986. Louder than words: An introduction to nonverbal communication. Iowa State Press, USA.

Cited By

View all
  • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
  • (2023)Co-Designing with a Social Robot Facilitator: Effects of Robot Mood Expression on Human Group DynamicsProceedings of the 11th International Conference on Human-Agent Interaction10.1145/3623809.3623820(22-29)Online publication date: 4-Dec-2023

Index Terms

  1. Expression of Robot’s Emotion and Intention Utilizing Physical Positioning in Conversation
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HAI '21: Proceedings of the 9th International Conference on Human-Agent Interaction
    November 2021
    447 pages
    ISBN:9781450386203
    DOI:10.1145/3472307
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 November 2021

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. emotion expression
    2. human-robot-interaction
    3. nonverbal communication
    4. robot avatar

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • JST Moonshot R&D
    • Innovation Platform for Society 5.0
    • JSPS KAKENHI

    Conference

    HAI '21
    Sponsor:
    HAI '21: International Conference on Human-Agent Interaction
    November 9 - 11, 2021
    Virtual Event, Japan

    Acceptance Rates

    Overall Acceptance Rate 121 of 404 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)46
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 15 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
    • (2023)Co-Designing with a Social Robot Facilitator: Effects of Robot Mood Expression on Human Group DynamicsProceedings of the 11th International Conference on Human-Agent Interaction10.1145/3623809.3623820(22-29)Online publication date: 4-Dec-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media