Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3661790.3661791acmotherconferencesArticle/Chapter ViewAbstractPublication PagesempathichConference Proceedingsconference-collections
research-article

EMiRAs-Empathic Mixed Reality Agents

Published: 26 June 2024 Publication History

Abstract

In recent years, there has been a growing body of research at the intersection of Mixed Reality (MR), Empathic Computing (EC), and agent technologies. Despite this trend, a unified theoretical framework to guide such research remains elusive. This paper introduces the concept of Empathic Mixed Reality Agents (EMiRAs), emerging from the convergence of Empathic Agent (EA), Mixed Reality Agent (MiRA), and Empathic Mixed Reality (EMR). We present the Corporeal Presence and Interactive Capacity (CPIC) matrix as a tool for examining EMiRAs-related studies, enabling systematic exploration of agents’ embodiment and environmental interaction capabilities. By conducting literature reviews organized within the CPIC matrix, we investigate the current landscape of EMiRAs research. Additionally, we discuss the challenges and opportunities inherent in developing EMiRAs. This work contributes to laying the groundwork for future advancements in the field by providing a comprehensive framework and analysis of EMiRAs-related research endeavors.

References

[1]
Ghazanfar Ali, Hong-Quan Le, Junho Kim, Seung-Won Hwang, and Jae-In Hwang. 2019. Design of seamless multi-modal interaction framework for intelligent virtual agents in wearable mixed reality environment. In Proceedings of the 32nd International Conference on Computer Animation and Social Agents. 47–52.
[2]
Vanya Avramova, Fangkai Yang, Chengjie Li, Christopher Peters, and Gabriel Skantze. 2017. A virtual poster presenter using mixed reality. In Intelligent Virtual Agents: 17th International Conference, IVA 2017, Stockholm, Sweden, August 27-30, 2017, Proceedings 17. Springer, 25–28.
[3]
William S Barbosa, Mariana M Gioia, Veronica G Natividade, Renan FF Wanderley, Marcelo R Chaves, Felipe C Gouvea, and Flavia M Gonçalves. 2020. Industry 4.0: examples of the use of the robotic arm for digital manufacturing processes. International Journal on Interactive Design and Manufacturing (IJIDeM) 14 (2020), 1569–1575.
[4]
Mark Billinghurst. [n. d.]. The Coming Age of Empathic Computing. https://medium.com/super-ventures-blog/the-coming-age-of-empathic-computing-617caefc7016
[5]
Fabien Boucaud, Quentin Tafiani, Catherine Pelachaud, and Indira Thouvenin. 2019. Social touch in human-agent interactions in an immersive virtual environment. In 3rd International Conference on Human Computer Interaction Theory and Applications (HUCAPP 2019). 129–136.
[6]
Fabien Boucaud, Quentin Tafiani, Catherine Pelachaud, and Indira Thouvenin. 2019. Social touch in human-agent interactions in an immersive virtual environment. In 3rd International conference on human computer interaction theory and applications (HUCAPP 2019). 129–136.
[7]
Cynthia Breazeal, Kerstin Dautenhahn, and Takayuki Kanda. 2016. Social robotics. Springer handbook of robotics (2016), 1935–1972.
[8]
Kathleen Carley and Allen Newell. 1994. The nature of the social agent. Journal of mathematical sociology 19, 4 (1994), 221–262.
[9]
Zhuang Chang. 2023. Using Empathic Mixed Reality Agents for Remote Collaboration. In 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 963–964.
[10]
Zhuang Chang, Huidong Bai, Li Zhang, Kunal Gupta, Weiping He, and Mark Billinghurst. 2022. The impact of virtual agents’ multimodal communication on brain activity and cognitive load in Virtual Reality. Frontiers in Virtual Reality 3 (2022), 179.
[11]
Mauro Dragone, Thomas Holz, and Gregory MP O’Hare. 2007. Using mixed reality agents as social interfaces for robots. In RO-MAN 2007-The 16th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 1161–1166.
[12]
Zhao Han, Albert Phan, Amia Castro, Fernando Sandoval Garza, and Tom Williams. 2022. Towards an Understanding of Physical vs Virtual Robot Appendage Design. In International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction.
[13]
Benedikt Hensen and Danylo Bekhter. 2023. Mixed Reality Agents as Language Learning Tutors. In International Conference on Intelligent Tutoring Systems. Springer, 565–575.
[14]
Thomas Holz, Abraham G Campbell, Gregory MP O’Hare, John W Stafford, Alan Martin, and Mauro Dragone. 2011. Mira—mixed reality agents. International journal of human-computer studies 69, 4 (2011), 251–268.
[15]
S Jerritta, M Murugappan, R Nagarajan, and Khairunizam Wan. 2011. Physiological signals based human emotion recognition: a review. In 2011 IEEE 7th international colloquium on signal processing and its applications. IEEE, 410–415.
[16]
Karla Bransky Kelly, Penny Sweetser Kyburz, Sabrina Caldwell, Kingsley Fletcher, 2024. Mind-Body-Identity: A Scoping Review of Multi-Embodiment. ACM/IEEE.
[17]
Stevanus Kevin, Yun Suen Pai, and Kai Kunze. 2018. Virtual gaze: exploring use of gaze as rich interaction method with virtual agent in interactive virtual reality content. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. 1–2.
[18]
Kangsoo Kim, Luke Boelling, Steffen Haesler, Jeremy Bailenson, Gerd Bruder, and Greg F Welch. 2018. Does a digital assistant need a body? The influence of visual embodiment and social behavior on the perception of intelligent virtual agents in AR. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 105–114.
[19]
Kangsoo Kim, Luke Boelling, Steffen Haesler, Jeremy Bailenson, Gerd Bruder, and Greg F Welch. 2018. Does a digital assistant need a body? The influence of visual embodiment and social behavior on the perception of intelligent virtual agents in AR. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 105–114.
[20]
Kangsoo Kim, Celso M de Melo, Nahal Norouzi, Gerd Bruder, and Gregory F Welch. 2020. Reducing task load with an embodied intelligent virtual assistant for improved performance in collaborative decision making. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 529–538.
[21]
Kangsoo Kim, Nahal Norouzi, Dongsik Jo, Gerd Bruder, and Gregory F Welch. 2023. The augmented reality internet of things: Opportunities of embodied interactions in transreality. In Springer Handbook of Augmented Reality. Springer, 797–829.
[22]
Kangsoo Kim, Ryan Schubert, Jason Hochreiter, Gerd Bruder, and Gregory Welch. 2019. Blowing in the wind: Increasing social presence with a virtual human via environmental airflow interaction in mixed reality. Computers & Graphics 83 (2019), 23–32.
[23]
Jonathan Lazar, Jinjuan Heidi Feng, and Harry Hochheiser. 2017. Research methods in human-computer interaction. Morgan Kaufmann.
[24]
Paul Milgram and Fumio Kishino. 1994. A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS on Information and Systems 77, 12 (1994), 1321–1329.
[25]
Alexis Morris, Jie Guan, Nadine Lessio, and Yiyi Shao. 2020. Toward mixed reality hybrid objects with iot avatar agents. In 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 766–773.
[26]
Anton Nijholt. 2022. Towards Social Companions in Augmented Reality: Vision and Challenges. In International Conference on Human-Computer Interaction. Springer, 304–319.
[27]
Nahal Norouzi, Gerd Bruder, Brandon Belna, Stefanie Mutter, Damla Turgut, and Greg Welch. 2019. A systematic review of the convergence of augmented reality, intelligent virtual agents, and the internet of things. Artificial intelligence in IoT (2019), 1–24.
[28]
Nahal Norouzi, Kangsoo Kim, Gerd Bruder, Jeremy N Bailenson, Pamela Wisniewski, and Gregory F Welch. 2022. The advantages of virtual dogs over virtual people: Using augmented reality to provide social support in stressful situations. International Journal of Human-Computer Studies 165 (2022), 102838.
[29]
Nahal Norouzi, Kangsoo Kim, Myungho Lee, Ryan Schubert, Austin Erickson, Jeremy Bailenson, Gerd Bruder, and Greg Welch. 2019. Walking your virtual dog: Analysis of awareness and proxemics with simulated support animals in augmented reality. In 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 157–168.
[30]
Catharine Oertel, Ginevra Castellano, Mohamed Chetouani, Jauwairia Nasir, Mohammad Obaid, Catherine Pelachaud, and Christopher Peters. 2020. Engagement in human-agent interaction: An overview. Frontiers in Robotics and AI 7 (2020), 92.
[31]
Shinji Okumoto, Feng Zhao, and Hideyuki Sawada. 2012. Tactoglove presenting tactile sensations for intuitive gestural interaction. In 2012 IEEE International Symposium on Industrial Electronics. IEEE, 1680–1685.
[32]
Ana Paiva. 2011. Empathy in social agents. International Journal of Virtual Reality 10, 1 (2011), 1–4.
[33]
Ana Paiva, Iolanda Leite, Hana Boukricha, and Ipke Wachsmuth. 2017. Empathy in virtual agents and robots: A survey. ACM Transactions on Interactive Intelligent Systems (TiiS) 7, 3 (2017), 1–40.
[34]
Thai Phan, Wolfgang Hönig, and Nora Ayanian. 2018. Mixed reality collaboration between human-agent teams. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 659–660.
[35]
Daniel Pimentel and Charlotte Vinkers. 2021. Copresence With Virtual Humans in Mixed Reality: The Impact of Contextual Responsiveness on Social Perceptions. Frontiers in Robotics and AI 8 (2021), 25.
[36]
Thammathip Piumsomboon, Youngho Lee, Gun A Lee, Arindam Dey, and Mark Billinghurst. 2017. Empathic mixed reality: Sharing what you feel and interacting with what you see. In 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR). IEEE, 38–41.
[37]
Helmut Prendinger and Mitsuru Ishizuka. 2005. The empathic companion: A character-based interface that address users’ affective states. Applied artificial intelligence 19, 3-4 (2005), 267–285.
[38]
Stephanie D Preston and Frans BM De Waal. 2002. Empathy: Its ultimate and proximate bases. Behavioral and brain sciences 25, 1 (2002), 1–20.
[39]
Shuwen Qiu, Hangxin Liu, Zeyu Zhang, Yixin Zhu, and Song-Chun Zhu. 2020. Human-robot interaction in a shared augmented reality workspace. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 11413–11418.
[40]
Alessandro Ricci, Luca Tummolini, Michele Piunti, Olivier Boissier, and Cristiano Castelfranchi. 2015. Mirror worlds as agent societies situated in mixed reality environments. In Coordination, Organizations, Institutions, and Norms in Agent Systems X: COIN 2014 International Workshops, COIN@ AAMAS, Paris, France, May 6, 2014, COIN@ PRICAI, Gold Coast, QLD, Australia, December 4, 2014, Revised Selected Papers 10. Springer, 197–212.
[41]
Sérgio H Rodrigues, Samuel F Mascarenhas, João Dias, and Ana Paiva. 2009. “I can feel it too!”: Emergent empathic reactions between synthetic characters. In 2009 3rd international conference on affective computing and intelligent interaction and workshops. IEEE, 1–7.
[42]
Stuart Russell and Peter Norvig. 2010. Artificial Intelligence: A Modern Approach (3 ed.). Prentice Hall.
[43]
Susanne Schmidt, Oscar Ariza, and Frank Steinicke. 2020. Intelligent blended agents: Reality–virtuality interaction with artificially intelligent embodied virtual humans. Multimodal Technologies and Interaction 4, 4 (2020), 85.
[44]
Felix Schoeller, Philippe Bertrand, Lynda Joy Gerry, Abhinandan Jain, Adam Haar Horowitz, and Franck Zenasni. 2019. Combining virtual reality and biofeedback to foster empathic abilities in humans. Frontiers in psychology 9 (2019), 2741.
[45]
Daniel Szafir and Bilge Mutlu. 2012. Pay attention! Designing adaptive agents that monitor and improve user engagement. In Proceedings of the SIGCHI conference on human factors in computing systems. 11–20.
[46]
Jan BF Van Erp and Alexander Toet. 2013. How to touch humans: Guidelines for social agents and robots that can touch. In 2013 humaine association conference on affective computing and intelligent interaction. IEEE, 780–785.
[47]
Jan BF Van Erp and Alexander Toet. 2015. Social touch in human–computer interaction. Frontiers in digital humanities 2 (2015), 2.
[48]
Ana M Villanueva, Ziyi Liu, Zhengzhe Zhu, Xin Du, Joey Huang, Kylie A Peppler, and Karthik Ramani. 2021. Robotar: An augmented reality compatible teleconsulting robotics toolkit for augmented makerspace experiences. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–13.
[49]
Anita Vrins, Ethel Pruss, Jos Prinsen, Caterina Ceccato, and Maryam Alimardani. 2022. Are You Paying Attention? The Effect of Embodied Interaction with an Adaptive Robot Tutor on User Engagement and Learning Performance. In International Conference on Social Robotics. Springer, 135–145.
[50]
Isaac Wang and Jaime Ruiz. 2021. Examining the Use of Nonverbal Communication in Virtual Agents. International Journal of Human–Computer Interaction (2021), 1–26.
[51]
Isaac Wang, Jesse Smith, and Jaime Ruiz. 2019. Exploring virtual agents for augmented reality. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
[52]
Dongbo Xie, Liang Chen, Lichao Liu, Liqing Chen, and Hai Wang. 2022. Actuators and sensors for application in agricultural robots: A review. Machines 10, 10 (2022), 913.
[53]
Özge Nilay Yalçın. 2019. Evaluating empathy in artificial agents. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE, 1–7.
[54]
Zi-Ming Ye, Jun-Long Chen, Miao Wang, and Yong-Liang Yang. 2021. PAVAL: Position-Aware Virtual Agent Locomotion for Assisted Virtual Reality Navigation. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 239–247.
[55]
Hui Zhang, Damian Fricker, Thomas G Smith, and Chen Yu. 2010. Real-time adaptive behaviors in multimodal human-avatar interactions. In International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction. 1–8.
[56]
Jianlong Zhou, Kun Yu, Fang Chen, Yang Wang, and Syed Z Arshad. 2018. Multimodal behavioral and physiological signals as indicators of cognitive load. In The Handbook of Multimodal-Multisensor Interfaces: Signal Processing, Architectures, and Detection of Emotion and Cognition-Volume 2. 287–329.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
EmpathiCH '24: Proceedings of the 3rd Empathy-Centric Design Workshop: Scrutinizing Empathy Beyond the Individual
May 2024
75 pages
ISBN:9798400717888
DOI:10.1145/3661790
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2024

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Agent
  2. Empathy
  3. Mixed Reality

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

EmpathiCH 2024

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 110
    Total Downloads
  • Downloads (Last 12 months)110
  • Downloads (Last 6 weeks)25
Reflects downloads up to 01 Dec 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media