Abstract
This paper investigates whether there exist social distances when interacting with human-like or animal-like 3D models displayed stereoscopically without wearable devices. While social robots with human-like bodies offer advantages in physical communication through gestures, they may be risks of harm to humans, leading to difficulties in maintaining appropriate social distances. To address these challenges, the authors propose employing a 3D display to present virtual robots, allowing observers to perceive their movements without physical actions. The study explores participants’ impressions of virtual heads, categorized as either human-like or animal-like, through a survey conducted with 50 Japanese participants recruited via crowdsourcing. Results indicate that human-like heads were perceived as more physically present, whereas animal-like heads were deemed more familiar. These findings shed light on designing comfortable, effective long-term interactions between humans and robots, emphasizing the importance of considering appearance and approachability in social robot design. By leveraging non-verbal cues and alternative communication methods, such as virtual displays, researchers aim to enhance user experiences and optimize social interactions with robots in various settings, including homes and offices.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
“Mamehinata” was designed and developed by Kameyama and Mochiyama-kingyo, https://mukumi.booth.pm.
References
Mumm, J., Mutlu, B.: Human-robot proxemics: physical and psychological distancing in human-robot interaction. In: HRI 2011 – Proceedings of 6th ACM/IEEE International Conference on Human Robot Interaction, pp. 331–338 (2011)
Takayama, L., Pantofaru, C.: Influences on proxemic behaviors in human-robot interaction. In: 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2009 5495–5502 (2009)
Syrdal, D.S., Koay, K.L., Walters, M.L., Dautenhahn, K.: A personalized robot companion? - The role of individual differences on spatial preferences in HRI scenarios. In: Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, pp. 1143–1148 (2007
Mitsunaga, N., Smith, C., Kanda, T., Ishiguro, H., Hagita, N.: Adapting robot behavior for human-robot interaction. IEEE Trans. Robot. 24, 911–916 (2008)
Shiomi, M., Kanda, T., Koizumi, S., Ishiguro, H., Hagita, N.: Group attention control for communication robots with wizard of OZ approach. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, pp. 121–128. ACM (2007)
Hall, E.T.: The Hidden Dimension. Doubleday, New York, vol. 6, no. 1 (1966)
Novikova, J., Watts, L., Inamura, T.: Emotionally expressive robot behavior improves human-robot collaboration. In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 7–12. IEEE (2015)
Hedayati, H., Szafir, D., Kennedy, J.: Comparing F-formations between humans and on-screen agents. In: Conference on Human Factors in Computing Systems – Proceedings, pp. 1–9. ACM (2020)
Komatsu, T., Kuki, N.: Investigating the contributing factors to make users react toward an on-screen agent as if they are reacting toward a robotic agent. In: RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, pp. 651–656. IEEE (2009)
Komatsu, T., Seki, Y.: Users’ reactions toward an on-screen agent appearing on different media. In: 5th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2010, pp. 163–164. ACM Press (2010)
Minegishi, T., Osawa, H.: Do you move unconsciously? Accuracy of social distance and line of sight between a virtual robot head and humans. In: 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 503–508. IEEE (2020)
Minegishi, T., Osawa, H.: Accuracy of interpersonal distance and line of sight between a virtual robot head and humans. J. Japan Soc. Fuzzy Theory Intell. Informatics 33, 757–767 (2021). (Japanese)
Koda, T., Hirano, T., Ishioh, T.: Development and perception evaluation of culture-specific gaze behaviors of virtual agents. In: LNCS (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10498 LNAI, pp. 213–222 (2017)https://doi.org/10.1007/978-3-319-67401-8_25
Hayduk, L.A.: The shape of personal space: an experimental investigation. Can. J. Behav. Sci./Rev. Can. des Sci. du Comport. 13, 87–93 (1981)
Bailenson, J.N., Blascovich, J., Beall, A.C., Loomis, J.M.: Equilibrium theory revisited: mutual gaze and personal space in virtual environments. Presence Teleoperators Virtual Environ. 10, 583–598 (2001)
Gérin-Lajoie, M., Richards, C.L., Fung, J., McFadyen, B.J.: Characteristics of personal space during obstacle circumvention in physical and virtual environments. Gait Posture 27, 239–247 (2008)
Aramaki, R., Murakami, M.: Investigating appropriate spatial relationship between user and ar character agent for communication using AR WoZ system. In: Proceedings of the 15th ACM on International Conference on Multimodal Interaction, pp. 397–404. ACM (2013)
Jones, B., Zhang, Y., Wong, P.N.Y., Rintel, S.: VROOM: virtual robot overlay for online meetings. In: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–10. ACM (2020)
Wachsmuth, I., et al.: A virtual interface agent and its agency. In: Proceedings of the first international conference on Autonomous agents - AGENTS ’97, pp. 516–517. ACM Press (1997)
Looking Glass Factory. https://lookingglassfactory.com. Accessed 21 Nov 2023
Tsubota, K., Nakamori, K.: Dry eyes and video display terminals. N. Engl. J. Med. 328, 584 (1993)
Fraser, N.M., Gilbert, G.N.: Simulating speech systems. Comput. Speech Lang. 5, 81–99 (1991)
Hoffmann, L., Bock, N., Rosenthal v.d. Pütten, A.M.: The peculiarities of robot embodiment (EmCorp-Scale). In: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 370–378. ACM (2018)
Nittono, H.A.: Behavioral science framework for understanding kawaii. In: 3rd International Workshop on Kansei, pp. 80–83 (2010)
Ohkura, M., Jadram, N., Laohakangvalvit, T.: Comparison of positive feelings for motions of CG kawaii and cool robots. In: 2022 10th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW 2022, pp. 1–6. IEEE (2022)
Seaborn, K., Rogers, K., Nam, S., Kojima, M.: Kawaii game vocalics: a preliminary model. In: CHI PLAY 2023 - Companion Proceedings of the Annual Symposium on Computer-Human Interaction in Play, pp. 202–208. ACM (2023)
Yahoo! Crowdsourcing. https://crowdsourcing.yahoo.co.jp. Accessed 14 Nov 2023 (Japanese)
Joosse, M.P., Poppe, R.W., Lohse, M., Evers, V.: Cultural differences in how an engagement-seeking robot should approach a group of people. In: Proceedings of the 5th ACM international conference on Collaboration across boundaries: Culture, Distance & Technology, pp. 121–130, ACM (2014)
Acknowledgements
This work was supported by JSPS KAKENHI Grant Number JP21H03569, JP23H03896 and JST SPRING Grant Number JPMJSP2124.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Minegishi, T., Osawa, H. (2024). Exploring Changes in Social Distance and Participant Discomfort with Virtual Robot Head and Visual Familiarity. In: Kurosu, M., Hashizume, A. (eds) Human-Computer Interaction. HCII 2024. Lecture Notes in Computer Science, vol 14685. Springer, Cham. https://doi.org/10.1007/978-3-031-60412-6_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-60412-6_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-60411-9
Online ISBN: 978-3-031-60412-6
eBook Packages: Computer ScienceComputer Science (R0)