Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3613905.3650827acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

Gender Differences and Social Design in Human-AI Collaboration: Insights from Virtual Cobot Interactions Under Varying Task Loads

Published: 11 May 2024 Publication History

Abstract

This work explores the effects of users’ gender and social design features of AI under different task load conditions on human-like attributions, social impact, work performance and perceived workload, user experience, and various other measures in Human-AI Interaction (HAII). Users had to execute sorting and dispatch tasks in collaboration with a virtual cobot. The degree of social gestalt of the cobot was varied by the ability to make small talk (i.e., talkative vs. non-talkative cobot), and the task load was increased by adding a secondary task (i.e., high vs. low task load condition). Overall, the talkative cobot led to a more positive perception of the cobot and increased social qualities like sense of meaning and team membership compared to the non-talkative cobot. The following gender effect was particularly interesting. The talkative cobot had a buffering effect for women and a distraction conflict effect for men in high task load conditions. When interacting with the talkative robot, women find the high task condition less stressful. In contrast thereto, the talkative cobot was distracting for men in the high task load condition. Our results highlight that social design choices and interindividual differences influence a successful collaboration between humans and AI. The work also shows the added value of systematic XR-simulations for the investigation and design of human-centered HAIIs (eXtended AI approach).

Supplemental Material

MP4 File
Talk Video

References

[1]
Steven R Aragon. 2003. Creating social presence in online environments. New directions for adult and continuing education 2003, 100 (2003), 57–68.
[2]
Jeremy N Bailenson, Eyal Aharoni, Andrew C Beall, Rosanna E Guadagno, Aleksandar Dimov, and Jim Blascovich. 2004. Comparing behavioral and self-report measures of embodied agents’ social presence in immersive virtual environments. In Proceedings of the 7th Annual International Workshop on PRESENCE, Vol. 1105. IEEE.
[3]
Robert S Baron. 1986. Distraction-conflict theory: Progress and problems. Advances in experimental social psychology 19 (1986), 1–40.
[4]
Frank Biocca, Chad Harms, and Judee K Burgoon. 2003. Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence: Teleoperators & virtual environments 12, 5 (2003), 456–480.
[5]
Jim Blascovich. 2002. Social influence within immersive virtual environments. In The social life of avatars: Presence and interaction in shared virtual environments. Springer, 127–145.
[6]
Jim Blascovich, Jack Loomis, Andrew C Beall, Kimberly R Swinth, Crystal L Hoyt, and Jeremy N Bailenson. 2002. Immersive virtual environment technology as a methodological tool for social psychology. Psychological inquiry 13, 2 (2002), 103–124.
[7]
Dario Bombari, Marianne Schmid Mast, Elena Canadas, and Manuel Bachmann. 2015. Studying social interactions through immersive virtual environment technology: virtues, pitfalls, and future challenges. Frontiers in psychology 6 (2015), 869.
[8]
Cynthia Breazeal. 2004. Social interactions in HRI: the robot view. IEEE transactions on systems, man, and cybernetics, part C (applications and reviews) 34, 2 (2004), 181–186.
[9]
Cynthia Breazeal, Kerstin Dautenhahn, and Takayuki Kanda. 2016. Social robotics. Springer handbook of robotics (2016), 1935–1972.
[10]
Astrid Carolus and Carolin Wienrich. 2021. Towards a holistic approach and measurement of humans interacting with speech-based technology. In 1st AI-debate workshop: Workshop establishing an interdisciplinary perspective on speech-based technology. 39–41.
[11]
Julie Carpenter. 2013. Just doesn’t look right: Exploring the impact of humanoid robot integration into explosive ordnance disposal teams. In Handbook of research on technoself: Identity in a technological society. IGI Global, 609–636.
[12]
Jie Chen, Yulong Ding, Bin Xin, Qingkai Yang, and Hao Fang. 2020. A unifying framework for human–agent collaborative systems—Part I: Element and relation analysis. IEEE Transactions on Cybernetics 52, 1 (2020), 138–151.
[13]
Jessie YC Chen, Shan G Lakhmani, Kimberly Stowers, Anthony R Selkowitz, Julia L Wright, and Michael Barnes. 2018. Situation awareness-based agent transparency and human-autonomy teaming effectiveness. Theoretical issues in ergonomics science 19, 3 (2018), 259–282.
[14]
Dianne Cyr, Khaled Hassanein, Milena Head, and Alex Ivanov. 2007. The role of social presence in establishing loyalty in e-service environments. Interacting with computers 19, 1 (2007), 43–56.
[15]
Mathias Diebig, Franziska Jungmann, Andreas Müller, and Ines Catharina Wulf. 2018. Inhalts-und prozessbezogene Anforderungen an die Gefährdungsbeurteilung psychischer Belastung im Kontext Industrie 4.0. Zeitschrift für Arbeits-und Organisationspsychologie A&O (2018).
[16]
Josef A Fischer and Hendrik Hüttermann. 2020. Speak-Up-Check (SUC): Fragebogen zur Messung von proaktiver und konstruktiver Kommunikation in Teams. In Zusammenstellung sozialwissenschaftlicher Items und Skalen (ZIS).
[17]
Bernard Guerin and JM Innes. 1984. Explanations of social facilitation: A review. Current psychological research & reviews 3 (1984), 32–52.
[18]
Svyatoslav Guznov, J Lyons, Marc Pfahler, A Heironimus, Montana Woolley, Jeremy Friedman, and A Neimeier. 2020. Robot transparency and team orientation effects on human–robot teaming. International Journal of Human–Computer Interaction 36, 7 (2020), 650–660.
[19]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology. Vol. 52. Elsevier, 139–183.
[20]
Marc Hassenzahl and Andrew Monk. 2010. The inference of perceived usability from beauty. Human–Computer Interaction 25, 3 (2010), 235–260.
[21]
Keith C Hendy, Jianqiao Liao, and Paul Milgram. 1997. Combining time and intensity effects in assessing operator information-processing load. Human Factors 39, 1 (1997), 30–47.
[22]
Clint Heyer. 2010. Human-robot interaction and future industrial robotics applications. In 2010 ieee/rsj international conference on intelligent robots and systems. IEEE, 4749–4754.
[23]
Chin-Chang Ho and Karl F MacDorman. 2010. Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior 26, 6 (2010), 1508–1518.
[24]
Pascal Huguet, Marie P Galvaing, Jean M Monteil, and Florence Dumas. 1999. Social presence effects in the Stroop task: further evidence for an attentional view of social facilitation.Journal of personality and social psychology 77, 5 (1999), 1011.
[25]
Veronika Huta. 2016. An overview of hedonic and eudaimonic well-being concepts. The Routledge handbook of media use and well-being (2016), 14–33.
[26]
WA IJsselsteijn, Y De Kort, and K Poels. 2013. The Game Experience Questionnaire. Technische Universiteit Eindhoven. Eindhoven (2013), 3–9.
[27]
JASP Team. 2024. JASP (Version 0.18.3)[Computer software]. https://jasp-stats.org/
[28]
Alexandra D Kaplan, Theresa T Kessler, J Christopher Brill, and PA Hancock. 2023. Trust in artificial intelligence: Meta-analytic findings. Human factors 65, 2 (2023), 337–359.
[29]
John F Kelley. 1983. An empirical methodology for writing user-friendly natural language computer applications. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. 193–196.
[30]
Megan N Kozak, Abigail A Marsh, and Daniel M Wegner. 2006. What do I think you’re doing? Action identification and mind attribution.Journal of personality and social psychology 90, 4 (2006), 543.
[31]
Mengjun Li and Ayoung Suh. 2022. Anthropomorphism in AI-enabled technology: A literature review. Electronic Markets 32, 4 (2022), 2245–2275.
[32]
Joseph B Lyons, Sean Mahoney, Kevin T Wynne, and Mark A Roebke. 2018. Viewing machines as teammates: A qualitative study. In 2018 AAAI Spring Symposium Series.
[33]
Joseph E Mercado, Michael A Rupp, Jessie YC Chen, Michael J Barnes, Daniel Barber, and Katelyn Procci. 2016. Intelligent agent transparency in human–agent teaming for Multi-UxV management. Human factors 58, 3 (2016), 401–415.
[34]
Kathleen L Mosier, Nikita Sethi, Shane McCauley, Len Khoo, and Judith M Orasanu. 2007. What you don’t know can hurt you: Factors impacting diagnosis in the automated cockpit. Human Factors 49, 2 (2007), 300–310.
[35]
Tatsuya Nomura, Tomohiro Suzuki, Takayuki Kanda, and Kensuke Kato. 2006. Measurement of negative attitudes toward robots. Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 7, 3 (2006), 437–454.
[36]
Xueni Pan and Antonia F de C Hamilton. 2018. Why and how to use virtual reality to study human social interaction: The challenges of exploring a new research landscape. British Journal of Psychology 109, 3 (2018), 395–417.
[37]
Matthew S Prewett, Ryan C Johnson, Kristin N Saboe, Linda R Elliott, and Michael D Coovert. 2010. Managing workload in human–robot interaction: A review of empirical studies. Computers in Human Behavior 26, 5 (2010), 840–856.
[38]
Amanda Purington, Jessie G Taft, Shruti Sannon, Natalya N Bazarova, and Samuel Hardman Taylor. 2017. " Alexa is my new BFF" social roles, user satisfaction, and personification of the Amazon Echo. In Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. 2853–2859.
[39]
Lionel P Robert Jr and Sangseok You. 2015. Subgroup formation in teams working with robots. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. 2097–2102.
[40]
Samantha Straka, Martin Jakobus Koch, Astrid Carolus, Marc Erich Latoschik, and Carolin Wienrich. 2023. How do employees imagine AI they want to work with: A drawing study. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems. 1–8.
[41]
Cristen Torrey, Aaron Powers, Matthew Marge, Susan R Fussell, and Sara Kiesler. 2006. Effects of adaptive robot dialogue on information exchange and social relations. In Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction. 126–133.
[42]
Valeria Villani, Fabio Pini, Francesco Leali, and Cristian Secchi. 2018. Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications. Mechatronics 55 (2018), 248–266.
[43]
Christopher D Wickens. 2002. Multiple resources and performance prediction. Theoretical issues in ergonomics science 3, 2 (2002), 159–177.
[44]
Carolin Wienrich, Astrid Carolus, David Roth-Isigkeit, and Andreas Hotho. 2022. Inhibitors and Enablers to Explainable AI Success: A Systematic Examination of Explanation Complexity and Individual Characteristics. Multimodal Technologies and Interaction 6, 12 (2022), 106.
[45]
Carolin Wienrich, Nina Döllinger, and Rebecca Hein. 2021. Behavioral framework of immersive technologies (BehaveFIT): How and why virtual reality can support behavioral change processes. Frontiers in Virtual Reality 2 (2021), 627194.
[46]
Carolin Wienrich and Marc Erich Latoschik. 2021. extended artificial intelligence: New prospects of human-ai interaction research. Frontiers in Virtual Reality 2 (2021), 686783.
[47]
Beverly Woolf, Winslow Burleson, Ivon Arroyo, Toby Dragon, David Cooper, and Rosalind Picard. 2009. Affect-aware tutors: recognising and responding to student affect. International Journal of Learning Technology 4, 3-4 (2009), 129–164.
[48]
Euijung Yang and Michael C Dorneich. 2017. The emotional, cognitive, physiological, and performance effects of variable time delay in robotic teleoperation. International Journal of Social Robotics 9 (2017), 491–508.

Index Terms

  1. Gender Differences and Social Design in Human-AI Collaboration: Insights from Virtual Cobot Interactions Under Varying Task Loads

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI EA '24: Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems
      May 2024
      4761 pages
      ISBN:9798400703317
      DOI:10.1145/3613905
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 11 May 2024

      Check for updates

      Author Tags

      1. Artificial Intelligence
      2. Collaborative Robots
      3. Gender
      4. Human Agent Interaction
      5. Human-AI Collaboration
      6. Interindividual Differences

      Qualifiers

      • Work in progress
      • Research
      • Refereed limited

      Data Availability

      Funding Sources

      • Bavarian State Ministry For Digital Affairs in the Project XR Hub
      • German Federal Ministry of Labour and Social Affairs

      Conference

      CHI '24

      Acceptance Rates

      Overall Acceptance Rate 5,779 of 22,566 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 196
        Total Downloads
      • Downloads (Last 12 months)196
      • Downloads (Last 6 weeks)54
      Reflects downloads up to 30 Sep 2024

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media