Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2207676.2207777acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Designing effective gaze mechanisms for virtual agents

Published: 05 May 2012 Publication History

Abstract

Virtual agents hold great promise in human-computer interaction with their ability to afford embodied interaction using nonverbal human communicative cues. Gaze cues are particularly important to achieve significant high-level outcomes such as improved learning and feelings of rapport. Our goal is to explore how agents might achieve such outcomes through seemingly subtle changes in gaze behavior and what design variables for gaze might lead to such positive outcomes. Drawing on research in human physiology, we developed a model of gaze behavior to capture these key design variables. In a user study, we investigated how manipulations in these variables might improve affiliation with the agent and learning. The results showed that an agent using affiliative gaze elicited more positive feelings of connection, while an agent using referential gaze improved participants' learning. Our model and findings offer guidelines for the design of effective gaze behaviors for virtual agents.

References

[1]
Argyle, M., and Cook, M. Gaze and mutual gaze. Cambridge University Press Cambridge, 1976.
[2]
Bailenson, J., Yee, N., Merget, D., and Schroeder, R. The effect of behavioral realism and form realism of real-time avatar faces on verbal disclosure, nonverbal disclosure, emotion recognition, and copresence in dyadic interaction. Presence: Teleoperators and Virtual Environments 15, 4 (2006), 359--372.
[3]
Bayliss, A., Paul, M., Cannon, P., and Tipper, S. Gaze cuing and affective judgments of objects: I like what you look at. Psychonomic bulletin & review 13, 6 (2006), 1061--1066.
[4]
Beebe, S. Effects of eye contact, posture and vocal inflection upon credibility and comprehension.
[5]
Burgoon, J., Coker, D., and Coker, R. Communicative effects of gaze behavior. Human Communication Research 12, 4 (1986), 495--524.
[6]
Cafaro, A., Gaito, R., and Vilhjálmsson, H. Animating idle gaze in public places. In Intelligent Virtual Agents, Springer (2009), 250--256.
[7]
Cassell, J., Bickmore, T., Billinghurst, M., Campbell, L., Chang, K., Vilhjálmsson, H., and Yan, H. Embodiment in conversational interfaces: Rea. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 520--527.
[8]
Cassell, J., Torres, O., and Prevost, S. Turn taking vs. discourse structure: How best to model multimodal conversation. Machine conversations (1999), 143--154.
[9]
Deng, Z., Lewis, J., and Neumann, U. Automated eye motion using texture synthesis. IEEE Computer Graphics and Applications (2005), 24--30.
[10]
Fox, J., and Bailenson, J. Virtual virgins and vamps: The effects of exposure to female characters sexualized appearance and gaze in an immersive virtual environment. Sex roles 61, 3 (2009), 147--157.
[11]
Freedman, E., and Sparks, D. Activity of cells in the deeper layers of the superior colliculus of the rhesus monkey: evidence for a gaze displacement command. Journal of neurophysiology 78, 3 (1997), 1669.
[12]
Frischen, A., Bayliss, A., and Tipper, S. Gaze cueing of attention: Visual attention, social cognition, and individual differences. Psychological bulletin 133, 4 (2007), 694.
[13]
Fuller, J. Head movement propensity. Experimental Brain Research 92, 1 (1992), 152--164.
[14]
Fullwood, C., and Doherty-Sneddon, G. Effect of gazing at the camera during a video link on recall. Applied Ergonomics 37, 2 (2006), 167--175.
[15]
Garau, M., Slater, M., Bee, S., and Sasse, M. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIGCHI conference on Human factors in computing systems, ACM (2001), 309--316.
[16]
Goldberg, G., Kiesler, C., and Collins, B. Visual behavior and face-to-face distance during interaction. Sociometry (1969), 43--53.
[17]
Goldring, J., Dorris, M., Corneil, B., Ballantyne, P., and Munoz, D. Combined eye-head gaze shifts to visual and auditory targets in humans. Experimental brain research 111, 1 (1996), 68--78.
[18]
Goossens, H., and Opstal, A. Human eye-head coordination in two dimensions under different sensorimotor conditions. Experimental Brain Research 114, 3 (1997), 542--560.
[19]
Griffin, Z. Gaze durations during speech reflect word selection and phonological encoding. Cognition 82, 1 (2001), B1-B14.
[20]
Guitton, D., and Volle, M. Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of neurophysiology 58, 3 (1987), 427.
[21]
Harris, M., and Rosenthal, R. No more teachers dirty looks: Effects of teacher nonverbal behavior on student outcomes. Applications of nonverbal communication (2005), 157--192.
[22]
Heylen, D., Van Es, I., Van Dijk, E., NI-JHOLT, A., van Kuppevelt, J., Dybkjaer, L., and Bernsen, N. Experimenting with the gaze of a conversational agent, 2005.
[23]
Hietanen, J. Does your gaze direction and head orientation shift my visual attention? Neuroreport 10, 16 (1999), 3443.
[24]
Ipeirotis, P. Demographics of Mechanical Turk. Tech. Rep. CeDER-10-01, 2010. Accessed on 10-Mar-2010 at http://hdl.handle.net/2451/29585.
[25]
Itti, L., Dhavale, N., and Pighin, F. Photorealistic attention-based gaze animation. In 2006 IEEE International Conference on Multimedia and Expo, IEEE (2006), 521--524.
[26]
Kelley, D., and Gorham, J. Effects of immediacy on recall of information. Communication Education (1988).
[27]
Kim, K., Brent Gillespie, R., and Martin, B. Head movement control in visually guided tasks: Postural goal and optimality. Computers in Biology and Medicine 37, 7 (2007), 1009--1019.
[28]
Kittur, A., Chi, E., and Suh, B. Crowdsourcing user studies with Mechanical Turk. In Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008), 453--456.
[29]
Lance, B., and Marsella, S. The expressive gaze model: Using gaze to express emotion. Computer Graphics and Applications, IEEE 30, 4 (2010), 62--73.
[30]
Langton, S., and Bruce, V. Reflexive visual orienting in response to the social attention of others. Visual Cognition (1999).
[31]
Lee, J., Marsella, S., Traum, D., Gratch, J., and Lance, B. The rickel gaze model: A window on the mind of a virtual human. In Intelligent Virtual Agents, Springer (2007), 296--303.
[32]
Lee, S., Badler, J., and Badler, N. Eyes alive. In ACM Transactions on Graphics (TOG), vol. 21, ACM (2002), 637--644.
[33]
Lester, J., Towns, S., Callaway, C., Voerman, J., and FitzGerald, P. Deictic and emotive communication in animated pedagogical agents. Embodied conversational agents (2000), 123--154.
[34]
Liu, C., Kay, D., and Chai, J. Awareness of partners eye gaze in situated referential grounding: An empirical study. 2nd Workshop on Eye Gaze in Intelligent Human Machine Interaction (2011).
[35]
Ma, X., Le, B., and Deng, Z. Perceptual analysis of talking avatar head movements: A quantitative perspective. CHI'11 (2011).
[36]
Mason, M., Tatkow, E., and Macrae, C. The look of love: Gaze shifts and person perception. Psychological Science (2005), 236--239.
[37]
Mehrabian, A. Immediacy: An indicator of attitudes in linguistic communication. Journal of Personality 34, 1 (1966), 26--34.
[38]
Meyer, A., Sleiderink, A., and Levelt, W. Viewing and naming objects: Eye movements during noun phrase production. Cognition 66, 2 (1998), B25-B33.
[39]
Mutlu, B., Forlizzi, J., and Hodgins, J. A storytelling robot: Modeling and evaluation of human-like gaze behavior. In Humanoid Robots, 2006 6th IEEE-RAS International Conference on, IEEE (2006), 518--523.
[40]
Nijholt, A., Heylen, D., and Vertegaal, R. Inhabited interfaces: Attentive conversational agents that help. In Proceedings 3rd international Conference on Disability, Virtual Reality and Associated Technologies-CDVRAT2000, Alghero, Sardinia (2000).
[41]
Otteson, J., and Otteson, C. Effect of teacher's gaze on children's story recall. Perceptual and Motor Skills (1980).
[42]
Parke, F., and Waters, K. Computer facial animation. AK Peters Ltd, 2008.
[43]
Pelachaud, C., and Bilvi, M. Modelling gaze behavior for conversational agents. In Intelligent Virtual Agents, Springer (2003), 93--100.
[44]
Pelz, J., Hayhoe, M., and Loeber, R. The coordination of eye, head, and hand movements in a natural task. Experimental Brain Research 139, 3 (2001), 266--277.
[45]
Peters, C. Animating gaze shifts for virtual characters based on head movement propensity. In 2010 Second International Conference on Games and Virtual Worlds for Serious Applications, IEEE (2010), 11--18.
[46]
Rickel, J., and Johnson, W. Task-oriented collaboration with embodied agents in virtual worlds. Embodied conversational agents (2000), 95--122.
[47]
Sherwood, J. Facilitative effects of gaze upon learning. Perceptual and Motor Skills (1987).
[48]
Steptoe, W., and Steed, A. High-fidelity avatar eye-representation. In Virtual Reality Conference, 2008. VR'08. IEEE, IEEE (2008), 111--114.
[49]
Tartaro, A., and Cassell, J. Authorable virtual peers for autism spectrum disorders. In Proceedings of the Combined workshop on Language-Enabled Educational Technology and Development and Evaluation for Robust Spoken Dialogue Systems at the 17th European Conference on Artificial Intellegence, Citeseer (2006).
[50]
Vertegaal, R. The gaze groupware system: mediating joint attention in multiparty communication and collaboration. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 294--301.
[51]
Zangemeister, W., and Stark, L. Types of gaze movement: variable interactions of eye and head movements. Experimental Neurology 77, 3 (1982), 563--577.

Cited By

View all
  • (2024)Navigating the Virtual Gaze: Social Anxiety's Role in VR ProxemicsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642359(1-15)Online publication date: 11-May-2024
  • (2024)Real-Time Multi-Map Saliency-Driven Gaze Behavior for Non-Conversational CharactersIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324467930:7(3871-3883)Online publication date: Jul-2024
  • (2024)Investigation of Relationships Between Embodiment Perceptions and Perceived Social Presence in Human–Robot InteractionsInternational Journal of Social Robotics10.1007/s12369-024-01138-w16:8(1735-1750)Online publication date: 15-May-2024
  • Show More Cited By

Index Terms

  1. Designing effective gaze mechanisms for virtual agents

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    May 2012
    3276 pages
    ISBN:9781450310154
    DOI:10.1145/2207676
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 May 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affiliation
    2. gaze
    3. learning
    4. nonverbal behavior
    5. virtual agents

    Qualifiers

    • Research-article

    Conference

    CHI '12
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)61
    • Downloads (Last 6 weeks)11
    Reflects downloads up to 24 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Navigating the Virtual Gaze: Social Anxiety's Role in VR ProxemicsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642359(1-15)Online publication date: 11-May-2024
    • (2024)Real-Time Multi-Map Saliency-Driven Gaze Behavior for Non-Conversational CharactersIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324467930:7(3871-3883)Online publication date: Jul-2024
    • (2024)Investigation of Relationships Between Embodiment Perceptions and Perceived Social Presence in Human–Robot InteractionsInternational Journal of Social Robotics10.1007/s12369-024-01138-w16:8(1735-1750)Online publication date: 15-May-2024
    • (2023)Is High-Fidelity Important for Human-like Virtual Avatars in Human Computer Interactions?International Journal of Network Dynamics and Intelligence10.53941/ijndi0201008(15-23)Online publication date: 27-Mar-2023
    • (2023)Real-Time Conversational Gaze Synthesis for AvatarsProceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3623264.3624446(1-7)Online publication date: 15-Nov-2023
    • (2023)Generating Emotionally Expressive Look-At AnimationProceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3623264.3624438(1-6)Online publication date: 15-Nov-2023
    • (2023)Guiding Visual Attention on 2D Screens: Effects of Gaze Cues from Avatars and HumansProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614529(1-9)Online publication date: 13-Oct-2023
    • (2023)Virtual Big Heads in Extended Reality: Estimation of Ideal Head Scales and Perceptual Thresholds for Comfort and Facial CuesACM Transactions on Applied Perception10.1145/357107420:1(1-31)Online publication date: 11-Jan-2023
    • (2023)The Effect of Rapport on Delegation to Virtual AgentsProceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607321(1-3)Online publication date: 19-Sep-2023
    • (2022)Exploring the Effect of Virtual Agent Type on Perceived Hedonic and Utilitarian Beliefs of an End UserProceedings of the Human Factors and Ergonomics Society Annual Meeting10.1177/107118132266145866:1(726-730)Online publication date: 27-Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media