Nothing Special   »   [go: up one dir, main page]

Skip to main content

Talking Through the Eyes: User Experience Design for Eye Gaze Redirection in Live Video Conferencing

  • Conference paper
  • First Online:
Human-Computer Interaction. Interaction Techniques and Novel Applications (HCII 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12763))

Included in the following conference series:

Abstract

In the post-corona era, more institutions are using videoconferencing (VC). However, when we talked in VC, we found that people could not easily concentrate on the conversation. This is a problem with computer architecture. Because the web camera records a person looking at a computer monitor, it appears as if people are staring at other places and having a conversation. This was a problem caused by overlooking eye contact, a non-verbal element of face-to-face conversation. To solve this problem, many studies focused on technology in literature. However, this study presented ER guidelines in terms of user experience: the function of selecting the direction of the face, the function of selecting a 3D avatar face divided into four stages through morphing, and a guideline on the function of intentionally staring at the camera using the teleprompter function.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. McCarthy, A., Lee, K.: Children’s knowledge of deceptive gaze cues and its relation to their actual lying behavior. J. Exp. Child Psychol. 103(2), 117–134 (2009). https://doi.org/10.1016/j.jecp.2008.06.005

    Article  Google Scholar 

  2. Bohannon, L.S., Herbert, A.M., Pelz, J.B., Rantanen, E.M.: Eye contact and video-mediated communication: a review. Displays 34(2), 177–185 (2013). https://doi.org/10.1016/j.displa.2012.10.009

    Article  Google Scholar 

  3. Jones, A., et al.: Achieving eye contact in a one-to-many 3D video teleconferencing system. ACM Trans. Graph. 28(3), 1–8 (2009). https://dl.acm.org/doi/abs/10.1145/1531326.1531370

  4. Ebara, Y., Kukimoto, N., Koyamada, K.: Evaluation experiment on eye-to-eye contact in remote communication with tiled displays environments. In: 22nd International Conference on Advanced Information Networking and Applications - Workshops (Aina worshops 2008), pp. 1017–1022. Gino-wan, Japan (2008). https://doi.org/10.1109/WAINA.2008.157

  5. Kuster, C., Popa, T., Bazin, J.C., Gotsman, C., Gross, M.: Gaze correction for home video conferencing. ACM Trans. Graph. 31(6), 1–6 (2012). https://doi.org/10.1145/2366145.2366193

    Article  Google Scholar 

  6. Vertegaal, R., Weevers, I., Sohn, C., Cheung, C.: GAZE-2: conveying eye contact in group video conferencing using eye-controlled camera direction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 521–528. Association for Computing Machinery, New York, NY, USA (2003). https://doi.org/10.1145/642611.642702

  7. Criminisi, A., Shotton, J., Blake, A., Torr, P.H.: Gaze manipulation for one-to-one teleconferencing. In: Proceedings Ninth IEEE International Conference on Computer Vision, vol. 1, pp. 191–198. IEEE, Nice, France (2003). https://doi.org/10.1109/ICCV.2003.1238340

  8. Baron-Cohen, S., Wheelwright, S., Jolliffe, T.: Is there a “Language of the Eyes?” evidence from normal adults, and adults with autism or asperger syndrome. Vis. Cogn. 4(3), 311–331 (1997). https://doi.org/10.1080/713756761

  9. Grayson, D.M., Monk, A.F.: Are you looking at me? Eye contact and desktop video conferencing. ACM Trans. Comput.-Hum. Interact. 10, 221–243 (2003). https://doi.org/10.1145/937549.937552

    Article  Google Scholar 

  10. Liu, X., Cheung, G., Zhai, D., Zhao, D.: Sparsity-based joint gaze correction and face beautification for conferencing video. In: 2015 Visual Communications and Image Processing (VCIP), pp. 1–4 (2015). https://doi.org/10.1109/VCIP.2015.7457830

  11. Ho, Y., Jang, W.: Eye contact technique using depth image based rendering for immersive videoconferencing. In: 2014 International Conference on Information and Communication Technology Convergence (ICTC), Busan, Korea (South), 2014, pp. 982–983 (2014). https://doi.org/10.1109/ICTC.2014.6983350

  12. Giger, D., Bazin, J., Kuster, C., Popa, T., Gross, M.: Gaze correction with a single webcam. In: 2014 IEEE International Conference on Multimedia and Expo (ICME), pp. 1–6 (2014). https://doi.org/10.1109/ICME.2014.6890306

  13. Leveraging the asymmetric sensitivity of eye contact for videoconference. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Inneapolis, Minnesota, USA (2002). https://dl.acm.org/doi/abs/10.1145/503376.503386?casa_token=2YovNqnOTbYAAAAA:YehVv9litc1PwArEXxJ1y4dTManjJf8vEaDR4KRVfbaWN7lx-QJKkvvOya9ToiTbc8FOs7ZxG-DkaEg. Accessed 26 Feb 2021

  14. Ruigang, Y., Zhengyou, Z.: Eye gaze correction with stereovision for video-teleconferencing. IEEE Trans. Pattern Anal. Mach. Intell. 26, 956–960 (2004). https://doi.org/10.1109/TPAMI.2004.27

    Article  Google Scholar 

  15. Parsons, M.M.: PowerPoint as visual communication pedagogy: relative differences in eye tracking and aesthetic pleasure (2020). https://ttu-ir.tdl.org/handle/2346/85826

  16. Asadi, R., Trinh, H., Fell, H.J., Bickmore, T.W.: IntelliPrompter: speech-based dynamic note display interface for oral presentations. In: Proceedings of the 19th ACM International Conference on Multimodal Interaction, pp. 172–180. ACM, Glasgow, UK (2017). https://doi.org/10.1145/3136755.d3136818

  17. Kononenko, D., Ganin, Y., Sungatullina, D., Lempitsky, V.: Photorealistic monocular gaze redirection using machine learning. IEEE Trans. Pattern Anal. Mach. Intell. 40, 2696–2710 (2018). https://doi.org/10.1109/TPAMI.2017.2737423

    Article  Google Scholar 

  18. Wood, E., Baltrušaitis, T., Morency, L.-P., Robinson, P., Bulling, A.: GazeDirector: fully articulated eye gaze redirection in video. Comput. Graph. Forum 37, 217–225 (2018). https://doi.org/10.1111/cgf.13355

    Article  Google Scholar 

  19. Massé, B.: Gaze direction in the context of social human-robot interaction (2018). https://tel.archives-ouvertes.fr/tel-01936821

  20. Ganin, Y., Kononenko, D., Sungatullina, D., Lempitsky, V.: DeepWarp: photorealistic image resynthesis for gaze manipulation. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9906, pp. 311–326. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46475-6_20

    Chapter  Google Scholar 

  21. Maxine Hompage. https://developer.nvidia.com/maxine. Accessed 2 Nov 2020

  22. Dodds, T.J., Mohler, B.J., Bülthoff, H.H.: Talk to the virtual hands: self-animated avatars improve communication in head-mounted display virtual environments. PLoS ONE 6, (2011). https://doi.org/10.1371/journal.pone.0025759

    Article  Google Scholar 

  23. Garau, M., Slater, M., Bee, S., Sasse, M.A.: The impact of eye gaze on communication using humanoid avatars. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 309–316. Association for Computing Machinery, New York, NY, USA (2001). https://doi.org/10.1145/365024.365121

  24. Salmon, G., Nie, M., Edirisingha, P.: Developing a five-stage model of learning in second life. Educ. Res. 52(2), 169–182 (2010). https://doi.org/10.1080/00131881.2010.482744

    Article  Google Scholar 

  25. Kang, S.H., Watt, J.H.: The impact of avatar realism and anonymity on effective communication via mobile devices. Comput. Hum. Behav. 29, 1169–1181 (2013). https://doi.org/10.1016/j.chb.2012.10.010

    Article  Google Scholar 

  26. Bente, G., Rüggenberg, S., Krämer, N.C., Eschenburg, F.: Avatar-mediated networking: increasing social presence and interpersonal trust in net-based collaborations. Hum. Commun. Res. 34(2), 287–318 (2008). https://doi.org/10.1111/j.1468-2958.2008.00322.x

    Article  Google Scholar 

  27. Gonzalez-Franco, M., Steed, A., Hoogendyk, S., Ofek, E.: Using facial animation to increase the enfacement illusion and avatar self-identification. IEEE Trans. Vis. Comput. Graph. 26(5), 2023–2029 (2020). https://doi.org/10.1109/TVCG.2020.2973075

    Article  Google Scholar 

  28. Greiner, B., Caravella, M., Roth, A.E.: Is avatar-to-avatar communication as effective as face-to-face communication? An ultimatum game experiment in first and second life. J. Econ. Behav. Organ. 108, 374–382 (2014). https://doi.org/10.1016/j.jebo.2014.01.011

    Article  Google Scholar 

  29. Luvatar Homepage. https://www.luvatar.com/. Accessed 2 Nov 2020

  30. Areeyapinan, J., Kanongchaiyos, P.: Face morphing using critical point filters. In: 2012 Ninth International Conference on Computer Science and Software Engineering (JCSSE), pp. 283–288, Bangkok, Thailand (2012). https://doi.org/10.1109/JCSSE.2012.6261966

Download references

Acknowledgments

This research was partially supported by SK Telecom.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Jeongyun Heo or Jiyoon Lee .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Park, W., Heo, J., Lee, J. (2021). Talking Through the Eyes: User Experience Design for Eye Gaze Redirection in Live Video Conferencing. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction Techniques and Novel Applications. HCII 2021. Lecture Notes in Computer Science(), vol 12763. Springer, Cham. https://doi.org/10.1007/978-3-030-78465-2_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-78465-2_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-78464-5

  • Online ISBN: 978-3-030-78465-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics