Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2822013.2822014acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmigConference Proceedingsconference-collections
research-article

Eye movement synthesis with 1/f pink noise

Published: 16 November 2015 Publication History

Abstract

Eye movements are an essential part of non-verbal behavior. Non-player characters (NPCs), as they occur in many games, communicate with the player through dialogues and non-verbal behavior and can have a strong influence on the player experience or even on gameplay. In this paper we propose a procedural model to synthesize the subtleties of eye motions. More specifically, our model adds microsaccadic jitter and pupil unrest both modeled by 1/f or pink noise to the standard main sequence. In a perceptual two-alternative forced-choice (2AFC) experiment we explore the perceived naturalness of different parameters of pink noise by comparing synthesized motions to rendered motion of recorded eye movements at extreme close shot and close shot distances. Our results show that, on average, data-driven motion is perceived as most natural, followed by parameterized pink noise, with motion lacking microsaccadic jitter being consistently selected as the least natural in appearance.

Supplementary Material

MP4 File (p47-duchowski.mp4)

References

[1]
Aks, D. J., Zelinsky, G. J., and Sprott, J. C. 2002. Memory Across Eye-Movements: 1/f Dynamic in Visual Search. Non-linear Dynamics, Psychology, and Life Sciences 6, 1, 1--25.
[2]
Bahill, A. T., Clark, M., and Stark, L. 1975. The Main Sequence, A Tool for Studying Human Eye Movements. Mathematical Biosciences 24, 3/4, 191--204.
[3]
Bentivoglio, A. R., Bressman, S. B., Cassetta, E., Carretta, D., Tonali, P., and Albanese, A. 1997. Analysis of Blink Rate Patterns in Normal Subjects. Movement Disorders 12, 6, 1028--1034.
[4]
Bérard, P., Bradley, D., Nitti, M., Beeler, T., and Gross, M. 2014. High-quality capture of eyes. ACM Trans. Graph. 33, 6 (Nov.), 223:1--223:12.
[5]
Deng, Z., Lewis, J. P., and Neumann, U. 2005. Automated eye motion using texture synthesis. IEEE Comput. Graph. Appl. 25, 2 (Mar.), 24--30.
[6]
Duchowski, A. T., House, D. H., Gestring, J., Congdon, R., Świrski, L., Dodgson, N. A., Krejtz, K., and Krejtz, I. 2014. Comparing Estimated Gaze Depth in Virtual and Physical Environments. In Eye Tracking Research & Applications, ACM, 103--110.
[7]
Garau, M., Slater, M., Vinayagamoorthy, V., Brogni, A., Steed, A., and Sasse, M. A. 2003. The Impact of Avatar Realism and Eye Gaze Control on Perceived Quality of Communication in a Shared Immersive Virtual Environment. In Human Factors in Computing Systems: CHI 03 Conference Proceedings, ACM Press, 529--536.
[8]
Green, D., and Sweets, J. 1966. Signal detection theory and psychophysics. Wiley, New York.
[9]
Grzywacz, N. M., and Norcia, A. M. 1995. Directional Selectivity in the Cortex. In The Handbook of Brain Theory and Neural Networks, M. A. Arbib, Ed. The MIT Press, 309--311.
[10]
Hodgins, J., Jörg, S., O'Sullivan, C., Park, S. I., and Mahler, M. 2010. The Saliency of Anomalies in Animated Human Characters. ACM Transactions on Applied Perception (TAP) 7, 4 (July), 22:1--22:14.
[11]
Hollos, S., and Hollos, J. R. 2014. Creating Noise. Exstrom Laboratories, LLC, Longmont, CO, April. ISBN: 9781887187268 (ebook), URL: http://www.abrazol.com/books/noise/ (last accessed Jan. 2015).
[12]
Hollos, S., and Hollos, J. R. 2014. Recursive Digital Filters: A Concise Guide. Exstrom Laboratories, LLC, Longmont, CO, April. ISBN: 9781887187244 (ebook), URL: http://www.abrazol.com/books/filter1/ (last accessed Jan. 2015).
[13]
Hubel, D. H. 1988. Eye, Brain, and Vision. Scientific American Library, New York, NY.
[14]
Jogan, M., and Stocker, A. A. 2014. A new two-alternative forced choice method for the unbiased characterization of perceptual bias and discriminability. Journal of Vision 14, 3, 1--18.
[15]
Knox, P. C. 2001. The Parameters of Eye Movement. Lecture Notes, URL: http://www.liv.ac.uk/~pcknox/teaching/Eymovs/params.htm (last accessed November 2012).
[16]
Krejtz, K., Duchowski, A. T., and Çöltekin, A. 2014. High-Level Gaze Metrics From Map Viewing: Charting Ambient/Focal Visual Attention. In 2nd International Workshop in Eye Tracking For Spatial Research (ET4S), P. Kiefer, I. Giannopoulos, M. Raubal, and A. Krüger, Eds.
[17]
Lance, B. J., and Marsella, S. C. 2008. A model of gaze for the purpose of emotional expression in virtual embodied agents. In Proceedings of the 7th International Joint Conference on Autonomous Agents and Multiagent Systems - Volume 1, 199--206.
[18]
Landy, S. D. 1999. Mapping the Universe. Scientific American 224, 38--45.
[19]
Le, B., Ma, X., and Deng, Z. 2012. Live speech driven head-and-eye motion generators. Visualization and Computer Graphics, IEEE Transactions on 18, 11 (Nov), 1902--1914.
[20]
Lee, S. P., Badler, J. B., and Badler, N. I. 2002. Eyes Alive. ACM Transactions on Graphics 21, 3 (July), 637--644.
[21]
Ma, X., and Deng, Z. 2009. Natural Eye Motion Synthesis by Modeling Gaze-Head Coupling. In IEEE Virtual Reality, 143--150.
[22]
Martinez-Conde, S., Macknik, S. L., and Troncoso, Xoana G. Hubel, D. H. 2009. Microsaccades: a neurophysiological analysis. Trends in Neurosciences 32, 9, 463--475.
[23]
McDonnell, R., Breidt, M., and Bülthoff, H. H. 2012. Render me Real?: Investigating the Effect of Render Style on the Perception of Animated Virtual Humans. ACM Transactions on Graphics 31, 4 (July), 91:1--91:11.
[24]
Newcombe, R. 1998. Two-Sided Confidence Intervals for the Single Proportion: Comparison of Seven Methods. Statistics in Medicine 17, 857--872.
[25]
Normoyle, A., Badler, J. B., Fan, T., Badler, N. I., Cassol, V. J., and Musse, S. R. 2013. Evaluating perceived trust from procedurally animated gaze. In Proceedings of Motion on Games, MIG '13, 119:141--119:148.
[26]
Ostling, A., Harte, J., and Green, J. 2000. Self-Similarity and Clustering in the Spatial Distribution of Species. Science 27, 5492, 671.
[27]
Otero-Millan, J., Macknik, S. L., Serra, A., and Leigh, R. J. 2011. Triggering mechanisms in microsaccade and saccade generation: a novel proposal. In Basic and Clinical Ocular Motor and Vestibular Research, J. Rucker and D. S. Zee, Eds., vol. 1233 of Annals of the New York Acad. of Sciences. 107--116.
[28]
Ouzts, A. D., and Duchowski, A. T. 2012. Comparison of Eye Movement Metrics Recorded at Different Sampling Rates. In Eye Tracking Research & Applications (ETRA), ACM, 321--324.
[29]
Oyekoya, O., Steptoe, W., and Steed, A. 2009. A Saliency-based Method of Simulating Visual Attention in Virtual Scenes. In Virtual Reality Software and Technology, ACM, 199--206.
[30]
Pamplona, V. F., Oliveira, M. M., and Baranoski, G. V. G. 2009. Photorealistic models for pupil light reflex and iridal pattern deformation. ACM Transactions on Graphics 28, 4 (Sept.), 106:1--106:12.
[31]
Peirce, J. 2007. PsychoPy - Psychophysics software in Python. Journal of Neuroscience Methods 162, 1--2, 8--13.
[32]
Peters, C., and Qureshi, A. 2010. A Head Movement Propensity Model for Animating Gaze Shifts and Blinks of Virtual Characters. Computers & Graphics 34, 677--687.
[33]
Privitera, C. M., Renninger, L. W., Carney, T., Klein, S., and Aguilar, M. 2008. The pupil dilation response to visual detection. In Human Vision and Electronic Imaging, B. E. Rogowitz and T. Pappas, Eds., vol. 6806, SPIE.
[34]
Queiroz, R. B., Barros, L. M., and Musse, S. R. 2008. Providing expressive gaze to virtual animated characters in interactive applications. Comput. Entertain. 6, 3 (Nov.), 41:1--41:23.
[35]
Ren, L., Patrick, A., Efros, A. A., Hodgins, J. K., and Rehg, J. M. 2005. A Data-driven Approach to Quantifying Natural Human Motion. ACM Trans. Graph. 24, 3, 1090--1097.
[36]
Robin, X., Turck, N., Hainard, A., Tiberti, N., Lisaczek, F., Sanchez, J.-C., and Müller, M. 2011. pROC: An Open-Source Package for R and S+ to Analyze and Compare ROC Curves. BMC Bioinformatics 12, 77.
[37]
Ruhland, K., Andrist, S., B., B. J., Peters, C. E., Badler, N. I., Gleicher, M., Mutlu, B., and McDonnell, R. 2014. Look me in the eyes: A survey of eye and gaze animation for virtual agents and artificial systems. In Computer Graphics Forum, Lefebvre, S. and Spagnuolo, M., Ed., EuroGraphics STAR---State of the Art Report.
[38]
Stark, L., Campbell, F. W., and Atwood, J. 1958. Pupil Unrest: An Example of Noise in a Biological Servomechanism. Nature 182, 4639, 857--858.
[39]
Steptoe, W., Oyekoya, O., and Steed, A. 2010. Eyelid kinematics for virtual characters. Comput. Animat. Virtual Worlds 21, 3--4 (May), 161--171.
[40]
Świrski, L., and Dodgson, N. 2014. Rendering synthetic ground truth images for eye tracker evaluation. In Eye Tracking Research and Applications, ACM, ETRA '14, 219--222.
[41]
Szendro, P., Vincze, G., and Szasz, A. 2001. Pink-noise behaviour of biosystems. European Biophysics Journal 30, 3, 227--231.
[42]
Team Bondi, 2011. L. A. Noire. Rockstar Games, May.
[43]
Trutoiu, L. C., Carter, E. J., Matthews, I., and Hodgins, J. K. 2011. Modeling and Animating Eye Blinks. ACM Transactions on Applied Perception (TAP) 2, 3 (May), 17:1--17:17.
[44]
Usher, M., Stemmler, M., and Olami, Z. 1995. Dynamic Pattern Formation Leads to 1/f Noise in Neural Populations. Physical Review Letters 74, 2, 326--330.
[45]
Vertegaal, R. 1999. The GAZE Groupware System: Mediating Joint Attention in Mutiparty Communication and Collaboration. In Human Factors in Computing Systems: CHI '99 Conference Proceedings, ACM Press, 294--301.
[46]
Yang, Z., Zhao, Q., Keefer, E., and Liu, W. 2009. Noise Characterization, Modeling, and Reduction for In Vivo Neural Recording. In Advances in Neural Information Processing Systems, Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, Eds., vol. 22, 2160--2168.
[47]
Yeo, S. H., Lesmana, M., Neog, D. R., and Pai, D. K. 2012. Eyecatch: Simulating Visuomotor Coordination for Object Interception. ACM Transactions on Graphics 31, 4 (July), 42:1--42:10.

Cited By

View all
  • (2024)S3: Speech, Script and Scene driven Head and Eye AnimationACM Transactions on Graphics10.1145/365817243:4(1-12)Online publication date: 19-Jul-2024
  • (2023)Real-Time Conversational Gaze Synthesis for AvatarsProceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3623264.3624446(1-7)Online publication date: 15-Nov-2023
  • (2023)SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial NetworksProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588410(1-9)Online publication date: 30-May-2023
  • Show More Cited By

Index Terms

  1. Eye movement synthesis with 1/f pink noise

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MIG '15: Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games
    November 2015
    247 pages
    ISBN:9781450339919
    DOI:10.1145/2822013
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. character animation
    2. eye movements
    3. gaze synthesis
    4. microsaccades
    5. pupil unrest

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    MIG '15
    MIG '15: Motion in Games
    November 16 - 18, 2015
    Paris, France

    Acceptance Rates

    Overall Acceptance Rate -9 of -9 submissions, 100%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)23
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 26 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)S3: Speech, Script and Scene driven Head and Eye AnimationACM Transactions on Graphics10.1145/365817243:4(1-12)Online publication date: 19-Jul-2024
    • (2023)Real-Time Conversational Gaze Synthesis for AvatarsProceedings of the 16th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3623264.3624446(1-7)Online publication date: 15-Nov-2023
    • (2023)SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial NetworksProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588410(1-9)Online publication date: 30-May-2023
    • (2022)HPCGen: Hierarchical K-Means Clustering and Level Based Principal Components for Scan Path Genaration2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529625(1-7)Online publication date: 8-Jun-2022
    • (2022)EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition2022 21st ACM/IEEE International Conference on Information Processing in Sensor Networks (IPSN)10.1109/IPSN54338.2022.00026(233-246)Online publication date: May-2022
    • (2021)Metadata-driven eye tracking for real-time applicationsProceedings of the 21st ACM Symposium on Document Engineering10.1145/3469096.3474935(1-4)Online publication date: 16-Aug-2021
    • (2021)Fully Convolutional Neural Networks for Raw Eye Tracking Data Segmentation, Generation, and Reconstruction2020 25th International Conference on Pattern Recognition (ICPR)10.1109/ICPR48806.2021.9413268(142-149)Online publication date: 10-Jan-2021
    • (2020)MLGaze: Machine Learning-Based Analysis of Gaze Error Patterns in Consumer Eye Tracking SystemsVision10.3390/vision40200254:2(25)Online publication date: 7-May-2020
    • (2019)Perceptual Comparison of Procedural and Data-Driven Eye Motion JitterACM Symposium on Applied Perception 201910.1145/3343036.3343130(1-5)Online publication date: 19-Sep-2019
    • (2019)Guiding gazeProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3314111.3319848(1-9)Online publication date: 25-Jun-2019
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media