Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Ah-Aloud Method to Comprehend Time-Series Emotion Observation During Gameplay: An Initial Investigation with Japanese Speakers

Published: 15 October 2024 Publication History

Abstract

Emotional dimensions significantly impact user experience (UX) and are essential in assessing interactions in entertainment systems. However, relying solely on subjective post-event indicators fails to capture dynamic emotional changes during the system experience. Kawashima and Watanabe introduced the "ah-aloud" method, enabling real-time emotion observation by having participants vocalize "ah" during system experiences. However, this method remains conceptual and preliminary, lacking in specific experimental procedures, analysis methods, and utilization guidelines. Our study delineates the requirements for practical "ah-aloud" application and validates its potential for game experience assessment. Consequently, the "ah-aloud" method demonstrates the potential for observing time-series emotional changes during gameplay. Based on these findings, we deliberated its utility for system evaluation and reliability as an evaluation experiment.

Supplemental Material

MP4 File
Supplemental video

References

[1]
Derek Isaacowitz Antarika Sen and Annett Schirmer. 2018. Age differences in vocal emotion perception: on the role of speaker age and listener sex. Cognition and Emotion 32, 6 (2018), 1189--1204. https://doi.org/10.1080/02699931.2017. 1393399 arXiv:https://doi.org/10.1080/02699931.2017.1393399 29063823.
[2]
Panagiotis D Bamidis, Christos Papadelis, Chrysoula Kourtidou-Papadeli, Costas Pappas, and Ana B. Vivas. 2004. Affective computing in the era of contemporary neurophysiology and health informatics. Interacting with Computers 16, 4 (2004), 715--721.
[3]
Rainer Banse and Klaus R Scherer. 1996. Acoustic profiles in vocal emotion expression. Journal of personality and social psychology 70, 3 (1996), 614.
[4]
Javier A Bargas-Avila and Kasper Hornbæk. 2011. Old wine in new bottles or novel challenges: a critical analysis of empirical studies of user experience. In Proceedings of the SIGCHI conference on human factors in computing systems. 2689--2698.
[5]
Pascal Belin, Sarah Fillion-Bilodeau, and Frédéric Gosselin. 2008. The Montreal Affective Voices: A validated set of nonverbal affect bursts for research on auditory affective processing. Behavior research methods 40, 2 (2008), 531--539.
[6]
Terri L. Bonebright, Jeri Thompson, and Daniel W. Leger. 1996. Gender stereotypes in the expression and perception of vocal affect. Sex Roles 34 (1996), 429--445. https://api.semanticscholar.org/CorpusID:145425283
[7]
Wolfram Boucsein. 2013. Electrodermal activity: Second edition. 1--618 pages. https://doi.org/10.1007/978--1--4614--1126-0
[8]
Anders Bruun and Simon Ahm. 2015. Mind the Gap! Comparing Retrospective and Concurrent Ratings of Emotion in User Experience Evaluation, Vol. 9296. https://doi.org/10.1007/978--3--319--22701--6_17
[9]
Anders Bruun, Effie Lai-Chong Law, Matthias Heintz, and Poul Svante Eriksen. 2016. Asserting real-time emotions through cued-recall: is it valid?. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. 1--10.
[10]
Florian Brühlmann and Gian-Marco Schmid. 2015. How to Measure the Game Experience? Analysis of the Factor Structure of Two Questionnaires. https://doi.org/10.1145/2702613.2732831
[11]
Antonio R Damasio. 1999. The feeling of what happens: Body and emotion in the making of consciousness. Houghton Mifflin Harcourt.
[12]
Charles Darwin. 1965. The Expression of the Emotions in Man and Animals. University of Chicago Press, Chicago, IL. Original work published 1872.
[13]
Jose M. R. Delgado. 1973. Emotions: Introduction to General Psychology: A Self-Selection Textbook (2nd ed.). Brown.
[14]
Damien Dupré, Anna Tcherkassof, and Dubois Michel. 2015. Emotions Triggered by Innovative Products A Multicomponential Approach of Emotions for User eXperience Tools. https://doi.org/10.1109/ACII.2015.7344657
[15]
Paul Ekman andWallace V Friesen. 1978. Facial action coding system. Environmental Psychology & Nonverbal Behavior (1978).
[16]
Paul Ekman, Wallace V Friesen, Maureen O'sullivan, Anthony Chan, Irene Diacoyanni-Tarlatzis, Karl Heider, Rainer Krause, William Ayhan LeCompte, Tom Pitcairn, Pio E Ricci-Bitti, et al. 1987. Universals and cultural differences in the judgments of facial expressions of emotion. Journal of personality and social psychology 53, 4 (1987), 712.
[17]
K. A. Ericsson and H. A. Simon. 1993. Protocol analysis:Verbal reports as data (Revised Edition). (1993).
[18]
Joseph L. Fleiss and Jacob Cohen. 1973. The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability. Educational and Psychological Measurement 33, 3 (1973), 613--619. https://doi.org/10.1177/ 001316447303300309
[19]
Jodi Forlizzi and Katja Battarbee. 2004. Understanding experience in interactive systems (DIS '04). Association for Computing Machinery, New York, NY, USA, 261--268. https://doi.org/10.1145/1013115.1013152
[20]
Tehmina Hafeez, Sanay Muhammad Umar Saeed, Aamir Arsalan, Syed Muhammad Anwar, Muhammad Usman Ashraf, and Khalid Alsubhi. 2021. EEG in game user analysis: A framework for expertise classification during gameplay. Plos one 16, 6 (2021), e0246913.
[21]
John Heritage. 1984. A Change-of-State Token and Aspects of its Sequential Placement. In J. Maxwell Atkinson and John Heritage (eds.), Structures of Social Action (1984).
[22]
Ernest R. Hilgard, Richard C. Atkinson, and Rita L. Atkinson. 1979. Introduction to psychology (7th ed.). Harcourt Brace Jovanovich, New York.
[23]
Toru Hishinuma. 2005. [Functions of Chinese interjection aiya] Kand¯oshi "aiya" no kin¯o: r¯osha wageki ni okeru y¯oh¯o. Sodai China Review (Mar 2005), 1--12. Issue 8.
[24]
Kenneth Holmqvist and Richard Andersson. 2017. Eye-tracking: A comprehensive guide to methods, paradigms and measures.
[25]
S. asif Hussain and Ahlam Al Balushi. 2020. A real time face emotion classification and recognition using deep learning model. Journal of Physics: Conference Series 1432 (01 2020), 012087. https://doi.org/10.1088/1742--6596/1432/1/012087
[26]
George L Huttar. 1968. Relations between prosodic variables and emotions in normal American English utterances. Journal of Speech and Hearing Research 11, 3 (1968), 481--487.
[27]
Carroll E. Izard. 1971. The Face of Emotion. Appleton-Century-Crofts, New York.
[28]
Carroll Ellis Izard and Peter B Read. 1982. Measuring emotions in infants and children: based on seminars sponsored by the Committee on Social and Affective Development During Childhood of the Social Science Research Council. Vol. 1. Cambridge University Press.
[29]
Daniel Johnson,MJohn Gardner, and Ryan Perry. 2018. Validation of two game experience scales: the player experience of need satisfaction (PENS) and game experience questionnaire (GEQ). International Journal of Human-Computer Studies 118 (2018), 38--46.
[30]
Daniel Johnson, Lennart E Nacke, and Peta Wyeth. 2015. All about that base: differing player experiences in video game genres and the unique case of moba games. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 2265--2274.
[31]
Daniel Johnson, Christopher Watling, John Gardner, and Lennart E Nacke. 2014. The edge of glory: the relationship between metacritic scores and player experience. In Proceedings of the first ACM SIGCHI annual symposium on Computer-human interaction in play. 141--150.
[32]
Vladimir Jovanovic et al. 2004. The form, position and meaning of interjections in English. FACTA UNIVERSITATISLinguistics and Literature 3, 01 (2004), 17--28.
[33]
Daniel Kahneman, D Kahneman, and A Tversky. 2000. Experienced utility and objective happiness: A moment-based approach. 2000 (2000), 673--692.
[34]
Takuma Kakami, Hideaki Kuzuoka, Etsuko Harada, Shinnosuke Tanaka, et al. 2019. [Emotives and Intonation Orchestrated in Japanese: Affective Relevance Modality] y¯uza no kikisousazi no onseij¯oh¯o wo motiita tomadoisuiteishuh¯o no kent¯o. Research Report Human-Computer Interaction(HCI) 2019, 10 (2019), 1--7.
[35]
Seth Kaplan, Reeshad Dalal, and Joseph Luchman. 2013. Measurement of emotions. 61--75.
[36]
Evangelos Karapanos, John Zimmerman, Jodi Forlizzi, and Jean-bernard Martens. 2009. User experience over time: An initial framework. IEEE Journal of Solid-state Circuits - IEEE J SOLID-STATE CIRCUITS, 729--738. https://doi.org/10. 1145/1518701.1518814
[37]
Takashi Kato. 1986. What 'question-asking protocols' can say about the user interface. International Journal of Man-Machine Studies 25, 6 (1986), 659--673.
[38]
Takuya Kawashima and Keita Watanabe. 2022. "Ah-aloud": Method for Evaluating Cognitive Processes Occurring During Tasks from Vocal Information. In 2022 8th International HCI and UX Conference in Indonesia (CHIuXiD), Vol. 1. 42--46. https://doi.org/10.1109/CHIuXiD57244.2022.10009797
[39]
Takuya Kawashima and Keita Watanabe. 2022. [Ah-aloud method: Proposal and validation of a method to evaluate psychological processes during an experience using the phonetic information A araudo h¯o: taiken ch¯u no-shinri purosesu wo 'a' no onsei j¯oh¯o de hy¯oka suru shuh¯o no teian to kensh¯o. Proceedings of Entertainment Computing Symposium 2022 2022 (aug 2022), 178--183.
[40]
Takuya Kawashima and Keita Watanabe. 2023. ['Ah-aloud': Proposal and Basic Study of a Method for Observing Real-time Emotions During a Experience] A araudo h¯o taiken ch¯u no riarutaimu na kanj¯o no kansoku shuh¯o to sono kiso kent¯o. IPSJ SIG Technical Report (2023).
[41]
Paul R Kleinginna Jr and Anne M Kleinginna. 1981. A categorized list of emotion definitions, with suggestions for a consensual definition. Motivation and emotion 5, 4 (1981), 345--379.
[42]
Takeshi Kohno. 2019. [Emotives and Intonation Orchestrated in Japanese: Affective Relevance Modality] Nihongo ni okeru kand¯oshi to inton¯eshon no k¯oky¯o : kanrensei modarithi no fuzei. Otsuma Women's University annual report. Humanities and social sciences 51 (2019), 226--207.
[43]
Peter Lang. 1980. Behavioral treatment and bio-behavioral assessment: Computer applications. Technology in mental health care delivery systems (1980), 119--137.
[44]
Peter J Lang. 1995. The emotion probe: Studies of motivation and attention. American psychologist 50, 5 (1995), 372.
[45]
Petri Laukka and Hillary Anger Elfenbein. 2021. Cross-Cultural Emotion Recognition and In-Group Advantage in Vocal Expression: A Meta-Analysis. Emotion Review 13, 1 (2021), 3--11. https://doi.org/10.1177/1754073919897295
[46]
Adi Lausen and Anne Schacht. 2018. Gender Differences in the Recognition of Vocal Emotions. Frontiers in Psychology 9 (06 2018), 882. https://doi.org/10.3389/fpsyg.2018.00882
[47]
Richard S. Lazarus. 1975. A cognitively oriented psychologist looks at biofeedback. The American psychologist 305 (1975), 553--61. https://api.semanticscholar.org/CorpusID:39145218
[48]
Irene Lopatovska and Ioannis Arapakis. 2011. Theories, methods and current research on emotions in library and information science, information retrieval and human-computer interaction. Inf. Process. Manage. 47 (07 2011), 575--592. https://doi.org/10.1016/j.ipm.2010.09.001
[49]
Irene Lopatovska and Ioannis Arapakis. 2011. Theories, methods and current research on emotions in library and information science, information retrieval and human-computer interaction. Inf. Process. Manage. 47 (07 2011), 575--592. https://doi.org/10.1016/j.ipm.2010.09.001
[50]
Murugappan M. and Mutawa A. 2021. Facial geometric feature extraction based emotional expression classification using machine learning algorithms. PLOS ONE 16, 2 (02 2021), 1--20. https://doi.org/10.1371/journal.pone.0247131
[51]
Regan L Mandryk, Kori M Inkpen, and Thomas W Calvert. 2006. Using psychophysiological techniques to measure user experience with entertainment technologies. Behaviour & information technology 25, 2 (2006), 141--158.
[52]
Anmin Mao. 2020. A Comparative Study of Interjections in Chinese and English. Open Journal of Modern Linguistics 10 (01 2020), 315--320. https://doi.org/10.4236/ojml.2020.104018
[53]
Albert Mehrabian. 1995. Framework for a comprehensive description and measurement of emotional states. Genetic, social, and general psychology monographs 121 3 (1995), 339--61. https://api.semanticscholar.org/CorpusID:27427769
[54]
Elisa D Mekler, Julia Ayumi Bopp, Alexandre N Tuch, and Klaus Opwis. 2014. A systematic review of quantitative studies on the enjoyment of digital entertainment games. In Proceedings of the SIGCHI conference on human factors in computing systems. 927--936.
[55]
John R Millenson and Julian C Leslie. 1967. Principles of behavioral analysis. Macmillan New York.
[56]
Richard E Nisbett and Timothy D Wilson. 1977. The halo effect: Evidence for unconscious alteration of judgments. Journal of personality and social psychology 35, 4 (1977), 250.
[57]
Don Norman. 2007. Emotional design: Why we love (or hate) everyday things. Basic books.
[58]
Donald A. Norman. 2009. THE WAY I SEE ITMemory is more important than actuality. Interactions 16, 2 (mar 2009), 24--26. https://doi.org/10.1145/1487632.1487638
[59]
Kent L Norman. 2013. Geq (game engagement/experience questionnaire): a review of two papers. Interacting with computers 25, 4 (2013), 278--283.
[60]
Takuya Oka, Takuya Kawashima, Daichi Hayashi, and KeitaWatanabe. 2021. [Design and development of video games for ease of use and standardization of research] Kenky¯u riy¯o shi yasuku hy¯ojun sei wo mezashita bideo g¯emu no sekkei to kaihatsu. Proceedings of Entertainment Computing Symposium 2021 2021 (aug 2021), 181--186.
[61]
Marco Pasch and Monica Landoni. 2011. Recognizing Bodily Expression of Affect in User Tests. 264--271. https://doi.org/10.1007/978--3--642--24571--8_29
[62]
Ingrid Pettersson, Florian Lachner, Anna-Katharina Frison, Andreas Riener, and Andreas Butz. 2018. A Bermuda triangle? A Review of method application and triangulation in user experience evaluation. In Proceedings of the 2018 CHI conference on human factors in computing systems. 1--16.
[63]
R Plutchik. 1962. The emotions: Facts, theories and a new model. University Press of America.
[64]
Karolien Poels, Yvonne AW de Kort, and Wijnand A IJsselsteijn. 2007. D3. 3: Game Experience Questionnaire: development of a self-report measure to assess the psychological impact of digital games. (2007).
[65]
Kathrin Pollmann, Victoria Sinram, Nora Fronemann, and Mathias Vukelic. 2018. Can We Distinguish Pragmatic from Hedonic User Experience Qualities with Implicit Measures?. In Design, User Experience, and Usability: Theory and Practice: 7th International Conference, DUXU 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, July 15--20, 2018, Proceedings, Part I 7. Springer, 509--527.
[66]
Andrew Przybylski, Richard Ryan, and C Rigby. 2009. The Motivating Role of Violence in Video Games. Personality & social psychology bulletin 35 (03 2009), 243--59. https://doi.org/10.1177/0146167208327216
[67]
Kori M. Inkpen Regan L. Mandryk and Thomas W. Calvert. 2006. Using psychophysiological techniques to measure user experience with entertainment technologies. Behaviour & Information Technology 25, 2 (2006), 141--158. https://doi.org/10.1080/01449290500331156
[68]
Virpi Roto, Effie Lai-Chong Law, Arnold P.O.S. Vermeeren, and Jettie Hoonhout. 2011. User Experience White Paper -- Bringing clarity to the concept of user experience. https://api.semanticscholar.org/CorpusID:9315964
[69]
JA Russell. 1994. Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychological bulletin 115, 1 (January 1994), 102-141. https://doi.org/10.1037/0033--2909.115.1.102
[70]
James A Russell. 1980. A circumplex model of affect. Journal of personality and social psychology 39, 6 (1980), 1161.
[71]
James A Russell, Anna Weiss, and Gerald A Mendelsohn. 1989. Affect grid: a single-item scale of pleasure and arousal. Journal of personality and social psychology 57, 3 (1989), 493.
[72]
RichardMRyan, C Scott Rigby, and AndrewPrzybylski. 2006. The motivational pull of video games:Aself-determination theory approach. Motivation and emotion 30 (2006), 344--360.
[73]
Pertti Saariluoma and Jussi Jokinen. 2014. Emotional Dimensions of User Experience: A User Psychological Analysis. International Journal of Human-Computer Interaction 30 (04 2014). https://doi.org/10.1080/10447318.2013.858460
[74]
Wataru Sato, Sylwia Hyniewska, Kazusa Minemoto, and Sakiko Yoshikawa. 2019. Facial expressions of basic emotions in Japanese laypeople. Frontiers in psychology 10 (2019), 259.
[75]
Disa Sauter, Frank Eisner, Andrew Calder, and Sophie Scott. 2010. Perceptual Cues in Nonverbal Vocal Expressions of Emotion. Quarterly journal of experimental psychology (2006) 63 (04 2010), 2251--72. https://doi.org/10.1080/ 17470211003721642
[76]
Disa A. Sauter, Frank Eisner, Paul Ekman, and Sophie K. Scott. 2010. Cross-cultural recognition of basic emotions through nonverbal emotional vocalizations. Proceedings of the National Academy of Sciences 107, 6 (2010), 2408--2412. https://doi.org/10.1073/pnas.0908239106
[77]
Klaus R Scherer. 2003. Vocal communication of emotion: A review of research paradigms. Speech communication 40, 1--2 (2003), 227--256.
[78]
Klaus R Scherer. 2005. What are emotions? And how can they be measured? Social science information 44, 4 (2005), 695--729.
[79]
Mike Schmierbach, Qian Xu, Anne Oeldorf-Hirsch, and Frank E Dardis. 2012. Electronic friend or virtual foe: Exploring the role of competitive and cooperative multiplayer video game modes in fostering enjoyment. Media Psychology 15, 3 (2012), 356--371.
[80]
R.E. Smith, I.G. Sarason, and B.R. Sarason. 1982. Psychology: The Frontiers of Behavior. Harper & Row. https://books.google.co.jp/books?id=TTN5QgAACAAJ
[81]
Christina Sobin and Murray Alpert. 1999. Emotion in speech: The acoustic attributes of fear, anger, sadness, and joy. Journal of psycholinguistic research 28 (1999), 347--365.
[82]
Samaneh Soleimani and Lai-Chong Law. 2017. What Can Self-Reports and Acoustic Data Analyses on Emotions Tell Us? 489--501. https://doi.org/10.1145/3064663.3064770
[83]
Eri Takayama, Takashi Nomaru, Yuki Yasunaka, Takeru Yamagishi, and Keita Watanabe. 2023. [Improvement of Emotional Observation Method and Analysis Method in the "Ah-aloud" Method] A araudo h¯o ni okeru kannjou kansoku syuh¯o no kaizen to bunseki syuh¯o no kent¯o. In Proceedings of Entertainment Computing Symposium 2023, Vol. 2023. 52--61.
[84]
Manfred Thüring and Sascha Mahlke. 2007. Usability, aesthetics and emotions in human--technology interaction. International journal of psychology 42, 4 (2007), 253--264.
[85]
Marieke Van Camp, Muriel De Boeck, Stijn Verwulgen, and Guido De Bruyne. 2019. EEG technology for UX evaluation: a multisensory perspective. In Advances in Neuroergonomics and Cognitive Engineering: Proceedings of the AHFE 2018 International Conference on Neuroergonomics and Cognitive Engineering, July 21--25, 2018, Loews Sapphire Falls Resort at Universal Studios, Orlando, Florida USA 9. Springer, 337--343.
[86]
David Watson, Lee Anna Clark, and Auke Tellegen. 1988. Development and validation of brief measures of positive and negative affect: the PANAS scales. Journal of personality and social psychology 54, 6 (1988), 1063.
[87]
Eric N Wiebe, Allison Lamb, Megan Hardy, and David Sharek. 2014. Measuring engagement in video game-based environments: Investigation of the User Engagement Scale. Computers in Human Behavior 32 (2014), 123--132.
[88]
Hanting Xie, Sam Devlin, Daniel Kudenko, and Peter Cowling. 2015. Predicting player disengagement and first purchase with event-frequency based data representation. 230--237. https://doi.org/10.1109/CIG.2015.7317919
[89]
Yaoye Yao. 2021. [On the recognition of words in the "ah" system of sentential verbs] "Ah" kei kand¯oshi ni okeru go no nintei ni tsuite. Bulletin of the Graduate Division of Letters, Arts and Sciences of Waseda University 66 (2021), 209--220.
[90]
Thedy Yogasara, Vesna Popovic, Ben Kraal, and Marianella Chamorro-Koc. 2011. General characteristics of anticipated user experience (AUX) with interactive products. (10 2011).
[91]
Katherine S Young, Christine E Parsons, Richard T LeBeau, Benjamin A Tabak, Amy R Sewart, Alan Stein, Morten L Kringelbach, and Michelle G Craske. 2017. Sensing Emotion in Voices: Negativity Bias and Gender Differences in a Validation Study of the Oxford Vocal ('OxVoc') Sounds Database. Psychological Assessment 29, 8 (2017), 967--977. https://doi.org/10.1037/pas0000382 (c) 2016 APA, all rights reserved).
[92]
Tianyi Zhang, Abdallah El Ali, Chen Wang, Alan Hanjalic, and Pablo Cesar. 2020. Rcea: Real-time, continuous emotion annotation for collecting precise mobile video ground truth labels. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1--15.

Index Terms

  1. Ah-Aloud Method to Comprehend Time-Series Emotion Observation During Gameplay: An Initial Investigation with Japanese Speakers

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 8, Issue CHI PLAY
    CHI PLAY
    October 2024
    1726 pages
    EISSN:2573-0142
    DOI:10.1145/3700823
    • Editor:
    • Jeff Nichols
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 October 2024
    Published in PACMHCI Volume 8, Issue CHI PLAY

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. ah-aloud
    2. entertainment
    3. evaluating emotion
    4. experimental method
    5. think-aloud
    6. user experience

    Qualifiers

    • Research-article

    Funding Sources

    • Japan Society for the Promotion of Science (JSPS)

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 74
      Total Downloads
    • Downloads (Last 12 months)74
    • Downloads (Last 6 weeks)74
    Reflects downloads up to 27 Nov 2024

    Other Metrics

    Citations

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media