Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3524273.3532895acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article

PEM360: a dataset of 360° videos with continuous physiological measurements, subjective emotional ratings and motion traces

Published: 05 August 2022 Publication History

Abstract

From a user perspective, immersive content can elicit more intense emotions than flat-screen presentations. From a system perspective, efficient storage and distribution remain challenging, and must consider user attention. Understanding the connection between user attention, user emotions and immersive content is therefore key. In this article, we present a new dataset, PEM360 of user head movements and gaze recordings in 360° videos, along with self-reported emotional ratings of valence and arousal, and continuous physiological measurement of electrodermal activity and heart rate. The stimuli are selected to enable the spatiotemporal analysis of the connection between content, user motion and emotion. We describe and provide a set of software tools to process the various data modalities, and introduce a joint instantaneous visualization of user attention and emotion we name Emotional maps. We exemplify new types of analyses the PEM360 dataset can enable. The entire data and code are made available in a reproducible framework.

References

[1]
Haldun Akoglu. 2018. User's guide to correlation coefficients. Turkish Journal of Emergency Medicine 18, 3 (2018), 91--93.
[2]
Rosa María Baños, Cristina Botella, Isabel Rubió, Soledad Quero, Azucena García-Palacios, and Mariano Luis Alcañiz Raya. 2008. Presence and Emotions in Virtual Environments: The Influence of Stereoscopy. Cyberpsychology & behavior : the impact of the Internet, multimedia and virtual reality on behavior and society 11 1 (2008), 1--8.
[3]
Lisa Feldman Barrett. 1998. Discrete emotions or dimensions? The role of valence focus and arousal focus. Cognition and Emotion 12, 4 (1998), 579--599. Place: United Kingdom Publisher: Taylor & Francis.
[4]
Wolfram Boucsein. 2012. Electrodermal activity, 2nd ed. Springer Science + Business Media, New York, NY, US. Pages: xviii, 618.
[5]
Margaret M. Bradley and Peter J. Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry 25, 1 (1994), 49--59.
[6]
Jason J. Braithwaite, Diana Patricia Zethelius Watson, Roland S. G. Jones, and Michael A. Rowe. 2013. Guide for Analysing Electrodermal Activity & Skin Conductance Responses for Psychological Experiments. CTIT technical reports series (2013).
[7]
Kjell Brunnström, Sergio Ariel Beker, Katrien De Moor, Ann Dooms, Sebastian Egger, Marie-Neige Garcia, Tobias Hossfeld, Satu Jumisko-Pyykkö, Christian Keimel, Mohamed-Chaker Larabi, Bob Lawlor, Patrick Le Callet, Sebastian Möller, Fernando Pereira, Manuela Pereira, Andrew Perkis, Jesenka Pibernik, Antonio Pinheiro, Alexander Raake, Peter Reichl, Ulrich Reiter, Raimund Schatz, Peter Schelkens, Lea Skorin-Kapov, Dominik Strohmeier, Christian Timmerer, Martin Varela, Ina Wechsung, Junyong You, and Andrej Zgank. 2013. Qualinet White Paper on Definitions of Quality of Experience. https://hal.archives-ouvertes.fr/hal-00977812 Qualinet White Paper on Definitions of Quality of Experience Output from the fifth Qualinet meeting, Novi Sad, March 12, 2013.
[8]
Lovish Chopra, Sarthak Chakraborty, Abhijit Mondal, and Sandip Chakraborty. 2021. PARIMA: Viewport Adaptive 360-Degree Video Streaming. In Proceedings of the Web Conference 2021. ACM, 2379--2391.
[9]
Anna Felnhofer, Oswald D. Kothgassner, Mareike Schmidt, Anna-Katharina Heinzle, Leon Beutl, Helmut Hlavacs, and Ilse Kryspin-Exner. 2015. Is Virtual Reality Emotionally Arousing? Investigating Five Emotion Inducing Virtual Park Scenarios. Int. J. Hum.-Comput. Stud. 82, C (oct 2015), 48--56.
[10]
Berthold K.P. Horn and Brian G. Schunck. 1981. Determining optical flow. Artificial Intelligence 17, 1 (1981), 185--203.
[11]
L. Itti, C. Koch, and E. Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 11 (1998), 1254--1259.
[12]
Jeffrey J. Walline. 2001. Designing Clinical Research: an Epidemiologic Approach, 2nd Ed. Optometry and Vision Science 78, 8 (2001). https://journals.lww.com/optvissci/Fulltext/2001/08000/Designing_Clinical_Research_an_Epidemiologic.5.aspx
[13]
Akisato Kimura. 2020. pySaliencyMap. https://github.com/akisatok/pySaliencyMap
[14]
Terry K Koo and Mae Y Li. 2016. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. Journal of chiropractic medicine 15, 2 (June 2016), 155--163. Edition: 2016/03/31 Publisher: Elsevier.
[15]
Olivier Le Meur and Thierry Baccino. 2013. Methods for comparing scanpaths and saliency maps: strengths and weaknesses. Behavior Research Methods 45, 1 (March 2013), 251--266.
[16]
Benjamin J. Li, Jeremy N. Bailenson, Adam Pines, Walter J. Greenleaf, and Leanne M. Williams. 2017. A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures. Frontiers in Psychology 8 (Dec. 2017), 2116.
[17]
Dominique Makowski, Tam Pham, Zen J. Lau, Jan C. Brammer, François Lespinasse, Hung Pham, Christopher Schölzel, and S. H. Annabel Chen. 2021. NeuroKit2: A Python toolbox for neurophysiological signal processing. Behavior Research Methods 53, 4 (feb 2021), 1689--1696.
[18]
Nitish Mutha. 2017. Equirectangular-toolbox. https://github.com/NitishMutha/equirectangular-toolbox
[19]
Federica Pallavicini, Alessandro Pepe, and Maria Eleonora Minissi. 2019. Gaming in Virtual Reality: What Changes in Terms of Usability, Emotional Response and Sense of Presence Compared to Non-Immersive Video Games? Simulation & Gaming 50, 2 (2019), 136--159. arXiv:https://doi.org/10.1177/1046878119831420
[20]
Wei Tang, Shiyi Wu, Toinon Vigier, and Matthieu Perreira Da Silva. 2020. Influence of Emotions on Eye Behavior in Omnidirectional Content. In 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, Athlone, Ireland, 1--6.
[21]
TensorfFlow. 2021. TensorFlow 2 YOLOv4. https://wiki.loliot.net/docs/lang/python/libraries/yolov4/python-yolov4-about/
[22]
Alexander Toet, Fabienne Heijn, Anne-Marie Brouwer, Tina Mioch, and Jan B. F. van Erp. 2020. An Immersive Self-Report Tool for the Affective Appraisal of 360° VR Videos. Frontiers in Virtual Reality 1 (Sept. 2020), 552587.
[23]
UCSF. 2021. Sample Size Calculators for designing clinical research. https://sample-size.net/correlation-sample-size/.
[24]
Jan-Niklas Voigt-Antons, Eero Lehtonen, Andres Pinilla Palacios, Danish Ali, Tanja Kojic, and Sebastian Möller. 2020. Comparing Emotional States Induced by 360° Videos Via Head-Mounted Display and Computer Screen. In 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). 1--6.
[25]
Mai Xu, Chen Li, Zhenzhong Chen, Zulin Wang, and Zhenyu Guan. 2019. Assessing Visual Quality of Omnidirectional Videos. IEEE Transactions on Circuits and Systems for Video Technology 29, 12 (2019), 3516--3530.
[26]
Tong Xue, Abdallah El Ali, Gangyi Ding, and Pablo Cesar. 2021. Investigating the Relationship between Momentary Emotion Self-reports and Head and Eye Movements in HMD-based 360° VR Video Watching. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1--8.
[27]
Tong Xue, Abdallah El Ali, Tianyi Zhang, Gangyi Ding, and Pablo Cesar. 2021. CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos. IEEE Transactions on Multimedia (2021), 1--1.

Cited By

View all
  • (2024)AMD Journee: A Patient Co-designed VR Experience to Raise Awareness Towards the Impact of AMD on Social InteractionsProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656314(17-29)Online publication date: 7-Jun-2024
  • (2024)Task-based methodology to characterise immersive user experience with multivariate data2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00092(722-731)Online publication date: 16-Mar-2024
  • (2024)PLUME: Record, Replay, Analyze and Share User Behavior in 6DoF XR ExperiencesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210730:5(2087-2097)Online publication date: 4-Mar-2024
  • Show More Cited By

Index Terms

  1. PEM360: a dataset of 360° videos with continuous physiological measurements, subjective emotional ratings and motion traces

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MMSys '22: Proceedings of the 13th ACM Multimedia Systems Conference
      June 2022
      432 pages
      ISBN:9781450392839
      DOI:10.1145/3524273
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 August 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 360° videos
      2. emotions
      3. physiological data
      4. user experiment

      Qualifiers

      • Research-article

      Funding Sources

      • AI4Media
      • EUR DS4H
      • UCA JEDI

      Conference

      MMSys '22
      Sponsor:
      MMSys '22: 13th ACM Multimedia Systems Conference
      June 14 - 17, 2022
      Athlone, Ireland

      Acceptance Rates

      Overall Acceptance Rate 176 of 530 submissions, 33%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)77
      • Downloads (Last 6 weeks)7
      Reflects downloads up to 19 Nov 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)AMD Journee: A Patient Co-designed VR Experience to Raise Awareness Towards the Impact of AMD on Social InteractionsProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3656314(17-29)Online publication date: 7-Jun-2024
      • (2024)Task-based methodology to characterise immersive user experience with multivariate data2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00092(722-731)Online publication date: 16-Mar-2024
      • (2024)PLUME: Record, Replay, Analyze and Share User Behavior in 6DoF XR ExperiencesIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210730:5(2087-2097)Online publication date: 4-Mar-2024
      • (2024)Comparing Continuous and Retrospective Emotion Ratings in Remote VR Study2024 16th International Conference on Quality of Multimedia Experience (QoMEX)10.1109/QoMEX61742.2024.10598301(139-145)Online publication date: 18-Jun-2024
      • (2024)AVDOS-VR: Affective Video Database with Physiological Signals and Continuous Ratings Collected Remotely in VRScientific Data10.1038/s41597-024-02953-611:1Online publication date: 25-Jan-2024
      • (2023)360TripleView: 360-Degree Video View Management System Driven by Convergence Value of Viewing Preferences2023 IEEE International Symposium on Multimedia (ISM)10.1109/ISM59092.2023.00010(28-35)Online publication date: 11-Dec-2023
      • (2022)Effects of emotions on head motion predictability in 360° videosProceedings of the 14th International Workshop on Immersive Mixed and Virtual Environment Systems10.1145/3534086.3534335(37-43)Online publication date: 14-Jun-2022
      • (2022)Machine learning-based strategies for streaming and experiencing 3DoF virtual realityProceedings of the 13th ACM Multimedia Systems Conference10.1145/3524273.3533934(398-402)Online publication date: 14-Jun-2022

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media