Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Olfactory-Enhanced VR: What's the Difference in Brain Activation Compared to Traditional VR for Emotion Induction?

Published: 29 November 2023 Publication History

Abstract

Olfactory-enhanced virtual reality (OVR) creates a complex and rich emotional experience, thus promoting a new generation of human-computer interaction experiences in real-world scenarios. However, with the rise of virtual reality (VR) as a mood induction procedure (MIP), few studies have incorporated olfactory stimuli into emotion induction in three-dimensional (3D) environments. Considering the differences in electroencephalography (EEG) dynamics between sensory stimuli, all previous two-dimensional (2D) and 3D emotional studies have been less effective in reality because they only use visual and audio senses. To overcome these limitations, we developed a novel EEG signal dataset based on OVR. We systematically analyzed the influence of olfactory stimuli on emotion induction in a VR environment from a neurophysiological perspective. Specifically, synchronous EEG signals were collected from 65 participants as they watched positive and negative videos in traditional VR and OVR. Their power spectral densities (PSDs) were then calculated to compare the differences in brain activation between their VR and OVR modes during the induction of positive and negative emotions, while their brain states were classified after feature selection. The results showed that olfactory stimuli enhanced EEG responses for positive emotions, but the opposite was true for negative emotions. Additionally, the recognition rate of brain emotional states was more than 90&#x0025; under both positive and negative emotions, while the high-frequency <italic>&#x03B2;</italic> and <italic>&#x03B3;</italic> bands could effectively distinguish VR and OVR modes. This study introduced the olfaction into the field of human-computer interaction, which could promote research on emotion induction and recognition in real-world environments.

References

[1]
J. Xie, P. Lan, S. Wang, Y. Luo, and G. Liu, “Brain activation differences of six basic emotions between 2D screen and virtual reality modalities,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 700–709, 2023.
[2]
M. Li et al., “Neurophysiological and subjective analysis of VR emotion induction paradigm,” IEEE Trans. Visual. Comput. Graph., vol. 28, no. 11, pp. 3832–3842, Nov. 2022.
[3]
M. Yu et al., “EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features,” Biomed. Signal Process. Control, vol. 72, 2022, Art. no.
[4]
M. Yu, Y. Li, and F. Tian, “Responses of functional brain networks while watching 2D and 3D videos: An EEG study,” Biomed. Signal Process. Control, vol. 68, 2021, Art. no.
[5]
I. Kontaris, B. S. East, and D. A. Wilson, “Behavioral and neurobiological convergence of odor, mood and emotion: A review,” Front. Behav. Neurosci., vol. 14, Mar. 2020, Art. no.
[6]
G. Ghinea and K. Ademoye, “Olfaction-enhanced multimedia: Perspectives and challenges,” Multimedia Tools Appl., vol. 55, pp. 601–626, 2011.
[7]
S. Mutic, Y. Brünner, R. Rodriguez-Raecke, M. Wiesmann, and J. Freiherr, “Chemosensory danger detection in the human brain: Body odor communicating aggression modulates limbic system activation,” Neuropsychologia, vol. 99, pp. 187–198, 2017.
[8]
A. Covaci, R. Trestian, E. B. Saleme, I. S. Comsa, and G. Ghinea, “360° Mulsemedia: A way to improve subjective QoE in 360° videos,” in Proc. 27th ACM Int. Conf. Multimedia, 2019, pp. 2378–2386.
[9]
O. Baus and S. Bouchard, “Exposure to an unpleasant odour increases the sense of Presence in virtual reality,” Virtual Reality, vol. 21, no. 2, pp. 59–74, Jun. 2017.
[10]
M. Melo, G. Goncalves, P. Monteiro, H. Coelho, J. Vasconcelos-Raposo, and M. Bessa, “Do multisensory stimuli benefit the virtual reality experience? A systematic review,” IEEE Trans. Visual. Comput. Graph., vol. 28, no. 2, pp. 1428–1442, Feb. 2022.
[11]
G. Ghinea and K. Ademoye, “The sweet smell of success: Enhancing multimedia applications with olfaction,” ACM Trans. Multimedia Comput. Commun. Appl., vol. 8, 2012, Art. no.
[12]
A. Raheel, M. Majid, and S. M. Anwar, “A study on the effects of traditional and olfaction enhanced multimedia on pleasantness classification based on brain activity analysis,” Comput. Biol. Med., vol. 114, Nov. 2019, Art. no.
[13]
J. Xue, J. Wang, S. Hu, N. Bi, and Z. Lv, “OVPD: Odor-video elicited physiological signal database for emotion recognition,” IEEE Trans. Instrum. Meas., vol. 71, 2022, Art. no.
[14]
K. Touhara, Odor and Pheromone Molecules, Receptors, and Behavioral Responses, Japan: Springer, 2014.
[15]
K. M. Igarashi et al., “Parallel mitral and tufted cell pathways route distinct odor information to different targets in the olfactory cortex,” J. Neurosci., vol. 32, no. 23, pp. 7970–7985, Jun. 6, 2012.
[16]
T. S. Lorig, “The application of electroencephalographic techniques to the study of human olfaction: A review and tutorial,” Int. J. Psychophysiol., vol. 36, no. 2, pp. 91–104, 2000.
[17]
S. B. Davis, G. Davies, R. Haddad, and M.-K. Lai, “Smell me: Engaging with an interactive olfactory game,” in People and Computers XX — Engage, N. Bryan-Kinns, A. Blanford, P. Curzon, and L. Nigay, Eds., London: Springer, 2007, pp. 25–40.
[18]
G. Mesfin, E. B. Saleme, O. A. Ademoye, E. Kani-Zabihi, C. A. S. Santos, and G. Ghinea, “Less is (Just as Good as) more-an investigation of odor intensity and hedonic valence in mulsemedia QoE using heart rate and eye tracking,” IEEE Trans. Multimedia, vol. 23, pp. 1095–1105, 2021.
[19]
N. Murray, Y. Qiao, B. Lee, A. K. Karunakar, and G. M. Muntean, “Subjective evaluation of olfactory and visual media synchronization,” in Proc. ACM Multimedia Syst. Conf., 2013, pp. 162–171.
[20]
M. Moghimi, R. Stone, and P. Rotshtein, “Affective recognition in dynamic and interactive virtual environments,” IEEE Trans. Affect. Comput., vol. 11, no. 1, pp. 45–62, First Quarter 2020.
[21]
N. Murray, K. Ademoye, G. Ghinea, and G.-M. Muntean, “A tutorial for olfaction-based multisensorial media application design and evaluation,” ACM Comput. Surv., vol. 50, pp. 1–30, 2017.
[22]
J. I. Vallade, R. Kaufmann, B. N. Frisby, and J. C. Martin, “Technology acceptance model: Investigating students’ intentions toward adoption of immersive 360° videos for public speaking rehearsals,” Commun. Educ., vol. 70, no. 2, pp. 127–145, 2021.
[23]
J. Posner, J. A. Russell, and B. S. Peterson, “The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology,” Develop. Psychopathol., vol. 17, no. 3, pp. 715–734, 2005.
[24]
O. A. Ademoye and G. Ghinea, Synchronization of Olfaction-Enhanced Multimedia, Piscataway, NJ, USA: IEEE Press, 2009.
[25]
N. Murray, B. Lee, Y. Qiao, and G. Miro-Muntean, “The impact of scent type on olfaction-enhanced multimedia quality of experience,” IEEE Trans. Syst., Man, Cybern.: Syst., vol. 47, no. 9, pp. 1–13, Sep. 2017.
[26]
L. Pion-Tonachini, K. Kreutz-Delgado, and S. Makeig, “ICLabel: An automated electroencephalographic independent Compon. classifier, dataset, website,” NeuroImage, vol. 198, pp. 181–197, 2019.
[27]
M. D. Basar, A. D. Duru, and A. Akan, “Emotional state detection based on common spatial patterns of EEG,” Signal Image Video Process., vol. 14, no. 3, pp. 473–481, 2020.
[28]
X. Shen, X. Hu, S. Liu, S. Song, and D. Zhang, “Exploring EEG microstates for affective computing: Decoding valence and arousal experiences during video watching,” in Proc. 42nd Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 43rd Annu. Conf. Can. Med. Biol. Eng. Soc., 2020, pp. 841–846.
[29]
X. Hu et al., “EEG correlates of ten positive emotions,” Front. Hum. Neurosci., vol. 11, 2017, Art. no.
[30]
R. Jenke, A. Peer, and M. Buss, “Feature extraction and selection for emotion recognition from EEG,” IEEE Trans. Affect. Comput., vol. 5, no. 3, pp. 327–339, Third Quarter 2014.
[31]
Y. J. Liu, M. Yu, G. Zhao, J. Song, and Y. Shi, “Real-time movie-induced discrete emotion recognition from EEG signals,” IEEE Trans. Affect. Comput., vol. 9, no. 4, pp. 550–562, Fourth Quarter 2018.
[32]
N. Manshouri, M. Maleki, and T. Kayikcioglu, “An EEG-based stereoscopic research of the PSD differences in pre and post 2D&3D movies watching,” Biomed. Signal Process. Control, vol. 55, 2020, Art. no.
[33]
F. J. Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform,” Proc. IEEE, vol. 66, no. 1, pp. 51–83, Jan. 1978.
[34]
S. Koelstra, “DEAP: A database for emotion analysis ;using physiological signals,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 18–31, First Quarter 2012.
[35]
R. Subramanian, J. Wache, M. K. Abadi, R. L. Vieriu, S. Winkler, and N. Sebe, “ASCERTAIN: Emotion and personality recognition using commercial sensors,” IEEE Trans. Affect. Comput., vol. 9, no. 2, pp. 147–160, Second Quarter 2018.
[36]
P. G. Zimbardo, Psychology and Life, Richmond, VIC, Australia, CA, USA: Pearson, 12th ed., 1988.
[37]
Y. Soudry, C. Lemogne, D. Malinvaud, S.-M. Consoli, and P. Bonfils, “Olfactory system and emotion: Common substrates,” Eur. Ann. Otorhinolaryngology, Head and Neck Diseases, vol. 128, no. 1, pp. 18–23, Jan. 2011.
[38]
E. Grossman et al., “Brain areas involved in perception of biological motion,” Cogn. Neurosci. J., vol. 12, no. 5, pp. 711–720, 2000.
[39]
A. Swami and R. Jain, “Scikit-learn: Machine learning in Python,” J. Mach. Learn. Res., vol. 12, no. 10, pp. 2825–2830, 2013.
[40]
Y. Qi, “Random forest for bioinformatics,” in Ensemble Machine Learning: Methods and Applications, C. Zhang and Y. Ma Eds. New York, NY, USA: Springer, 2012, pp. 307–323.
[41]
D. M. Reif, A. A. Motsinger, B. A. Mckinney, J. Jr, and J. H. Moore, “Feature selection using a random forests classifier for the integrated analysis of multiple data types,” in Proc. IEEE Symp. Comput. Intell. Bioinf. Comput. Biol., Toronto, Ontario, Canada, 2006, pp. 1–8.
[42]
M. Huljanah, Z. Rustam, S. Utama, and T. Siswantining, “Feature selection using random forest classifier for predicting prostate cancer,” IOP Conf. Ser. Mater. Sci. Eng., vol. 546, 2019, Art. no.
[43]
M. M. Rahman, A. K. Sarkar, M. A. Hossain, M. S. Hossain, and M. A. Moni, “Recognition of human emotions using EEG signals: A review,” Comput. Biol. Med., vol. 136, no. 2, 2021, Art. no.
[44]
D. J. Oathes et al., “Worry, generalized anxiety disorder, and emotion: Evidence from the EEG gamma band,” Biol. Psychol., vol. 79, no. 2, pp. 165–170, 2008.
[45]
D. R. Collins, J. G. Pelletier, and D. Paré, “Slow and fast (gamma) neuronal oscillations in the perirhinal cortex and lateral amygdala,” J. Neurophysiol., vol. 85, no. 4, 2001, Art. no.
[46]
R. Cao, L. Zou-Williams, A. Cunningham, J. Walsh, and B. H. Thornas, “Comparing the neuro-physiological effects of cinematic virtual reality with 2D monitors,” in Proc. IEEE Virtual Reality 3D User Interfaces, 2021, pp. 729–738.
[47]
G. Mohammadi and P. Vuilleumier, “A multi-componential approach to emotion recognition and the effect of personality,” IEEE Trans. Affect. Comput., vol. 13, no. 3, pp. 1127–1139, Third Quarter 2022.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing  Volume 15, Issue 3
July-Sept. 2024
1087 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 29 November 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media