Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2993352.2993362acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

Barrier-free affective communication in MOOC study by analyzing pupil diameter variation

Published: 28 November 2016 Publication History

Abstract

A MOOC (Massive Open Online Course) study shortens the distance between students and educators, and surpasses time and space, but it also creates barriers of true emotion interaction in the study process. This research demonstrated the feasibility of using a pupil diameter variation as the indicator of implicit affection states for the scales of valence and arousal, aiming to find a way of revealing the students' true emotion. In the experiment, affective music clips were selected as stimuli, the participants' pupillary responses information and their valence-arousal labeling scores were used for emotion recognition modeling. Multilayer Perceptron, Kstar and SVM algorithms were validated in the experiment for model comparison. SVM achieved the best recognition rate in both valence and arousal affective dimensions, indicating that affective states could be recognized by pupil diameter variation. On the basis of the recognition model, an application named "EMOOC" was developed for communication improvement in the MOOC's study, which could visualize students' emotional states to the MOOC teacher as feedback, bridging the emotion gap due to the online communication barrier.

References

[1]
Abubakr, B., Ibrahima, F., and Anuj, M. 2014. Differentiation of pupillary signals using statistical and functional analysis. In International Conference on Intelligent and Advanced Systems, 1--6.
[2]
Al-Omar, D., Areej, A.-W., and Manar, H. 2013. Using pupil size variation during visual emotional stimulation in measuring affective states of non communicative individuals. In HCI International 2013, Part II:UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION, vol. 8010, 253--258.
[3]
Anton, A., Angel, B., Eduardo, B., and Ferran, B. 2015. Affective modulation of the startle reflex and the reinforcement sensitivity theory of personality: The role of sensitivity to reward. Physiology 138, 332--339.
[4]
Ashkan, Y., Jong-Seok, L., Jean-Marc, V., and Touradj, E. 2012. Affect recognition based on physiological changes during the watching of music videos. IEEE Transactions on Interactive Intelligent Systems 2, 1, 7--33.
[5]
Chen, S., and Epps, J. 2012. Automatic classification of eye activity for cognitive load measurement with emotion interference. Computer Methods and Programs in Biomedicine 110, 111C124.
[6]
Chen, W., Er, M., and Wu, S. 2006. Illumination compensation and normalization for robust face recognition using discrete cosine transform in alogarithm domain. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 36, 2, 458--466.
[7]
Cheng, B., and Liu, G. 2008. Emotion recognition from surface emg siganl using wavelet transform and neural network. In International Conference on Bioinformatics and Biomedical Engineering, 1363--1366.
[8]
Ekman, P. 1992. An argument for basic emotions. Cognition and Emotion 6, 169--200.
[9]
Fan, R. E., Chen, P. H., and Lin, C. J. 2005. Working set selection using second order information for training svm. Journal of Machine Learning Research 6, 1889--1918.
[10]
Gao, Y., Barreto, A., and Adjouadi, M. 2009. Comparative analysis of noninvasively monitored biosignals for affective assessment of a computer user. In Southern Biomedical Engineering Conference, 255--260.
[11]
Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., and Witten, I. H. 2009. The weka data mining software: An update. SIGKDD Explorations 11.
[12]
Head, K. 2013. Massive open online adventure. Chronicle of Higher Education 59, 34, 24--25.
[13]
Hevner, K. 1936. Experimental studies of the elements of expression in music. American Journal of Psychology 48, 246--268.
[14]
Hew, K. F., and Cheung, W. S. 2014. Students and instructors use of massive open online courses (moocs): Motivations and challenges. Educational Research Review 12, 45--48.
[15]
Kawai, S., Takano, H., and Nakamura, K. 2013. Pupil diameter variation in positive and negative emotions with visual stimulus. In IEEE International Conference on Systems, Man and Cybernetics, 4179--4183.
[16]
Kim, J., and Andre, E. 2008. Emotion recognition based on physiological changes in music listening. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 12, 2067--2083.
[17]
Koelstra, S., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., and Patras, I. Y. 2012. Deap: A database for emotion analysis using physiological signals. IEEE Transactions on Affective Computing 3, 1, 18--31.
[18]
Koji, K., Momoyo, I., and Minoru, F. 2011. Development of automatic filtering system for individually unpleasant data detected by pupil-size change. In International Conference on Systems, Man, and Cybernetics, 3311--3316.
[19]
Kolowich, S. 2013. The professors who make the moocs. Chronicle of Higher Education 59, 28, 20--23.
[20]
Lanat, A., Armato, A., Valenza, G., and Scilingo, E. P. 2011. Eye tracking and pupil size variation as response to affective stimuli: a preliminary study. In International Conference on Pervasive Computing Technologies for Healthcare, 78--84.
[21]
Laukka, J. S., Haapala, M., Lehtihalmes, M., Väyrynen, E., and Seppänen, T. 2013. Pupil size variation related to oral report of affective pictures. Procedia-Social and Behavioral Sciences 84, 1, 18--23.
[22]
Masek, L. 2003. Recognition of human iris patterns for biometric identification. In Report for Bachelor of Engineering degree.
[23]
Masheal, A., and Sharifa, A. 2013. An exploratory study of detecting emotion states using eye-tracking technology. In Science and Information Conference, 428--433.
[24]
Miccoli, B. M. M., Escrig, L., Lang, M. A., and Peter, J. 2008. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45, 4, 602--607.
[25]
Onorati, F., Barbieri, R., Mauri, M., Russo, V., and Mainardi, L. 2013. Reconstruction and analysis of the pupil dilation signal: Application to a psychophysiological affective protocol. Engineering in Medicine and Biology Society 59, 1, 185--198.
[26]
Panksepp, J. 1998. Affective neuroscience: The foundation of human and animal emotions. Oxford University Press.
[27]
Peng, R., Armando, B., Ying, G., and Malek, A. 2011. Affective assessment of computer users based on processing the pupil diameter signal. In International Conference of the Engineering in Medicine and Biology Society, 2594--2597.
[28]
Peng, R., Ana, B., Ying, G., and Malek, A. 2013. Affective assessment by digital processing of the pupil diameter. IEEE Transactions on Affective Computing 4, 1, 2--14.
[29]
Prehn, K., Heekeren, R. H., and der Meer, E. V. 2011. Influence of affective significance on different levels of processing using pupil dilation in an analogical reasoning task. International Journal of Psychophysiology 79, 2, 236--243.
[30]
Raluca, B., Radu, D., and Laura, F. 2013. A comparison of several classifiers for eye detection on emotion expressing faces. In International Symposium on Electrical and Electronics Engineering, vol. 8010, 1--6.
[31]
Roth, M. S. 2013. My modern experience teaching a mooc. Chronicle of Higher Education 59, 34, 18--21.
[32]
Russell, J. 1980. A circumplex model of affect. Personality and Social Psychology 39, 6, 1161--1178.
[33]
Russell, J. 2010. A comparison of the discrete and dimensional models of emotion in music. Psychology of Music 39, 1, 18--49.
[34]
Saromporn, C., and Michiko, O. 2014. Exploring emotion in an e-learning system using eye tracking. In Symposium on Computational Intelligence in Healthcare and e-health, 141--147.
[35]
Xing, B., Zhang, K., Sun, S., Zhang, L., Gao, Z., Wang, J., and Chen, S. 2015. Emotion-driven chinese folk music-image retrieval based on de-svm. Neurocomputing 148, 619--627.
[36]
Zhang, K., and Sun, S. 2013. Web music emotion recognition based on higher effective gene expression programming. Neuro-computing 105, 3, 100--106.
[37]
Zhu, X. 2010. Emotion recognition of emg based on bp neural network. In International Symposium on Networks and Network Security, 227--229.
[38]
Zhua, Z., and Jib, Q. 2005. Robust real-time eye detection and tracking under variable lighting conditions and various face orientations. Computer Vision and Image Understanding 98, 124--154.

Cited By

View all
  • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
  • (2022)Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCIProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517495(1-16)Online publication date: 29-Apr-2022
  • (2022)UX evaluation of open MOOC platforms: a comparative study between Moodle and Open edX combining user interaction metrics and wearable biosensorsInteractive Learning Environments10.1080/10494820.2022.204867431:10(6841-6855)Online publication date: 9-Mar-2022
  • Show More Cited By

Index Terms

  1. Barrier-free affective communication in MOOC study by analyzing pupil diameter variation

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SA '16: SIGGRAPH ASIA 2016 Symposium on Education
    November 2016
    95 pages
    ISBN:9781450345446
    DOI:10.1145/2993352
    • Conference Chairs:
    • Miho Aoki,
    • Zhigeng Pan
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 November 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. MOOC
    2. SVM
    3. emotion recognition
    4. pupil diameter variation
    5. wavelet transform

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    SA '16
    Sponsor:
    SA '16: SIGGRAPH Asia 2016
    December 5 - 8, 2016
    Macau

    Acceptance Rates

    Overall Acceptance Rate 178 of 869 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 18 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Survey on Emotion Sensing Using Mobile DevicesIEEE Transactions on Affective Computing10.1109/TAFFC.2022.322048414:4(2678-2696)Online publication date: 1-Oct-2023
    • (2022)Shared User Interfaces of Physiological Data: Systematic Review of Social Biofeedback Systems and Contexts in HCIProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517495(1-16)Online publication date: 29-Apr-2022
    • (2022)UX evaluation of open MOOC platforms: a comparative study between Moodle and Open edX combining user interaction metrics and wearable biosensorsInteractive Learning Environments10.1080/10494820.2022.204867431:10(6841-6855)Online publication date: 9-Mar-2022
    • (2022)Trends in the use of affective computing in e-learning environmentsEducation and Information Technologies10.1007/s10639-021-10769-927:3(3867-3889)Online publication date: 1-Apr-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media