Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3463946.3469240acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
research-article

Color-Grayscale-Pair Image Sentiment Dataset and Its Application to Sentiment-Driven Image Color Conversion

Published: 27 August 2021 Publication History

Abstract

In this study, we focused on the fact that the color information of an image has a significant effect on the emotion recalled, and we created a dataset with discrete and continuous emotion labels for color and grayscale image pairs, which are not present in existing emotion image datasets. As a result of the analysis of the continuous-valued emotion labels, we found that the color images evoked a wider range of positive and negative emotions. We also conducted experiments to transform the color of the image to the target emotion using the emotion label and the image as input and succeeded in changing the color of the entire image according to the input emotion value. Another experiment is conducted to convert the emotion category and we confirmed that the image was generated in such a way that the evoked emotion was changed compared to the original image.

References

[1]
Hyojin Bahng, Seungjoo Yoo, Wonwoong Cho, David Keetae Park, Ziming Wu, Xiaojuan Ma, and Jaegul Choo. 2018. Coloring with words: Guiding image colorization through text-based palette generation. In ECCV. 431--447.
[2]
Damian Borth, Rongrong Ji, Tao Chen, Thomas Breuel, and Shih-Fu Chang. 2013. Large-scale visual sentiment ontology and detectors using adjective noun pairs. In ACM. 223--232.
[3]
J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei. 2009. ImageNet: A Large-Scale Hierarchical Image Database. In CVPR.
[4]
Paul Ekman. 1992. An argument for basic emotions. Cognition and Emotion 6, 3--4 (1992), 169--200. https://doi.org/10.1080/02699939208411068 arXiv:https://doi.org/10.1080/02699939208411068
[5]
Paul Ekman. 2002. Facial action coding system (FACS). A human face (2002).
[6]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In CVPR. 770--778.
[7]
Satoshi Iizuka, Edgar Simo-Serra, and Hiroshi Ishikawa. 2016. Let there be Color!: Joint End-to-end Learning of Global and Local Image Priors for Automatic Image Colorization with Simultaneous Classification. ACM Transactions on Graphics (Proc. of SIGGRAPH 2016) 35, 4 (2016), 110:1--110:11.
[8]
Phillip Isola, Jun-Yan Zhu, Tinghui Zhou, and Alexei A Efros. 2017. Image-toimage translation with conditional adversarial networks. In CVPR. 1125--1134.
[9]
Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
[10]
Da Liu, Yaxi Jiang, Min Pei, and Shiguang Liu. 2018. Emotional image color transfer via deep learning. Pattern Recognition Letters 110 (2018), 16--22.
[11]
Albert Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 14, 4 (1996), 261--292.
[12]
Mehdi Mirza and Simon Osindero. 2014. Conditional generative adversarial nets. arXiv preprint arXiv:1411.1784 (2014).
[13]
Chanjong Park and In-Kwon Lee. 2020. Emotional Landscape Image Generation Using Generative Adversarial Networks. In ACCV.
[14]
Kuan-Chuan Peng, Tsuhan Chen, Amir Sadovnik, and Andrew C Gallagher. 2015. A mixed bag of emotions: Model, predict, and transfer emotion distributions. In CVPR. 860--868.
[15]
Robert Plutchik. 1980. A general psychoevolutionary theory of emotion. In Theories of Emotion. Elsevier, 3--33.
[16]
Albert Pumarola, Antonio Agudo, Aleix M Martinez, Alberto Sanfeliu, and Francesc Moreno-Noguer. 2020. GANimation: One-shot anatomically consistent facial animation. International Journal of Computer Vision 128, 3 (2020), 698--713.
[17]
Vishaal Ram, Laura P Schaposnik, Nikos Konstantinou, Eliz Volkan, Marietta Papadatou-Pastou, Banu Manav, Domicele Jonauskaite, and Christine Mohr. 2020. Extrapolating continuous color emotions through deep learning. Physical Review Research 2, 3 (2020), 033350.
[18]
Bart Thomee, David A Shamma, Gerald Friedland, Benjamin Elizalde, Karl Ni, Douglas Poland, Damian Borth, and Li-Jia Li. 2016. YFCC100M: The new data in multimedia research. ACM 59, 2 (2016), 64--73.
[19]
Xiao-Hui Wang, Jia Jia, Han-Yu Liao, and Lian-Hong Cai. 2012. Affective image colorization. Journal of Computer Science and Technology 27, 6 (2012), 1119--1128.
[20]
Yang Yang, Jia Jia, Shumei Zhang, Boya Wu, Qicong Chen, Juanzi Li, Chunxiao Xing, and Jie Tang. 2014. How do your friends on social media disclose your emotions?. In AAAI, Vol. 14. Citeseer, 1--7.
[21]
Sicheng Zhao, Guiguang Ding, Qingming Huang, Tat-Seng Chua, Björn W Schuller, and Kurt Keutzer. 2018. Affective Image Content Analysis: A Comprehensive Survey. In IJCAI. 5534--5541.

Cited By

View all
  • (2023)Detection of Emotions in Artworks Using a Convolutional Neural Network Trained on Non-Artistic Images: A Methodology to Reduce the Cross-Depiction ProblemEmpirical Studies of the Arts10.1177/0276237423116348142:1(38-64)Online publication date: 16-Mar-2023
  • (2022)Analysis of the Use of Color and Its Emotional Relationship in Visual Creations Based on Experiences during the Context of the COVID-19 PandemicSustainability10.3390/su14201298914:20(12989)Online publication date: 11-Oct-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MMArt-ACM '21: Proceedings of the 2021 International Joint Workshop on Multimedia Artworks Analysis and Attractiveness Computing in Multimedia 2021
August 2021
23 pages
ISBN:9781450385312
DOI:10.1145/3463946
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 August 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective computing
  2. color
  3. datasets

Qualifiers

  • Research-article

Conference

ICMR '21
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)2
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Detection of Emotions in Artworks Using a Convolutional Neural Network Trained on Non-Artistic Images: A Methodology to Reduce the Cross-Depiction ProblemEmpirical Studies of the Arts10.1177/0276237423116348142:1(38-64)Online publication date: 16-Mar-2023
  • (2022)Analysis of the Use of Color and Its Emotional Relationship in Visual Creations Based on Experiences during the Context of the COVID-19 PandemicSustainability10.3390/su14201298914:20(12989)Online publication date: 11-Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media