Abstract
We introduce EmoSnaps, a mobile application that captures unobtrusively pictures of one’s facial expressions throughout the day and uses them for later recall of her momentary emotions. We describe two field studies that employ EmoSnaps in an attempt to investigate if and how individuals and their relevant others infer emotions from self-face and familiar face pictures, respectively. Study 1 contrasted users’ recalled emotions as inferred from EmoSnaps’ self-face pictures to ground truth data as derived from Experience Sampling. Contrary to our expectations, we found that people are better able to infer their past emotions from a self-face picture the longer the time has elapsed since capture. Study 2 assessed EmoSnaps’ ability to capture users’ experiences while interacting with different mobile apps. The study revealed systematic variations in users’ emotions while interacting with different categories of mobile apps (such as productivity and entertainment), social networking services, as well as direct social communications through phone calls and instant messaging, but also diurnal and weekly patterns of happiness as inferred from EmoSnaps’ self-face pictures. All in all, the results of both studies provided us with confidence over the validity of self-face pictures captured through EmoSnaps as memory cues for emotion recall, and the effectiveness of the EmoSnaps tool in measuring users’ momentary experiences.
Similar content being viewed by others
Notes
Z-transformation was applied to normalize the distance Δ between ESM and reconstruction ratings.
References
Larson R, Csikszentmihalyi M (1983) The experience sampling method. New Direct Methodol Soc Behav Sci
Ekman P (1993) Facial expression and emotion. Am Psychol 48:384
Ortony A, Turner TJ (1990) What’s basic about basic emotions? Psychol Rev 97:315
Russell JA (1994) Is there universal recognition of emotion from facial expressions? A review of the cross-cultural studies. Psychol Bull 115:102
Pantic M, Rothkrantz LJM (2000) Automatic analysis of facial expressions: the state of the art. IEEE Trans Pattern Anal Mach Intell 22:1424–1445
Cohen I, Sebe N, Garg A, Chen LS, Huang TS (2003) Facial expression recognition from video sequences: temporal and static modeling. Comput Vis Image Underst 91:160–187
Azcarate A, Hageloh F, van de Sande K, Valenti R (2005) Automatic facial emotion recognition. Universiteit van Amsterdam
Teeters A, El Kaliouby R, Picard R (2006) Self-Cam: feedback from what would be your social partner. p 138
Gruebler A, Suzuki K (2010) Measurement of distal EMG signals using a wearable device for reading facial expressions. pp 4594–4597
Bartlett FC (1995) Remembering: a study in experimental and social psychology. Cambridge University Press, Cambridge
Tulving E (1984) Precis of elements of episodic memory. Behav Brain Sci 7:223–268
Robinson MD, Clore GL (2002) Belief and feeling: evidence for an accessibility model of emotional self-report. Psychol Bull 128:934–960. doi:10.1037//0033-2909.128.6.934
Barsalou LW (1988) The content and organization of autobiographical memories. Cambridge University Press, Cambridge
Lee ML, Dey AK (2008) Lifelogging memory appliance for people with episodic memory impairment. In: Proceedings of the 10th international conference on Ubiquitous computing. ACM, New York, NY, USA, pp 44–53
Hodges S, Williams L, Berry E, Izadi S, Srinivasan J, Butler A, Smyth G, Kapur N, Wood K (2006) SenseCam: a retrospective memory aid. UbiComp 2006: Ubiquitous Computing 177–193
Gouveia R, Karapanos E (2013) Footprint tracker: supporting diary studies with Lifelogging. In: Proceedings of CHI 2013
Sas C, Fratczak T, Rees M, Gellersen H, Kalnikaite V, Coman A, Höök K (2013) AffectCam: arousal-augmented sensecam for richer recall of episodic memories. CHI’13 extended abstracts on human factors in computing systems. ACM, pp 1041–1046
Conway MA (2009) Episodic memories. Neuropsychologia 47:2305–2313
Kalnikaite V, Sellen A, Whittaker S, Kirk D (2010) Now let me see where i was: understanding how lifelogs mediate memory. In: Proceedings of the 28th international conference on human factors in computing systems, pp 2045–2054
Whittaker S, Kalnikaitė V, Petrelli D, Sellen A, Villar N, Bergman O, Clough P, Brockmeier J (2012) Socio-technical lifelogging: deriving design principles for a future proof digital past. Human Comput Interact 27:37–62
Karapanos E, Martens JB, Hassenzahl M (2009) Reconstructing experiences through sketching. arXiv preprint arXiv:0912.5343
Young AW, McWeeny KH, Hay DC, Ellis AW (1986) Matching familiar and unfamiliar faces on identity and expression. Psychol Res 48:63–68
Kenny DA, Acitelli LK (2001) Accuracy and bias in the perception of the partner in a close relationship. J Pers Soc Psychol 80:439
Anderson SJ, Conway MA (1993) Investigating the structure of autobiographical memories. J Exp Psychol Learn Mem Cogn 19:1178
Kahneman D, Krueger AB, Schkade DA, Schwarz N, Stone AA (2004) A survey method for characterizing daily life experience: the day reconstruction method. Science 306:1776–1780. doi:10.1126/science.1103572
Conner TS, Reid KA (2012) Effects of intensive mobile happiness reporting in daily life. Soc Psychol Pers Sci 3:315–323
MacKerron G, Mourato S (2010) LSE’s mappiness project may help us track the national mood: but how much should we consider happiness in deciding public policy? British Politics and Policy at LSE
Killingsworth MA, Gilbert DT (2010) A wandering mind is an unhappy mind. Science 330:932. doi:10.1126/science.1192439
Isaacs E, Konrad A, Walendowski A, Lennig T, Hollis V, Whittaker S (2013) Echoes from the past: how technology mediated reflection improves well-being. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 1071–1080
Sanches P, Höök K, Vaara E, Weymann C, Bylund M, Ferreira P, Peira N, Sjölinder M (2010) Mind the body!: designing a mobile stress management application encouraging personal reflection. In: Proceedings of the 8th ACM conference on designing interactive systems. ACM, pp 47–56
Guan Z, Lee S, Cuddihy E, Ramey J (2006) The validity of the stimulated retrospective think-aloud method as measured by eye tracking. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 1253–1262
Stone AA, Smyth JM, Pickering T, Schwartz J (1996) Daily mood variability: form of diurnal patterns and determinants of diurnal patterns. J Appl Soc Psychol 26:1286–1305
Ryan RM, Bernstein JH, Brown KW (2010) Weekends, work, and well-being: psychological need satisfactions and day of the week effects on mood, vitality, and physical symptoms. J Soc Clin Psychol 29:95–122
Rystrom DS, Benson ED (1989) Investor psychology and the day-of-the-week effect, pp 75–78
Jin B, Peña JF (2010) Mobile communication in romantic relationships: mobile phone use, relational uncertainty, love, commitment, and attachment styles. Commun Rep 23:39–51. doi:10.1080/08934211003598742
Green MC, Hilken J, Friedman H, Grossman K, Gasiewskj J, Adler R, Sabini J (2005) Communication via instant messenger: short-and long-term effects. J Appl Soc Psychol 35:445–462
Oulasvirta A, Rattenbury T, Ma L, Raita E (2012) Habits make smartphone use more pervasive. Pers Ubiquit Comput 16:105–114
Böhmer M, Hecht B, Schöning J, Krüger A, Bauer G (2011) Falling asleep with angry birds, Facebook and kindle: a large scale study on mobile application usage. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services. pp 47–56
Ickin S, Wac K, Fiedler M, Janowski L, Hong J-H, Dey AK (2012) Factors influencing quality of experience of commonly used mobile applications. Commun Mag 50:48–56
Verkasalo H (2008) Contextual patterns in mobile service usage. Pers Ubiquit Comput 13:331–342. doi:10.1007/s00779-008-0197-0
De Guzman ES, Sharmin M, Bailey BP (2007) Should I call now? Understanding what context is considered when deciding whether to initiate remote communication via mobile devices. Proc Graph Interface 2007:143–150
Larson R (1989) Is feeling “in control” related to happiness in daily life? Psychol Rep 64:775–784
Karapanos E (2011) Experience sampling, day reconstruction, what’s next? Towards technology-assisted reconstruction. M-ITI, Funchal
Acknowledgments
This work was conducted in the frames of Logica Service Design Lab with support of Knowledge + incentive system in Madeira, Portugal. The authors also acknowledge the financial support of the Future and Emerging Technologies (FET) programme within the 7th Framework Programme for Research of the European Commission, under FET Grant Number: 612933.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Niforatos, E., Karapanos, E. EmoSnaps: a mobile application for emotion recall from facial expressions. Pers Ubiquit Comput 19, 425–444 (2015). https://doi.org/10.1007/s00779-014-0777-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00779-014-0777-0