Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Empathy by Design: The Influence of Trembling AI Voices on Prosocial Behavior

Published: 14 November 2023 Publication History

Abstract

Recent advances in artificial speech synthesis and machine learning equip AI-powered conversational agents, from voice assistants to social robots, with the ability to mimic human emotional expression during their interactions with users. One unexplored development is the ability to design machine-generated voices that induce varying levels of “shakiness” (i.e., trembling) in the agents’ voices. In the current work, we examine how the trembling voice of a conversational AI impacts users’ perceptions, affective experiences, and their subsequent behavior. Across three studies, we demonstrate that a trembling voice enhances the perceived psychological vulnerability of the agent, followed by a heightened sense of empathic concern, ultimately increasing people's willingness to donate in a prosocial charity context. We provide further evidence from a large-scale field experiment that conversational agents with a trembling voice lead to increased click-through rates and decreased costs-per-impression in an online charity advertising setting. These findings deepen our understanding of the nuanced impact of intentionally designed voices of conversational AI agents on humans and highlight the ethical and societal challenges that arise.

References

[1]
F. Efthymiou, C. Hildebrand, E. de Bellis, and H. W. Hampton, “The power of AI-generated voices: How digital vocal tract length shapes product congruency and ad performance,” J. Interactive Marketing, 2023.
[2]
B. Hernandez-Ortega and I. Ferreira, “How smart experiences build service loyalty: The importance of consumer love for smart voice assistants,” Psychol. Marketing, vol. 38, no. 7, pp. 1122–1139, 2021.
[3]
S. Melumad, R. Hadi, C. Hildebrand, and A. F. Ward, “Technology-augmented choice: How digital innovations are transforming consumer decision processes,” Customer Needs Solutions, vol. 7, no. 3, pp. 90–101, 2020.
[4]
National public media, “The smart audio report,” 2022. [Online]. Available: https://www.nationalpublicmediacominsightsreportssmart-Audio-Rep
[5]
Juniper Research, “Number of voice assistant devices in use to overtake world population by 2024, reaching 8.4 billion, led by smartphones,” 2020. [Online]. Available: https://wwwjuniperresearchcompressnumber–Voice-Assist.-Devices–Use
[6]
W. Loeb, “Amazon is the biggest investor in the future, spends $22.6 billion on R&D,” 2018. [Online]. Available: https://wwwforbescomsiteswalterloeb20181101amazon–Biggest-Invest.—Futur
[7]
J. Humphry and C. Chesher, “Preparing for smart voice assistants: Cultural histories and media innovations,” New Media Soc., vol. 23, no. 7, pp. 1971–1988, 2021.
[8]
H. S. Woods, “Asking more of Siri and Alexa: Feminine persona in service of surveillance capitalism,” Crit. Stud. Media Commun., vol. 35, no. 4, pp. 334–349, 2018.
[9]
H. R. M. Pelikan, M. Broth, and L. Keevallik, ““Are you sad, Cozmo?” How humans make sense of a home robot's emotion displays,” in Proc. 15th ACM/IEEE Int. Conf. Hum.-Robot Interact., 2020, pp. 461–470.
[10]
Amazon, “Amazon polly: Developer guide,” 2020. [Online]. Available: https://us-east-2.console.aws.amazon.com/polly/home/SynthesizeSpeech
[11]
F. Eyben et al., “The Geneva minimalistic acoustic parameter set (GeMAPS) for voice research and affective computing,” IEEE Trans. Affect. Comput., vol. 7, no. 2, pp. 190–202, Apr./Jun. 2016.
[12]
J. Sundberg, S. Patel, E. Bjorkner, and K. R. Scherer, “Interdependencies among voice source parameters in emotional speech,” IEEE Trans. Affect. Comput., vol. 2, no. 3, pp. 162–174, Jul./Sep. 2011.
[13]
K. Zhou, B. Sisman, R. Rana, B. W. Schuller, and H. Li, “Speech synthesis with mixed emotions,” IEEE Trans. Affect. Comput., early access, Dec. 30, 2022.
[14]
C. Stupp, “Fraudsters used AI to mimic CEO's voice in unusual cybercrime case,” Wall Street J., vol. 30, no. 08, 2019. [Online]. Available: https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402
[15]
E. Tsukerman, “How artificial intelligence is changing social engineering,” 2020. [Online]. Available: https://sresourcesinfosecinstitutecomtopichow-Artif.-Intell.–Chang.-Soc.-Eng
[16]
S. Abraham and I. Chengalur-Smith, “An overview of social engineering malware: Trends, tactics, and implications,” Technol. Soc., vol. 32, no. 3, pp. 183–196, 2010.
[17]
A. Schirmer and R. Adolphs, “Emotion perception from face, voice, and touch: Comparisons and convergence,” Trends Cogn. Sci., vol. 21, no. 3, pp. 216–228, 2017.
[18]
C. Dromey, S. O. Holmes, J. A. Hopkin, and K. Tanner, “The effects of emotional expression on vibrato,” J. Voice, vol. 29, no. 2, pp. 170–181, 2015.
[19]
B. Reeves and C. Nass, “The media equation: How people treat computers, television, and new media like real people and places,” Bibliovault OAI Repository Univ. Chicago Press, Jan. 1996.
[20]
C. Nass and Y. Moon, “Machines and mindlessness: Social responses to computers,” J. Soc. Issues, vol. 56, no. 1, pp. 81–103, Jan. 2000.
[21]
H. Chin, L. W. Molefi, and M. Y. Yi, “Empathy is all you need: How a conversational agent should respond to verbal abuse,” in Proc. CHI Conf. Hum. Factors Comput. Syst., 2020, pp. 1–13.
[22]
A. C. Horstmann, N. Bock, E. Linhuber, J. M. Szczuka, C. Straßmann, and N. C. Krämer, “Do a robot's social skills and its objection discourage interactants from switching the robot off?,” PLoS One, vol. 13, no. 7, 2018, Art. no.
[23]
Y. Suzuki, L. Galli, A. Ikeda, S. Itakura, and M. Kitazaki, “Measuring empathy for human and robot hand pain using electroencephalography,” Sci. Rep., vol. 5, no. 1, pp. 1–9, 2015.
[24]
A. M. R. von der Pütten et al., “Investigations on empathy towards humans and robots using fMRI,” Comput. Hum. Behav., vol. 33, pp. 201–212, 2014.
[25]
M. Shiomi, A. Nakata, M. Kanbara, and N. Hagita, “A hug from a robot encourages prosocial behavior,” in Proc. 26th IEEE Int. Symp. Robot Hum. Interactive Commun., 2017, pp. 418–423.
[26]
A. M. Rosenthal-von der Pütten, N. C. Krämer, L. Hoffmann, S. Sobieraj, and S. C. Eimler, “An experimental study on emotional reactions towards a robot,” Int. J. Soc. Robot., vol. 5, no. 1, pp. 17–34, 2013.
[27]
K. Darling, P. Nandy, and C. Breazeal, “Empathic concern and the effect of stories in human-robot interaction,” in Proc. IEEE 24th Int. Symp. Robot Hum. Interactive Commun., 2015, pp. 770–775.
[28]
L. D. Riek, T.-C. Rabinowitch, B. Chakrabarti, and P. Robinson, “How anthropomorphism affects empathy toward robots,” in Proc. IEEE/ACM 4th Int. Conf. Hum. Robot Interact., 2009, pp. 245–246.
[29]
L. Yaniv and M. Yossi, “Google duplex: An AI system for accomplishing real-world tasks over the phone,” 2018. [Online]. Available: https://research.google/pubs/pub49194/
[30]
S. Harding, “Amazon uses kid's dead grandma in morbid demo of Alexa audio deepfake,” 2022. [Online]. Available: https://arstechnicacomgadgets202206amazon-Uses-Kids-Dead-Grandma–Morb.-Demo–Alexa-Audio-Deep
[31]
K. Hao, “Robots that teach autistic kids social skills could help them develop,” 2020. [Online]. Available: https://www.technologyreview.com/2020/02/26/916719/ai-robots-teach-autistic-kids-social-skills-development/
[32]
K. Darling, “Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects,” in Robot Law. Ann Arbor, MI, USA: Edward, 2016.
[33]
M. Yao, “Could your child's best friend be a robot?,” 2017. [Online]. Available: https://wwwforbescomsitesmariyayao20170406could-Your-Childs-Best-Friend-Be–Robot
[34]
R. Pusztahelyi, “Emotional AI and its challenges in the viewpoint of online marketing,” Curentul Juridic, vol. 23, no. 2, pp. 13–31, 2020.
[35]
Y. Zhou, Z. Fei, Y. He, and Z. Yang, “How human–chatbot interaction impairs charitable giving: The role of moral judgment,” J. Bus. Ethics, vol. 178, no. 3, pp. 849–865, 2022.
[36]
E. Mendoza and G. Carballo, “Acoustic analysis of induced vocal stress by means of cognitive workload tasks,” J. Voice, vol. 12, no. 3, pp. 263–273, 1998.
[37]
S. A. Motzer and V. Hertig, “Stress, stress response, and health,” Nurs. Clin., vol. 39, no. 1, pp. 1–17, 2004.
[38]
C. Peifer, A. Schulz, H. Schächinger, N. Baumann, and C. H. Antoni, “The relation of flow-experience and physiological arousal under stress—Can u shape it?,” J. Exp. Soc. Psychol., vol. 53, pp. 62–69, 2014.
[39]
G. Giannakakis, D. Grigoriadis, K. Giannakaki, O. Simantiraki, A. Roniotis, and M. Tsiknakis, “Review on psychological stress detection using biosignals,” IEEE Trans. Affect. Comput., vol. 13, no. 1, pp. 440–460, Jan.–Mar. 2022.
[40]
K. R. Scherer, “Vocal affect expression: A review and a model for future research,” Psychol. Bull., vol. 99, no. 2, pp. 143–165, Mar. 1986.
[41]
M. Van Puyvelde, X. Neyt, F. McGlone, and N. Pattyn, “Voice stress analysis: A new framework for voice and effort in human performance,” Front. Psychol., vol. 9, pp. 1994–1994, 2018.
[42]
K. Huttunen, H. Keränen, E. Väyrynen, R. Pääkkönen, and T. Leino, “Effect of cognitive load on speech prosody in aviation: Evidence from military simulator flights,” Appl. Ergonom., vol. 42, no. 2, pp. 348–357, 2011.
[43]
X. Li et al., “Stress and emotion classification using jitter and shimmer features,” in Proc. IEEE Int. Conf. Acoust., Speech Signal Process., 2007, pp. IV-1081–IV-1084.
[44]
C. Hildebrand, F. Efthymiou, F. Busquet, W. H. Hampton, D. L. Hoffman, and T. P. Novak, “Voice analytics in business research: Conceptual foundations, acoustic feature extraction, and applications,” J. Bus. Res., vol. 121, pp. 364–374, Dec. 2020.
[45]
R. A. Lester and B. H. Story, “The effects of physiological adjustments on the perceptual and acoustical characteristics of simulated laryngeal vocal tremor,” J. Acoust. Soc. Amer., vol. 138, no. 2, pp. 953–963, 2015.
[46]
L. A. Zebrowitz, Social Perception. Pacific Grove, CA, USA: Brooks/Cole, 1990.
[47]
V. G. Sinclair and K. A. Wallston, “The development and validation of the psychological vulnerability scale,” Cogn. Ther. Res., vol. 23, no. 2, pp. 119–129, 1999.
[48]
J. Zubin and B. Spring, “Vulnerability: A new view of schizophrenia,” J. Abnorm. Psychol., vol. 86, no. 2, pp. 103–103, 1977.
[49]
P. A. Thoits, “Life stress, social support, and psychological vulnerability: Epidemiological considerations,” J. Community Psychol., vol. 10, no. 4, pp. 341–362, 1982.
[50]
C. Segrin, M. McNelis, and P. Swiatkowski, “Social skills, social support, and psychological distress: A test of the social skills deficit vulnerability model,” Hum. Commun. Res., vol. 42, no. 1, pp. 122–137, 2016.
[51]
C. D. Baston, “The empathy-altruism hypothesis: Issues and implication,” in From Bench Bedside. Cambridge, MA, USA: MIT Press, 2012, pp. 41–54.
[52]
A. J. M. Dijker, “Perceived vulnerability as a common basis of moral emotions,” Brit. J. Psychol., vol. 49, no. 2, pp. 415–423, 2010.
[53]
J. O.’L. Banks and M. M. Raciti, “Perceived fear, empathy and financial donations to charitable services,” Serv. Industries J., vol. 38, no. 5/6, pp. 343–359, 2018.
[54]
M. Coeckelbergh, “Artificial companions: Empathy and vulnerability mirroring in human-robot relations,” Stud. Ethics Law Technol., vol. 4, no. 3, 2011, Art. no.
[55]
M. Nussbaum, Poetic Justice: The Literary Imagination and Public Life. Boston, MA, USA: Beacon, 1997.
[56]
J. Decety and J. M. Cowell, “Empathy, justice, and moral behavior,” AJOB Neurosci., vol. 6, no. 3, pp. 3–14, 2015.
[57]
J. P. Tangney, J. Stuewig, and D. J. Mashek, “Moral emotions and moral behavior,” Annu. Rev. Psychol., vol. 58, pp. 345–345, 2007.
[58]
L. W. Niezink, F. W. Siero, P. Dijkstra, A. P. Buunk, and D. P. H. Barelds, “Empathic concern: Distinguishing between tenderness and sympathy,” Motivation Emotion, vol. 36, no. 4, pp. 544–549, 2012.
[59]
C. Pham and F. Septianto, “A smile–the key to everybody's heart? The interactive effects of image and message in increasing charitable behavior,” Eur. J. Marketing, vol. 54, pp. 261–281, 2019.
[60]
O. FeldmanHall, T. Dalgleish, D. Evans, and D. Mobbs, “Empathic concern drives costly altruism,” Neuroimage, vol. 105, pp. 347–356, 2015.
[61]
M. Bae, “The effects of message order on emotions and donation intention in charity advertising: The mediating roles of negative and positive empathy,” J. Marketing Commun., vol. 29, no. 3, pp. 270–287, 2021.
[62]
D. Z. Basil, N. M. Ridgway, and M. D. Basil, “Guilt and giving: A process model of empathy and efficacy,” Psychol. Marketing, vol. 25, no. 1, pp. 1–23, 2008.
[63]
A. Tusche, A. Böckler, P. Kanske, F.-M. Trautwein, and T. Singer, “Decoding the charitable brain: Empathy, perspective taking, and attention shifts differentially predict altruistic giving,” J. Neurosci., vol. 36, no. 17, pp. 4719–4732, 2016.
[64]
G. C. Thomas, C. D. Batson, and J. S. Coke, “Do good samaritans discourage helpfulness? Self-perceived altruism after exposure to highly helpful others,” J. Pers. Social Psychol., vol. 40, no. 1, pp. 194–200, 1981.
[65]
E. L. Stocks, D. A. Lishner, and S. K. Decker, “Altruism or psychological escape: Why does empathy promote prosocial behavior?,” Eur. J. Soc. Psychol., vol. 39, no. 5, pp. 649–665, 2009.
[66]
A. S. Bergner, C. Hildebrand, and G. Häubl, “Machine talk: How verbal embodiment in conversational AI shapes consumer–brand relationships,” J. Consum. Res., vol. 50, 2023, Art. no.
[67]
J. Hartmann, A. Bergner, and C. Hildebrand, “MindMiner: Uncovering linguistic markers of mind perception as a new lens to understand consumer-smart object relationships,” J. Consum. Psychol., vol. 33, pp. 645–667, 2023.
[68]
N. Castelo, J. Boegershausen, C. Hildebrand, and A. P. Henkel, “Understanding and improving consumer reactions to service bots,” J. Consum. Res., vol. 50, 2023, Art. no.
[69]
H. Takagi and K. Terada, “The effect of anime character's facial expressions and eye blinking on donation behavior,” Sci. Rep., vol. 11, no. 1, pp. 1–8, 2021.
[70]
R. Corretge, “Praat vocal toolkit,” 2012. [Online]. Available: http://www.praatvocaltoolkit.com/
[71]
M. A. Guzman et al., “Influence of emotional expression, loudness, and gender on the acoustic parameters of vibrato in classical singers,” J. Voice, vol. 26, no. 5, pp. 675.e5–675.e11, 2012.
[72]
C. E. Seashore, “The natural history of the vibrato,” Proc. Nat. Acad. Sci., vol. 17, no. 12, pp. 623–626, 1931.
[73]
J. Sundberg, “Acoustic and psychoacoustic aspects of vocal vibrato,” Vibrato, pp. 35–62, 1995.
[74]
S. Palan and C. Schitter, “Prolific. AC — A subject pool for online experiments,” J. Behav. Exp. Finance, vol. 17, pp. 22–27, 2018.
[75]
F. Eyssel, D. Kuchenbrandt, F. Hegel, and L. De Ruiter, “Activating elicited agent knowledge: How robot and user features shape the perception of social robots,” in Proc. IEEE Roman: 21st Int. Symp. Robot Hum. Interactive Commun., 2012, pp. 851–857.
[76]
F. Eyssel, L. De Ruiter, D. Kuchenbrandt, S. Bobinger, and F. Hegel, ““If you sound like me, you must be more human”: On the interplay of robot and user features on human-robot acceptance and anthropomorphism,” in Proc. IEEE/ACM 7th Int. Conf. Hum.-Robot Interact., 2012, pp. 125–126.
[77]
S. Tolmeijer, N. Zierau, A. Janson, J. S. Wahdatehagh, J. M. M. Leimeister, and A. Bernstein, “Female by default? Exploring the effect of voice assistant gender and pitch on trait and trust attribution,” in Proc. Extended Abstr. CHI Conf. Hum. Factors Comput. Syst., 2021, pp. 1–7.
[78]
M. Mileva and N. Lavan, “Trait impressions from voices are formed rapidly within 400 ms of exposure,” J. Exp. Psychol. Gen., vol. 152, 2023, Art. no.
[79]
P. McAleer, A. Todorov, and P. Belin, “How do you say ‘hello’? Personality impressions from brief novel voices,” PLoS One, vol. 9, no. 3, 2014, Art. no.
[80]
J. Hakes, T. Shipp, and E. T. Doherty, “Acoustic characteristics of vocal oscillations: Vibrato, exaggerated vibrato, trill, and trillo,” J. Voice, vol. 1, no. 4, pp. 326–331, 1988.
[81]
A. M. Mendesa, H. B. Howardb, C. Sapienzab, and W. S. Brown, “Acoustic effects of vocal training,” in Proc. 17th Int. Congr. Acoust., Rome, 2001, vol. 8, pp. 106–107.
[82]
D. Stanley, Your Voice: Applied Science of Vocal Art. New York, NY, USA: Pitman, 1957.
[83]
M. B. Donnellan, F. L. Oswald, B. M. Baird, and R. E. Lucas, “The mini-IPIP scales: Tiny-yet-effective measures of the big five factors of personality,” Psychol. Assess., vol. 18, no. 2, pp. 192–203, 2006.
[84]
P. T. Costa Jr. and R. R. McCrae, The Revised Neo Personality Inventory. Newbury Park, CA, USA: Sage, 2008.
[85]
L. R. Goldberg et al., “The international personality item pool and the future of public-domain personality measures,” J. Res. Pers., vol. 40, no. 1, pp. 84–96, 2006.
[86]
J. A. Johnson, “Measuring thirty facets of the five factor model with a 120-item public domain inventory: Development of the IPIP-NEO-120,” J. Res. Pers., vol. 51, pp. 78–89, 2014.
[87]
UCLA, “What does Cronbach's alpha mean?,” 2021. [Online]. Available: https://statsoarcuclaeduspssfaqwhat-Does-Cronbachs-Alpha-Mean
[88]
A. F. Hayes, Introduction to Mediation, Moderation, and Conditional Process Analysis: A Regression-Based Approach. New York, NY, USA: Guilford Publications, 2017.
[89]
E. Ruane, A. Birhane, and A. Ventresque, “Conversational AI: Social and ethical considerations,” in Proc. AICS, 2019, pp. 104–115.
[90]
M. Rheu, J. Y. Shin, W. Peng, and J. Huh-Yoo, “Systematic review: Trust-building factors and implications for conversational agent design,” Int. J. Hum.–Comput. Interact., vol. 37, no. 1, pp. 81–96, 2021.
[91]
I. Torre, J. Goslin, and L. White, “If your device could smile: People trust happy-sounding artificial agents more,” Comput. Hum. Behav., vol. 105, 2020, Art. no.
[92]
A. Niculescu, B. van Dijk, A. Nijholt, H. Li, and S. L. See, “Making social robots more attractive: The effects of voice pitch, humor, and empathy,” Int. J. Soc. Robot., vol. 5, no. 2, pp. 171–191, 2013.
[93]
R. Tamagawa, C. I. Watson, I. H. Kuo, B. A. MacDonald, and E. Broadbent, “The effects of synthesized voice accents on user perceptions of robots,” Int. J. Soc. Robot., vol. 3, no. 3, pp. 253–262, Aug. 2011.
[94]
R. C.-S. Chang, H.-P. Lu, and P. Yang, “Stereotypes or golden rules? Exploring likable voice traits of social robots as active aging companions for tech-savvy baby boomers in Taiwan,” Comput. Hum. Behav., vol. 84, pp. 194–210, 2018.
[95]
P. Hu and Y. Lu, “Dual humanness and trust in conversational AI: A person-centered approach,” Comput. Hum. Behav., vol. 119, 2021, Art. no.
[96]
C. R. Sunstein, “Moral heuristics,” Behav. Brain Sci., vol. 28, no. 4, pp. 531–541, 2005.
[97]
J. Baron and I. Ritov, “Protected values and omission bias as deontological judgments,” Psychol. Learn. Motivation, vol. 50, pp. 133–167, 2009.
[98]
E. Hermann, “Leveraging artificial intelligence in marketing for social good—An ethical perspective,” J. Bus. Ethics, vol. 179, no. 1, pp. 43–61, 2021.
[99]
L. Floridi, J. Cowls, T. C. King, and M. Taddeo, “How to design AI for social good: Seven essential factors,” Sci. Eng. Ethics, vol. 26, no. 3, pp. 1771–1796, 2020.
[100]
D. Schatsky, V. Katyal, S. Iyengar, and R. Chauhan, “Can AI be ethical? Why enterprises shouldn't wait for AI regulation,” 2019. [Online]. Available: https://www2deloittecomglobaleninsightsfocussignals–Strateg.-Artif.-Intell
[101]
A. McStay and G. Rosner, “Emotional artificial intelligence in children's toys and devices: Ethics, governance and practical remedies,” Big Data Soc., vol. 8, no. 1, 2021, Art. no.
[102]
K. LaGrandeur, “Emotion, artificial intelligence, and ethics,” in Beyond Artificial Intelligence. Berlin, Germany: Springer, 2015, pp. 97–109.
[103]
I. R. Nourbakhsh, “AI ethics: A call to faculty,” Commun. Assoc. Comput. Machinery, vol. 64, no. 9, pp. 43–45, 2021.
[104]
R. Eitel-Porter, “Beyond the promise: Implementing ethical AI,” AI Ethics, vol. 1, no. 1, pp. 73–80, 2021.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing  Volume 15, Issue 3
July-Sept. 2024
1087 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 14 November 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media