In January of 2018, Annette Zimmermann, vice president of research at Gartner, proclaimed: “By 2022, your personal device will know more about your emotional state than your own family.” Just two months later, a landmark study from the University of Ohio claimed that their algorithm was now better at detecting emotions than people are.
3 Ways AI Is Getting More Emotional
Emotion AI systems and devices will soon recognize, interpret, process, and simulate human affects. A combination of facial analysis, voice pattern analysis, and deep learning can already decode human emotions – some research has shown that algorithms are now better at detecting emotions than people are. Emotional inputs will create a shift from data-driven IQ-heavy interactions to deep EQ-guided experiences, giving brands the opportunity to connect to customers on a much deeper, more personal level. But reading people’s emotions is a delicate business. Emotions are highly personal, and users will not agree to let brands look into their soul unless the benefit to them is greater than the fear of privacy invasion and manipulation. A series of (collectively agreed upon) experiments will need to guide designers and brands toward the appropriate level of intimacy, and a series of fails will determine the rules for maintaining trust, privacy, and emotional boundaries. The biggest hurdle to finding the right balance might not be achieving more effective forms of Emotion AI, but finding emotionally intelligent humans to build them.