Technology has transformed the way we communicate and do business. Phone, chat, emails, voice messages, and video calls have replaced face-to-face conversations. Retail shopping is now a thing of the past, and online shopping has become a multibillion-dollar industry. Chatbots have become one of the most popular channels for customer service inquiries. But this form of communication is not always beneficial – often, the communication is slow and distorted, and requests and questions are misinterpreted, thus leading to customer frustration.
Further, the COVID-19 pandemic has reshaped the world order, and more consumers have started shopping online in greater numbers and frequency. According to IBM’s US Retail Index data, the pandemic has accelerated the shift from physical stores to digital shopping by roughly five years. Department stores, as a result, are seeing significant declines. In the first quarter of 2020, department store sales and those from other “non-essential” retailers declined by 25%. This grew to a 75% decline in the second quarter.
As much technology hinders the communication process, it is a boon as it facilitates effective and frictionless conversation. Futurologists and trend researchers are hopeful that emotional Artificial Intelligence (AI) can help bridge the gap.
AI and emoticons
Emotion AI, also referred to as affective computing, is an evolving area, and organizations are rapidly using it to detect emotions using AI. Machines with this kind of emotional intelligence can not only get the tasks done but can understand the cognitive but also emotive channels of human communication.
That enables them to detect, interpret, and respond appropriately to both verbal and nonverbal signals.
Emotion AI is a subset of AI that can measure, understand, simulate, and react to human emotions. The field came into the limelight in 1995 when MIT Media lab professor Rosalind Picard published “Affective Computing.”
According to a research scientist named Javier Hernandez from the Affective Computing Group at the MIT Media Lab, “emotion AI can be explained as a tool to bring about a more natural interaction between humans and machines.”
Hernandez commented, “Think of the way you interact with other human beings; you look at their faces, you look at their body, and you change your interaction accordingly.” He continued, “How can (a machine) effectively communicate information if it doesn’t know your emotional state if it doesn’t know how you’re feeling, it doesn’t know how you’re going to respond to specific content?”
Humans may indeed hold a better command in understanding, reading, or interpreting emotions, but machines are slowly gaining ground in interpreting human emotions using their strengths. MIT Sloan professor Erik Brynjolfsson explains – Machines are very good at analyzing large amounts of data. They can listen to voice inflections and start to recognize when those inflections correlate with stress or anger. Machines can analyze images and pick up subtleties in micro-expressions on humans’ faces that might happen even too fast for a person to recognize.
“We have a lot of neurons in our brain for social interactions. We’re born with some of those skills, and then we learn more. It makes sense to use technology to connect to our social brains, not just our analytical brains.” Brynjolfsson said. “Just like we can understand speech and machines can communicate in speech, we also understand and communicate with humor and other emotions. And machines that can speak that language — the language of emotions — will have better, more effective interactions with us. It’s great that we’ve made some progress; it’s just something that wasn’t an option 20 or 30 years ago, and now it’s on the table.”
Use cases of emotional AI
According to Gartner, about 10% of personal devices will have emotional AI capabilities by the end of 2022, either through on-device or cloud services. This number was less than 1% in 2018.
The two most important areas where emotional AI can support businesses are customer experience and cost-cutting.
The following areas show how different companies are leveraging emotion AI technology. More recently, emotion AI vendors have ventured into new areas and industries, thus helping enterprises create better customer experiences –
- Video games – Using computer vision, the gaming console or video games can recognize emotions using facial expressions during the game and adapt to it.
For example, Nevermind is a psychological thriller game that can sense a player’s facial expressions for signs of emotional distress Using Affectiva’s Affdex SDK.
- Medical diagnosis – Softwares such as Electronic Health Record (EHR) software and medical database software help doctors diagnose diseases such as depression and dementia by using voice analysis.
- Education industry – Learning software prototypes have been designed to adapt to children’s emotions. For example, if a child is frustrated because they find a task too difficult or too simple, the program adapts the task to make it less or more challenging.
- There are also special programs designed for autistic that guide them in performing study-related activities. For example, a revolutionary language development app, ‘See.Touch.Learn,’ was designed by Brain Parade for children with autism, language disorders, and other special needs.
- Employee safety – Emotion AI can effectively detect employees’ stress and anxiety levels. A recent survey by Gartner has revealed that there has been a tremendous increase in the development of employee safety solutions.
- Patient care – A ‘nurse bot’ sends reminders to older patients on long-term medical programs to take medication. The bot also converses with them every day to monitor their overall well-being.
- Safety for cars – Automotive vendors can use computer vision technology to track the driver’s emotional state. It can send alerts about the driver’s extreme emotional state or drowsiness.
- Detecting fraud – Insurance companies often use voice analysis to check whether a customer is telling the truth while submitting a claim. Independent surveys have revealed that up to 30% of users usually lie to the car insurance company to claim coverage.
- Recruitment – Companies use various software such as Manatal, Breeze HR, etc., during the job interview processes to understand a candidate’s credibility.
- Call center – The emotion AI function can help track an angry customer from the start. The customer can, hence, be re-routed to a well-trained agent who can monitor the flow of conversation in real-time.
- Public services – The relationship between emotional AI technology vendors and surveillance camera providers has moved ahead. The United Arabic Emirates country’s Ministry of Happiness has initiated a unique project. Cameras are installed in public places in the UAE that can capture people’s facial expressions and, hence, understand the general mood of the population.
- Retail – Installing computer vision emotion AI technology in stores enables retailers to capture demographic information and analyze visitors’ moods and reactions.
Take away
The above scenarios explain how different industries adopt emotional AI to their benefit. Many companies are moving toward emotional intelligence to grab more attention and expand their land. They are trying to grow the emotional quotient of their culture and structure to keep up with the demand and fulfill commitment with a purpose.
Emotions now are a centric point of everyday experiences, while emotional AI makes it more holistic for user experience. It does make human interactions with technology more intuitive and turns technology more responsive according to needs.
Thus, marketers should start implementing emotional AI by focusing on data reflecting customers’ emotions. It is better to start deploying emotional AI in the real world early if the organization wants to move successfully with this technology for a better future.
For more information on Artificial Intelligence and other related technologies, visit our whitepapers here.