Imagine messaging an artificial intelligence (AI) chatbot about a missing package and getting the response that it would be “delighted” to help. Once the bot creates the new order, they say they are “happy” to resolve the issue. After, you receive a survey about your interaction, but would you be likely to rate it as positive or negative?
This scenario isn’t that far from reality, as AI chatbots are already taking over online commerce. By 2025, 95% of companies will have an AI chatbot, according to Finance Digest. AI might not be sentient yet, but it can be programmed to express emotions.
Humans displaying positive emotions in customer service interactions have long been known to improve customer experience, but researchers at the Georgia Institute of Technology’s Scheller College of Business wanted to see if this also applied to AI. They conducted experimental studies to determine if positive emotional displays improved customer service and found that emotive AI is only appreciated if the customer expects it, and it may not be the best avenue for companies to invest in.
“It is commonly believed and repeatedly shown that human employees can express positive emotion to improve customers’ service evaluations,” said Han Zhang, the Steven A. Denning Professor in Technology & Management. “Our findings suggest that the likelihood of AI’s expression of positive emotion to benefit or hurt service evaluations depends on the type of relationship that customers expect from the service agent.”
The researchers presented their findings in the paper, “Bots With Feelings: Should AI Agents Express Positive Emotion in Customer Service?,” in Information Systems Research in December.
Studying AI Emotion
The researchers conducted three studies to expand the understanding of emotional AI in customer service transactions. Although they changed the participants and scenario in each study, AI chatbots imbued with emotion used positive emotional adjectives, such as excited, delighted, happy, or glad. They also deployed more exclamation points.
The first study focused on whether customers responded more favorably to positive emotion if they knew the customer agent was a bot or person. Participants were told they were seeking help for a missing item in a retail order. The 155 participants were then randomly assigned to four different scenarios: human agents with neutral emotion, human agents with positive emotion, bots with neutral emotion, and bots with positive emotion. Then they asked participants about service quality and overall satisfaction. The results indicated that positive emotion was more beneficial when human agents exhibited it, but it had no effect when bots exhibited it.
The second study examined if customers’ personal expectations determined their reaction to the bot. In this scenario, the 88 participants imagined returning a textbook and were randomly assigned to either emotion-positive or emotion-neutral bots. After chatting with the bot, they were asked to rate if they were communal (social) oriented or exchange (transaction) oriented on a scale. If the participant was communal-focused, they were more likely to appreciate the positive emotional bot, but if they expected the exchange as merely transactional, the emotionally positive bot made their experience worse.
“Our work enables businesses to understand the expectations of customers exposed to AI-provided services before they haphazardly equip AIs with emotion-expressing capabilities,” Zhang said.
The final study explored why a bot’s positive emotion influences customer emotions, following 177 undergraduate students randomly assigned to emotive or non-emotive bots. The results explained why positive bots have less of an effect than anticipated. Because customers do not expect machines to have emotions, they can react negatively to emotion in a bot.
The results across the studies show that using positive emotion in chatbots is challenging because businesses don’t know a customer’s biases and expectations going into the interaction. A happy chatbot could lead to an unhappy customer.
“Our findings suggest that the positive effect of expressing positive emotion on service evaluations may not materialize when the source of the emotion is not a human,” Zhang said. “Practitioners should be cautious about the promises of equipping AI agents with emotion-expressing capabilities.”
CITATION: Han, Elizabeth and Yin, Dezhi and Zhang, Han (2022) Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?. Information Systems Research.
Published online in Articles in Advance 02 Dec 2022
DOI: https://doi.org/10.1287/isre.2022.1179
######
The Georgia Institute of Technology, or Georgia Tech, is one of the top public research universities in the U.S., developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its more than 46,000 students, representing 50 states and more than 150 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.
Journal
Information Systems Research
Method of Research
Experimental study
Subject of Research
People
Article Title
Bots with feelings: Should AI agents express positive emotion in customer service?
Article Publication Date
2-Dec-2022