Emotion AI and eCommerce explained

 

Emotions rule us as human beings, our decision-making, how we treat those around us and so much more, yet we know surprisingly little about them. Some academics estimate that 95% of purchase decisions are made subconsciously, which explains why the emotion detection and recognition market could be worth US$43.3 billion by 2027.

 

Emotion artificial intelligence (AI) has been around since the mid-90s but is now seen as an emerging technology in the world of eCommerce, as advances have started to make using emotion AI a more viable proposition.

 

Emotion AI and eCommerce explained

But what is emotion AI, how is it being used in eCommerce, and should you explore it for your business?

 

Emotion AI defined 

 

“Your intellect may be confused, but your emotions will never lie to you” – Roger Ebert.

 

As with all kinds of cutting-edge technology, a good place to start looking for a definition is at the Massachusetts Institute of Technology (MIT). Specifically, this definition comes from the MIT Sloan School of Management:

 

“Emotion AI is a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures understand, simulates, and reacts to human emotions. It’s also known as affective computing or artificial emotional intelligence.”

 

If the boffins’ explanation is still a bit intimidating, perhaps there’s a more straightforward way to think about emotion AI. Simply put, it’s about teaching machines to better recognize human emotions and adjust their communications with humans appropriately. Basically, it’s an attempt to give computers emotional intelligence, which may strike some as a lofty goal seeing as plenty of full-fledged humans are sorely lacking in that department!

 

The machines’ biggest advantage is the vast amounts of data they can process, much more than any human could ever. This makes feeding a machine data about certain emotions and training it to distinguish between different facial expressions or tones of voice a tantalizing prospect for researchers. On the other hand, humans have the advantage of millions of years of evolution, social development and the complexities of the human mind to call upon in the arena of emotional intelligence.

 

Of course, positioning emotion AI and human emotional intelligence as competitors would be a tad simplistic, and a ripoff of the 2004 sci-fi thriller I, Robot. Instead, emotion AI can be thought of as a technology that could be used to augment human emotional labour. If chatbots are used to remove the burden of some tasks of human customer support agents, think of one use of emotion AI being something similar, only the difficulty of the task has increased.

 

3 ways emotion AI is being used in eCommerce

 

Definitions and theories are all well and good, but what about real-life, practical use cases? Read on for some examples.

 

Customer insights

 

Knowing your customer inside and out is half the battle of selling to them. A Salesforce survey found that 66% of customers expect companies to understand their needs and expectations, so it’s no surprise that emotion AI has been deployed on this crucial front.

 

Revuze, a US-based startup, is one company that uses emotion AI to give digital commerce firms insights into their customers. Their value proposition is a simple but effective one. When a company launches a new product, getting feedback from customers can typically require the hiring of an outside surveying company and may take up to six months to complete. Revuze promises to do this much quicker by using its emotional AI technology to gather and then analyze online text discussions pertaining to the product in a much quicker timeframe.

 

Customer insights can really help boost your store’s efficiency.

 

Testing

 

To paraphrase a well-known expression, when the going gets tough, the tough get testing. Whether it’s testing your website’s load times or A/B testing your marketing efforts, we all know how valuable performing tests can be for all aspects of an eCommerce business. This is another area where emotion AI is being used.

 

Affect UX is a product from Entropik Tech that, the company claims, can provide clients with a 52% reduction in development time, a 16% increase in online conversions, and 63% savings in customer acquisition costs. It does this using “high-precision eye tracking and facial coding” to provide insights into how a survey panel experiences a website or app, which the client company can then use to optimize the experience. This provides the company with

 

Targeting

 

Serving the right message to the right customer at the right time is the holy grail of your advertising and promotional efforts, but it’s easier said than done. Serving someone the wrong message at the wrong time could turn them off your brand completely. Emotion AI can be used to ensure this doesn’t happen.

 

More than 1 million people use the Yellow Line of the Sao Paulo Metro in Brazil every day and when they stand in front of the subway doors, emotion AI is used to optimize the adverts they see. Using the security camera feeds, AdMobilize’s emotion AI analytics technology classifies passengers’ expressions into happiness, surprise, neutrality, and dissatisfaction. The adverts then change to suit what will best target that passenger.

 

2 cons of emotion AI

 

It would be ignorant to talk about emotion AI without acknowledging the severe flaws its many critics have pointed out with the technology, and with how it’s being used. So you get the full picture, here are the two major cons that emotion AI faces right now.

 

It’s biased

 

The 2020 documentary Coded Bias brought the problem of biases within artificial intelligence into the mainstream, but experts have been sounding the alarm on this issue for years. How can a computer be biased? The answer’s pretty simple: the computer was built by a human. To be more specific, humans, with all their biases, select the datasets used to train emotion AI to do everything it can do, from reading people’s facial expressions to analyzing their tone of voice.

 

A study by academic Lauren Rhue, Assistant Professor of Information Systems and Analytics at Wake Forest University in the US, found that “emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces”. By feeding photos of 400 basketball players into two emotional AI systems, Rhue found that the AI consistently rated the black players as being more angry and contemptuous, as well as less happy. This happened even when the AI could see that the black player was smiling – it still decided to ascribe more negative emotions to him.

 

This is, obviously, a big problem for emotion AI. If the technology is consistently giving you bad data about a certain demographic of customers, you could start making poor decisions regarding how you serve those customers. And that’s without even considering the possibility of racial, gender or otherwise profiling people.

 

It’s wildly inaccurate

 

Humans find it tough to consistently and accurately read each other’s emotions, so it’s probably no great surprise that machines also struggle. In 2019, five scientists reviewed more than 1,000 scientific papers looking into emotion research, papers spanning most of the last century. The results? “There is no scientific support for the common assumption ‘that a person’s emotional state can be readily inferred from his or her facial movements.’”

 

The scientists determined that facial expressions couldn’t be used to accurately determine a person’s emotional state for several reasons:

 

“The link between facial expressions and emotions is not reliable (i.e., the same emotions are not always expressed in the same way), specific (the same facial expressions do not reliably indicate the same emotions), or generalizable (the effects of different cultures and contexts has not been sufficiently documented).”

 

This seems a lot like common sense when one really thinks about it. We’ve all smiled when we’ve been feeling far from happy or pulled a mock surprise face in conversation with a friend. While the study mentioned above focused on facial recognition in emotion AI, the same problems hold true for text and for voices. To offer just one example, sarcasm is notoriously difficult for even humans to recognize in writing and in speech, so what hope do machines have?

 

While emotion AI evangelists insist the technology is improving (and it likely is) if the base assumptions its is built on are flawed, that may be tough to overcome.

 

Next steps

 

So is emotion AI capability the next must-have piece of your tech stack? The major problems the technology faces mean mainstream and widespread adoption is probably still some way off in the world of eCommerce. However that doesn’t mean it couldn’t provide you with some valuable insights, maybe just don’t let your entire strategy hinge on a computer’s ability (or inability) to read a smile.

 

For all your eCommerce needs, you can also arrange a call with one of our experts today. We promise you’ll speak to a human, not AI.