Should We Outsource Emotion to Robots?

Posted on 2nd May 2017


Welcome to the Business Empathy Forum and thank you for your visit. In this post I highlight a 2016 Slate article by Christine Rosen that explores ‘emotional labor’ and the risks of outsourcing such comforting behavior to machines. Given the breakneck development of artificial intelligence in 2017, it might be worthwhile to pause and reflect on this.

Artificial Intelligence and ‘Emotional Labor’

Rosen begins with familiar examples that illustrate what emotional labor can look like: the ever-smiling, ever-cheerful employees at Walt Disney World; the rote greetings of serving staff at many casual restaurants; the cool courtesy of many flight attendants; or the friendly ‘Have a nice day!’ that rings out as we leave the dentist’s office. These workplace behaviors represent what sociologist Arlie Russell Hochschild has defined as ‘emotional labor’ – “the performance of feelings that service workers must provide to their customers.”

I find that phrase striking, and haunting: the “performance of feelings that service workers must provide”. It is a reminder that empathy – even simply being courteous – can require energy and ‘work’. Given the effort that empathy can entail, and the fact that we humans are fond of delegating ‘menial’ tasks to machines – think of how we use our computers, and apps that count calories, and smart phones that keep appointments and give directions – it makes sense that we are now leveraging AI technology to create empathic, therapeutic robots.

But this can get confusing… what are we really getting here? “These robots are explicitly marketed to the public as nurturing companions, not merely machines. We are supposed to view them like emotionally hyper responsive pets.” We seem to be crossing a line these days, from machines performing ‘tasks’ to their becoming our ‘nurturing companions’. Acknowledging for the moment the practical benefits and the commercial potential of companion robots, what are the risks? One is the long-term risk of emotional ‘de-skilling’. In this scenario, as we become ever more accustomed to interacting with programmed machines and their predictable, algorithm-driven responses, we slowly lose our nuanced feel for human communication.

Yet our ability to communicate with one another is surely one of the crown jewels of humanity: are we really willing to give that up? As I read Rosen’s article I found myself circling back to a fundamental question from my own book: what is the right balance between technology and empathy, between ‘tech’ and ‘touch’? The answer is not an easy one, but it is important.

So in the end, how much of our emotional labor should we outsource to machines? Here is Rosen’s final thought: “In our age of ersatz intimacy, perhaps sophisticated emotional mimicry is enough for us. But we should at least acknowledge that we are setting a low bar for our emotional lives. Emotions are like weather – only partially understandable and predictable. But that is precisely what makes them a bug and a feature of being human.”

Good luck, and until next time…