Most of us think that improving quality or capability of an AI system makes it incrementally (or at least monotonically) better for the users.
However, just like computer graphics and robotics, there can be an uncanny valley for AI systems.
As artificial faces get more realistic, people are happier, but there is an uncanny valley where faces get to a point that is close to realistic but not…quite. Faces look “eerie” or “creepy”, as we know something is off but don’t quite know why.
Why is this? Our brains are wired to recognize human faces. When something is a cartoon or abstraction, the logical part of our brain recognizes these just fine. But when an image is realistic enough to activate our fine-grained emotion detectors (like true vs. fake smiles, and micro-expressions), these emotional judgments kick in.
This has important implications for people developing “life-like” robots. Until the quality improves to make those emotional recognizers happy and get past the uncanny response, improvements have less value. The value can even be negative, despite the higher investments required to achieve them.
Recent CES attendees got a close look at two robots on the edge of the uncanny valley. Joshua Melvin describes them in his article “Creepy meets cool in humanoid robots at CES tech show“, where he described how Pedia-Roid from the Japanese firm tmsuk made people uncomfortable. This very human-looking doll is designed to train healthcare workers who have to deal with children who squirm, scream, and try to avoid being examined or treated by a doctor. Attendees found her eyes especially creepy.
On the other hand, people loved Ameca from the British company Engineered Arts. Ameca is a chatty, genderless, raceless, humanoid robot, with a very expressive gray face and hands. It’s designed to talk and interact with humans, to move like a human, and to show facial expressions, but not to be quite human enough to trigger the uncanny response. Ameca’s metal body and a few non-human facial features (like seams in its plastic), make it clearly a cool robot, not a scary fake human.
Joel Pinney explains a typical research participant response: “Participants said they wanted a robot that resembled humans with a face, a mouth and eyes but – crucially – not an identical representation of human features. In other words, they still wanted them to look like a robot, not some unsettling cyborg hybrid.”
Beyond faces to language
The subtleties that make the difference between excellent and spooky extend beyond faces to other AI applications like natural language processing (NLP). John Kucera of Salesforce names “mirror how we talk” as one of the “4 Trends in Scaling AI for the Coming Year”. He cautions, “In a world where there are at least 175 billion nuances to speech, how we speak—not just what we say—becomes all the more important. When dealing with robotics and natural language processing, these nuances make the difference between success and entry into the uncanny valley.”
Keeping NLP systems likeable and trustable is essential because AI customer service agents and chatbots are increasingly the public faces of businesses. Juniper research says that, “by 2022, 75-90% of customer queries will be handled entirely by chatbots.” Similarly, Alexey Aylarov reports that, “90% of large companies are already using AI solutions to enhance the customer experience.” and that NLP systems face three challenges: 1) the risk of the uncanny valley; 2) understanding spoken language including verbal “body language,” and 3) knowing when the bot is over its head and using what it has learned from the conversation to route the call to the most appropriate human agent, solving the customer’s problem without making the system overly sophisticated or human-like. Aylarov explains his solution, which is, “chatbots or voicebots that use fairly simple questions, leaving customers satisfied throughout the conversation.” And, as with physical robots, avoiding too much humanizing and making it clear that the human is interacting with an AI, is key to avoiding the uncanny valley.