Article by Katie Kenny as featured on Stuff | January 28, 2019
Kiwi company Soul Machines is creating digital humans powered by biologically inspired models of the brain.
We were trying to get in touch with our internet service provider. I can’t remember the reason. But we contacted the company through its website chat system.
My partner was typing, and I noticed his language was unusually clipped, devoid of the words “please”, or “thank you”.
“You don’t have to be so rude,” I said.
“It’s just a bot,” he replied, shrugging.
I took a closer look at the conversation. “No, that’s not a bot. You’re talking to a real person.”
The speed, and humanity, of the responses were beyond the capabilities of common virtual assistants, I thought.
So we asked the woman at the other end about the weather, and what she did at the weekend. Her replies confirmed my suspicion. She sounded like a person used to getting hassled in this online role, wanting to end the conversation quickly.
Our uncertainty felt like a very 2019 thing. Voice-controlled artificial intelligence systems, and even robots, have become more common in our everyday lives; from Siri, Apple’s “intelligent personal assistant”, to WoeBot, the chatbot therapist, to Travelmate, the suitcase that uses GPS to stay close to your connected smartphone.
As they proliferate, how should we properly address, and relate, to these virtual beings?
IBM distinguished designer Adam Cutler, at AI-Day in Auckland last year, said society is shifting from “a transactional age of computing, to a relationship age”.
Eventually, people will want to date their AI operating systems, he said, alluding to Spike Jonze’s 2013 film, Her, about a man who falls in love with his operating system.
“Why? Pathetic fallacy. We, as humans, want to attribute human feelings to inanimate objects. We want to form relationships.”
In his TED talk, Cutler adds: “For the past 72 years, we’ve been communicating with computers on their terms. All of the user interfaces we’re surrounded by are nothing more than elaborate workarounds for us to share our intent with a computer.
“Today, we’re right on the cusp of an evolution in our relationships with humans and machines. These machines aren’t programmed, they’re taught. This means a machine can understand, reason, learn and interact and these are the very building blocks of what a machine needs to form and maintain a relationship with a human.”
One way to foster that relationship is for the AI to look, well, human, says Greg Cross, chief business officer at Soul Machines.
“With the technology that’s being developed, we’re going to spend more time interacting with machines. At Soul Machines we’ve got a simple vision: aren’t machines going to be more helpful to us, if they’re more like us?”
Greg Cross, Soul Machines’ chief business officer, believes by adding human-like faces to AI systems, such as “Rachel”, behind him, consumers will more readily interact with them
The Auckland-based company is known around the world for its creation of “digital humans” — autonomous, animated individuals that look and sound like real people, powered by virtual central nervous systems.
These digital humans have been employed at banks, airlines, education and healthcare services.
“We believe by adding a face to AI, we’re actually allowing large organisations to provide a much more personalised customer experience,” Cross says. Pilots with digital humans at NatWest branches in the United Kingdom and at Air New Zealand showed consumers were “quite happy” to interact and even form emotional relationships with them.
However, he adds, the aim of digital humans isn’t to replace traditional customer service staff. “The simple reality is there will always be customers who have problems which are very complex, and having resources available to provide real human interaction will be required as well.”