Greg Cross talks to CBC News about putting a face on machines

Virtual infant BabyX prompts question: how do we feel about AI that looks so much like us?

New Zealand-based research group wants to create artificial intelligence emulating human gestures, functions

Excerpt from story by Ramona Pringle – Technology Columnist – CBC News – 13 November 2017 [Click here to read the full story]

The eyes are the windows to the soul. That’s what we say about humans — but what about with robots or humanoid simulations? Is a realistic complexion or an earnest gaze the key to seeing artificial beings as more than, well, artificial?

That’s the premise behind BabyX, the lifelike virtual infant from the New Zealand-based research group Soul Machines™, whose goal is to humanize artificial intelligence (AI). The group’s work is in many ways unprecedented as they develop robots that emulate not only human gestures but also actual human functioning.

But it also raises a question: Is human likeness something that we want from our machine counterparts? Or, conversely, does it make humans slightly nervous when artificial beings look too much like us?

Meet BabyX

BabyX is a hyper-realistic screen-based simulation of an infant, with rosy cheeks and wide, sparkling eyes. Its lifelike appearance is a result of both art and engineering.

Soul Machines™’ founder, Mark Sagar, is an award-winning special effects artist who has worked in digital character creation for blockbuster films like Avatar and King Kong. He has developed a unique appreciation for the minutiae of human expression. In that way, the computer-generated “people” he creates are in a league of their own, with appearances and movements that are remarkably close to those of humans.

And that is no small feat. After all, it’s one thing to look human, but it’s a whole other thing to move realistically, explains Michael Walters, a senior lecturer in the School of Computer Science at the University of Hertfordshire.

“It’s very difficult to get a robot to not just look right but move right, as well,” says Walters, who is also a researcher with the university’s multidisciplinary Adaptive Systems Research Group. 

“We’ve seen various humanoid robots, but we aren’t fooled by them for very long. They’re close but not quite right.”

Sagar’s team at Soul Machines™ is working to make virtual beings that are persuasively lifelike — not just in how they look but in how they move and react to stimuli. That’s due in large part to the way they’re approaching this 21st -century challenge: they’re endeavouring to build a simulated brain.

Finding the human connection

An interdisciplinary team that includes neuroscientists and physiologists “is now building biologically inspired models of the human brain,” using the concepts of neural networks and machine learning to build a virtual nervous system, says Greg Cross, Soul Machines™’ chief business officer.

Their goal, he says, is to understand how humans work, and “figure out how we learn to interact with others, and how we learn to create.” When their autonomous virtual infant smiles, it’s not because of a line of code directing it to do so following certain prompts or inputs — it’s in reaction to virtual dopamine and endorphins, the release of which is triggered by real-world stimuli and interactions. In other words, the same things that make humans smile.

“By putting a face on machines they become more human-like,” says Cross. “The most powerful instrument we have to show our emotions is the human face.”

Soul Machines™’ team of developers is striving to reach the benchmarks of “emotional intelligence, understanding and responding to emotion,” he added.

In this way, the research group is differentiating their creations from the current wave of consumer robots on the market. Cross sees their AI as the inevitable evolution of the faceless virtual assistants like Siri and Alexa that are now in millions of homes and businesses all over the world.

“Humanoid robots are to virtual assistants what television was to radio,” he says.

The full story can be read here.