SOUL MACHINES™ BLOG: The Healing Machine

Digital Humans – Counsellors of Tomorrow

 “What we achieve inwardly, will change outer reality.” Plutarch – Greek Philosopher (45 AD – 120 AD)

The mind is a rich and complex puzzle.  A perpetually intriguing mass of grey matter that sparks and fires in endless activity, challenging even the greatest of neuroscientists. We are only just touching the depths to see what really goes on underneath. One of the questions that remains unanswered is how do we stem the global rise of mental health conditions?

According to the World Health Organisation: ‘20% of the world’s children and adolescents have mental health disorders or problems.’ While war and disaster understandably, is having a massive impact on the psychosocial well-being of vast swathes of adults.

The dark stigma associated with seeking out mental health support is a fundamental barrier for many, like a black cloud of discrimination hanging over them.  Judgement comes all too easily when someone states they’re ‘seeing a shrink!’ It’s not something you often hear in conversation at the water-cooler. And what about those caught in the midst of conflict or economic crisis, where can they find solace and support when the very fabric of their society is crumbling? The WHO Mental Health fact file (2014) highlights inadequate human resourcing ‘for mental health especially in low and middle income countries.’ While, for the military, some return from action unharmed in the physical sense, yet they return wounded nonetheless. For these soldiers, Post Traumatic Stress is a war they never imagined they’d have to fight.

With mental health conditions affecting so many people across the world, Digital Humans may well be the answer to helping manage the growth of this issue.  A Digital Human as a Mental Healthcare Assistant is a financially viable and accessible option for regions struggling with meager human resources. And for those wrestling with revealing their innermost anguish, a Digital Human has the ability to unlock doors of the mind previously closed to humans by the mere fact they aren’t real people! In test studies exploring depression at USC’s Institute for Creative Technologies (2014), patients showed a greater willingness to disclose secrets with virtual assistants than with a human present.  There was no feeling of judgement or pity, so study subjects felt more at ease and better able to show signs of sadness.

AI technology has continued to advance since then, and with the rise of Digital Humans of the like that Soul Machines™ create, with their empathetic listening and ability to read facial expressions; mental health support could be raised even higher. A sufferer could build a strong rapport with a Digital Human who looks and reacts like a real person yet they can feel secure in the knowledge their vulnerabilities will not be judged.

AI technology can revolutionise medicine. Digital Humans will not only aid mental health sufferers but act as an important resource on the front-line of delivering invaluable support to health specialists. Medical staff could rely on these virtual healthcare assistants as intermediaries in diagnosis by drawing on invaluable data from their revealing conversations with mental health patients.

And as healthcare systems around the world drown under the weight of maintaining hospital records,  Digital Humans can take the strain by managing and analysing patient data far more efficiently than real people. In Mt Sinai Hospital in New York (2015), a research group applied Deep Learning to patients records which then discovered hidden patterns in the data. The project aptly named Deep Patient was surprisingly good at predicting the onset of psychiatric disorders like schizophrenia.  

Digital Humans won’t replace the human-side of mental health treatment BUT they will provide a new and vital component of it. As the world opens its doors to AI technology, the Digital Humans of today could well be the counsellors of tomorrow!

SOUL MACHINES™ BLOG: Face-to-Face with a Digital Human

“The real voyage of discovery consists not in seeking new landscapes, but in having new eyes.”

Marcel Proust

 Our latest Digital Human Our latest Digital Human


We’re now well into the 21st century….who’d have thought… and the words artificial intelligence are buzzing like swarms of bees around the globe.  It’s a time when we imagine and fear the unknown playing field of futurism and yearn for the next user experience that will win the match.

Going beyond the straightforward chatbots to the more complex voice-based generalists of the Alexa’s, Siri’s and Cortana’s of this world, we are now in a realm where serving up information is what we all need and want. But is interconnectivity of devices and the ability to answer endless questions where the AI juggernaut will park itself? Humans shouting demands to inanimate objects (albeit product designed to the max) seems an isolating place to be. So hold that thought and consider this: how about the more natural experience of engaging with a computer that features an intelligent human-like personality with a face that not only hears but sees you. A face that registers and responds to your own facial reactions. Gone is just the talking to, now you’re talking with a Soul Machines™ digital human. Surely this is a far better rapport than just bashing out questions on a keyboard or shouting at a device, especially if the digital human shows a sense of humour by throwing in curveballs to the conversation!

“The face is a picture of the mind with eyes as its interpreter”

Marcus Tullius Cicero

What’s in a face?

Face-to-face communication is an intrinsic part of what makes us human. It establishes trust. If someone doesn’t look you in the eye when they talk to you it’s unnerving, irritating even and the conversation quickly loses value and meaning. Visual cues to what a person truly means are painted across the face adding the clarity of non-verbal communication to the conversation.  Soul Machines™ digital humans have the ability to read whether someone is confused, happy or unsure through their expression’s and can then direct the conversation accordingly.

Face-to-face dialogue is vital to humans. In truth, we have “dedicated brain regions” for recognising faces and facial expressions. Face-to-face interaction also releases ‘the cuddle chemical’ – oxytocin which plays a key role in strong social bonding (especially between a mother and child). It also has the power to regulate fear and anxiety. Put your hand up if you’ve ever felt that surge of panic when you’ve spent eons of time filling out a crucial form online only to hit the next button and see the screen sucked into a cyberspace black hole? Imagine instead you’re talking to a digital human who not only projects a sense of calmness with their dulcet tones and facial engagement, but who can also fill out the form for you at the same time. How satisfying would that make your user experience?

Technology is now deeply embedded in our world but social media and smartphones with all their tweets, texts and transcripts has built a wall against face-to-face interaction brick by brick. If a more natural engagement can be sparked through human-like interaction with smart tech then surely the walls of faceless conversation will come tumbling down. For yes…. with the advent of digital humans the art of face-to-face conversation is back.

Viva la Revolution!


Further Interesting Reading: Neural synchronisation during face-to-face communication


Capturing the attention of Newshub

In the news today:

Kiwi startup Soul Machines reveals latest artificial intelligence creation, Rachel

by Simon Shepherd at Newshub, July 9, 2017

Take a look at the news story here.

A Kiwi company developing artificial intelligence has delivered its latest digital human, called Rachel.

Rachel can see, hear and respond to you.

She is an avatar created by two-time Oscar winner Mark Sagar, who worked on the blockbuster movie of the same name.

Mr Sagar, of Auckland-based company Soul Machines™, says his aim is to make man socialise with machine, by putting a human face on artificial intelligence.

“So what we are doing with Soul Machines™ is trying to build the central nervous system for humanising this kind of computer,” he says.

A favourite theme of Hollywood, the interaction between human and computer is already here in much simpler forms, from Siri on your iPhone to virtual assistants in your home.

China’s third-largest technology company Baidu has just announced artificial intelligence is its major focus, including driverless cars.

Soul Machines™’ goal is just as complex – emotions. The startup’s prototype was Baby X, which gets upset and needs reassurance when Mr Sagar hides, and can also recognise pictures.

The technology’s advancing so quickly, a later version helps people in Australia with disabilities.

And the version after that is so detailed it has a warning on its Youtube video – this is not real.



A Digital Brain for Digital Humans – Mark Sagar at Cannes Lions Festival

In the news at Beet TV:

Virtual Brain Models put a Face on Big Data: AI Guru Sagar

by Robert Andrews, June 25, 2017

CANNES — At this point in the early development of artificial intelligence, many people probably assume that typical AI applications revolve around textual deployments.

But we if you could use AI to create lifelike digital brains that, implanted in 3D facial models, could give life and character to virtual avatars?

As far-fetched as it may seem, that is exactly what Mark Sagar has done. The CEO of Soul Machines™ says his company has used IBM’s Watson cognitive computing service to inject emotion in to computer-generated movie characters – but the tech is not going to stop at Hollywood.

“How do we make characters that have their own digital life?,” asks Auckland-based Sagar, during this panel interview at Cannes Lions. “You almost have to give it a nervous system, a digital brain so it can think for itself.”

Sagar, who first pioneered the technology whilst working on the movie King Kong and who later built upon his work for Avatar, may be used to working with scripted characters – but these AI creations don’t necessarily have to follow the paths laid out for them.

“We have biologically-constrained cognitive architectures – these are brain models,” he says.”You don’t know how it’s going to act, it will have memory and so forth.

“The models can sense the environment, they can react, they can learn in real-time and we can connect those to Watson – you (can) have a conversation with it.”

Why is Sagar in Cannes, where the world’s advertisers and creative agencies are out in force to hear about what’s new and what’s next?

Because AI-driven facial models could help brands and enterprises create avatars that interact with customers in lifelike ways, tapping in to vast databases behind them and describing it in emotional mannerisms.

“If you’re a company and have big data that you want to go through, we can put a living face on it,” Sagar adds.”

This interview panel was chaired by The Weather Company CMO Jordan Bitterman. The Weather Company was acquired by IBM in 2015 and, together, the pair are leveraging IBM’s Watson to work on a range of AI-powered initiatives.

This video is part of Beet.TV’s AI Series from Cannes Lions 2017, presented by The Weather Company, an IBM Business. For more from the series, please visit this page.

Mark Sagar at Cannes Lions 2017 speaking about bringing brands to life

In the news today on Beet TV:

‘We Could Replicate Anybody’: Sagar Bring Brand Bots To Life

by Robert Andrews, July 4, 2017, Beet TV

CANNES — As Cannes Lions played host to plenty of discussions about the role of artificial intelligence in advertising and marketing, many might have wondered how far off some of the technologies may be.

Certainly, tools like 3D brand avatars imbued with lifelike emotions and empathy may seem far-fetched. But they are real, here and now, said one pioneer pitching the tech to advertisers today.

“This technology is available today,” said Mark Sagar, CEO of Soul Machines™ a company responsible for the systems. “We could basically replicate any person.”

By that, Sagar means, the 3D technology his company built, originally for movie studios to bring characters to life, how moved beyond animation itself – now, those 3D character models are getting injected with algorithmic profiles that mimic human emotions, ticks and responses, even to viewers they can see through their own digital cameras.

And Sagar thinks brands could use the same tech to bring brand avatars to life.

“It could be a celebrity, it could be a spokesperson or whatever,” he said. “If you think about how you approach representing a brand when you get a celebrity … you’re embodying their traits in that. We can bring them to life and have them interact.”

This is a world away from optimising display ads’ click-through engagement. In this future world, an army of technologies would be deployed to mimic human characters employed to interact in lifelike conversations with customers and prospects. It’s a closing of the gap that Sagar thinks will yield results.

“When you interact, you invest,” he added. “You start personalising things. You can have a spokesperson or representative of a particular brand have a relationship with you – it will remember you, your preferences, adapt its behaviour.

“You start forming a stronger relationship with the brand in that way. By adding life to things, we can’t ignore it.”

This video is part of Beet.TV’s AI Series from Cannes Lions 2017, presented by The Weather Company, an IBM Business. For more from the series, please visit this page.