Building machines that help everyone, everywhere.
 

Humanizing

Computing

BabyX_5.jpg
 

Who we are

Soul Machines is a ground-breaking high tech company of AI researchers, neuroscientists, psychologists, artists and innovative thinkers; re-imagining what is possible in Human Computing.

We bring technology to life by creating incredibly life-like, emotionally responsive Digital Humans with personality and character that allow machines to talk to us literally face-to-face!

Our vision is to humanize computing to better humanity.


what we do

We use Neural Networks that combine biologically inspired models of the human brain and key sensory networks to create a virtual central nervous system that we call our Human Computing Engine.

When you 'plug' our engaging and interactive Digital Humans into our cloud-based Human Computing Engine, we can transform modern life for the better by revolutionizing the way AI, robots and machines interact with people.


OUR LATEST NEWS AND BLOGS

2400x-1-1.jpg

Bloomberg Feature: Mark Sagar Made a Baby in His Lab. Now It Plays the Piano

The AI genius, who has built out his virtual BabyX from a laughing, crying head, sees a symbiotic relationship between humans and machines.

idealog_2.jpg

Idealog Feature: Human after all: The rise (and rise) of Soul Machines

New Zealand company Soul Machines is on a mission to reverse engineer the brain and humanise AI interactions. And it’s making very good progress. Jihee Junn explores the rise of – and potential uses for – its ‘digital humans’. 

 
 
DigitalHumanEmotions.jpg

emotional

Intelligence

Artboard Copy 12.jpg
 
 

When we, as human beings interact face-to-face, it's on the basis of both intellectual and emotional engagement. It's in our DNA. It's something we do naturally. What if machines were able to do this with us as well?

With their unprecedented level of intelligence and natural expressions, our life-like Digital Humans can connect with us in a much more human way. By analyzing reactions and learning in real time they not only recognize emotional expressions but respond appropriately and interactively.

Our emotionally intelligent Digital Humans are opening the doors to a new-era of human-style customer experience that can be utilized across a wide range of industries that are leading the way in this era of digital disruption - including Automotive, Financial Services & Banking, Healthcare, Media, Software and Technology.

Air New Zealand and Soul Machines show the potential of Digital Humans in customer service To read the article click here

 
 
DigitalHumanEmotions29_Roman.gif

The 3D

Faces

Artboard+Copy+12.jpg
 
 

The 3D Faces we create are as close to the real thing as we can make them. They are the most important instrument of emotional expression and engagement between people. We model the face in detail from the way the facial muscles create complex expressions all the way through the eyes that reflect what they see. We are developing full bodies from our Digital Humans with the same physiological control systems. Our Digital Humans are perfect for AR and VR.

Personality. Every one of our Digital Humans comes with its own personality. We create the character behind the face based entirely on the role the Digital Human will have in the "real" world. If for example, the Digital Human will be a Virtual Customer agent we will incorporate a range of emotional responses, expressions, and behaviors that are consistent with the role and the core values of the organization that they will be representing.

 
 

 

 

Neural

Network

Models

Artboard Copy 12.jpg
 
 

Our Neural Network Models are at the core of our Embodied Cognition Platform. They were originally developed in our BabyX research program to give BabyX the ability to learn in real-time, express, speak and then recognize words and objects. Processing information from sensory inputs to generate behavior, these neural system models enable our Digital Humans to express themselves based on the people they interact with.

We have developed biologically inspired models of the Brain that are responsible for some of the key capabilities of our Digital Humans. These are controllable by virtual neurotransmitters and hormones like dopamine, serotonin, and oxytocin. Together they influence virtual physiological states which guide learning and behavior, modulating the emotions that our Digital Humans "feel" and express.

Intelligent Sensors Provide our Digital Humans with the ability to see via a webcam and hear via the microphone in the device are just the beginning of the digital nervous system that controls many of the physiological inputs that help bring our Digital Humans to life.

 
 
 

Visual &

Auditory

Systems

Lia_eyes.jpg
Artboard Copy 12.jpg
 
 

Visual and Auditory systems provide the data feeds our identification, emotion detection, and analysis systems. Our Auditory systems are also responsible for providing the captured voice stream to the Natural Language Processing (NLP) engine which in turn asks questions of the AI platform.

Voice and speech are created specifically for a Digital Human depending on the language and or accent required.  To ensure the most life-like facial expressions while talking we train the muscles and lip movement to match the voice. We have even provided people with deafness the ability to lip read.