Building machines that help everyone, everywhere.




Who we are

Soul Machines is a ground-breaking high tech company of AI researchers, neuroscientists, psychologists, artists and innovative thinkers; re-imagining what is possible in Human Computing.

We bring technology to life by creating incredibly life-like, emotionally responsive Digital Humans with personality and character that allow machines to talk to us literally face-to-face!

Our vision is to humanize computing to better humanity.

what we do

We use Neural Networks that combine biologically inspired models of the human brain and key sensory networks to create a virtual central nervous system that we call our Human Computing Engine.

When you 'plug' our engaging and interactive Digital Humans into our cloud-based Human Computing Engine, we can transform modern life for the better by revolutionizing the way AI, robots and machines interact with people.


Announced at Mobile World Congress 2018 - Soul Machines has entered into a new partnership with Daimler Financial Services to create Sarah, a digital human designed to help customers with personalized assistance for the company's services, including car financing, leasing and insurance.  Watch the video showcasing this exciting new partnership.

Sarah combines artificial and emotional intelligence for a completely new experience which redefines the link between humans and machines. Check out the latest press coverage on Sarah

Soul Machines is changing the landscape of banking with CORA, a digital human created for UK bank NatWest, a subsidiary of banking giant - the Royal Bank of Scotland.  Running as a pilot program, Cora is there to add a dynamic way for customers to get quick, accurate answers on everyday banking questions. Read more about this banking first.

Kevin Hanley - Director of Innovation, Royal Bank of Scotland Group - talks about the role of Cora

Screen Shot 2018-03-22 at 10.43.09 AM.png

Rachael rekart of autodesk live on cheddar news site

Discussing how Autodesk have partnered with Soul Machines to create AVA.


Cora is causing quite a stir across global media

Check out the press coverage and take a look at the BBC video

What makes you get up in the morning?  This is the question posed to Mark Sagar and Greg Cross of Soul Machines.  Take a look at what they said.




Artboard Copy 12.jpg

When we, as human beings interact face-to-face, it's on the basis of both intellectual and emotional engagement. It's in our DNA. It's something we do naturally. What if machines were able to do this with us as well?

With their unprecedented level of intelligence and natural expressions, our life-like Digital Humans can connect with us in a much more human way. By analyzing reactions and learning in real time they not only recognize emotional expressions but respond appropriately and interactively.

Our emotionally intelligent Digital Humans are opening the doors to a new-era of human-style customer experience that can be utilized across a wide range of industries that are leading the way in this era of digital disruption - including Automotive, Financial Services & Banking, Healthcare, Media, Software and Technology.


The 3D



The 3D Faces we create are as close to the real thing as we can make them. They are the most important instrument of emotional expression and engagement between people. We model the face in detail from the way the facial muscles create complex expressions all the way through the eyes that reflect what they see. We are developing full bodies from our Digital Humans with the same physiological control systems. Our Digital Humans are perfect for AR and VR.

Personality. Every one of our Digital Humans comes with its own personality. We create the character behind the face based entirely on the role the Digital Human will have in the "real" world. If for example, the Digital Human will be a Virtual Customer agent we will incorporate a range of emotional responses, expressions, and behaviors that are consistent with the role and the core values of the organization that they will be representing.







Artboard Copy 12.jpg

Our Neural Network Models are at the core of our Embodied Cognition Platform. They were originally developed in our BabyX research program to give BabyX the ability to learn in real-time, express, speak and then recognize words and objects. Processing information from sensory inputs to generate behavior, these neural system models enable our Digital Humans to express themselves based on the people they interact with.

We have developed biologically inspired models of the Brain that are responsible for some of the key capabilities of our Digital Humans. These are controllable by virtual neurotransmitters and hormones like dopamine, serotonin, and oxytocin. Together they influence virtual physiological states which guide learning and behavior, modulating the emotions that our Digital Humans "feel" and express.

Intelligent Sensors Provide our Digital Humans with the ability to see via a webcam and hear via the microphone in the device are just the beginning of the digital nervous system that controls many of the physiological inputs that help bring our Digital Humans to life.


Visual &



Artboard Copy 12.jpg

Visual and Auditory systems provide the data that feeds our identification, emotion detection, and analysis systems. Our Auditory systems are also responsible for providing the captured voice stream to the Natural Language Processing (NLP) engine which in turn asks questions of the AI platform.

Voice and speech are created specifically for a Digital Human depending on the language and or accent required.  To ensure the most life-like facial expressions while talking we train the muscles and lip movement to match the voice. We have even provided people with deafness the ability to lip read.