Soul Machines™ latest project with Air New Zealand shows the potential of Digital Humans in customer service

26 September 2017

Air New Zealand has teamed up with Soul Machines, to showcase a digital human powered by artificial intelligence at a recent event in Los Angeles with a view to exploring how this fast-evolving technology can help travelers.

Sophie the digital human, impressed guests at the North American launch of Air New Zealand’s new global marketing campaign A Better Way to Fly with her advanced emotional intelligence and responsiveness as she answered questions about New Zealand as a tourist destination and the airline’s products and services.

The Soul Machines technology behind Sophie uses neural networks and brain models to bring its digital humans to life from their cloud based human computing engine which sits on top of an artificial Intelligence platform powered by IBM Watson.  Sophie underwent specific training prior to the LA launch including teaching her about New Zealand and Air New Zealand, tweaking her Kiwi accent and perfecting her facial expressions.

See the video of Sophie in action below.

Working with Sophie underscores Air New Zealand’s commitment to harnessing technology to improve customer experience.  While there’s no current plan to employ Sophie on a permanent basis, experimenting with digital human technology is just one of the airline’s many forays into the innovation space. 

Air New Zealand General Manager of Global Brand and Content Marketing Jodi Williams says, “We’re always looking for new ways to improve the travel experience and solve pain points with digital innovation. We’re excited to have had the opportunity to partner with Soul Machines, to bring Sophie to life and explore new ways to approach customer experience.”

Greg Cross, Chief Business Officer at Soul Machines, says, “We’re creating some of the world’s first emotionally responsive and interactive digital humans and we were thrilled to partner with Air New Zealand to showcase Sophie at the Los Angeles event. Sophie learns from every new human interaction, so the experience was invaluable in creating a more tailored and personalized encounter for those that interacted with Sophie.

IDEALOG FEATURE: Human after all: The rise (and rise) of Soul Machines™

story by Jihee Junn for Idealog | 8, September 2017

New Zealand company Soul Machines is on a mission to reverse engineer the brain and humanise AI interactions. And it’s making very good progress. Jihee Junn explores the rise of – and potential uses for – its ‘digital humans’. 


Extract from the article:

As the name suggests, Soul Machines™ creates emotionally intelligent, lifelike avatars (or, as it prefers to call them, ‘digital humans’) that act as a visual interface for customer service chatbots, virtual assistants and a host of other practical uses.

While artificial intelligence (AI) has become a term even the most technologically inept among us have become familiar with, emotional intelligence (EI) – the capacity to identify and manage one’s own emotions and the emotions of others – has been a term applied more commonly among psychologists than in computer programming circles. But as robotics and automation become increasingly ingrained into the workings of society, experts have realised that to extend the possibilities of AI, they must equip these technologies with the capability to form engaging interactions with humans. In fact, the inclusion of EI is what distinguishes Soul Machines™ from the rest of the pack: its avatars can recognise emotions by analysing an individual’s facial and vocal expressions in real time, while reciprocating these reactions with an unprecedented level of human-like response. Like AI, EI develops through experience – the more it interacts with you, the more emotionally sentient it gets.

These lifelike interactions can most notably be seen in several demonstrations of BabyX run by Soul Machines™ CEO and co-founder Dr. Mark Sagar. With a past career as Weta Digital’s special projects supervisor for blockbusters like Avatar, King Kong and Rise of the Planet of the Apes, Dr. Sagar joined the University of Auckland’s Laboratory for Animate Technologies in 2012 where he began to develop the BabyX technology that now underpins Soul Machines™. BabyX, an interactive virtual infant prototype, appears on screen as a rosy cheeked, strawberry blonde, doe-eyed toddler. Just like a real child, BabyX whimpers and cries when it’s insulted or ignored, and smiles and coos when it’s encouraged or entertained.

While the technology behind Soul Machines™ has been a project several years in the making, it’s still a newcomer to the commercial realm, having only formally launched in 2016 after receiving a $7.5 million investment from Hong Kong-based Horizon Ventures. From the start, the company has attracted a huge amount of attention. Elon Musk’s biographer Ashlee Vance visited Sagar as part of his technology show Hello World; Bill Reichert, entrepreneur and managing director of Garage Technology Ventures, listed Soul Machines™ as one of the startups that impressed him the most during a recent visit to New Zealand; and in PwC’s 2017 Commercialising Innovation Report, Soul Machines™ was again cited as a prime example of “leading the way in the AI space”.


BLOOMBERG FEATURE: Mark Sagar Made a Baby in His Lab. Now It Plays the Piano

The AI genius, who has built out his virtual BabyX from a laughing, crying head, sees a symbiotic relationship between humans and machines.

By Ashlee Vance for Bloomberg Businessweek

BabyX, the virtual creation of Mark Sagar and his researchers, looks impossibly real. The child, a 3D digital rendering based on images of Sagar’s daughter at 18 months, has rosy cheeks, warm eyes, a full head of blond hair, and a soft, sweet voice. When I visited the computer scientist’s lab last year, BabyX was stuck inside a computer but could still see me sitting in front of the screen with her “father.” To get her attention, we’d call out, “Hi, baby. Look at me, baby,” and wave our hands. When her gaze locked onto our faces, we’d hold up a book filled with words (such as “apple” or “ball”) and pictures (sheep, clocks), then ask BabyX to read the words and identify the objects. When she got an answer right, we praised her, and she smiled with confidence. When she got one wrong, chiding her would turn her teary and sullen.

If it sounds odd to encounter a virtual child that can read words from a book, it’s much more disorienting to feel a sense of fatherly pride after she nails a bunch in a row and lights up with what appears to be authentic joy. BabyX and I seemed to be having a moment, learning from each other while trading expressions and subtle cues so familiar to the human experience. That’s the feeling Sagar is after with his research and his new company Soul Machines™ Ltd.