CMO: How virtual humans could transform the brand experience

The rise of artificial intelligence is bringing with it the advent of a new age of robotic humans. We look at their impact on service and engagement

Excerpt from story by Brad Howarth (CMO) | 28 November 2017

For years, marketers have talked about brands as having personalities. Now they have the tools to bring those brands to life – virtually at least.

Rapid developments in artificial intelligence (AI) are being combined with Academy Award-winning animation skills to create virtual humans that are the closest yet to flesh and blood. And for brands, that offers the opportunity to put a very human-looking face on a corporate body.

One of the latest iterations of these virtual humans comes from Auckland-based company, Soul Machines™, whose co-founder and CEO, Mark Sagar’s ground-breaking work in computer-generated faces on films, King Kong and Avatar, was recognised with consecutive Oscars. Now Soul Machines™ is applying its skills to the commercial world through the creation of virtual humans as the new face of brands.

According to Soul Machines™’ chief business officer, Greg Cross, virtual humans give consumer-facing organisations an opportunity to completely change the economics of highly-personalised service.

“When you interact with one of our virtual humans as a virtual concierge or a virtual customer service agent, it is going to be a highly personalised experience,” Cross says.

“These virtual employees are going to remember you every time you interact with them. They will have built up a profile of your personality, of what products and services you love, and which ones you don’t. They will have knowledge of problems you have had and whether they have been resolved to your satisfaction.

“What that means is we now have the ability to deliver highly personalised service on a very, very large scale using these virtual employees.”

The technology is being trialed at numerous organisations in banking, technology, healthcare, education, and transport. It has also been used to create Sophie, a pilot customer interaction project for Air New Zealand.

“The early customers we are working with really see their first virtual employee as an extension of their brand,” Cross says. “They go through a process of designing their first virtual employee in the same way they would select a celebrity to represent their brand in a television advertisement.”

While the quality of the animations developed by Soul Machines™ is incredibly lifelike, it is the AI model behind its virtual humans that brings them to life. Cross says this is based around mimicking human thought process and brain models.

“Mark [Sagar] asked ‘what if I built biologically-inspired models of different parts of the human brain, and wired all this together into a virtual nervous system?’,” Cross says. “Could we create emotionally responsive, human-like, engaging characters? And that is what we have done.

“As best we know we are the only company in the world that can bring these digital characters to life using a virtual nervous system and neural networks that continuously learn.”

For the full story click here

NEWSHUB: Soul Machines™ develops a Digital Human

Newshub feature by Simon Shepherd | Newshub | 24 November 2017

 To watch the video click here To watch the video click here

Breaking into America is a milestone for any Kiwi start-up and the latest has done it with a digital human.

You may not have heard of this company, but its showreel has everything from buildings to special effects, all designed on its software.

Soul Machines™ makes artificial intelligence ‘human-like’ and one of its creations will be the new face of a global software company.

Autodesk turns over $2 billion, and it’s come to downtown Auckland and Kiwi company Soul Machines™ for the latest in artificial intelligence customer service.

“[It’s] our first big US customers, so that’s always a big milestone for a Kiwi start up,” says Soul Machines™ chief business officer Greg Cross.

Autodesk’s current customer service chatbot will be replaced with Ava.

“Ava is a virtual customer service agent, to bring a whole new level of personalisation and brand experience to that customer experience on a day-to-day basis,” Mr Cross says.

Soul Machines™ digital humans are avatars with a central nervous system that can be mapped to show how they respond.

Soul Machines™ says these machines will be more useful to us and more natural to interact with.

New Zealand’s tech sector is now our third biggest export industry – and Soul Machines™ is now an important part of that growth.

Newshub.

FAST COMPANY: Are you ready for bots to read your face?

Story by Sean Captain | Robot Revolution | Fast Company | 11.15.17

 [Screenshot: Autodesk/SoulMachines] [Screenshot: Autodesk/SoulMachines™]

Would you turn on your webcam so that a customer service robot can get to know you better?

Soul Machines™, a New Zealand startup, thinks so. It builds a customer service bot with an amazingly human face and a simulated nervous system that interprets how customers feel and reacts accordingly—in part by watching them over a webcam. As I reported today, design software maker Autodesk will be the first big client to try out the technology next year—what Soul Machines™ calls a “digital human”—with a remake of its AVA customer service bot. (Below, a video about the creation of AVA.)

While many companies boast about personalized service but can’t afford to hire enough people (and while most existing customer service bots haven’t exactly managed to fill in the gap), Soul Machines™ sees an opportunity for new CGI and AI-driven techniques. AVA’s photorealistic appearance—based on scans and recordings of actress Shushila Takao—is an outgrowth of the work the company’s founder, Mark Sagar, has done as a CGI engineer for Hollywood films like Avatar; her “emotional intelligence,” guiding how she responds to human cues, comes from his AI research simulating a human nervous system in software.

AVA, says Sagar, can read cues in people’s expressions, such as a smile or a furrowed brow, to get a sense of a customer’s disposition. Are they happy and open for pleasant chat, or impatient and wanting to get to the point? Or are they just freaked out by a robot staring at them, reading their expressions and possibly classifying their emotions?

“I think that is going to be the biggest challenge of it,” Gregg Spratto, Autodesk’s VP of operations, told me about the concerns that Autodesk’s privacy-conscious users might have about a face-scanning chatbot. “And I think a lot of our success will be dependent on how we convince people to give that a try.”

AVA also provides a fallback: she’s designed to pick up nuances in people’s voices too. So even with the webcam off, she can still get a better feel, so to speak, for the human she’s talking to.

NATIONAL BUSINESS REVIEW: Soul Machines™ partners with Autodesk to create digital employee

Excerpt from story by Rebecca Howard | NBR |November 16, 2017

Soul machines™, the Auckland-based developer of intelligent, emotionally responsive avatars, has joined forces with 3D design, engineering and entertainment software developer Autodesk to create the US company’s first digital employee.

AVA (Autodesk virtual agent), a virtual customer service agent, will interact with customers 24/7 to resolve any issues. However, unlike Autodesk’s current virtual assistant that is text-based, its virtual nervous system will give AVA the ability to see, hear and respond emotionally, according to Soul Machines™.

“This is where our technology is particularly unique. These digital characters do have a virtual nervous system, using AI technology and models of different parts of the human brain. That’s how we bring them to life in a very human-like way,” said Greg Cross, chief business officer for Soul Machines™ in an interview.

To read the full story click here.

COMPUTERWORLD: Soul Machines™ gives Autodesk chatbot a human face

Soul Machines™ says the humanised AVA will enable customers to get answers to questions direct them to content and enable them to complete transactions

Story by Stuart Corner (Computerworld New Zealand) 16 November 2017

Soul Machines™, a spinout from the University of Auckland Bioengineering Institute, is developing digital human interface to Autodesk’s customer assistance chatbot, the Autodesk Virtual Agent, AVA.

Soul Machines™ says the humanised AVA will enable customers to get answers to questions direct them to content and enable them to complete transactions.

“Soul Machines™ is advancing AVA’s capabilities, with a digital human face and persona that it literally brings AVA to life using [our] world leading Human Computing Engine (HCE),” the company said.

However to take full advantage of humanoid AVA, Autodesk customers wil need to turn on the video on the phone or computer so AVA can see them,

HCE is described as a virtual nervous system that combines neural networks and biologically inspired models of the human brain that will give AVA “the ability to see and hear as well as sensory systems that enable to recognise and respond emotionally in an incredibly human like way.”

Soul Machines™’ CBO Greg Cross said: “Talking to one of our digital humans means you will get the same sort of social responses and non-verbal communication cues as if you were sitting face to face across a table from a real person. It means our customers can deliver highly personalised brand accretive experiences in a way they have not been able to afford to do up till now.”

CEO of Soul Machines™ Dr Mark Sagar said “[Ava] has a virtual nervous system and all kinds of sensory capabilities so she can respond to the user’s behaviour in real time to facilitate the communication.”

For the full story click here

HOT OFF THE PRESS: Soul Machines™ Partners with Autodesk to launch AVA at Autodesk University 2017

Soul Machines™ today announced it has partnered with Autodesk, a builder of software that helps people imagine, design and create a better world, to create a digital human of Autodesk’s Virtual Agent, AVA. Announced at Autodesk University, Autodesk’s annual conference in Las Vegas, Ava will provide customers access to a service agent 24/7—answering questions, directing them to content and completing transactions.

Soul Machines™ is advancing AVA’s capabilities, with a digital human face and persona that it literally brings AVA “to life” using its world leading Human Computing Engine™ (HCE). The Soul Machines™ HCE  is a Virtual Nervous System™ that combines neural networks and biologically inspired models of the human brain. AVA’s Virtual Nervous System gives her the ability to see and hear as well as sensory systems that enable to recognize and respond emotionally in an incredibly human like way.

“The future of highly personalized customer engagement is a blend of digital and human support, and AVA our first digital employee is just that. She will understand human consciousness and interactions by reading signals such as body language and facial reactions—in turn learning more about customers to better serve them. The addition of emotional intelligence to AVA takes our customer service beyond purely transactional to relational.” – Rachael Rekart, senior manager for, machine assisted service engagement at Autodesk

By putting a face on Artificial Intelligence and extending the potential for customer interaction beyond text based chatbots and voice assistants we enable large corporations and organizations all over the world to enter a whole new era of personalized service and democratized knowledge transfer.

“As one of the world’s leading global software companies we are excited that Autodesk will be among the first to deploy our technology on top of their IBM Watson cognitive platform. If the future of software is AI and 3D graphics this is an extremely exciting partnership for both Autodesk and Soul Machines™” – Greg Cross, Chief Business Officer, Soul Machines™

As we work with Autodesk to launch their first digital employee AVA we are very excited  to be exploring the future of human machine interaction at the very beginning of the AI and robot era. Learning about how we can make these “machines” more human like  and more useful to each and everyone of us as individuals will define the future of creating compelling customer experiences.

 

 

FAST COMPANY: Autodesk’s AVA is built to be a font of empathy

This Chatbot Is Trying Hard To Look And Feel Like Us

 Autodesk's AVA created by Soul Machines Autodesk’s AVA created by Soul Machines™

Modeled on a real person and equipped with a virtual “nervous system,” Autodesk’s AVA is built to be a font of empathy, no matter how mean a customer gets.

For the full story click here

Excerpt from story by Sean Captain | Fast Company | 11.15.17

 

Among the attributes credited for Apple’s famous customer loyalty is a network of stores where curious or frustrated consumers can meet the company face-to-face.

The 3D design software maker Autodesk is trying to achieve something similar online with a help service that allows people to interact with what sure looks like an actual human. The company says that next year it will introduce a new version of its Autodesk Virtual Agent (AVA) avatar, with an exceedingly lifelike face, voice, and set of “emotions” provided by a New Zealand AI and effects startup called Soul Machines™. Born in February as a roughly sketched avatar on a chat interface, AVA’s CGI makeover will turn her into a hyper-detailed, 3D-rendered character–what Soul Machines™ calls a digital human.

The Autodesk deal is Soul Machines™’ first major gig, following a pilot project with the Australia National Disability Insurance Agency from February to September, 2016, and some proof-of-concept demos, like a recent one with Air New Zealand. Greg Cross, Soul Machines™’ chief business officer, says the company is working on eight other projects with “major brand names” but declined to name them.

 The Original AVA The Original AVA

In demonstration videos, the new AVA’s face reacts with all the physiological subtlety of the best Hollywood CGI characters. No wonder, since she’s born of the same technologies. Before founding Soul Machines™ in 2016, engineer Mark Sagar was a movie special effects guru at outfits like Sony Pictures and Peter Jackson’s Weta Digital. He won two Oscars, in 2010 and 2011, for his work creating lifelike CGI facial animation in films including Jackson’s King Kong and James Cameron’s Avatar. Among his achievements was co-developing a system called Light Stage that scans a human face and body in extreme detail, with lighting from multiple angles, in order to create lifelike 3D models.

Hollywood productions may take the scans as a starting point, such as morphing impressions by actor Zoe Saldana into the blue-face Neytiri in Avatar or of Mark Ruffalo into the green-faced Hulk in The Avengers. But Soul Machines™, using additional software that Sagar developed at the University of Auckland, keeps true to the original, retaining even the pink splotches, clogged pores, and errant eyebrow hairs of its human models.

That embracing of flaws may allow digital humans to avoid the “uncanny valley” of creepiness, when a robot looks oh-so-close-to, but not quite, human. Soul Machines™’ imperfect beauty has to be more than skin deep for expressions to be believable, though, with computer modeling of bone structure, muscle twitches, and other subtleties.

Sagar wants to go even deeper. Since leaving cinema, he’s been trying to recreate the human nervous system in software. Analyzing facial expressions to discern a smile, even a subtle one, and analyzing voice to pick up a pleasant tone, Soul Machines™’ software provides a hit of virtual dopamine to AVA’s nervous system. As in a human, this triggers a relaxed demeanor in AVA. Don’t be mistaken, though: This is very far from artificial sentience, Sagar cautions. “Everything is radically simplified from the real thing, and even how the real thing works is not understood,” he says. “This is just sort of current thinking on how some of these models work.”

TO READ THE REST OF THE STORY CLICK HERE

 

HOT OFF THE PRESS: Soul Machines™ Partners with Autodesk to launch AVA at Autodesk University 2017

Soul Machines today announced it has partnered with Autodesk, a builder of software that helps people imagine, design and create a better world, to create a digital human of Autodesk’s Virtual Agent, AVA. Announced at Autodesk University, Autodesk’s annual conference in Las Vegas, Ava will provide customers access to a service agent 24/7—answering questions, directing them to content and completing transactions.

Soul Machines™ is advancing AVA’s capabilities, with a digital human face and persona that it literally brings AVA “to life” using its world leading Human Computing Engine™ (HCE). The Soul Machines™ HCE  is a Virtual Nervous System™ that combines neural networks and biologically inspired models of the human brain. AVA’s Virtual Nervous System gives her the ability to see and hear as well as sensory systems that enable to recognize and respond emotionally in an incredibly human like way.

“The future of highly personalized customer engagement is a blend of digital and human support, and AVA our first digital employee is just that. She will understand human consciousness and interactions by reading signals such as body language and facial reactions—in turn learning more about customers to better serve them. The addition of emotional intelligence to AVA takes our customer service beyond purely transactional to relational.” – Rachael Rekart, senior manager for, machine assisted service engagement at Autodesk

By putting a face on Artificial Intelligence and extending the potential for customer interaction beyond text based chatbots and voice assistants we enable large corporations and organizations all over the world to enter a whole new era of personalized service and democratized knowledge transfer.

“As one of the world’s leading global software companies we are excited that Autodesk will be among the first to deploy our technology on top of their IBM Watson cognitive platform. If the future of software is AI and 3D graphics this is an extremely exciting partnership for both Autodesk and Soul Machines” – Greg Cross, Chief Business Officer, Soul Machines

As we work with Autodesk to launch their first digital employee AVA we are very excited  to be exploring the future of human machine interaction at the very beginning of the AI and robot era. Learning about how we can make these “machines” more human like  and more useful to each and everyone of us as individuals will define the future of creating compelling customer experiences.

 

 

Greg Cross talks to CBC News about putting a face on machines

Virtual infant BabyX prompts question: how do we feel about AI that looks so much like us?

New Zealand-based research group wants to create artificial intelligence emulating human gestures, functions

Excerpt from story by Ramona Pringle – Technology Columnist – CBC News – 13 November 2017 [Click here to read the full story]

The eyes are the windows to the soul. That’s what we say about humans — but what about with robots or humanoid simulations? Is a realistic complexion or an earnest gaze the key to seeing artificial beings as more than, well, artificial?

That’s the premise behind BabyX, the lifelike virtual infant from the New Zealand-based research group Soul Machines™, whose goal is to humanize artificial intelligence (AI). The group’s work is in many ways unprecedented as they develop robots that emulate not only human gestures but also actual human functioning.

But it also raises a question: Is human likeness something that we want from our machine counterparts? Or, conversely, does it make humans slightly nervous when artificial beings look too much like us?

Meet BabyX

BabyX is a hyper-realistic screen-based simulation of an infant, with rosy cheeks and wide, sparkling eyes. Its lifelike appearance is a result of both art and engineering.

Soul Machines™’ founder, Mark Sagar, is an award-winning special effects artist who has worked in digital character creation for blockbuster films like Avatar and King Kong. He has developed a unique appreciation for the minutiae of human expression. In that way, the computer-generated “people” he creates are in a league of their own, with appearances and movements that are remarkably close to those of humans.

And that is no small feat. After all, it’s one thing to look human, but it’s a whole other thing to move realistically, explains Michael Walters, a senior lecturer in the School of Computer Science at the University of Hertfordshire.

“It’s very difficult to get a robot to not just look right but move right, as well,” says Walters, who is also a researcher with the university’s multidisciplinary Adaptive Systems Research Group. 

“We’ve seen various humanoid robots, but we aren’t fooled by them for very long. They’re close but not quite right.”

Sagar’s team at Soul Machines™ is working to make virtual beings that are persuasively lifelike — not just in how they look but in how they move and react to stimuli. That’s due in large part to the way they’re approaching this 21st -century challenge: they’re endeavouring to build a simulated brain.

Finding the human connection

An interdisciplinary team that includes neuroscientists and physiologists “is now building biologically inspired models of the human brain,” using the concepts of neural networks and machine learning to build a virtual nervous system, says Greg Cross, Soul Machines™’ chief business officer.

Their goal, he says, is to understand how humans work, and “figure out how we learn to interact with others, and how we learn to create.” When their autonomous virtual infant smiles, it’s not because of a line of code directing it to do so following certain prompts or inputs — it’s in reaction to virtual dopamine and endorphins, the release of which is triggered by real-world stimuli and interactions. In other words, the same things that make humans smile.

“By putting a face on machines they become more human-like,” says Cross. “The most powerful instrument we have to show our emotions is the human face.”

Soul Machines™’ team of developers is striving to reach the benchmarks of “emotional intelligence, understanding and responding to emotion,” he added.

In this way, the research group is differentiating their creations from the current wave of consumer robots on the market. Cross sees their AI as the inevitable evolution of the faceless virtual assistants like Siri and Alexa that are now in millions of homes and businesses all over the world.

“Humanoid robots are to virtual assistants what television was to radio,” he says.

The full story can be read here.

Bringing AI to Life: IBM Watson and Soul Machines™

IBM Watson and Soul Machines™ are rethinking how businesses are interacting with customers.  Read the blog by Shantenu Agarwal – IBM Watson – as he writes about the emotionally aware avatar that puts a human face on AI interactions.

 “Emotions and expressions can differentiate us as individuals, allowing a deeper connection with each other.”  Excerpt from blog by Shantenu Agarwal, IBM Watson

 

 

EVENT: Greg Cross as Innovation Night Speaker for 2017 IBM Forum in Taipei

When virtual technology meets the real world

Cloud computing and big data analysis to artificial intelligence and smart technology are rewriting the traditional business rules.

        

Greg Cross (CBO, Soul Machines™) sits down with Wang Jie  (Business Week Editor of Digital Content) and Rob High (IBM Watson Chief Technology Office) to discuss when virtual technology meets the real world, live at 2017 IBM Forum in Taipei.

Excerpt from the discussion panel, as Greg Cross shares the idealogy behind Soul Machines™:  

” Literally what we’re doing is putting a face on IBM Watson and taking that computer interface beyond a simple chatbot. What we think is really important is that we’re heading into an era where as humans we’re going to spend so much more of our time interacting with AI systems, with robots, with different types of machines; and the thing that we believe is incredibly important is – these machines and these AI systems will be so much more useful to us as humans, as people; if they’re more human like.”

Watch Greg Cross in discussion at the IBM Forum in Taipei.