FAST COMPANY: Autodesk’s AVA is built to be a font of empathy

This Chatbot Is Trying Hard To Look And Feel Like Us

 Autodesk's AVA created by Soul Machines Autodesk’s AVA created by Soul Machines™

Modeled on a real person and equipped with a virtual “nervous system,” Autodesk’s AVA is built to be a font of empathy, no matter how mean a customer gets.

For the full story click here

Excerpt from story by Sean Captain | Fast Company | 11.15.17

 

Among the attributes credited for Apple’s famous customer loyalty is a network of stores where curious or frustrated consumers can meet the company face-to-face.

The 3D design software maker Autodesk is trying to achieve something similar online with a help service that allows people to interact with what sure looks like an actual human. The company says that next year it will introduce a new version of its Autodesk Virtual Agent (AVA) avatar, with an exceedingly lifelike face, voice, and set of “emotions” provided by a New Zealand AI and effects startup called Soul Machines™. Born in February as a roughly sketched avatar on a chat interface, AVA’s CGI makeover will turn her into a hyper-detailed, 3D-rendered character–what Soul Machines™ calls a digital human.

The Autodesk deal is Soul Machines™’ first major gig, following a pilot project with the Australia National Disability Insurance Agency from February to September, 2016, and some proof-of-concept demos, like a recent one with Air New Zealand. Greg Cross, Soul Machines™’ chief business officer, says the company is working on eight other projects with “major brand names” but declined to name them.

 The Original AVA The Original AVA

In demonstration videos, the new AVA’s face reacts with all the physiological subtlety of the best Hollywood CGI characters. No wonder, since she’s born of the same technologies. Before founding Soul Machines™ in 2016, engineer Mark Sagar was a movie special effects guru at outfits like Sony Pictures and Peter Jackson’s Weta Digital. He won two Oscars, in 2010 and 2011, for his work creating lifelike CGI facial animation in films including Jackson’s King Kong and James Cameron’s Avatar. Among his achievements was co-developing a system called Light Stage that scans a human face and body in extreme detail, with lighting from multiple angles, in order to create lifelike 3D models.

Hollywood productions may take the scans as a starting point, such as morphing impressions by actor Zoe Saldana into the blue-face Neytiri in Avatar or of Mark Ruffalo into the green-faced Hulk in The Avengers. But Soul Machines™, using additional software that Sagar developed at the University of Auckland, keeps true to the original, retaining even the pink splotches, clogged pores, and errant eyebrow hairs of its human models.

That embracing of flaws may allow digital humans to avoid the “uncanny valley” of creepiness, when a robot looks oh-so-close-to, but not quite, human. Soul Machines™’ imperfect beauty has to be more than skin deep for expressions to be believable, though, with computer modeling of bone structure, muscle twitches, and other subtleties.

Sagar wants to go even deeper. Since leaving cinema, he’s been trying to recreate the human nervous system in software. Analyzing facial expressions to discern a smile, even a subtle one, and analyzing voice to pick up a pleasant tone, Soul Machines™’ software provides a hit of virtual dopamine to AVA’s nervous system. As in a human, this triggers a relaxed demeanor in AVA. Don’t be mistaken, though: This is very far from artificial sentience, Sagar cautions. “Everything is radically simplified from the real thing, and even how the real thing works is not understood,” he says. “This is just sort of current thinking on how some of these models work.”

TO READ THE REST OF THE STORY CLICK HERE

 

HOT OFF THE PRESS: Soul Machines™ Partners with Autodesk to launch AVA at Autodesk University 2017

Soul Machines today announced it has partnered with Autodesk, a builder of software that helps people imagine, design and create a better world, to create a digital human of Autodesk’s Virtual Agent, AVA. Announced at Autodesk University, Autodesk’s annual conference in Las Vegas, Ava will provide customers access to a service agent 24/7—answering questions, directing them to content and completing transactions.

Soul Machines™ is advancing AVA’s capabilities, with a digital human face and persona that it literally brings AVA “to life” using its world leading Human Computing Engine™ (HCE). The Soul Machines™ HCE  is a Virtual Nervous System™ that combines neural networks and biologically inspired models of the human brain. AVA’s Virtual Nervous System gives her the ability to see and hear as well as sensory systems that enable to recognize and respond emotionally in an incredibly human like way.

“The future of highly personalized customer engagement is a blend of digital and human support, and AVA our first digital employee is just that. She will understand human consciousness and interactions by reading signals such as body language and facial reactions—in turn learning more about customers to better serve them. The addition of emotional intelligence to AVA takes our customer service beyond purely transactional to relational.” – Rachael Rekart, senior manager for, machine assisted service engagement at Autodesk

By putting a face on Artificial Intelligence and extending the potential for customer interaction beyond text based chatbots and voice assistants we enable large corporations and organizations all over the world to enter a whole new era of personalized service and democratized knowledge transfer.

“As one of the world’s leading global software companies we are excited that Autodesk will be among the first to deploy our technology on top of their IBM Watson cognitive platform. If the future of software is AI and 3D graphics this is an extremely exciting partnership for both Autodesk and Soul Machines” – Greg Cross, Chief Business Officer, Soul Machines

As we work with Autodesk to launch their first digital employee AVA we are very excited  to be exploring the future of human machine interaction at the very beginning of the AI and robot era. Learning about how we can make these “machines” more human like  and more useful to each and everyone of us as individuals will define the future of creating compelling customer experiences.