FAST COMPANY: Autodesk's AVA is built to be a font of empathy

This Chatbot Is Trying Hard To Look And Feel Like Us

Autodesk's AVA created by Soul Machines

Autodesk's AVA created by Soul Machines

Modeled on a real person and equipped with a virtual “nervous system,” Autodesk’s AVA is built to be a font of empathy, no matter how mean a customer gets.

For the full story click here

Excerpt from story by Sean Captain | Fast Company | 11.15.17

 

Among the attributes credited for Apple’s famous customer loyalty is a network of stores where curious or frustrated consumers can meet the company face-to-face.

The 3D design software maker Autodesk is trying to achieve something similar online with a help service that allows people to interact with what sure looks like an actual human. The company says that next year it will introduce a new version of its Autodesk Virtual Agent (AVA) avatar, with an exceedingly lifelike face, voice, and set of “emotions” provided by a New Zealand AI and effects startup called Soul Machines. Born in February as a roughly sketched avatar on a chat interface, AVA’s CGI makeover will turn her into a hyper-detailed, 3D-rendered character–what Soul Machines calls a digital human.

The Autodesk deal is Soul Machines’ first major gig, following a pilot project with the Australia National Disability Insurance Agency from February to September, 2016, and some proof-of-concept demos, like a recent one with Air New Zealand. Greg Cross, Soul Machines’ chief business officer, says the company is working on eight other projects with “major brand names” but declined to name them.

The Original AVA

The Original AVA

In demonstration videos, the new AVA’s face reacts with all the physiological subtlety of the best Hollywood CGI characters. No wonder, since she’s born of the same technologies. Before founding Soul Machines in 2016, engineer Mark Sagar was a movie special effects guru at outfits like Sony Pictures and Peter Jackson’s Weta Digital. He won two Oscars, in 2010 and 2011, for his work creating lifelike CGI facial animation in films including Jackson’s King Kong and James Cameron’s Avatar. Among his achievements was co-developing a system called Light Stage that scans a human face and body in extreme detail, with lighting from multiple angles, in order to create lifelike 3D models.

Hollywood productions may take the scans as a starting point, such as morphing impressions by actor Zoe Saldana into the blue-face Neytiri in Avatar or of Mark Ruffalo into the green-faced Hulk in The Avengers. But Soul Machines, using additional software that Sagar developed at the University of Auckland, keeps true to the original, retaining even the pink splotches, clogged pores, and errant eyebrow hairs of its human models.

That embracing of flaws may allow digital humans to avoid the “uncanny valley” of creepiness, when a robot looks oh-so-close-to, but not quite, human. Soul Machines’ imperfect beauty has to be more than skin deep for expressions to be believable, though, with computer modeling of bone structure, muscle twitches, and other subtleties.

Sagar wants to go even deeper. Since leaving cinema, he’s been trying to recreate the human nervous system in software. Analyzing facial expressions to discern a smile, even a subtle one, and analyzing voice to pick up a pleasant tone, Soul Machines’ software provides a hit of virtual dopamine to AVA’s nervous system. As in a human, this triggers a relaxed demeanor in AVA. Don’t be mistaken, though: This is very far from artificial sentience, Sagar cautions. “Everything is radically simplified from the real thing, and even how the real thing works is not understood,” he says. “This is just sort of current thinking on how some of these models work.”

TO READ THE REST OF THE STORY CLICK HERE

 

Kirrily Denny