The World’s First Autonomously Animated Digital Influencer

Today at the Cannes Lions International Festival of Creativity we announced with P&G’s SK-II, a global prestige skincare brand, the world’s first autonomously animated Digital Influencer: YUMI.

YUMI marks the birth of the first fully autonomous digital influencer capable of interacting as a human would but with the control brands need and expect.

YUMI is an integral part of SK-II’s ongoing transformation journey to connect with a new generation of consumers who are yearning for more meaningful experiences with the brands they know and trust. YUMI will not only provide beauty advice but also help consumers better understand their skin and selecting SK-II products that are suitable for them.

Much has been written about Digital Influencers but this is very different and a result of our more than eight-year investment in hard science and development that has resulted in a number of patents and innovations. One of our goals is to contribute to the progression of artificial general intelligence (AGI) using the human as a model system, enhancing human-machine collaboration.

The difference between YUMI and what are being touted as digital influencers is significant:

  1. Autonomy: YUMI can operate at scale, autonomously of human intervention. She will express emotions and information through what is effectively a Digital Brain™. Harnessing advanced artificial intelligence (AI) she can respond and interact just as a human would.
  2. Human: Through our Digital DNA™ Studio she has been developed to be life-like but with a unique personality. Unlike current approaches which depend on hand-tooling and significant animation, YUMI was built rapidly and can change dynamically to reflect the unique personality of the brand. We’ve learnt through our research that creating a human experience has to couple hyper-real imagery with hyper-real expressions, reactions and conversation. To look good isn’t enough – the experience has to be relatable and feel great as well. The research we’ve seen elsewhere reinforces this point: “The photorealistic avatar was rated more trustworthy, and people had more affinity with it and preferred it over the cartoon agent… a cartoon character caused extra cognitive load which hindered learning particularly for male participants, compared to the realistic character” (Yuan, Dennis, & Riemer, 2019)
  3. Responsive: Current digital puppets, “hot cartoons” and bots are designed to follow pre-scripted animation paths and fixed in the way they are designed. YUMI will magically adapt and respond based on the consumer in front of her.
  4. Integrated: The current generation of digital influencers deliver little utility to consumers. By integrating YUMI with information she can deliver what consumers are looking for – help, advise, tips and tricks – all based on what they need.
  5. AI-powered: Until now Digital Influencers have animated scripts. The point of YUMI is she is animated because of AI – constantly learning and improving on her own.

YUMI is a Digital Influencer with a difference. She engages and responds as a human would. She won’t know everything but will know what to do when she doesn’t. She represents a new way for brands to be more human in engaging humans in a highly scalable way.

YUMI is more than a digital influencer, she is a digital human capable of interacting and engaging in ways technology hasn’t been able to do until now,” said Sandeep Seth, Chief Executive Officer, Global SK-II. YUMI personifies our goal to combine technology and creativity to benefit customers. She provides the warmth and connection of human touch in the form of a digital experience to make the overall skincare experience at home and in store more enjoyable and compelling. SK-II is the perfect brand to introduce YUMI to and we’re looking forward to customers being able to turn to her for skincare and beauty questions at any time of the day or night.”

That is what Brands and Consumers are demanding.

Feature: MIT Technology Review

In the news today on MIT Technology Review:
“Customer Service Chatbots Are About to Become Frighteningly Realistic”
By: Tom Simonite // March 22, 2017

From the article:

“A startup gives chatbots and virtual assistants realistic facial expressions and the ability to read yours.”

“Would your banking experience be more satisfying if you could gaze into the eyes of the bank’s customer service chatbot and know it sees you frowning at your overdraft fees? Professor and entrepreneur Mark Sagar thinks so.

Sagar won two Academy Awards for novel digital animation techniques for faces used on movies including Avatar and King Kong. He’s now an associate professor at the University of Auckland, in New Zealand, and CEO of a startup called Soul Machines, which is developing expressive digital faces for customer service chatbots.

He says that will make them more useful and powerful, in the same way that meeting someone in person allows for richer communication than chatting via text. “It’s much easier to interact with a complex system in a face-to-face conversation,” says Sagar.

Recommended for You

  1. Andrew Ng Is Leaving Baidu in Search of a Big New AI Mission

  2. Self-Driving Cars’ Spinning-Laser Problem

  3. How a Boy’s Lazarus-like Revival Points to a New Generation of Drugs

  4. Google’s New Tool Says Nearly 80 Percent of Roofs Are Sunny Enough for Solar Panels

  5. A New Way to Spot Malicious Apps

The movements of Soul Machines’s digital faces are produced by simulating the anatomy and mechanics of muscles and other tissues of the human face. The avatars can read the facial expressions of a person talking to them, using a device’s front-facing camera. Sagar says people talking to something that looks human are more likely to be open about their thoughts and be expressive with their own face, allowing a company to pick up information about what vexes or confuses customers.

The company’s avatars can also be programmed to react to a person’s facial expressions with their own simulated facial movements, in an attempt to create the illusion of empathy.

Would you find it easier to relate to chatbots with faces?

Tell us what you think.

Other companies have tried detecting people’s emotions by analyzing a person’s voice, words, or expressions. Amazon is exploring the idea as a way to improve its Alexa voice-operated assistant.”


Bloomberg / Ashlee Vance – This Freaky Baby Could Be the Future of AI. Watch It in Action.

“Mark Sagar started his career by building medical simulations of body parts. He took those skills and went into CGI, most famously for movies including Avatar, King Kong, and others. Now he’s combining his skills and building an entire brain and responsive face on a computer in order to map human consciousness. Watch the full episode of ‘Hello World’

Scoop News: AI and avatar company Soul Machines™ raises Series A investment led by Horizons Ventures


AUCKLAND, NZ, November 23, 2016 – Soul Machines™, a developer of intelligent, emotionally responsive avatars, today announced it has raised $7.5 million USD in a Series A financing round led by Horizons Ventures with Iconiq Capital. This investment will allow Soul Machines™ to deliver on its vision of humanizing technology to create intelligent and emotionally responsive, human-like avatars that augment and enrich the user experience for customers and markets adopting Artificial Intelligence-based platforms.

Soul Machines™, which formally launches as a result of this investment, is built on the technology behind Baby X, the first avatar created by the company’s founder and CEO Dr. Mark Sagar – a two time Oscar-winning scientist – and his engineering research team at the Laboratory for Animate Technologies based in the Auckland Bioengineering Institute (ABI), University of Auckland.

“Mark’s work on Baby X is leading the way in the development of a completely new interaction model between humans and machines,” says Phil Chen of Horizons Ventures. “With the rapid acceleration of intelligent assistants and productivity applications using deep learning techniques, Mark and his team provide an emotional and social reasoning platform to existing and developing intelligence in the AI industry.”

Previously the originator of the Vive, Phil Chen of Horizons Ventures joins Soul Machines™ as executive chairman. With a proven track record of early investments in disruptive AI technology like Apple’s SIRI, Waze and Spotify, Chen and Horizons Ventures understand the fast-moving AI market and will be key advisors in driving mass adoption of emotionally responsive avatars across markets. 

“Mark and his research team have wowed leading technology influencers around the world with Baby X,” said Dr. Andy Shenk, CEO of Auckland UniServices, the Technology Commercialization Company of the University of Auckland. “Horizons Ventures was introduced to the technology on a recent tour of the University and was so impressed the team made the decision to invest almost immediately. Horizons Ventures makes for the perfect investment partner with their track record in Artificial Intelligence and AR/VR.”

Experienced New Zealand-based technology entrepreneur Greg Cross, with a 20-year track record in building technology companies in Asia Pacific and North America, has also joined Soul Machines™ as part of the transaction to launch the new company and accelerate commercialisation. 

Dr. Sagar says, “It’s a really exciting time for the Soul Machines™ team with both the investment led by Horizons Ventures and commercial leadership with Greg Cross in place. Now, our engineering team can focus on building core technology that will bring human life to technology that is intelligent, emotive and adaptive. Our goal is to define the user experience for AI systems and platforms.”

As a result of the investment, Dr. Sagar and his research team now make up the newly formed Soul Machines™ brand, and Auckland UniServices has reassigned ownership of all Intellectual Property and associated research contracts to Soul Machines™ in return for a shareholding in the new company. 


About Soul Machines

Soul Machines™ is a developer of intelligent, emotionally responsive avatars that augment and enrich the user experience for Artificial Intelligence (AI) platforms. The University of Auckland spinout company was built on the Baby X technology created by Dr. Mark Sagar and his engineering research team at the University’s Laboratory for Animate Technologies based in the Auckland Bioengineering Institute. The company is venture backed, with an investment led by Hong Kong-based Horizons Ventures, a leading artificial intelligence and virtual reality investor.