Capturing the attention of Newshub

In the news today:

Kiwi startup Soul Machines reveals latest artificial intelligence creation, Rachel

by Simon Shepherd at Newshub, July 9, 2017

Take a look at the news story here.

A Kiwi company developing artificial intelligence has delivered its latest digital human, called Rachel.

Rachel can see, hear and respond to you.

She is an avatar created by two-time Oscar winner Mark Sagar, who worked on the blockbuster movie of the same name.

Mr Sagar, of Auckland-based company Soul Machines™, says his aim is to make man socialise with machine, by putting a human face on artificial intelligence.

“So what we are doing with Soul Machines™ is trying to build the central nervous system for humanising this kind of computer,” he says.

A favourite theme of Hollywood, the interaction between human and computer is already here in much simpler forms, from Siri on your iPhone to virtual assistants in your home.

China’s third-largest technology company Baidu has just announced artificial intelligence is its major focus, including driverless cars.

Soul Machines™’ goal is just as complex – emotions. The startup’s prototype was Baby X, which gets upset and needs reassurance when Mr Sagar hides, and can also recognise pictures.

The technology’s advancing so quickly, a later version helps people in Australia with disabilities.

And the version after that is so detailed it has a warning on its Youtube video – this is not real.



A Digital Brain for Digital Humans – Mark Sagar at Cannes Lions Festival

In the news at Beet TV:

Virtual Brain Models put a Face on Big Data: AI Guru Sagar

by Robert Andrews, June 25, 2017

CANNES — At this point in the early development of artificial intelligence, many people probably assume that typical AI applications revolve around textual deployments.

But we if you could use AI to create lifelike digital brains that, implanted in 3D facial models, could give life and character to virtual avatars?

As far-fetched as it may seem, that is exactly what Mark Sagar has done. The CEO of Soul Machines™ says his company has used IBM’s Watson cognitive computing service to inject emotion in to computer-generated movie characters – but the tech is not going to stop at Hollywood.

“How do we make characters that have their own digital life?,” asks Auckland-based Sagar, during this panel interview at Cannes Lions. “You almost have to give it a nervous system, a digital brain so it can think for itself.”

Sagar, who first pioneered the technology whilst working on the movie King Kong and who later built upon his work for Avatar, may be used to working with scripted characters – but these AI creations don’t necessarily have to follow the paths laid out for them.

“We have biologically-constrained cognitive architectures – these are brain models,” he says.”You don’t know how it’s going to act, it will have memory and so forth.

“The models can sense the environment, they can react, they can learn in real-time and we can connect those to Watson – you (can) have a conversation with it.”

Why is Sagar in Cannes, where the world’s advertisers and creative agencies are out in force to hear about what’s new and what’s next?

Because AI-driven facial models could help brands and enterprises create avatars that interact with customers in lifelike ways, tapping in to vast databases behind them and describing it in emotional mannerisms.

“If you’re a company and have big data that you want to go through, we can put a living face on it,” Sagar adds.”

This interview panel was chaired by The Weather Company CMO Jordan Bitterman. The Weather Company was acquired by IBM in 2015 and, together, the pair are leveraging IBM’s Watson to work on a range of AI-powered initiatives.

This video is part of Beet.TV’s AI Series from Cannes Lions 2017, presented by The Weather Company, an IBM Business. For more from the series, please visit this page.

Mark Sagar at Cannes Lions 2017 speaking about bringing brands to life

In the news today on Beet TV:

‘We Could Replicate Anybody’: Sagar Bring Brand Bots To Life

by Robert Andrews, July 4, 2017, Beet TV

CANNES — As Cannes Lions played host to plenty of discussions about the role of artificial intelligence in advertising and marketing, many might have wondered how far off some of the technologies may be.

Certainly, tools like 3D brand avatars imbued with lifelike emotions and empathy may seem far-fetched. But they are real, here and now, said one pioneer pitching the tech to advertisers today.

“This technology is available today,” said Mark Sagar, CEO of Soul Machines™ a company responsible for the systems. “We could basically replicate any person.”

By that, Sagar means, the 3D technology his company built, originally for movie studios to bring characters to life, how moved beyond animation itself – now, those 3D character models are getting injected with algorithmic profiles that mimic human emotions, ticks and responses, even to viewers they can see through their own digital cameras.

And Sagar thinks brands could use the same tech to bring brand avatars to life.

“It could be a celebrity, it could be a spokesperson or whatever,” he said. “If you think about how you approach representing a brand when you get a celebrity … you’re embodying their traits in that. We can bring them to life and have them interact.”

This is a world away from optimising display ads’ click-through engagement. In this future world, an army of technologies would be deployed to mimic human characters employed to interact in lifelike conversations with customers and prospects. It’s a closing of the gap that Sagar thinks will yield results.

“When you interact, you invest,” he added. “You start personalising things. You can have a spokesperson or representative of a particular brand have a relationship with you – it will remember you, your preferences, adapt its behaviour.

“You start forming a stronger relationship with the brand in that way. By adding life to things, we can’t ignore it.”

This video is part of Beet.TV’s AI Series from Cannes Lions 2017, presented by The Weather Company, an IBM Business. For more from the series, please visit this page.

SOUL MACHINES™ BLOG: How an Owl is Taking Flight at Soul Machines™

Every day at Soul Machines™ is about breaking new ground in the AI fast lane. And this particular day would be no different. Being briefed to create an autonomous animal character that could be used across a diverse range of applications from digital companion to gaming sidekick, that is appealing and cute but also one that people would easily relate to, is no easy task.

Stepping up to the challenge, Goran – 3D artist at Soul Machines™ – plunged straight in.  But which animal?  Well, the one thing people connect with are the eyes and owls are known for their big, bright watchful eyes.

Why an Owl?

Across history and mythology people have viewed owls with fascination and awe whether as Hedwig, Harry Potter’s trustful companion, as Greek Goddess Athene’s favourite creature or with the Kwakiutl of North America who believe owls are the souls of people.

 The Owl from Auckland Museum preparing for its first 3D scan. This owl is a Morepork: a small, dark-brown, forest-dwelling owl known for its distinctive 'morepork' call. Native to New Zealand The Owl from Auckland Museum preparing for its first 3D scan. This owl is a Morepork: a small, dark-brown, forest-dwelling owl known for its distinctive ‘morepork’ call. Native to New Zealand

With their beguiling gaze, solemn presence and dexterity as silent hunters by virtue of their unique feathers; owls carry great resonance with humans around the world.  As a symbol of wisdom and mystery they hold an almost hypnotic attraction for us.

Goran immersed himself in the subject matter. Researching relentlessly he collected a range of references from anatomy to habitat to ensure his owl would be as accurate as possible. Sifting through piles of imagery the team settled on a form that was rounded in shape with big eyes and fluffy feathers. Mark Sagar – CEO of Soul Machines™ – was sold on the idea. It had the appeal needed.

Goran set to work producing initial sketches that led to the first stages of building-up a model. But in order to truly capture the owl precisely a real-life specimen was required.

We get better clarity from proper specimens presented in a beautiful way which gives us the edge, pushing for the final 10% to get that extra realism.”  Goran,  3D Artist – Soul Machines™

Enter Auckland Museum.

In a unique collaboration between Soul Machines™ and Auckland Museum, a selection of taxidermied Morepork were brought to the Lab for Animate Technologies where these specimens were light captured in Soul Machines™ specifically created 3D scanner.

“Seeing it from a completely different perspective is amazing.”
Ruby Moore,  Collections Manager for Entomology & Land Vertebrates – Auckland Museum

These highly-valued exhibits were handled by the Natural Sciences Collections Managers who fully embraced the detailed process of image capture and the opportunities this presents for future collaborations.

“Artificial Intelligence is quite hot at the moment and it’s the next big thing. And so it’s great to see how our collections can be used to help companies such as this.”
Dhahara Ranatunga, Collections Manager, Natural Sciences – Auckland Museum

Armed with this new information Goran is now bringing the owl to life by producing a selection of anatomical changes. He’s extracting texture, re-building elements and adding super-realistic surface details to existing models with special focus on the feathers in order to capture the real-life nuances that cannot be seen from standard photographs.  One detail Goran is currently exploring is the owl’s distinctive movement. Since their eyes are fixed forward, the characteristic signature of bobbing its neck to verify depth perception to accurately fix on its prey will be translated to the final puppet articulation.  This will give a person the sense s/he’s communicating with an actual owl.

Once this accuracy has been reached then reality takes a side-step into make-believe as this unique little bird will be given the ability to talk! The beak will replicate that of a person’s mouth as the owl communicates with users in real time, thus creating a dynamic connection between this autonomous character and the people who interact with it.

So step aside Dr Dolittle, soon you won’t be the only person who can talk to animals!

In the Top Ten of New Zealand Businesses

In the news on Idealog:

Bill Reichert’s New Zealand innovation report, part 2: The top ten (and a bit) New Zealand businesses  

By: Bill Reichert // June 23, 2017

Excerpt from the article:

“Garage Technology Ventures’ managing director Bill Reichert recently spent four weeks in New Zealand as entrepreneur in residence at AUT University and travelled the country meeting some of our most promising and passionate startups, innovators, educators and regulators. In the second and final part of a feature, he tells us about the New Zealand companies that impressed him most. 

Finally, I’d like to share some of my excitement around many companies I have met that are shining examples of world-class talent and innovation. My personal Top Ten List of New Zealand startups includes: 

• Soul Machines™: Emotionally intelligent avatars for personalised online service and support. I had a chance to visit the new office in the Ferry Building and see behind the scenes what Mark Sagar and Greg Cross are doing. Mind boggling.”

Close of article:  “These companies and these entrepreneurs are the crown jewels of New Zealand’s future. And they are only the tip of the iceberg. I met several other impressive entrepreneurs, but I couldn’t fit them all into the Top Ten.”

Soul Machines™ sparks interest at Cannes

In the news on AdExchanger: 

AI Had a Modest Showing At Cannes, But Here Are Some Notable Developments

By: Ryan Joe // June 22, 2017

Excerpt from the article:

IBM Watson/Weather Channel And Soul Machines

“If you’ve seen “Avatar” or “Rise of the Planet of the Apes,” you’ve seen Dr. Mark Sagar’s work.

For those films, Sagar won two Oscars for his facial motion capture work. But at a Cannes event hosted by MEC, Sagar was repping his startup Soul Machines™, which creates avatars – or in his preferred parlance, “digital humans” – to be used as customer service representatives. Watson, of course, provides the AI.

Despite the viability of video conferencing, contact centers still rely on voice calls. But the problem with video conferencing is that it presumes the service rep is well-groomed and camera-ready, and – let’s be honest – that’s just not everyone’s forte.

Sagar insists his digital humans aren’t meant to replace service reps. Rather, like automated contact centers, they can relieve human employees of more menial tasks.

Digital humans, however, are lifelike and are designed to mimic emotion to establish a human-like connection. Are you calling because your credit card was stolen? The digital human will look sad. Are you ordering flowers to celebrate your 50th wedding anniversary? The digital human will duly look happy.

Sagar said his digital humans are already in a handful of pilots and that the solution is scalable, not particularly cost-prohibitive and highly customizable.

But actual non-digital humans are still involved, at least in the testing phase.

Of course, another test will be to see whether digitized faces, despite recent advancements, have fully crossed the uncanny valley – at least enough for most consumers to accept.”

Feature: MIT Technology Review

In the news today on MIT Technology Review:
“Customer Service Chatbots Are About to Become Frighteningly Realistic”
By: Tom Simonite // March 22, 2017

From the article:

“A startup gives chatbots and virtual assistants realistic facial expressions and the ability to read yours.”

“Would your banking experience be more satisfying if you could gaze into the eyes of the bank’s customer service chatbot and know it sees you frowning at your overdraft fees? Professor and entrepreneur Mark Sagar thinks so.

Sagar won two Academy Awards for novel digital animation techniques for faces used on movies including Avatar and King Kong. He’s now an associate professor at the University of Auckland, in New Zealand, and CEO of a startup called Soul Machines, which is developing expressive digital faces for customer service chatbots.

He says that will make them more useful and powerful, in the same way that meeting someone in person allows for richer communication than chatting via text. “It’s much easier to interact with a complex system in a face-to-face conversation,” says Sagar.

Recommended for You

  1. Andrew Ng Is Leaving Baidu in Search of a Big New AI Mission

  2. Self-Driving Cars’ Spinning-Laser Problem

  3. How a Boy’s Lazarus-like Revival Points to a New Generation of Drugs

  4. Google’s New Tool Says Nearly 80 Percent of Roofs Are Sunny Enough for Solar Panels

  5. A New Way to Spot Malicious Apps

The movements of Soul Machines’s digital faces are produced by simulating the anatomy and mechanics of muscles and other tissues of the human face. The avatars can read the facial expressions of a person talking to them, using a device’s front-facing camera. Sagar says people talking to something that looks human are more likely to be open about their thoughts and be expressive with their own face, allowing a company to pick up information about what vexes or confuses customers.

The company’s avatars can also be programmed to react to a person’s facial expressions with their own simulated facial movements, in an attempt to create the illusion of empathy.

Would you find it easier to relate to chatbots with faces?

Tell us what you think.

Other companies have tried detecting people’s emotions by analyzing a person’s voice, words, or expressions. Amazon is exploring the idea as a way to improve its Alexa voice-operated assistant.”


Bloomberg / Ashlee Vance – This Freaky Baby Could Be the Future of AI. Watch It in Action.

“Mark Sagar started his career by building medical simulations of body parts. He took those skills and went into CGI, most famously for movies including Avatar, King Kong, and others. Now he’s combining his skills and building an entire brain and responsive face on a computer in order to map human consciousness. Watch the full episode of ‘Hello World’

Scoop News: AI and avatar company Soul Machines™ raises Series A investment led by Horizons Ventures


AUCKLAND, NZ, November 23, 2016 – Soul Machines™, a developer of intelligent, emotionally responsive avatars, today announced it has raised $7.5 million USD in a Series A financing round led by Horizons Ventures with Iconiq Capital. This investment will allow Soul Machines™ to deliver on its vision of humanizing technology to create intelligent and emotionally responsive, human-like avatars that augment and enrich the user experience for customers and markets adopting Artificial Intelligence-based platforms.

Soul Machines™, which formally launches as a result of this investment, is built on the technology behind Baby X, the first avatar created by the company’s founder and CEO Dr. Mark Sagar – a two time Oscar-winning scientist – and his engineering research team at the Laboratory for Animate Technologies based in the Auckland Bioengineering Institute (ABI), University of Auckland.

“Mark’s work on Baby X is leading the way in the development of a completely new interaction model between humans and machines,” says Phil Chen of Horizons Ventures. “With the rapid acceleration of intelligent assistants and productivity applications using deep learning techniques, Mark and his team provide an emotional and social reasoning platform to existing and developing intelligence in the AI industry.”

Previously the originator of the Vive, Phil Chen of Horizons Ventures joins Soul Machines™ as executive chairman. With a proven track record of early investments in disruptive AI technology like Apple’s SIRI, Waze and Spotify, Chen and Horizons Ventures understand the fast-moving AI market and will be key advisors in driving mass adoption of emotionally responsive avatars across markets. 

“Mark and his research team have wowed leading technology influencers around the world with Baby X,” said Dr. Andy Shenk, CEO of Auckland UniServices, the Technology Commercialization Company of the University of Auckland. “Horizons Ventures was introduced to the technology on a recent tour of the University and was so impressed the team made the decision to invest almost immediately. Horizons Ventures makes for the perfect investment partner with their track record in Artificial Intelligence and AR/VR.”

Experienced New Zealand-based technology entrepreneur Greg Cross, with a 20-year track record in building technology companies in Asia Pacific and North America, has also joined Soul Machines™ as part of the transaction to launch the new company and accelerate commercialisation. 

Dr. Sagar says, “It’s a really exciting time for the Soul Machines™ team with both the investment led by Horizons Ventures and commercial leadership with Greg Cross in place. Now, our engineering team can focus on building core technology that will bring human life to technology that is intelligent, emotive and adaptive. Our goal is to define the user experience for AI systems and platforms.”

As a result of the investment, Dr. Sagar and his research team now make up the newly formed Soul Machines™ brand, and Auckland UniServices has reassigned ownership of all Intellectual Property and associated research contracts to Soul Machines™ in return for a shareholding in the new company. 


About Soul Machines

Soul Machines™ is a developer of intelligent, emotionally responsive avatars that augment and enrich the user experience for Artificial Intelligence (AI) platforms. The University of Auckland spinout company was built on the Baby X technology created by Dr. Mark Sagar and his engineering research team at the University’s Laboratory for Animate Technologies based in the Auckland Bioengineering Institute. The company is venture backed, with an investment led by Hong Kong-based Horizons Ventures, a leading artificial intelligence and virtual reality investor.