PRESS: Is AI’s Next Evolution to Digital Humans?

by PYMTS | February 27, 2018

What would you do if you met a digital human?

It is not an idle question, nor are we asking how you would handle yourself if you suddenly landed in an alternative universe surrounded by robots and avatars.

As it turns out, digital humans are already among us.

Autodesk users have been interacting with them since the end of last year, when calling into customer support. Travelers on Air New Zealand have been utilizing the services of its digital travel concierge for a little more than six months.

Australians with disabilities are now able to work with a digital human named Nadia, designed to help users better navigate the National Disability Insurance Scheme (NDIS) and find the information they need. Nadia can read users’ emotions by “watching” their faces – not to mention give them the experience of talking to a celebrity, sort of: Nadia’s voice is provided by Academy Award-winning actress Cate Blanchett.

Very soon, the banking customers of NatWest will meet Cora, their new personal banking assistant, who will look them in the eye as she talks to them and helps them along their financial journeys.

So, what do all of these digital humans have in common?

A company called Soul Machines™, which is building what it believes the next generation of interactive, conversational artificial intelligence (AI) will look like.

Because, according to Soul Machines™’ chief business officer, Greg Cross, they will actually have to look like something. The age of AI, he said, will only become truly useful when the machines are familiar to the people who use them.

That will mean they will have to do better than just sound like us – they will need to look like us, too, he said.

To that end, the company has developed what it refers to as the world’s first Virtual Nervous System, from which it painstakingly renders the visually responsive, three-dimensional “virtual humans” – human-like avatars – that can interact “face-to-face with customers.”

“We actually believe that, in time, all assistants will need to have a human face, because as humans, we are programmed at a DNA level to want to be able to look at someone when they are talking to us,” Cross said.

Facing the Conversation

The problem today with automated communication, Cross noted, is that it tends to feel a bit stiff and, well, robotic. The consumer’s experience is not only not enhanced – in some cases, it can even be actively worse than it was before.

And that, said Cross, is a big problem.

As he pointed out, anyone who wants to use a conversational interface to connect a human being and a smart machine basically has to solve for two issues. First, it has to be able to build a “highly personal and customized interaction for the customer. Then, it has to make sure that interactions can keep expanding – and that is because the AI is learning, and thus interacting more efficiently.”

The Power of  “Face-to-Face” Interactions

The intent, Cross noted, is not to trick the user into believing the avatar they are speaking to is a real person. The firms they work with across a variety of verticals, he said, make no effort to disguise the fact that customers are talking to a virtual human. Air New Zealand customers know they are talking to an AI avatar when they are dealing with the concierge service.

As Cross maintained, there is no need to hide it, because putting a literal face on the technology only makes the interaction that much better. Customers actually like talking to a visual avatar – after all, Cross noted, they already tend to think of Alexa as a “she” instead of an “it.”

“We believe it is a much more personalized customer experience,” he said. “From here, we get to a position where customers can really have a more intimate experience, because we are better able to create and convey the complex range of emotions the human face can convey.”

Moreover, he noted, the inclusion of a facial focal point makes some of the behaviors of the AI more palatable for human consumers.  For example, he noted, customers often don’t like the idea of a device using its camera to “scan” their face to read data about their mood – they tend to find it “creepy.” But the exact same activity doesn’t read as off-putting scanning when done by a virtual human – instead, it reads as the AI looking at the customer.

Which is why, Cross pointed out, the firm is building so many virtual humans for so many partners.

Putting a Face to a Name

As of this week, NatWest has announced that it will be taking the “digital human concept for a ride to help cater to their consumers’ customer service needs in regard to getting answers to basic banking queries.”

At the start of 2017, the bank deployed a text-based chatbot named  “Cora,” which already can handle 200 basic banking queries and now holds 100,000 conversations a month. The goal of the partnership is two-fold, Cross noted. First, he said they are hoping to help Cora transfer all of those basic banking skills into the face-to-face personal interactions. That entails more than just a direct transfer of text-based conversational platform, because the translations are not exactly 1-1.

“People talking face-to-face use very different language than when they are texting,” Cross observed.

But beyond merely adopting Cora’s current skill set, he noted, the bigger goal and vision is to expand the universe of what she can do in cooperation with the customer as “they get to know each other better.”

“The end goal is to get Cora to a point where she can be a consumer’s personal banker – that is the hope for this digital human, that the more one uses it, the more helpful it is going to get,” Cross said. “That is the promise for the future in the interaction.” 

What’s Next

Cross and the team at Soul Machines™ know a thing or two about avatars. The amazing – if complicated – thing, Cross said, is that they are not developing this technology in any single direction, so much as they are building emotional content for these avataric smartbots, which can then spread those benefits over a range of use cases.

One of their avatars, Baby X, now has a body and can realistically move his arms and legs, meaning the world of gestural communication is opening up to the digital human being built in the very near future.

The grander vision, Cross noted, is to develop a series of tools that can be used so that everyone can have the customer-built digitized assistant they want or need, offered up freely to third parties.

But they are building it out to do some really different things. For example, Cross noted, they are currently using their Virtual Nervous System to construct a virtual human for someone who is no longer alive: specifically, an art grandmaster who has been dead for over 100 years.

“What we are looking at is creating a digital grandmaster artist,” he said. “So that someone at their favorite art gallery or museum can be standing in front of one of the great paintings of the world, having a digital version of that artist explaining the work.”

So, what would you do if you met a digital person? You might learn something, organize your finances, book a trip, get tech support or even meet the digital ghost of a great mind from generations past.

If Soul Machines™ has its way, that’s just scratching the surface.

PRESS: NZ can’t afford to fall behind in the AI revolution

by Kip Brook | Make Lemonade | March 1, 2018

Auckland – The head of New Zealand’s leading artificial intelligence (AI) company Soul Machines™ has issued a plea to New Zealand corporate companies not to fall behind in the global development of AI, the latest tech industrial revolution.

Greg Cross, chief business officer for Soul Machines™, says jumping on the AI bandwagon is a big challenge and a big opportunity for New Zealand companies.

“It will be fundamental to the competitiveness of our big industries going forward and currently there is not a lot of evidence that our corporates are experimenting and innovating at this point,” he says.

Cross is one of 20 top speakers at AI Day the biggest artificial intelligence (AI) event ever to be held in New Zealand later this month. The event will be held in Auckland on March 28. Other speakers include Microsoft’s Steve Guggenheimer, IBM’s Adam Cutler and Amazon’s Alayna Van Dervort.

The conference has been organised by NewZealand.AI and the AI Forum NZ, which is part of the NZTech Alliance, bringing together 14 tech communities, more than 560 organisations and more than 100,000 employees to help create a more prosperous New Zealand underpinned by technology.

Cross’s Auckland-based company Soul Machines™ makes artificial intelligence human avatars that are emotionally responsive. They have built eight digital humans and are building about 20 more in the next 12 months.

“These avatars are bringing a whole new level to online customer service,” he says.

“Our digital humans are avatars with a central nervous system that can be mapped to show how they respond. The machines will be more useful to us and more natural to interact with.

“Kiwis are going to be spending more and more time interacting with these digital human-like creations. An enormous amount of detail goes into making all aspects of these avatars and we really focused on making a difference to the way we live our lives.

“We are going to spend more of our time interacting with AI systems, robots and machines such as self-driving cars. To be more like us these machines will need to be emotionally engaging in a way that we are capable of forming a relationship with them.

“The core theory behind our technology is our faces are the mirror image of our brain. You can’t create a realistic face without creating models of the human brain as well,” Cross says.

“Our avatars also have a breathing model because when we speak as human beings, we have to manage our sentences based on when we have to take a breath, so our tone of voice or expression on our face may change slightly.”

Artificial intelligence had reached a tipping point and business leaders were not aware of the changes it would bring to the economy and society, Cross says.

“These human avatar assistants are reinventing the way organisations serve customers online. Some businesses really realise the size and the scale of the impact this is going to have.”

“AI is the next industrial revolution and Kiwi businesses had to act quickly to survive it. Companies at the leading edge of artificial intelligence are few and far between in New Zealand,” Cross says.

Meanwhile, AI Forum executive director Ben Reid says soon after the AI-Day event the AI Forum will be releasing a major research report on the impact of AI in New Zealand which identifies the opportunities and challenges for our country.

Note: See a recent presentation Cross made in Taipei.

PRESS: Would you pay to immortalise yourself in a digital forever?

Madison Reidy | SUNDAY STAR TIMES | February 18, 2018

 To watch the video click here To watch the video click here

Soul Machines™ has created eight virtual twins so far and digital Rachel is one of them.

The digital-human is a facsimile of company employee Rachel Love, though she was renamed and used by Air New Zealand as an ambassador last year.

 Soul Machines employee Rachel Love is one of only eight people in the world who have a virtual version of themselves.     Photo : Peter Meecham /Stuff Soul Machines™ employee Rachel Love is one of only eight people in the world who have a virtual version of themselves.     Photo : Peter Meecham /Stuff

The pair are strikingly similar, even if their eyes are different colours: Love has blue eyes, digital Rachel’s are brown. 

Soul Machines™ employee Rachel Love is one of only eight people in the world who have a virtual version of themselves.

Digital Rachel reacts like any human might. Smile, and she smiles back. Clap abruptly in her face and she is startled.  

But she is not human. She is software. She has a face, and is programmed to be “emotionally responsive”, but she lives within a computer.

Digital Rachel, created by the New Zealand-based artificial intelligence company, represents a latest step in our journey to achieving immortality.

Soul Machines™ wants to make life after death a reality by allowing us to exchange our blood and tissue for pixels and hardware to create digital clones of ourselves.

It aims to make digital humans a mainstream option within a decade.

A DIGITAL BIRTH

As the pair spent time together, digital Rachel learned Love’s mannerisms, personality and politeness, downloading it all into her virtual nervous system – a digital brain built to work like a human brain.

It is a technological revolution Soul Machines™’ engineers, neuroscientists and psychologists have spent years developing.

When she sees a person, Rachel is programmed to recognise facial expressions and speech and respond appropriately.

The system’s intricacy includes a flow of dopamine through Rachel’s brain when someone smiles at her, notifying her to smile back. Movement tracking allows her eyes to follow a person when they shift past the screen she lives behind. 

She is artificially intelligent so requires no manual control. Through machine learning, she gets smarter with every interaction. 

Soul Machines™ chief business officer Greg Cross says after creating the likeness of a person, they must then create “the personality and the knowledge base that drives the digital human behind it”.

“Our virtual nervous system will be what brings it to life in an incredibly accurate version of yourself.”

The company has spent years creating freakishly realistic digital avatars that react and respond like humans.

For now, they sell them to companies to hire as digital employees and ambassadors. But the business case for digital humans extends far beyond the corporate world. 

Cross says they are already working with a handful of confidential celebrities who want a digital double to preach their philanthropic message and keep their legacy alive forever. 

“Imagine being able to build a digital version of that person for the purpose of continuing the story of that philanthropic foundation.

“Or you could take an iconic entrepreneur like Richard Branson and say what would it be like to start building a Branson now, so he can continue to have an influence and his stories and his journeys can continue to be told for generations to come.”

Cross says they are already working on a project to bring back to live “a very, very famous person” who lived in an era before photography and video. He is not, however, revealing who the returnee will be.

Soul Machines™’ latest technological advancement is a motor control system in their Baby X 5.0 infant digital human.

Adult digital humans such as Rachel are little more than a talking head. Baby X 5.0 has limbs it can move, too, to make hand gestures. It has a pumping heart and lungs.

Soul Machines™ is not the only player in the digital immortality game.

Start-up eterni.me wants to make people “virtually immortal” by amalgamating their online photos, videos and conversations after they die so family and friends can continue to speak to them. 

“Think of it like a library that has people instead of books. An invaluable treasure for humanity,” eterni.me’s website says. 

Some 40,000 people have already signed up to one day become virtually immortal. 

Cross says having a ‘humanised’ digital copy of yourself living in the cloud will be mainstream in five to 10 years. 

“We see a world in the future where there will be populations, millions, of these digital humans. We believe that will create a pool of what we call digital DNA™ to recreate just about any face in the world.”

In his opinion, everyone should have the option to exist forever in the virtual world. It will empower the general population, he says.

ETHICAL DILEMMA

But this use-case for technology raises as many ethical and legal questions as it does eyebrows. 

Who will own the information your digital-self holds about you and your life after you die? 

Technology law firm Hudson Gavin Martin partner Edwin Lim says that depends on the service contract agreement between the digital human creator and the person they create it for.

Cross says the protection and privacy of the information a digital human holds is paramount. 

Unlike most modern social media platforms, when interacting with a digital human, people have the choice to tell them as much or as little information as they want. 

“We have become the opt-in society. We agree to give data away without really knowing what we are doing. It should be you deciding how much of that information, and how much of you, you want to share.”

What is of more concern, though, is identity theft. Humans aren’t hackable, digital humans are.

Cross says that is already a contentious conversation worldwide. But he is optimistic the technology won’t fall into the wrong hands. 

“I can honestly say I have not met a single person who is not doing this because they are doing something amazing for mankind, for human society.”

Lim says the legal realm was grappling to come to terms with legislating digital humans, because they, or any artificially intelligent machine, do not meet the definitions of a legal entity. That in itself is a danger. 

“New Zealand law has not adequately caught up with the rapid development of technology, and artificial intelligence in particular. This is because of the lack of or move away from human input, which is a fundamental assumption of most contracts and legislation.”

Until the problems of such technology come to fruition and make it to court, it is hard to legislate their responsibilities, he says. 

Cross says before digital humans become mainstream ghosts, they will become commonplace in the workforce.

He expects all large companies will soon start hiring digital humans to do mundane tasks in industries where skills are short. 

At the moment, they are single-faceted, they are not autonomous and do not have minds of their own.

“The digital humans we are creating, like Rachel, only have one specific task, to deliver a service or be able to sell a product. They are programmed. They are limited.”

He says the rate of change happening today is mind blowing and its impact on society is inevitable. 

The day when a funeral fund pays for a digital clone to exist forever seems far-fetched. But so did an iPhone, 15 years ago. 

“We adapt to change in a way that nothing else does, that is part of who we are.”

 – Read the full story at Sunday Star Times