HOT OFF THE PRESS: emotional intelligence daimler financial services invests in soul machines

“It will be a game changer,’ says CIO Udo Neumann

By Sara Castellanos and Kim S. Nash | The Wall Street Journal | Mar 1, 2018

 Udo Neumann, global chief information officer for Daimler Financial Services, standing next to Sarah, the 'digital human,' at 2018's Mobile World Congress in Barcelona. Photo: Daimler AG Udo Neumann, global chief information officer for Daimler Financial Services, standing next to Sarah, the ‘digital human,’ at 2018’s Mobile World Congress in Barcelona. Photo: Daimler AG

Digital assistants on the market now can help customers with tasks like finding the right pair of jeans and making payments to credit cards, all while being polite, helpful and sometimes witty.

Udo Neumann, global chief information officer for Daimler Financial Services, is exploring how digital assistance could go even further.

An assistant with a human-like “face,” with instant access to helpful data and programmed to detect how people are feeling and respond accordingly, could help gain customer and employee trust, Mr. Neumann said. “It’s clearly the next step in the development of an evolving technology, (where) emotions come into play.”

 Daimler Financial's Sarah can react to spoken and typed words as well as non-verbal queues. Photo: Daimler AG Daimler Financial’s Sarah can react to spoken and typed words as well as non-verbal queues. Photo: Daimler AG

Daimler Financial Services, a division of Daimler AG, announced this week it’s partnering with New Zealand startup Soul Machines on a proof-of-concept project to see how a digital assistant with a face and a name could give personalized help to employees and customers.

The companies, which have worked together for several months, are developing a “digital human” built with AI software from IBM Watson that can be programmed to answer questions related to car financing, leasing and insurance, and capabilities to recognize non-verbal cues using face recognition technology.

“It will be a game changer. I think we humans love to have interactions on an emotional basis,” Mr. Neumann said.

Neural networking and machine learning tools lets an early version, named Sarah, react to spoken and typed words as well as non-verbal queues such as a loud noise or a nodding head in agreement.

Sarah can be programmed with highly specialized knowledge about, for example, the latest Mercedes models and information about leasing options, said Greg Cross, chief business officer at Soul Machines.

“They learn to recognize you, learn about your personality type and respond and create conversational content that matches you,” Mr. Cross said.

The project is in the early stages of development, with no date set for when Sarah could be deployed to employees or customers, Mr. Neumann said. “We want to combine artificial intelligence and emotional intelligence and see how these capabilities come together,” he said.

The digital human could eventually act as a “companion” for employees at a call center or training center, he said. For customers, talking to such an avatar might increase purchases among those who feel intimidated by high-pressure sales staff, said Mr. Cross.

“Some people don’t feel comfortable in a sales room. They will have a conversation in their living room,” he said.

Then, when a customer visits a dealership for a test drive, the customer and the salesperson could converse with the avatar at a kiosk, sharing information, Mr. Cross said.

Unlike humans, Soul Machines’ digital assistant can be programmed only with traits that help it perform a job, he said. Anger and frustration, for example, will not exist. “The digital sales person simply will not have these traits,” he said.

The goal isn’t to replace a human salesperson, though. “It becomes another way a customer can interact with the company,” he said.

PRESS: The insider’s guide to the making of a digital assistant

Liz Maguire, head of digital and transformation at ANZ bank, says Jamie, the avatar, provides vital lessons on digital inclusion and innovation.

Article by Divina Paredes (CIO New Zealand) | 09 October, 2018

 

 Image by CIO Image by CIO

 

The digital assistant is not going to be for everyone, but this is entirely about customer choice.

Liz Maguire, ANZ

Since Jamie started to work over two months ago, she has had more than 10,000 conversations with customers, and 60 per cent of customers say she was able to answer their queries.

Jamie is a trainee at ANZ Bank and works 24×7. But if you ask her to go out for coffee, she will politely turn you down. “I don’t need coffee to stay awake, but thanks for asking,” she replies. After all, Jamie can not just step out of her workplace. Jamie is an avatar, created to answer frequent customer queries.

“Jamie is a work in progress,” explains Liz Maguire, head of digital and transformation at ANZ Bank.

“As human beings, we have been talking for a lot lot longer than we have been using small screens,” says Maguire, on how the project came about.

“We worked from the hypothesis that talking to a digital assistant was better than pressing a button on a screen.”

Maguire talked about creating Jamie as part of her presentation on the ‘secrets of effective change leaders: inclusion and innovation’ at the recent CIO and Computerworld forum ‘Digital Now and Digital Savvy’ held in conjunction with Zoho.

Maguire explains that at the moment, Jamie can answer answer the top 30 frequently asked questions or most frequently searched-for topics on the help section of anz.co.nz. She shares that they are receiving demands to include more topics that Jamie can answer.

Jamie was made in partnership with Soul Machines™, whose CEO and co-founder Mark Sagar, has won awards for his groundbreaking facial technology in King Kong and Avatar.

“If you are happy, she looks happy; if you are sad, she looks genuinely concerned,” says Maguire.

Maguire explains Jamie sits on a “big AI stack” and is on what she calls “moderated learning mode.”

We are looking at all of the conversations she is having with the customers, says Maguire. “She gets a lot of abuse, which is kind of disturbing, and which is one of the reasons why we did not turn on the learning mode.”  

 Image by CIO Image by CIO

 

 

Maguire did not elaborate on the nature of the abuse, other than saying they were “clearly inappropriate things.” She discloses that the team spent a lot of time working on privacy issues that go with the deployment of a digital human.

She says a customer accepts a disclosure agreement that they will get written extracts of the conversation and the emotional tags. Jamie can only answer generation questions and does not need personal customer data.

“We don’t want people talking about specific information with her,” explains Maguire.

Digital inclusion, digital options

She adds that Jamie compliments the digital options and channels of the bank. She says two-thirds of their customers use digital channels regularly. The digital assistant is not going to be for everyone, “but this is entirely about customer choice.”

The ages of those who use the digital assistant are fairly evenly split between those in the early 20s and mid-60s, says Maguire. There is slightly less use of the technology by those over 65 and those under 21. “That is a reflection that those under 21 have less complex banking needs,” she states.

“Jamie is smart, capable, and intelligent, so why could she not be female?” Liz Maguire, ANZ

According to Maguire, she is often asked, “Why is Jamie female?” Her answer: “Jamie is smart, capable, and intelligent, so why could she not be female?”

The truth, she explains, is that Jamie is a stock avatar from Soul Machines™. Jamie is also the same model used by Air New Zealand’s avatar Sophie.

“We are the first bank in the world to have a publicly available digital human on the system,” declares Maguire. So what can other organisations learn from their pioneer work on creating a digital assistant? Maguire ‘crowdsourced’ the answers from her team and distils their responses into four areas:

First is, the importance of conversations.

Maguire says first, they had to develop a personality for Jamie, who she describes as “quite geeky.” “The team spent a lot of time saying, what would Jamie say in this particular situation?”

ANZ brought in a former movie director to the team. “She has a whole bunch of skills in making characters believable,” says Maguire. She adds that, “The conversation, without question, is absolutely the biggest piece of work in creating Jamie.”

She reveals the team worked on the digital human for about a year before they were confident for it to go live.

When Jamie went live in July, they learned their second biggest lesson: that people are willing to give time to try new ideas.

“We [were] gobsmacked about how willing our customers and staff have been to try new things,” she relates. Maguire shares that the staff approached customers queuing at the ANZ flagship branch on Queen Street in Auckland. “People were receptive [to the invite], and they were way more receptive to Jamie once “they had a go” in asking her questions. There were also unexpected benefits.

“We did not think that potentially Jamie might be a tool to help migrants,” says Maguire. “We had customers with English as a second language and they said to us, ‘actually, I feel at ease. Sometimes I am worried with my English when I am speaking to someone in your branch. I don’t have that problem when I am dealing with Jamie’.”

She says the bank also tested Jamie with more than 150 customers and staff, to “give us reassurance we aren’t going to bungle up our brand.” This, she says, is a lesson for heritage companies working on innovation projects. “If you are doing something that is quite different from what you do, [consider] what is that going to have an impact on your brand.”

She then shares the third lesson: how Jamie has highlighted to them that customers have surprisingly high expectations of innovation.

Since the launch, customers have felt that Jamie can answer anything. So much so that when the bank had an outage a few weeks ago, everybody who went to Jamie thought she would know what was going on. “Once you start down the path, you have to move pretty darn quickly to be able to keep up with customer expectations and hopefully get ahead as well,” she says.

Jamie is responding, talking to the customer in real-time, and we are building a bunch of functionalities to make use of that real-time capability, Maguire further expounds.

She then segues to the fourth lesson shared by her team: There’s never enough time.

“When you are working on new innovation, you are always pressed for time,” states Maguire. She says when they were working on Jamie, the year felt like an “excruciatingly long time”.

Since Jamie was a pilot project, they had to work on a lot of sign offs, and spent a lot of time testing “to get the project right”. “What we have found is the time is super slow when you are bringing it to market. The second you bring it to market time, time is up really really quickly,” she says. Thus, she advises the audience at the CIO and Computerworld forum, “Give yourself a lot more time upfront and a lot more time you think you need for your pilot. Before you know it, your pilot will be finished.”

“We are pretty excited over Jamie,” says Maguire.  She sees “interesting use cases” for Jamie within and outside the bank.

In the future, Jamie may be able to tell a customer interest rates in real-time.

“I would like to see her at GoMoney,” says Maguire.

The latter, developed by her team, is the most popular mobile banking app in New Zealand, having been downloaded by more than half a million Kiwis.

“We expect over time we’ll have our own branch avatars, as well.”

PRESS: Greg Cross wants your next employee to be an AI-powered digital human

Article by Jordan Teicher for INDUSTRIOUS issue 3 – IBM’s Quarterly magazine about the latest trends in the industry

“Humans can communicate in lots of ways,” said Greg Cross. “But when we actually want to have important conversations we always do those face to face.”

Cross, the CBO of Soul Machines™, practices what he preaches. Though he lives in New Zealand, he took time out of a brief business trip in New York to meet me in person at IBM’s office near Union Square. We gathered to talk about his company, whose mission is to make face- to-face conversations like ours part of the most common interactions we have today—namely, the interactions we have with intelligent machines.

“We’re heading into a world where we’re going to spend a lot more of our time interacting with machines. We have a fundamental belief that these machines can be more helpful to us if they’re more like us,” he said.

To do that, Soul Machines™’ team of AI researchers, neuroscientists, psychologists and artists are creating “digital humans”—fully autonomous, animated individuals that look and sound like real people. The key to their intelligence is a cloud-based virtual central nervous system called the Human Computing Engine, which sits atop IBM Watson and uses Watson Assistant.

When connected to that system, Soul Machines™’ digital humans are amazingly lifelike. They hear and see the people with whom they interact, and their conversations with those people are made emotive through nuanced facial expressions. For businesses, Cross said, digital humans can revolutionize the economics of customer service, giving them the ability to provide personalized and consistent care at scale.

A face, Cross said, is a “reflection of the heart and mind
of an individual,” and it can be key to successful digital interactions with customers. In the years to come, he bets, businesses across industries will agree and make digital humans an integral part of their workforce.

“The question we wanted to explore was: What happens when you create a digital face? Will people engage with
it? Will they find that digital face more engaging than
a chatbot or a voice assistant? Our view is that, yes, of course they will. That’s ultimately the market and business development we’ve been going on,” Cross said.

“It completely captured my imagination”

Cross has been a technology entrepreneur nearly his entire career. At 18, he dropped out of business school at the University of Waikato, and began an internship at the high-tech manufacturer Trigon Packaging. Since then, he’s worked at technology startups in different industries all over the world. In 2007, he co-founded PowerbyProxi, a spin-out of the University of Auckland’s wireless power department, which developed high efficiency and high density wireless power products. The company sold to Apple last year.

“For me, there’s nothing more fun than taking on some sort of core technology or core idea, wrapping a team of people behind it, and exploring how you build a company around it. That’s still what gets me out of bed with a smile on my face,” he said.

Two years ago, Cross found his most recent opportunity
to do just that when he met Dr. Mark Sagar, an Academy Award-winning animator who was then the director of the Laboratory for Animate Technologies at The University of Auckland. Cross had, in the past, seen Sagar present his work— a virtual animated baby called BabyX that learns and reacts like a real human infant. But when Sagar sat down with him one-on-one to show him the technology underlying his creation, Cross knew he had to get involved.

“It completely captured my imagination,” Cross said.

First steps

When Cross and Sagar first started thinking about how
to turn the technology into a business, they drew up a list of half a dozen industries they knew were facing “quite significant disruption,” and began imagining how digital humans could help. They then started talking about digital humans at technology and industry conferences. Soon, business leaders eager to drive change in their industries wanted to talk with them.

“It’s like any new technology; it’s well understood that there’s an adoption curve. There are the early adopters and then there are those who never want to be first. We’re always very careful about making sure we’re speaking to the right people,” he said.

So far, it seems, Cross has found those people. This year, Soul Machines debuted its first crop of digital employees at Autodesk, Daimler Financial Services and NatWest. It’s still early days, Cross said, but the employees—Cora, Sarah, and Ava—are paving the way for a future in which digital humans will be an integral part of the way people interact with businesses.

“I like to think in five years we’ll create a very large population of digital humans who will be interacting with people and having hundreds of millions of conversations every day.”

Imagining the future

Where might digital humans pop up next? Cross couldn’t talk about some of Soul Machines™’ upcoming projects. But the appetite for next-generation customer service solutions, he said, is strong across a number of industries, including retail and telecommunications. Digital humans could find a productive place in all of them.

In a fast-paced, digitally-driven landscape, customers have little patience for endless call center queues
and customer service departments with limited hours. Increasingly, they expect quick, seamless interactions
at any time of the day or night with representatives that understand and remember their preferences and history.

At the moment, Soul Machines™’ digital humans are
making their mark in customer service. But Cross is already investigating a wide range of future applications for his company’s technology. He imagines digital humans one day teaching classes or providing medical care. Celebrities, he said, could enlist their own digital twin to perform tasks they can’t fit into their schedule. The possibilities, Cross said, are endless—and he’s exploring as many of them as possible.

“One day I can be sitting in a board room doing a presentation for a CEO of one of the largest banks or the largest tech companies in the world. Another day I can be sitting down with the biggest celebrities in the world,” he said. “It’s a huge amount of fun.”

 

HOT OFF THE PRESS: New Zealand startup Soul Machines™ puts human face on AI

Company founded by ‘Avatar’ animator will bring its digital humans to Asia next year

Article by Akane Okutsu | Nikkei Asian Review | October 02, 2018

 

 Soul Machines’ digital humanoid Lia shows lifelike expressions that would be hard for a physical robot to match Soul Machines™’ digital humanoid Lia shows lifelike expressions that would be hard for a physical robot to match

 

TOKYO — “Do you need me to tell everyone your life story?” asks Lia as she appears on a screen, offering to introduce the speaker. Her wrinkles and moving eyes make her look like a real person, but she is a digital humanoid.

New Zealand-based startup Soul Machines™ has so far created 15 such humanoids — disembodied screen presences — employed mainly as customer service assistants. They work in seven countries for companies including Royal Bank of Scotland and Australia and New Zealand Banking Group.

The company looks to produce thousands of digital humanoids in the next three years, with plans to expand into Asia for the first time in the next 12 months.

In Asia, Soul Machines™ will launch projects with companies in Japan and China in the first half of next year. It also wants to expand to Southeast Asian countries like Singapore.

“We tend to look for industries where we know they are going to go through substantial change, such as banking, autos, health care and education,” Chief Business Officer Greg Cross told the Nikkei Asian Review in an interview on the sidelines of FIN/SUM 2018, an annual financial technology summit in Tokyo sponsored by Nikkei. Cross declined to disclose the partners’ names or their industries.

 

 

“The Chinese market is hugely exciting,” Cross said, as “there are opportunities to leapfrog industry infrastructure” that is relatively underdeveloped, lacking sufficient access to health or financial services.

“Over the next few years we would expect our team to grow to as many as 200 to 300 people to support the business,” Cross said.

Entry into Asia means making humanoids that look Asian, and adjusting their social behaviors to fit the host cultures, Cross said.

One of these humanoids can be created and implemented for “less than half a million dollars,” according to Cross. The company charges annual subscription fees based on factors such as the number of personalities and languages, as well transaction fees that vary with the number of conversations.

The startup was co-founded two years ago by Mark Sagar, who won sci-tech Academy Awards for the films “King Kong” and “Avatar.” It has attracted investors including Hong Kong-based Horizon Ventures, as well as founders of Facebook and Google-owned AI company DeepMind, the developer of the AlphaGo program.

Unlike competitors that feed prerecorded content into a chat box, “we are actually autonomously animating these digital characters using brain models to synthesize human behavior in real time,” Cross said in his speech at FIN/SUM.

Soul Machines™’ digital assistants use existing artificial intelligence engines such as IBM’s Watson, which supports several languages. The company also trains its AIs to understand different English accents.

Going beyond software, the Soul Machines™ team includes neuroscientists to build artificial brains and nervous systems. Artificial digital versions of hormones like adrenaline and oxytocin run through these systems, making the humanoids act human.

“There are some jobs that digital humans and AI can do better than real humans,” Cross told Nikkei in an interview, mentioning customer support. Humanoids become practical for providing personalized services to a large number of users, collecting information and learning in the process. They are better than humans at providing specialized information and analysis based on vast quantities of data, he added.

 

 Greg Cross, Soul Machines’ chief business officer, says humanoids beat humans at some tasks (Photo by Takuya Fujisawa) Greg Cross, Soul Machines™’ chief business officer, says humanoids beat humans at some tasks (Photo by Takuya Fujisawa)

 

Soul Machines™ continues research and development on humanoids that are even more lifelike. Cross said the company is “making our virtual nervous system smarter, teaching it how to learn, teaching it how to cooperate, teaching it things like social learning patterns.”

The idea is that the more humanoids resemble humans, the better they can interact with and be trusted by them. “If you are on a self-driving car, how do you trust that machine?” asked Cross, suggesting that an artificial chauffeur may help.

Humanoids would also benefit from technological developments by other companies, such as improvements in the accuracy of natural-language processing. Even physical humanoid robots are possible if other companies develop technology that imitates the movements of human facial muscles, Cross said.

Read the full article here

 

 

PRESS: World’s first digital teacher starts work teaching kids about energy

Article as featured on Fanatical Futurist | September 2018

WHY THIS MATTERS IN BRIEF

AI, avatars and bots play an increasingly central role in the future of education, and Will is the first teacher of his kind

It’s back to school time for millions of children around the world and you know what that means – it’s time to fire up the tablet that teaches you. At least that’s what primary school students in New Zealand are doing after 125,000 of them have become the first students in the world to learn from an incredibly lifelike Artificially Intelligent (AI) digital avatar.

A few months ago Auckland energy company Vector teamed up with New Zealand AI company Soul Machines™, whose complex deep learning based avatars I’ve discussed at length before, to create Will, a convincing and engaging digital teacher. And lest I remind you, when you watch the video below, Will is the first of his kind so you can expect him to improve very quickly as the technology advances. Will is now part of Vector’s “Be Sustainable with Energy” program which it offers free of charge to the schools it sells electricity to, and he, or maybe it, who knows… will be teaching the children all about sensible energy use.

 

 

The students will be fully able to interact with him on the device of their choice. Thanks to some impressive AI chops from Soul Machines™, who specialise in “Human to AI” interfaces, it won’t be long until the students interaction with Will, from the way he speaks and his responses to his mannerisms, sound and feel “real.”

As you’d expect Will’s main skillsets at the moment center around different forms of renewable energy, such solar and wind, and he can also ask the students questions about what they’ve learned to make sure his “lessons” stick. According to Vector’s Chief Digital Officer, Nikhil Ravishankar, students seem particularly taken by Will, and when you see the video above it’s probably no surprise.

“What was fascinating to me was the reaction of the children to Will. The way they look at the world is so creative and different, and Will really captured their attention,” he said in a news release.

He went on to add, “Using a digital human is a very compelling method to deliver new information to people, and I have a lot of hope in this technology as a means to deliver cost effective, rich, educational experiences into the future.”

Ravishankar isn’t the only person who thinks bots, in this case in the form of AI software programs, will play an increasingly central role in education, many experts do too, as I highlight in my Future of Education 2020 to 2070 report.

It’s a well documented fact that many nations, particularly in the developing world, don’t have nearly enough teachers so bots like Will could one day help fill that gap. Compared to the cost of paying a human teacher, these systems are also far cheaper, they can scale to millions of students per avatar, and they can adjust to each student’s individual learning style, known as “Adaptive learning,” to help them reach their full potential.

While AI teachers could provide a host of benefits suffice to they still aren’t as advanced as they need to be. Will, for example, is only well versed on one topic, renewable energy, while quality teachers are typically far more well rounded. However, as we see advances in Artificial General Intelligence, like the one we saw recently, where AI’s become experts in multiple domains, over time this will become less of an issue. But social interaction between teachers and students is also critical to a quality education, and AI teachers most certainly lag behind their human counterparts in this realm, in fact, let’s face it – that’s an area where they simply can’t compete and won’t be able to for a very long time, even with the use of Augmented Reality and Virtual Reality to help them.

All that said though while Will might be the first digital teacher to hit the classroom he almost certainly won’t be the last. There’s a revolution coming.

PRESS: Robots will probably help care for you when you’re old

 

Article by Corinne Purtill | Excerpt from QUARTZ | 12 SEPTEMBER 2018

Among the symptoms of dementia is a phenomenon called “sundowners syndrome”: an increase in agitation, confusion, and anxiety as late afternoon transitions to evening. Its cause isn’t well understood; circadian rhythm disruptions precipitated by the change in light, anxiety over end-of-day activity, and hormonal fluctuations have all been floated as theories. Whatever the trigger, sundowners can make otherwise amiable people combative and even violent, a frightening and unsettling experience for patients and caregivers alike.

Staff in hospitals and nursing homes typically treat the symptoms with sedative drugs. But in recent years, facilities from Japan to the US have turned instead to a specialist: a robot baby seal named Paro.

Paro spent a decade in development at Japan’s National Institute of Advanced Industrial Science and Technology. The robot seal came to market in 2004 and is now in use in many parts of Asia, Europe, and North America to offer the psychological benefits of pet therapy in situations where a real animal isn’t practical…


To a person in normal cognitive health, Paro is unmistakably a machine. A soft mechanical sound accompanies its motions; up close, you can see its whiskers have tiny sensors on the ends. Given the comfort it brings to people suffering a dreadful disease, insisting that patients recognize its artificiality seems cold and beside the point.

But you don’t have to peer very far into the future to see the possibility of interactions in which it will be difficult even for a person with their full cognitive facilities to tell the difference between robots and reality.

The Auckland, New Zealand-based tech company Soul Machines™ creates AI interfaces that look uncannily like high-definition video chats with a real human being. It doesn’t quite pass the Turing Test, but it’s easy to imagine a situation in which someone with limited eyesight or cognitive disabilities believes they’re having a human conversation when talking to a robot like “Ava.”

 

 

Or “Sarah.”

 

 

Or this baby.

 

 

Soul Machines™ licenses its user interface technology to businesses and institutions. Its technology has powered digital assistants for banks, airlines, and software companies, as well as a prototype virtual assistant, voiced by the actor Cate Blanchett, that was designed to help people with disabilities navigate Australia’s public benefits system. (That program was shelved, not long after the Australian government’s disastrous introduction of an automated system to detect welfare fraud drew public outcry.) Soul Machines™ has discussed services for the elderly with prospective clients but has not announced any partnerships on that subject to date, says chief business officer Greg Cross.

Soul Machines™ envisions a future in which digital instructors educate students without access to quality human teachers, and in which famous deceased artists are digitally resurrected to discuss their works in museums. Robot companions for the infirm, then, are not too far a leap. Nor is the prospect of a future in which a family converses with the lively AI recreation of a person suffering from dementia, while a caregiver—robot or human—tends to their ailing body in another room.

 

HOT OFF THE PRESS: What We Have to Gain From Making Machines More Human

ARTICLE BY MARC PROSSER | SINGULARITY HUB | 11 SEPTEMBER 2018

The borders between the real world and the digital world keep crumbling, and the latter’s importance in both our personal and professional lives keeps growing. Some describe the melding of virtual and real worlds as part of the fourth industrial revolution. Said revolution’s full impact on us as individuals, our companies, communities, and societies is still unknown.

Greg Cross, chief business officer of New Zealand-based AI company Soul Machines™, thinks one inescapable consequence of these crumbling borders is people spending more and more time interacting with technology. In a presentation at Singularity University’s Global Summit in San Francisco last month, Cross unveiled Soul Machines™’ latest work and shared his views on the current state of human-like AI and where the technology may go in the near future.

Humanizing Technology Interaction

Cross started by introducing Rachel, one of Soul Machines™’ “emotionally responsive digital humans.” The company has built 15 different digital humans of various sexes, groups, and ethnicities. Rachel, along with her “sisters” and “brothers,” has a virtual nervous system based on neural networks and biological models of different paths in the human brain. The system is controlled by virtual neurotransmitters and hormones akin to dopamine, serotonin, and oxytocin, which influence learning and behavior.

As a result, each digital human can have its own unique set of “feelings” and responses to interactions. People interact with them via visual and audio sensors, and the machines respond in real time.

“Over the last 20 or 30 years, the way we think about machines and the way we interact with machines has changed,” Cross said. “We’ve always had this view that they should actually be more human-like.”

The realism of the digital humans’ graphic representations comes thanks to the work of Soul Machines™’ other co-founder, Dr. Mark Sagar, who has won two Academy Awards for his work on some computer-generated movies, including James Cameron’s Avatar.

Cross pointed out, for example, that rather than being unrealistically flawless and clear, Rachel’s skin has blemishes and sun spots, just like real human skin would.

The Next Human-Machine Frontier

When people interact with each other face to face, emotional and intellectual engagement both heavily influence the interaction. What would it look like for machines to bring those same emotional and intellectual capacities to our interactions with them, and how would this type of interaction affect the way we use, relate to, and feel about AI?

Cross and his colleagues believe that humanizing artificial intelligence will make the technology more useful to humanity, and prompt people to use AI in more beneficial ways.

“What we think is a very important view as we move forward is that these machines can be more helpful to us. They can be more useful to us. They can be more interesting to us if they’re actually more like us,” Cross said.

It is an approach that seems to resonate with companies and organizations. For example, in the UK, where NatWest Bank is testing out Cora as a digital employee to help answer customer queries. In Germany, Daimler Financial Group plans to employ Sarah as something “similar to a personal concierge” for its customers. According to Cross, Daimler is looking at other ways it could deploy digital humans across the organization, from building digital service people, digital sales people, and maybe in the future, digital chauffeurs.

Soul Machines™’ latest creation is Will, a digital teacher that can interact with children through a desktop, tablet, or mobile device and help them learn about renewable energy. Cross sees other social uses for digital humans, including potentially serving as doctors to rural communities.

Our Digital Friends—and Twins

Soul Machines™ is not alone in its quest to humanize technology. It is a direction many technology companies, including the likes of Amazon, also seem to be pursuing. Amazon is working on building a home robot that, according to Bloomberg, “could be a sort of mobile Alexa.

Finding a more human form for technology seems like a particularly pervasive pursuit in Japan. Not just when it comes to its many, many robots, but also virtual assistants like Gatebox.

The Japanese approach was perhaps best summed up by famous android researcher Dr. Hiroshi Ishiguro, who I interviewed last year: “The human brain is set up to recognize and interact with humans. So, it makes sense to focus on developing the body for the AI mind, as well as the AI. I believe that the final goal for both Japanese and other companies and scientists is to create human-like interaction.”

During Cross’s presentation, Rob Nail, CEO and associate founder of Singularity University, joined him on the stage, extending an invitation to Rachel to be SU’s first fully digital faculty member. Rachel accepted, and though she’s the only digital faculty right now, she predicted this won’t be the case for long.

“In 10 years, all of you will have digital versions of yourself, just like me, to take on specific tasks and make your life a whole lot easier,” she said. “This is great news for me. I’ll have millions of digital friends.”

AI – are humans obsolete?

 

Article by Andrew Cornell | Excerpt from ANZ bluenotes | 13 SEPTEMBER 2018

Dystopia or Utopia? Will robots be our – willing – slaves or overlords? Actually, what is ‘artificial intelligence? We may not realise how pervasive it is already.

We spoke to an AI scientist, a digital human creator, a venture capitalist, an ex-Google banker and an economist about their experience with and expectations for AI in the future.

For further discussion points, see the video below.


Greg Cross – Chief Business Officer & Co-Founder, Soul Machines

At Soul Machines™, we’ve gone beyond what Hollywood is doing with the quality of the digital characters they are producing. What’s really interesting is the way we bring them to life by taking away actors and cameras which have traditionally been used to bring computer generated characters to life.

We’ve automated these characters by giving them a brain and creating a system which enables them to respond, interact and engage with us in exactly the same way we engage with each other.

But at what point does this digital character become engaging to the humans that they’re interacting with? At what point can we relate to them? At what point can we learn to trust them?

In very simple terms by putting a face on artificial intelligence we’re trying to create a platform where people can develop trust in machines.

Building trust between humans and machines is going to be a really critical part of the way we use our systems in the next 10, 20 or 30 years.


 

PRESS: Nigel Latta launches ‘A Curious Mind’ series featuring BabyX

Original Title: “A Curious Mind: Nigel Latta on making popular TV without dumbing it down”

BY HAIMONA GRAY | THE SPINOFF | 25 AUGUST 2018

Television icon Nigel Latta returns to TV, and this time he’s focused on the one thing that governs us all: the brain. Haimona Grey talks to the man himself.

“Our belief has always been that people are interested in interesting things. Sometimes TV patronises the audience, it has a belief that people won’t stay if it’s real content – but they do.” – Nigel Latta.

The general school of thought across the business of entertainment is the broader the appeal of your show, the bigger the potential audience and the lower the risk of failure. 

In the film industry, they call it a ‘four-quadrant movie’: one with broad enough appeal to catch the four major demographic ‘quadrants’ of the movie-going audience: male, female, and both over- and under-25s. 

When failure could lose you your job, having as many ‘quadrants’ on your side as possible seems like a no-brainer. The only safer bet when your job is funding TV shows is having talent, in front and behind the camera, and a good concept. 

This is where Nigel Latta and his production company, Ruckus Media, have separated themselves from their competition. They have brought together talent and good concepts – ones that also happen to have broad appeal – at a prodigious rate. 

Since 2016, Ruckus has produced: two seasons of the Mind Over Money series; the lauded feature length documentary on the health journey of Stan Walker; What’s Next, Latta and John Campbell’s co-hosted series; and his latest, The Curious Mind.

The Curious Mind introduces a new co-host for Latta: BabyX, the literal brainchild of Academy Award-winning artificial intelligence engineer, Professor Mark Sagar. 

I asked Latta what an animated baby has to do with neuroscience, and how Sagar came to be involved.

“Several years ago I was at a science conference in Auckland where Mark Sagar was presenting BabyX. First thought was ‘wow that’s a great animation, it looks exactly like a baby!’, but it was when he kept turning to the baby on the screen to soothe BabyX, I realised it wasn’t an animation, it was a virtual human reacting to Mark and getting upset when he ignores them.

“I have always been interested in [making a] neuroscience show, but it seemed like BabyX was a really good vehicle to show people what is happening inside a brain. So afterwards I went up to Mark to pick his brain, he explained to me that he had always seen BabyX as a learning tool to help teach neuroscience. 

“With The Curious Mind, when we sat down to try and develop a show about the brain we realised there would be much we couldn’t cover. So we tried to look at the really big things which are central to everyone’s lives: how we are wired to connect with other people, how we learn and remember.”

 Nigel Latta and Baby X. Nigel Latta and Baby X.

BLOG: How Interactive learning could transform education with digital teachers

27 August 2018

It’s a world crisis: teacher shortage. The statistics are unsettling as they cast huge concerns for the future of education on a global scale. According to the World Economic Forum: “Currently Nigeria faces the biggest shortages – the West African nation needs an additional 380,000 teachers. India also faces shortages in excess of 350,000, while Indonesia needs nearly 190,000 more teachers.”

The challenges faced by young people living in remote communities in gaining access to education is troubling. Lacking even the most basic literacy and numeracy skills, one-in-three children in Africa still don’t even make it into the classroom. Combine this with the worsening global teacher shortage and the future looks bleak for a huge percentage of the youth of today.

But there is some light on the horizon and it comes from the glow of a computer screen!

Interactive learning in the form of a digital teacher is the catalyst in transforming the floundering education sector by tapping into the Gen Y love of digital media. A digital teacher available anytime on your computer, your tablet or even mobile who looks like a human and behaves like a human. A digital teacher who is pushing the boundaries of education by interacting with students through engaging face-to-face learning. A digital teacher who can be programmed to deliver content in a way that draws in students who previously struggled with more academically focused teaching techniques. And the best part of all is, this is not limited to simply the richer nations of the world.

Digital learning has the potential to bridge the abyss by reaching out to those previously excluded from schooling. Now, more and more classrooms have access to computers as market competition brings prices down and with the cost of internet access cheaper than ever before this has opened up opportunities for technology to reach underprivileged classrooms. Even the challenges of providing tech to the most remote communities on the planet are being harnessed with the rise of robust routers (such as the SupaBRCK) that is not only waterproof but is “solar-powered Wi-Fi that operates as a 3G hotspot and off-grid server.” [TechCrunch]

“…there’s no denying that the use of devices and systems that promote engagement and collaboration bring tremendous value to the learning environment.”  [Education Technology]

It may seem more like a futuristic dream, but digital teaching is not sequestered in the future but is currently being trialled in the form of Will, a digital educator created by Soul Machines™ for Vector, an energy company in New Zealand. Will interacts with young people in a lively and friendly manner as he shares interesting information and fascinating facts around how renewable energy will shape our future world. At the end of each future energy topic, as selected by the user, Will asks a pertinent question which the young students answer by picking from a multiple choice selection. The engagement factor of this form of learning is high, with interactions that stimulate problem-solving and critical thinking which can awaken young and curious minds.

So, if digital educators can help with the world’s teacher shortage, if they can enhance learning through dynamic interactions and if they can reach remote communities then they can affect change on a global scale. Every child has the right to an education and if a digital teacher can help open doors to a future that was previously shut to so many, then a digital teacher could play a vital part in transforming education as we know it.


Sources from media quotes:

  1. The Guardian
  2. New York Post
  3. UNESCO

 

Meet Will – Vector’s new renewable energy educator in schools

Media Release  | 22 August 2018

In a first for education, Vector is exploring the use of “digital human” technology in its energy education programmes in primary schools.

In conjunction with New Zealand’s leading AI company Soul Machines™, Vector has created Will, a “digital teacher” being trialed in its award-winning ‘Be Sustainable with Energy’ schools programme, which is offered free of charge to schools within Vector’s Auckland electricity network. The school’s programme was launched in 2005 and has since educated more than 125,000 children about energy.

Will can interact with children from a desktop, tablet, or mobile, and helps them to learn about renewable energy such as geothermal, solar, and wind. Will challenges kids on their renewable energy knowledge by asking questions such as “how tall is a wind turbine?”; “how long does sunlight take to reach the earth?”; and “when we burn biomass, what do you think is let off as the main by-product?”

Vector’s Chief Digital Officer, Nikhil Ravishankar, says it’s critical the company uses new and emerging technologies which will allow Vector to have better conversations with its customers – including its future generation.

“Our work with Soul Machines is a very effective use case as an education tool for kids around renewable energy and creating a new energy future.”

“What was fascinating to me was the reaction of the children to Will. The way they look at the world is so creative and different, and Will really captured their attention.”

“Using a digital human is a very compelling method to deliver new information to people, and I have a lot of hope in this technology as a means to deliver cost-effective, rich, educational experiences into the future.”

Will uses Soul Machines™’ world-leading Artificial Nervous System – an autonomous animation platform that is modelled on the way the human brain and nervous system work – to bring his digital human face and persona to life in a very human-like way.

Soul Machines™ Chief Business Officer, Greg Cross, says education is going to be one of the breakthrough applications for Soul Machine’s technology as digital teachers have the potential to democratize the delivery of education to students everywhere (particularly those in remote communities) and help address the growing teacher shortages on a global scale.

“Creating one of the world’s first digital teachers has been one of the company’s most exciting assignments. The opportunity to see digital interactions with children in the classroom has been a fantastic part of this project with the next generation of Vector’s users.

Working with such an innovative company with a vision to create a new energy future, we’ve been able to not only demonstrate the power of digital humans in education, but also show how our technology can play an important role in helping companies reinvent themselves.”


About Vector

Vector is New Zealand’s largest distributor of electricity and gas, owning and operating networks which span the Auckland region. We’re leading the transformation of the energy sector to create a new energy future with sustainable energy technology, which includes solar power, energy storage, EV charging stations, and smart meters, and we’re constantly identifying and developing options that will provide value, choice, and service for our customers.

About Soul Machines

Soul Machines™ is a ground-breaking high-tech company of AI researchers, neuroscientists, psychologists, artists and innovative thinkers; re-imagining how we connect with machines. It brings technology to life by creating lifelike, emotionally responsive artificial humans with personality and character that allow machines to talk to humans literally face-to-face. The company’s vision is to humanize artificial intelligence to better humanity. Soul Machines™ is now deploying the world’s first digital humans with some of the biggest corporate brands in the world in Banking and Finance, Software and Technology, Automotive, Healthcare, Energy and Education industries.


Don’t miss your chance to chat to an artificial human

Are you keen to help Soul Machines out on a study investigating how people respond to an artificial human in Auckland, New Zealand?

Then we would love your participation in our research.

You will be offered a $10 Westfield voucher for a 30-minute session.

What the session is

The session will include interacting with an artificial human who will talk to you, show you some images and ask what you think and feel during your conversation.

Where?

University of Auckland, Newmarket campus (Free visitor parking)

Participants must be over the age of 18 and fluent in English.

Find out more

For further information or to sign up straight away:

This study is funded by Soul Machines Ltd. Approved by the University of Auckland Human Participants Ethics Committee on 14 June 2018 for three years. Reference Number 021447

CASE-STUDY: Bringing a human face to customer-facing AI with IBM Watson

Case-Study by IBM  |  July 2018  | Full article here

Business challenge

AI-powered chatbots are already transforming customer service, but customers still value the emotional connection of interacting with human agents. Soul Machines™ saw an opportunity to give AI a human face.

Transformation

Soul Machines™ is working with IBM Watson technologies to create “artificial humans”—realistic computer-generated characters that can react intelligently, empathetically and efficiently to customer needs.

Results

Over 40%  of customer inquiries can be answered with zero human intervention

Continuous learning means that results will improve over time

8-12 weeks to create and roll out a customized artificial human for new clients

Business challenge story

Building empathetic AI

As artificial intelligence (AI) continues its move into the mainstream, many businesses are looking for areas of their business where techniques such as machine learning and deep learning can provide a competitive advantage.

One of the most promising use cases is customer service automation; over the past few years, hundreds of companies have built chatbots that can help them reduce the pressure on their customer service teams and communicate with customers at scale.

However, many people are wary of AI, and it can often be difficult to get users comfortable with the idea of connecting with new technologies—especially if they feel like they are being pushed to interact with an impersonal application interface, when they would prefer to speak directly to a human being.

But what if your interface had a human face? Soul Machines™, a tech company based in New Zealand, specializes in creating “artificial humans”—lifelike computer-generated characters with natural voices and realistic facial expressions, which can communicate and interact with users just like real people.

The company arose from a research team at the University of Auckland, led by Dr. Mark Sagar, the Oscar-winning animation specialist behind computer-generated faces in Avatar and King Kong. Today, it works with some of the world’s largest banks and software companies to create artificial humans that not only possess deep domain knowledge, but also embody the values and personalities of their brands.

The realism of these artificial humans relies on a virtual nervous system, a core technology modeled on the human brain and central nervous system. Greg Cross, Chief Business Officer for Soul Machines™, explains: “What’s unique about our artificial humans is that they are created using brain models that are similar to human brain chemistry.

“For example, imagine you are interacting with one of our artificial humans online, and you smile into your webcam. Our system uses visual recognition technology to recognize that this is an image of someone smiling. The virtual nervous system then interprets this as a positive situation, and creates the virtual equivalent of dopamine and serotonin. This in turn triggers the artificial human to “feel happy” and smile back at you.”

Soul Machines™ supplies the interactive, AI-powered face and voice that registers emotional cues from the customer and modulates voice and expression in response. This helps customers feel much more comfortable interacting with an AI-powered customer service system.

However, while Soul Machines™’ technology makes these interactions feel natural, the other half of the problem is to ensure that the artificial humans can answer customer questions correctly and accurately, with a deep understanding of the company’s business domain. That’s where IBM Watson™ Assistant comes in.


“The ease of integration with Watson Assistant means that the time to market for our clients is very short. We can get an artificial human up and running in just eight to 12 weeks. ”

— Greg Cross, Chief Business Officer, Soul Machines

Transformation story

Getting to the right answer

Watson Assistant is an enterprise-level AI assistant—customizable to any business—that delivers proactive, personalized services. It interprets natural language (in multiple languages) and can be continuously trained on domain-specific data to deliver appropriate responses to clients’ queries.

As the number of channels which customers use to communicate with companies continues to increase, Watson Assistant offers a more scalable alternative to human customer service agents. Companies can deploy Watson to handle thousands of everyday queries automatically, leaving human agents to focus their expertise on addressing more specific and complex issues.

“Many of our clients were already working with IBM, using Watson-based chatbots to answer customer questions through text-based interfaces,” Greg Cross says. “We just needed to integrate our artificial humans with Watson Assistant to provide a new user experience layer that would deliver Watson’s answers as if you were talking to someone live and in-person.”

The integration between the Soul Machines™ platform and IBM Watson Assistant is relatively straightforward. Watson Assistant runs in the IBM Cloud™, so there is no infrastructure to manage, and it provides a simple API that makes it easy to connect with other applications.

As a customer speaks to an artificial human, Soul Machines™ sends the audio stream of the customer’s voice to the Watson Assistant API. Watson converts the audio into text, then searches the company’s corpus of knowledge for relevant answers to the customer’s question, ranks the results, and returns the top-ranked answer to the Soul Machines™ solution.

Meanwhile, the Soul Machines™ platform is analyzing the audiovisual input for emotional cues from the customer’s tone of voice and their facial micro-expressions. It then converts the answer into modulated, emotionally inflected speech for the artificial human to deliver, matched with appropriately generated facial expressions.

“The ease of integration with Watson Assistant means that the time to market for our clients is very short,” says Greg Cross. “If a company has a Watson chatbot set up already, we can get an artificial human up and running in just eight to 12 weeks.”


“One of our banking clients has found that it can already satisfy 40 percent of queries without any kind of human intervention, and we expect that number to climb even higher as the solution continues to learn.”

— Greg Cross, Chief Business Officer, Soul Machines

Results story

Winning customers’ trust

“We’ve had great experiences working with the IBM team, both technically and commercially,” comments Greg Cross. “As a technology company, partnering with IBM gives us an increased reach because of the scale of their global consulting services team. IBM consultants can help clients implement our solutions, and the deep relationships that IBM has with its customers give us a strong foundation to build upon.”

As more businesses make the move from simple chatbots to Soul Machines™’ more sophisticated customer service solutions, the number of API calls to Watson Assistant is increasing substantially.

Greg Cross remarks: “So far, our clients’ customers are very happy using the technology. They love the immediacy of the customer experience interacting with our artificial humans. As the Watson solution keeps learning every day, it can answer more and more customer questions, and the answers get better over time.

“One of our joint banking clients has found that it can already satisfy 40 percent of queries without any kind of human intervention, and we expect that number to climb even higher as the solution continues to learn.”

As the world moves into the AI era, the pressing question is: how can businesses reinvent customer experience to retain and add value to customer relationships? As more and more companies take up AI solutions, the race will be on to add value in a way that keeps customers away from competitors.

Greg Cross concludes: “AI solutions keep learning and developing from the day they are implemented, so anyone late to the game will find it very difficult to catch up with the continued advances made by early adopters. Together with IBM we are proud to be helping businesses take an early lead in the AI race—potentially giving them an insuperable advantage in the years to come.”


Solution components

 

Take the next step

To learn more about IBM Watson Assistant, please contact your IBM representative or IBM Business Partner, or visit the following website: ibm.com/watson/ai-assistant

To learn more about IBM Watson Platform

HOT OFF THE PRESS: Introducing Jamie – ANZ’s new digital assistant

Media Release | Tuesday 10th July 2018

 Introducing Jamie Introducing Jamie

Meet Jamie, ANZ’s new digital assistant, who starts work tomorrow helping customers with some of their general banking queries.

Jamie has a human face, voice and expressions and can have a two-way verbal conversation with people via a computer screen, tablet or mobile phone – thanks to advances in neuroscience, psychology, computing power and artificial intelligence.

She has been programmed to answer questions on 30 of the most frequently searched-for topics on the ANZ Help section of anz.co.nz.

ANZ will be trialling Jamie at https://help.anz.co.nz where she will answer questions such as: “What do I do if I’ve lost my card?” and “How long does it take for a payment to process?” 

All of the questions she can answer are general in nature and do not require any specific customer information.

“Through the trial, we want to see if Jamie will appeal to those who might not be as comfortable using our other digital channels,” Liz Maguire, Head of Digital & Transformation at ANZ Bank said.

“While we know many of our customers love connecting through our existing digital channels, we have been talking face to face a lot longer than we’ve been using small screens.”

ANZ partnered with New Zealand tech company, Soul Machines™, to develop Jamie. Soul Machines™, co-founder and CEO, Mark Sagar, has won awards for his ground-breaking facial technology in King Kong and Avatar.

Jamie has a digital human face and persona, and is ‘brought to life’ using Soul Machines™’ world-leading Human Computing Engine™ (HCE). The Soul Machines™ HCE is a Virtual Nervous System™ that combines neural networks and biologically inspired models of the human brain which allow her to express personality and character in an incredibly human-like way.

Initial feedback from staff and customers has been positive. Around 90 percent of customers who have spoken to Jamie think it is a good idea for ANZ to introduce the technology.

“One of the things that is really exciting about this project is that we are starting to understand some of the benefits we can deliver for ANZ’s customers. The fact they can talk to somebody immediately,” Greg Cross, Chief Business Officer at Soul Machines™ said. “It’s a personal interaction – it is a face-to-face interaction.”

“How we move forward will be guided by what our customers and staff tell us they want,” Ms. Maguire said. “We’re excited to show Jamie to more of our customers and get their feedback.”

Soul Machines™ is making a name for itself globally for humanizing artificial intelligence. The company has just been named as one of the 61 most innovative start-ups in the world by the World Economic Forum developing “world changing” technology.


About Jamie

  • Jamie is ANZ’s digital banking assistant developed by New Zealand tech company, Soul Machines.
  • Customers can start talking with Jamie at https://help.anz.co.nz.
  • She has been optimized for desktop, tablet and mobile. 
  • All of the questions Jamie can answer are general in nature and do not require any specific customer information.
  • Jamie’s favourite colour is ANZ blue, and her favourite TV show is NZ’s Country Calendar.

 

 

 

 

BLOG: This man wants your next employee to be an AI-powered digital human

June 28, 2018 | Written by: Jordan Teicher  | IBM Industries

Categorized: Artificial Intelligence | Media and Entertainment | Retail and Consumer Products

This story is part of Big Thinkers, a series of profiles on business leaders transforming industries with bold ideas.

“Humans can communicate in lots of ways,” said Greg Cross. “But when we actually want to have important conversations we always do those face to face.”

Cross, the CBO of Soul Machines™, practices what he preaches. Though he lives in New Zealand, he took time out of a brief business trip in New York to meet me in person at IBM’s office near Union Square. We gathered to talk about his company, whose mission is to make face-to-face conversations like ours part of the most common interactions we have today—namely, the interactions we have with intelligent machines.

“We’re heading into a world where we’re going to spend a lot more of our time interacting with machines. We have a fundamental belief that these machines can be more helpful to us if they’re more like us,” he said.

To do that, Soul Machines™’ team of AI researchers, neuroscientists, psychologists and artists are creating “digital humans”—fully autonomous, animated individuals that look and sound like real people. The key to their intelligence is a cloud-based virtual central nervous system called the Human Computing Engine, which sits atop IBM Watson and uses Watson Assistant.

When connected to that system, Soul Machines™’ digital humans are amazingly life-like. They hear and see the people with which they interact, and their conversations with those people are made emotive through nuanced facial expressions. For businesses, Cross said, digital humans can revolutionize the economics of customer service, giving them the ability to provide personalized and consistent care at scale.

A face, Cross said, is a “reflection of the heart and mind of an individual,” and it can be key to successful digital interactions with customers. In the years to come, he bets, businesses across industries will agree and make digital humans an integral part of their workforce.

“The question we wanted to explore was: What happens when you create a digital face? Will people engage with it? Will they find that digital face more engaging than a chatbot or a voice assistant? Our view is that, yes, of course they will. That’s ultimately the market and business development we’ve been going on,” Cross said.

 Rachel, a digital human created by Soul Machines. Rachel, a digital human created by Soul Machines™.

“It completely captured my imagination”

Cross has been a technology entrepreneur nearly his entire career. At 18, he dropped out of business school at the University of Waikato, and began an internship at the high tech manufacturer Trigon Packaging. Since then, he’s worked at technology startups in different industries all over the world. In 2007, he co-founded PowerbyProxi, a spin-out of the University of Auckland’s wireless power department, which developed high efficiency and high density wireless power products. The company sold to Apple last year.

“For me, there’s nothing more fun than taking on some sort of core technology or core idea, wrapping a team of people behind it, and exploring how you build a company around it. That’s still what gets me out of bed with a smile on my face,” he said.

Two years ago, Cross found his most recent opportunity to do just that when he met Dr. Mark Sagar, an Academy Award-winning animator who was then the director of the Laboratory for Animate Technologies at The University of Auckland. Cross had, in the past, seen Sagar present his work— a virtual animated baby called BabyX that learns and reacts like a real human infant. But when Sagar sat down with him one-on-one to show him the technology underlying his creation, Cross knew he had to get involved.

“It completely captured my imagination,” Cross said.

First steps

When Cross and Sagar first started thinking about how to turn the technology into a business, they drew up a list of half a dozen industries they knew were facing “quite significant disruption,” and began imagining how digital humans could help. They then started talking about digital humans at technology and industry conferences. Soon, business leaders eager to drive change in their industries wanted to talk with them.

“It’s like any new technology; it’s well understood that there’s an adoption curve. There are the early adopters and then there are those who never want to be first. We’re always very careful about making sure we’re speaking to the right people,” he said.

So far, it seems, Cross has found those people. This year, Soul Machines debuted its first crop of digital employees at Autodesk, Daimler Financial Services and NatWest. It’s still early days, Cross said, but the employees— Cora, Sarah, and Ava— are paving the way for a future in which digital humans will be an integral part of the way people interact with businesses.

“I like to think in five years we’ll create a very large population of digital humans who will be interacting with people and having hundreds of millions of conversations every day,” Cross said.

Imagining the future

Where might digital humans pop up next? Cross couldn’t talk about some of Soul Machines’ upcoming projects. But the appetite for next-generation customer service solutions, he said, is strong across a number of industries, including retail and telecommunications, and digital humans could find a productive place in all of them.

In a fast-paced, digitally-driven landscape, customers have little patience for endless call center queues and customer service departments with limited hours. Increasingly, they expect quick, seamless interactions at any time of the day or night with representatives that understand and remember their preferences and history.

“As real human beings, our memory has limits. If you’re dealing with 100 people a day, you’re not going to remember every single interaction. A digital person will,” Cross said.

At the moment, Soul Machines’ digital humans are making their mark in customer service. But Cross is already investigating a wide range of future applications for his company’s technology. He imagines digital humans one day teaching classes or providing medical care. Celebrities, he said, could enlist their own digital twin to perform tasks they can’t fit into their schedule. The possibilities, Cross said, are endless—and he’s exploring as many of them as possible.

“One day I can be sitting in a board room doing a presentation for a CEO of one of the largest banks or the largest tech companies in the world. Another day I can be sitting down with the biggest celebrities in the world,” he said. “It’s a huge amount of fun.”

NEWS: We are one of the 2018 Tech Pioneers!

We are excited to announce that Soul Machines™ has been selected as a 2018 Tech Pioneer by the World Economic Forum

“The Technology Pioneers cohort of 2018 brings together 61 early-stage companies from around the world that are pioneering new technologies and innovations ranging from the use of artificial intelligence in drug discovery, the development of autonomous vehicles, advancing cybersecurity and reducing food waste, to applying blockchain to a decentralized engagement platform.

In joining this community and the two-year journey where they become part of the Forum’s initiatives, activities and events, they bring cutting-edge insights and novel perspectives to world-critical discussions.”

— World Economic Forum

BLOG: High-tech AI helps make digital humans for social good

June 19, 2018 | Written by: Greg Cross

Published by IBM.  Categorized: AI/Watson | Computer Services

Artificial intelligence has received an extraordinary amount of hype from the media. Hollywood in particular has pushed a dystopian view—it’s science fiction that has an awful lot of fiction in the science.

We are heading into an era where we’ll be increasingly interfacing with artificial intelligence—as robots, machines, drones, self-driving cars, chatbots, etc. And because of the media’s AI hype, many people fear this future.

At Soul Machines™, we think that people will be more comfortable interacting with AI if these systems are actually more like us—if we can interact with these systems in natural ways, and over time, learn to trust and comfortably engage with them.

That’s why we’ve focused on creating digital humans.

Bringing AI systems to life with virtual neural networks

Soul Machines™ is a ground-breaking high-tech company of AI researchers, neuroscientists, psychologists, artists and innovative thinkers. Our mission is to reimagine how people connect with machines by creating incredibly life-like, emotionally responsive artificial humans with personality and character.

My business partner, Dr. Mark Sagar, has spent his life studying the human face. He won two Academy Awards for the incredible work he did on movies like Avatar, with James Cameron, and King Kong with Peter Jackson.

Building on his expertise, we’ve added AI to create a virtual nervous system that is made up of neural networks and machine learning—in effect, we’ve developed biologically-inspired models of different parts of the human brain.

To date, we’ve used this core technology to create AI systems—like Baby X and Rachel—that serve as a user interface to an AI platform. Our core technology, or Human Computing Engine, sits on top of IBM Watson and uses Watson Assistant (formerly Watson Conversation) as a key part of our dialog interface.

Addressing critical social needs with digital humans

The AI systems we create can be used on flat screens—like your smartphone or laptop—but can also be directly inserted into augmented reality (AR) or virtual reality (VR) environments. Since we build these systems in 3D, they can be projected as holographs without any further work.

The applications for these humanized AI systems are endless. For example, we’ve got a worldwide shortage of science teachers in our high schools. We can create digital tutors that can help kids get specialized knowledge about physics and chemistry as they do their homework. Doctors congregate in bigger cities where there are more people who can keep their practices functioning. As a result, rural and remote areas can find it difficult to provide adequate healthcare services. With our technology, we can create virtual doctors to ensure that these communities get the healthcare they need.

And that’s just the beginning. We’re about to start a project that will bring somebody who’s been dead for a couple hundred years back to life, so that they can talk about their work and their inspiration. With humanized AI systems, we can tell their life stories in a way that hasn’t been possible before.

That’s the wonderful thing about what I get to do for a living… it’s an incredibly creative world, limited only by our imagination.

 

VIDEO: Bloomberg Technology – Meet Ava, Autodesk’s New Virtual Assistant

Rachael Rekart – Director of Machine Assistance at Autodesk – discusses AVA, Autodesk’s virtual assistant, with Bloomberg’s Emily Chang.

(Source: Bloomberg)

“As we evolved AVA we partnered with Soul Machines™ that is a New Zealand tech company that essentially is working on studying human consciousness through AI and they have brought Ava to life, so to speak through, through the development of a digital avatar. This digital avatar for consumers can actually voice and video chat with our customers and recognise and respond to emotional cues.”

— Rachael Rekart – Director of Machine Assistant at Autodesk