PRESS: Robots will probably help care for you when you’re old

 

Article by Corinne Purtill | Excerpt from QUARTZ | 12 SEPTEMBER 2018

Among the symptoms of dementia is a phenomenon called “sundowners syndrome”: an increase in agitation, confusion, and anxiety as late afternoon transitions to evening. Its cause isn’t well understood; circadian rhythm disruptions precipitated by the change in light, anxiety over end-of-day activity, and hormonal fluctuations have all been floated as theories. Whatever the trigger, sundowners can make otherwise amiable people combative and even violent, a frightening and unsettling experience for patients and caregivers alike.

Staff in hospitals and nursing homes typically treat the symptoms with sedative drugs. But in recent years, facilities from Japan to the US have turned instead to a specialist: a robot baby seal named Paro.

Paro spent a decade in development at Japan’s National Institute of Advanced Industrial Science and Technology. The robot seal came to market in 2004 and is now in use in many parts of Asia, Europe, and North America to offer the psychological benefits of pet therapy in situations where a real animal isn’t practical…


To a person in normal cognitive health, Paro is unmistakably a machine. A soft mechanical sound accompanies its motions; up close, you can see its whiskers have tiny sensors on the ends. Given the comfort it brings to people suffering a dreadful disease, insisting that patients recognize its artificiality seems cold and beside the point.

But you don’t have to peer very far into the future to see the possibility of interactions in which it will be difficult even for a person with their full cognitive facilities to tell the difference between robots and reality.

The Auckland, New Zealand-based tech company Soul Machines™ creates AI interfaces that look uncannily like high-definition video chats with a real human being. It doesn’t quite pass the Turing Test, but it’s easy to imagine a situation in which someone with limited eyesight or cognitive disabilities believes they’re having a human conversation when talking to a robot like “Ava.”

 

 

Or “Sarah.”

 

 

Or this baby.

 

 

Soul Machines™ licenses its user interface technology to businesses and institutions. Its technology has powered digital assistants for banks, airlines, and software companies, as well as a prototype virtual assistant, voiced by the actor Cate Blanchett, that was designed to help people with disabilities navigate Australia’s public benefits system. (That program was shelved, not long after the Australian government’s disastrous introduction of an automated system to detect welfare fraud drew public outcry.) Soul Machines™ has discussed services for the elderly with prospective clients but has not announced any partnerships on that subject to date, says chief business officer Greg Cross.

Soul Machines™ envisions a future in which digital instructors educate students without access to quality human teachers, and in which famous deceased artists are digitally resurrected to discuss their works in museums. Robot companions for the infirm, then, are not too far a leap. Nor is the prospect of a future in which a family converses with the lively AI recreation of a person suffering from dementia, while a caregiver—robot or human—tends to their ailing body in another room.

 

HOT OFF THE PRESS: What We Have to Gain From Making Machines More Human

ARTICLE BY MARC PROSSER | SINGULARITY HUB | 11 SEPTEMBER 2018

The borders between the real world and the digital world keep crumbling, and the latter’s importance in both our personal and professional lives keeps growing. Some describe the melding of virtual and real worlds as part of the fourth industrial revolution. Said revolution’s full impact on us as individuals, our companies, communities, and societies is still unknown.

Greg Cross, chief business officer of New Zealand-based AI company Soul Machines™, thinks one inescapable consequence of these crumbling borders is people spending more and more time interacting with technology. In a presentation at Singularity University’s Global Summit in San Francisco last month, Cross unveiled Soul Machines™’ latest work and shared his views on the current state of human-like AI and where the technology may go in the near future.

Humanizing Technology Interaction

Cross started by introducing Rachel, one of Soul Machines™’ “emotionally responsive digital humans.” The company has built 15 different digital humans of various sexes, groups, and ethnicities. Rachel, along with her “sisters” and “brothers,” has a virtual nervous system based on neural networks and biological models of different paths in the human brain. The system is controlled by virtual neurotransmitters and hormones akin to dopamine, serotonin, and oxytocin, which influence learning and behavior.

As a result, each digital human can have its own unique set of “feelings” and responses to interactions. People interact with them via visual and audio sensors, and the machines respond in real time.

“Over the last 20 or 30 years, the way we think about machines and the way we interact with machines has changed,” Cross said. “We’ve always had this view that they should actually be more human-like.”

The realism of the digital humans’ graphic representations comes thanks to the work of Soul Machines™’ other co-founder, Dr. Mark Sagar, who has won two Academy Awards for his work on some computer-generated movies, including James Cameron’s Avatar.

Cross pointed out, for example, that rather than being unrealistically flawless and clear, Rachel’s skin has blemishes and sun spots, just like real human skin would.

The Next Human-Machine Frontier

When people interact with each other face to face, emotional and intellectual engagement both heavily influence the interaction. What would it look like for machines to bring those same emotional and intellectual capacities to our interactions with them, and how would this type of interaction affect the way we use, relate to, and feel about AI?

Cross and his colleagues believe that humanizing artificial intelligence will make the technology more useful to humanity, and prompt people to use AI in more beneficial ways.

“What we think is a very important view as we move forward is that these machines can be more helpful to us. They can be more useful to us. They can be more interesting to us if they’re actually more like us,” Cross said.

It is an approach that seems to resonate with companies and organizations. For example, in the UK, where NatWest Bank is testing out Cora as a digital employee to help answer customer queries. In Germany, Daimler Financial Group plans to employ Sarah as something “similar to a personal concierge” for its customers. According to Cross, Daimler is looking at other ways it could deploy digital humans across the organization, from building digital service people, digital sales people, and maybe in the future, digital chauffeurs.

Soul Machines™’ latest creation is Will, a digital teacher that can interact with children through a desktop, tablet, or mobile device and help them learn about renewable energy. Cross sees other social uses for digital humans, including potentially serving as doctors to rural communities.

Our Digital Friends—and Twins

Soul Machines™ is not alone in its quest to humanize technology. It is a direction many technology companies, including the likes of Amazon, also seem to be pursuing. Amazon is working on building a home robot that, according to Bloomberg, “could be a sort of mobile Alexa.

Finding a more human form for technology seems like a particularly pervasive pursuit in Japan. Not just when it comes to its many, many robots, but also virtual assistants like Gatebox.

The Japanese approach was perhaps best summed up by famous android researcher Dr. Hiroshi Ishiguro, who I interviewed last year: “The human brain is set up to recognize and interact with humans. So, it makes sense to focus on developing the body for the AI mind, as well as the AI. I believe that the final goal for both Japanese and other companies and scientists is to create human-like interaction.”

During Cross’s presentation, Rob Nail, CEO and associate founder of Singularity University, joined him on the stage, extending an invitation to Rachel to be SU’s first fully digital faculty member. Rachel accepted, and though she’s the only digital faculty right now, she predicted this won’t be the case for long.

“In 10 years, all of you will have digital versions of yourself, just like me, to take on specific tasks and make your life a whole lot easier,” she said. “This is great news for me. I’ll have millions of digital friends.”

AI – are humans obsolete?

 

Article by Andrew Cornell | Excerpt from ANZ bluenotes | 13 SEPTEMBER 2018

Dystopia or Utopia? Will robots be our – willing – slaves or overlords? Actually, what is ‘artificial intelligence? We may not realise how pervasive it is already.

We spoke to an AI scientist, a digital human creator, a venture capitalist, an ex-Google banker and an economist about their experience with and expectations for AI in the future.

For further discussion points, see the video below.


Greg Cross – Chief Business Officer & Co-Founder, Soul Machines

At Soul Machines™, we’ve gone beyond what Hollywood is doing with the quality of the digital characters they are producing. What’s really interesting is the way we bring them to life by taking away actors and cameras which have traditionally been used to bring computer generated characters to life.

We’ve automated these characters by giving them a brain and creating a system which enables them to respond, interact and engage with us in exactly the same way we engage with each other.

But at what point does this digital character become engaging to the humans that they’re interacting with? At what point can we relate to them? At what point can we learn to trust them?

In very simple terms by putting a face on artificial intelligence we’re trying to create a platform where people can develop trust in machines.

Building trust between humans and machines is going to be a really critical part of the way we use our systems in the next 10, 20 or 30 years.