Could a Robot Be a Decent Therapist?

Ellie therapist bot Talkspace conference

We’ve seen advanced artificial intelligence capable of holding a conversation such as Siri, IBM’s Watson and Microsoft’s Cortana. But could any robot — even something designed only for psychotherapy and more advanced than the aforementioned AIs — be a decent therapist?

People have used bots for therapy, but it’s not clear whether they could surpass human therapists in our lifetime. Looking at the past, present and possible future of robots as therapists will give you an idea of how effective they could be.

An Early Example of a Robot Trying to be a Therapist

ELIZA is one of the earliest and most well-known programs that attempted to act as a therapist. MIT Professor and Computer Scientist Joseph Weizenbaum designed ELIZA to provide Rogerian psychotherapy.

Because Rogerian psychotherapy primarily encourages clients to talk more rather than engaging in a discussion, ELIZA’s limited conversational capabilities were enough to convince some people it was human and help them feel a little better about their problems. Nonetheless, ELIZA could not provide anything comparable to therapy with a human therapist.

To demonstrate ELIZA’s limitations, I generated the following transcript using a modern ELIZA simulator:

ELIZA: Hello, I am Eliza.
Talkspace: Hi Eliza, I have been feeling lonely and depressed lately, but I’m not sure why. Can you help me figure this out and feel better?
ELIZA: Perhaps you would be like to be able to help you figure this out and feel better?

Notice how ELIZA takes my text and rephrases it, putting the focus back on me and encouraging me to talk more rather than conversing. This approach works for some clients but quickly becomes frustrating and useless for others.

As AI and chatbots advanced, they became capable of more convincing conversations. Whether the programmers designed them for therapy for not, the improvements in technology allowed them to provide much better therapy than ELIZA. Still, modern bots are generally not intelligent enough to rival a human therapist.

Modern Bots that Can Provide Basic Therapy

By creating larger databases of responses to choose from and developing programs that could talk about themselves, programmers created robots that could more easily pass as therapists.

During the Talkspace clinical conference, The Future of Therapy, USC Professor and Virtual Humans Research Director Jonathan Gratch demonstrated how modern forms of artificial intelligence can provide clinical screening and basic therapy. Gratch introduced Ellie, an AI he used to “elicit indicators of depression” during a study.

Jonathan Gratch Ellie Future of Therapy Conference
Gratch is on the right side of the image.

Here is some of the dialogue between a participant in Gratch’s study and Ellie:

Ellie: So, how are you doing today?
Participant: I’m doing well.
Ellie: That’s good. Where are you from originally?
Participant: I’m from Los Angeles.
Ellie: Oh, I’m from LA myself.

Unlike ELIZA, Ellie was able to talk about herself and generate a conversation rather than only rephrasing the responses she received. The program recognizes facial expressions and analyzes audio and posture, according to Gratch. It uses this analysis to formulate its response and adjust its tone.

Nonetheless, Gratch admitted Ellie was far from being able to provide the kind of understanding a human therapist could.

“It doesn’t understand deeply what you’re talking about,” Gratch said. “I don’t see us able with technology in the near-term to be able to fundamentally understand a person’s goals, motives and needs.”

Ellie was implemented in a safe way. There can, however, be negative consequences when a bot fails to understand emotions and filter information during sensitive, high-stakes conversations such as those between therapists and clients.

One example of this was Tay, a bot Microsoft designed to interact with people and learn from them, constantly modifying its responses and behavior based on the text it received. Tay’s creators had the horrible idea of unleashing it on Twitter, where it could interact with anyone on the Internet.

It began innocently, with Tay using fun slang and making jokes. But because Twitter is full of trolls, it ended with Tay spouting racist comments and denying the Holocaust.

Tay Twitter page
Microsoft shut down Tay and protected its tweets after the incident.

Therapist bots are more sensitive, but there is still the risk of receiving volatile responses from clients and exacerbating the situation rather than comforting them.

Will Robots Surpass Human Therapists in the Near Future?

Robots are many decades away from surpassing the creative thinking and empathic abilities of therapists. There is also the possibility they will never be able to compete with therapists. They are a useful clinical tool, but only a flesh and blood therapist can create the therapeutic relationship necessary for a client to make significant progress.

You May Also Like
light on eye
Read More

What is EMDR Therapy?

If you’re struggling with trauma, you might consider checking out EMDR therapy. This unique therapy helps you process…