This post was originally published on this site
“I understand that you’re experiencing a relationship problem, is that right?”
Yes.
“Okay, I’m delighted that you want to work on this K. I’ve good and bad news for you… Which do you want to hear first?”
Good news.
“The good news is, I have very powerful skills that I can teach you! These skills have really helped a lot of people build better relationships.”
And the bad news?
“The bad news is that you would have to be willing to accept 100% of the responsibility for changing the relationship.”
Why?
“Because you’re only ever going to be able to have control over your part of it.”
This conversation may sound like a fairly typical therapist’s session, but in this case, it was directed by Woebot, a therapeutic chatbot, with the psychiatrist’s couch swapped for a smartphone screen.
The app presents itself as an automated therapist when finding a real one can feel like a logistical and financial impossibility. At the same time, the need for therapists is only growing.
During the pandemic, about four in 10 adults in the United States reported that they had symptoms of anxiety or depression, according to the Kaiser Family Foundation. At the same time, the federal government warns of a critical shortage of therapists and psychiatrists. According to the advocacy group Mental Health America, almost 60 percent of those with mental illness last year did not get treatment.
Woebot Health says the pandemic has driven up demand for its services. The number of its daily users doubled and is now in the tens of thousands, said Alison Darcy, a psychologist and the founder and president of the company.
Digital mental health has become a multibillion-dollar industry and includes more than 10,000 apps, according to an estimate by the American Psychiatric Association. The apps range from guided meditation (Headspace) and mood tracking (MoodKit) to text therapy by licensed counselors (Talkspace, BetterHelp).
But Woebot, which was introduced in 2017, is one of only a handful of apps that use artificial intelligence to deploy the principles of cognitive behavioral therapy, a common technique used to treat anxiety and depression. Woebot aims to use natural language processing and learned responses to mimic conversation, remember past sessions and deliver advice around sleep, worry and stress.
“If we can deliver some of the things that the human can deliver,” Dr. Darcy said, “then we actually can create something that’s truly scalable, that has the capability to reduce the incidence of suffering in the population.”
Almost all psychologists and academics agree with Dr. Darcy on the problem: There is not enough affordable mental health care for everyone who needs it. But they are divided on her solution: Some say bot therapy can work under the right conditions, while others consider the very concept paradoxical and ineffective.
At issue is the nature of therapy itself. Can therapy by bot make people understand themselves better? Can it change long-held patterns of behavior through a series of probing questions and reflective exercises? Or is human connection essential to that endeavor?
Hannah Zeavin is the author of the forthcoming book “The Distance Cure: A History of Teletherapy.” The health care system is so broken, she says, that “it makes sense that there’s space for disruption.”
But, she added, not all disruption is equal. She calls automated therapy a “fantasy” that is more focused on accessibility and fun than actually helping people get better over the long term.
“We are an extraordinarily confessing animal; we will confess to a bot,” she said. “But is confession the equivalent of mental health care?”
Eli Turns to Woebot
Eli Spector seemed like the perfect client for A.I. therapy.
In 2019, Mr. Spector was a 24-year-old college graduate, working in a neuroscience lab in Philadelphia. Having grown up with an academic father who specialized in artificial intelligence, he considered himself something of a technologist.
But Mr. Spector’s job was isolating and tedious, and after four stimulating years in academia, he felt bored and lonely. He couldn’t sleep well and found that his moods were consistently dark.
“I was just having a really hard time adjusting and I didn’t have any co-workers I liked,” he said. “It was just a tough period for me.”
But he wasn’t sure he wanted to bare his soul to a real person; he didn’t want to worry about anyone’s judgment or try to fit around someone else’s schedule.
Besides, he didn’t think he could find a therapist on his parents’ insurance plan that he could afford, as that could run from $100 to $200 a session. And Woebot was free and on his phone.
“Woebot seemed like this very low-friction way to see, you know, if this could help.”
Therapy by Algorithm
Woebot’s use of cognitive behavioral therapy has a philosophical and practical logic to it. Unlike forms of psychotherapy that probe the root causes of psychological problems, often going back to childhood, C.B.T. seeks to help people identify their distorted ways of thinking and understand how that affects their behavior in negative ways. By changing these self-defeating patterns, therapists hope to improve symptoms of depression and anxiety.
Because cognitive behavioral therapy is structured and skill-oriented, many mental health experts think it can be employed, at least in part, by algorithm.
“You can deliver it pretty readily in a digital framework, help people grasp these concepts and practice the exercises that help them think in a more rational manner,” said Jesse Wright, a psychiatrist who studies digital forms of C.B.T. and is the director of the University of Louisville Depression Center. “Whereas trying to put something like psychoanalysis into a digital format would seem pretty formidable.”
Dr. Wright said several dozen studies had shown that computer algorithms could take someone through a standard C.B.T. process, step by step, and get results similar to in-person therapy. Those programs generally follow a set length and number of sessions and require some guidance from a human clinician.
But most smartphone apps don’t work that way, he said. People tend to use therapy apps in short, fragmented spurts, without clinician oversight. Outside of limited company-sponsored research, Dr. Wright said he knew of no rigorous studies of that model.
And some automated conversations can be clunky and frustrating when the bot fails to pick up on the user’s exact meaning. Dr. Wright said A.I. is not advanced enough to reliably duplicate a natural conversation.
“The chances of a bot being as wise, sympathetic, empathic, knowing, creative and being able to say the right thing at the right time as a human therapist is pretty slim,” he said. “There’s a limit to what they can do, a real limit.”
John Torous, director of digital psychiatry for Beth Israel Deaconess Medical Center in Boston, said therapeutic bots might be promising, but he’s worried they are being rolled out too soon, before the technology has caught up to the psychiatry.
“If you deliver C.B.T. in these bite-size parts, how much exposure to bite-size parts equals the original?” he said. “We don’t have a good way to predict who’s going to respond to them or not — or who it’s good or bad for.”
These new apps, Dr. Torous said, risk setting back other advances in digital mental health: “Do we in part end up losing trust and credibility because we’re promising what is not yet possible by any machine or any program today?”
Other mental health professionals say that therapy should simply not be delivered by machine. Effective treatment involves more than just cognitive skill-building, they say. It needs a human-to-human connection. Therapists needs to hear nuances, see gestures, recognize the gap between what is said and unsaid.
“These apps really shortchange the essential ingredient that — mounds of evidence show — is what helps in therapy, which is the therapeutic relationship,” said Linda Michaels, a Chicago-based therapist who is co-chair of the Psychotherapy Action Network, a professional group.
Dr. Darcy of Woebot says a well-designed bot can form an empathetic, therapeutic bond with its users, and in fact her company recently published a study making that claim. Thirty-six thousand Woebot users responded to statements like, “I believe Woebot likes me,” “Woebot and I respect each other” and “I feel that Woebot appreciates me.”
The study’s authors — all with financial ties to the company — concluded that a significant percentage of participants perceived a “working alliance” with Woebot, a term that means the therapist and patient have formed a cooperative rapport. The study did not measure whether there actually was a working alliance.
Sherry Turkle, a clinical psychologist at the Massachusetts Institute of Technology who writes about technology and relationships, is not swayed by such evidence. For therapy to heal, she said, the therapist must have a lived experience and the ability to empathize with a patient’s pain. An app cannot do that.
“We will humanize whatever seems capable of communicating with us,” Dr. Turkle said. “You’re creating the illusion of intimacy, without the demands of a relationship. You have created a bond with something that doesn’t know it is bonding with you. It doesn’t understand a thing.”
Eli Pours Out His Problems
Eli Spector started with Woebot in the summer of 2019.
He liked that he could open the app whenever he felt like it and pour out his thoughts of distress on his own schedule, for even a few minutes at a time. Most of the words coming out had to do with how unhappy he felt at his job.
He also took advantage of Woebot’s other features, including tracking his mood and writing in an online journal. It helped him realize how depressed he really was.
But he had doubts about the algorithm. The bot’s advice often felt generic, like a collection of “mindfulness buzzwords,” he said. “Like, ‘Can you think more about that feeling, and what you could do differently?’”
And worse, the advice could be nonsensical.
“I would type in, like, ‘My boss doesn’t appreciate the work I do’ and ‘I can’t seem to get her approval,’” Mr. Spector said. “And Woebot would be like: ‘That sounds difficult. Does this happen more in the morning or at night?’”
“It felt sort of silly,” he said.
Is It Really Therapy?
Much of the debate over therapeutic bots comes down to expectations. Do patients and clinicians understand the limitations of chatbots? Or are they expecting more than even the companies say they deliver?
On its website, Woebot promises to “automate both the process and content of therapy,” but Dr. Darcy is careful not to call Woebot medical treatment or even formal therapy.
Instead, she says, the bot delivers “digital therapeutics.” And Woebot’s terms of service call it a “pure self-help” program that is not meant for emergencies. In fact, in the event of a severe crisis, Woebot says that it is programmed to recognize suicidal language and urge users to seek out a human alternative.
In that way, Woebot does not approach true therapy — like many mental health apps, the current, free version of Woebot is not subject to strict oversight from the Food and Drug Administration because it falls under the category of “general wellness” product, which receives only F.D.A. guidance.
But Woebot is striving for something more. With $22 million of venture capital in hand, Woebot is seeking clearance from the F.D.A. to develop its algorithm to help treat two psychiatric diagnoses, postpartum depression and adolescent depression, and then sell the program to health systems.
And it is here that Woebot hopes to make money, using its practical advantage over any human therapist: scale.
While other virtual therapy companies, like BetterHelp or Talkspace, must keep recruiting therapists to join their platforms, A.I. apps can take on new users without paying for extra labor. And while therapists can vary in skills and approach, a bot is consistent and doesn’t get stressed out by back-to-back sessions.
“The assumption is always that, because it’s digital, it’ll always be limited,” Dr. Darcy of Woebot said. “There’s actually some opportunities that are created by the technology itself that are really challenging for us to do in traditional treatment.”
One advantage of an artificial therapist — or, as Dr. Darcy calls it, a “relational agent” — is 24-hour-a-day access. Very few human therapists answer their phone during a 2 a.m. panic attack, as Dr. Darcy pointed out. “I think people have probably underestimated the power of being able to engage in a therapeutic technique in the moment that you need to,” she said.
But whether Woebot can be involved in medical diagnosis or treatment is up to the F.D.A., which is supposed to make sure the app can back up its claims and not cause harm, an agency spokesperson said.
One possible harm, the spokesperson said, is a “missed opportunity” where someone with mental illness fails to get more effective treatment or delays treatment. “And what the consequences of those delays would look like — that’s something we’d worry about,” the spokesperson said.
Artificial intelligence can be problematic in other ways. For instance, Dr. Zeavin worries that racial and gender bias or privacy breaches could simply get translated into bots.
“Therapy has enough problems on its own,” Dr. Zeavin said. “And now they’ve brought all of the problems of algorithmic technology to bear.”
But even some skeptics of chatbot therapy believe it has the potential to complement the human-guided mental health system, as long as it comes with serious research.
“As the market gets saturated, the bar for evidence will get higher and higher and that’s how people will compete,” Dr. Torous said. “So maybe we’re just in such early stages and we don’t want to punish people for being innovative and kind of trying something.”
The idea, Dr. Darcy says, is not to replace human therapists with bots; she thinks it’s important to have both. “It’s like saying if every time you’re hungry, you must go to a Michelin star restaurant, when actually a sandwich is going to be OK,” she said. “Woebot is a sandwich. A very good sandwich.”
Eli Breaks Up With Woebot
After about a month, Eli Spector deleted Woebot from his phone.
He was unimpressed by the bot’s advice for beating back loneliness and despair, but he is not entirely sorry that he tried it out.
The mere act of typing out his problems was helpful. And through the process, he pinpointed what he actually needed to feel better.
“So maybe this was just evidence that I needed to, like, actually address this,” he said. “It was enough to inspire me to just take the plunge and find a flesh-and-blood therapist.”
Now, Mr. Spector pays a human psychotherapist in Philadelphia $110 a session.
They’ve been meeting on Zoom since the pandemic began, so the flesh-and-blood part is somewhat theoretical. But it’s close enough.