“Schizophrenia” Still Carries a Stigma. Will Changing the Name Help?

Many people with or connected to the mental illness approve of updating the name, a new survey shows. But some experts are not convinced it’s the answer.For decades, Linda Larson has been trying to distance herself from the diagnosis she was given as a teenager: schizophrenia. She accepts that she has the mental disorder but deeply resents the term’s stigma. People hear it and think, “violent, amoral, unhygienic,” she said.Ms. Larson, 74, is part of a group trying to remove that association — by changing the name of the illness. The idea is that replacing the term “schizophrenia” with something less frightening and more descriptive will not only change how the public perceives people with the diagnosis, but also how these people see themselves.Ms. Larson is a member of the Consumer Advisory Board of the Massachusetts Mental Health Center, which is associated with Beth Israel Deaconess Medical Center in Boston. The group has been working with psychiatrists at Harvard to build momentum for a name change, most recently through a national survey published in the journal Schizophrenia Research.“That term over time has become so associated with hopelessness, with dangerousness, with volatile and erratic behavior, that doctors are afraid to use that term with people and their family members,” said Dr. Raquelle Mesholam-Gately, a Harvard psychologist and the lead author of the new paper. “And people who have the condition don’t want to be associated with that name.”As a result, she said, clinicians often avoid making such a devastating diagnosis and many patients and their families don’t seek treatment until after the illness has wreaked considerable damage.Dr. Mesholam-Gately and her team asked about 1,200 people connected to schizophrenia — including those with the disorder, their family members, mental health providers, researchers and government officials — whether it should be called something else.The survey proposed nine alternative names, based partly on the experience of people diagnosed with schizophrenia. Among them: altered perception disorder, attunement disorder, disconnectivity syndrome, integration disorder and psychosis spectrum disorder.Although none of the options had overwhelming approval, 74 percent of respondents favored a new name in principle. But the path to an official change remains steep, as the field of schizophrenia researchers and advocates remains divided on whether a change would actually reduce stigma and improve the lives of people with the disorder.“We have to take this on in a systematic way,” said Dr. Matcheri Keshavan, the academic head of psychiatry at Beth Israel Deaconess and a co-author of the study. “Any change has to be gradual. Sudden changes, nobody will accept.”In the United States, the decision is up to the American Psychiatric Association, which would make the change in its official diagnostic manual (the Diagnostic and Statistical Manual of Mental Disorders, or D.S.M.) after reaching consensus among its scientific advisers. (The World Health Organization also oversees an international classification of diseases.)The term “schizophrenia,” which derives from Latin words for “split mind,” was coined in 1908 by Dr. Eugen Bleuler. He argued that the disorder, previously considered a type of dementia, was characterized by a “splitting of psychological functions” where “the personality loses its unity.”But the term has often been misunderstood and wrongly applied over the last century, many psychologists and researchers say. It is often confused with dissociative identity disorder, previously known as multiple personality disorder. “Schizophrenic” has also been usurped by colloquial language, often as an insult.Part of the problem is schizophrenia has long been misunderstood as an untreatable disease, Ms. Larson said. That is what she and her family had assumed in the 1960s when, at 15, she started having delusions and psychotic episodes.“For a while, I thought silver cars were C.I.A., green cars were Army, blue cars were Air Force, black cars were Secret Service,” she said.By her twenties, she recovered sufficiently to start working on a doctorate in literature at the University of Mississippi, but then she had another psychotic break.She stood outside gas station and decided to blow it up, she said: “I had a Bic lighter in my hand and I stood there. And for some reason I didn’t.”Advised to abandon her Ph.D. program when she was young, Ms. Larson began taking the antipsychotic drug clozapine in the 1990s and said she hasn’t had an episode since.Tony Luong for The New York TimesA doctor diagnosed Ms. Larson with schizophrenia and suggested that she abandon her Ph.D. program.She went through 20 years of sporadic hospitalizations and several suicide attempts until the 1990s, when she was prescribed the antipsychotic drug clozapine.Although clozapine can have serious side effects, Ms. Larson found it transformative; she said she has not had a psychotic break since. She has published four books of poetry and was married for 32 years until her husband’s death in 2020.“The term schizophrenia hasn’t evolved with the treatment,” Ms. Larson said.But Dr. Mesholam-Gately said that not all survey respondents supported a name change. Some worried that an unfamiliar name would make it harder for patients to apply for disability or insurance coverage. Others said that if the new name was too broad, doctors might diagnose patients excessively. And some considered the term just too ingrained in the culture.Dr. William Carpenter, a psychiatrist at the University of Maryland School of Medicine and the editor of Schizophrenia Bulletin, said he has seen these semantic debates play out for decades.“A rose by any other name would smell the same,” said Dr. Carpenter, who was not involved in the survey. “And if you make the change, how long until the stigma catches up with it?”Dr. Carpenter agreed that stigma surrounding the term “schizophrenia” may in fact delay critical treatment after a first psychotic episode. (The average gap between diagnosis and treatment is two to three years, he noted.) But he was not convinced that changing the name would close that gap.For example, he said, suppose a teenage patient goes to the doctor with telltale symptoms, such as hearing voices. If the doctor uses a new name for the diagnosis, Dr. Carpenter said, “you can almost hear the parents saying, ‘Didn’t that used to be called schizophrenia?’”This may also be the wrong moment to tinker with the name, Dr. Carpenter added. Scientists are reworking the clinical definition of schizophrenia, including focusing more on brain mechanisms, not just psychological symptoms, and viewing it more as a syndrome than as a single disease. These changes could be reflected in future revisions of the D.S.M., and it may not make sense to rename the disorder before this happens.Even some mental health professionals who work to counter its stigma are skeptical of the renaming effort.“We absolutely agree that language is extremely important,” said Lisa Dailey, the director of the Treatment Advocacy Center, which supports people with severe mental illness, but added that pushing for a name change is not an effective use of limited resources.The best way to destigmatize schizophrenia, Ms. Dailey said, “is to develop better medications that work for more people.”While other countries, including Japan and South Korea, have recently adopted new names for schizophrenia, Dr. Meshalom-Gately and Dr. Keshavan acknowledged that they need more of a consensus among scientists and clinicians in the United States.There is precedent for rethinking mental health terminology, they note. The illness once known as manic depression was successfully relabeled bipolar disorder in 1980. “Mental retardation” became “intellectual disability” in 2013. And the categories for autism were changed in the most recent version of the psychiatric diagnostic manual, after years of advocacy.Even if the Consumer Advisory Board succeeds in convincing the authors of the next diagnostic manual to change the name, it “is not going to be enough to reduce stigma and discrimination,” Dr. Mesholam-Gately said. “There also needs to be public education campaigns that go along with that, to really explain what the condition is and the treatments that are available for it.”

Read more →

Something Bothering You? Tell It to Woebot.

“I understand that you’re experiencing a relationship problem, is that right?”Yes.“Okay, I’m delighted that you want to work on this K. I’ve good and bad news for you… Which do you want to hear first?”Good news.“The good news is, I have very powerful skills that I can teach you! These skills have really helped a lot of people build better relationships.”And the bad news?“The bad news is that you would have to be willing to accept 100% of the responsibility for changing the relationship.”Why?“Because you’re only ever going to be able to have control over your part of it.”This conversation may sound like a fairly typical therapist’s session, but in this case, it was directed by Woebot, a therapeutic chatbot, with the psychiatrist’s couch swapped for a smartphone screen.The app presents itself as an automated therapist when finding a real one can feel like a logistical and financial impossibility. At the same time, the need for therapists is only growing.During the pandemic, about four in 10 adults in the United States reported that they had symptoms of anxiety or depression, according to the Kaiser Family Foundation. At the same time, the federal government warns of a critical shortage of therapists and psychiatrists. According to the advocacy group Mental Health America, almost 60 percent of those with mental illness last year did not get treatment.Woebot Health says the pandemic has driven up demand for its services. The number of its daily users doubled and is now in the tens of thousands, said Alison Darcy, a psychologist and the founder and president of the company.Digital mental health has become a multibillion-dollar industry and includes more than 10,000 apps, according to an estimate by the American Psychiatric Association. The apps range from guided meditation (Headspace) and mood tracking (MoodKit) to text therapy by licensed counselors (Talkspace, BetterHelp).But Woebot, which was introduced in 2017, is one of only a handful of apps that use artificial intelligence to deploy the principles of cognitive behavioral therapy, a common technique used to treat anxiety and depression. Woebot aims to use natural language processing and learned responses to mimic conversation, remember past sessions and deliver advice around sleep, worry and stress.“If we can deliver some of the things that the human can deliver,” Dr. Darcy said, “then we actually can create something that’s truly scalable, that has the capability to reduce the incidence of suffering in the population.”Almost all psychologists and academics agree with Dr. Darcy on the problem: There is not enough affordable mental health care for everyone who needs it. But they are divided on her solution: Some say bot therapy can work under the right conditions, while others consider the very concept paradoxical and ineffective.At issue is the nature of therapy itself. Can therapy by bot make people understand themselves better? Can it change long-held patterns of behavior through a series of probing questions and reflective exercises? Or is human connection essential to that endeavor?Hannah Zeavin is the author of the forthcoming book “The Distance Cure: A History of Teletherapy.” The health care system is so broken, she says, that “it makes sense that there’s space for disruption.”But, she added, not all disruption is equal. She calls automated therapy a “fantasy” that is more focused on accessibility and fun than actually helping people get better over the long term.“We are an extraordinarily confessing animal; we will confess to a bot,” she said. “But is confession the equivalent of mental health care?”Alison Darcy, a psychologist and Woebot Health’s founder and president. The idea, Dr. Darcy says, is not to replace human therapists with bots; she thinks it’s important to have both.Paulo Nunes dos Santos for The New York TimesEli Turns to WoebotEli Spector seemed like the perfect client for A.I. therapy.In 2019, Mr. Spector was a 24-year-old college graduate, working in a neuroscience lab in Philadelphia. Having grown up with an academic father who specialized in artificial intelligence, he considered himself something of a technologist.But Mr. Spector’s job was isolating and tedious, and after four stimulating years in academia, he felt bored and lonely. He couldn’t sleep well and found that his moods were consistently dark.“I was just having a really hard time adjusting and I didn’t have any co-workers I liked,” he said. “It was just a tough period for me.”But he wasn’t sure he wanted to bare his soul to a real person; he didn’t want to worry about anyone’s judgment or try to fit around someone else’s schedule.Besides, he didn’t think he could find a therapist on his parents’ insurance plan that he could afford, as that could run from $100 to $200 a session. And Woebot was free and on his phone.“Woebot seemed like this very low-friction way to see, you know, if this could help.”Therapy by AlgorithmWoebot’s use of cognitive behavioral therapy has a philosophical and practical logic to it. Unlike forms of psychotherapy that probe the root causes of psychological problems, often going back to childhood, C.B.T. seeks to help people identify their distorted ways of thinking and understand how that affects their behavior in negative ways. By changing these self-defeating patterns, therapists hope to improve symptoms of depression and anxiety.Because cognitive behavioral therapy is structured and skill-oriented, many mental health experts think it can be employed, at least in part, by algorithm.“You can deliver it pretty readily in a digital framework, help people grasp these concepts and practice the exercises that help them think in a more rational manner,” said Jesse Wright, a psychiatrist who studies digital forms of C.B.T. and is the director of the University of Louisville Depression Center. “Whereas trying to put something like psychoanalysis into a digital format would seem pretty formidable.”Dr. Wright said several dozen studies had shown that computer algorithms could take someone through a standard C.B.T. process, step by step, and get results similar to in-person therapy. Those programs generally follow a set length and number of sessions and require some guidance from a human clinician.But most smartphone apps don’t work that way, he said. People tend to use therapy apps in short, fragmented spurts, without clinician oversight. Outside of limited company-sponsored research, Dr. Wright said he knew of no rigorous studies of that model.And some automated conversations can be clunky and frustrating when the bot fails to pick up on the user’s exact meaning. Dr. Wright said A.I. is not advanced enough to reliably duplicate a natural conversation.“The chances of a bot being as wise, sympathetic, empathic, knowing, creative and being able to say the right thing at the right time as a human therapist is pretty slim,” he said. “There’s a limit to what they can do, a real limit.”John Torous, director of digital psychiatry for Beth Israel Deaconess Medical Center in Boston, said therapeutic bots might be promising, but he’s worried they are being rolled out too soon, before the technology has caught up to the psychiatry.“If you deliver C.B.T. in these bite-size parts, how much exposure to bite-size parts equals the original?” he said. “We don’t have a good way to predict who’s going to respond to them or not — or who it’s good or bad for.”These new apps, Dr. Torous said, risk setting back other advances in digital mental health: “Do we in part end up losing trust and credibility because we’re promising what is not yet possible by any machine or any program today?”Other mental health professionals say that therapy should simply not be delivered by machine. Effective treatment involves more than just cognitive skill-building, they say. It needs a human-to-human connection. Therapists needs to hear nuances, see gestures, recognize the gap between what is said and unsaid.“These apps really shortchange the essential ingredient that — mounds of evidence show — is what helps in therapy, which is the therapeutic relationship,” said Linda Michaels, a Chicago-based therapist who is co-chair of the Psychotherapy Action Network, a professional group.Dr. Darcy of Woebot says a well-designed bot can form an empathetic, therapeutic bond with its users, and in fact her company recently published a study making that claim. Thirty-six thousand Woebot users responded to statements like, “I believe Woebot likes me,” “Woebot and I respect each other” and “I feel that Woebot appreciates me.”Eli Spector tried Woebot, when he was reluctant to bare his soul to a therapist. “Woebot seemed like this very low-friction way to see, you know, if this could help,” he said.Hannah Yoon for The New York TimesThe study’s authors — all with financial ties to the company — concluded that a significant percentage of participants perceived a “working alliance” with Woebot, a term that means the therapist and patient have formed a cooperative rapport. The study did not measure whether there actually was a working alliance.Sherry Turkle, a clinical psychologist at the Massachusetts Institute of Technology who writes about technology and relationships, is not swayed by such evidence. For therapy to heal, she said, the therapist must have a lived experience and the ability to empathize with a patient’s pain. An app cannot do that.“We will humanize whatever seems capable of communicating with us,” Dr. Turkle said. “You’re creating the illusion of intimacy, without the demands of a relationship. You have created a bond with something that doesn’t know it is bonding with you. It doesn’t understand a thing.”Eli Pours Out His ProblemsEli Spector started with Woebot in the summer of 2019.He liked that he could open the app whenever he felt like it and pour out his thoughts of distress on his own schedule, for even a few minutes at a time. Most of the words coming out had to do with how unhappy he felt at his job.He also took advantage of Woebot’s other features, including tracking his mood and writing in an online journal. It helped him realize how depressed he really was.But he had doubts about the algorithm. The bot’s advice often felt generic, like a collection of “mindfulness buzzwords,” he said. “Like, ‘Can you think more about that feeling, and what you could do differently?’”And worse, the advice could be nonsensical.“I would type in, like, ‘My boss doesn’t appreciate the work I do’ and ‘I can’t seem to get her approval,’” Mr. Spector said. “And Woebot would be like: ‘That sounds difficult. Does this happen more in the morning or at night?’”“It felt sort of silly,” he said.Is It Really Therapy?Much of the debate over therapeutic bots comes down to expectations. Do patients and clinicians understand the limitations of chatbots? Or are they expecting more than even the companies say they deliver?On its website, Woebot promises to “automate both the process and content of therapy,” but Dr. Darcy is careful not to call Woebot medical treatment or even formal therapy.Instead, she says, the bot delivers “digital therapeutics.” And Woebot’s terms of service call it a “pure self-help” program that is not meant for emergencies. In fact, in the event of a severe crisis, Woebot says that it is programmed to recognize suicidal language and urge users to seek out a human alternative.In that way, Woebot does not approach true therapy — like many mental health apps, the current, free version of Woebot is not subject to strict oversight from the Food and Drug Administration because it falls under the category of “general wellness” product, which receives only F.D.A. guidance.But Woebot is striving for something more. With $22 million of venture capital in hand, Woebot is seeking clearance from the F.D.A. to develop its algorithm to help treat two psychiatric diagnoses, postpartum depression and adolescent depression, and then sell the program to health systems.And it is here that Woebot hopes to make money, using its practical advantage over any human therapist: scale.While other virtual therapy companies, like BetterHelp or Talkspace, must keep recruiting therapists to join their platforms, A.I. apps can take on new users without paying for extra labor. And while therapists can vary in skills and approach, a bot is consistent and doesn’t get stressed out by back-to-back sessions.“The assumption is always that, because it’s digital, it’ll always be limited,” Dr. Darcy of Woebot said. “There’s actually some opportunities that are created by the technology itself that are really challenging for us to do in traditional treatment.”One advantage of an artificial therapist — or, as Dr. Darcy calls it, a “relational agent” — is 24-hour-a-day access. Very few human therapists answer their phone during a 2 a.m. panic attack, as Dr. Darcy pointed out. “I think people have probably underestimated the power of being able to engage in a therapeutic technique in the moment that you need to,” she said.But whether Woebot can be involved in medical diagnosis or treatment is up to the F.D.A., which is supposed to make sure the app can back up its claims and not cause harm, an agency spokesperson said.One possible harm, the spokesperson said, is a “missed opportunity” where someone with mental illness fails to get more effective treatment or delays treatment. “And what the consequences of those delays would look like — that’s something we’d worry about,” the spokesperson said.Artificial intelligence can be problematic in other ways. For instance, Dr. Zeavin worries that racial and gender bias or privacy breaches could simply get translated into bots.“Therapy has enough problems on its own,” Dr. Zeavin said. “And now they’ve brought all of the problems of algorithmic technology to bear.”But even some skeptics of chatbot therapy believe it has the potential to complement the human-guided mental health system, as long as it comes with serious research.“As the market gets saturated, the bar for evidence will get higher and higher and that’s how people will compete,” Dr. Torous said. “So maybe we’re just in such early stages and we don’t want to punish people for being innovative and kind of trying something.”The idea, Dr. Darcy says, is not to replace human therapists with bots; she thinks it’s important to have both. “It’s like saying if every time you’re hungry, you must go to a Michelin star restaurant, when actually a sandwich is going to be OK,” she said. “Woebot is a sandwich. A very good sandwich.”Eli Breaks Up With WoebotAfter about a month, Eli Spector deleted Woebot from his phone.He was unimpressed by the bot’s advice for beating back loneliness and despair, but he is not entirely sorry that he tried it out.The mere act of typing out his problems was helpful. And through the process, he pinpointed what he actually needed to feel better.“So maybe this was just evidence that I needed to, like, actually address this,” he said. “It was enough to inspire me to just take the plunge and find a flesh-and-blood therapist.”Now, Mr. Spector pays a human psychotherapist in Philadelphia $110 a session.They’ve been meeting on Zoom since the pandemic began, so the flesh-and-blood part is somewhat theoretical. But it’s close enough.

Read more →