Arizona Public Radio | Your Source for NPR News
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
SERVICE ALERT:

Our 88.7 transmitter site sustained a fire of unknown origin. We have installed a bypass that has returned us to full power for most, though repairs are still ongoing. Our HD service remains inoperable. We apologize for the inconvenience and appreciate your patience as we continue to work on the transmitter. Online streaming remains unaffected.

Revisiting the idea of whether AI might help those dealing with isolation, depression

STEVE INSKEEP, HOST:

So many people need advice on their mental health in this country that there are not enough professionals to meet their needs. So what if a computer could help? Some people seek answers from an app on their phones. Artificial intelligence might address isolation or depression, although it also raises new ethical questions. NPR's Yuki Noguchi reports.

YUKI NOGUCHI, BYLINE: Chukurah Ali overcame a traumatic childhood and, several years ago, opened Coco's Desserts in St. Louis, Mo. Her ornate cakes looked fit for baking shows. But those aren't even her favorite.

CHUKURAH ALI: Chocolate chip cookies (laughter). So simple (laughter). Those are my favorite. My grandma used to make them.

NOGUCHI: But last February, things fell apart. A car accident left Ali, a single mom who also cares for her mother, hobbled by injury, from head to knee.

ALI: I could barely talk. I could barely move. I felt like I was worthless because I could barely provide for my family at that moment. And now I lost my car. I can't even take care of my daughter.

NOGUCHI: Darkness and depression engulfed Ali.

ALI: The pain, my emotions, migraines.

NOGUCHI: Her orthopedist urged her to find a therapist, but none were available. Plus, Ali could no longer afford health insurance. She had to close the bakery.

ALI: That's stressful, too. That was my second baby.

NOGUCHI: So her doctor suggested a mental health app called Wysa. Its chatbot-only service is free, though it also offers teletherapy services with a human. The chatbot asks questions like, how are you feeling, or what's bothering you? It analyzes answers but doesn't generate its own responses. Instead, it draws from a database of psychologist-approved messages that deliver support or advice about managing chronic pain, say, or grief. That is how Ali found herself on the frontier of technology and mental health. Initially, she felt silly opening up to a robot.

ALI: I thought it was weird at first 'cause I'm like, OK, I'm talking to a bot. It's not going to do nothing. I want to talk to a therapist (laughter). But that bot helped.

NOGUCHI: Confined to her bed, she could text it at 3 a.m.

ALI: I would just start chatting with it. How are you feeling today? I'm not feeling it. Then it would give me these little options that I could do.

NOGUCHI: Like a simple exercise or deep breathing or listening to soothing music. It focused Ali on things other than pain, and it reminded her of the in-person therapy she did years ago.

ALI: What I noticed it was doing - CBT therapy, the cognitive behavioral therapy. It's not a person but it makes you feel like it's a person because it's asking you all the right questions.

PAOLA PEDRELLI: And that is really what a therapist does.

NOGUCHI: Paola Pedrelli is a psychologist and professor at Harvard researching uses of AI to monitor mental health.

PEDRELLI: Reflect back what you're saying, naming and labeling your emotion. And now chatbot are able to do that.

NOGUCHI: Companies and researchers like Pedrelli are looking at various ways technology might improve therapy. Motion sensors and online activity and things like apps might help flag a patient's worsening mood. AI might also alert therapists when patients skip medications or might keep more detailed notes about a patient's tone or behavior during meetings. Other forms of AI interact directly with patients like Chukurah Ali, serving up suggestions based on known therapeutic methods.

Skeptics warn there hasn't been enough research or regulatory review and point to dangers of a chatbot misunderstanding or responding inappropriately. Many people may not be receptive to it. But research also shows some people prefer machines. There's no stigma with no human at the other end. Ali says as odd as it might sound, she relies on her chatbot.

ALI: I think the most I talked to that bot was, like, seven times a day (laughter).

NOGUCHI: She says mostly, it helps her help herself.

ALI: Or I will go to my physical therapist appointment, when before I'm like, no, I can do it today. I'm going to have to reschedule it.

NOGUCHI: That's precisely why Ali's doctor, orthopedist Abby Cheng, suggested she use the app. Cheng treats physical ailments, but says almost always, mental health challenges accompany those.

ABBY CHENG: Sometimes, if we can't address the mental health aspect of things, we feel stuck.

NOGUCHI: And patients, in turn, get stuck because of a lack of therapists, transportation, insurance, time or money.

CHENG: In order to address this huge mental health crisis we have in our nation and even globally, I think digital treatments and AI can play a role in that and at least fill some of that gap in the shortage of providers and resources that people have.

NOGUCHI: But getting to that future also requires figuring out thorny issues like health privacy and legal liability. And even AI's proponents argue, computers aren't ready to replace human therapists, especially for handling people in crisis.

Cindy Jordan is CEO of Pyx Health, a company that uses AI as part of its service to help people who feel chronically lonely. She worries, for example, about a chatbot responding to a suicidal person.

CINDY JORDAN: Oh, I'm sorry to hear that or, worse, I don't understand you. That makes me nervous. You know, we have not reached a point where - in an affordable, scalable way - where AI can understand every sort of response that a human might give, particularly those in crisis.

NOGUCHI: So as a backup, Pyx staffs a call center with people who call users when the system identifies them as potentially in crisis. But for more routine support Chukurah Ali says she believes technology could help many more people and recommends the app to all her friends. She constantly finds herself passing along mental health advice she learns from it.

ALI: I wasn't like this before, but now it's like, so what you going to do today to make you feel better? How about you try this today (laughter)?

NOGUCHI: It isn't just a technology trying to act human, she laughs. She's now mimicking the technology.

Yuki Noguchi, NPR News.

INSKEEP: Hey, listen, I'm telling you, person to person, if you or someone you know may be considering suicide or in crisis, call or text the 988 Suicide & Crisis Lifeline. Just three digits - 988. Transcript provided by NPR, Copyright NPR.

Yuki Noguchi is a correspondent on the Science Desk based out of NPR's headquarters in Washington, D.C. She started covering consumer health in the midst of the pandemic, reporting on everything from vaccination and racial inequities in access to health, to cancer care, obesity and mental health.