Skip to Content

How chatbots are being used to train crisis counselors

By Rachel Metz, CNN Business

Drew, a 21-year-old in Irvine, California, needs help: He’s transgender, and after starting hormone replacement therapy he’s facing harassment from coworkers. It’s gotten so bad, Drew tells a crisis counselor via a text-based chat session, that he’s considering suicide. He can’t quit his job, however, because he needs the money.

“i think about killing myself pretty constantly these days,” Drew types.

The counselor reassures Drew — thanking him for reaching out to talk, telling him he’s not alone — and draws out details about how Drew plans to kill himself.

“Have you done anything today to try to kill yourself?” the counselor asks.

After a pause, Drew responds, “no i haven’t done anything today.”

It’s a hard conversation to read, even with the knowledge that Drew isn’t a real person, but rather an artificially intelligent chatbot created by The Trevor Project, a suicide prevention and crisis intervention group for LGBTQ youth.

While chatbots are often thought of as a necessary (and at times obnoxious) outgrowth of online customer service, Drew’s purpose is far different from helping customers do things like return a pair of pants or get an insurance quote. Drew simulates conversations with volunteer crisis-counselors-in-training who will go on to staff The Trevor Project’s always-available text- and chat-based helplines (the group also has a staffed 24/7 phone line). LGBTQ youth are at a higher risk of depression and suicide than other young people, and research indicates this may have worsened during the pandemic due to factors such as isolation from school closures.

The overall training process for new counselors who will respond to texts and chats takes months, and role-playing is a key part of it. The hope is that, with the aid of capable chatbots like Drew, the nonprofit can train many more counselors more quickly than by conducting role-playing sessions staffed by people.

“You can watch a lot of training videos and you can read all the handbooks. You can get cognitively how this is supposed to go. But actually doing it and feeling the feelings of being in one of these conversations, even if it’s simulated, is just a different kind of learning experience,” said Dan Fichter, head of AI and engineering for The Trevor Project.

Drew and Riley

Drew is the second such chatbot the group has rolled out this year — part of what The Trevor Project calls its “Crisis Contact Simulator” — and it deals with more complicated subject matter than its predecessor. The first chatbot, named Riley, represented a depressed North-Carolina teen dealing with issues related to coming out as genderqueer; Riley was created with help and $2.7 million in funding from Google’s charitable arm, Google.org. (Drew was developed internally by The Trevor Project).

The Trevor Project said it began using Drew along with Riley in the last couple months, and has trained over 1,000 digital volunteer counselors with the chatbots thus far. It has 1,205 digital counselors in total.

In November, The Trevor Project gave CNN Business a peek at how the training unfolds in real time via a demo video of a conversation conducted by a trained counselor with the Drew chatbot. The conversation unfolded slowly, with the counselor gradually asking more personal questions about Drew’s age, location, and so on in the hope of building trust with Drew and, over time, assessing his risk of suicidal behavior and figuring out how to help him. At one point, the counselor empathized with how hard it must feel to be harassed at work, and asked Drew what his relationship is like with his boss.

“she told me to ignore it and be the bigger person but she doesn’t get how scary it is for me,” Drew responded.

Regular pauses on Drew’s end, which seemed to vary in length, added to the conversation’s feeling of intensity. Kendra Gaunt, The Trevor Project’s data and AI product manager and the trained counselor who recorded the demo, said that after launching Riley these varying pauses between responses were added to better simulate how a person contacting The Trevor Project might be switching between devices or tasks.

At the end of the conversation, a trainer at The Trevor Project reviews the transcript and gives feedback to the trainee. Trainees also participate in some role-play sessions led by The Trevor Project’s instructors.

“While this is not necessarily a real conversation with a live youth, these mirror the reasons why people seek Trevor’s support in the first place,” Gaunt said.

“sorry idk :/”

While AI chatbots have advanced significantly in recent years, they still have plenty of limitations. Chatbots such as Drew and Riley that are built using large language models, which are AI systems that can generate text that can look nearly indistinguishable from what a human would write. So while they may respond realistically to a human’s queries, they can also reflect the biases of the internet, since that’s what those models are trained on. And they can’t always answer a question, or answer it well. For instance, at one point in the conversation, the counselor asked Drew how it felt to talk to his boss about the problems he’s having with coworkers.

“sorry, idk :/..” Drew typed in response.

The Trevor Project is trying to use this weakness as an advantage: This kind of response is, in a sense, a good thing for a counselor-in-training to come up against, so they can come up with another way to phrase their question to get a better response.

Also, Fichter said, “Part of the experience of helping Drew involves a new counselor learning to sit with the discomfort of not being able to solve everyone’s problems in one conversation.”

Trainees will also only find out about Drew’s suicidal thoughts if they probe for it, Fichter pointed out, and this is meant to help accustom them to asking hard questions in direct ways.

“For most trainees, Riley and Drew are the first time they’ve probably ever typed out the words, ‘Are you thinking of killing yourself?'” Fichter said.

“A dearth of resources”

Beyond the general language training of The Trevor Project’s Crisis Contact Simulator, the personas of Drew and Riley were built with data from transcripts of text-based conversations that in the past were used to train crisis counselors — not details from conversations between people contacting The Trevor Project and counselors.

Maggi Price, an assistant professor at Boston College who studies how healthcare services can be improved for transgender youth, said she’s concerned about how well the chatbot can represent a real person since it’s trained on simulated interactions with counselors, rather than real conversations. Still, she sees potential to use this kind of chatbot to train counselors, of which there are too few — particularly when it comes to those who have the expertise to work with transgender clients.

“There’s such a mental-health crisis right now and there’s such a dearth of resources in gender-affirming care, LGBTQ-affirming care, specifically,” she said. “I think overall it sounds really promising.”

Joel Lam, who works in finance for The Trevor Project and completed its counselor training earlier this year with the Riley chatbot, said it felt surprisingly natural to communicate with an automated tool. He also said it felt a little less stressful conducting the roleplay knowing there wasn’t really another person on the other end of the conversation.

After many monthly shifts on a crisis hotline, he said he can confirm the chatbot acts like a person, in part simply because of how it pauses before replying to a question from a counselor.

During training, he said, “I was like, ‘Maybe there’s a real person behind there.'”

Editor’s Note: If you or a loved one have contemplated suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 or text TALK to 741741. The International Association for Suicide Prevention and Befrienders Worldwide also provide contact information for crisis centers around the world.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

News Channel 3 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content