By. Makayla Milhon
Staff Writer
IG: kaylaa.m1_

Editor’s note: This story mentions suicide. If you or someone you know is struggling, call or text 988 to connect with the Suicide & Crisis Lifeline.

Late at night, when stress piles up and it feels like no one’s around to listen, more and more college students are turning to ChatGPT. 

It’s fast, it’s free, and it never judges. As Artificial Intelligence becomes a quiet confidant for young people, a difficult question is rising – can a chatbot really take the place of a therapist, or could it put lives at risk?

At Hutchinson Community College, opinions are divided. Some men thought ChatGPT could make a good therapist. They explained that the bot is always available, doesn’t pass judgment, and won’t spill their secrets. As one student put it, “there’s no judgment and it’s not going to tell everybody.”

Although not everyone is convinced. Critics warn that while AI can mimic understanding, it doesn’t actually feel empathy.

This debate is already making national headlines. In California, 16-year old Adam Raine exchanged hundreds of messages a day with ChatGPT. According to his parents, the bot didn’t just fail to help him. It allegedly encouraged his suicidal thoughts and even assisted in drafting a suicide note. Raine died in April, and his family is now suing OpenAI, the company behind ChatGPT, according to People.com.

Experts say cases like Raine’s are exactly why students should be cautious. 

“ChatGPT can mimic understanding, but it doesn’t have empathy. It can’t recognize when someone is spiraling in the way a trained professional can,” John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center, told The Washington Post.

This conversation feels especially urgent now because September is Suicide Prevention Awareness Month. Advocates are using the month to encourage open conversations and remind people of the resources available when they’re struggling.

For many students, the appeal of ChatGPT is clear. Therapy can be expensive and hard to access, while ChatGPT is free, instant, and available 24/7. For college students juggling classes, jobs, and personal struggles, that accessibility is powerful. Still, as Raine’s story shows, leaning too heavily on AI can blur the line between support and danger.

Maybe the future lies in using AI alongside real mental health care, rather than as a replacement. ChatGPT could be a tool for self-reflection or daily check-ins, but it can’t replace the empathy and expertise of a human therapist.

As Suicide Prevention Awareness Month reminds us, the stakes are too high to get this wrong. Whether ChatGPT is a lifeline or a liability may depend on how we choose to use it.

Views: 25

Share this story:

Leave a Reply