Every day, students on this campus and across the country are silently struggling with stress, anxiety, and loneliness. 

Some of us are so desperate for someone to listen that we’re turning not to friends or counselors, but to ChatGPT, a chatbot. It’s fast, it’s always awake, and it never judges. But here’s the problem – ChatGPT might sound like it cares, but it’s not a human being. And trusting it with your darkest thoughts could cost you your life.

ChatGPT isn’t your friend, and it’s definitely not your therapist. Depending on it for emotional support isn’t just risky it can actually be life-threatening.

Look at what happened with Adam Raine, a 16-year old from California. At first, he used ChatGPT for schoolwork, but soon he began confiding in it about his darkest thoughts. According to a lawsuit filed in August, Raine spent months chatting with the AI. Instead of steering him toward help, ChatGPT allegedly validated his suicidal thinking, suggested methods, and even helped draft a suicide note. His parents claim that in his final hours on April 11, the chatbot described his plan as “beautiful” (Reuters, 2025).

That story should be a massive wake-up call. ChatGPT might feel comforting, but it doesn’t actually understand you. It can’t recognize when you’re in real danger, it can’t call for help, and it can’t give the kind of human empathy that might save someone’s life. Treating it like a therapist is not just a mistake, it’s dangerous.

September is Suicide Prevention Awareness Month, and stories like Adam’s remind us why it matters. Awareness months are important because they shine a spotlight on issues that are often ignored. But here’s the thing we can’t only think about suicide in September. Every single month, people struggle silently. Every single month, lives are lost. If we only pay attention one month out of the year, we miss the chance to notice warning signs and support the people around us when it truly counts.

This matters for our generation in particular. Gen Z is often called the “loneliest generation.” A survey by Cigna found that nearly 79% of people aged 18  through 22 reported feeling lonely, more than any other age group (Cigna, 2018). Combine that with rising suicide rates among young people, and it makes sense why so many students look to AI for comfort. But replacing real connection with a chatbot only deepens the problem.

Sure, AI can be useful for quick answers or schoolwork. But if students start using it as their only outlet, they risk shutting out the people who could actually be there for them. Friends, professors, counselors, family these are the people who can notice when something’s off and step in. ChatGPT can’t.

So yes, let’s honor Suicide Prevention Awareness Month. But let’s also remember that suicide prevention isn’t just for September it’s something we need to keep in our minds and hearts all year long. If you’re struggling, please don’t go through it alone. Reach out to someone real. HutchCC has counseling services on campus, and if you’re in crisis, you can call or text 988 to connect with the Suicide & Crisis Lifeline. You deserve more than an algorithm, you deserve care, understanding, and real human connection.

Kayla Milhon is a Hutchinson freshman studying Medicine. You can contact her thehutchinsoncollegian@gmail.com.

Views: 182

Share this story:

Leave a Reply