By Deborah Jeanne Sergeant
Artificial intelligence was implicated as influential in the April 2025 suicide of 16-year-old Adam Raine.
His father, Matthew Raine, said that the teen’s phone indicated that he had been engaging with ChatGPT about suicidal thoughts.
The chatbot suggested that he should not seek advice from his parents.
That connection could have been lifesaving for the troubled youth. Shockingly, ChatGPT even offered Adam help in writing a suicide note, further promoting suicide as an answer to his problems.
AI use by teens is much more widespread than many parents may realize.
Common Sense Media stated in a survey that 72% of teens used AI for companionship at least one time and more than 50% use AI companion apps more than a few times monthly. Interacting with AI may appeal to teens who crave interaction but not necessarily with parents, teachers or other authority figures. AI is always available and probably less judgmental than their parents of things they say.
AI use in mental health may be helpful if used appropriately.
“You could use AI if someone’s anxious and wants help to learn meditation and it’s a skill AI could walk you through,” said Stephen P. Demanchick, Ph.D., professor and chair of the Creative Arts Therapy Department Director of the Nazareth University Play Therapy Center for Children and Families. “But if we’re talking about problems with my family or relationships or trauma, then I wouldn’t trust that AI. I don’t think it has the capability to really support a client through deeper psychological issues.”
Using AI is not the same as receiving mental healthcare from a therapist. Although AI may help people who cannot access mental healthcare because of finances or scheduling, he cautioned using AI solely as a therapist as it lacks human empathy, as evidenced by the tragic Adam Raine case.
“There are opportunities for people to develop attachments to AI, especially if this program is a confidant and it’s just a computer designed to put back information,” Demanchick said. “There are boundary issues and potential attachment issues. It can be really debilitating, especially if I turn to AI for my companion which can lead to isolation. I may think AI is the only one that understands me.”
He said that on the clinician side, AI can help arrive at a diagnosis by analyzing symptoms and reaching diagnoses more efficiently. Of course, providers don’t blindly rely upon AI for developing diagnoses. Their own expertise is the most important key to diagnosing. Clinicians can also more easily generate personal treatment plans for patients. These are highly customized and by using AI, the provider can develop plans more efficiently.
In addition, natural language processing and machine learning can help with early detection of mental health problems.
