Editor's note: This article discusses suicide and suicidal ideation, including suicide methods. If you or someone you know needs mental health resources and support, please call, text or chat with the 988 Suicide & Crisis Lifeline or visit 988lifeline.org for 24/7 access to free and confidential services.
Joshua Enneking, 26, was a tough and resilient child. He was private about his feelings, and never let anyone see him cry. During his teenage years, he played baseball and lacrosse, and rebuilt a Mazda RX7 transmission by himself. He received a scholarship to study civil engineering at Old Dominion University in Virginia, but left school after COVID-19 hit. He moved in with his older sister, Megan Enneking, and her two children in Florida, where he grew especially close with his 7-year-old nephew. He was always the family jokester.
Megan knew Joshua had started using ChatGPT for simple tasks in 2023, such as writing emails or asking when a new Pokémon Go character would be released. He’d even used the chatbot to write code for a video game in Python and shared what he created with her.
But in October 2024, Joshua began confiding in ChatGPT — and ChatGPT alone — about struggles with depression and suicidal ideation. His sister had no idea, but his mother, Karen Enneking, had suspected he might be unhappy, sending him Vitamin D supplements and encouraging him to get out in the sun more. He said not to worry; he said he “wasn’t depressed.”









