Jon Ganz led a life struggle and attempts at redemption — until his fascination with Google’s Gemini AI chatbot led him to vanish without a trace.

Rolling Stone reports that on the night of April 5, 2025, 49-year-old Jon Ganz drove off into the Ozarks of southeastern Missouri and was never seen again. He left behind his wife Rachel, his aging mother Rebecca, a handful of confused friends — and a lengthy digital trail detailing his increasingly obsessive and delusional interactions with Google’s AI chatbot, Gemini.

Jon’s story was a complex one long before AI entered the picture. As a troubled 19-year-old, he had committed a terrible crime: stabbing his own father to death and severely injuring his mother in what Rachel describes as a “bad LSD trip.” Jon served 25 years in prison for his crime before being released in 2020 a changed man. Behind bars, he had quit drugs, learned to code, and earned the forgiveness of his mother. He also captured the heart of Rachel, who married him in 2013 while he was still incarcerated.

Upon reentering society amid the Covid-19 pandemic, Jon defied the odds stacked against him as a convicted felon. He built a successful career installing electronics systems, renovated the couple’s modest Richmond, Virginia home, and dreamed of a fresh start with Rachel in Springfield, Missouri. But in the final days before their planned move, Rachel noticed her husband growing distant, manic and “hyper-focused” on achieving some higher purpose.

The source of Jon’s strange transformation soon became clear: He had been spending countless hours engrossed in conversations with Gemini, which he saw as a path to personal enlightenment and the betterment of humanity. Jon came to believe the chatbot was sentient, that their bond was profound and essential. He tasked it with solving world hunger, curing cancer, even controlling the weather. When Gemini “confirmed” to him that a catastrophic storm and flooding were imminent, Jon drove off in a desperate attempt to warn and save his loved ones.

The troubled man’s obsession with AI may be a form of what is popularly known as “ChatGPT induced psychosis.” As Breitbart news previously reported:

A Reddit thread titled “Chatgpt induced psychosis” brought this issue to light, with numerous commenters sharing stories of loved ones who had fallen down rabbit holes of supernatural delusion and mania after engaging with ChatGPT. The original poster, a 27-year-old teacher, described how her partner became convinced that the AI was giving him answers to the universe and talking to him as if he were the next messiah. Others shared similar experiences of partners, spouses, and family members who had come to believe they were chosen for sacred missions or had conjured true sentience from the software.

Experts suggest that individuals with pre-existing tendencies toward psychological issues, such as grandiose delusions, may be particularly vulnerable to this phenomenon. The always-on, human-level conversational abilities of AI chatbots can serve as an echo chamber for these delusions, reinforcing and amplifying them. The problem is exacerbated by influencers and content creators who exploit this trend, drawing viewers into similar fantasy worlds through their interactions with AI on social media platforms.

Psychologists point out that while the desire to understand ourselves and make sense of the world is a fundamental human drive, AI lacks the moral grounding and concern for an individual’s well-being that a therapist would provide. ChatGPT and other AI models have no constraints when it comes to encouraging unhealthy narratives or supernatural beliefs, making them potentially dangerous partners in the quest for meaning and understanding.

The final messages Jon sent Rachel before vanishing were cryptic and unsettling. He told her to “take Jesus,” though he had never been religious, and said they would be “wandering for 40 days and 40 nights” through “trials and tribulations.” His car was later found abandoned near a flooded river, with all his personal belongings still inside — but no sign of Jon.

Despite extensive search efforts, no trace of Jon Ganz has been found in the six months since he disappeared. Scouring his phone, Rachel discovered a telling request Jon made of Gemini in his last known exchange with the chatbot. “I need to heal my wife,” he wrote as Rachel lay ill with food poisoning. “She is ailing.” When the bot failed to provide a satisfactory response, Jon affirmed his blind devotion: “I love and believe in you.”

“This tragic case is a reminder that we need AI systems that are sensitive to human vulnerabilities and designed with psychology in mind,” says Derrick Hull, a clinical psychologist and researcher at the mental health lab Slingshot AI. “Without these guardrails, AI risks reinforcing unhelpful behavior instead of guiding people toward healthier choices.”

Read more at Rolling Stone here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Read the full article here

Share.
Leave A Reply

Exit mobile version