The man accused of perpetrating the deadly shooting at Florida State University in April 2025 was in “constant contact” with ChatGPT while planning the attack, according to attorneys representing one victim’s family who plan to sue OpenAI.

The New York Post reports that the attorneys for the family of Robert Morales, one of two people killed in the mass shooting at Florida State University (FSU), claim the alleged shooter used the AI chatbot ChatGPT to help plan the attack. The law firm Brooks, LeBoeuf, Foster, Gwartney and Hobbs announced plans to file a lawsuit against OpenAI seeking to hold them accountable for their client’s death.

Morales, a 57-year-old Aramark worker and father from Tallahassee, was fatally shot when Phoenix Ikner allegedly opened fire on the campus on April 17, 2025. The shooting also claimed the life of Tiru Chabba, a 45-year-old Aramark vendor from Greenville, South Carolina, and left six students wounded.

According to a statement provided to WCTV by the law firm, they “have been advised that the shooter was in constant communication with ChatGPT leading up to the shooting.” The attorneys further stated they “have reason to believe that ChatGPT may have advised the shooter how to commit these heinous crimes.”

Court records reveal that more than 270 images of ChatGPT conversations are listed as exhibits in the case, though the specific content of these messages has not been publicly disclosed.

OpenAI responded to the allegations by confirming it “identified a ChatGPT account believed to be associated with the suspect” shortly after the shooting occurred. The company stated it “proactively shared this information with law enforcement and cooperated with authorities.” An OpenAI spokesperson told the outlet that the company “built ChatGPT to understand people’s intent and respond in a safe and appropriate way, and we continue improving our technology.”

The shooting began outside FSU’s student union just before noon on April 17, 2025. Police say Ikner, who was enrolled at the public university at the time, used a service pistol belonging to his stepmother, Jessica Ikner, a deputy with the Leon County Sheriff’s Office. He also possessed a shotgun but did not appear to have used it during the attack.

Police quickly responded to the incident and shot Ikner, leaving his face severely disfigured, before taking him into custody. He now faces charges including first-degree murder, attempted murder, and other related offenses. Investigators have indicated the motive for the shooting remains unclear, and Ikner did not appear to have any prior connections to his victims.

In addition to planning legal action against OpenAI, Morales’ attorneys have also suggested the Leon County Sheriff’s Office could bear some liability. In a September letter to LCSO, the law firm noted that Ikner had participated in the department’s Youth Advisory Council, where he “was allegedly taught about firearms and displayed behavior that should’ve raised concerns.”

The attorneys wrote that “Mr. Ikner was not mentally stable and should not be around guns, much less taught how to use them.” They argued that “The Leon County Sheriff’s Office’s handling of Mr. Ikner, as described more fully herein, was at least part of the cause of the murder of Mr. Robert Morales.”

Breitbart News previously reported that Ikner had been prescribed medication for “emotional dysregulation.”

OpenAI faces a lawsuit over the Canadian school shooting earlier this year that alleges the company knew the shooter was engaged in threatening behavior, but failed to alert authorities:

According to the court documents, shooter Jesse Van Rootselaar, who was 17 years old at the time, opened a ChatGPT account during the summer of 2025. The lawsuit claims that Van Rootselaar described various scenarios involving gun violence over the course of several days through interactions with the AI platform.

The legal filing alleges that 12 monitoring staff members at ChatGPT identified Van Rootselaar’s inquiries as indicating an imminent risk of serious harm to others. These employees reportedly recommended that Canadian law enforcement be notified of the concerning activity and escalated the matter to company leadership. However, the lawsuit claims that company executives subsequently rebuffed their employees’ request to contact authorities.

Breitbart News social media director and author Wynton Hall explains in his instant bestseller, Code Red: The Left, the Right, China, and the Race to Control AI, that conservatives must develop a plan to deal with the dark side of AI, whether it is used to indoctrinate students in the classroom, to sexualize and groom them, or to cause a mentally ill person to spiral into a dangerous condition.

Senator Marsha Blackburn (R-TN), who was named one of TIME’s 100 Most Influential People in AI, praised Code Red as a “must-read.” She added: “Few understand our conservative fight against Big Tech as Hall does,” making him “uniquely qualified to examine how we can best utilize AI’s enormous potential, while ensuring it does not exploit kids, creators, and conservatives.”  Award-winning investigative journalist and Public founder Michael Shellenberger calls Code Red “illuminating,” ”alarming,” and describes the book as “an essential conversation-starter for those hoping to subvert Big Tech’s autocratic plans before it’s too late.”

Read more at the New York Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of AI, free speech, and online censorship.

Read the full article here

Share.
Leave A Reply

Exit mobile version