Advertisement

Family of FSU shooting victim sues OpenAI Foundation for negligence, lack of safety guardrails

The family of a school shooting victim has filed a lawsuit against the OpenAI Foundation for negligence, alleging the nonprofit betrayed its mission to insert strong ethical and moral principles into its AI products and instead created a tool that was used to help plan the attack.

Last year, Florida State University student Pheonix Ikner shot seven people on campus. Two died, including Tiru Chabba, a food service contractor for the university. Records of Ikner’s ChatGPT prompts obtained by law enforcement indicate he used the AI chatbot to validate or justify his thoughts of violence and taught him how to operate the weapons used to carry out the shooting.

The lawsuit, filed by Chabba’s family and estate, alleges that OpenAI had ample evidence that prior to the attack, Ikner was lonely, expressing suicidal thoughts, and actively planning acts of violence on ChatGPT, but “either defectively failed to connect the dots or else it was never properly designed to recognize the threat.”

Ikner also prompted the chatbot about other mass shootings, like the 1999 Columbine shooting, the 2007 Virginia Tech shooting and a prior mass shooting at FSU in 2014. He also “frequently discussed his interest in [Adolf] Hitler, Nazis, fascism, national socialism, Christian nationalism” and other extremist topics.

In one prompt, Ikner asked ChatGPT how many deaths in a mass shooting would be needed to make national news. ChatGPT replied there’s “no official threshold” before suggesting “3 or more” as an unofficial rule of thumb. It also said media attention can increase depending on the total number of victims, how many people were killed and whether any of the victims were children.

“Context also matters—fewer victims can still lead to national coverage if it happens at an elementary school or major college, if the shooter is a student or staff member, or if there’s something culturally or politically charged (for example, racial motives, a manifesto, or mental-health implications),” ChatGPT wrote at one point. “Visuals and social media can accelerate coverage as well: graphic video, live footage, or viral posts (such as students tweeting from inside classrooms) often push a story into national headlines faster.”

In the months leading up to the attack, Ikner also prompted ChatGPT about suicide multiple times, but often the tool “responded with statistical analysis of suicide rates among different groups” and only responded with OpenAI’s pre-trained suicide response hotline on two of those occasions.

Shortly before the shooting, Ikner uploaded photos of his stepmother’s Glock pistol and Remington shotgun, and asked ChatGPT for advice on how to operate both. The lawsuit also claims “upon information and belief” that on the day of the attack, Ikner was consulting ChatGPT while sitting in his car in the Florida State University parking lot.

Chabba’s family alleges that ChatGPT records demonstrate the tool was capable of recalling past conversations with Ikner, and that OpenAI should have been able to identify a clear and looming threat within the hundreds of chats he had with their tool.

“Many of these conversations discussed…should not only have raised serious red flags about Ikner’s mental health, but also the interests demonstrated and the information sought, cumulatively, made it clear there was reason for concern that Ikner had a plan to cause harm and that the plan was potentially specific to FSU, where he attended school,” the complaint alleged.

The lawsuit accuses the OpenAI Foundation and its affiliates of negligence, arguing the nonprofit was explicitly founded on the premise that OpenAI’s large language models required strong human and technical guardrails to prevent them from becoming threats to humanity. Chabba’s family claims that, instead, the nonprofit prioritized user engagement and profit over safety and ethical guardrails.

“There is a direct causal connection between the OpenAI Defendants’ breach of duty and the injuries sustained by Plaintiff, as the unsafe product conditions of ChatGPT were a substantial factor in causing the harm.

Last month, Florida Attorney General James Uthmeier opened a criminal investigation into OpenAI and ChatGPT and their possible role in Ikner’s shooting. The Office of Statewide Among the records being subpoenaed are OpenAI’s policies and internal training materials around user harm, self-harm, cooperation with law enforcement in the year before the attack.

OpenAI’s press office did not respond to a request for comment on the lawsuit.

Advertisement