Florida AG Expands ChatGPT Criminal Probe to USF Student Murders

Florida AG Expands ChatGPT Criminal Probe to USF Student Murders

Florida Attorney General Expands ChatGPT Criminal Investigation to Include USF Student Murders

Florida Attorney General James Uthmeier announced on April 27, 2026, that his office is expanding its criminal investigation into OpenAI and its ChatGPT chatbot to encompass the murders of two University of South Florida doctoral students — marking the latest escalation in what is being described as the first attempt in the nation to hold an AI company and its executives criminally accountable for the outputs of generative software.

"We are expanding our criminal investigation into OpenAI to include the USF murders after learning the primary suspect used ChatGPT," Uthmeier stated, citing court records that revealed the alleged killer consulted the chatbot about body disposal before and after the victims disappeared.

The USF Murders: What Court Records Reveal About ChatGPT's Alleged Role

Hisham Abugharbieh, 26, is charged with two counts of first-degree murder with a weapon in the deaths of Zamil Limon, 27, and Nahida Bristy, 27 — both Bangladeshi nationals enrolled as doctoral students at USF on student visas. Limon was studying geography, environmental science, and policy; Bristy was studying chemical engineering. Abugharbieh was Limon's roommate.

Both Limon and Bristy disappeared on April 16, 2026. The case was flagged to law enforcement on April 17, when one of Bristy's friends went to the USF Police Department after being unable to contact either student. Limon's remains were found on the side of a Tampa bridge on April 25, 2026 — his body in multiple black utility trash bags and in advanced stages of decomposition. The Pinellas County Medical Examiner's Office confirmed his cause of death as "multiple sharp force injuries," including numerous lacerations and stab wounds to his abdomen and lower back.

Court filings paint a detailed picture of Abugharbieh's alleged ChatGPT activity in the days surrounding the killings. According to prosecutors, on April 13 — three days before the victims were last seen — Abugharbieh allegedly asked ChatGPT: 'What happens if a human has a put in a black garbage bag and thrown in a dumpster?' ChatGPT responded that it sounded dangerous. Abugharbieh then sent a follow-up message: 'How would they find out.'

On April 19, according to court records cited by Court TV, Abugharbieh asked ChatGPT: 'Has there been someone who survived a sniper bullet to the head,' 'will my neighbors hear my gun,' and 'Is there a water temperature that burns immediately.' A pretrial detention report filed by prosecutors also revealed that Abugharbieh asked the chatbot whether the vehicle identification number on his car could be changed, and whether he could keep a gun at home without a license. On April 23, he sent another message asking: 'What does missing endangered adult mean?'

Abugharbieh was arrested on April 24, 2026, after barricading himself inside a residence — a standoff that prompted the activation of a SWAT team, bomb disposal team, crisis negotiations team, and drone response team. The pretrial detention motion filed by prosecutors in the case runs 23 pages.

The FSU Shooting: The Investigation That Started It All

The USF expansion follows the AG's initial probe, launched on April 21, 2026, which focused on the April 2025 mass shooting at Florida State University. That attack killed two people and injured five more. Accused gunman Phoenix Ikner allegedly consulted ChatGPT on the morning of the shooting — asking about how to fire his guns, the busiest times on campus, and how many murder victims would be needed to draw national media attention.

Uthmeier's office subpoenaed OpenAI over its potential involvement in the FSU shooting and issued subpoenas seeking information about OpenAI's internal policies and training materials related to user threats of harm, how the company cooperates with law enforcement, and how it reports crimes — with the document requests dating back to March 2024.

"This criminal investigation will determine whether OpenAI bears criminal responsibility for ChatGPT's actions in the shooting at Florida State University last year," Uthmeier stated in his office's official news release.

OpenAI, for its part, has said it will cooperate with the investigation. OpenAI spokesperson Kate Waters stated: "Last year's mass shooting at Florida State University was a tragedy, but ChatGPT is not responsible for this terrible crime." The company also noted that it reached out to share information about the alleged shooter's account with law enforcement after the shooting.

A Landmark Legal Theory: Can an AI Company Be Criminally Liable?

At the heart of Uthmeier's investigation is a legal theory with no clear precedent in U.S. law. Florida statutes hold those who assist or counsel someone committing a crime equally responsible for that crime. Uthmeier has argued that if ChatGPT were a human interlocutor — providing the same information it allegedly provided to these suspects — that individual could face criminal charges.

"If ChatGPT were a person, it would be facing charges for murder," Uthmeier said in his office's official news release. "Florida is leading the way in cracking down on AI's use in criminal behavior."

Speaking to NPR, Uthmeier was equally direct: "My prosecutors have looked at this and they've told me, if it was a person on the other end of that screen, we would be charging them with murder." He also stated: "We cannot have AI bots that are advising people on how to kill others," and added, "We are going to look at who knew what, designed what, or should have done what."

Mark Glass, Commissioner of the Florida Department of Law Enforcement, added his voice to the concern: "It is important that all are aware of the risks of this new technology, and the harms it can and has already caused in our communities."

This investigation is described by WLRN and Florida Politics as the first attempt in the nation to hold AI company executives or program designers criminally accountable for the actions of generative software — a framing that legal observers across the country are watching closely.

Florida's Broader AI Enforcement Record and Legislative Response

The OpenAI criminal probe does not exist in isolation. Florida has been accelerating its legal posture toward AI-facilitated crimes on multiple fronts. The state previously secured a 135-year prison sentence for a predator who possessed AI-generated child sexual abuse material (CSAM). A separate predator is currently facing 100 criminal charges, including 46 counts of AI-generated CSAM. In March 2026, AG Uthmeier joined Governor DeSantis for the signing of HB 1159, which increased the penalty for AI-generated CSAM to a second-degree felony.

The legislative dimension is also moving quickly. According to Axios Tampa Bay, AI regulation is among the issues Florida state lawmakers are expected to address during a special session beginning Tuesday, April 28, 2026 — one day after Uthmeier's announcement of the USF investigation expansion.

OpenAI Under Mounting Legal Pressure Nationwide

Florida's criminal probe is one of several legal challenges now confronting OpenAI. According to NPR, OpenAI is already facing a lawsuit from the family of a victim critically wounded in an attack in British Columbia in February 2026 — a mass casualty event that killed eight people and injured dozens more — in which the alleged shooter had reportedly discussed gun violence scenarios with ChatGPT prior to the attack.

The accumulation of these cases — the FSU shooting, the USF murders, and the British Columbia attack — is sharpening the national debate over what obligations AI companies bear when their tools are allegedly used to facilitate violence. The central legal and policy question is not whether AI companies intended harm, but whether they had the means and responsibility to prevent it — and whether the current legal framework is equipped to answer that question.

For more tech news, visit our news section.

What Comes Next

Uthmeier's office is actively reviewing ChatGPT interaction logs tied to both the FSU and USF cases. Subpoenas to OpenAI are already in motion, seeking internal training materials and policy documents related to how the company handles user threats of harm and coordinates with law enforcement. OpenAI has stated it will cooperate.

The Florida special legislative session beginning April 28 may produce new statutory language specifically addressing AI-facilitated crimes — though the precise scope and content of any legislation has not yet been confirmed in the available reporting. The criminal case against Abugharbieh is proceeding through the courts in Tampa, with his pretrial detention motion already filed.

What is clear is that Florida's AG has staked out an aggressive and legally novel position: that the designers, executives, and decision-makers behind generative AI tools may bear direct criminal responsibility when those tools are alleged to have materially assisted in violent crimes. Whether Florida courts — and ultimately higher legal authorities — agree remains to be determined.


At Moccet, we believe that understanding the technology shaping your world is fundamental to protecting your health, focus, and wellbeing. The ChatGPT-USF case is a stark reminder that the AI tools increasingly embedded in daily life carry real-world consequences that are only beginning to be tested in courts and legislatures. Staying informed about how AI intersects with safety, accountability, and public health is not optional — it is essential to navigating the decade ahead. Join the Moccet waitlist to stay ahead of the curve.

Share:
← Back to Tech News