The day ChatGPT becomes a witness to murder

byRainer Hofmann

April 21, 2026

In Florida, a criminal investigation is being conducted for the first time against an AI system. An inquiry has turned into a criminal case. At the center is OpenAI - and a chatbot that is no longer seen merely as a tool, but as part of an act that cost two people their lives.

Attorney General James Uthmeier stood before the press in Tampa and said a sentence that stays with you: If a human had been sitting on the other side of the screen, we would have filed murder charges. Investigators had previously analyzed the communication between 20 year old Phoenix Ikner and ChatGPT. More than 200 messages, written in the days leading up to the shooting at Florida State University.

What they found is enough to take the case to a new level. The chatbot provided answers to questions directly related to preparing an attack. It involved the effects of weapons at close range, suitable ammunition, procedures. On the day of the crime, Ikner asked how the public would react to an attack on a campus and when the largest number of people would be present.

The perpetrator, Phoenix Ikner, initially moved through the building without obstruction, as he expected a delayed police response. The attack ended only when an armed individual intervened.

The shots were fired on April 17 of last year near the student center of the university, where more than 43,000 people study. Two adults were killed, six others injured. Ikner has been in custody ever since and is awaiting trial. The charges against him are multiple counts of murder and attempted murder. Uthmeier launched the investigation into OpenAI on April 9. A civil review is also underway. But the criminal dimension overshadows everything. It is legal territory that has not been explored before. Not because a crime occurred - that is undisputed. But because the question now stands whether and to what extent a system that generates responses can be part of that crime.

Phoenix Ikner

Uthmeier made it clear what his office is now examining. It is not only about the chatbot itself, but about the people behind it. Who developed it. Who controlled it. Who decided how it responds. And whether boundaries were crossed that are relevant under criminal law. Subpoenas have already been issued. OpenAI stated that it will cooperate with authorities. The system was designed to provide safe and appropriate responses. But this case does not stand alone. As early as December 2025, relatives of an 83 year old woman who was killed filed a lawsuit. Their allegation: ChatGPT reinforced the paranoid beliefs of her son before he killed his mother.

What is happening in Florida is more than a case against a company. It is the moment when a technology is no longer discussed only as a risk, but as a possible part of a specific act. And it is the moment when it will be decided whether responsibility ends where an algorithm begins - or whether it is exactly where it starts.

Independent Journalism · Kaizen Blog

We are where,
it hurts. wehtut.

We do not sit in comfort writing about the world - and we do not stop once the writing ends. Our help goes where it is needed. We are a small team. No investors, no millionaires, no large newsroom behind us. What we have is heart, determination, and the commitment to uncover things that others often overlook. If you want this work to continue, please support the Kaizen Blog.

Our work depends on those who pay attention - and stand up for making sure it remains possible.

Updates – Kaizen News Brief

All current curated daily updates can be found in the Kaizen News Brief.

To the Kaizen News Brief In English
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x