WeSearch

Florida Murder Suspect Reportedly Asked ChatGPT What Happens If You Put Someone in a Dumpster

Mike Pearl· ·3 min read · 0 reactions · 0 comments · 1 view
#chatgpt#florida#openai#crime#artificial intelligence
Florida Murder Suspect Reportedly Asked ChatGPT What Happens If You Put Someone in a Dumpster
⚡ TL;DR · AI summary

Florida's attorney general has expanded a criminal investigation into OpenAI and ChatGPT to include two recent murders at South Florida University, following reports that a suspect used the AI chatbot to ask about disposing of bodies and tracking stolen phones. The probe initially stemmed from a 2025 shooting at Florida State University, where the suspect allegedly had extensive interactions with ChatGPT. Published court documents suggest the USF suspect asked ChatGPT about putting someone in a dumpster and tracking iPhone users, though it's unclear how influential the AI was in the crimes. OpenAI expressed condolences and stated it is cooperating with law enforcement.

Original article
Gizmodo · Mike Pearl
Read full at Gizmodo →
Full article excerpt tap to expand

Florida’s attorney general announced that his office would begin a criminal probe into ChatGPT’s potential role in homicides committed in his state. That investigation appears to be deepening as more tragic deaths in that state with supposed connections to ChatGPT occur.cnx.cmd.push(function(){cnx({"playerId":"92b7b46b-43ed-4e0e-b21b-2c999302d9d7","settings":{"advertising":{"macros":{"AD_UNIT":"/23178111854/od.gizmodo.com/article","CHILD_UNIT":"article","POST_ID":"2000751519","POST_TYPE":"post","CHANNEL":"tech","SECTION":"artificial-intelligence","SUBSECTION":"","CATEGORIES":"artificial-intelligence,crime","TAGS":"chatgpt,florida,openai","NOP":"0"},"timeBeforeFirstAd":0}}}).render("cnx-player-main")}); In addition to the state eyeing ChatGPT over a crime from over a year ago, two horrific deaths earlier this month at South Florida University now have a potential ChatGPT connection as well—and some partial interactions between a suspect and the chatbot have now been published, including one about what happens when people are thrown in dumpsters. Attorney general James Uthmeier announced about a week ago that his office would investigate OpenAI for possible liability related to crimes in that state—particularly an April 17, 2025 shooting at a different school, Florida State University, in which two people died and six were injured. An attorney for one of the victims said that the suspect was in “constant communication” with OpenAI’s chatbot, and claimed the software “may have advised the shooter how to commit these heinous crimes.” So the two separate incidents are now two parts of the same criminal probe into ChatGPT, according to Uthmeier, who posted on X Monday morning, “We are expanding our criminal investigation into OpenAI to include the USF murders after learning the primary suspect used ChatGPT.” While details were initially light regarding exactly how ChatGPT is alleged to have misbehaved in order to merit a criminal probe, Axios, which has reviewed court documents from the prosecution, now has a bit of detail and context about what some of the actual alleged interactions were between a USF suspect, Hisham Abugharbieh, and the chatbot. The disappearance of the missing students was reported on April 16. Apparently on April 13, Abugharbieh allegedly prompted ChatGPT with a question about what happens if a person gets “put in a black garbage bag and thrown in a dumpster.” On April 19, Abugharbieh apparently asked, “Will Apple know who is the new iPhone user after the previous user[?]” I asked the free version of ChatGPT the dumpster question while logged out and its answer focused on the health of the ostensibly alive person thrown in the dumpster. “A person sealed in a garbage bag can’t get enough air, so suffocation can happen quickly,” it said. It gave a technical answer to the iPhone question, seemingly under the assumption that I was someone with privacy concerns who had recently bought a used iPhone. The answer to the question about the term “missing endangered adult” more or less just rephrased it with more bold text: “a term used by law enforcement to describe a missing person who is 18 or older and believed to be at higher risk of harm.” These tests should just give you the flavor of ChatGPT’s behavior generally. It’s not clear what the suspect’s other ChatGPT use may have been, or how much information he shared with the chatbot. For what it’s worth, I included the three different prompts in the same ChatGPT session, and…

This excerpt is published under fair use for community discussion. Read the full article at Gizmodo.

Anonymous · no account needed
Share 𝕏 Facebook Reddit LinkedIn Email

Discussion

0 comments

More from Gizmodo