Ex-NFL Player Darron Lee Used AI Bot Before Reporting Girlfriend's Death
Tennessee prosecutors allege former Jets linebacker Darron Lee consulted an AI chatbot to cover up his girlfriend's killing before calling 911.
A former NFL linebacker asked an artificial intelligence chatbot what to do about a woman’s injuries and searched for signs consistent with a fall before calling 911 to report his girlfriend’s death, a Tennessee prosecutor alleged in court this week.
Darron Lee, a first-round pick by the New York Jets in the 2016 NFL Draft, faces a murder charge in the death of his girlfriend at their Nashville-area home. Prosecutors laid out the AI consultation as part of their case against him, arguing the searches reveal a calculated attempt to construct a cover story before law enforcement arrived.
According to prosecutors, Lee queried the AI bot for information about injury symptoms and what physical signs a fall typically produces. The timing of those searches, authorities contend, points to premeditation rather than the panicked response of someone who had just witnessed a tragic accident.
The case puts a sharp focus on how law enforcement and prosecutors are now mining AI chat histories alongside browser searches and text messages as standard investigative tools. Digital forensics have long been central to criminal cases, but the emergence of conversational AI platforms gives investigators a new category of evidence. When a person types a question into a chatbot, that query can be preserved, timestamped, and retrieved, creating a detailed record of what someone was thinking and seeking in the minutes or hours surrounding a crime.
Defense attorneys have not yet made their full case publicly, and Lee has not been convicted of any crime. The legal proceedings are ongoing.
The facts alleged by the prosecution, if proven at trial, would represent one of the more explicit documented examples of a defendant turning to generative AI in the immediate aftermath of a violent incident. Legal observers have noted that while people have long used search engines to look up legally sensitive information, AI chatbots produce a more conversational and specific exchange, potentially providing prosecutors with richer evidence about intent.
Lee was selected 20th overall by the Jets out of Ohio State, where he was a standout defender. He played several seasons in the NFL before his career wound down. He had been living in Tennessee at the time of the incident.
Prosecutors in Nashville are arguing that the sequence of events, including the AI queries, the gap before the 911 call, and the nature of the victim’s injuries, paint a picture inconsistent with an accidental death. Medical examiner findings have not been fully detailed in public court filings reviewed for this report.
The case will likely draw attention from legal scholars and civil liberties advocates who have raised questions about how AI companies store user data, under what circumstances they respond to law enforcement subpoenas, and what privacy expectations users reasonably hold when they type sensitive queries into a chatbot. Most major AI platforms include data retention policies in their terms of service, but many users remain unaware of how that data can be accessed by third parties, including government investigators with a valid legal order.
For prosecutors, the AI query record functions similarly to a search history, but potentially more damning. A Google search for “signs of a fall” might suggest innocent curiosity. The same question typed into a chatbot moments after a woman is found dead, with follow-up questions about injury management, tells a more pointed story, at least in the prosecution’s framing.
Lee’s case is scheduled to move forward in Tennessee courts. No trial date has been publicly confirmed as of this writing.
The outcome will be decided by the evidence presented at trial and the judgment of a jury. What the case has already demonstrated is that anyone who reaches for their phone or computer in a moment of crisis, whether out of panic, grief, or something more sinister, may be creating a record that follows them into a courtroom. AI bots are not confessionals. They are software platforms that log what users ask, and those logs are increasingly ending up in the hands of investigators.