ChatGPT cited 'bogus' cases for a New York federal court filing. The attorneys involved may face sanctions.

NBC News Clone summarizes the latest on: Chatgpt Cited Bogus Cases New York Federal Court Filing Rcna86843 - Technology and Innovation | NBC News Clone. This article is rewritten and presented in a simplified tone for a better reader experience.

OpenAI’s popular chatbot had “hallucinated” — a term for when AI systems simply invent false information — and spat out cases and arguments that were entirely fiction.
Thurgood Marshall United States Courthouse on Jan. 31, 2023 in New York.
The Thurgood Marshall U.S. Courthouse in New York.Roy Rochlin / Getty Images file

Roberto Mata’s lawsuit against Avianca Airlines wasn’t so different from many other personal injury suits filed in New York federal court. Mata and his attorney, Peter LoDuca, alleged that Avianca caused Mata personal injuries when he was “struck by a metal serving cart” on board a 2019 flight bound for New York.

Avianca moved to dismiss the case. Mata’s lawyers predictably opposed the motion and cited a variety of legal decisions, as is typical in courtroom spats. Then everything fell apart.

Avianca’s attorneys told the court that it couldn’t find numerous legal cases that LoDuca had cited in his response. Federal judge P. Kevin Castel demanded that LoDuca provide copies of nine judicial decisions that were apparently used.

In response, LoDuca filed the full text of eight cases in federal court. But the problem only deepened, the federal judge said in a filing, because the texts were fictitious, citing what appeared to be “bogus judicial decisions with bogus quotes and bogus internal citations.”

The culprit, it would ultimately emerge, was ChatGPT. OpenAI’s popular chatbot had “hallucinated” — a term for when AI systems simply invent false information — and spat out cases and arguments that were entirely fiction. It appeared that LoDuca and another attorney, Steven Schwartz, had used ChatGPT to generate the motions and the subsequent legal text.

Schwartz, an associate at the law firm of Levidow, Levidow & Oberman, told the court that he had been the one tooling around on ChatGPT, and that LoDuca had “no role in performing the research in question,” nor “any knowledge of how said research was conducted.”

Opposing counsel and the judge had first realized that the cases didn’t exist, providing the involved attorneys an opportunity to admit to the error.

LoDuca and his firm, though, seemed to double down on the use of ChatGPT, using it not just for the initially problematic filing but to generate false legal decisions when asked to provide them. Now, LoDuca and Schwartz may be facing judicial sanction, a move that could even lead to disbarment.

The motion from the defense was “replete with citations to non-existent cases,” according to a court filing.

“The Court is presented with an unprecedented circumstance,” Castel said, and set a June 8 hearing where both LoDuca and Schwartz will be called to explain themselves. Neither attorney responded to CNBC’s request for comment.

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone