Back to all posts

Developer Offer

Try ImaginePro API with 50 Free Credits

Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.

Start Free Trial

When AI Lies The Legal System Pays The Price

2025-10-29Tufan Neupane5 minutes read
Artificial Intelligence
Legal Tech
AI Ethics

The Case of the Non-Existent Citations

To the untrained eye, the citations in an appeal for a denied Social Security claim seemed perfectly legitimate: Brown v. Colvin, Wofford v. Berryhill, and Hobbs v. Commissioner of Social Security Administration. Each one included a case number and the initials of a real federal judge from the District of Arizona. The only problem? While the judges were real, the cases were complete fabrications.

In an August 14 ruling, U.S. District Judge Alison Bachus revealed that 12 of the 19 cases cited in the legal brief were “fabricated, misleading, or unsupported.” She sanctioned the lawyer responsible for the filing, which was “replete with citation-related deficiencies, including those consistent with artificial intelligence generated hallucinations.”

A Growing Epidemic in the Courthouse

This incident is not an isolated one. As artificial intelligence becomes more integrated into workplaces and schools, its presence—and its flaws—are increasingly showing up in the legal system. According to the AI Hallucination Cases database, maintained by researcher Damien Charlotin, there have been a half-dozen federal court filings in Arizona since September 2024 containing fabricated material from AI tools like ChatGPT. This places Arizona second only to the Southern District of Florida in the U.S. for such cases.

Worldwide, the database has tracked 486 cases, with 324 occurring in U.S. courts. While many of these filings are from individuals representing themselves, a startling number are from legal professionals. Hallucinatory filings in the U.S. have been attributed to 128 lawyers and even two judges.

“There are now hundreds of reported instances of lawyers (including those from top law firms), expert witnesses, and even judges filing documents in court that contain hallucinated citations and other content,” noted legal researcher Matthew Dahl. In a 2024 paper, he found that when AI systems are used for legal research, “they hallucinate at a level far higher than would be acceptable for responsible legal practice.”

Screenshot of chat with ChatGPT about AI hallucinations showing up more often in legal filings.

ChatGPT's Warning: Verify Everything

When asked about its own role in generating fake legal cases, ChatGPT-5 acknowledged the problem. “It’s definitely important and concerning from an ethical and practical standpoint,” the AI responded. “Cases like these highlight how misuse or overreliance on AI tools without verification can cause real harm — especially in domains like law, medicine or journalism where accuracy is critical.”

The AI model went on to state that such incidents “underscore why transparency, verification and human oversight are non-negotiable.”

Advice from ChatGPT-5 on how to spot fake case citations.

Legal experts warn that this trend poses a serious threat to a justice system built on precedent and case law. “When lawyers cite hallucinated case opinions, those citations can mislead judges and clients,” said Christina Frohock, a University of Miami law professor. “The hallucinations might then appear in a court order and sway an actual dispute between actual parties. ... If fake cases become prevalent and effective, they will undermine the integrity of the legal system and erode trust in judicial orders.”

Facing the Consequences: Sanctions and Public Shaming

Courts are responding to this misconduct with significant penalties. In 2023, a New York federal judge fined a lawyer $5,000 for a ChatGPT-generated brief filled with fake cases. In Colorado, an attorney who lied about using AI accepted a 90-day suspension. Other consequences have included bar referrals, mandatory continuing education, and case dismissals.

In the Arizona case, Judge Bachus ordered attorney Maren Ann-Miller Bam to inform the three judges whose names were attached to the fictitious opinions. Bam must also provide a copy of the sanctions order to the judge in any future case she is involved in. In a court filing, she acknowledged the AI-generated cases and blamed another attorney for preparing the brief, assuring the court she did not intend to mislead.

A Call for Caution from the Top

Even judges are not immune. A federal judge in Mississippi admitted an order he issued contained false information from the AI tool Perplexity, blaming a clerk for the error. The State Bar of Arizona has issued guidance reminding lawyers they are responsible for verifying all AI-generated work.

This sentiment was echoed by Chief Justice John Roberts in his 2023 year-end report on the federal judiciary, where he warned that “any use of AI requires caution and humility.” He bluntly stated that citing non-existent cases is “always a bad idea.” One test of ChatGPT's legal knowledge conducted by SCOTUSblog found that it answered only 21 out of 50 questions correctly.

Despite the risks, the frequency of these incidents is rising. “Before this spring in 2025, we maybe had two cases per week,” said researcher Charlotin. “Now we’re at two cases per day or three cases per day.” He added a silver lining: “What’s good about this hallucination thing is that it casts a spotlight on sloppy, bad lawyers.”

Read Original Post

Compare Plans & Pricing

Find the plan that matches your workload and unlock full access to ImaginePro.

ImaginePro pricing comparison
PlanPriceHighlights
Standard$8 / month
  • 300 monthly credits included
  • Access to Midjourney, Flux, and SDXL models
  • Commercial usage rights
Premium$20 / month
  • 900 monthly credits for scaling teams
  • Higher concurrency and faster delivery
  • Priority support via Slack or Telegram

Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.

View All Pricing Details
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.