Legal updates and opinions
News / News
AI-Hallucinated Case Law
Appellate court to trial judge: You know these cases are made up, right?
by Ahmore Burger-Smidt, Director and Head of Regulatory
Research is Difficult and Time-Consuming
Over the past year, we have read more than once of lawyers referencing non-existent cases before courts in various countries. Lawyers have made apologies, and judges have been outspoken.
To date, the legal system has effectively addressed potential misapplication of incorrect guidance and its acceptance by our courts. Diligent judges have ensured that fake cases continue to be caught before they result in real concerns.
But what if the system fails to identify and flag Beavis v. Butthead, and a busy or apathetic judge rubber-stamps one side’s proposed order without thoroughly examining the case law relied upon? The potential consequences of such oversight are significant and should serve as a stark reminder of the need for vigilance.[1].
During a main trial, held in Georgia (the United States), the Georgia Court of Appeals was confronted with exactly this situation. The trial judge issued an order based on fake cases. While the appellate court put a stop to the matter, the fact that it got that far should terrify the legal fraternity.
The case, Shahid v. Esaam[2], called upon the Georgia Court of Appeals to step in where a final judgment and decree of divorce had been issued. In response to the wife’s objection to the judgment due to improper service, the husband’s legal team included two fabricated cases in their court submissions. The trial judge accepted the husband’s argument, issuing an order based in part on the fake cases.
Interestingly, the Appellate Court did not blame AI for the fake cases, but laid out its theory of the case as follows –
As noted above, the irregularities in these filings suggest that they were drafted using generative AI. In his 2023 Year-End Report on the Federal Judiciary, Chief Justice John Roberts warned that “any use of AI requires caution and humility.” Roberts specifically noted that commonly used AI applications can be prone to “hallucinations,” which caused lawyers using those programs to submit briefs with cites to non-existent cases.[1]
In a 2023 opinion, a federal district court noted in Mata v. Avianca, Inc.[2], that –
“there is nothing inherently improper about using a reliable artificial intelligence tool for assistance. But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings.”
The following is a list of fictitious cases included in the Shahid v. Esaam matter –
- In the Interest of J. M. B., 296 Ga. 786 (2015)
- Miller v. Miller, 288 Ga. 274 (702 SE2d 888) (2010)[24]
- Brown v. Brown, 264 Ga. 48 (1994)
- Walker v. Georgia, 309 Ga. 749 (2021)
- Ramos v. Ramos, 279 Ga. 487 (2005)
- McRae v. McRae, 263 Ga. 303 (1993
- Johnson v. Johnson, 285 Ga. 408 (2009)
But the AI hallucinations went further and also produced four citations to real cases that have nothing to do with the proposition stated –
- Blasingame v. Blasingame, 249 Ga. 791 (294 SE2d 519) (1982)[25]
- Wilson v. Wilson, 282 Ga. 728 (2007)[26]
- Brown v. Tomlinson, 246 Ga. 513 (1980)[27]
- Jones v. State, 277 Ga. 36 (2003)[28]
For instance, in the matter of Mavundla v MEC: Department of Co-Operative Government and Traditional Affairs KwaZulu-Natal and Others, a supplementary notice of application for leave to appeal was submitted to the court. This notice included several case law references to support the arguments raised in support of the grounds of appeal.[3], a supplementary notice of application for leave to appeal was submitted to the court. This notice included several case law references to support the arguments raised in support of the grounds of appeal.
During the preparation of her ruling, it became apparent to Bezuidenhout J that no such cases existed in the South African Law Reports, the All South African Law Reports, or on the SAFLII. The legal references cited in the supplementary notice were fictitious. But this goes further. The judge tasked two court law researchers with verifying all the cited cases. Of the nine cases referenced, only two were found to exist, and one of those had an incorrect citation.
In Northbound Processing (Pty) Ltd v the South African Diamond and Precious Metals Regulator (Case Number: 2025-072038), the Gauteng High Court delivered a clear warning that neither good intentions nor an apology will excuse lawyers, even though not relied upon, reflecting non-existent cases in submissions to the court.
So be warned, AI sometimes hallucinates. But what is an AI hallucination? According to IBM, this can be defined as flows –
“AI hallucination is a phenomenon wherein a large language model (LLM)-often a generative AI chatbot or computer vision tool – perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate.”[1]
AI offers a powerful set of tools for improving efficiency, but its use must be approached with diligence. Over-trusting AI’s capabilities without human oversight is a problem. The responsibility lies with us, the legal professionals, to ensure that AI is used as a tool, not a replacement for human judgment.
The Core Message
Human, check the health of your AI model. To what extent is it prone to hallucination? AI can help, but it won’t replace the legal mind, at least not for now. Remember, your judgment is irreplaceable in the legal process.
[1] Trial Court Decides Case Based On AI-Hallucinated Caselaw – Above the Law (accessed 25 July 2025)
[2] Court of Appeals of Georgia, First Division June 30, 2025, Decided A25A0196
[3] Shahid v. Esaam 1
[4] 678 FSupp3d 443, 448 (SDNY 2023).
[5] [2025] ZAKZPHC2
[6] IBM, https://www.ibm.com/think/topics/ai‑hallucinations (accessed 9 July 2025)
______________________________________________________________________________________________________________________________________
Read more about our Regulatory practice area.
Latest News
Key Updates to the Code of Good Practice on Dismissal: A Comparison of the 2025 Draft and the Original Framework
On 22 January 2025, the Department of Employment and Labour issued a draft update to the Code of Good [...]
Are CCMA and Bargaining Council Subpoenas Meeting Legal Standards? A Closer Look at Substantive Compliance
and Rekgopetše Pula, Candidate Attorney Since inception, the issuance of subpoenas by the Commission for Conciliation, Mediation and Arbitration [...]
Privacy. Who is looking after the children?
As we celebrate International Privacy Day on 28 January 2025, we are called to look inward and ask how the [...]
Can language proficiency policies be used to exclude individuals who lack the required language skills from employment?
Our Constitution recognises 12 official languages and commits to promoting their development and use. Viewed through the lens of language, [...]
Hey POPIA, is the publication of a person’s HIV status, positive or negative?
The right to privacy and the rights of public figures came before the court in the case of Tshabalala-Msimang versus [...]
Error 404 – when facial recognition does not see you – a tale of how R1, R2 and R7 wages were paid for a week’s work
Since its inception, facial recognition technology has been regarded as the future for security, safety, technology and innovation.[1] Indeed, the [...]