How One Lawyer Got Sanctioned for a Fake Case Citation He Never Wrote
When AI Makes a Mistake, Lawyers Pay the Price
Artificial intelligence tools like ChatGPT have made their way into nearly every profession, and the legal field is no exception. Lawyers are using these tools to speed up research, draft documents, and find case citations. But what happens when the AI gets it wrong? One attorney found out the hard way — and the consequences were serious.
This is a story about how a lawyer ended up facing professional sanctions for citing a court case that never existed. The twist? He didn’t write the fake citation himself. His AI tool did. And that distinction, it turns out, didn’t matter much to the court.
The Case That Wasn’t There
The incident came to light when opposing counsel tried to look up a case that had been cited in a legal brief. The case couldn’t be found — not in legal databases, not in court records, nowhere. That’s because it was completely fabricated. ChatGPT had generated a realistic-sounding but entirely fictional case citation, complete with a made-up case name, docket number, and supposed legal ruling.
This kind of AI error has a name: a hallucination. It’s when an AI system produces information that sounds confident and believable but is simply not true. These hallucinations are a known problem with large language models, and legal experts have been warning about them for years.
The attorney who submitted the brief said he had relied on the AI-generated research without independently verifying the citations. When the court discovered the error, the attorney faced formal sanctions — meaning official punishment from the court for submitting false information.
What Are Attorney Sanctions?
Attorney sanctions are penalties that courts or bar associations can impose on lawyers who violate professional rules. They can take several forms, including:
- Monetary fines — the attorney may be required to pay money as a penalty
- Written reprimands — a formal statement of wrongdoing placed in the lawyer’s record
- Suspension — temporary removal of the attorney’s license to practice law
- Disbarment — permanent loss of the ability to practice law
In this case, the attorney received a combination of financial penalties and a formal reprimand. While he was not disbarred, the sanctions still carried significant professional and financial consequences. His reputation also took a very public hit.
Why “I Didn’t Write It” Wasn’t a Valid Defense
One of the most important lessons from this story is that courts hold attorneys personally responsible for everything they submit. It doesn’t matter who — or what — produced the content. When a lawyer puts their name on a document and files it with a court, they are certifying that the information in it is accurate and made in good faith.
This obligation comes from a legal rule called Rule 11 of the Federal Rules of Civil Procedure, which requires attorneys to verify that their filings are based on real facts and law. Similar rules exist in state courts as well. Using AI doesn’t create an exception to these rules. The court’s position was straightforward: the attorney should have checked the citations before filing.
Judges across the country have been increasingly direct about this issue. Some courts have even started requiring attorneys to disclose when they use AI tools and to certify that any AI-generated content has been independently reviewed and verified.
How ChatGPT Hallucinations Happen
To understand why this happened, it helps to know a little about how AI language models work. Tools like ChatGPT are trained on massive amounts of text from the internet and other sources. They learn to predict what words and sentences should come next based on patterns in that data.
The problem is that these models don’t actually “know” things the way humans do. They don’t check a database of real court cases when asked for legal citations. Instead, they generate text that sounds like a legal citation based on patterns they’ve seen. If the model has seen enough legal writing, it can produce something that looks completely legitimate — but is totally made up.
This is especially dangerous in law, where a case citation needs to be exactly right. There’s no room for “close enough.” Either the case exists and says what you claim it says, or it doesn’t.
This Wasn’t an Isolated Incident
This attorney’s story made headlines, but he is far from alone. There have been multiple high-profile cases in recent years where lawyers have been caught submitting AI-generated fake citations. Some of the most notable examples include:
- A New York case where two attorneys submitted a brief with at least six fake case citations and were fined $5,000 each
- A Colorado attorney who cited a non-existent case and was required to take additional ethics training
- Several cases where judges ordered attorneys to appear in person to explain AI-related errors in their filings
These cases have pushed courts to take a harder look at how AI is being used in legal practice. Many judges are now more alert to citations that seem unusual or are hard to verify.
The Legal Ethics Problem at the Center of All This
Legal ethics rules exist to protect clients, courts, and the public. They require lawyers to be honest, competent, and diligent. When AI tools introduce errors into legal work, those rules don’t bend to accommodate new technology.
The American Bar Association and various state bar associations have started issuing guidance on the use of AI in legal work. The general message is consistent: AI can be a useful tool, but it cannot replace a lawyer’s duty to verify facts, research cases properly, and act in their client’s best interest.
Some key ethical concerns raised by this situation include:
- Competence — lawyers are required to stay current with the tools they use, including understanding the limitations of AI
- Candor to the tribunal — lawyers must be honest with courts and cannot submit false information, even unintentionally
- Supervision — attorneys are responsible for the work done under their name, whether by a paralegal, an associate, or an AI tool
Could This Lead to a Malpractice Claim?
Beyond court sanctions, attorneys who use AI carelessly may also face legal malpractice claims from their clients. Malpractice happens when a lawyer fails to provide competent representation and that failure causes harm to the client.
If a case is dismissed or a client loses because their attorney submitted fake citations, the client has a strong argument that they were harmed by their lawyer’s negligence. Malpractice claims can result in significant financial damages paid to the client, in addition to any sanctions from the court or bar association.
Legal malpractice insurance may cover some of these costs, but insurers are already beginning to look more closely at AI-related claims. Some insurance policies may not cover losses that result from a failure to verify AI-generated content.
What Lawyers Should Be Doing Instead
AI isn’t going away, and most legal experts agree it doesn’t need to. Used correctly, it can save time and improve efficiency. The problem isn’t the technology itself — it’s using it without proper checks in place. Here’s what responsible use looks like:
- Always verify citations independently — use Westlaw, LexisNexis, or another trusted legal database to confirm every case cited in a document
- Treat AI output as a starting point, not a final answer — use it to brainstorm or find leads, then do the real research yourself
- Disclose AI use when required by the court — some jurisdictions now require this, and it’s becoming more common
- Train your staff — anyone using AI tools in your office should understand both their capabilities and their limitations
- Stay updated on ethics guidance — bar associations are regularly issuing new guidance on AI use, and attorneys have an obligation to follow it
What This Means for Clients
If you’re a client working with an attorney, this story is a good reminder to ask questions. You have a right to know how your lawyer is handling your case and what tools they’re using. You don’t need to be anti-technology, but you do deserve to know that your representation is careful, accurate, and grounded in real law.
If you’re ever concerned about the quality of your legal representation, you can contact your state’s bar association to ask about filing a complaint or seeking a second opinion from another attorney.
A Wake-Up Call for the Legal Profession
The story of this attorney and his AI-generated fake citation is more than just a cautionary tale about one person’s mistake. It’s a signal that the legal profession is in the middle of a major shift, and the rules haven’t fully caught up yet.
Courts are paying attention. Bar associations are issuing guidance. And lawyers across the country are being forced to think carefully about how they use these powerful new tools. The good news is that awareness is growing. The bad news is that some attorneys are still learning this lesson the hard way.
At the end of the day, the law is built on trust — trust that the facts are real, that the cases exist, and that the attorneys presenting them have done their homework. No AI tool, no matter how advanced, can take the place of that fundamental responsibility.














