Legal Q&A
How to Detect Fake AI Legal Citations
Fake AI citations usually look plausible before they fail source verification. A safe workflow treats every citation-shaped string as a candidate until the lawyer opens the source, confirms the proposition, and saves the verification trail.
A lawyer has AI-assisted text or a draft brief and wants a practical way to find fabricated cases or statutes.
Do not trust reporter-shaped text until it resolves to a source.
Verify both existence and proposition support.
Keep blocked citations visible in the audit trail.
Start with source existence
The first pass is mechanical: open the case, statute, regulation, or rule in a public or official source. If the source route cannot be found, the citation should not move forward.
- Check party names, reporter, court, and year.
- Do not rely on search snippets alone.
- Preserve the source URL with the work product.
Then test proposition support
A real citation can still be wrong. Read the cited passage and ask whether it supports the sentence in the draft, whether the jurisdiction fits, and whether subsequent history changes reliance.
- Match each legal proposition to a passage.
- Separate binding authority from persuasive support.
- Escalate weak support before export.
Use a blocked-citation workflow
The safest AI workflow does not hide uncertainty. It records unsupported citations, explains why they were blocked, and routes the lawyer to replacement research.
- Keep blocked items visible.
- Replace unsupported authorities with verified sources.
- Save the final reviewed version separately.
Frequently Asked Questions
Is a citation safe if the case exists?
No. Existence is only the first step. The source must also support the proposition for the relevant jurisdiction and procedural posture.
What should happen to a citation that cannot be found?
It should be blocked or removed until a verified replacement authority is found.