Arizona court sanctions lawyer for AI-generated false citations: Judge revokes attorney's pro hac status and imposes multiple sanctions after majority of legal citations were fabricated by AI. (ppc.land)
from Pro@programming.dev to technology@lemmy.world on 16 Aug 07:45
https://programming.dev/post/35780753

cross-posted from: programming.dev/post/35742052

An Arizona federal court issued extensive sanctions against attorney Maren Bam on August 14, 2025, after finding that her brief contained multiple artificial intelligence-generated citations to non-existent cases. The sanctions include revocation of pro hac vice status, striking the brief, and mandatory notification to state bar authorities.

#technology

threaded - newest

vk6flab@lemmy.radio on 16 Aug 07:50 next collapse

It’s happening in Australia too. The headline in this article needs some work.

I propose:

“Soon to be disbarred King’s Council used Assumed Intelligence instead of Actual Intelligence to argue a murder case”

www.abc.net.au/news/2025-08-15/…/105661208

raman_klogius@ani.social on 16 Aug 08:23 next collapse

Lawyers took the techbros’ blue pill

Doomsider@lemmy.world on 16 Aug 16:48 collapse

Why should you have to learn law if your are a lawyer?

PattyMcB@lemmy.world on 16 Aug 17:31 collapse

Most of them don’t

skisnow@lemmy.ca on 16 Aug 08:29 next collapse

How naïve it was of me, to think that the New York Avianca case in 2023 was high profile enough for lawyers to have learnt their lesson, but nope, it’s getting worse each and every month that goes by:

www.damiencharlotin.com/hallucinations/

It doesn’t help that the most common outcomes there are “Warning” or a fine in the low thousands. If a legal practice can save $500,000 a year on avoiding doing their own research, and the worse that’s likely to happen is “Warning” or a $2,000 fine, then why would they not?

Decq@lemmy.world on 16 Aug 08:43 collapse

How are they not immediately disbarred for this? Surely fabricating documents and citations gets you disbarred right?

ToastedRavioli@midwest.social on 16 Aug 09:32 collapse

It doesnt, but it should. Its malpractice of the highest degree and shows clear disregard for properly representing a client

dan@upvote.au on 16 Aug 08:32 next collapse

I’m amazed that these lawyers are using things like ChatGPT, when better solutions exist for the legal industry. The big legal databases (like LexisNexis) have their own AI tools that will give you actual useful results, since they’re trained on caselaw from the database rather than just using a generic model, and link to the relevant cases so you can verify them yourself.

FenrirIII@lemmy.world on 16 Aug 17:16 collapse

Do is it free?

dan@upvote.au on 16 Aug 17:21 collapse

No, but law firms generally subscribe to these databases.

At least where I live, lawyers can also go to the local law library to use LexisNexis for free.

9488fcea02a9@sh.itjust.works on 16 Aug 14:23 next collapse

All the prior cases from these chatGPT lawyers should be reviewed. What other shortcuts were they taking before? Did an innocent person end up in jail because of some prior negligence?

druidjaidan@lemmy.world on 16 Aug 17:34 collapse

A very small minority of lawyers work criminal cases.

This lawyer in particular only works on Social Security Disabilty claims.

PattyMcB@lemmy.world on 16 Aug 17:29 collapse

I don’t trust lawyers at all