Relying on ChatGPT’s False Cases Leads to Trouble : TechMoran

by Digital Brainiacs
0 comment 1 minutes read

A legal professional’s decision to employ ChatGPT for legal research backfired when he cited several non-existent cases in a court filing.

You Might Be Interested In

The lawyer, representing Roberto Mata in his lawsuit against Avianca Airlines, submitted a 10-page brief containing references to fictional court decisions and fabricated quotations. Among the purported cases were Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines, and Varghese v. China Southern Airlines, which delved into discussions on federal law and the “tolling effect of the automatic stay on a statute of limitations.”

Assured by ChatGPT that these cases were legitimate, the lawyer proceeded with the submission, unaware that the AI had conjured up entirely fictional content.

Unsurprisingly, neither the airline’s attorneys nor the judge could locate any trace of the cited decisions or the quotations referenced in the brief.

Consequently, the airline’s legal team notified the judge of their inability to…

Read full article here

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.