Two U.S. judges recently retracted their rulings after lawyers brought to their attention filings containing significant inaccuracies. These errors included fabricated case details and misattributed quotes, highlighting potential issues with the increasing use of artificial intelligence in legal research and submissions.
In New Jersey, Judge Julien Neals withdrew his denial of a motion to dismiss a securities fraud case. The decision was based on filings with "pervasive and material inaccuracies," including numerous instances of fabricated quotes and mistaken lawsuit outcomes.
Similarly, in Mississippi, Judge Henry Wingate replaced his original temporary restraining order concerning a state law on diversity, equity, and inclusion programs. Lawyers informed the judge that the initial decision relied on the purported testimony of individuals whose declarations were not present in the case record.
The errors in both cases, quickly identified by attorneys, prompted the judges to revise or retract their orders. These incidents coincide with the rapid growth of generative AI across various professions, particularly among younger workers. The inaccuracies in these legal filings resemble AI-related errors, such as the use of "hallucinated" quotes and incorrect case citations.
For attorneys, the accuracy of court filings is paramount. The American Bar Association emphasizes that lawyers are responsible for the veracity of all information in their submissions, including AI-generated content. Recent cases have seen sanctions imposed on law firms and attorneys for submitting erroneous filings generated by AI, including fabricated quotes and incorrect legal citations.
7 Comments
Coccinella
These incidents show a disregard for thoroughness in the legal field. The consequences of AI errors could be catastrophic!
Bermudez
Kudos to the attorneys who caught these inaccuracies! It shows the value of experienced legal professionals even in the age of AI.
Coccinella
This is a failure not just of the technology, but of the legal professionals who chose to use it without proper verification. They should be held accountable!
Matzomaster
With the rise of technology, it’s essential for the legal community to stay vigilant. These situations encourage better practices moving forward.
Karamba
AI should not be used in an industry where accuracy is crucial. The legal system is no place for these “hallucinations” mentioned in the article!
Rotfront
It's important to scrutinize all information, including AI-generated content. This incident highlights the need for thorough fact-checking!
Loubianka
It’s good to see judges taking accountability. Acknowledging errors is part of ensuring justice is served!