From hallucinations to High Court: Can AI deliver justice?

19 hours ago 3

It began as a relatively routine family law case involving divorce proceedings. Instead, it became what could turn out to be a watershed moment for Australia’s legal system. A Victorian solicitor submitted what appeared to be convincing citations, only to discover they were entirely fabricated by an AI chatbot.

The fallout was severe.

Justice Amanda Humphreys and her associates could find no trace of the cited cases. The solicitor admitted the list was generated by an AI tool, used without verification. He offered an unconditional apology and committed to “take the lessons learned to heart”. Nonetheless, the matter was referred to the Victorian Legal Services Board.

New South Wales Chief Justice Andrew Bell.

New South Wales Chief Justice Andrew Bell.Credit: Sam Mooy

By August 19, regulators had intervened. The lawyer lost his ability to practise as a principal, was banned from managing trust funds, and must now work under supervision – with quarterly reporting – for two years.

It was the first instance of a lawyer in Australia facing professional sanctions for using artificial intelligence in a court case.

Loading

Law Council of Australia president Juliana Warner voiced the profession’s growing concern. “Where these tools are utilised by lawyers, this must be done with extreme care,” she said, warning that a blanket ban on AI was neither practical nor proportionate.

In NSW, Chief Justice Andrew Bell has banned AI use in key evidence documents. Lawyers must now declare no AI was used in affidavits, witness statements, or character references. AI is limited to secondary legal research, and only under strict scrutiny.

Bell has flagged generative AI as one of the biggest challenges facing the justice system.

“The problem of hallucinations has not been solved,” he said in an address to the Australian Bar Association in late August. “Fabricated case names, misquoted legislation and false principles are being put before courts.

“Lawyers have been disciplined in most jurisdictions, including Australia, for relying on hallucinated AI content.

‘Without transparency, accountability and oversight, these systems risk perpetuating the very injustices they aim to resolve.’

Matt McMillan, Lander & Rogers technology lawyer

“AI hallucinations remind us that accurate, reliable and critical legal analysis currently remains a solely human capability.”

And, just weeks ago, another high-profile case reverberated through the legal system. Submissions filed by a defendant in the Supreme Court of Victoria contained fabricated quotes and non-existent judgments generated by AI, in a murder case. The defendant’s barrister told Justice James Elliott he was “deeply sorry and embarrassed”. The blunder delayed proceedings and led to a third round of corrected submissions.

“It is not acceptable for AI to be used unless the product of that use is independently and thoroughly verified,” Elliott told the Supreme Court in Melbourne.

Judge Ellen Skinner.

Judge Ellen Skinner.

There have now been more than a dozen reported cases in Australian courts of documents containing false citations.

Whether we’re ready or not, generative AI has rapidly swept through Australia’s workplaces and classrooms, and it’s also entered our courtrooms.

It raises the question: to what extent should our justice system allow AI, hallucinations at all? And should AI eventually wield the gavel?

The prospect of an AI judge is no longer the stuff of science fiction. Estonia has already trialled an AI system to resolve small claims disputes under €7000 ($12000). Singapore’s courts are experimenting with AI-assisted arbitration to unclog backlogs.

At next month’s SXSW Sydney, law firm Lander & Rogers will stage a provocative experiment: a fictional high-stakes embezzlement case set in 2035, in which an AI judge will weigh evidence, assess public sentiment using algorithms and deliver a sentence. The session will be led by Lander & Rogers executive Courtney Blackman, who last year ran a mock trial involving AI lawyers.

 Jeanette Merjane, LawTech hub director Courtney Blackman, Ken Leung, Landers chief innovation officer Michelle Bey and UTS law professor David Lindsay.

At last year’s SXSW a mock trial pitted AI against human lawyers. From left: Jeanette Merjane, LawTech hub director Courtney Blackman, Ken Leung, Landers chief innovation officer Michelle Bey and UTS law professor David Lindsay.Credit: Louie Douvis

Ellen Skinner is a sitting judge and is president of the Children’s Court of New South Wales. She will join an Oxford-trained ethicist and a technology lawyer in debating AI’s verdict at the end of the scenario. She says it is time to talk about what role human judges will play in the future and how the judiciary can maintain public trust if decisions are made by machines.

“As artificial intelligence begins to influence every corner of society, it is essential we debate its role in our justice system,” she says.

Lander & Rogers technology lawyer Matt McMillan is also sitting on the panel. He says AI systems have significant potential to enable greater access to justice, make legal processes more efficient, and reduce human bias.

Lander & Rogers technology lawyer Matt McMillan.

Lander & Rogers technology lawyer Matt McMillan.

With proper human oversight, AI can support a more equitable legal system, he says.

“That human oversight is, however, essential to the issue of trust. Many current systems, for example, are trained on historical data which may reflect systemic biases. So, without transparency, accountability and oversight, these systems risk perpetuating the very injustices they aim to resolve.

“Legal reasoning, which often involves context and moral judgment, is complex, human, and cannot be fully captured by machines, no matter how intelligent they are. Although AI is smart, it remains, at least for now, an approximation of human thinking.”

For all the question marks around hallucinations, ethics and accuracy, AI tools have quickly become widely used by Australia’s top-tier law offices and their clients.

Silicon Valley legal AI start-up Harvey, co-founded by former Californian litigator Winston Weinberg, has signed up a third of Australia’s top-tier firms, including King & Wood Mallesons, Ashurst, Gilbert + Tobin and Arnold Bloch Leibler, to its AI solution that provides instant analysis and data processing for lawyers. It has just opened a Sydney office, where it plans to hire 15 people.

Weinberg says that the real potential for AI justice lies in low-stakes cases, at least for now. He thinks such a system could work in Australia.

Harvey co-founders Gabe Pereyra and Winston Weinberg.

Harvey co-founders Gabe Pereyra and Winston Weinberg.

“Imagine two parties agreeing to submit a small claim to an AI arbitrator,” he says. “Both sides put in their argument, the AI produces a ruling, and if either disagrees, they can appeal. That can speed up and unclog a lot of the legal system, but I think the reality is that for very high-stakes things you want a jury of your peers.”

It’s a pragmatic vision: not robot judges in Supreme Court murder trials, but software resolving car park fender benders or landlord-tenant spats.

There have now been more than 20 reported cases in Australian courts of documents containing false citations.

There have now been more than 20 reported cases in Australian courts of documents containing false citations.Credit: iStock

England and Wales recently gave judges limited approval to use AI, though only for writing opinions, not for research or decision-making. Under that system, AI may help with the paperwork but it won’t decide guilt or innocence.

The Australian Institute for Judicial Administration, in a recent report, stressed that AI tools must meet the profession’s highest standards of transparency and accountability. That means knowing how an algorithm reached its conclusion, which is something that remains a black box for most large language models.

Proponents suggest AI could one day bring consistency to the bench. A University of Chicago study replicated a 2015 experiment involving 31 US federal judges, now comparing them with OpenAI’s GPT-4o. Judges were swayed by emotional portrayals of defendants 65 per cent of the time, even when precedent dictated otherwise. AI, by contrast, remained unswayed and was driven purely by precedent. Researchers dubbed the AI a “formalist judge”, noting its approach is more akin to law students than seasoned jurists.

Despite the risks, momentum is building and feels undeniable, at least for research and low-level tasks.

Anthony Curtin, managing partner at Merton Lawyers, says that every time he interviews a graduate, lawyer or partner, he expects them to show a curiosity and willingness to adopt AI.

“It’s going to revolutionise our industry, and if you’re defensive on its application then you’ll lose,” he says.

“Every lawyer and law student should be testing these platforms. They’re not perfect yet, but their impact is already significant and accelerating. Without understanding this shift, you risk being behind the eight ball in recruitment and career progression.

Loading

“As an industry we need to embrace and enjoy this. We are at a turning point in life, technology and how we operate. One day, we will look back with nostalgia and pride at being at the front row of this important shift.”

The Silicon Valley start-up Harvey claims its AI tech saves lawyers between 13 and 25 hours a month on research and drafting. For larger firms that integrate AI into custom workflows, the number can rise to more than 80 hours.

At a time when governments are desperate to lift national productivity, those numbers are hard to ignore.

Junior lawyers may spend less of their careers trawling through case law, and more time in court or advising clients, Weinberg says.

For Judge Ellen Skinner, AI is already shaping the sector, and there’s no turning back.

“Exploring whether machines can or should be making sentencing decisions forces us to confront the values we want in future legal frameworks,” she says.

“It’s a conversation we must have now, before AI is overly embedded in our courtrooms.”

The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.

Most Viewed in Technology

Loading

Read Entire Article
Koran | News | Luar negri | Bisnis Finansial