AI/Machine Learning

‘It’s the constitution, it’s Mabo, it’s the vibe…’: Australian lawyer’s false ChatGPT legal citations come a cropper

- February 6, 2025 2 MIN READ
Sometimes human instinct is better than all the Nvidia chips in the world. Image: Legally Blonde
AI can be useful, but it’s a long way from being as useful as a paralegal or associate when it comes to case law, as a NSW immigration lawyer found out recently, to his detriment.

The lawyer, who cannot be named, has been referred to the complaints watchdog, the Office of the NSW Legal Services Commissioner (OLSC), after he used ChatGPT to prepare submissions for a case, and the AI bot “hallucinated”, as it’s euphemistically called (Startup Daily imagines a Nvidia chip on mushrooms as it scans its LLM training, but maybe that’s just us), making up citations on previous “cases” that never existed.

The issue is especially pertinent this week because the NSW Supreme Court brought in rules restricting the use of generative AI by lawyers, which kicked in on Monday, February 3.

“Gen AI must not be used in generating the content of affidavits, witness statements, character references or other material that is intended to reflect the deponent or witness’ evidence and/or opinion, or other material tendered in evidence or used in cross examination,” it says.

It can be used in the preparatory drafting of an affidavit or other document setting out the evidence of a witness, but “affidavits, witness statements, character references should contain and reflect a person’s own knowledge, not AI-generated content”.

Josh Taylor from the Guardian broke the news of the lawyer in trouble with the Bench after federal circuit and family court Justice Rania Skaros referred him to the OLSC last Friday.

“Both documents contained citations to cases and alleged quotes from the tribunal’s decision which were nonexistent,” Justice Skaros said.

“The court expressed its concern about the [lawyer]’s conduct and his failure to check the accuracy of what had been filed with the court, noting that a considerable amount time had been spent by the court and my associates checking the citations and attempting to find the purported authorities.”

The incident occurred during an appeal on immigration matter before the administrative appeals tribunal (AAT) last October. The following month during a hearing the lawyer admitted he’d used ChatGPT because of health issues and being pressed for time. He’d used the AI to find Australian cases. They didn’t exist. ChatGPT made them up, including quotes it claimed were made in another AAT case.

A similar thing happened to a Victorian lawyer last year.

While the NSW lawyer expressed remorse, the immigration minister’s team were unimpressed, and argued he should be referred to the OLSC for misusing AI so the problem is ‘nipped in the bud’. Justice Skaros agreed.

It’s worth remembering that in the middle of this, there’s a bloke in a fight with Australia’s immigration minister, who’s probably hoping Lawrence Hammill QC, Bud Tingwell’s character from The Castle, turns up to represent him pro bono, after his initial lawyer seems more at the Dennis Denuto end of the legal spectrum.

The Guardian story is here.