New Zealand courts have approved guidelines for the use of generative AI tools, such as Chat GPT. 

These guidelines apply to all courts and tribunals and vary slightly for lawyers and non-lawyers.  

Law Society Civil Litigation and Tribunals Committee member Felicity Monteiro who is a partner at Wilson Harle said the tools were powerful and useful for self-representing litigants.  

“Because I think it must be daunting when you’re up against a lawyer and so having a tool where you can present your arguments or evidence in a well-written way, must feel really like you’ve got some backup. So I can see why it would be a really useful tool.” 

However she does have concerns. 

“People that use it need to know actually what it is and what it’s doing so that they can ameliorate those issues. And those are obviously because these generative AI tools, they aren’t designed to give you correct answers, they’re designed to give you answers that look like the correct answer.  

“So they’re really good at putting material together to make it look convincing but they don’t know when they’ve made a mistake and they can’t tell the difference between facts and opinion.” 

For example, in the US earlier this year the lawyer for a man suing an airline in a routine personal injury suit used ChatGPT to prepare a filing, but the bot delivered fake cases that the attorney then presented to the court. 

The guidelines make this clear: “GenAI chatbots cannot give you reliable legal advice that is tailored to a specific case.”  

“GenAI chatbots are not search engines. They do not provide answers from authoritative sources. Rather, they put words together based on what you tell them and information they have previously been given. This means the output generated by a GenAI chatbot is what it predicts to be the most likely combination of words, not necessarily the most correct or accurate answer.” 

They also advise against entering private, confidential, suppressed or legally privileged information into the chat bot, and make clear the person submitting the information is responsible for its accuracy. 

People will not need to disclose that they’ve used generative AI though, unless explicitly asked.  

The number of self-represented litigants is increasing, and is very common in civil district court cases. 

For the year ended June 7, 232 cases involved self-represented parties – 72 percent of all civil cases through the District Court.  

Five years ago 52 percent of cases involved a self-represented party. 

Monteiro said the increase was having an impact on court operations. 

“It’s one thing to know the substantive law that your matter relates to, but it’s also all of the procedural rules and regulations. 

“And so, when people are acting for themselves they’re sort of strangers to both of those things and what that means for registry staff is there’s often a lot of back and forth before it’s even filed.” 

She said there was a lot of court registry staff turnover at the moment as well, which made things even harder.  

“What that means is we’ve got inexperienced or brand new litigants dealing with inexperienced registry staff and what we’re seeing as lawyers, acting for defendants, is that we’re getting documents that are impossible to actually respond to and that previously, we probably would have not actually seen because they would have been rejected.  

“Then you’ve got to work out what their actual claim is, so you’ve sort of got to do the work twice.” 

Another upshot of more claims coming in from self-represented parties was that there would be an increase in strike-out applications.  

“If we’re seeing more proceedings come through don’t actually disclose a cause of action, then we’re going to be seeing more applications for strike out of those proceedings and that will increase the burden on not just the registry staff but also on judges because of course they need to decide those applications.” 

She said generative AI tools could help self-represented litigants prepare submissions in the corrects formats, but should not rely on it to do the legal work.

“It’s a great tool for people who aren’t used to writing large amounts of text, like lawyers are, but it needs to be used really carefully. So I would hugely encourage people who are representing themselves to actually go and check out the legal authorities themselves – don’t rely on the AI to generate your arguments or your case or your pleadings. 

“You need to do the legal legwork first, and then use it as a writing tool.” 

Leave a comment