More than 20 Government agencies have committed to be transparent about when they use algorithms and how those algorithms operate, Marc Daalder reports

New Zealand is the first country to develop standards governing the use of algorithms by the public sector, after Statistics Minister James Shaw announced the creation of an Algorithm Charter on Tuesday.

The charter, which has already been signed by 21 ministries and agencies, requires signatories to be transparent about when they use algorithms and how those algorithms function.

“Most New Zealanders recognise the important role algorithms play in supporting government decision-making and policy delivery, however they also want to know that these systems are being used safely and responsibly. The Charter will give people that confidence,” Shaw said.

The charter

A wide range of government bodies use algorithms to automate some functions or perform complex calculations. These range from the mundane (IRD uses an algorithm to calculate and issue simpler tax refunds) to the serious (the Department of Corrections uses an algorithm to estimate the risk of reoffending).

“We live in a data-rich world where algorithms play a crucial role in helping us to make connections, and identify relationships and patterns across vast quantities of information. This helps to improve decision-making and leads to benefits such as the faster delivery of targeted public services,” Shaw said.

How these tools produce outputs is not immediately clear to those without a computer science degree – and even those with the appropriate qualifications might not have access to the inner workings of the software.

That’s where the charter comes in. Agencies that sign up – including IRD and Corrections – pledge to “maintain transparency by clearly explaining how decisions are informed by algorithms”.

This could involve the creation of plain English documentation of the algorithm itself, proactively providing information about the algorithm and explaining how the data used by the algorithm is collected and securely stored.

“Using algorithms to analyse data and inform decisions does not come without its risks. It is important, therefore, that people have confidence that these algorithms are being used in a fair, ethical, and transparent way,” Shaw said.

In addition to transparency commitments, the charter requires signatories to address issues of bias in their data sets, embed a Te Ao Māori perspective in the development of algorithms and implement a host of human oversight measures, including regular peer reviews of algorithms to ensure there are no unintended consequences, creating a point of contact for public questions about algorithms and providing a way to appeal the decisions arrived at or influenced by algorithms.

Not all algorithms will have to be reported under the charter, however.

“The Algorithm Assessment Report found that advanced analytics and data use are an essential part of delivering public services. Applying the Charter to every business rule and process would be impossible for agencies to comply with and not achieve the intended benefits of the Charter,” the document states.

“However, where algorithms are being employed by government agencies in a way that can significantly impact on the wellbeing of people, or there is a high likelihood many people will suffer an unintended adverse impact, it is appropriate to apply the Charter.”

InternetNZ chief executive Jordan Carter said this was a concerning point for him.

“The triaging by risk ratings as to whether [the charter] applies, it’s kind of a pragmatic way to start but we’d like to see agencies picking it up and using it pretty broadly,” he said.

What it means for you

Shaw says the charter will give New Zealanders a window into the inner workings of Government agencies, at least on the algorithm side of things.

“As it rolls out, what you will see more of is agencies publishing lists of the algorithms that they use in their decision-making processes, starting to publish the sets of assumptions that are underlying those and in some cases even the code, so that people who know about those things can go and test them,” he said.

Andrew Chen, an expert in technology and society and a research fellow at the University of Auckland’s Koi Tū – the Centre for Informed Futures, was more doubtful that the average Kiwi would feel the impact of the charter.

“Initially this is more of a values statement, of the things that the Government thinks are important to consider around the use of algorithms,” he said.

“We will have to wait and see how the agencies apply this to see if it is actually effective, if it is more than just a values statement, if it is actually going to result in changes that affect the everyday New Zealander.

“The key things for most people will be, one, that algorithms should be more transparent if they’re being used and how they work should be more easily accessible. Two, if an algorithm is going to be used, then the affected people should be consulted and so they might see more of that.

“Three is acknowledging the role of people and having human oversight in these sorts of systems and understanding where the algorithm goes and when people interact with the system and when people make decisions or when algorithms make decisions.”

“It is the commitment there in that first part to making those decisions clearly explained in plain English documents” that will affect New Zealanders’ everyday lives, Carter said. Fostering broader public discussion and awareness around algorithms and Government use of data “is the key benefit of the approach that they’ve sketched out”.

Questions around Māori consultation and inclusion

The picture isn’t fully rosy, however. Karaitiana Taiuru, a longtime tech industry expert and the author of the world’s first set of indigenous guidelines on AI, algorithms, data and the Internet of Things, said he was shocked at how the charter didn’t seriously account for Māori concerns.

“From a Māori perspective, it feels very tokenistic and basically applies some terms that don’t mean anything and offers no protection at all to Māori or any Te Tiriti rights. I was shocked when I read it,” he said.

The document notes that Māori Data Sovereignty is an “important consideration” but is too “complex” to address within the one-page charter. The other reference to Māoridom also rankles Taiuru.

The charter requires signatories to “deliver clear public benefit through Treaty commitments by embedding a Te Ao Māori perspective in the development and use of algorithms consistent with the principles of the Treaty of Waitangi”.

“There’s not really such a thing as [a Te Ao Māori perspective]. It’s like saying ‘the Kiwi perspective’,” he said.

Taiuru and other submitters also pushed for the charter to reference the principles of Te Tiriti o Waitangi – the Māori-language text of the Treaty which affords Māori more rights than the English version. That change wasn’t incorporated.

“They seem to have ignored all the Māori submissions on it,” he said.

Shaw pushed back on the idea that Māori concerns weren’t dealt with in the charter.

“There is a line in the charter which specifically says that agencies need to be mindful of Te Ao Māori and Treaty obligations as they employ these algorithms. It’s not a strong area of competence in Government so far, the use of algorithms, and so we do hope to be able to lift the game on that too,” he said.

Carter said he understood why Māori Data Sovereignty wasn’t dealt with in the charter, given the lack of space, but said Māori perspectives needed to be consulted.

“Considering Māori perspectives is going to be an important part of the broader conversation as well,” he said.

“I get why they couldn’t put more of a concrete statement into the charter, but essentially saying that it’s too hard to include in the charter also wasn’t ideal, and I hope that the [Government] will do more to engage with Māori on [Māori Data Sovereignty],” Chen agreed.

“We’ve seen around the world how data discriminates against minorities at the moment,” Taiuru said. “I don’t see how the charter will actually stop discriminating against Māori.”

The only other line that might address Taiuru’s concerns is similarly broad: “Ensure that privacy, ethics and human rights are safeguarded by regularly peer reviewing algorithms to assess for unintended consequences and act on this information”.

That concern about data bias is grounded in fact, according to Alistair Knott, an associate professor at the University of Otago’s Department of Computer Science who helped author a paper on the use of algorithms by the public sector in 2019.

“We talk a lot about the use of algorithms in criminal justice and in areas, for instance, to do with risk assessment for children in families, family violence, and stuff like that. In those kinds of areas, there is a real danger that the data that algorithms are using to make decisions or to make suggestions is biased through years of institutional racism,” he said.

Knott pointed to the United States, where algorithms that predict the risk of reoffending regularly find black people more dangerous than white people in similar circumstances. That’s because the base reoffending rate for black people is higher than for white people. If the data is biased, the output can be too, he said.

Leading by example?

Knott and his co-authors had called for a more robust system on algorithms to be implemented, beginning with an independent body that would review and publish – in plain English – all algorithms used in the public sector.

“We were asking for more in our report. What we wanted was something that would have penetrated slightly more into the public consciousness,” he said.

By centralising and proactively publishing the algorithm information, the independent body would make that data even more accessible.

Carter, likewise, sees scope for expanding the charter in the future, particularly if agencies don’t abide by its intentions. In that scenario, he might want enforcement mechanisms introduced. The same transparency-first approach should also be taken to other areas of Government data stewardship, Carter believes.

“This is just one piece of a broader discussion about how Government is using data about people. Algorithms are an element of that, but they’re not the whole picture,” he said.

That said, Knott is still happy with the charter as an initial step.

“There’s a lot of good things about the charter. I think they’re right in saying this is a world-first. It’s got very broad scope. The idea is that all branches of the New Zealand Government can sign up to this and in that sense, it’s quite unique and it’s something we should be quite pleased with,” he said.

As a world-first, the charter also provides an opportunity for the Government to lead by example on algorithmic transparency. That could pull other governments and big tech companies like Facebook and Twitter into line as well. Algorithmic transparency was a key tenet of the Christchurch Call, which noted that algorithms had driven some people towards more and more extreme content.

“What is encouraging is that the Government is putting these sorts of values statements out and setting some sort of example for other governments to do the same thing. As you have more pressure building up in that way, then that might encourage private corporations to also adopt similar sorts of charters or statements,” Chen said.

But that will only work if the charter really works, he said.

“The difficulty will be in seeing how those values are being applied. The charter doesn’t necessarily give specific standards to say what counts as good transparency, for example. What is sufficiently good transparency?” Chen said.

“It’s better that we have this than not, but the devil is going to be in how we operationalise the details.”


Marc Daalder is a senior political reporter based in Wellington who covers climate change, health, energy and violent extremism. Twitter/Bluesky: @marcdaalder

Leave a comment