When Netsafe announced it had finalised an independent code of practice for tech companies to reduce online harm, a trio of organisations put out a statement criticising it as an attempt to sidestep regulation.

“This code looks to us like a Meta-led effort to subvert a New Zealand institution so that they can claim legitimacy without having done the work to earn it,” the joint statement from Internet New Zealand, Tohatoha NZ and the Inclusive Aotearoa Collective Tāhono read.

Now, Newsroom understands, the Government harbours similar concerns that the code of practice doesn’t require big tech companies to do anything differently, wasn’t consulted on appropriately and looks a lot like a fig leaf to ward off more serious regulation.

A document released to Newsroom under the Official Information Act shows the Department of Internal Affairs, which is currently preparing its own consultation on an overhaul of the country’s media content regulatory system, had concerns with the code. The specific concerns were redacted from the document, but people close to the department say they align closely with civil society’s worries.

The code of practice was mostly developed over 2021 in private between Netsafe and a handful of major tech companies. When Netsafe revealed it to a narrow range of independent voices in the tech sector last year, they said it should undergo a public consultation.

Tom Barraclough, co-founder of the Brainbox Institute, told Netsafe in a submission that the code needed to be developed transparently or it risked losing its legitimacy.

Other civil society organisations expressed similar concerns during the initial, targeted consultation process, leading Netsafe to release a public draft late last year.

The final version of the code was revealed in July. Its signatories include Meta (which owns Facebook and Instagram), Google (which owns YouTube), Twitter, TikTok and Twitch. There were relatively few changes from the draft version, other than a new formal administrator for the code after submitters argued Netsafe shouldn’t oversee both the code of practice and disputes under the Harmful Digital Communications Act.

Brent Carey, the chief executive of Netsafe, said the changes showed the organisation was listening to criticism.

“Overall the public submissions contained a broad range of opinions and viewpoints. It is not possible to incorporate every standpoint in the code, but everything raised was considered in the development,” he said.

Concerns around the consultation also lead into worries about the origin and intent of the code of practice, both for civil society and for Government.

Carey didn’t directly answer questions about whether Meta instigated the creation of the code or whether an initial draft was written by the tech giant.

“Netsafe engages in ongoing conversations with various entities including government, civil society groups and the technology industry to advance online safety outcomes. Through these discussions the idea of a new code of practice came about,” he said.

“Most of the digital platforms who were involved in the development of the code are signatories to or members of [similar codes overseas], making them invaluable contributors to the initial draft. They have also been contributors to its development from the initial drafts to the version that was released earlier this year.”

Nick McDonnell, the head of policy for Meta’s New Zealand and Pacific branch, also didn’t directly answer questions about Meta’s role in the early stages of the code’s development.

“We’ve been clear as a company that we welcome regulations that address today’s toughest challenges on the Internet, while balancing people’s ability to freely and safely express themselves online. That’s why we’ve joined other technology organisations in New Zealand to sign up to a framework to address these needs, while also calling on the government to create updated internet regulations that address harmful content,” he said.

That’s a far cry from the way that civil society groups see the code.

“In our view, this is a weak attempt to preempt regulation – in New Zealand and overseas – by promoting an industry-led model that avoids the real change and real accountability needed to protect communities, individuals and the health of our democracy, which is being subjected to enormous amounts of disinformation designed to increase hate and destroy social cohesion,” the joint statement read.

While it hasn’t publicly endorsed this view, the Government is also understood to have similar concerns. These include a worry that the code doesn’t actually require companies to do anything more than they already are to prevent online harm and that it is therefore more of a facade to repel regulation than any genuine attempt to change the status quo.

The Government’s scepticism does have limits. It views Netsafe, for example, as genuinely pursuing online safety.

In a statement to Newsroom on the code, Internal Affairs Minister Jan Tinetti said she appreciates Netsafe’s effort and the code is “a move in the right direction”. She did emphasise that it won’t deflect regulation.

“The code’s themes are also areas of focus for the Content Regulatory System Review (the Review) and may provide some useful insights for the Department of Internal Affairs. However it has not affected the approach to the Review or the need for change in this area,” she said.

“I am taking the work of the tech industry to develop codes as a positive sign they will support further change that will come from the content regulatory review process.”

Carey said he couldn’t address any concerns the Government held unless they were raised directly with Netsafe.

“However, the code is not about setting the details of policies for each signatory – it is about ensuring signatories have policies and are enforcing them as well as transparently being held to account against those policies,” he said.

“The code is not intended to replace or address obligations from existing law or other voluntary regulatory frameworks. It aims to support the broader policy and legislative framework and recognise that companies act in line with their own policies, which may go beyond what’s required by national law.”

Marc Daalder is a senior political reporter based in Wellington who covers climate change, health, energy and violent extremism. Twitter/Bluesky: @marcdaalder

Leave a comment