More than six months after the March 15 terror attack in Christchurch, the Department of Internal Affairs unit that monitors compliance with the Chief Censor’s rulings still has no employees who investigate extremist content full-time. That is now set to change, with the Censorship Compliance Unit planning to double its roster of investigators after receiving a $17 million funding boost.

The new funds are earmarked entirely for fighting violent extremism online and will go towards hiring around 17 full-time employees, improve the speed at which the Chief Censor classifies material, designing new laws to respond to terrorist content and reviewing in the longer-term the regulatory framework around social media companies.

Department of Internal Affairs officials say the expanded unit will draw from its success with the other area in which it specialises: child sexual abuse material.

Process still in early stages

Officials at a background briefing on Monday cautioned that the new endeavour is still in the early stages of development. However, they said it will have three streams: increasing operational capacity, in part with the hire of more investigators; quick-fix legislative changes to shore up the unit’s powers; and a longer-term investigation into the role of online companies in New Zealand’s censorship regime.

“Our laws in relation to censorship are reasonably outdated but we’re proposing some changes in those areas,” an official said. “We intend to engage with civil society and communities around those types of changes. The aim there is to get some more immediate changes in relatively quickly.”

Practically speaking, the department expects the new funding will increase its ability to prevent violent extremist content from proliferating and give it greater potential for investigating and prosecuting those who disseminate it.

Officials stressed that they are not currently examining, altering or expanding the types of content that could be classified as objectionable.

While the DIA has always been empowered to tackle all forms of content deemed objectionable by the Chief Censor – or that its own employees believe to be objectionable – its focus has traditionally been on child sexual abuse content.

“Our focus has been in the child exploitation space, which is an objectionable medium,” an official said. “The existence of violent extremism or terrorism-type publications can also be objectionable and is in existence. We’re looking to build capability based on the existing skills and expertise we have in child exploitation.”

“Like any department or any regulator, we have to think about where the greatest risks are,” a second official at the briefing said.

“For the relatively small capability we’ve had in the censorship compliance area, we have focused that capability on countering child sexual abuse images and identifying and prosecuting those who trade in those images.

“This additional funding helps us to deal with another area of harm in New Zealand.”

Task a tall order for the Government

Nonetheless, that official admitted, “fundamentally, we can’t eliminate this content from the internet. What we can do is try and protect New Zealanders as much as possible from harmful content that they may come across. But you cannot remove it [all] from the internet.”

The Government has a limited set of tools for dealing with online content, as this Newsroom explainer shows, almost all of which depend on the goodwill from platforms hosting it. One of its primary instruments is take-down notices. When sent to a website based overseas, these notices are voluntary. “They are just convention,” an official admitted.

A second is communication with Internet Service Providers and international partners. On March 15, ISPs voluntarily blocked access to websites hosting the livestream or manifesto of the alleged shooter. As with take-down notices, this method relies on voluntary participation.

The strongest tool available to the Government is an internet filter. The Government currently operates the Digital Child Exploitation Filtering System (DCEFS), a filter that most ISPs have voluntarily signed up to implement. When a user attempts to access a website that the Government has identified as hosting objectionable content, they will instead be directed to a webpage informing them that the site is blocked and linking to mental health services.

Officials declined to comment on whether any work was being done to add violent extremism to DCEFS, or create a new filter for such content, beyond confirming that it would be possible. “It is an area where we would need to provide some advice to ministers,” an official said.

However, at a post-Cabinet press conference on Monday, Prime Minister Jacinda Ardern said the Government was looking into a new filter. “We do need to look at options like voluntary filters,” she said.

“That is something we use already for child exploitation and we are now exploring for the issue of terrorist and violent extremist content.”

The Government can also prosecute people in New Zealand who create, share or possess objectionable material. Since March 15, police have charged at least 35 people for possession of the Christchurch terror attack. However, an official said the Government would likely only pursue people who intentionally share violent extremist material, not those who happen to come across it.

Marc Daalder is a senior political reporter based in Wellington who covers climate change, health, energy and violent extremism. Twitter/Bluesky: @marcdaalder

Leave a comment