After widespread opposition forced the Government to back down on legislation that would empower officials to filter the internet, a new effort to set up an opt-in filter is ongoing.
A briefing to Internal Affairs Minister Jan Tinetti, released to Newsroom under the Official Information Act, shows officials are developing another proposal for filtering terrorist and violent extremist content.
Broadband companies have long asked the Government to step in on this issue.
In the immediate aftermath of the March 15 terror attack, they closed access to the terrorist’s footage and manifesto on an ad hoc basis in response to pleas from the Department of Internal Affairs, which wanted content blocked but didn’t have the statutory authority to demand that. The list of URLs to be blocked was hosted on a Google spreadsheet, and on at least one occasion an email full of website addresses for censoring was deleted by an email spam filter.
An earlier attempt by the Government to set up a framework for new internet filters as part of legislation to tackle online extremism failed when Tinetti axed the filtering provisions. Every party in Parliament other than Labour opposed that effort.
At the time, Tinetti flagged that work would continue on a system to filter extremist content that built in more human rights safeguards.
The new system would be opt-in for Internet Service Providers, like the only existing government-run filter, the Digital Child Exploitation Filtering System for child sexual abuse material.
Internal affairs officials said they would consult with a narrow range of stakeholders, including ISPs and civil society, while developing the proposal and then open it up for wider consultation. The two options currently being considered are a content warning page which would link users to support services but also allow them to proceed if they wanted, or wholesale blocking of content.
“It would be the intention of a filter based on either option below to only apply to domains that are primarily dedicated to promoting or supporting terrorism or violent extremism,” officials added. “It would not be appropriate to permanently include domains which serve a different primary purpose (for example, email services, mainstream social media or search engines).”
An independent reference group would oversee the operation of the filter in line with a purpose-built code of practice. The filter would only target terrorist and violent extremist content that has already been deemed objectionable by the Chief Censor, like the March 15 footage and manifesto.
Andrew Chen, a research fellow at Koi Tū, the Centre for Informed Futures and an expert on technology and society, said the filter should only target a narrow range of indisputably harmful content.
“It’s a really difficult debate to have. For example, a manifesto? That would be an example of one that it would be really hard to say a manifesto should be blocked. Whereas if it’s a video of people being murdered, absolutely that should be blocked.”
He said a filter was justified for the worst of the worst content.
“It is less about stopping the people who really want to have access to get access and more about stopping people from accidentally coming across it. Given that we accept the premise that this has met a very high threshold of being harmful content that we don’t want people to see, that we don’t want to be distributed, that is likely to generate harm just by viewing it, I would prefer that we just blocked it rather than giving people an option to still see it.”
The interim chief executive of InternetNZ, Andrew Cushen, said he opposed any attempt to filter the internet for this content. If filtering were to go ahead, it should be done in consultation with affected communities to minimise unintended consequences.
“Filters just really don’t work that well. They don’t work well because it’s against the fundamental design of the internet to be able to impose a filter,” he said.
“Yes, it can stop the accidental browsing and exposure of your average user but it doesn’t stop anybody who is clearly a recidivist or determined to go and find that content. Filtering is against the DNA of the internet.”
The Department of Internal Affairs also recognises filters won’t stop the determined from accessing terrorist content.
“While web-filtering can be effective in preventing casual and inadvertent access to objectionable material, technical constraints with current filtering technology does not prevent those who are determined to access objectionable material,” officials wrote in the briefing obtained by Newsroom.
Telecommunications companies told Newsroom they wanted to hear more details about the proposal, but the Government needed to lead on any internet filtering.
“While the industry is happy to step up in the midst of a crisis it’s really the job of government to decide what New Zealanders can and can’t look at online and we’d be happier if they took on that role,” the Telecommunications Forum chief executive Paul Brislen said.
“We haven’t engaged directly with the Department of Internal Affairs on this proposal, but would welcome the opportunity to feed back and to work with government, industry and relevant groups on this important issue,” Quentin Reade, the head of communications for 2degrees, said. “We would like to see more detail around the proposed options, and seek to understand the technical details around how these proposals would work in a real-world environment.”
Rich Llewellyn, Vodafone’s head of corporate affairs, went further in saying that blocking access entirely to domains could introduce “complexity and unintended consequences”.
“It is unclear what role ISPs, if any, would be expected to have in blocking domains and the mechanism under which they would undertake domain-blocking activity if asked. We would need further detail before confirming the nature and extent of Vodafone’s support for either option. This would include a better understanding of any role that ISPs would be expected to play in enabling either option.”
A Spark spokesperson said the company supports a filter similar to the existing one for child sexual abuse material.
“We would need to understand the details of the proposals and how they would operate in practice before commenting further on our preferences. But what we can say is we have been clear in our view that a centralised filter, tightly focused in scope, and technically similar to the existing digital child exploitation filter, provides the best protection for New Zealanders from terrorist and violent extremist online content and is the best way to ensure this content can be blocked on all New Zealand service providers’ networks,” the spokesperson said.
“We do recognise that filters are not a comprehensive solution and can be bypassed, but they send an important social signal about what is acceptable and they prevent inadvertent online access to objectionable content by ordinary New Zealanders and their children.”