Civil society and tech industry stakeholders are split over a provision in new legislation that would grant the Government the ability to filter the internet
Internet Service Providers (ISPs) and civil society have returned to battle lines first drawn in February over a controversial issue: Whether the Government should be able to filter the internet.
A bill introduced to Parliament on Tuesday would grant the Government the ability to create bespoke internet filters for content ruled objectionable by the Chief Censor. Such a filter – voluntary for participation by ISPs – already exists for child sexual abuse material and the Minister for Internal Affairs, Tracey Martin, said in a Cabinet paper that she is actively investigating the possibility of a filter for terrorist and violent extremist content.
The measures come out of widespread concerns over the ad hoc response to the March 15 terror attack, in which Department of Internal Affairs (DIA) officials had to rely on ISP participation to block websites hosting the shooting video and manifesto.
ISPs butted heads with DIA officials who wanted content blocked but didn’t have the statutory authority to demand that. The list of URLs to be blocked was hosted on a Google spreadsheet, and on at least one occasion an email full of website addresses for censoring was deleted by an email spam filter.
The bill clarifies the respective obligations and responsibilities of ISPs and the Government, and allows the latter to order certain content blocked through the creation of an internet filter.
Civil society concerned
Some industry stakeholders, however, are concerned such measures could represent a violation of free expression.
“The adjustments in the bill broadly are pragmatic and we definitely accept the good faith of the officials and the ministers who pulled this package together,” InternetNZ chief executive Jordan Carter told Newsroom.
“They are rightfully horrified about this content and want to do the things that they can to deal with it. And that’s right – we have to respond to what happened in Christchurch. But the broad legislative basis for filtering, that could be compulsory, just isn’t the right way to go. It’s too big an approach. It isn’t justified by the problem that we’re trying to solve here.
“We support a free, open and secure internet and the Government has said that it does as well. Introducing a pretty broad power for statutory filtering of the internet isn’t consistent with that idea. We support those principles and we think that the problems of content are best done by dealing with the content and dealing with the people who publish it, not the intermediaries.”
Carter said the bill differed from the existing DCEFS filter because it allowed the Government to create a mandatory filter with no oversight, while the DCEFS is opt-in for ISPs and has an independent oversight board.
“There isn’t a problem with filtering per se. We always said with the [DCEFS] that there are a couple of things about it that mean we aren’t complaining about it. One of them is that it’s opt-in. Another is that it’s a service the DIA is providing to ISPs, it isn’t mandated under statute. And it’s got independent oversight that can check the items that have been entered into the filtering service and audit its performance,” he said.
“So it’s quite a different thing to say we’re going to introduce a system that is in legislation, that it isn’t even defined in the legislation whether ISPs would have to use it or not, there’s no independent oversight of it. The DCEFS is not the same thing as what this legislation is empowering the Government to do.”
Tech expert Isla Stewart told Newsroom she was concerned the new bill amounted to government overreach.
“I’m quite worried about this internet censorship bill,” she said. “It’s on the backdrop of some very worrying processes: armed police trials, facial recognition, cementing the without-warrant raids. I don’t see the value in censoring the internet. It feels dangerous and it feels ineffective. It just doesn’t seem to meet that balance of risk to reward.”
Such a reaction didn’t surprise Martin, who said this was just part of living in a democracy.
“I think civil society will probably come to the select committee and argue that this is not the sort of thing that they want government to do,” she told Newsroom.
“A percentage of civil society will do that. And that’s the beauty of living in a democracy, obviously. So I don’t expect this for everybody to just say, ‘Yep, hip hurrah, go and filter our internet’. But we have had an awful lot of conversation up to this point so I’m pretty confident we know what’s going to come to select committee and if they’ve got improvements then we need to improve it.”
Efficacy also doubted
Stewart and Carter also raised concerns about the efficacy of filters.
“It’s extremely easy to evade. There’s free public software that will get around almost any filter. You can buy a VPN for a couple bucks a month,” Stewart said.
“The websites that glorify in hosting this kind of thing are going to just be doing it anyway. The kind of people who use those websites are okay with all that technology [needed to circumvent filters],” Carter said.
Martin said that just because some people would break the law didn’t mean an act shouldn’t be outlawed.
“We have laws for lots of things and there’s always the odd individual who decides that they’re going to break it. That’s in the real world and I don’t know that we should expect anything more from the digital world,” she said.
“So, child sexual exploitation, for example, I don’t care if they want to see it. It’s illegal, it should not be produced and nobody should be able to see it. The [child sexual abuse advocacy group] Phoenix 11, they put it beautifully: For every online image, there is an offline crime.
“This is about how much can we actually filter, how much can we protect, knowing that you can’t stop everybody going to extreme lengths to get around something. But then of course, if you minimise the people who are breaking the law, then you’ve got fewer people to look for.”
Martin Cocker, CEO of Netsafe, told Newsroom that while filters could be circumvented by bad actors, they were still useful in situations where viral objectionable content could be inadvertently seen by everyday Kiwis.
“It makes it more difficult to access objectionable material. It filters out some of the easy ways to access that content and certainly that makes it more difficult for people to go and find it. But it doesn’t stop people – there are other options for them,” he said.
“But filters help during those periods of mass communication of an event or mass sharing of harmful content because what happens in those occasions is that people get drawn down from the main platforms into second-tier platforms that are hosting the content. People are putting links into Twitter, they’re putting links into Facebook, they’re putting links into every platform they can.
“When you hit the link, you drop down to the site that hosts it. Now, we can work with Facebook, we can work with Google, we can work with Twitter to remove the content from their platforms. But it’s much harder to stop those second-tier platforms from hosting it. Those are the entities that said to New Zealand back in March last year, ‘We don’t care about your law, we’re going to host this content, we think we should be able to’. And those are the ones that therefore qualify for the filter.”
ISPs broadly supportive
Cocker said he welcomed the legislation.
“There’s nothing in the bill that is problematic for us. The legislation puts in place a framework around the kind of responses we were forced to undertake, as a country, post the Christchurch attacks. At the time, people were reacting, they were rapidly classifying content, they were working with their international content partners to have content taken down. What this legislation has done is it puts in place a framework where that could happen again, perhaps more quickly and more cleanly than it did the first time around,” he said.
For their part, the country’s two major ISPs have told Newsroom they support the new measures and want to see them put in place as quickly as possible.
“We have always been very clear that we believe it is Government’s role to classify and determine what content should be blocked, rather than these decisions being left to individual service providers like Spark. We are very keen to see Government introduce and administer a filter system for terrorist and violent extremist content, similar to what is already in place for child exploitation content, and we hope that the final legislation will enable this to be put in place quickly,” a Spark NZ spokesperson said.
“Vodafone already subscribes to voluntary Digital Child Exploitation Filtering Service (DCEFS) and we believe a new filter is a relatively effective way of approaching what is a multi-layered and challenging problem. There are ways to circumnavigate filters, meaning it will never be able to fully prevent online harm, but broadly we support the move to help protect everyday New Zealanders from being presented with abhorrent material inadvertently such as via social media channels, who also have a large role to play in this ecosystem,” Vodafone spokesperson Nicky Preston told Newsroom.
“This filter should apply to all service providers equally (the bill is currently silent on whether the filter will be mandatory or voluntary), and any implementation costs should be imposed equally across internet providers.”
Preston also said Vodafone supports the other provisions in the bill.
“The government-led response proposed by this bill is appropriate in providing ISPs like Vodafone a framework to respond to terrorist or violent extremist content. We support the need for takedown notices, and enabling the Chief Censor to urgently decide if content is objectionable, as well as methods to legally punish non-compliant websites or those that host extremist content that is deemed illegal or objectionable.
“We have long argued that we shouldn’t decide what the public can and cannot view, no matter how unimaginable the content is, and have been supporting the DIA’s work to develop this bill. While this legislation helps clarify our respective responsibilities, we hope it will be given priority as the country urgently needs a framework to address what is currently a grey area.”