A report from the Helen Clark foundation has proposed ways to regulate social media. Photo: Lynn Grieveson.

Social media companies should no longer be left to self-regulate, but should have a their work supervised by an independent regulatory body, recommends a new paper following the Christchurch terror attack. 

A report from the Helen Clark Foundation is calling on an independent regulatory body to be created to oversee the regulation of social media companies. 

The report comes in the wake of the Christchurch terror attack, which was live-streamed on Facebook and re-uploaded millions of times across the internet. 

The report’s co-author Katherine Errington told Newsroom that the consensus around regulating tech companies had shifted from not wanting to regulate at all, to making sure that the regulation was done well.

She said some countries like Singapore, which had taken a hard-line approach to social media, had created an environment where Governments were in the position of deciding “what is true”.

Singapore’s fake news laws allow the Government to unilaterally remove content which it deems to be untrue. Critics argue this allows the Government to stifle free speech. 

Errington said that, by contrast, she wanted to propose “a democratic model”.

Her paper proposes an independent regulatory body along the lines of the New Zealand Media Council and Broadcasting Standards Authority. This would replace the current regime in which social media companies are largely left to self-regulate how they monitor and remove harmful content. 

She said the regulator needs to be comprised of a range of voices including those of Government and public interest groups.

However while the regulator could have Government appointees, it also needed to be independent. 

“That regulator needs to be independent of Government, the regulator stands on its own and the Government can’t step in and overrule it,” she said. 

Alongside the fact that the current regime relies too heavily on self-regulation is the fact that the legislation which does exist largely predates the existence of social media, meaning some issues fall through the cracks and it’s not immediately clear whether social media falls under some existing legislation.

For example, although the Chief Censor ruled the video made by the alleged gunman ‘objectionable’ under the Films, Video and Publication Classification Act 1993 it is unclear whether the Act is applicable to social media companies. 

Even legislation designed for the age of social media like the Harmful Digital Communications Act, which was drawn up in response to online bullying, specifically exempt social media companies from liability. 

Errington is calling for the Law Commission to review regulation of social media to close some of the holes in the legislation.

Another issue is how to appropriately sanction social media companies who break the rules. 

Errington said sanctions tend to fit into three broad categories: financial penalties, individual liability of executives, and the disruption of media companies’ business. 

In all cases, the trend overseas has been for increasingly severe sanctions on social media companies, but they have also highlighted the need for social media companies to act quickly to take harmful content off their platforms, recognising that content left online for longer becomes exponentially more difficult to remove.

The European Commission is developing an approach that will oblige platforms to remove terrorist content or disable access within one hour of receiving a removal order from authorities. Failure to comply would result in a maximum fine of 4 percent of global turnover for the previous year. 

Australia swiftly introduced legislation in the wake of the Christchurch attack that criminalised “abhorrent violent material”, with employees facing up to three years in prison if they failed to comply.

The other sanction, business disruption, had a trial of sorts following the attack, when telcos took it upon themselves to unilaterally block access to objectionable sites. A regulator could take on the ability to use these disruptive powers to temporarily block access to sites hosting harmful material. 

In any case Errington sees the fact that the debate has moved from whether action should be taken to what form that action should take as a positive thing. New Zealanders will get their first look at the Government’s response this week when the “Christchurch Call” is unveiled in Paris. 

Leave a comment