For Election 2020, Bernard Hickey finishes a three-part series on the digital policies needed to bolster Aotearoa’s resilience, protect our sovereignty and reduce inequality in the wake of the Christchurch attacks, foreign interference in elections, the Covid-19 pandemic, America’s tech war with China, and state-sponsored cyber-attacks.
There have been three moments in the last 18 months when it dawned on Aotearoa/New Zealand that it had lost sovereignty over large parts of our national life in ways we could never have imagined in the days when sovereignty was all about wars, invasions and peace treaties.
Now sovereignty is all about which platform you get your information from, and run your business or school from, where it is hosted, what is written into the algorithms it uses and who is hacking into it. We have to negotiate with tech CEOS and foreign parliaments, rather than kings and dictators, to keep our data safe and private, to fix the holes in our tax base, and communicate with our own Government.
The first moment was the afternoon of the March 15 attacks in Christchurch when Facebook and Youtube were weaponised to amplify a hate crime. See more in our first article. The second moment was in November 2018 when Huawei was locked out of Spark’s 5G rollout. See more in our second article.
Moment 3: August 16, 2020 – Chris Hipkins begs people to stop spreading ‘vile slurs’ online
It was the moment our open slather approach to barely regulating social media came home to roost, and at the worst possible time.
New Zealand’s ‘Team of Five Million’ response to Covid-19 since March has been remarkable for the high levels of public support for (and compliance with) the toughest lockdowns in the world. That meant more than 90 percent approved of the level four lockdown, even though it was announced when there were just 102 cases overall, just two proven cases of community transmission and no deaths.
This was a level of social cohesion that was the envy of the world. It was partly due to Prime Minister’s popularity and communication skills, but it also reflected the relatively widely available news and information from established and cooperative local media sources. One News, Newshub, RNZ, Stuff and NZME regularly carried the full 1pm news conferences live on broadcast television and via Youtube and Facebook, and ensured those public health messages were read and heard through every day.
Useful, timely, easily accessible and accurate information gave the public confidence about the hard lockdowns. But all through this period misinformation, rumours and conspiracy theories was bubbling up from the nether regions of the Internet, and became more and more widely distributed via Facebook and Instagram as the crisis wore on.
The shock in mid August of finding new cases of community transmission and going back into level three lockdown in Auckland and level two elsewhere then lit the blue touchpaper. As explained in this piece from The Detail about ‘The anatomy of a rumour, it began as a piece of very-amateur specu-sleuthing on Reddit that morphed into a full-on conspiracy theory via an anonymous Facebook group notorious for spreading racist comments.
I was one of many journalists repeatedly sent the post by credulous emailers. It alleged the Government was covering up all manner of nefariousness and the media was complicit. It was completely unfounded and initially unreported because none of the details checked out. Yet it continued to circulate at a speed and a volume that eventually forced Health Minister Chris Hipkins to take the extraordinary step of rubbishing the post’s ‘vile slurs’ at the start of the widely-watched 1pm Covid-19 briefing from the Beehive Theatrette.
“Not only was it harmful and dangerous, it was totally and utterly wrong. I want to say this again – it did not happen. There have always [been] and will always be rumours, but this one smacked of orchestration, of being a deliberate act of misinformation,” Hipkins said.
“This sort of behaviour is deliberately designed to create panic, fear and confusion, and it is completely unacceptable,” he said.
However, Hipkins’ implication was that Facebook’s users were to blame for sharing the post, rather than Facebook itself, which has fine-tuned its algorithms to put more and more extreme content into users’ news feeds because they are more likely to trigger engagement such as ‘likes’ and ‘shares’. Youtube uses similar algorithms to keep viewers watching and circling around into rabbit holes of ever-more extreme user-generated videos. I was struck one day when I left Youtube running after watching a 1pm briefing via an RNZ livestream and it defaulted to a video of an American conspiracy theorist questioning the lethality of Covid-19 and promoting hydroxychroloquine as a cure.
The Labour Minister was skirting around an elephant in the room: just over three-quarters of New Zealanders are active users of Facebook and Youtube, and both the Labour Party and Government have become reliant on these platforms to reach voters and the public more generally. Labour spent over $370,000 on Facebook and Youtube ads and ‘boosts’ in the 2017 election campaign, which was almost double that spent on newspaper advertising. Labour has already spent almost $100,000 on Facebook in this now-extended campaign for re-election, knowing that New Zealanders look at their Facebook feeds an average of 14 times a day for a total of 50 minutes a day.
Ardern reluctant to act
Prime Minister Jacinda Ardern has also rejected calls for Labour and the Government to stop advertising or using Facebook, as many global corporate advertisers have. She said in July after Stuff withdrew from actively using Facebook that the Government needed to reach New Zealanders where they were. She does impromptu 10-12 minute Facebook Live video sessions to her 1.7 million followers every two or three days.
“Some media outlets — Stuff for example — have an ability to share information and engage with the public on their own platform. Politicians don’t have some of those same options,” Ardern said.
“We see part of our critical job is talking with voters, making sure that people have a way to contact us, but I still see there is an argument for those platforms to change, so I will continue to hold them to account,” she said.
Stuff CEO and owner Sinead Boucher questioned that stance in the wake of Hipkins’ plea.
“Given the NZ govt’s strong warning against Covid-19 social media disinformation today, perhaps it could reconsider the extent to which it funds and enables it by spending its ad dollars on platforms which facilitate fake news vs news media that invests in journalism,” Boucher tweeted.
Chief Censor David Shanks is also particularly aware of the risks, given he was forced to ban the videos of the Christchurch attacks and the shooter’s manifesto. “We’re not immune to the pandemic, and we’re not immune to the infodemic,” said Chief Censor David Shanks.
One particular issue is the relatively higher use of Facebook and Youtube by Māori, Pacific and Asian communities, which is reflected in the increased prominence of conspiracy theories and misinformation in the communities most vulnerable to Covid-19. An NZ on Air survey in 2018 found Youtube and Facebook had the deepest reach into these communities of any media, exceeding the use of local television and radio by two to three times.
This imbalance reinforces the risks that the increasingly internationalisation of services provided digitally from remote servers and on overseas-designed software amplifies the inequalities already present between these communities, which have less access to smart phones and fast connectivity, particularly in the poorest regional and rural areas.
Catalyst IT CEO Don Christie sees the increasing use of artificial intelligence, big data and algorithms designed elsewhere that open amplify the biases of the software engineers elsewhere and deepen inequalities.
“What we’re seeing is a growing realisation that these larger and larger datasets are used to make decisions about our lives. So if your data set is very homogenised and effectively reflects an American society, the chances are the decisions being made using those datasets are not necessarily the best ones for New Zealand,” Christie says.
“Think about Māori data and look at the health and social outcomes of minorities within the minority. Then you’re in an even worse position. And this is why, for Māori, data sovereignty, the use of health data, the type of knowledge system, are all becoming a really hot button issue,” he says.
“There’s this growing realisation that these data sets are driving decisions about how minorities are treated, categorised and viewed, amplifying generations of unfair and unjust outcomes.”
So what could be done?
Andrew Chen is a Research Fellow at Koi Tū, the Centre for Informed Futures at the University of Auckland, and has just edited a book about these issues — Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand.
Chen says the platforms have tried to build systems to react faster in the event of more Christchurch attacks and to remove content clearly and quickly identified as hateful, but they could do much more.
“There is still significant scope for an increase in platform accountability, both legal and moral. Some of this may come by progressively moulding the limits of the ‘safe harbour’ provisions that insulate platforms like Facebook, Twitter and Reddit from accountability as publishers,” Chen writes, adding some have moved faster than others.
“Recent examples include Twitter’s introduction of a feature that asks users whether they still want to share an article that they have not yet read and a feature that fact-checks unreliable material – such as topical conspiracy theories – and marks them as unreliable.”
Facebook is the biggest laggard, given it still allows livestreaming and is the slowest to remove objectionable and incorrect material. Youtube makes most of its revenue from advertised around user-generated content, including the likes of the Plandemic conspiracy theory video, which was viewed by more than one million before it was taken down. Over 1.7m saw it on Facebook’s video before it was removed.
Australia is taking a much more robust approach to Facebook and Youtube in particular. It is legislating to force them pay up to 10 percent of their A$10 billion advertising revenues to news companies to compensate for free use of news clips in searchs and news feeds. Australia has also moved faster to force these tech giants, along with Amazon, Microsoft and Apple, to pay more in tax.
Europe has also threatened the search and social media platforms with multi-billion euro fines if they don’t pay more tax, control hate speech, protect data privacy, stop bullying and squashing competitors, and compensate news providers. So far the tech giants have refused to concede much, but the pressure is growing, particularly as politicians on both sides of the Atlantic question the network monopoly power they are using to colonise adjacent markets and squeeze out rivals.
New Zealand has taken a more softly, softly stance on all of these issues, preferring to engage with the tech firms and nudging them in the right direction, along with piggybacking on the efforts of ultra-national organisations such as the World Health Organisation, the OECD and the European Union.
Corroding public trust and democracy
The Workshop’s Marianne Elliott led the report produced last year for The Law Foundation on ‘Digital Threats to Democracy,’ which was based on a quantitative survey of 1,000 voting age residents by UMR.
“Our research shows is that it is critical that the Prime Minister and her advisors look beyond immediate concerns about violent extremism and content moderation, to consider the wider context in which digital media is having a growing, and increasingly negative, impact on our democracy,” Elliott wrote.
“Misinformation, disinformation and mal-information are undermining not only informed debate, but also public trust in all forms of information. Distraction and information overload are eroding citizens’ capacity to focus on important and complex issues, and their capacity to make the ‘important moral judgements’ required in a healthy democracy.”
The report recommended regulating the big technology platforms in the same way as other industries, which are subject to copyright, anti-trust laws, tax laws, defamation and fair trial laws. It also proposed fostering other more local digital media outfits through public funding for public interest media, bolstering data privacy laws and strengthening civics education.
But ultimately, the debate is about sovereignty, and what people in Aotearoa New Zealand are able to build and control.
Catalyst’s Christie sees an opportunity for the Government to lead the way with the hundreds of millions of dollars spent each year through its IT procurement policies to reduce Aotearoa New Zealand’s dependence on the big tech giants in a way that improves our resilience for the next pandemic or global cyber-attack.
“The argument is about how do you build enough capability within New Zealand so that for critical infrastructure, critical data sets, that are considered tāonga, for example, that we are able to be stewards of our own sovereignty and look after our own destiny,” Christie says.
“It’s important to New Zealand’s future that we have some degree of autonomy in this space because it’s increasingly omnipresent around us – the use of big data, the way algorithms are used to make decisions and the tools we use every day to work and learn and play and be healthy.”
—
The following is a recorded video panel discussion on August 25 to go with this series of articles. Bernard Hickey moderated the panel, which included the following members:
Marianne Elliott is a co-director of The Workshop, which is an independent thinktank and consultancy group based in Wellington. She led the creation of the ‘Digital Threats to Democracy’ report for the Law Foundation last year and The Workshop has also produced reports on Digital Divides and Online hate and Offline harm.
Andrew Chen is a research fellow at Koi Tū – The Centre for Informed Futures at the University of Auckland. He edited the book just published through Bridget Williams Books called Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand
Anjum Rahman is a project leader for the Inclusive Aotearoa Collective Tāhono. She is a Muslim Community Leader and human rights activist based in Hamilton. She co-wrote a chapter on reducing online harm in Shouting Ones and Zeros and is a chartered accountant in her spare time.
Caleb Moses is a Māori data scientist working for Dragonfly Data Science from Auckland. He is focused on using AI and machine learning, including building the first speech-to-text algorithm in Teo Reo Māori.
Don Christie is the co-founder and CEO of Wellington-based IT group Catalyst, which works using open source software and technologies on projects throughout Governments and the economy.