The 2020 election will not just be your usual political slugfest – it will also be accompanied by, potentially, two controversial referendums on recreational cannabis and voluntary euthanasia.

Justice Minister Andrew Little has launched a plan to combat fake news and other forms of misinformation surrounding the referendum topics. A team within the Ministry of Justice will direct questioners to reliable sources of information and keep an eye out for deliberately-spread falsehoods.

While technology and communication experts say this is a step in the right direction, there are broader challenges facing anyone who truly wants to solve the problem of deliberately disseminated disinformation masquerading as news.

How do we learn what is true and what is false?

Jess Berentson-Shaw researches the science of communication and is the author of A Matter of Fact: Talking Truth in a Post-Truth World. “We have a whole lot of in-built assumptions around truth-telling and that people have an ability to innately recognise good information if you provide it to them enough,” she said.

“That turns out to be completely incorrect. That’s not how people assimilate information – it’s based far more on cognitive psychology, cultural linguistics and cultural models.”

That means merely responding to falsehoods with the truth isn’t necessarily enough to correct misconceptions.

“If how we respond to information is much more sophisticated than our common assumptions, how and what are the strategies that actually work in order to get people to both hear and respond to what we know is good information?” she said.

For young people, “there’s emerging evidence that you can train people or put them through an education process on how they might recognise false information. It’s about teaching children quite early on how to identify what is potentially misinformation.”

Another method is “pre-bunking, which is where, before people are exposed to bad information, you can get in there first and let them know that they might hear this information and it will be false,” Berentson-Shaw said. “That really relies on the ability to get in there first.”

A demographic challenge

Berentson-Shaw said that pre-bunking is more effective than debunking after the fact. It’s also more likely to succeed when working with older people, who are not the usual targets of digital literacy programmes.

In September, the Office for Seniors offered $600,000 for proposals for digital literacy training for seniors. The agency is currently reviewing proposals it received.

The internet poses real challenges for people who didn’t grow up with it, according to Dr Catherine Strong, a senior journalism lecturer at Massey University who studies social media and fake news.

“For senior citizens, it’s not just like with students, to whom you say, ‘Here’s a new concept: media literacy’. For them, it’s unlearning decades of media literacy,” she said.

“For decades, they had the newspaper delivered to them on their doorstep and they didn’t have to wade through a whole lot of fake newspapers on their doorstep. They would turn on the national programme on the radio every day, it would just come in. There wasn’t a choice. They had loyalty and they could rely on that.”

Andrew Cushen, the engagement director at InternetNZ, concurred. “People spent a large part of their adult life not having to worry about whether or not they had to critically read the news. The context of what news is and the vast increase in the plethora of sources is a rendering of the digital immigrant challenge,” he said.

Berentson-Shaw agrees. “I think it’s a reasonable hypothesis to have an older person looking on social media and not really having that sense that things are being manipulated,” she said.

The link between age and falling for fake news is of course a generalisation – there are plenty of seniors who are tech savvy and plenty of young people who struggle with the internet. That said, a study published by Science Advances in January observed a correlation between age and propensity for sharing fake news.

The study was voluntarily given access to respondents’ Facebook pages, meaning it didn’t have to rely on self-reported data about sharing habits. It found that 11 percent of people over the age of 65 shared at least one fake news article over the span of about a month in 2016, compared to just 3 percent of users between the ages of 18 and 29.

At the same time, the study noted that overall rates of sharing fake news were low – more than 90 percent of respondents didn’t do so, even though a majority shared more than 100 links to websites. Moreover, the study did not have access to users’ News Feeds. It’s possible that older people were exposed to more fake news links and that is why they shared them more.

A chart from the Science Advances study.Caption

“People don’t know what they don’t know”

Organisations like Netsafe and the Electoral Commission are also thinking about digital literacy for people who didn’t grow up with the internet.

“Netsafe has been a public educator on online safety for twenty years,” said Martin Cocker, the CEO of Netsafe. “The reality is, when we’ve been talking about digital literacy in the past, it’s only really been recently that the reality of deepfakes and cheapfakes has been something that we’ve had to factor in.”

“Certainly, the broader concepts of media literacy and digital literacy, we’ve been trying to promote for as long as the organisation’s been around.”

Nonetheless, outreach can be tough – particular for people who aren’t in school or universities.

“They’re not in a school environment, they’re not all in one place. It’s not easy to do outreach,” Strong said. Moreover, even if the resources – such as digital literacy trainings – are out there, people might not sign up for them.

“In a lot of cases, people don’t know what they don’t know.”

Many people are aware of the basics of fake news but don’t realise how pervasive misinformation can be, Berentson-Shaw says.

“There’s a really interesting study that was done in the US around conservative Christians who were aware that there might not be truth-telling by certain politicians but they would use the internet as an inquiry process,” Berentson-Shaw said.

“They would check out whether something that was said by a certain politician was in fact accurate or not, similar to a bible inquiry, where they were really seeking whether this was true and looking for multiple sources.”

“They knew the information could be manipulated but what they didn’t realise was how far that manipulation reached – into YouTube or Google algorithms. Their understanding of the manipulation of information was a relatively explicit one and they didn’t understand how things like algorithms work or how algorithms are so individualised and tailored on social media.”

Impact of fake news

Not only are experts unsure of how exactly to address digital illiteracy, but many admit that the effect of fake news is equally unclear.

Much of the coverage of fake news, in New Zealand and abroad, has been influenced by the 2016 Presidential election in the United States, in which fake news was overwhelmingly pro-Trump in orientation. Some of the fake news articles during that election were created by Russian-paid trolls.

But swaying an election doesn’t seem to be the primary concern of experts in New Zealand. “What is my concern for 2020 and the election in New Zealand? It’s maintenance in public confidence, rather than being concerned that someone is going to genuinely change the outcome of our election,” Cocker said.

“It undermines people’s confidence in election results and election integrity. We don’t know that there’s a major concern about the ability to use technology to alter the outcome of an election, but if there’s enough noise about it, then people lose confidence in the outcome of the election.”

The Electoral Commission is also keeping an eye out for fake news this election cycle. “The Electoral Commission is aware of the potential for the spread of false information in elections.  False information has always been an issue in elections, but technology is creating new and different ways to spread it, particularly online and on social media, so digital literacy is important,” a commission spokesperson told Newsroom.

“In the lead up to the election, we’ll be reminding voters to check the source of any election information they see.  Where has it come from? Does it say who’s behind it?  Is it a credible source? If those things aren’t obvious, be sceptical.”

At the same time as the Electoral Commission, the Ministry of Justice and other agencies and organisations are gearing up to deal with the impact of online fake news, Berentson-Shaw and others caution that this isn’t just a problem of technology and digital literacy won’t solve it all.

“Fake news has always been here. The nature of the digital media has allowed, specifically, this further and further reach of it but also this manipulation based on individuals which nobody else can see. There’s this opaqueness that didn’t exist before,” Berentson-Shaw said.

This is something that InternetNZ emphasises as well. In a paper on misinformation, it writes, “Anyone can tell you, misinformation is not new. Propaganda has been used since the beginning of humankind. But we have never seen the speed or the ease in which propaganda can be shared.”

“What’s happening with fake news is something that has happened for a long time. We’ve seen the manipulation of narratives and research by people with dodgy motives for decades now. That in itself is not new,” Berentson-Shaw said.

“Until people look a little bit deeper and get a little more sophisticated in understanding the ways in which information is shaped and especially how people come to believe the things they do, the tools [of digital literacy] are never going to fix it for us.”

Marc Daalder is a senior political reporter based in Wellington who covers climate change, health, energy and violent extremism. Twitter/Bluesky: @marcdaalder

Leave a comment