Analysis: A new book suggests we can understand and combat fascism online as if it were an infectious disease, Marc Daalder reports
The sentencing of the March 15 terrorist last week was a momentous occasion, but it didn’t close the book on New Zealand’s experience with online fascism.
As Newsroom has previously reported, New Zealand-based white supremacists have interacted with members of overseas far-right groups and discussed the launch of domestic terror cells. At least two Defence Force-linked people have been tied to the far right – one, a soldier, was arrested and his case is in the military justice system, while the second, an army reservist, was a member of the white supremacist group Action Zealandia.
Now, twin essays by coder and tech and society expert Serena Chen offer a model for combatting online fascism, something that has become all too familiar over the past eight months: Epidemiology.
Understanding online fascism
Chen is quick to caution, both in person and in her essays, that she developed this framework after the March 15 terror attacks and well before Covid-19 emerged into the global consciousness. The idea sprang from a podcast she had been listening to about infectious diseases as she struggled to feel like she – or anyone – had any agency or ability to combat the amorphous spread of far-right ideology online.
“When I started to do a little bit of research into how these groups work, how they get influenced, what the purpose of some of these groups are and how it’s starting to take over online spheres, I got kind of dizzy. Just from the vast amount of what was going on,” Chen told Newsroom.
“I found it especially difficult to process and to understand it to a point where I felt like I could have some clear actions or just a direction to go to either counter it, or avoid it, or help friends and family not fall down this rabbit hole. What was missing for me was some kind of clear, solid framework that I could hang all of these very disparate and seemingly unrelated ideas onto in a way that allowed me to view what was going on as a whole.”
While listening to a podcast hosted by two epidemiologists, the similarities just clicked, Chen said.
“In epidemiology, they have to deal with phenomena where you’re basically just seeing all of the symptoms and you have almost zero knowledge about what is happening underneath – especially for new diseases and new pathogens. They have really great frameworks to conceptualise seemingly disparate symptoms and phenomena and to bring that together and let you see it not as a straight, cause-effect model, but more as a stochastic, probabilistic [model].”
This reflected the way Chen had observed online fascism operating. Not everyone follows a straight path from (usually) lonely, young, white man to mass shooter. The online far-right is more a twisted network than a line from Stefan Molyneux fans to outright neo-Nazis.
“This analogy provides us with helpful ways of recognising this highly connected, internet brand of terrorism,” Chen writes in her first essay.
“For instance, we can look at what makes people ‘susceptible’ and ‘immune’ for ideas on how to guard against extremism. We can recognise that these ideas do not live in a vacuum, and consider the spread of the idea in regards to the people and the environments in which it lives. We can recognise that this model is dynamic and constantly evolving.
“New ‘strains’ (offshoot ideologies), ‘hosts’ (people who make and spread the propaganda), ‘vectors’ (propaganda techniques) and ‘environments’ (websites, platforms, but also new socio-economic and political shifts) can emerge. Instead of looking for direct chains of causation, we can see stochastic feedback loops that make certain behaviours more or less likely.”
Under the epidemiological framework, we can understand the process of recruiting and radicalising new people into the far-right as an act of transmission – which can occur via different vectors. Chen outlines a number of these vectors, such as the apoliticising of racist and sexist “jokes”, the use of real socio-economic and mental health issues to promote far-right ideals as a supposed solution and the incremental process by which fascist ideology can be mainstreamed into the political discourse, as has happened in the United States with the backlash to protests against racial injustice or in New Zealand with the UN migrant compact.
Certain people are also more susceptible to being “infected” by these ideologies, Chen writes. These demographics are familiar to us: they are generally young white men and often feel isolated from others or disillusioned with their lives and opportunities. But, just like Covid-19 can have severe impacts for even those who are not in the most vulnerable demographics, no one is immune from online fascism by virtue of their skin colour, age or gender.
Cutting the chains of transmission
In her second essay, Chen asks the crucial question: If we can understand fascism like we understand pathogens, can we fight it like one too?
“In the case of the common flu, [it can be fought through] frequent handwashing (neutralising the pathogen), vaccination (reducing the susceptible population) or social/physical distancing (removing the vectors of transmission). We can take a similar approach to online fascism: What are the different ways of dismantling each pillar to protect each other from further harm?”
To start with, we can cut off the vectors of transmission. With Covid-19, this is done through wearing masks and staying distant from one another. For online fascism, Chen suggests, we can refuse to offer them a platform, or overwhelm hateful messages with kind ones – or both.
The worst actors can be deplatformed. While this may raise the risk of driving dangerous behaviour underground where it can no longer be monitored, a more nuanced understanding of online fascism and stochastic terrorism can help us arrive at the right decision.
The value of silencing those actively encouraging strangers to acts of violence is greater than the value of being able to monitor their operations in the public sphere. Besides, the most dangerous organisations – far-right terror cells like The Base or Atomwaffen Division – won’t be planning their own attacks on public forums anyways. That behaviour will be confined to encrypted chats on apps like Telegram, while mainstream outlets like Twitter or Facebook would be used solely to propagandise and incite violence.
For others, the calculus is less clear. What about Jordan Peterson or Stefan Molyneux, whose content has been demonstrated to be a part of the pathway to radicalisation? While deplatforming them could reduce the number of people they might be able to reach, it could also risk inciting a backlash and further radicalising their supporters.
Another option for short term response is what Chen calls “flooding the vectors”. This involves posting positive or deradicalising content with hashtags often used to spread hate. When bad actors attempted to spread images of the murder of 17-year-old Bianca Devins on social media in July 2019, teenagers uploaded “positive spam” to those same hashtags.
“In the online environment, speech is action. And when faced with techniques such as Google bombing (creating false or misleading webpages to which specific search terms lead) and filling data voids (creating false or misleading content for niche search terms that don’t have much prior content, so that your content is the only voice that shows up), often the best way of responding is to use the same techniques but with better content,” Chen writes.
Alongside cutting chains of transmission, protecting the vulnerable is another tried and true strategy in epidemiology. For Covid-19, this looked like keeping rest homes and those with relevant preexisting conditions in Level 4-like environments even while the rest of the country moved down the alert levels. Eventually, it will be accomplished through vaccination.
For online fascism, this can be done through educational campaigns and, in the long-term, society-wide shifts in how we think about race, religion and masculinity. Those who are vulnerable to radicalisation because they feel isolated can be protected through reinvigorating community.
“If the pipeline of radicalisation preys on the lonely and the isolated, and if it depends on the continued isolation of these people from their local communities, then fighting against fascist recruitment means strengthening communal bonds. This means doing the uncomfortable and difficult work of talking to friends and family members who are vulnerable to radicalisation, or in the midst of being radicalised,” Chen writes.
Inoculating against online fascism
The equivalent of vaccination is inoculation or “pre-bunking”, a concept spoken about by Jess Berentson-Shaw, who researches the science of communication and is the author of A Matter of Fact: Talking Truth in a Post-Truth World.
“There’s no really effective treatment once people have been exposed to misinformation. That’s what the research shows us – it’s really hard to remove misinformation once people have been exposed to it a few times,” told Newsroom in August.
“Inoculation is a much better approach. In Finland, they’ve started to do specific inoculation in schools. That’s where you’re actually telling children they are likely to be exposed to misinformation. You’re exposing them to the fallacy of it, the motivations behind misinformation, the intent behind it, the strategies and tactics which they will be exposed to.”
While Berentson-Shaw has mostly focused on anti-vaccination sentiment in New Zealand, Chen believes the same principles can be applied to the far-right.
“The reason why so many are susceptible to the talking points of the alt-right is similar to why otherwise rational people might fall for outlandish conspiracy theories (or why a naïve population is susceptible to a new virus): they’ve never encountered these arguments before, nor do they know any counter-arguments,” she writes.
“Pre-empt people’s exposure to the common talking points of the alt-right, and you can arm them with counter-arguments before they are exposed in the wild. Much like how we can vaccinate against some pathogens by exposing people to a weaker version, exposing people to extremist talking points in a safe way can teach them how to respond when they eventually encounter it in the wild.”
This is also the sort of work being done by Caleb Cain, a former “race realist” who now works with American University to develop content that could inoculate vulnerable people – especially students – against far-right talking points.
Of course, not everyone has the resources of a university at their fingertips. Chen notes that, just as fighting Covid-19 required a range of different actions from a range of different people and institutions (we washed our hands while the police ensured lockdown rules were followed), combatting online fascism requires individuals, civil society, the tech sector and government to get on board.
“Everyone has an important role to play. Even if you are not an expert in the field, even if you don’t go online. No matter how embedded you are, how much power and influence you have or even how much you care, everyone is important in this ecosystem and has a role to play,” she told Newsroom.
“What I wanted to get across was that there are a large variety of actions that we can take. The other dimension to think about it is, who are you in this ecosystem and what actions are you taking as an individual, what actions are you taking as a community, what actions are you taking as a part of your local legislative body? Are you in a company, are you in government?
“It does us less good to concentrate solely on a specific action to take. In fact, we have to recognise that everyone has a role to play, much like in eradicating a real disease.”
Chen’s essays – “The Spread of Online Fascism / Te Horapa o te Mana Whakamatua Kotahi i te Ao Tuihono” and “A Framework For Response / He Anga Urupare” – can be found in Shouting Zeros and Ones: Digital Technology, Ethics and Policy in New Zealand, edited by Andrew Chen and published by Bridget Williams Books.