Analysis: A Facebook executive has penned a lengthy blog post defending the company against claims its algorithms are responsible for upticks in polarisation and extremism, but do his arguments hold up? Marc Daalder reports
Facebook is pushing back against criticisms of its algorithms and claims that the site is responsible for increased political polarisation and extremism.
In a lengthy blog post, Vice-President for Global Affairs and Communications Nick Clegg – also a former deputy Prime Minister in the UK and the former head of the Liberal Democrats – defended the company from accusations made in the Netflix documentary The Social Dilemma, as well as by high-profile commentators in magazines and books.
The defence coincides with an announcement from Facebook that users will now have more control over how algorithms structure their News Feed. This includes making the option to turn off the algorithm and just see posts chronologically more visible, as well as a new algorithmically-curated “favourites” News Feed that users can also add and remove pages, groups and people from.
‘You and the Algorithm’
In the title of his post on Medium, Clegg argued that “it takes two to tango”. In other words, people who have experienced adverse outcomes from Facebook’s algorithm, including radicalisation and increased negative feelings about political opponents, are partly to blame because they clicked on some of this content in the first place.
The algorithm, Clegg writes, is designed to reflect what we show to Facebook, giving us more of what we have indicated we like and find meaningful.
However, Clegg also said the algorithms don’t solely put up content that people are more likely to engage with. “Facebook’s systems are not designed to reward provocative content. In fact, key parts of those systems are designed to do just the opposite,” he wrote.
That involves deprioritising misleading, sensationalising and clickbait posts. Instead, more meaningful content – like photos and life updates from family and close friends – is prioritised.
When CEO Mark Zuckerberg announced this change in 2018, Clegg wrote, “he recognised explicitly that this shift would lead to people spending less time on Facebook because Pages — where media entities, sports teams, politicians, and celebrities among others tend to have a presence — generally post more engaging though less meaningful content than, say, your mum or dad. The prediction proved correct, as the change led to a decrease of 50 million hours’ worth of time spent on Facebook per day, and prompted a loss of billions of dollars in the company’s market cap.”
“The company’s long-term growth will be best served if people continue to use its products for years to come. If it prioritised keeping you online an extra 10 or 20 minutes, but in doing so made you less likely to return in the future, it would be self-defeating.”
Content moderation, Clegg said, shouldn’t be left solely to big tech companies.
“It is entirely reasonable to argue that private companies shouldn’t be making so many big decisions about what content is acceptable on their own. It would clearly be better if these decisions were made according to frameworks agreed by democratically accountable lawmakers. But in the absence of such laws, there are decisions that need to be made in real time,” he wrote.
Experts have positively greeted the changes announced by Facebook, but criticised Clegg’s accompanying post.
“Admittedly, it is a refreshing departure from statements that have come from the company suggesting that connecting people and Facebook dishing out the content is the way forward and there is nothing wrong with it. What we are finding more and more, which is a good thing, is that the company is taking cognisance of the manner in which the platform delivers news, how it is consumed and offline impacts of what is on the platform,” Sanjana Hattotuwa, a special advisor at ICT4Peace who studies the interaction between extremism and the internet at the University of Otago, told Newsroom.
However, he was more critical of Clegg’s arguments.
“It’s very strategic in not having a nuanced approach. It’s very crafty. It’s a really well-crafted document that has some good in it, but it’s also really deflecting a lot of the criticisms. He tries to suggest that all of us who are criticising [Facebook] are without merit and rabble-rousing,” he said.
That was echoed by InternetNZ senior policy advisor Nicola Brown, who said Clegg was being “wilfully naive”. And Kate Hannah, who works on online disinformation and extremism at the University of Auckland, said the post was “an impressive piece of drawing on several different strands of ways in which one might deflect arguments around the responsibilities of the platforms, and in particular Facebook as the oldest platform and the most ubiquitous platform for most users”.
The funhouse mirror
Andrew Chen, a research fellow at the University of Auckland’s Koi Tū – Centre for Informed Futures and an expert on technology and society, said Clegg’s post offered up a valuable but flawed idea: Facebook’s algorithms as a mirror of our wants and values.
“It is an interesting psychological argument around, ‘Well, the primary input into the algorithm is the user’s choices and selections and therefore the algorithm is a reflection of that user’. And I think that is true to some degree,” he said.
“Where it falls over is that it is not a 100 percent true and accurate reflection of ourselves. It distorts who we are. If you treat the algorithm as a mirror, well, I think it’s actually a funhouse mirror where it will over-exaggerate certain parts of you and your values and your personality because you are only giving the algorithm certain parts. That’s kind of why we see people being pushed down radicalisation rabbit holes, because those are the parts that get exaggerated.
“The problem with the mirror is that it’s not simply the reflection – it is also a nudge. It shifts our personality and values in the direction that the mirror is showing us. Over time, it gets more and more distorted. So I think that argument makes some sense but it has a limitation.”
In particular, Chen identified three ways in which the algorithm’s interpretation of a user is not a perfect reflection. First, there are some things that we never show to the algorithm – values, thoughts and interests that we don’t engage with via the internet, or at least not on Facebook.
Second, the algorithm has its own weights for what we show to it. For example, your interest in your family members may be weighted as more “meaningful” than your interest in particular cooking videos, and may therefore be correspondingly prioritised.
“The first thing you’re talking about is what you show to the mirror, what part of your body you put in front of the mirror, and then the second part is the shape of the [funhouse] mirror,” Chen said.
Third, there are third-parties who take advantage of Facebook and its algorithms to reflect certain parts of our identity back to us.
“I don’t think [Clegg] touches on that at all. He centres the conversation between the user and Facebook but doesn’t talk about the role of other users who may be leveraging the Facebook platform to get to users,” Chen said.
“In the worst case we can point to Cambridge Analytica. But there are plenty of others, from political parties to terrorist groups, who use Facebook as a platform to get to people. In the framing that Clegg sets up, it becomes Facebook’s responsibility to protect users from those sorts of entities, but we know that those entities use Facebook’s algorithm against users to get their content out there. That’s influencing the design or shape of the mirror.”
Hattotuwa agreed, saying that even Clegg seems to contradict himself on these issues.
“Clegg takes some pains to suggest that there are no filter bubbles, there is no algorithmic harm, that polarisation is not a problem and it’s certainly not a problem that Facebook contributes to. These are all very, very loaded things,” Hattotuwa said.
“But then he also says, and I quote, ‘There is no editor dictating the front page headline millions will read on Facebook. Instead, there are billions of front pages, each personalised to our individual tastes and preferences, and each reflecting our unique network of friends, Pages, and Groups.’ Now, if that’s not a filter bubble, I don’t know what is. Moreover, in that example alone, it is very clear that Facebook is the editor.”
Informed consent
Chen said that the idea of the algorithm as a mirror also implies people are choosing to interact with the algorithm and display parts of themselves on it with the full knowledge of what will happen with that data. That, he said, is generally not the case.
Not only do people not understand how the algorithm might weight their information, but they also may not know when they’re surrendering information to Facebook in the first place.
“Facebook’s, for want of a better term, surveillance extends well beyond the platform. They’ve got tracking pixels in a lot of websites that are embedding Facebook things, there are also apps like Instagram and WhatsApp that are part of the Facebook family that people may not necessarily consider to be interacting with the Facebook platform. They’re all feeding data back into the system,” he said.
“When he says you are interacting with the Facebook algorithm, you are doing that in a lot of places and times when you may not be aware that you’re doing that. So that removes agency. I just have real problems with Clegg implying: ‘Oh, well, if you don’t like white supremacy content, don’t click on it. Don’t tell Facebook that you have an interest in it and we won’t show you any of it’.”
Brown agreed, pointing to the wide reach of Facebook and Google on websites and apps well beyond Facebook.com and Google.com.
“Between Google and Facebook, almost every website you ever visit has a little piece of code embedded on the back that lets them know that you’ve been there. They do try and create some transparency around that but I wouldn’t call it meaningful consent by any stretch of the definition,” she said.
“You can’t just look at Facebook. About 60 percent of New Zealanders access Facebook once a day, but if you add Instagram and WhatsApp to that, between these three Facebook platforms, New Zealanders are spending so much of their time. Which I think is something we need to take really seriously – and Facebook needs to take seriously.
“If they know that they’ve got that percentage of New Zealand’s attention, they have to come to the table and talk about how these things can work for us. We need to keep demanding more from these social media companies. They’re asking us to be our public square without giving us any agency in return.”
Indeed, InternetNZ’s research found that Facebook was the most-used social media platform or communication technology. Four of the top five platforms and communication methods were owned by Facebook, including Facebook itself, Facebook messenger, Instagram and WhatsApp.
Facebook as a supermarket
Clegg also outlines his views of Facebook’s role in determining the shape of the mirror via the analogy of Facebook as a partner delivering a home-cooked meal.
“Imagine you’re on your way home when you get a call from your partner. They tell you the fridge is empty and ask you to pick some things up on the way home. If you choose the ingredients, they’ll cook dinner. So you swing by the supermarket and fill a basket with a dozen items,” Clegg writes.
“Of course, you only choose things you’d be happy to eat — maybe you choose pasta but not rice, tomatoes but not mushrooms. When you get home, you unpack the bag in the kitchen and your partner gets on with the cooking — deciding what meal to make, which of the ingredients to use, and in what amounts. When you sit down at the table, the dinner in front of you is the product of a joint effort, your decisions at the grocery store and your partner’s in the kitchen.
“The relationship between internet users and the algorithms that present them with personalised content is surprisingly similar. Of course, no analogy is perfect and it shouldn’t be taken literally. There are other people who do everything from producing the food to designing the packaging and arranging the supermarket shelves, all of whose actions impact the final meal.”
Experts said that analogy falls short for a number of reasons – not least that there are certain things you can’t purchase at a supermarket (like cyanide) but the social equivalent of those items remains available on Facebook.
“Sure, you can buy the food you know you like at the supermarket, but there are people you paid a lot more money to get their products on those end stands than anywhere else on the shelf,” Brown said.
“And they would have put cyanide in that product until that was illegal. You’ve got to ask how many people, when they started using Facebook, were asking for the cyanide? Versus how many people ask for it over a long period of time? There’s no way to know that answer unless you’re working inside of Facebook.”
“You don’t have to go as extreme as cyanide. It’s also just junk food and unhealthy food,” Hattotuwa said. “But my point is that, at the end of the day, if Facebook is a supermarket, it crafts and curates and contains and controls and indeed even censors that which you can buy from. It shapes the shelves. Without getting too philosophical around free will and what not, it literally shapes the way that you shop.”
Hannah said the analogy implies Facebook’s algorithm is an equal partner with its users.
“The analogy he gives is the analogy of one partner. One partner has an innate, in-depth knowledge of your likes and dislikes. But they also have an innate, in-depth knowledge, hopefully, of what you ate the day before, and what you might be going to eat the next day and what kind of mood you’re in,” she said.
“A better analogy would be, if you were doing an online shop and rang Countdown or New World and said, ‘Can you put in some ingredients for me to have dinner tonight? Just surprise me.’ Facebook isn’t our friend or our partner who will always make decisions based on what’s good for us at that time. They’re a stranger.”
Chen said this shows the role of regulation of big tech companies.
“I think this is where you hit the intersection with regulation. The supermarket is making decisions about what products it stocks on the shelf and if the owners of the supermarket were ethical, it probably would choose not to stock cyanide – but at the same time, there are regulations that prevent the supermarket from stocking those products,” he said.
Regulating Facebook
Although Clegg himself called for regulation of tech companies “by democratically elected institutions”, Chen saw that as a cynical move.
“In an ideal world, I don’t think tech companies should be the ones making those decisions. Where my view kind of differs is that that pushes the onus back on government and I have relatively little faith that the Government will be able to act in a way that effectively addresses these problems. They’ve had plenty of time to act and they haven’t done so yet. The Christchurch Call is the closest thing we have to international cooperation on these sorts of issues. They’re trying, but it’s slow and it’s not going to be able to respond at the pace that technology is continuing to develop and proliferate, and all the while, harm is being accrued,” Chen said.
“If governments are not going to act, then the onus does fall back on the big tech companies to at least do something. I’m pointing to things like the Donald Trump ban, that was an example. The government’s not going to deplatform Donald Trump, but he was clearly causing harm and the tech companies had to do something about it.
“I also don’t buy the overall argument for a different reason, which is that big tech companies have been making these decisions for 20 years, because government has not. And that is what has allowed the big tech companies to get to where they are in the first place. To then say, ‘We want to wipe our hands clean and we don’t want to have this sort of power that we have given ourselves,’ well, sorry, if you are going build and wield this massive amount of power, then you probably have to accept the responsibilities that come along with it.”
Hannah was also critical of Clegg’s apparent desire for regulation.
“He does also mention that, ‘in the absence of such laws we’re doing this out of the goodness of our heart because there are no laws, currently, that make us do things. So aren’t we nice, aren’t we good?’ But we all know that Facebook is complicit in censoring speech where governments ask them to censor speech,” she said.
“It is completely a front to say that they were just waiting for governments to regulate them. What they’re saying is: ‘We think it’s going to take forever and so we can get away with making these good noises at this time’.”
An ahistorical perspective
Hattotuwa, who has studied the role of social media in exacerbating and inciting anti-Muslim violence in Sri Lanka in 2018 and 2019, took issue with what he called Clegg’s “ahistorical perspective”.
“He suggests that if you take away Facebook, polarisation wouldn’t decrease. In a country like Sri Lanka or India or many other countries that have protracted conflict and deeply divided societies unlike in New Zealand, you can’t take an ahistorical perspective,” he said.
“You can’t suggest that just because you take Facebook out and just because you give users more control, that the contributions of well over a decade done by Facebook and made by Facebook to exacerbate those divisions [are erased]. You can’t take away the net effects and the net harm and then now say that, actually, taking away Facebook doesn’t reduce polarisation.”
Brown agreed, saying that Facebook’s long history of misusing user data and inciting harm can still be felt today.
“To be fair, this happened a long time ago but it’s still part of Facebook’s DNA: Facebook is the company who in 2010 performed an experiment on its users to see if it could affect voter turnout in the United States. They may be really committed to building meaningful online experiences for you today, in 2021, but the choices they’ve made through their entire existence still influence the feeds that you end up with today,” she said.
“The groups that might have been promoted to you pre-2018, before they decided to stop promoting certain kinds of content, you might still be in those groups. They can’t remove themselves from that historical context and it would be dangerous to do so.
“It’s a bizarre thing, right? You sell crack for 10 years and then you say, okay, I’m going to stop selling you crack so you are less addicted to it,” Hattotuwa said.
In his blog post, Clegg pushes back on arguments that Facebook is responsible for historical polarisation and violence. He writes that the “goal is to make sure you see what you find most meaningful – not to keep you glued to your smartphone for hours on end”. Moreover, Clegg argues, Facebook isn’t engineered to prioritise provocative content.
However, New York Times tech columnist Kevin Roose tracks the 10 top-performing links posted to Facebook every single day. Consistently, eight or nine of the top 10 come from Fox news commentators, far-right trolls and ultraconservative meme pages. The remainder are often sourced from equally sensationalist pages on the opposite end of the political spectrums.
More significantly, a United Nations investigation found in 2018 that Facebook had played a “determining role” in the ethnic cleansing of Rohingya Muslims in Myanmar.
“It has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media,” the chair of the fact-finding mission reported.
“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” a UN investigator added at the time.
Hattotuwa’s own work has come to similar conclusions.
Clegg’s assertion that Facebook isn’t designed to keep people online for longer “is completely contradicted by my doctoral research,” he said.
“I can name specific incidents of communal violence of a significant magnitude in the country where the content most shared and engaged with contributed to not just offline harm, but contributed to users engaging with the content on Facebook for a much longer period of time. I’m not talking about five or 10 minutes more. I’m literally talking about, at the height of offline violence, users in the country engaging with the most incendiary content from about 8 o’clock in the morning to about 11.30 at night.
“It’s toxicity and pollution for well over a decade, and where is the contrition or acknowledgement?”