Facebook is finally being forced to open up its algorithm

Facebook CEO Zuckerberg testifies remotely at Senate Judiciary Committee hearing about Facebook and Twitter's content moderation decisions in Washington

Credit: Reuters

In the days following the US election, something curious happened on Facebook. As the slow trickle of early results swung decisively in favour of Joe Biden, a New York Times league table of the most liked and commented news posts suddenly appeared to shift to the left.

Instead of the conservative stalwarts who usually dominate the list – Fox News, Breitbart News, and commentators such as Ben Shapiro and Dan Bongino – the top ten posts for November 7 and November 8 came from more liberal sources such as CNN, National Public Radio, the left-wing news anchor Rachel Maddow and the New York Times itself.

Immediately questions arose: had Facebook, long accused of both bias and appeasement towards conservative media, tweaked the algorithms that order its news feed to curry favour with the next president? 

Alternatively, might its much-discussed emergency sharing restrictions, intended to cool civil unrest by reducing the prominence of political content, have starved partisans of their audience?

Neither appeared to be the case. In a rare disclosure of new data, Facebook said that the impact of its election curbs had been dwarfed by a surge in users interacting with non-conservative media. "Americans applying heart reactions on political content were off the charts, while angry reactions were closer to baseline," said one data scientist.

This is the second straight day that Facebook’s top posts are mostly from large news outlets versus right-wing Facebook pages.

It could reflect something about the election. But given how sharp and sudden the change has been, I bet Facebook changed its algorithm. https://t.co/Tw95W7w5cK

— Jack Nicas (@jacknicas) November 8, 2020

That suggested the real explanation might lie with a wave of users friendly to Biden but not usually focused on politics, who had temporarily paid much closer attention while Trump supporters were already close to maximum engagement. 

Sure enough, the top ten has since returned to normal, even though Facebook’s curbs remain in place. Data from the analytics company NewsWhip shows little lasting change, while the media vetting app NewsGuard says it has seen no evidence that the emergency measures meaningfully stymied pro-Trump fake news outlets.

It was a perfect example of how social media companies’ long-held secrecy about their algorithms is finally breaking apart. Despite their huge power over what we read, almost nobody on the outside knows whether and how such systems might be changing society. 

Now they are being dragged uncomfortably out into the light, in no small part because of controversies like the Affaire du Bongino. 

"Facebook’s algorithms are a wildly influential force," says Ashley Boyd, vice president of advocacy at the pro-transparency Mozilla Foundation. "They determine what billions of people see, read, and ultimately believe… it’s difficult to quantify the exact impact these algorithms have. But often, we can’t even try."

She points out that the lack of information also makes it impossible to independently measure the effect of any emergency changes that social networks make to their algorithms, such Facebook’s decision to temporarily switch off its automatic recommendation system for all political groups.

Dan Bongino speaks onstage during Politicon 2018 in Los Angeles

Credit: Phillip Faraone/Getty Images North America 

Even Bongino, a former Secret Service agent who now runs a partisan media company devoted to "owning the libs", claims to have no idea why he is so favoured by the algorithm gods – though he does note that some of his writers have made a "cottage industry" of charting their mysteries. 

Such secrecy is not without reason. Good algorithms can make or break a tech company, as in the case of TikTok, whose uncanny "For You" technology almost sunk negotiations to save the app from a US ban after China refused to let the system be sold off. 

The companies also fear that revealing too much about their algorithms will open them up to abuse. There is already an army of consultants and analysts devoted to reading their entrails, and foreign spies and extremist movements as well as spam marketers will always try to exploit any loopholes they can find.

Today, however, formerly technophobic politicians now have algorithms squarely in their sights. Testifying before the US Senate in Washington DC on Tuesday, Facebook boss Mark Zuckerberg and Twitter’s Jack Dorsey were interrogated about the impact of their automated systems and whether their design might be influenced by their overwhelmingly progressive workforce.

Many of Facebook’s biggest scandals this year have come not from its privacy problems but from its automated news feed ranking and group recommendation systems, which have been repeatedly implicated in the spread of conspiracy theories and the growth of extremist movements.

Accordingly, both companies are now promising new levels of transparency. In the Senate, Mark Zuckerberg touted Facebook’s new outside research project, which will share masses of internal data with independent academics to assess the service’s impact on the election. 

Facebook’s chief marketing officer Alex Schultz has also said it may share its own monthly league tables in future, arguing that the New York Times lists only capture a specific type of activity on the service.

Typically, it was Dorsey – well-known in tech circles for his utopian suggestions about the future of Twitter, as well as his days-long meditation retreats – who has proposed something truly unusual.  

Building on previous testimony given last month, he described how Twitter hopes to let users choose their own news feed algorithms, plugging in software created by other people or companies according to their preferences.

We’re inspired by the market approach suggested by Dr. Stephen Wolfram before this committee in June 2019. Enabling people to choose algorithms created by third parties to rank and filter their content is an incredibly energizing idea that’s in reach. https://t.co/Oavx4xVskC

— jack (@jack) October 28, 2020

"Being able to turn off ranking algorithms, being able to choose different ranking algorithms that are written by third-party developers in somewhat of an algorithmic marketplace – I think [that’s] important, and a future that would excite and energise us," he told senators.

That attracted great interest from experts such as Eli Pariser, an activist and tech theorist whose 2011 book The Filter Bubble sounded an early alarm about social media’s ability to automatically sort people into political echo chambers by responding to their preferences.

"The degree to which we live in totally separate imagined nations is more dramatic than ever," he says. "To the extent that this is a concession to the notion that there is no one algorithm for human society and life, I think that’s a really important corner to turn."

In theory, opening social networks up to third-party algorithms could create a new developer ecosystem letting users not only control what they see but understand why they are seeing it. They could also try out unfamiliar algorithms to see the world through different eyes, or switch between them to suit their moods.

Yet pitfalls are easy to imagine. Die-hard partisans could choose to shrink even further into their bubbles. Enterprising conspiracy merchants might offer to cleanse followers’ feeds of the lies of the fake globe media in favour of content that affirms the truth that the world is flat. 

In that instance, would social media companies have to audit third-party algorithms? Would they sue extremist algorithm-builders for violating their terms of service? In 2024, would Facebook be under fire for shutting down the Donald J Trump Real News Feed, or failing to shut down the Vaccine Truth Bombs plug-in?

"Introducing more recommendation algorithms isn’t necessarily a silver bullet. The vast majority of consumers won’t be building these algorithms themselves, nor capable of understanding how they work," says Ashley Boyd.

"As a result, we could end up with more decentralised but equally opaque, AI… if consumers can verify that an algorithm is created by a trusted party, and if those algorithms are open for third-party audits, it’s possible that social media experiences would become less divisive and more trustworthy."

Further data may shed more light on the impact of election changes. Benedict Nicholson, head of research at NewsWhip, said that social media activity about Trump’s claims of a stolen election had gone "way down" since Biden was declared the victor.

For whatever it's worth, engagement to articles about a stolen election have gone way down since election day pic.twitter.com/uXhGvJ5bnl

— Benedict Nicholson (@BenNicholsonNW) November 17, 2020

"There is nothing to indicate that these measures stopped the spread of election-process and election-counting hoaxes," says Steven Brill, co-chief-executive of NewsGuard. "The number of sites publishing them has, in fact, accelerated since." 

He says that the number of likes for so-called misinformation "super-spreaders", who frequently publish such claims, had also soared since the end of October.

For now, third-party statistics and sporadic corporate blog posts – plus the occasional leak – are the best available clues to what prospers on Facebook. Researchers are sceptical that the company will follow through.

"Mark Zuckerberg always promises transparency during his congressional hearings, but he rarely delivers," says Boyd. Pariser agrees: "I’d like to believe that, but I’ll believe it when I see it. Folks like me have been burned on this a lot."

You may also like...