Algorithms and echo chambers —

Did Facebook fuel political polarization during the 2020 election? It’s complicated.

There's strong ideological segregation, but proposed interventions didn't change attitudes.

Did Facebook fuel political polarization during the 2020 election? It’s complicated.
Getty Images | Aurich Lawson

Over the last several years, there have been growing concerns about the influence of social media on fostering political polarization in the US, with critical implications for democracy. But it's unclear whether our online "echo chambers" are the driving factor behind that polarization or whether social media merely reflects (and arguably amplifies) divisions that already exist. Several intervention strategies have been proposed to reduce polarization and the spread of misinformation on social media, but it's equally unclear how effective they would be at addressing the problem.

The US 2020 Facebook and Instagram Election Study is a joint collaboration between a group of independent external academics from several institutions and Meta, the parent company of Facebook and Instagram. The project is designed to explore these and other relevant questions about the role of social media in democracy within the context of the 2020 US election. It's also a first in terms of the degree of transparency and independence that Meta has granted to academic researchers. Now we have the first results from this unusual collaboration, detailed in four separate papers—the first round of over a dozen studies stemming from the project.

Three of the papers were published in a special issue of the journal Science. The first paper investigated how exposure to political news content on Facebook was segregated ideologically. The second paper delved into the effects of a reverse chronological feed as opposed to an algorithmic one. The third paper examined the effects of exposure to reshared content on Facebook. And the fourth paper, published in Nature, explored the extent to which social media "echo chambers" contribute to increased polarization and hostility.

"We find that algorithms are extremely influential in people's on-platform experiences, and there is significant ideological segregation in political news exposure," Natalie Jomini Stroud of the University of Texas at Austin—co-academic research lead for the project, along with New York University's Joshua Tucker—said during a press briefing. "We also find that popular proposals to change social media algorithms did not sway political attitudes."

Ideological segregation

Let's start with the question of whether or not Facebook enables more ideological segregation in users' consumption of political news. Sandra Gonzalez-Bailon of the University of Pennsylvania and her co-authors looked at the behavior of 208 million Facebook users between September 2020 and February 2021. For privacy reasons, they did not look at individual-level data, per Gonzalez-Bailon, focusing only on aggregated measures of audience behavior and audience composition. So the URLs they analyzed had been posted by users more than 100 times.

The results: Conservatives and liberals do indeed see and engage with different sets of political news—strong ideological separation. That segregation is even more pronounced when political news is posted by pages or groups as opposed to individuals. "In other words, pages and groups contribute much more to segregation than users," said Gonzalez-Bailon. Furthermore, politically conservative users are much more segregated on Facebook than liberal users; there were far more political news URLs seen exclusively by conservatives compared to those seen exclusively by liberals.

Finally, the vast majority of political news that Meta's third-party fact-checker program rated as false was viewed by conservatives, compared to liberals. That said, those false ratings amounted to a mere 0.2 percent, on average, of the full volume of content on Facebook. And political news in general accounts for just 3 percent of all posts shared on Facebook, so it's not even remotely the most popular type of content. "This segregation is the result of a complex interaction between algorithmic forms of curation and social forms of curation, and these feedback loops are very difficult to disentangle with observational data," said Gonzalez-Bailon of the study's findings.

Reader Comments (93)

View comments on forum

Loading comments...

Channel Ars Technica