Enlarge (credit: Getty Images | Aurich Lawson)

Over the last several years, there have been growing concerns about the influence of social media on fostering political polarization in the US, with critical implications for democracy. But it’s unclear whether our online “echo chambers” are the driving factor behind that polarization or whether social media merely reflects (and arguably amplifies) divisions that already exist. Several intervention strategies have been proposed to reduce polarization and the spread of misinformation on social media, but it’s equally unclear how effective they would be at addressing the problem.

The US 2020 Facebook and Instagram Election Study is a joint collaboration between a group of independent external academics from several institutions and Meta, the parent company of Facebook and Instagram. The project is designed to explore these and other relevant questions about the role of social media in democracy within the context of the 2020 US election. It’s also a first in terms of the degree of transparency and independence that Meta has granted to academic researchers. Now we have the first results from this unusual collaboration, detailed in four separate papers—the first round of over a dozen studies stemming from the project.

Three of the papers were published in a special issue of the journal Science. The first paper investigated how exposure to political news content on Facebook was segregated ideologically. The second paper delved into the effects of a reverse chronological feed as opposed to an algorithmic one. The third paper examined the effects of exposure to reshared content on Facebook. And the fourth paper, published in Nature, explored the extent to which social media “echo chambers” contribute to increased polarization and hostility.

Read 32 remaining paragraphs | Comments

By