28.6 C
New York
Tuesday, July 16, 2024

What Meta’s New Studies Do—and Don’t—Reveal About Social Media and Polarization

Last week, the first papers from a collaboration between Meta's Facebook and a team of external researchers studying the 2020 election were finally published. Two of these studies asked: Are we trapped in filter bubbles, and are they tearing us apart? The results suggest that filter bubbles are at least somewhat real, but countering them algorithmically doesn’t seem to bring us any closer together.

Some are interpreting these results as proof that Facebook divides us. Others are claiming these experiments are a vindication of social media. It’s neither.

The first study tried to figure out whether we’re really in informational echo chambers, and if so, why. Unsurprisingly, the segregation in our information diets starts with who we follow. This mirrors offline life, where most people’s in-person social networks are highly segregated.

But what we actually see in our Feed is more politically homogeneous than what is posted by those we follow, suggesting that the Feed algorithm really does amplify the ideological leanings of our social networks.

There are even larger partisan differences in what we engage with, and Facebook, like pretty much every platform, tries to give people more of what they click, like, comment on, or share. In this case, it looks like the algorithm is sort of meeting human behavior halfway. The difference in our information diets is partly due to what we’ve chosen, and partly the result of using computers to guess—often correctly—what buttons we’ll click.

This raises the question of how ideologically similar people’s news should be. You can read the computed values of the “isolation index” in the paper, but it’s not clear what numbers we should be aiming for. Also, this study is strictly concerned with “news and civic content.” This might be democratically important, but it makes up only a few percent of impressions on Facebook. It’s possible that positive interactions with people who are politically different change us the most, even if it’s just reading their posts on unrelated topics.

The second study directly tested whether increasing the political diversity of people and publishers in your feed has an effect on polarization. For about 20,000 consenting participants, researchers reduced the amount of content from like-minded sources by about a third. This increased consumption from both neutral and cross-cutting sources, because the amount of time spent on Facebook didn’t change.

Of the eight polarization variables measured—including affective polarization, extreme ideological views, and respect for election norms—none changed in a statistically significant way. This is pretty good evidence against the most straightforward version of the “algorithmic filter bubbles cause polarization” thesis.

But this is not the end of the story, because filter bubbles aren’t the only way of thinking about the relationship between media, algorithms, and democracy. A review of hundreds of studies has found a positive correlation between general “digital media” use and polarization, worldwide, as well as a positive correlation with political knowledge and participation. Social media use has many effects, both good and bad, and filter bubbles aren’t the only way of thinking about the relationship between media, algorithms, and democracy. For example, there’s evidence that engagement-based algorithms amplify divisive content, and tools to reach targeted audiences can also be used for propaganda or harassment.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

We need to ask not just how to prevent harm, but what part platforms should play in helping to make societal conflicts healthier. It’s a deep question, and scholars have explored how different theories of democracy might call for different types of recommender algorithms. We don’t want to eliminate all political conflict or enforce conformity, but there’s no denying that the way Americans are fighting each other now, sometimes called pernicious polarization, is destructive, escalatory, and unhealthy.

Meta’s results notwithstanding, we know that content can have effects on polarization—because of the Strengthening Democracy Challenge, a series of experiments that tried to change how people approach political conflict. It’s also possible to algorithmically identify political content that garners agreement across societal divides, a strategy known as bridging-based ranking, and prioritizing such content is thought to reduce polarization. Such a ranking system is already in use to select Twitter’s community notes. There have even been experiments showing that a carefully designed AI chatbot can help mediate divisive conversations.

There is, in short, a lot to try.

Many people will be looking to the current batch of experiments to either crucify or exonerate Facebook. That’s not what they do; this is bigger than Facebook, and these studies are early results in a new field. Meta should be commended for undertaking open research on these significant topics. Yet this is the culmination of work announced three years ago. In the face of layoffs and criticism, the appetite for open science on hard questions may be waning across the industry. I’m aware of at least one large research project Meta recently canceled, and the company said it “does not have plans to allow” another wave of election research in 2024. Many in the research community support a bill called PATA, which would give the National Science Foundation authority to vet and prioritize research projects which platforms would be obligated to support.

Simultaneously, the AI era is dawning, and our information ecosystem is about to get a lot weirder. We’re going to need a lot more open science on the frontiers of media, machines, and conflict.

Related Articles

Latest Articles