23.3 C
New York
Wednesday, July 24, 2024

Meta’s Election Research Opens More Questions Than It Answers

In the lead-up to the 2020 presidential election, Meta set out to conduct a series of ambitious studies on the effects its platforms—Facebook and Instagram—have on the political beliefs of US-based users. Independent researchers from several universities were given unprecedented access to Meta’s data, and the power to change the feeds of tens of thousands of people in order to observe their behavior.

The researchers weren’t paid by Meta, but the company seemed pleased with the results, which were released today in four papers in Nature and Science. Nick Clegg, Meta’s president of global affairs, said in a statement that “the experimental findings add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization” or have “meaningful effects on” political views and behavior.

It’s a sweeping conclusion. But the studies are actually much narrower. Even though researchers were given more insight into Meta’s platforms than ever before—for many years, Meta considered such data too sensitive to make public—the studies released today leave open as many questions as they answer.

The studies focused on a specific period in the three months leading up to the 2020 presidential election. And while Andrew Guess, assistant professor of politics and public affairs at Princeton and one of the researchers whose findings appear in Science, noted that this is longer than most researchers get, it’s not long enough to be entirely representative of a user’s experience on the platform.

“We don’t know what would have happened had we been able to do these studies over a period of a year or two years,” Guess said at a press briefing earlier this week. More importantly, he said, there is no accounting for the fact that many users have had Facebook and Instagram accounts for upwards of a decade now. “This finding cannot tell us what the world would have been like if we hadn’t had social media around for the last 10 to 15 years or 15 or 20 years.”

There’s also the issue of the specific time frame the researchers were able to study—the run-up to an election in an atmosphere of intense political polarization.

Most PopularBusinessThe End of Airbnb in New York

Amanda Hoover

BusinessThis Is the True Scale of New York’s Airbnb Apocalypse

Amanda Hoover

CultureStarfield Will Be the Meme Game for Decades to Come

Will Bedingfield

GearThe 15 Best Electric Bikes for Every Kind of Ride

Adrienne So

“I think there are unanswered questions about whether these effects would hold outside of the election environment, whether they would hold in an election where Donald Trump wasn’t one of the candidates,” says Michael Wagner, a professor of journalism and communication at University of Wisconsin-Madison, who helped oversee Meta’s 2020 election project.

Meta’s Clegg also said that the research challenges “the now commonplace assertion that the ability to reshare content on social media drives polarization.”

Researchers weren’t quite so unequivocal. One of the studies published in Science found that resharing elevates “content from untrustworthy sources.” The same study showed that most of the misinformation caught by the platform’s third-party fact checkers is concentrated amongst and exclusively consumed by conservative users, which has no equivalent on the opposite side of the political aisle, according to an analysis of about 208 million users.

Another study found that while participants whose feeds excluded reshared content did end up consuming less partisan news, they also ended up less well informed in general. “We often see that polarization and knowledge kind of move together,” says Guess. “So you can make people more knowledgeable about politics, but then you’ll see an increase in polarization among the same set of people.”

“I don’t think the findings suggest that Facebook isn’t contributing to polarization,” says Wagner. “I think that the findings demonstrate that in 2020, Facebook wasn’t the only or dominant cause of polarization, but people were polarized long before they logged on to Facebook in 2020.”

The studies released today represent just the first tranche of research. Thirteen more papers are expected over the coming months that will focus on topics that include the impact of political advertisements and attitudes toward political violence around the January 6 insurrection at the Capitol.

Meta spokesperson Corey Chambliss told WIRED that the company does not have plans to allow similar research in 2024. When asked about whether Meta would be funding further research, Chambliss pointed to the company’s newly announced research tools, particularly the Meta Content Library and API. “The library includes data from public posts, pages, groups, and events on Facebook,” he says. “For Instagram, it will include public posts and data from creator and business accounts. Data from the library can be searched, explored, and filtered on a graphical user interface or through a programmatic API.”

Notably, the newly published studies did not investigate ways to specifically depolarize users. As a result, researchers say that while there are reasons to be concerned about social media’s impact on politics, it’s not any clearer what policy solutions could address the issue.

“It would have been nice for the public, for lawmakers, for regulators, and for social science to have a better idea of what kind of interventions might make things better,” says Wagner.

Related Articles

Latest Articles