Subscribe to enjoy similar stories. Meta Platforms’ claims that Facebook doesn’t polarize Americans came under new doubt as the journal Science raised questions about a prominent research paper the tech giant has cited to support its position.
In an editorial Thursday, Science said that Meta’s emergency efforts to calm its platforms in the wake of the 2020 election may have swayed the conclusions of the paper, which the journal published in July 2023. The editorial, titled “Context matters in social media," was prompted by a letter that Science also published presenting new criticism of the paper.
Because the study of Facebook’s algorithms relied on data provided by Meta when it was undertaking extraordinary efforts to restrain incendiary political content, the letter’s authors argue that the paper may have overstated the case that social media algorithms didn’t contribute to political polarization. Such criticisms of peer-reviewed research often appear below papers in academic journals, but Science’s editors felt their editorial was needed to more prominently caveat this original paper’s conclusions, said Holden Thorp, Science’s editor in chief.
“It was incumbent on us to come up with a way somehow that people who would come to the paper would know of these concerns," Thorp said in an interview. While no correction was warranted, he said, “There’s an election coming up, and we care about people citing this paper." Meta said it had been transparent with researchers about its actions during the time of the study, and the company and its research partners say it had no control over the Science paper’s conclusions.
Meta called debates of the sort aired on Thursday as part of the research process. “Questions about the role
. Read more on livemint.com