Facebook has handed US users the controls over fact-checked content, in a potentially significant move that the platform says will give them more power over its algorithm but some analysts insist could benefit purveyors of misinformation. For years, Facebook's algorithm automatically moved posts lower in the feed if they were flagged by one of the platform's third-party fact-checking partners, including AFP, reducing the visibility of false or misleading content.
Under a new «content reduced by fact-checking» option that now appears in Facebook's settings, users have flexibility to make debunked posts appear higher or lower in the feed or maintain the status quo. Fact-checked posts can be made less visible with an option called «reduce more.» That, according to the platform's settings, means the posts «may be moved even lower in feed so you may not see them at all.» Another option labeled «don't reduce» triggers the opposite effect, moving more of this content higher in their feed, making it more likely to be seen.
«We're giving people on Facebook even more power to control the algorithm that ranks posts in their feed,» a Meta spokesman told AFP. «We're doing this in response to users telling us that they want a greater ability to decide what they see on our apps.» Meta rolled out the fact-checking option in May, leaving many users to discover it for themselves in the settings.
It comes amid a hyperpolarized political climate in the United States that has made content moderation on social media platforms a hot-button issue. Conservative US advocates allege that the government has pressured or colluded with platforms such as Facebook and Twitter to censor or suppress right-leaning content under the guise of fact-checking.
. Read more on economictimes.indiatimes.com