Meta said that it will enable an India-specific Elections Operations Centre to identify potential threats and implement appropriate measures in real time across the company's various applications. Meta also said that its India-specific operations centre will bring together experts from various areas across the company, such as data science, engineering, research, operations, content policy and legal teams.
Additionally, the social media giant noted that it is working closely with the Election Commission of India through a voluntary code of ethics signed in 2019, which provides the ECI with a "high priority" channel for flagging unlawful content. Meta says that it already removes the most serious types of misinformation, such as content that could suppress voting or lead to imminent violence or harm.
But the company will also remove content related to false claims that someone of one religion is physically harming or harassing another person of a different religion, based on inputs from its "local partners". The Facebook parent company also plans to expand its network of independent fact-checkers, while making it easier for fact-checkers to find and rate election-related content.
Meta said it will use keyword detection to make it easier for fact-checkers to rate misinformation. Fact-checkers will be able to flag content as 'altered', meaning that the audio, video or photo has either been 'fabricated, manipulated or transformed'.
Once flagged as altered, the content will appear lower in the Facebook feed, while it will be filtered out on Instagram. Milestone Alert!
Livemint tops charts as the fastest growing news website in the world