social media and internet intermediaries to “align” the terms of service on their platforms within the next seven days to alert users about the consequences of creating, uploading and sharing prohibited information including deep fake content or child sexual abuse material (CSAM), a senior government official said on Friday.
Intermediaries have to “explicitly” warn users that they cannot “host, display, upload, modify, publish, transmit, store, update or share” any content that belongs to someone else, is defamatory, obscene, pornographic, paedophilic or invasive of another user’s privacy.
Further, social media users will have to be immediately notified about these dangers and illegalities when they log onto a platform.
“Just telling the user in a general sense that illegal content cannot be created, uploaded, or shared will not help,” the minister of state for electronics and information technology Rajeev Chandrasekhar said.
Also read | Social media companies warned of strict action over child abuse content
“If I am the user of a platform and I am not told when I log in that I cannot use this platform for CSAM, deep fakes, or misinformation, that is not great awareness,” he added.
Pointing out that the provisions contained in the Intermediary Guidelines and Digital Media Ethics Code, also known as the Information Technology (IT) Rules of 2021, are sufficient to tackle deep fake content, Chandrasekhar said his ministry will push for complete