Instagram's parent company Meta on Tuesday announced that it is including security measures to the accounts of teen users for protecting them from content that is age-inappropriate from the search results as well as the explore pages on Facebook and Instagram. Further, it will notify the teens to update their privacy settings. This comes amidst the lawsuits from states surrounding child safety and an approaching hearing before the Senate.
The social media platform will be removing content around suicide, eating disorders, and self-harm, from the accounts of the teen users on Facebook and Instagram. This will be followed even if such content is being shared by a user whom they follow. Meta has further added that though 'self-harm' content can help to reduce the stigma around these issues, it is a rather complicated topic and is not suitable for all young people.
In addition, in the case of teen users searching for terms that are related to eating disorders, suicide, and self-harm, Meta will hide the associated results and further redirect the search towards a helpline or other resources for the users to get assistance.
Teen users who are present on Facebook and Instagram will be placed automatically under the control settings of the most constrictive content making it more complex for their accounts to find potential age-inappropriate accounts or content.
Meta will begin to notify teen users to update their privacy settings and in the case of teens choosing the 'recommended settings', their accounts will restrict which people can tag, mention, repost, or add their content in Reels remixes and block users who are not following them from sending a message to