Goodbye Pixel 6a! Google reportedly discontinues its popular mid-ranger ahead of I/O 2024 According to data shared by YouTube, 96% of these videos were flagged via ‘automated flagging’ indicating that they were detected by a machine rather than a human reviewer. Around 3 lakh videos were flagged by a user, while around 52 thousand videos were flagged by organizations and only 4 videos were flagged by government agencies.
The company also revealed that around 51.15% of removed videos had zero views, 26.43% of videos had 0-10 views and only 1.25% of videos had more than 10,000 views. The streaming giant also revealed the reasons for the removal of these videos: 39.4% of videos were found to be dangerous or harmful, 32.4% of videos were removed due to child safety concerns and 7.5% of videos were found to be violent or pornographic.
Other reasons for removing videos included nudity or sexual content, harassment and bullying, promotion of violence and violent extremism and more. Elaborating on its process of removing the videos, the streaming giant stated in a blogpost, “YouTube relies on teams around the world to review flagged videos and remove content that violates our Community Guidelines; restrict videos (e.g., age-restrict content that may not be appropriate for all audiences); or leave the content live when it doesn’t violate our guidelines." Also Read | Spotify takes on YouTube with new music videos feature.
Here's how it works “These removal reasons correspond to YouTube’s Community Guidelines. Reviewers evaluate flagged videos against all of our Community Guidelines and policies, regardless of the reason the video was originally flagged." the company added.
Read more on livemint.com