Twitter, TikTok and Google will be forced to answer questions about how they tackle child sexual abuse and blackmail attempts on their platforms after the Australian eSafety commissioner issued legal notices to the companies.
The tech companies, as well as gaming platforms Twitch and Discord, will have 35 days to respond to the commissioner’s questions or risk fines of up to $687,000 a day.
The legal demands come six months after similar notices were issued to Apple, Meta, Microsoft, Snap and Omegle, which revealed some tech platforms were not using well-known safety measures to detect abusive content and protect users.
The commissioner of eSafety, Julie Inman Grant, said she was particularly concerned about the treatment of illegal material on Twitter following massive job cuts to its Australian and safety teams.
“Back in November, Twitter boss Elon Musk tweeted that addressing child exploitation was priority No 1 but we have not seen detail on how Twitter is delivering on that commitment,” Inman Grant said.
“We’ve also seen extensive job cuts to key trust and safety personnel across the company – the very people whose job it is to protect children – and we want to know how Twitter will tackle this problem going forward.”
Sign up for Guardian Australia’s free morning and afternoon email newsletters for your daily news roundup
The tech platforms must answer questions about how they detect and remove child sexual abuse content from their platforms including live streams, how algorithms could amplify its reach, and how the companies deal with sexual extortion attempts against children.
These attempts typically involve tricking underage users into providing intimate images and later blackmailing them.
“The creation, dissemination
Read more on theguardian.com