SEOUL—The victims were reportedly teachers, military officers, undergraduates and elementary-school students. Across a labyrinth of Telegram group chats, anonymous users submitted photos of South Korean girls and women without their permission that were manipulated into sexually explicit images and videos viewed by hundreds of thousands. South Korean authorities on Wednesday began an investigation to tackle faked pornographic images after a massive network was uncovered—involving hundreds of victims, many of them minors.
The revelation reflected the scale of the problem facing South Korea, which according to some researchers is the source for roughly half of so-called “deepfake" porn videos spread globally. Many countries, including the U.S., are confronting a rise in AI-generated fake nudes targeting young women and girls. The current protections in South Korea haven’t kept pace with the threats posed by AI-generated sexualized content—much of which is created by teenagers or younger children, local officials said.
The country’s education ministry is reviewing the maximum punishment for middle-school-aged perpetrators, or those as young as 10 years old. Lawmakers are looking to close legal loopholes, such as widening current punishments beyond those who intentionally disseminate the illicit content. “Deepfake videos may be dismissed as mere pranks, but they are clearly criminal acts that exploit technology under the shield of anonymity," said South Korean President Yoon Suk Yeol, a former prosecutor, at a Tuesday cabinet meeting.
Read more on livemint.com