Sentencing those who watch it to 3 years in prison

South Korea is taking strong action to combat the issue of deepfake porn, which has become a significant social problem. In response to an increase in complaints about the use of artificial intelligence to create fake sexual images and videos, South Korean authorities have decided to tighten the country’s laws on the matter.

Lawmakers are working on a new bill that will punish those who create and distribute sexual deepfakes. It also goes after those who intentionally purchase, store, or view this type of content. The proposed penalties include heavy fines and even prison sentences of several years.

What happened. South Korea wants to strengthen its laws to combat deepfake pornography. Currently, individuals who create such content for distribution face a prison sentence of up to five years and fines of up to ₩50 million (approximately $38,000). However, authorities aim to expand the scope of punishment to those who possess or view this type of content.

To achieve this, the new bill will impose fines and prison sentences on individuals who knowingly possess pornographic content containing fake images created with AI. After a parliamentary committee approved the revision of the law on Wednesday, lawmakers approved the bill on Thursday. Reuters reported.

Deep fake 1
Deep fake 1

3 years in prison. Those who knowingly store or view sexually explicit deepfake content will face serious consequences. According to The Korea Timesthe revised law allows penalties of up to three years in prison or fines of ₩30 million (approximately $22,900). The aim is to discourage anyone from purchasing, storing or viewing deepfake material to reduce its spread.

Political debate. This issue has received a lot of attention in the country. As such, South Korean authorities have approved a new regulation to combat illegal deepfake content and provide support to victims. In addition, the parliamentary committee agreed to revise the law to impose harsher penalties on those who use sexual material to blackmail children or adolescents.

Looking at the official figures. South Korea’s actions are not surprising. In late August, President Yook Suk Yeol ordered measures to address the impact of deepfake pornography. According to Yook, it is an exploitation of technology that relies on the protection of anonymity, which is “a clear criminal act.”

South Korea’s National Police has released some figures that help understand the extent to which this type of content has become a major problem in the country. So far in 2024, alleged victims have reported 812 deepfake-related sex crimes, leading to the arrest of 387 suspects.

Even more alarming is the fact that almost half of these reports (367) were filed last month after authorities launched a special campaign to prosecute these types of crimes. Furthermore, 83.7% of those arrested were teenagers. Furthermore, 66 of them were under the age of 14, meaning they were legally exempt from criminal prosecution. 13% were in their twenties.

Met Gala deepfakes prove we have a problem. Even Katy Perry's mother thought they were real

“An epidemic.” The Asian country is not the only country facing the significant challenge of deepfakes. Cases have also been reported in the US. In South Korea, however, the issue is so widespread that Human Rights Watch has publicly acknowledged its concerns.

“South Korea is facing an epidemic of digital sex crimes, with hundreds of women and girls targeted by deepfake sexual images shared online,” Heather Barr, deputy director of the Women’s Rights Division, said in a message. Sungshin Bae, a researcher and official at the South Korean Supreme Court, also recently spoke of a “crisis.” The conversation.

“I was terrified.” In her post, Bae said: “AI is fueling a deepfake porn crisis in South Korea.” She also reported that the startup Security Heroes recently analyzed nearly 96,000 AI-generated sex videos from various sources and found that just over half of the material, about 53%, featured South Korean singers and actresses.

The issue also affects women who work outside the spotlight, such as Heejin, a pseudonym for a university student. She recently told the newspaper BBC how she felt when she discovered a pornographic image of herself circulating in a chat room. The image was generated by AI and was clearly of a sexual nature. “I was terrified, I felt so alone,” she said.

In August, The Guardian reported that the 220,000 members of a Telegram chat room created and shared manipulated images of women, including young girls, college students, teachers and military personnel. They used photos from social media platforms like Instagram to create these images.

Image | 卡晨

Related | New tool for making deepfakes for free reignites the debate about the dangers it entails

You May Also Like

More From Author