Deepfake sex videos – DNyuz

In 2020, as South Korean authorities were operating a blackmail ring that forced young women to make sexually explicit videos for paying viewers, they discovered something else in the dark recesses of social media: pornographic images featuring other people’s faces in graphic detail.

They didn’t know what to do with these early attempts at deepfake pornography. Eventually, the National Assembly passed a vaguely worded law against those who created and distributed it. But that didn’t stop a crime wave, powered by AI technology, that has taken the country’s misogynistic online culture to new depths.

Over the past two weeks, South Koreans have been shocked to discover that a growing number of young men and teenage boys have taken hundreds of social media images of classmates, teachers and military colleagues — almost all of them young women and girls, including minors — and used them to create sexually exploitative images and video clips using deepfake apps.

They spread the material through chat rooms on the encrypted messaging service Telegram, some of which have as many as 220,000 members. The deepfakes typically combine a victim’s face with a body in a sexually explicit pose taken from pornography. The technology is so advanced that it is often difficult for ordinary people to spot that they are fake, researchers say. As the country races to tackle the threat, experts have noted that in South Korea, enthusiasm for new technologies sometimes outweighs concerns about their ethical implications.

But for many women, these deepfakes are just the latest online manifestation of a deep-seated misogyny in their country – a culture that has now spawned young men who enjoy sharing sexually degrading images of women online.

“Korean society doesn’t treat women as fellow human beings,” said Lee Yu-jin, a student whose university is among hundreds of high schools, colleges and universities where students have been victimized. She wondered why the government hadn’t done more “before it became a digital culture to steal photos of friends and use them for sexual humiliation.”

Online sexual violence is a growing problem worldwide, but South Korea is leading the way. Whether and how to successfully tackle the deepfake problem will be considered by policymakers, school officials and law enforcement elsewhere.

The country has an ​underbelly of sex crime that occasionally surfaces. A South Korean was convicted of running one of the world’s largest child sex abuse image sites. A K-pop artist was found guilty of facilitating prostitution through a nightclub. For years, police have fought against spycam porn. And the mastermind behind a blackmail ring investigated in 2020 was sentenced to 40 years in prison for luring young women, including teenagers, to make the videos he sold online through Telegram chat rooms.

The rise of easy-to-use deepfake technology has added an insidious dimension to such forms of sexual violence: victims are often unaware they are a victim until they receive an anonymous message or a call from the police.

‘Slave’, ‘Toilet’, ‘Rag’

For a 30-year-old deepfake victim, whose name is being withheld to protect her privacy, the attack began in 2021 with an anonymous message on Telegram that read, “Hello!”

In the hours that followed, a flood of obscenities and deepfake images and video clips featuring her face were taken from family trip photos she had posted on social media. Words like “slave,” “toilet,” and “rag” were written on the body.

In April, she learned from police that two of her former classmates from Seoul National University were among those detained. Male graduates of the prestigious university, along with accomplices, had attacked dozens of women, including a dozen former Seoul National students, with deepfake pornography. One of the detained men was sentenced to five years in prison last month.

“I can’t think of any reason why they treated me like that, other than because I was a woman,” she said. “The fact that there were people like that around me made me lose my trust in my fellow man.”

She says she has struggled with trauma since the attack, her heart racing when she receives a message notification on her smartphone, or an anonymous call.

South Korea, whose pop culture is exported worldwide, has become the country most vulnerable to deepfake pornography, with singers and actresses making up 53 percent of those targeted, according to “2023 State of Deepfakes,” a study published by U.S. cybersecurity firm Security Hero. Leading K-pop agencies have declared war on deepfakes, saying they are gathering evidence and threatening lawsuits against their creators and distributors.

Still, the problem is growing. South Korean police reported 297 cases of deepfake sex crimes between January and July, compared to 156 for all of 2021, when such data was first collected.

It was only last month, when local news media exposed the widespread traffic in deepfakes on Telegram, that President Yoon Suk Yeol ordered his government to “eradicate” them. Critics of Mr. Yoon noted that during his 2022 presidential campaign, he had denied that there was structural gender discrimination in South Korea and had promised to abolish the Ministry of Gender Equality.

News reports about the rise in deepfakes this year have sparked panic among young women, many of whom have deleted selfies and other personal images from their social media accounts, fearing they could be used in deepfakes. Chung Jin-kwon, who was a high school principal before taking a position at the Seoul Metropolitan Office of Education last month, said his former school had debated whether to remove photos of students from yearbooks.

“Some teachers had already refused to put their photos there and replaced them with caricatures,” Mr Chung said.

Young people in South Korea, one of the world’s most wired countries, are becoming tech-savvy at an early age. But critics say the school system is so focused on preparing for high-stakes college entrance exams that they’re not learning how to use new technology ethically.

“We produce exam-problem-solving machines,” Mr. Chung said. “They don’t teach values.”

A push for stricter laws

Kim Ji-hyun, a Seoul city official whose team has counseled 200 teenagers involved in digital sexual exploitation since 2019, said some boys had used deepfakes to get revenge on ex-girlfriends — and that in some cases girls had used them to ostracize classmates. But many young people were initially drawn to deepfakes out of curiosity, Ms. Kim said.

Chat room operators lured them with incentives, including Starbucks coupons, and asked them to provide photos and personal information of women they knew. Some Telegram channels, called “rape and humiliation rooms,” targeted individuals or women from specific schools, said Park Seong-hye, a team leader at the government-funded Women’s Human Rights Institute of Korea who has investigated cyber sex crimes and helped victims.

Under the law passed in 2020, people convicted of creating sexually explicit or offensive deepfakes with the intent to distribute them could face up to five years in prison. Those who seek to profit financially from distributing such content could face up to seven years. But there is no law prohibiting the purchase, storage or viewing of deepfakes.

Investigators must obtain court approval to go undercover and access deepfake chat rooms, and they can only do so to investigate reports that minors have been sexually abused. The process can also be slow.

“You find a chat room on a holiday, but by the time you get court approval, it’s gone,” said Hahm Young-ok, a senior investigator of online crime at the National Police Agency.

The government has promised to push for tougher laws against buying or viewing sexually exploitative deepfakes. This month, police investigating the latest wave of deepfakes said they had arrested seven male suspects, six of them teenagers.

Pornography is censored on the internet in South Korea, but people can circumvent controls using virtual private networks, and the ban is difficult to enforce on social media channels. Police have indicated they may investigate whether Telegram has supported deepfake sex crimes. Last month, Telegram founder Pavel Durov was arrested in France and charged with a range of offences, including facilitating the distribution of child sexual abuse material.

Telegram said in a statement that it “has actively removed content from Korea that violates the Terms of Service and will continue to do so.”

Meanwhile, the government is under pressure to force online platforms to do more to filter content such as deepfake pornography.

“It is time to choose between protecting the platforms and protecting our children and adolescents,” said Lee Soo-jung, a professor of forensic psychology at Kyonggi University. “What we see happening now in 2024 was predicted in 2020, but we did nothing about it in the meantime.”

The post In South Korea, Misogyny Has a New Weapon: Deepfake Sex Videos appeared first on New York Times.

You May Also Like

More From Author