Korean Police Announce 7-Month Strict Crackdown on AI Deep Fake Sex Images – ROK Drop

Deepfake sex photos are the biggest news in South Korea right now and it’s a problem that will be very difficult to stop due to the proliferation of AI technology that makes it possible:

Police are to crack down on deepfake sex images as a spate of recent crimes fuel fears that any woman could become a victim, officials said Tuesday.

During the seven-month crackdown beginning on Wednesday, police will aggressively hunt down those who create and distribute such images, especially those of children and teenagers, the National Police Agency said.

According to the police department, 297 cases of deepfake sexual exploitation crimes were reported nationwide from January to July. Of the 178 defendants, 73.6 percent, or 113 individuals, were identified as teenagers.

Political parties and human rights organizations also called for severe punishment and an active investigation.

“The fear is growing, with an estimated nearly 220,000 members having participated in these deepfake porn chat rooms on Telegram,” said Son Sol, a co-chair of the task force of the small opposition Progressive Party dealing with the issue.

Yes, I have a bite

You can read more at the link, but the AI ​​deep fakes have also been linked to the ROK military with a chatroom that specializes in passing on images of female soldiers. I wonder how long before this becomes a problem in the US military as well?

You May Also Like

More From Author