1 in 10 children use AI to take nude photos of their classmates and share them online

KI

It seems that the world of artificial intelligence (AI) is causing more concern than it is helping.

From taking over human tasks to coming up with easy ways to plagiarize content, to even more disturbing things like deepfakes, morphing images, and even audio/video content. AI is being used for all of this, to the point where it’s hard to tell what’s real and what’s fake.

A recent study has found that young children in school may be using AI to take nude photos of their classmates and more. They are sourcing the photos from social media and other digital platforms.

What did the research yield?

Thorn, a non-profit organization focused on protecting children from sexual exploitation, conducted a survey among 1,040 minors aged 9 to 17 between November 3 and December 1, 2023.

The US-based organisation published its findings in a study called Youth Perspectives on Online Safety, 2023, asking children from different backgrounds about their experiences with child pornography (CSAM). “harmful online experiences” and more.

The survey wrote “Underage participants were recruited through existing youth panels or directly through caregivers at the time of this study,” and that “Caregiver consent was required for minors to participate in youth panels, as well as for minors recruited directly into the study.”

Regarding the non-consensual use of AI, the report wrote: “In 2023, Thorn’s annual monitoring survey also asked about generative AI (GAI) being used by peers to create CSAM, including ‘deepfake’ nude images of real children that can then be shared without their consent.

The results showed that most minors do not believe that their peers use this technology to create explicit images of other children.

However, about 1 in 10 minors (11%) indicated that they knew of cases where their peers had done this, while another 10% indicated that they preferred not to answer.”

The report also said that “1 in 8 minors, aged 9 to 12, reported having seen non-consensually shared SG-CSAM.”

The study also found that one in seven minors admitted to sharing self-generated CSAM content. While this may be consensual, it is still considered risky online behavior and the consequences can be quite serious.

A report by 404 Media also highlighted that while these findings are certainly concerning, the alarmist framing by such “anti-trafficking” organisations is also not ideal. If we take the example of one in seven “minors have shared their own SG-CSAM,” it wrote that “While these images are technically illegal and against the policies of all major internet platforms, the terrifying term also encompasses instances where, for example, a 17-year-old sends a consensual nude photo to his partner.”

The report also said: “While the motivation behind these events is likely driven by adolescents misbehaving rather than an intent to commit sexual abuse, the resulting harms to victims are real and should not be minimized in attempts to shift responsibility.”

Julie Cordua, CEO of Thorn, also commented on this statement “The fact that 1 in 10 minors report that their peers are using AI to generate nude photos of other children is alarming and highlights how quickly online risks are evolving.”

She also said “This emerging form of abuse poses a significant threat to the safety and wellbeing of children. We must act quickly to develop safeguards, educate young people about the dangers of deepfakes, and empower parents to have open conversations with their children about these risks. We can better protect children in an ever-changing digital landscape by staying ahead of these technological changes.”


Read More: Google Pay, Paytm, PhonePe UPI Transactions Could Put Women at Risk


This brings to light the unlawful use of AI and the alarming proliferation of disturbing apps such as ‘nudify’ and other apps that can use AI to turn an otherwise generic photo of a person into a nude photo.

In April 2023, TikTok user Rachel (@rache.lzh) posted a video in which she described how an anonymous user sent her photos of herself that had been edited using AI so that she could appear naked on Instagram.

The images the TikToker posted to her account were fully clothed, but someone had edited them into non-consensual pornographic images.

In an April 27 video, she said through tears: “They were pictures of me that I had posted fully clothed, fully clothed. And they had run them through some kind of AI editing program to edit me out naked,” and that “They basically photoshopped me naked.”

She added that she explained the ordeal as follows: “And what’s even worse is that the next day when I woke up I got dozens of DMs of these images, but without the watermarks. So this person paid to have the watermark removed and started spreading it as if it was real,” and added that the people also got tattoos on her body and altered her body using AI.

In December of last year, two Florida boys, ages 13 and 14, were arrested for allegedly using AI to create deepfake nude photos of their classmates. They were charged with third-degree felonies, according to a report by Wired.

This horrific act is not limited to the United States, but is happening worldwide.

In July of this year, a Spanish juvenile court sentenced 15 minors to a one-year suspended prison sentence after they distributed nude photos of their female classmates generated by AI via WhatsApp groups.

According to a press release from the Badajoz Juvenile Court, the minors will also have to attend classes on “responsible use of information and communication technologies,” gender and equality to understand the consequences of their actions.

According to reports, the teens used photos from the girls’ social media profiles and then used an AI app to overlay the underage girls’ faces onto the photos. “other naked female bodies.”

A 2023 study by the UK-based Internet Watch Foundation (IWF) found that “It was found that 20,254 AI-generated images were posted on a single CSAM forum on the dark web in a one-month period,“and that in the present time “Most of the AI ​​CSAM material found is now realistic enough to be treated as ‘real’ CSAM.”

Even in South Korea, there is currently a massive deepfake scandal going on, with men reportedly using Telegram chat rooms to share sexually explicit content of women, taken without their consent. This reportedly includes women of all types, from young girls, students, military personnel and even female family members, according to reports.

Perhaps most alarming is the ease of access to such deepfake platforms or AI apps that promise “undress every photo” or take nude photos of any person as long as a photo is available within a few seconds.


Image Cedits: Google Images

Main image designed by Saudamini Seth

Sources: First post, Thorn Org, The Hindu

Find the blogger: @chirali_08

This post is tagged under: AI, AI abuse, deepfakes, deepfake nudes, AI generated content, Thorn, Thorn organization, child protection, sexual exploitation, sexual exploitation AI, sexual exploitation artificial intelligence, AI technologies, deepake scandal

Disclaimer: We have no rights, copyright on the used images, these are taken from Google. In case of credits or removal, the owner can kindly mail us.


Other recommendations:

Men create AI girlfriends to verbally abuse them

You May Also Like

More From Author