Kids are using AI to harm their peers – and Microsoft’s GitHub is the cause

Jessica*, a 15-year-old from New Jersey, was returning to her history class one day to find a group of girls whispering to each other. She walked over to ask them what they were talking about. What she heard next was horrifying:

Boys at her high school used images of their female classmates’ faces and edited them onto AI-generated nude bodies.

The girls decided to take the issue up with school officials, but unsure whether she would become a victim of this sexually exploitative behavior herself, Jessica went about her day as usual, focusing on her schoolwork and extracurricular activities. As time went on, she was surprised to discover that she was one of the girls whose AI-generated nude photos were taken by her male classmates.

As she left the front office after her meeting with the administration, she saw a few girls, whose faces were in the photos, crying. But what really made her sick was the group of boys laughing at them from a few feet away.

“I didn’t think my classmates could do this to me,” she said.

*Survivor’s name has been changed

The Alarming Prevalence of AI-Generated Sexual Abuse Images

In an era where major technological advancements are happening almost daily, many of us are forced to be careful about the information we make public.

But now it seems no one is safe anymore.

Thanks to the abuse of newly developed AI technology, it doesn’t matter if you’ve never taken a nude photo in your life: you can still be a victim of image-based sexual abuse (IBSA) or child sexual abuse (CSAM).

This traumatic experience for women and young girls is becoming far too common. Recent research from Thorn, a leading child safety organisation, found that 1 in 10 minors say their friends or classmates have used AI to generate nude photos of other children.

Microsoft’s GitHub is at the root of the problem

There is one company at the heart of all this sexually exploitative AI technology: Microsoft’s Github.

Microsoft’s GitHub is perhaps the world’s most prolific space for developing artificial intelligence. Described as a mix between Google Docs and a social media app for programmers, GitHub is a platform designed for individuals to collaborate, manage, and share code for developing software.

One disturbing fact that Microsoft refuses to acknowledge, however, is that GitHub has been a major driver of AI-generated CSAM and IBSA.

GitHub is a hotbed of sexual deepfake repositories (where the code is) and forums (where people chat) dedicated to the creation and commercialization of synthetic media technologies. These technologies have fueled the development of “nudizing apps”where users can digitally ‘strip’ women of their clothing in images. Currently, this type of nudifying technology only works on images of women, underscoring the highly gendered nature of sexual violence.

The platform also hosts code and datasets used to create AI-generated CSAM. Unfortunately, this phenomenon has become dramatically more common in the past year. In December 2023, the Stanford Internet Observatory discovered over 3,200 images of suspected CSAM in the training set of LAION-5B, a popular generative AI platform called Stable Diffusion. This dataset was available on GitHub.

Whether it’s teenage boys taking AI-generated “nude photos” of their female peers or adult predators indulging their pedophile lusts, the problem of AI-generated child sexual abuse is clearly exploding.

GitHub failed to implement requested security changes

In 2023, GitHub was named to NCOSE’s Dirty Dozen List, an annual campaign that calls out 12 mainstream contributors to sexual abuse and exploitation. The Dirty Dozen List mobilizes concerned citizens (people like YOU!) to contact these 12 entities and urge them to make much-needed safety changes. As a result, many of the Dirty Dozen List targets have made significant improvements to their policies and practices.

Yet GitHub has done nothing.

Even after being placed on the Dirty Dozen List for the second time in 2024, GitHub failed to acknowledge the campaign, let alone implement the requested security changes.

GitHub focuses on children, fueling the rise of AI-generated CSAM

Disturbingly, GitHub is being marketed to 13+ year olds. Given this, is it any wonder that 1 in 10 minors now say that their friends and classmates have taken AI-generated nude photos of other children?

While GitHub offers opportunities for children to educate themselves about coding and software development, the lack of safety standards and high level of abuse on the platform make it highly unsuitable for users of this age. Furthermore, GitHub does not have an age verification feature, only requiring a valid email address to create an account. This means that users even younger than 13 can gain access to this extremely dangerous platform.

Teenage girls are being sexually exploited by their peers, their privacy and mental health are being torn to shreds. Women are losing their jobs and reputations because of sexually explicit images that are completely fabricated. Countless lives are being destroyed because of the technologies created on Microsoft’s GitHub.

Microsoft is the second richest company in the world—they have the resources to solve this. How long will they turn a blind eye to the way their platform is destroying the lives of women and children?

ACTION: Call on Microsoft’s GitHub to stop enabling sexual exploitation!

The widespread abuses being spread by Microsoft’s GitHub are relatively uncharted territory. With AI technology evolving so rapidly, it is imperative that we act NOW before the problem becomes much worse. Take 30 SECONDS to take the quick action below!

Learn more about how Microsoft’s GitHub contributes to sexual exploitation here.

You May Also Like

More From Author