VICTORY! Google improves protection against deepfake/AI-generated pornography

Google has taken a major step in the fight against image-based sexual abuse (IBSA), announcing major updates to its policies and processes to protect people from sexually explicit deepfake and AI-generated content. These changes, which were made possible thanks to feedback from experts and survivor advocates, represent a monumental victory in our ongoing fight against IBSA.

Understanding the Impact of Deepfake and AI-Generated Pornography

Computer-generated IBSA, often referred to as “deepfake pornography” or “AI-generated pornography,” is becoming increasingly prevalent and poses a serious threat to personal privacy and security. With increasing simplicity and speed, technology can create highly realistic explicit content that is often targeted at individuals without their knowledge or consent. The distress and harm caused by these images is profound, with the potential to seriously damage reputations, careers, and mental health.

And it can happen to anyone. It can happen to you and all the people you love. If there is a picture of your face online, you are at risk.

Google’s Major Updates

Google has recognized the urgent need to address these issues and has therefore implemented several important updates to its search platform to make it easier for individuals to remove IBSA, including computer-generated IBSA, and to prevent such content from appearing prominently in search results. Here’s what’s new:

  • Explicit result filtering: When someone successfully requests the removal of an explicit deepfake/AI-generated image, Google will now also filter all explicit results for similar searches about that person. This helps prevent harmful content from reappearing in related searches.
  • Deduplication: Google’s systems automatically scan and remove duplicate sexually explicit images that have already been successfully removed. This reduces the chance of re-traumatization for victims who previously had to repeatedly request removal of the same images.
  • Ranking Updates: Google is updating its ranking algorithms to reduce the visibility of deepfake/AI-generated pornography. By promoting high-quality, non-explicit content, Google hopes to reduce the appearance of harmful material at the top of search results.
  • Demotions: Websites with a high volume of takedown requests are demoted in search results. This discourages sites from hosting deepfake/AI-generated pornography and helps protect individuals from repeated exposure to such material.

Thank Google with us!

Please take a moment to sign the form below and thank Google for listening to survivors and creating a safer internet!

Listening to survivors: a crucial element

One of the most commendable aspects of Google’s update is its grounding in the experiences and needs of survivors. By actively seeking and incorporating feedback from those directly affected by IBSA, Google has demonstrated its commitment to creating solutions that truly address the complexity and impact of this form of abuse.

NCOSE has enabled Google to meet the survivors, and we are very pleased that the company listened to their critical insights in developing these new features. We sincerely thank these brave survivors for their voices committed to making the world safer for others.

We also thank you for your show of support that fueled this victory! Over the years, you have joined us in numerous campaigns targeting Google, such as the Dirty Dozen List, where Google Search and other Google entities have been featured many times. This victory is also YOUR legacy!

A step forward, but still much work to do

While these changes represent a significant victory, the fight against IBSA is far from over. Continued vigilance, innovation, and collaboration from technology companies, policymakers, and advocacy groups are essential to creating a safer online environment. We must continue to push for more robust measures and support systems for those affected by image-based sexual abuse.

ACTION: Call on Microsoft’s GitHub to stop facilitating IBSA!

Google was far from the only company facilitating computer-generated IBSA. In fact, there is one corporate entity that is at the root of almost all of this abuse: Microsoft’s GitHub.

Microsoft’s GitHub is the global hub for creating sexually exploitative AI technology. The vast majority of deepfakes and computer-generated IBSAs are created on this platform owned by the world’s richest company.

It’s time for Microsoft’s GitHub to stop fueling this problem and start to fight it instead!

Please take 30 SECONDS to sign the quick action form belowwhich calls on Microsoft’s GitHub to combat deepfake and AI-generated pornography.

ACTION: Urge your Senator to support the TAKE IT DOWN Act!

We also urgently need better legislation to combat IBSA. As it stands, there is NO federal criminal penalty for those who distribute or threaten to distribute sexually explicit images without their consent.

The TAKE IT DOWN Act attempts to fill this glaring gap in the law.

The TAKE IT DOWN Act has already been unanimously passed by the Committee. Join us in pushing it through the next steps!

Take action now and ask your Senator to support this important bill.

We encourage everyone to stay informed about IBSA, support survivors, and advocate for stronger protections and accountability from tech companies. Together, we can create a safer, more respectful digital world.

For more information and resources on combating image-based sexual abuse, please visit our webpage here.

You May Also Like

More From Author