AI-generated child pornography legal under current law; lawmakers working to close loopholes

TAMPA, Fla. (WFLA) — Artificial intelligence technology is already changing lives, and not always for the better.

Some are using AI to create pornographic images of people without their consent, and sometimes the victims are children. This is already happening here in Florida, including in the Bay Area.

You may be shocked to learn that in the state of Florida, and at the federal level, there is no law to protect victims.

8 On Your Side asked lawmakers and law enforcement officials what they are doing to combat the outbreak of AI imagery, which uses the faces of real children and teenagers and superimposes them on artificially generated, explicit images.

A third-grade science teacher at Beacon Christian Academy in New Port Richey was arrested on charges of possessing child pornography. The Pasco County Sheriff’s Office said those are the charges that landed Steven Houser in jail.

“There were other things that were generated by AI, from erotica at the federal level, there’s no crime against that at the state level,” said Sheriff Chris Nocco.

Nocco is even more concerned about Houser’s admission that he used three students’ yearbook photos to generate porn using AI

“It’s not just that they have this, but ultimately they act on it and that’s what we’re trying to prevent,” he said.

“The way the law is framed in this particular incident is that it is a crime, but I believe it is and we have a duty to protect our children,” said Rep. Gus Bilirakis, R-Florida’s 12and Neighborhood.

A few states — including South Dakota, Louisiana, and Washington — have passed legislation specifically aimed at combating sexually explicit AI images of minors. Bilirakis and Nocco believe that something needs to be done at the federal level.

“As legislators, we have a duty to close this loophole,” Bilirakis said.

8 Brittany Muller, a researcher at On Your Side, spoke with the congressman to bring to his attention instances of AI-generated pornography across the state.

“It’s just horrible, a teacher, a third grade teacher,” Bilirakis said.

He said he has legislation that will hold big tech companies accountable while protecting minors.

Male students at a private school in South Florida were suspended for similar use of this technology.

“My photo and my face were used without my consent,” said a female student who was victimized.

According to Miami-Dade police, two boys from Pinecrest Cove Academy in Miami abused artificial intelligence technology to take nude photos of their classmates, including one girl who remained anonymous after what happened to her.

“I feel violated by this,” she said. “I feel abused and used.”

The parents of another victim said they believe the boys responsible should receive more than just a 10-day suspension.

“We want these boys expelled from school,” said Nadia Khan-Roberts, the victim’s mother. “Our daughters don’t feel comfortable walking down the same corridors with these boys who are using their images in very inappropriate ways in a nude AI generator app.”

The boys will not face any criminal charges because it is not illegal under current Florida state law and the photos are not considered real images.

“I think this is something we see a lot of times, where technology is evolving so quickly across all sectors of technology, that the laws and regulations are constantly having to keep up with the times,” said Fallon McNulty, CEO of CyberTipline NCMEC.

The National Center for Missing and Exploited Children is at the forefront of protecting children from online sexual exploitation and abuse. Last year, its cyber hotline received 4,700 reports of child abuse material related to generative AI.

When asked about the students, Bilirakis said the boys need to realize the gravity of their actions.

“They need to be held accountable,” he said. “I have a KOSA bill, the Kids Online Safety Bill, that’s going to move through the House and the Senate pretty quickly, and I think it’s going to become law to protect these kids.”

The U.S. Senate passed its version of the bill last week with broad, bipartisan support, and President Biden has said he will sign it if it passes both chambers. Now it’s up to the House of Representatives to vote on KOSA.

“We feel like we can get something done by the end of the year,” he said.

The FBI has even issued a warning about malicious actors manipulating photos and videos using AI technology.

“There are a lot of good people out there trying to protect children,” Nocco said. “Unfortunately, there’s a large percentage of bad people who will pray on these children so that they find the loopholes that they find in different ways and that’s why the use of AI is something that people should be afraid of.”

Victims fear that someone could post a photo of them online, generating an inappropriate and dangerous image of them.

“We cannot allow this to become normalized,” Nocco said.

You May Also Like

More From Author