SACRAMENTO, CA — California Governor Gavin Newsom on Sunday signed a pair of proposals aimed at protecting minors from the increasingly widespread misuse of artificial intelligence tools that generate harmful sexual images of children.
The move is part of California’s concerted effort to tighten regulations on a major industry that increasingly impacts Americans’ daily lives but is largely unsupervised in the country.
Earlier this month, Newsom also signed some of the toughest legislation to combat election deepfakes, which is being challenged in court. California is widely seen as a potential leader in regulating the AI industry in the US
The new law, which received overwhelming bipartisan support, closes legal loopholes surrounding AI-generated child sexual abuse images and makes clear that child pornography is illegal even if it is generated by AI. I made it.
Advocates say current law allows district attorneys to investigate people who possess or distribute AI-generated images of child sexual abuse unless they can prove the material depicts a real person. It is not allowed to do so. Under the new law, such crimes would be considered felonies.
Democratic Rep. Mark Berman, who authored one of the bills, said in a statement: “Whether the images are AI-generated or of actual children, no child sexual abuse material is created in California. “It must be illegal to possess, possess, or distribute.” The images used to create these awful images were trained on thousands of images of real children being abused and re-victimizing those children. ”
Earlier this month, Newsom also signed two other bills to strengthen revenge porn laws, aimed at protecting more women, teenage girls and others from sexual exploitation and harassment enabled by AI tools. did. Under state law, it is illegal for adults to create or share AI-generated sexually explicit deepfakes of individuals without their consent. Social media platforms are also required to allow users to report the removal of such material.
But some laws don’t go far enough, said Los Angeles County District Attorney George Gascón, whose office sponsored some of the proposals. Gascon said the new penalties for sharing AI-generated revenge porn should have included people under 18. The measure was scaled back by state lawmakers last month to apply only to adults.
“The result is inevitable. You don’t get a free pass just because you’re under 18,” Gascón said in a recent interview.
The law comes after San Francisco filed the nation’s first lawsuit against more than a dozen websites that use AI tools that promise to “undress any photo uploaded to them within seconds.” It was established in response.
The problem with deepfakes is not new, but experts say it is getting worse as the technology to create them becomes more accessible and easier to use. Over the past two years, researchers have been alarmed by the proliferation of AI-generated child sexual abuse material using depictions of real victims and virtual characters.
In March, the Beverly Hills School District expelled five middle school students for creating and sharing fake nudes of their classmates.
The issue prompted swift bipartisan action in nearly 30 states to address the prevalence of AI-generated sexual abuse content. Some of them include protections for everyone, while others only outlaw material depicting minors.
Newsom touted California as both an early adopter and regulator of AI technology, even as the state considers new rules against AI discrimination in employment practices. He said generative AI tools could soon be introduced to tackle highway congestion and provide tax guidance.