new york
CNN
—
“All you need to be a victim is a human appearance,” says attorney Carrie Goldberg, explaining the risks of deepfake porn in the age of artificial intelligence.
Revenge porn, or the non-consensual sharing of sexual images, has been around for almost as long as the internet, but with the proliferation of AI tools, no one can do it, even if they never took or sent the photo. may be targeted by this form of harassment. Nude photos. Artificial intelligence tools can now superimpose a person’s face onto a nude body or manipulate existing photos to make it appear as if the person is not wearing any clothes.
Claire Duffy Terms of Use Deepfake revenge porn is on the rise: what can you do?
A new kind of deepfake revenge porn is taking the internet by storm. Using artificial intelligence, bad actors can do things like superimpose your face onto your nude body to create convincing and harmful images. Despite tech companies and lawmakers trying to play catch-up, the reality is that these tools remain easily accessible. So how can you keep you and your loved ones safe? Digital harassment and sexual assault attorney Carrie Goldberg has some answers.
November 12, 2024 • 28 minutes
…
Over the past year, non-consensual pornographic images generated by AI have targeted everyone from high-profile women like Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school girls.
For people who find out that they or their children have been the subject of deepfake porn, the experience is usually frightening and overwhelming, said CA, a New York-based firm that represents victims of sexual crimes and online harassment. said Goldberg, who runs Goldberg Law. “Especially when they’re young and don’t know how to cope and the internet is such a big, nebulous place,” she says.
But Goldberg said in an interview on CNN’s new technology podcast “Terms of Service with Clare Duffy” that there are steps targets of this type of harassment can take to protect themselves and where they can get help. Ta.
Terms of Use are intended to demystify new technologies that listeners encounter in their daily lives. (You can listen to the full conversation with Goldberg here.)
Goldberg said the first step for people targeted by AI-generated sexual images should be to screenshot the image, even if it’s counterintuitive.
“The natural reaction is to remove this from the internet as quickly as possible,” Goldberg said. “But if you want to have the option of filing a criminal complaint against it, you need evidence.”
Next, look for forms provided by platforms like Google, Meta, and Snapchat to request removal of explicit images. Nonprofits like StopNCII.org and Take It Down are also helping to facilitate the removal of such images across multiple platforms at once, although not all sites cooperate with the group. Helpful.
A bipartisan group of senators sent an open letter in August inviting nearly a dozen tech companies, including X and Discord, to join the program.
The fight to combat non-consensual explicit images and deepfakes is gaining rare bipartisan support. A group of teens and parents affected by AI-generated pornography testified at a hearing on Capitol Hill, where Republican Sen. Ted Cruz joined Democratic Sen. Amy Klobuchar and others in support. In response, a bill was proposed that would criminalize the following acts: Publish such images and request social media platforms to remove them upon notification from the victim.
But for now, victims have no choice but to navigate a patchwork of state laws. In some jurisdictions, there may be no criminal law prohibiting the creation or sharing of explicit deepfakes for adults. (AI-generated sexual images of children typically fall under child sexual abuse material laws.)
“My positive advice to would-be criminals is to stop being the scum of the earth and trying to steal people’s images and use them to humiliate them,” Goldberg said. Ta. “There’s not much victims can do to prevent this…We’ll never be completely safe in a digital society, but it’s up to each other to avoid becoming complete outlaws.”