Francesca Mani was 14 years old when her name was called over the loudspeaker at Westfield High School in New Jersey. She heads to the principal’s office and learns that her photo has been turned into a nude image by artificial intelligence.
Mani had never heard of the ‘nudify’ website or app before. When she left the principal’s office, she said she saw a group of crying girls and then a group of boys laughing.
“That’s when I realized I should stop crying and get angry, because this is unacceptable,” Mani said.
What happened to Francesca Mani?
Last October, while taking a high school history class, Mani heard rumors that some male students had naked photos of female classmates. She soon learned that she and several other girls at Westfield High School had been targeted.
A male student at the school then uploaded the photo from Instagram to a site called Crossoff, one of the most popular “naked” websites, according to a lawsuit filed by one of the other victims through his parents. It is said that he was doing so. 60 Minutes decided to name this site to raise awareness of the potential dangers. According to social network analysis company Graphika, Clothoff received more than 3 million visitors in the last month alone. Although this website offers both men and women to be “nude,” female nudes are much more popular.
Visitors to the website can upload photos and receive a free demonstration where they are shown an image of a clothed woman, then naked seconds later. The results are very realistic.
Clothoff users must be over 18 to enter the site and are told they can’t use someone else’s photo without their permission. The website claims it is “unable to process minors,” but no one at the company responded when 60 Minutes emailed them asking for proof of that, among many other questions. .
Although Mani never saw what was done to her photos, the same lawsuit alleges that at least one student’s AI nude was shared on Snapchat and seen by several children at school. It is said that
She said the situation worsened when Mani found out about her photos. She recalled how she and other girls were called by name into the principal’s office over the school’s public address system.
“I feel like that’s a gross violation of our privacy, but it’s the same as, say, having a villain removed from class in private,” she said.
That afternoon, Westfield’s principal sent an email to parents informing them that “some students were using artificial intelligence to create pornographic images from original photos.” The principal also said the school is investigating and “at this time we believe the images created have been removed and are not being distributed.”
Fake images, real damage
Mani’s mother, Dorota, who is also an educator, was not convinced. She worries that what is shared online is never really deleted.
“Who printed it? Who took the screenshot? Who downloaded it? You can’t completely erase it,” she said.
The school district did not provide details to 60 Minutes, including the photo, the students involved, or any disciplinary action taken. The superintendent said in a statement that the district revised its harassment, intimidation and bullying policy to incorporate AI, and said Mani and others had spent months urging school officials to do so.
Francesca Mani feels the girls targeted have paid a higher price than the boys who made the images.
“Because they have to live with the knowledge that their images may be floating around on the internet,” she says. “And they just have to deal with what the boys did.”
Dorota Mani said she had filed a police report, but no charges had been filed.
Jota Souras is the Chief Legal Officer of the National Center for Missing and Exploited Children. Her organization works with technology companies to flag inappropriate content on their sites. She says that while the images created by AI “undressing” sites are fake, the harm they cause to victims is real.
“They will suffer mental health distress and reputational damage,” Souras said. “In school settings, this problem is further amplified because such an image is created by one of your colleagues, which erodes self-confidence and erodes trust.”
fight for change
“60 Minutes” found that nearly 30 similar cases have occurred in U.S. schools over the past 20 months, with more similar cases occurring around the world.
In at least three of those cases, Snapchat was reportedly used to disseminate AI nudes. One parent told 60 Minutes it took him more than eight months to delete the account that shared the images. Souras said the lack of response to victims is a problem the National Center on Missing and Exploited Children sees repeatedly across technology companies.
“That’s not what it’s supposed to be, right? So parents who have children who have exploitative or child porn images on the internet are relying on third parties to contact them and have tech companies call them. It shouldn’t. The company should take responsibility for removing that content immediately,” she said.
60 Minutes asked Snapchat about the parents, who the company said it didn’t respond to in eight months. A Snapchat spokesperson said she could not identify her request and said in part, “We have efficient mechanisms for reporting this type of content.” The spokesperson went on to say that Snapchat has a “zero-tolerance policy against such content” and “takes swift action when reported.”
The Justice Department has said that AI-generated nudity of minors is illegal under federal child pornography laws if it depicts what is defined as “explicit sexual conduct.” But Solas is concerned that some images created by “nudity” sites may not meet that definition.
In the year since Francesca Mani learned she was being targeted, she and her mother Dorota have been encouraging schools to implement AI policies. They have also worked with members of Congress to try to pass a number of federal bills. One of those bills, the Take It Down Act, co-sponsored by Sens. Ted Cruz and Amy Klobuchar, passed the Senate earlier this month and is currently awaiting a vote in the House. be. The move would create criminal penalties for sharing AI nudes and require social media companies to remove photos within 48 hours of receiving a request.