The first government lawsuit against websites and apps creating deepfake AI nude images of people is coming from right here in San Francisco, as City Attorney David Chiu is suing the top 16 purveyors of fake nude images created without the subjects' consent.
It got a great deal of press when fake nude images of Taylor Swift flooded social media platforms in January. But it got less press when the New York Times reported that dozens of high school sophomore girls were victimized by deepfake nudes made of them in Westfield, New Jersey this spring, and when 16 eighth-grade students endured the same in Beverly Hills.
So the New York Times broke the story this morning that SF City Attorney David Chiu was suing the 16 biggest websites and apps that produce deepfake porn. Chiu then formally announced that lawsuit in the Thursday morning press conference seen below.
“These websites allow users to upload photos of real, clothed individuals,” Chiu said at the conference. “AI technology will quote ‘undress’ these persons in the photo, creating pornographic images. Images are generated without the consent of the persons depicted, and are virtually indistinguishable from real photographs.”
“Some sites create images of adults, while others allow users to create child pornography,” he added. According to a press release from Chiu’s office, these 16 sites have generated more than 200 million views during just the first six months of 2024.
X's downward spiral continues through the bottom of the barrel as it begins to show advertisements for nonconsensual AI-undressing apps: "Did you hack me? Delete this photo immediately!" the ad sayshttps://t.co/L3y79ajvsk
— Jason Koebler (@jason_koebler) December 15, 2023
The Chronicle notes that “The companies allegedly creating the AI-generated content are Sol Ecom Inc., Briver LLC, ITAI Tech Ltd., Defirex OÜ, Itai OÜ, Augustin Gribinets and 50 unnamed individuals.” The names of the websites are redacted from Chiu’s public lawsuit filings so as to not drive traffic to them.
But there were some big names mentioned at the press conference. Deputy city attorney Karun Tilak noted that AI model Stable Diffusion was used to create several of these deepfake platforms, though that product has added safeguards to more recent versions to prevent its use to create non-consensual porn. Still, the older models that did allow this are still very much publicly available.
And as we reported in May, the current industry darling OpenAI is thinking about getting into the porn game, which could be a watershed moment for unscrupulous porn huckters.
Chiu’s lawsuit seems to be casting a wide net. His press release notes that “Any person who has been the victim of nonconsensual deepfake pornography or has relevant information, is welcome and encouraged to contact the City Attorney’s Office through its consumer complaint web portal or hotline at (415) 554-3977.”
Related: Sure Enough, OpenAI Is Considering Allowing Its Tools to Generate Porn [SFist]
Image: SF City Attorney