‘Spicy’ AI image generator left millions of photos exposed


A platform that promises “spicy AI chatting” left nearly two million images and videos, many of them showing private citizens, exposed to the public, 404 Media reported.

Secret Desires, an erotic chatbot and AI image generator, left cloud storage containers of photos, women’s names, and other personal information like workplaces and universities, vulnerable, according to 404 Media.

This “massive leak” is the latest case of people using generative AI tools to turn innocent photos into nonconsensual explicit deepfakes.

Hookup apps for everyone


AdultFriendFinder


readers’ pick for casual connections


Tinder


top pick for finding hookups


Hinge


popular choice for regular meetups

Products available for purchase through affiliate links. If you buy something through links on our site, Mashable may earn an affiliate commission.

Some of the photos and videos were taken from real influencers, public figures, and non-famous women. The latter category includes Snapchat screenshots and at least one yearbook photo. Some of the exposed media included user-generated AI images, such as those created with a now-defunct “faceswap” feature, which Secret Desires removed earlier this year.

Mashable Trend Report

Like Character.AI or Replika, Secret Desires allows users to create AI personas and chat with them. While pornographic content isn’t allowed on Character.AI (and is only allowed for certain Replika users), Secret Desires says it “provides limitless intimacy and connection” on its Quick Start Guide.

As 404 Media found, the AI-generated media found in the vulnerable storage containers were mostly explicit. Some of the file names included terms like “17-year-old.”

The company didn’t respond to 404 Media’s request for comment, but the files became inaccessible around an hour after the publication reached out.

For years, women and girls have been victims of explicit deepfakes, which are AI-generated content. Many deepfakes are women’s likenesses “faceswapped” onto pornographic videos. This applies to celebrities like Taylor Swift as well as women who are not famous. This also happens to girls, creating online child sex abuse material.

This year, Congress passed the Take It Down Act to combat deepfake images. The law proved controversial, as several free speech and advocacy groups claim that it can be weaponized against consensual explicit material or political speech.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *