The Dark Side of Open Source AI Image Generators

Whether by the frowning high-definition face of a chimpanzee or a psychedelic, pink-and-red-hued doppelganger of himself, Reuven Cohen makes use of AI-generated photographs to catch folks’s consideration. “I’ve always been interested in art and design and video and enjoy pushing boundaries,” he says—but the Toronto-based consultant, who helps companies develop AI tools, also hopes to raise awareness of the technology’s darker uses.

“It can also be specifically trained to be quite gruesome and bad in a whole variety of ways,” Cohen says. He’s a fan of the freewheeling experimentation that has been unleashed by open source image-generation technology. But that same freedom enables the creation of explicit images of women used for harassment.

After nonconsensual images of Taylor Swift recently spread on X, Microsoft added new controls to its image generator. Open source models can be commandeered by just about anyone and generally come without guardrails. Despite the efforts of some hopeful community members to deter exploitative uses, the open source free-for-all is near-impossible to control, experts say.

“Open source has powered fake image abuse and nonconsensual pornography. That’s impossible to sugarcoat or qualify,” says Henry Ajder, who has spent years researching dangerous use of generative AI.

Ajder says that at the same time that it’s becoming a favorite of researchers, creatives like Cohen, and academics working on AI, open source image generation software has become the bedrock of deepfake porn. Some tools based on open source algorithms are purpose-built for salacious or harassing uses, such as “nudifying” apps that digitally remove women’s clothes in images.

But many instruments can serve each reliable and harassing use circumstances. One standard open supply face-swapping program is utilized by folks within the leisure trade and because the “tool of choice for bad actors” making nonconsensual deepfakes, Ajder says. High-resolution picture generator Stable Diffusion, developed by startup Stability AI, is claimed to have greater than 10 million customers and has guardrails put in to forestall express picture creation and insurance policies barring malicious use. But the corporate additionally open sourced a version of the image generator in 2022 that is customizable, and online guides explain how to bypass its built-in limitations.

Meanwhile, smaller AI models known as LoRAs make it easy to tune a Stable Diffusion model to output images with a particular style, concept, or pose—such as a celebrity’s likeness or certain sexual acts. They are widely available on AI model marketplaces such as Civitai, a community-based site where users share and download models. There, one creator of a Taylor Swift plug-in has urged others not to use it “for NSFW images.” However, once downloaded, its use is out of its creator’s control. “The way that open source works means it’s going to be pretty hard to stop someone from potentially hijacking that,” says Ajder.

4chan, the image-based message board site with a reputation for chaotic moderation is home to pages devoted to nonconsensual deepfake porn, WIRED found, made with openly available programs and AI models dedicated solely to sexual images. Message boards for adult images are littered with AI-generated nonconsensual nudes of real women, from porn performers to actresses like Cate Blanchett. WIRED also observed 4chan users sharing workarounds for NSFW images using OpenAI’s Dall-E 3.

That sort of exercise has impressed some customers in communities devoted to AI image-making, together with on Reddit and Discord, to aim to push again towards the ocean of pornographic and malicious photographs. Creators additionally categorical fear in regards to the software program gaining a status for NSFW photographs, encouraging others to report photographs depicting minors on Reddit and model-hosting websites.