Nightshade, a newly developed and freely available tool which helps artists prevent unlicenced usage of their artworks for training AI models, has amassed 250,000 downloads within five days of its release.
Created by computer science researchers at the University of Chicago, and crafted to empower artists, Nightshade has struck a chord with the global artistic community seeking to safeguard their creations from AI scraping without consent.
Ben Zhao, the lead on the Nightshade project and a professor of computer science, was astonished at the overwhelming response, stating, “I expected it to be extremely high enthusiasm. But I still underestimated it. The response is simply beyond anything we imagined.”
With over 2.67 million artists in the U.S. alone, Nightshade’s popularity signifies a broader interest among creative individuals seeking to safeguard their intellectual property. Nightshade operates by intentionally altering artworks posted online at a pixel level, a process described as “shading,” to confuse generative AI image models.
By introducing these alterations, the tool aims to make the training on unlicensed data more challenging, prompting users to consider licensing images from their creators as a viable alternative.
The tool’s popularity and the massive download rates have caused a strain on the University of Chicago’s web servers. To address this, mirror links have been added, enabling users to download Nightshade from alternative locations in the cloud.
The team behind Nightshade previously introduced Glaze, a tool designed to prevent AI models from learning an artist’s signature style by subtly altering pixels. Since its release in April 2023, Glaze has garnered 2.2 million downloads, emphasizing the growing interest in tools that empower creators to retain control over their artistic works in the realm of AI and machine learning. The high demand for Nightshade and Glaze within the artistic community shows the pressing need for tools that allow creators to safeguard their work in the evolving landscape of AI and machine learning.
Despite Nightshade’s popularity and its potential combination with Glaze for comprehensive protection of artists’ work, researchers from The Glaze Project have advised caution and thorough testing before releasing images treated with both tools. The developers stress that artists first use Glaze to protect their style, before following with Nightshade to disrupt AI model training.
Interestingly, the researchers behind Nightshade and Glaze have not had direct communication with the developers of AI image-generating technologies, including prominent names such as OpenAI (DALL-E 3), Midjourney, and Stability AI (Stable Diffusion). These AI models are widely used, including by VentureBeat for creating various content, including article imagery.