Artists unleash a new weapon to ward off AI theft Nightshade’s creators claim they are not aiming to annihilate AI, but to advocate for artists’ rights.

New Ally

Artists have a new ally in their fight against AI exploitation: Nightshade, a free software tool that can “poison” AI models that feed on their work. The tool was first spotted by Venture Beat.

Nightshade uses PyTorch, an open source machine learning framework, to scan an image and tweak its pixels in a way that fools AI programs into seeing something else.

This means that any AI model that trains on “poisoned” images will start to mix up more objects for everyone who uses it. The brains behind Nightshade have also created Glaze, another tool that helps artists hide their style from AI thieves.

The Glaze Project urges artists to use both tools together for a double-edged strategy: “attack” and “defense”. They say they don’t want to ruin AI models, but to raise the price of training them, so that licensing the images from their makers becomes a better option.

Facts About Nightshade

The Nightshade Glaze project is an initiative to protect artists from generative AI models that use their work without permission or compensation. It is a tool that allows artists to add invisible changes to the pixels in their art before they upload it online, so that if it is scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways. For example, dogs become cats, cars become cows, and so forth12.

The Nightshade Glaze project is developed by a team of researchers at the University of Chicago, led by Professor Ben Zhao. It is based on the idea of data poisoning, which is a technique to manipulate the data used to train machine learning models in order to degrade their performance or cause them to behave in unexpected ways3. The project also includes Glaze, a tool that allows artists to mask their own personal style to prevent it from being scraped by AI companies1.

The Nightshade Glaze project aims to empower artists and give them more control over their intellectual property. It also hopes to raise awareness of the ethical and legal issues surrounding the use of generative AI models and the data they rely on12. The project is open source and available for artists to use3.

Sources: