NightShade: The Revolutionary Tool for Protecting Designs from Artificial Intelligence Models | University of Chicago

2024-01-23 17:58:19

Researchers at the University of Chicago have launched a new tool called NightShade, to help designers and artists protect their designs from being used to train artificial intelligence models.

The tool’s website reported that the idea NightShade It relies on adding digital data to each element of the design, and it causes artificial intelligence models to become confused if they try to train on the designs. If the user enters a command into the intelligence models to design an image, the design appears completely contrary to what the user intended.

The researchers who developed the tool, which is part of the research project The Glaze Project, said that the digital modification that the tool makes to designs cannot be eliminated or circumvented in any way, and remains fixed within the composition of the designs, even when taking a screenshot, cropping the image, or redesigning it. , or modification of its elements.

Unprecedented turnout

The new tool was widely welcomed by artists, who praised it as a powerful offensive step that could be used to harm artificial intelligence companies, and limit their ability to exploit artworks to train models for free, while achieving huge profits, without acknowledging the efforts of artists.

Immediately after its launch, the tool witnessed a demand that the developers did not expect, who confirmed that the tool’s hosting servers were on the network. University of Chicago It is experiencing unusual slowness, due to increasing installation requests from users.

The demand reached such a degree that some artists waited up to 8 hours to be able to install the new tool, which is available for Mac computers that operate with Apple M1, M2, and M3 processors, as well as Windows 10 and Windows 11 computers.

When installing the tool, users are obligated not to change its source code, modify it, or use it to obtain profits from its distribution or sale.

In conjunction with the spread of artificial intelligence at the end of 2022, a state of anger prevailed among textual content makers such as news sites, and designers and photographers who created visual content, as a result of companies developing artificial intelligence models using this available content to train their models, without the permission of the content makers.

1706040663
#tool #prevents #trained #designer #images

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.