With the emergence of artificial intelligence and its ever-increasing progress, artists have faced a new challenge. The new issue facing them is that by sharing their artworks in the virtual space, there is a possibility that artificial intelligence will use their works, and on the other hand, not doing this will limit their work growth; But a new AI tool may soon be able to help artists stop AI companies from using their work without permission.
Nightshade supports artists’ rights
Researchers at the University of Chicago have recently developed a new model of artificial intelligence called Nightshade According to the MIT technology, by poisoning the artist’s work by making subtle changes in its pixels, it makes it impossible for other AI models to recognize and use the image.
These minor changes are such that the human eye is not able to recognize them, and the purpose of their creation is only to make the machine learning models mistake the image for something other than what it is, and not be able to advance artistic or educational goals. and because Nightshade relies on accurate data, this process essentially renders the image useless for other AI models.
Another interesting point about the way Nightshade works is that if a large number of images poisoned by this model from artificial intelligence are extracted by another model, the performance of that model itself may be disrupted and it will no longer be able to produce accurate images. .
For example, by feeding Stable Diffusion, an artificial intelligence model aimed at image generation, the researchers fed it 50 poisoned images of dogs and asked it to generate new images of dogs. The result of this experiment was that the images produced were of very low accuracy and produced distorted photos of animals with too many limbs or cartoon faces that only somewhat resembled dogs; But they did not have a direct visual connection with the original photos and images.
This test clearly showed that AI models cannot use the images poisoned by Nightshade properly, and the result of their work will not be favorable if these images are used.
If Nightshade goes through the development stages, it can be promising for many artists around the world and will help to protect the copyright and use of their works.
Conclusion
Researchers at the University of Chicago are developing a new model of artificial intelligence called Nightshade, which can make small changes in the pixels of an artist’s image to make it unrecognizable to other artificial intelligence models. The changes made by Nightshade are so small that the human eye cannot see them and it does not affect the quality of the images. This model of artificial intelligence has come out proud from its initial tests and can soon create a big change in the work process of artificial intelligence models with the aim of producing images.

RCO NEWS