Nightshade - A new data poisoning tool lets artists fight back against generative AI (lemmy.world)
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. Is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission....
![](https://kbin.life/media/cache/resolve/entry_thumb/a0/32/a032172bc6737af07fb574e6b0b2f682fe3f270c65c27216496ef1e78057320b.jpg)