Nightshade - A new data poisoning tool lets artists fight back against generative AI (lemmy.world)
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. Is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission....