There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

ricecake ,

Did you know that they actually do check? It’s true! There’s a big difference between what happened, which is CSAM was found in the foundation data, and that CSAM then being used for training.

Stability AI on Wednesday said it only hosts filtered versions of Stable Diffusion and that “since taking over the exclusive development of Stable Diffusion, Stability AI has taken proactive steps to mitigate the risk of misuse.” “Those filters remove unsafe content from reaching the models,” the company said in a prepared statement. “By removing that content before it ever reaches the model, we can help to prevent the model from generating unsafe content.”

Also, the people who maintain the foundational dataset do checks, although which was mentioned by the people who reported the issue. Their critique was that the checks had flaws, not that they didn’t exist.

So if your only issue is that they didn’t check, well… You’re wrong.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines