There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

d3Xt3r , (edited )

Sorry, but I disagree. Note that I don’t disagree with the idea or the technology itself (and the concept of Fediverse), the problem is the current state of development. Saying that it’s the moderators job doesn’t absolve the responsibility of the software, when the software, in it’s current state, doesn’t really provide any decent tools for moderation and user access controls.

CSAM was never a problem on well-configured traditional forums, which were based on forum software such as Invision, vBulletin etc. To elaborate, in traditional forums, you’d get a LOT of controls for filtering out the kind of users who post such content. For instance, most forums won’t even let you post until you complete an interactive tutorial first (reading the rules and replying to a bot indicating you’ve understood them etc). On top of that, you can have various levels of restrictions, eg, someone with less than 100 posts, or an account less than a month old may not be able to post any links or images etc. Also, you can have a trust system on some forums, where a mod can mark your account as trusted or verified, granting you further rights. You can even make it so that a manual moderator approval is required before image posting rights are granted. In this instance, a mod would review your posting history and ensure that your posts genuinely contributed to the community and you’re unlikely to be a troll/karma farmer account etc.

So, short of accounts getting compromised/hacked, it’s very difficult to have this sort of stuff happen on a well-configured traditional forum.

I used to be a mod on a couple of popular forums back in the day, and I even ran my own server for a few years (using Invision Power Board), and never once have I had to deal with such content.

The fact is Lemmy, in it’s present state, is woefully inadequate to deal with such content. Dealing with CSAM should never be a volunteer mod’s job - that stuff can scar you for life, or even trigger PTSD/bad memories for those who might’ve suffered abuse in their forgotten past. If people are involved, it should be a job for professionals who’re trained to deal with this stuff.

Once again, I don’t disagree with the general idea or the concept of Lemmy, it’s just unfortunate timing the Reddit exodus happened when the software was essentially an alpha.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines