There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Glass0448 ,

OMG. Every other post is saying their disgusted about the images part but it’s a grey area, but he’s definitely in trouble for contacting a minor.

Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.

thefederalcriminalattorneys.com/possession-of-lol…

en.wikipedia.org/wiki/PROTECT_Act_of_2003

Madison420 ,

Yeah that’s toothless. They decided there is no particular way to age a cartoon, they could be from another planet that simply seem younger but are in actuality older.

It’s bunk, let them draw or generate whatever they want, totally fictional events and people are fair game and quite honestly I’d Rather they stay active doing that then get active actually abusing children.

Outlaw shibari and I guarantee you’d have multiple serial killers btk-ing some unlucky souls.

sugar_in_your_tea ,

Exactly. If you can’t name a victim, it shouldn’t be illegal.

RGB3x3 ,

The problem with AI CSAM generation is that the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.

So is it right to be using images of real children to train these AI? You’d be hard-pressed to find someone who thinks that’s okay.

Eezyville ,
@Eezyville@sh.itjust.works avatar

You make the assumption that the person generating the images also trained the AI model. You also make assumptions about how the AI was trained without knowing anything about the model.

RGB3x3 ,

Are there any guarantees that harmful images weren’t used in these AI models? Based on how image generation works now, it’s very likely that harmful images were used to train the data.

And if a person is using a model based on harmful training data, they should be held responsible.

However, the AI owner/trainer has even more responsibility in perpetuating harm to children and should be prosecuted appropriately.

Eezyville ,
@Eezyville@sh.itjust.works avatar

And if a person is using a model based on harmful training data, they should be held responsible.

I will have to disagree with you for several reasons.

  • You are still making assumptions about a system you know absolutely nothing about.
  • By your logic anything born from something that caused suffering from others (this example is AI trained on CSAM) the users of that product should be held responsible for the crime committed to create that product.
    • Does that apply to every product/result created from human suffering or just the things you don’t like?
    • Will you apply that logic to the prosperity of Western Nations built on the suffering of indigenous and enslaved people? Should everyone who benefit from western prosperity be held responsible for the crimes committed against those people?
    • What about medicine? Two examples are The Tuskegee Syphilis Study and the cancer cells of Henrietta Lacks. Medicine benefited greatly from these two examples but crimes were committed against the people involved. Should every patient from a cancer program that benefited from Ms. Lacks’ cancer cells also be subject to pay compensation to her family? The doctors that used her cells without permission didn’t.
    • Should we also talk about the advances in medicine found by Nazis who experimented on Jews and others during WW2? We used that data in our manned space program paving the way to all the benefits we get from space technology.
gardylou ,

LOL, that’s a lot of bullshit misdirection to defend AI child porn. Christ, can there be one social media like platform that just has normal fucking people.

Cryophilia ,

If everywhere you go, everyone is abnormal, I have news for you

gardylou ,

If everyone you go, everyone you know thinks AI generated child sex stuff is normal, well buddy, I think I’ve got some news for you.

aceshigh ,
@aceshigh@lemmy.world avatar

The topic that you’re choosing to focus on really interesting. what are your values?

Eezyville ,
@Eezyville@sh.itjust.works avatar

My values are none of your business. Try attacking my arguments instead of looking for something about me to attack.

aceshigh , (edited )
@aceshigh@lemmy.world avatar

At the root of it beliefs aren’t based on logic they’re based on your value system. So why dance around the actual topic?

PotatoKat ,

The difference between the things you’re listing and SAM is that those other things have actual utility outside of getting off. Were our phones made with human suffering? Probably but phones have many more uses than making someone cum. Are all those things wrong? Yea, but at least good came out of it outside of just giving people sexual gratification directly from the harm of others.

aesthelete ,

Are there any guarantees that harmful images weren’t used in these AI models?

Lol, highly doubt it. These AI assholes pretend that all the training data randomly fell into the model (off the back of a truck) and that they cannot possibly be held responsible for that or know anything about it because they were too busy innovating.

There’s no guarantee that most regular porn sites don’t contain csam or other exploitative imagery and video (sex trafficking victims). There’s absolutely zero chance that there’s any kind of guarantee.

sugar_in_your_tea ,

If the images were generated from CSAM, then there’s a victim. If they weren’t, there’s no victim.

this_1_is_mine ,

I hate the no victim argument.

sugar_in_your_tea ,

Why? Can you elaborate?

PotatoKat ,

The images were created using photos of real children even if said photos weren’t CSAM (which can’t be guaranteed they weren’t). So the victims were are the children used to generate CSAM

dev_null ,

Sure, but isn’t the the perpetrator the company that trained the model without their permission? If a doctor saves someone’s life using knowledge based on nazi medical experiments, then surely the doctor isn’t responsible for the crimes?

PotatoKat ,

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

Your analogy doesn’t match the premise. (Again assuming there is no csam in the training data which is unlikely) the training data is not the problem it is how the data is used. Using those same picture to generate photos of medieval kids eating ice cream with their family is fine. Using it to make CSAM is not.

It would be more like the doctor using the nazi experiments to do some other fucked up experiments.

(Also you posted your response like 5 times)

dev_null ,

Sorry, my app glitched out and posted my comment multiple times, and got me banned for spamming… Now that I got unbanned I can reply.

So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?

In this scenario no, because the crime was in how someone used the car, not in the creation of the car. The guy in this story did commit a crime, but for other reasons. I’m just saying that if you are claiming that children in the training data are victims of some crime, then that crime was committed when training the model. They obviously didn’t agree for their photos to be used that way, and most likely didn’t agree for their photos to be used for AI training at all. So by the time this guy came around, they were already victims, and would still be victims if he didn’t.

PotatoKat ,

I would argue that the person using the model for that purpose is further victimizing the children. Kinda like how with revenge porn the worst perpetrator is the person who uploaded the content, but every person viewing it from there is furthering the victimization. It is mentally damaging for the victim of revenge porn to know that their intimate videos are being seen/sought out.

sugar_in_your_tea ,

Let’s do a thought experiment, and I’d look to to tell me at what point a victim was introduced:

  1. I legally acquire pictures of a child, fully clothed and everything
  2. I draw a picture based on those legal pictures, but the subject is nude or doing sexually explicit things
  3. I keep the picture for my own personal use and don’t distribute it

Or with AI:

  1. I legally acquire pictures of children, fully clothed and everything
  2. I legally acquire pictures of nude adults, some doing sexually explicit things
  3. I train an AI on a mix of 1&2
  4. I generate images of nude children, some of them doing sexually explicit things
  5. I keep the pictures for my own personal use and don’t distribute any of them
  6. I distribute my model, using the right to distribute from the legal acquisition of those images

At what point did my actions victimize someone?

If I distributed those images and those images resemble a real person, then that real person is potentially a victim.

I will say someone who does this creepy and I don’t want them anywhere near children (especially mine, and yes, I have kids), but I don’t think it should be illegal, provided the source material is legal. But as soon as I distribute it, there absolutely could be a victim. Being creepy shouldn’t be a crime.

PotatoKat ,

I think it should be illegal to make porn of a person without their permission regardless of if it was shared or not. Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person. Just like how revenge porn doesn’t actively harm a person but causes mental strafe (both the initial upload and continued use of it). For scenario 1 it would be at step 2 when the porn is made of the person. For scenario 2 it would be a mix between step 3 and 4.

sugar_in_your_tea ,

Thanks for sharing! I’m going to disagree with pretty much everything, so please stop reading here if you’re not interested.

Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person…

Sure, and there are plenty of things that can cause mental strain, but that doesn’t make those things illegal. For example:

  • public display of affection - could cause mental stain people who recently broke up or haven’t found love
  • drug use - recovering addicts could experience mental strain
  • finding out someone is masturbating to a picture of you

And so on. Those things aren’t illegal, but someone could experience mental strain from them. Experiencing that doesn’t make you a victim, it just means you experience it.

revenge porn doesn’t actively harm a person but causes mental strafe

Revenge porn damages someone’s reputation, at the very least, which is a large part of why it’s illegal.

Someone keeping those images for private use doesn’t cause harm, therefore it shouldn’t be illegal.

Someone doing something creepy for their own use should never be illegal.

PotatoKat , (edited )

Thanks for sharing! I’m going to disagree with pretty much everything, so please stop reading here if you’re not interested.

I’m not one to stop because of disagreement. You’re in good faith and that’s all that matters imo

Revenge porn damages someone’s reputation, at the very least, which is a large part of why it’s illegal.

Someone keeping those images for private use doesn’t cause harm, therefore it shouldn’t be illegal.

I believe consent is a larger factor. The person who made it consented to have their photos/videos seen by that person but did not consent to them sharing it.

That’s why it’s not illegal to call someone a slut (even though that also damages reputation)

Someone doing something creepy for their own use should never be illegal.

What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?

sugar_in_your_tea ,

Consent is certainly important, but they don’t need your consent if the image was obtained legally and thus subject to fair use, or if you gave them permission in the past.

That’s why it’s not illegal to call someone a slut (even though that also damages reputation)

It can be, if that constitutes defamation or libel. A passing statement wouldn’t, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.

What if the recording was made without the person’s consent. Say someone records their one night stand without the other person’s knowledge but they don’t share it with anyone. Should that be illegal?

That depends on whether there was a reasonable expectation of privacy. If it’s in public, there’s no reasonable expectation of privacy.

In general, I’d say intimacy likely occurs somewhere with a reasonable expectation of privacy, at which point it would come down to consent (whether implied or explicit).

PotatoKat , (edited )

It can be, if that constitutes defamation or libel. A passing statement wouldn’t, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.

If the person is a slut it wouldn’t be libel but it would still damage reputation. The person being a slut is true but calling them one still damages their reputation. If you release a home made video of a pornstar it would still be illegal even though it’s not something that would damage their reputation.

The reason for the illegality is the lack of consent not the reputation damage.

That depends on whether there was a reasonable expectation of privacy. If it’s in public, there’s no reasonable expectation of privacy.

Even in a 1 party consent state recording someone while you are having intercourse with them is illegal without their consent, because we make exceptions for especially sensitive subjects such as sex.

To go along with that I also believe that people who uploaded photos of themselves/their children did not consent to having their photos used to make sexual content. If they did it would be another matter to me entirely.

Edit: I also would like to say (and I really am sorry for bringing them into this) but from what you said you think it would be okay (not socially acceptable but okay/fine) for someone to take pictures of your kids while they’re at the park and use that to make porn. Really think about that. Is that something you think should be allowed? Imagine someone taking pictures of them at walmart and you ask what they’re doing and they straight up tell you “I like how they look I’m going to add them to my training data to make porn, don’t worry though I’m not sharing it with anyone” and you could do jack shit about it without facing legal consequences yourself. You think that is okay?

sugar_in_your_tea ,

If the person is a slut it wouldn’t be libel but it would still damage reputation

Sure, in which case the person wouldn’t legally be a victim. It’s completely legal to tell the truth.

But that strays a bit from the point. Making fake porn of someone is a false reputation of that person’s character, and thus illegal, but only if it actually causes damages to reputation (i.e. you distribute it). Or at least that’s the line of argumentation I think someone would use in states where “revenge porn” isn’t explicitly illegal.

Even if the person is a porn star, the damage is that the porn is coming from somewhere other than the approved channels, thus the damages. Or maybe it’s lost sales. Regardless, there are actual, articulable damages.

The reason for the illegality is the lack of consent not the reputation damage.

Maybe in states where it’s expressly illegal. I’m talking more from a theoretical standpoint where there isn’t an explicit law against it.

If there’s no explicit law, tht standard is defamation/libel or violation of a reasonable expectation of privacy.

we make exceptions for especially sensitive subjects such as sex.

That’s the reasonable expectation of privacy standard (that applies inside houses when in bedrooms, bathrooms, etc, even if it’s not your house). If you’re doing it in public, there’s no reasonable expectation of privacy, so I think a court would consider filming in that context to be legal.

Then again, this could certainly vary by jurisdiction.

I also believe that people who uploaded photos of themselves/their children did not consent to having their photos used to make sexual content

They don’t need to consent for any use, if it’s made available for personal use, then any individual can use it for personal use, even if that’s sexual content. As long as they don’t distribute it, they’re fine to use it as they please.

If you want control over how how content is used, don’t make it available for personal use.

but from what you said you think it would be okay

Yes. I certainly don’t want them to do that, but I really don’t want to live in a society with the surveillance necessary to prosecute such a law. Someone being creepy with pictures of my kids is disgusting, but it honestly doesn’t hurt me or my kids in any way, provided they don’t share those images with anyone.

So yes, I think it’s a necessary evil to have the kinds of privacy protections I think are valuable to have in a free society. Freedom means letting people do creepy things that don’t hurt anyone else.

PotatoKat ,

Even if the person is a porn star, the damage is that the porn is coming from somewhere other than the approved channels, thus the damages

The damages would be the mental harm done to the victim. Most porn stars have content available for free so that wouldn’t be a reason for damages

That’s the reasonable expectation of privacy standard (that applies inside houses when in bedrooms, bathrooms, etc, even if it’s not your house). If you’re doing it in public, there’s no reasonable expectation of privacy, so I think a court would consider filming in that context to be legal.

The expectation of privacy doesn’t apply to one party consent States but they still can’t record sexual activities of someone without their consent

If you want control over how how content is used, don’t make it available for personal use.

I don’t think people who uploaded pictures on Facebook consider that making it available for personal use

I really don’t want to live in a society with the surveillance necessary to prosecute such a law.

Did i say anything about surveillance? Just because something is made illegal doesn’t make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped. Like you’d be able to stop the person from doing that to your children. Or if someone gets their house raided for something else they can be charged for it. Not every person who has real csam creates it or shares it, many times they just get caught by another charge then it gets found. Or the geek squad worker sees it on their computer and reports them.

It would give people avenues to stop others from using photos of their children in such a way. You wouldn’t need any extra surveillance

Freedom means letting people do creepy things that don’t hurt anyone else.

Do you think it’s okay for someone to have real csam? Let’s say the person who made it was properly prosecuted and the person who has the images/videos don’t share it, they just have it to use. Do you think that’s okay?

sugar_in_your_tea ,

I don’t think people who uploaded pictures on Facebook consider that making it available for personal use.

Then they shouldn’t have uploaded it to Facebook and made it publicly accessible.

Just because something is made illegal doesn’t make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped.

It’s the next logical step for the pearl clutchers and amounts to “thought crime.”

These people aren’t doing anything to my children, they’re making their own images from images they have a right to use. It’s super creepy and I’d probably pick a fight with them if I found out, but I don’t think it should be illegal if there’s no victim.

The geek squad worker could still report these people, and it would be the prosecution’s job to prove that they were acquired or created in an illegal way.

Do you think it’s okay for someone to have real csam?

No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

Possession itself isn’t the problem, the problem is how they’re produced.

I feel similarly about recreational drugs. Buying from dealers is bad because it encourages snuggling and everything related to it. I have no problem with weed or whatever, I have problems with the cartels. At least with drugs there’s a simple solution: legalize it. I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children. Them looking at creepy AI content generated from pictures of my child doesn’t hurt my child, just don’t share those images or otherwise let me know about it.

PotatoKat ,

It’s the next logical step for the pearl clutchers and amounts to “thought crime.”

I seriously doubt they would create any more surveillance for that than there already is for real CSAM.

The geek squad worker could still report these people, and it would be the prosecution’s job to prove that they were acquired or created in an illegal way.

That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.

Possession itself isn’t the problem, the problem is how they’re produced.

I think the production of generated CSAM is unethical because it still involves photos of children without their consent

No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.

There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse

theguardian.com/…/online-sexual-abuse-viewers-con…

The survey was self reported so the reality is probably higher than the 42% cited from the study

I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children.

The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.

sugar_in_your_tea ,

That would just make it harder to prosecute people for CSAM

That’s true, and an unfortunate part of preserving freedoms. That said, if someone is actually abusing children on the regular, police have a way of tracking that individual to catch them: investigations.

I wish police had to do them more often instead of leaving that job to the prosecution. If that means we need to pull officers away from other important duties like arresting black men for possessing a joint or pulling people over for speeding on an empty highway, I guess that’s what we have to do.

it still involves photos of children without their consent

It involves legally acquired images and is protected under “fair use” laws. You don’t need my permission to exercise your fair use rights, even if I think your use is disgusting. It’s not my business. But if you make it my business (i.e. you tell me), I may choose to assault you and hope the courts will side with me that they constitute “fighting words.”

Just because something is disgusting doesn’t make it illegal.

As for that article:

“This is really significant. We now have a peer-reviewed study to prove that watching [CSAM] can increase the risk of contact.”

It doesn’t prove anything, what it does is draw a correlation between people who search for CSAM on the dark web and are willing to answer a survey (a pretty niche group) and self-reported inclination to contact children. Correlation isn’t proof, it’s correlation.

That said, I don’t know if a better study could or should be conducted. Maybe survey people caught contacting children (sting operations) and those caught just distributing CSAM w/o child contact. We need go know the difference between those who progress to contact and those who don’t, and I don’t think this survey provides that.

find a psychologist that can help them work through their desire

I agree, and I think that should be widely accessible.

That said, I don’t think giving people a criminal record helps. If they need to be locked up to protect the public (i.e. there are actual victims), then let’s lock them up. But otherwise, we absolutely shouldn’t. Let’s make help available and push people toward getting that help.

deathbird ,

the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.

First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

But also, AI systems can blend multiple elements together. They don’t need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.

PotatoKat ,

You ignored the second part of their post. Even if it didn’t use any csam is it right to use pictures of real children to generate csam? I really don’t think it is.

deathbird ,

There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.

ICastFist ,
@ICastFist@programming.dev avatar

It has to somehow know what a naked minor looks like.

Not necessarily

You need to feed it CSAM

You don’t. You just need lists of other things, properly tagged. If you feed an AI a bunch of clothed adults and a bunch of naked adults, it will, in theory, “understand” the difference between being clothed and naked and create any of its clothed adults, naked.

With that initial set above, you feed it a bunch of clothed children. When you ask for a naked child, it will either produce a child head with naked adult body, or a “weird” naked child. It “understands” that adult and child are different things, that clothed and naked are different things, and tries to infer what “naked child” looks like from what it “knows”.

So is it right to be using images of real children to train these AI?

This is the real question and one I don’t know the answer to, because it will boil down to consent to being part of a training model, whether your own as an adult, or a child’s parent, much like how it works for stock photos and videos.

“I consent to having my likeness used for AI training models, except for any use that involves NSFW content” - Fair enough. Good luck enforcing that.

MDKAOD ,

I think the challenge with Generative AI CSAM is the question of where did training data originate? There has to be some questionable data there.

scoobford ,

That would mean you need to enforce the law for whoever built the model. If the original creator has 100TB of cheese pizza, then they should be the one who gets arrested.

Otherwise you’re busting random customers at a pizza shop for possession of the meth the cook smoked before his shift.

erwan ,

There is also the issue of determining if a given image is real or AI. If AI were legal, that means prosecution would need to prove images are real and not AI with the risk of letting go real offenders.

The need to ban AI CSAM is even clearer than cartoon CSAM.

Madison420 ,

And in the process force non abusers to seek their thrill with actual abuse, good job I’m sure the next generation of children will appreciate your prudish factually inept effort. We’ve tried this with so much shit, prohibition doesn’t stop anything or just creates a black market and a abusive power system to go with it.

ZILtoid1991 ,

My main issue with generation is the ability of making it close enough to reality. Even with the more realistic art stuff, some outright referenced or even traced CSAM. The other issue is the lack of easy differentiation between reality and fiction, and it muddies the water. “I swear officer, I thought it was AI” would become the new “I swear officer, she said she was 18”.

Madison420 ,

That is not an end user issue, that’s a dev issue. Can’t train on scam if it isn’t available and as such is tacit admission of actual possession.

surewhynotlem ,

Would Lisa Simpson be 8 years old, or 43 because the Simpsons started in 1989?

zbyte64 ,

Big brain PDF tells the judge it is okay because the person in the picture is now an adult.

arefx ,

You can say pedophile… that “pdf file” stuff is so corny and childish. Hey guys lets talk about a serious topic by calling it things like “pdf files” and “Graping”. Jfc

RGB3x3 ,

Why do people say “graping?” I’ve never heard that.

Please tell me it doesn’t have to do with “The Grapist” video that came out on early YouTube.

okiloki ,

To avoid censorship filters in social media, same with PDF files.

ICastFist ,
@ICastFist@programming.dev avatar

Tiktok and Instagram are the main culprits, they’ll shadowban, or outright delist, any content that uses no-no words. Sex, rape, assault, drugs, die, suicide, it’s a rather big list

surewhynotlem ,

That’s the issue though. As far as I know it hasn’t been tested in court and it’s quite possible the law is useless and has no teeth.

With AI porn you can point to real victims whose unconsented pictures were used to train the models, and say that’s abuse. But when it’s just a drawing, who is the victim? Is it just a thought crime? Can we prosecute those?

gardylou ,

Yikes at the responses ITT. This shit should definitely be illegal, and the people that want it probably want to abuse real children too. All of you parsing arguments to make goddamn representations of sexual child abuse legal should take a long hard look in the mirror and consider whether or not you yourself need therapy.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

ICastFist ,
@ICastFist@programming.dev avatar

the fuck was that spam supposed to do?

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

dev_null ,

The discussion will never be resolved in your favour, if you shut down the discussion.

Maggoty ,

Sure, and then some judge starts making subjective decisions on drawn/painted art that didn’t hurt anyone and suddenly people are getting hurt.

The justice system is supposed to protect society, not hurt people you don’t like.

ZILtoid1991 ,

While I do think realistic stuff should be illegal, no question, with the loli/shota/whatever, you’re just opening a can of worms that could be applied to other things too, and some already did.

Regulators used the very same “normalizing certain sexual acts” to try and censor more extreme form of porn and/or the sexual acts themselves, and partly succeeded in the UK. Sure, scat is gross, many like that exactly due to that. One could even talk about the health risks too. Same with fisting, which is too extreme for many, supposed to be extremely painful because many people’s only exposure to it was from Requiem for a Dream, and has some associated health risks. However, a lot of it is some misrepresentation of the truth, with scat isn’t that big of a health risk if you have a good immune system (rest can be mitigated with precautions and moderation), and fisting isn’t inherently painful (source: me).

And the same is true about loli/shota. The terms aren’t just applied to actual underage characters, but for the “short adults” common within the VTubing scene, many of which are also shorter in real life (obligatory “of course not all”). Some of those other characters are also adults, that have exaggerated, almost child-like physique. Most of it however is still just some depiction of children, and otherwise I can understand why some wants to abstain from even the “adult loli/shota” stuff. I remember when pubic hair removal was becoming mainstream, and many, like radical feminists, feared it would normalize pedophilia, I even got called a pedo by a pubic hair connoisseur for not really liking it. I also don’t really want to talk over victims of CSA, many of who want it banned, many of who want it legal.

As for normalizing: The greatest normalization is done by pedos getting into the fandom to recruit others, and entertain the idea of a lower age of consent. For a long time, we threw out these motherfuckers from our community. But then 4chan happened, and suddenly these very same people just started screaming “it’s just an edgy joke bro”, so at one point people trying to keep these creeps out of the anime community in general became villainified, and with gamergate and the culture wars hitting the scene, “gatekeeping the normies” became the priority, so these sick fucks became a feature, which created in the anime community

  • a nazi/pedo/weird gatekeeping free space,
  • and a space that doesn’t moralize about loli/shota.

I had a lot of connections to victims of CSA, most of them were teens, none were groomed by loli/shota (everyone’s mileage will vary on it, likely different in the age of the internet), but by either some non-pornographic work featuring a teen girl and an older man (usually in historic setting), or just by the perpetrator likening a 25+yo guy (often they lied they were way younger) going out with a 14 yo girl to her parents age gap (I’m in Hungary, where that’s technically legal🤮). Usually a simple “that big age gap isn’t okay in your age” talk did wonders, unless the only way for the girl to eat that day was to go out with that guy.

Clbull ,

I thought cartoons/illustrations of that nature were only illegal in the UK (Coroners and Justices Act 2008) and Switzerland. TIL about the PROTECT Act.

ICastFist ,
@ICastFist@programming.dev avatar

Several countries prohibit any fictional depictions of child porn, whether drawn, written or otherwise. Wikipedia has an interesting list on that - en.wikipedia.org/…/Legality_of_child_pornography

Rayspekt ,

I wonder if there is significant migration happening into those countries where csam os legal.

ICastFist ,
@ICastFist@programming.dev avatar

Unlikely. Tourism, on the other hand…

ZILtoid1991 ,

Most people instead have a trip to a place where underage sex workers are common, one can just have an external hard drive and/or a USB stick for that material which they hide. "An"caps are actively trying to form their own countries, partly to legalize “recordings of crimes” as they like to call them, if not outright to legalize child rape and child sex trafficking.

ZILtoid1991 ,

The thing about the PROTECT Act is that it relies on the Miller test, which has obvious holes, and is like depends on who is reviewing it and stuff. I have heard even the UK law has holes which can be exploited.

mightyfoolish ,

Does this mean the AI was trained on CP material? How else would it know how to do this?

deathbird ,

It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

deraceituno ,

Training is how it knows it…

fidodo ,

You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

dustyData ,

But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

bitwaba ,

Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

herrvogel ,

The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They’re also really good at editing images to add or remove entire objects.

MeanEYE ,
@MeanEYE@lemmy.world avatar

You can always tell when someone has no clue about AI but has read online about it.

mightyfoolish ,

I think @deathbird meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.

But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.

desktop_user ,

AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

Local model go brrrrrr

joel_feila ,
@joel_feila@lemmy.world avatar

Well some llm have been caught wirh cp in their training data

ZILtoid1991 ,

Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.

TheObviousSolution ,

He then allegedly communicated with a 15-year-old boy, describing his process for creating the images, and sent him several of the AI generated images of minors through Instagram direct messages. In some of the messages, Anderegg told Instagram users that he uses Telegram to distribute AI-generated CSAM. “He actively cultivated an online community of like-minded offenders—through Instagram and Telegram—in which he could show off his obscene depictions of minors and discuss with these other offenders their shared sexual interest in children,” the court records allege. “Put differently, he used these GenAI images to attract other offenders who could normalize and validate his sexual interest in children while simultaneously fueling these offenders’ interest—and his own—in seeing minors being sexually abused.”

I think the fact that he was promoting child sexual abuse and was communicating with children and creating communities with them to distribute the content is the most damning thing, regardless of people’s take on the matter.

Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

Maggoty ,

Wait do you think all Hentai is CSAM?

And sending the images to a 15 year old crosses the line no matter how he got the images.

BangCrash ,

Hentai is obviously not CSAM. But having a hentai image on an article about CSAM and child grooming is pretty poorly thought out

Maggoty ,

You know, looking at it again, I think it’s an ad.

BangCrash ,

It’s an ad for another article on that site.

Saledovil ,

Umm … That AI generated hentai on the page of the same article, though … Do the editors have any self-awareness? Reminds me of the time an admin decided the best course of action to call out CSAM was to directly link to the source.

The image depicts mature women, not children.

BangCrash ,

Correct. And OP’s not saying it is.

But to place that sort of image on an article about CSAM is very poorly thought out

StaySquared ,

I wonder if cartoonized animals in CSAM theme is also illegal… guess I can contact my local FBI office and provide them the web addresses of such content. Let them decide what is best.

Kedly , (edited )

Ah yes, more bait articles rising to the top of Lemmy. The guy was arrested for grooming, he was sending these images to a minor. Outside of Digg, anyone have any suggestions for an alternative to Lemmy and Reddit? Lemmy’s moderation quality is shit, I think I’m starting to figure out where I lean on the success of my experimental stay with Lemmy

Edit: Oh god, I actually checked digg out after posting this and the site design makes it look like you’re actually scrolling through all of the ads at the bottom of a bulshit clickbait article

far_university1990 ,

Go to instance that moderate like you like it.

FiniteBanjo , (edited )

Lemmy as a whole does not have moderation. Moderators on Lemmy.today cannot moderate Lemmy.world or Lemmy ml, they can only remove problematic posts as they come and as they see fit or block entire instances which is rare.

If you want stricter content rules than any of the available federated instances then you’ll have to either:

  1. Use a centralized platform like Reddit but they’re going to sell you out for data profits and you’ll still have to occasionally deal with shit like “The Donald.”
  2. Start your own instance with a self hosted server and create your own code of conduct and hire moderators to enforce it.
Kedly ,

Yeah, I know, thats why I’m finding lemmy not for me. This new rage bait every week is tiring and not adding anything to my life except stress, and once I started looking at who the moderaters were when Lemmy’d find a new thing to rave about, I found that often there was 1-3 actual moderators, which, fuck that. With reddit, the shit subs were the exception, here it feels like they ALL (FEEL being a key word here) have a tendency to dive face first into rage bait

Edit: Most of the reddit migration happened because Reddit fucked over their moderators, a lot of us were happy with well moderated discussions, and if we didnt care to have moderators, we could have just stayed with reddit after the moderators were pushed away

moon ,

You can go to an instance that follows your views closer and start blocking instances that post low quality content to you. Lemmy is a protocol, it’s not a single community. So the moderation and post quality is going to be determined by the instance you’re on and the community you’re with.

ArmokGoB ,

This is throwing a blanket over the problem. When the mods of a news community allow bait articles to stay up because they (presumably) further their views, it should be called out as a problem.

nutsack ,

you are too optimistic about the internet

Kedly , (edited )

I fail to see what part of my comment is optimistic? xD

BugKilla ,
@BugKilla@lemmy.world avatar

That anywhere else is better.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines