There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Man Arrested for Creating Child Porn Using AI

A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

ReallyActuallyFrankenstein , (edited )

It’s hard to have a nuanced discussion because the article is so vague. It’s not clear what he’s specifically been charged with (beyond “obscenity,” not a specific child abuse statute?). Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

I completely get the “lock them all up and throw away the key” visceral reaction - I feel that too, for sure - but this is a much more difficult question. There are porn actors over 18 who look younger, do the laws outlaw them from work that would be legal for others who just look older? If AI was trained exclusively on those over-18 people, would outputs then not be CSAM even if the images produced features that looked under 18?

I’m at least all for a “fruit of the poisoned tree” theory - if AI model training data sets include actual CSAM then they can and should be made illegal. Deepfaking intentionally real under 18 people is also not black and white (looking again to the harm factor), but also I think it can be justifiably prohibited. I also think distribution of completely fake CSAM can be arguably outlawed (the situation here), since it’s going to be impossible to tell AI from real imagery soon and allowing that would undermine enforcement of vital anti-real-CSAM laws.

The real hard case is producing and retaining fully fake people and without real CSAM in training data, solely locally (possession crimes). That’s really tough. Because not only does it not directly hurt anyone in its creation, there’s a possible benefit in that it diminishes the market for real CSAM (potentially saving unrelated children from the abuse flowing from that demand), and could also divert the impulse of the producer from preying on children around them due to unfulfilled desire.

Could, because I don’t think there’s studies that answers whether those are true.

mpa92643 ,

I mostly agree with you, but a counterpoint:

Downloading and possession of CSAM seems to be a common first step in a person initiating communication with a minor with the intent to meet up and abuse them. I’ve read many articles over the years about men getting arrested for trying to meet up with minors, and one thing that shows up pretty often in these articles is the perpetrator admitting to downloading CSAM for years until deciding the fantasy wasn’t enough anymore. They become comfortable enough with it that it loses its taboo and they feel emboldened to take the next step.

CSAM possession is illegal because possession directly supports creation, and creation is inherently abusive and exploitative of real people, and generating it from a model that was trained on non-abusive content probably isn’t exploitative, but there’s a legitimate question as to whether we as a society decide it’s associated closely enough with real world harms that it should be banned.

Not an easy question for sure, and it’s one that deserves to be answered using empirical data, but I imagine the vast majority of Americans would flatly reject a nuanced view on this issue.

MagicShel , (edited )

The problem is empirical data cannot be morally or ethically found. You can’t show a bunch of people porn and then make a statistical observation of whether those shown child porn are more likely to assault children. So we have to go forward without that data.

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even “ethically produced CSAM.”

mpa92643 ,

True, it wouldn’t be ethical to conduct an experiment, but we can (and probably do) collect lots of observational data that can provide meaningful insight. People are arrested at all stages of CSAM related offenses from just possession, distribution, solicitation, and active abuse.

While observation and correlations are inherently weaker than experimental data, they can at least provide some insight. For example, “what percentage of those only in possession of artificially generated CSAM for at least one year go on to solicit minors” vs. “real” CSAM.

If it seems that artificial CSAM is associated with a lower rate of solicitation, or if it ends up decreasing overall demand for “real” CSAM, then keeping it legal might provide a real net benefit to society and its most vulnerable even if it’s pretty icky.

That said, I have a nagging suspicion that the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all. There’s also the risk that artificial CSAM reduces the taboo of CSAM and can be an on-ramp to more harmful materials for those with pedophilic tendencies that they otherwise are able to suppress. But it’s still way too early to know either way.

HelixDab2 ,

the thing many abusers like most about CSAM is that it’s a real person and that the artificial stuff won’t do it for them at all.

Perhaps. But what about when they can’t tell the difference between real and virtual? It seems like the allure of all pornography is the fantasy, rather than the reality. That is, you may enjoy extreme BDSM pornography, and enjoy seeing a person flogged until they’re bleeding, or see needles slowly forced through their penis, but do you really care that it’s a real person that’s going to end the scene, take a shower, and go watch a few episodes of “The Good Place” with their dog before bed? Or is it about the power fantasy that you’re constructing in your head about that scene? How important is the reality of the scene, versus being able to suspend your disbelief long enough to get sexual gratification from it? If the whole scene was done with really good practical effects and CG, would your experience, as a user–even if you were aware–be different?

usualsuspect191 ,

I will anecdotally observe anal sex, oral sex, and facials have gone up between partners as prevalence in porn has gone up. That suggests but does not prove a direct statistical harm caused by even "ethically produced CSAM

Can we look at trends between consenting adults (who are likely watching porn of real people by the way) as an indicator of what pedophiles will do? I’m not so sure. It’s not like step sibling sex is suddenly through the roof now with it being the “trend” in porn.

Looking specifically at fake rape porn maybe and seeing if it increases rates of rape in the real world might be a better indicator.

MagicShel , (edited )

That’s fair. I tried to make clear that my interpretation is not in any way scientific or authoritative. Better correlations are probably possible.

ETA on further thought: I wonder if prevalence of incest porn has had an effect on actual incest rates. That might be a much closer correlation due to the similar social taboo. But I’m not sure we have good data on that, either.

HelixDab2 ,

CSAM possession is illegal because possession directly supports creation

To expound on this: prior to this point, the creation of CSAM requires that children be sexually exploited. You could not have CSAM without children being harmed. But what about when no direct harms have occurred? Is lolicon hentai ‘obscene’? Well, according to the law and case law, yes, but it’s not usually enforced. If we agree that drawings of children engaged in sexual acts aren’t causing direct harm–that is, children are not being sexually abused in order to create the drawings–then how much different is a computer-generated image that isn’t based off any specific person or event? It seem to me that, whether or not a pedophile might decide that they eventually want more than LLM-generated images is not relevant. Treating a future possibility as a foregone conclusion is exactly the rationale behind Reefer Madness and the idea of ‘gateway’ drugs.

Allow me to float a second possibility that will certainly be less popular.

Start with two premises: first, pedophilia is a characteristic that appears to be an orientation. That is, a true pedophile–a person exclusively sexually attracted to pre-pubescent children–does not choose to be a pedophile, any more than a person chooses to be gay. (My understanding is that very few pedophiles are exclusively pedophilic though, and that many child molesters are opportunistic sexual predators rather than being pedophiles.) Secondly, the rates of sexual assault appear to have . So the question I would have is, would wide availability of LLM-generated CSAM–CSAM that didn’t cause any real, direct harm to children–actually decrease rates of child sexual assault?

2xsaiko , (edited )
@2xsaiko@discuss.tchncs.de avatar

Hard to say. I generally agree with what you’ve said though. Also, lots of people have other fantasies that they would never enact in real life for various reasons (e.g. it’s unsafe, illegal, or both; edit: I should also absolutely list non-consensual here). I feel like pedophilia isn’t necessarily different.

However part of the reason loli/whatever is also illegal to distribute (it is, right? I assume it is at least somewhere) is that otherwise it helps people facilitate/organize distribution of real CSAM, which increases demand for it. That’s what I’ve heard at least and it makes sense to me. And I feel like that would apply to AI generated as well.

snooggums ,
@snooggums@midwest.social avatar

Even worse, you don’t need CSAM to start with. If a learning model has regular porn and nude reference model photography of people under 18 that are used for drawing anatomy, then they have enough information to combine the two. Hell, it probably doesn’t even need the people under 18 to actually be nude.

Hell, society tends to assume any nudity inder 18 to be CSAM anyway, because someone could see it that way.

HelixDab2 ,

Because any simulated CSAM laws have been, to my knowledge, all struck down when challenged.

To the best of my knowledge, calling drawn works obscene has been upheld in courts, most often because the artist(s) lack the financial ability to fight the charges effectively. The artist for the underground comic “Boiled Angel” had his conviction for obscenity upheld–most CSAM work falls under obscenity laws–and ended up giving up the fight to clear his name.

ReallyActuallyFrankenstein ,

Oh, for sure. I’m talking about laws specifically targeted to minors. “Obscenity” is a catch-all that is well-established, but if you are trying to protect children from abuse, it’s a very blunt instrument and not as effective as targeted abuse and trafficking statutes. The statutory schemes used to outlaw virtual CSAM have failed to my knowledge.

For example: en.wikipedia.org/…/Ashcroft_v._Free_Speech_Coalit…

That case was statutorily superseded in part by the PROTECT Act, which attempted to differentiate itself by…relying on an obscenity standard. So it’s a bit illusory that it does anything new.

HelixDab2 ,

The PROTECT Act has been, so far, found to be constitutional, since it relies on the obscenity standard in regards to lolicon hentai. Which is quite worrisome. It seems like it’s a circular argument/tautology; it’s obscene for drawn art to depict child sexual abuse because drawings of child sexual abuse are obscene.

damnedfurry ,

I don’t know if it’s still a thing, but I’m reminded of some law or regulation that was passed a while back in Australia, iirc, that barred women with A-cup busts from working in porn, the “reasoning” being that their flatter chests made them look too similar to prepubescent girls, lol…

Not only stupid but also quite insulting to women, imo.

DmMacniel ,
@DmMacniel@feddit.org avatar

I don’t see how children were abused in this case? It’s just AI imagery.

It’s the same as saying that people get killed when you play first person shooter games.

Or that you commit crimes when you play GTA.

Samvega ,

It’s just AI imagery.

Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.

HelixDab2 ,

indicates that this person might groom children for real

But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.

Chozo ,

I agree, this line of thinking quickly spirals into Minority Report territory.

CeruleanRuin ,

It will always be a gray area, and should be, but there are practical and pragmatic reasons to ban this imagery no matter its source.

HubertManne ,

Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.

Samvega ,

If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.

HubertManne ,

yeah and if you want to keep people who fantasize about murdering folk. you can't say one thing is a thing without saying the other is. Im sorry you were raped but I doubt it would be stopped by banning lolita.

Samvega ,

I don’t recall Nabokov’s novel Lolita saying that sexualising minors was an acceptable act.

Thanks for the strawman, though, I’ll save it to burn in the colder months.

HubertManne ,

You can call it a strawman but doing something evil if its killing folks or raping folks the effect should be the same when discussing non actual and actual. You can say this thing is a special case but when it comes to freedom of speech, which is anything that is not based in actual events. writing, speaking, thinking, art. Special circumstances becomes a real slippery slope (which can also be brought up as a fallacy which like all "fallacies" depend a lot on what else backs them up on how they are being presented)

CeruleanRuin ,

If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.

This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.

It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.

KillerTofu ,

How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

So no, you are making false equivalence with your video game metaphors.

grue ,

But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!

Dkarma ,

Cuz they’re not

fernlike3923 ,
@fernlike3923@sh.itjust.works avatar

A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

finley ,

In that case, the images of children were still used without their permission to create the child porn in question

fernlike3923 ,
@fernlike3923@sh.itjust.works avatar

That’s a whole other thing than the AI model being trained on CSAM. I’m currently neutral on this topic so I’d recommend you replying to the main thread.

finley ,

How is it different?

fernlike3923 , (edited )
@fernlike3923@sh.itjust.works avatar

It’s not CSAM in the training dataset, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

finley ,

It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.

fernlike3923 ,
@fernlike3923@sh.itjust.works avatar

I am not trying to rationalize it, I literally just said I was neutral.

fernlike3923 ,
@fernlike3923@sh.itjust.works avatar
nomous ,

It’s every time with you people, you can’t have a discussion without accusing someone of being a pedo. If that’s your go-to that says a lot about how weak your argument is or what your motivations are.

Dkarma ,

Lol you don’t understand that the faces AI generated are not real. In any way.

MagicShel ,

That’s not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.

Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.

finley ,

Why does it need to be “ nuanced” to be valid or correct?

TheRealKuni ,

Because the world we live in is complex, and rejecting complexity for a simple view of the world is dangerous.

See You Can’t Get Snakes from Chicken Eggs from the Alt-Right Playbook.

(Note I’m not accusing you of being alt-right. I’m saying we cannot ignore nuance in the world because the world is nuanced.)

Dkarma ,

Well it doesn’t but it’s not correct.

Dkarma ,

Wrong again.

CeruleanRuin ,

Good luck convincing the AI advocates of this. They have already decided that all imagery everywhere is theirs to use however they like.

Diplomjodler3 ,

While i wouldn’t put it past Meta&Co. to explicitly seek out CSAM to train their models on, I don’t think that is how this stuff works.

DmMacniel ,
@DmMacniel@feddit.org avatar

Can you or anyone verify that the model was trained on CSAM?

Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.

KillerTofu ,

You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.

DmMacniel ,
@DmMacniel@feddit.org avatar

I just hope that the Models aren’t trained on CSAM. Making generating stuff they can fap on ““ethical reasonable”” as no children would be involved. And I hope that those who have those tendancies can be helped one way or another that doesn’t involve chemical castration or incarceration.

Dkarma ,

No they are not.

Dkarma ,

Wrong.

TallonMetroid ,
@TallonMetroid@lemmy.world avatar

Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

MagicShel ,

An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

Dkarma ,

Just say you don’t get how it works.

lunarul ,

we don’t know that

might

Unless you’re operating under “guilty until proven innocent”, those are not reasons to accuse someone.

ContrarianTrail ,
timestatic ,

Then also every artist creating loli porn would have to be jailed for child pornography.

CeruleanRuin ,

Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.

There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.

leraje ,
@leraje@lemmy.blahaj.zone avatar

The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.

The intent with AI generated CSAM is to watch kids being abused.

datavoid ,

Whose to say there aren’t people playing games to watch people die?

Stern ,
@Stern@lemmy.world avatar

Lolicon fans in absolute shambles.

Rai ,

CANNED IN BANADA

Nollij ,

This creates a significant legal issue - AI generated images have no age, nor is there consent.

The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

How do you define what’s depicting a fictional child? Especially without including real adults? I’ve met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

Even the extremes aren’t clear. Adult star “Little Lupe”, who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there’s full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

CeruleanRuin ,

To paraphrase someone smarter than me, “I’ll know it when I see it.”

But naturally I don’t want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It’s gross, but it is also a problem thatsl’s more widespread and nebulous than most people are willing to admit.

Xatolos ,
@Xatolos@reddthat.com avatar

“I’ll know it when I see it.”

I can’t think of anything scarier than that when dealing with the legality of anything.

lightnsfw ,

I’m nearly 40 and still regularly get carded while other people out with me do not so it’s not just “we card everyone”. People are bad at judging age.

DragonTypeWyvern ,

en.m.wikipedia.org/wiki/I_know_it_when_I_see_it

They really downplayed the criticism of the phrase in the article, it’s actually criticised quite often for being so subjective.

BakerBagel ,

Sometimes something cant have a perfect definition. What’s the difference between a gulf, a bay, and a channel? Where does the shore line become a beach? When does an arid prairie become a desert? How big does a town have to grow before it becomes a city? At what point does a cult bevota religion?When does a murder become premeditated vs a crime of passion? When does a person become too drunk to give active consent? Human behavior is a million shades of gray, just like everytbing else we do, and the things that don’t fit into our clear definitions are where the law needs to be subjective.

Nollij ,

Just when trying to guess someone’s age (we’ll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it’s been (i.e. the older you are), the younger they look. Which means, “when I see it” depends entirely on the age of the viewer.

This isn’t even just about perception and memory- modern style is based on/influenced heavily by youth. It’s also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it’s not just you - teens really are looking younger each year. But they’re still the same age.

Drewelite ,

Well, also this is nothing new, unfortunately. See Lolis. Or maybe don’t…

beepnoise ,

Honestly, I don't care if it is AI/not real, I'm glad that the man was arrested. He needs some serious help for sexualising kids.

DmMacniel ,
@DmMacniel@feddit.org avatar

And that guy gets that help in a prison, riiight.

Samvega ,

Chemical castration offers the best success rates, see my comment under OP citing research.

DmMacniel ,
@DmMacniel@feddit.org avatar

I don’t think that’s actually help him.

Samvega ,

You don’t think that reducing testosterone and therefore sex drive will change offending rates? That is contrary to research which has reliably found that this is the best therapy, in terms of effectiveness on recidivism.

DmMacniel ,
@DmMacniel@feddit.org avatar

That guy didn’t even commit anything just having AI imagery depicting children.

That guy has a mental problem that you can’t only treat by chemical castration. He needs more than that.

Samvega ,

That does not change the fact that chemical castration is the most successful treatment we have to stop CSA recidivism at present.

That guy didn’t even commit anything just having AI imagery depicting children.

Possessing and distributing images that sexually objectify children may be a crime, even if generated by AI.

abfarid , (edited )
@abfarid@startrek.website avatar

Cutting off their testicles and straight up executing them would also reduce the offending rates. Even more effectively than chemical castration, I’m sure. But we wouldn’t be calling that helping the offender, would we? And the comment above was specifically talking about helping them.
What we have now is more of a best middle ground between the amount of damage caused to the patient and safety guarantees for the society. We obviously prioritize safety for the society, but we should be striving for less damage to the patient, too.

Samvega ,

…we should be striving for less damage to the patient, too.

Can you make someone just not sexually interested in something they find arousing? As far as I know, conversion therapy for non-heterosexual people doesn’t have good success rates. Also, those therapies also tended to involve some form of harm, from what I’ve heard.

abfarid ,
@abfarid@startrek.website avatar

Can you make someone just not sexually interested in something they find arousing?

No, I can’t. Doesn’t mean that we (as a society) shouldn’t be working on finding ways to do it or finding alternative solutions. And it’s necessary to acknowledge that what we have now is not good enough.

those therapies also tended to involve some form of harm

They probably did. But nobody here is claiming those were good or helping the patients either.

treefrog ,

Depending on the state, yes actually.

I did time in a medium security facility that also did sex offender treatment (I was there on drug charges). I still have friends that went through that program.

The men who were actually invested in getting better, got better. The ones invested in staying well, are still well.

Cosmonauticus ,

You and I both know he’s not going to get it. I have a kinda sympathy for ppl attracted to kids **but refuse to act on it. **They clearly know it’s not normal and recognize the absolute life destroying damage they can cause if they act on it. That being said there’s not many places you can go to seek treatment. Any institutions that advertised treatment would have ppl outside with pitchforks and torches.

Before anyone tries to claim I’m pro pedo you can fuck right off. I just wish it was possible for ppl are attracted to kids and not out touching them to get some kind of therapy and medication to make them normal (or destroy their sex drive) before something terrible happens.

Samvega ,

to get some kind of therapy and medication to make them normal

Hi, Psychologist here. Does society have strong evidence that therapeutic interventions are reducing rates of, say, the most common disorders of anxiety and depression? Considering that the rates of these are going up, I don’t think we can assume there’s a hugely successful therapy to help those attracted to CSA images to change. Psychology is not a very good science principally because it offers few extremely effective answers in the real world.

In terms of medication androgen antagonists are generally used. This is because lowering testosterone generally leads to a lower sex drive. Here is an article about those drugs, including an offender who asked for them: theguardian.com/…/what-should-we-do-about-paedoph…

TW: the article contains discussion of whether offenders are even psychologically disordered, when set within a historical cultural context of child-marriage. This paragraph is two above the illustration of people trapped within concentric circular walls, and starts “In the 2013 edition …”.

Collis began to research the treatment and decided that it was essential to his rehabilitation. He believes he was born a paedophile, and that his attraction to children is unchangeable. “I did NOT wake up one morning and decide my sexual preference. I am sexually attracted to little girls and have absolutely no interest in sex with adults. I’ve only ever done stuff with adults in order to fit in with what’s ‘normal’.” For Collis, therefore, it became a question of how to control this desire and render himself incapable of reoffending.

[…]

Many experts support Aaron Collis’s self-assessment, that paedophilia is an unchangeable sexual preference. In a 2012 paper, Seto examined three criteria – age of onset, sexual and romantic behaviour, and stability over time. In a number of studies, a significant proportion of paedophiles admitted to first experiencing attraction towards children before they had reached adulthood themselves. Many described their feelings for children as being driven by emotional need as well as sexual desire. As for stability over time, most clinicians agreed that paedophilia had “a lifelong course”: a true paedophile will always be attracted to children. “I am certainly of the view,” Seto told me, “that paedophilia can be thought of as a sexual orientation.”

Brain-imaging studies have supported this idea. James Cantor, a psychiatry professor at the University of Toronto, has examined hundreds of MRI scans of the brains of paedophiles, and found that they are statistically more likely to be left-handed, shorter than average, and have a significantly lower density of white matter, the brain’s connective tissue. “The point that’s important for society is that paedophilia is in the brain at all, and that the person didn’t choose it,” Cantor told me. “As far as we can tell, they were born with it.” (Not that this, he emphasised, should excuse their crimes.)

[…]

Clinical reality is a little more complicated. “There’s no pretence that the treatment is somehow going to cure them of paedophilia,” Grubin told me. “I think there is an acceptance now that you are not going to be able to change very easily the direction of someone’s arousal.” Grubin estimates that medication is only suitable for about 5% of sex offenders – those who are sexually preoccupied to the extent that they cannot think about anything else, and are not able to control their sexual urges. As Sarah Skett from the NHS put it: “The meds only take you so far. The evidence is clear that the best treatment for sex offending is psychologically based. What the medication does is help people have a little bit of control, which then allows them to access that treatment.”

Some research on success rates:

Prematurely terminating treatment was a strong indicator of committing a new sexual offense. Of interest was the general improvement of success rates over each successive 5-year period for many types of offenders. Unfortunately, failure rates remained comparatively high for rapists (20%) and homosexual pedophiles (16%), regardless of when they were treated over the 25-year period. [pubmed.ncbi.nlm.nih.gov/11961909/]

Within the observation period, the general recidivism and sexual recidivism rates were 33.1% and 16.5%, respectively, and the sexual contact recidivism rate was 4.7%. [journals.sagepub.com/doi/abs/…/0306624X231165416 - this paper says that suppressing the sex drive with medication was the most successful treatment]

Men with deviant sexual behavior, or paraphilia, are usually treated with psychotherapy, antidepressant drugs, progestins, and antiandrogens, but these treatments are often ineffective. Selective inhibition of pituitary–gonadal function with a long-acting agonist analogue of gonadotropin-releasing hormone may abolish the deviant sexual behavior by reducing testosterone secretion. [www.nejm.org/doi/full/…/nejm199802123380702 - this paper supports that lowering testosterone works best]

z3rOR0ne ,

Thank you for such a well laid out response and the research to back it up. I rarely see people approaching the subjects of pedophilia, and how best to treat pedophiles, rationally and analytically.

It’s understandable considering the harm they can cause to society that most can only ever view them as nothing more or less than monsters, and indeed, those that are incapable of comprehending the harm they cause and empathizing with those they could potentially cause or have caused harm to, are IMHO some of the more loathsome individuals.

That said, I think too often people are willing to paint others whose proclivities are so alien and antithetical to our own as not only monsters, but monsters that aren’t worth understanding with any degree of nuance, that we ultimately do ourselves and future generations a disservice by not at least attempting to address the issue at hand in the hopes that the most harmful parts of our collective psyche are treated palliatively to the best of our ability.

Your annotated sources indicate that there is not nearly as clear a path forward as detractors to the “pedophiles are simply monsters and there’s no reason to look into their motives further” would pike to believe, while also, by the nature of the existence of the attempted treatments themselves, points out that there is more work to be done to hopefully find a more lasting and successful rate of treatment.

Like many of the paychological ailments plagueing societies today, you cannot simply kill and imprison the problem away. That is always a short term (albeit at times temporarily effective) solution. The solution to the problem of how to greatly reduce the occurrence of pedophilia will ultimately require more of this kind of research and will require more analysis and study towards achieving such ends.

Again, I thank you for your nuanced post, and commend you for taking your nuanced stance as well.

FlyingSquid ,
@FlyingSquid@lemmy.world avatar

Isn’t it more about power than sex many times anyway?

Rai ,

For people actually abusing? Spot on, most of the time.

For non-offending paedos? Nah… a horrible affliction.

treefrog ,

You don’t actually know this. See my comment here.

lemm.ee/post/40672622/14329948

Chozo ,

Do you think he's going to get help in prison?

treefrog ,

Possibly yes. See my comment here.

lemm.ee/post/40672622/14329948

MataVatnik ,
@MataVatnik@lemmy.world avatar

Pretty sure the training data sets are CSAM

Phen ,

I would imagine that AI having been trained on both pictures of kids and on adult sexual content would be somewhat enough to mix the two. Even if the output might end up uncanny.

MataVatnik ,
@MataVatnik@lemmy.world avatar

That’s the most likely case, now my question is was he using somebody else’s generator or did he train this one himself

ContrarianTrail ,

One doesn’t need to browse AI generated images for longer than 5 seconds to realize it can generate a ton of stuff that you for absolute certainty can know wasn’t on the training data. I don’t get why people insist on the narrative that it can only output copies of what it has already seen. What’s generative about that?

MediaBiasFactChecker Bot ,

Futurism - News Source Context (Click to view Full Report)Information for Futurism:
> MBFC: Pro-Science - Credibility: High - Factual Reporting: Mostly Factual - United States of America
> Wikipedia about this source

Search topics on Ground.Newshttps://futurism.com/the-byte/man-arrested-csam-ai

Media Bias Fact Check | bot support

subignition ,
@subignition@fedia.io avatar

Florida Man strikes again...

aesthelete ,

Good. I do not think society owes pedos a legal means to create CSAM.

garpujol ,

Images of crimes should be illegal. No one should be able to draw a murder.

B312 ,

It’s always the florida men

BonesOfTheMoon ,

Could this be considered a harm reduction strategy?

Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines