There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Teen girls are being victimized by deepfake nudes. One family is pushing for more protections

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

PhantomAudio ,

gee here is a novel idea, dont let children have access to social media. that would solve a lot of other problems also

Sandbag ,

While I agree with that in principle, we shouldn’t start blocking people, even young people, access to a lot of information.

Twitter, while now a cesspool, still has a lot of academics on it that share new ideas and discoveries.

Reddit, while shit, also has the value of helping people find niche hobbies and communities.

YouTube, while turning into shit, allows people access to video tutorials and explanations, hell while I was in school half the time teachers assigned hw that we would need to watch a YouTube video.

While it’s an idea to block the youth from accessing social media, the drawbacks I think are too much.

TheEighthDoctor ,

What’s the fundamental difference between a deep fake and a good Photoshop and why do we need more laws to regulate that?

LeadSoldier ,

Fear.

UlrikHD ,
@UlrikHD@programming.dev avatar

Lower skill ceiling. One option can be done by pretty much anyone at a high volume output, the other would require a lot training and are not available for your average basement dweller.

Good luck trying to regulate it though, Pandora’s box is opened and you won’t be able to stop the FOSS community from working on the tech.

funkless_eck ,

photoshop is almost entirely automated now. and has phone apps that take it even further.

UlrikHD ,
@UlrikHD@programming.dev avatar

Photoshop (if it does?) and any other believable face swap apps use some sort of neural networks, which is exactly the problematic tech we are talking about.

gandalf_der_12te ,

Honest opinion:

We should normalize nudity.

That’s the only healthy relationship that we can have with our bodies in the long term.

ParsnipWitch ,

For this to happen people would probably need to stop judging people on their bodies. I am pretty sure there is a connection there. With how extremely superficial media and many relationships are, and with how we value women in particular, this needs a lot of change in people and society.

I also think it would be a good thing, but we still have to do something about it until we reach that point.

SuddenDownpour ,

There’s a pretty big fucking difference between normalizing nudity and people putting the faces of 14 year olds in porn video through deepfakes.

Lemming6969 ,

Good luck both policing it and having a society with a healthy relationship with our biology and Ai technology without some sort of societal perspective change.

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

Healthy is doing a lot of heavy lifting in the context of PDFs.

Basil ,

This isn’t even the problem going on, though? Sure, normalize nudity, whatever, that doesn’t fix deep faked porn of literal children.

adrian783 ,

outlaw children

boatsnhos931 ,

Here here!!

Daft_ish ,

Hey, that’s the name of outlaw star x wolf children cross over fan fic.

Socsa ,

Fucking finally

Basil ,

thank you biden

davysnavy , (edited )

It’s the idea that people may be less sexually depraved if we normalize people being naked. Duh. In my opinion the whole world just needs to normalize sexuality across the board. I think it would solve many of society’s problems, frankly

Thief_of_Crows ,

Why is that a problem though? Youre allowed to draw a picture of a specific child naked, why is it suddenly a crime if you use a computer to do it really well?

Basil ,

I shouldn’t have to answer this

GiddyGap , (edited )

Having spent many years in both the US and multiple European countries, I can confidently say that the US has the weirdest, most unnatural, and most unhealthy relationship with nudity.

Aceticon ,

There might be an upside to all this, though maybe not for these girls: with enough of this people will eventually just stop believing any nude pictures “leaked” are real, which will be a great thing for people who had real nude pictures leaked - which, once on the Internet, are pretty hard to stop spreading - because other people will just presume they’re deepfakes.

Mind you, it would be a lot better if people in general culturally evolved beyond being preachy monkeys who pass judgment on others because they’ve been photographed in their birthday-suit, but that’s clearly asking too much so I guess simply people assuming all such things are deepfakes until proven otherwise is at least better than the status quo.

yamanii ,
@yamanii@lemmy.world avatar

Photoshop is a 40yo tool and people still believe almost every picture.

Aceticon ,

Fair point.

drislands ,

Yes, but good Photoshop has a high skill ceiling. Generative AI does not.

virock ,

I studied Computer Science so I know that the only way to teach an AI agent to stop drawing naked girls is to… give it pictures of naked girls so it can learn what not to draw :(

rustydomino ,
@rustydomino@lemmy.world avatar

hmmm - I wonder it makes sense to use generative AI to create negative training data for things like CP. That would essentially be a victimless way to train the AIs. Of course, that creates the conundrum of who actually verifies the AI-generated training data…

gohixo9650 ,

this doesn’t work. AI still needs to know what is CP in order to create CP for negative use. So you need to first feed it with CP. Recent example of how OpenAI was labelling “bad text”

The premise was simple: feed an AI with labeled examples of violence, hate speech, and sexual abuse, and that tool could learn to detect those forms of toxicity in the wild. That detector would be built into ChatGPT to check whether it was echoing the toxicity of its training data, and filter it out before it ever reached the user. It could also help scrub toxic text from the training datasets of future AI models.

To get those labels, OpenAI sent tens of thousands of snippets of text to an outsourcing firm in Kenya, beginning in November 2021. Much of that text appeared to have been pulled from the darkest recesses of the internet. Some of it described situations in graphic detail like child sexual abuse, bestiality, murder, suicide, torture, self harm, and incest.

source: time.com/6247678/openai-chatgpt-kenya-workers/

lntl ,

AI bad

Chakravanti ,

Addendum, Closed Source AI.

Olgratin_Magmatoe ,

Open source AI would be even worse.

Chakravanti ,

According to what logic? Like I’m ever going to trust some lying asshole to hide his instructions for fucking anything that’s MINE. News Alert: “Your” computer ain’t yours.

Olgratin_Magmatoe ,

People have been trying to circumvent chatGPT’s filters, they’ll do the exact same with open source AI. But it’ll be worse because it’s open source, so any built in feature to prevent abuse could just get removed then recompiled by whoever.

And that’s all even assuming there ever ends up being open source AI.

Chakravanti ,

You’re logic is bass ackwards. Knowing the open source publicly means the shit gets fixed faster. Closed source just don’t get fixed %99 of the time because there’s only one mother fucker to do the fixing and usually just don’t do it.

Olgratin_Magmatoe ,

You can’t fix it with open source. All it takes is one guy making a fork and removing the safeguards because they believe in free speech or something. You can’t have safeguards against misuse of a tool in an open source environment.

I agree that closed source AI is bad. But open source doesn’t magically solve the problem.

Chakravanti ,

Forks are productive. Your’re just wrong about it. I’ll take FOSS over closed source. I’ll trust the masses reviewing FOSS over the one asshole doing, or rather not doing, exactly that.

Olgratin_Magmatoe , (edited )

The masses can review said asshole all they like, but it doesn’t mean anything, because nobody can stop them from removing the safeguards.

Then all of a sudden you have an AI that anybody can use that will happily generate even the most morally bankrupt things with ease and speed.

I love FOSS, but AI is inherently unethical. At best it steals people’s work. At worst it makes CP/unconsensual porn.

Chakravanti ,

The user can. You don’t understand anything and I ain’t on a laptop to school you. Gimmie a few hours til I get home and I’ll teach you a the best I can, if you care to learn.

AI may be unethical but terminator’s AS was the best description of a FOSS AI.

Olgratin_Magmatoe ,

You don’t understand anything and I ain’t on a laptop to school you. Gimmie a few hours til I get home and I’ll teach you a the best I can, if you care to learn

Thanks for being civil. Coming in with condescension really shows how smart you are.

If you care about the truth, quit being a dick.

Chakravanti ,

Well if you’re so fucking sensitive that can’t be frank with you then piss off with you’re ignorance. Sure, I’m a jackass for offering to help with that.

Olgratin_Magmatoe ,

offering to help

Calling me stupid does not help.

Chakravanti ,

Ignorance is not an insult. Stupid means lack on intelligence. Ignorance means lack of knowledge. Taking offense means that you do not learn and THAT is stupid.

badbytes ,

AI legal matters are going to be krazy

Hadriscus ,

she should do a couple of him rawdogging it and pass it around

HiddenLayer5 , (edited )

Maybe I’m just naive of how many protections we’re actually granted but shouldn’t this already fall under CP/CSAM legislation in nearly every country?

Karyoplasma ,

In Germany, it would.

HiddenLayer5 ,

Not a lawyer, but 99% sure it’s the same here in Canada as well.

legios ,
@legios@aussie.zone avatar

Australia too. Hentai showing underage people is illegal here. From my understanding it’s all a little grey depending on the state and whether the laws are enforced, but if it’s about victimisation the law will be pretty clear.

Fal ,
@Fal@yiffit.net avatar

Absolutely absurd. Criminalizing drawings is the stupidest thing in the world.

This case should already be illegal under harassment or similar laws. There’s no reason to make drawings illegal

Metz ,

In germany even a written story about it is illegal. it is considered “textual CSAM” then.

Wilibus ,

Nah dude, I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless of the fact that she’s actually a 10,000 old midget elf or whatever paper thin explanation they provide not to be considered paedos.

Fal ,
@Fal@yiffit.net avatar

Well that’s just absurd and you should rethink your position using logic rather than emotion.

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

“that’s just absurd”

Well that’s an emotional response that includes no specifics or appeals to logic.

“rethink your position using logic rather than emotion”

Lol.

Fal ,
@Fal@yiffit.net avatar

Well that’s an emotional response that includes no specifics or appeals to logic.

You clearly have no logic so I’m not going to appeal to it. Just a general comment.

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

I am shocked to my core that you would write off an observation as having no logic. Shocked I tell you!

Fal ,
@Fal@yiffit.net avatar

I am perfectly cool with animated depictions of child sexual exploitation being in the same category as regualr child exploitation regardless

Not an observation. You’re saying you don’t care about actual victims, children actually being abused. Real, live victims. That’s not worse than someone drawing some pictures you don’t like?

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

Wow, you got me. That’s totally what I am saying that you had to fabricate a quote that represents the argument you want to have.

Fal ,
@Fal@yiffit.net avatar

Fabricate a quote? I literally quoted the start of the thread.

zbyte64 , (edited )
@zbyte64@lemmy.blahaj.zone avatar

Misattribute then? My only position was that your rebuttal was not logical while asking for logic, which I thought was funny and “absurd” as you put it.

Wilibus ,

First of all I didn’t say any of things you’re inferring from the quote you posted.

But yes, I think sexually exploitative imagery of children is just as vile and disgusting as behavior that directly harms children and very indicative of someone who may attempt to harm a child in the future.

atzanteol ,

Would it? How do they prove the age of an AI generated image?

DogMuffins ,

By… checking the age of the person depicted in the image?

AVincentInSpace ,

…who by definition is AI generated and does not, in fact, exist?

Basil ,

What? But they literally do exist, and they’re hurting from it. Did you even read the post?

Nyanix ,
@Nyanix@lemmy.ca avatar

While you’re correct, many of these generators are retaining the source image and only generating masked sections, so the person in the image is still themselves with effectively photoshopped nudity, which would still qualify as child pornography. That is an interesting point that you make though

DogMuffins ,

Of course they exist. If the AI generated image “depicts” a person, a victim in this case, that person “by definition” exists.

Your argument evaporates when you consider that all digital images are interpreted and encoded by complex mathematical algorithms. All digital images are “fake” by that definition and therefore the people depicted do not exist. Try explaining that to your 9 year old daughter.

AVincentInSpace ,

Go to this website and tell me who is depicted in the photo, please?

DogMuffins ,

Are you daft? I assume that the person depicted in the photo at thispersondoesnotexist.com does not exist.

AVincentInSpace ,

That image was generated by AI.

So do people in images that are purely AI generated exist, or not?

DogMuffins ,

This is so tedious. If you have a point, then make it. Stop asking inane questions.

So do people in images that are purely AI generated exist, or not?

This question is based on a false premise, as though the technology used to create an image is relevant to what it depicts.

  • If michaelangelo paints the likeness of a model, does the model in the image exist?
  • if a child draws a stick figure likeness of their dad, does the dad in the image exist?
  • if you take a photo on your phone, and it uses complex mathematical algorithms to compress and later render the image, do people in those images exist?
  • if you run a filter over that image on your phone, does that person still exist ?

Of course in all cases, for all intents and purposes the depicted person exists. You can argue that a painting is just an arrangement of pigments on canvas and you would be correct, but to everyone else its still a picture of a specific person.

If you use a computer to generate an image that “looks like” a school-mate doing whatever thing, then an argument that the person in the picture does not exist because the image was generated by AI is moot, because for all intent’s and purposes it’s a “picture of” that school mate doing that thing.

AVincentInSpace ,

Suppose that instead of generating photos of faces, thispersondoesnotexist.com generated porn. Who would be harmed then?

DogMuffins ,

For the love of everything holy. This is not how grown ups discuss things. Make your point and stop asking dumb questions.

As you well know, no one is directly harmed by the simple act of someone viewing AI generated porn which does not depict a real person.

That said, the law in my jurisdiction does not discern between real or not. If it’s an image (even hentai) depicting sexual abuse against a minor then it’s CSAM. How do you know if the depicted person is a minor? That’s a question for a jury. I’m sure there are arguments against this position, but it’s merits are obvious. You don’t need to quibble over whether an image depicts a real person or not, if it’s CSAM then it’s illegal.

AVincentInSpace ,

Then why did you say that there was no difference realism-wise between an image generated by AI and an image generated by a camera?

DogMuffins ,

Because it’s true.

AVincentInSpace ,

How can that be true if it is also true that people in AI images are fake and no one is harmed by them?

DogMuffins ,

People in AI images of real people are real.

People in AI images of fake people are fake.

TacoNissan ,

You fucking dunce. You did not read the article. People have been taking real pictures of real children, and using AI to remove their clothes. The real person is still in the image

drislands ,

The article is about real children being used as the basis for AI-generated porn. This isn’t about entirely fabricated images.

Lemming6969 ,

Ship of Thesseus

DogMuffins ,

IDK why this dumb thought experiment makes me so grumpy everyone someone invokes it, but you’re going to have to explain how it’s relevant here.

Lemming6969 , (edited )

How many pieces do you have to change before it’s not closely enough related? If every piece is modified, is it the same base image? If it’s not the same image, when does it cease to represent the original and must be reassessed? If it’s no longer the image of a real person, given the extreme variety in both real and imagined people, how can an AI image ever be illegal? If you morph between an image of a horse and an illegal image, at what exact point does it become illegal? What about a person and an illegal image? What about an ai generated borderline image and an illegal image? At some point, a legal image changes into an illegal image, and that point is nearly impossible to define. Likewise, the transition between a real and imagined person is the same, or the likeness between two similar looking real, but different, or imagined people.

DogMuffins ,

that point is nearly impossible to define

As with any law, there will undoubtedly be cases in which it is difficult to discern whether or not a law has been broken, but courts decide on innocence or guilt in such cases every day. A jury would be asked to decide whether a a reasonable third party is likely to conclude on the balance of probabilities that the image depicts a person who is under 18.

Whether or not the depicted person is real or imagined is not relevant in many / most jurisdictions.

atzanteol ,

You mean the real person being depicted? So this wouldn’t apply to fake people?

DogMuffins ,

Which fake people?

atzanteol ,

If the porn were of non-real people.

You can’t ask questions on lemmy - people assume you have lots of subtext that isn’t there.

Wilibus ,

Just ask ChatGPT to cut them in half and count the rings.

rchive ,

If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

rchive ,

If you make a picture today of someone based on how they looked 10 years ago, we say it’s depicting that person as the age they were 10 years ago. How is what age they are today relevant?

GeneralVincent ,

I’m unsure of the point you’re trying to make?

It’s relevant in this case because the age they are today is underage. A picture of them 10 years ago is underage. And a picture of anyone made by AI to deep fake them nude is unethical irregardless of age. But it’s especially concerning when the goal is to depict underage girls as nude. The age thing specifically could get a little complicated in certain situations ig, but the intent is obvious most of the time.

rchive ,

I’m obviously not advocating or defending any particular behavior.

Legally speaking, why is what age they are today relevant rather than the age they are depicted as in the picture? Like, imagine we have a picture 20 years from now of someone at age 37. It’s legally fine until it’s revealed it was generated in 2023 when the person in question was 17? If the exact same picture was generated a year later it’s fine again?

DogMuffins ,

Basically, yes.

Is the person under-age at the time the image was generated? and … Is the image sexual in nature?

If yes, then generating or possessing such an image ought to be a crime.

Fal ,
@Fal@yiffit.net avatar

Won’t somebody think of the make believe computer generated cartoon children?!

boatsnhos931 ,

Someone has to pay… this image is only 2 hours old…TWO HOURS OLD, YOU ANIMALS

yamanii ,
@yamanii@lemmy.world avatar

It was done by another underage boy, how would the law act in this case?

mxcory ,

Yes, underage people can be charged.

www.tribtoday.com/…/teen-sexting-is-child-porn/

Edit: Of course this is actual pictures, not generated.

boatsnhos931 ,

So basically the link you posted has nothing to do with OP LOL

calypsopub ,

So as a grown woman, I’m not getting why teenage girls should give any of this oxygen. Some idiot takes my head and pastes it on porn. So what? That’s more embarrassing for HIM than for me. How pathetic that these incels are so unable to have a relationship with an actual girl. Whatever, dudes. Any boy who does this should be laughed off campus. Girls need to take their power and use it collectively to shame and humiliate these guys.

I do think anyone who spreads these images should be prosecuted as a child pornographer and listed as a sex offender. Make an example out of a few and the rest won’t dare to share it outside their sick incels club.

WoahWoah ,

That’s fine and well. Except they are videos, and it is very difficult to prove they aren’t you. And the internet is forever.

This isn’t like high school when you went to high school.

Agreed on your last paragraph.

MargotRobbie ,
@MargotRobbie@lemmy.world avatar

Then nude leak scandals will quickly become a thing of the past, because now every nude video/picture can be assumed to be AI generated and are always fake until proven otherwise.

That’s the silver lining of this entire ordeal.

Again, this is a content distribution problem more than an AI problem, the liability should be on those who willingly host these deepfake content than on AI image generators.

finestnothing ,

That would be great in a perfect world, but unfortunately public perception is significantly more important than facts when it comes to stuff like this. People accused of heinous crimes can and do lose friends, their jobs, and have their life ruined even if they prove that they are completely innocent

Plus, something I’ve already seen happen is someone says a nude is fake and are then told they have to prove that it’s fake to get people to believe them… which is very hard without sharing an actual nude that has something unique about their body

derpgon ,

The rest of the human body has more unique traits than the nude parts. Freckles, birthmarks, scars, tattoos. Those are traits that are not possible to replicate unless the person specifically knows.

Now that I think about it, we all proobably need a tattoo. That should clear anyone instantly.

Llewellyn ,

You can ask an AI to draw a blurred version of the tattoo. Or to mask the tattooed area with, I don’t know, piece of clothes or something.

WoahWoah ,

Yes I’m sure a hiring manager is going to involve themselves that deeply in the pornographic video your face pops up in.

HR probably wouldn’t even allow a conversation about it. That person just never gets called back.

And then the worse part is the jobs that DO hire you. Now you have to question why they are hiring you. Did they not see the fake porn video? Or did they see it.

The entire thing is damaging and ugly.

derpgon ,

If you are already an employee, then they, will want to keep you and look into the matter.

If you are not an employee yet - is HR really looking up porn of everyone?

WoahWoah ,

Yes, HR Googles your name. 🙄

derpgon ,

I am pretty sure people who do porn use pseudonyms anyway. If HR thinks the people use their real name and spread their porn on the internet, they are dumb for not realizing it’s fake. HR being HR as always.

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

Seems we’re partially applying market dynamics of supply and demand. Simply assuming the “surplus” supply of deep fakes will decrease their value ignores the fact that the demand is still there. Instead what we get is new value opportunities in the arms race of validating and distributing deep fakes.

calypsopub ,

Why should they have to expend any energy proving it’s not them?

toonicycle ,

I mean they obviously shouldn’t have to, but if nude photos of you got leaked in your community, people would start judging you negatively, especially if you’re a young woman. Also in these cases where they aren’t adults it would be considered cp.

foo ,

What if the deep fake was so real it was hard to tell? Now if the deep fake was highly invasive and humiliating? Can you see the problem?

calypsopub ,

Not really. The more extreme it is, the more easily people will believe you when you say it’s a deep fake. Everyone who matters (friends and family) will know it’s not you. The more this sort of thing becomes commonplace, the more people will simply shake their heads and move on.

mrsgreenpotato ,

People kill themselves over much more mundane things than this. I think you overestimate teenagers unfortunately, not everyone can handle it as lightly as you would. Telling people to just “shake it off” will simply not work most of the time.

calypsopub ,

Sadly, you have a point. Somebody with good support at home and a circle of friends can weather this sort of thing, but others may feel helpless or hopeless. There needs to be an effective place to turn to for kids who are being bullied. Unfortunately that doesn’t seem to exist.

ParsnipWitch ,

That depends on a how a specific person is seen and treated by their surroundings.

A teenage girl who is already a victim of harassment or bullying for example will be treated very differently when humiliating images of her surface in her peer group, compared between someone who is well liked in school.

People who do this have to be judged much more harshly. This can’t become the next item on a list of common sexual harassment experiences every girl and women “has to” experience.

DogMuffins ,

I think that the point this comment is trying to make is that because it has become so easy to make these images, their existence is not very meaningful. All deep fakes are very realistic. You can’t tell fakes from originals.

Like as an adult, if I saw an “offensive” image of a co-worker, my first assumption would be that it’s probably AI generated, my first thought would be “which asshole made this image” rather than “I can’t believe my co-worker did [whatever thing]”.

atzanteol ,

You may not be representative of teenage girls.

Basil ,

So as a grown woman

Right? Literally not what’s being discussed. Obviously they’ll be more mature and reasonable about it. Teenagers won’t be

calypsopub ,

I wasn’t very representative even when I WAS a teenager. I was bullied quite a bit, though.

atzanteol ,

And can you imagine those bullies creating realistic porn of you and sharing it with everyone at school? You may have been strong enough to endure that - but it’s pretty unrealistic to expect everyone to be able to do so. And it’s not a moral failing if somebody is unable to. This is the sort of thing that leads to suicides.

ILikeBoobies ,

So they do it and share it around to slut shame you

You try to find a job and they find porn of you

It’s a lot worse than you’re making it out to be when it’s not you that gets to make that decision

DogMuffins ,

IMO the days of searching for porn of prospective employees are over. With the advent of AI generated porn, what would be the point of that?

Couldbealeotard ,
@Couldbealeotard@lemmy.world avatar

There are so many recent articles linked on Lemmy about people losing their job over making porn. The days of losing jobs over porn is now more than ever.

DogMuffins ,

Seriously? Maybe we don’t read the same stuff but that’s not something I’ve noticed.

I just can’t imagine how that’s possible. I wish someone would fire me over porn so I could sue them for unfair dismissal as well as defamation and or libel.

ILikeBoobies ,

People are quicker to judge than they are to reason

ExLisper ,

I don’t think the problem is that the girls and ashamed of the fake porn. The problem is not even that other kids will believe it. The problem is that kids will use it to mock, bully and ostracise them. It’s not being shared as ‘OMG, you’re so hot I made fake sex tape with you, marry me". It’s being shared as "you’re a slut that does porn, everyone thinks you’re a bitch, go kill yourself’.

calypsopub ,

I see your point. In that way it’s just like any other bullying, though more personal. Unfortunately, society hasn’t done a good job of coming up with workable solutions for bullying. In this case, dragging the culprit behind the bleachers and letting the girls take turns kicking him in the nuts would be my go-to, but you can’t do that sort of thing anymore.

zbyte64 ,
@zbyte64@lemmy.blahaj.zone avatar

You response highlights how the victims needs the power of community to respond appropriately, and how society excuses some forms of violence (involuntary porn) and not others (women getting retribution).

Treczoks ,

The problem is how to actually prevent this. What could one do? Make AI systems illegal? Make graphics tools illegal? Make the Internet illegal? Make computers illegal?

Jimmyeatsausage ,

Make “producing real or simulated CSAM illegal?”

CAVOK , (edited )

It is where I’m at. Draw Lisa Simpson nude and you get a visit from the law. Dunno what the punishment is though. A fine? Jail? Can’t say.

Edit: Apparently I was wrong, it has to be a realistic drawing. See here: 2010/0064/COD doc.nr 10335/1/10 REV 1

SaakoPaahtaa ,

Lisa Simpson is shambles reading this

Rodeo ,

What about making depictions of other crimes? Should depictions of theft be illegal? Depictions of murder?

Why should depictions of one crime be made illegal, but depictions of other heinous crimes remain legal?

Jimmyeatsausage ,

Because a picture of someone robbing my house doesn’t revictimize me. Even if it’s simulated, every time they run into some rando who recognizes them or every time a potential employer runs a background/social media check, it impacts the victim again

Fal ,
@Fal@yiffit.net avatar

A picture of a cartoon child having sex doesn’t victimize you either, the same way a drawing of a robbery doesn’t victimize you

Ataraxia ,

You mean being raped. What it does is let pedos feel like it’s OK to be pedos.

Fal ,
@Fal@yiffit.net avatar

Lol just like violent video games makes people think it’s ok to be violent in real life?

Rodeo ,

Who is being victimized with a drawing of Lisa Simpson?

Treczoks ,

Isn’t it already? Has it provided any sort of protection? Many things in this world are illegal, and nobody cares.

Jimmyeatsausage ,

Yes, I would argue that if CSAM was legal, there would be more of it…meaning it being illegal provides a level of protection.

yamanii ,
@yamanii@lemmy.world avatar

I wonder why are you being downvoted, something being illegal puts fear in most people to not do it.

31337 ,

I’ve been wondering about this lately, but I’m not sure how much of an effect this has. There are millions of people in prison, and many of those will go on to offend again. Making things illegal can be seen as an agreement to a social contract (in a democracy), drive the activity underground (probably good thing in many cases), and prevent businesses (legal entities) from engaging in the activity; but I’m not sure how well it works on an individual level of deterrence. Like, if there were no laws, I can not really think of a law I would break that I wouldn’t already break regardless. I guess I’d just be more open about it.

Though, people who cause harm to others should be removed from society, and ideally, quickly rehabilitated, and released back into society as a productive member.

afraid_of_zombies ,

Require consent to take a person’s picture and hold them liable for whatever comes from them putting it on a computer.

Treczoks ,

You already need consent to take a persons picture. Did it help in this case? I don’t think so.

afraid_of_zombies ,

Really? Please show me the signed and notarized letter with the girl’s name on it that says they agree to have their image used for AI porn. Also since she is a minor her legal guardians.

CommanderCloon ,

How would you possibly enforce that, or prevent people from just copying publicly available pictures for nefarious usage

afraid_of_zombies ,

It would have to be enforced after getting caught. As an add on charge. Like if an area has a rule against picking locks to commit a crime. You can never be charged with it alone but you can add that on to existing charges.

Bbbbbbbbbbb ,

Very rarely do you need consent to take peoples pictures

JonEFive ,

*in the US.

In the US, the thought is that if you are in a public place, you have no presumption of privacy. If you’re walking down the street, or shopping in a grocery store or whatever else, anyone can snap a picture of you.

Other countries have different values and laws such that you may need a person’s permission to photograph them even if they are in a public place.

afraid_of_zombies ,

That thought is a pile of bull crap. If you really think you have zero presumption of privacy then I have the right to follow right behind you with a sign that says “idiot ahead”. Laws like this are so written for the drug war and for big media not for us.

JonEFive ,

Not saying I agree with it, that’s just the way the laws are written.

A good example of how crappy this law works out is paparazzi. They harass celebrities just to get any halfway decent photo. Then they can sell the photo, the celebrity has no say in the matter. And to make things even worse, if the celebrity happens to use the photo of themselves in any way, the photographer can demand payment because they own the copyright.

afraid_of_zombies ,

And this is exactly what I was talking about. We need tules that say you own your own image.

JonEFive ,

That much I can agree with. If someone takes a picture of me, I should have some say in how that image is used, even if the default assumption is that a person in public is plainly visible to everyone including photographers.

But there’s a lot of nuance here. Maybe a celebrity, or any person really, doesn’t want an unflattering image used. Fair enough I suppose, but to what extent is that actually enforceable?

Or maybe the subject wants to use the image of themselves for their own purposes. Does the photographer deserve compensation for their role in creating the image?

What about unflattering images of politicians or government employees? What about criminals? There’s a line to be walked here as well. We already have this sort of concept in slander laws. Public figures have a higher bar to prove damages resulting from statements that might otherwise be considered slanderous or libelous. There are also free speech and freedom of the press issues associated with government entities.

Yes, you should have a right to decide how your image is used, and yes, you should probably have some shared ownership of images of yourself unless you agree otherwise. But the reality isn’t so clear cut.

Admittedly, I haven’t looked into how other parts of the world that don’t default to lack of privacy in public handle this. Some of these questions must have already been hashed out.

Treczoks ,

Sorry, I forgot that the US is decades behind the rest of the world in privacy laws.

Well, maybe you could start with this aspect.

jimbo ,

That’s a whole fucking can of worms we don’t need to open. Just make faking porn a crime similar to publishing revenge porn.

afraid_of_zombies ,

Nah. Use my image and pay me what I want. If I can’t make a Mickey Mouse movie they shouldn’t be able to make a porn staring me. Does a corporatation have more rights to an image than I have to my image?

jimbo ,

That really depends on what you consider “using my image”. Are you going to demand that people pay you because you were wandering around in the background of their family photo or their YouTube video? Will you ask to be compensated when people post group photos that include you on their social media? Does mom owe you money for all those pictures she took of you as a kid?

afraid_of_zombies ,

If I can be identified and it is on a computer attached to the Internet then pay me.

ParsnipWitch ,

By dishing out punishment that really hurts.

Llewellyn ,

Severity of punishment works poorly. Inevitability, on the other hand…

ParsnipWitch ,

I think in this case less mild punishment would send the appropriate signal that this isn’t just a little joke or a small misdemeanor.

There are still way too many people who believe sexual harassment etc. aren’t that huge of a deal. And I believe the fact that perpetrators so easily get away with it plays into this.

(I am not sure how it is in the US, in my country the consequence of crimes against bodily autonomy are laughable.)

yetAnotherUser ,

Average American be like

Wilibus ,

Make anime illegal

renrenPDX ,

This is treading on some dangerous waters. Kids need to realize this is way too close to basically creating underage pornography/trafficking.

gandalf_der_12te ,

Also, while we’re at it, we should judge playing shooter games in a similar way than actual murder. /s

ParsnipWitch ,

I would rather have them realize that other people are to be treated with respect.

Modern_medicine_isnt ,

In the end you can’t stop it anymore than you can stop teen boys from wanking. Eventually there will just be fake nudes of everyone so it will have no meaning. It sucks, but it is how it is. Maybe people should get out in front of it by generating there own deep fakes of themsleves, but embellish them some so they have an obvious fakeness and age them up to legal age or something.

lolcatnip ,

Does it suck? A future where people have gotten over feeling ashamed of having bodies sounds pretty cool.

calypsopub ,

Exactly.

ParsnipWitch ,

Not if it comes with normalising the behaviours these boys are showing.

lolcatnip ,

If nudity wasn’t a big deal, it wouldn’t even occur to them to harass girls with fake nudes, and nobody would care if they tried.

ParsnipWitch ,

They could still do it for self-gratification. And the problem in that is objectifying other people.

Regardless of whether or not they would still do it when nudity was something humans didn’t have emotions over, it would still be wrongdoing against another person. That’s the problem that has to be tackled.

I don’t think it’s less realistic than removing emotions about nudity in people.

lolcatnip ,

I’m saying it would be like distributing photos of their hands. Just not a big deal to anyone.

And there are certainly examples of cultures where nudity isn’t considered a big deal at all, so it’s not like I’m suggesting something farfetched or contrary to human nature. The ancient Greeks for one example, or Northern Europeans any time they go to a bath house or sauna. In ancient Egypt children under 6 didn’t wear clothes at all in warm weather. I recall seeing a documentary as a kid about an Amazon tribe where nobody wore clothes.

soot ,
SnotFlickerman ,
@SnotFlickerman@lemmy.blahaj.zone avatar

Maybe it is just me, but its why I think this is a bigger issue than just Hollywood.

The rights to famous people’s “images” are bought and sold all the time.

I would argue that the entire concept should be made illegal. Others can only use your image with your explicit permission and your image cannot be “owned” by anyone but yourself.

The fact that making a law like this isn’t a priority means this will get worse because we already have a society and laws that don’t respect our rights to control of our own image.

A law like this would also remove all the questions about youth and sex and instead make it a case of misuse of someone else’s image. In this case it could even be considered defamation for altering the image to make it seem like it was real. They defamed her by making it seem like she took nude photos of herself to spread around.

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

There are genuine reasons not to give people sole authority over their image though. “Oh that’s a picture of me genuinely doing something bad, you can’t publish that!”

Like, we still need to be able to have a public conversation about (especially political) public figures and their actions as photographed

SnotFlickerman ,
@SnotFlickerman@lemmy.blahaj.zone avatar

Yeah I’m not stipulating a law where you can’t be held accountable for actions. Any actions you take as an individual are things you do that impact your image, of which you are in control. People using photographic evidence to prove you have done them is not a misuse of your image.

Making fake images whole cloth is.

The question of whether this technology will make such evidence untrustworthy is another conversation that sadly I don’t have enough time for right this moment.

Zachariah ,
@Zachariah@lemmy.world avatar

Seems like a typical copyright issue. The copyright owner has a monopoly on the intellectual property, but there are (genuine reasons) fair use exceptions (for journalism, satire, academic, backup, etc.)

lolcatnip ,

Reminder that the stated reason for copyrights to exist say all, per the US Constitution, is “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

Anything that occurs naturally falls outside the original rationale. We’ve experienced a huge expansion of the concept of intellectual property since then, but as far as I can tell there has never been a consensus on what purpose intellectual property rights are supposed to serve beyond the original conception.

afraid_of_zombies ,

Makes sense. If I do something worth taking a picture of that means I have zero rights to it since that is “natural”, but the person who took the photo has all the rights to it.

Tell me this crap wasn’t written for and by the worst garbage publishers out there.

afraid_of_zombies ,

If you have a picture of someone doing something bad you really should be talking to law enforcement not Faceboot. If it isnt so bad that it is criminal I wonder why it is your concern?

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

It’s not just “taking it to law enforcement”, it’s a freedom of the press issue.

afraid_of_zombies ,

Can you address what I brought up?

Dark_Arc ,
@Dark_Arc@social.packetloss.gg avatar

I don’t have time for this…

CommanderCloon ,

Public outrage more often drives justice for public figures than what law enforcement does on its own. The level of control you’re asking for would simply nuke the press.

afraid_of_zombies ,

No. Public figures are not private figures.

SuddenDownpour ,

My experience with the police is that most of them will systematically ignore denounces up until the issue has already grown out of control. Outside of that, there are things that are unethical but not illegal, but you might want to denounce publicly anyway.

afraid_of_zombies ,

Ok so your plan is if you see someone do something illegal is to depend on faceboot

SuddenDownpour ,

If you complain that people don’t address your point, and then someone addresses it in good faith, strawmanning them afterwards only makes you look like an asshole and encourages everyone else to not address you at all.

afraid_of_zombies ,

Not seeing the good faith.

nonailsleft ,

Wait you thought this was a problem for Hollywood?

MargotRobbie ,
@MargotRobbie@lemmy.world avatar

It is for actors, since you would be handing over the right to your likeness to studios for AI to reproduce for eternity.

It was one of the main issues for the SAG-AFTRA strike.

APassenger ,

Or their identity?

nonailsleft ,

Well at least they’re getting paid for it. But someone could copy your likeness for free

MargotRobbie ,
@MargotRobbie@lemmy.world avatar

They could be impersonating me as we speak!

afraid_of_zombies ,

Many years ago I mentioned this on reddit. Complaining how photographers can just take pictures of you or your property and do what they want with it. Of course the group mind attacked me.

Problem just seems to get worse by the year.

lolcatnip ,

That’s because your proposal would make photography de facto illegal, because getting the rights to everyone and everything that appears in a photograph would be virtually impossible. Hell, most other kinds of visual art would be essentially illegal as well. There would be hardly anything but abstract art.

afraid_of_zombies ,

Bullshit.

Taking a photo of yourself or your family at a public landmark? Legal.

Taking a photo of yourself or your family at a celebration? Legal.

Zooming in on the local Catholic school to get a shot of some 12 year olds and putting it on the internet? Illegal.

We need to stop pretending that photography isn’t a thing and that there is zero expectation of privacy if someone can violate it. This is crap we see with police using infrared cameras to get around the need for warrants and the crap we see of people using drones to stalk. You have the right to be left the fuck alone and if someone wants to creep on teens well sorry you are out of luck.

lolcatnip ,

There are literally already cases where taking a photo of yourself in front of a public landmark is illegal because of copyright issues.

afraid_of_zombies ,

And?

Zetta ,

That sounds pretty dystopian to me. Wouldn’t that make filming in public basically illegal?

ParsnipWitch ,

In Germany it is illegal to make photos or videos of people who are identifieable (faces are seen or closeups) without asking for permission first. With exception of public events, as long as you do not focus on individuals. It doesn’t feel dystopian at all, to be honest. I’d rather have it that way than ending up on someone’s stupid vlog or whatever.

CleoTheWizard ,
@CleoTheWizard@lemmy.world avatar

The tools used to make these images can largely be ignored, as can the vast majority of what AI creates of people. Fake nudes and photos have been possible for a long time now. The biggest way we deal with them is to go after large distributors of that content.

When it comes to younger people, the penalty should be pretty heavy for doing this. But it’s the same as distributing real images of people. Photos that you don’t own. I don’t see how this is any different or how we treat it any differently than that.

I agree with your defamation point. People in general and even young people should be able to go after bullies or these image distributors for damages.

I think this is a giant mess that is going to upturn a lot of what we think about society but the answer isn’t to ban the tools or to make it illegal to use the tools however you want. The solution is the same as the ones we’ve created, just with more sensitivity.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines