There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

erranto ,

Maybe be We should shift our thinking to assume that everything posted on the internet is Fake. that’s the only solution to counter the proliferation of AI. the genie is out of the bottle and can’t be forced back

only believe information from official sources that is cryptographically signed.

stevedidwhat_infosec ,

This is a a really really over simplified solution and I’m gonna argue it’s not at all effective. The cat is out of the bag, just like you said. You can’t undo that.

Nothing on the internet is real, okay let’s start from there.

So now how do we relay scientific findings for example? Rely on the media? Pray to some god and hope our reasoning and interpretations are correct enough?

Should we trust video recordings? Pictures?

Should we trust word of mouth? Each other? Ourselves?

erranto ,

that’s where things are headed anyways. AI is only getting better, and very soon every one will be suing everyone else for label and stuff.

OpenAI have recognized that they don’t have the means to distinguish between human and AI generated essays. and soon pics and videos generated will be just as hard to verify. I say we should only trust what we can have the means to verify or go back to old tech, like public broadcasting channels for official news and announcements.

stevedidwhat_infosec ,

(Didn’t realize we had simultaneous convos going lmao hi again)

Can you clarify a bit on this one? What’s headed where specifically? What’s your claim?

Again though, like you said the cat is out of the bag. You can slap all the laws you want on AI. The technology and the concept is already created and been fleshed out. You can’t destroy information and concepts. So to ban it or somehow convince everyone to abandon it is doomed to fail immediately and intrinsically. The powerful people in the world (like my other comment) will still be able to utilize this very powerful technology. Which puts everyone else at a significant and arguably irreparable disadvantage.

I think AI is no different from any other emerging technology, look at nukes! With this great power comes great responsibility that we need to flesh out and figure out.

For example, AI generate image or not, deep fake nudes are still sexual harassment (not physical of course but no different from revenge porn) they take away someone’s autonomy and thus we treat the offenders no differently. Yes the threat is there but we acknowledge it and we have a way to punish for it.

Less trying to make traps to send the cat back into the bag and more hunters and techniques to catch all the kittens that the cat will create. We are very intelligent beings all things considered, we can handle this stuff we just have to learn from our past and make meaningful changes the best we can and we will continue to thrive as we always have. (Assuming we can fix climate change or prepare accordingly and adequately)

Fraylor ,

Thanks to deepfakes basically none of it is trustworthy anymore.

Vqhm ,

Eh, most dashcams have metadata with GPS, timestamps, etc.

GPS locations, time, that’s just math tho. But we could put a private key on every camera and digitally sign every photo/video to prove where it came from.

If it gets bad enough you’ll just carry around a film camera and snap a photo of an accident to prove your dashcam was “real.”

Official news sources, such as the AP, aren’t going to just start faking shit. So look towards media that has a reputation. Yellow journalism has always been around and always will. But one advacent in faking whatever does not mean countermeasures stop advancing.

atx_aquarian ,
@atx_aquarian@lemmy.world avatar

But we could put a private key on every camera and digitally sign every photo/video to prove where it came from.

Unfortunately, the movie Freeze Frame is becoming increasingly relevant.

stevedidwhat_infosec ,

I’m going to push back mainly on your last point. “News agencies are just going to start faking shit”

Why not? What happened to CNN? What’s to stop anyone from buying out the news agency and steering the ship elsewhere? Obviously they wouldn’t want to be obvious about it but a little change here and there and you’d be surprised what you can convince people of.

As a society we are incredibly susceptible to social influence and it’s only going to get easier. I can’t rely on the past to predict the future accurately when the medium is so easily changeable.

xc2215x ,

That is very messed up.

MonkderZweite , (edited )

Violates the right to your own image. You are not allowed to upload images of a classmate to an AI cloud without asking and neither to reach the generated images around.

MartianSands ,

Is that an actual legal right? If you’ve described it accurately, then Facebook and Instagram would be completely illegal

MonkderZweite , (edited )

No, human right. And yeah, they mostly are. But it’s not Facebook offending but each of the teens, so nobody can really enforce it. Same like with phone numbers, except that those are actually protected by law in most countries.

rentar42 ,

It depends on your location, different countries have very different laws.

For example in most countries it's perfectly acceptable to have someone in a picture that you're taking in public (for example you're taking a picture of a building and someone happens to walk by). A notable exception to this is France, where apparently the right to ones own image is quite strong which technically makes most pictures of the Eiffel Tower illegal (as long as any one person is identifiable on it).

Taking (and distributing) a picture specifically of a specific person that's just doing random stuff in public is already less uniform and varies. There's often some protection to basically say "no, you can't make fun of some random person for having the wrong tshirt, they have a right to privacy". A notable exception to that is usually "public figures" (which mostly means people in political, religious or commercial leadership positions): they mostly just have to accept to be pictured wherever.

Protection for pictures taken in a private is usually the strongest (so yes, if you post a picture of your 3 best friends at a small party in your home, you might have to ask them for permission!)

How all of this applies to pictures that "aren't real" but look disturbingly so is probably going to be fought over in court for a good while.

Kalkaline ,
@Kalkaline@programming.dev avatar

deleted_by_author

  • Loading...
  • MonkderZweite ,
    Kalkaline ,
    @Kalkaline@programming.dev avatar

    deleted_by_author

  • Loading...
  • MonkderZweite ,

    What worldwide, it’s Spain.

    Kalkaline ,
    @Kalkaline@programming.dev avatar

    deleted_by_author

  • Loading...
  • qdJzXuisAndVQb2 ,

    Dude, dial back the snarkiness.

    CrayonRosary , (edited )

    AI generates novel images, though. They are merely trained to produce your likeness. None of the pixels are from any source images.

    In this case, I’m mistaken. They used a clothing remover app on normal photos and did not train an AI.

    stevedidwhat_infosec ,

    Jesus Christ that’s not even close to AI they literally stitched together shit like photoshop

    By the way, where do you see that clothoff ( the app mentioned in the article) doesn’t use trained AI models? I’m refraining from visiting their site to check myself as I don’t really want to give them that traffic and I figured I’d ask you direct instead as you already went their to verify I assume)

    CrayonRosary ,

    I didn’t say the app doesn’t use trained models. I said the students didn’t themselves train an embedding or LORA against the other students’ faces in order to generate entirely new pics.

    stevedidwhat_infosec ,

    Ah, good point I missed that detail. Thanks for clarifying

    CarbonatedPastaSauce ,

    Inevitable. Our technology outpaced our evolution a long time ago. We’re spiraling.

    rentar42 ,

    I think that happened at least 10k years ago ... it's just that the spiral is getting faster and faster ...

    PsychedSy ,

    That’s always the case. Evolution, both biological and societal, happens after the environment changes.

    tallwookie ,

    setting a new low for a scam.

    flubba86 , (edited )

    Kinda weird that it details how badly this affected the girls’ mothers. The girls don’t get a say, but won’t someone please think of the mothers?!

    NotAPenguin ,

    It is pretty weird, like "The reaction was one of massive support for all the affected mothers"..?

    ParsnipWitch ,

    How do the girls not get a say? They asked their mothers for help who organised to found others who are affected.

    EnderMB ,

    I imagine it leans into the idea of some people being “too young” to form a grown-up opinion.

    Really fucking weird, given the context is around their likeness being used for the purpose of porn.

    JackGreenEarth ,

    You can’t stop them being made, they’re just the same deepfakes people have been making before. It’s important to note that they’re not photos of people, they’re guesses made by a algorithm.

    strider ,

    While you’re completely right, that’s hardly a consolation for those affected. The damage is done, even if it’s not actually real, because it will be convincing enough for at least some.

    PunnyName ,

    While I understand your point, what consolation can be provided?

    SaakoPaahtaa ,

    Saying man that sucks doe, and nothing else really

    Voyager ,
    @Voyager@psychedelia.ink avatar

    The EU is assessing the state of deepfakes and is planning to create a legal framework to keep it under control through a “combination of measures will likely be necessary to limit the risks of deepfakes, while harnessing their potential.”

    ParsnipWitch ,

    I think the people who made the pictures have to suffer consequences. Otherwise this sends the message as if it was just fair game to behave that way.

    InternetTubes ,

    If governments can go after child porn, then they can go after the websites generating it and people distributing it.

    I’m sort of sick about services that can generate whatever bullshit people ask of them with zero oversight and control, specially when it involves deepfakes. When deepfakes become real enough, societies will just become a race towards distributing the deepfakes that serve whatever passes as the prejudices of the times, and people will eat it up.

    It already happens in societies without deepfakes, and even the people who disagree with the mainstream still adopt their perception of things towards the prejudices present in the media of their society that they don’t really become aware off until they try living outside of it for a while.

    Deepfakes will become like steroids for creating bubbles of ideology once it is able to cross the uncanny valley territory.

    Fal ,
    @Fal@yiffit.net avatar

    It’s almost like people are the problem and will use any excuse to do what they want. So yeah, let’s ban technology, even though as you said people find ways to be shitty anyway, because after all, won’t somebody think of the children?

    rentar42 ,

    Yeah, people are the root cause of almost all problems that people have to deal with.

    And we've been dealing with them for a long time and one way to deal with them is to develop norms and rules as a society (which at some point we decided to enshrine into laws).

    So no, it's not that we need to "ban technology". But a good first step is to say "hey, if you generate porn of someone in your class and distribute it to others in your school then that's a pretty shitty thing to do". Another good step is probably to try to get some consensus on that statement. And if enough people agree with this, then we can start thinking of putting some actual rules behind it.

    Societies have been able to handle these kinds of nuances for many different topics for a very long time. So stop pretending that it's all just "oh, you all just want to ban the new stuff". It might take a while to get it all worked out and some steps along the way will almost certainly be missteps, but it's not like this always ends badly.

    InternetTubes ,

    DRM and IP laws are basically bans on technology. There would not be any system of law if your logic was taken to the extreme. Things can be done.

    Fal ,
    @Fal@yiffit.net avatar

    Of course they can be, that wasn’t the point. Drm and ip law are not examples to be held up as things to imitation though

    Aux ,

    Let’s ban knives because people stab each other.

    Gutless2615 ,

    Ban photoshop!

    Touching_Grass ,

    Before you can operate any AI you will need a license and inform the government what you intend to do develop with it.

    SwampYankee ,

    British moment.

    Aux ,

    I’m from Britain, I have shit loads of knives.

    InternetTubes , (edited )

    Pretty sure some places do ban carrying weapons like swords and machetes in public. Another thing I’m also sick is people who act like there’s no middle ground or precedent or possibility of nuance by making gross caricatures of things.

    Cethin ,

    This stuff can be run locally. Its not something that can be stopped by just going after some service providing it. It may make it slightly less convenient to access, but if anyone wants to access it it’ll be available. Pandora’s box has been opened and it can’t be closed.

    Touching_Grass ,

    The goal isn’t to stop deepfakes of random people. Its to limit AI access to regular people so it can be horded by select groups of people. Using threats against children to stir up the masses is the oldest play in history. The upper crust needs to make laws against how the rest of us use these tools.

    InternetTubes ,

    You can also have your hard drive loaded with child porn locally, and it wouldn’t be any less illegal.

    Cethin ,

    Sure, it’s illegal. They can’t do anything about it unless you do something else wrong though. I wish they could just magically detect where that content was, but they need a search warrant to find it. Talking about stopping this software will lead to nothing, but sharing this content (real or generated) is where attention should be focused.

    Rayspekt ,

    Exactly, the technology is out there and will not cease to exist. Maybe we'll digitally sign our photos in the future so that deepfakes can be sorted out by that.

    Redditiscancer789 ,

    Omg it’s NFTs time to shine!!!

    /S

    maegul ,
    @maegul@lemmy.ml avatar

    To push back your attempt to minimalise what’s going on here …

    Yes, they’re not actually photos of the girls. But, nor is a photo of a naked person actually the same as that person standing in front of you naked.

    If being seen naked is unwanted and embarrassing etc, why should a photo of you naked be embarrassing, and, to make my point, what difference would it make if the photo is more or less realistic? An actual photo can be processed or taken under certain lighting or with a certain lens or have been taken some time in the past … all factors that lessen how close it is to the current naked appearance of the subject. How unrealistic can a photo be before it’s no longer embarrassing?

    Psychologically, I’d say it’s pretty obvious that the embarrassment of a naked image is that someone else now has a relatively concrete image in their minds of what the subject looks like naked. It is a way of being seen naked by proxy. A drawn or painted image could probably have the same effect.

    There’s probably some range of realism within which there’s an embarrassing effect, and I’d bet AI is very capable of getting in that range pretty easily these days.

    While the technology is out there now … it doesn’t mean that our behaviours with it are automatically acceptable. Society adapts to the uses and abuses new technology has and it seems pretty obvious that we’re yet to culturally curb the abuses of this technology.

    HorseRabbit ,

    Tldr

    AdlachGyfiawn ,
    @AdlachGyfiawn@lemmygrad.ml avatar

    Is there a reason you didn’t have time to read but you did have time to comment and make yourself look like an asshole

    n0m4n , (edited )

    The faces are not generated, and that is where the damage comes. It targets the girls for humiliation by implying that they allowed the nudes to be taken of them. Depending upon the location and circumstances, this could get the girls murdered. Think of “honor killings” by fundamentalists. It makes them targets for further sexual abuse, too. Anyone distributing the photos are at fault, as well as the people who made the photos.

    The problem goes deeper, though. We can never trust a photo as proof of anything, again. Let that sink in, what it means to society.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines