There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

snake_case ,

It’s good that the pixel 6 came with a magic eraser then I guess

dreadedsemi ,

The cat is out. You can’t stop it. It’s open source. I think the more fakes out there might desensitize people and make them less inclined to believe anything. Though my experience on Facebook isn’t encouraging.

ayyndrew ,
@ayyndrew@lemmy.world avatar

Yeah I think people* will come to understand that not all images are real, just like they came to understand the not all headlines are real

*some people

photonic_sorcerer ,
@photonic_sorcerer@lemmy.dbzer0.com avatar

We need something to verify that media is what it claims to be. Will CCTV footage be admissable in court in the future? Maybe we should be looking into NFTs or the like to verify real recordings.

excel ,
@excel@lemmy.megumin.org avatar

This sounds like a job for digital signatures using traditional certificate authority infrastructure, which has been around for decades. Think digitally signed emails or software installers… Seems like NFTs made society collectively forget about every other form of encryption.

Also keep in mind that you can only ever verify the source, not the truthfulness of the data (e.g. you can prove that the image was created by New York Times and hasn’t been altered since then, but you will never be able to prove that NYT themselves didn’t fake it in some way). Verifying the source is obviously still useful though, so it seems dumb that digital signatures aren’t more ubiquitous.

My guess is that they’ve been resisted for privacy reasons.

masonlee ,

Here some big names are working on a standard for chaining digital signatures on media files: c2pa.org.

Their idea is that the first signature would come from the camera sensor itself, and every further modification adds to the signature chain.

AccidentalLemming ,

deleted_by_author

  • Loading...
  • NeoNachtwaechter ,

    ban open source software that can do the stripping?

    Not the software itself, but using it, obviously

    ApeNo1 ,

    What may be more interesting is people trying to pass off real images as fake by adding the watermark to start messing with any techniques attempting to determine what is fake or AI generated.

    NeoNachtwaechter ,

    It isn’t such a good news as it seems at first glance.

    So the biggest money grabbing companies are against deep fakes.

    But the smaller ones, and the more shady ones aren’t. The governments and their ‘services’. And the companies who sell to them, and and…

    lasagna ,
    @lasagna@programming.dev avatar

    This brings us back to the regulatory issues. If we over regulate western AI, AI from countries like China will gain an edge. And China gives 0 fucks about our laws.

    That’s not to say I’m against regulation. I think letting them loose could lead to disaster. Coming up with a solution for this is something I think our governments are incapable of and I’m glad it’s not my job.

    30mag ,

    “Biden-⁠Harris Administration Secures Voluntary Commitments from Leading Artificial Intelligence Companies to Manage the Risks Posed by AI”

    Are there any consequences for violating a voluntary commitment?

    Miqo ,

    Better back up that agreement with a binding pinky-promise!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines