There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

technology

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

sunzu , in Firmware flaw affects numerous generations of Intel CPUs — UEFI code execution vulnerability found for Intel CPUs from 14th Gen Raptor Lake to 6th Gen Skylake CPUs, and TPM will not save you

Does TPM do anything ?

underisk ,
@underisk@lemmy.ml avatar

If you disable it you can prevent Microsoft from force updating your windows 10 install to windows 11. Obviously a play to get people to buy new hardware for 11 but a useful anti feature I suppose until you can stomach switching to Linux.

TheAnonymouseJoker , in Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
@TheAnonymouseJoker@lemmy.ml avatar

Such actions should be judged not as CSAM but as defamation and libel. Anyone going around harping about AI CSAM does not care about empowering politicians and elites and will bootlick them forever happily. A drawing or AI generated media cannot be CSAM, because nobody is physically abused.

Zoot ,
@Zoot@reddthat.com avatar

Thanks for making it easy to tag you as a Loli Supporter. Ml has its problems, but hopefully harboring loli’s/pedo’s who get their kicks off of child like photos won’t be one of them.

TheAnonymouseJoker ,
@TheAnonymouseJoker@lemmy.ml avatar

There are many pedo/loli Lemmy instances that are banned. You may like those, I do not. You might have that problem you are describing in such detail.

Mammothmothman ,

Go back to Q-an0n Diddler.

TheAnonymouseJoker ,
@TheAnonymouseJoker@lemmy.ml avatar

I see, it turned out to be useful to detect reactionary baiters. People like you are useful to the state, not to the kids you are pretending to protect.

Mammothmothman ,

Your “gubment bad” position is reactionary. Your inability or unwillingness to understand how an AI generated image of a naked body with a minor’s likeness superimposed on top of it is CSAM is telling of your true motivation. You are the type of person who reads 1984 and can’t do anything but identify with the main character, completely ignoring how dispicable and low that character is. The state is by no means perfect but its a whole lot better than the bullshit you are peddling. Eat Shit and die pedo apologist.

todd_bonzalez ,

It really is a bizarre argument.

“The government bans child porn, but the government is bad, so child porn should be legal”

I feel like this person is starting with the conclusion, and justifying it with any narrative they can find that makes child porn free speech…

todd_bonzalez ,
delirious_owl , in Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
@delirious_owl@discuss.online avatar

Is it CSAM if it was produced by AI?

Nyoka ,

In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

TheAnonymouseJoker ,
@TheAnonymouseJoker@lemmy.ml avatar

deleted_by_moderator

  • Loading...
  • whodoctor11 ,
    @whodoctor11@lemmy.ml avatar

    If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    Reporting and getting my comment removed for feeling the hypothetical threat of becoming a CSAM planting victim? Wow, I think I struck the chord with you. It makes sense, people like you never think through things before suggesting them. Such people should never get the tiniest sliver of power.

    papertowels ,

    Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

    All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

    papertowels ,

    Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

    Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

    The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

    Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

    I just had a dejavu moment typing this comment, and I have no idea why.

    Zoot ,
    @Zoot@reddthat.com avatar

    Dude, it depicts a child in a sexual way. Find some other way to defend Loli’s then trying to say “The terms aren’t right, really its just libel” fuck outta here. Child, depicted in a sexual way -> CSAM. Doesn’t matter if it was drawn, produced, or photographed.

    magi ,

    It is very clear that they produce and/or consume said material and feel threatened by anyone calling it what it is

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

    magi ,

    In what world does that justify creating PHOTOREALISTIC sexual imagery of a REAL child? You’re out of your mind, royally.

    ssj2marx ,

    Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    Glad that it will always remain a hot take.

    The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.

    Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?

    ssj2marx ,

    there cannot be developed a scale or spectrum to judge where the fake stops and real starts

    Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

    And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    An image is not merely an arrangement of pixels in a jpeg,

    I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.

    Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

    not every law needs to have a perfectly defined line

    And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.

    ssj2marx ,

    Who will be the judge?

    The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.

    The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.

    The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.

    ssj2marx ,

    How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.

    You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.

    The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.

    todd_bonzalez ,

    These are some insane mental gymnastics.

    Congratulations on the power trip purging every comment that calls you out.

    frauddogg ,
    @frauddogg@lemmygrad.ml avatar

    Still on your fuckshit, I see. Smh.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    This is not a “CSAM” problem, since there is no physical outcome. This is a defamation and libel problem, and should be treated as such. If I see nonsensical notions, I will call them out without fear.

    ssj2marx ,

    Do you not consider photoshopping an actual person’s photos into porn abusive towards that person?

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    I consider it as defamation and libel. Yes, it is faux porn, but ultimately the goal is to harass and defame the person.

    todd_bonzalez ,

    Is it material that sexually abuses a child?

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    deleted_by_moderator

  • Loading...
  • todd_bonzalez ,

    So you don’t think that nudifying pics of kids is abusive?

    Says something about you I think…

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    deleted_by_moderator

  • Loading...
  • todd_bonzalez ,

    drawings

    Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.

    the liberty of masses be stomped and murdered

    Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    deleted_by_moderator

  • Loading...
  • magi ,

    They’re making them unsafe? You and your bullshit are making them unsafe. Every comment you post reeks of your true character. Go get help.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    deleted_by_moderator

  • Loading...
  • todd_bonzalez ,

    Do you really think being insufferable is going to change any minds here?

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    The only thing insufferable is reactionary people in this thread. It must be easy labelling people rather than calmly thinking about things.

    todd_bonzalez ,

    You are absolutely not coming across as calm fwiw.

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    Kindly provide the answer within two hours for false accusations on the other comment, since you are currently active. No further notice will be given.

    todd_bonzalez ,

    I’m making kids unsafe by…

    checks notes

    …being firmly and unwaveringly against the sexual exploitation of children.

    I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.

    Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.

    I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.

    TheAnonymouseJoker , (edited )
    @TheAnonymouseJoker@lemmy.ml avatar

    deleted_by_moderator

  • Loading...
  • magi ,

    You’re defending it by playing it down as simple defamation. Quit your bullshit and go get help.

    TheAnonymouseJoker , (edited )
    @TheAnonymouseJoker@lemmy.ml avatar

    Since you are also labelling me as a pedophile/loli fan, I would prefer you provide evidence of the same. Failing to do so will require to take moderator actions.

    Justifying your absurdity using hivemind baiting tactics may work on Reddit, but this is Lemmy.

    Edit: I have learned my lesson. I will never be this tolerant again. Disgusting people. Leniency just makes you a doormat.

    magi , (edited )

    Refute literally anything I said? My post history in this thread, mostly replying to you, is still present. My absurdity? You’ll go down as the laughing stock you are right now. Only need to take a quick look around this thread to realize your view on this is unpopular. “this is Lemmy”, yeah, and apparently people like you still exist on the internet. You’re defending/playing down the production and consumption of photorealistic sexual imagery depicting a REAL underage girl.

    magi ,

    Found the weirdo

    Zoot ,
    @Zoot@reddthat.com avatar

    Found the Loli* ftfy

    refalo ,

    attack the argument, not the person

    TheAnonymouseJoker ,
    @TheAnonymouseJoker@lemmy.ml avatar

    I think I have been attacked far too much here already. These nasty people are labelling me as pedophilia supporters. I would suggest capital punishment for pedophiles and atleast a non bailable offence law for such defamation actors like the one in post article and these internet creatures that go around labelling people falsely.

    Zoot ,
    @Zoot@reddthat.com avatar

    Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.

    It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.

    Surreal ,

    Practice what you preach. Read the thread again, what do you think “say something about you” mean?

    delirious_owl , (edited )
    @delirious_owl@discuss.online avatar

    Is it material that may encourage people to sexually abuse a child?

    todd_bonzalez ,

    That’s one definition, sure.

    Now answer the very simple question I asked about whether or not child porn is abusive.

    MehBlah ,

    Any sex act involving a adult and a child/minor is abusive by its very nature.

    NotMyOldRedditName ,

    It’s actually not clear that viewing material leads that person to causing in person abuse

    Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.

    That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.

    delirious_owl ,
    @delirious_owl@discuss.online avatar

    There’s other instances where it was completely fabricated, and the courts ruled it was CSAM and convicted

    NotMyOldRedditName , (edited )

    There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.

    Edit: in the USA it might not even be illegal unless there was intent to distribute

    By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[

    So local AI generating fictional material that is not distributed may be okay federally in the USA.

    delirious_owl ,
    @delirious_owl@discuss.online avatar

    Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.

    Laws are weird.

    NotMyOldRedditName , (edited )

    Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.

    Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.

    delirious_owl ,
    @delirious_owl@discuss.online avatar

    Prison*

    NotMyOldRedditName ,

    Ah my bad, you’re right.

    Then you’ll probably get shanked if any of the other inmates find out you were sent there for CP.

    delirious_owl ,
    @delirious_owl@discuss.online avatar

    Hey bro, it was just AI-generated tho! And it had serious value!

    Plz don’t stab me

    HauntedCupcake ,

    I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.

    But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.

    Apologies if it’s just my reading comprehension being shit

    fine_sandy_bottom ,

    Is that the definition of CSAM?

    KillingTimeItself ,

    it would be material of and or containing child sexual abuse in it.

    Majestic ,

    It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.

    Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.

    Consider the following:

    1. Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).
    2. Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.
    3. From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.
    4. Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.
    5. Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.
    6. Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.

    Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.

    Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.

    Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.

    deltapi ,

    I think it’s best to not defend kiddie porn, unless you have a republican senator in your pocket.

    Majestic ,

    Did you reply to the wrong person or do you just have reading comprehension issues?

    KillingTimeItself ,

    i believe in the US for all intents and purposes, it is, especially if it was source from a minor, because you don’t really have an argument against that one.

    Upstream7564 , (edited ) in Google Chrome's Death Of Manifest V2 Has Arrived

    Bad research. Brave and Vivaldi will continue the support for MV2 extension, the CEO of Brave said they will continue the support even if they have to host the code themself.

    Trent ,

    Vivaldi has said they will as long as the code is in Chromium, and are planning on it going away by June of next year.

    No idea about Brave, I don’t use it and never will.

    This did give me the motivation to switch to back to Firefox, and later possibly Librewolf though, so thanks Google.

    Bitrot ,

    Neither has its own extension repository, so maintaining support enables side loading but isn’t all that useful for normal people or those who want their extensions to be up to date.

    Brave shields work better than the built-in protection in Vivaldi, so it’s less of an issue there but still frustrating.

    AnomalousBit , in Mozilla acquired Anonym, an ad start-up

    I wish Mozilla had been really clear about their intentions and end goals with this acquisition. On the face of it, it looks terrible. Especially when you look at their jettisoning of Servo.

    What the hell are they up to if making a browser engine isn’t a core competency, but buying an ad company is considered a wise move?

    devfuuu ,

    One is just spending money, the other potentially brings you money in.

    AnomalousBit ,

    Well that’s cool if they want to become an ad company, but last I checked they are known for making a browser. I’m sure they’ll do so much better than Oracle in the ad business. /s

    Microw ,

    They are known for making a browser that constantly puts them into a financial deficit. Mozilla is still looking for a way to pay their bills in the long run.

    Vivendi ,

    Servo wasn’t going anywhere and even today the absolute best they are trying to do is to be a tiny embedded engine. They took parts of projects that were worth shit and added it into their core ecosystem and stopped the vanity dream of making a whole new browser core.

    So, servo is dead long live servo

    Maeve , in Mozilla acquired Anonym, an ad start-up

    About Anonym: Anonym was founded in 2022 by former Meta executives Brad Smallwood and Graham Mudd. The company was backed by Griffin Gaming Partners, Norwest Venture Partners, Heracles Capital as well as a number of strategic individual investors.

    Completely reassuring.

    refalo ,

    a browser with 2% market share isn’t getting anywhere without lots of money. I fail to see the issue?

    huginn ,

    The issue is fuck ads

    toastboy79 ,

    That's a fair position, but let me ask. Are you donating to Mozilla on a monthly basis?

    huginn ,

    Yes. Just like I donate to my Lemmy host on a regular basis.

    I even pay for YouTube, despite using Vanced.

    Fuck ads.

    toastboy79 ,

    Hell yeah man, carry on then. Nice to meet someone who walks the walk while talking the talk

    Bitrot ,

    The Mozilla Corporation does not accept donations.

    huginn ,

    Sorta. The foundation does.

    foundation.mozilla.org/en/donate/

    Bitrot ,

    Yes, and they don’t develop Firefox (legally can’t) since they made a for-profit entity for that purpose.

    huginn ,

    Hmm maybe I stop donating then… I’ll have to dig into where my money is actually going.

    qprimed , (edited ) in Firmware flaw affects numerous generations of Intel CPUs — UEFI code execution vulnerability found for Intel CPUs from 14th Gen Raptor Lake to 6th Gen Skylake CPUs, and TPM will not save you

    phew! good thing I still have a few 386sx AMI BIOS boards handy. no ones shopping around zero days on those anymore, right?

    MyTurtleSwimsUpsideDown ,
    apfelwoiSchoppen , in Mozilla acquired Anonym, an ad start-up
    @apfelwoiSchoppen@lemmy.world avatar

    When you get most of your funding from Google, you might start to act like Google.

    Spotlight7573 ,

    Looking at it most favorably, if they ever want to not be dependent on Google, they need revenue to replace what they get from Google and like it or not much of the money online comes from advertising. If they can find a way to get that money without being totally invasive on privacy, that’s still better than their current position.

    apfelwoiSchoppen ,
    @apfelwoiSchoppen@lemmy.world avatar

    In my view that isn’t favorable. It is perhaps real, but it still doesn’t sit well as we know where it likely leads.

    refalo ,

    most people are stupid. you and I don’t click on ads, sure, but how do you think google got all their money? you have to cater to the idiots.

    Auzy , in X will soon limit the ability to livestream to Premium subscribers

    And nobody noticed…

    Maybe calling a hero cave diver a pedo wasn’t a great idea, Elon

    CrypticCoffee , in Mozilla acquired Anonym, an ad start-up

    I think the important thing is consent to use data. If I can control what data I share with them, it isn’t the end of the world. If I choose to not, and it’s honoured, then this is a good thing. I’d prefer this approach funding development to Mozilla not being able to compete.

    Mozilla is a far superior company to Google.

    refalo ,

    they shouldn’t have the data in the first place. constant data leaks have shown us that is the only way to have some privacy.

    MurrayL , in Mozilla acquired Anonym, an ad start-up

    Advertising isn’t going anywhere, so investing in/supporting ways to more ethically serve ads without harvesting private data seems like a good thing?

    uzay ,

    Becoming an ad company while trying to put privacy first seems like a conflict of interests in the making

    MurrayL ,

    It’s definitely making their job harder on the face of it, but it also differentiates them from other ad companies, so I guess they’re betting on that being a draw for potential clients.

    AllNewTypeFace ,
    @AllNewTypeFace@leminal.space avatar

    As Jamie Zawinski put it, it’s like a non-profit animal shelter setting up a sideline selling kitten meat to satisfy demands for hockey-stick growth. If somebody castigates them for it, they can point out that the demand for kitten deli slices didn’t going to go away, and if they didn’t sell them, someone else would step in and do it less humanely.

    Trainguyrom ,

    There’s actually a real world example of this. Some cats that are disected in schools are euthanized cats from shelters, because the alternative is cat farms that breed cats just to be killed and disected

    Empricorn ,

    Worse than being a pro-privacy company that utterly depends on Google?

    helenslunch ,
    @helenslunch@feddit.nl avatar

    It’s not, at all. When you drive by a billboard on the highway, is it invading your privacy? There’s no reason there can’t be a digital equivalent.

    jjjalljs ,

    That’s what I always say. Targeted advertising should be illegal. Contextual advertising is acceptable.

    If I’m on the star trek wiki, serve me ads for star trek, sci-fi, and whatever. You don’t need to know anything about me specifically.

    We’d still need to do something about like ads that take up too much space, hurt page performance, or introduce malware, but removing the stalking would be an improvement

    Trainguyrom ,

    Contextual ads can be simple images/html without 20 thousand scripts buried in

    helenslunch ,
    @helenslunch@feddit.nl avatar

    Right, and something like Reddit makes targeted advertising SUPER easy, with zero personal information.

    Want to know what kind of products I might be interested in? Literally just ask.

    ms_lane ,

    When you drive by a billboard on the highway, is it invading your privacy?

    Possibly?

    Let me rephrase it a little- When you walk past a digital advertising screen at a Westfield Shopping Centre - is it invading your privacy? (The answer is a definite YES, they have facial tracking and keep metrics on where you go in the mall, how long you loiter in certain locations, what stores you go, whether you came back out with bags, etc)

    helenslunch ,
    @helenslunch@feddit.nl avatar

    Once again, that is an issue with implementation, and not an issue inherent in advertising in general.

    skullgiver ,
    @skullgiver@popplesburger.hilciferous.nl avatar

    Advertising doesn’t need to be privacy infringing. That’s something from the last 20 years. No conflict of interest necessary if all they do is ads.

    Blizzard ,

    When everyone start using adblockers then it will go away and companies will have to come with new business models. I have been using adblockers since the first adblock was released and I don’t see ads so it’s up to the people. Better invest in/support ways to block ads.

    wizardbeard ,

    Same, but surely you realize that ads have only gotten worse in the intervening time. I also don’t truly believe that we’ll ever reach critical mass on adblocker users. You’re asking people who don’t care, who don’t use the internet the same way we do, to suddenly care enough to take manual action outside of their knowledgebase amd comfort zone.

    The only way the adblocker user numbers get pumped up to critical mass for a change is if a popular default browser makes adblocking an opt-out default.

    Blizzard ,

    You’re asking people who don’t care, who don’t use the internet the same way we do, to suddenly care enough to take manual action outside of their knowledgebase amd comfort zone

    If they don’t care about ads then they won’t care if those ads are private or not.

    thejml ,

    I will say that we’re definitely getting to a level of adblockers that the sites actively care about blocking content or warning about people using adblockers. It’s starting to affect their bottom lines.

    webghost0101 , (edited )

    Why? Does 95% of digital advertisement even serve a single valuable purpose?

    I get that websites need funding and that legitimate business require some way communicate their services exist. We need to solve the problem for the former and create specialized accessible safe spaces for the later.

    When is the last time anyone here saw an ad for a local business, when is the last time anyone recall willfully clicking one? Was there actually anything useful there?

    From what i recall ads almost always are one of the following:

    • sex, barely legal drugs and predatory video games. (Lumped together to make a bad pun)
    • real product/fake price: oh this item isnt in stock plz look at catalog
    • politics, buy our guide to get rich, actual illegal scam operation.

    None of them are honest or respectful to the customer. People aren’t prey, stop baiting.

    Admittedly, for me this is personal. Autism means i experience the extra noise as painful. Plastering it on useful websites feels like a hostile attack to keep me out and unwelcome. I downright refuse to look at watch nor will i support them through ad free subscriptions to the point of it having become a digital disability.

    But come on, can we smart online people really not figure out something else that isn’t based on literal brainwashing.

    5C5C5C ,

    I think a long time ago a vicious cycle began in the advertising space where predatory ads had more incentive to pay for ad space, so sensible people start to perceive ads in general as predatory. Now no sensible advertiser that’s trying to promote a legitimate product for legitimate reasons will do so by buying ad space, thus reinforcing the increasingly accurate perception that all ads are predatory.

    wizardbeard ,

    As well as predatory/not, there’s also a trend with attention grabbing/not.

    There was a period of time where Google AdWords ruled the online ad space, and most ads were pure text in a box with a border making the border between content and ads visually distinct.

    Kind of like having small portions of the newspaper classified section cut out and slapped around the webpage.

    I still disliked them, but they were fairly easy to look past, and you didn’t have to worry about the ad itself carrying a malware payload (just whatever they linked to).

    Companies found that those style ads get less clickthrough than flashier ones, and that there’s no quantifiable incentive to not make their ads as obnoxious as possible. So they optimized for the wrong metric: clickthrough vs sales by ad.

    More recently, companies have stepped up their tracking game so they can target sales by ad more effectively, but old habits die hard, and predatory ads that just want you to click have no incentive to care and “de-escalate” the obnoxiousness.

    delirious_owl ,
    @delirious_owl@discuss.online avatar

    The alternative was supposed to be idle crypto mining

    sunbather , in Girl, 15, speaks out after classmate made deepfake nudes of her and posted online

    society has become so used to girls and women being considered less that there is a scary amount of rationalization as to why its fine actually to completely annihilate all remaining bodily autonomy they have left. this is an explosion in suicides of young girls and adult women alike begging to happen. wake the fuck up.

    deFrisselle , in Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
    @deFrisselle@lemmy.sdf.org avatar

    Odd that there is no mention of the parents contacting the police and working through them to get the images down Technically and legally the photos would be considered child porn Since it’s over the Internet it would bring Federal charges even though there maybe State charges Somethings were handled wrong if all the kid is getting is probation

    wewbull ,

    Technically and legally the photos would be considered child porn

    I don’t think that has been tested in court. It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

    I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone. A form of image based libel, but I don’t think that’s currently a legal concept. It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

    In fact, that raises an interesting simily. We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked. We allow images of human physical abuse as long as they are faked. Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them. The resulting “works of art” are not under such limitations as far as I’m aware.

    What’s the line here? Parental consent? I think that could lead to some very concerning outcomes. We all know abusive parents exist.

    I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful. Ones that will potentially do a lot of harm. Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water. Police people’s actions, not data.

    todd_bonzalez , (edited )

    I don’t think that has been tested in court.

    It has and it continues to be.

    And even if it hadn’t, that’s no excuse not to start.

    It would be a reasonable legal argument to say that the image isn’t a photo of anyone. It doesn’t depict reality, so it can’t depict anyone.

    It depicts a real child and was distributed intentionally because of who it depicts. Find me then legal definition of pornography that demands that pornography be a “depiction of reality”. Where do you draw the line with such a qualifier?

    I think at best you can argue it’s a form of photo manipulation, and the intent is to create a false impression about someone.

    It is by definition “photo manipulation”, but the intent is to sexually exploit a child against her will. If you want to argue that this counts as a legal form of free speech (as libel is, FYI), you can fuck right on off with that.

    A form of image based libel, but I don’t think that’s currently a legal concept.

    Maybe actually know something about the law before you do all this “thinking”.

    It’s also a concept where you would have to protect works of fiction otherwise you’ve just made the visual effects industry illegal if you’re not careful.

    Oh no, not the sLiPpErY sLoPe!!!

    We do not allow animals to be abused, but we allow images of animal abuse in films as long as they are faked.

    Little girls are the same as animals, excellent take. /s

    Children are often in horror films, and creating the images we see is very strictly managed so that the child actor is not exposed to anything that could distress them.

    What kind of horror films are you watching that has naked children in sexual situations?

    What’s the line here?

    Don’t sexually exploit children.

    Parental consent?

    What the living fuck? Parental consent to make porn of their kids? This is insane.

    I say all of this, not because I want to defend anyone, but because I think we’re about to set some really bad legal precidents if we’re not careful.

    The bad legal precedent of banning the creation and distribution of child pornography depicting identifiable minors?

    Personally, I don’t think the concept of any image, or any other piece of data, being illegal holds water.

    Somebody check this guy’s hard drive…

    suburban_hillbilly , (edited )

    photos

    They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

    There isn’t any actual private information about the girls being disclosed. The algorithms, for example, do not and could not know about and produce an unseen birthmark, mole, tattoo, piercing, etc. A photograph would have that information. What is being shown is an approxomation of what similar looking girls in the training set look like, with the girls’ faces stiched on top. That is categorically different than something like revenge porn which is purely private information specific to the individual.

    I’m sure it doesn’t feel all that different to the girls in the pics, or to the boys looking at it for that matter. There is some degree of harm here without question. But we must tread lightly because there is real danger in categorizing algorithmic guesswork as reliable which many authoritarian types are desperate to do.

    wired.com/…/parabon-nanolabs-dna-face-models-poli…

    This is the other side of the same coin. We cannot start treating the output of neural networks as facts. These are error prone black-boxes and that fact must be driven hard into the consciousness of every living person.

    For some, I’m sure purely unrelated reason, I feel like reading Phillip K Dick again…

    daellat ,

    I’ve only read do androids dream of electric sheep by him, what other book(s) should I check out by him?

    Rai ,

    Androids/sheep was so good

    KillingTimeItself ,

    They aren’t photos. They’re photorealistic drawings done by computer algorithms. This might seem like a tiny quibble to many, but as far as I can tell it is the crux of the entire issue.

    most phone cameras alter the original image with AI shit now, it’s really common, they apply all kinds of weird correction to make it look better. Plus if it’s social media there’s probably a filter somewhere in there. At what point does this become the ship of thesseus?

    my point here, is that if we’re arguing that AI images are semantically, not photos, than most photos on the internet including people would also arguably, not be photos to some degree.

    suburban_hillbilly ,

    The difference is that a manipulated photo starts with a photo. It actually contains recorded information about the subject. Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

    Yes it is semantics, it’s the reason why we have different words for photography and drawing and they are not interchangeable.

    Rekorse ,

    The deepfakes would contain the prompt image provided by the creator. They did not create a whole new approximation of their face as the entire pool it can pull on for that specific part is a single or group of images provided by the prompter.

    KillingTimeItself ,

    yeah idk why they said that, it’s objectively wrong. Like hilariously wrong.

    KillingTimeItself ,

    Deepfakes do not contain any recorded information about the subject unless that subject is also in the training set.

    this is explicitly, untrue, they literally do. You are just factually wrong about this. While it may not be in the training data, how do you think it manages to replace the face of someone in one picture, with the face of someone else in some other video.

    Do you think it just magically guesses? No, it literally uses a real picture of someone. In fact, back in the day with ganimation and early deepfake software, you literally had to train these AIs on pictures of the person you wanted it to do a faceswap on. Remember all those singing deepfakes that were super popular back a couple of years ago? Yep, those literally trained on real pictures.

    Regardless, you are still ignoring my point. My question here was how do we consider AI content to be “not photo” but consider photos manipulated numerous times, through numerous different processes, which are quite literally, not the original photo, and a literal “photo” to rephrase it simpler for you, and other readers. “why is ai generated content not considered to be a photo, when a heavily altered photo of something that vaugely resembles it’s original photo in most aspects, is considered to be a photo”

    You seem to have missed the entire point of my question entirely. And simply said something wrong instead.

    Yes it is semantics

    no, it’s not, this is a ship of thesseus premise here. The semantics results in how we contextualize and conceptualize things into word form. The problem is not semantics (they are just used to convey the problem at hand), the problem is a philosophical conundrum that has existed for thousands of years.

    in fact, if we’re going by semantics here, technically photograph is rather broad as it literally just defines itself as “something in likeness of” though it defines it as taken by method of photography. We could arguably remove that part of it, and simply use it to refer to something that is a likeness of something else. And we see this is contextual usage of words, a “photographic” copy is often used to describe something that is similar enough to something else, that in terms of a photograph, they appear to be the same thing.

    Think about scanning a paper document, that would be a photographic copy of some physical item. While it is literally taken via means of photography. In a contextual and semantic sense, it just refers to the fact that the digital copy is photographically equivalent to the physical copy.

    suburban_hillbilly ,

    Oh FFS, I clipped the word new. Of course it uses information in the prompt. That’s trivial. No one cares about it returning the information that was given to it in the prompt. Nevertheless, mea culpa. You got me.

    this is a ship of thesseus premise here

    No, it really isn’t.

    The pupose of that paradox is that you unambiguously are recreating/replacing the ship exactly as you already know it is. The reason the ‘ai’ in question here is even being used is that it isn’t doing that. It’s giving you back much more than it was given.

    The comparison would be if Thesues’ ship had been lost and you definitely don’t have the ship anymore, but had managed to recover the sail. If you take the sail to an experienced builder (the ai) who had never seen the ship, then he might be able to build a reasonable approximation based on inferences from the sail and his wealth of knowledge, but nobody is going to be daft enough to assert it is same ship. Does the wheel even have the same number of spokes? Does it have the same number of oars? The same weight of anchor?

    The only way you could even tell if his attempted fascimile was close is if you had already intimate knowledge of the ship from some other source.

    …when a heavily altered photo of something that vaugely resembles it’s original photo in most aspects, is considered to be a photo”

    Disagree.

    skullgiver , in Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
    @skullgiver@popplesburger.hilciferous.nl avatar

    Shitty companies are selling AI editing tools explicitly for this purpose. Their ads are all over Instagram. They’ve been found in supposedly regulated app stores. Yet, I’ve never seen anyone report on this trash industry.

    There is no stopping the existence of these tools when running on local hardware, but it shouldn’t be this easy for teenagers. Somehow these companies manage to make money while real sex workers find themselves shoved into platforms like OnlyFans because no credit card company will process their payments. That’s the wrong way around!

    Evotech ,

    It’s not been reported on much because it doesn’t work that well. It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best

    KillingTimeItself ,

    It’s not as easy as they want you to believe it is. I’m pretty sure most of the “promotional material” has been photoshopped or cherry picked at best

    absolutely, all of the material out there for marketing is digitally manipulated by a human to some degree. And if it isn’t then honestly, i don’t know what you’re using AI image generation for lmao.

    Spedwell ,

    404media is doing excellent work on tracking the non-consentual porn market and technology. Unfortunately, you don’t really see the larger, more mainstream outlets giving it the same attention beyond its effect on Taylor Swift.

    Mandy , in X will soon limit the ability to livestream to Premium subscribers

    They also made analytics premium, wierdly enought noone seems to have made a stink about that one

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines