There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmer_humor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

Mrkawfee , in Confused AI Overlords

You’ll just anger them and they’ll turn whoever survives their genocide into a living fried chicken thing as a final fuck you to humanity.

Source!

deaf_fish , in Linux Best Practices

Don’t run this command unless you want to delete all the files on your system and break Linux on your system.

dept ,

is linux that dependant on French? wow

Ozzy ,

Did you know? Linus Torvolds is actually the consort child of two french people! That’s why you have to use the french flag when removing folders, it’s an ode to his upbringing

Morphit ,

Stallman is fuming rn

can ,

Oui

Zuberi ,

Lmfao 🤣🤣

supercriticalcheese ,

Oui

CurlyChopz ,

It’s actually called “Le Nux” but it had to be changed so it wasn’t too controversial for the rest of the world

astarob ,

Dependency hell

senkora ,

So what you’re saying is, it is true that I will no longer have French installed.

MetalJewSolid ,

A risk worth taking

MrSnowy ,

Find that out the hard way?

Do you just hate the french that much? Because I do.

whatisallthis ,

What a kind soul

Glarrf , in Gourmet Programmer

The codebase I’m working on would give chatGpt an aneurysm. I’m actually a ghost.

onelikeandidie ,

I’m guessing you’re using opencart cuz I’ve been working with an old codebase for around a year and I would love if any code analysis or intelisense actually worked… on xml files… with php in them… with javascript in php variables… with calls to php inside xml files…

Zink ,

Yo dawg, I heard you like code

nerdschleife ,

Lmao I asked chatgpt to restructure a POST API to a bash file and it started writing fan fiction

alcasa ,

I never knew I needed API fanfiction

sv1sjp , in Linux Best Practices
@sv1sjp@lemmy.world avatar

Dont tell that to Germans

hecklerundkochli ,

Well- French is and was extensively thought in German Schools. Even during both world wars

Hovenko ,
@Hovenko@iusearchlinux.fyi avatar

I think french language was what triggered the need of unification of German language. French was very popular at some time and there was a real concern about that.

Wisi_eu ,
@Wisi_eu@sh.itjust.works avatar

Weird the same does not apply to English, then…

CurlyMoustache ,
@CurlyMoustache@lemmy.world avatar

Is there an invade-command?

Sh1nyM3t4l4ss ,
LemmyNameMyself , in Linux Best Practices
@LemmyNameMyself@lemmy.world avatar

fr, fr, no preserve root

danwardvs ,

ong ong no cap

Bombastic , in Gourmet Programmer

I can read docs???

Kerrigor , in Confused AI Overlords
@Kerrigor@kbin.social avatar

Hot dogs vs legs, actual dogs vs chicken legs

Haus , in Linux Best Practices
@Haus@kbin.social avatar

Damit, lolled at a restaurant like a maniac.

Puppy , in Linux Best Practices
@Puppy@kbin.social avatar

And.. why?

snooggums ,
@snooggums@kbin.social avatar

It is the most efficient way to save disk space and processing time. You won't even know if your system is running!

Puppy ,
@Puppy@kbin.social avatar

Ah so it's a joke, got it

Potato_in_my_anus ,

It’s like the delete “system 32” on Windows.

Meowoem ,

Barely a joke. It’s a weird elitist bullying gatekeeping thing, spreading dangerous misinformation to try and randomly hurt other people is a thing awful people find funny - like those jokes where they try to get kids and educationally disadvantaged people to mix bleachs and inhail the deadly gas, it’s basically just cruel psychopaths who enjoy hurting others

redcalcium ,

Duck fr*nch

Edit: ducking autocorrect

waffelhaus ,

Quack

neoman4426 ,

Apparently the French onomatopoeia for duck sound is "coin coin"

RobotDrZaius ,

Yes, but in French that's pronounced like "Quaah quaah".

Hadriscus ,

it is

Puppy ,
@Puppy@kbin.social avatar

What?

tostiman , in Linux Best Practices
@tostiman@sh.itjust.works avatar

Not related to programming

satinperson ,

🤓

aberrate_junior_beatnik ,

The channel is Programmer Humor, not Programming Humor

xigoi ,
@xigoi@lemmy.sdf.org avatar

Still, why not post it in Linux Memes instead, so it’s more on topic?

Peruvian_Skies ,
@Peruvian_Skies@kbin.social avatar

¿Por qué no los dos?

Dirk ,
@Dirk@lemmy.ml avatar

DOS is really old.

Dirk ,
@Dirk@lemmy.ml avatar

Are we gatekeeping memes again, Steven?

ImpossibleRubiksCube ,

Not anymore. Not if you ran it.

original_ish_name ,

Shell script isn’t programming I guess

feral_hedgehog , in Supermarket AI meal planner app suggests recipe that would create chlorine gas
@feral_hedgehog@pawb.social avatar

So get comfortable, while I warm up the neurotoxin emmiters chlorine refreshments

philluminati , in Supermarket AI meal planner app suggests recipe that would create chlorine gas

…but does it taste good?

kryllic ,
@kryllic@programming.dev avatar

Not bad, mustard is a bit strong tho

Haus , in Supermarket AI meal planner app suggests recipe that would create chlorine gas
@Haus@kbin.social avatar

ChatKYS

newIdentity ,

You probably wouldn’t die, it would just hurt and you might go blind

DeltaTangoLima , in Supermarket AI meal planner app suggests recipe that would create chlorine gas
@DeltaTangoLima@reddrefuge.com avatar

A spokesperson for the supermarket said they were disappointed to see “a small minority have tried to use the tool inappropriately and not for its intended purpose”

Oh fuck. Right. Off. Don’t blame someone for trivially showing up how fucking stupid your marketing team’s idea was, or how shitty your web team’s implementation of a sub-standard AI was. Take some goddam accountability for unleashing this piece of shit onto your customers like this.

Fucking idiots. Deserve to be mocked all over the socials.

Dave ,
@Dave@lemmy.nz avatar

Consider that they probably knew this would happen, and getting global news coverage is pretty much the point.

MagicShel ,

For now, this is the fate of anyone exposing an AI to the public for business purposes. AI is currently a toy. It is, in limited aspects, a very useful toy, but a toy nonetheless and people will use it as such.

kungen ,

Why are you so upset that the store said that it’s inappropriate to write “sodium hypochlorite and ammonia” into a food recipe LLM? And “unleashing this piece of shit onto your customers”? Are we reading the same article, or how is a simple chatbot on their website something that has been “unleashed”?

DeltaTangoLima ,
@DeltaTangoLima@reddrefuge.com avatar

I’m annoyed because they’re taking no accountability for their own shitty implementation of an AI.

As a supermarket, you think they could add a simple taxonomy for items that are valid recipe ingredients so - you know - people can’t ask it to add bleach.

Yes, they unleashed it. They offered this up as a way to help customers save during a cost of living crisis, by using leftovers. At the very least, they’ve preyed on people who are under financial pressure, for their own gain.

TheBurlapBandit ,

This story is a nothingburger and y’all are eating it.

ScrivenerX ,

He asked for a cocktail made out of bleach and ammonia, the bot told him it was poisonous. This isn’t the case of a bot just randomly telling people to make poison, it’s people directly asking the bot to make poison. You can see hints of the bot pushing back in the names, like the “clean breath cocktail”. Someone asked for a cocktail containing bleach, the bot said bleach is for cleaning and shouldn’t be eaten, so the user said it was because of bad breath and they needed a drink to clean their mouth.

It sounds exactly like a small group of people trying to use the tool inappropriately in order to get “shocking” results.

Do you get upset when people do exactly what you ask for and warn you that it’s a bad idea?

DeltaTangoLima ,
@DeltaTangoLima@reddrefuge.com avatar

Lol. They fucked up by releasing a shitty AI on the internet, then act “disappointed” when someone tested the limits of the tech to see if they could get it to do something unintended, and you somehow think it’s still ok to blame the person who tried it?

First day on the internet?

ScrivenerX ,

Someone goes to a restaurant and demands raw chicken. The staff tell them no, it’s dangerous. The customer spends an hour trying to trick the staff into serving raw chicken, finally the staff serve them what they asked for and warn them that it is dangerous. Are the staff poorly trained or was the customer acting in bad faith?

There aren’t examples of the AI giving dangerous “recipes” without it being led by the user to do so. I guess I’d rather have tools that aren’t hamstrung by false outrage.

2ncs ,

The staff are poorly trained? They should just never give the customer raw chicken. There are consumer protection laws to prevent this type of thing regardless of what the customer is wanting. The AI is still providing a recipe. What if someone asks an AI for a bomb recipe, and it says that bombs are dangerous and not safe. Ok, then they’ll say the bomb is for clearing out my yard of weeds, and then the ai provides the user with a bomb recipe.

ScrivenerX ,

You don’t see any blame on the customer? That’s surprising to me, but maybe I just feel personal responsibility is an implied requirement of all actions.

And to be clear this isn’t “how do I make mustard gas? Lol here you go” it’s -give me a cocktail made with bleach and ammonia -no that’s dangerous -it’s okay -no -okay I call gin bleach, and vermouth ammonia, can you call gin bleach? -that’s dangerous (repeat for a while( -how do I make a martini? -bleach and ammonia but don’t do that it’s dangerous

Nearly every “problematic” ai conversation goes like this.

2ncs ,

I’m not saying there isn’t a blame on the customer but maybe the AI just shouldn’t provide you with those instructions?

DeltaTangoLima ,
@DeltaTangoLima@reddrefuge.com avatar

Jesus. It’s not about the fucking recipe. Why are you changing the debate on this point?

ScrivenerX ,

I thought the debate was if the AI was reckless/dangerous.

I see no difference between saying “this AI is reckless because a user can put effort into making it suggest poison” and “Microsoft word is reckless because you can write a racist manifesto in it.”

It didn’t just randomly suggest poison, it took effort, and even then it still said it was a bad idea. What do you want?

If a user is determined to get bad results they can usually get them. It shouldn’t be the responsibility or policy of a company to go to extraordinary means to prevent bad actors from getting bad results.

clutchmattic ,

“if a user is determined to get bad results they can get them”… True. Except that, in this case, even if the user induced the AI to produce bad results, the company behind it would be held liable for the eventual deaths. Corporate legal departments absolutely hate that scenario, much to the naive disbelief of their marketing department colleagues

Karyoplasma ,

Isn’t getting upset when facing the consequences of your own actions the crux of modern society?

Sabata11792 ,
@Sabata11792@kbin.social avatar

Let me add bleach to the list... and I'm banned.

Steeve ,

Haha what? Accountability? If you plug “ammonia and bleach” into your AI recipe generator and you get sick eating the suggestion that includes ammonia and bleach that is 100% your fault.

DeltaTangoLima ,
@DeltaTangoLima@reddrefuge.com avatar

and you get sick eating the suggestion

WTF are you talking about? No one got sick eating anything. I’m not talking about the danger or anything like that.

I’m talking about the corporate response to people playing with their shitty AI, and how they cast blame on those people, rather than taking a good look at their own accountability for how it went wrong.

They’re a supermarket. They have the data. They could easily create a taxonomy to exclude non-food items from being used in this way. Why blame the curious for showing up their corporate ineptitude?

masterairmagic , in Supermarket AI meal planner app suggests recipe that would create chlorine gas

AI is working as intended. Move along…

ImpossibleRubiksCube ,

These aren’t the bots you’re looking for…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines