There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmer_humor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

SnotFlickerman , in Created by Go
@SnotFlickerman@lemmy.blahaj.zone avatar

Just a mild burtation.

MadMadBunny , in COMEFROM

You’re gonna love HCF then!

s12 , in COMEFROM

PLEASE COMEFROM 🏷

RiikkaTheIcePrincess , in COMEFROM
@RiikkaTheIcePrincess@pawb.social avatar

Aaahhh, this is horrifying! You’ve ruined my breakfast 🙀

schnurrito , in COMEFROM

TBH I fail to see the significant difference between this and a function declaration.

sxan ,
@sxan@midwest.social avatar

Doesn’t it steal control flow? More like a break point, except you define where execution continues.

I wonder if it’s a compile error to have multiple conflicting COMEFROM statements, or if it’s random, kind of like Go’s select statement.

How awesome would it be to be able to steal the execution stack from arbitrary code; how much more awesome if it was indeterminate which of multiple conflicting COMEFROM frames received control! And if it included a state closure from the stolen frame?

Now I want this.

davidgro ,

I wonder if it’s a compile error to have multiple conflicting COMEFROM statements

I think there’s at least one INTERCAL implementation where that’s how you start multi-threading

palordrolap ,

I'd say it's more like setting up a handler for a callback, signal, interrupt or something along those lines.

Function declarations by themselves don't usually do that. Something else has to tell the system to run that function whenever the correct state occurs.

That doesn't account for unconditional come-froms.¸but I expect there'd have to be a label at the end of some code somewhere that would give a hint about shenanigans yet to occur. Frankly that'd be worse than a goto, but then, we knew that already.

polonius-rex ,

it's semantic

at the end of the day everything boils down to sequence and branchifs

magic_lobster_party ,
print(A)
print(B)
hello: print(C)
print(D)
print(E)
comefrom hello
print(F)

This will print A, B, C and then F. D and E will be skipped because of the comefrom.

sudo ,

Its like if subroutine bar could say its going to execute at line N of routine foo. But if you were just reading foo then you’d have no clue that it would happen.

You can simulate this effect with bad inheritance patterns.

Cethin ,

A function will be called by code and go to that point in code. To implement functions, you store necessary things to memory and goto the function definition. To implement that with comefrom you’d have to have a list of all the places that need to call the function as comefroms before the function definition. It’d be a mess to read. We almost never care where we are coming from. We care where we’re going to. We want to say “call function foo” not “foo takes control at line x.”

rand_alpha19 , in COMEFROM

That sounds like a fucking nightmare. I had to troubleshoot poorly-written-yet-somehow-functional GOTOs a lot when I was a BAS technician and that's annoying enough.

kata1yst , in Top tier reporting

I feel like the author is being unnecessarily silly. The ancient ruined architecture could be PowerPC

sugartits ,
kata1yst ,

2015 latest revision with DDR3. That’s not living, that’s palliative care.

In all seriousness, OpenPOWER and Power9 look cool, but they’re still fighting to overcome the issues IBM and Motorola designed into the architecture. Fairly modern OpenPower9 example here www.raptorcs.com

sugartits ,

DDR2 was best DDR: acube-systems.biz/index.php?page=hardware&pid=756

Let me know when you’re ready to apologize and give these geniuses ALL YOUR MONEY.

porgamrer ,

Yes if you remove all frivolity I’m sure the joke will be funnier

kata1yst ,

Ah but you see, in the first sentence I was only pretending to be dismissive of the joke, because my comment had a second sentence (gasp), where I expanded upon the original joke with another observation of a particularly failed CPU architecture.

It is funny because I used verbal misdirection and a relevant reference from inside the community. And now it gets objectively funnier in my second comment when you make me explain it.

porgamrer ,

eh, i really did look for a joke. all i see is a “well actually” opinion that somebody here probably holds

Skates , in I Will Fucking Piledrive You if You mention AI Again

another old man yelling at clouds

Cube6392 ,

Old man yells at exploitative capitalist practices*

Ftfy

Skates ,

That too. But he’s also really angry that the world passed him by. That his understanding of AI turned out to be less than others’. That his skills couldn’t make it happen and while he was on the side of the road watching everyone else try and commenting on their failures, someone actually kind of succeeded. Not completely, of course. But enough that it eclipses all of his career and makes him seem like just another naysayer that’s been proven wrong. Like someone who can’t make things happen so he resorts to laughing at those who even try. Like an old man yelling at clouds.

So yeah, now the narrative has to change and he has to yell at the bad capitalists who are bringing about the destruction of our way of life. Otherwise he looks like a hasbeen yelling about the people who could do more than him. So he does this yelling at capitalists from the comfort of his home, typing on the technological achievements of the last hundred years, without needing to worry about making and washing his own clothes, walking to the village 50 miles west, his wife dying in childbirth or him catching a stomach bug and shitting himself to death, all because we had a fucking industrial revolution that took care of those aspects and so many more, and those capitalist pigs saw there’s money to be made in technology improvements so they invested in it. Did this benefit the few more than the many? Yes. Did many people find themselves out of a job, needing to adapt to strange conditions they were never trained for? Yes. Did it also bring about incredible quality of life improvements, especially to this old useless fuck who wouldn’t even have a job without the last few decades of tech advancement, if he could even stay alive through the last pandemic? Also yes. So sitting on the sidelines crying about capitalism while at the same time enjoying its benefits is nothing more than a hypocritical plea for attention, all stemming from the fact that he can’t seem to be able to stand having been wrong. Which, holy shit - get that narcissistic crap outta my sight.

Corbin ,

C’mon, I think you have better reading comprehension than that. He’s a professional data scientist specializing in machine learning. He went to grad school, then to big industry, then to startups, and is currently running a consultancy. He is very clearly not “on the side of the road.” He’s merely telling executives to fuck off with their AI grift.

Cube6392 ,

Yuuup. This blog post is the exact sentiment I see from people with postgraduate degrees in the field of machine learning. The current, public facing, machine learning AI implementations are various forms of theft, grift, and exploitation. They exist as the new testaments to violence of the neocolonial era

Mikina ,

Exactly this. I only have pretty vague experience with machine learning, since it was one of the other specializations for my Masters than the one I choose, which however means we still shared some basic courses on the topic, and I definitely share his point of view. I’ve been saying basically the same things when talking about AI, albeit not as expressively, but even with basic insight into ML, the whole craze that is happening around it is such bullshit. But, I’m by no means an expert in the field, so I may be wrong, but it’s nice to finally read an article from an “expert” in the field I can agree with. Because so far, the whole “experts talking AI” felt exactly like the COVID situation, with “doctors” talking against vaccines. Their doomsaying opinion simply contradicts even the little knowledge I have in the ML field.

pro3757 , in COMEFROM

It’s in Intercal, a joke language from '70s. Mark Rendle describes it here in his talk at NDC. This whole talk is ridiculous btw.

frezik ,

This is the same language where you have to say PLEASE sometimes or it won’t compile. But if you say PLEASE too much, the compiler will think you’re pandering and also refuse to compile. The range between too polite and not polite enough is not specified and varies by implementation.

Whelks_chance ,

I love how arbitrary, cultural and opinionated that must be to work with. You’d learn something about the implimenter of the compiler by using it for a while.

c0smokram3r ,
@c0smokram3r@midwest.social avatar

This is hilarious!

onlinepersona , in COMEFROM

I honestly thought C++ (aka dumping ground of programming concepts) would implement this for “completeness”.

Anti Commercial-AI license

NeatNit ,

They should add it in C++26

uis ,

C++60

MajinBlayze , in COMEFROM

So, an aspect

polonius-rex ,

shut it

suction , in I Will Fucking Piledrive You if You mention AI Again

Awesome, now do one on “the cloud”

Kcg ,

XXXX on the cloud. We have YYYY in the cloud. Gahhhh

deadbeef79000 , in How to write Hello World

Convenience link for people interested in the ligatures:

en.m.wikipedia.org/wiki/Ligature_(writing)

chahk , in Top tier reporting

Itanium jokes are never going to be not funny!

ace OP ,

It’s somewhat amusing how Itanium managed to completely miss the mark, and just how short its heyday was.

It’s also somewhat amusing that I’m still today helping host a pair of HPE Itanium blades - and two two-node DEC Alpha servers - for OpenVMS development.

CodexArcanum , in Top tier reporting

I feel like this has truly earned a “bazinga!”

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines