Doesn’t it steal control flow? More like a break point, except you define where execution continues.
I wonder if it’s a compile error to have multiple conflicting COMEFROM statements, or if it’s random, kind of like Go’s select statement.
How awesome would it be to be able to steal the execution stack from arbitrary code; how much more awesome if it was indeterminate which of multiple conflicting COMEFROM frames received control! And if it included a state closure from the stolen frame?
I'd say it's more like setting up a handler for a callback, signal, interrupt or something along those lines.
Function declarations by themselves don't usually do that. Something else has to tell the system to run that function whenever the correct state occurs.
That doesn't account for unconditional come-froms.¸but I expect there'd have to be a label at the end of some code somewhere that would give a hint about shenanigans yet to occur. Frankly that'd be worse than a goto, but then, we knew that already.
Its like if subroutine bar could say its going to execute at line N of routine foo. But if you were just reading foo then you’d have no clue that it would happen.
You can simulate this effect with bad inheritance patterns.
A function will be called by code and go to that point in code. To implement functions, you store necessary things to memory and goto the function definition. To implement that with comefrom you’d have to have a list of all the places that need to call the function as comefroms before the function definition. It’d be a mess to read. We almost never care where we are coming from. We care where we’re going to. We want to say “call function foo” not “foo takes control at line x.”
That sounds like a fucking nightmare. I had to troubleshoot poorly-written-yet-somehow-functional GOTOs a lot when I was a BAS technician and that's annoying enough.
2015 latest revision with DDR3. That’s not living, that’s palliative care.
In all seriousness, OpenPOWER and Power9 look cool, but they’re still fighting to overcome the issues IBM and Motorola designed into the architecture. Fairly modern OpenPower9 example here www.raptorcs.com
Ah but you see, in the first sentence I was only pretending to be dismissive of the joke, because my comment had a second sentence (gasp), where I expanded upon the original joke with another observation of a particularly failed CPU architecture.
It is funny because I used verbal misdirection and a relevant reference from inside the community. And now it gets objectively funnier in my second comment when you make me explain it.
That too. But he’s also really angry that the world passed him by. That his understanding of AI turned out to be less than others’. That his skills couldn’t make it happen and while he was on the side of the road watching everyone else try and commenting on their failures, someone actually kind of succeeded. Not completely, of course. But enough that it eclipses all of his career and makes him seem like just another naysayer that’s been proven wrong. Like someone who can’t make things happen so he resorts to laughing at those who even try. Like an old man yelling at clouds.
So yeah, now the narrative has to change and he has to yell at the bad capitalists who are bringing about the destruction of our way of life. Otherwise he looks like a hasbeen yelling about the people who could do more than him. So he does this yelling at capitalists from the comfort of his home, typing on the technological achievements of the last hundred years, without needing to worry about making and washing his own clothes, walking to the village 50 miles west, his wife dying in childbirth or him catching a stomach bug and shitting himself to death, all because we had a fucking industrial revolution that took care of those aspects and so many more, and those capitalist pigs saw there’s money to be made in technology improvements so they invested in it. Did this benefit the few more than the many? Yes. Did many people find themselves out of a job, needing to adapt to strange conditions they were never trained for? Yes. Did it also bring about incredible quality of life improvements, especially to this old useless fuck who wouldn’t even have a job without the last few decades of tech advancement, if he could even stay alive through the last pandemic? Also yes. So sitting on the sidelines crying about capitalism while at the same time enjoying its benefits is nothing more than a hypocritical plea for attention, all stemming from the fact that he can’t seem to be able to stand having been wrong. Which, holy shit - get that narcissistic crap outta my sight.
C’mon, I think you have better reading comprehension than that. He’s a professional data scientist specializing in machine learning. He went to grad school, then to big industry, then to startups, and is currently running a consultancy. He is very clearly not “on the side of the road.” He’s merely telling executives to fuck off with their AI grift.
Yuuup. This blog post is the exact sentiment I see from people with postgraduate degrees in the field of machine learning. The current, public facing, machine learning AI implementations are various forms of theft, grift, and exploitation. They exist as the new testaments to violence of the neocolonial era
Exactly this. I only have pretty vague experience with machine learning, since it was one of the other specializations for my Masters than the one I choose, which however means we still shared some basic courses on the topic, and I definitely share his point of view. I’ve been saying basically the same things when talking about AI, albeit not as expressively, but even with basic insight into ML, the whole craze that is happening around it is such bullshit. But, I’m by no means an expert in the field, so I may be wrong, but it’s nice to finally read an article from an “expert” in the field I can agree with. Because so far, the whole “experts talking AI” felt exactly like the COVID situation, with “doctors” talking against vaccines. Their doomsaying opinion simply contradicts even the little knowledge I have in the ML field.
This is the same language where you have to say PLEASE sometimes or it won’t compile. But if you say PLEASE too much, the compiler will think you’re pandering and also refuse to compile. The range between too polite and not polite enough is not specified and varies by implementation.
I love how arbitrary, cultural and opinionated that must be to work with. You’d learn something about the implimenter of the compiler by using it for a while.
It’s somewhat amusing how Itanium managed to completely miss the mark, and just how short its heyday was.
It’s also somewhat amusing that I’m still today helping host a pair of HPE Itanium blades - and two two-node DEC Alpha servers - for OpenVMS development.
programmer_humor
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.