Doesn’t mean the statement is less true, the enshitification of google is a symptom, the disease is the internet as a whole. Google and LLMs screwing the web, M$ screwing windows, Apple’s existence by itself, Meta monopolizing and screwing social media, and don’t get me started with streaming platforms and other media industries are all symtoms.
Considering all of that, yes, the internet enshitification is very real.
But anyway, the cool thing about the internet is that you can find your nice cozy niche and stay there.
That’s how the 90s internet was. If the megacorps want to be in here, fine. I’ll just stay in Lemmy. And when Lemmy starts sucking, I’ll move to somewhere else.
It’s not even an issue with java. Apps ran fine on the original Android devices with single core CPUs and half a gig of RAM or less. It’s just that developers get lazier as more powerful hardware become available. Nobody cares about writing well optimized code anymore.
If Google and Apple required all apps to run smoothly on low end hardware from 5 years ago, we would be using our phones until the wear out rather than having to upgrade every couple of years if the batteries are replaceable.
Android has actually employed a hybrid JIT/AOT compilation model for a long time.
The application bytecode is only interpreted on first run and afterwards if there’s no cached JIT compilation for it. The runtime AOT compiles well-known methods and then profiles the application to identify targets for asynchronous JIT compilation when the device is idle and charging (so no excess battery drain): source.android.com/docs/core/runtime/configure#ho…
Compiling on the device allows the use of profile-guided optimizations (PGO), as well as the use of any non-baseline CPU features the device has, like instruction set extensions or later revisions (e.g. ARMv8.5-A vs ARMv8).
If apps had to be distributed entirely as compiled object code, you’d either have to pre-compile artifacts for every different architecture and revision you plan to support, or choose a baseline to compile against and then use feature detection at runtime, which adds branches to potentially hot code paths.
It would also require the developer to manually gather profiling data if they wanted to utilize PGO, which may limit them to just the devices they have on-hand, or paying through the nose for a cloud testing service like that offered by Firebase.
This is not to mention the massive improvement to the developer experience from not having to wait several minutes for your app to compile to test out each change. Call it laziness all you want, but it’s risky to launch a platform when no one wants to develop apps for it.
Any experienced Android dev will tell you it does kinda suck anyways, but it’d suck way worse if it was all C++ instead. I’d take Android development over iOS development any day of the week though. XCode is one of the worst software products ever conceived, and you’re forced to use it to build anything for iOS.
I know about all this — I actually began implementing my own JVM language a few days ago. I know Android uses Dalvik btw. But I guess a lot of people can use this info; infodump is always good. I do that.
btw I actually have messed around with libgcc-jit and I think at least on x86, it makes zero difference. I once did a test:
– Find /e/ with MAWK -> 0.9s – Find /e/ with JAWK -> 50s.
No shit! It’s seriously slow.
Now compare this with go-awk: 19s.
Go has reference counting and heap etc, basically a ‘compiled VM’. I think if you want fast code, ditch runtime.
Actually, Android doesn’t really use Dalvik anymore. They still use the bytecode format, but built a new runtime. The architecture of that runtime is detailed on the page I linked. IIRC, Dalvik didn’t cache JIT compilation results and had to redo it every time the application was run.
FWIW, I’ve heard libgcc-jit doesn’t generate particularly high quality code. If the AOT compiled code was compiled with aggressive optimizations and a specific CPU in mind, of course it’ll be faster. JIT compiled code can meet or exceed native performance, but it depends on a lot of variables.
As for mawk vs JAWK vs go-awk, a JIT is not going to fix bad code. If it were a true apples to apples comparison, I’d expect a difference of maybe 30-50%, not ~2 orders of magnitude. A performance gap that wide suggests fundamental differences between the different implementations, maybe bad cache locality or inefficient use of syscalls in the latter two.
On top of that, you’re not really comparing the languages or runtimes so much as their regular expression engines. Java’s isn’t particularly fast, and neither is Go’s. Compare that to Javascript and Perl, both languages with heavyweight runtimes, but which perform extraordinarily well on this benchmark thanks to their heavily optimized regex engines.
It looks like mawk uses its own bespoke regex engine, which is honestly quite impressive in that it performs that well. However, it only supports POSIX regular expressions, and doesn’t even implement braces, at least in the latest release listed on the site: github.com/ThomasDickey/mawk-20140914
(The author creates a new Github repo to mirror each release, which shows just how much they refuse to learn to use Git. That’s a respectable level of contempt right there.)
Meanwhile, Java’s regex engine is a lot more complex with more features, such as lookahead/behind and backreferences, but that complexity comes at a cost. Similarly, if go-awk is using Go’s https://pkg.go.dev/regexp, it’s using a much more complex regex engine than is strictly necessary. And Golang admits in their own FAQ that it’s not nearly as optimized as other engines like PCRE.
Thus, it’s really not an apples to apples comparison. I suspect that’s where most of the performance difference arises.
Go has reference counting and heap etc, basically a ‘compiled VM’.
This statement is completely wrong. Like, to a baffling degree. It kinda makes me wonder if you’re trolling.
Go doesn’t use any kind of VM, and has never used reference counting for memory management as far as I can tell. It compiles directly to native machine code which is executed directly by the processor, but the binary comes with a runtime baked in. This runtime includes a tracing garbage collector and manages the execution of goroutines and related things like non-blocking sockets.
Additionally, heap management is a core function of any program compiled for a modern operating system. Programs written in C and C++ use heap allocations constantly unless they’re specifically written to avoid them. And depending on what you’re doing and what you need, a C or C++ program could end up with a more heavyweight collective of runtime dependencies than the JVM itself.
At the end of the day, trying to write the fastest code possible isn’t usually the most productive approach. When you have a job to do, you’re going to welcome any tool that makes that job easier.
This statement is completely wrong. Like, to a baffling degree. It kinda makes me wonder if you’re trolling.
No I just struggle at getting my meaning across + these stuff are new to me. What I meant was ‘Go does memory management LIKE a VM does’. Like ‘baking in the GC’. Does that make sense? Or am I still wrong?
Issue is incentive. Developers use what they are told by more senior developers and most rewrites and tech debt work is deemed unprofitable and dropped.
They use shit like electron to write things once. Its always the worst experience but it seems to management on paper to be a huge win.
You know there’s nothing stopping you from buying a server rack and loading that bad boy out with as much processing power as your heart desires, right?
Well, except money I guess, but according to this 1969 price list referenced on Wikipedia, a base model PDP-11 with cabinet would run you around $11,500. Adjusted for inflation, that’s about 95 grand. You could put together one hell of a home server for that kind of money.
What? How does her being weirded out about the words “sacrifice child” mean she ignored anything? It doesn’t matter what triggered the error, she is questioning why the code has dark word combinations
You have so much to learn about people who feed into the Satanic panic. Cherry picking is by definition how they get there. One of Alex Jones biggest boggiemen for years was a subsection of a law that allowed medical testing on troops, and he always ignores the very next section that states that it all requires informed consent. Then lies and act like people would have no idea.
During covid he found an exercise that tried to assume 4 different future scenarios that may come into play, and ignored the positive leaning ones or nuetralish ones and went straight for the heavily authoritarian exercise because it used a possible pandemic as a background setting, then claimed it was all planned out and proof Covid was a bioweapon attack.
People like this willfully ignore things that give context, and will often repackage it without the context anytime they can.
So “sacrifice child” is a common term used in what language? I don’t believe in religion but I also don’t know a whole lot about computer science. So I would believe you if you said it meant something.
But seeing the words “sacrifice child” would rightfully startle anybody. It’s nothing to do with cherry picking or satanic panic. It’s everything to do with those two very specific words being right next to each other. Nothing else.
Part of the whole panic and cherry picking thing is also an important next step: refusal to do proper research. A simple web search would correctly show you that it’s harmless. One might also find sources that claim it’s actually satanic, but they’d find those in blogs, social media, or message boards, while legitimate and official sites would show the correct info.
It’s up to the person to determine which one is correct. Most logical people would go with the simplest and least sensational definition being the correct one, while those with a conspiratorial mind view would ignore such common sense and choose to panic.
It’s still very jarring. Attributing it solely to satanic panic is wild though. It’s just someone’s first reaction to seeing something. Not everyone does research before having a natural human reaction.
Well, we don’t really care about a natural emotion reaction in yout head. Once you start spreading it around and claiming something about it, then its a problem. If you just spread it as a “look at this weird thing I found, isn’t it funny?” That’s also fine. However, if you start spreading it like “can you believe this?” without checking into it, then you’re either gullible to the point of the internet being dangerous for you, or you’re complicit.
Everybody doesn’t have to be an expert on a subject to say “look at this, it’s crazy right?”. It’s up to experts to explain why it’s not actually crazy. It’s still crazy the term has to be “sacrifice child”, whether it’s common or not
When you aren’t an expert, then you try to find answers by looking it up, as I explained. It isn’t hard, and this one in particular is a common joke. On some subjects a simple search won’t work as well, I’ll grant you that. However you seemed hellbent on defending people jumping to conclusions without som3 due diligence. That’s on the person. Misinformation spreads because lazy people want to go off of gut reactions and not even make sure the stuff they spread is true or a misunderstanding.
Why are you so invested in not even trying to fact check? Apologies if that isn’t your point, because it sure feels like it.
It’s not my point. No need to apologize though. I just think a lady online being startled by those words and posting about it saying “have you seen this?” is not at all the same as satanic panic. Who knows maybe that’s exactly what she was doing but I doubt it. It was probably just a startled Mom or Auntie
Yeah, they probably were. I might be a bit more sensitive because I’ve seen people ruined by simple stuff like this, and algorithms that encorage going further down the shock, anger, and fear pipeline. I’m pretty adamant that people fact check instead of being shocked, as those moms and aunties might become future Ashli Babbitts. That of course could be just me paying more attention to that side of indoctrination, because I worry what harm it could cause.
Still, I was mostly engaging in an discussion that cherry picked stuff is dangerous even if people don’t think it is. Plenty of people have been radicalized starting with jokes and minor misunderstandings that never got corrected. I try to at least steer people towards looking into things even occasionally on joke posts. It may be overreacting, but I remember a time where I was dumb enough to accept “Nice guys finish last” as a somewhat true joke.
When sacrificing the child, use a dagger made from obsidian. Cut upward from below the sternum, then force the rib cage apart. Push the lungs aside with your hands, then cut out the heart with your ritual dagger. Hold the heart up to the cheering crowd, and then place it in an earthen vessel in honor of the gods. Kick the body down the steps of the temple pyramid.
Actually no. A transsternal access to the heart is impossible with stone tools, even obsidian. Middle american ritual sacrifices were performed transphrenic – they had less problems with the complications of that access as they didn’t intend their victim to survive, in contrast to — most — modern surgeons.
Look, we already got rid of “Master/Slave” in favor of things like “Parent/Child”, “Active/Standby”, or “Primary/Secondary”. We’re not making more changes because right-wingers are afraid of everything.
tbh i think “master” terminology is only bad if paired with “slave”. the word itself kinda just lost it’s original meaning
but I don’t really care about git’s change. im only using master out of habit
Oh yeah that was a shitshow. I made a point to keep “master” in my repos and configurations because it’s the other meaning of master - one of the many others. Words are allowed to mean different things, ya know? If I’m drinking some coke I’m certainly not drugging myself (…I hope).
After all, the command to attach to a master is not “git slave”, it’s “git pull”.
I smell a crime thriller where a serial killer is a programmer and hid their actual child killing searches by masking them as programmer endorsed child killing kind.
<span style="font-style:italic;color:#969896;">-- |Removes the given object from its current parent, if any, and then adds it as a child of the other given object.
</span><span style="font-weight:bold;color:#795da3;">kidnap </span><span style="font-weight:bold;color:#a71d5d;">:: ChildBearing </span><span style="color:#323232;">c p
</span><span style="color:#323232;"> </span><span style="font-weight:bold;color:#a71d5d;">=> </span><span style="color:#323232;">p </span><span style="font-style:italic;color:#969896;">-- ^The kidnapper.
</span><span style="color:#323232;"> </span><span style="font-weight:bold;color:#a71d5d;">-> </span><span style="color:#323232;">c </span><span style="font-style:italic;color:#969896;">-- ^The child to kidnap.
</span><span style="color:#323232;"> </span><span style="font-weight:bold;color:#a71d5d;">IO </span><span style="color:#0086b3;">()
</span>
programmer_humor
Hot
This magazine is from a federated server and may be incomplete. Browse more on the original instance.