There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmer_humor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

jnk , in I expect normies to use words like 'algorithm' to refer to 'AI', which is in reality, a mathematical optimization PAC model --- but is this guy not supposed to be epitome of tech meritocracy?

He’s just an entrepreneur who really wants to play the Tony Stark role but doesn’t really know much more than the average tech hobbyist, so now he’s doing a bad-mediocre job at both things.

AVincentInSpace , in I expect normies to use words like 'algorithm' to refer to 'AI', which is in reality, a mathematical optimization PAC model --- but is this guy not supposed to be epitome of tech meritocracy?

the people who do that probably don’t even know that the ‘AI LE BAD’ is the same technology that is giving them precious views.

This fact is at best useless and at worst deliberately misleading.

The microcontroller inside my microwave oven that counts down seconds and shows me the time is a computer. It can execute instructions, do math, and emulate a universal Turing machine (or would be able to if it was connected to infinite memory). The Pleiades supercomputer at NASA is a computer. It, too, can execute instructions, do math, and emulate a universal Turing machine, albeit much more quickly. These two computers do such vastly different jobs and differ by so much in computational ability that referring to them using the same term is almost meaningless. Just because predicting what a user will click on and generating an image from a text prompt are both done using convolutional neural networks does not mean that the people who rally against the latter while relying on the former for an income stream are hypocrites. Good grief, you might as well accuse someone watching cat videos on their PC of warmongering because computers are also used for missile guidance.

DacoTaco , in Probably the wrong meme format
@DacoTaco@lemmy.world avatar

The problem with java is the language and how it works itself, and not the byte code idea.
I say that as a few things do that and .net, java and wasm are the first that jump to mind.
Hell, pure technically any programming language that is not asm does that :')

My problem is java itself, not its byte code. Wasm as advantage, imo, is that its not stuck to a single language like java is. .net blazor can build to wasm, but you could also use c++ to compile wasm applications :)

onlinepersona OP ,

I’m not sure why the WASM creator(s) didn’t take advantage of the existing java bytecode and JVM instead of reimplementing it. Might’ve sped up its adoption. Doesn’t matter to me though, as long as JS stops being the #1 in web development.

This meme isn’t to be taken seriously

Anti Commercial AI thingyCC BY-NC-SA 4.0

RonSijm ,
@RonSijm@programming.dev avatar

Why would they? WASM is Web-Assembly, so Assembly is a lower level language than Java.

You can use C# WASM through Blazor, and Java WASM though JWebAssembly. WASM as core is supposed to be language agnostic. So If you want a JVM in WASM you can build it on top of it

onlinepersona OP ,

Java makes bytecode which is run in a JVM, WASM is bytecode which runs in a VM.

Anti Commercial AI thingyCC BY-NC-SA 4.0

RonSijm ,
@RonSijm@programming.dev avatar

Java makes things run in a VM

Docker makes things run in a VM

Virtualbox makes things run in a VM

Why did we need any of those things, should have just put everything in Java instead right 🙃

onlinepersona OP ,

Now that’s meme-worthy!

Anti Commercial AI thingyCC BY-NC-SA 4.0

WhiskyTangoFoxtrot ,

All software should be made to run in SCUMMVM.

xigoi ,
@xigoi@lemmy.sdf.org avatar

Java bytecode is designed specifically for class-oriented languages like Java and works terribly with anything else.

AVincentInSpace , in I expect normies to use words like 'algorithm' to refer to 'AI', which is in reality, a mathematical optimization PAC model --- but is this guy not supposed to be epitome of tech meritocracy?

Why am I not surprised that someone who unironically quotes Ben Shapiro thinks of Elon “My Dad Owns An Emerald Mine” Musk as a shining example of capitalist meritocracy

xmunk , in Probably the wrong meme format

I’ll be chilling over in PHP. Let me know when you’re ready for a real language.

onlinepersona OP ,

The language of “as long as it works” champions 💪

Anti Commercial AI thingyCC BY-NC-SA 4.0

pftbest , in Probably the wrong meme format

You can’t compile C to java bytecode, they are fundamentally incompatible. But you can compile C to wasm, which is what you want for a good universal bytecode. Java is shit.

fl42v ,
  1. Compile jvm into wasm
  2. ???
  3. Be universally hated
onlinepersona OP ,

Join the player haters ball

Anti Commercial AI thingyCC BY-NC-SA 4.0

onlinepersona OP ,

May I introduce you to github.com/davidar/lljvm

The C to JVM bytecode compilation provided by LLJVM involves several steps

pftbest ,

Have you seen what it outputs? The same way we can compile C to brainfuck, it doesn’t mean it’s good or is useful.

onlinepersona OP ,
  • “compiling C to java bytecode isn’t possible"
  • link to project that does exactly that
  • "it’s not good or useful"
  • WASM exists
  • "that’s useful”

… OK

Anti Commercial AI thingyCC BY-NC-SA 4.0

pftbest ,

I can’t quite understand what is your point? Are you arguing that both JVM and WASM are bad? With this I agree, they both have terrible performance and in an ideal world we wouldn’t use any of them.

Are you arguing that JVM bytecode is better than WASM? That’s objectively not true. One example is a function pointer in C. To compile it to JVM bytecode you would need to convert it to the virtual call using some very roundabout way. But in WASM you have native support for function pointers, which gives much better flexibility when compiling other languages.

onlinepersona OP ,

There is no point. It’s a stupid meme, not to be taken seriously in a “programmerhumor” community. It’s about as serious as demanding the Biden be the arbiter of good and bad programming languages, or saying JS should be used to punish convicts.

Look at the other people saying stuff like “JAVA: Just Another Virtual Assembler” or something. They aren’t taking it seriously.

Anti Commercial AI thingyCC BY-NC-SA 4.0

Rikj000 , in Probably the wrong meme format
@Rikj000@discuss.tchncs.de avatar

WASM = WebAssembly,
this has nothing to do with Java,
but with JS (JavaScript).

JS works with JIT (Just In Time) compilation, meaning every user that requests a web page, will request the JS and your browser will compile that JS on the fly as you request it.

WASM on the other hand is pre-compiled once, by the developer, when he/she is making the code. So when a user requests a WASM binary, they don’t have to wait for JIT compilation, since it was already pre-compiled by the developer.

They only have to wait for a tiny piece of JS,
which is still JIT compiled,
a tiny piece of JS to load in the WASM binary.

This saves the user from waiting on JIT compilation and thus speeds up requesting web pages.

WASM also increases security,
since binaries are harder to reverse engineer then plain text JS.

Due to those reasons,
I believe WASM will be the future for Web development.

No clue why people are hating on WASM,
but I guess they just don’t grasp all of the above yet.

onlinepersona OP ,

Wasm code (binary code, i.e. bytecode) is intended to be run on a portable virtual stack machine (VM)

WASM wikipedia

Java bytecode is the instruction set of the Java virtual machine (JVM), crucial for executing programs written in the Java language and other JVM-compatible languages

java bytecode

Need I say more?

hglman ,

Yes

onlinepersona OP ,

OK

Anti Commercial AI thingyCC BY-NC-SA 4.0

dohpaz42 ,
@dohpaz42@lemmy.world avatar

Please do say more because I don’t think the argument you’re trying to make is coming across clearly. Obviously your intelligence is at a level far higher than us low-iq plebs, and we need your brilliant mastery of these topics to be poetically spelled out for us. For we are not worthy otherwise.

onlinepersona OP ,

What are you not getting?

Anti Commercial AI thingyCC BY-NC-SA 4.0

dohpaz42 ,
@dohpaz42@lemmy.world avatar

In all seriousness, everything. You post two out-of-context quotes from two different websites, but put no actual effort into tying them together and explaining your position on either side, and expect us to read your mind and agree with you. It’s a bad faith comment that exudes arrogance.

Sekoia ,

Having read a significant portion of the base WASM spec, it’s really quite a beautiful format. It’s well designed, clear, and very agnostic.

I particularly like how sectioned it is, which allows different functions to be preloaded/parsed/whatever independently.

It’s not perfect by any means; I personally find it has too many instructions, and the block-based control flow is… strange. But it fills a great niche as a standard low-level isolated programming layer.

lil ,
@lil@lemy.lol avatar

I want webpages to be open source, not compiled. That’s why I dislike WASM

Rikj000 ,
@Rikj000@discuss.tchncs.de avatar

WASM projects can be open source,
just like Android apps can be.

However in both instances the compiled versions of it are not easily readable.

Also you can validate binaries against a shasum to ensure no tampering has happened with them.

madkarlsson ,

WASM is great and as it becomes more accessible it will likely take over more and more

OPs meme is just a sign of someone not understanding the softer parts around development. The meme also seems to forget that we tried java in the browser for two decades and it was just… Horrible from all perspectives, in all layers

anton , in Probably the wrong meme format

The main difference is that WASM is an agnostic bytecode without a gc while the jvm is opinionated in a java way. It has a gc, focus on dynamic dispatch and it has knowledge of concepts like exceptions, classes and visibility.

All this leaking of abstractions means languages like java and kotlin are well suited, scala has hit problems and c couldn’t be compiled to java bytecode.

onlinepersona OP ,

C can be compiled to java bytecode though…

Anti Commercial AI thingyCC BY-NC-SA 4.0

arendjr ,

Of course, technically you can compile anything to almost anything. But I don’t think linking to a project that’s unmaintained for 15 years really helps your argument.

onlinepersona OP ,

What is my argument exactly?

Anti Commercial AI thingyCC BY-NC-SA 4.0

arendjr ,

Good question! 😂 maybe I’m overthinking it, but you seem to be making the point that it’s silly for people to like WASM based on the argument the JVM already exists and people are not fond of it/Java. If that’s not the point, why did you make the meme at all?

Iapar ,

Addiction.

Tja , in Hilarious

Reason 5: she doesn’t exist and you’re bad at cropping

Croquette , in Rebase Supremacy

I know this is a meme post, but can someone succinctly explain rebase vs merge?

I am an amateur trying to learn my tool.

letsgo ,

Merge gives an accurate view of the history but tends to be “cluttered” with multiple lines and merge commits. Rebase cleans that up and gives you a simple A->B->C view.

Personally I prefer merge because when I’m tracking down a bug and narrow it down to a specific commit, I get to see what change was made in what context. With rebase commits that change is in there, but it’s out of context and cluttered up with zillions of other changes from the inherent merges and squashes that are included in that commit, making it harder to see what was changed and why. The same cluttered history is still in there but it’s included in the commits instead of existing separately outside the commits.

I honestly can’t see the point of a rebased A->B->C history because (a) it’s inaccurate and (b) it makes debugging harder. Maybe I’m missing some major benefit? I’m willing to learn.

reflectedodds ,

I feel the opposite, but for similar logic? Merge is the one that is cluttered up with other merges.

With rebase you get A->B->C for the main branch, and D->E->F for the patch branch, and when submitting to main you get a nice A->B->C->D->E->F and you can find your faulty commit in the D->E->F section.

For merge you end up with this nonsense of mixed commits and merge commits like A->D->B->B’->E->F->C->C’ where the ones with the apostrophe are merge commits. And worse, in a git lot there is no clear “D E F” so you don’t actually know if A, D or B came from the feature branch, you just know a branch was merged at commit B’. You’d have to try to demangle it by looking at authors and dates.

The final code ought to look the same, but now if you’re debugging you can’t separate the feature patch from the main path code to see which part was at fault. I always rebase because it’s equivalent to checking out the latest changes and re-branching so I’m never behind and the patch is always a unique set of commits.

Atemu ,
@Atemu@lemmy.ml avatar

For merge you end up with this nonsense of mixed commits and merge commits like A->D->B->B’->E->F->C->C’ where the ones with the apostrophe are merge commits.

Your notation does not make sense. You’re representing a multi-dimensional thing in one dimension. Of course it’s a mess if you do that.

Your example is also missing a crucial fact required when reasoning about merges: The merge base.
Typically a branch is “branched off” from some commit M. D’s and A’s parent would be M (though there could be any amount of commits between A and M). Since A is “on the main branch”, you can conclude that D is part of a “patch branch”. It’s quite clear if you don’t omit this fact.

I also don’t understand why your example would have multiple merges.

Here’s my example of a main branch with a patch branch; in 2D because merges can’t properly be represented in one dimension:


<span style="color:#323232;">M - A - B - C - C'
</span><span style="color:#323232;">             /
</span><span style="color:#323232;">    D - E - F
</span>

The final code ought to look the same, but now if you’re debugging you can’t separate the feature patch from the main path code to see which part was at fault.

If you use a feature branch workflow and your main branch is merged into, you typically want to use first-parent bisects. They’re much faster too.

reflectedodds ,

You’re right, I’m not representing the merge correctly. I was thinking of having multiple merges because for a long running patch branch you might merge main into the patch branch several times before merging the patch branch into main.

I’m so used to rebasing I forgot there’s tools that correctly show all the branching and merges and things.

Idk, I just like rebase’s behavior over merge.

Atemu ,
@Atemu@lemmy.ml avatar

The thing is, you can get your cake and eat it too. Rebase your feature branches while in development and then merge them to the main branch when they’re done.

jaemo ,

👏 Super duper this is the way. No notes!

ActionHank ,

I would advocate for using each tool, where it makes sense, to achieve a more intelligible graph. This is what I’ve been moving towards on my personal projects (am solo). I imagine with any moderately complex group project it becomes very difficult to keep things neat.

In order of expected usage frequency:

  1. Rebase: everything that’s not 2 or 3. keep main and feature lines clean.
  2. Merge: ideally, merge should only be used to bring feature branches into main at stable sequence points.
  3. Squash: only use squash to remove history that truly is useless. (creating a bug on a feature branch and then solving it two commits later prior to merge).

History should be viewable from log --all --decorate --oneline --graph; not buried in squash commits.

JackbyDev ,

Folks should make sure the final series of commits in pull requests have atomic changes and that each individual commit works and builds successfully alone. Things like fixup commits with auto squash rebase. THIS WAY you can still narrow it down to one commit regardless of the approach.

jaemo ,

Merge keeps the original timeline. Your commits go in along with anything else that happened relative to the branch you based your work off (probably main). This generates a merge commit.

Rebase will replay all the commits that happened while you were doing your work before your commits happen, and then put yours at the HEAD, so that they are the most recent commits. You have to mitigate any conflicts that impact the same files as these commits are replayed, if any conflicts arise. These are resolved the same way any merge conflict is. There is no frivolous merge commit in this scenario.

TlDR; End result, everything that happened to the branch minus your work, happens. Then your stuff happens after. Much tidy and clean.

Croquette ,

Thanks for the explanation. It makes sense. To my untrained eyes, it feels like both merge and rebase have their use. I will try to keep that in mind.

bitcrafter ,

Yes. My rule of thumb is that generally rebasing is the better approach, in part because if your commit history is relatively clean then it is easier to merge in changes one commit at a time than all at once. However, sometimes so much has changed that replaying your commits puts you in the position of having to solve so many problems that it is more trouble than it is worth, in which case you should feel no qualms about aborting the rebase (git rebase --abort) and using a merge instead.

Croquette ,

I have the bad habit of leaving checkpoints everywhere because of merge squash that I am trying to fix. I think that forcing myself to rebase would help get rid of that habit. And the good thing is that I am the sole FW dev at work, so I can do whatever I want with the repos.

JackbyDev ,

Yes. They do. A lot of people will use vacuous terms like “clean history” when arguing for one over the other. In my opinion, most repositories have larger problems than rebase versus merge. Like commit messages.

Also, remember, even if your team/repository prefers merges over rebases for getting changes into the main branch, that doesn’t mean you shouldn’t be using rebase locally for various things.

Croquette ,

How would rebasing my own branch work? Do I rebase the main into my branch, or make a copy of the main branch and then rebase? I have trouble grasping how that would work.

jaemo ,

Here’s an example

Say I work on authentication under feature/auth Monday and get some done. Tuesday an urgent feature request for some logging work comes in and I complete it on feature/logging and merge clean to main. To make sure all my code from Monday will work, I will then switch to feature/auth and then git pull --rebase origin main. Now my auth commits start after the merge commit from the logging pr.

Croquette ,

Thanks for the example. Rebase use is clearer now.

JackbyDev ,

You’re still rebasing your branch onto main (or whatever you originally branched it off of), but you aren’t then doing a fast forward merge of main to your branch.

The terminology gets weird. When people say “merge versus rebase” they really mean it in the context of brining changes into main. You (or the remote repository) cannot do this without a merge. People usually mean “merge commit versus rebase with fast forward merge”

Croquette ,

Yeah I was confused because you are right, merge is usually refered as the git merge and then git commit.

It makes sense. Thanks for the clarification

jaemo ,

You nailed it with the critique of commit messages. We use gitmoji to convey at-a-glance topic for commits and otherwise adhere to Tim Pope’s school of getting to the point

JackbyDev ,

Gitmoji?

jaemo ,

gitmoji.dev

Quasi parallel reply to your other post, this would kind of echo the want for a capital letter at the start of the commit message. Icon indicates overall topic nature of commits.

Lets say I am adding a database migration and my commit is the migration file and the schema. My commit message might be:


<span style="color:#323232;">     🗃️ Add notes to Users table
</span>

So anyone looking at the eventual pr will see the icon and know that this bunch of work will affect db without all that tedious “reading the code” part of the review, or for team members who didn’t participate in reviews.

I was initially hesitant to adopt it but I have very reasonable, younger team mates for whom emojis are part of the standard vocabulary. I gradually came to appreciate and value the ability to convey more context in my commits this way. I’m still guilty of the occasionally overusing:


<span style="color:#323232;">   ♻️ Fix the thing
</span>

type messages when I’m lazy; doesn’t fix that bad habit, but I’m generally much happier reading mine or someone else’s PR commit summary with this extra bit of context added.

Deebster ,
@Deebster@programming.dev avatar

I looked at it and there’s a lot of them!

I see things like adding dependencies but I would add the dependency along with the code that’s using it so I have that context. Is the Gitmoji way to break your commits up so that it matches a single category?

jaemo ,

Yes, that is another benefit, once you start getting muscle memory with the library. You start to parcel things by context a bit more. It’s upped my habit of discrete commit-by-hunks, which also serves as a nice self-review of the work.

Deebster , (edited )
@Deebster@programming.dev avatar

I don’t see that as a benefit tbh - if I have a dependency, I want to see why it’s there as part of the commit. I’m imagining running blame on Cargo.toml and seeing “Add feature x” vs “Add dependency”. I guess the idea is it’s “➕ Add dep y for feature x” but I’d still rather be able to see the related code in the same commit instead of having to find the useful commit in the log.

I suppose you could squash them together later, but then why bother splitting it out in the first place?

I see that some use a subset of Gitmoji and that does make sense to me - after all, you wouldn’t use all of them in every project anyway, e.g. 🏷️ types is only relevant for a few languages.

JackbyDev ,

I must have read that blog post in the past because that’s exactly the style I use. Much of it is standard though.

One MAJOR pet peeve of mine (and I admit it is just an opinion either way) is when people use lower case letters for the first line of the commit message. They typically argue that it is a sentence fragment so shouldn’t be capitalized. My counter is that the start of sentences, even fragmented ones, should be capitalized. Also, and more relevant, is that I view the first line of the commit more like the title of something than a sentence. So I use the Wikipedia style of capitalizing.

jaemo ,

100% they do. Rebase is an everyday thing, merge is for PRs (for me anyway). Or merges are for regular branches if you roll that way. The only wrong answer is the one that causes you to lose commits and have to use reflog, cos…well, then you done messed up now son… (but even then hope lives on!)

aubeynarf ,

Never use rebase for any branch that has left your machine (been pushed) and which another entity may have a local copy of (especially if that entity may have committed edits to it).

testfactor , in Probably the wrong meme format

Do you really think the reason people hate Java is because it uses an intermediate bytecode? There’s plenty of reasons to hate Java, but that’s not one of them.

.NET languages use intermediate bytecode and everyone’s fine with it.

Any complaints about Java being an intermediate language are due to the fact that the JVM is a poorly implemented dumpster fire. It’s had more major vulnerabilities than effing Adobe Flash, and runs like molasses while chewing up more memory than effing Chrome. It’s not what they did, it’s that they did it badly.

And WASM will absolutely never replace normal JS in the browser. It’s a completely different use case. It’s awesome and has a great niche, but it’s not really intended for normal web page management use cases.

onlinepersona OP ,

Do you really think the reason people hate Java is because it uses an intermediate bytecode? There’s plenty of reasons to hate Java, but that’s not one of them.

No, I do not. It’s a meme.

Anti Commercial AI thingyCC BY-NC-SA 4.0

PoliticalAgitator ,

Which is functionally identical to the last one you posted. Are you just doing the same joke in every format you can? Fuck the internet sucks now.

fruitycoder ,

but its not really intended for normal webpages

But I really want to make full stack web apps with rust!

www.arewewebyet.org/topics/frameworks/#pkg-actix-

masterspace ,

And WASM will absolutely never replace normal JS in the browser. It’s a completely different use case. It’s awesome and has a great niche, but it’s not really intended for normal web page management use cases.

While I overall agree that JS / TS isn’t likely to be replaced, Microsoft’s Blazor project is interesting conceptually … Write C# webpages and have it compile down to WASM for more performance than JS could offer.

lambda ,
@lambda@programming.dev avatar

What, you can write a website in C# and have It output as a website using wasm? I have never touched wasm. That might be an interesting way to try it though.

Heavybell ,
@Heavybell@lemmy.world avatar

The problem with blazor as I understand it, is that no, it does not compile your C# into WASM. Instead, it compiles into a standard .net module – with as much excising of unused code as possible – and distributes it with a CLR that is compiled to WASM. So effectively you’re running the .net VM inside the WASM VM. If you do client-side blazor, which is not even MS’s push anymore because they stand to make more money if you write server-side blazor and deploy it to Azure.

Do look it up yourself tho. I could have a totally wrong understanding. I haven’t looked into it in some time because I’ve not been in a position to start a new frontend project from scratch. I would love to do my frontend stuff in C# though, don’t get me wrong.

Cratermaker ,

Interesting, yeah. I inherited a Blazor project though and have nothing positive to say about it really. Some of it is probably implementation, but it’s a shining example of how much better it is to choose the right tool for the job, rather than reinventing the wheel. For a while I was joking about setting the whole project “ablazor” until we finally decided to go back to a React/C# ASP.NET stack. If you’re thinking of using Blazor still, though, I think two fun things to look into are “linting issues with Blazor” and “Blazor slow”. I’ve heard people praise it, but they tend to be those who consider themselves backend devs that occasionally get stuck making frontends.

magic_lobster_party , in Should it just be called JASM?

JAVA: Just Another Version of Assembly

AWittyUsername , in Hilarious
  1. Using Java
pulaskiwasright , in New language

My favorite is “Java is slow” said by someone advocating for a language that’s at least 10 times slower.

humbletightband ,

Those who say such things are straight ignorant

pulaskiwasright ,

They’re basically fashion victims.

humbletightband ,

I wouldn’t say so. They are inexperienced. They don’t know where the bottleneck of most of the modern software is (it’s io in 80-90% of cases) and how to optimize software without rewriting it to C++

SorryQuick ,

How are they ignorant? It’s a known fact that java is slow, at least slower than some others. Sure, it’s still fast enough for 95% of use cases, but most code will run faster if written in, say, C. Will have 10x the amount of code and twice as many bugs though.

xor ,

the jvm brings enough bugs to outweigh any benefits there…
it is relatively fast, but it’s slow in that it takes up a bunch of resources that could be doing other things…

humbletightband ,

the jvm brings enough bugs to outweigh any benefits there…

Please name a few

xor ,

i decline your invite to debate the merits of java and jvm… i will instead walk my dog through this beautiful park here…

but, it’s all been said on top level comments on this post.
it’s trash, and honestly, even if it was perfect, sun microsystems has ruined any potential benefits.

humbletightband ,

Have a good walk at least

humbletightband ,

takes up a bunch of resources that could be doing other things…

You cannot get rid of garbage collectors, but you can always compile your java into binary to reduce the memory footprint.

xor ,

sea lion

kaffiene ,

Bullshit.

humbletightband , (edited )

Java is indeed slower than C, Rust, in some cases than Go.

But that doesn’t mean that

code will run faster if written in, say, C

Again, like 80-90% of production code are bounded by disk/network io operations. You will gain performance from using C in embedded systems and in heavy calculations (games, trading, simulations) only.

SorryQuick ,

Which is exaxtly what I said, that it’s fast enough for most use cases.

In theory though, you will “gain performance” by rewriting it (well) in C for literally anything. Even if it’s disk/io, the actual time spent in your code will be lower, while the time spent in kernel mode will be just as long.

For example, you are running a server which reads files and returns data based on said files. The act of reading the file won’t be much faster, but if written in C, your parsers and actual logic behind what to do with the file will be.

But it’s as you said, this actual tiny performance gain isn’t worth it over development/resource cost most of the time.

kaffiene ,

My favourite is “all the boilerplate” then they come up with go’s error checking where you repeat the same three lines after every function call so that 60% of your code is the same lines orlf error checking over and over

xtapa ,

When you handle all your errs the same way, I’d say you’re doing something wrong. You can build some pretty strong err trace wrapping errs. I also think it’s more readable than the average try catch block.

kaffiene ,

You still need to add error handling to every call to every function that might raise an error

pulaskiwasright ,

And god help you if you forget those 3 lines somewhere and you silently have database failures or something else.

kaffiene ,

Yeah, that’s the other thing - it does become easier to accidentally fail to deal with errors and the go adherents say they do all of that verbose BS to make error handling more robust. I actually like go, but there’s so much BS with ignoring the pain points in the language.

steeznson ,

Things like lombok make the boilerplate less of an issue in modern Java too

redempt , in New language

we should use only the best language for everything.

Maalus ,

Codebase is now 30 languages

redempt ,

there aren’t 30 best languages, that’s not how “best” works. we use only the best language. for everything.

Maalus ,

That’s exactly how “best” works. Everyone thinks their language is the greatest and shits on everything else. If they all chose “the best” there would be 50 of them. Opinions are like assholes. Everyone has one.

redempt ,

too bad everyone else is wrong

ieatpillowtags ,

Nah it’s you.

Empricorn ,

Lol Principal Skinner meme

KairuByte ,
@KairuByte@lemmy.dbzer0.com avatar

If only it was intentional…

redempt ,

yeah if only I was joking. wouldn’t that be funny

Iapar ,

Your joke is like my asshole. No one gets it.

ieatpillowtags ,

Oh really? There’s one language that’s “best” in all contexts and for every single use case? Care to enlighten the rest of us oh wise one?

KindaABigDyl ,
@KindaABigDyl@programming.dev avatar

C

ieatpillowtags ,

I love building websites in C!

KairuByte ,
@KairuByte@lemmy.dbzer0.com avatar

Hey now, CSS is just C with some extra S’s. Just rip off those S’s and you’re styling in C.

nickwitha_k ,

People used to do it.

ieatpillowtags ,

Not saying you can’t, but is it the “best”?

nickwitha_k ,

There is no objective answer to that. But, in my opinion, no, no it is not.

redempt ,

scratch

nickwitha_k ,

HolyC

KindaABigDyl ,
@KindaABigDyl@programming.dev avatar

And?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines