There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmer_humor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

HairHeel , in OC: Me since Bun 1.0.0
@HairHeel@programming.dev avatar

My favorite pastime is arguing about which JavaScript runtime is faster while I wait for my app to finish running O(n^n) table scans of my database.

kamen , in Every Single Freaking Time

I’ve almost gotten into the habit of hitting Ctrl+Shift+C when I want to copy something because of that.

vpklotar ,

I do that all the time. Opens up developer tools on firefox if you do it.

fernandu00 ,

I open developer tools every day doing that!

rinze ,
@rinze@infosec.pub avatar

Yes. And on Microsoft Teams that triggers a chat call.

scottywh ,

😂

KyuubiNoKitsune ,

Burn it with fire!

phoenixz ,

That solution ish the worst. Ctrl-shift-c does a shitload of different things in different programs, and in browsers it does different things per page.

Ctrl-ins, shift-ins, shift-del for the win bit THEN some programs simply refuse to support that.

I have like 4 different copy paste short cuts because of this and it sucks

kamen ,

I’m not saying it’s great, but at least in my use I haven’t seen it being destructive/disruptive like Ctrl+C is.

phoenixz ,

Ctrl-c for copying is a windows thing and it’s annoying.

millie ,

Do you at least have 4 clipboards to go with them? Because I don’t think I could ever go back to a single clipboard.

phoenixz ,

I use standard Linux dual clipboard (Ctrl ins and just select, middle click) but most extra clipboards I’ve seen require a lot of extra clicking to get the work done. I want something simple stupid fast.

millie ,

I’m running windows for my daily, but I’ve got Ditto and it works great. I have like 3 clipboards set up, could set up more. It just needs a different hotkey combination. It’s really simple.

MentalEdge , in The birth of JS
@MentalEdge@sopuli.xyz avatar

Ah. The shork did the math. This explains a lot.

words_number , in libmem_cpy-strnrrn-std-clib_Cmvaeffc_ld-TWA_nif.aarch64(32bit)2-0.13.2-23.2.so.7(3).1.1.gz.conf

Seriously though, why? Is there historic reasons for that? Did they have to pay extra for more letters back in the day?

scurry ,

Yes. Memory and storage were at a very high premium until the 1990s, and when C was first being developed, it wasn’t uncommon for computers to output to printers (that’s why print() and co are named what they are), so every character was at a premium. In the latter case, you were literally paying in ink and paper by the character. These contributed to this convention that we’re still stuck with today in C.

words_number ,

Thanks for the insight! I think this kind of convention that once made some sense, is now exclusively harmful, but is still followed meticulously, is often called “tradition” and is one of the high speed engines that let humanity drive towards extinction.

scurry ,

I agree, and these conventions are being followed less over time. Since the 1990s, Windows world, Objective-C, and C++ have been migrating away (to mixed results), and even most embedded projects have been too. The main problem is that the standard library is already like that, and one of C’s biggest selling point is that you can still use source written >40 years ago, and interact with that. So just changing that, at that point just use Go or something. I also want to say, shoutout to GNU for being just so obstinate about changing nothing except for what they make evil about style. Gotta be one of my top 5 ‘why can’t you just be good leaders, GNU?’ moments.

words_number ,

at that point just use Go or something

*Rust (obviously!)

QuazarOmega ,

Wait, but they didn’t print out the source code right? Or did they use teletypes to develop?

xchgeaxeax ,

or did they use teletypes to develop

Basically yeah. ed the editor was designed with that in mind

QuazarOmega ,

Oh, that makes a lot of sense then.

After all, it is the standard text editor

spoileruff, doesn’t feel right if it isn’t KasaneTeto saying this :/

zqwzzle ,

IIRC older DOS versions were also limited to 8.3 filenames, so even filenames had a max limit of 8 characters + 3 extension. May it was a limitation of the file system, can’t quite remember.

scurry ,

At one point it was both. At one point they internally added support for longer file names in DOS, and then a later version of the filesystem also started supporting it. I think that on DOS and Windows (iirc even today), they never actually solved it, and paths on Windows and NTFS can only be 256 characters long in total or something (I don’t remember what the exact limit was/is).

isVeryLoud ,

It’s 256, unless you enable something in the registry. NTFS supports paths longer than 256, funnily enough.

NateSwift ,

I’ve heard arguments that back in ye old days each row only had 80 characters and variable names were shortened so you didn’t have to scroll the page back and forth

Knusper ,

I’ve already felt like I should choose shorter names in a (shitty) project where the customer asked us to use an auto-formatter and a max line-width of 120 characters.

Because ultimately, I choose expressive variable names for readability. But an auto-formatter gladly fucks up your readability, breaking your line at some random ass point, unless your line does not need to be broken up.

And so you start negotiating whether you really need certain information in a variable name for the price of badly broken lines.

bleistift2 ,

120 characters is quite much, though.

Knusper ,

Yeah, I meant it as an example, where I was still granted relatively luxurious conditions, but even those already caused me to compromise on variable names.

I’d say, 95% of my lines of code do fit into 120 characters easily. It’s those 5% that pained me.

mustardman , (edited )

I worked with a complier that would assume only compare the first 8 characters and would treat it the same afterwards.

Compiler copyright was around 1990.

Edit: This was for function names in C

SubArcticTundra ,
@SubArcticTundra@lemmy.ml avatar

Damn that must have been such a headache

AngryCommieKender ,

We were limited to a certain number of characters for filenames, way back in the Apple ]|[ days. IIRC it was 8

nothacking ,

They did, with core you could be paying for many dollars per bit of memory. They also often used teletypes, where you would pay in ink and time for every character.

SubArcticTundra ,
@SubArcticTundra@lemmy.ml avatar

Unix didn’t run on core though, did it? I thought core was before its time

nothacking ,

A early models of PDP-11, the computer Unix was developed on, did use core.

UrbonMaximus , in The lengths we have to go to

Let me introduce you to YAML, you’ll love it!

dansity ,

I got my hair torn out till I setup my home assistant. I f*cking hate it its stupid

lhamil64 ,

One of these days I’ll actually look up how YAML indentation works. Every time I use it it’s trial and error until I stop getting errors.

merc ,

That’s a super risky way to do it. It might stop giving you errors because you finally got the indentation right, or it might stop giving you errors because you got the indentation “right” but not how you you meant to organize the objects.

CoderKat ,

Ugh, there’s some parts of YAML I love, but ultimately it’s a terrible format. It’s just too easy to confuse people. At least it has comments though. It’s so dumb that JSON doesn’t officially have comments. I’ve often parsed “JSON” as YAML entirely for comments, without using a single other YAML feature.

YAML also supports not quoting your strings. Seems great at first, but it gets weird of you want a string that looks like a different type. IIRC, there’s even a major version difference in the handling of this case! I can’t remember the details, but I once had a bug happen because of this.

Performance wise, both YAML and JSON suck. They’re fine for a config file that you just read on startup, but if you’re doing a ton of processing, it will quickly show the performance hit. Binary formats work far better (for a generic one, protobuffers has good tooling and library support while being blazing fast).

vrighter ,

json 5 does support comments. alternatively, yaml is a superset of json. any valid json is also valid yaml. but yaml also supports comments. So you can also write json with comments, and use a yaml parser on it, instead of a standard json parser

sonnenzeit ,

It’s so dumb that JSON doesn’t officially have comments.

So much this.

Used to work at a company where I sometimes had to manually edit the configuration of devices which were written and read in JSON. Super inconvenient if you have to document all changes externally. As a “hack” I would sometimes add extra objects to store strings (the comments). But that’s super dicey as you don’t know if it somehow breaks the parsing. You’re also not guaranteed the order of objects so if the configuration gets read, edited and rewritten your comment might no longer be above/below the change you made.

Always found it baffling that such a basic feature is missing from a spec that is supposed to cover a broad range of use cases.

Bipta , in Isn't it ironic, don't you think?

Before I do anything "risky" with forms I copy the text AND paste it somewhere else to confirm I really copied it. Only then do I take the next action, and still I get burned all the time by crap like this one way or another.

docAvid ,

I usually just start from typing it up in emacs, then copy paste it to the fussy little form. Anything over six words, it probably saves me time, even if nothing was going to go wrong. And then… Just as you said.

TheMadIrishman , in The legend

That’s the Norse god of throughput.

devious , in Error 502:

If my router drops packets on the rug, should I rub their WAN port in it?

CurlyChopz ,

That’s deemed inhumane nowadays, just turn them off and back on and if they do it again, speak to an IT specialist

Onions ,
@Onions@lemmy.world avatar

Just get some more deditated WAM for it

joshcodes ,
@joshcodes@programming.dev avatar

I did that last Christmas but the very next day, it gave it away. This year, to keep me from tears I’ll deditate it to something special.

yum13241 , in Golang be like

This makes me not want to use Golang at all.

AstridWipenaugh ,

I assure you, the feeling is mutual.

SouthernCanadian ,

You have to manually provide a seed every time you want a random number. Gophers will defend this with their dying breath.

flame3244 ,

This is not needed anymore since version 1.20

SouthernCanadian ,

That’s good to hear, but I wouldn’t choose to use a language that took 5 years to get that right, when most languages have it from day 1.

flame3244 ,

So what language would you chose then? Java, PHP, JavaScript? None of the big languages where perfect from day one and it does not really matter, since day one is over already.

SouthernCanadian ,

Yes, but I’d prefer incompetence rather than being deliberately anti-developer.

flame3244 ,

Okayyyy

GandarfDeGrape , in History repeats itself

OK. Query.

Rebase or merge into current?

I personally never rebase. It always seems to have some problem. I’m surely there’s a place and time for rebasing but I’ve never seen it in action I guess.

NatoBoram ,

It only matters if you want to be able to use the commit tree and actually find something. Otherwise, there’s no harm in using merges.

Blamemeta ,

What you do is create a third branch off master, cherry pick the commits from the feature branch, and merge in the third branch. So much easier.

BabaYaga ,

I’ve definitely done this before…

JDubbleu ,

This is actually genius. Gonna start using this at work.

GigglyBobble ,

If your cherry-pick doesn't run into conflicts why would your merge? You don't need to merge to master until you're done but you should merge from master to your feature branch regularly to keep it updated.

Blamemeta ,

Git is weird sometimes.

yogo ,

That’s called rebasing

fiah ,
@fiah@discuss.tchncs.de avatar

for some reason it’s easier than normal rebasing though

yogo ,

Have you tried interactive rebase (rebase -i)? I find it very useful

Blamemeta ,

Yeah, but then you deal with merge conflicts

gedhrel ,

rerere is a lifesaver here.

(I’m also a fan of rebasing; but I also like to land commits that perform a logical and separable chunk of work, because I like history to have decent narrative flow.)

dukk ,

You can get merge conflicts in cherry picks too, it’s the same process.

atyaz ,

That is absolutely not what rebasing does. Rebasing rewrites the commit history, cherry picking commits then doing a normal merge does not rewrite any history.

yogo , (edited )

I’m sorry but that’s incorrect. “Rewriting the commit history” is not possible in git, since commits are immutable. What rebase actually does is reapply each commit between upstream and head on top of upstream, and then reset the current branch to the last commit applied (This is by default, assuming no interactive rebase and other advanced uses). But don’t take my word for it, just read the manual. git-scm.com/docs/git-rebase

atyaz ,

“Reapply” is rewriting it on the other branch. The branch you are rebasing to now has a one or multiple commits that do not represent real history. Only the very last commit on the branch is actually what the user rebasing has on their computer.

yogo ,

Cherry picking also rewrites the commits. This is equivalent to rebasing:


<span style="color:#323232;">git branch -f orig_head
</span><span style="color:#323232;">git reset target
</span><span style="color:#323232;">git cherry-pick ..orig_head
</span>
dukk ,

Merge commits suck.

My biggest issue with GitHub is that it always squashes and merges. It’s really annoying as it not only takes away from commit history, but it also puts the fork out of sync with the main branch, and I’ll often realize this after having implemented another features, forcing me end up cherry picking just to fix it. Luckily LazyGit makes this process pretty painless, but still.

Seriously people, use FF-merge where you can.

Then again, if my feature branch has simply gone behind upstream, I usually pull and rebase. If you’ve got good commits, it’s a really simple process and saves me a lot of future headaches.

There’s obviously places not to use rebase(like when multiple people are working on a branch), but I consider it good practice to always rebase before merge. This way, we can always just FF-merge and avoid screwing with the Git history. We do this at my company and honestly, as long as you follow good practices, it should never really get too out of hand.

GandarfDeGrape ,

Sounds like I just gotta get better with rebasing. But generally I do my merges clean from local changes. I’ll commit and push, then merge in, push. Then keep working. Not too hard to track but I’ve found it’s the diff at MR time that people really pay attention to. So individual commits haven’t been too crucial.

GigglyBobble ,

Merge commits suck.

My biggest issue with GitHub is that it always squashes and merges.

You are aware you're talking about two different pieces of software?

dukk ,

Yeah, I am. However GitHub, being the biggest Git hosting provider and all that, makes you use merge commits. FF-merges must be done manually from the command line. While this definitely isn’t a problem for me, many people out there just don’t care and merge without a second thought (which, as I said in my comment, results in having to create a new branch and cherry picking the commits onto there).

themusicman ,

You should check out the repo options on GitHub. It most definitely supports rebase merges, and you can disable other merge types if desired.

risottinopazzesco ,

Rebase. Merge into current leaves merge commits in the dev branches.

atyaz ,

Always merge when you’re not sure. Rebasing rewrites your commit history, and merging with the squash flag discards history. In either case, you will not have a real log of what happened during development.

Why do you want that? Because it allows you to go back in time and search. For example, you could be looking for the exact commit that created a specific issue using git bisect. Rebasing all the commits in a feature branch makes it impossible to be sure they will even work, since they represent snapshots that never existed.

I’ll never understand why people suggest you should default to rebasing. When prompted about why, it’s usually some story about how it went wrong and it was just easier to do it the wrong way.

I’m not saying never squash or rebase. It depends on the situation but if you had to pick a default, it should be to simply merge.

h14h ,

I try to structure my commits in a way that minimizes their blast radius, which usually likes trying to reduce the number of files In touch per commit.

For example, my commit history would look like this:

  • Add new method to service class
  • Use new service class method in worker

And then as I continue working, all changes will be git commit --fixuped to one of those two commit’s hashes depending on where they occur.

And when it’s time to rebase in full, I can do a git rebase master --interactive --autosquash.

dukk ,

This is the way! Small commits with descriptive commit names, then just fixup into a few feature commits. Makes rebase a breeze.

rookeh ,

I’ve always merged. Rebase simplifies the history graph, but realistically I can’t think of a time where that has been important to me, or any of the teams I’ve worked with.

Maybe on some projects with a huge number of concurrent branches it becomes more important, probably less so for smaller teams.

maniacal_gaff , in Java

It would be kinda dumb to force everyone to keep casting back to a double, no? If the output were positive, should it have returned an unsigned integer as well?

larvyde ,

int coerces to double automatically, without explicit casting

snake_baitman ,

The CPU has to do real work to convert between float and int types. Returning an int type would just be giving the CPU extra work to do for no reason

notnotmike ,
@notnotmike@programming.dev avatar

I’m learning so much from this thread and I don’t even use Java

baseless_discourse ,

I think one of the main reason to use floor/ceilling is to predictably cast a double into int. This type signature kind of defeats this important purpose.

I don’t know this historical context of java, but possibly at that time, people see type more of a burden than a way to garentee correctness? (which is kind of still the case for many programmers, unfortunately.

BorgDrone ,

You wouldn’t need floor/ceil for that. Casting a double to an int is already predictable as the java language spec explicitly says how to do it, so any JVM will do this the exact same way.

The floor/ceil functions are simply primitive math operations and they are meant to be used when doing floating point math.

All math functions return the same type as their input parameters, which makes sense. The only exception are those that are explicitly meant for converting between types.

baseless_discourse , (edited )

“predictable” in the sense that people know how it works regardless what language they know.

I guess I mean “no surprise for the reader”, which is more “readability” than “predictability”

BorgDrone ,

Is there any language that doesn’t just truncate when casting from a float to an int?

baseless_discourse ,

As far as I know, haskell do not allow coresion of float to int without specifying a method (floor, ceil, round, etc): https://hoogle.haskell.org/?hoogle=Float±%3E+Integer&scope=set%3Astackage

Agda seems to do the same: agda.github.io/agda-stdlib/Data.Float.Base.html

sundray , in Help me stepbro, I fell and now there's AI stuck in my sextoys. Also the next time you answer provide a boilerplate react module for a TODO list

The sudden change in <style> makes my <head> feel funny 🥴

lennivelkant , in No common rube

I work in our service department myself (not as support tech though), but obviously, all tickets are supposed to go through 1st level. I don’t wanna be the dick skipping queue, so I did then one time I had an issue.

There’s a unique feeling of satisfaction to submitting a ticket with basically all the 1st level troubleshooting in the notes, allowing the tech to immediately escalate it to a 2nd level team. One quick call, one check I didn’t know about, already prepared the escalation notes while it ran. Never have I heard our support sound so cheerful.

Sabata11792 ,

Still riding the high of RMAing my Index. Included all the steps I did and the reply was essentially, “Thanks for troubleshooting, confirm your address and we’ll ship your replacement.”

suction ,

My “Index” you say…🤔

Sabata11792 ,

You know, the fancy hat you wear so you don’t have to see the same reality that work takes place in.

suction ,
Sabata11792 ,
suction ,

Thanks that would have bothered me for weeks

Sabata11792 ,

I had to mess with you a little bit.

Xanis ,

My favorite little story was while working short-term at a company. Had some issues, did my normal troubleshooting steps and Google searches, identified what I felt the issue was and knew I wouldn’t have enough access to fix it. Reached out and got a response “Blah blah blaaah schedule blah blah Remote-In.”

Later on he sent me a message and remotes into my computer. I take control quick, open up notepad, and type out “Hi!”

To this day I swear that little show earned me more difficult fake phishing attempts. Which I mention because he specifically told me one day he had experience in the information security sector. Lo’ and behold!

Olhonestjim ,

One should never skip dicks in the queue. It’s rude and they’ve been waiting.

Ziglin ,

If someone sends a bug report with minimal effort and expects me to fix I’ll skip their report unless I have nothing better to do.

sneezycat , (edited ) in No common rube
@sneezycat@sopuli.xyz avatar

Restarting can be a pain too.

Recently, I decided to install arch linux on an old laptop my sibling gave to me. I’m not new to Linux, I’ve been running a debian server for a year now and I have tried several VMs with different systems. But this was my first time installing arch without a script, and on bare metal.

Installing arch itself wasn’t that much of an issue, but there was a bigger problem: the PC didn’t recognize the pendrive for boot in UEFI mode. It seemed to work in the regular boot mode, but I didn’t want to use that. I made sure to deactivate safe mode and all the jazz. Sure enough, I could get UEFI boot working.

I install arch, works fine, I reboot. Oops! I didn’t install dhcpcd and I don’t know how to use network manager! No internet, great!

In my infinite wisdom, instead of trying to get NM to work, I decided to instead chroot back into the system and install dhcpcd. But my surprise when… The boot menu didn’t recognize the USB again. I tried switching between UEFI and normal boot modes on the bios and trying again, after all it appeared last time after changing it, right?

“Oh it doesn’t appear… Wait, what’s this? No boot partition found? Oh crap…”

Turns out, by changing the setting on the BIOS I probably deleted the nvram and with it the boot table settings or whatever they’re called. I deleted GRUB.

Alas, as if to repent for my sins, God gave me a nugget of inspiration. I swap the USB drive from the 3.0 port to one of the 2.0 ports on the other side and… It works, first try. The 3.0 port was just old and the connection bad. And I just deleted GRUB for no reason.

Usually, I would’ve installed everything from scratch again, but with newfound confidence, I managed to chroot into the system and regenerate the boot table or whatever (and install dhcpcd). And it worked! I had a working, bootable system, and an internet connection to download more packages.

I don’t know what the moral of the story is I just wanted to share it :)

s12 ,

I like to imagine an IT person telling someone that story to see whether they understand it or get a stroke, as a way to check if they were telling the truth about being good with computers and having tried everything, or something.

akakunai ,

respecc

suction ,

Check if dckpcd is up

puchaczyk , in Please stop

I just want a Debian-based distro with KDE that’s not poisoned by Canonical’s nonsense

boonhet ,

KDE Neon?

puchaczyk ,

It’s Ubuntu based

Successful_Try543 , (edited )

But afaIk without ‘Canonical’s nonsense’, e.g. snap Firefox.

WormFood ,

updating packages in kde neon is like playing russian roulette, it’s worse than pop os in my experience

Successful_Try543 ,

During usual updates? Or during the major release jump of KDE Plasma from 5.x to 6.x?

fourwd ,
@fourwd@programming.dev avatar

Have you tried Debian?

puchaczyk ,

I’m currently on debian. I wrote this comment as a response to the Debian slander in the meme.

pelya ,

Just do a quick simple sudo apt-get install task-kde-desktop

nexussapphire ,

Man Nvidia users are going to be stoked when the get explicit sync in they’re desktop environments in two years. 😂 They’re have been so many small improvements in the Nvidia drivers up until that point I hope they actually update Nvidia drivers on Debian. I understand some of those improvements are not going to work because of the kernel version and the desktop versions.

ik5pvx ,

You can choose KDE as desktop environment during Debian installation, or replace whatever DE you installed at any time.

anzo ,

I’m using mxlinux “ahs” version, it comes with kde at their “ahs” repos for supporting latest hardware and graphics cards. You may also check for the non-ahs, there might be a meta-package for kde plasma and that’s it…

aaaaace ,

MX ?

Matriks404 ,

Why do you need it to be Debian-based?

dan ,
@dan@upvote.au avatar

I think you’re looking for Debian. If you want newer packages, run testing instead of stable.

yessikg ,
@yessikg@lemmy.blahaj.zone avatar

Same, that’s why I’m using Q4OS

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines