There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

programmerhumor

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

BorgDrone , in Programming languages like PHP and JavaScript get a lot of valid criticism, but I feel that we do have to consider the fact that their ease of use got A LOT of people into programming.

Yes, this is the problem with PHP. It gets a lot of people programming who shouldn’t be. I still have nightmares about the PHP code one of my managers at a previous job wrote.

simonced ,

Exactly, programming shouldn’t be easily accessible. Anybody following a tutorial can make a simple page working. And they think they know programming, get hired by people not knowing any better, and here we are, debugging supid shit instead of doing nice things.

digdilem ,

I’ve heard that a lot, but I think it’s an outdated view.

Programming should be easy, or at least easier. That’s a view shared by everyone who writes and contributes to documentation on all languages and also those who develop the languages as well. (With varying success).

Every damned one of us was a shit coder when we started, that’s part of the process - not least amongst us who are self taught. Yet some go on to do great things and be wonderful coders (including yourself, no doubt).

You had a bad experience, fair enough, but it’s a big brush to tar everyone with. I think everyone should be a programmer. If nothing else it teaches them a little how software actually works and that’s a good thing.

BorgDrone ,

I disagree completely. Sure, there is a learning curve and you’re not going to be a great programmer day one, that is what college and junior programmer positions are for. But the idea that programming can be easy is bullshit.

Programming is inherently difficult, and there is no way to reduce this. Read ‘No Silver Bullet’ by Fred Brooks, it’s as true today as when it was written back in 1986. Not everyone should be a programmer, just like not everyone should be a doctor, or a painter, or a formula 1 driver. People have unique talents and the idea that this is something that everyone should be able to do is frankly ridiculous.

digdilem ,

I disagree completely.

Great! It would be a boring world if we all thought alike.

Programming is inherently difficult,

That’s where we differ. I don’t think it is - and I’m not saying that because I think I’m good, it’s because programming is just a different way of thinking - that’s why there’s books like “Zen and the art of computer programming” and “The Tao of programming”. (I haven’t read “No Silver Bullet” but I’ll keep an eye open. I was actually writing code back in 1986 so it might be interesting to compare because I think programming has changed a huge amount in that time)

Not all programming is easy, just as not all of it is hard. The range of this subject is massive, and blanket statements, pro or anti, just don’t cut it when you dig into it.

BorgDrone , (edited )

You can easily find ‘No Silver Bullet’ online worrydream.com/refs/Brooks-NoSilverBullet.pdf

He basically splits the complexity of programming into two categories: accidental complexity and essential complexity. The accidental complexity you can fix, it’s the difficulty caused by tooling, programming languages, etc. The essential complexity, that is: the complexity caused by the problem your program is trying to solve, cannot be fixed. To quote the man:

The essence of a software entity is a construct of interlocking concepts: data sets, relationships among data items, algorithms, and invocations of functions. This essence is abstract, in that the conceptual construct is the same under many different representations. It is nonetheless highly precise and richly detailed. I believe the hard part of building software to be the specification, design, and testing of this conceptual construct, not the labor of representing it and testing the fidelity of the representation. We still make syntax errors, to be sure; but they are fuzz compared to the conceptual errors in most systems.

I don’t think it is - and I’m not saying that because I think I’m good, it’s because programming is just a different way of thinking

This different way of thinking is something that some people simply will never get, or not at the right level. I’m certainly not a great programmer but I know from experience I’m above average. I know a lot of programmers that simply cannot grasp things above a certain level of abstraction. These are college-educated programmers with years of experience. The easiest way to tell is that bad programmers turn simple problems into complicated code, and good programmers turn complicated problems into simple code.

digdilem ,

Nice quote - but I don’t think it does hold up as truly as it did in the 80s. There is an unimaginable wealth of systems and design tools available now that were not around then. Even something take for granted like a gui schema designer - hell, even SQL itself wouldn’t be around until almost a decade later, and that was partly designed to simplify database queries. Every step like that has simplified what we do today. Debugging tools are light years ahead of when I was writing C in the early 90s. Debugging then was pretty much “try and compile it and then fix the errors”. Now there’s linters, memory profilers, automatic pipelines and all the rest of that. Much of that is offset by the fact we do far more complicated things than we did, and that those very tools mean there’s a lot more to learn and master beyond the mere language.

I do concede and agree with your last paragraph. Design is more important than implementation, and elegance of code and concept is a timeless beauty. One of the hardest things I’ve had to learn is that thinking about coding is often far more productive than actually coding, and too many times I’ve been a busy fool, re-writing and starting over many times because I later found out a better way.

BorgDrone ,

All those tools you mention would fall under the accidental complexity header. There have been many advances in that field. But none of those tools reduce the essential complexity. SQL doesn’t mean you don’t have to think about how you organize your data. You still need to think about things like normalization. Even ORM doesn’t free you from this.

Same goes for debuggers, sure it’s easier to inspect code at runtime but that doesn’t help you design good code.

You can reduce this accidental complexity but in the end there is always the core of the essential complexity. The difference with past decades is that for a simple program the accidental complexity would be a huge part of the total complexity, so in that regard you’re right. It has become a lot easier to write trivial programs where the essential complexity is very low.

This may apply to a lot of hobby-level / beginner projects, but in the end it doesn’t have as much an impact on what we do as professionals. As you said, I spend a lot more time thinking about coding than actually writing code. Especially as I got older and more experienced. As a senior developer I write a fraction of the code I did as a junior, but I’m working on more complicated problems as well.

vd1n , in open source in progress...

I finally got a proper USB 3.012542.

crusty ,

It’s bad when the actual naming scheme is more ridiculous. See USB 3.2 gen 2x1. Hopefully usb 4 simplifies things, but let’s be honest, it probably won’t…

Chais , in This interface name my previous manager wrote
@Chais@sh.itjust.works avatar

Suggest a replacement: IMasterBase2

peopleproblems , in thought you stood a chance?

Ok, so, I thought I was crazy when I said “debug it” and my coworkers were like “you can read that shit?”

SpaceNoodle ,

My colleague literally once said to me “i can’t read bleep bloops”

catshit_dogfart ,

Heck I remember when you had to read “bleep bloops”. POST codes came in beeps, and that’s how you knew why the computer wouldn’t start.

Sometimes I miss em, wish it gave those in addition to the modern indicators. Then I could just tell without even looking.

SpaceNoodle ,

Do modern PCs even support IBM-style speakers any more?

janNatan ,

Yes. My latest mobo has the pins for a POST speaker, but didn’t actually come with one. Installed it though and it works.

SpaceNoodle ,

I didn’t even bother checking on the last box I built. I know I’ve got at least one proper speaker and one piezo in a crate somewhere.

catshit_dogfart ,

Now I’m wondering if my board supports one. I think it’d be cool if my big fancy custom cooling loop gaming build sounded like it’s from the early 90s.

gliide ,

Most boards still do. I have a couple x570 boards lying around that do, and my friend just got an x670e board that has one. They all have the setting in the BIOS for the POST beep as well, and you can set it up in your OS to beep at a specific frequency/duration for notifications.

catshit_dogfart ,

That’s exactly what I have! An X570-A-PRO. We’ll see, haven’t gotten around to messing with this.

notthebees ,

My b350 board supported that style of speaker. Newer machines apparently use the start button led in some cases.

dan ,
@dan@upvote.au avatar

Some motherboards have a tiny piezoelectric speaker soldered on, which replaced the larger speakers that used to be used. It’s becoming less and less common though.

peopleproblems ,

My current mobo has LED post codes.

I hate it with a passion. The manual doesn’t list that the CPU led and MEM led are lit when the +5V rail is too low from too much load on it from the USB devices.

though thinking about it I should probably figure out WHY that’s a problem

dan ,
@dan@upvote.au avatar

Yeah the LEDs can be confusing. I like the fancy systems that have an eight-segment LED display (those basic ones that can show numbers and some letters) that show an error code. My work PC (a Lenovo ThinkStation) has the error code display on the front of the PC so you don’t even have to open it to determine the issue.

Poob , in What a great start !

Anyone who thinks chatbots are going to make life better for worker shlubs hasn’t been paying attention for the last 200 years

Kichae , (edited )

Something something ludites, something something sabot, something something hence the word .

ReakDuck , in open source in progress...

Yeah, thats the positive thing.

I don’t want a single application that just works for a few cases but doesn’t work for tons of other cases. With such a world we would have a Windows OS where you need to use Face-ID for everything you want to use and sometimes just crashes and restricts the usage of something simple stupid because you don’t have the magical Windows Battlepass or smth.

tun , in Easy

Programmer never steal. We just copy and paste where we need them.

LukeChriswalker , in What a great start !

It will! For people who already don’t do work because they implemented the AI strategy for developers. Clearly, if productivity rises the managers did good!

pizzaiolo , in open source in progress...

This of course ignores all the other times a new common standard succeeded

Vithar ,

Doesn’t that almost always require a significant government intervention/regulation.

volodymyr ,

Phone and laptop chargers converged from numerous standards to just a few all on their own I think, no?

fristislurper ,
@fristislurper@feddit.nl avatar

Yes, but under treath of lawmakers mandating a single standard. And the EU has now forced a single standard anyway on smartphones, tablets, etc.

Although I agree that there are quite a few examples of a “naturally emerging” single standards without lawmaker intervention, but this is not really one of them…

volodymyr ,

I thought this is an example where standards in part converged naturally. But I agree that regulation was fundamental part of this process.

avidamoeba , in The Future We Deserve
@avidamoeba@lemmy.ca avatar

Once I made an informal survey at a previous workplace. I was shocked how many Mac users who also used Docker had no idea that there was a Linux VM in-between. 🥲

Chais ,
@Chais@sh.itjust.works avatar

What did you expect? Mac users aren’t exactly famous for their technical inclination.

avidamoeba ,
@avidamoeba@lemmy.ca avatar

Those were software developers using Docker on Mac. I expected a tad more from them. 🤭

Chais ,
@Chais@sh.itjust.works avatar

I’ve made that mistake before.

Otome-chan , in 10 Worst Programming Languages
@Otome-chan@kbin.social avatar

I'm just gonna go out on a limb and ruffle some feathers and say it's node.js

pkulak ,

Well, that’s a runtime. But yes, JavaScript.

h_a_r_u_k_i , in open source in progress...
@h_a_r_u_k_i@programming.dev avatar

Python packaging in a nutshell.

pazukaza ,

In how many ways can you package a python?

Just tie a knot with it and throw it in a bucket.

darcy , in address

“No, where do you live?!”

“On the Internet.”

Korne127 , in open source in progress...
@Korne127@lemmy.world avatar

Yeah, in such a situation you should always try to compare the standards, look at the userbases and suddenly there are only very few that actually make sense. If everyone just does this, one standard will eventually crystalise as the one to use (or at least depending on the situation). Character encoding is an interesting example, because nowadays (almost) everything just uses UTF-8, despite there having been many.

xthexder ,
@xthexder@l.sw0.com avatar

UTF-8 is absolutely magical in how it’s backwards compatible with ASCII. Windows still uses UTF-16 which makes supporting Unicode filenames and stuff a huge pain compared to linux. At least pretty much the entire web is UTF-8.

Cowabunghole , in why not a,b or x,y?

It’s my understanding that i,j are conventionally used in mathematics which carried over into programming, but specifically it comes from Fortran in which all integer variables start with “I” through “N” based on said mathematical convention

LuckyFeathers ,

Yep, this is the answer. In Fortran, all variables are assumed to be floats, unless the variable starts with I, J, K, L, M or N. I’m sure they had a good reason, but it sounds so bizarre today!

galilette ,

In fact this goes all the way back to Hamilton when he invented quaternion, in which i,j,k are used as basis vectors (which are generalizations of the imaginary i). Later Gibbs dropped the scalar component and gave us the modern vector.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines