Hell yeah bro same. I’ve been amazed at how much better Linux is in just about every way, except for native software availability, but it’ll get there. I feel like Microsoft is approaching the tipping point for shit people will put up with, and desktop Linux is so good now that non-technical people can move over to it.
I already planned on my next computer being Linux Mint, but it’s getting more and more desired as time goes on.
I was playing Elden Ring when it began stuttering, turns out Windows Defender was just constantly reading the disk (I still have a hard drive). Finally turned off maximum priority (seemingly random) scans in task scheduler when I began stuttering again. This time it was Windows Compatibility Telemetry taking up 50% of the disk, until I finally found a way to turn that off.
It’d be so nice to have an OS that doesn’t run random unnecessary things without your permission.
I shifted all my important data to an external disk, wiped the main ssd, slapped Debian on there, then moved the data back. Great way to spend an afternoon.
TBF you’d probably get even more benefit from de-bloating that PC then. Free up some processing power for the tasks you actually want, instead of doing Microsofts bidding in the background all the time.
But we’ve all got different plans/priorities/timelines. Best of luck to you m8!
I just put mint on a 2015 dell shit laptop that barley functioned with windows. Now it’s a perfectly fine computer. I don’t do much besides use the internet but it struggled with that before.
That’s still newer than any of my daily-use laptops that are all running full-featured Linux distros just fine. I got 'em all cheap secondhand, and just pumped up the RAM (12-16GB) and installed SSDs.
As a gamer, I was anxious about switching to Linux as my daily driver, but I needed to fully immerse myself to improve at Linux, and I’ve been pleasantly surprised by how few gaming related problems I’ve had.
I also personally ask myself how a PyPI Admin & Director of Infrastructure can miss out on so many basic coding and security relevant aspects:
Hardcoding credentials and not using dedicated secret files, environment variable or other secret stores
For any source that you compile you have to assume that - in one way or another - it ends up in the final artifact - Apparently this was not fully understood (“.pyc files containing the compiled bytecode weren’t considered”)
Not using a isolated build process e.g. a CI with an isolated VM or a container - This will inevitable lead to “works on my machine” scenarios
Needing the built artifact (containerimage) only locally but pushing it into a publicly available registry
Using a access token that has full admin permissions for everything, despite only requiring it to bypass rate limits
Apparently using a single access token for everything
When you use Git locally and want to push to GitHub you need an access token. The fact that article says “the one and only GitHub access token related to my account” likely indicates that this token was at least also used for this
One of the takeaways of the article says “set aggressive expiration dates for API tokens” - This won’t help much if you don’t understand how to handle them properly in the first place. An attacker can still use them before they expire or simply extract updated tokens from newer artifacts.
On the other hand what went well:
When this was reported it was reacted upon within a few minutes
Some of my above points of criticism now appear to be taken into account (“Takeaways”)
Yes kids, the only stuff in ANY repo (public or otherwise) should be source code.
If it is compiled, built, or otherwise modified by any process outside of you the developer typing in your source code editor, it needs to be excluded/ignored from being committed. No excuses. None. Nope, not even that one.
Two choices: Either the production software isn’t in the exact state the repo was when the software was built. Or I can’t get build timestamps in the software.
I think the idea is that average people have no clue what color they are. So they’d be forced to take it out to check and thus have to restart their PC. It’s a trick!
Altho, maybe I’m misunderstanding something because all the pins of all the electrical cords I’ve ever seen have been silver?
I’d make up some BS about an old version of the product using brass or copper, and newer versions using aluminum or iron, so knowing the color will help me know how to fix it
I worked with a guy that would tell people that coax needed to be “released to ground” occasionally, by unhooking the cable and putting your thumb over the end. That’s how he made sure people were disconnecting and reconnecting the cable from the back of the box. He also told someone that “data might be trapped in the Ethernet cord” and advised they unplug it from both ends and swing it around their head in a circle to “loosen the stuck bits and clear the line”…
C was originally created as a “high-level” language, being more abstract (aka high-level) than the other languages at the time. But now it’s basically considered very slightly more abstract than machine code when compared to the much higher level high-level languages we have today.
Other way around, actually; C was one of several languages proposed to model UNIX without having to write assembly on every line, and has steadily increased in abstraction. Today, C is specified relative to a high-level abstract machine and doesn’t really resemble any modern processing units’ capabilities.
Incidentally, coming to understand this is precisely what the OP meme is about.
To add on to @azdle 's comment, “High Level” in terms of programming languages means further away from how the computer processes things and “Low Level” means very similar to how machines process things. For example, binary and hexadecimal (16 bit) machine code such as “assembly language” are both low level.
Imagine if program interpreters were building blocks, then 6 layers of abstraction would be very tall or higher level.
This is pedantic, but assembly languages get “assembled” to machine code. This is somewhat similar to higher level languages being “compiled,” which eventually becomes assembly which gets assembled. The major reason why these are different is because a compiler changes the structure of the code. Assembly is a direct mapping to instructions. It just converts the text into machine code directly, which is why it’s easy to go from machine code to assembly but decompiling doesn’t give you identical results to the original source code.
Also, binary and hexadecimal are just different ways to view the same binary data and aren’t different things. There is only “machine code” which is a type of binary data but you can view binary with any arbitrary base, though obviously powers of 2 work better.
I don’t think I said assembly is abstracted. It’s pretty much just a translation.
Hexidecimal isn’t binary. They’re both just ways to represent numbers. A number displayed in hexadecimal and binary are the same number even though they look different. FF(base 16) = 1111 1111(base 2) = 255(base 10). They’re all identical.
Assembly is a direct mapping to instructions. It just converts the text into machine code directly,
Kinda… yes and no? At least with x86 there’s still things like encoding selection going on, there’s not a 1:1 mapping between assembly syntax and opcodes.
Also assemblers, at least those meant for human consumption (mostly nasm nowadays) tend to have powerful macro systems. That’s not assembly as such, of course.
But I think your “a compiler changes the structure of the code” thing is spot-on, an assembler will not reorder instructions, it won’t do dead code elimination, but I think it’s not really out of scope of an assembler to be able to do those things – compilers weren’t doing them for the longest time, either.
I think a clearer division would be that compilers deal with two sets of semantics: That of the source language, and that of the CPU. The CPU semantics don’t say things like “result after overflow is undefined”, that’s C speaking, and compilers can use those differences to do all kind of shennanigans. With assemblers there’s no such translation between different language semantics, it’s always the CPU semantics.
Your team needs to have a coding standards meeting where you can describe the pros and cons of each approach. You guys shouldn’t be wasting time during PR reviews on the same argument. When that happens to me, it just feels like such a waste of time.
programmer_humor
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.