This is kind of intriguing. I like FreeBSD’s userland tools a lot better. Have you tried running it? If not, I might see what it’s all about. The GNU toolchain is a mixed bag. Some of it is really well documented, some stuff average, and others is just a dog’s breakfast.
I only learnt about it today, so I couldn’t check it. I have this project of building my own distro using musl and a non GNU userland, and it is a very annoying process, so I felt like I should share this one.
I have been running it for a while. It is mostly awesome.
A non-trivial amount of software assumes Glibc though and so you will have the odd hiccup because of MUSL. I think one of the goals of Chimera is to improve that situation.
I have one old laptop where I installed Gentoo with musl+llvm profile. It’s fun to tinker with. If I need to run any game binaries, I guess I’d need to run some containers…
Also, if it came pre-installed, one would assume all the hardware was properly supported. A big pain point with Linux is that sometimes things just don’t work right, and there’s nobody to turn to for help except Google. It’s been a while since I attempted to run Linux on a laptop, but when I did I struggled a lot getting good battery life, good trackpad support, and a sleep mode that worked correctly.
Reputations live on for decades after they are earned. Perhaps all of my laptop problems are ancient history, but I have no way to know without trying, and it’s too much effort.
I have an example: a little whole ago I put Arch on my 2-in-1 laptop just because I prefer open-source philosophy, and although a lot of things worked out-of-the-box, my biggest problem was the actual 2-in-1 function. I know that, like Windows, I’d have to do a little digging to get it working (except Windows would involve drivers, Linux required settings) and I got a makeshift solution working: KDE has its own screen-rotating feature, and I made 2 shell commands on the desktop that, when pressed, disable/enable the keyboard/trackpad. Turns out it only works on Xorg, and Wayland requires a way more complicated setup to work, so I just gave up using Wayland on it. Something to do with udev rules or something
Understanding what different distros offer and being able to make a educated decision about it. I looked around for a week or so until I found a arch distro that worked, took away the manual installation process as a complete noob, and wasn't all red flags straight away (the example is that a lot of ppl advised against manjaro). I ended up with garuda (which some ppl aren't a fan of because of chaotic-aur, but we have to start somewhere, haven't we) atm which works fine until I am confident enough to do a complete base arch installation the next time.
The last time I tried to make a USB dual-boot Linux on a laptop I ended up breaking the laptop. It would turn on but show nothing but a black screen. Makes me really hesitant to try again on an old laptop that I would still like to be able to use if I fuck it up.
Anything using the terminal... I once tried to do something on Linux because a friend told me it was great. I gave it another go when it came up on my Chromebook and tried to teach myself. I just don't get it.
I'm not a programmer at all, so anything that involves typing commands is going to baffle me!
One thing i had to learn when i started to understand how big techs really work, of what that would imply (see chat control) and get passionate about free software, free operative systems and freedom of customization is that freedom itself almost always requires work, the question is: is that a work you’re willing to do? for me the answer is a strong YES.
But I was just giving my perspective as an outsider who stumbled across this post because messing about with the terminal had the opposite affect on me as someone who appreciates the concept of Linux but doesn't really have the level of passion to learn programming for it.
YES! I fucking hate it. I shouldn’t have to enter code in order to install a program. I want to go to a website and download the software, click install, and have it actually just work.
I’ve used mint for several years now but it will never be a primary OS die my household because it’s such a hassle to work with.
Yeah I keep seeing people mention having to use the terminal to install software, and I wonder what distro they’re using and what software they’re trying to install.
Most distros use flatpak, so when opening GNOME Software/Discover you can install Discord, Spotify, Web browsers, text editors, Steam, etc all through it. And even Ubuntu which doesn’t use flatpak by default, all of those apps are in its Snap store as well. Hell, Ubuntu even has software drivers through a GUI in one central place which is very nice.
I’m on Fedora Kinoite right now which really encourages you to use Flatpaks, and the only software I’ve installed through the CLI are dev tools which would be disingenuous to say in this situation stops casual Windows users since they are very unlikely to need Rust, Neovim, various C/C++ libs, etc…
On many popular distros there are graphical apps preinstalled for that. The distribution maintainers have repositories with common packages to make it so that you can open an app store and install programs from one place rather than going to different websites and downloading installers.
Honestly, I’d rather use terminals to install software. Most of the time, it’s actually far fewer steps than just clicking through several screen on top of having to find the application installation file you downloaded.
IMO one of the main problems is eliminating the workflow of older commercial operating systems and having to build a new habit of using a new system. There are various Linux-based distributions that manage to give the user everything they need without having to resort to using the specific terminal.
Creating a new habit after spending years developing one for an old system, for me, is the main problem that leads many users to leave it.
Most folks have been sold a story that every new technology they start using is supposed to be “intuitive”; and that if it is not “intuitive” then it must be defective or willfully perverse.
For example, novice programmers often stumble when learning their second or third language, because it differs from their first. Maybe it uses indentation instead of curly braces; maybe type declarations are written in a different order; maybe it doesn’t put $ on its variables; maybe capitalization of identifiers is syntactically significant.
And so they declare that Python is not “intuitive” because it doesn’t look like C; or Go is not “intuitive” because it doesn’t feel like PHP.
It should be obvious that this has nothing to do with intuition, and everything to do with familiarity and comfort-level.
Commercial, consumer-oriented technology has leaned heavily into the “intuitive” illusion. On an iPhone or Windows, Android or Mac, you’re supposed to be able to just guess how to do things without ever having to confront unfamiliarity. You might use a search engine to find a how-to document with screenshots — but you’re not supposed to have to learn new concepts or anything. That would be hard.
That’s not how to learn, though. To learn, you need to get into unfamiliar things, recognize that they are unfamiliar, and then become familiar with them.
Comfort-level is also important. It sucks to be doing experimental risky things on the computer that’s storing your only copy of your master’s thesis research. If you want to try installing a new OS, it sure helps if you can experiment with it in a way that doesn’t put any of your “real work” at risk. That can be on a spare computer, or booting from a USB drive, or just having all your “real work” backed up on Dropbox or Google Drive or somewhere that your experimentation can’t possibly break it.
It should be obvious that this has nothing to do with intuition, and everything to do with familiarity and comfort-level.
Not to be petty, but I think that intuitive is not that different to familiar.
I mean, the problem is in using the word intuitive when “selling” something in the first place. User interaction involves ton of things, large and small, and the intuitive things are rarely noticed. Such promise is likely going to lead to disappointment.
Adapting to these small differences is a skill in itself.
I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly? With AMD, the 3D acceleration driver is bundled directly into VESA, so it’s already ready & working before even the first-boot of almost all desktop distros. That’s how drivers are supposed to work on Linux and it has taken NVIDIA 10+ years (and counting…) to get with the basic program.
I applaud the long overdue decision to move their proprietary firmware directly onto the card and making the rest of the kernel driver open-source, but I’ll remind you folks of a few things:
The open source driver is still in an alpha with no timeline for a stable release
NVIDIA has so far elected to control their own driver releases instead of incorporating 3D acceleration support into VESA
NVIDIA had to be dragged screaming to go this far and they’re still not up to scratch. There’s still plenty of fuel left in the “Fuck NVIDIA” gastank.
I’m messing with shitvidia now on a new AAA laptop after people said it just works. I just spent all day trying to setup EFI keys for secure boot because shitvidia doesn’t sign their kernel drivers module. Plus their drivers are outdated and their documentation is terrible. I failed today because Gigabyte is another shit company that has a proprietary (theft) bootloader set so that no one can lock the UEFI secure boot with any PK except theirs. I can run Fedora’s key issued by Microsoft to run it with secure boot enabled, but then I can’t use the god dam GPU I bought the piece of shit for in the first place. Shitvidia will always be shitvidia. This proprietary bullshit is straight up theft. It should be illegal to sell anything with digital reservations of any kind. Dealing with all this, I think Stallman was ultra conservative right wing. Fuck these criminals.
You other option would be use amd iGPU. Cause good luck find amd discrete gpu in notebook this days. And even then, you would be just “messing with shit-amd”.
The problem is that Nvidia’s software stack is much more advanced. For example machine learning acceleration, CUDa is miles better than Rocm and widely supported. I wish AMD were more serious about GPUs and made greater strides, but they overslept and let Nvidia become a de-facto monopolists with their anti competitive/consumers strategies and closed source strides.
I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly?
Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.
By the way, I am currently using a 6800 XT from AMD. I therefore don’t want to defend Nvidia graphics cards across the board.
Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.
Since then, some things have changed and Nvidia has contributed code to several projects like Plasma or Mesa to improve the situation regarding Wayland.
Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.
Kernel modules work until they don’t. I’m genuinely glad that you’ve had a good experience and – despite appearances – I’m not interested in provoking a vendor flamewar… but the fact remains that among the three major patterns (builtin, userland, module), modules are the most fragile and least flexible. I’ll cite this response to my parent comment as an example.
Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.
That’s a fair point, but to a certain extent I think this overlooks the importance of developer sentiment on a project like Linux. Take (Intel) Macbooks as an extreme example: kernel developers liked the hardware enough to support it despite utter vendor indifference. It’s clearly a case of hypocrisy compared to NVIDIA who (at the very least) participates, but at the end of the day people will show love for the things that they love. NVIDIA remains unloved and I do feel that this bleeds through to the user experience a fair amount.
In any case, you’re right to say that legitimate criticisms are often blown out of proportion. Developer problems aren’t necessarily user problems, even if we sometimes romanticize otherwise.
You basically have two option: suffer on nvidia, cause some feature may not be developed, or suffer on amd, cause developed feature just straight up do not work.
linux
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.