Ignore everyone here saying fix Ubuntu and try Fedora Kinoite (or Silverblue). Bazzite is probably great too if you are gaming but I haven’t tried it.
I finally tried Fedora Kinoite after years of Ubuntu (and related distros) and I genuinely wish I had tried it sooner. Everything just works. I cannot reccomend it enough. It’s what I always wanted Linux to be.
it’s kinda the fire-and-forget of OSes. you just press the update/upgrade button when the unattended-upgrade didn’t catch all and it just works for free and forever.
So it has auto updates enabled? Windows, macOS and a ton of other Linux distros do that as well.
I think it’s moreso that Ubuntu is (one of the) most used desktop Linux OSes, so a lot of corporations and individuals who like to play safe just go with that
From my perspective, if used for work, automatic security updates should be mandatory. Linux is damn impressive with live patch. With thousands or even tens of thousands of endpoints, it’s negligent to not patch.
Features? Don’t care. But security updates are essential in a large organisation.
The worst part of the Linux fan base is the users who hate forced updates, and also don’t believe in AV. Ok on your home network that’s not very risky compared to a corp network with a million student and staff personal information often with byo devices only a network segment away and APT groups targeting you because they know your reputation is worth something to ransom.
They probably have been using it for years, and for the last more then a decade I’ve been using Ubuntu as my main Linux distribution since I have work to do and I’ll get to doing work faster in ubuntu than any other distribution.
Why did I start with Ubuntu? 10+ years ago Ubuntu was lightyears ahead for community support for issues. Again, I had work to do, I wasn’t hobbyist playing “fuck windows”.
In fact look at things like ROS where you can get going with “apt install ros-noetic-desktop” and now you can build your robotics stuff instantly. Every dependency to start and all the other tooling is there too. Sure a bunch of people would now say “use nix” but my autonomous robotics project doesn’t care I am trying to get lidar, camera, motors, and SLAM algorithms to work. I don’t want to care or think about compiling ROS for some arch distribution.
I won’t say I don’t dabble with other distributions but if I’ve got work to do, I’m going to use the tools I already know better than the back of my hand. And at the time, when selecting these tools, Ubuntu had it answered and is stable enough to have been unchanging for basically a decade.
Oh and if I needed to, I could pay and get support so the CEO can hear that risk is gone too (despite almost every other vendor we pay never actually resolving a issue before we find and fix it… Though I do like also being able to say “we have raised a ticket with vendor x and am waiting on a reply”).
I think your first point is the main reason Ubuntu has its popularity to thank for; 10-15 years ago it was (one of) the best desktop Linux OSes, people used to its workflow will continue using it as there’s no imminent reason to switch to whatever new thing just came out
Inertia is just a sign of maturity. It’s fine. Nothing wrong with it. Especially when the new stuff is happening along side it. In 10 years there may be people asking why you’re using arch or nix, when whatever new thing is superior. But it’ll just be proof that nix can run in production for 10+ years.
Most organizations care about maintaining document compatibility, especially formatting, and that usually means Office365. Microsoft is notorious for publishing a standard and then ignoring their own standard, making it exceedingly difficult to use other office suites.
I’ve heard OnlyOffice does the best at maintaining compatibility.
Sorry to clarify: updates come as security or as feature updates. If I’ve already got a standard operating environment (SOE) with all the features I/staff need to do work, I don’t need new features.
I then have to watch cves with my cve trackers to know when software updates are needed and all devices with those software get updated and the SOE is updated.
I can go on a rant about how bad the Linux has recently made my life as someone’s policy is that any Linux bug might be a security vulnerability and therefore I now have infinite noise in my cve feed, which in turn is making decisions on how to mitigate security issues hard, but that is beyond this discussion.
So in short I’m only talking about when you update, updating only security fixes, not the software and features. Live patching security vulnerabilities is pretty much free low effort, low impact, and in my personal opinion, absolutely critical. But software features patching can be disruptive, leaves little to be gained, and really only should be driven for a request to need that feature at which point it would also include an update to the SOE.
It was one of the first polished desktop Linux systems, even though it’s enshittified recently it holds its popularity due to its long-standing status as “THE Linux desktop”
Windows, macOS and a ton of other Linux distros do that as well.
First of all, windows and macOS are not for free. They cost extra money, sometimes hidden in the PC cost when pre-installed. When they do a major update, like Win10 to 11, you are at their mercy, if your license can be used to upgrade. Often it can, but sometimes your PC is not “Windows 11 ready” or so and then you get updates for your old system for a few more years until they drop you like a hot potato and throw you to the malware wolves.
Additionally, in Windows the automatic updates are just for the OS itself and some apps from its store. A few apps like Chrome and FF install their own extra update service on top. A lot of other programs check for updates individually or some not at all and often you have to download and run their installer for every update. Idk how it is in macOS tho. Haven’t used it in years.
Yes, a ton of other Linux distros also have background unattended-upgrade or similar. However, the people who choose Ubuntu over those are usually looking for a quick solution that almost always just installs without problems. They usyally don’t have time or patience for any complications, however small. So they choose the fire-and-forget Linux and additionally have greater chances to find a fix or help in the super rare case it doesn’t work, because the bigger user base increases the likelyhood someone else is familiar or has infos regarding that exotic issue.
macOS is mostly the same as Windows in terms of updating Applications.
The App Store is more prevalent than Microsoft Store, but you can still download an executable for most programs from the browser. Installing is a bit different since you drop the file into the app folder instead of actually having an installation executable.
Then there is homebrew, which is an unofficial package manager, which I am using for everything, if available (which is almost all the time)
If your bash script gets longer than 200 lines (including argument handling), use Python. I have to support bash APPLICATIONS at work and it’s a fucking nightmare to maintain.
I would then assume those scripts weren’t written properly to begin with.
But yes, shell scripts should be used (normally) to automate some simple tasks (file copying, backups…) or as an wrapper to exec some other program. I’ve written several shell scripts to automate things on my personal machines.
However shell script can be complex program while at the same time being (somewhat) easy to maintain:
functions, use functions, alot
comment every function and describe what it expects in stdin or as an arguments
also comment what it outputs or sets
This way at least I don’t break my scripts, when I need to modify a function or some way extend my scripts. Keeping the UNIX philosophy inside shell scripts: let one function do one thing well.
And of course: YMMV. People have wastly different coding standards when it comes to personal little(?) projects.
I had several tests at the beginning of the script. These tests define the “low-level” functions based the capability of the shell. To test new features I “simply” ran all the necessary commands on the test environments (bash, busybox, toybox+mksh).
The script would error out if some necessary capability was missing from the host system. It also had a feature to switch shell if it found a better one (preferring busybox and its internal tools).
Yeah… It was tedious process. It was one of those “I’ll write a simple script. So simple that it’ll work on almost every posixy shell.”… rest is history.
It's probably like the US military and their missile silos still using floppy disks. Better to keep a time-tested and very familiar system running a critical operation than a new one with a bunch of unknowns. Or like when you go to the bank, and the screen the teller is looking at is just a front end going through a dozen different layers with COBOL code written by long dead or retired people on a mainframe at the other end.
Us end users with very low risk can afford to continuously live on the bleeding edge.
Just fyi, that is not Fedora workstation, thats a Fedora atomic spin, which is an immutable os. Installing packages and updating works a bit different than a normal distro.
I tried that extension but it is greyed out on my installation. Besides, it acoompanies pdf files, not the url site directly if I understand it correctly
When you find an interesting article through Google Scholar, the arXiv or journal websites, this browser extension allows you to add those references to JabRef. Even links to accompanying PDFs are sent to JabRef, where those documents can easily be downloaded, renamed and placed in the correct folder.
I use jabref and this extension quite heavily. I can assure you that it does send the URL to jabref; it gets added as a Misc reference with the site URL in the optional fields. On my firefox / windows system it does show greyed out in the plugins menu like you say, however it adds a jabref logo in the address bar which can be clicked (or alt+shift+j) to send to jabref.
I just tried it on my linux system though, and it doesn’t work for me, either. Suspect some sandboxing weirdness because I have jabref as a flatpak but firefox running natively. I’m just coming back to linux from a few years hiatus so I’m hoping someone better than me at this can check in.
Jabref does have some troubleshooting steps for their extension that might be worth trying though, depending on your install.
Thank you! I’ll check it out later again. I’ll try using distrobox or nix
edit: I installed firefox and jabref with nix and it works out of the box. I didn’t have to adjust anything, yet the extensions loads very long sometimes. Sometimes it can’t find anything.
I’m not familiar with jabref, so I probably shouldn’t stick my oar in here. But I will anyway. I’ve written my own that I use for lists like Web bookmarks. It’s a poor man’s database manager, so you can add attributes to bookmark entries and sort and search on those attributes.
Tbo, I wonder why I don’t simply use a csv or wysiwyg markdown table. In the end jabref provides a table view of a list which could be converted. I either use this or yours, so thank you very much in advance!
I use wallabag. There is paid hosted version, but you can install it on your server. You can tag, star and mark read/unread your bookmarks. There is a webapp, browser extensions, mobile apps for all platforms, and apps for ebook readers.
Recently switched to bluefin from workstation, I was initially a bit held back by all of the GNOME customisations, but they’re pretty straightforward to revert back to default. While I like the idea of automatic updates it would be nice if it integrated with GNOME software to make it easier to control. Otherwise if you’re looking for an immutable/atomic desktop and want it to pretty much work out of the box I would highly recommend
Yeah I use silverblue on another computer and previously on this one, but the killer feature of bluefin is that NVIDIA drivers and codecs are built right into the image (as with the other ublue images) meaning that you don’t need to layer them and risk a bad upgrade. I’m planning on bringing the other computer over as well even though it’s AMD, at least I’ll get ROCm and the codecs.
I don’t get how people manage to spend so much time keeping arch running. I used it on my laptop for a few years and it just worked?? It was like the easiest to maintain distribution I’ve used other than immutable ones. The only real problems I ever had were accidentally interrupting pacman during a kernel update and not having a kernel, but that was always a like 2 minute fix
linux
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.