I initially got a Z80-MBC2, a Z80-based SBC that runs CP/M and other operating systems, as I had developed an Intel 8080 cross assembler and wanted to run on actual hardware the code assembled with it. It was so fun I got a V20-MBC, an SBC by the same maker that features a Nec V20 (8088 + 8080) and can run CP/M-86.
Both SBCs led me down a fascinating retrocomputing rabbit hole.
This sounds like an Ubuntu problem, sadly. Ubuntu is, in my experience, a mess of a distribution. Debian works almost flawlessly and I think you’ll have less issues with a properly run distro.
The only difference between those two versions of linux is that the new one was built with a newer version of gcc. That doesn’t really narrow the problem down, though. As far as I’m aware, emergency mode is caused by either a kernel panic or a failure to mount a needed filesystem. I’m leaning towards a corrupted kernel, since it doesn’t sound like you changed your fstab or had any problem mounting /. I would run fsck -f on your boot partition, then try to re-download and reinstall the new package.
If that doesn’t work, then you can add IgnorePkg = linux linux-headers to pacman.conf so you can update without installing the broken package, until you resolve the underlying issue. Or your can install a different kernel altogether.
As for preventing problems in the future, there’s only so much you can do. Check archlinux.org before updating to see if anything requires manual intervention, and pay close attention while running pacman in case something goes wrong. You already seem to know the most important part, which is to keep a set of packages that are certain to work, so you can easily downgrade if a crash does happen.
Check archlinux.org before updating to see if anything requires manual intervention
informant in the AUR can also be used to halt updates and automatically show you messages from the site. Then you can just run the update again and it will go through.
Ran sudo pacman -Syu; sudo pacman -Syy like I do every few days
Syy forces the package database to be updated even if no updates are available.
In my opinion, this makes no sense, especially after you have already run pacman -Syu before. Basically, you only generate additional, unnecessary traffic on the mirror you are using. Pacman -Syu is normally always sufficient.
The journal was really long so I moved past it
The display of the systemd journal can be easily filtered. For example, with journalctl -p err -b -1, all entries of the last boot process that are marked as error, critical, alarm or emergency are displayed.
Has anyone else ran into this issue when updating?
Not me. But other users do. Some of them also use a distribution other than Arch (or a distribution based on it). When I look at the problems, the current kernel is probably quite a minefield as far as problems are concerned.
Any advice for preventing future crashes or issues like this so I don’t fear updating?
As other users have already recommended, you could additionally install the LTS kernel. And if you use BTRFS as a file system, create snapshots before an update (wiki.archlinux.org/title/snapper#Wrapping_pacman_…).
And it should be obvious that important data should be backed up on a regular basis.
I guess that’s a bit better than the original command in question. But from what I understand it’s still unnecessary and there is simply no need to force the refresh. A regular pacman -Syu is all you need and will refresh all databases that need it.
Maybe rebuilding the ramdisk failed during the original upgrade?
One of the post-install stages after upgrading the kernel is rebuilding the initramfs - a tiny environment for bootstrapping the main OS.
If you trigger it manually with mkinitcpio --allpresets you’ll notice it has fancy colorful output, with clearly visible warnings/errors. However when invoked as part of an upgrade this coloring is removed, making errors difficult to spot.
I had this stage randomly fail a few times, resulting in an unbootable system like you described - solution was to just trigger a manual rebuild or reinstall the kernel with pacman -U.
It’s possible that this is what actually fixed things when you downgraded the kernel.
I think so. In some cases the flatpaks are prepared by the developers themselves. This isn’t in itself a sign of trustworthiness, but if a dev were to sneak malicious code in somewhere and it were found out… Well, the internet is the courtroom, and the public the jury, right?
But, it is a piece of software, and you never know what one little dependency can do. Same can be said about repos.
I think it’s just general fear of the command line. I’ve had a friend who always owned a mac, and started using it for his programming course. While assisting him in trying to compile some programs or use something like git from the mac’s zsh terminal, I can tell it’s a stressful event for him, even though all I told him to enter were simple commands like ls, mkdir, g++ etc.
I have a machine that runs fedora with no trouble at all. I never needed to debug anything, multimonitors and sound outputs all work. But every once in a while, something happens which can only be solved through the command line, because linux simply does not have a settings utility as extensive as windows control panel. It’s fine for me, but telling that friend to bring up the terminal and enter a cryptic line will probably freak him out.
Simple tasks can take you way more time than needed. For example, I have an old laptop under Bunsenlabs (based on Debian with Openbox). The other day, I wanted to connect a secondary monitor. I wasn't expected the nightmare I had to setup this thing. The layout was totally off with a dead space between the two screens where the cursor disappeared and ArandR was very rough to use. I ended up editing txt file if I remember correctly.
I absolutely love Linux but this kind of thing happen quite regularly to be honest.
Needlessly reductionist, but also wrong. If your code is proven to work (like, machine verified), and you use a compiler that is also verified to generate correct code, then that code is secure.
Have you tried Manjaro Sway? Felt much more polished than i3. Sadly, barrier isn’t working and that was a dealbreaker for me. But you should give it a try.
I did actually on a bootable USB, but all the preinstalled tools like the network manager in the task bar were text based which put me way off. I got the sense it needed a ton of configuration to become fully featured.
I’m a devops engineer, so I understand Linux well. I actually used exclusively Linux all throughout university.
Linux works just as good as windows for 98% of my uses cases. And for the 2% that it doesnt, I can probably figure out how to get it to work or an alternative.
But honestly, I usually just don’t want to anymore. After working 8 hours, I’m very seldom in the mood to do more debugging, so I switch to Windows more and more frequently.
If this is my experience as someone who understands it, most normies will just fuck off the moment the first program they want to run doesn’t.
That’s part of why I don’t use Linux, outside of my steamdeck which I rarely go out of game mode so doesn’t even count, I just want my shit to work and not worry about compatibility or “figuring it out” I feel like had I used it at a younger age I’d be more fine with it but I just can’t be bothered tbh.
I work in devops as well and while Windows is easier and more convenient for many things, some processing-heavy tasks are better left to Linux. Doing generative AI stuff, for example, I don’t want to be loading a bulky OS on top of the task at hand.
I thought about dual booting, but it would make multitasking nearly impossible. So, instead, I’m using Linux whenever possible and I have a Windows VM I can enter at a moment’s notice or hibernate if I need the resources. And then there’s the MacBook, but we don’t talk about the MacBook.
My experience is the opposite but the same. I have been a sysadmin for 15 years in mostly Windows and Microsoft only. All my work tools are in Windows.
I actually boot to Linux when I’m not supposed to work since otherwise I just have anxiety or dread and then I’ll open teams, outlook, ncentral, prtg…
Also why I enjoy my switch. Can’t really do projects on it like I can on Linux, but I also am switched off from work.
I’ve been doing Linux since the early days when Slackware fitted a “few” floppy disks and you had to configure the low level CRT display timings on a text file to get X-windows to work, and through my career have used Linux abundantly, at some point even designing distributed high performance software systems on top of it for Investment Banks.
Nowadays I just don’t have the patience to solve stupid problems that are only there because some moron decided that THEY are the ones that after 2 bloody decades of it working fine trully have the perfect way (the kind of dunning-krugger level software design expertise which is “oh so common” at the mid-point of one’s software development career and regularly produces amongst others “yet another programming language were all the old lessons about efficiency of the whole software development cycle and maintenability have been forgotten”) for something that’s been done well enough for years, and decided to reinvent it, so now instead of one well integrated, well documented solution there are these pieces of masturbatory-“brilliance” barelly fitting together with documentation that doesn’t even match it properly.
Just recently I got a ton of headaches merely getting language and keyboard configuration working properly in Armbian because there was zero explanation associated with the choices and their implications, thousands of combinations (99.99% of which are not used or even exists) of keyboard configurations were ordered alphabetically on almost-arbitrary names across 2 tables, with no emphasis on “commonly used” (clearly every user is supposed to be an expert on the ins and outs of keyboard layouts) and there were multiple tools, most of which didn’t work (some immediatelly due to missing directories, others failing after a couple of minutes, others only affecting X) and whatever documentation was there (online and offline) didn’t actually match.
(It’s funny how the “genious” never seems to extend to creating good documentation or even proper integration testing)
Don’t get me wrong: I see Software Architecture-level rookie mistakes all the time in the design of software systems, frameworks and libraries in the for-profit sector (“Hi Google!!!”), but they seem to actually more frequent in Open Source because the barrier for entry for reinventing the wheel (again!) is often just convincing a handful of people with an equally low level of expertise.
I think I am more than happy with the os. The bummer is that many of the alternative softwares do not have feature parity. The more you try to mimic the Windows workflow, the more you’ll burnout with minimal results. I’ve come to terms with it and just run a vm in gnome boxes for ms office and tableau and other stuff. However, many a times if I want something that could be done programmatically I’d definitely try a cli solution, so that cant be the same pro for everyone.
My first experience with linux was Ubuntu. Sue me, it was listed under most “most user friendly distro” listicles when I wasn’t smart enough to realize those were mostly marketing.
It worked fine for my purposes, though it took getting used to, but it would wake itself up from sleep after a few minutes. I would have to shut it off at night so that I wouldn’t wake up in a panic as an eerie light emanated through the room from my closed laptop. I did my best searching for the problem, but could never find a solution that worked; in retrospect, I probably just didn’t have the language to adequately describe the problem.
Nothing about the GUI was well-documented to the degree that CLI apps were. If I needed to make any changes, there would be like one grainy video on youtube that showed what apps to open and buttons to click and failed to solve my problem, but a dozen Stack Exchange articles telling me exactly what to do via the terminal.
I remember going off on some friends online when they tried to convince me Linux and the terminal were superior. I ranted about how this stupid sleep issue was indicative of larger, more annoying problems that drove potential users away. I raged about how hostile to users this esoteric nerds-only UX is. I cried about Windows could be better for everyone if the most computer-adept people would stop jumping ship for mediocre OSes.
I met another friend who used Arch (btw) within a year from that hissy fit, and she fixed my laptop within minutes. Using a CLI app nonetheless. I grumbled angrily to myself.
A few years later and everyone’s home all the time for some reason, and I get the wild idea that I’m going to be a(n ethical) hacker for whatever reason. I then proceeded to install Kali on a VM and the rest is history.
The point being that some people labor under the misguided belief that technology should conform to the users, and because we were mostly raised on Windows or Mac, we develop the misconception that those interfaces are “intuitive” (solely because we learned them during the best time in our life to pick up new skills). Then you try to move to linux for whatever reason and everything works differently and the process is jarring and noticeably requires the user conforming to the technology–i.e. changing bad habits learned from other OSes to fit the new one. The lucky few of us go on to learn many other OSes and start to see beyond the specifics to the abstract ideas similar to all of them, then it doesn’t matter if you have to work with iOS or TempleOS, you understand the basics of how it all fits together.
TL;DR Category theorists must be the least frustrated people alive
Category Theory is an attempt to understand all of math (including conputer science) as simply different instances of abstract conceprs, called categories. The way I’ve managed to understand OSes as abstract systems rather than entirely unique beasts is how I imagine category theorists must see all of computer science
It’s a freeing paradigm shift once you realize that your understanding is broad enough that you can transfer your knowledge from one OS to another, therefore the joke is that since Category Theorists have the broadest knowledge, they must deal with the least amount of frustrations learning a new system
linux
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.