I’m so confused, why does there need to be a daemon that creates /home? Can’t you just make it at install time and assume it’s always there? Is this made for ramdisk / immutable distros or something?
Things like this are why I don’t put systemd on my machines. It’s too complicated for me. Too many things going on. I’ve moved away from mac os to linux specifically to avoid weird over-engineered solutions, I want to be able to understand my system, not just use it!
EDIT:
SystemD/Linux
We’re not there yet with systemd, but I would argue that Alpine Linux qualifies as “busybox/Linux” lol. It’s literally just the kernel, busybox, openrc, and a package manager stapled together. It’s so minimalist that it barely even exists! I love that distro so much!
Honestly, the only reason I’m not using a non-SystemD distro is this is my first time actually going all in and having larger communities to help with issues plus just trying to force myself to learn it since it seems like it’s not going away
But yeah, I’m not a fan.
Working through a networking issue right now and the layers of obfuscation SystemD adds, especially with JournalD, leaves me not really sure where to even look
It is tempting to say screw it and load up Gentoo on my desktop though
Gentoo seems fun, I wanna try it some day. I would also recommend Void if you’re looking for a distro with a boring old binary package manager (it’s what I use on my laptop). Although the package list in Void is rather barren, I would recommend installing Flatpak to help fill in the gaps for some of the missing packages. There’s also Alpine if you wanna go balls deep into the minimalism rabbithole. What makes Alpine so difficult is that it’s a musl libc distro, so anything that needs glibc (i.e. any “serious” gui application) needs to run through a compatibility layer like gcompat or flatpak. Void is available in both glibc and musl libc flavours.
The community aspect can definitely be a big hurdle. Most of the time if you search for something like “<description of your issue> ubuntu”, you can more or less blindly copy-paste the commands from the first result and it will work. With niche distros, you have to be able to interpolate instructions aimed at other distros and actually understand what you’re doing. That why I would never recommend a non-systemd distro to someone who’s new to linux.
By the way, what’s your network issue? I’m no expert, but maybe I can try to help?
It seems to be an issue with using a 5.8 gigahertz WiFi endpoint, which has worked fine up until a couple days ago when it started dropping packets going outside my local network: I could watch a continuous ping start failing for a couple minutes while using Synergy to control my laptop that was connected to my work VPN without issue, so it only seemed to be an issue routing outside my network, which is really weird. Switching to the 2.4 gigahertz channels seems to have fixed it entirely.
What I need to do is look up the JournalD commands to be able to read the logs correctly and find what I’m after… Might also spin up a VM to see if that goes out at the same time, would be interesting if the VM can still work while the host is dropping packets…
So switching to a slower wifi AP causes packets destined for outside of your network to not be dropped? That sounds like one of those cursed issues that’s a complete nightmare to track down lol. Maybe the faster speed of the 5.8ghz network is causing your router to get overwhelmed or something? Does the same issue happen if you connect via ethernet? I don’t really know what else can cause this, I hope you can get it fixed!
Since my other systems were unaffected, I’m pretty sure it’s something on my PC, possibly an update for the Wi-Fi drivers introduced a bug that affects the 5.8 channels
It’s been stable since switching so it’s more academic at this point, I have no burning need to be connected to the 5ghz channels
They changed systemd-tmpfiles to create stuff other than tempfiles a while back, but for whatever reason they never renamed it to better describe what it does.
Systemd was actually a “clone” of apple’s launchd. Similarities with windows arise from the fact that it makes sense to manage services in certain ways on modern OSs. Also services on windows are completely different from Linux and MacOS, they are even a different executable file format, not a normal exe.
I know lol. It was a joke, although I do think that in theory leaner systems like Runit are better. But I cannot dismiss some of the innovation/work done in systemd
Meanwhile everyone still using NetworkManager instead of systemd-networkd in desktop systems because there’s no GUI yet. It works great in containers and wsl.
The trick is to split the code into smaller parts.
This is how I code using ChatGPT:
Have it analyze how to structure the program and then give me the code for the outline with not yet implemented methods and functions.
Have it implement the methods and functions one by one with tests for each one.
I copy the code and test for each method and function before moving on to the next one So that I always have working code.
Despair because my code is working and I have no idea how it works and I have become a machine that just copies code without an original thought of my own.
This works pretty well for me as long as I don’t work with obscure frameworks or in large codebases.
This is exactly how I use it. Just like with conversations, ChatGPT tends to lose the plot after a while. It starts to “forget” the start of the conversation, and has trouble parsing things. It’s great for the first few paragraphs then begins to drift. So only use it for a few “paragraphs” worth of code at a time.
And as always, you need to make sure that it’s not just pretending to know. It will confidently feed you incorrect information, so you need to double check it occasionally.
So my job (electrical engineering) has been pretty stagnant recently (just launched a product, no V2 on the horizon yet), so I’ve taken my free time to brush up on my skills.
I asked my friend (an EE at Apple) what are some skills that I should acquire to stay relevant. He suggested three things: FPGAs, machine learning, and cloud computing. So far, I’ve made some inroads on FPGAs.
But I keep hearing about people unironically using chatGPT in professional/productive environments. In your opinion, is it a fun tool for the lazy, or a tool that will be necessary in the future? Will employers in the future be expecting fluency with it?
That’s like asking in the early 90’s if knowing how to use a search engine will be a required skill.
Without a doubt. Just don’t rely on it for your own professional knowledge, use it to get the busywork done and automate where you can. I have virtually replaced my search engine needs with Bing AI when troubleshooting at work because it can find PDF manuals for obscure network hardware faster than I can shift through the first five pages of a Google search. It’s also one of those things where the skill of the operator can change the output from garbage to gold. If you can’t describe your problem or articulate what you want the solution to look like, then your AI is going to be just as clueless.
I don’t know what the future will hold and how much of our white collar workforce will be replaced by AI in the coming decades, but our cloud and automation engineers are not only leveraging LLM models but actively programming and training in-house models on company data. Bottom rung data entry is going the way of the dodo in the next ten years for sure. Programmers will likely see the same change that translators did after translation software was developed, they moved from doing the job themselves to QA’ing the software.
Times are changing but getting onboard with using AI as well as learning how to integrate it will be the next big thing in the IT world. It’s not going to replace us anytime soon but it will reduce the workforce as the years go by.
Right now it’s a good but limited tool if you know how to use it. But it can’t really do anything a professional in a given field can’t do already. Alhough it may be a bit quicker at certain task there is always a risk of errors sneaking in that can become a headache later.
So right now I don’t think it’s a necessary tool. In the future I think it will become necessary, but at that point I don’t think it will require much skill to use anymore as it will be much better at both understanding and actually accomplishing what you want. Right now the skill in using GPT4 is mostly in being able to work around it’s limitations.
Speculation time!
I don’t think the point where it will be both necessary and easy to use will be far of tbh. I’m not talking about AGI or anything close to that, but I think all that is necessary for it to reach that point is a version of GPT4 that is consistent over long code generation, is able to better plan out it’s work and then follow that plan for a long time.
fed.dyne.org
Hot