I’m not too familiar with C, but I was under the impression that C++ was deceloped as a superset to C, and was capable of everything C could do. Is that not the case?
Every programming language is an abstraction layer between the programmer and the machine that will run the code. But abstraction isn’t free. Generally speaking, the higher the abstraction, the less efficient the program.
C++ optionally provides a much higher level of abstraction than pure C, which makes C++ much nicer to work with. But the trade off is that the program will struggle to run in resource constrained environments, where a program written in C would run just fine.
And to be clear, when I say “low-end hardware”, I’m not talking about the atom-based netbook from 2008 you picked up for $15 at a yard sale. It will run C++ based programs just fine. I’m talking about 8- or 16-bit microcontrollers running at <100 MHz with a couple of hundred kB of RAM. Such machines are still common in many embedded applications, and they do not handle C++ applications gracefully.
And speed too. A small program using only C features can compile 5x faster with a c compiler then a c++ one. (GCC will use c++ mode on a .cop file so make sure it is .c)
That’s true, but they’re working on an ABI implementation. It’s no mean feat with a language like Rust. A quick search around the Internet found various possible candidates, though many of the discussion threads have petered out.
Its cross-platform support (not just for using but also for building it) is not there yet, and it is quite huge and unstandardized with only one full implementation. I’d agree the last part will change with age, but given the frequent large changes and feature additions I am afraid it will be harder and harder and it is simply too complex and fast-moving for many low-level applications. It is closer to C++ than C in my eyes. I’d be happy seeing it replace C++ though for its memory safety benefits!
I wouldn’t say “need”, but there are possible improvements to ergonomics and safety that wouldn’t make the language itself more complex or high level. I think it does its job quite well though and will be here for decades to come.
Ada has been around since 1983 and is objectively superior. Yes I will die on that hill.
It’s too bad programmers are all such egotards they think they can write bugfree programs in C, while whining about how “restrictive” a safe language like Ada is.
Wi-Fi drivers are notoriously complicated on Linux in general, though things have been improving. But yeah if ‘iwctl device list’ comes up empty when you plan to use Wi-Fi to install Arch, especially if Ethernet isn’t a viable temporary alternative because your device doesn’t have an Ethernet port, you’re in for a tough time.
Oh yeah that’s a great alternate option too if your mobile plan includes tethering. I’ve successfully used both Android and iOS tethering in the past and it was pretty seemless each time.
Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.
I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.
That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.
Yeah they don't hate Linux, they just have their own priorities. That said I'm running Nvidia+Wayland happily, for desktop they have worked a lot more on Wayland this year, the upcoming driver fixes a bunch of things, and my distrib handled driver installation and updates, I never have to think about it.
I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good
I really hope this happens. After being on Nvidia for over a decade (960 for 5 years and similar midrange cards before that), I finally went AMD at the end of last year. Then of course AI burst onto the scene this year, and I’ve not yet managed to get stable diffusion running to the point it’s made me wonder if I might have made a bad choice.
It’s possible to run stable diffusion on amd cards, it’s just a bit more tedious and a lot slower. I managed to get it working on my rx 6700 under arch linux just fine. Now that I’m on fedora, it doesn’t really want to work for some reason, but I’m sure that it can be fixed as well, I just didn’t spend enough time on it.
It just makes no sense to me though, how is it sustainable for nvidia to not have great Linux kernel support? Like, let the kernel maintainers do their job and reap the benefits. I’m guessing that nvidia sees enterprise support contracts as an essential revenue stream, but eventually even enterprises are going to go with hardware that Linus isn’t giving the finger to right? Am I crazy?
That’s not true. Some companies contribute. AMD does a great job fostering open source software. This is an Nvidia issue. They are a plague and I hope they one day lose market share for it.
Cats use a “direct register” walking style, which means they put their hind foot down in exactly the same spot their forefoot on the same side just was. This cat is walking forward and lifting up their left front paw so we would expect the left rear paw to be reaching forward for that spot and out of view from this angle.
I’m gonna be that person… I rarely, if ever have issues with nvidia on Linux. Used several 30xx series cards for gaming over the last couple of years and it’s been a great experience.
Is it my distro (Void)?. is it because I’m happy staying on X11? Is it just luck? Interested to hear people’s gripes
I use Ubuntu and Nvidia 3080 and the only issue I have had was when Steam updated their Big Picture Mode. I was using Wayland and it broke with the new Big Picture Mode. I had to switch back to x11 and it works well with that. I do hope Nvidia and Steam fix the Wayland issue. I'd rather use Wayland.
I have been using my Linux Gaming PC for for a couple of years now. I jumped ship with the ad riddled Windows 11. And I have been very happy with Steam/Proton gaming.
Well, you are not alone. While I too would prefer not to use proprietary drivers, I have had no problems on any of my Nvidia machines as well. Ironically, despite the open source drivers, getting a 7900XTX card up and running was an issue for me for months till distros caught up (with newer kernels and mesa libs), while my 4090 installation was a breeze even on the day it was released.
A lot of problems people have with Nvidia GPUs seem to be installation related. I think that is because the installation tends to be distro-specific and people do not necessarily follow the correct procedure for their distro or try installing the drivers directly from the Nvidia site as they would on Windows. For example, Fedora requires you to add RPMFusion, Debian needs non-free to be added to sources, Linux Mint lets you install the proprietary drivers but only after the first boot, and so on. Pop OS! probably makes the process the easiest with their Nvidia-specific ISO.
I have a 3080 and it runs fine with openSUSE Tumbleweed. On first boot you do need to add the nvidia repo and then install it which I guess could be problematic for new linux users, but it's literally pasting 1 line into terminal and then clicking the driver in yast. Echoing what others have said, I'd prefer if nvidia was a little less hostile to open source but frankly the driver just works, and works well. The only thing I've used besides openSUSE lately is Pop_OS and I believe the nvidia driver was installed automatically. If someone is having trouble getting the driver installed that seems to be a failure of the distro, not the user. You should be able to depend on your distros packaging to take care of this stuff.
There is definitely some substance behind the complaints, but I think they are overblown or just the typical linux-user-parroting something they heard other people say.
On PopOS my 3070ti was always stable. I ran into occasional stuttering in the DE, but the biggest thing was I had to manual compile shaders using some guys github repo to play Apex Legends without it being a stuttery mess. But like you said, Pop is on X11 so maybe that makes a difference?
I bought into the “if you are going to use linux, especially for gaming, you need an amd gpu.” So I bought a 6900xt. I’ve had as many issues with my 6900xt; they are just different types of issues. Nothing insurmountable but its not like its some panacea.
I think this is a big part of it. I have no issues with Nvidia + X11, however if I try to use Wayland with my 2080 I get numerous issues that has me running back to Xorg very quickly.
This kind of meme is disheartening seeing on Lemmy, that’s supposed to be full of open minded people. And I write that as someone who saw a real UFO in 1990 in Greece, together with others. The tech in 1990 was not there for the US to create a silent, gravity-defying, appearing and disappearing in-front-of-our-eyes vehicle. So this meme, AND the comments from the other lemmy posts in the last 2 days about the recent ufo events are rather insulting to me. The phenomenon is real.
Edit: Also, I don’t understand the downvotes! You downvote anything you find going against your grain? My experience is my experience and it’s equally as valid!
lemmy.ml
Active