Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.
I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.
That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.