Both of them have their own issues, but historically, Nvidia have been considerably worse, because they not only required a proprietary driver, but also adamantly refused to support certain features, crippling the functionality of a lot of compositors.
Today, I’m having zero issues with an Nvidia card, on Kwin/Wayland. Everything that runs in native Wayland runs flawlessly. Games through Xwayland run great too, now that explicit sync is actually there.
Worth noting that I don’t have a VRR display, and don’t have a card that supports frame generation. The latter just is not implemented at all, and the I’ve heard there are issues with the former
Is why I can’t use AMD. My primary display is a TV. Yea, there are large monitors with Display Ports, but they didn’t exist until recently and most if not all are inferior to a high quality TV in picture quality.
OK don’t lynch me, but has anyone tried on Docker for Windows? Since this is Linux first, it looks like a lot of the environment paths are Linux-only. It solves a need for me, but I haven’t dived into switching fully to Linux gaming.
Others have asked for it and I guess some of it might even work given that WSL2 has GPU support. I’ll keep this issue updated as I get some progress on this front!
The only open source client that I know that worked with Steam is for bridging Steam’s chats and notifications with IRC. It’s called Bitlbee.
And i’ve stopped using it a few years ago because IIRC it was a pain to keep the authentication working with Steam. They thought hackers were accessing my account or something like that, and I kept having to disable security “features” just in order to stay logged in.
I assume an open source client not endorsed by Steam would have the same issue.
I’m pretty sure I ran the good exe file, the game is Still Wakes the Deep I’m not sure why it happened but the issue happened with kubuntu also I did found the vcrun and this issue disappeared, though for DirectX I don’t know which DLL choose.
No way! I had the exactly same problem with the same game! (from Fitgirl) You’re trying to launch the wrong exe friend. You need to select the Stillwakesthedeep.exe from Habitat/Binaries/Win64 folder.
I do not remember the name, but there was one being developed and the UI was made with QT. I don’t know if it used it’s own method to download games or was just a GUI for steamcmd.
because desktop Steam is technically browser with specific website opened.
The Steam client does use Chromium Embedded Framework for its interface, but there’s a lot more to it than just that. For example, Steam Input, Steam Overlay, Steamworks, and Steam Play (aka Proton) which itself is a collection of of nontrivial components.
If you just want an alternative launcher, there is Lutris, but there is no stand-alone Steam client alternative as far as I know.
It might be interesting to see how much functionality could be replicated in an open-source client. Some components, like DXVK and a web engine, are readily available. Others, like Steamworks, are not. SteamDB shows that it’s possible to inspect Steam’s game repositories, but actually downloading from them without Steam (or steamcmd) might be challenging. Goldberg Emulator shows that it’s possible to fool some games into thinking Steam is running, but that’s not enough to run games that include DRM.
Anyone attempting this would have to weigh the time they spend reverse engineering and re-implementing against the fact that Steam can always change its internal services, rendering all that time and effort wasted. And, of course, there would always be a risk that anyone using it (without explicit permission from Valve) might have their account banned.
The interface “running” is one thing, but does it know to run games in wine/proton? Does it know to grab the Linux versions of games if available? Mono doesn’t make that automatic.
Lutris is another one like playnite that is designed specifically for Linux using gtk. Like playnite, it collates all (most?) of the major game clients like steam, ea and epic. It works pretty well from my experience.
I assume not, but there is a command-line client, steamcmd. I used it the other day when I wasn’t near a machine that I had Steam installed on to remotely install FTL on a machine via ssh that I had Steam installed on so that I could copy it to a remote laptop, which I didn’t have Steam installed on (FTL is one of the games that doesn’t use Steam for DRM)
I’d imagine that one could theoretically slap an open-source frontend on that.
EDIT: Also, graphical frontends aside, it doesn’t even have readline/editline support, so running it via:
…is already throwing a minimal open-source frontend on it that rather improves the experience.
EDIT2: I don’t care that much about most of Steam, but I do wish that the downloader portion of Steam were open source so that I could push a patch to let one cap the number of concurrent TCP connections open. Normally, a saturated network connection will tend to allocate bandwidth evenly on a per-TCP-connection basis, and because Steam opens a ton of concurrent connections, it gets the lion’s share of the connection…for Steam downloads, which are very much low priority, and which I don’t want trying to eat up all the bandwidth.
EDIT3: There’s apparently some Linux ncurses-based client that uses library injection to take over the graphical client, but that repo last saw a change 6 years ago, and I’d bet that it hasn’t worked in a long time. Looking at steam_injector.c, It looks like it prevents XMapWindow() from running, so it should keep the graphical client from actually doing anything graphical.
I understand that it’s an absolute brightness standard, not like the relative levels in SDR
The standard is also relative brightness actually, though displays (luckily) don’t implement it that way.
why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?
It depends. You might
have a driver bug. Right now only AMD has correct color space communication with the display, that doesn’t work correctly on Intel and NVidia yet
have a display that does a terrible job at mapping the rec.2020 color space to the display
be just used to the oversaturated colors you get with the display in SDR mode
Why does my screen block the brightness control in HDR mode but not contrast?
Because displays are stupid, don’t assume there’s always a logical reason behind what display manufacturers do. Mine only blocks the brightness setting through DDC/CI, but not through the monitor OSD…
Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison
OLED simply gets very hot when you make it bright over the whole area, the display technology is inherently limited when it comes to high brightness on big displays
Hey there, thanks for the comprehensive reply, I learned a lot. Also, your blog is fantastic, I’m always happy when there’s a new post =)
Question about the last point: I feel like in SDR mode, the OLED is pushing brighter images. I almost feel like it’s underselling the capabilities at 270, but does so to give pixels a rest every now and then, in the hope that the bright spots don’t stay stationary on the screen. It’s a wild guess, I have no idea.
Also, your blog is fantastic, I’m always happy when there’s a new post =)
Thank you, I’m glad you like it!
I feel like in SDR mode, the OLED is pushing brighter images. I almost feel like it’s underselling the capabilities at 270, but does so to give pixels a rest every now and then, in the hope that the bright spots don’t stay stationary on the screen. It’s a wild guess, I have no idea.
It’s certainly possible, displays do whacky stuff sometimes. For example, if the maximum brightness in the HDR metadata matches exactly what the display says would be ideal to use, my (LCD!) HDR monitor dims down a lot, making everything far, far less bright than it actually should be.
KWin has a workaround for that, but it might be that your display does the same thing with the reported average brightness.
HDR content looks washed out on my HDR TV and my work Mac. At this point I'm pretty sure "washed out" is just the HDR look. I just turn it off in anything I can now.
But why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?
The desktop runs in SDR and the color space differs between SDR and HDR, meaning you will end up with washed out colors when you display SDR on HDR as is.
When you increase the slider in KDE, you change the tone mapping but no tone mapping is perfect so you might want to leave it at the default 0% and use the HDR mode only for HDR content. In KDE for example, colors are blown out when you put the color intensity to 100%.
Why does my screen block the brightness control in HDR mode but not contrast? And why does the contrast increase the brightness of highlights, instead of just split midtones towards brighter and darker shades?
In SDR, your display is not sent an absolute value. Meaning you can pick what 100% is, which is your usual brightness slider.
In HDR, your display is sent absolute values. If the content you’re displaying requests a pixel with 1000 nits your display should display exactly 1000 nits if it can.
Not sure about the contrast slider, I never really use it.
Why is truehdr400 supposed to be better in dark rooms than peak1000 mode?
Because 1000 nits is absurdly bright, almost painful to watch in the dark. I still usually use the 1000 mode and turn on a light in the room to compensate.
Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison.
Display technology limitations. OLED screens can only display the full brightness over a certain area (e.g. 10% for 400 nits and 1% for 1000 nits) before having to dim the screen. That makes the HDR mode mostly unuseable for desktop usage since your screen will dim/brighten when moving large white or black areas around the screen.
OLED screens simply can’t deliver the brightness of other display technologies but their benefits easily make it worth it.
Ah cool, I didn’t know that there are layers of capabilities for different requested brightnesses. Thanks for your in depth reply! I’m also a 1000 nits enjoyer but I don’t switch on any lights - I like when my eyeballs get blasted with colors. 😂
For the washed-out colors, are you using an Nvidia, Intel, or AMD GPU? If you’re using AMD you need to run kernel 6.8 or later I believe, if you’re using DisplayPort.
I’m not sure why your display lets you adjust contrast in HDR mode, I would just leave it at the default imo.
I’m using all of them sometimes. ^^ Washed out colors are not an issue on AMD anymore as you said it, but on nvidia I can’t seem to fix it. I wonder if this is happening to absolutely everyone, as the arch wiki makes it sound like nvidia 545+ has been reported working…
About the contrast: I wish I could, but I found that the factory default was 70% and it did seem to often cause noticeable dimming because the image was too bright for the max avg luminance. It felt weird and I think it’s because Alienware, like many manufacturers, just can’t resist blasting the consumer with overtuned contrasts to get a purchase out of it.
Well if everything’s working correctly you’d want the desktop itself to stay close to the sdr values but have applications that are HDR capable to make use of it. Otherwise you’re limited to full screen apps making use of it.
My monitor (Acer XV275K P3) has a better MiniLED local dimming algorithm in HDR mode than in SDR, so even SDR content looks better that way. Also it’s annoying having to switch it back and forth, it’s way easier to just leave it in HDR mode and not worry about it.
linux_gaming
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.