Then I shouldn’t have had to share that link with you which will solve all your issues, right? I’m sure you know all the Linux things from your sarcasm, and it’s the Hardware’s fault.
What crawled in your coffee and died this morning? It’s attitudes like this that stop people from wanting to switch to linux. Someone considering it will scroll through, see you being a dickhead, and go: Well…if this is what the community is like, idk if I want to try linux…
Check yourself pal. You’ve got a shit attitude, and you’re doing no one any favours.
I was just pointing out the state of things on an up-to-date distro like Fedora as many times a newer kernel fixes stuff like this and no one bothers to update old reviews. I was already aware of the link you provided (it’s literally pinned to the top of the blog post I linked in my main post), but it’s irrelevant when I’m talking about the out-of-the-box experience. I only tried the input-remapper fix because someone pointed it out and I wanted to confirm that worked for me.
I didn’t make this post to complain about issues or ask for solutions, I’m here looking for interesting ideas and questions about this super cool hardware. This thing’s fucking awesome and I wanted to share.
“Bricking” though normally means turning a software problem into a hardware problem. You just have a software problem, which is infinitely easier to fix.
As a fellow risc-v supporter, I think the rise of arm is going to help risc-v software support and eventually adoption. They’re not compatible, but right now developers everywhere are working to ensure their applications are portable and not tied to x86. I imagine too that when it comes to emulation, emulating arm is going to be a lot easier than x86, possibly even statically recompilable.
This is what concerns me. ARM could dominate the market because almost everyone would develop apps supporting it and leave RISC-V behind. It could become like Itanium vs AMD64 all over again.
Well right now most people develop apps supporting x86 and leaves everything else behind. If they’re supporting x86 + arm, maybe adding riscv as a third option would be a smaller step than adding a second architecture
Porting Windows exclusive games to Linux is a small step as well, but most developers don’t do it because they cannot justify the additional QA and debugging time required to port them over. Especially since Linux’s market share is small.
The reason Itanium failed was because the architecture was too different from x86 and porting x86 applications over required significant effort and was error prone.
For RISC-V to even get any serious attention from developers, I think they need to have appx 40-50% market share with OEMs alongside ARM. Otherwise, RISC-V will be seen as a niche architecture and developers would avoid porting their applications to it.
My point is that “porting” is not such a big deal if it is just recompile. If you already target Linux with a portable code base ( to support both ARM and amd64 for example ) then the burden of RISC-V is pretty low. Most of the support will be the same between RISC-V and ARM if they target the same Linux distros.
The Linux distros themselves are just a recompile as well and so the entire Open Source ecosystem will be available to RISC-V right away.
It is a very different world from x86 vs Itanium with amd64 added to the mix.
Look at Apple Silicon. Fedora already has a full distribution targeting Apple Silicon Macs. The biggest challenges have been drivers, not the ISA. The more complete the Linux ecosystem is on ARM, the easier it will be to create distros for RISC-V as well.
Porting Windows games to Linux is not a small step. It is massive and introduces a huge support burden. That is much different than just recompiling your already portable and already Linux hosted applications to a new arch.
With games, I actually hope the Win32 API becomes the standard on Linux as well because it is more stable and reduces the support burden on game studios. It may even be ok if they stay x86-64. Games leverage the GPU more than the CPU and so are not as greatly impacted running the CPU under emulation.
That is a risk on the Windows side for sure. Also, once an ISA becomes popular ( like Apple Silicon ) it will be hard to displace.
Repurposing Linux software for RISC-V should be easy though and I would expect even proprietary software that targets Linux to support it ( if the support anything beyond x86-64 ).
Itanium was a weird architecture and you either bet on it or you did not. RISC and ARM are not so different.
The other factor is that there is a lot less assembly language being used and, if you port away from x64, you are probably going to get rid of any that remains as part of that ( making the app more portable ).
Once a chip architecture gets popular on Windows, it will be hard to displace. ARM has already become popular on macOS ( via Apple Silicon ) so we know that is not going anywhere. If ARM becomes popular on Windows ( perhaps via X Elite ), it will be hard to displace as the more popular option. That makes RISC-V on Windows a more difficult proposition.
I do not think that RISC-V on Linux has the same obstacles other than that most hardware will be manufactured for Windows or Mac and will use the chips popular with those operating systems.
I wrote it several times and I will write it again. Linux on a tablet is at best average. However, after recent release of KDE 6, plasma mobile got really good. In tablet mode it feels almost like a real thing. I’ve been using it for some time now and I like the experience.
It is usable but I’ve been using iPad for years before trying Linux on a tablet and it’s way behind iPadOS in terms of ux and ease of use. The latest plasma mobile makes it more tablety but it still feels like a desktop with touch support. Having said that, I’m pretty happy with plasma mobile and can’t wait for further improvements.
That’s to be expected. Linux distros are barely just getting their feet wet in the tablet/mobile world.
I have no use for tablets, but if I did, I’d certainly go the Linux way and deal with whatever I have to before ever thinking to use Apple, Microsoft or any Google OS.
Linux distros are barely just getting their feet wet in the tablet/mobile world. I would say “barely just getting their toe wet” :)
Getting back to the point. I loved the way iPad was integrated with the stylus (Apple Pencil). My use case for a tablet back then was to write/draw stuff I did “remotely” and export all my, let’s call it drawings, to mac and work on that. Today’s example. I was planning a garden layout. It took me way too much time to get the stylus working the way I expected and when it did I had more issues trying to export the drawings to a usable format* I would be better off with a good old pen and paper.
I’m not a pro Linux user so there’s a good chance I missed something
True, but it is also completely different use cases and they have different goals.
Windows on a 2-in-1 is also not as good as an iPad. They are desktop OS’s with tablet functionality as a nice to have. They will never be as smooth of an experience as a mobile-first OS.
The trade off is 100x better compatibility with many apps, especially FOSS. inkscape, krita, KiCAD, FreeCAD, coding IDEs, MATLAB/scipy, games, etc… They are all available out of the box without a mediocre mobile port.
The flexibility to functionally use it as a full-blown computer (and not reliant on a monopolized, centralized app store) is the reason you get it and not an iPad. Of course it won’t be as good as a tablet because it wasn’t made for that.
You can also say “the iPad will never be as good of a drawing experience as a dedicated high-end drawing tablet.” Like of course. That isn’t its function and goal.
Yeah honestly if they could do a massive overhaul on performance and UX with the OSK then that’d solve the main complaint I’ve had with touch interfaces on Linux
I am super tempted to switch to KDE on this thing. KDE has always looked cool, but I’m too happy with Gnome on my main desktop to justify fully switching. This is seeming like a perfect opportunity for some variety…
You removed Windows. Not sure why Ubuntu is slow, but that may be because of snaps.
The internet issue may also be just because of missing drivers.
Please test if it works on a live USB or SD (I guess, never used an SD Card) before.
And yes, Windows installer is notorious for removing Linux, so install Windows again, then inside of Windows use their shitty partition manager and shrink the big storage partition, then install Linux in there
Note that I dont recommend Ubuntu as they got pretty shitty. They theme the desktop environment GNOME a lot, and everybody hates their Snap package system. Instead I highly recommend Fedora, which is a less opinionated distro.
I also dont recommend dual booting with Windows, as you should never update Windows again, which is a security risk. The updater often removes the Linux bootloader and you need to unbreak that.
I will try another Disto if I can. I restored windows too. Are there other, safer methods to do this if I only have one SSD. Also, If I do reinstall how do I tell the installer specifically which partition to use
According to the video, open source software is not necessarily as important for servers and IT in the modern world as it was in the past. This is because software updates are now delivered electronically over the internet, making it less important to have access to the source code. Additionally, companies typically pay for service agreements with software vendors, which means that the vendor will fix bugs and update the software for them. Even if a company has access to the source code, they may not have the expertise to fix the software themselves.
I believe this is a perfect example of what I believe is called the Dunning-Kruger effect. In the same way that I’m glad amateurs are given a platform, I rue the fact that amateurs are given a platform as a little bit of research would’ve prevented them making themselves look so stupid. “Servers don’t need open source” with 97% of the top 100 servers running Linux looks like an odd position to take. Maybe they’re trolling.
Making connections between open-source and updates is either really stupid or really sus. Open-source isn’t about that at all. I’d say that person is 99% an imp or a troll. In any case just vote him out.
I just want to clarify as OP, I didn’t post this because I support it, I posted it because it’s stupid and just a taste of the very poor takes out there
I wouldn’t recommend watching it, but the central argument of this video is to do with software support. They argue that “open source” was more relevant prior to the internet (in servers?) due to the long turnaround time in getting a software vender (in this video IBM) to fix a bug in their software, arguing that by having access to the source code support could instruct the server maintainer what changes to make without them needing to send the tape to IBM to debug (apparently that was something they did, but it seems people in the video comments disagree with this hinting that the youtuber has no actual experience in this area). They argue that due to high speed internet support can release software fixes much quicker so having access to the source code isn’t useful as paying for support contracts is a better option for businesses rather than having people who understand the software they’re running. Apparently this is the only reason why open source is useful. They go on to argue that Linux is only popular on servers because RedHat’s support contracts are cheaper than Microsoft’s, something which I doubt and probably has more to do with the kernel and OS being easy to modify and control allowing it to be extended to a large variety of use cases instead of writing a new system from scratch.
There’s lots of issues with their argument and some have claimed it is trolling but I reckon that would be giving them too much credit. It is likely they are just an idiot fanboying for their favourite companies desperately trying to justify their irrational biases
Exactly, what the video fails to mention is the eventuality that the software ceases to be supported, then what? You’ve built your entire business around this piece of software and it would cost more to migrate to something else than having someone who understands the code or perhaps someone doing it for free on the internet. But with server software especially, I wouldn’t be surprised if some of this proprietary stuff ends up going SaaS only ripping off any companies that self host.
Not sure what the title was before you changed it, but if I see a post in my feed that looks like this (without the “very bad take” part), I wouldn’t even want to open the post to see the description. I’m glad you clarified by editing the title, but without making your stance clear in the title from the very beginning, it would be bound to receive a barrage of downvotes.
The title was the same thing but without the "[Very bad take] " bit, I probably wouldn’t have read the description and just jumped to the comments. I don’t really care about votes though, I find comments much more interesting. If I post content I just take any votes as a review of the content, if I’ve commented my honest opinion and put some thought into the comment only to get downvotes and no comments really explaining why then I’m a bit disappointed.
Oh dang, I got excited about a new update. Then I started getting deja-vus. And finally I checked the date: This is from early May. Still a good read, unless you already did read it before :P
I’ve now added the date to the title to make it more clear the article is from two months ago. The article is a good read and wasn’t posted on here, so I thought it’s still worth sharing.
This doesn’t mention the part where if you enable hdr, it sets the color profile to edid without an option to change it, which for my monitor makes everything very desaturated even in comparison to srgb mode (with no color profile)
That has pretty much nothing to do with the color profile, when colors look very desaturated on HDR screens, that’s the driver messing up the colorspace signaling.
What GPU do you have? Both Intel and NVidia still have major problems with this.
Many displays (but not all, which is why it’s not exposed in the GUI) also support doing HDR without additional colorspace signaling, you could try enabling only hdr and disabling wcg with kscreen-doctor. IMO the color part is the more noticeable benefit of HDR, but you could at least have functional HDR until your GPU driver is fixed.
I had thought it was about the color profile because with hdr disabled from system settings, enabling the built in color profile desaturates colors quite a bit and does some kind of perceived brightness to luminosity mapping that desaturates bright / dark hdr content even more. Although I don’t think that’s the cause of my problems anymore.
I think there must be something wrong with my screen since the hdr reduces saturation more than anything else. Anyways, thanks for the good work
Edit: Tried this with an amd gpu. hdr+wcg works as expected without muted colors. hdr without wcg still significantly desaturates colors, so I guess that’s a monitor bug. Now to figure out gpu passthrough… (Edit 2: It seems to just work??)
Side note, when I turn off hdr only from kscreendoctor the display stays in hdr mode until it turns off and on again, that didn’t happen with nvidia
Edit 3: Found something weirder… Hdr colors are muted on nvidia gpu and seems vibrant with the amd igpu. If I plug the monitor to the motherboard (amd), enable hdr, then unplug and plug it into the nvidia gpu, the colors are still vibrant??? I can disable and enable hdr again and again and they aren’t affected. They’re even fine when hdr is enabled without wcg??? But if I fully turn off the monitor and back on they once again become muted with hdr. Weird ass behavior
enabling the built in color profile desaturates colors quite a bit and does some kind of perceived brightness to luminosity mapping that desaturates bright / dark hdr content even more
It maps the colors to be more correct, and it does use the brightness info from the EDID for HDR content, so that checks out.
I think there must be something wrong with my screen since the hdr reduces saturation more than anything else
It might enable some sort of gamut mapping on the display side… HDR on monitors is really weird sometimes.
Side note, when I turn off hdr only from kscreendoctor the display stays in hdr mode until it turns off and on again, that didn’t happen with nvidia
I think that’s a bug in amdgpu. It should force a modeset on hdr change, but it doesn’t.
linux
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.