If you really want a true dumb TV, you should look into the commercial TVs
Personally I just get any TV and don’t connect it to the internet. I disable any popup interfaces/home menus as much as I can on the TV so I just turn it on and it goes to HDMI1 and that is all the TV’s interface needs to do.
I also disable alot of the picture altering features as well. My LG TV has some true motion crap that just made everything a little bit off.
For the most part the handful of TVs I have tried just work.
This is what I do. My Samsung TV can be set to turn on to a specific input, and the “smart” features never get in the way; similarly, it remains permanently disconnected from my home network.
I’ve heard of them searching for open wifi networks and using them. If I had one and cared, I’d bet the wifi card was removable like in a laptop, and I’d open it up and remove it.
But I own a dumb old CCFL TV that I got for free, and I’m going to use it until I can’t any more.
I recall that the first-gen TVs with integrated Amazon Fire TV had the baseband chip directly on the mainboard like other SMPs. I also do not recall any service that would connect to open networks.
However, that’s a single datapoint from sketchy memory of something I worked on over six years ago.
I have heard of this as well. But I don’t believe I have seen any reliable source actually confirming it. I vaguely remember some posted about it on Reddit years ago saying their Samsung tv would do it. That said it is clearly a possibility that the TV could do it if programed so it is good to keep that possibility in mind. Might be worth running a few tests if your worried about it.
The good news is Open WiFi hotspots are very rare, so unless you live next to a cafe or something that provides free WiFI you don’t really have much to worry about.
I have never seen a removable WiFi card on a TV board. They are always integrated directly on the main board itself along with the CPU RAM and other components. Just look up TV Main board on ebay and see if you can find any WiFi cards on the photos.
Direct3D is a subset of DirectX, which also includes a bunch of other Windows APIs named Direct[foo] for things like sound. DXVK is a port of WINE’s Direct3D interface to Vulkan (it would previously have used OpenGL, I presume). Since DXVK is a first cut at moving to a new lib, I would bet the internals are an incompletely-documented mess, which presumably is exactly what they mean to clean up.
If you’re going to do AI stuff you have to go with Nvidia. AMD is quite bad at it and in some cases doesn’t support some technologies like Stable Diffusion at all.
I’d recommend a 3070 at least. You’ll need the vram.
Uhm, I don’t think you will have much luck with an AMD laptop GPU and stable diffusion. Their support for desktop consumer GPUs is already atrocious in ROCm.
Maybe get a cheaper laptop that allows connecting a eGPU case? No idea if that works better, but I think the chances are a lot better.
I’ve seen people say this kind of thing. It is why I went to the data. There are 176 out of 699 that are using AMD just fine. Around 15 of those look to be laptops, but I can’t tell for sure.
IDK 🤷♂️ But it also looks like the laptop GPU you propose has a maximum of 12gb ram, which is already quite low for the older image models and definitely not enough for most language models.
I’m mostly concerned with what potential proprietary garbage is locked in the firmware of a laptop. Current PC motherboards are even worse for this, like even System76’s firmware for their desktops is proprietary. The work on HIPS to bridge CUDA and ROCm is active and open source. I’ll deal with some limitations to avoid nvidia treating me like garbage as a customer. There is a good bit of banter about how AMD is currently operating at around half its potential and that this is about to change. With OpenSIL and the effort AMD is putting into open source it seems like the better option. I want to do some more kernel hacking experiments with the CPU scheduler and process isolation. This is far easier when I don’t have to deal with asymmetrical cores and complicated management.
Regardless, the linked data telemetry shows plenty of people are running AMD GPUs just fine. I was expecting to see custom kernels used with the Radeon stuff and mostly people running mainline, but that is not the case. There are a few on the bleeding edge, but most are on old generic LTS kernels. It looks like it just works. The dataset includes the parameters and iteration time for each user running SD. It is a little slower than nvidia, but it still works fine. There are a lot of people running 8GB and smaller GPUs on SD.
This is for very low resolution only and AI up-scaling then takes another long time. Yes SD can work with 8gb vRAM and 12 is nicer, but the upcoming SDXL will probably require 16gb to work good enough.
I agree that Nvidia is crap and would love to recommend AMD, but their software for AI stuff is just bad right now and their business decisions to only support the newest data-center GPUs with it is even worse.
I have an all AMD Linux system, and it works great for gaming and VR, but I have given up on trying to get SD to work on it despite spending a lot of time on that already. Maybe with a newer card it would be better, but I think the risk is just too high to spend a lot of money on an officially unsupported card that AMD can break any minute and has done so in the past.
This is the talking-sense that got to me. Thanks. It is why I made the post before pulling the trigger.
I really hate shopping and now I’m back to zero. I probably need to focus on an external graphics card solution, but that looks like a messy space to navigate too. There seems to be a good bit of negative feedback from the ASUS ROC external GPU laptop setup. I have no idea what is or is not possible. I think I saw a headline in passing about USB4 just getting merged into the kernel, so that doesn’t bode well for support of existing hardware. I’m not sure what kind of bandwidth is really needed for SD to the CPU.
Thanks again for the minor disappointment to avoid a major one later.
linux
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.