There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

meathorse ,

Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs

CaptKoala ,

Predictable outcome, common tech company L.

KomfortablesKissen ,

The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.

Phegan ,

Poll shows 84% of PC users are suckers.

Rin ,

You like having to pay more for AI?

TopRamenBinLaden ,

I feel like the sarcasm was pretty obvious in that comment, but maybe I’m missing something.

AVincentInSpace ,

I’m willing to pay extra for software that isn’t

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?

Snowpix ,
@Snowpix@lemmy.ca avatar

One word. Linux.

Tattorack ,
@Tattorack@lemmy.world avatar

I already moved to Linux. Windows is basically doing this already.

bouldering_barista ,

Who in the heck are the 16%

Honytawk ,
  • The ones who have investments in AI
  • The ones who listen to the marketing
  • The ones who are big Weird Al fans
  • The ones who didn’t understand the question
Glytch ,

I would pay for Weird-Al enhanced PC hardware.

quicksand ,

Those Weird Al fans will be very disappointed

desktop_user ,
  • The nerds that care about privacy but want chatbots or better autocomplete
x0x7 ,

Maybe people doing AI development who want the option of running local models.

But baking AI into all hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn’t offer the privacy of local AI, networked local AI on devices you don’t fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.

barfplanet ,

I’m interested in hardware that can better run local models. Right now the best bet is a GPU, but I’d be interested in a laptop with dedicated chips for AI that would work with pytorch. I’m a novice but I know it takes forever on my current laptop.

Not interested in running copilot better though.

31337 ,

I would if the hardware was powerful enough to do interesting or useful things, and there was software that did interesting or useful things. Like, I’d rather run an AI model to remove backgrounds from images or upscale locally, than to send images to Adobe servers (this is just an example, I don’t use Adobe products and don’t know if this is what Adobe does). I’d also rather do OCR locally and quickly than send it to a server. Same with translations. There are a lot of use-cases for “AI” models.

ZILtoid1991 ,

A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).

rainynight65 ,

I am generally unwilling to pay extra for features I don’t need and didn’t ask for.

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

raytracing is something I’d pay for even if unasked, assuming they meaningfully impact the quality and dont demand outlandish prices.
And they’d need to put it in unasked and cooperate with devs else it won’t catch on quickly enough.
Remember Nvidia Ansel?

Xenny ,

As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn’t there then it’s not supported.

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

Assuming theres no catch-on after 3-4 cycles I’d say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.

Maybe it needs a bit of marturing and a re-introduction at a later point.

chicken ,

I can’t tell how good any of this stuff is because none of the language they’re using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

ssm ,
@ssm@lemmy.sdf.org avatar

AI-en{hanced,shittified}

mlg ,
@mlg@lemmy.world avatar

Even DLSS only works great for some types of games.

Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.

War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.

Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It’s so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.

TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.

Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.

Most modern games and applications really don’t need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on “AI” to reduce the work is sort of a lazy solution that will have side effects because you’re effectively predicting the output.

RememberTheApollo_ ,

When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.

GalacticTaterTot ,

The only reason I have any enthusiasm about CoPilot+ PCs (AI PCs or whatever new name they get in 6 months) is because of ARM and battery life.

Heck, I’ll trade them all the AI features for no ads.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • lifeLocal
  • goranko
  • All magazines