There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Survey shows most people wouldn't pay extra for AI-enhanced hardware | 84% of people said no

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

OhmsLawn ,

I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.

If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.

originalucifer ,
@originalucifer@moist.catsweat.com avatar

what they mean is that they are putting in dedicated processors or other hardware just to run an LLM . it doesnt speed up anything other than the faux-AI tool they are implementing.

LLMs require a ton of math that is better suited to video processors than the general purpose cpu on most machines.

tal ,
@tal@lemmy.today avatar

I honestly have no Idea what AI does to a processor

Parallel processing capability. CPUs historically worked with mostly-non-massively-parallelizable tasks; maybe you’d use a GPU if you wanted that.

I mean, that’s not necessarily “AI” as such, but LLMs are a neat application that uses them.

On-CPU video acceleration does parallel processing too.

Software’s going to have to parallelize if it wants to get much by way of performance improvements, anyway. We haven’t been seeing rapid exponential growth in serial computation speed since the early 2000s. But we can get more parallel compute capacity.

shonn ,

I wouldn’t even pay less.

catloaf ,

I would pay less, and then either use it for dumb stuff or just not use it at all.

Lost_My_Mind ,

84% said no.

16% punched the person asking them for suggesting such a practice. So they also said no. With their fist.

cyborganism ,

I don’t mind the hardware. It can be useful.

What I do mind is the software running on my PC sending all my personal information and screenshots and keystrokes to a corporation that will use all of it for profit to build user profile to send targeted advertisement and can potentially be used against me.

independantiste ,
@independantiste@sh.itjust.works avatar

Personally I would choose a processor with AI capabilities over a processor without, but I would not pay more for it

Godort , (edited )

This is one of those weird things that venture capital does sometimes.

VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don’t really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

cheese_greater ,

I don’t want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they’ve cobbled together

Fiivemacs ,

NDA also legally prevent you from using this forced garbage too. Companies are going to get screwed over by other companies, capitalism is gonna implode hopefully

Tenthrow ,
@Tenthrow@lemmy.world avatar

I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

TipRing ,

Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don’t make it.

One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

SkyeStarfall ,

AI is not doomed, LLMs or consumer AI products, might be

In industries AI is and will be used (though probably not LLMs, still, except in a few niche use cases)

TipRing ,

Yeah, I mean the AI being shoveled at us by techbros. Actual ML stuff is currently and will continue to be useful for all sorts on not-sexy but vital research and production tasks. I do task automation for my job and I use things like transcription models and OCR, my company uses smart sorting using rapid image recognition and other really cool uses for computers to do things that humans are bad at. It’s things like LLMs that just aren’t there - yet. I have seen very early research on AI that is trained to actually understand language and learns by context, it’s years away, but eventually we might see AI that really can do what the current AI companies are claiming.

_haha_oh_wow_ ,
@_haha_oh_wow_@sh.itjust.works avatar

“enhanced”

Zatore ,

Most people won’t pay for it because a lot of AI stuff is done cloud side. Even stuff that could be done locally is done in the cloud a lot. If that wasn’t possible, probably more people would wand the hardware. It makes more sense for corporations to invest in hardware.

helenslunch ,
@helenslunch@feddit.nl avatar

a lot of AI stuff is done cloud side.

If it’s done in the cloud then there’s no need for them to buy “AI-accelerated hardware”

BlackLaZoR ,
@BlackLaZoR@kbin.run avatar

There's really no point unless you work in specific fields that benefit from AI.

Meanwhile every large corpo tries to shove AI into every possible place they can. They'd introduce ChatGPT to your toilet seat if they could

x4740N ,

Imagining a chatgpt toilet seat made me feel uncomfortable

Davel23 ,
Lost_My_Mind ,

Aw maaaaan. I thought you were going to link that youtube sketch I can’t find anymore. Hide and go poop.

BlackLaZoR ,
@BlackLaZoR@kbin.run avatar

Don't worry, if Apple does it, it will sell a like fresh cookies world wide

SeaJ ,
Arbiter ,

Idk, they can’t even sell VR.

Odo ,
br3d ,

“Shits are frequently classified into three basic types…” and then gives 5 paragraphs of bland guff

Krackalot ,

With how much scraping of reddit they do, there’s no way it doesn’t try ordering a poop knife off of Amazon for you.

catloaf ,

It’s seven types, actually, and it’s called the Bristol scale, after the Bristol Royal Infirmary where it was developed.

Lost_My_Mind ,

Which would be approptiate, because with AI, theres nothing but shit in it.

fuckwit_mcbumcrumble ,

Someone did a demo recently of AI acceleration for 3d upscaling (think DLSS/AMDs equivilent) and it showed a nice boost in performance. It could be useful in the future.

I think it’s kind of a ray tracing. We don’t have a real use for it now, but eventually someone will figure out something that it’s actually good for and use it.

nekusoul ,
@nekusoul@lemmy.nekusoul.de avatar

AI acceleration for 3d upscaling

Isn’t that not only similar to, but exactly what DLSS already is? A neural network that upscales games?

fuckwit_mcbumcrumble ,

But instead of relying on the GPU to power it the dedicated AI chip did the work. Like it had it’s own distinct chip on the graphics card that would handle the upscaling.

I forget who demoed it, and searching for anything related to “AI” and “upscaling” gets buried with just what they’re already doing.

barsoap ,

That’s already the nvidia approach, upscaling runs on the tensor cores.

And no it’s not something magical it’s just matrix math. AI workloads are lots of convolutions on gigantic, low-precision, floating point matrices. Low-precision because neural networks are robust against random perturbation and more rounding is exactly that, random perturbations, there’s no point in spending electricity and heat on high precision if it doesn’t make the output any better.

The kicker? Those tensor cores are less complicated than ordinary GPU cores. For general-purpose hardware and that also includes consumer-grade GPUs it’s way more sensible to make sure the ALUs can deal with 8-bit floats and leave everything else the same. That stuff is going to be standard by the next generation of even potatoes: Every SoC with an included GPU has enough oomph to sensibly run reasonable inference loads. And with “reasonable” I mean actually quite big, as far as I’m aware e.g. firefox’s inbuilt translation runs on the CPU, the models are small enough.

Nvidia OTOH is very much in the market for AI accelerators and figured it could corner the upscaling market and sell another new generation of cards by making their software rely on those cores even though it could run on the other cores. As AMD demonstrated, their stuff also runs on nvidia hardware.

What’s actually special sauce in that area are the RT cores, that is, accelerators for ray casting though BSP trees. That’s indeed specialised hardware but those things are nowhere near fast enough to compute enough rays for even remotely tolerable outputs which is where all that upscaling/denoising comes into play.

fuckwit_mcbumcrumble ,

Nvidia’s tensor cores are inside the GPU, this was outside the GPU, but on the same card (the PCB looked like an abomination). If I remember right in total it used slightly less power, but performed about 30% faster than normal DLSS.

fuckwit_mcbumcrumble ,

Found it.

neowin.net/…/powercolor-uses-npus-to-lower-gpu-po…

I can’t find a picture of the PCB though, that might have been a leak pre reveal and now that it’s revealed good luck finding it.

Telorand ,

…just under 2,000 voters said “yes.”

And those people probably work in some area related to LLMs.

It’s practically a meme at this point:

Nobody:

Chip makers: People want us to add AI to our chips!

ozymandias117 ,

The even crazier part to me is some chip makers we were working with pulled out of guaranteed projects with reasonably decent revenue to chase AI instead

We had to redesign our boards and they paid us the penalties in our contract for not delivering so they could put more of their fab time towards AI

nickwitha_k ,

That’s absolutely crazy. Taking the Chicago School MBA philosophy to things as time consuming and expensive to setup as silicon production.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines