There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Survey shows most people wouldn't pay extra for AI-enhanced hardware | 84% of people said no

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

Telorand ,

…just under 2,000 voters said “yes.”

And those people probably work in some area related to LLMs.

It’s practically a meme at this point:

Nobody:

Chip makers: People want us to add AI to our chips!

ozymandias117 ,

The even crazier part to me is some chip makers we were working with pulled out of guaranteed projects with reasonably decent revenue to chase AI instead

We had to redesign our boards and they paid us the penalties in our contract for not delivering so they could put more of their fab time towards AI

nickwitha_k ,

That’s absolutely crazy. Taking the Chicago School MBA philosophy to things as time consuming and expensive to setup as silicon production.

BlackLaZoR ,
@BlackLaZoR@kbin.run avatar

There's really no point unless you work in specific fields that benefit from AI.

Meanwhile every large corpo tries to shove AI into every possible place they can. They'd introduce ChatGPT to your toilet seat if they could

x4740N ,

Imagining a chatgpt toilet seat made me feel uncomfortable

Davel23 ,
Lost_My_Mind ,

Aw maaaaan. I thought you were going to link that youtube sketch I can’t find anymore. Hide and go poop.

BlackLaZoR ,
@BlackLaZoR@kbin.run avatar

Don't worry, if Apple does it, it will sell a like fresh cookies world wide

SeaJ ,
Arbiter ,

Idk, they can’t even sell VR.

Odo ,
br3d ,

“Shits are frequently classified into three basic types…” and then gives 5 paragraphs of bland guff

Krackalot ,

With how much scraping of reddit they do, there’s no way it doesn’t try ordering a poop knife off of Amazon for you.

catloaf ,

It’s seven types, actually, and it’s called the Bristol scale, after the Bristol Royal Infirmary where it was developed.

br3d ,

I know. But I was satirising GPT’s bland writing style, not providing facts

Lost_My_Mind ,

Which would be approptiate, because with AI, theres nothing but shit in it.

fuckwit_mcbumcrumble ,

Someone did a demo recently of AI acceleration for 3d upscaling (think DLSS/AMDs equivilent) and it showed a nice boost in performance. It could be useful in the future.

I think it’s kind of a ray tracing. We don’t have a real use for it now, but eventually someone will figure out something that it’s actually good for and use it.

nekusoul ,
@nekusoul@lemmy.nekusoul.de avatar

AI acceleration for 3d upscaling

Isn’t that not only similar to, but exactly what DLSS already is? A neural network that upscales games?

fuckwit_mcbumcrumble ,

But instead of relying on the GPU to power it the dedicated AI chip did the work. Like it had it’s own distinct chip on the graphics card that would handle the upscaling.

I forget who demoed it, and searching for anything related to “AI” and “upscaling” gets buried with just what they’re already doing.

barsoap ,

That’s already the nvidia approach, upscaling runs on the tensor cores.

And no it’s not something magical it’s just matrix math. AI workloads are lots of convolutions on gigantic, low-precision, floating point matrices. Low-precision because neural networks are robust against random perturbation and more rounding is exactly that, random perturbations, there’s no point in spending electricity and heat on high precision if it doesn’t make the output any better.

The kicker? Those tensor cores are less complicated than ordinary GPU cores. For general-purpose hardware and that also includes consumer-grade GPUs it’s way more sensible to make sure the ALUs can deal with 8-bit floats and leave everything else the same. That stuff is going to be standard by the next generation of even potatoes: Every SoC with an included GPU has enough oomph to sensibly run reasonable inference loads. And with “reasonable” I mean actually quite big, as far as I’m aware e.g. firefox’s inbuilt translation runs on the CPU, the models are small enough.

Nvidia OTOH is very much in the market for AI accelerators and figured it could corner the upscaling market and sell another new generation of cards by making their software rely on those cores even though it could run on the other cores. As AMD demonstrated, their stuff also runs on nvidia hardware.

What’s actually special sauce in that area are the RT cores, that is, accelerators for ray casting though BSP trees. That’s indeed specialised hardware but those things are nowhere near fast enough to compute enough rays for even remotely tolerable outputs which is where all that upscaling/denoising comes into play.

fuckwit_mcbumcrumble ,

Nvidia’s tensor cores are inside the GPU, this was outside the GPU, but on the same card (the PCB looked like an abomination). If I remember right in total it used slightly less power, but performed about 30% faster than normal DLSS.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

from the articles I’ve found it sounds like they’re comparing it to native…

fuckwit_mcbumcrumble ,

Found it.

neowin.net/…/powercolor-uses-npus-to-lower-gpu-po…

I can’t find a picture of the PCB though, that might have been a leak pre reveal and now that it’s revealed good luck finding it.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

Having to send full frames off of the GPU for extra processing has got to come with some extra latency/problems compared to just doing it actually on the gpu… and I’d be shocked if they have motion vectors and other engine stuff that DLSS has that would require the games to be specifically modified for this adaptation. IDK, but I don’t think we have enough details about this to really judge whether its useful or not, although I’m leaning on the side of ‘not’ for this particular implementation. They never showed any actual comparisons to dlss either.

As a side note, I found this other article on the same topic where they obviously didn’t know what they were talking about and mixed up frame rates and power consumption, its very entertaining to read

The NPU was able to lower the frame rate in Cyberpunk from 263.2 to 205.3, saving 22% on power consumption, and probably making fan noise less noticeable. In Final Fantasy, frame rates dropped from 338.6 to 262.9, resulting in a power saving of 22.4% according to PowerColor’s display. Power consumption also dropped considerably, as it shows Final Fantasy consuming 338W without the NPU, and 261W with it enabled.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

We have plenty of real uses for ray tracing right now, from blender to whatever that avatar game was doing to lumen to partial rt to full path tracing, you just can’t do real time GI with any semblance of fine detail without RT from what I’ve seen (although the lumen sdf mode gets pretty close)

although the rt cores themselves are more debatably useful, they still give a decent performance boost most of the time over “software” rt

Zatore ,

Most people won’t pay for it because a lot of AI stuff is done cloud side. Even stuff that could be done locally is done in the cloud a lot. If that wasn’t possible, probably more people would wand the hardware. It makes more sense for corporations to invest in hardware.

helenslunch ,
@helenslunch@feddit.nl avatar

a lot of AI stuff is done cloud side.

If it’s done in the cloud then there’s no need for them to buy “AI-accelerated hardware”

_haha_oh_wow_ ,
@_haha_oh_wow_@sh.itjust.works avatar

“enhanced”

Godort , (edited )

This is one of those weird things that venture capital does sometimes.

VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don’t really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

cheese_greater ,

I don’t want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they’ve cobbled together

Fiivemacs ,

NDA also legally prevent you from using this forced garbage too. Companies are going to get screwed over by other companies, capitalism is gonna implode hopefully

Tenthrow ,
@Tenthrow@lemmy.world avatar

I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

TipRing ,

Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don’t make it.

One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

SkyeStarfall ,

AI is not doomed, LLMs or consumer AI products, might be

In industries AI is and will be used (though probably not LLMs, still, except in a few niche use cases)

TipRing ,

Yeah, I mean the AI being shoveled at us by techbros. Actual ML stuff is currently and will continue to be useful for all sorts on not-sexy but vital research and production tasks. I do task automation for my job and I use things like transcription models and OCR, my company uses smart sorting using rapid image recognition and other really cool uses for computers to do things that humans are bad at. It’s things like LLMs that just aren’t there - yet. I have seen very early research on AI that is trained to actually understand language and learns by context, it’s years away, but eventually we might see AI that really can do what the current AI companies are claiming.

independantiste ,
@independantiste@sh.itjust.works avatar

Personally I would choose a processor with AI capabilities over a processor without, but I would not pay more for it

cyborganism ,

I don’t mind the hardware. It can be useful.

What I do mind is the software running on my PC sending all my personal information and screenshots and keystrokes to a corporation that will use all of it for profit to build user profile to send targeted advertisement and can potentially be used against me.

Lost_My_Mind ,

84% said no.

16% punched the person asking them for suggesting such a practice. So they also said no. With their fist.

shonn ,

I wouldn’t even pay less.

catloaf ,

I would pay less, and then either use it for dumb stuff or just not use it at all.

OhmsLawn ,

I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.

If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.

originalucifer ,
@originalucifer@moist.catsweat.com avatar

what they mean is that they are putting in dedicated processors or other hardware just to run an LLM . it doesnt speed up anything other than the faux-AI tool they are implementing.

LLMs require a ton of math that is better suited to video processors than the general purpose cpu on most machines.

tal ,
@tal@lemmy.today avatar

I honestly have no Idea what AI does to a processor

Parallel processing capability. CPUs historically worked with mostly-non-massively-parallelizable tasks; maybe you’d use a GPU if you wanted that.

I mean, that’s not necessarily “AI” as such, but LLMs are a neat application that uses them.

On-CPU video acceleration does parallel processing too.

Software’s going to have to parallelize if it wants to get much by way of performance improvements, anyway. We haven’t been seeing rapid exponential growth in serial computation speed since the early 2000s. But we can get more parallel compute capacity.

TheEntity ,

And what do the companies take away from this? “Cool, we just won’t leave you any other options.”

Wooki ,

Plenty of companies offering sane normal solutions and make bank in the process

catloaf ,

History has shown that not to be the case.

Kraiden ,

someone tried to sell me a fucking AI fridge the other day. Why the fuck would I want my fridge to "learn my habits?" I don't even like my phone "learning my habits!"

Ragnarok314159 ,

And it would improve your life zero. That is what is absurd about LLM’s in their current iteration, they provide almost no benefit to a vast majority of people.

All a learning model would do for a fridge is send you advertisements for whatever garbage food is on sale. Could it make recipes based on what you have? Tell it you want to slowly get healthier and have it assist with grocery selection?

Nah, fuck you and buy stuff.

Zron ,

Why does a fridge need to know your habits?

It has to keep the food cold all the time. The light has to come on when you open the door.

What could it possibly be learning

1995ToyotaCorolla ,
@1995ToyotaCorolla@lemmy.world avatar

Hi Zron, you seem to really enjoy eating shredded cheese at 2:00am! For your convenience, we’ve placed an order for 50lbs of shredded cheese based on your rate of consumption. Thanks!

variants ,

We also took the liberty of canceling your health insurance to help protect the shareholders from your abhorrent health expenses in the far future

rottingleaf ,

If your fridge spies after you, certain people can have better insights into healthiness of your food habits, how organized you are, how often things go bad and are thrown out, what medicine (requiring to be kept cold) do you put there and how often do you use it.

That will then affect your insurances, your credit rating, and possibly many other ratings other people are interested in.

njm1314 ,

So I can see what you like to eat, then it can tell your grocery store, then your grocery store can raise the prices on those items. That’s the point. It’s the same thing with those memberships and coupon apps. That’s the end goal.

rottingleaf ,

They can see what you like to eat by what you’re buying, LOL. No, not this.

A fridge can give them information on how do you eat.

JackbyDev ,
  1. Know when you’re about to put groceries in so it makes the fridge colder so the added heat doesn’t make things go bad.
  2. Know when you don’t use it and let it get a tiny bit warmer to save a teeny bit of power. (The vast majority of power is cooling new items, not keeping things cold though.)
  3. Tell you where things are?
  4. Ummm… Maybe give you an optimized layout of how to store things?
  5. Be an attack vector on your home’s wifi
  6. Wait, no, uh,
  7. Push notifications
  8. Do you not have phones?
upside431 ,

To remind you when should go to buy groceries haha

jballs ,
@jballs@sh.itjust.works avatar
explodicle ,
trollblox_ ,

always xkcd

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

it doesn’t seem all that hard to make, as long as you don’t mind the severely reduced flexibility in capacity and glass bottles shattering against each other at the bottom

jubilationtcornpone ,

I’m still pissed about the fact that I can’t buy a reasonably priced TV that doesn’t have WiFi. I should never have left my old LG Plasma bolted to the wall of my previous house when I sold it. That thing had a fantastic picture and doubled as a space heater in the winter.

fruitycoder ,

I want AI in my fridge for sure. Grocery shopping sucks. Forgetting how old something was sucks. Letting all the cool out to crawl around to see what I have sucks.

I want my fridge to be like the Sims, just get deliveries or pickup the order. Fill it out and get told what ingredients I have. Bonus points if you can just tell me what recipes I can cook right now, even better if I can ask for time frame.

That would be sick!

Still not going to give ecorp all of my data or put some half back internet of stings device on my WiFi for it. But it would be cool.

tal ,
@tal@lemmy.today avatar

That’s kind of abstract. Like, nobody pays purely for hardware. They pay for the ability to run software.

The real question is, would you pay $N to run software package X?

Like, go back to 2000. If I say “would you pay $N for a parallel matrix math processing card”, most people are going to say “no”. If I say “would you pay $N to play Quake 2 at resolution X and fps Y and with nice smooth textures,” then it’s another story.

I paid $1k for a fast GPU so that I could run Stable Diffusion quickly. If you asked me “would you pay $1k for an AI-processing card” and I had no idea what software would use it, I’d probably say “no” too.

Grimy ,

Yup, the answer is going to change real fast when the next Oblivion with NPCs you can talk to needs this kind of hardware to run.

tal , (edited )
@tal@lemmy.today avatar

I’m still not sold that dynamic text generation is going to be the major near-term application for LLMs, much less in games. Like, don’t get me wrong, it’s impressive what they’ve done. But I’ve also found it to be the least-practically-useful of the LLM model categories. Like, you can make real, honest-to-God solid usable graphics with Stable Diffusion. You can do pretty impressive speech generation in TortoiseTTS. I imagine that someone will make a locally-runnable music LLM model and software at some point if they haven’t yet; I’m pretty impressed with what the online services do there. I think that there are a lot of neat applications for image recognition; the other day I wanted to identify a tree and seedpod. Someone hasn’t built software to do that yet (that I’m aware of), but I’m sure that they will; the ability to map images back to text is pretty impressive. I’m also amazed by the AI image upscaling that Stable Diffusion can do, and I suspect that there’s still room for a lot of improvement there, as that’s not the main goal of Stable Diffusion. And once someone has done a good job of building a bunch of annotated 3d models, I think that there’s a whole new world of 3d.

I will bet that before we see that becoming the norm in games, we’ll see LLMs regularly used for either pre-generated speech synth or in-game speech synthesis, so that the characters say text (which might be procedurally-generated, aren’t just static pre-recorded samples, but aren’t necessarily generated from an LLM). Like, it’s not practical to have a human voice actor cover all possible phrases with static recorded speech that one might want an in-game character to speak.

Grimy ,

I think it’s coming pretty fast. There’s already a mod for Skyrim that lets you talk to your companion. People are spending hours talking to llms and roleplaying, the first triple A game to incorporate it is going to bee a massive hit imo. I’m actually surprised no one’s been coming out with visual novels using them, it seems like a perfect use case.

It’s definitely going to be used first for making the content of the game like you said though.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

there are some local genai music models, although I don’t know how good they are yet as I haven’t tried any myself (stable audio is one, but I’m sure there are others)

also minor linguistic nitpick but LLM stands for ‘language model’ (you could maybe get away with it for pixart and sd3 as they use t5 for prompt encoding, which is an llm, i’m sure some audio models with lyrics use them too), the term you’re looking for is probably ‘generative’

peopleproblems ,

AI in Movies: “The only Logical solution, is the complete control/eradication of humanity.”

AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!”

JDPoZ ,
@JDPoZ@lemmy.world avatar

AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!” More like :

“Dave, I see you have beer, soda, and cheese in your fridge. Have you thought about ordering PRIME energy drink? There’s a sale.”

No.

“36 count case of PRIME energy drink ordered!”

I said no.

“changed PRIME energy drink 36 count case shipping to next-day air for $150.79!”

GODDAMNIT!

peopleproblems ,

Yeah this is probably more likely. It’s just so depressing

Zorque ,

Please drink verification can.

fruitycoder ,

I think South Parks vision is still the worst. AI so human you can fall in love with but it still tries to manipulate you to sell you things.

metaStatic ,

this goes to show just how far the current grift has gone.

AI enhanced hardware? Jesus Fuck take all my money that's amazing.

Dedicated LLM chatbot hardware? Die in a fire for even suggesting this is AI.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines