There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

capital ,

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

Run it faster.
A CPU can also compute graphics but you wait significant more time than using hardware accelerated graphics hardware.

Xenny ,

As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn’t there then it’s not supported.

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

Assuming theres no catch-on after 3-4 cycles I’d say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.

Maybe it needs a bit of marturing and a re-introduction at a later point.

rainynight65 ,

I am generally unwilling to pay extra for features I don’t need and didn’t ask for.

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

raytracing is something I’d pay for even if unasked, assuming they meaningfully impact the quality and dont demand outlandish prices.
And they’d need to put it in unasked and cooperate with devs else it won’t catch on quickly enough.
Remember Nvidia Ansel?

roguetrick ,

Just need the right name for it. Soundblasters are still being produced aren’t they? There’s always a market.

GoodEye8 ,

Well yeah, because dedicated DACs have a tangible benefit of better audio. If you want better audio you need to buy a quality DAC and quality cans.

I also used to think it’s dumb because who cares as long as you can hear. But then I built a new PC and I don’t know if it was a faulty mobo or just unlucky setup but the internal DAC started picking up static. So I got an external DAC and what I noticed was that the audio sounded clearer and I could hear things in the sound that I couldn’t hear before. It was magical, it’s like someone added new layers into my favorite songs. I had taken the audio crack.

I pretty quickly gave away my DAC along with my audio technicas because I could feel the urge. I needed another hit. I needed more. I got this knawing itch and I knew I had to get out before the addiction completely took over. Now I live in static because I do not dare to touch the sun again.

Soundblasters may be shit but the hardware they’re supposed to sell is legit, it has a tangible benefit to whomever can tell the difference. But with AI, what is the tangible benefit that you couldn’t get by getting a better GPU?

FMT99 ,

Show the actual use case in a convincing way and people will line up around the block. Generating some funny pictures or making generic suggestions about your calendar won’t cut it.

overload ,

I completely agree. There are some killer AI apps, but why should AI run on my OS? Recall is a complete disaster of a product and I hope it doesn’t see the light of day, but I’ve no doubt that there’s a place for AI on the PC.

Whatever application there is in AI at the OS level, it needs to be a trustless system that the user has complete control of. I’d be all for an Open source AI running at that level, but Microsoft is not going to do that because they want to ensure that they control your OS data.

PriorityMotif ,
@PriorityMotif@lemmy.world avatar

Machine learning in the os is a great value add for medium to large companies as it will allow them to track real productivity of office workers and easily replace them. Say goodbye to middle management.

overload , (edited )

I think it could definitely automate some roles where you aren’t necessarily thinking and all decisions are made based on information internally available to the PC. For sure these exist but some decisions need human input, I’m not sure how they automate out those roles just because they see stuff happening on the PC every day.

If anything I think this feature is used to spy on users at work and see when keystrokes fall below a certain level each day, but I’m sure that’s already possible for companies to do (but they just don’t).

chicken ,

I can’t tell how good any of this stuff is because none of the language they’re using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?

ssm ,
@ssm@lemmy.sdf.org avatar

AI-en{hanced,shittified}

mlg ,
@mlg@lemmy.world avatar

Even DLSS only works great for some types of games.

Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.

War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.

Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It’s so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.

TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.

Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.

Most modern games and applications really don’t need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on “AI” to reduce the work is sort of a lazy solution that will have side effects because you’re effectively predicting the output.

RememberTheApollo_ ,

When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.

GalacticTaterTot ,

The only reason I have any enthusiasm about CoPilot+ PCs (AI PCs or whatever new name they get in 6 months) is because of ARM and battery life.

Heck, I’ll trade them all the AI features for no ads.

NounsAndWords ,

I would pay for AI-enhanced hardware…but I haven’t yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.

lmaydev ,

I use it heavily at work nowadays. It would be nice to run it locally.

baggins ,

You don’t need AI enhanced hardware for that, just normal ass hardware and you run AI software on it.

lmaydev ,

But you can run more complex networks faster. Which is what I want.

baggins ,

Maybe I’m just not understanding what AI-enabled hardware is even supposed to mean

lmaydev ,

It’s hardware specifically designed for running AI tasks. Like neural networks.

An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks. Unlike general-purpose CPUs and GPUs, NPUs are optimized for a data-driven parallel computing, making them highly efficient at processing massive multimedia data like videos and images and processing data for neural networks

ILikeBoobies ,

github.com/huggingface/candle

You can look into this, however it’s not what this discussion is about

lmaydev ,

An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks.

Exactly what we are talking about.

ILikeBoobies ,

Stick to the discussion of paying a premium for hardware not the software

lmaydev ,

Not sure what you mean? The hardware runs the software tasks more efficiently.

ILikeBoobies ,

The discussion is whether people should/would pay extra for hardware designed around ai vs just getting better hardware

Nachorella ,

I’m curious what you use it for at work.

lmaydev ,

I’m a programmer so when learning a new framework or library I use it as an interactive docs that allows follow up questions.

I also use it to generate things like regex and SQL queries.

It’s also really good at refactoring code and other repetitive tasks like that

Nachorella ,

it does seem like a good translator for the less human readable stuff like regex and such. I’ve dabbled with it a bit but I’m a technical artist and haven’t found much use for it in the things I do.

the_crotch ,

Not the guy you were asking but it’s great for writing powershell scripts

DerisionConsulting ,

In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.

ryathal ,

Anything AI actually enhanced would be advertising the enhancement not the AI part.

Fermion ,

It’s like rgb all over again.

At least rgb didn’t make a giant stock market bubble…

hsr ,
@hsr@lemmy.dbzer0.com avatar

DLSS and XeSS (XMX) are AI and they’re noticably better than non-hardware accelerated alternatives.

PriorityMotif , (edited )
@PriorityMotif@lemmy.world avatar

Already had that Google thingy for years now. The USB/nvme device for image recognition. Can’t remember what it’s called now. Cost like $30.

Edit: Google coral TPU

lauha ,

My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.

kemsat ,

What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

WhyDoYouPersist ,

When they start calling everything AI, soon enough it loses all meaning. They’re gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that’s for sure.

werefreeatlast ,

Instead of Nvidia knowing some of your habits, they will know most of your habits. $$$.

kemsat ,

Just saying, I’d welcome some competition from other players in the industry. AI-boosted upscaling is a great use of the hardware, as long as it happens on your own hardware only.

JokeDeity ,

The other 26% were bots answering.

some_guy ,

16%

Buelldozer ,
@Buelldozer@lemmy.today avatar

I’m fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it’s useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.

However I’ve bought TPUs for things like Frigate servers and various ML projects. For gamers there’s some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For “Smart Home” enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.

So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.

Codilingus ,

I wish someone would make software that utilizes things like a M.2 coral TPU, to enhance gameplay like with frame gen, or up scaling for games and videos. Some GPUs are starting to even put M.2 slots on the GPU, if the latency from Mobo M.2 to PCIe GPU would be too slow.

blazeknave ,

Not even on my phone

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines