There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

UltraGiGaGigantic ,

We’re not gonna make it, are we? People, I mean. https://lemm.ee/pictrs/image/4dc62d50-b38e-4a89-97b0-b849cdc0fcf8.jpeg

alessandro ,
@alessandro@lemmy.ca avatar

Didn’t John Connor befriend the second IA he find?

Marin_Rider ,

yeah but it didn’t try to lock him into a subscription plan or software ecosystem

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

It locked him into the world of the terminators? Imo a mighty subscription

/j

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar
alessandro ,
@alessandro@lemmy.ca avatar

yeah but it didn’t try to lock him into a subscription plan or software ecosystem

Not AI fault, the first one (killed) was a remotely controlled by the product of a big corp (Skynet), the other one was a local, offline one.

Moral of the story: there’s difference between the AI that runs locally on your GPU and the one that runs on Elon’s remote servers… and that difference may be life or death.

roguetrick ,

Just need the right name for it. Soundblasters are still being produced aren’t they? There’s always a market.

GoodEye8 ,

Well yeah, because dedicated DACs have a tangible benefit of better audio. If you want better audio you need to buy a quality DAC and quality cans.

I also used to think it’s dumb because who cares as long as you can hear. But then I built a new PC and I don’t know if it was a faulty mobo or just unlucky setup but the internal DAC started picking up static. So I got an external DAC and what I noticed was that the audio sounded clearer and I could hear things in the sound that I couldn’t hear before. It was magical, it’s like someone added new layers into my favorite songs. I had taken the audio crack.

I pretty quickly gave away my DAC along with my audio technicas because I could feel the urge. I needed another hit. I needed more. I got this knawing itch and I knew I had to get out before the addiction completely took over. Now I live in static because I do not dare to touch the sun again.

Soundblasters may be shit but the hardware they’re supposed to sell is legit, it has a tangible benefit to whomever can tell the difference. But with AI, what is the tangible benefit that you couldn’t get by getting a better GPU?

ArchRecord , (edited )

And when traditional AI programs can be run on much lower end hardware with the same speed and quality, those chips will have no use. (Spoiler alert, it’s happening right now.)

Corporations, for some reason, can’t fathom why people wouldn’t want to pay hundreds of dollars more just for a chip that can run AI models they won’t need most of the time.

If I want to use an AI model, I will, but if you keep developing shitty features that nobody wants using it, just because “AI = new & innovative,” then I have no incentive to use it. LLMs are useful to me sometimes, but an LLM that tries to summarize the activity on my computer isn’t that useful to me, so I’m not going to pay extra for a chip that I won’t even use for that purpose.

Natanael ,

You borked your link

OfficerBribe ,
Natanael ,

That still needs an FPGA. While they certainly seems to be able to use smaller ones, adding an FPGA chip will still add cost

ArchRecord ,

Whoops, no clue how that happened, fixed!

JokeDeity ,

The other 26% were bots answering.

some_guy ,

16%

capital ,

My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.

What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?

Appoxo ,
@Appoxo@lemmy.dbzer0.com avatar

Run it faster.
A CPU can also compute graphics but you wait significant more time than using hardware accelerated graphics hardware.

UltraMagnus0001 ,

Fuck, they won’t upgrade to TPM for windows 11

lost_faith ,

Still havent turned mine on, don’t want no surprises after a long day at work

blazeknave ,

Not even on my phone

kemsat ,

What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.

WhyDoYouPersist ,

When they start calling everything AI, soon enough it loses all meaning. They’re gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that’s for sure.

werefreeatlast ,

Instead of Nvidia knowing some of your habits, they will know most of your habits. $$$.

kemsat ,

Just saying, I’d welcome some competition from other players in the industry. AI-boosted upscaling is a great use of the hardware, as long as it happens on your own hardware only.

Snapz ,

And the other 16% would also pay you $230 to hit them in the face with a shovel

PenisWenisGenius , (edited )

I’m generally opposed to anything that involves buying new hardware. This isn’t the 1980s. Computers are powerful as fuck. Stop making software that barely runs on them. If they can’t make ai more efficient then fuck it. If they can’t make game graphics good without a minimum of a $1000 gpu that produces as much heat as a space heater, maybe we need to go back to 2000s era 3d. There is absolutely no point in making graphics more photorealistic than maybe Skyrim. The route they’re going is not sustainable.

reev ,

The point of software like DLSS is to run stuff better on computers with worse specs than what you’d normally need to run a game as that quality. There’s plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity “need” or “should want” is a weird take ¯_(ツ)_/¯

warm ,

The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.

I don't want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.

The point is, we would rather have games looking like Skyrim with great performance vs '4K RTX real time raytracing ultra AI realistic graphics wow!' at 60fps.

nekusoul , (edited )
@nekusoul@lemmy.nekusoul.de avatar

The quality of games has dropped a lot, they make them fast

Isn’t the public opinion that games take way too long to make nowadays? They certainly don’t make them fast anymore.

As for the rest, I also can’t really agree. IMO, graphics have taken a huge jump in recent years, even outside of RT. Lighting, texture quality shaders, as well as object density and variety have been getting a noticeable bump. Other than the occasional dud and awful shader compilation stutter that has plagued many PC games over the last few years (but is getting more awareness now) I’d argue that game performance is pretty good for most games right now.

That’s why I see techniques like DLSS/FSR/XeSS/TSR not as crutch, but as just as one of the dozen other rendering shortcuts game engines have accumulated over the years. That said, it’s not often we see a new technique deliver such a big performance boost while having almost no visual impact.

Also, who decided that ‘we’ would rather have games looking like Skyrim? While I do like high FPS very much, I also do like shiny graphics with all the bells and whistles. A Game like ‘The Talos Principle 2’ for example does hammer the GPU quite a bit on its highest settings, but it certainly delivers in the graphics department. So much so that I’ve probably spent as much time admiring the highly detailed environments as I did actually solving the puzzles.

warm , (edited )

Isn't the public opinion that games take way too long to make nowadays? They certainly don't make them fast anymore.

I think the problem here is that they announce them way too early, so people are waiting like 2-3 years for it. It's better if they are developed behind the scenes and 'surprise' announced a few months prior to launch.

Graphics have advanced of course, but it's become diminishing returns and now a lot of games have resorted to spamming post-processing effects and implementing as much foliage and fog as possible to try and make the games look better. I always bring Destiny 2 up in this conversation, because the game looks great, runs great and the graphical fidelity is amazing - no blur but no rough edges. Versus like any UE game which have terrible TAA, if you disable it then everything is jagged and aliased.

DLSS etc are defo a crutch and they are designed as one (originally for real-time raytracing), hence the better versions requiring new hardware. Games shouldn't be relying on them and their trade-offs are not worth it if you have average modern hardware where the games should just run well natively.

It's not so much us wanting specifically Skyrim, maybe that one guy, but just an extreme example I guess to put the point across. It's obviously all subjective, making things shiny obviously attracts peoples eyes during marketing.

nekusoul ,
@nekusoul@lemmy.nekusoul.de avatar

I see. That I can mostly agree with. I really don’t like the temporal artifacts that come with TAA either, though it’s not a deal-breaker for me if the game hides it well.

A few tidbits I’d like to note though:

they announce them way too early, so people are waiting like 2-3 years for it.

Agree. It’s kind of insane how early some games are being announced in advance. That said, 2-3 years back then was the time it took for a game to get a sequel. Nowadays you often have to wait an entire console-cycle for a sequel to come out instead of getting a trilogy of games on during one.

Games shouldn’t be relying on them and their trade-offs are not worth it

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

warm ,

Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

Trade-offs being the artifacts, while not that noticable to most, I did try it and anything in fast motion does suffer. Another being the hardware requirement. I don't mind it existing, I just don't think mid-high end setups should ever have to enable it for a good experience (well, what I personally consider a good experience :D).

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

We should have stopped with Mario 64. Everything else has been an abomination.

crazyminner ,

I was recently looking for a new laptop and I actively avoided laptops with AI features.

lamabop ,

Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you’ll probably get a shrug at most

alessandro ,
@alessandro@lemmy.ca avatar

I don’t think the poll question was well made… “would you like part away from your money for…” vaguely shakes hand in air “…ai?”

People is already paying for “ai” even before chatGPT came out to popularize things: DLSS

Buelldozer ,
@Buelldozer@lemmy.today avatar

I’m fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it’s useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.

However I’ve bought TPUs for things like Frigate servers and various ML projects. For gamers there’s some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For “Smart Home” enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.

So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.

Codilingus ,

I wish someone would make software that utilizes things like a M.2 coral TPU, to enhance gameplay like with frame gen, or up scaling for games and videos. Some GPUs are starting to even put M.2 slots on the GPU, if the latency from Mobo M.2 to PCIe GPU would be too slow.

the_post_of_tom_joad ,

Only 7% say they would pay more, which to my mind is the percentage of respondents who have no idea what “AI” in its current bullshit context even is

taiyang ,

Or they know a guy named Al and got confused. ;)

jballs ,
@jballs@sh.itjust.works avatar

Maybe I’m in the minority here, but I’d gladly pay more for Weird Al enhanced hardware.

lost_faith ,

Hardware breaks into a parody of whatever you are doing

Me - laughing and vibing

NigelFrobisher ,

A man walks down the street He says why am I short of attention Got a short little span of attention And woe my nights are so long

flicker ,

I figure they’re those “early adopters” who buy the New Thing! as soon as it comes out, whether they need it or not, whether it’s garbage or not, because they want to be seen as on the cutting edge of technology.

n3m37h ,

Let me put it in lamens terms… FUCK AI… Thanks, have a great day

iAmTheTot ,

FYI the term is “layman’s”, as of you were using the language of a layman, or someone who is not specifically experienced in the topic.

krashmo ,

Sounds like something a lameman would say

Grabthar ,

Well, when life hands you lémons…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines