There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

floofloof ,

Shed a tear, if you wish, for Nvidia founder and Chief Executive Jenson Huang, whose fortune (on paper) fell by almost $10 billion that day.

Thanks, but I think I’ll pass.

brbposting ,

I’m sure he won’t mind. Worrying about that doesn’t sound like working.

I work from the moment I wake up to the moment I go to bed. I work seven days a week. When I’m not working, I’m thinking about working, and when I’m working, I’m working. I sit through movies, but I don’t remember them because I’m thinking about work.

  • Huang on his 14 hour workdays

It is one way to live.

MelodiousFunk , (edited )

That sounds like mental illness.

ETA: Replace “work” in that quote with practically any other activity/subject, whether outlandish or banal.

I sit through movies but I don’t remember them because I’m thinking about baking cakes.

I sit through movies but I don’t remember them because I’m thinking about traffic patterns.

I sit through movies but I don’t remember them because I’m thinking about cannibalism.

I sit through movies but I don’t remember them because I’m thinking about shitposting.

Obsessed with something? At best, you’re “quirky” (depending on what you’re obsessed with). Unless it’s money. Being obsessed with that is somehow virtuous.

brbposting ,

Valid argument for sure

It would be sad if therapists kept telling him that but he could never remember

MelodiousFunk ,

“Sorry doc, was thinking about work. Did you say something about line go up?”

Hupf ,
@Hupf@feddit.org avatar
lauha ,

Some would not call that living

msage ,

Yeah ok sure buddy, but what do you DO actually?

PhlubbaDubba ,

I’m just praying people will fucking quit it with the worries that we’re about to get SKYNET or HAL when binary computing would inherently be incapable of recreating the fast pattern recognition required to replicate or outpace human intelligence.

Moore’s law is about similar computing power, which is a measure of hardware performance, not of the software you can run on it.

utopiah ,

Unfortunately it’s part of the marketing, thanks OpenAI for that “Oh no… we can’t share GPT2, too dangerous” then… here it is. Definitely interesting then but now World shattering. Same for GPT3 … but through exclusive partnership with Microsoft, all closed, rinse and repeat for GPT4. It’s a scare tactic to lock what was initially open, both directly and closing the door behind them through regulation, at least trying to.

CosmoNova ,

Welp, it was ‘fun’ while it lasted. Time for everyone to adjust their expectations to much more humble levels than what was promised and move on to the next sceme. After Metaverse, NFTs and ‘Don’t become a programmer, AI will steal your job literally next week!11’, I’m eager to see what they come up with next. And with eager I mean I’m tired. I’m really tired and hope the economy just takes a damn break from breaking things.

dustyData ,

But if it doesn’t disrupt it isn’t worth it!

/s

Fetus ,

I just hope I can buy a graphics card without having to sell organs some time in the next two years.

catloaf ,

My RX 580 has been working just fine since I bought it used. I’ve not been able to justify buying a new (used) one. If you have one that works, why not just stick with it until the market gets flooded with used ones?

macrocephalic ,

Don’t count on it. It turns out that the sort of stuff that graphics cards do is good for lots of things, it was crypto, then AI and I’m sure whatever the next fad is will require a GPU to run huge calculations.

utopiah ,

I’m sure whatever the next fad is will require a GPU to run huge calculations.

I also bet it will, cf my earlier comment on rendering farm and looking for what “recycles” old GPUs lemmy.world/comment/12221218 namely that it makes sense to prepare for it now and look for what comes next BASED on the current most popular architecture. It might not be the most efficient but probably will be the most economical.

Zorsith ,
@Zorsith@lemmy.blahaj.zone avatar

I’d love an upgrade for my 2080 TI, really wish Nvidia didn’t piss off EVGA into leaving the GPU business…

sheogorath ,

If there is even a GPU being sold. It’s much more profitable for Nvidia to just make compute focused chips than upgrading their gaming lineup. GeForce will just get the compute chips rejects and laptop GPU for the lower end parts. After the AI bubble burst, maybe they’ll get back to their gaming roots.

utopiah ,

move on to the next […] eager to see what they come up with next.

That’s a point I’m making in a lot of conversations lately : IMHO the bubble didn’t pop BECAUSE capital doesn’t know where to go next. Despite reports from big banks that there is a LOT of investment for not a lot of actual returns, people are still waiting on where to put that money next. Until there is such a place, they believe it’s still more beneficial to keep the bet on-going.

masterspace ,

Thank fucking god.

I got sick of the overhyped tech bros pumping AI into everything with no understanding of it…

But then I got way more sick of everyone else thinking they’re clowning on AI when in reality they’re just demonstrating an equal sized misunderstanding of the technology in a snarky pessimistic format.

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

As I job-hunt, every job listed over the past year has been “AI-drive [something]” and I’m really hoping that trend subsides.

AdamEatsAss ,

“This is an mid level position requiring at least 7 years experience developing LLMs.” -Every software engineer job out there.

figjam ,

That was cloud 7 years ago and blockchain 4

macrocephalic ,

Yeah, I’m a data engineer and I get that there’s a lot of potential in analytics with AI, but you don’t need to hire a data engineer with LLM experience for aggregating payroll data.

utopiah ,

there’s a lot of potential in analytics with AI

I’d argue there is a lot of potential in any domain with basic numeracy. In pretty much any business or institution somebody with a spreadsheet might help a lot. That doesn’t necessarily require any Big Data or AI though.

EldritchFeminity ,

Reminds me of when I read about a programmer getting turned down for a job because they didn’t have 5 years of experience with a language that they themselves had created 1 to 2 years prior.

simplejack ,
@simplejack@lemmy.world avatar

I’m more annoyed that Nvidia is looked at like some sort of brilliant strategist. It’s a GPU company that was lucky enough to be around when two new massive industries found an alternative use for graphics hardware.

They happened to be making pick axes in California right before some prospectors found gold.

And they don’t even really make pick axes, TSMC does. They just design them.

Zarxrax ,

They didn’t just “happen to be around”. They created the entire ecosystem around machine learning while AMD just twiddled their thumbs. There is a reason why no one is buying AMD cards to run AI workloads.

masterspace ,

Go ahead and design a better pickaxe than them, we’ll wait…

mycodesucks ,
@mycodesucks@lemmy.world avatar

Go ahead and design a better pickaxe than them, we’ll wait…

Same argument:

“He didn’t earn his wealth. He just won the lottery.”

“If it’s so easy, YOU go ahead and win the lottery then.”

masterspace ,

My fucking god.

“Buying a lottery ticket, and designing the best GPUs” totally the same thing, amiriteguys???"

mycodesucks , (edited )
@mycodesucks@lemmy.world avatar

In the sense that it’s a matter of being in the right place at the right time, yes. Exactly the same thing. Opportunities aren’t equal - they disproportionately effect those who happen to be positioned to take advantage of them. If I’m giving away a free car right now to whoever comes by, and you’re not nearby, you’re shit out of luck. If AI didn’t HAPPEN to use massively multi-threaded computing, Nvidia would still be artificial scarcity-ing themselves to price gouging CoD players. The fact you don’t see it for whatever reason doesn’t make it wrong. NOBODY at Nvidia was there 5 years ago saying “Man, when this new technology hits we’re going to be rolling in it.” They stumbled into it by luck. They don’t get credit for forseeing some future use case. They got lucky. That luck got them first mover advantage. Intel had that too. Look how well it’s doing for them. Nvidia’s position over AMD in this space can be due to any number of factors… production capacity, driver flexibility, faster functioning on a particular vector operation, power efficiency… hell, even the relationship between the CEO of THEIR company and OpenAI. Maybe they just had their salespeople call first. Their market dominance likely has absolutely NOTHING to do with their GPU’s having better graphics performance, and to the extent they are, it’s by chance - they did NOT predict generative AI, and their graphics cards just HAPPEN to be better situated for SOME reason.

MigratingtoLemmy ,

His engineers built it, he didn’t do anything there

utopiah ,

They just design them.

It’s not trivial though. They also managed to lock dev with CUDA.

That being said I don’t think they were “just” lucky, I think they built their luck through practices the DoJ is currently investigating for potential abuse of monopoly.

LemmyBe ,

Wether we like it or not AI is here to stay, and in 20-30 years, it’ll be as embedded in our lives as computers and smartphones are now.

shalafi ,

Is there a “young man yells at clouds meme” here?

“Yes, you’re very clever calling out the hype train. Oooh, what a smart boy you are!” Until the dust settles…

Lemmy sounds like my grandma in 1998, “Pushah. This ‘internet’ is just a fad.'”

FlorianSimon ,

The difference is that the Internet is actually useful.

BakerBagel ,

Yeah, the early Internet didn’t require 5 tons of coal be burned just to give you a made up answer to your query. This bubble is Pets.com only it is also murdering the rainforest while still be completely useless.

utopiah ,

Right, it did have an AI winter few decades ago. It’s indeed here to stay, it doesn’t many any of the current company marketing it right now will though.

AI as a research field will stay, everything else maybe not.

TropicalDingdong ,

It’s like the least popular opinion I have here on Lemmy, but I assure you, this is the begining.

Yes, we’ll see a dotcom style bust. But it’s not like the world today wasn’t literally invented in that time. Do you remember where image generation was 3 years ago? It was a complete joke compared to a year ago, and today, fuck no one here would know.

When code generation goes through that same cycle, you can put out an idea in plain language, and get back code that just “does” it.

I have no idea what that means for the future of my humanity.

RegalPotoo ,
@RegalPotoo@lemmy.world avatar

Personally I can’t wait for a few good bankruptcies so I can pick up a couple of high end data centre GPUs for cents on the dollar

bruhduh ,
@bruhduh@lemmy.world avatar

Search Nvidia p40 24gb on eBay, 200$ each and surprisingly good for selfhosted llm, if you plan to build array of gpus then search for p100 16gb, same price but unlike p40, p100 supports nvlink, and these 16gb is hbm2 memory with 4096bit bandwidth so it’s still competitive in llm field while p40 24gb is gddr5 so it’s good point is amount of memory for money it cost but it’s rather slow compared to p100 and compared to p100 it doesn’t support nvlink

Gormadt ,
@Gormadt@lemmy.blahaj.zone avatar

Personally I don’t much for the LLM stuff, I’m more curious how they perform in Blender.

utopiah ,

Interesting, I did try a bit of remote rendering on Blender (just to learn how to use via CLI) so that makes me wonder who is indeed scrapping the bottom of the barrel of “old” hardware and what they are using for. Maybe somebody is renting old GPUs for render farms, maybe other tasks, any pointer of such a trend?

PersnickityPenguin ,

Can it run crysis?

bruhduh ,
@bruhduh@lemmy.world avatar

How about cyberpunk?

felixwhynot ,
@felixwhynot@lemmy.world avatar
helenslunch ,
@helenslunch@feddit.nl avatar

The stock market is not based on income. It’s based entirely on speculation.

Since then, shares of the maker the high-grade computer chips that AI laboratories use to power the development of their chatbots and other products have come down by more than 22%.

June 18th: $136 August 4th: $100 August 18th: $130 again now: $103 (still above 8/4)

It’s almost like hype generates volatility. I don’t think any of this is indicative of a “leaking” bubble. Just tech journalists conjuring up clicks.

Also bubbles don’t “leak”.

Bishma ,
@Bishma@discuss.tchncs.de avatar

My only real hope out of this is that that copilot button on keyboards becomes the 486 turbo button of our time.

BlackLaZoR ,
@BlackLaZoR@fedia.io avatar

Meaning you unpress it, and computer gets 2x faster?

Bishma ,
@Bishma@discuss.tchncs.de avatar

I was thinking pressing it turns everything to shit, but that works too. I’d also accept, completely misunderstood by future generations.

yokonzo ,

Well now I wanna hear more about the history of this mystical shit button

macrocephalic ,

Back in those early days many applications didn’t have proper timing, they basically just ran as fast as they could. That was fine on an 8mhz cpu as you probably just wanted stuff to run as fast as I could (we weren’t listening to music or watching videos back then). When CPUs got faster (or it could be that it started running at a multiple of the base clock speed) then stuff was suddenly happening TOO fast. The turbo button was a way to slow down the clock speed by some amount to make legacy applications run how it was supposed to run.

macrocephalic ,

Actually you pressed it and everything got 2x slower. Turbo was a stupid label for it.

somethingsnappy ,

That’s… the same thing.

Whops, I thought you were responding to the first child comment.

Regrettable_incident ,
@Regrettable_incident@lemmy.world avatar

I could be misremembering but I seem to recall the digits on the front of my 486 case changing from 25 to 33 when I pressed the button. That was the only difference I noticed though. Was the beige bastard lying to me?

PenisDuckCuck9001 ,

I just want computer parts to stop being so expensive. Remember when gaming was cheap? Pepperidge farm remembers. You used to be able to build a relatively high end pc for less than the average dogshit Walmart laptop.

filister ,

To be honest right now is a relatively good time to build a PC, except for the GPU, which is heavily overpriced. I think if you are content with last gen AMD, this can also be turned to somewhat acceptable levels.

Ilovethebomb ,

I’ve noticed people have been talking less and less about AI lately, particularly online and in the media, and absolutely nobody has been talking about it in real life.

The novelty has well and truly worn off, and most people are sick of hearing about it.

ultranaut ,

The hype is still percolating, at least among the people I work with and at the companies of people I know. Microsoft pushing Copilot everywhere makes it inescapable to some extent in many environments, there’s people out there who have somehow only vaguely heard of ChatGPT and are now encountering LLMs for the first time at work and starting the hype cycle fresh.

gravitas_deficiency ,

It’s like 3D TVs, for a lot of consumer applications tbh

marx2k ,

Oh fuck that’s right, that was a thing.

Goddamn

Cryophilia ,

3D has been a thing every 15 years or so

4vgj0e ,

I find it insane when “tech bros” and AI researchers at major tech companies try to justify the wasting of resources (like water and electricity) in order to achieve “AGI” or whatever the fuck that means in their wildest fantasies.

These companies have no accountability for the shit that they do and consistently ignore all the consequences their actions will cause for years down the road.

catloaf ,

It’s research. Most of it never pans out, so a lot of it is “wasteful”. But if we didn’t experiment, we wouldn’t find the things that do work.

4vgj0e ,

I agree, but these researchers/scientists should be more mindful about the resources they use up in order to generate the computational power necessary to carry out their experiments. AI is good when it gets utilized to achieve a specific task, but funneling a lot of money and research towards general purpose AI just seems wasteful.

AssaultPepper ,

I mean general purpose AI doesn’t cap out at human intelligence, of which you could utilize to come up with ideas for better resource management.

Could also be a huge waste but the potential is there… potentially.

Cryophilia ,

Most of the entire AI economy isn’t even research. It’s just grift. Slapping a label on ChatGPT and saying you’re an AI company. It’s hustlers trying to make a quick buck from easy venture capital money.

billbennett ,

I've spent time with an AI laptop the past couple of weeks and 'overinflated' seems a generous description of where end user AI is today.

nobleshift ,
@nobleshift@lemmy.world avatar

Can we please get rid of the tech bros too?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines