There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

floofloof ,

Shed a tear, if you wish, for Nvidia founder and Chief Executive Jenson Huang, whose fortune (on paper) fell by almost $10 billion that day.

Thanks, but I think I’ll pass.

brbposting ,

I’m sure he won’t mind. Worrying about that doesn’t sound like working.

I work from the moment I wake up to the moment I go to bed. I work seven days a week. When I’m not working, I’m thinking about working, and when I’m working, I’m working. I sit through movies, but I don’t remember them because I’m thinking about work.

  • Huang on his 14 hour workdays

It is one way to live.

LemmyBe ,

Wether we like it or not AI is here to stay, and in 20-30 years, it’ll be as embedded in our lives as computers and smartphones are now.

shalafi ,

Is there a “young man yells at clouds meme” here?

“Yes, you’re very clever calling out the hype train. Oooh, what a smart boy you are!” Until the dust settles…

Lemmy sounds like my grandma in 1998, “Pushah. This ‘internet’ is just a fad.'”

FlorianSimon ,

The difference is that the Internet is actually useful.

billbennett ,

I've spent time with an AI laptop the past couple of weeks and 'overinflated' seems a generous description of where end user AI is today.

nobleshift ,
@nobleshift@lemmy.world avatar

Can we please get rid of the tech bros too?

4vgj0e ,

I find it insane when “tech bros” and AI researchers at major tech companies try to justify the wasting of resources (like water and electricity) in order to achieve “AGI” or whatever the fuck that means in their wildest fantasies.

These companies have no accountability for the shit that they do and consistently ignore all the consequences their actions will cause for years down the road.

catloaf ,

It’s research. Most of it never pans out, so a lot of it is “wasteful”. But if we didn’t experiment, we wouldn’t find the things that do work.

4vgj0e ,

I agree, but these researchers/scientists should be more mindful about the resources they use up in order to generate the computational power necessary to carry out their experiments. AI is good when it gets utilized to achieve a specific task, but funneling a lot of money and research towards general purpose AI just seems wasteful.

AssaultPepper ,

I mean general purpose AI doesn’t cap out at human intelligence, of which you could utilize to come up with ideas for better resource management.

Could also be a huge waste but the potential is there… potentially.

PenisDuckCuck9001 ,

I just want computer parts to stop being so expensive. Remember when gaming was cheap? Pepperidge farm remembers. You used to be able to build a relatively high end pc for less than the average dogshit Walmart laptop.

arran4 ,

The fact that is is from LA Times shows that it’s still significant though

Bishma ,
@Bishma@discuss.tchncs.de avatar

My only real hope out of this is that that copilot button on keyboards becomes the 486 turbo button of our time.

BlackLaZoR ,
@BlackLaZoR@fedia.io avatar

Meaning you unpress it, and computer gets 2x faster?

Bishma ,
@Bishma@discuss.tchncs.de avatar

I was thinking pressing it turns everything to shit, but that works too. I’d also accept, completely misunderstood by future generations.

yokonzo ,

Well now I wanna hear more about the history of this mystical shit button

macrocephalic ,

Back in those early days many applications didn’t have proper timing, they basically just ran as fast as they could. That was fine on an 8mhz cpu as you probably just wanted stuff to run as fast as I could (we weren’t listening to music or watching videos back then). When CPUs got faster (or it could be that it started running at a multiple of the base clock speed) then stuff was suddenly happening TOO fast. The turbo button was a way to slow down the clock speed by some amount to make legacy applications run how it was supposed to run.

macrocephalic ,

Actually you pressed it and everything got 2x slower. Turbo was a stupid label for it.

somethingsnappy ,

That’s… the same thing.

Whops, I thought you were responding to the first child comment.

PhlubbaDubba ,

I’m just praying people will fucking quit it with the worries that we’re about to get SKYNET or HAL when binary computing would inherently be incapable of recreating the fast pattern recognition required to replicate or outpace human intelligence.

Moore’s law is about similar computing power, which is a measure of hardware performance, not of the software you can run on it.

CosmoNova ,

Welp, it was ‘fun’ while it lasted. Time for everyone to adjust their expectations to much more humble levels than what was promised and move on to the next sceme. After Metaverse, NFTs and ‘Don’t become a programmer, AI will steal your job literally next week!11’, I’m eager to see what they come up with next. And with eager I mean I’m tired. I’m really tired and hope the economy just takes a damn break from breaking things.

dustyData ,

But if it doesn’t disrupt it isn’t worth it!

/s

Fetus ,

I just hope I can buy a graphics card without having to sell organs some time in the next two years.

catloaf ,

My RX 580 has been working just fine since I bought it used. I’ve not been able to justify buying a new (used) one. If you have one that works, why not just stick with it until the market gets flooded with used ones?

macrocephalic ,

Don’t count on it. It turns out that the sort of stuff that graphics cards do is good for lots of things, it was crypto, then AI and I’m sure whatever the next fad is will require a GPU to run huge calculations.

Zorsith ,
@Zorsith@lemmy.blahaj.zone avatar

I’d love an upgrade for my 2080 TI, really wish Nvidia didn’t piss off EVGA into leaving the GPU business…

sheogorath ,

If there is even a GPU being sold. It’s much more profitable for Nvidia to just make compute focused chips than upgrading their gaming lineup. GeForce will just get the compute chips rejects and laptop GPU for the lower end parts. After the AI bubble burst, maybe they’ll get back to their gaming roots.

DogPeePoo ,

Wall Street has already milked “the pump” now they short it and put out articles like this

Ilovethebomb ,

I’ve noticed people have been talking less and less about AI lately, particularly online and in the media, and absolutely nobody has been talking about it in real life.

The novelty has well and truly worn off, and most people are sick of hearing about it.

ultranaut ,

The hype is still percolating, at least among the people I work with and at the companies of people I know. Microsoft pushing Copilot everywhere makes it inescapable to some extent in many environments, there’s people out there who have somehow only vaguely heard of ChatGPT and are now encountering LLMs for the first time at work and starting the hype cycle fresh.

gravitas_deficiency ,

It’s like 3D TVs, for a lot of consumer applications tbh

marx2k ,

Oh fuck that’s right, that was a thing.

Goddamn

vk6flab ,
@vk6flab@lemmy.radio avatar

A.I., Assumed Intelligence

dinckelman ,

More like PISS, a Plagiarized Information Synthesis System

givesomefucks ,

Well, they also kept telling investors all they need to simulate a human brain was to simulate the amount of neurons in a human brain…

The stupidly rich loved that, because they want computer backups for “immortality”. And they’d dump billions of dollars into making that happen

About two months ago tho, we found out that the brain uses microtubules in the brain to put tryptophan into super position, and it can maintain that for like a crazy amount of time, like longer than we can do in a lab.

The only argument against a quantum component for human consciousness, was people thought there was no way to have even just get regular quantum entanglement in a human brain.

We’ll be lucky to be able to simulate that stuff in 50 years, but it’s probably going to be even longer.

Every billionaire who wanted to “live forever” this way, just got aged out. So they’ll throw their money somewhere else now.

half_built_pyramids ,

I used to follow the Penrose stuff and was pretty excited about QM as an explanation of consciousness. If this is the kind of work they’re reaching at though. This is pretty sad. It’s not even anything. Sometimes you need to go with your gut, and my gut is telling me that if this is all the QM people have, consciousness is probably best explained by complexity.

ask.metafilter.com/…/Is-this-paper-on-quantum-pro…

Completely off topic from ai, but got me curious about brain quantum and found this discussion. Either way, AI still sucks shit and is just a shortcut for stealing.

givesomefucks ,

That’s a social media comment from some Ask Yahoo knockoff…

Like, this isn’t something no one is talking about, you don’t have to solely learn about that from unpopular social media sites (including my comment).

I don’t usually like linking videos, but I’m feeling like that might work better here

www.youtube.com/watch?v=xa2Kpkksf3k

But that PBS video gives a really good background and then talks about the recent discovery.

Jordan117 ,

some Ask Yahoo knockoff…

AskMeFi predated Yahoo Answers by several years (and is several orders of magnitude better than it ever was).

givesomefucks ,

And that linked accounts last comment was advocating for Biden to stage a pre-emptive coup before this election…

www.metafilter.com/activity/306302/…/mefi/

It doesn’t matter if it was created before Ask Yahoo or if it’s older.

It’s random people making random social media comments, sometimes stupid people make the rare comment that sounds like they know what they’re talking about. And I already agreed no one had to take my word on it either.

But that PBS video does a really fucking good job explaining it.

Cuz if I can’t explain to you why a random social media comment isn’t a good source, I’m sure as shit not going to be able to explain anything like Penrose’s theory on consciousness to you.

Jordan117 , (edited )

It doesn’t matter if it was created before Ask Yahoo or if it’s older.

It does if you’re calling it a “knockoff” of a lower-quality site that was created years later, which was what I was responding to.

edit: btw, you’ve linked to the profile of the asker of that question, not the answer to it that /u/half_built_pyramids quoted.

givesomefucks ,

Great.

So the social media site is older than I thought, and the person who made the comment on that site is a lot stupider than it seemed.

Like, Facebooks been around for about 20 years. Would you take a link to a Facebook comment over PBS?

Jordan117 ,

My man, I said nothing about the science or the validity of that comment, just that it’s wrong to call Ask MetaFilter “some Ask Yahoo knockoff”. If you want to get het up about an argument I never made, you do you.

masterspace ,

Thank fucking god.

I got sick of the overhyped tech bros pumping AI into everything with no understanding of it…

But then I got way more sick of everyone else thinking they’re clowning on AI when in reality they’re just demonstrating an equal sized misunderstanding of the technology in a snarky pessimistic format.

sentient_loom ,
@sentient_loom@sh.itjust.works avatar

As I job-hunt, every job listed over the past year has been “AI-drive [something]” and I’m really hoping that trend subsides.

AdamEatsAss ,

“This is an mid level position requiring at least 7 years experience developing LLMs.” -Every software engineer job out there.

figjam ,

That was cloud 7 years ago and blockchain 4

macrocephalic ,

Yeah, I’m a data engineer and I get that there’s a lot of potential in analytics with AI, but you don’t need to hire a data engineer with LLM experience for aggregating payroll data.

EldritchFeminity ,

Reminds me of when I read about a programmer getting turned down for a job because they didn’t have 5 years of experience with a language that they themselves had created 1 to 2 years prior.

simplejack ,
@simplejack@lemmy.world avatar

I’m more annoyed that Nvidia is looked at like some sort of brilliant strategist. It’s a GPU company that was lucky enough to be around when two new massive industries found an alternative use for graphics hardware.

They happened to be making pick axes in California right before some prospectors found gold.

And they don’t even really make pick axes, TSMC does. They just design them.

Zarxrax ,

They didn’t just “happen to be around”. They created the entire ecosystem around machine learning while AMD just twiddled their thumbs. There is a reason why no one is buying AMD cards to run AI workloads.

masterspace ,

Go ahead and design a better pickaxe than them, we’ll wait…

mycodesucks ,
@mycodesucks@lemmy.world avatar

Go ahead and design a better pickaxe than them, we’ll wait…

Same argument:

“He didn’t earn his wealth. He just won the lottery.”

“If it’s so easy, YOU go ahead and win the lottery then.”

masterspace ,

My fucking god.

“Buying a lottery ticket, and designing the best GPUs” totally the same thing, amiriteguys???"

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines