There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

iAvicenna , (edited )
@iAvicenna@lemmy.world avatar

FMO is the best explanation of this psychosis and then of course denial by people who became heavily invested in it. Stuff like LLMs or ConvNets (and the likes) can already be used to do some pretty amazing stuff that we could not do a decade ago, there is really no need shit rainbows and puke glitter all over it. I am also not against exploring and pushing the boundaries, but when you explore a boundary while pretending like you have already crossed it, that is how you get bubbles. And this again all boils down to appeasing some cancerous billionaire shareholders so they funnel down some money to your pockets.

hellothere ,

Pop pop!

fogelmensch ,

Magnitude!

TropicalDingdong ,

It’s like the least popular opinion I have here on Lemmy, but I assure you, this is the begining.

Yes, we’ll see a dotcom style bust. But it’s not like the world today wasn’t literally invented in that time. Do you remember where image generation was 3 years ago? It was a complete joke compared to a year ago, and today, fuck no one here would know.

When code generation goes through that same cycle, you can put out an idea in plain language, and get back code that just “does” it.

I have no idea what that means for the future of my humanity.

rottingleaf ,

you can put out an idea in plain language, and get back code that just “does” it

No you can’t. Simplifying it grossly:

They can’t do the most low-level, dumbest detail, splitting hairs, “there’s no spoon”, “this is just correct no matter how much you blabber in the opposite direction, this is just wrong no matter how much you blabber to support it” kind of solutions.

And that happens to be main requirement that makes a task worth software developer’s time.

We need software developers to write computer programs, because “a general idea” even in a formalized language is not sufficient, you need to address details of actual reality. That is the bottleneck.

That technology widens the passage in the places which were not the bottleneck in the first place.

TropicalDingdong ,

I think you live in a nonsense world. I literally use it everyday and yes, sometimes it’s shit and it’s bad at anything that even requires a modicum of creativity. But 90% of shit doesn’t require a modicum of creativity. And my point isn’t about where we’re at, it’s about how far the same tech progressed on another domain adjacent task in three years.

Lemmy has a “dismiss AI” fetish and does so at its own peril.

rottingleaf ,

Are you a software developer? Or a hardware engineer? EDIT: Or anyone credible in evaluating my nonsense world against yours?

TropicalDingdong ,

Machine learning scientist.

rottingleaf ,

So close, but not there.

OK, you’ll know that I’m right when you somewhat expand your expertise to neighboring areas. Should happen naturally.

hark ,
@hark@lemmy.world avatar

That explains your optimism. Code generation is at a stage where it slaps together Stack Overflow answers and code ripped off from GitHub for you. While that is quite effective to get at least a crappy programmer to cobble together something that barely works, it is a far cry from having just anyone put out an idea in plain language and getting back code that just does it. A programmer is still needed in the loop.

I’m sure I don’t have to explain to you that AI development over the decades has often reached plateaus where the approach needed to be significantly changed in order for progress to be made, but it could certainly be the case where LLMs (at least as they are developed now) aren’t enough to accomplish what you describe.

rottingleaf ,

It’s not about stages. It’s about the Achilles and tortoise problem.

There’s extrapolation inside the same level of abstraction as the data given and there’s extrapolation of new levels of abstraction.

But frankly far smarter people than me are working on all that. Maybe they’ll deliver.

AlexanderESmith ,

This is the most hilarious possible answer you could have given xD

The bias on display here is strong enough to tilt the Earth off it's axis 🤣

Jesus_666 ,

And I wouldn’t know where to start using it. My problems are often of the “integrate two badly documented company-internal APIs” variety. LLMs can’t do shit about that; they weren’t trained for it.

They’re nice for basic rote work but that’s often not what you deal with in a mature codebase.

barsoap ,

And my point isn’t about where we’re at, it’s about how far the same tech progressed on another domain adjacent task in three years.

First off, are you extrapolating the middle part of the sigmoid thinking it’s an exponential. Secondly, link.springer.com/…/s11633-017-1093-8.pdf

tetris11 ,
@tetris11@lemmy.ml avatar

they’re pretty good, and the faults they have are improving steadily. I dont think we’re hitting a ceiling yet, and I shudder to think where they’ll be in 5 years.

felixwhynot ,
@felixwhynot@lemmy.world avatar
helenslunch ,
@helenslunch@feddit.nl avatar

The stock market is not based on income. It’s based entirely on speculation.

Since then, shares of the maker the high-grade computer chips that AI laboratories use to power the development of their chatbots and other products have come down by more than 22%.

June 18th: $136 August 4th: $100 August 18th: $130 again now: $103 (still above 8/4)

It’s almost like hype generates volatility. I don’t think any of this is indicative of a “leaking” bubble. Just tech journalists conjuring up clicks.

Also bubbles don’t “leak”.

SapphironZA ,

Its all vibes and manipulation

SturgiesYrFase ,
@SturgiesYrFase@lemmy.ml avatar

Also bubbles don’t “leak”.

I mean, sometimes they kinda do? They either pop or slowly deflate, I’d say slow deflation could be argued to be caused by a leak.

iopq ,

The broader market did the same thing

finance.yahoo.com/quote/SPY/

$560 to $510 to $560 to $540

So why did $NVDA have larger swings? It has to do with the concept called beta. High beta stocks go up faster when the market is up and go down lower when the market is done. Basically high variance risky investments.

Why did the market have these swings? Because of uncertainty about future interest rates. Interest rates not only matter vis-a-vis business loans but affect the interest-free rate for investors.

When investors invest into the stock market, they want to get back the risk free rate (how much they get from treasuries) + the risk premium (how much stocks outperform bonds long term)

If the risks of the stock market are the same, but the payoff of the treasuries changes, then you need a high return from stocks. To get a higher return you can only accept a lower price,

This is why stocks are down, NVDA is still making plenty of money in AI

RegalPotoo ,
@RegalPotoo@lemmy.world avatar

Personally I can’t wait for a few good bankruptcies so I can pick up a couple of high end data centre GPUs for cents on the dollar

bruhduh ,
@bruhduh@lemmy.world avatar

Search Nvidia p40 24gb on eBay, 200$ each and surprisingly good for selfhosted llm, if you plan to build array of gpus then search for p100 16gb, same price but unlike p40, p100 supports nvlink, and these 16gb is hbm2 memory with 4096bit bandwidth so it’s still competitive in llm field while p40 24gb is gddr5 so it’s good point is amount of memory for money it cost but it’s rather slow compared to p100 and compared to p100 it doesn’t support nvlink

Gormadt ,
@Gormadt@lemmy.blahaj.zone avatar

Personally I don’t much for the LLM stuff, I’m more curious how they perform in Blender.

utopiah ,

Interesting, I did try a bit of remote rendering on Blender (just to learn how to use via CLI) so that makes me wonder who is indeed scrapping the bottom of the barrel of “old” hardware and what they are using for. Maybe somebody is renting old GPUs for render farms, maybe other tasks, any pointer of such a trend?

PersnickityPenguin ,

Can it run crysis?

bruhduh ,
@bruhduh@lemmy.world avatar

How about cyberpunk?

iAvicenna ,
@iAvicenna@lemmy.world avatar

I ran doom on a GPU!

bruhduh ,
@bruhduh@lemmy.world avatar
Scipitie ,

Lowest price on Ebay for me is 290 Euro :/ The p100 are 200 each though.

Do you happen to know if I could mix a 3700 with a p100?

And thanks for the tips!

bruhduh ,
@bruhduh@lemmy.world avatar

Ryzen 3700? Or rtx 3070? Please elaborate

RegalPotoo ,
@RegalPotoo@lemmy.world avatar

Thanks for the tips! I’m looking for something multi-purpose for LLM/stable diffusion messing about + transcoder for jellyfin - I’m guessing that there isn’t really a sweet spot for those 3. I don’t really have room or power budget for 2 cards, so I guess a P40 is probably the best bet?

bruhduh , (edited )
@bruhduh@lemmy.world avatar

Try ryzen 8700g integrated gpu for transcoding since it supports av1 and these p series gpus for llm/stable diffusion, would be a good mix i think, or if you don’t have budget for new build, then buy intel a380 gpu for transcoding, you can attach it as mining gpu through pcie riser, linus tech tips tested this gpu for transcoding as i remember

RegalPotoo ,
@RegalPotoo@lemmy.world avatar

8700g

Hah, I’ve pretty recently picked up an Epyc 7452, so not really looking for a new platform right now.

The Arc cards are interesting, will keep those in mind

floofloof ,

Shed a tear, if you wish, for Nvidia founder and Chief Executive Jenson Huang, whose fortune (on paper) fell by almost $10 billion that day.

Thanks, but I think I’ll pass.

brbposting ,

I’m sure he won’t mind. Worrying about that doesn’t sound like working.

I work from the moment I wake up to the moment I go to bed. I work seven days a week. When I’m not working, I’m thinking about working, and when I’m working, I’m working. I sit through movies, but I don’t remember them because I’m thinking about work.

  • Huang on his 14 hour workdays

It is one way to live.

MelodiousFunk , (edited )

That sounds like mental illness.

ETA: Replace “work” in that quote with practically any other activity/subject, whether outlandish or banal.

I sit through movies but I don’t remember them because I’m thinking about baking cakes.

I sit through movies but I don’t remember them because I’m thinking about traffic patterns.

I sit through movies but I don’t remember them because I’m thinking about cannibalism.

I sit through movies but I don’t remember them because I’m thinking about shitposting.

Obsessed with something? At best, you’re “quirky” (depending on what you’re obsessed with). Unless it’s money. Being obsessed with that is somehow virtuous.

brbposting ,

Valid argument for sure

It would be sad if therapists kept telling him that but he could never remember

MelodiousFunk ,

“Sorry doc, was thinking about work. Did you say something about line go up?”

Hupf ,
@Hupf@feddit.org avatar
UndercoverUlrikHD ,

I don’t think you become the best tech CEO in the world by having a healthy approach to work. He is just wired differently, some people are just all about work.

lauha ,

Some would not call that living

msage ,

Yeah ok sure buddy, but what do you DO actually?

rottingleaf ,

He knows what this hype is, so I don’t think he’d be upset. Still filthy rich when the bubble bursts, and that won’t be soon.

LemmyBe ,

Wether we like it or not AI is here to stay, and in 20-30 years, it’ll be as embedded in our lives as computers and smartphones are now.

shalafi ,

Is there a “young man yells at clouds meme” here?

“Yes, you’re very clever calling out the hype train. Oooh, what a smart boy you are!” Until the dust settles…

Lemmy sounds like my grandma in 1998, “Pushah. This ‘internet’ is just a fad.'”

FlorianSimon ,

The difference is that the Internet is actually useful.

BakerBagel ,

Yeah, the early Internet didn’t require 5 tons of coal be burned just to give you a made up answer to your query. This bubble is Pets.com only it is also murdering the rainforest while still be completely useless.

Womble ,

Estimates for chatgpt usage per query are on the order of 20-50 Wh, which is about the same as playing a demanding game on a gaming pc for a few minutes. Local models are significantly less.

utopiah ,

Right, it did have an AI winter few decades ago. It’s indeed here to stay, it doesn’t many any of the current company marketing it right now will though.

AI as a research field will stay, everything else maybe not.

billbennett ,

I've spent time with an AI laptop the past couple of weeks and 'overinflated' seems a generous description of where end user AI is today.

nobleshift ,
@nobleshift@lemmy.world avatar

Can we please get rid of the tech bros too?

4vgj0e ,

I find it insane when “tech bros” and AI researchers at major tech companies try to justify the wasting of resources (like water and electricity) in order to achieve “AGI” or whatever the fuck that means in their wildest fantasies.

These companies have no accountability for the shit that they do and consistently ignore all the consequences their actions will cause for years down the road.

catloaf ,

It’s research. Most of it never pans out, so a lot of it is “wasteful”. But if we didn’t experiment, we wouldn’t find the things that do work.

4vgj0e ,

I agree, but these researchers/scientists should be more mindful about the resources they use up in order to generate the computational power necessary to carry out their experiments. AI is good when it gets utilized to achieve a specific task, but funneling a lot of money and research towards general purpose AI just seems wasteful.

AssaultPepper ,

I mean general purpose AI doesn’t cap out at human intelligence, of which you could utilize to come up with ideas for better resource management.

Could also be a huge waste but the potential is there… potentially.

Cryophilia ,

Most of the entire AI economy isn’t even research. It’s just grift. Slapping a label on ChatGPT and saying you’re an AI company. It’s hustlers trying to make a quick buck from easy venture capital money.

EnderMB ,

You can probably say the same about all fields, even those that have formal protections and regulations. That doesn’t mean that there aren’t people that have PhD’s in the field and are trying to improve it for the better.

Cryophilia ,

Sure but typically that’s a small part of the field. With AI it’s a majority, that’s the difference.

rottingleaf ,

I don’t think I’ve heard a lot of actual research in the AI area not connected to machine learning (which may be just one component, not really necessary at that).

PenisDuckCuck9001 ,

I just want computer parts to stop being so expensive. Remember when gaming was cheap? Pepperidge farm remembers. You used to be able to build a relatively high end pc for less than the average dogshit Walmart laptop.

filister ,

To be honest right now is a relatively good time to build a PC, except for the GPU, which is heavily overpriced. I think if you are content with last gen AMD, this can also be turned to somewhat acceptable levels.

arran4 ,

The fact that is is from LA Times shows that it’s still significant though

Bishma ,
@Bishma@discuss.tchncs.de avatar

My only real hope out of this is that that copilot button on keyboards becomes the 486 turbo button of our time.

BlackLaZoR ,
@BlackLaZoR@fedia.io avatar

Meaning you unpress it, and computer gets 2x faster?

Bishma ,
@Bishma@discuss.tchncs.de avatar

I was thinking pressing it turns everything to shit, but that works too. I’d also accept, completely misunderstood by future generations.

yokonzo ,

Well now I wanna hear more about the history of this mystical shit button

macrocephalic ,

Back in those early days many applications didn’t have proper timing, they basically just ran as fast as they could. That was fine on an 8mhz cpu as you probably just wanted stuff to run as fast as I could (we weren’t listening to music or watching videos back then). When CPUs got faster (or it could be that it started running at a multiple of the base clock speed) then stuff was suddenly happening TOO fast. The turbo button was a way to slow down the clock speed by some amount to make legacy applications run how it was supposed to run.

barsoap ,

Most turbo buttons never worked for that purpose, though, they were still way too fast Like, even ignoring other advances such as better IPC (or rather CPI back in those days) you don’t get to an 8MHz 8086 by halving the clock speed of a 50MHz 486. You get to 25MHz. And practically all games past that 8086 stuff was written with proper timing code because devs knew perfectly well that they’re writing for more than one CPU. Also there’s software to do the same job but more precisely and flexibly.

It probably worked fine for the original PC-AT or something when running PC-XT programs (how would I know our first family box was a 386) but after that it was pointless. Then it hung on for years, then it vanished.

macrocephalic ,

Actually you pressed it and everything got 2x slower. Turbo was a stupid label for it.

somethingsnappy ,

That’s… the same thing.

Whops, I thought you were responding to the first child comment.

Regrettable_incident ,
@Regrettable_incident@lemmy.world avatar

I could be misremembering but I seem to recall the digits on the front of my 486 case changing from 25 to 33 when I pressed the button. That was the only difference I noticed though. Was the beige bastard lying to me?

frezik ,

Lying through its teeth.

There was a bunch of DOS software that runs too fast to be usable on later processors. Like a Rouge-like game where you fly across the map too fast to control. The Turbo button would bring it down to 8086 speeds so that stuff is usable.

PhlubbaDubba ,

I’m just praying people will fucking quit it with the worries that we’re about to get SKYNET or HAL when binary computing would inherently be incapable of recreating the fast pattern recognition required to replicate or outpace human intelligence.

Moore’s law is about similar computing power, which is a measure of hardware performance, not of the software you can run on it.

utopiah ,

Unfortunately it’s part of the marketing, thanks OpenAI for that “Oh no… we can’t share GPT2, too dangerous” then… here it is. Definitely interesting then but now World shattering. Same for GPT3 … but through exclusive partnership with Microsoft, all closed, rinse and repeat for GPT4. It’s a scare tactic to lock what was initially open, both directly and closing the door behind them through regulation, at least trying to.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines