There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

deegeese ,

Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.

This is up there with Uber pretending self driving cars will make them rich.

MeekerThanBeaker ,

I mean… self driving cars probably will. Just not as soon as they think. My guess, at least another decade.

NOT_RICK ,
@NOT_RICK@lemmy.world avatar

Maybe, or maybe like harnessing fusion it will always be “just a few more years away!”

IphtashuFitz ,

Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.

massive_bereavement ,

As someone said on this thread: as soon as they can convince legislators, even if they are murder machines, capital will go for it.

Borrowing from my favorite movie: "it's just a glitch".

meco03211 ,

Alternatively measures could be put in place to eliminate certain edge cases. You can see similar concepts in places with separate infrastructure for things like busses or HOV lanes. Places you could still ostensibly allow “regular” vehicles to travel but limit/eliminate pedestrians or merging.

yildolw ,

Abolish snowfall

IphtashuFitz ,

I doubt it. The liability would be far too great. Ambulance chasing lawyers would salivate at the chance to represent the families of pedestrians struck and killed by buggy self driving cars. Those capitalists don’t want endless years of class action cases tying up their profits.

Nommer ,

When was the last time a corporation got anything other than a slap on the wrist and a small donation to the government just so they could keep doing what they’re doing?

atrielienz , (edited )

Like Boeing. As much as I hate people saying dumb shit about a company they don’t know much of anything about, Boeing is the epitome of what you said. A company getting a small slap on the wrist for gross negligence in the name of profit. Especially because of all the goodies they develope for the US Federal Government. And since they are a world wide company our government isn’t the only one. They know they reside in a place of power because they fill a hole in an industry that basically has to be filled. And people want to try to bankrupt them with some weird ideas about voting with their dollar. But that’s nonsense.

People don’t understand about how they build planes not to sell but to lease. How these types of leases keep their customers paying out the nose for an asset they don’t own, and responsible for the maintenance of that asset until it’s time to upgrade. They cornered the market on enshitification long before the likes of Microsoft and Google, and they have mastered the art of it.

Tesla or Uber or whoever wish they could do what Boeing has been doing for decades. People have this rose tinted glasses view of what Boeing “used to be” when it was “run by engineers” etc. That’s hilarious to me. Back in the day they hedged their bets in a race to the bottom to develop a two engined plane that wouldn’t catastrophically fail if it fells out of the sky so they could skirt worldwide federal regulations that required planes to have more than two engines. This added to upkeep and fuel costs making it untenable and creating air travel that was incredibly expensive. And their engineers managed it, so they played the long game, basically allowing them to develop planes that were more fuel efficient and cost effective to maintenance meaning their customers could afford to buy more of them by providing air travel opportunities to more people.

You know what we got from that? Shittier seating arrangements, poorly manufactured planes, and baggage fees out the whazoo in addition to ever rising ticket prices for air travel.

cogitase ,

Their accident rate continues to decrease and things like quorum sensing and platooning are going to push them to be better than humans. You’re never going to have a perfect system that never has accidents, but if you’re substantially better than humans in accidents per mile driven and you’re dramatically improving throughput and reducing traffic through V2X, it’s going to make sense to fully transition.

I imagine some east Asian countries will be the first to transition and then the rest of the world will begrudgingly accept it once the advantages become clear and the traditional car driving zealots die off.

AtomicTacoSauce ,
@AtomicTacoSauce@lemmy.world avatar

The robot taxi from Total Recall came to mind while reading your reply. Our future is almost assuredly dystopian.

KevonLooney ,

Plus, as soon as the cars can drive themselves people will stop needing Uber in many cases.

No parking? Just tell your car to go park on a street 10 blocks away.

Drunk? Car drives itself while you sleep.

Going to the airport? Car drops you off and returns home. Car also picks you up when you are back.

This is combined with the fact that people will do more disgusting things in an Uber without the driver there. If you have ever driven for Uber, you know that 10% of people are trying to eat or drink in the car. They are going to spill and it’s going to end up like the back of a bus.

yildolw ,

Not sure if we’re agreeing and saying exactly the same thing here, but Uber’s business model is to get suckers who are bad at math to own the cars. Uber’s business model does not work if they have to own their own cars. Self-driving Uber doesn’t work because Uber would have to own the cars and therefore has to cover vehicle insurance, vehicle depreciation, and so on out of its own margin.

Num10ck ,

there will be a massive building in like india with many thousand of atrociously paid workers donning VR goggles who spend their long hours constantly Quantum Leap finding themselves in traumatizing last second emergency situations that the AI gives up on. Instantly they slam on the brakes as hard as they can. They drink tea. there’s suicide netting everywhere. they were the lowest bidder this quarter.

pbbananaman ,

Just like all humans can do right now, right?

I never see any humans on the rode staring at their phone and driving like shit.

Wilzax ,

The problem with self-driving cars isn’t that it’s worse than human drivers on average, it’s that it’s SO INCREDIBLY BAD when it’s wrong that no company would ever assume the liability for the worst of its mistakes.

pbbananaman ,

But if the average is better, then we’re will clearly win by using it. I’m not following the logic of tracking the worst case scenarios as opposed to the average.

Wilzax ,

Average is better means fewer incidents overall. But when there are incidents, the damages for those incidents tend to be much worse. This means the victims are more likely to lawyer up and go after the company responsible for the AI that was driving, and that means that the company who makes the self-driving software better be prepared to pay for those worst case scenarios, which will now be 100% their fault.

Uber can avoid liability for crashes caused by their human drivers. They won’t be able to do the same when their fleet is AI. And when that happens, AI sensibilities will be measured my human metrics because courts are run by humans. The mistakes that they make will be VERY expensive ones, because a minor glitch can turn an autonomous vehicle from the safest driving experience possible to a rogue machine with zero sense of self-preservation. That liability is not worth the cost savings of getting rid of human drivers yet, and it won’t be for a very long time.

Nollij ,

“handle” is doing a lot of heavy lifting there. The signs are already there that all of these edge cases will just be programmed as “safely pull over and stop until conditions change or a human takes control”. Which isn’t a small task in itself, but it’s a lot easier than figuring out to continue (e.g.) on ice.

deegeese ,

Self driving taxis are definitely happening, but the people getting rich in a gold rush are the people selling shovels.

Uber has no structural advantage because their unique value proposition is the army of cheap drivers.

yildolw ,

We’re a century away from self-driving cars that can handle snowfall

Just this year farmers with self-driving tractors got screwed because a solar flare made GPS inaccurate and so tractors went wild because they were programmed with the assumption of GPS being 100% reliable and accurate with no way to override

trolololol ,

I hope this helps people understand that you don’t get to be CEO by being smart or working hard. It’s all influence and gossip all the way up.

dylanmorgan ,

In fact, being stupid is probably a benefit.

trolololol ,

Yep if I had that kind of money and surrounded by like minded people I’d agree. Unfortunately I’m cursed with a rational mind 🙃🙃🙃

Hackworth ,

“Coding” was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack

ASDraptor ,

This right here.

Problem is not coding. Anybody can learn that with a couple of well focused courses.

I’d love to see an AI find the cause of a catastrophic crash of a machine that isn’t caused by a software bug.

Hackworth ,

Catching up on what Carmack’s been up to for the last decade has revived the fan in me. I love that 2 years after leaving Oculus to focus on AGI, this is all the hype he’s willing to put out there.

gibmiser ,

Fucking lol.

Boozilla ,
@Boozilla@lemmy.world avatar

They’ve been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don’t like that kind of thing.

Unfortunately, I don’t think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.

SlopppyEngineer ,

And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You’d only need big corpo for cloud storage and then only when distributed systems written by AI don’t work.

L0rdMathias ,

“Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of”

UnsavoryMollusk ,

You did a great summary honestly

umbrella ,
@umbrella@lemmy.ml avatar

when will ai replace ceos?

A_A ,
@A_A@lemmy.world avatar

Mark Zuckerberg is not a robot ?

umbrella ,
@umbrella@lemmy.ml avatar

good point

RamblingPanda ,

Is he fully functional? I have some standards.

Whitebrow ,

Lizardman. Easy to confuse the two as they’re both cold to the touch by default.

lemmeBe ,

This. ⬆️ 😆

_____ ,

No!!! They’re useful because uhhmm uuhhhh uhmm uhhhbbh dndusfjduehrhrh

BallsandBayonets ,

Wouldn’t even need AI, a coin flip would do just as well.

homesweethomeMrL ,

Yeah hows that goin’?

aviation_hydrated ,

It can write really buggy Python code, so… Yeah, seems promising

ripcord ,
@ripcord@lemmy.world avatar

It does a frequently shitty job of writing docstrings for simple functions, too!

Beetschnapps ,

Almost like dealing with real engineers…

yesman ,

I just want to remind everyone that capital won’t wait until AI is “as good” as humans, just when it’s minimally viable.

They didn’t wait for self-checkout to be as good as a cashier; They didn’t wait for chat-bots to be as good as human support; and they won’t wait for AI to be as good as programmers.

xtr0n ,

And then we should all charge outrageous hourly rates to fix the AI generated code.

SlopppyEngineer , (edited )

They’ll try the opposite. It’s what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was “just touching up an existing script” it was half price.

eager_eagle ,
@eager_eagle@lemmy.world avatar

Yeah they’ll try. Surely that can’t cascade into a snowball of issues. Good luck for them 😎

SlopppyEngineer ,

A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn’t have unions.

peopleproblems ,

You better fucking believe it.

AIs are going to be the new outsource, only cheaper than outsourcing and probably less confusing for us to fix

AmbiguousProps ,

They won’t, and they’ll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We’ve already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn’t able to do their jobs.

SlopppyEngineer ,

And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.

If it’s AI doing all the work, the responsibility goes to the remaining humans. They’ll be interesting lawsuits even there’s the inevitable bug that the AI itself can’t figure out.

deegeese ,

Unexpected item in bagging are? I think you meant free item in bagging area.

atrielienz ,

We saw this happen in Amazon’s cashier-less stores. They were actively trying to use a computer based AI system but it didn’t work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.

I doubt the vast majority of tech workers will be replaced by AI any time soon. But they’ll probably keep trying because they really really don’t want to pay human beings a liveable wage.

painfulasterisk1 ,

It’s really funny how AI “will perform X job in the near future” but you barely, if any, see articles saying that AI will replace CEO’s in the near future.

Dearth ,

Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO

Hexagon ,

Can AI do proper debugging and troubleshooting? That’s when I’ll start to get worried

hoot ,

Uh huh.

werefreeatlast ,

AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.

ipkpjersi ,

It will never understand context and business rules and things of that nature to the same extent that actual devs do.

Melvin_Ferd ,

I don’t get how it’s not that AI would help programmers build way better things. if it can actually replace a programmer I think it’s probably just as capable of replacing a CEO. I bet it’s a better use case to replace CEO

collapse_already ,

You can hire a lot of programmers for the cost of one CEO.

Semi_Hemi_Demigod ,
@Semi_Hemi_Demigod@lemmy.world avatar

Until an AI can get clear, reasonable requirements out of a client/stakeholder our jobs are safe.

Beetschnapps ,

So never right?

If the assumption is that a PM holds all the keys…

SparrowRanjitScaur ,

Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.

Tyfud ,

Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

SparrowRanjitScaur ,

Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

OmnislashIsACloudApp ,

right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.

ai doing any actual programming is a long ways off.

Tyfud ,

This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.

So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

Eyck_of_denesle ,

I heard a lot of programmers say it

Tyfud ,

They’re falling for a hype train then.

I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I’ve been coding and working in tech for over 25 years.

The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.

We’re so far off from this existing with the current tech, that it’s not worth seriously discussing.

There are scripts, snippets of code that vscode’s llm or VS2022’s llm plugin can help with/bring up. But 9 times out of 10 there’s multiple bugs in it.

If you’re doing anything semi-complex it’s a crapshoot if it gets close at all.

It’s not bad for generating psuedo-code, or templates, but it’s designed to generate code that looks right, not be right; and there’s a huge difference.

AI Genned code is exceedingly buggy, and if you don’t understand what it’s trying to do, it’s impossible to debug because what it generates is trash tier levels of code quality.

The tech may get there eventually, but there’s no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.

It’s useful for non-engineers to get an idea of what they’re trying to do, but it can just as easily send them down a bad path.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines