There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

key ,

“software developer says ai will not replace software developers” feels very John Henry

AmbiguousProps ,

Nope

tal ,
@tal@lemmy.today avatar

In the long run, sure.

In the near term? No, not by a long shot.

There are some tasks we can automate, and that will happen. That’s been a very long-running trend, though; it’s nothing new. People generally don’t write machine language by physically flipping switches these days; many decades of automation have happened since then.

I also don’t think that a slightly-tweaked latent diffusion model, of the present “generative AI” form, will get all that far, either. The fundamental problem: taking an incomplete specification in human language and translating it to a precise set of rules in machine language making use of knowledge of the real world, isn’t something that I expect you can do very effectively by training on a existing corpus.

The existing generative AIs work well on tasks where you have a large training corpus that maps from something like human language to an image. The resulting image don’t have a lot by way of hard constraints on their precision; you can illustrate that by generating a batch of ten images for a given prompt that might all look different, but a fair number look decent-enough.

I think that some of that is because humans typically process images and language in a way that is pretty permissive of errors; we rely heavily on context and our past knowledge about the real world to obtain meaning up with the correct meaning. An image just needs to “cue” our memories and understanding of the world. We can see images that are distorted or stylized, or see pixel art, and recognize it for what it is.

But…that’s not what a CPU does. Machine language is not very tolerant of errors.

So I’d expect a generative AI to be decent at putting out content intended to be consumed by humans – and we have, in fact, had a number of impressive examples of that working. But I’d expect it to be less-good at putting out content intended to be consumed by a CPU.

I think that that lack of tolerance for error, plus the need to pull in information from the real world, is going to make translating human language to machine language less of a good match than translating human language to human language or human language to human-consumable image.

Jestzer ,

The rule of any article asking asking a question in its title is that the answer is always no.

flamingo_pinyata ,

AI is actually great at typing the code quickly. Once you know exactly what you want. But it’s already the case that if your engineers spend most of their time typing code, you’re doing something wrong. AI or no AI.

hendrik ,

I don't think so. I've had success letting it write boilerplate code. And simple stuff that I could have copied from stack overflow. Or a beginners programming book. With every task from my real life it failed miserably. I'm not sure if I did anything wrong. And it's been half a year since I last tried. Maybe things have changed substantially in the last few months. But I don't think so.

Last thing I tried was some hobby microcontroller code to do some robotics calculations. And ChatGPT didn't really get what it was supposed to do. And additionally instead of doing the maths, it would just invent some library functions, call them with some input values and imagine the maths to be miraculously be done in the background, by that nonexistent library.

flamingo_pinyata ,

Yes actually, I can imagine it getting microcontroller code wrong. My niche is general backend services. I’ve been using Github copilot a lot and it served me well for generating unit tests. Write test description and it pops out the code with ~ 80% accuracy

hendrik , (edited )

Sure. There are lots of tedious tasks in a programmers life that don't require a great amount of intelligence. I suppose writing some comments, docstrings, unit tests, "glue" and boilerplate code that connects things and probably several other things that now escape my mind are good tasks for an AI to assist a proper programmer and make them more effective and get things done faster.

I just wouldn't call that programming software. I think assisting with some narrow tasks is more exact.

Maybe I should try doing some backend stuff. Or give it an API definition and see what it does 😅 Maybe I was a bit blinded by ChatGPT having read the Wikipedia and claiming it understands robotics concepts. But it really doesn't seem to have any proper knowledge. Same probably applies to engineering and other nighboring fields that might need software.

flamingo_pinyata ,

It might also have to do with specialized vs general models. Copilot is good at generating code but ask it to write prose text and it fails completely. In contrast ChatGPT is awful at code but handles human readable text decently.

A_A ,
@A_A@lemmy.world avatar

… it will take many years … and designs will change considerably before we are there.

Telorand ,

Not until it’s better at QA than I am. Good luck teaching a machine how stupid end-users can be.

dan1101 ,

No

Olap ,

Also, obviously no

hendrik , (edited )

Tl;Dr: Not anytime soon. It fails even at simple tasks.

Technus ,

Even if it didn’t, any middle manager who decides to replace their dev team with AI is going to realize pretty quickly that actually writing code is only a small part of the job.

Won’t stop 'em from trying, of course. But when the laid-off devs get frantic calls from management asking them to come back and fix everything, they’ll be in a good position to negotiate a raise.

hendrik ,

If anything. AI could be used to replace managers 😆 I mean lots of management seems to be just pushing paper to me. Ideal to be handled by AI. But I think we still need people to do the real work for quite some time to come. Especially software architecture and coding (complex) stuff ain't easy. Neither is project management. So I guess even some managers can stay.

Technus ,

Don’t even need an AI. Just teach a parrot to say “let’s circle back on this” and “how many story points is that?”

Disregard3145 ,

“Its easy, right. Just …”

conciselyverbose ,

Good management is almost all people skills. It needs to be influenced by domain knowledge for sure, but it’s almost all about people.

You can probably match trash managers, but you won’t replace remotely competent ones

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines