It's really funny how AI "will perform X job in the near future" but you barely, if any, see articles saying that AI will replace CEO's in the near future.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
C-suites are like Russian elites.
The latter are some thieves who've inherited a state from Soviet leadership. They have a layman's idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.
The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.
While in actuality with today's P2P technologies CEO's are the most likely to be replaced, if we use our common sense, but without "AI", of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.
Until an AI can get clear, reasonable requirements out of a client/stakeholder our jobs are safe.
So never right?
If the assumption is that a PM holds all the keys…
But Human QAs .... Human QAs everywhere!
Just confirming what we already knew.
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
It will never understand context and business rules and things of that nature to the same extent that actual devs do.
Lol sure, and AI made human staff at grocery stores a thing of the....oops, oh yeah....y'all tried that for a while and it failed horribly....
So tired of the bullshit "AI" hype train. I can't wait for the market to crash hard once everybody realizes it's a bubble and AI won't magically make programmers obsolete.
Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers...
Yeah, I member.
We are now X+14 months away from AI replacing your job in X months.
Current AI is good at compressing knowledge.
Best job role: information assistant or virtual secretary.
It's a good thing. After all, I don't care when Amazon goes down.
I wish.
Extremely misleading title. He didn't say programmers would be a thing of the past, he said they'll be doing higher level design and not writing code.
So they would be doing engineering and not programming? To me that sounds like programmers would be a thing of the past.
Even so, he's wrong. This is the kind of stupid thing someone without any first hand experience programming would say.
I heard a lot of programmers say it
They're falling for a hype train then.
I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I've been coding and working in tech for over 25 years.
The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.
We're so far off from this existing with the current tech, that it's not worth seriously discussing.
There are scripts, snippets of code that vscode's llm or VS2022's llm plugin can help with/bring up. But 9 times out of 10 there's multiple bugs in it.
If you're doing anything semi-complex it's a crapshoot if it gets close at all.
It's not bad for generating psuedo-code, or templates, but it's designed to generate code that looks right, not be right; and there's a huge difference.
AI Genned code is exceedingly buggy, and if you don't understand what it's trying to do, it's impossible to debug because what it generates is trash tier levels of code quality.
The tech may get there eventually, but there's no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.
It's useful for non-engineers to get an idea of what they're trying to do, but it can just as easily send them down a bad path.
Not really, it's doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.
right now not a chance. it's okay ish at simple scripts. it's alright as an assistant to get a buggy draft for anything even vaguely complex.
ai doing any actual programming is a long ways off.
This is incorrect. And I'm in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.
I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.
People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.
If you don't know what you're doing, LLMs can get you close, some of the time. But there's no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.
Why would I want to voluntarily spend my day trying to decypher someone else's code? I don't need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else's. This is true by orders of magnitude for AI-code gen today.
So I don't consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.
It's just a tool like any other. An experienced developer knows that you can't apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It's a tool, and a useful one if you know how to use it.
This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.
A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, ... , but an LLM is no better than a human in that too)
Yeah, there are people who can "in general" imagine how this will happen, but programming is exactly 99% not about "in general" but about specific "dumb" conflicts in the objective reality.
People think that what they generally imagine as the task is the most important part, and since they don't actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.
But objective reality doesn't bend. Their general ideas without every little bloody detail simply won't work.
Until you ask it to do something never done before and it has a meltdown.
I've seen what Amazon produces internally for software, I think the LLMs could probably do a better job.
It's worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it's a hilariously stupid comment to make, he's in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.
I managed to get an AI to build pong in assembly. Are are pretty cool things, but not sci-fi level just yet, but I didn't just say "build pong in assembly", I have to hand hold it a little bit. You need to be a programmer to understand how to guide the AI to do the task.
That was something very simple, I doubt that you can get it to do more complex tasks without a more lot of back and forth.
To give you an example I had a hard time getting it to understand that the ball needed to bounce off at an angle if intercepted at an angle, it just kept snapping it to 90° increments. I couldn't fix it myself because I don't really know assembly well enough to really get into the weeds with it so I was sort of stuck until I was finally able to get the AI to do what I wanted it to. I sort of understood what the problem was, there was a number somewhere in the system and it needed to make the number negative, but it just kept setting the number to a value. A non-programmer wouldn't really understand that's what the problem was and so they wouldn't be able to explain to the AI how to fix it.
I believe AI is going to become an unimaginably useful tool in the future and we probably don't really yet understand how useful it's going to be. But unless they actually make AGI it isn't going to replace programmers.
If they do make AGI all bets are off it will probably go build a Dyson Sphere or something at that point and we will have no way of understanding what it's doing.
Yeah, I don't see AI replacing any developers working on an existing, moderately complex codebase. It can help speed up some tasks, but it's far from being able to take a requirement and turn it into code that edits the right places and doesn't break everything.