this post was submitted on 21 Sep 2024
2 points (100.0% liked)

Technology

59651 readers
2744 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Please remove it if unallowed

I see alot of people in here who get mad at AI generated code and I am wondering why. I wrote a couple of bash scripts with the help of chatGPT and if anything, I think its great.

Now, I obviously didnt tell it to write the entire code by itself. That would be a horrible idea, instead, I would ask it questions along the way and test its output before putting it in my scripts.

I am fairly competent in writing programs. I know how and when to use arrays, loops, functions, conditionals, etc. I just dont know anything about bash's syntax. Now, I could have used any other languages I knew but chose bash because it made the most sense, that bash is shipped with most linux distros out of the box and one does not have to install another interpreter/compiler for another language. I dont like Bash because of its, dare I say weird syntax but it made the most sense for my purpose so I chose it. Also I have not written anything of this complexity before in Bash, just a bunch of commands in multiple seperate lines so that I dont have to type those one after another. But this one required many rather advanced features. I was not motivated to learn Bash, I just wanted to put my idea into action.

I did start with internet search. But guides I found were lacking. I could not find how to pass values into the function and return from a function easily, or removing trailing slash from directory path or how to loop over array or how to catch errors that occured in previous command or how to seperate letter and number from a string, etc.

That is where chatGPT helped greatly. I would ask chatGPT to write these pieces of code whenever I encountered them, then test its code with various input to see if it works as expected. If not, I would ask it again with what case failed and it would revise the code before I put it in my scripts.

Thanks to chatGPT, someone who has 0 knowledge about bash can write bash easily and quickly that is fairly advanced. I dont think it would take this quick to write what I wrote if I had to do it the old fashioned way, I would eventually write it but it would take far too long. Thanks to chatGPT I can just write all this quickly and forget about it. If I want to learn Bash and am motivated, I would certainly take time to learn it in a nice way.

What do you think? What negative experience do you have with AI chatbots that made you hate them?

(page 3) 28 comments
sorted by: hot top controversial new old
[–] obbeel@lemmy.eco.br 0 points 2 months ago* (last edited 2 months ago)

I have worked with somewhat large codebases before using LLMs. You can ask the LLM to point a specific problem and give it the context. I honestly don't see myself as capable without a LLM. And it is a good teacher. I learn much from using LLMs. No free advertisement for any of the suppliers here, but they are just useful.

You get access to information you can't find on any place of the Web. There is a large structural bad reaction to it, but it is useful.

(Edit) Also, I would like to add that people who said that questions won't be asked anymore seemingly never tried getting answers online in a discussion forum - people are viciously ill-tempered when answering.

With a LLM, you can just bother it endlessly and learn more about the world while you do it.

[–] Numuruzero@lemmy.dbzer0.com 0 points 2 months ago (1 children)

I have a coworker who is essentially building a custom program in Sheets using AppScript, and has been using CGPT/Gemini the whole way.

While this person has a basic grasp of the fundamentals, there's a lot of missing information that gets filled in by the bots. Ultimately after enough fiddling, it will spit out usable code that works how it's supposed to, but honestly it ends up taking significantly longer to guide the bot into making just the right solution for a given problem. Not to mention the code is just a mess - even though it works there's no real consistency since it's built across prompts.

I'm confident that in this case and likely in plenty of other cases like it, the amount of time it takes to learn how to ask the bot the right questions in totality would be better spent just reading the documentation for whatever language is being used. At that point it might be worth it to spit out simple code that can be easily debugged.

Ultimately, it just feels like you're offloading complexity from one layer to the next, and in so doing quickly acquiring tech debt.

Exactly my experience as well. Using AI will take about the same amount of time as just doing it myself, but at least I'll understand the code at the end if I do it myself. Even if AI was a little faster to get working code, writing it yourself will pay off in debugging later.

And honestly, I enjoy writing code more than chatting with a bot. So if the time spent is going to be similar, I'm going to lean toward DIY every time.

[–] Soup@lemmy.cafe 0 points 2 months ago

Because despite how easy it is to dupe people into thinking your methods are altruistic- AI exists to save money by eradicating jobs.

AI is the enemy. No matter how you frame it.

[–] sugar_in_your_tea@sh.itjust.works 0 points 2 months ago* (last edited 2 months ago) (1 children)

Two reasons:

  1. my company doesn't allow it - my boss is worried about our IP getting leaked
  2. I find them more work than they're worth - I'm a senior dev, and it would take longer for me to write the prompt than just write the code

I just dont know anything about bash’s syntax

That probably won't be the last time you write Bash, so do you really want to go through AI every time you need to write a Bash script? Bash syntax is pretty simple, especially if you understand the basic concept that everything is a command (i.e. syntax is <command> [arguments...]; like if <condition> where <condition> can be [ <special syntax> ] or [[ <test syntax> ]]), which explains some of the weird corners of the syntax.

AI sucks for anything that needs to be maintained. If it's a one-off, sure, use AI. But if you're writing a script others on your team will use, it's worth taking the time to actually understand what it's doing (instead of just briefly reading through the output). You never know if it'll fail on another machine if it has a different set of dependencies or something.

What negative experience do you have with AI chatbots that made you hate them?

I just find dealing with them to take more time than just doing the work myself. I've done a lot of Bash in my career (>10 years), so I can generally get 90% of the way there by just brain-dumping what I want to do and maybe looking up 1-2 commands. As such, I think it's worth it for any dev to take the time to learn their tools properly so the next time will be that much faster. If you rely on AI too much, it'll become a crutch and you'll be functionally useless w/o it.

I did an interview with a candidate who asked if they could use AI, and we allowed it. They ended up making (and missing) the same mistake twice in the same interview because they didn't seem to actually understand what the AI output. I've messed around with code chatbots, and my experience is that I generally have to spend quite a bit of time to get what I want, and then I still need to modify and debug it. Why would I do that when I can spend the same amount of time and just write the code myself? I'd understand the code better if I did it myself, which would make debugging way easier.

Anyway, I just don't find it actually helpful. It can feel helpful because it gets you from 0 to a bunch of code really quickly, but that code will probably need quite a bit of modification anyway. I'd rather just DIY and not faff about with AI.

[–] ikidd@lemmy.world 0 points 2 months ago (1 children)

You boss should be more worried about license poisoning when you incorporate code that's been copied from copyleft projects and presented as "generated".

Perhaps, but our userbase is so small that we'd be very unlikely that someone would notice. We are essentially B2B with something like a few hundred active users. We do vet our dependencies religiously, but in all actuality, we could probably get away with pulling in some copyleft code.

[–] bruhduh@lemmy.world 0 points 2 months ago* (last edited 2 months ago) (3 children)

That is the general reason, i use llms to help myself with everything including coding too, even though i know why it's bad

[–] xavier666@lemm.ee 0 points 2 months ago

Based Linus strikes again

[–] ikidd@lemmy.world 0 points 2 months ago

I'm fairly sure Linus would disapprove of my "rip everything off of Stack Overflow and ship it " programming style.

[–] dezmd@lemmy.world 0 points 2 months ago

This is a good quote, but it lives within a context of professional code development.

Everyone in the modern era starts coding by copying functions without understanding what it does, and people go entire careers in all sorts of jobs and industries without understanding things by copying what came before that 'worked' without really understanding the underlying mechanisms.

What's important is having a willingness to learn and putting in the effort to learn. AI code snippets are super useful for learning even when it hallucinates if you test it and make backups first. This all requires responsible IT practices to do safely in a production environment, and thats where corporate management eyeing labor cost reduction loses the plot, thinking AI is a wholesale replacement for a competent human as the tech currently stands.

[–] Banked3-Visa9@fedia.io 0 points 2 months ago

People are not "mad" or "hate" about AI, more like "concerned."

[–] SergeantSushi@lemmy.world 0 points 2 months ago (4 children)

I agree AI is a godsend for non coders and amateur programmers who need a quick and dirty script. As a professional, the quality of code is oftentimes 💩 and I can write it myself in less time than it takes to describe it to an AI.

[–] NeoNachtwaechter@lemmy.world 0 points 2 months ago

AI is a godsend for non coders and amateur programmers who need a quick and dirty script.

Why?

I mean, it is such a cruel thing to say.

50% of these poor non coders and amateur programmers would end up with a non-functioning script. I find it so unfair!

You have not even tried to decide who deserves and gets the working solution and who gets the garbage script. You are soo evil...

load more comments (3 replies)
[–] small44@lemmy.world 0 points 2 months ago (3 children)

Many lazy programmers may just copy paste without thinking too much about the quality of generated code. The other group of person who oppose it are those who think it will kill the programmer job

load more comments (3 replies)
load more comments
view more: ‹ prev next ›