this post was submitted on 03 Oct 2024
150 points (92.1% liked)

Technology

58424 readers
4654 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 40 comments
sorted by: hot top controversial new old
[–] Landless2029@lemmy.world 4 points 1 hour ago

Everyone keeps talking about autocomplete but I've used it successfully for comments and documentation.

You can use vs code extensions to generate and update readme and changelog files.

Then if you follow documentation as code you can update your Confluence/whatever by copy pasting.

[–] TheEighthDoctor@lemmy.world 2 points 1 hour ago (3 children)

I'm a penetration tester and it increases my productivity a lot

[–] Gonzako@lemmy.world 2 points 7 minutes ago

so it's a vector of attack?

[–] yikerman@lemmy.world 1 points 21 minutes ago

I mainly use AI for learning new things. It’s amazing at trivial tasks.

[–] GreenKnight23@lemmy.world 1 points 1 hour ago

as a dental assistant I can also confirm that AI has increased my productivity, checks notes, by a lot.

[–] VonReposti@feddit.dk 14 points 4 hours ago

While I am not fond of AI, we do have access to it at work and I must admit that it saves some time in some cases. I'm not a developer with decades of experience in a single language, so something I am using AI to is asking "Is it possible to do a one-liner in language X where it does Y?" It works very well and the code is rarely unusable, but it is still up to my judgement whether the AI came up with a clever use of functions that I didn't know about or whether it crammed stuff into a single unreadable line.

[–] Greg@lemmy.ca 27 points 7 hours ago

Generative AI is great for loads of programming tasks like helping create regular expressions or syntax conversions between languages. The main issue I've seen in codebases that rely heavily on generative AI is that the "solutions" often fix today's bug while making future debugging more difficult. Generative AI makes it easy to go fast in the wrong direction. Used right it's a useful tool.

[–] eager_eagle@lemmy.world 10 points 7 hours ago* (last edited 7 hours ago)

lol Uplevel's """full report""" saying devs using Copilot create 41% more bugs has 2 pages and reads like a promotional material.

you can download it with a 10 minute email if you really want to see for yourself.

just some meaningless numbers.

[–] Grandwolf319@sh.itjust.works 8 points 7 hours ago (1 children)

Yep, by definition generative AI gets worse the more specific you get. If you need common templates though, it’s almost as good as today’s google.

[–] mint_tamas@lemmy.world 3 points 3 hours ago

… which is not a high bar.

[–] Blue_Morpho@lemmy.world 46 points 11 hours ago (2 children)

Good devs gain little.

I gain a lot.

[–] Warl0k3@lemmy.world 20 points 10 hours ago

Its basically a template generator, which is really helpful when you're generating boilerplate. It doesn't save me much if any time to refactor/fill in that template, but it does save some mental fatigue that I can then spend on much more interesting problems.

It's a niche tool, but occasionally quite handy. Without leaps forward technically though, it's never going to become more than that.

[–] rocci@lemmy.ml 3 points 10 hours ago

Feel the same way!

[–] einkorn@feddit.org 59 points 12 hours ago (1 children)

For me, it is a glorified auto-complete function. Could definitely live without it.

[–] CatsGoMOW@lemmy.world 35 points 11 hours ago (1 children)

Same for me, but that glorified auto complete helps a lot.

[–] MeatsOfRage@lemmy.world 12 points 7 hours ago

Hell yea. Our unit test coverage went way up because you can blow through test creation in second. I had a large complicated migration from one data set to another with specific mutations based on weird rules and GPT got me 80% of the way there and with a little nudging basically got it perfect. Code that would've taken a few hours took about 6 prompts. If I'm curious about a new library I can get a working example right away to see how everything fits together. When these articles say there's no benefit I feel people aren't using these tools or don't know how to use them effectively.

[–] 9point6@lemmy.world 18 points 11 hours ago

My main use is skipping the blank page problem when writing a new suite of tests—which after about 10 mins of refactoring are often a good starting point

[–] TrickDacy@lemmy.world 3 points 7 hours ago (4 children)

I truly don't understand the tendency of people to hate these kinds of tools. Honestly seems like an ego thing to me.

[–] GreenKnight23@lemmy.world 1 points 1 hour ago

I sent a PR back to a Dev five times before I gave the work to someone else.

they used AI to generate everything.

surprise, there were so many problems it broke the whole stack.

this is a routine thing this one dev does too. every PR has to be tossed back at least once. not expecting perfection, but I do expect it to not break the whole app.

[–] leftzero@lemmynsfw.com 2 points 5 hours ago (1 children)

Having to deal with pull requests defecated by “developers” who blindly copy code from chatgpt is a particularly annoying and depressing waste of time.

At least back when they blindly copied code from stack overflow they had to read through the answers and comments and try to figure out which one fit their use case better and why, and maybe learn something... now they just assume the LLM is right (despite the fact that they asked the wrong question and even if they had asked the right one it'd've given the wrong answer) and call it a day; no brain activity or learning whatsoever.

[–] TrickDacy@lemmy.world 6 points 4 hours ago (1 children)

That is not a problem with the ai software, that's a problem with hiring morons who have zero experience.

[–] leftzero@lemmynsfw.com 3 points 4 hours ago (1 children)

No. LLMs are very good at scamming people into believing they're giving correct answers. It's practically the only thing they're any good at.

Don't blame the victims, blame the scammers selling LLMs as anything other than fancy but useless toys.

[–] jungle@lemmy.world 1 points 18 minutes ago

Did you get scammed by the LLM? If not, what's the difference between you and the dev you mentioned?

[–] FlorianSimon@sh.itjust.works 7 points 7 hours ago (1 children)

Carbon footprint. Techbro arrogance. Not sure what's hard to understand about it.

[–] TrickDacy@lemmy.world -1 points 7 hours ago (1 children)

Yeah I'm sure you are concerned about the carbon footprint of it and that some dude you talked to once was arrogant about this technology.

[–] FlorianSimon@sh.itjust.works 7 points 5 hours ago (1 children)

Of course you know me better than myself.

I guess you wanted an answer but decided upfront you weren't gonna like it no matter what? Not much I can do about that.

[–] TrickDacy@lemmy.world 3 points 4 hours ago

You probably don't remember previously admitting to me that you never had used copilot, but at that time talked shit about it anyway. So it's funny I clocked you perfectly as a an anti-LLM zealot -- being one of the few people to respond here hatefully once again.

[–] gaael@lemmy.world -1 points 6 hours ago (2 children)

Also, when a tool increases your productivity but your salary and paid time off don't increase, it's a tool that only benefits the overlords and as such deserves to be hated.

[–] stephen01king@lemmy.zip 2 points 4 hours ago

Oh, so do you use a 13 year old PC because a newer one increases your productivity without increasing your salary and paid time off?

[–] TrickDacy@lemmy.world 0 points 5 hours ago (1 children)

Some people feel proud that their work got done quicker and also aren't micromanaged so if they choose, yes actually they can have more time for their personal lives. Not everyone's job is purely a transaction in which they do the absolute minimum they can do without being fired.

I hope you feel better soon, because you're clearly bitter and lashing out at whatever you can lash at.

[–] LunarLoony@lemmy.sdf.org 1 points 3 hours ago

I'm glad you live in this fantasy world where more productivity = more personal time, but it doesn't always work like that, especially in salaried positions. More productivity generally means more responsibility coming your way, which rarely results in an increased salary.

[–] ShunkW@lemmy.world 12 points 11 hours ago

And yet, higher ups continue to lay off more devs because AI "is the future".

[–] tonytins@pawb.social 10 points 12 hours ago (1 children)

Places GPT-based "AI" next to flying cars

[–] pennomi@lemmy.world 2 points 11 hours ago (1 children)

Flying cars exist, they’re just not cost effective. AFAICT there’s no GPT that is proficient at coding yet.

[–] otp@sh.itjust.works 5 points 10 hours ago (1 children)

It's a lot easier to access ChatGPT than it is to access a flying car

[–] sepi@piefed.social 4 points 8 hours ago* (last edited 8 hours ago)

The more people using chatgpt to generate low quality code they don't understand, the more job safety and greater salary I get.

[–] tdawg@lemmy.world 5 points 10 hours ago

I honestly stopped using it after a week

[–] eager_eagle@lemmy.world 3 points 11 hours ago

I like to use suggestions to feel superior when trash talking the generated code

[–] tkw8@lemm.ee 4 points 12 hours ago* (last edited 12 hours ago)

I’m shocked. There must be an error in this analysis. /s

Maybe engage an AI coding assistant to massage the data analysis lol

[–] EleventhHour@lemmy.world 1 points 11 hours ago* (last edited 11 hours ago)

Devs that are punching above their class, however, probably get great benefit from it. I would think it’s also an OK learning tool, except for how inaccurate it can be sometimes.