this post was submitted on 14 Jan 2025
-32 points (25.0% liked)

Programming

17978 readers
199 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Kissaki@programming.dev 20 points 2 weeks ago (1 children)

One table of percent increase/decrease written into SEVEN worded paragraphs. That's how you add bloat and reduce overview and comparability.

The percent numbers aren't telling. They don't explain the methodology of how interest has been measured. Which could have added value to just writing out the numbers. The huge numbers of multiple hundred percent indicate to me that they're worthless numbers.

The title is bullshit too. They say interest in C and C# was up, contradicting their claim that traditional programming language interest is declining. Clickbait non-content.


The note on Googles CEO claiming 25% of their internal code is now AI generated was surprising and interesting to me. I don't know if I find it surprising, shocking, or implausible (suspecting the CEO misunderstands or misattributes what is happening; sourcing is not applied code).

[–] MagicShel@lemmy.zip 19 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

My guess is 25% of their developers use AI coding assist. Because as a developer who uses AI almost every day, I can promise you only the most pedestrian code can be written by AI. As autocomplete, it saves me some time typing. But actually writing code from scratch, no way.

Yesterday, I asked it to write some particular code for me to do with multi-threading, and it constantly wrote things wrong, like initializing the access of a database client with the user of a request, which would mean every single user would have the access of the first user, not their own.

I reviewed some code earlier this week that did the same thing with the GlobalExceptionHandler that I suspect was also written by AI. These are sort of insidious in that when you write tests to make sure the code works, the tests will pass just fine.

You have to have a skilled developer to identify those issues because the code looks good, and just about any test an individual developer will throw at it will pass. That bug would have gone to production if I hadn't caught it. And that's on top of code that just uses the wrong class or confuses common methods from two completely different classes.

And I couldn't even get a job at Google when they interviewed me, twice. So you can't tell me 25% of their code is AI-generated. It's useful, and a time saver. But it's not capable of generating reliable code on its own. Certainly not yet, and I believe not ever in this form of AI (maybe AGI if it ever comes about).

[–] Kissaki@programming.dev 4 points 2 weeks ago

That's the kind of thing I suspect as well. Thank you for sharing your insight/experience. It's always interesting and valuable to hear others experiences.