this post was submitted on 05 Feb 2025
165 points (82.1% liked)

Technology

61632 readers
7160 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Voyajer@lemmy.world 19 points 10 hours ago* (last edited 10 hours ago) (1 children)

This but actually. Don't use an LLM to do things LLMs are known to not be good at. As tools various companies would do good to list out specifically what they're bad at to eliminate requiring background knowledge before even using them, not unlike needing to somehow know that one corner of those old iPhones was an antenna and to not bridge it.

[–] sugar_in_your_tea@sh.itjust.works 3 points 10 hours ago (1 children)

Yup, the problem with that iPhone (4?) wasn't that it sucked, but that it had limitations. You could just put a case on it and the problem goes away.

LLMs are pretty good at a number of tasks, and they're also pretty bad at a number of tasks. They're pretty good at summarizing, but don't trust the summary to be accurate, just to give you a decent idea of what something is about. They're pretty good at generating code, just don't trust the code to be perfect.

You wouldn't use a chainsaw to build a table, but it's pretty good at making big things into small things, and cleaning up the details later with a more refined tool is the way to go.

[–] spankmonkey@lemmy.world 0 points 9 hours ago (1 children)

They’re pretty good at summarizing, but don’t trust the summary to be accurate, just to give you a decent idea of what something is about.

That is called being terrible at summarizing.

That depends on how you use it. If you need the information from an article, but don't want to read it, I agree, an LLM is probably the wrong tool. If you have several articles and want go decide which one has the information you need, an LLM is a pretty good option.