this post was submitted on 23 Nov 2024
202 points (97.2% liked)

Technology

59587 readers
3037 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

you are viewing a single comment's thread
view the rest of the comments
[–] teh7077@lemmy.today 9 points 3 hours ago (1 children)

That's what I always thought when reading this and other articles about the estimated power consumption of GPT-4. Run a decent 7B LLM on the consumer hardware like the steam deck and you got your e-mail in a minute with the fans barely spinning up.

Then I read that GPT-4 is supposedly a 1760B model. (https://en.m.wikipedia.org/wiki/GPT-4#Background) I don't know how energy usage would scale with model size exactly, but I'd consider it plausible that we are talking orders of magnitude above the typical local LLM.

considering that the email by the local LLM will be good enough 99% of the time, GPT may just be horribly inefficient, in order to score higher in some synthetic benchmarks?

[–] douglasg14b@lemmy.world 4 points 40 minutes ago

Computational demands scale aggressively with model size.

And if you want a response back in a reasonable amount of time you're burning a ton of power to do so. These models are not fast at all.