this post was submitted on 21 Sep 2024
2 points (100.0% liked)

Technology

59651 readers
2643 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Modern AI data centers consume enormous amounts of power, and it looks like they will get even more power-hungry in the coming years as companies like Google, Microsoft, Meta, and OpenAI strive towards artificial general intelligence (AGI). Oracle has already outlined plans to use nuclear power plants for its 1-gigawatt datacenters. It looks like Microsoft plans to do the same as it just inked a deal to restart a nuclear power plant to feed its data centers, reports Bloomberg.

you are viewing a single comment's thread
view the rest of the comments
[–] finitebanjo@lemmy.world 0 points 2 months ago (1 children)

Earlier this year, the International Energy Agency released its energy usage and forecast and has predicted that the total global electricity consumption of data centers is set to top 1 PWh (petawatt-hour) in 2026. This more than doubles its 2022 value and (as the report states) “is equivalent to the electricity consumption of Japan.” SOURCE

It does fuck all for me except make art and customer service worse on average, but yes it certainly will result in countless avoidable deaths if we don't heavily curb its usage soon as it is projected to Quintuple its power draw by 2029.

[–] areyouevenreal@lemm.ee 0 points 2 months ago (1 children)

I am not talking about things like ChatGPT that rely more on raw compute and scaling than some other approaches and are hosted at massive data centers. I actually find their approach wasteful as well. I am talking about some of the open weights models that use a fraction of the resources for similar quality of output. According to some industry experts that will be the way forward anyway as purely making models bigger has limits and is hella expensive.

Another thing to bear in mind is that training a model is more resource intensive than using it, though that's also been worked on.

[–] finitebanjo@lemmy.world 0 points 2 months ago* (last edited 2 months ago) (1 children)

You put power in and you get worthless garbage out. Do the world a favor and just mine crypto, try FoldingCoin out.

[–] areyouevenreal@lemm.ee 0 points 2 months ago (1 children)

I've seen teachers use this stuff and get actually decent results. I've also seen papers where people use LLMs to hack into a computer, which is a damn sophisticated task. So you are either badly informed or just lying. While LLMs aren't perfect and aren't a replacement for humans, they are still very much useful. To believe otherwise is folly and shows your personal bias.

[–] finitebanjo@lemmy.world 0 points 2 months ago

Anybody who uses a bullshit generator in any step of the education process is unqualified.