this post was submitted on 13 Sep 2023
29 points (100.0% liked)

Technology

37739 readers
794 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

cross-posted from !microsoft@lemdro.id

top 4 comments
sorted by: hot top controversial new old
[–] greyscale@lemmy.sdf.org 9 points 1 year ago (1 children)

Are they using total loss water cooling or something? What in the actual fuck are they doing.

[–] petey@aussie.zone 10 points 1 year ago (1 children)

I’m thinking evaporative cooling (paired with refrigerative cooling)

[–] greyscale@lemmy.sdf.org 7 points 1 year ago

Well ain't that the dumbest shit ever. Some of the worlds cleanest water being fed to a machine to hallucinate code that doesn't work while water scarcity starts to come home to roost.

[–] autotldr@lemmings.world 3 points 1 year ago

🤖 I'm a bot that provides automatic summaries for articles:

Click here to see the summaryBut one thing Microsoft-backed OpenAI needed for its technology was plenty of water, pulled from the watershed of the Raccoon and Des Moines rivers in central Iowa to cool a powerful supercomputer as it helped teach its AI systems how to mimic human writing.

Few people in Iowa knew about its status as a birthplace of OpenAI’s most advanced large language model, GPT-4, before a top Microsoft executive said in a speech it “was literally made next to cornfields west of Des Moines.”

In response to questions from The Associated Press, Microsoft said in a statement this week that it is investing in research to measure AI’s energy and carbon footprint “while working on ways to make large systems more efficient, in both training and application.”

Microsoft first said it was developing one of the world’s most powerful supercomputers for OpenAI in 2020, declining to reveal its location to AP at the time but describing it as a “single system” with more than 285,000 cores of conventional semiconductors, and 10,000 graphics processors — a kind of chip that’s become crucial to AI workloads.

It wasn’t until late May that Microsoft’s president, Brad Smith, disclosed that it had built its “advanced AI supercomputing data center” in Iowa, exclusively to enable OpenAI to train what has become its fourth-generation model, GPT-4.

In some ways, West Des Moines is a relatively efficient place to train a powerful AI system, especially compared to Microsoft’s data centers in Arizona that consume far more water for the same computing demand.


Saved 79% of original text.