this post was submitted on 02 Feb 2025
24 points (76.1% liked)

Asklemmy

44697 readers
975 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
top 9 comments
sorted by: hot top controversial new old
[โ€“] TherapyGary@lemmy.blahaj.zone 17 points 3 days ago (1 children)
[โ€“] chirospasm@lemmy.ml 7 points 3 days ago

Ooh, neat! This feels like Folding@Home for AI tasks.

[โ€“] turbowafflz@lemmy.world 10 points 3 days ago (2 children)

Probably, but why? Who is going to spend money to run a server just to provide compute for random people they don't know to generate garfield fanfics

[โ€“] aoidenpa@lemmy.world 2 points 3 days ago

Around 2018 people trained go playing AI LeelaZero together in a distributed way. Google's AlphaGo was not available.

[โ€“] sunzu2@thebrainbin.org 1 points 3 days ago

Same reason why people mined crypto... You generate economic incentives around renting out the compute

[โ€“] Tabitha@hexbear.net 2 points 2 days ago
[โ€“] jeena@piefed.jeena.net 4 points 3 days ago

That is a good question actually. I was looking into buying a graphics card to run som llms but it seems you need a huge amount of VRAM which makes the cards cost a huge amount of money.

I read you can buy several cheaper ones and connect them so the VRAM is enough, but that was on one computer.

It would be cool if several people could run their GPUs together similar to SETI@home to do GPU sharing. Even better if it's done federated.

[โ€“] A_Very_Big_Fan@lemmy.world 2 points 3 days ago (1 children)

I'm not even sure what you're asking, let alone if it's possible.

I mean, you could probably divide its tasks/processes between different servers or computers, but I don't see what the point of that would be. It'd just make things significantly slower and more costly.

I downvoted OP as if this is StackOverflow.

What is the use case for OP for using AI?

What does OP not like about the current setup?

What goal is OP trying to achieve?

What drawback and extra cost is OP willing to take?

Without providing any of the information, asking such a question, slabbing buzz words like "federated" to the title is just wasting everyone's time. A few days ago there was even a question asking for a "federated browser". (ยฏโ€•ยฏูฅ)

Despite all these some pals still provided insightful answers. You guys are amazing.