this post was submitted on 18 Oct 2024
1093 points (97.8% liked)
Technology
59566 readers
4890 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The Grok models are a laughing stock in the LLM space. They aren't good over APIs, and they're even less useful after Elon "open sources" them far later. Qwen 72B, and heck, Qwen 32B is already better than Grok 2, which is probably hundreds of billions of parameters. Qwen is runnable locally right now, Apache 2.0, and released day one. Grok 1 is... well, I dunno, no one has even bothered to try hosting it for anything.
I dunno what Twitter is doing with all those H100s Elon hoarded, but it seems like a big waste so far. Its certainly nothing to help the open source/self hosting space or to "decensor" and "democratize" LLMs like Elon fans seem to think.
Duh. Mining crypto.