this post was submitted on 27 Feb 2025
263 points (93.4% liked)
Technology
63614 readers
2933 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's just not true.
as someone who runs local inference all the time, i think that centralized online models have no place anywhere near consumers. partly because the things they offer are trivial and offload critical skills, partly because they require insane amounts of energy, and partly because they are privacy nightmares. all things that are against moz's stated mission. and yet here we are.
This is the only decent comment that I've got on this entire thread
And you seem to be right, I agree with your sentiment. If the features are unneeded, then this is the wrong step to take. The best option would be for Mozilla (who are behind Ollama I think?) to offer a locally run open model, any potato capable of running a modern browser should be able to handle 1B-7B models
moz are behind llamafiles, but ollama is a separate entity.
also, chat models are just not that useful. i'm all for their local translation models and the like, but chat is just a toy.