this post was submitted on 29 Jan 2025
211 points (96.9% liked)

World News

40161 readers
4111 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 2 years ago
MODERATORS
 

Summary

Alibaba has launched Qwen 2.5-Max, an AI model it claims outperforms DeepSeek-V3, OpenAI’s GPT-4o, and Meta’s Llama-3.1-405B.

The release, coinciding with Lunar New Year, reflects mounting competition in China’s AI sector after DeepSeek’s rapid rise.

DeepSeek’s recent advancements have pressured Chinese rivals like ByteDance and Baidu to upgrade their models and cut prices.

DeepSeek’s founder downplays price wars, focusing on artificial general intelligence (AGI). The company’s lean, research-focused structure contrasts with China’s tech giants, which face challenges in AI innovation.

you are viewing a single comment's thread
view the rest of the comments
[–] Gsus4@mander.xyz 9 points 6 days ago* (last edited 6 days ago) (1 children)

But I could use it as a starting point for training and build from it with my own data. I could fork it. I couldn't fork llama, I don't have the weights.

[–] trevor@lemmy.blahaj.zone 10 points 6 days ago

You can also fork proprietary code that is source available (depending on the specific terms of that particular proprietary license), but that doesn't make it open source.

Fair point about llama not having open weights though. So it's not as proprietary as llama. It still shouldn't be called open source if the training data that it needs to function is proprietary.