Just this week they published a serious foundational library for LLMs https://github.com/deepseek-ai/TileKernels
Others worth mentioning:
https://github.com/deepseek-ai/DeepGEMM a competitive foundational library
https://github.com/deepseek-ai/Engram
https://github.com/deepseek-ai/DeepSeek-V3
https://github.com/deepseek-ai/DeepSeek-R1
https://github.com/deepseek-ai/DeepSeek-OCR-2
They have 33 repos and counting: https://github.com/orgs/deepseek-ai/repositories?type=all
And DeepSeek often has very cool new approaches to AI copied by the rest. Many others copied their tech. And some of those have 10x or 100x the GPU training budget and that's their moat to stay competitive.
The models from Chinese Big Tech and some of the small ones are open weights only. (and allegedly benchmaxxed) (see https://xcancel.com/N8Programs/status/2044408755790508113). Not the same.
> Open weight!
They clearly were implying it's not open source.
So you can’t see what facts are pruned out, what biases were applied, etc. Even more importantly, you can’t make a slightly improved version.
This model is as open source as a windows XP installation ISO.
Did you even read my comment?