upvote
There is an easy fix already in widespread use: "open weights".

It is very much a valuable thing already, no need to taint it with wrong promise.

Though I disagree about being used if it was indeed open source: I might not do it inside my home lab today, but at least Qwen and DeepSeek would use and build on what eg. Facebook was doing with Llama, and they might be pushing the open weights model frontier forward faster.

reply
> There is an easy fix already in widespread use: "open weights"

They're both correct given how the terms are actually used. We just have to deduce what's meant from context.

There was a moment, around when Llama was first being released, when the semantics hadn't yet set. The nutter wing of the FOSS community, to my memory, put forward a hard-line and unworkable definition of open source and seemed to reject open weights, too. So the definition got punted to the closest thing at hand, which was open weights with limited (unfortunately, not no) use restrictions. At this point, it's a personal preference that's at most polite to respect if you know your audience has one.

reply
The point is that "open source" by now has an established and widespread definition, and a "source" hints that it is something a thing is built from that is open.

Is this really a debate we still need to be having today? Sounds like grumpiness with Open Source Initiative defining this ~25 years ago when this term was rarely used as such.

If we do not accept a well defined term and want to keep it a personal preference, we can say that about any word in a natural language.

reply
Yeah, open weights is really good, especially when base models (not just the instruction tuned) weights are released like here.
reply
Nvidia did with Nemo.
reply
it's not a gotcha but people using words in ways others don't like.
reply
It's not about likes, it's a flat out lie.
reply