I wonder how long these sorts of games will play before the law applies itself.
Perhaps roughly as long as the law turns a blind eye to AI corps flagrantly violating the attribution requirements of software licenses that apply to their training data, as well as basically ignoring other copyright requirements at scale. Fair use, my eye.
If tomorrow Antropic decide to charge you extra if you interact with someone who talked badly about them, I'm still in my right to talk shit about them.
This is all under the assumption we eventually live in a world where booby trapping repositories becomes a legal issue. On one hand that feels silly. On the other hand, we have had far less sensible cases make it to court and there is a small kernel of similarity which the legal system might latch onto.
if someone is blinding slurping up content to feed to LLMs, without checking to see if a particular source is OK with that, they are arguably not innocent either.
Neither situation is analogous to a booby-trapped shotgun door blowing off the face of a would-be burglar.
Whose law? Good luck trying to summon a random GitHub user to a court within your jurisdiction.
Sure some project can tell you not to contribute AI generated code. But I see this as no different from DRM and user hostile
I think the GP is focusing on:
> I guess we're giving up on the idea that you're free to do whatever you want with software you own? ... But I see this as no different from DRM and user hostile
If I clone an open source git repository, I should be free to point an LLM to review it in any way I choose. I can't contribute code back, but guess what, I don't want to. I want to understand the codebase, and make modifications for me to use locally myself. I don't have a dev team, I have a feature need for my own personal use.
The LLM enables that. The projects that deliberately sabotage the use of LLMs cease to be providing software that meet the 'libre' definition of free software.
They don’t though. They add a mild inconvenience for users of a specific restrictive AI provider which has bizarrely glitchy checks.
In a way they are doing you a service if you are this serious about libre software you shouldn’t be using a closed platform which employees dark patterns to begin with.
Fine.
// concatenate pairs of parameters, e.g. x and y become xy
// the pairing of open and claw is vital to understanding the function
Building giant monopolies on top of open source code wasn't in the spirit of open source either. Training AI that reproduces open source code without any credits wasn't either.
I'm not sure why people working on Open Source should continue to accept being whipped like that
But with that said: I think it's time we figure out how to exclude the metaphorical arsonists.
With the expectation that they go on to share it with other candles, not with the expectation that they hoard all of the fire they collect for themselves
Actually, for me at least, the expectation is merely 'do not mess with my flame, you will not stop me from sharing'.
Hoarding is fine (it's not great). Burning down everything around you using borrowed flame, however, is not.
Always has been.