upvote
This is satire, but the very notion of open source license obligations is meaningless in context. FLOSS licenses do not require you to publish your purely internal changes to the code; any publication happens by your choice, and given that AI can now supposedly engineer a clean-room reimplementation of any published program whatsoever, publishing your software with a proprietary copyright isn't going to exactly save you either.
reply
No, no, some open source licenses require you to publish internal changes. Eg some are explicitly written that you have to publish even when you 'only' use the changes on your own servers. (Not having to publish that was seen as a loophole for cloud companies to exploit.)
reply
Those clauses exclude those licenses from some very important definitions of free/open-source software. For example they would fail the Desert Island Test for the Debian Free Software Guidelines.
reply
The Debian project guidelines are not the ultimate arbiter of what is and isn't free software, they are just some of many useful guidelines to consider. Another useful guideline is that the user shall have freedom.
reply
You are either talking about a license nobody is using (at least I've never heard of it) or misconstruing what the AGPL obligates you to do.

I am going to assume it's the latter.

If you in your house take an AGPL program, host it for yourself, and use it yourself, nothing in the AGPL obligates you to publish the source changes.

In fact, even if you take AGPL software and put it behind a paywall and modify it, the only people who the license mandates you to provide the source code for are the people paying.

The AGPL is basically the GPL with the definition of "user" broadened to include people interacting with the software over the network.

And the GPL, again, only requires you to provide the source code, upon request, to users. If you only distribute GPL software behind a paywall, you personally only need to give the source to people paying.

Although in both these cases, nothing stops the person receiving that source code from publishing it under its own terms.

reply
The point he's making is that who is going to actually enforce that? If I take something that has that license and make changes to it, who is going to know? That's the underlying premise here.
reply
The courts?

Google “examples of GPL enforced in court” for a few

Yeah it requires finding out, but how do you prove a whistleblower broke their NDA?

reply
Your point is circular, let me bring it all around. If I make a 'clean-room' implementation using an LLM of a software that has a GPL license. How does the court enforce that my black box didn't use the original software in any way if there's no way to know? Does having that software as part of it's training corpus automatically enroll all output as GPL enforceable? This is essentially the question some courts are attempting to answer right now.
reply
"given that AI can now supposedly engineer a clean-room reimplementation of any published program whatsoever"

I'm missing something there, that's precisely what I'm arguing again. How can it do a clean-room reimplementation when the open source code is most likely in the training data? That only works if you would train on everything BUT the implementation you want. It's definitely feasible but wouldn't that be prohibitively expensive for most, if not all, projects?

reply
If I hired a human to write a clone of GNU grep to be released under a MIT license, and he wrote one that was performed exactly the same as GNU grep, it would be impossible for me to prove that the guy I hired didn't look at the GNU code.

But we'd be able to look at his clone code and see it's different, with different algorithms, etc. We could do a compare and see if there are any parts that were copied. It's certainly possible to clone GNU grep without copying any code and I don't think it would fail any copyright claims just because the GNU grep code is in the wild.

If that was the case, the moment any code is written under the GPL, it could never be reimplemented with a different license.

So instead of a human cloner, I use AI. Sure, the AI has access to the GPL code - every intelligence on the planet does. But does that mean that it's impossible to reimplement an idea? I don't think so.

reply
What you argue is a non-sequitur and regardless of case law really makes no sense when the spirit of the action is to replicate something. Reasonable people would say that replicating and disseminating code with the express purpose of avoiding copyright is a violation of copyright and why it exists in the first place.

Just because something is trivial enough to copy does not mean it was trivial to conceive of and codify. Mens rea really does matter when we are talking about defrauding intellectual property holders and stealing their opportunity.

reply
"Reasonable people would say that replicating and disseminating code with the express purpose of avoiding copyright is a violation of copyright and why it exists in the first place."

But then how can the FSF reimplement AT&T utilities? The FSF didn't invent grep. They wrote a new version of it from scratch under a different license.

reply
Civil War Hospital Clean Room equivalent
reply
Am I right in thinking that is not even "clean room" in the way people usually think of it, e.g. Compaq?

The "clean room" aspect for that came in the way that the people writing the new implementation had no knowledge of the original source material, they were just given a specification to implement (see also Oracle v. Google).

If you're feeding an LLM GPL'd code and it "creates" something "new" from it, that's not "clean room", right?

At the end of the day the supposed reimplementation that the LLM generates isn't copyrightable either so maybe this is all moot.

reply
> If you're feeding an LLM GPL'd code and it "creates" something "new" from it, that's not "clean room", right?

I didn’t RTFA but I suppose that by clean room here they mean you feed the code to ”one” LLM and tell it to write a specification. Then you give the specification to ”another” LLM and tell it to implement the specification.

reply
It's a satire. The authors presented it at FOSDEM. They are people that worked previously for foss communities.
reply
Satire is too dangerous to be presented outside of its community. This honestly should've been left within FOSDEM.

It's great within the context of people who understand it, enlightening even. Sparks conversations and debates. But outside of it ignorance wields it like a bludgeon and dangerous to everyone around them. Look at all the satirical media around fascism, if you knew to criticize you could laugh, but for fascists it's a call to arms.

reply
No one who understands the first thing about this topic could possibly have read that web page and not realized that it was satire.

"Those maintainers worked for free—why should they get credit?"

"Your shareholders didn't invest in your company so you could help strangers."

"For the first time, a way to avoid giving that pesky credit to maintainers."

"Full legal indemnification [...] through our offshore subsidiary in a jurisdiction that doesn't recognize software copyright"

reply
Maybe I’m missing something but big corps do this, right? I legitimately expect folks like Musk and Zuckerberg to say these things. I get why that’s exactly the reason it’s satire but it’s a little too close to the truth for me to chuckle about it.
reply
This is because you're already in that mindset.

Try to take the stance of someone who doesn't really know too much about open source other than it's a nuisance to use, this is a great idea! I wanted to use this tool that corporate said we couldn't touch, but now I can!

reply
If people lack sense of humor or satire, even if pathologically, well, too bad for them. Why should the rest be denied of that satire? It's not harming anyone at all.
reply
Unfortunately it's not too bad for them, it's too bad for everyone they're around. They aren't the ones that lose out when we start dismantling open source communities.
reply
PP's point is that 2025-2026 is exactly the result of satire being weaponized to cause real harm, because people pretend it's truth.
reply
That wasn’t people weaponizing satire, that was people just making weapons
reply
There is an overlay of smeared poop on one of the license files… is that something you are seeing on typical tech company landing pages?

The company is literally named “bad/evil.”

reply