upvote
Isn't the "corpo moat" bigger now?

They can wash the copyright by AI training, but the AIs don't get trained on closed source.

"corpo" also has a ton of patents, which still can't be AI-washed.

What will become unenforceable are Open Source Licenses exclusively, how does that make it a "level field"?

reply
Because AI is also proving to be very good at reverse engineering proprietary binaries or just straight up cloning software from test suites or user interfaces. Cuts both ways.
reply
Oh sure, AI is a fantastic protection against copyright law. You do realize that if you're not going to be able that you wrote something you're wide open to claims of copyright infringement, especially if your argument is going to be 'it wasn't me that did the RE, it was the AI, the same AI that wrote the code'.

It's going to be very interesting to see 'cleanroom' kind of development in the AI age but I suspect it's not going to be such a walk in the park as some seem to think it will be. There are just too many vested interests. But: it would be nice to see someone do a release of say the Oracle source code as rewritten by AI through this progress, just to see how fast the IP hammer will come down on this kind of trick.

reply
Reverse engineering is illegal in many jurisdictions, and especially in the USA thanks to the DMCA.

If the argument is just "They won't catch me", then yes you are correct.

But some of us are still forced to follow the law, whatever it might be.

Also: They still have patents on it.

reply
So the argument is just "AI is magic and any kind of software can be rewritten for free"? Not really sure I buy it...
reply
Have you ever seen what obfuscation looks like when somebody puts the effort in?

Not to mention companies will try to mandate hardware decryption keys so the binary is encrypted and your AI never even gets to analyze the code which actually runs.

It's not sci-fi, it's a natural extension of DRM.

reply
Companies have been encrypting code to HSMs for decades. Never stopped humans from reverse engineering so it certainly will not stop AI aided by humans able to connect a Bus Pirate on the right board traces. Anything that executes on the CPU can be dumped with enough effort, and once dumped it can be decompiled.
reply
You are agreeing with me, you just don't know it yet.

1) The financial aspect: As you say, more and more advanced DRM requires more and more advanced tools. Even assuming advanced AI can guide any human to do the physical part, that still means you have to pay for the hardware. And the hardware has to be available (companies have been known to harass people into giving up perfectly moral and legal projects).

2) The legal aspect: Possession of burglary tools is illegal in some places. How about possession of hacking tools? Right now it's not a priority for company lobbying, what about when that's the only way to decompile? Even today, reverse engineering is a legal minefield. Did you know in some countries you can technically legally reverse engineer but under some conditions such as having disabilities necessitating it and only using the result for personal use?[0]

3) The TOS aspect: What makes you think AI will help you? If the company owning the AI says so, you're on your own.

---

You need to understand 2 things:

- Just because something is possible doesn't mean somebody is gonna do it. Effort, cost and risk play huge roles. And that assumes no active hostile interference.

- History is a constant struggle between groups with various goals and incentives. Some people just want to live a happy life, have fun and build things in their free time. Other people want to become billionaires, dream about private islands, desire to control other people's lives and so on. People are good at what they focus on. There's perhaps more of the first group but the second group is really good at using their money and connections to create more money and connections which they in turn use to progress towards their primary objectives, usually at the expense of other people. People died[1] over their right to unionize. This can happen again.

Somebody might believe historical people were dumb or uncivilized and it can't happen today because we've advanced so much. That's bullshit. People have had largely the same wetware for hundreds of thousands of years. The tools have evolved but their users have not.

[0]: https://pluralistic.net/2026/03/16/whittle-a-webserver/ - "... aren't tools exemptions, they're use exemptions ... You have that right. Your mechanic does not have that right."

[1]: https://en.wikipedia.org/wiki/Pinkerton_(detective_agency)

reply
I spend a fun week during Christmas figuring out some really obfuscated bibary code with antidebugging anti pampering things in a cryptographic context. I didn’t use ghydra or ida or anything beyond gdb with deepseek chat in a browser. That low effort got me what I needed to get.
reply
[dead]
reply
Exactly.

AI proponents completely ignore the disparity of resources available to an individual and a corporation. If I and a company of 1000 people create the same product and compete for customers, the company's version will win. Every single time. Or maybe at least 1000:1 if you're an optimist.

They have access to more money for advertising, they have an already established network of existing customers, they have legal and marketing experts on payroll. Or just look at Microsoft, they don't even need advertising, they just install their product by default and nobody will even hear about mine.

Not to mention as you said, the training advances only goes from open source to closed source, not the other way around.

AI proponents who talk about "democratization" are nuts, it would be laughable if it wasn't so sad.

reply
>If I and a company of 1000 people create the same product and compete for customers, the company's version will win. Every single time.

As a person who works for a company with 25k people, I would disagree. You, a single person will often get to the basic product that a lot of people will want much faster than a company with 1k, 5k and 25k people.

Bigger companies are constrained by internal processes, piles of existing stuff, and inability to hire at the scale they need and larger required context. Also regulation and all that. Bigger companies are also really slow to adapt, so they would rather let you build the product and then buy out your company with your product and people who build it. They are at at a temporary disadvantage every time the landscape shifts.

reply
The point wasn't about the number of people, the point was a company which employs that number of people has enough money which can be converted to leverage against you.

Besides that, your whole arguments hinges on large companies being inflexible, inefficient and poorly run. Isn't that exactly the kind of problem AI promises to solve? Complete AI surveillance of every employee, tasks and instructions tailored to each individual and superhuman planning. Of course at that point, the only employees will be manual workers because actual AI will be much better and cheaper at everything than every human, except those things where it needs to interact with the physical world. Even contract negotiations with both employees and customers will be done with AI instead of humans, the human will only sign off on it for legal requirements just like today you technically enter a contract with a representative of the company who is not even there when you talk to a negotiator.

reply
Large companies are often inflexible and inefficient as a matter of deliberate strategy. I've found myself in scenarios where we have a complete software artifact that a smaller company would launch and find successful, but we can't launch it, because we have to satisfy some expectation we've set or do a complex integration with some important other system of ours.
reply
A lesson from gamedev is that players will deliberately restrict themselves - sometimes to make the game more fun or challenging, sometimes to appeal to their aesthetic principles.

If/when superhuman AI is achieved, those limitations will all go away. An owner will just give it money and control and tell it to optimize for more money or political power or whatever he wants.

That's a much scarier future than a paperclip maximizer because it's much closer and it doesn't require complete takeover first, it'll be just business as usual, except more somehow more sociopathic.

reply
deleted
reply
> If something does not exist as MIT, an LLM will create it.

Nitpicking on the license here, but please don't use MIT, it has no patent grant protections.

And those are never covered in any AI-washing anyway.

There are equivalent licenses with patent grant protection, like 'Apache2+LLVM exception' or 'Mozilla Public License 2' and others...

reply
The corporate moat is the army of lawyers they have. It doesn’t matter whether they win or not if you can’t afford endless litigation. Is the same for patents.
reply
Funny, their army of lawyers seems incapable of stopping me from easily downloading pirated software or coding an open alternative to their closed-source software with AI if I wanted to..

You cannot keep a purely legally-enforced moat in the face of advancing technology.

reply
I would caution against using this argument.

In the USA the DMCA can make it illegal to even own and use tools meant to bypass even the weakest of protection.

This law has already been used to ruin lives.

"They might catch the individual but not us all" is nice and fine until it is your turn, so check your legislation.

reply
The music industry has an army of lawyers too, and it did not make a damn bit of difference once bittorrent was popularized.

IP law means nothing once tens of millions of people are openly violating it.

The software industry is about to learn this lesson too.

reply
So is music free now? The record industry doesn't exist anymore, isn't ridiculously profitable? Artists are finally earning a fair share?
reply
Music is free, because music piracy is unenforceable so the law is irrelevant. Now, I personally buy most of my music on vinyl because I want to support artists, but absolutely nothing forces me to do that as all the music is available for free.
reply
As far as I can see, the vast majority of people don’t pirate music these days (unlike 20 years ago). Most people wouldn’t even know where and how to pirate music. They just have Spotify or another streaming service.
reply
> So is music free now?

Uhm... yes? The cost of downloading pirated music is essentially zero. The only reason why people use services like Spotify is because it's extremely cheap while being a bit more convenient. But jack up the price and the masses will move to sail the sea again.

reply
The cost of stealing has always been essentially zero. Same argument can be made for streaming, and yet Netflix is neither cheap nor struggling for subscribers.
reply
> The cost of stealing has always been essentially zero.

That is not necessarily true, depending on the level of enforcement and the availability of opportunities to steal.

> Same argument can be made for streaming, and yet Netflix is neither cheap nor struggling for subscribers.

Netflix is still pretty cheap for the convenience it provides. Again, jack up the price and see the masses move to torrent movies/shows again.

reply
In the sense of artists cannot expect to get any money for their work, yeah music's free. Becoming a meme or a celebrity on the grounds of personality is still fair game, to the extent that AI is not impersonating people effectively at scale yet.

Yet.

A whole bunch of people I watch on youtube (politics, analysts, a weatherman) are already seeing AI impersonation videos, sometimes misrepresenting their positions and identities. This will grow.

So, you can't create art because that's extruded at scale in such a way that it's just turning on the tap to fill a specified need, and you can't be a person because that can also be extruded at scale pretty soon, either to co-opt whatever you do that's distinct, or to contradict whatever you're trying to say, as you.

As far as being a person able to exist and function through exchanging anything you are or anything you do for recompense, to survive, I'm not sure that's in the cards. Which seems weird for a technology in the guise of aiding people.

reply
This means that all copyleft is MIT but it doesn't change the closed source stuff... So once again it benefits corpo more than most.
reply
Generating software still token costs, generating something like ms-word will still cost a significant amount, takes a lot of human effort to prompt and validate. Having a proven solution still has value.
reply
You can already generate surprisingly complex software on an LLM on a raspberry pi now, including live voice assistance, all offline. Peoples hardware can self write software pretty readily now. The cost of tokens is a race to zero.
reply
Ironically, I actually suspect the exact opposite. Linux has no real choice in this matter because most of the code is written by Google, Red Hat, Cisco, and Amazon at this point, and these big cos are all going to mandate their developers have to use AI coding agents. Refuse to accept these contributions and we're just going to end up with 20 Linuxes instead of one, and the original still under the control of Linus will be relegated to desktop usage and wither and die.
reply