upvote
Because AI is also proving to be very good at reverse engineering proprietary binaries or just straight up cloning software from test suites or user interfaces. Cuts both ways.
reply
Oh sure, AI is a fantastic protection against copyright law. You do realize that if you're not going to be able that you wrote something you're wide open to claims of copyright infringement, especially if your argument is going to be 'it wasn't me that did the RE, it was the AI, the same AI that wrote the code'.

It's going to be very interesting to see 'cleanroom' kind of development in the AI age but I suspect it's not going to be such a walk in the park as some seem to think it will be. There are just too many vested interests. But: it would be nice to see someone do a release of say the Oracle source code as rewritten by AI through this progress, just to see how fast the IP hammer will come down on this kind of trick.

reply
Reverse engineering is illegal in many jurisdictions, and especially in the USA thanks to the DMCA.

If the argument is just "They won't catch me", then yes you are correct.

But some of us are still forced to follow the law, whatever it might be.

Also: They still have patents on it.

reply
So the argument is just "AI is magic and any kind of software can be rewritten for free"? Not really sure I buy it...
reply
Have you ever seen what obfuscation looks like when somebody puts the effort in?

Not to mention companies will try to mandate hardware decryption keys so the binary is encrypted and your AI never even gets to analyze the code which actually runs.

It's not sci-fi, it's a natural extension of DRM.

reply
Companies have been encrypting code to HSMs for decades. Never stopped humans from reverse engineering so it certainly will not stop AI aided by humans able to connect a Bus Pirate on the right board traces. Anything that executes on the CPU can be dumped with enough effort, and once dumped it can be decompiled.
reply
You are agreeing with me, you just don't know it yet.

1) The financial aspect: As you say, more and more advanced DRM requires more and more advanced tools. Even assuming advanced AI can guide any human to do the physical part, that still means you have to pay for the hardware. And the hardware has to be available (companies have been known to harass people into giving up perfectly moral and legal projects).

2) The legal aspect: Possession of burglary tools is illegal in some places. How about possession of hacking tools? Right now it's not a priority for company lobbying, what about when that's the only way to decompile? Even today, reverse engineering is a legal minefield. Did you know in some countries you can technically legally reverse engineer but under some conditions such as having disabilities necessitating it and only using the result for personal use?[0]

3) The TOS aspect: What makes you think AI will help you? If the company owning the AI says so, you're on your own.

---

You need to understand 2 things:

- Just because something is possible doesn't mean somebody is gonna do it. Effort, cost and risk play huge roles. And that assumes no active hostile interference.

- History is a constant struggle between groups with various goals and incentives. Some people just want to live a happy life, have fun and build things in their free time. Other people want to become billionaires, dream about private islands, desire to control other people's lives and so on. People are good at what they focus on. There's perhaps more of the first group but the second group is really good at using their money and connections to create more money and connections which they in turn use to progress towards their primary objectives, usually at the expense of other people. People died[1] over their right to unionize. This can happen again.

Somebody might believe historical people were dumb or uncivilized and it can't happen today because we've advanced so much. That's bullshit. People have had largely the same wetware for hundreds of thousands of years. The tools have evolved but their users have not.

[0]: https://pluralistic.net/2026/03/16/whittle-a-webserver/ - "... aren't tools exemptions, they're use exemptions ... You have that right. Your mechanic does not have that right."

[1]: https://en.wikipedia.org/wiki/Pinkerton_(detective_agency)

reply
I spend a fun week during Christmas figuring out some really obfuscated bibary code with antidebugging anti pampering things in a cryptographic context. I didn’t use ghydra or ida or anything beyond gdb with deepseek chat in a browser. That low effort got me what I needed to get.
reply
[dead]
reply
Exactly.

AI proponents completely ignore the disparity of resources available to an individual and a corporation. If I and a company of 1000 people create the same product and compete for customers, the company's version will win. Every single time. Or maybe at least 1000:1 if you're an optimist.

They have access to more money for advertising, they have an already established network of existing customers, they have legal and marketing experts on payroll. Or just look at Microsoft, they don't even need advertising, they just install their product by default and nobody will even hear about mine.

Not to mention as you said, the training advances only goes from open source to closed source, not the other way around.

AI proponents who talk about "democratization" are nuts, it would be laughable if it wasn't so sad.

reply
>If I and a company of 1000 people create the same product and compete for customers, the company's version will win. Every single time.

As a person who works for a company with 25k people, I would disagree. You, a single person will often get to the basic product that a lot of people will want much faster than a company with 1k, 5k and 25k people.

Bigger companies are constrained by internal processes, piles of existing stuff, and inability to hire at the scale they need and larger required context. Also regulation and all that. Bigger companies are also really slow to adapt, so they would rather let you build the product and then buy out your company with your product and people who build it. They are at at a temporary disadvantage every time the landscape shifts.

reply
The point wasn't about the number of people, the point was a company which employs that number of people has enough money which can be converted to leverage against you.

Besides that, your whole arguments hinges on large companies being inflexible, inefficient and poorly run. Isn't that exactly the kind of problem AI promises to solve? Complete AI surveillance of every employee, tasks and instructions tailored to each individual and superhuman planning. Of course at that point, the only employees will be manual workers because actual AI will be much better and cheaper at everything than every human, except those things where it needs to interact with the physical world. Even contract negotiations with both employees and customers will be done with AI instead of humans, the human will only sign off on it for legal requirements just like today you technically enter a contract with a representative of the company who is not even there when you talk to a negotiator.

reply
Large companies are often inflexible and inefficient as a matter of deliberate strategy. I've found myself in scenarios where we have a complete software artifact that a smaller company would launch and find successful, but we can't launch it, because we have to satisfy some expectation we've set or do a complex integration with some important other system of ours.
reply
A lesson from gamedev is that players will deliberately restrict themselves - sometimes to make the game more fun or challenging, sometimes to appeal to their aesthetic principles.

If/when superhuman AI is achieved, those limitations will all go away. An owner will just give it money and control and tell it to optimize for more money or political power or whatever he wants.

That's a much scarier future than a paperclip maximizer because it's much closer and it doesn't require complete takeover first, it'll be just business as usual, except more somehow more sociopathic.

reply
deleted
reply