upvote
> How one is supposed to ensure license compliance while using LLMs which do not (and cannot) attribute sources having contributed to a specific response?

Additionally there seems to be a general problem with LLM output and copyright[1]. At least in Germany. LLM output cannot be copyrighted and the whole legal field seems under-explored.

> This immediately raises the question of who is the author of this work and who owns the rights to it. Various solutions are possible here. It could be the user of the AI alone, or it could be a joint work between the user and the AI programmer. This question will certainly keep copyright experts in the various legal systems busy for some time to come.

It seems that in the long run the kernel license might become unenforceable if LLM output is used?!

[1] https://kpmg-law.de/en/ai-and-copyright-what-is-permitted-wh...

reply
Either you allow LLM generated + human reviewed code or people start hiding AI use.

...and then people start going "that's AI" on every single piece of code, seeing AI generated code left and right - like normal people claim every other picture, video or piece of text is "AI".

IMO it's a lot better to let people just openly say "this code was generated with AI assistance", but still sign off on it. Because "Your job is to deliver code you have proven to work": https://simonwillison.net/2025/Dec/18/code-proven-to-work/

reply