upvote
Human beings can create copyrightable code.

As per the US Copyright Office, LLMs can never create copyrightable code.

Humans can create copyrightable code from LLM output if they use their human creativity to significantly modify the output.

reply
a human can still be held accountable though, github copilot running amock less so
reply
If you pay for Copilot Business/Enterprise, they actually offer IP indemnification and support in court, if needed, which is more accountability than you would get from human contributors.

https://resources.github.com/learn/pathways/copilot/essentia...

reply
9 lines of code came close to costing Google $8.8 billion

how much use do you think these indemnification clauses will be if training ends up being ruled as not fair-use?

reply
Are you concerned that this will bankrupt Microsoft?
reply
I think they're afraid they will have to sue Microsoft to get them to abide by the promise to come to their defense in another suit.
reply
be nice, wouldn't it?

poetic justice for a company founded on the idea of not stealing software

reply
I think that they felt the need to offer such a service says everything, basically admitting that LLMs just plagiarize and violate licenses.
reply
That covers any random contribution claiming to be AI?
reply
Their docs say:

> If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights, our contractual terms are designed to shield you.

I'm not actually aware of a situation where this was needed, but I assume that MS might have some tools to check whether a given suggestion was, or is likely to have been, generated by Copilot, rather than some other AI.

reply