As per the US Copyright Office, LLMs can never create copyrightable code.
Humans can create copyrightable code from LLM output if they use their human creativity to significantly modify the output.
https://resources.github.com/learn/pathways/copilot/essentia...
how much use do you think these indemnification clauses will be if training ends up being ruled as not fair-use?
poetic justice for a company founded on the idea of not stealing software
> If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights, our contractual terms are designed to shield you.
I'm not actually aware of a situation where this was needed, but I assume that MS might have some tools to check whether a given suggestion was, or is likely to have been, generated by Copilot, rather than some other AI.