There is an act of copying, and there is whether or not that copying was permitted under copyright law. If the author of the code said you can copy, then you can. If the original author didn't, but the author of a derivative work, who wasn't allowed to create a derivative work, told you you could copy it, then it's complicated.
And none of it's enforced except in lawsuits. If your work was copied without permission, you have to sue the person who did that, or else nothing happens to them.
(IANAL)
but also probably not fully right
as far as I understand they avoid the decision of weather an AI can produce creative work by saying that the neither the AI nor it's owner/operator can claim ownership of copyright (which makes it de-facto public domain)
this wouldn't change anything wrt. derived work still having the original authors copyright
but it could change things wrt. parts in the derived work which by themself are not derived
iff it is a complete new implementation with completely different internal then it could also still be no LGPL even if produced by a person with in depth knowledge. Copyright only cares if you "copied" something not if you had "knowledge" or if it "behaves the same". So as long as it's distinct enough it can still be legally fine. The "full clean room" requirement is about "what is guaranteed to hold up in front of a court" not "what might pass as non-derivative but with legal risk".