AI tools are decent at helping with code because they're editing language in a context. AI tools are terrible at helping with art because they are operating on the entirely wrong abstraction layer (in this case, waveforms) instead of the languages humans use to create art, and it's just supremely difficult to add to the context without destroying it.
(Most of it isn't.)
Art is on a sliding scale from "Fun study and experiment for the sake of it" to "Expresses something personal" to "Expresses something collective" to "A cultural landmark that invents a completely new expressive language, emotionally and technically."
All of those options are creatively worthwhile. Or maybe none of them are.
Take your pick.
Because it is a human making it, expressing something is always worthwhile to the individual on a personal level. Even if its not "artisticallly worthwhile", the process is rewarding to the participant at the very least. Which is why a lot of people just find enjoyment in creating art even if its not commercially succesful.
But in this case, the criteria changes for the final product (the music being produced). It is not artistically worthwhile to anyone, not even the creator.
So no, a person with no talent (self claim) using an LLM to create art is much less worthwhile than a human being with no/any talent creating art on their own at all times by default.
The medium was using the "wrong" tool for the job, which creative musicians do on a regular basis. And the output was so cool, it really felt like a relic from a different era even though it's hyper-modern.