Training currently requires nvidia's latest and greatest for the best models (they also use google TPU's now which are also technically the latest and greatest? However, they're more of a dual purpose than anything afaik so that would be a correct assesment in that case)
Inference can run on a hot potato if you really put your mind to it
I am not saying this would be a great use of their compute, but idle is far from the only alternative. (Unless electricity is the binding constraint?)
Huh, what? You know you can turn off unused equipment, and at least my nvidia GPU can use more or less Watts even when turned on?
Or does Anthropic have a flatline deal for electricity and cooling?