upvote
I believe training currently costs significantly more than inference to all the current vendors, so I'd be surprised if it doesn't also use more power.

And by the look of it, that'll be the norm pretty much forever - unless something fundamental about how models can be trained/updated, an "older" model loses value as it's knowledge becomes out of date, even if we no longer get improvements from other sources or techniques.

But other things likely change based on "lifetimes" and usage patterns too - e.g. a large battery for an electric car may have a higher upfront energy cost in manufacturing than a small ICE + fuel tank, but presumably there's a mileage that the improved per-mile efficiency overcomes that, and then continues to gain with each additional mile.

reply