upvote
> Unless Nvidia can launch a new chip every 2-3 years with massively improved performance-per-watt at a lower price no one is going to rush to recycle the old one.

That's exactly the point.

Performance/watt is increasing so much gen-to-gen that it makes no longer sense to run older hardware.

Not my words, Jensen's.

reply
Are you saying that the person selling shovels thinks you should buy a new shovel? I guess they must be the expert.
reply
So you are saying that the ceo of the company that builds the chips is saying that it makes sense to change them each generation?
reply
When the warbirds are on the wing, sell anti-aircraft systems to both sides.
reply
you can absolutely run e.g. datacenter-level A100 at home, there are adapters from the SXM to the PCIe socket. Haven't seen people running SXM versions of H100s this way but this could be due to the price factor only
reply
Well by the time the become obsolete you can run that computing on a Mac with no special cooling so I really doubt they will be of any use. Maybe in some parts of the world where electricity is cheap. If someone wants to really find out perhaps watching the crypto ASICs stories could help.
reply
Well technically true, I would wager that the home lab is going to require increasingly distinct and unusual adaptations to retrofit the hardware to home use.

New stuff is all liquid cooled by default and that's a paradigm shift for your average home lab.

I'm less aware of exactly what's happening on the power side of things but I think some of the architectures are now moving to relatively high voltage DC throughout and then down converting it to low voltage right before it's used. So not exactly just plug-and-play with your average nema15 outlet.

reply
> I doubt anyone has the setup to run a H200 in their home rig.

There are PCIe versions of these right? And another comment is saying there are PCI adapters too. It "only" requires 600 to 700W. It's not out of reach for everybody.

If the used regular server market is any indication, you can find, after a few years, a lot of enterprise gear at totally discounted prices. CPU costing $4K brand new for $100 after a few years: stuff like that.

A friend has got a 42U rack and so do some homelab'ers. People have been running GPU farms mining cryptocurrencies or doing "transcoding" (for money).

It's not just CPUs at 1/40th of their brand new price: network gear too. And ECC RAM (before the recent RAM craze).

I'm pretty sure that if H200 begin to flood the used market, people shall quickly adapt.

> Unless Nvidia can launch a new chip every 2-3 years with massively improved performance-per-watt at a lower price no one is going to rush to recycle the old one.

I agree with that. But if they resell old H200s, people are resourceful and shall find a way to run these.

reply
Would it even require a particularly high level of resourcefulness? Purchase the GPU along with the mobo that slots it. It's not as though companies typically swap out CPU and GPU while keeping the rest of the box.
reply
Where do you find such deals
reply
Start on eBay and learn the off-lease companies, and start watching them directly.
reply