Something that makes it expensive to initiate a connection and cheap (relatively) to accept or reject would probably help. I think that’s a hard problem though.
I do 95% of my web browsing via Tor Browser and it is very tolerable, most circuits are fast enough for 1080p video (Youtube, Twitch livestreams, etc) without any buffering.
Here is a speedtest I ran just moments ago, I would hardly consider this "painfully slow": https://www.speedtest.net/result/19172283165.png
Of course this is a single tor circuit with an exit node, so speeds are slower when going directly to .onion sites, but the only real slowness comes from the latency and not throughput.
I’m not talking about the network itself but the servers on the other end.
I guess my point is that while Google is definitely malicious, I don’t think every site using recaptcha is and if we expect them not to use that tool there should probably be an alternative.
I think SV was asking what onion services, which can't really use recaptcha, do to prevent the DDoS storm.
And I would imagine the answer is obscurity, since the dark web isn't nearly as well-mapped as the public web. That and some Anubis or other PoW would probably go far.
If I’m hosting at some IP, I still need Anubis or something to serve up the challenge, so doesn’t that become the attack point?