It seems inevitable, but in the mean time, that's vastly more expensive than running curl in a loop. In fact, it may be expensive enough that it cuts bot traffic down to a level I no longer care about defending against. Like GoogleBot had been crawling my stuff for years without breaking the site. If every bot were like that, I wouldn't care.
Even that functions as a sort of proof of work, requiring a commitment of compute resources that is table stakes for individual users but multiplies the cost of making millions of requests.
"Turns out careless, unethical vibecoders aren't very competent." well, they rely on AI, don't they? and AI is trained with already existing bad code, so why should the outcome be different?