I realized this is the crux of our moment, because a variant of Amdahl's law applies to AI code gen.
{time gained} = {time saved via gen AI} - {time spent in human review}
There's no way that results in a positive number with 100% human review coverage, which means that human review coverage is headed to < 100% (ideally as low as possible).
The question is whether humans can sensibly judge the break even point and not generate faster than that. It's very easy to get lost in the woods and suddenly have a bunch of generated stuff you no longer grok.
As we know with driving, sensible drivers stick to the speed limit most of the time, but there's a good percentage of knuckle draggers who just love speeding, some people get drunk, some they just drive the wrong way down the highway entirely. Either way it's usually the sensible people who end up suffering.