I just read Vernor Vinge's "A deepness in the sky" And the way he modeled their compute systems felt depressingly believable, they have thousand of years of libraries floating around, sort of loosely tacked together. and specialist programmer-archaeologists are the ones who who dig deep and try to understand the system.
Interestingly, most long-running codebases are like that, no?
It's just that producing (incl. reviewing/testing and all those, even AI-assisted) that amount of code in a significantly shorter period of time highlights this discrepancy much more to us.
Boiling frog
This seems like a path to eventual LLM lock-in once the codebase gets messy enough. These things could end up being like 0% interest credit cards for technical debt. I guess it all depends on how the token usage scales over time. My guess is it will be steeper than linear.