Yes I know every millisecond a company like Google can shave off, is multiplied by billions of transactions a day and can save real money on infrastructure. But even at a second tier company like Salesforce, it probably doesn’t matter
Over the past decade, part of my job has been to design systems, talk to “stakeholders” and delegate some work and do some myself. I’m neither a web developer nor a mobile developer.
I don’t look at a line of code for those types of implementations. I do make sure they work. From my perspective, those that I delegated to might as well be “human LLMs”.
But even with C, it’s still not completely deterministic with out of order and predictive branching, cache hits vs misses etc. Didn’t exactly this cause some of the worse processor level security issues we had seen in years?