upvote
RAM + GPU are getting more expensive but mostly for applications that require a lot of it like AI. The hardware cost for regular applications has not vastly increased (especially when factoring in inflation). Spending 2x development time on a problem often is not worth it (or only with large deployments).

UI development is an even more special case here. The customer buys the machine which runs the code, not the company. So sadly "good enough" is the standard.

One example for me here is the "switch product option" button on Amazon listings (e.g. switch green to blue color, smaller to larger model). On my phone this sometimes takes >5 seconds to properly load. Horribly optimised.

reply
Oh of course, that's the current standard, but I doubt it will be considered acceptable for much longer.
reply
It’s not even close to at an end. Hardware would need to increase in cost by hundreds or even thousands of times to materially change that calculation.

Just as an example, the cost of one week of engineering time corresponds to tens of thousands of vCPU-hours, which is many years of CPU time.

As such, it only ever makes business sense to optimize code either when it has bottlenecks that can’t be fixed by throwing hardware at it, or when it’s so inefficient that it can be sped up by several orders of magnitude.

reply