upvote
> Modern OSes with virtual memory and multitasking and user isolation are a lot more tolerant of shit code, so we are getting more of it.

It's not the glut of compute resources, we've already accepted bloat in modern software. The new crutch is treating every device as "always online" paired with mantra of "ship now! push fixes later." Its easier to setup a big complex CI pipeline you push fixes into and it OTA patches the users system. This way you can justify pushing broken unfinished products to beat your competitors doing the same.

reply
I think you're just recalling the few software products that were actually good. There was plenty of crap software that would crash and lose your work in the old days.
reply
Lol right!

Remember when OS uptime was super duper important? Now it's a given that you can basically never restart your computer and be fine.

reply
I always found it funny how Word on Window 3.1/95 would have a day dream moment and just completely lock up, usually when you were about to save the document

I still save stuff every few minutes out of habits formed in the 90s.

Old DOS stuff could either be a total nightmare or some of the most brilliant code you had ever seen. Thats just the way having no giard rails goes.

reply
Another factor at work is the use of rolling updates to fix things that should better have been caught with rigorous testing before release. Before the days of 'always on' internet it was far too costly to fix something shipped on physical media. Not that everything was always perfect, but on the whole it was pretty well stress-tested before shipping.

The sad truth is that now, because of the ease of pushing your fix to everything while requiring little more from the user than that their machine be more or less permanently connected to a network, even an OS is dealt with as casually as an application or game.

reply