Then of course there was the huge "replace everything with SSDs ASAP" performance bump, but ever since the later Core and before the M1, everything felt incremental. Nothing like the "Wolfenstein 3D to Quake Glide in 5 years" era.
Holy shit it was only 5 years - the M1 was released 6 years ago!
By comparison, from the first Core Duo computers to today... we should have been roaming around, at street, level, in fully 3D rendered cities in Street View long ago, at least with Half Life 2 quality graphics and some high res textures and light bolted on to fake out the most complex effects.
And yet what we have it's the exact same rendering PC's and PowerPC macs did with virtual tour multimedia CD's in late 90's (and Encarta) but in a higher resolution. Literally. A 2D image rendered inside of an sphere.
That's it, something right now you can do in software without too much effort. You can get some 3D image and a dedicated viewer for it such as Panini with no advanced 3D support at all and what you are literally doing it's the same as JS scripts and WebGL are trying to do with Street View still images but with a Core Duo CPU or a high end Pentium 4.
Which is a technology closer to Myst in concept than the Unreal Engine. There's no actual progress as the one we lived through the 90's. Years passed as huge steps in the same way a kid aged 10 had near nothing to do in tastes with a 14yo one. Well, technology in our lives was the same. From walkmans to CD's and for crappy MSDOS computers in Mid Elementary to Windows 98 at nearly the high scrool.
I feel we haven't really had a different paradigm for decades, and I'm not sure AI even really is one.