SSDs provided a huge bump in performance to each individual computer, but trickled their way into market saturation over a generation or two of computers, so you'd be effectively running the same software but in a much more responsive environment.
When SSDs became mainstream, yes, I agree they had a bigger impact than any CPU speed increases at that particular time.
But back in the double-digit MHz days of CPU speeds, upgrading your CPU was king when it came to better performance, and I'd argue that effect was more pronounced than the the HDD to SSD transition was. It's hard to convey what huge jumps CPUs were making during that time period, and how big a difference it made.
I also remember a time, somewhere in the middle of that, when adding more RAM could be a bigger boost than a CPU upgrade. But back in the 80s and 90s (and prior, but I have no personal experience with that), there was only so much RAM you could add, and the CPU was still often what was holding you back.
But CPUs just haven't been the bottleneck for most home user workloads for a long time now. These days when I buy a new laptop, I certainly want the best CPU I can get, but I'm more concerned about how much RAM I can put in it, and the iGPU's specs. (SSDs are a given, so I don't need to think much about it.)
SSDs booted faster and launched programs faster and were a very nice change, but they weren't that same sort of night-and-day 80s/90s era change.
The software, in those days, was similarly making much bigger leaps every few years. 256 colors to millions, resolution, capabilities (real time spellcheck! a miracle at the time.) A chat app isn't a great comparison. Games are the most extreme example - Sim City to Sim City 2000; Doom to Quake; Unreal Tournament to Battlefield 1942 - but consider also a 1995 web browser vs a 1999 one.
The drag down of swapping became almost a non-issue with the SSD changeover.
I suppose going from a //e to a IIgs was that kind of leap but that was more about the whole computer than a cpu.
Now I have to say, swapping to an SSD on my windows machines at work was far less impressive than going to SSD with my macs. I sort of wrote that off as all the anivirus crap that was running. It was very disappointing compared to the transformation on mac. On my macs it was like I suddenly heard the hallelujah chorus when I powered on.
Also, going from Sim City to Sim City 2000 was pre-bloat. Over the course of five years, the new version was significantly better than the original, but they both target the same 486 processor generation, which was brand new when the original SimCity was released, but rather old by the time SimCity 2000 was released. Another five years later, Sim City 3000 added minimal functionality, but required not just a Pentium processor, but a fast one.
I guess what I'm getting at is that a faster CPU means programs released after it will run better, but faster storage means that all programs, old and new, will run better.
These days, we value developer productivity over performance optimization, so we have stuff like Electron apps. The reason behind it is that CPUs (and RAM quantity, for the most part) are so far ahead of regular desktop applications that it doesn't matter. In the 80s and 90s, the hardware could barely keep up with decently-optimized software that wanted to do anything interesting.
I think there's a difference between bloat and actually useful features or performance.
For example, I started making music with computers in the early 90s. They were only powerful enough to control external equipment like synthesizers.
Nowadays, I can do everything I could do with all that equipment on an iPad! I would not call that bloat.
On the other hand, comparing MS Teams to say ICQ, yeah, a lot of that is bloat.
Tell that to ScreamTracker!
And we were mostly ripping those samples from records on cassettes and CDs, or other mods.
https://www.c64-wiki.de/images/f/f1/rockmon3.png
Or also at https://www.youtube.com/watch?v=roBkg-iPrbw&t=400s in the video already linked below. And yes, I had to type in that listing.
For me they were.
I still remember the first PC I put together for someone with a SSD.
I had a quite beefy machine at the time and it would take 30 seconds or more to boot Windows, and around 45s to fully load Photoshop.
Built this machine someone with entirely low-end (think like "i3" not "Celeron") components, but it was more than enough for what they wanted it for. It would hit the desktop in around 10 seconds, and photoshop was ready to go in about 2 seconds.
(Or thereabouts--I did time it, but I'm remembering numbers from like a decade and a half ago.)
For a _lot_ of operations, the SSD made an order of magnitude difference. Blew my mind at the time.
So it was the only way to get that visceral improvement in user experience like CPU and platform upgrades were in the mid 90's to very early 00's.
The experience of just slapping a new SSD in a 3 year old machine was similar to a different generation of computer nerds.
Nothing could really match the night and day difference of an entire machine being double to triple the performance in a single upgrade though. Not even the upgrade from spinning disks to SSD. You'd go from a game being unplayable on your old PC to it being smooth as butter overnight. Not these 20% incremental improvements. Sure, load times didn't get too much better - but those started to matter more when the CPU upgrades were no longer a defining experience.
Would you take the SSD and a 500Mhz processor or a 2Ghz dual-core with a 7200k or 10000k HD? "Some operations are faster" vs "every single thing is wildly faster" of the every-few-years quadrupling+ of CPU perf, memory amounts, disk space, etc.
(45sec to load Photoshop also isn't tracking with my memory, though 30s-1min boot certainly is, but I'm not invested enough to go try to dig up my G4 PowerBook and test it out... :) )
Never witnessed anything before or after with that jump in specs
I'd say software never really "caught up" to the general slowness that we had to endure in the HDD era either. Even my 14 year old desktop starts Word in a few seconds compared to upwards of 60s in the 90s.
The closest I've seen is the shitty low end Samsung Android tablet we got for our kids. It's soooo slow and laggy. I suspect it's the storage. And that was actually and upgrade over the Amazon Fire tablet we used to have which was so slow it was literally unusable. Again I suspect slow storage is the culprit.
The only thing more impressive that hardware engineers' delivering continuous massive performance improvements for the past several decades is software engineers' ability to completely erase that with more and more bloated programs to do essentially the same thing.
One of the co-signers of the Agile Manifesto had previously stated that "The best way to get the right answer on the Internet is not to ask a question; it's to post the wrong answer." (https://en.wikipedia.org/w/index.php?title=Ward_Cunningham#L...) I'm convinced that the Agile Manifesto was an attempt to make an internet post of the most-wrong way to manage a software projects, in hopes someone would correct it with the right answer, but instead it was adopted as-is.
The mangling of JavaScript to fit through every hole seems to be the biggest mistake made in modern programming, and I'm not sure what even keeps it going aside from momentum. At first it regained ground because Flash was going EOL, but now?
I feel this. Humanity has peaked.
Nowadays, you really don't get these magical moments when you upgrade, not on the device itself. The upgrade from Windows 10 to Windows 11 was basically just more ads. Games released today look about as good as games released 5-10 years ago. The music-making or photo-editing program you installed back then is still good. Your email works the same as before. In fact, I'm not sure I have a single program on my desktop that feels more capable or more responsive than it did in 2016.
There's some magic with AI, but that's all in the cloud.
Windows 11, Discord: 4GB are not enough to run it well.
FYI, Kopete allowed inline LaTeX, Youtube videos (low res, ok, 480p maybe, but it worked), emoticos, animations, videoconference, themes, maybe basic HTML tags and whatnot. And it ran fast.
"Bananas" core-counts gave me the same experience. Some year ago I moved to Ryzen Threadripper and experienced similar "Wow, compiling this project is now 4x faster" or "processing this TBs of data is now 8x faster", but of course it's very specific to specific workloads where concurrency and parallism is thought of from the ground up, not a general 2x speed up in everything.
About a week ago, completely out of the blue, YouTube recommended this old gem to me: https://www.youtube.com/watch?v=z0jQZxH7NgM
A Pentium 4, overclocked to 5GHz with liquid nitrogen cooling.
Watching this was such an amazing throwback. I remember clearly the last time I saw it, which was when an excited friend showed it to me on a PC at our schools library. A year or so before YouTube even existed.
By 2005, my Pentium 4 Prescott at home had some 3.6GHz without overclocking, 4GHz models for the consumer market were already announced (but plagued by delays), but surely 10GHz was "just a few more years away".
But with longer pipelines comes larger penalties when the pipeline needs to be flushed, so the P4 eventually hit a wall and Intel returned to the late Pentium 3 Tualatin core, refining it into the Pentium M which later evolved into the first Core CPUs.
https://www.tomshardware.com/pc-components/cpus/core-i9-1490...
The cpu and heatsink was fully integrated into what looked like a NES cart, with an integrated fan and everything. It was not really possible to separate the cpu and the heatsink as the locking mechanism to keep the cart in place on the motherboard interfaced with the heatsink assembly.
So I'm a little dubious of that no-heatsink claim.
But that was several years after the book cited by the GP was published (1994, shortly after the release of the original Pentium).
It took a long time before I felt a need to improve my PC's performance again after that.
I remember loading up Doom, plugging my shitty earplugs that had a barely long enough cable and hearing the “real” shotgun sound for the first time. Oo-wee
Between IPC (~50 to 100-fold improvement) and clock speed increases (1000-fold alone), I estimated that single-thread performance has increased on the order of 50,000x - 100,000x since the 4.77 MHz 8088.
In human terms this is like one minute compared to one month!
I didn't feel any huge speed boosts like that until the M1 MacBook in 2020.
I can see why you wouldn’t consider it as impactful if you weren’t into gaming at the time.
Up until the 486, the clock speed and bus speed were basically the same and topped out at about 33MHz (IIRC). The 486 started the thing of making the CPU speed a multiple of the bus speed eg 486dx2/66 (33MHz CPU, 66MHz bus), 486dx4/100 (25MHz CPU, 100MHz bus). And that's continued to this day (kind of).
But the point is the CPU became a lot faster than the IO speed, including memory. So these "overdrive" CPUs were faster but not 2-4x faster.
Also, in terms of impact, yeah there was a massive incrase in performance through the 1990s but let's not forget the first consumer GPUs, namely 3dfx Voodoo and later NVidia and ATI. Oh, Matrox Millenium anyone?
It's actually kind of wild that NVidia is now a trillion dollar company. It listed in 1998 for $12/share and adjusted for splits, Google is telling me it's ~3700x now.
The Apple Silicon chassis was allowed to finally house an appropriate cooling solution, too. They are much quieter than the same Intel laptops when dissipating the same power levels.
Apple’s power efficiency was a great bump forward, but the performance claims were a little exaggerated. I love my Apple Silicon devices but I still switch over to a desktop for GPU work because it’s so much faster, for example.
Apple had that famously misleading chart where they showed their M1 GPU keeping pace with a flagship nVidia card that misled everyone at launch. In practice they’re not even close to flagship desktop accelerators, unfortunately.
They have excellent idle power consumption though. Great for a laptop.