Not just that, you have to be always doing less for more gains. Real work is bad work. Shrinkflation good. I don't know what it is if it wasn't a pure scammer mindset.
Few care if you have a lifetime warranty and excellent service or replacement parts if the majority will upgrade in a few years! Mature technologies increasingly become cheaply available as services, eg. laundry, food, transportation. That further reduces demand on production, as many can get by with the bare minimum and don't need the highest quality, longest lasting appliances. Software is even more ephemeral and specialized.
Developing education and training pipelines is wasting money if the skills you need are constantly changing! There is plenty of "slack" in the workforce so this works just fine in most cases - somebody will learn what they need to get paid. There are very few fields where qualified worker shortages are a real problem.
R&D can be outsourced or bought and subsidized by the government in universities, so why do everything yourself? Open source software has even further muddied the waters. Applications have only a limited lifetime before being replicated and becoming free products (this has only been intensified by the introduction of AI), so companies develop services instead.
Technology and knowledge deepening and rapidly becoming more specialized makes the monolithic corporation much less practical, so companies also need to specialize in order to effectively compete. Going too far in the name of efficiency can destroy core competencies, but moving away from the old model was necessary and rational.
Because some problems that many companies in very specialized industries work on are so special that outside of this industry, nearly all people won't even have heard about them.
Additionally, many problems companies have where research would make sense are not the kind of problems that are a good fit for universities.
Bell Labs greatest work came out when AT&T was a monopoly. Once they were broken up (1984?) they started feeling the pain.
When the Lucent spinoff took place, the new entities had no Monopoly money to fund unconstrained research while management's behaviour never changed.
I don't know how BL fared under Alcatel and now Nokia, but haven't heard of anything interesting for years.
Per wikipedia:
IBM employees have garnered six Nobel Prizes, seven Turing Awards,
20 inductees into the U.S. National Inventors Hall of Fame, 19 National Medals of Technology,
five National Medals of Science and three Kavli Prizes. As of 2018,
the company had generated more patents than any other business in each of 25 consecutive years.A couple things about those patents, from a former IBMer who has quite a few in his time there.
First, not all patents are created equal. Most of those IBM patents are software-related, and for pretty trivial stuff.
Second, most of those patents are generated by the rank and file employees, not research scientists. The IBM patent process is a well-oiled machine but they ain't exactly patenting transistor-level breakthroughs thousands of times a year.
I started at the tail of one research group’s mass exodus. It was like a bomb had gone off; the people left behind were trying to pick up the pieces. In essence, this group developed a sophisticated new technique, which the company urged them to commercialize. Pivoting to commercialization was a big effort, and not naturally within the expertise of this group, but they did it, largely at the expense of their own research productivity—for several years. They even hired programmers (ie, not people who are primarily computer scientists) and got it done. But just before launch, IBM pulled the plug.
This infuriated the researchers in the group. Keep in mind that career advancement in research is largely predicated on producing new research. In effect, IBM asked people to take a time out and then punished them for agreeing to do it. The whole group was extremely demoralized. Google was the largest beneficiary of this misstep.
I also had a similar, frustrating experience working for Microsoft, so it’s not just IBM, but the same dynamics were at work: bean counters asking researchers to commercialize something and then axing a project as it becomes deliverable.
If AI replaces any role in the company of the future, please let it be the managerial class.
We did that at Meta and Amazon too (for polycarbonate puzzle pieces, with no monetary award at all!). Every now and then something meaningful came out of it
Patents do, but in most cases it's trivial patents or patents for a "mutually assured destruction" portfolio (aka, you keep them in hand should someone ever decide to sue you).
That's a fundamental problem with how the Western sphere prioritizes and funds R&D. Either it has direct and massive ROI promises (that's how most pharma R&D works), some sort of government backing (that's how we got mRNA - pharma corps weren't interested, or how we got the Internet, lasers, radar and microwaves) or some uber wealthy billionaire (that's how we got Tesla and SpaceX, although government aids certainly helped).
All while we are cutting back government R&D funding in the pursuit of "austerity", China just floods the system with money. And they are winning the war.
A Nobel in 2026 doesnt carry the same weight as a Nobel in 1955.
This is only fair, because they themselves are firing at 100% all the time IYKWIM ;)
The beancounters have cut all the corners on physical products that they could find. Now even design and manufacturing is outsourced to the lowest bidder, a bunch of monkeys paid peanuts to do a job they're woefully unqualified for.
And the end result is just a market for lemons. Nobody trusts products to be good anymore, so they just buy the cheapest garbage.
Which, inevitably, is the stuff sold directly by Chinese manufacturers. And so the beancounters are hoisted by their own petard.
We've seen it happen to small electronics and general goods.
We're seeing it happen right now to cars. Manufacturers clinging on to combustion engines and cutting corners. Why spend twice the money on a western brand when their quality is rapidly declining to meet BYD models half the price.
---
And we're seeing it happen to software. It was already kind of happening before AI; So much of software was enshittifying rapidly. But AI is just taking a sledgehammer to quality. (Setting aside whether this is an AI problem or a "beancounters push everyone into vibecoding" problem)
E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there. Windows is just going down in flames. People are jumping ship now.
SaaS is quickly going that way as well. If it's all garbage, why pay for it. Either stop using it or just slop something together yourself.
---
And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. So much manufacturing knowledge is just gone, starting a new manufacturing firm in the west is a staffing nightmare. Same story with cars, China has the EV knowledge. And software's going the same way. These beancounters are all chomping at the bit to fire all their devs and replace them with teenagers in the developing world spitting out prompts. They can't move back upmarket after that's done.
Even when the knowledge still lives, when the people with the skills requires have simply moved to other industries and jobs, who's going to come back? Why leave your established job for the former field, when all it takes is the management or executive in charge being replaced by another dipshit beancounter for everyone to be laid off again.
The knowledge isn't the problem. It can be quickly regained, and progress of science and technology often offer new paths to even better quality, which limits the need for recovering details of old process.
The actual problem is, there is no market to go up to anymore. Once everyone is used to garbage being the only thing on offer, and adjust to cope with it, you cannot compete on quality anymore. Customers won't be able to tell whether you're honest, or just trying to charge suckers for the same garbage with a nicer finish, like every other brand that promises quality. It would take years of effort and low sales to convince the customers to start believing you're the real deal, which (as beancounters will happily tell you) you cannot afford. And even if you could, how are you going to convince people you're not going to start cutting corners again a few years down the line? In fact, how do you convince yourself? If it happened once, if it keeps happening everywhere around across all economy, it's bound to happen to your business too.
Desktop Linux has gotten better, though much of the improvement happened decades ago. I believe the first person to prematurely declare "the year of Linux on the desktop" was Dirk Hohndel in 1999: https://www.linux.com/news/23-years-terrible-linux-predictio...
And speaking as someone who was running desktop Linux in 1999, I remember just how bad it was. Xfce, XFree86 config files, and endless messing around with everything. The most impressive Linux video game of 2000 was Tux Racer.
But over the next 10 years, Gnome and KDE matured, X learned how to auto-detect most hardware, and more-and-more installs started working out of the box.
By the mid-2010s, I could go to Dell's Ubuntu Linux page and buy a Linux laptop that Just Worked, and that came with next day on-site support. I went through a couple of those machines, and they were nearly hassle free over their entire operational life. (I think one needed an afternoon of work after an Ubuntu LTS upgrade.)
The big recent improvement has been largely thanks to Valve, and especially the Steam Deck. Valve has been pushing Proton, and they're encouraging Steam Deck support. So the big change in recent years is that more and more new game releases Just Work on Linux.
Is it perfect? No. Desktop Linux is still kind of shit. For examples, Chrome sometimes loses the ability to use hardware acceleration for WebGPU-style features. But I also have a Mac sitting on my desk, and that Mac also has plenty of weird interactions with Chrome, ones where audio or video just stops working. The Mac is slightly less shit, but not magically so.
And yet I run it every day, and it's by FAR the most enjoyable platform and tooling to use (for me).