I think consumers have little power here. Our economic system fundamentally chooses to reward such behaviour. Until we change that, the power will always be with these kind of companies.
Perhaps governments could levy punative fines in such situations. But that seems like a bandaid (and ripe for corruption). Ideally we'd have structural change that prevents this behaviour in the first place. Perhaps worker representation on company boards. Or progressive corporation taxation that more strongly encourages smaller companies and more competition.
Most consumers are unwilling to take an option that they perceive as inconveniencing them more than getting screwed by the company inconveniences them.
The reality is that companies know they can get away with crap because they all get away with crap. And because they all do it, consumers are powerless.
This is why regulation isn’t a bad the thing that many HNers seem to recoil at. The real problem with regulation is when it’s defined by lobbyists rather than consumer groups. But even then, it’s really no different to the status quo where businesses are never held accountable.
The TV I have has never had an antenna cable plugged, or an internet connection. It’s, from day one, been a large HDMI monitor to an Apple TV, a Nintendo Switch and a C64 Maxi with some other devices plugged in from time to time.
It IS possible to ignore the TV’s software most of the time (mine, luckily, isn’t intrusive at all) and use it as a conduit for a much cheaper and easily replaced (or hacked) device.
I remember how surprised the engineers at [manufacturer redacted] were when I told them they had everything needed to turn their TVs into thin clients and meeting room monitors right into the Linux firmware just a compile away. I’d totally love a 35” X terminal in 2008 with Ethernet and a couple USB ports.
From personal experience, it really really is barely even an inconvenience. Especially in a world where YouTube exists and is accessible for free from a desktop computer. There's barely been anything good on TV for decades, and the older stuff probably only seemed good because of the difficulty of publishing any competition.
But I also know a hell of a lot of people who still massively prefer watching content the traditional way. As in, not just TV shows, but on a TV too. And I have no more right to tell them how to consume video content as they do to tell me how I should consume the stuff I want to read.
I sometimes suggest they’d do themselves a favor if they stopped watching Fox News and reality TV, that life is much better without that.
A disturbing proportion of my family spend more than half of their free time watching television (typically while doom scrolling tiktok). They don't "need" TVs - they need to find interests.
Besides, it’s not like TVs are the only industry where consumer choice is an illusion. You see the same problem in a lot of sports (I used to fence and there was a great deal of pressure to buy equipment from one specific manufacturer which charged literally 4x the price for their gear).
And it’s not just hobbies either. I need a car for family duties and there are plenty of parts on it that can only be replaced by an authorised dealer.
Nobody dictates that. What we do is to suggest there might be more rewarding things to do with their time off than watching TV between the dopamine hits from TikTok
Are there some things I would struggle with if suddenly there were issues? Sure. I had to significantly increase my internet spend because of the (much) cheaper option going to complete shit. I require the internet for my career but unless the entire world collapses I doubt I'll run into any true blocker that would prevent me from using it for work.
Most people are just afraid to change their lives substantively. I am too, but I'm also willing to do it for causes I believe in.
My point is that your list is one list which you are making, but someone else could look at your life and make a different list. Your argument only goes so far you can extend into your own life. If you really cared about something's place in your life, you wouldn't classify it as a convenience, so you are conveniently applying your own classifications to other people's lives, which you don't have a right to do.
This is why we have democratic institutions and authority -- to make these limits about what is tolerable and intolerable -- not what people's conveniences are.
Another instance where companies can have more leverage than consumers is gaming. Console exclusives are a thing because they work; not giving consumers the option to play Pokemon on anything but the Nintendo Switch drives switch sales. Microsoft is better off working with other gaming companies to ensure Windows keeps being dominant, than building an OS to gamer's preferences.
I think time has proven many times that consumers aren't always good regulators for the market. The market is best regulated by organized entities.
Sure, but I also think that a lot of the issues with Windows 11 don't really matter much if its just used as a work OS. For example, I refuse to upgrade my home PC to 11, because I don't want Microsoft to spy on me; however, when I am using my work computer, I know that I am already being spied upon, so that's not a concern for me.
There is a whole ecosystem that needs to move before they can move.
When you insist that the people comprising the system have no agency, you're the one perpetuating it
Analyzed well here: https://yalelawjournal.org/pdf/e.710.Khan.805_zuvfyyeh.pdf
They aren't a majority in any other market segment.
Too many markets are utterly dominated by one or two big players. I know it’s a tricky problem because market share is hard to define (Does Amazon have 80% share of e-commerce? Or 30% share of all retail?) but I think we would be better off if there were a more aggressive set of rules about anti-competitive behavior that automatically applied to these huge firms, which didn’t rely so much on subjective judgment.
Don't buy their products, and tell your friends
“Boy, I hate operating systems from evil gigantic corporations that constantly spy on us. I know the solution, let’s use a Google product!”
I'll give you five guesses which OS I never booted into.
I used to do a lot of document and Office work. If you had told me that 20 years in the future MS would still be around, automagic piracy enabled coding bots were a thing, and people were having problems because the buttons in Office don’t work, I would’ve flagged the third as unbelievable.
The only way that stops is by having enough people leave that they change their behavior, and it's not sufficient to switch to the competition that is operating under the same perverted incentives under the same system with the same failure modes. No Windows, no Mac, no Chromebooks, no enshittified corporate quagmire of awfulness and despair.
The solution is simple - use Linux. Set your family up with Linux.
It's the year of the Linux desktop; it's never been easier or better, and it's never been more important to make the leap.
The family computer is set up to boot into Ubuntu; booting into Windows 11 is the exception (games, iTunes).
Consumers have the final say, our economic system fundamentally is consumer spending. (Ok, save for most recent year(s) of mag7 AI buildout. But generally that's the case for USA economy).
We have to stop taking out our wallet and just accepting things like sheep. (nearly) Every one of the "scrapped" computers could have run a *nix OS and been a middle finger to microsoft.
Nearly 1 billion PCs have stayed on Windows 10, 42% of the global desktop marketshare is still on 10 despite EOL. Linux has been showing consistent growth on the steam hardware survey as well, and time will tell but I have a feeling the MacBook Neo is going to put another nail in Microsoft's consumer coffin.
The problem for us is that's such a tiny margin of Microsoft's customer base. They aren't a consumer company anymore. For Microsoft to feel the pain, we need the big legacy enterprises to start ripping out Windows (and by extension, rip out Windows Server, Azure, M365).
Us here on HN are in a unique position to help, with many of us having influence on or even the authority to make technical decisions for the companies we work for. Its not enough to stop buying Microsoft at home, we all need to stop buying Microsoft at work.
Microsoft has largely stopped asking consumers for money. The last paid upgrade was Windows 8, IIRC. Since then, Microsoft wants consumers to upgrade, so it's free, with full screen prompts at login, and sometimes the 'no thanks' button just does it anyway.
Microsoft sells consumer OSes to OEMs. I haven't been looking, but I assume they don't allow OEMs to install Windows 10 Home anymore; and maybe not even Windows 10 Pro. So when consumers buy a new PC, they're getting Windows 11. The only Linux option at most stores is Chrome OS, which Google is shutting down, and is just a browser for most users (it's a useful product! a lot of users just need a browser; but it's not a platform of empowerment)
The present: Nobody got fired for buying Microslop.
Only if consumers have viable alternatives to choose from. If they don't then what are they supposed to do?
I agree it's not as easy as pre-installed, but it definitely is viable.
Individual consumer action does not a monopoly break.
It’s similar to that idea that floats around where $0 lost to fraud is not the optimal amount. If you over index on removing fraud from your system eventually you will spend more on monitoring and removing it than you save on the fraud itself. That is time you could have spent building more things that make money.
The ecosystem is a beast of its own and the optimal state of the ecosystem is /not/the optimal state for any actor within it
As it is now, buying a laptop in a store is a "pick your poison" situation.
Recently, I decided to start making music again after a decade of hiatus. I got a nice audio interface and some hardware which can do nifty things. The catch?
None of the supporting software for my hardware runs on Linux. I either need to run a VM to configure these things, or use the macOS versions of the software. I chose the latter because it's not meaningful to passthrough all the devices to change some parameters and give device back to Linux. I also don't use Wine. I don't want to install something that big into my daily driver.
While Linux is great for many, many things, there are some things still sorely lacking in the ecosystem. Why can't I adjust monitoring/routing in a class-compliant audio device? Why my effect processors' USB protocol is not open so I can't play with it parameters from Linux?
We still have a long way to go in some areas.
For photography and graphic arts, Linux can handle many if not most of the work (I use Digikam and Darktable with great success, for example), yet when it comes to audio for example, it falls short due to a thousand papercuts.
You don't have to be everything to everyone. You just have to satisfy a need.
Yet, Darktable allows me to process my RAWs to a point which I like. Similarly, my audio equipment allows me to create some music which I like, too.
I didn't push Darktable to professional levels, but I believe it can match bigger tools for what I want to do with it. I don't do photo manipulation, for example. Just process RAWs. I expect the same from my audio equipment for my music endeavors.
https://www.reddit.com/r/linux_gaming/comments/1qdgd73/i_mad...
https://support.focusrite.com/hc/en-gb/articles/208530735-Is...
I haven't actually tested it, but it seems like it works for people, and it's solid enough to have the kernel component in the kernel. I found it while researching a possible move with my Vocaster One.
If it's one of those and class compliant, you might be able to access all of it through alsamixer or one of the many frontends (maybe too many, maybe one for you): https://en.wikipedia.org/wiki/Alsamixer
The Audient situation appears to be a proper nightmare realm with non-class compliant stuff, but there is a tool with a list of caveats longer than you might want to deal with: https://github.com/TheOnlyJoey/MixiD
It's more best case scenario as an escape hatch and less problem solved, but it's something.
I didn't expect Audient to work, actually.
The problem is that I can't get one in a store. It's a product that is only available to those in the know.
In the ideal situation a lay-person would be in a store, and there would be two versions of the same machine, one with ads on the lock screen, one without.
I made a decision I didn't want to make: I bought the Macbook Pro. If I was retired or completely cashflow positive in my endeavors, I'd pick the machine I want.
That being said, there were so many ecosystem, hardware, power management, GPU throughput and compatibility advantages with the Macbook Pro at the moment, and given that I'm firmly in founder/launch mode, I went with the safety option. My biggest risk is Apple making another anti-consumer choice.. I don't see the ads they've started pumping into their product, but I do miss GNOME.
I made a work decision, not a technology decision. That said, Windows never entered the equation.
My “nice” mechanical keyboard is sitting on my old desktop, which is now a container store. It’s easier to not go back and forth.
And that doesn't even get into gaming.
You may have to spend extra work to get things running; but once it's done, it runs forever without a hitch.
I know, I use Slackware. It's regarded as a very technical distribution and some manual configuration is expected but once it's done, it's done. I have configs from > 20 years ago that I still use without a hiccup.
Uptimes of half a year are not uncommon, the record so far is 400+ days. I just don't shut it down unless there's a serious kernel or hardware upgrade.
It just works, non-kernel updates, stuff being plugged/unplugged, couple times I swapped sata hdds without turning off power (which is simple, they are hotplug by design, just don't drop the screws onto motherboard and don't forget to unmount+detach first).
Now, when I used to and test some cross-builds for windows (win7-win10 era), I had another dedicated windows machine for that. And even though I tried to make it as stable as possible, it was a brittle piece of junk, in comparison.
So in my experience, yes, linux is fundamentally different usage philosophy: you don't need to think about what crap Microsoft will break your workflow with next Tuesday.
To solve the chicken/egg problem, the GNU/Linux distributions should generate some very (in particular binary) stable interface for writing applications (including GUI applications) on GNU/Linux - like WinAPI on Windows. With "stable" I mean "stable for at least 20-25 years". This interface must, of course, work on all widespread GNU/Linux distributions.
"Build musl libc statically, set up a toolchain to use it, build libc++ for that toolchain, get libwayland, link that statically (which their build scripts don't support, roll your own), get xcb,libxau,libxwhatever and build those statically as well, and implement TWO platform layers, dynamically checking for wayland support. There's like 5 different ways to set your window icon. Yes, you need to implement all of them. Now for loading the graphics API......."
On Windows it's a call to RegisterClassW followed by CreateWindowW.
An operating system is a style of thinking about your work. WINE is a way to get Windows applications to run (by now run decently) under GNU/Linux. These Windows applications are nevertheless foreign bodies in the whole kind of thinking which GNU/Linux is built around.
It's sad because it's true.
I guess you want a Mac. That's fine.
I value freedom and things not mysteriously breaking and functionality not disappearing, and am quite happy investing a the time and knowledge upfront, so I use Linux.
And then there are people who want to have a system which works out of the box initially and who don't want to learn anything and don't mind it breaking later, and they choose Windows.
To each their own.
I'm interested in where that estimate + number are coming from. And I'd like to point out that I don't nearly see as many people pushing back against say MacOS for "not being Windows", despite the fact that the same issue would be there. I wonder why Linux gets special treatment in that regards, when modern distros make usage very accessible.
> And that doesn't even get into gaming.
Gaming on Linux works very well. And if something doesn't, it's usually by choice (e.g. BattleEye customers not enabling it on Linux) or by sheer incompetence / malevolence (e.g. EA Games and their shitty EA App that breaks often even on Windows, and even worse on Linux in a Wine environment).
Mac users paid money for their choice, so ironically they are more forgiven for the inability to run some Office VBA macros, work with that random MST dual display dongle or whatever. They rationalize their expensive purchase as a good decision and that it's good enough and possible to solve issues encountered like spending 5 times as much on Thunderbolt dock to do what the $30 MST dongle did or learn some entirely new $10 app to do what they did on Windows with something else.
Just as nobody is pushing back against Linux when it comes to server software, or pushing back against PlayStation when it comes to games.
Hard disagree. Not that it has to be FOSS, but you have a product that is predatory towards you and you refuse to change your ways.
Leaving an abusive relationship is hard, but sometimes you have to do it.
And honestly it seems like you refuse to learn even the smallest bit about human nature.
Very, very few people want to "learn" how to use their computer. Walk into a room of 100 graphic designers who have spend the last 20 years using Photoshop exclusively and put GIMP in front of them and and at least 98 of them are going to say what the hell is wrong with you, they have work to do, take this uncanny valley garbage and get out of here.
I'm typing this on a System76 laptop right now but I understand expecting people to use Linux writ large is ridiculous.
I see this point being missed over and over again in this thread. To people like you and I the computer is often the entire point. To normal people it's a tool. It exists to get the job done so they can move onto something else.
The solution that requires the least effort is objectively the best solution. Most of the time that still means Windows, and it won't change until the required level of effort changes.
They aren't looking and they aren't interested in looking. At this point they have no one else to blame.
The tin foil hat interpretation of this is that it is all by design, by whatever cabal runs everything, to subjugate the masses and control them directly or indirectly. The generous interpretation is closer to an extreme version of Sturgeon's Law[0] where this is just a natural, even inevitable, byproduct of most things being garbage. Like most things the truth is almost certainly somewhere in the middle.
[0] "90% of everything is [crud/crap/shit]" https://en.wikipedia.org/wiki/Sturgeon's_law
It is a solution. Once you do it, your problem is solved, that makes it the solution. If you aren't willing to go with that, you can stay with Windows and just accept the constant abuse.
As for gaming, I've been on Linux for two years now and I haven't had a single game not work.
Perhaps ironically, Wine may be the best stable API on Linux. I'd like to see a concerted and well-funded effort to make Wine run certain Windows applications well. We might not be able to replace the Adobe Suite short-term by a FOSS alternative for most of its users, but we might be able to get Wine to run the Adobe Suite, Affinity Suite, and whatnot well enough to make it possible to switch and keep running these applications.
It actually is. It may not be the best solution, but it absolutely is one of available solutions. = Not being able to ( or wiling to ) learn ( and adjust ) as needed is part of the reason we are here.
I am not being nitpicky here. Reasonable people don't hope things will change; instead, they change things they can.
I suspect that most people don't run much software at all outside of their web browser and wouldn't notice any difference between using chrome in windows and using chrome in linux. Gaming is not the barrier it used to be either.
If they want to edit a photo, and they're used to Photoshop, then Photoshop will be lower effort than a competitor just as Photoshop is lower effort than darkroom editing film. Competitors have to be lower effort or offer significantly better features than incumbents. Product cost is a part of the effort needed to use that product, but far from the entire thing.
It's not 2016 anymore, you don't have to switch to LibreOffice if you need an office suite of apps.
That obviously would be preferable, but if you're an avid Microsoft ecosystem user, just use WinApps. It's simple enough to the point that a child could use it.
Linux is an important operating system, but anyone under the delusion that it is desktop ready right now needs to actually watch someone use it. I say this not because I hate linux, but because I love it. I want someone to make it usable for a desktop, and people claiming that it is usable right now are not helping that.
I strongly disagree with this; I believe that an OS should be whatever the user needs it to be. In my case, I am a power user that loves the command line, and while I agree that I may not represent the majority of users, I do not care for your assertion that my way of doing things is somehow invalid.
If we had a giant influx of computing illiterate people, the platform would enshittify. They would move towards android-type lock downs and user hostile stuff. More and more binary-only proprietary software, they might fork systemd etc and make sure that the proprietary binaries only run under certain unmodified setups etc. Of course there would be escape routes to various other, nonpopular distros, so the skilled people would be fine again, but there would be a barrier again.
I think this is fundamental. Once the general public starts entering an arena, it won't stay the same. Eternal September etc.
It's a hard question to figure out what's the proper level of abstraction for this is. And while I strongly resisted it originally, I am becoming more open to the argument that many people don't need to "know" what a file is, to benefit from their computers - that as long as they can "save" their work, and "send" it from one app to another, they'd be able to get all the productivity that they are looking for.
Without the helpful abstraction of files and folders, all we'd have are bytes stored at various addresses or sectors of the hardware.
I agree with most everything else you said, but would slightly push back on that. I actually quite like the idea of non-hierarchical blob storage searchable via arbitrary indexed metadata, as well as the idea of content-addressable storage (e.g. with magnet links). While folders are an elegant abstraction, I really feel that we shouldn't be beholden to it.
On that note, I remember how absolutely ecstatic I was when I first set up Sublime Text and discovered that unsaved editor tabs always reliably survive restarts; it essentially flips the script, whereby I've lost multiple saved files by accidentally deleting them, but I've never accidentally lost work in unsaved tabs, and I've never actually had any interest in figuring out where and how these tabs get persisted - it just works.
The people doing the former use computers for ‘real work’. They are using a computer as an end in itself, care about operating systems and have strong opinions about systemd. The people doing the latter couldn’t give two shits about any of that and just want to get their presentation finished on time.
Problem is, both sets of people have to use the same machines. It’s also why software like GIMP will never become widely adopted in professional environments because it’s designed for a completely different userbase.
Your critique should be channelled into a productive direction and point the finger at the maintainers why this is not packaged yet. https://repology.org/projects/?search=winapps https://pkgs.org/search/?q=winapps
Why is that argument always applied against Linux, and never against for instance macOS, which also can't run Windows software?
There's a certain type of technical user that gets joy from coming up with arguments, good, bad, or just pulled out of their butt, explaining why people can't use Linux. I'm not going to spend my day trying to understand people's unusual preferences.
It is getting tiring, I don't say Linux is perfect, but KDE has been better than Windows for years, Linux doesn't bit rot like an average Windows install and Linux is in practice surprisingly more stable, but no-no-no, Linux can't be this time again. Quick... ehm "there is a piece of software that only works on Windows". Have you ever thought the reverse holds too, but times 1000?
If you call yourself an IT-professional, you only run spyware.exe in a vm or in a box with all networking gear ripped out and you don't making stupid excuses.
All of these issues go away with Mac and Windows. I'm not giving up on Linux, I'm just a realist.
Also quite a few inaccuracies - what the heck is 'bit rot' on windows? I had 1 same Windows 10 install running on desktop for 8 years as primary personal PC and installed tons of software and games, both official and... some other types. 0 issues.
On laptop whole lifetime with original install is the default for everybody I know, for me 6-7 years (simply the length of ownership). We don't talk about Windows 95 or ME era here where frequent installs were basically mandatory and a well-practiced chore.
In the past a good "registry cleaner" would help - but those are no longer reliable with newer versions of Windows - there are many virtual entries that get cleaned-up by overly aggressive utilities.
I actually have a desktop still running that got a launch party host Windows 7 Steve Ballmer edition install that's just been upgraded as time has gone on. Very much a Ship of Theseus machine but technically only ever migrated the OS image around, never reinstalled. That's 17 years of a Windows install so far, and its perfectly fine. That one install has made it through multiple motherboards and OS upgrades. It'll end up dying and being replaced once I get too uncomfortable with 10 EoL, this board is still useful to me but it doesn't have a TPM so Windows is dead to this machine.
Gaming on Linux is a mostly solved issue for anyone that doesn't do competitive multiplayer gaming. If a game isn't using some root kit level anti-cheat or copyright protection, it is going to run just fine. Same with running most other software.
The only part where Linux is sucks is for certain creatives fields. If you need Adobe products you are out of luck. Video editing well you use Da Vinci or free software. There are some good DAWS but no Ableton.
Yes, you have to compromise but Linux is definitely getting there. Not everything runs on Mac either and people cope just fine.
Especially Affinity imho. A lot of the people studying graphics design in the last 3 years or so, that I know of, saw the benefit of not paying for an Adobe subscription.
Turns out, a lot of people do exactly that. Hundreds of millions of people play CoD, Fortnite, Battlefield, Apex and many many other games which won't work on Linux at all.
I think the state of gaming on Linux is absolutely incredible - what used to be a very esotheric and "roll of the dice" process 20 years ago now is extremely simple and it mostly just works. But when I play games with friends every week it's almost never a game that would work on Linux.
I do agree with your larger point though. It’s the same reason everybody doesn’t change the oil in their car on their own or cook their food every night over ordering out. Only it goes even further because by this point most people expect a computer to just do what it’s supposed to do (or they think it’s supposed to do) the first time they try. I can’t imagine asking my parents to start inputting terminal commands. Even just the process of something like running etcher and prepping a usb drive to install linux is a whole thing.
Or Accessibility, which the Linux desktop is notoriously bad with, since, what, 20 years. The constant push to rewrite things typically forgets making Accessibility a priority, for the sake of "progress".
Both installing Windows and installing Linux can be difficult for most people. I have done both professionally and when installing Windows I have encountered frequently more serious problems, which required much more time to solve than the problems encountered when installing Linux.
For those who have someone else to install and configure Linux, it is at least as easy to use as Windows.
My parents, more than 80-years old, have used for many years Linux without any problems and they have no idea what Linux is, they just know the applications that they are using for viewing and editing documents, e-mail, Internet browsing, music or movies listening or watching, TV watching or recording (with TV tuner) and so on.
Would have they bought such a configuration on a random computer store?
Most people also don't buy laptops from some online store that only HN readers know about.
Oh, and laptops are nasty. They are put together in ways that can easily confound you when you have plenty of experience. Lots of it revolves around little pieces of plastic that are marginal when new and that just want to break by the time the device needs service. It's a conspiracy!
Anyway, at least you know it can be done. The conditional still holds.
Look at the mobile YouTube client. The bottom navigation bar has the "+" create button stuffed right in the middle of it, larger than any other button. What % of users creates YouTube content? Probably <1%. What pp of those do it in the mobile YouTube client? Probably 0.1%. Yet the button is there, with no way to disable it.
In general, why don't apps have a "creator" toggle, off-by-default, that optimized the entire UI for viewing / consuming? Just how apps like Uber have either an entire separate app for 'partners', or toggle.
I know the reason this happens is because we aren't the real customers of an app. Nor are the creators / partners. The real customers are the shareholders. And YouTube has no competitor, so they can go buckwild with anything that synthetically increases KPIs.
I think the only app in recent memory that I have seen right the ship is Spotify. The past year they have introduced a lot of toggles for things like the shuffle algorithm, the dumb looping album art videos, audio loudness normalization being split out into normalization and compression ('volume'), etc; About the only thing that's missing is a toggle to disable podcasts, just like YouTube needs a toggle to completely disable shorts.
Any PMs reading this, be our hero. Fight the good fight.
A while ago, they introduced the Home page with algorithmic recommendations; okay, it sucks that you can't choose whether Home or Subscriptions is the default, but at least you can choose between the algorithmic recommendations and the chronological subscriptions feed.
Then they introduced Shorts. These are algorithmic ally recommended TikToks which you can't disable, they always litter both the Subscriptions page and the Home page. This sucks.
Then, recently, they added algorithmic recommendations to Subscriptions. So if you're on Home you see only algorithmic recommendations, and if you're on Subscriptions, a lot of your screen is still taken up by algorithmically recommended videos from channels you subscribe to.
Every one of these steps is in the direction of making sure you watch what YouTube wants you to watch instead of what you want to watch.
We crossed an all-time record recently.
We get a 2 rows x 3 column grid now. The upper left is an ad, the lower row are clipped in half to coach scrolling, bringing the total to 2 thumbnails.
I feel like a junkie whose dealer tripled their prices and cut the drugs with 80% filler; sobriety by cartoonish consumer exploitation
TV has it. Only TV program production companies can create shows. That literally undermine ... a lot of things. We don't need that.
Exactly.
I am in an engineering design software developer organization bought by an investor from the founders approaching retirement (they worked 3 decades on this software helping construction engineers designing better homes). Ever since the lead up to the sell - changes were tuned to lure in investors, for the liking of investors - our organization is focusing on maximising revenue. Fast. That is THE focus. New marketing strategy, sales strategy, licensing strategy changes, reshape organization to have more informed decision making in sales (i.e. collecting and processing much more data on increasing number of contacts). Company meetings are about EBITDA, sales goals vs. actual, streamlining organization. Luncbreak discussions evolve around how to license existing features differently so it would trigger/force up/cross sales.
What is not on the agenda for maximising revenue: features and engineering. We are a "sales oriented organization", says our new CEO prodly - brought in during the sale. Addressing user needs and becoming more popular for the eventual income boost takes longer than the sales cycle of less than 5 years (the investor wants to sell the company in 5 years time). Engineering is in the way, accounting books need to look much much better much sooner for the eventual profit. Only sales tactics work here.
I see ralted pattern elsewhere, in tools I have the misfortune to use (SaaS and other subscription based products). Shameless self-promotions (cross-sale) distact your focus all the time, 'features' good for the assumed 'cutting-edge' image of the organization, privacy offensive practices (data for running sales campaigns), 'offerings' that help you with the ideas they force on you for some sizeable extra cost.
It will not end well. Takes long time to fail, but without valuable features and engineering there will be no value left for the users to buy eventually. No user wants top notch marketing, licensing, and sales strategy for the benefit of the organization.
Yes, Apple has a 'walled garden' to an extent, but I've never once worried about MacOS serving me an ad from a third party, and their privacy controls are top notch and seem to get better as advertisers attack methods get more sophisticated.
I can count on one hand the number of times I've had to jump through a few hoops to get an unsigned app installed, and each time it's been relatively painless.
this is in general how the market for pretty much everything works (sometimes 'users' are replaced by 'the regulator', but it doesn't matter too much).
lesson in there is 'majority of users don't care nearly as much as you think', usually.
This is capitalism's biggest flaw: it's based on the assumption that there will be competition, but competition eventually leads to winners that then consolidate their positions and we end up with no real choices.
You're telling me people would pick a worse OS because they don't care even if they had real options? I don't believe that for a second.
We very much do have options. I haven't had Windows on a personal machine since 2011.
The fact that governments allow Microsoft to abuse its position to force OEMs to install Windows is the biggest problem. This would never happen in a market where regulation ensures healthy competition.
Unfortunately this also allowed the USA to have companies so large that they basically control the government, changing this now will require massive political will and a political body untethered from corporate interests. I really don't see that happening in the USA, it's been thoroughly captured after so many years driving on that path.
Google hasn't enticed the big entrenched MS orgs to move over to Workspace, so if Google can't how can a smaller startup ever hope to accomplish that in the face of these behemoths that can just outlast them in a race to the bottom until they are insolvent or get bought by said behemoths?
Microsoft doesn't just sell an OS, or some services, they sell "IT in a box"
Take an industry with healthy competition like restaurants. You can compete in price, quality, format, service and probably a lot more.
Now tell me how that competition enshittified eating at restaurants?
For me, nothing stands out. If a restaurant charges nonsense fees, under-staffs to increase profits, reduce portions with the same value, etc. I can simply go to another one. Restaurants that enshittify will almost inevitably close.
But if we look at a closely related industry like the food delivery apps, we see the same exact signs of enshittification we see on the tech world due to monopolies (or oligopolies to be more exact) like: - Increased/hidden fees
- Increased delivery times
- Crappy apps with ads everywhere
- Ineffective review systems
- Pay-to-win search
- Dynamic pricing
They can get away with it because realistically, you don't have any other options. The cost to entry might not be that high but the network effect all but prohibits competition.
Yes, and you correctly point out: On the average restaurant visit, nothing stands out. A good restaurant only needs to provide not-terrible food and not-terrible service to be almost indistinguishable from all others. Quality of a restaurant visit is hard to measure and compare. Price is easy to measure. Thus, the rational consumer will prefer the cheaper option (and even at the same price, a restaurant with lower costs will be more profitable, thus expand more easily).
The same thing happens on Amazon and other market places: When it is difficult to compare quality, price always wins out. Some products are interchangeable with well defined specs, like a 16GB RAM stick is obviously twice as good as 8GB RAM and so it can be twice as expensive and still sell. But when I'm looking for a new light for my bicycle there are no standardized specs to compare. All the product descriptions and pictures are exaggerated. I have no reliable information to tell if the lamp that is twice as expensive is really twice as good (and from personal experience: they never are), so I'm buying the cheapest one cause I expect all of the products to be equally crappy no matter the price.
It's not Amazon's fault. This happens everywhere.
I think the desktop Linux ecosystem is an example of something healthier, but it goes too far in the other direction. There are too many options to choose from that it's hard to find the one for your needs.
A lot of windows UI design decisions are pretty good. They mess it up now and then like windows 8 (tablet design mess) disaster, especially now with WSL 2.0, it delivers everything I need.
Do I still hate it , yes for the reasons explained in this article and other stupid designed features like search index, windows defender , mix of legacy and new dialogs, for the shitty design of powershell and then the mess of mixed shells, terminal etc.
List goes on, but comparatively I’ll pick windows desktop over anything out there at the moment. It’s a personal choice but I assume majority of windows user feel this way (or cannot afford macOS :))
How can this be your takeaway when there is no channel for communication with the users? There is no signal at all so you assume what is convenient for you. But this has no bearing on actual user sentiment, its just convenience for you.
Part and parcel of the “problem” with tech people is they assume they can just fill in the gaps with their preferences and pretend they’re actually user preferences. In the rest of life this is called “bullshit”.
A framework of just and fair laws and regulations should support this, backed up by open enforcement.
but, yeah.
This isn't some nefarious plot to screw over users. Taste is not prioritized because nobody has it and thus can't recognize it. Can't value something you don't even recognize. This is orthogonal to talent btw. Lots of people there who are insanely good at what they do, who produce the most hideous API specs you've ever seen, as one example.
A much more mundane (and almost certainly true) explanation is that people who put all that crap in legitimately thought it's a good idea. Taste is its own thing and it's just not in Microsoft's DNA.
It's quite common for megacorps, FAANG and friends, NASDAQ bigwigs.
It's rare for small companies, and extremely rare for independent developers.
This is not general. This is true only on markets which are full regarding available customers, and there is no foreseeable growth.
What we can see in IT in the past 10-15 years (especially after around 2015) is the slow progress towards this state from a rich and competitive (and personally I think a way more fun) one.
I worked for dying companies (e.g. Ericsson), for slowly moving ones (e.g. Santander), and for several now dead startups, and what happened with Google, Microsoft, etc is that they slowly moving from the "startup" market - there is still available non conquered market segments - to the dying, slowly moving one - where there are a few large players, and it's not possible to grow in any meaningful way with your own skills. The only difference now compared to the decades until the 90s is that antitrust checks and balances are dead, and they can artificially inflate their own power, which haven't happened in this scale for at least 100 years. And it caused world shattering problems back then, and it will now too.
I would leave this field happily, even when I'm exceptionally good in it, because it's more and more disgusting. Only if there would be any good alternatives, which wouldn't require me to loose at least a decade of my life. But unfortunately, the balance is way more fucked up to easily change my lifestyle at this point. And it will be just worse than this.
OneDrive managers on the other hand are one step away from inventing some way of adding a gacha mechanic.
I think you miss the more common reasoning though. This starts with "can we build a Windows app?" The answer to that was "no" for many more people until relatively recently. The .NET Framework wasn't as available by default until the second half of the 2000s which caused some Windows app devs to hold off beyond the performance reasons and WinForms vs WPF. Electron and React go hand-in-hand here as they made a (crappy) Windows app easy.
What I feel popularized this was the webview approach on mobile. In 2010, there were a ton of frameworks popping up for hybrid mobile development. This was carried forward to desktop although some of us had been embedding IE webviews much earlier. This let people say "yes" and it went from one thing to the next with diversions into React Native.
I ditched Windows long ago so I'm mentioning this only in the interest of accuracy.
"Infecting with screwdrivers" now see how dumb that sounds?
So no, React is a (poor) solution, not the problem. The problem is Windows can't nail down a solid SDK for it's platforms.
As a user, however, I find that the Start menu has become more sluggish than it used to be, and that's pretty annoying. What about that?
lol what a weird response.
So React, the most popular front-end library and used my hundreds of thousands of successful apps, is the ridiculous electric screwdriver? See how weird that sounds and makes it obvious you guys can't give an honest assessment?
Pretty much everything in there still works, if I type it in and compile it with the current Visual C/C++ toolchain. I might need to get rid of a few MOVEABLE and DISCARDABLE tags here and there, I suppose.
Idk, and I'm not saying it's not a good question, but it's irrelevant to the comparison in OP's comment.
which on one hand, good -- fuck microsoft and the monolith; on the other hand we get react start menus when we have to use microsoft.
What's the issue?
Other apps are successful despite being slow and bloated, since performance isn’t a primary concern of users. In contrast it’s critical for OS internals like the start menu, so a javascript runtime and framework is just the wrong tool for the job.
React only makes sense as a layer on top of the browser DOM, because the DOM itself cannot be fixed without rewriting it from scratch, so making it usable for non-trivial UI needs to happen in the 'framework layer'.
But without the DOM as the thing that needs fixing and the restrictions of the single-threaded browser-event-loop, the React programming model simply doesn't make a lot of sense. Using the "React-paradigm" outside the browser (e.g. SwiftUI, React Native) is pure cargo-culting, it only makes sense for onboarding web-devs who are already familar with React - but makes it harder to create UIs for anybody else.
The actual problem in the context of Win11 is of course that Microsoft doesn't have any sort of longterm strategy for Windows system APIs (not just UI frameworks). The only long-term-stable API is Win32.
React is the symptom here, not the cause.
Public infrastructure should be built on open-source, period.
developer delusion. devs who barely use their own apps. who dont understand the day-to-day user experience.
Most standard users simply dont have an option. Mac Neo brought Apple into a lower price range, but requires a new device. Linux is there (and frankly fantastic at this point) but good luck getting the average person through the setup process.
an enterprising hardware manufacturer can take on the mantle, and be the trail blazer with a no-setup machine that works.
Personally, i would imagine something like framework laptop, and steam machine, are the best candidates.
AI is part of the problem with what MS has shoved in to things but it may be part of what can help with the underlying issue of this behavior by corporations.
The average user increasingly will not need to be walked through in certain ways, they’ll only have to be aware something, some way, is possible. Because we are most of usthe average, meaning outsider to knowledge and understanding of things their functioning on a computer. I can strip out tired windows behavior to some extent and certainly stand up a Linux desktop. But I didn’t know how to easily manage retrieval of data from an old disc image that refused to mount. But I knew it was there and not impossible so I asked Claude. A one shot prompt that a few minutes later had Claude reading raw bytes in someway and finding the location of a few files I needed.
So there is potential for AI to fill some gaps in this way and make some things easier and more in reach of average users. It’s potential only though, so continuing to work and ensure open models remain a thing, it’s important. Just like the Internet enabled a lot of things previously out of reach of people. And yeah, that was not an un mixed blessing with the rest, so all the more reason to move forward thoughtfully.
Yes, when there isn’t real competition. And that’s in part due to a long history of anti competitive practices but also simply because Microsoft is too big and should be broken up.
It's called "enshittification": https://pluralistic.net/2025/02/26/ursula-franklin/
I get the impression that many companies are working through this with AI-assisted coding. How bad can the product get before the revenue loss is greater than money saved by firing programmers and deploying AI slop? For products like Windows and Office, the subscription model and enterprise account revenue provides a huge cushion for decreasing quality before they even have to apologize and roll back.