They probably won't, but those willing to drop $3-10k will be if the consumer and data-center computing diverge at the architectural level. It's the classical hollowing out the middle - most of the offerings end up in a race-to-the-bottom chasing volume of price-sensitive customers, the quality options lose economies of scale and disappear, and the high-end becomes increasingly bespoke/pricey, or splits off into a distinct market with an entirely different type of customers (here: DC vs. individuals).
https://us.ugreen.com/collections/usb-c-hubs - these docks only require a single USB port to connect to. That could be a SBC working as a handheld. These docks could end up being the largest cost component in the new era of all-in-ones. UGreen could be the next Apple as screens and processors snap-on to these hubs, in addition to their own range of power banks and SSD enclosures. Their quality is high too.
In fact, I would go so far as to say we are entering a tinkering culture, and free-energy technologies are upon us as a response oppressive economic times. Sort of like how the largest leaps in religious and esoteric thought have occurred in the most oppressive of circumstances.
People will reject their crappy thin clients, start tinkering and build their own networks. Knowledge and currency will stay private and concentrated - at least at first.
But indeed, once you have USB-C support on your device, you can connect all kinds of periphery through it, from keyboards to 4K screens. Standardized device classes obviate the need for most drivers.
This will likely extend further and further, more into the "normie" territory. MS Windows is, of course, the thing that keeps many people pinned to the x64 realm, but, as Chromebooks and the Steam Deck show us, Windows is not always a hard requirement to reach a large enough market segment.
The large PC builders (Dell, HP, Lenovo) will continue down the road of cost reduction and proprietary parts. For the vast majority of people pre-packaged machines from the "big 3" are good enough. (Obviously, Apple will continue to Apple, too.)
I think bespoke commodity PCs will go the route, pricing wise, of machines like the Raptor Talos machines.
Edit: For a lot of people the fully customized bespoke PC experience is preferred. I used to be that person.
I also get why that doesn't seem like a big deal. I've been a "Dell laptop as a daily driver" user for >20 years now. My two home servers are just Dell server machines, too. I got tired of screwing around with hardware and the specs Dell provided were close enough to what I wanted.
I'm very excited about the Steam Machine for the reasons you mention - I want to buy a system, not a loose collection of parts that kind-of-sort-of implement some standard to the point that they probably work together.
There's nothing wrong with ATX or having interchangeable components. An established standard means that small companies can start manufacturing components more easily and provide more competition. If you turn PCs into prepackaged proprietary monoliths, expect even fewer players on the market than we have now, in addition to a complete lack of repairability and upgradability. When you can't pick and choose the parts, you let the manufacturer dictate what you're allowed to buy in what bundles, what spare parts they may sell to you (if any) and what prices you will pay for any of these things. Even if you're not building custom PCs yourself, the availability of all these individual components is putting an intrinsic check on what all-in-one manufacturers can reasonably charge you.
There are systems like the NUC but if I want a super-high-end 5090 and top-end CPU, all of the options to cool them feel like... well, something kluged together from whatever parts I can find, not something that's designed as a total system. Maybe we'll get some interesting designs out of this.
I'm really fearful that PCs are going down the road of locked bootloaders, running the user-facing OSs inside bare-metal hypervisors that "protect" the hardware from the owner, etc.
I'll accept that I'm likely under the influence of a bit of paranoia, too.
I'm strongly of the opinion several unaffiliated factions (oligarchs, cultural authoritarians, "intellectual property" maximalists, software-as-a-service providers, and intelligence agencies, to name a few) see unregulated general purpose computers in the hands of the public as dangerous.
I don't think there's an overt conspiracy to remove computing from the hands of the public. The process is happening because of an unrelated confluence of goals.
I don't see anybody even remotely comparable in lobbying power standing up for owner's rights, either.
But I can imagine that it would become less prevalent on personal machines, maybe even rare eventually.
Prior to the crunch, you could have anything from 48-64 cores and a good chunk of RAM (128GB+). If you were inordinately lucky, 56 cores and 64GB of onboard HBM2e was doable for 900-1500 USD.
They’re not Threadrippers or EPYCs,but sort of a in between - server chip that can also make a stout workstation too.
I also want housing as cheap as it was a couple of years ago.
Many a people need only a basic device for Netflix, YouTube, google docs or email or search/but flights tickets. That will be amazing.
Many have job supplied laptop/desktop for great performance (made rubbish by AV scanners but that's different issue)
I was looking up an old video game homepage the other day for some visual design guidance. It was archived on the Wayback Machine, but with Flash gone, so was the site. Ruffle can't account for every edge case.
Flash was good. It was the bedrock of a massive chunk of the Old Net. The only thing awful are the people who pushed and cheered for its demise just so that Apple could justify their walled garden for the few years before webdev caught up. Burning the British Museum to run a steam engine.
I remember pulling up crash logs for people showing them that Flash was in every one of the Safari crashes the wanted me to fix. I told them it was out of my hands.
We're out here with amazing performance in $600 laptops that last all day on battery and half of this comment section is acting like personal computing is over.
Raspberry Pi is way cheaper than those things, and I'm sure you could hook one up with an all-day battery for $100-200.. Doesn't mean it's "better".
They’re not ideal for all use cases, of course. I’m happy to still have my big Linux workstation under my desk. But they seem to me like personal computers in all the ways that matter.
This is what I'm afraid of. As more stuff moves to the cloud helped in part by the current prices of HW, the demand for consumer hardware will drop. This will keep turning the vicious cycle of rising consumer HW prices and more moves to the cloud.
I can already see Nvidia rubbing their hands together in expectation of the massive influx of customers to their cloud gaming platform. If a GPU is so expensive, you move to a rental model and the subsequent drop in demand will make GPUs even more expensive. They're far from the only ones with dollar signs in their eyes, between the money and total control over customers this future could bring.
Being entirely reliant on someone else's software and hardware is a bleak thought for a person used to some degree of independence and self sufficiency in the tech world.
Roblox is not popular because of its graphics. Younger gamers care more about having fun than having an immersive experience.
The problem I describe is companies pushing towards the "rent" model vs. "buy to own". Nvidia was just an example by virtue of their size. Microsoft could be another, they're also eying the game streaming market. Once enough buyers become renters, the buying market shrinks and becomes untenable for the rest, pushing more people to rent.
GPUs are so expensive now that many gamers were eying GeForce Now as a viable long term solution for gaming. Just recently there was a discussion on HN about GeForce Now where a lot of comments were "I can pay for 10 years of GeForce Now with the price of a 5090, and that's before counting electricity". All upsides, right?
In parallel Nvidia is probably seeing more money in the datacenter market so would rather focus the available production capacity there. Once enough gamers move away from local compute, the demand is unlikely to come back so future generations of GPUs would get more and more expensive to cater for an ever shrinking market. This is the vicious cycle. Expensive GPU + cheap cloud gaming > shrinking GPU market and higher GPU prices > more of step 1.
Roblox is one example of a game, there are many popular games that aren't graphics intensive or don't rely on eye candy. But what about all the other games that require beefy GPU to run? Gamers will want to play them, and Nvidia like most other companies sees more value in recurring revenue than in one time sales. A GPU you own won't bring Nvidia money later, a subscription keeps doing that.
The price hikes come only after there's no real alternative to renting. Look at the video streaming industry.
Also, if gamers demand infinitely improving graphics so much that they would rather pay for cloud gaming than relax their expectations and be happy with, say, current gen graphics, then that is more a claim about modern self-pwned gamer behavior than megacorp conspiracy.
But I don't buy that either. The biggest games on Steam Charts and Twitch aren't AAA RTX 5090 games.
Riddle me this: does anyone pursue a self-pwn intentionally?
"Conspiracy theory" is just dehumanizer talk for falling prey to business as usual.
It's also a nightmare from any sort of privacy perspective, in a world that's already becoming too much like a panopticon.
This what always happens in capitalism. Scarcity is almost always followed by glut
Memory makers, for example, have sold out their inventory for several years, but instead of investing to manufacture more, they’re shutting down their consumer divisions. They’re just transferring their consumer supply to their B2B (read AI) supply instead.
Thats likely because they don’t expect this demand to last past a few years.
Many users will not want to risk their privacy, data, and workflow on someone else's rapidly-enshittifying AI cloud model. Right now we don't have much choice, but there are signs of progress.
Many new games cannot run max settings, 4k, 120hz on any modern gpus. We probably need to hit 8k before we max out on the returns higher resolution can provide. Not to mention most game devs are targeting an install base of $500 6 year old consumer hardware, in a world where the 5090 exists.