When you do a project from scratch, if you work enough on it, you end up wishing you would have started differently and you refactor pieces of it. While using a framework I sometimes have moments where I suddenly get the underlying reasons and advantages of doing things in a certain way, but that comes once you become more of a power user, than at start, and only if you put the effort to question. And other times the framework is just bad and you have to switch...
But ya, I hate when people say they don't like "magic." It's not magic, it's programming.
Yes, it's not magic as in Merlin or Penn and Teller. But it is magic in the aforementioned sense, which is also what people complain about.
Sorry for the snark but why is this such a problem?
https://pomb.us/build-your-own-react/
Certain frameworks were so useful they arguably caused an explosion the productivity. Rails seems like one. React might be too.
const element = document.createElement("h1");
element.innerHTML = "Hello";
element.setAttribute("title", "foo");
const container = document.getElementById("root");
container.appendChild(element);
I now have even less interest in ever touching a React codebase, and will henceforth consider the usage of React a code smell at best.Maybe nobody needs React, I’m not a fan. But a trivial stateless injection of DOM content is no argument at all.
<h1 title=foo>Hello</h1>
I have even less interest in touching any of your codebases!Don't get me wrong, I don't think you need or should need a degree to program, but if your standard of what abstractions you should trust is "all of them, it's perfectly fine to use a bunch of random stuff from anywhere that you haven't the first clue how it works or who made it" then I don't trust you to build stuff for me
The big thing here is that the transformations maintain the clearly and rigorously defined semantics such that even if an engineer can't say precisely what code is being emitted, they can say with total confidence what the output of that code will be.
Sure, obviously, we will not undersatnd every single little thing down to the tiniest atoms of our universe. There are philosophical assumptions underlying everything and you can question them (quite validly!) if you so please.
However, there are plenty of intermediate mental models (or explicit contracts, like assembly, elf, etc.) to open up, both in "engineeering" land and "theory" land, if you so choose.
Part of good engineering as well is deciding exactly when the boundary of "don't cares" and "cares" are, and how you allow people to easily navigate the abstraction hierarchy.
That is my impression of what people mean when they don't like "magic".
In other words, why is one particular abstraction (e.g. Javscript, or the web browser) ok, but another abstraction (e.g. React) not? This attitude doesn't make sense to me.
As far as x86, the 8086 (1978) through the Pentium (1993) used microcode. The Pentium Pro (1995) introduced an out-of-order, speculative architecture with micro-ops instead of microcode. Micro-ops are kind of like microcode, but different. With microcode, the CPU executes an instruction by sequentially running a microcode routine, made up of strange micro-instructions. With micro-ops, an instruction is broken up into "RISC-like" micro-ops, which are tossed into the out-of-order engine, which runs the micro-ops in whatever order it wants, sorting things out at the end so you get the right answer. Thus, micro-ops provide a whole new layer of abstraction, since you don't know what the processor is doing.
My personal view is that if you're running C code on a non-superscalar processor, the abstractions are fairly transparent; the CPU is doing what you tell it to. But once you get to C++ or a processor with speculative execution, one loses sight of what's really going on under the abstractions.
Yeah, JavaScript is an illusion (to be exact, a concept). But it’s the one that we accept as fundamental. People need fundamentals to rely upon.
Sure you can, why can't you? Even if it's deprecated in 20 years, you can still run it and use it, fork it even to expand upon it, because it's still JS at the end of the day, which based on your earlier statement you can code for life with.
If this is true, why have more than one abstraction?
But it is not quite the case. The hand coded solution may be quicker than AI at reaching the business goal.
If there is an elegant crafted solution that stays in prod 10 years and just works it is better than an initially quicker AI coded solution that needs more maintenance and demands a team to maintain it.
If AI (and especially bad operators of AI) codes you a city tower when you need a shed, the tower works and looks great but now you have 500k/y in maintaining it.
Anything that can be automated can be automated poorly, but we accept that trained operators can use looms effectively.
Programming is famously non-linear. Small teams making billion dollar companies due to tech choices that avoid needing to scale up people.
Yes you need marketing, strategy, investment, sales etc. But on the engineering side, good choices mean big savings and scalability with few people.
The loom doesn't have these choises. There is no make a billion tshirts a day for a well configured loom.
Now AI might end up either side of this. It may be too sloppy to compete with very smart engineers, or it may become so good that like chess no one can beat it. At that point let it do everything and run the company.
Electricity is magic. TCP is magic. Browsers are hall-of-mirrors magic. You’ll never understand 1% of what Chromium does, and yet we all ship code on top of it every day without reading the source.
Drawing the line at React or LLMs feels arbitrary. The world keeps moving up the abstraction ladder because that’s how progress works; we stand on layers we don’t fully understand so we can build the next ones. And yes LLM outputs are probabilistic, but that's how random CSS rendering bugs felt to me before React took care of them
The cost isn’t magic; the cost is using magic you don’t document or operationalize.
If you've only been in a world with React & co, you will probably have a more difficult time understanding the point they're contrasting against.
(I'm not even saying that they're right)
Autovectorization is not a programming model. This still rings true day after day.
LLMs are vastly more complicated and unlike compilers we didn't get a long, slow ramp-up in complexity, but it seems possible we'll eventually develop better intuition and rules of thumb to separate appropriate usage from inappropriate.
React, which just is functions to make DOM trees and render them is a framework? There is a reason there are hundreds of actual frameworks that exist to make structure about using these functions.
At this point, he should stop using any high level language! Java/python are just a big frameworks calling his bytecode, what magical frameworks!
Granted, there are limits to how deep one should need to go in understanding their ecosystem of abstractions to produce meaningful work on a viable timescale. What effect does it have on the trade to, on the other hand, have no limit to the upward growth of the stack of tomes of magical frameworks and abstractions?
Simple: if it's magic, you don't have to do the hard work of understanding how it works in order to use it. Just use the right incantation and you're done. Sounds great as long as you don't think about the fact that not understanding how it works is actually a bug, not a feature.
That's such a wrong way of thinking. There is simply a limit on how much a single person can know and understand. You have to specialize otherwise you won't make any progress. Not having to understand how everything works is a feature, not a bug.
You not having to know the chemical structure of gasoline in order to drive to work in the morning is a good thing.
I've never found this to be a particular problem. Most ORMs are actually quite predictable. I've seen how my ORM constructs constructs queries for my database and it's pretty ugly but also it's actually also totally good. I've never really gained any insight that way.
But the sheer amount of time effort I've saved by using an ORM to basically do the same boring load/save pattern over and over is immeasurable. I can even imagine going back and doing that manually -- what a waste of time, effort, and experience that would be.
It's about layers of abstraction, the need to understand them, modify them, know what is leaking etc.
I think people sometimes substitute magic when they mean "I suddenly need to learn a lower layer I assumed was much less complex ". I don't think anyone is calling the linux kernal magic. Everyone assumes it's complex.
Another use of "magic" is when you find yourself debugging a lower layer because the abstraction breaks in some way. If it's highly abstracted and the inner loop gives you few starting points ( while (???) pickupWorkFromAnyWhere() )). It can feel kafkaesque.
I sleep just fine not knowing how much software I use exactly works. It's the layers closest to application code that I wish were more friendly to the casual debugger.
It seems common with regard to dependency injection frameworks. Do you need them for your code to be testable? No, even if it helps. Do you need them for your code to be modular? You don't, and do you really need modularity in your project? Reusability? Loose coupling?
A couple of megabytes of JavaScript is not the "big bloated" application in 2026 that is was in 1990.
Most of us have phones in our pockets capable of 500Mbps.
The payload of an single page app is trivial compared to the bandwidth available to our devices.
I'd much rather optimise for engineer ergonomics than shave a couple of milliseconds off the initial page load.
The idea that React is inherently slow is totally ignorant. I'm sympathetic to the argument that many apps built with React are slow (though I've not seen data to back this up), or that you as a developer don't enjoy writing React, but it's a perfectly fine choice for writing performant web UI if you're even remotely competent at frontend development.
But it does seems that culture of complexity is more pervasive lately. Things that could have been a simple gist or a config change is a whole program that pulls tens of dependencies from who knows who.