upvote
While there is certainly some amount of unnecessary junk code out there, your claim that it could be reduced by an order of magnitude isn't even close to correct. In general the only way to write less code is to use higher level abstractions. The problem, of course, is that those abstractions are always leaky and using them tends to make certain required features too slow or even impossible to build at all. There is no free lunch.
reply
as programmers we like to use all this jargon like "leaky abstraction", but never bothered to understsand it beyond the PL paradigms we use. There's no formal definition and simply makes them good terms to abuse, and throw in conversations to make our points.

Why are the abstractions leaky? Are all abstractions leaky? Why - we simply accept the situation without spending any real effort.

"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean? If the lunch is not free, how cheap or expensive can it get and why?

This is why, as engineers, we tend to brush off the Dijkstras as arrogant, while at the same time ignoring both our arrogance and ignorance.

reply
A leaky abstraction is like obscenity: I know it when I see it. It's impossible to define the concept in a rigorous way, and yet it impacts everything that we do.

You're simply wrong to claim that we accept the situation without spending any real effort. In reality the more experienced developers who build abstraction layers tend to spend a lot of time trying to prevent leaks, but they can't have perfect foresight to predict what capabilities others will need. Software abstractions often last through multiple major generations of hardware technology with wildly different capabilities: you can't prevent those changes from leaking through to higher levels and it would be foolhardy to even try.

reply
I understand your position and I think it's the norm. Yet I find it difficult to comprehend how it's not self-evidently absurd.

Do you feel like software transcends pyhsics, mathematics and logics? Because that's what the statement translates to.

The only reason it's impossible, is because nobody tries, because trying to do so would interfer with the deliverables of next sprint. The software industry has painted itself into a corner.

reply
"Software people are not alone in facing complexity. Physics deals with terribly complex objects even at the "fundamental" particle level. The physicist labors on, however, in a firm faith that there are unifying principles to be found, whether in quarks or in unified field theories. Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary.

No such faith comforts the software engineer. Much of the complexity he must master is arbitrary complexity, forced without rhyme or reason by the many human institutions and systems to which his interfaces must conform. These differ from interface to interface, and from time to time, not because of necessity but only because they were designed by different people, rather than by God."

- Fred Brooks, No Silver Bullet

reply
Physics is full of leaky abstractions. Solid? Leaky abstraction (melting). Ideal gas? Leaky abstraction (van der Walls). Molecule? Leaky abstraction (chemical reactions). Atom? Leaky abstraction (ionization, fusion, fission, alpha and beta decay). Proton? Leaky abstraction (sometimes you have to care about the quarks).
reply
Check out Urs Schreiber if you want to get over it
reply
>"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean?

You cannot have a thing without doing the work to build it. You don't get the better abstraction without implementing it first. Your proof in theory, is just that, until exercised, and the divergence from the ideal to the real world is finally realized. I can teach a programmer all manner of linguistic trickery to allow them to exploit all sorts of mathematical abuse of notation. None of that makes a cotton-picking, salt-licking bit of difference if at the end of the day, if your symbolic proof isn't translateable to a machine code that runs and maps successfully onto the operational space of an implementation of a computing device. If you give a program written in the form of a Shakespearean sonnet (an example of a focus on radical novelty in encoding a program without regard to analogy); say; I still need a bloody compiler that'll turn that into something that is capable of running within the constraints of the machine, and the other primitives to make it work. That's TANSTAAFL. You break from what exists; you still have to reroot and establish a parallel basis of operation that covers the primitive operations you're familiar with. Djikstra might be right. There's something liberating to staying in the realm of the formal and mathematical. His detractor's were also right. He is so damn far above everyone else, that everybody in the room has trouble understanding just what it is he's going on about. At the end of the day, teach what the greatest number of the people there can firmly mentally grip, and pass that on. The geniuses like Djikstra will quickly outgrow it, and excel. They don't need the help. Everyone else on the other hand, does. I wouldn't be opposed, to trying Djikstra's approach myself. Shattering my current understanding of the practice of programming and working more from a formal methods POV. That comes after a career which has been fruitful, and was rooted in the old way which worked quite well for many others educated at the same time I was. I already know I can do it. His method just changes the emphasis. Though I will note with alarm, his reticence to test is disturbing. If he does assume everything is proofable from the get go, then I suppose you don't need tests; but that's hardly the way anything in the world actually bloody works. That's Math in a vacuum, with spherical cows. Not writing code then realizing "Shit, the processor in the machine I'm writing for doesn't support that primitive, or has a glitchy implementation thereof".

Software engineering isn't programming for people who can't; it's a set of practices and know-how to navigate a niche field that are battle hardened, and tested through time to actually guarantee some semblance of a chance of success in a field shaped by such fast development, the logic of 6 months ago seems antiquated. For that time with Moore's Law in full swing, yeah. Radical novelty might have been justifiable; but ultimately didn't push past the test of time. It can be as clever a hack as you can imagine, but if no one else can follow it... You haven't condensed it to a teachable form.

reply