DLLs got their start when early windowing systems didn't quite fit on the workstations of the era in the late 80s / early 90s.
In about 4 minutes both Microsoft and GNU were like, "let me get this straight, it will never work on another system and I can silently change it whenever I want?" Debian went along because it gives distro maintainers degrees of freedom they like and don't bear the costs of.
Fast forward 30 years and Docker is too profitable a problem to fix by the simple expedient of calling a stable kernel ABI on anything, and don't even get me started on how penetrated everything but libressl and libsodium are. Protip: TLS is popular with the establishment because even Wireshark requires special settings and privileges for a user to see their own traffic, security patches my ass. eBPF is easier.
Dynamic linking moves control from users to vendors and governments at ruinous cost in performance, props up bloated industries like the cloud compute and Docker industrial complex, and should die in a fire.
Don't take my word for it, swing by cat-v.org sometimes and see what the authors of Unix have to say about it.
I'll save the rant about how rustc somehow manages to be slower than clang++ and clang-tidy combined for another day.
…
Dynamic linking moves control from users to vendors and governments at ruinous cost in performance, props up bloated industries...
This is ridiculous. Not everything is a conspiracy!
If fact, if there was anything remotely controversial about a bunch of extremely specific, extremely falsifiable claims I made, one imagines your rebuttal would have mentioned at least one.
I said inflmatory things (Docker is both arsonist and fireman at ruinous cost), but they're fucking true. That Alpine in the Docker jank? Links musl!
But people should make an informed choice, and there isn't any noble or high minded or well-meaning reason to try to shout that information down.
Don't confidently assert falsehoods unless you're prepared to have them refuted. You're entitled to peddle memes and I'm entitled to reply with corrections.
for i in 0..10 {}
translates to roughly let mut iter = Range { start: 0, end: 10 }.into_iter();
while let Some(i) = iter.next() {}This has tradeoffs: increased ABI stability at the cost of longer compile times.
Nah. Slow type checking in Swift is primarily caused by the fact that functions and operators can be overloaded on type.
Separately-compiled generics don't introduce any algorithmic complexity and are actually good for compile time, because you don't have to re-type check every template expansion more than once.
I’d like to see tooling for this to pinpoint bottlenecks - it’s not always obvious what’s making builds slow.
I second this enthusiastically.
If it improves compile time, that sounds like a bug in the compiler or the design of the language itself.
Even this can lead to unworkable compile times, to the point that code is rewritten.
Could you expand on that, please? Every time you run dynmically linked program, it is linked at runtime. (unless it explicitly avoids linking unneccessary stuff by dlopening things lazily; which pretty much never happens). If it is fine to link on every program launch, linking at build time should not be a problem at all.
If you want to have link time optimization, that's another story. But you absolutely don't have to do that if you care about build speed.
I think lazily linking is the default even if you don't use dlopen, i.e. every symbol gets linked upon first use. Of course that has the drawback, that the program can crash due to missing/incompatible libraries in the middle of work.
Anyway, while what you said is theoretically half-true, a fairly large number of libraries are not designed/encapsulated well. This means almost all of their symbols are exported dynamically, so, the idea that there are only "few public exported symbols" is unfortunately false.
However, something almost no one ever mentions is that ELF was actually designed to allow dynamic libraries to be fairly performant. It isn't something I would recommend, as it breaks many assumptions on Unices, (while you don't get the benefits of LTO) you can achieve code generation almost equivalent to static linking by using something like "-fno-semantic-interposition -Wl,-Bsymbolic,-z,now". MaskRay has a good explanation on it: https://maskray.me/blog/2021-05-16-elf-interposition-and-bsy...
Wouldn't you say a lot of that comes from the macros and (by way of monomorphisation) the type system?
I suspect this leaks into both compile-time and run-time costs.
Go got famous compile times, because for a decade a new generation educated in scripting languages and used to badly configured C and C++ projects, took for innovation, what was actually a return to old values in compiler development.