What is a problem is library quality. Which is downstream of nobody getting paid for it, combined with an optimistic but unrealistic "all packages are equal" philosophy.
> High quality C libraries
> OpenSSL
OpenSSL is one of the ones where there's a ground up rewrite happening because the code quality is so terrible while being security critical.
On the other end, javascript is uniquely bad because of the deployment model and difficulty of adding things to the standard library, so everything is littered with polyfills.
Absolute nonsense. What does automated world even mean? Even if one could infer reasonably, it's no justification. Appealing to "the real world" in lieu of any further consideration is exactly the kind of mindlessness that has led to the present state of affairs.
Automation of dependency versions was never something we needed it was always a convenience, and even that's a stretch given that dependency hell is abundant in all of these systems, and now we have supply chain attacks. While everyone is welcome to do as they please, I'm going to stick to vendoring my dependencies, statically compiling, and not blindly trusting code I haven't seen before.
How do you handle updating dependencies then?
People are trying to automate the act of programming itself, with AI, let alone all the bits and pieces of build processes and maintenance.
1. Packages should carry a manifest that declares what they do at build time, just like Chrome extensions do. This manifest would then be used to configure its build environment.
2. Publishers to official registries should be forced to use 2FA. I proposed this a decade ago for crates.io and people lost their minds, like I was suggesting we drag developers to a shed to be shot.
3. Every package registry should produce a detailed audit log that contains a "who, what, when". Every build/ command should be producing audit logs that can be collected by endpoint agents too.
4. Every package registry should support TUF.
5. Typosquatting defenses should be standard.
etc etc etc. Some of this is hard, some of this is not hard. All of this is possible. No one has done it, so it's way too early to say "package managers can't be made safe" when no one has tried.
It's also shockingly controversial to suggest typosquatting suggestions. I made this suggestion ages ago for cargo, demonstrated that basic distance checks would have impacted <1% of crates over all time, and people still didn't want it.
How is this enforced when it's pushed via a pipeline?
Publishing should be handled via something like Trusted Publishing, which would leverage short lived tokens and can integrate with cryptographic logs for publish information (ie: "Published from the main branch of this repo at this time").
I'm not sure why you believe this is more secure than a package manager. At least with a package manager there is an opportunity for vetting. It's also trivial that it did not increase your executable's size. If your executable depends on it, it increases its effective size.
This is what happens when there is no barrier to entry and it includes everyone who has no idea what they are doing in charge of the NPM community.
When you see a single package having +25 dependencies, that is a bad practice and increases the risk of supply chain attacks.
Most of them don't even pin their dependencies and I called this out just yesterday on OneCLI. [0]
It just happens that NPM is the worst out of all of the rest of the ecosystems due to the above.
If no one checks their dependencies, the solution is to centralize this responsibility at the package repository. Something like left-pad should simply not be admitted to npm. Enforce a set of stricter rules which only allow non-trivial packages maintained by someone who is clearly accountable.
Another change one could make is develop bigger standard libraries with all the utilities which are useful. For example in Rust there are a few de facto standard packages one needs very often, which then also force you to pull in a bunch of transitive dependencies. Those could also be part of the standard library.
This all amounts to increasing the minimal scope of useful functionality a package has to have to be admitted and increasing accountability of the people maintaining them. This obviously comes with more effort on the maintainers part, but hey maybe we could even pay them for their labor.
But maybe that's not the right fit either. The world where package managers are just open to whatever needs to die. It's no longer a safe model.
That model effectively becomes your ring 1. Ring 0 is the stdlib and the package manager itself, and - because you would always need to be able to step outside the distribution for either freshness or "that's not been picked up by the distro yet" reasons - the ecosystem package repositories are the wild west ring 2.
In the language ecosystems I'm only aware of Quicklisp/Ultralisp and Haskell's Stackage that work like this. Everything else is effectively a rolling distro that hasn't realised that's what it is yet.
Rust projects tend to take their project and split it into many smaller packages, for ease of development, faster compiles through parallelization, ensuring proper splitting of concerns, and allowing code reuse by others. But the packages are equivalent to a single big package. The people that write it are the same. They get developed in tandem and published at the same time. You can take a look at the del tree for ripgrep, and the split of different parts of that app allows me to reuse the regex engine without dealing with APIs that only make sense in the context of a CLI app or pulling in code I won't ever use (which might be hiding an exploit too).
Counting 100 100 line long crates all by the same authors as inherently more dangerous than 1 10000 line long crate makes no sense to me.
You are just swapping a package manager with security by obscurity by copy pasting code into your project. It is arguably a much worse way of handling supply chain security, as now there is no way to audit your dependencies.
> If you get rid of transitive dependencies, you get rid of the need of a package manager
This argument makes no sense. Obviously reducing the amount of transitive dependencies is almost always a good thing, but it doesn't change the fundamental benefits of a package manager.
> There's so many C libraries like this
The language with the most fundamental and dangerous ways of handling memory, the language that is constantly in the news for numerous security problems even in massively popular libraries such as OpenSSL? Yes, definitely copy-paste that code in, surely nothing can go wrong.
> They also bindings for every language under the sun. Rust libraries are very rarely used outside of Rust
This is a WILD assumption, doing C-style bindings is actually quite common. YOu will of course then also be exposing a memory unsafe interface, as that is what you get with C.
What exactly is your argument here? It feels like what you are trying to say is that we should just stop doing JS and instead all make C programs that copy paste massive libraries because that is somhow 'high quality'.
This seems like a massively uninformed, one-sided and frankly ridiculous take.
You should try writing code, and not relying on libraries for everything, it may change how you look at programming and actually ground your opinions in reality. I'm staring at company's vendor/ folder. It has ~15 libraries, all but one of which operate on trusted input (game assets).
> fundamental benefits of a package manager.
I literally told you why they don't matter if you write code in a sane way.
> doing C-style bindings is actually quite common
I know bindings for Rust libraries exist. Read the literal words you quoted. "Rust libraries are very rarely used outside of Rust". Got some counterexamples?
https://github.com/dora-rs/dora
It is VERY common in existing codebases that are migrating from C++/C to make heave use of FFI/ existing C
Gamedev is its own weird thing, and isn't a model you want to generalize to other industries. It has to optimize for things a lot of software does not, and that skews development.
Vendoring libraries is almost always a terrible idea because it immediately starts to bitrot and become a footgun.
Sometimes it's necessary, but it's not desirable, and you almost always just want to pin your dependencies instead.
Something to reflect upon too.
You got a project with 1-2 depencies? Sure. But if you need to bring in 100 different libs (because you bring in 10 libs which in turn brings in 10 libs) good luck.
So don’t?
With manual deps management, everyone soon gravitates to a core set of deps. And libraries developer tends to reduce their deps needs, That’s why you see most C libraries deals with file formats, protocols, and broad concerns. Smaller algorithms can be shared with gists and blog articles.
You just invented a worse Stack Overflow.
Using libraries is good, actually.
Rewriting the world to protect against a specific kind of threat is insane.
Some pm's are badly maintained (Pip/NPM), while others are curated enough.
Again, if you have GNU/Linux installed, install Guix, read the Info manual on 'guix import' and just create a shell/container with 'guix shell --container' (and a manifest package created from guix import) and use any crap you need for NPM in a reproducible and isolated way. You $HOME will be safe, for sure.