upvote
There are several issues with "Batteries Included" ecosystems (like Python, C#/.NET, and Java):

1. They are not going to include everything. This includes things like new file formats.

2. They are going to be out of date whenever a standard changes (HTML, etc.), application changes (e.g. SQLite/PostgreSQL/etc. for SQL/ORM bindings), or API changes (DirectX, Vulcan, etc.).

3. Things like data structures, graphics APIs, etc. will have performance characteristics that may be different to your use case.

4. They can't cover all nice use cases such as the different libraries and frameworks for creating games of different genres.

For example, Python's XML DOM implementation only implements a subset of XPath and doesn't support parsing HTML.

The fact that Python, Java, and .NET have large library ecosystems proves that even if you have a "Batteries Included" approach there will always be other things to add.

reply
"Batteries included" means "ossification is guaranteed", yah. "stdlib is where code goes to die" is a fairly common phrase for a reason.

There's clearly merit to both sides, but personally I think a major underlying cause is that libraries are trusted. Obviously that doesn't match reality. We desperately need a permission system for libraries, it's far harder to sneak stuff in when doing so requires an "adds dangerous permission" change approval.

reply
[dead]
reply
The goal is not to cover everything, the goal is to cover 90% of the use cases.

For C#, I think they achieved that.

reply
> They are going to be out of date whenever a standard changes (HTML, etc.)

You might want to elaborate on the "etc.", since HTML updates are glacial.

reply
The HTML "Living Standard" is constantly updated [1-6].

The PNG spec [7] has been updated several times in 1996, 1998, 1999, and 2025.

The XPath spec [8] has multiple versions: 1.0 (1999), 2.0 (2007), 3.0 (2014), and 3.1 (2017), with 4.0 in development.

The RDF spec [9] has multiple versions: 1.0 (2004), and 1.1 (2014). Plus the related specs and their associated versions.

The schema.org metadata standard [10] is under active development and is currently on version 30.

[1] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (New)

[2] https://web.dev/baseline/2025 -- popover API, plain text content editable, etc.

[3] https://web.dev/baseline/2024 -- exclusive accordions, declarative shadow root DOM

[4] https://web.dev/baseline/2023 -- inert attribute, lazy loading iframes

[5] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (Baseline 2023)

[6] https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/... (2020)

[7] https://en.wikipedia.org/wiki/PNG

[8] https://en.wikipedia.org/wiki/XPath

[9] https://en.wikipedia.org/wiki/Resource_Description_Framework

[10] https://schema.org/

reply
please! nobody uses Xpath (coz json killed XML), it RDF (semantic web never happened, and one ever 10years is not fast), schema.org (again, nobody cares), PNG: no change in the last 26 years, not fast. the HTML "living standard" :D completely optional and hence not a standard but definition.
reply
deleted
reply
glaciers change faster than HTML
reply
Oof, I honestly hadn't considered that.
reply
Why would they be out of date? The ecosystems themselves (for example .NET) receives regular updates.

Yes, they cannot include everything, but enough that you do not _need_ third party packages.

reply
deleted
reply
Python, .NET, and Java are not examples of batteries included.

Django and Spring

reply
And in fact wasn't a popular Python library just compromised very recently? See https://news.ycombinator.com/item?id=47501426.

So Python's clearly not "batteries included" enough to avoid this kind of risk.

reply
That's my point. You can have a large standard library like those languages I mentioned, but that isn't going to include everything nor cover every use case, so you'll have external libraries (via PyPi for Python, NuGet for .NET, and Maven for Java/JVM).
reply
comparing to Node, .NET is batteries included: built-in Linq vs needing lodash external package, built-in Decimal vs decimal.js package, built-in model validation vs class-validator & class-transformer packages, built-in CSRF/XSRF protection vs csrf-csrf package, I can go on for a while...
reply
Python's standard library is definitely much more batteries-included than JavaScript's.
reply
Batteries included systems are still susceptible to supply chain attacks, they just move slower so it’s not as attractive of a target.

I think packages of a certain size need to be held to higher standards by the repositories. Multiple users should have to approve changes. Maybe enforced scans (though with trivy’s recent compromise that wont be likely any time soon)

Basically anything besides lone developer can decide to send something out on a whim that will run on millions of machines.

reply
While technically true, it's so much slower that it's essentially a different thing. Third party packages being attacked is a near daily occurrence. First party attacks happens on the timescale and frequency of decades.

It's like the difference in protecting your home from burglars and foreign nation soldiers. Both are technically invaders to your home, but the scope is different, and the solutions are different.

reply
> they just move slower so it’s not as attractive of a target.

Well, there’s other things. Maven doesn’t allow you to declare “version >= x.y.z” and doesn’t run arbitrary scripts upon pulling dependencies, for one thing. The Java classpath doesn’t make it possible to have multiple versions of the same library at the same time. That helps a lot too.

NPM and the way node does dependency management just isn’t great. Never has been.

reply
> In practice you will tend to have a few, but you won't be vendoring out critical things like HTTP, TCP, JSON, string sanitation, cryptography

Unless you are Python, where the standard library includes multiple HTTP libraries and everyone installs the requests package anyways.

Few languages have good models for evolving their standard library, so you end up with lots of bad designs sticking around forever. Libraries are much easier to evolve, giving them the advantage in terms of developer UX and performance.

reply
What type of developer chooses UX and performance over security? So reckless.

I removed the locks from all the doors, now entering/exiting is 87% faster! After removing all the safety equipment, our vehicles have significantly improved in mileage, acceleration and top speed!

reply
>What type of developer chooses UX and performance over security? So reckless.

Initially I assumed this is sarcastic, but apparently not. UX and performance is what programmers are paid to do! Making sure UX is good is one of the most important things in programmer job.

While security is a moving target, a goal, something that can never be perfect, just "good enough" (if NSA wants to hack you, they will). You make it sound like installing third party packages is basically equivalent to a security hole, while in practice the risk is low, especially if you don't overdo it.

Wild to read extreme security views like that, while at the same time there are people here that run unconstrained AI agents with --dangerous-skip-confirm flags and see nothing wrong with it.

reply
Installing 3rd party packages the way Node and Python devs do regularly _is_ a security hole.
reply
We definitely agree on that. Fortunately some of the 600+ comments here include suggestions of what to do about it.
reply
Even more wild to read that sarcasm about "removing locks from doors for 87% speedup" is considered extreme...

And yes, we agree that running unconstrained AI agents with --dangerous-skip-confirm flags and seeing nothing wrong with it is insane. Kind of like just advertising for burglars to come open your doors for you before you get home - yeah, it's lots faster to get in (and to move about the house with all your stuff gone).

reply
Better developer UX can directly lead to better safety. "You are holding it wrong" is a frequent source of security bugs, and better UX reduces the ways you can hold it wrong, or at least makes you more likely to hold it the right way
reply
> Better developer UX can directly lead to better safety.

Depends. If you had to add to a Makefile for your dependencies, you sure as hell aren't going to add 5k dependencies manually just to get a function that does $FOO; you'd write it yourself.

Now, with AI in the mix, there's fewer and fewer reasons to use so many dependencies.

reply
Friction is helpful. Putting seatbelts on takes more time than just driving, but it’s way safer for the driver. Current dev practices increase speed, not safety.
reply
"Security" is often more about corporate CYA than improving my actual security as a user, and sometimes in opposition, and there is often blatant disregard for any UX concession at all. The most secure system is fully encrypted with all copies of the encryption key erased.
reply
requests should be in the Python standard library. Hard choices need to be made.
reply
I'm pretty sure it's really one HTTP library: urllib.request is built on top of http.client. But the very Java-inspired API for the former is awful.
reply
> Unless you are Python, where the standard library includes multiple HTTP libraries and everyone installs the requests package anyways.

The amount of time defining same data structures over and over again vs `pip install requests` with well defined data structures.

reply

    >  Few languages have good models for evolving their standard library
Can you name some examples?
reply
Personally I've heard Odin [1] to do a decent job with this, at least from what I've superficially learned about its stdlib and included modules as an "outsider" (not a regular user). It appears to have things like support for e.g. image file formats built-in, and new things are somewhat liberally getting added to core if they prove practically useful, since there isn't a package manager in the traditional sense. Here's a blog post by the language author literally named "Package Managers are Evil" [2]

(Please do correct me if this is wrong, again, I don't have the experience myself.)

[1] https://pkg.odin-lang.org/

[2] https://www.gingerbill.org/article/2025/09/08/package-manage...

reply
Irony is that Node has no need for Axios, native fetch support has been there for years, so in terms of network requests it is batteries included.
reply
It doesn't matter. We pulled axios out of our codebase, but it still ends up in there as a child or peer from 40 other dependencies. Many from major vendors like datadog, slack, twilio, nx (in the gcs-cache extension), etc...
reply
People use axios or ky because with fetch you inevitably end up writing a small wrapper on top of it anyway.
reply
Fetch has also lacked support for features that xhr has had for over a decade now. For example upload progress. It's slowly catching up though, upload progress is the only thing I'd choose xhr for.
reply
You can pipe through a TransformStream that counts how many bytes you've uploaded, right?
reply
That is a way to approximate it, though I'd be curious to know the semantics compared to xhr - would they both show the same value at the same network lifecycle of a given byte?
reply
Some might say the tradeoff of writing a small wrapper is worth it given what’s been demonstrated here.
reply
Yeah but what about other deps like db drivers?
reply
In my experience people feel the need to wrap axios too.
reply
These are the kind of people I hope AI replaces
reply
AI was trained on Axios wrappers, so it's just going to be wrappers all the way down. Look inside any company "API Client" and it's just a branded wrapper around Axios.
reply
Speak for yourself, Claude works fine with fetch on my system.
reply
Node fetch is relatively new. Wasn't marked stable until 2023, though I've used it since like 2018.
reply
I'm not sure fetch is a good server-side API. The typical fetch-based code snippet `fetch(API_URL).then(r => r.json())` has no response body size limit and can potentially bring down a server due to memory exhaustion if the endpoint at API_URL malfunctions for some reason. Fine in the browser but to me it should be a no-no on the server.
reply
> I'm not sure fetch is a good server-side API. The typical fetch-based code snippet `fetch(API_URL).then(r => r.json())` has no response body size limit and can potentially bring down a server due to memory exhaustion if the endpoint at API_URL malfunctions for some reason. Fine in the browser but to me it should be a no-no on the server.

Nor is fetch a good client-side API either; you want progress indicators, on both upload and download. Fetch is a poor API all-round.

reply
Hm, I don't think axios would do much better here. `fetch` is the official replacement for axios. If both are flawed that's another topic
reply
Axios has maxContentLength and maxBodyLength options. I would probably go with undici nowadays though (it also has maxResponseSize).
reply
> `fetch` is the official replacement for axios.

No. Axios is still maintained. They have not deprecated the project in favor of fetch.

reply
I'm not saying that axios is unmaintained, I'm saying that if you want something like axios from the standard lib, fetch is the closest thing you get to official
reply
Sure but Axios determine what the official replacement for Axios is.
reply
It's not deprecated, it's obsoleted.
reply
deleted
reply
deleted
reply
It doesn't have a need _now_. Axios is more than 10 years old now, and even before axios other libraries did the same utility of making requests easier
reply
Browsers too.

It’s not needed anymore.

reply
The other thing that keeps coming up is the github-code-is-fine-but-the-release-artifact-is-a-trojan issue. It really makes me question if "packages" should even exist in JavaScript, or if we could just be importing standard plain source code from a git repo.

I understand why this doesn't work well with legacy projects, but it's something that the language could strive towards.

reply
This might make things worse not better.

Yes - the postinstall hook attack vector goes away. You can do SHA pinning since Git's content addressing means that SHA is the hash of the content. But then your "lockfile" equivalent is just... a list of commit SHAs scattered across import statements in your source? Managing that across a real dependency tree becomes a nightmare.

This is basically what Deno's import maps tried to solve, and what they ended up with looked a lot like a package registry again.

At least npm packages have checksums and a registry that can yank things.

reply
You can just git submodule in the dependencies. Super easy. Also makes it straightforward to develop patches to send upstream from within your project. Or to replace a dependency with a private fork.

In my experience, this works great for libraries internal to an organization (UI components, custom file formats, API type definitions, etc.). I don't see why it wouldn't also work for managing public dependencies.

Plus it's ecosystem-agnostic. Git submodules work just as well for JS as they do for Go, sample data/binary assets, or whatever other dependencies you need to manage.

reply
> But then your "lockfile" equivalent is just... a list of commit SHAs scattered across import statements in your source? Managing that across a real dependency tree becomes a nightmare.

The irony is that this is actually the current best practice to defend against supply chain attacks in the github actions layer. Pin all actions versions to a hash. There's an entire secondary set of dev tools for converting GHA version numbers to hashes

reply
> I understand why this doesn't work well with legacy projects, but it's something that the language could strive towards.

Why wouldn't that work well with legacy projects? In fact, the projects I was a part of that I'd call legacy nowadays, was in fact built by copy-and-pasting .js libraries into a "vendor/" directory, and that's how we shipped it as well, this was in the days before Bower (which was the npm of frontend development back in the day), vendoring JS libs was standard practice, before package managers became used in frontend development too.

Not sure why it wouldn't work, JavaScript is a very moldable language, you can make most things work one way or another :)(

reply
or you don't use a package manager where anyone can just publish a package (i.e. use your system package manager). There is still some risk, but it is much smaller. Like, if xz were distributed by PyPI or NPM, everyone would have been pwned, but instead it was (barely) found.

It's true that system repos doesn't include everything, but you can create your own repositories if you really need to for a few things. In practice Fedora/EPEL are basically sufficient for my needs. Right now I'm deploying something with yocto, which is a bit more limited in slection, but it's pretty easy to add my own packages and it at least has hashes so things don't get replaced without me noticing (to be fair, I don't know if the security practices of open-embedded recipes are as strong as Fedora...).

reply
it's muddying what a package is. A package, or a distro, is the people who slave and labor over packaging, reviewing, deciding on versions to ship, having policies in place, security mailing lists, release schedules, etc.

just shipping from npm crap is essentially the equivelant of running your production code base against Arch AUR pkgbuilds.

reply
I agree that dependencies are a liability, but, sadly, "batteries included" didn't work out for Python in practice (i. e. how do I even live without numpy? No, array aren't enough).
reply
To the extend that Python is indeed "batteries included," that seems true. But just how "batteries included" is it? I'd argue that its batteries are pretty limited. Exhibit A: everybody uses the third-party requests instead of the stdlib urllib. Exhibit B: http.server isn't a production-ready webserver, so people use Flask or something beefier.

I'd contrast Python with Go, which has an amazing stdlib for the domains that Go targets. This last part is key--Go has a more focused scope than Python, and that makes it easier for its stdlib to succeed.

reply
We could have different Python package bundles: Python base. Python webdev. Python desktop.
reply
Fully agree with this! I think today .NET is probably the most batteries included platform you can get. This means that even if you use third-party libraries, these typically depend only on first-party dependencies, making it much less likely for something shady to sneak in.
reply
Kinda.

With Bun I use less dependencies from NPM than I used from Nuget with .NET to build minimal apis. For example the pg driver.

reply
Why is .NET more "batteries included" than Java?
reply
With the notable exception of cross-platform audio.
reply
and cross-platform UI
reply
Not really notable, aiui the only mainstream language with anything like that is JS in the browser

And for good reason. There are enough platform differences that you have to write your own code on top anyway.

reply
To me, I really like Golang's batteries included platform. I am not sure about .NET though
reply
C#'s LINQ (code as data, like LISP) wins over golang for any type of data access. Strongly-typed, language-native queries. Go has its own advantages though.
reply
EF is amazing
reply
And now with NativeAOT, you can use C# like go - you don't need to ship the CLR.
reply
This is a rather superlative and tunnel vision, "everything is a nail because I'm a hammer" approach. The truth is this is an exceedingly difficult problem nobody has adequately solved yet.
reply
I think the AI tooling is, if not completely solving sandboxing, at least making the default much better by asking you every time they want to do something and providing files to auto-approve certain actions.

Package managers should do the same thing

reply
Another layer of AI tooling is the cost of spinning up your own version of some libraries is lowered and can be made hyper specific to your needs rather than pulling in a whole library with features you'll never use.
reply
> Another layer of AI tooling is the cost of spinning up your own version of some libraries is lowered and can be made hyper specific to your needs rather than pulling in a whole library with features you'll never use.

Tell me about it. Using AI Chatbots (not even agents), I got a MVP of a packaging system[1] to my liking (to create packages for a proprietary ERP system) and an endpoint-API-testing tool, neither of which require a venv or similar to run.

------------------------------

[1] Okay, all it does now is create, sign, verify and unpack packages. There's a roadmap file for package distribution, which is a different problem.

reply
> at least making the default much better by asking you every time they want to do something

Really? I thought 'asking you every time they want to do something' was called 'security fatigue' and generally considered to be a bad thing. Yes you can concatenate files in the current project, Claude.

reply
Yes it has to be combined with a robust way to allowlist actions you trust
reply
Oddly, since I wrote that Claude 'auto' mode just landed and I built something with it (instead of 'dangeously skip') and it's working.
reply
So, youre on Microsoft then, judging by ScottPlot you write .NET desktop apps. If you use Dapper, you probably use Microsoft.Data.SqlClient, which is... distributed over NuGet and vulnerable to supply chain attack. You may not need many deps as a desktop dev. Modern day line of business apps require a lot more deps. CSVHelper, ClosedXML, AutoMapper, WebOptimizer, NetEscapades.AspNetCore.SecurityHeaders.

Yes less deps people need the better but it doesn't fix trhe core problem. Sharing and distrib uting code is a key tenant of being able to write modern code.

reply
Different programmers have very different ideas about what is "all the functionality you typically need."
reply
What are some examples of batteries-included languages that folk around here really feel productive in and/or love? What makes them so great, in your opinion?

(Leaving aside thoughts on language syntax, compile times, tooling etc - just interested in people's experiences with / thoughts on healthy stdlibs)

reply
I work in a NIS2 compliance sector, and we basically use Go and Python for everything. Go is awesome, Python isn't as such. Go didn't always come with the awesome stllib that it does today, which is likely partly why a lot of people still use things like Gin for web frameworks rather than simply using the standard library. Having worked with a lot of web frameworks, the one Go comes with is nice and easy enough to extend. Python is terrible, but on the plus side it's relatively easy to write your own libraries with Python, and use C/Zig to do so if you need it. The biggest challenges for us is that we aren't going to write a better MSSQL driver than Microsoft, so we use quite a bit of dependencies from them since we are married with Azure. These live in a little more isolation than what you might expect, so they aren't updated quite as often as many places might. Still, it's a relatively low risk factor that we can accept.

Our React projects are the contrast. They live in total and complete isolation, both in development and in production. You're not going to work on React on a computer that will be connected to any sort of internal resources. We've also had to write a novel's worth of legal bullshit explaining how we can't realistically review every line of code from React dependencies for compliance.

Anyway, I don't think JS/TS is that bad. It has a lot of issues, but then, you could always have written your own wrapper ontop of Node's fetch instead of using Axios. Which I guess is where working in the NIS2 compliance sector makes things a little bit different, because we'd always chose to write the wrapper instead of using one others made. With the few exceptions for Microsoft products that I mentioned earlier.

reply
Go is well known for its large and high quality std lib
reply
Go didn't even have versioning for dependencies for ages, so CVE reporting was a disaster.

And there's plenty of libraries you'll have to pull to get a viable product.

reply
These are the big ones I use, specifically because of the standard libraries:

Python (decent standard library) - It's pretty much everywhere. There's so many hidden gems in that standard library (difflib, argparse, shlex, subprocess, cmd)

C#/F# (.NET)

C# feels so productive because of how much is available in .NET Core, and F# gets to tag along and get it all for free too. With C# you can compile executables down to bundle the runtime and strip it down so your executables are in the 15 MiB range. If you have dotnet installed, you can run F# as scripts.

reply
These are definitely some good thoughts, thanks!

Do you worry at all about the future of F#? I've been told it's feeling more and more like a second-class citizen on .NET, but I don't have much personal experience.

reply
I used to, but the knowledge of .NET seems mostly transferrable to C#. It's super useful to do `dotnet fsi` and then work out the appropriate .NET calls in the F# repl.
reply
yep!

This is exactly the world I'm working towards with packaging tooling with a virtual machine i.e. electron but with virtual machines instead so the isolation aspect comes by default.

reply
deleted
reply
> "Batteries included" ecosystems are the only persistent solution to the package manager problem.

The irony in this case is that axios is not really needed now given that fetch is part of the JS std lib.

reply
For a lot of code, I switched to generating code rather than using 3rd party libraries. Things like PEG parsers, path finding algorithms, string sanitizers, data type conversion, etc are very conveniently generated by LLMs. It's fast, reduces dependencies, and feels safer to me.
reply
Ah, so you've traded the possibility of bad dependencies for certainty.
reply
Remember, our objective function here is “feels safe.”
reply
How can you come to that conclusion, given the specific examples I have given, which are tedious to write, but easy to proof-read and test?
reply
deleted
reply
deleted
reply
Because AI threatens the identity of many programmers.
reply
deleted
reply
deleted
reply
Or find the best third party library and copy the code from a widely used version that has been out long enough to have been well tested into your source tree.

The problem is not third party libraries. It is updating third party libraries when the version you have still works fine for your needs.

reply
Don't do this. Use a package manager that let's you specify a specific version to pin against. Vendoring side steps most automated tooling that can warn you about vulnerabilities. Vendoring is a signal that your tooling is insufficient, 99% of the time.
reply
Vendoring means you don't have to fetch the internet for every build, that you can work offline, that you're not at the mercy of the oh-so-close-99.999 availability, that it will keep on working in 10 years, and probably other advantages.

If your tooling can pull a dependency from the internet, it could certainly check if more recent version from a vendored one is available.

reply
Is there any package manager incapable of working offline?
reply
> Is there any package manager incapable of working offline?

I think you've identified the problem here: package management and package distribution are two different problems. Both tools have possibilities for exploits, but if they are separate tools then the surface area is smaller.

I'm thinking that the package distribution tool maintains a local system cache of packages, using keys/webrings/whatever to verify provenance, while the package management tool allows pinning, minver/maxver, etc.

reply
Honestly, you can get pretty far with just Bun and a very small number of dependencies. It’s what I love most about Bun. But, I do agree with you generally. .NET is about as good as I’ve ever seen for being batteries included. I just hate the enterprisey culture that always seems to pervade .NET shops.
reply
I agree about the culture. If I take my eye off the dev team for too long, I'll come back and we'll be using entity framework and a 20 page document about configuring code cleanup rules in visual studio.
reply
Entity framework is pretty good.
reply
I agree. Got downvoted a lot the other day for proposing Node should solve fundamental needs.
reply
But javascript is batteries included in this case, you can use xmlhttprequest or fetch
reply
What kind of apps do you build / industry etc?
reply
Language churn makes this problem worse.

Frankly inventing a new language is irresponsible these days unless you build on-top of an existing ecosystem because you need to solve all these problems.

reply
> "Batteries included" ecosystems are the only persistent solution

Or write your own stuff. Yes, that's right, I said it. Even HTTP. Even cryptography. Just because somebody else messed it up once doesn't mean nobody should ever do it. Professional quality software _should_ be customized. Professional developers absolutely can and should do this and get it right. When you use a third-party HTTP implementation (for example), you're invariably importing more functionality than you need anyway. If you're just querying a REST service, you don't need MIME encoding, but it's part of the HTTP library anyway because some clients do need it. That library (that imports all of its own libraries) is just unnecessary bloat, and this stuff really isn't that hard to get right.

reply
> When you use a third-party HTTP implementation (for example), you're invariably importing more functionality than you need anyway. If you're just querying a REST service, you don't need MIME encoding, but it's part of the HTTP library anyway because some clients do need it. That library (that imports all of its own libraries) is just unnecessary bloat, and this stuff really isn't that hard to get right.

This post is modded down (I think because of the "roll your own crypto vibe", which I disagree with), but this is actually spot on the money for HTTP.

The surface area for HTTP is quite large, and your little API, which never needed range-requests, basic-auth, multipart form upload, etc suddenly gets owned because of a vulnerability in one of those things you not only never used, you also never knew existed!

"Surface area" is a problem, reducing it is one way to mitigate.

reply
> the "roll your own crypto vibe", which I disagree with

Again, you run into the attack surface area here. Think about the Heartbleed vulnerability. It was a vulnerability in the DTLS implementation of OpenSSL, but it affected every single user, including the 99% that weren't using DTLS.

Experienced developers can, and should, be able to elide things like side-channel attacks and the other gotchas that scare folks off of rolling their own crypto. The right solution here is better-defined, well understood acceptance criteria and test cases, not blindly trusting something you downloaded from the internet.

reply
The reason I disagree about crypto is because:

1. It's really really hard to verify that you have not left a vulnerability in (for a good time, try figuring out all the different "standards" needed in x509), but, more importantly,

2. You already have options for a reduced attack surface; You don't need to use OpenSSL just for TLS, you can use WolfSSL (I'm very happy with it, actually). You don't need WolfSSL just for public/private keys signing+encryption, use libsodium. You don't need libsodium just for bcrypt password hashing, there's already a single function to do that.

With crypto, you have some options to reduce your attack surface. With HTTP you have few to none; all the HTTP libs take great care to implement as much of the specification as possible.

reply
> "standards" needed in x509

That's actually not really crypto, though - that's writing a parser (for a container that includes a lot of crypto-related data). And again... if you import a 3rd-party x.509 parser and you only need DER but not BER, you've got unnecessary bloat yet again.

reply
> Even cryptography

Good luck

reply