upvote
So, what's your counterproposal?

Each of these tools provides real value.

* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.

* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)

* Code formatters automatically ensure consistent formatting.

* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)

As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.

And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?

reply
I'm being a little coy because I do have a very detailed proposal.

In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.

reply
Yeah it's a shame that few people realize running 3 (or more) different programs that have separate parsing and AST is the bigger problem.
reply
Not just because of perf (though the perf aspect is annoying) but because of how often the three will get out of sync and produce bizarre results
reply
Hasn't that already been tried (10+ years ago) with projects like https://github.com/jquery/esprima ? Which have since seen usage dramatically reduced for performance reasons.
reply
Yeah, you are correct. But that means I have the benefit of ten years development in the web platform, as well as having hindsight on the earlier effort.

I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed

reply
You can rip fast builds from my cold, dead hands. I’m not looking back to JS-only tooling, and I was there since the gulp days.
reply
All I can say for sure is that the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...

reply
I look at it and don't really have an issue with it. I have been using tsc, vite, eslint, and prettier for years. I am in the process of switching my projects to tsgo (which will soon be tsc anyway), oxlint, and oxfmt. It's not a big deal and it's well worth the 10x speed increase. It would be nice if there was one toolchain to rule them all, but that is just not the world we live in.
reply
How do you plan to track CVEs flagged on tsgo's native dependencies.
reply
I only use it for typechecking locally and in CI. I don’t have it generating code. Of course, what is generating my code is esbuild and soon Rolldown, so same issue maybe. If CVEs in tsgo’s deps are a big risk to run locally, I would say I have much bigger problems than that — a hundred programs I run on my machine have this problem.
reply
The good part is that the new tools do replace the old ones, while being compatible. The pattern is:

* Rolldown is compatible to Rollup's API and can use most Rollup plugins

* Oxlint supports JS plugins and is ESLint compatibel (can run ESLint rules easily)

* Oxfmt plans to support Prettier plugins, in turn using the power of the ecosystem

* and so on...

So you get better performance and can still work with your favorite plugins and extend tools "as before".

Regarding the "mix of technology" or tooling fatigue: I get that. We have to install a lot of tools, even for a simple application. This is where Vite+[0] will shine, bringing the modern and powerful tools together, making them even easier to adopt and reducing the divide in the ecosystem.

[0] https://viteplus.dev/

reply
e: ahhh frick this is just stupid AI spam for this dude’s project.

Supports… some ESLint rules. It is not “easy” to add support to Oxlint for the rules it does not.

The projects at my work that “switched” to it now use both Eslint and Oxlint. It sucks, but at least a subset of errors are caught much faster.

reply
Vite+ is not “this dude’s project”, it’s made by the team that makes all the tools discussed in this article.
reply
Yeah, no. Real human here.

Oxlint does support core rules out of the box but has support for JS plugins[0] as mentioned. If you don't rely on a custom parser (so svelte or vue component for example) things just work. Even react compiler rules[1].

[0] https://oxc.rs/docs/guide/usage/linter/js-plugins.html [1] https://github.com/TheAlexLichter/oxlint-react-compiler-rule...

reply
> We have to maintain everything, old and new, because it's all still critical, engineers have to learn everything, old and new, because it's all still critical.

I completely agree but maintenance is a maintainer problem, not the consumer or user of the package, at least according to the average user of open source nowadays. One of two things are come out of this: either the wheels start falling off once the community can no longer maintain this fractured tooling as you point out, or companies are going to pick up the slack and start stewarding it (likely looking for opportunities to capture tooling and profit along the way).

Neither outcome looks particularly appealing.

reply
Yes, this just sounds like the run-of-the-mill specialization issue that is affecting every industry (and has been affecting every industry before AI). Web devs learn Javascript/Typescript/frameworks, "middleware" developers learn Rust/Go/C++/etc. to build the web development frameworks, lower-level devs build that, etc. There shouldn’t be a strict need for someone who wants to make websites or web technology to learn Rust or Go unless they want to break into web framework development or WASM stuff. But again, this is just over-specialization that has been happening since forever (or at least since the Industrial revolution).
reply
It's definitely an explosion of complexity but also something that AI can help manage. So :shrug: ...

Based on current trends, I don't think people care about knowing how all the parts work (even before these powerful LLMs came along) as long as the job gets done and things get shipped and it mostly works.

reply
> All I really see is an explosion of complexity.

I thought this was the point of all development in the JavaScript/web ecosystem?

reply
In retrospect, the tolerance for excess complexity in the JS/npm/yarn/web framework ecosystem was an important precursor to the wanton overconsumption of today's LLM ecosystem.
reply