Each of these tools provides real value.
* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.
* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)
* Code formatters automatically ensure consistent formatting.
* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)
As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.
And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?
In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.
I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed
And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...
* Rolldown is compatible to Rollup's API and can use most Rollup plugins
* Oxlint supports JS plugins and is ESLint compatibel (can run ESLint rules easily)
* Oxfmt plans to support Prettier plugins, in turn using the power of the ecosystem
* and so on...
So you get better performance and can still work with your favorite plugins and extend tools "as before".
Regarding the "mix of technology" or tooling fatigue: I get that. We have to install a lot of tools, even for a simple application. This is where Vite+[0] will shine, bringing the modern and powerful tools together, making them even easier to adopt and reducing the divide in the ecosystem.
Supports… some ESLint rules. It is not “easy” to add support to Oxlint for the rules it does not.
The projects at my work that “switched” to it now use both Eslint and Oxlint. It sucks, but at least a subset of errors are caught much faster.
Oxlint does support core rules out of the box but has support for JS plugins[0] as mentioned. If you don't rely on a custom parser (so svelte or vue component for example) things just work. Even react compiler rules[1].
[0] https://oxc.rs/docs/guide/usage/linter/js-plugins.html [1] https://github.com/TheAlexLichter/oxlint-react-compiler-rule...
I completely agree but maintenance is a maintainer problem, not the consumer or user of the package, at least according to the average user of open source nowadays. One of two things are come out of this: either the wheels start falling off once the community can no longer maintain this fractured tooling as you point out, or companies are going to pick up the slack and start stewarding it (likely looking for opportunities to capture tooling and profit along the way).
Neither outcome looks particularly appealing.
Based on current trends, I don't think people care about knowing how all the parts work (even before these powerful LLMs came along) as long as the job gets done and things get shipped and it mostly works.
I thought this was the point of all development in the JavaScript/web ecosystem?