upvote
Forking the Web

(dillo-browser.org)

> The specification must contain a non-ambiguous formal grammar that can be parsed easily. A page can then be tested against the standard and reject or accept as compliant. Pages that don't conform with the specification won't be rendered. It is explicitly forbidden for clients to accept any page that doesn't conform with the specification.

This is what XHTML was, and it was a complete disaster. There's a reason almost nobody serves XHTML with the application/xhtml+xml MIME type, and that reason is that getting a “parser error” (this is what browsers still do! try it!) is always worse than getting a page that 99% works.[0] I strongly believe that rejecting the robustness principle is a fatal mistake for a web-replacement project. The fact that horribly broken old sites can stay online and stay readable is a huge part of the web's value. Without that, it's not really “the web”, spiritually or otherwise.

[0] It's particularly “cool” how they simply do not work in the Internet Archive's Wayback machine. The page can be retrieved, but nobody can read it.

reply
Agreed. There may be some situations where I may want to ensure 100% correctness. I'm thinking life or death scenarios, (which if so, maybe should use a different protocol). However, checking the sports score or looking at cat memes isn't that.
reply
XHTML failed in an era when writers (even normies) were writing some HTML of their own and they could't be trusted to close their tags properly. XHTML also assumed writers would be personally invested in semantic markup like distinguishing e.g. the italics of book titles from the italics of emphasis.

Today, when writers are using visual editors (or Markdown), few are writing their own HTML any more. A web standard requiring compliance would work differently today.

reply
Markdown sux and so do visual editors. I think visual editors were just invented to make it so cut-and-paste never quite works right. There's been some conceptual problem with the whole idea ever since MS Word and the industry has never dealt with it.
reply
No scripting is a tell, it's about wanting other people to accommodate their concerns about running a complex browser, not about solving a real problem.

If it did somehow happen that a good deal of interesting content was published using the standard, the most popular client would probably be nonconforming, ignoring the rule to not render ambiguous content.

reply
History explains why HTML is now a living standard: https://whatwg.org/faq (Ctrl+F Living and keep reading).

> A published version of the standard NEVER, EVER, EVER, EVER changes.

WhatWG does have per-commit snapshots of the standard. They're just not semantically versioned because it is a living standard.

I think what the author wants is something like Gemini instead of HTML, but that has its own set of problems. My plea for Dillo would be to instead just support a text/markdown mime-type natively and we can try for adoption in more browsers.

reply
I mostly agree with the article - I believe the differentiation should be between documents and applications.

While HTML serves its purpose, especially for documents, the modern web is a giant mess of that legacy, combined with unfriendly ergonomics and glue/hacks built on top just so we as developers can have better DX for creating complex software on top of it.

Building a browser means having to deal with all that legacy, wether we like it or not, so most of the browser market got captured by the big players who have enough manpower to cover all those edge cases. That also means we have to deal with whatever technical choices or bloat they make, causing an infinite stream of issues, from memory usage, to size, to limitations that don't make sense in 2026 but are still there because someone 20 years ago decided to write them like that. As I deal with mobile webviews a lot in my daily work, I unfortunately had to get familiar with quite many gotcha's and edge cases, and some are just... absurd in this day and age.

I believe we need a separation between an application layer and the document layer, and especially between the UI language and the actual application code - script tags serve their purpose, but again, they are a hacky solution with its own bag of tricks, and those tricks impact all of the software built upon it.

Now, a bit of a shameless plug I've been working on something to fill that gap, at least for myself and hopefully for others who encounter the same issue - it's called Hypen (https://hypen.space) and it's a DSL for building apps that work natively on all platforms, with strict separation of code/UI/state, and support for as many languages and platforms as I can maintain, not "just javascript". While currently it's focused on streaming UI, it's built with Rust and WASM at it's core and will soon allow fully "compileable" apps.

While it may not be the future of software, once you get into building something like that, it becomes obvious that the way we are building now is at least wrong, and at best kafkaesque.

reply
Developers would rather fork the Web than admit Chrome is the new IE6 and stop targeting it.
reply
I think at least part of the reason for this is acknowledging that the web isn't much of a web any longer. You've got three or four vendors that serve the vast majority of all internet traffic. And it's not happenstance that those same vendors now control something which was originally meant to be democratic.

Most of this document reads to me like that's the problem they're trying to solve, not just chrome's huge marketshare, so simply not targeting it doesn't serve their purpose.

reply
how is the web not democratic?

in real democracies the populists (facebook, tiktok, chrome) always win. because that's what the masses want

reply
If I could, I would post an Amen gif.
reply
> A page can then be tested against the standard and reject or accept as compliant. Pages that don't conform with the specification won't be rendered. It is explicitly forbidden for clients to accept any page that doesn't conform with the specification.

it's as if nothing was learned from the XHTML debacle

reply
[dead]
reply
>Adding scripting capabilities was a mistake, so we can avoid it now

Gemini protocol?

reply
Can't say I hate the HTML 5 spec. It resolves the ambiguities that made previous HTML specs insufficient to make a working web browser.

The standards that make my life miserable at times are the secondary standards like GDPR and WCAG as well as the de facto "standard" systems we are forced to participate in such as Cloudflare, the advertising economy, etc.

It's easy to say "WebUSB is bloat" and I'd certainly say PWA is something that could only come out of the mind that brought us Kubernetes, but lately I've been building biosignals applications and what should my choice be: write fragile GUI applications for the desktop that look like they came out of a lab and crash from memory leaks or spend 1/5 the time to make web applications that look like they belong in the cockpit of a Gundam and "just work"?

reply
>I'd certainly say PWA is something that could only come out of the mind that brought us Kubernetes

How so? PWAs are awesome! Democratizing for users. Democratizing for developers. They work well for the right class of apps. They would go much further if there weren't forces actively resisting them. Think of all the electron type-apps out there. Now imagine if the average Joe could just install them from the web with 2 clicks.

(Regular ole bookmarks get you a decent percent of the way but clearly something extra than that was needed.)

reply
I feel like that's not solving any of the problems I think of the Web as having.

You can certainly make something with it, but I can't imagine most people finding a use for it.

reply
I think original web standards were solving a completely different problem: sharing information.

Modern Internet is 45% appearances and 50% search traffic optimizations. For better or worse we lost all usable registries of websites, we lost appearance-less and traffic considerations-less websites. Information-focused Web is pretty much dead.

Maybe these ideas did not scale and did not monetize that well, but we will never really know what information-focused version of Internet would have looked like because evolution took it elsewhere. Unless we try building another one with different principles and limitations at the core.

reply
I agree. Even where blogging and sharing information is still around, it is strongly linked with brand-building, monetization, and engagement-maxxing. Look at all the old Wordpress bloggers who switch to Substack in order to have some eyeballs on their posts, and then inevitably begin conforming to its ethos willingly or unwillingly.

For me, the information-sharing part of the internet now is the shadow libraries. I can get access to all (well, still not quite all) journals and university-press publications from the last century? Awesome. Vastly more informative than some blogger who nowadays is trying to monetize my attention.

reply
At this point we need a fork of not just the web but the entire internet, one built for privacy.
reply
I support forking the web, into the simple information web services that the web started with. This is a magnificent idea.
reply
Seems like somebody is not accepting that every successful project will grow and become unwieldy like this. This is all legacy backwards compatibility of all iterated ideas that now you have to support.
reply
Ah yes, another "If I Were King" blog post. For an example of how it will turn out, look at how many JavaScript frameworks have been built to replace an overly complicated, unwieldy previous one.

oh and also https://xkcd.com/927/

reply