upvote
Agreed. There may be some situations where I may want to ensure 100% correctness. I'm thinking life or death scenarios, (which if so, maybe should use a different protocol). However, checking the sports score or looking at cat memes isn't that.
reply
To be fair, HTML5 also has a defined parsing algorithm. It just happens to always work on any input to produce a webpage
reply
Yes, this is what you'd want. It doesn't have to be a complicated as the HTML5 algorithm either. That's complicated because it was a harmonization of at least 3 browser's multi-decade heuristics and untold terabytes of existing HTML practice. An algorithm unconcerned with backwards compatibility could much simpler, but still clearly define error behavior much easier to use than "scream and die".

And it's still unambiguous. You can cringe at what some people do, but it would be strictly a taste issue rather than a technical one, as the parse would still be unambiguous. And if you think you can fix taste issues with technical specification, well, you've already lost anyhow.

reply
I think the GP has an issue not with the specification part, but with the part where it's forbidden for clients to render a noncompliant page.
reply
No scripting is a tell, it's about wanting other people to accommodate their concerns about running a complex browser, not about solving a real problem.

If it did somehow happen that a good deal of interesting content was published using the standard, the most popular client would probably be nonconforming, ignoring the rule to not render ambiguous content.

reply
Every modern alternative web protocol is about accommodating the author's concerns and pet peeves about the modern web (and usually gatekeeping it from capitalists and normies.)

Protocols used to be limited by technology, now they're defined by ideology.

reply
XHTML failed in an era when writers (even normies) were writing some HTML of their own and they could't be trusted to close their tags properly. XHTML also assumed writers would be personally invested in semantic markup like distinguishing e.g. the italics of book titles from the italics of emphasis.

Today, when writers are using visual editors (or Markdown), few are writing their own HTML any more. A web standard requiring compliance would work differently today.

reply
Markdown sux and so do visual editors. I think visual editors were just invented to make it so cut-and-paste never quite works right. There's been some conceptual problem with the whole idea ever since MS Word and the industry has never dealt with it.
reply
> XHTML failed in an era when writers (even normies) were writing some HTML of their own

I'd say it was a minority of writers that were handcrafting XHTML. And it was the case that everyone or their handcrafting or using tools could validate their compliance using a browser which made it very easy to adjust your tools or your handcrafted code. We are now in a situation where there is no schema for HTML.

I, for one, am very much in favor of forking the web with a document format with a schema. It really seems like a small and simple change to me.

reply
deleted
reply
Note that when I say "writing their own HTML", I don't mean handcrafting a whole webpage. I mean that people were writing i or b tags in their Wordpress editors or in online comment boxes, because back then such text fields did not have visual editors and would accept raw tags. Under XHTML, if the writer did not close tags properly, such input would have broken the whole page, so obviously back then such a standard was DOA.
reply
Those cases were easy to fix by using eg htmltidy on the UGC.

Honestly I don't think it was killed by one thing, or by anything. Just no platform really cared and it wasn't a win for anyone and occasionally a loss.

reply