upvote
Unfortunately you've now made an incredibly niche browser, and the lack of those metrics is a good fingerprint by itself. How browsers render SVGs can be used for fingerprinting (even the underlying OS affects this, and I assume you'll want to see those), combine with ISP from IP address, and unless theres hundreds users in every city you're now pretty easily trackable.
reply
There's no problem with having a unique fingerprint. The problem is having a consistent one. Randomize the fingerprint every time and you're fine. The IP address problem applies to everyone, including anyone using tor browser. The only solution to that is not using your own IP address (VPN/proxy). If I were going to make a secure privacy focused browser it either wouldn't allow things like rendering SVGs (which have introduced vulnerabilities beyond tracking) and wouldn't allow much (if any) JS and only a sane subset of CSS.
reply
> Unfortunately you've now made an incredibly niche browser, and the lack of those metrics is a good fingerprint by itself.

If 100 people are using that browser, how will they know which one is me?

> How browsers render SVGs can be used for fingerprinting (even the underlying OS affects this, and I assume you'll want to see those)

Can you provide details on this? And how will they know which OS I'm using (through SVG rendering...)? The UserAgent definitely should not send the OS.

> combine with ISP from IP address

That's already provided whether I use Private mode or not, correct? I can always use a VPN.

reply
You're the only one out of 100 that visits HN, or who's use matches a particular timezone, or who has the use pattern that [anti-]correlates with your work pattern, or ...
reply
I can't edit, but I forgot to add:

No support for forms. The browser is meant for content consumption. Not for interaction/creation.

One could argue that any JS capabilities to do network requests (including dynamically rendering content) would be disallowed.

Yes, I know, this is going pre-Web 2.0.

Yes, of course, most current sites won't work in that model. But I'll also say: Most current content sites don't need these capabilities. They have them because they know the browser supports them.

Again - a fantasy. I know only a few people will use it. I know that won't be enough to change web behavior. It would be nice, though, if sites carried a badge to indicate they conform to all of the above.

reply
Just use Tor browser? You can turn the tor part off if you need the speed.

What you want exists, have at it

reply
As the submission shows, Tor browser isn't enough. My hypothetical browser would never have an IndexedDB API. Why should it?

"Web applications use it for offline support, caching, session state, and other local storage needs"

This use case is completely orthogonal to what my browser is meant to do. My browser would not have a concept of local storage.

The premise of starting with a modern browser and stripping away features to get privacy is flawed - it's always vulnerable to these types of things. I'm going the opposite route: Only add features if they cannot be exploited for monitoring.

reply
i've had the same thought for 20 years and unfortunately it's less likely than ever to happen now, given how many sites require javascript and have cloudflare pages before even loading a site (I get several a day).

thankfully i think traditional web surfing is probably going to die out in the next 10 years, and progressively decline a lot much sooner than that as people start to interact with AI rather than browsers (or any software for that matter).

my feed of hackernews is going to be my AI agent giving it to me in plain text very soon, and soon after that i will probably never visit the internet again because it will be impossible to know what's real and fake

as a millennial it will be interesting to experience the full cycle of being born when nothing was online, to everything being online, to then again being entirely offline by the time i'm older

reply
> my feed of hackernews is going to be my AI agent giving it to me in plain text very soon

Wait for the advent of local agents running on local models (for privacy) followed by techniques to fingerprint agents, followed by techniques to infer query parameters based on agent behavior. I wish I was joking but it seems all too plausible.

reply