upvote
> Prefer privacy-preserving decoupled age-verification services, where the service validates minimum age and presents a cryptographic token to the entity requiring age validation.

This is the wrong implementation.

You require sites hosting adult content to send a header indicating what kind of content it is. Then the device can do what it wants with that information. A parent can then configure their child's device not to display it, without needing anybody to have an ID or expecting every government and lowest bidder to be able to implement the associated security correctly.

It doesn't matter what kind of cryptography you invent. They either won't use it to begin with or will shamelessly and with no accountability violate the invariants taken as hard requirements in your theoretical proof. If you have to show your ID to the lowest bidder, you're pwned, so use the system that doesn't have that.

reply
This solves some probelms, such as children accessing porn sites (oh the horror). But it doesn't solve other problems, such as predators accessing children's spaces. YouTube Kids is purportedly a safe, limited place for kids - and yet, there are numerous disturbing videos that get past the automated censors. Pedophiles stalk places like Roblox.
reply
Then you do forensic and catch the predator instead of the age verification nonsense
reply
Your proposed architecture also achieves the goal of discouraging content-distributing entities from holding hard identification data, so it sounds good to me.
reply