upvote
> App and website developers shouldn't be burdened with extra costly liability

Why not? Physical businesses have liability if they provide age restricted items to children. As far as I know, strip clubs are liable for who enters. Selling alcohol to a child carries personal criminal liability for store clerks. Assuming society decides to restrict something from children, why should online businesses be exempt?

On who should be responsible, parents or businesses, historically the answer has been both. Parents have decision making authority. Businesses must not undermine that by providing service to minors.

reply
> Why not?

This implies the creation of an infrastructure for the total surveillance of citizens, unlike age verification by physical businesses.

reply
Spell it out: how do ID checks for specific services (where the laws I've read all require no records be retained with generally steep penalties) create an infrastructure for total surveillance? Can't sites just not keep records like they do in person and like the law mandates? Can't in-person businesses keep records and share that with whomever you're worried about?

How do you reconcile porn sites as a line in the sand with things like banking or online real estate transactions or applying for an apartment already performing ID checks? The verification infrastructure is already in place. It's mundane. In fact the apartment one is probably more offensive because they'll likely make you do their online thing even if you could just walk in and show ID.

reply
>create an infrastructure for total surveillance

I mean, we're talking about age verification in the OS itself in some of these laws, so tell me how it doesn't.

Quantity is a quality. We're not just seeing it for porn, it's moving to social media in general. Politicians are already talking about it for all sites that allow posts, that would include this site.

So you tell me.

reply
App and website developers having liability is an alternative to OS controls. Mandatory OS controls are OS/device manufacturers having liability. I agree that's a poor idea, and actually said as much like a year ago pointing out that this California bill was the awful alternative when people were against bills like the one from Texas. It's targeting the wrong party and creates burdens on everyone even if you don't care about porn or social media.
reply
No, in the CA law OS controls are part and parcel with app and website developer liability.
reply
They're separate concepts. Clearly, obviously, mandating OS controls is creating liability for OS providers, not service operators. Other states do liability for providers without mandating some other party get involved.

California is also stupid for creating liability for service/app providers that don't even deal in age restricted apps, like calculators or maps. It's playing right into the "this affects the whole Internet/all of computing" narrative when in fact it's really a small set of businesses that are causing issues and should be subject to regulation.

reply
Knowing if the user's over 18 doesn't imply total surveillance, it only implies a user profile setting that says if they're over 18.
reply
It implies that the user has access to the technical infrastructure that supports age verification. Sucks to be you, if you can't afford a recent Apple or Android device to run the AgeVerification app.

There is also the problem of mission creep. Once the infrastructure is in place, to control access to age-restricted content, other services might become out of reach. In particular, anonymous usage of online forums might no longer be possible.

reply
That technical infrastructure: a drop-down menu on the user's account settings
reply
The EU Digital Waller requires hardware attestation so only locked-down government-approved OSes work
reply
Do you know what the word "infrastructure" means?
reply
Do you know what "total surveillance" means? It doesn't mean a checkbox for over 18
reply
I can't tell if this is a troll or not.

OS-level ability to verify the age of the person using it absolutely provides infrastructure for the OS to verify all sorts of other things. Citizenship, identity, you name it. When it's at the OS level there's no way to do anything privately on that machine ever again.

reply
I agree that a checkbox for if the user is over 18 opens the door to a checkbox for if the user is a citizen and even a textbox for the user's full name (which already exists on Linux so you better boycott Debian now!). I don't see how such input fields are "total surveillance".
reply
> Physical businesses have liability if they provide age restricted items to children.

Ok, suppose the strip club is the website, and the club's door is the OS.

Would you fine the door's manufacturer for teens getting into the strip club?

reply
Dueling physical analogies is never a productive way to resolve a conversation like this. It just diverts all useful energy into arguing about which analogy is more accurate but it doesn't matter because the people pushing this law don't care about any of them and aren't going to stop even if the entire internet manages to agree about an analogy. This needs to be fought directly.
reply
>This needs to be fought directly.

How do we fight? It seems like agree or disagree, this isn't going to stop. There's so much money behind it in a time where the have nots can barely survive as is.

reply
The OS is not the club's door. The OS is unrelated. The strip club needs to hire someone to work their door and check ID, not point at an unrelated third party. They should have liability to do so as the service provider.
reply
> Physical businesses have liability if they provide age restricted items to children.

These are often clear cut. They're physical controlled items. Tobacco, alcohol, guns, physical porn, and sometimes things like spray paint.

The internet is not. There are people who believe discussions about human sexuality (ie "how do I know if I'm gay?") should be age restricted. There are people who believe any discussion about the human form should be age restricted. What about discussions of other forms of government? Plenty would prefer their children not be able to learn about communism from anywhere other than the Victims of Communism Memorial Foundation.

The landscape of age restricting information is infinitely more complex than age restricting physical items. This complexity enables certain actors to censor wide swaths of information due to a provider's fear of liability.

This is closer to a law that says "if a store sells an item that is used to damage property whatsoever, they are liable", so now the store owner must fear the full can of soda could be used to break a window.

reply
That's not a problem of age verification. That's a problem of what qualifies for liability and what is protected speech, and the same questions do exist in physical space (e.g. Barnes and Noble carrying books with adult themes/language).

So again, assuming we have decided to restrict something (and there are clear lines online too like commercial porn sites, or sites that sell alcohol (which already comes with an ID check!)), why isn't liability for online providers the obvious conclusion?

reply
> That's a problem of what qualifies for liability and what is protected speech

The crux is we cannot decide what is protected speech, and even things that are protected speech are still considered adult content.

> why isn't liability for online providers the obvious conclusion?

We tried. The providers with power and money(Meta) are funding these bills. They want to avoid all liability while continuing to design platforms that degrade society.

This may be a little tin-foil hat of me, but I don't think these bills are about porn at all. They're about how the last few years people were able to see all the gory details of the conflict in Gaza.

The US stopped letting a majority of journalists embed with the military. In the last few decades it's been easier for journalists to embed with the Taliban than the US Military.

The US Gov learned from Vietnam that showing people what they're doing cuts the domestic support. I've seen people suggesting it's bad for Bellingcat to report on the US strike of the girls school because it would hurt morale at home.

The end goal is labeling content covering wars/conflicts as "adult content". Removing any teenagers from the material reality of international affairs, while also creating a barrier for adults to see this content. Those who pass the barrier will then be more accurately tracked via these measures.

reply
However there are also parts of the internet that are clear cut, like porn.
reply
What about nude paintings/photography that aren't made with erotic intent?

Anatomical reference material for artists with real nude models?

What about Sexual education materials? Medical textbooks?

Women baring their breasts in NYC where it's legal?

Where is the clear cut line of Pornography? At what point do we say any depiction of a human body is pornographic?

reply
>Plenty would prefer their children not be able to learn about communism

Plenty of people would prefer that children not learn about scientology from pro-scientology cultists too. It's not that they can't know about scientology (they probably should, in fact, because knowledge can have an immunizing effect against cults)...

And it's not that they can't know about communism (they probably should, in fact, because knowledge can have an immunizing effect against cults)...

reply
Would you also be against learning about Capitalism from the Heritage foundation?

This is a comment section about large corporations lobbying against our ability to freely use computers and you break out the 80's cold war propaganda edition of understanding a complicated economic system that intertwines with methodology for historical analysis with various levels of implementations from a governmental level.

You're either a mark or trying to find a mark.

reply
> Physical businesses

Physical businesses nominally aren't selling their items to people across state or country borders.

Of course, we threw that out when we decided people could buy things online. How'd that tax loophole turn out?

reply
But when they do, federal law requires age verification (at least with e.g. alcohol).

It turned out we pretty much closed the tax loophole. I don't remember an online purchase with no sales tax since the mid 00s.

reply
For one thing, it's fairly uncommon for children to purchase operating systems. As long as there is one major operating system with age verification, parents (or teachers) who want software restrictions on their children can simply provide that one. The existence of operating systems without age verification does not actually create a problem as long as the parents are at least somewhat aware of what is installed at device level on their child's computer, which is an awful lot easier than policing every single webpage the kid visits.
reply
So I agree that operating systems and device developers should not be liable. That's putting a burden on an unrelated party and a bad solution that does possibly lead to locked down computing. I meant that liability should lie with service providers. e.g. porn distributors. The people actually dealing in the restricted item. As a role of thumb, we shouldn't make their externalities other people's problems (assuming we agree that their product being given to children is a problem externality).
reply
What if all the useful apps refuse to run on the childproof operating system?
reply
I think the market is pretty good at situations like that.
reply
Then ditch propietary software completely and join free as freedom OSes.
reply
App and website developers shouldn't be burdened with extra costly liability to make sure someone's kids don't read a curse word, parents can use the plethora of parental controls on the market if they're that worried.

App and website operators should add one static header. [1] That's it, nothing more. Site operators could do this in their sleep.

User-agents must look for said header [1] and activate parental controls if they were enabled on the device by a parent. That's it, nothing more. No signalling to a website, no leaking data, no tracking, no identifying. A junior developer could do this in their sleep.

None of this will happen of course as bribery (lobbying) is involved.

[1] - https://news.ycombinator.com/item?id=46152074

reply
Practically, instead of requiring that sites verify age, require that they serve adult content with standardized headers. Devices can then be marketed as "child-safe" which refuse to display content with such headers.
reply
ZKP methods are just as draconian as they rely on locking down end user devices with remote attestation, which is why they're being pushed by Google ("Safety" net, WEI, etc).

The real answer to the problem is for websites/appstores to publish tags that are legally binding assertions of age appropriateness, and then browsers/systems can be configured to use those tags to only show appropriate content to their intended user.

This also gives parents the ability to additionally decide other types of websites are not suitable for their children, rather than trusting websites themselves to make that decision within the context of their regulatory capture. For example imagine a Facebook4Kidz website that vets posts as being age appropriate, but does nothing to alleviate the dopamine drip mechanics.

There has been a market failure here, so it wouldn't be unreasonable for legislation to dictate that large websites must implement these tags (over a certain number of users), and that popular mobile operating systems / browsers implement the parental controls functionality. But there would be no need to cover all websites and operating systems - untagged websites fail as unavailable in the kid-appropriate browsers, and parents would only give devices with parental controls enabled to their kids.

reply
> The real answer to the problem is for websites/appstores to publish tags that are legally binding assertions of age appropriateness, and then browsers/systems can be configured to use those tags to only show appropriate content to their intended user.

Agreed, recycling a comment: on reasons for it to be that way:

___________

1. Most of the dollar costs of making it all happen will be paid by the people who actually need/use the feature.

2. No toxic Orwellian panopticon.

3. Key enforcement falls into a realm non-technical parents can actually observe and act upon: What device is little Timmy holding?

4. Every site in the world will not need a monthly update to handle Elbonia's rite of manhood on the 17th lunar year to make it permitted to see bare ankles. Instead, parents of that region/religion can download their own damn plugin.

reply
Good list of more reasons! I focused on what I consider the two most important.

To expand on your #3, it also gives parents a way to have different policies on different devices for the same child. Perhaps absolutely no social media on their phone (which is always drawing them, and can be used in private when they're supposed to be doing something else), but allowing it on a desktop computer in an observable area (ie accountability).

The way the proposed legislation is made, once companies have cleared the hurdle of what the law requires, parents are then left up to the mercy of whatever the companies deem appropriate for their kids. Which isn't terribly surprising for regulatory capture legislation! But since it's branded with protecting kids and helping parents, we need to be shouting about all the ways it actually undermines those goals.

reply
> There's always another option: don't implement age verification laws at all.

Where do you go to vote for this option?

reply
The concern is ubiquitous all-pervasive surveillance, control, and manipulation of algorithmical social media and its objective consequences for child development and well-being. Not "kids reading a bad word". Disagree all you want, but don't twist the premise.

Surely you can find a rationalwiki article for your fallacy too.

reply
If you want to avoid all pervasive surveillance, it might be wise to not mandate all pervasive surveillance in the OS by law.

In fact, I suspect adults, and not just children, would also appreciate it if the pervasive surveillance was simply banned, instead of trying to age gate it. Why should bad actors be allowed to prey on adults?

reply
Luckily some of these laws, which we're rallying against, make it illegal to pervasively surveil.
reply
I must have missed that. Which of them prevent pervasive data collection on all ages?
reply
>Disagree all you want, but don't twist the premise.

The 2 billion dollars are the one twisting it.

reply
You mean the same social media companies that want this legislation and wrote it themselves? The same legislation that introduces more surveillance and tracking for everyone, including kids?

Also, I heard the same thing about video games, TV shows, D&D, texting and even youth novels. It's yet another moral panic.

From the Guardian[1]:

> Social media time does not increase teenagers’ mental health problems – study

> Research finds no evidence heavier social media use or more gaming increases symptoms of anxiety or depression

> Screen time spent gaming or on social media does not cause mental health problems in teenagers, according to a large-scale study.

> With ministers in the UK considering whether to follow Australia’s example by banning social media use for under-16s, the findings challenge concerns that long periods spent gaming or scrolling TikTok or Instagram are driving an increase in teenagers’ depression, anxiety and other mental health conditions.

> Researchers at the University of Manchester followed 25,000 11- to 14-year-olds over three school years, tracking their self-reported social media habits, gaming frequency and emotional difficulties to find out whether technology use genuinely predicted later mental health difficulties.

From Nature[2]:

> Time spent on social media among the least influential factors in adolescent mental health

From the Atlantic[3] with citations in the article:

> The Panic Over Smartphones Doesn’t Help Teens, It may only make things worse.

> I am a developmental psychologist[4], and for the past 20 years, I have worked to identify how children develop mental illnesses. Since 2008, I have studied 10-to-15-year-olds using their mobile phones, with the goal of testing how a wide range of their daily experiences, including their digital-technology use, influences their mental health. My colleagues and I have repeatedly failed to find[5] compelling support for the claim that digital-technology use is a major contributor to adolescent depression and other mental-health symptoms.

> Many other researchers have found the same[6]. In fact, a recent[6] study and a review of research[7] on social media and depression concluded that social media is one of the least influential factors in predicting adolescents’ mental health. The most influential factors include a family history of mental disorder; early exposure to adversity, such as violence and discrimination; and school- and family-related stressors, among others. At the end of last year, the National Academies of Sciences, Engineering, and Medicine released a report[8] concluding, “Available research that links social media to health shows small effects and weak associations, which may be influenced by a combination of good and bad experiences. Contrary to the current cultural narrative that social media is universally harmful to adolescents, the reality is more complicated.”

[1] https://www.theguardian.com/media/2026/jan/14/social-media-t...

[2] https://www.nature.com/articles/s44220-023-00063-7

[3] https://www.theatlantic.com/technology/archive/2024/05/candi...

[4] https://adaptlab.org/

[5] https://pubmed.ncbi.nlm.nih.gov/31929951/

[6] https://www.nature.com/articles/s44220-023-00063-7#:~:text=G...

[7] https://pubmed.ncbi.nlm.nih.gov/32734903/

[8] https://nap.nationalacademies.org/resource/27396/Highlights_...

reply