upvote
Contractual agreement? Nobody reads things like EULAs or terms of service. It's probably in there already.
reply
I should have been a bit more clear. We should ban retention for any purposes where it is not explicitly required for the intended function and clearly agreed to by all parties. Think somethig like strava or asset tracking. You know it stores gps data, and why.
reply
There is no such things as "clearly agreed to by all parties" when it comes to end users. Companies provide a one-sided, "take it or leave it" EULA, and if you don't agree to everything in it, you don't use the product. There is no meeting of the minds, there is no negotiation, and there is no actual agreement. It's a rule book dictated by one side.
reply
Then it's not a valid contract and therefore does not absolve them of criminal liability for stalking you.
reply
Contracts of adhesion can be valid contracts. The ability to negotiate or equal bargaining power is not a required element of a contract.

Furthermore, you cannot contract away criminal liability if any exists.

reply
Even attempting to use a contract of adhesion to justify selling GPS location data to a third party should be a criminal act.
reply
Yes, the US is in desperate need of better privacy laws.
reply
You click on “accept terms and conditions” which means you agree to the contact.
reply
You can't just bury literally anything in an EULA. There's a fair amount of case law establishing that EULAs clauses that are surprising or illegal aren't enforceable.
reply
That fact does not change the point of the individual to which you replied. Regardless of whether the clauses in the EULA are 100% legal, some mixture or 100% illegal, the entire EULA is a "one sided rule-book dictated completely by one side". You, the person held to the EULA's rules, do not get to negotiate on the individual points. You simply have a "take it or go away" set of options.
reply
You're talking about contracts of adhesion and they are overwhelmingly common for B2C agreements. Most red-lining of contracts only happens in high-value B2B transactions where the sums of money involved are enough that it makes sense to bring lawyers into the loop.
reply
If the product has any serious audience / traction, it becomes profitable to scan its EULA for illegal clauses, and sue the company for damages (and maybe extra punishment for breaking the law).

The fact that 100% of its users, except the litigant, skimmed through the EULA and did not notice anything does not relieve the company from the responsibility.

reply
when you already pay for the device and a contract, then surprise now that you have skin and flesh in the game, you HAVE TO agree to this EULA or your property is a brick and we keep your money.

that is defined as extortion, but labled as onboarding.

reply
Courts do look poorly upon this -- to have a valid contract of adhesion there is some degree of advanced notice required and ability to reject it.
reply
There is the GDPR.
reply
if it were up to me i’d require a hand signed contract that explicitly, up front and in plain english gives permission and is not transferable to any “partners”.
reply
Instead of “I accept”, you’re given a quiz
reply
Right, privacy terms are written to be vague and permissive. Even if you read them you can’t usually understand how the data will be used or opt out.
reply
companies don't buy anymore, they just take it. think about Claude, every time you use it, you are literally give away your code, your data, and even your personal information.
reply
I think we should make this type of tracking opt-out by default. We should also ban the sale of its use to third parties and its use for purposes other than the specific functionality which required it to be enabled in the first place.
reply
>I think we should make this type of tracking opt-out by default

That's opt-in, not opt-out.

https://en.wiktionary.org/wiki/opt-out

reply
GP states correctly that they believe the default 'choice' of a user should be 'opting-out' of location tracking.
reply
default "no" is called "opt-in"

pre-checked "yes" is called "opt-out"

reply
This is utterly confusing the use of the terms. Opting is making a choice. The default isn't a choice. Opting by default makes no sense.
reply
Yeah it is. What I mean is the default is you have not opted in. You must choose to opt in for this type of tracking. It should be a choice that doesn’t preclude you from using a service if you don’t allow tracking.
reply
deleted
reply
Every EULA already covers this basically. The real problems are: people agree to it, and the government can do an end-run around the constitution by simply purchasing data or hiring contractors.
reply
> IMO we should ban gathering this data without

GDPR tried. And the narrative around GDPR was deliberately completely derailed by adtech.

Lack of enforcement didn't help either

reply
GDPR like all EU regulation is needlessly complicated and aimed at a compliance model that seems designed for SAP.
reply
The compliance model is very simple. Do not collect data. Problem solved. If you need to collect data (e.g. because you are a webshop), only collect the minimum necessary.

The problem is not the GDPR, the problem is the surveillance industry that wants to grab as much data as possible and try to do as much malicious compliance as possible.

reply
> compliance model is very simple. Do not collect data. Problem solved

In a perfect world, yes. In the real world, there is an entire industry of lawyers who will smother your competitors with bogus requests because GDPR requires you spend time and resources to investigate and respond to each and every complaint regardless of merit.

reply
Designing around GDPR compliance shows up all over the place in industrial data collection. It doesn't only affect surveillance webslop.

The costs are often worse on industrial side because the data is so much larger and faster than web or mobile data.

reply
What do you mean by "industrial" in this case?
reply
Telemetry from machines and data from environmental sensors that is collected for operational purposes (safety, efficiency, reliability) in industrial applications. Old school engineering systems that in modern times have expansive network-connected sensors that may even have onboard classifiers to reduce the quantity of data.

The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for.

reply
Interesting. What are your obligations under GDPR in that case? It's not like a packing machine can request data deletion.
reply
No one has been able to provide a satisfactory answer to this question. I've seen the lawyers try to figure this out at a few companies.

GDPR frames everything in the context of a person's data. There is no "person_id" or similar field in these data models. That isn't the purpose of the data, it would be expensive to extract it, and then it would create obvious liability under GDPR. This makes the idea of finding a person's data expensive -- brute-force search on huge data volumes.

Compounding this, these data systems are often operational and some of the data may be in situ at the edge because it is too large to move all of it. The power and compute budget may not exist to find a person using brute force.

AFAICT, current best practice is to maintain a polite fiction that people aren't being tracked because that is not the intent. No one thinks that would stand up to serious legal scrutiny though. If the regulators come after you then plead best effort based on the technical infeasibility of doing anything else.

reply
Let's say I'm an industrial monitoring SaaS company with a bunch of analytics products and a new AI intelligence product.

Forget tracking workers' movements and stuff like that because that's even more complicated (the data is tied to a person, but only in their capacity as an employee and not as a private individual).

Focus on a case like a cluster of sensors attached to various equipment powered by electric motors, or using RFID to detect when a pallet enters the warehouse. Let's say that all goes to a cloud platform and I store it, I build a bunch of derived analytics stuff from it, and I send it up to Anthropic (with no-train-on-me-pls contract clause) for my cool new AI insights engine.

Does GDPR apply at all to that? I would have assumed it doesn't have any relevance whatsoever, but you're implying that it does. Or are you specifically talking about the case when individual employees are the data collection subjects, like a fleet management platform with a telematics component?

reply
https://gdpr-info.eu/art-25-gdpr/

--- start quote ---

Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.

The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. 3In particular, such measures shall ensure that by default personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons.

An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.

--- end quote ---

IANAL, but this basically covers all your bases, together with https://gdpr-info.eu/art-32-gdpr/

Unless, of course, your industrial-scale data collection actually collects significantly more data than you let on, and extraction of personal data is not as hard as you make it sound

reply
Eh?

The GDPR is there to protect your personal/sensitive data, or data that can personally identify you. If has nothing whatsoever to do with data capture from industrial machinary.

I remain astounded how ignorant some people are of basic GDPR principle: protecting your _personal_ data.

reply
Industrial data capture can produce detailed traces of your travel no different than tracking your mobile phone. Some can capture personal details that adtech often can't because the sensor suites are more diverse and operate in different environments. We just don't use it for that.

How is this not your personal data?

Exploitation of these types of data sources has been demonstrated for 15+ years at this point. Abuse is often impractical for technical reasons but GDPR doesn't give you free pass on collecting personal data just because you aren't using it like personal data.

reply
> The trouble started when lawyers correctly noticed that these are incidentally capable surveillance systems even though that isn't how we use them or what they were designed for.

Many systems were not explicitly designed for surveillance, and are. Because many systems collect too much data to begin with.

Hence the problem: people who collect too much data claim that GDPR is complicated, complex, convoluted, impossible to comply with... instead of changing what data they collect, and how.

Additionally, people confuse the complexity of human endeavours with the complexity of the law. GDPR itself is neither complex nor complicated. It doesn't try to carve out exceptions, rules, and regulations for every possible activity humans may attempt. Then it would become impossible to understand or comply with.

As is, it has enough carveouts for industries which require more data than strictly necessary, called "legitimate interest" (which still doesn't allow you to just use this data willy-nilly). E.g. banks collect significantly more data about customers than strictly necessary (because KYC, fraud, security etc.), and store that data for significantly longer amount of time than allowed by privacy-related laws (because they are governed by bank laws of respective countries). It doesn'tmean they can sell that data or spy on users.

Same here. It's not on the law to tell you exactly how to operate your "industrial-scale operation". It's on you to fix your shit, stop collecting more data than necessary, have data protection in place, delete data after a reasonable time, anonymize data etc.

reply
Have you read it? It's not that bad, unless you're thinking like an adtech programmer trying to find the exact edge case for the maximal amount of tracking you're allowed to do, because such a bright line does not exist and that fact infuriates adtech professionals. It is vague because reality is vague and complex; each specific case of alleged violation has to be interpreted by multiple humans; there is no algorithm.
reply
The law mandates a data protection officer with specific duties. It also establishes a board that "issue guidelines, recommendations, and best practices" which is where administrative complication and nonsense always creeps in.
reply
It is regulation that imagines companies are a government bureaucracy.

I have read GDPR and don't work in adtech. It is vague and it is pretty easy to find pathological scenarios that don't make much sense or impose an unusually high burden for no benefit. Every European law firm seems to agree with this assessment despite what proponents assert. Consequently, it forces a lot of expensive defensive activity in practice.

To some extent, it was just a failure of imagination on the part of GDPR's authors. Many things are not nearly as simple as it seems to assume and it bleeds into data models that have nothing to do with people.

It is what it is but no one should pretend it is not a burden for companies that have nothing to do with adtech or even data about people.

reply
You can literally read the entire "complicated" regulation in one sitting in an afternoon. There's literally nothing complex or complicated about it.

Congrats on gullibly believing the ad tech narrative.

reply
The "GDPR is complicated" meme has been circulating among software developers since probably before it was even written. It's so wild that HN dunks on it so much: Here we have a societal problem in computing we've been complaining about for decades, someone offers an incremental but imperfect regulation to start taking steps to correct it, and everyone hates it!
reply
> It's so wild that HN dunks on it so much: Here we have a societal problem in computing we've been complaining about for decades, someone offers an incremental but imperfect regulation to start taking steps to correct it, and everyone hates it!

YOUR collection of user's data is an overreach and breach of privacy. MY collection of data is absolutely necessary to grow my scrappy small business and provide value. I am a good person with good intentions, so its OK. You are a bad person doing bad things, so its not OK.

reply
The GDPR is vague and unworkable as written. It fundamentally restricts all data processing with a few, vague exceptions.

What is data processing essential for the services being provided? Many publishers assumed that getting paid was an essential part of providing a service, and it was not until 3 months before the implementation deadline that the committee clarified that getting paid is not included when you are being paid by a third party.

How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)? Is making that determination a service essential for providing your service? The answers apparently were "You don't" and "No", which would effectively make companies assume that the GDPR applies to everyone on the planet.

The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight. It was too big of a change to happen at once, so it effectively only loosely enforced in practice.

I like the idea of the GDPR, but the implementation sucks.

reply
> The GDPR is vague and unworkable as written. It fundamentally restricts all data processing with a few, vague exceptions.

What utter utter FUD

You are free to collect as much personal data as you want, PROVIDING you have my explicit opt-in informed consent to do so.

What about this is difficult to understand?

> How are you to know whether or not the user is an EU citizen (and thus subject to the GDPR)?

The GDPR provides _basic_ data safety and consumer protection. If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data. In which case you need to _seriously_ consider if what you are doing is ethical.

> The GDPR also is fundamentally opposed to how things currently work in the internet, making almost all advertising on the web illegal overnight.

Utter Bullshit!

You are free to advertise as much as you like! But if you want to track me with your advertising (hello scummy adtech industry) then you need my explicit informed consent to do so. And so you should!

Again, what about this is difficult to understand?

reply
> If you aren't protecting users private data regardless of where they live in line with GDPR principles (such as collecting it fairly, and not selling it to randoms) then you are playing fast and loose with your users private, sensitive data.

It's interesting and revealing when someone responds to a law that says "You're not allowed to abuse users in countries X, Y, and Z" with "How can I figure out who's in the other countries, so I can abuse them?" instead of "I'll just stop abusing everyone, and then I don't even need to worry about where anyone is."

Whenever you find yourself asking "how do I toe as close to the 'illegal' line as I can without technically going over it?" I think it's time to ask yourself some pretty hard questions.

reply
Your entire reply is both a non sequitur, and doesn't even attempt to understand what people tell you
reply
Same with the California age input box.
reply
The problem with the age input box is that we don't have the GDPR. We're mandating that people give accurate age information to advertisers, and it's legal for advertisers to sell detailed dossiers on people including their age and target advertising using the age. This is why Meta wrote the age input box legislation, they want to make everyone legally required to provide Meta with their age.
reply
Being able to read something in one sitting doesn't make it simple or obvious. The law establishes a board that gets to set new requirements.
reply
What new requirements can be set by the board? As far as I understand EDPB can only issue guidelines, recommendations and best practices. All of these are just guidelines on how to interpret GDPR. Courts are the ones who ultimately decide if are complying with GDPR. Local DPA likely won't harshly punish you if you follow EDPB's recommendations if they end up getting overturned by court.

DPA won't punish you for not following EDPB's recommendations, they will punish you for breaking GDPR. You are free to ignore EDPB if you think your legal position is strong, but you carry the risk if you are wrong.

reply
As someone who has to implement it, it's really not bad at all: Ask the user for consent to use their data, and don't be misleading about it. That's it.

The rest of the "It'S So LaRgE AnD UndErSpEciFieD" is just FUD. The regulators don't just slap fines, they work with you to get you to comply, and they just want to see that you're putting in the effort instead of messing them about.

I have literally never been surprised by the GDPR. Whenever I thought "surely this is allowed" it was, whenever I thought "this can't be allowed", it wasn't. For everything in the middle, nobody will punish you for an honest mistake.

reply
Also, "Be able to track a user's data and delete it on a request."

This is not too hard if you do proper engineering work ahead of time and are purposeful about how you move and manage data (step 1 is just not collecting it unless its vital). But the industry encourages us to be very bad about that because we gotta "move fast and break things or you're not gonna make it."

reply
> for everything in the middle, nobody will punish you for an honest mistake.

How do you know that? Again the law establishes a rules making body that can at any time change or add rules, and as far as I can tell there's no public review process.

reply
> Again the law establishes a rules making body that can at any time change or add rules

Please quote the exact text of the law that you claim does that. And since the law has been in force for 10 years, perhaps you can point at the website of said body.

If you say "DPAs", then...erm... perhaps learn something about the world around you? Who do you think monitors compliance, say, for food, or for construction? It just appears out of nowhere? Same here

reply
Which body is this? The EDPB?
reply
Anti GDPR people: "it's so complicated not being able to walk into someone's house and take their things! Which things can I not take? How about this? And now I need a lawyer if I take someone's things? Ridiculous!"

Just don't spy on people.

reply
Yeah that's pretty much what it feels like, or sometimes it's "what if someone's stuff is lying on the street? Can I take it then?" and the regulator is kind of like "look around and ask if it belongs to anyone, and if not, sure".
reply