So they can't sell the fact that you're at Target at 8:00 p.m. on Thursday to anybody... Nor build profiles to sell to advertisers... And if that's the case that's very similar to cloud storage vendors.
If I access hacker news, and the record of my visit is stored in an AWS S3 bucket, I can't submit to AWS to delete my visitor record, even though the server, network cards, wires, and storage medium are AWS property, it was hacker news' website that generated that record and their responsibility to take my request to delete it.. AWS' stance would rightly be "talk to the website operator for CCPA requests"
Flock operates a federated network. If you drive past an unmarked camera, you have absolutely no way of knowing which specific HOA or town leased it so how are you realistically supposed to know who the "data controller" is to send your ccpa or deletion request to?
Me being me, I submitted a FOIA request for the dashcam footage of the five cop cars and the dispatch logs.
Instead of pulling over the easily identifiable car, they pulled over some random guy. They were behind him the whole time but five cop cars pulled behind him thinking that he fired a gun a few minutes back.
He was let go without a citation, but the official reason, despite being paired with the dispatch for the firecracker, was a broken headlamp.
It's crazy how cops just rush to very specific and nuanced crimes. Someone likely said they heard gun shots, and then they scrambled to find them.
all attorneys represent their clients; your attorney does not have to share your opinion of the law or public policy, they can still interpret what the law means to you.
if you are afraid your attorney might have a bias (they are human) you may get better advice from the "misaligned" POV: the flaws/holes in a privacy law found by a pro-business conservative attorney are more likely to find sympathy in the courts from both fellow conservatives and progressive judges.
It's not hard to see how this enables an institution to gate itself from criticism.
It should absolutely be Flock’s responsibility to remove my data and we should absolutely require it by law. Full stop.
Apple hold the data in iCloud, Apple (or a phone network) may be leasing me the phone. That sounds pretty similar to the Flock situation.
I guess the difference is that flock might be sharing the data from a customers camera with other customers. Then they are definitely controlling it.
I think the bigger problem with Flock is the fact that their cyber security is so laughably bad that non-customers can easily access the data.
For example, would you want to be able to tell Public Storage (or some other storage unit place) to remove any naked photos of you stored anywhere in their storage units?
For them to actually be able to do that would require they have nigh omniscience on everything stored by/for everyone in every one of their storage units. Even inside closed boxes.
Now, it's not the same thing of course - but hopefully you understand what I'm referring to?
I was enumerating the likely defense, not that it's valid.
"Sorry FBI, the tenant renting my warehouse out to manufacturing cocaine is not my responsibility. I won't do anything about it. You deal with them."
Nope, that's a failure of a duty to act and aiding and abetting a criminal activity if you hace constructive knowledge.
The fact of the matter is that Flock is playing two-step with the concept of "ownership" of data. They disclaim ownership as a way to leave local agencies holding the bag for liabilities, but they fight tenaciously to retain complete and unfettered access to that data.
(After organizing a community group that won Flock contract cancellations in multiple jurisdictions in Oregon, I went on to coauthor state legislation regulating ALPRs. I am very well familiar with all the dirty ball they play.)
Also, Flock's cameras collect more data than is provided to police agencies. Who owns that data, I wonder?
Does Flock do some kind of P2P dance to avoid the data transiting their systems?
Presumably the California data brokerage statutes were written specifically to prevent the kind of nerd-lawyering happening on this thread.
This is what I mean by the fruitlessness of these kinds of legal discussions on HN. What do you want me to argue, that you're wrong to want the law to work that way?
I'm not saying that's what's happening, but that's what I thought was happening before reading this thread, and now I have to go and run through their policies.
Either way ALPRs and AI-facial scanners in public are a huge violation of privacy and I loathe them, but I hope it's correct that Flock customers cannot easily share information with one another.
Are you saying Flock itself does not have access to any of the data, and that the data they store on behalf of local governments is not fed into any central datalake? That every organization's data is completely, unalterably separate from everyone else's?
If so, that makes the panopticon slightly less powerful.
If the DSLR uploaded them to Rent-A-Center owned/leased servers it would in fact require Rent-A-Center to take the necessary steps.
As Rent-A-Center would be the only group with proper access to data storage they would have inserted themselves into the chain of custody, and thereby have such obligation to ensure others data is wiped from systems they control.
But you knew that.
They're contractually forbidden from "selling their access to it" to arbitrary parties; they can share data only with the consent of their customers, almost all of whom actively want that data shared --- this is a very rare case of a data collection product where that's actually the case.
As such, even if they can contract it such that they are not legally responsible for such use, they are very much knowingly facilitating it. If this was physical goods, rather than data, they would probably been as responsible as their customers.
This is the same situation as a web hosting provider: if it is communicated to them that one of their customers uses their service to host illegal content, then it becomes the web hosting provider's responsibility to remove that content.
Reasonable technical feasibility for the service provider is key here, but it can be argued since the data can apparently be shared in ways that identify OP.
Probably not how the law currently works (don't know, not a lawyer), but I guess it should, as otherwise it allows creating a platform that shares abusively retained data without any reasonable recourse for the subjects of this data to remove the data from the platform.
Californians would have standing under the law but need expensive lawyers to litigate.
AWS has employed expensive lawyers to argue semantics; they host OS VMs and databases. This provides them legal cover for what AWS customers store.
Amazon the retailer stores customer data. A non-customer would have standing under California law to litigate removal of PII should they decide to hire lawyers.
Your reductionism is to law what a Linux beige box on a routable IP, no firewall, hosting a production health database with creds set to admin/pwd1234 is to software engineering.
Coincidentally 1234 happens to be the code to my luggage.
If Flock was just an opaque cloud storage service for law enforcement to back up their mass surveillance to then sure, your argument would have merit; it's not, it's a giant database of photos, locations, times, license plate information, and likely a lot more. They're not selling cloud storage, they're selling (leasing?) surveillance devices and tools.
My experience on HN is that these kinds of discussions almost immediately devolve into debates about what people want the law to be, as opposed to what it actually is.
However, I suspect that is not the case. AWS is agnostic as to the type of data stored on S3, and deletion of PII stored on S3 is the sole responsibility of the AWS customer that chooses to store it.
Flock's cameras aren't in bathrooms. However, they're still recording people who haven't opted into it. ("But you have no expectation of privacy in a public place!" "You have the expectation that someone might inadvertently overhear you. You don't have the expectation that someone is actively recording you at all times.")