You're going to get a lot of cheerleading and support about this in venues like HN and Reddit, because you're narrowcasting to an audience already primed to be hyperconcerned about surveillance technology (I am too). I think you're going to find those attitudes do not in fact generalize to the public at large, and especially not to the legal system.
Best of luck either way. It'll be an interesting experience to write up, and I'm happy to read about the outcome, even if I do think it's highly predictable.
fyi, flock owns the cameras.
"We operate using a lease model. What does that mean? Since we own the hardware, we own the problems that occur."
[0]: https://www.scotusblog.com/cases/case-files/chatrie-v-united...
Obviously, the idea is to not disallow having someone take a photo of you as a background, passing figure as they take a front-and-center photo of their family, but not allow you to be the main subject unknowingly and especially when you object explicitly.
On the other hand, a photographer still owns the copyright to a photo, so a subject (including in a portrait) cannot claim it or distribute it without permission even if they can potentially stop the photographer from distributing that photo.
IANAL, but you are not by default allowed to use anyone's "likeness" for your individual profit.
This is not the case in the United States. There is no presumption of privacy in public. In fact, there is a whole genre known as "street photography" that involves taking pictures in public without explicit consent of the subjects.
[0]: https://www.eff.org/deeplinks/2025/10/flock-safety-and-texas...
I think what's happening here is that people are trying to colloquially define "selling access to data" to fit the camera data sharing that Flock enables, and then saying that because you have to pay to be a Flock customer to get access to that data, they're effectively selling it. I don' think that's how data brokerage laws work. Flock doesn't own the data they're providing access to, and they're providing that sharing access with the (avid!) consent of their customers.
If https://legalclarity.org/can-you-post-someones-picture-witho... is to be trusted though, at least you get protection from your likeness being used for commercial purposes, though that seems a bit more limited than I'd expect.
Even your full legal name and birth date cannot be guaranteed to refer only to you specifically (as there could be someone else with an identical name and birth date), but it's obviously still PII because it helps narrow the field immensely if you can combine it with other information - for example, your IP address.
So yeah, "anyone could have been driving my car", but if you also know that the car drove from your home to your work then that narrows down the list of likely individuals immensely.
Conversely, if your license plate was spotted parked near an anti-ICE rally, then they can be pretty confident that you or someone you know was near an anti-ICE rally, which means they can harass you about it, follow you around, shoot you in the street, etc.
> “Personal information” does not include [...] Information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer
> (v) (1) “Personal information” means information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following [...]:
> (E) Biometric information.
> (H) Audio, electronic, visual, thermal, olfactory, or similar information.
https://leginfo.legislature.ca.gov/faces/codes_displaySectio...
To your point, the intent would presumably still matter for exceptions to when deletion requests must be honored (say for journalism), but a photo of someone walking down a public street would still logically be considered the subject's personal information, by the above definition.