It use to be the default belief, throughout all of humanity, on how greed is bad and dangerous; yet for the last 100 years you'd think the complete opposite was the norm.
> when they are only incentivized to lie, cheat, and steal
The fact that they are allowed to do this is beyond me.The fact that they do this is destructive to innovation and I'm not sure why we pretend it enables innovation. There's a thousands multi million dollar companies that I'm confident most users here could implement, but the major reason many don't is because to actually do it is far harder than what those companies build. People who understand that an unlisted link is not an actual security measure, that things need to actually be under lock and key.
I'm not saying we should go so far as make mistakes so punishable that no one can do anything but there needs to be some bar. There's so much gross incompetence that we're not even talking about incompetence; a far ways away from mistakes by competent people.
We are filtering out those with basic ethics. That's not a system we should be encouraging
The best fix that we can work on now in America is repealing the 17th amendment to restrengthen the federal system as a check on populist impulses, which can easily be manipulated by liars.
Even if the CEO believes it right now, what if the team responsible for the automatic-deletion merely did a soft-delete instead of a hard delete "just in case we want to use it for something else one day"?
If you let your legal team use such broad CYA language, it is usually because you are not sure what's going on and want CYA, or you actually want to keep the door open for broader use with those broader permissive legal terms. On the other hand, if you are sure that you will preserve user's privacy as you are stating in marketing materials, then you should put it in legal writing explicitly.
> - All biometric personal data is deleted immediately after processing.
The implication is that biometric data leaves the device. Is that even a requirement? Shouldn't that be processed on device, in memory, and only some hash + salt leave? Isn't this how passwords work?I'm not a security expert so please correct me. Or if I'm on the right track please add more nuance because I'd like to know more and I'm sure others are interested
Btw, hashes aren't unique. I really do mean that an input doesn't have a unique output. If f(x)=y then there is some z such that f(z)=y.
Remember, a hash is a "one way function". It isn't invertible (that would defeat the purpose!). It is a surjective function. Meaning that reversing the function results in a non-unique output. In the hash style you're thinking of you try to make the output range so large that the likelihood of a collision is low (a salt making it even harder), but in a perceptual hash you want collisions, but only from certain subsets of the input.
In a typical hash your collision input should be in a random location (knowing x doesn't inform us about z). Knowledge of the input shouldn't give you knowledge of a valid collision. But in a perceptual hash you want collisions to be known. To exist in a localized region of the input (all z are near x. Perturbations of x).
Certainly, you mean: "claiming that".
In the terms of Mandy Rice-Davies [1], "well he would, wouldn't he?" Especially, his claim that the data isn't used for training by companies that are publicly known to have illegally acquired data to train their models doesn't look very serious.
[1]: https://en.wikipedia.org/wiki/Well_he_would,_wouldn%27t_he%3...
Thus it is impossible to believe his words.
Because KYC is evil in itself and if the linked article does not explain to you why is that then I certainly cannot.
> KYC provider would want to protect their reputation more than the average company
False. It is exactly the opposite. See, there are no repercussions for leaking customers data, while properly securing said data is expensive and creates operational friction. Thus, there are NO incentives to protect data while there ARE incentives to care as less as possible.
Bear in mind that KYC is a service that no one wants, anll customers are forced and everybody hates it: customers, users, companies.
- someone finally reading the T&Cs
- legal drafting the T&Cs as broadly as possible
- the actual systems running at the time matching what’s in the T&Cs when legal last checked in
Maybe this is a point to make to the Persona CEO. If he wants to avoid a public issue like this then maybe some engineering effort and investment in this direction would be in his best interest.
Infrastructure: AWS and Google Cloud Platform
Database: MongoDB
ETL/ELT: Confluent and DBT
Data Warehouse and Reporting: Sigma Computing and Snowflake
Trust needs to earned. It hasn't been.
The big stick doesn't really exist.
the already too centralized is being made even more centralized here.