What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?
What could be hard about that?
Isn't that the exact same argument against Lockdown mode? The point isn't that the number of users is small it's that it can significantly help that small set of users, something that Apple clearly does care about.
Where CAs are concerned, not having the phone image 'cracked' still does not make it safe to use.
The way it would work is not active destruction of data just a different view of data that doesn’t include any metadata that is encrypted in second profile.
Data would get overwritten only if you actually start using the fallback profile and populating the "free" space because to that profile all the data blocks are simply unreserved and look like random data.
The profiles basically overlap on the device. If you would try to use them concurrently that would be catastrophic but that is intended because you know not to use the fallback profile, but that information is only in your head and doesn’t get left on the device to be discovered by forensic analysis.
Your main profile knows to avoid overwriting the fallback profile’s data but not the other way around.
But also the point is you can actually log in to the duress profile and use it normally and it wouldn’t look like destruction of evidence which is what current GrapheneOS’s duress pin does.
Also would recommend the book called The Mastermind by Evan Ratliff
Whether he was involved in the organization and participated in it, is certainly up for debate, but it's not like he would admit it.
This could even be a developer feature accidentally left enabled.
Android has supported multiple users per device for years now.
It's actually annoying because every site wants to "remember" the browser information, and so I end up with hundreds of browsers "logged in". Or maybe my account was hacked and that's why there's hundreds of browsers logged in.
Multi-user that plausibly looks like single-user to three letter agencies?
Not even close.
Never ever use your personal phone for work things, and vice versa. It's bad for you and bad for the company you work for in dozens of ways.
Even when I owned my own company, I had separate phones. There's just too much legal liability and chances for things to go wrong when you do that. I'm surprised any company with more than five employees would even allow it.
While plausible deniability may be hard to develop, it’s not some particularly arcane thing. The primary reasons against it are the political balancing act Apple has to balance (remember San Bernardino and the trouble the US government tried to create for Apple?). Secondary reasons are cost to develop vs addressable market, but they did introduce Lockdown mode so it’s not unprecedented to improve the security for those particularly sensitive to such issues.
This seems hard to justify. They share a lot of code yes, but many many things are different (meaningfully so, from the perspective of both app developers and users)
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.
No. Think about it for a second: you're a journalist being investigated to find your sources, and your phone says you mainly check sports scores and send innocuous emails to "grandma" in LLM-speak? It's not going to fool someone who's actually thinking.
For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.
Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.
Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"
A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.
So do not have biometrics as device unlock if you are a journalist protecting sources.
It's not really that useful for a safe since they aren't _that_ difficult to open and, if you haven't committed a crime, it's probably better to open your safe for them than have them destroy it so you need a new one. For a mathematically impossible to break cipher though, very useful.
Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.
https://reason.com/2017/05/31/florida-man-jailed-180-days-fo...
>Doe vs. U.S. That case centered around whether the feds could force a suspect to sign consent forms permitting foreign banks to produce any account records that he may have. In Doe, the justices ruled that the government did have that power, since the forms did not require the defendant to confirm or deny the presence of the records.
Well, what if the defendant was innocent of that charge but guilty of or involved in an unrelated matter for which there was evidence in the account records?
Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)
No, you did something fake to avoid doing what you were asked to do.
> If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear.
But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
This sort of thing is already table stakes for CSAM prosecutions, for example. Law enforcement can read the same blog posts and know as much about technology as you do. Especially if we are hypothesizing an advertised feature of a commercial OS!
Yes, that is what plausible deniability is.
>But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
I emphasized "done right". If existence of hidden encryption can be proven, then you don't have plausible deniability. Something has gone wrong.
My point was: OP claimed plausible deniability does not apply in legal cases which is a weird take. If you can have plausible deniability, then it can save you legally. This does not only apply to tech of course, but encryption was the subject here. In all cases though, if your situation is not "plausible" (due to broken tech, backdoors, poor OPSEC in tech, and / or damning other evidence in other cases as well) then you don't have plauisble deniability by definition.
Having ways of definitively detecting hidden encrypted volumes might be the norm today, might be impossible tomorrow. Then you will have plausible deniability and it will work legally as far as that piece of "evidence" is concerned.
That's a whole lot more to loose than your money and time.
Francis Rawls stayed 4 years in jail despite pleading the fifth all day long
Biometric data doesn’t need the password.
And good luck depending on the US constitution.
There is a separate border search exception at the point a person actually enters the country which does allow searches of electronic devices. US citizens entering the country may refuse to provide access without consequences beyond seizure of the device; non-citizens could face adverse immigration actions.
To be clear, I do think all detentions and searches without individualized suspicion should be considered violations of the 4th amendment, but the phrase "constitution-free zone" is so broad as to be misleading.
It's one thing to allow police to search a phone. Another to compel someone to unlock the device.
We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.
I've been advocating for this under-duress-PIN feature for years, as evidenced by this HN comment I made about 9 years ago: https://news.ycombinator.com/item?id=13631653
Maybe someday.