Its also a new account that only posted these two posts.
https://news.ycombinator.com/threads?id=Soerensen
Their comment got flagged, but looks like they made a new one today and is still active.
That account ('Soerensen') was created in 2024 and dormant until it made a bunch of detailed comments in the past 24-48 hrs. Some of them are multiple paragraph comments posted within 1 minute of each other.
One thing I've noticed is that they seem to be getting posted from old/inactive/never used accounts. Are they buying them? Creating a bunch and waiting months/years before posting?
Either way, both look like they're fooling people here. And getting better at staying under the radar until they slip up in little ways like this.
The truth is that the internet is both(what's the word for 'both' when you have three(four?) things?) dead, an active cyber- and information- warzone and a dark forest.
I suppose it was fun while it lasted. At least we still have mostly real people in our local offline communities.
https://en.wikipedia.org/wiki/On_the_Internet%2C_nobody_know...
Also, some of us draft our comments offline, and then paste them in. Maybe he drafted two comments?
That said, as a general point, it’s reasonable to make scoped comments in the corresponding parts of the conversation tree. (Is that what happened here?)
About me: I try to pay attention to social conventions, but I rarely consider technology offered to me as some sort of intrinsically correct norm; I tend to view it as some minimally acceptable technological solution that is easy enough to build and attracts a lowest common denominator of traction. But most forums I see tend to pay little attention to broader human patterns around communication; generally speaking, it seems to me that social technology tends to expect people to conform to it rather than the other way around. I think it’s fair to say that the history of online communication has demonstrated a tendency of people to find workarounds to the limitations offered them. (Using punctuation for facial expressions comes to mind.)
One might claim such workarounds are a feature rather than a bug. Maybe sometimes? But I think you’d have to dig into the history more and go case by case. I tend to think of features as conscious choices not lucky accidents.
What’s so hard to make 2-3 pins and each to access different logged in apps and files.
If Apple/android was serious about it would implement it, but from my research seems to be someone that it’s against it, as it’s too good.
I don’t want to remove my Banking apps when I go travel or in “dangerous” places. If you re kidnapped you will be forced to send out all your money.
What’s so hard about adding a feature that effectively makes a single-user device multi-user? Which needs the ability to have plausible deniability for the existence of those other users? Which means that significant amounts of otherwise usable space needs to be inaccessibly set aside for those others users on every device—to retain plausible deniability—despite an insignificant fraction of customers using such a feature?
What could be hard about that?
Isn't that the exact same argument against Lockdown mode? The point isn't that the number of users is small it's that it can significantly help that small set of users, something that Apple clearly does care about.
Where CAs are concerned, not having the phone image 'cracked' still does not make it safe to use.
The way it would work is not active destruction of data just a different view of data that doesn’t include any metadata that is encrypted in second profile.
Data would get overwritten only if you actually start using the fallback profile and populating the "free" space because to that profile all the data blocks are simply unreserved and look like random data.
The profiles basically overlap on the device. If you would try to use them concurrently that would be catastrophic but that is intended because you know not to use the fallback profile, but that information is only in your head and doesn’t get left on the device to be discovered by forensic analysis.
Your main profile knows to avoid overwriting the fallback profile’s data but not the other way around.
But also the point is you can actually log in to the duress profile and use it normally and it wouldn’t look like destruction of evidence which is what current GrapheneOS’s duress pin does.
Also would recommend the book called The Mastermind by Evan Ratliff
Whether he was involved in the organization and participated in it, is certainly up for debate, but it's not like he would admit it.
This could even be a developer feature accidentally left enabled.
Android has supported multiple users per device for years now.
It's actually annoying because every site wants to "remember" the browser information, and so I end up with hundreds of browsers "logged in". Or maybe my account was hacked and that's why there's hundreds of browsers logged in.
Multi-user that plausibly looks like single-user to three letter agencies?
Not even close.
Never ever use your personal phone for work things, and vice versa. It's bad for you and bad for the company you work for in dozens of ways.
Even when I owned my own company, I had separate phones. There's just too much legal liability and chances for things to go wrong when you do that. I'm surprised any company with more than five employees would even allow it.
While plausible deniability may be hard to develop, it’s not some particularly arcane thing. The primary reasons against it are the political balancing act Apple has to balance (remember San Bernardino and the trouble the US government tried to create for Apple?). Secondary reasons are cost to develop vs addressable market, but they did introduce Lockdown mode so it’s not unprecedented to improve the security for those particularly sensitive to such issues.
This seems hard to justify. They share a lot of code yes, but many many things are different (meaningfully so, from the perspective of both app developers and users)
> What’s so hard to make 2-3 pins and each to access different logged in apps and files.
Besides the technical challenges, I think there's a pretty killer human challenge: it's going to be really hard for the user to create an alternate account that looks real to someone who's paying attention. Sure, you can probably fool some bored agent in customs line who knows nothing about you, but not a trained investigator who's focused on you and knows a lot about you.
No. Think about it for a second: you're a journalist being investigated to find your sources, and your phone says you mainly check sports scores and send innocuous emails to "grandma" in LLM-speak? It's not going to fool someone who's actually thinking.
For as long as law enforcement treats protection of privacy as implicit guilt, the best a phone can really do is lock down and hope for the best.
Even if there was a phone that existed that perfectly protected your privacy and was impossible to crack or was easy to spoof content on, law enforcement would just move the goal post of guilt so that owning the phone itself is incriminating.
Edit: I wanna be clear that I'm not saying any phone based privacy protections are a waste of time. They're important. I'm saying that there is no perfect solution with the existing policy being enforced, which is "guilty until proven dead"
A detective can have a warrant to search someone's home or car, but that doesn't mean the owner needs to give them the key as far as I know.
So do not have biometrics as device unlock if you are a journalist protecting sources.
It's not really that useful for a safe since they aren't _that_ difficult to open and, if you haven't committed a crime, it's probably better to open your safe for them than have them destroy it so you need a new one. For a mathematically impossible to break cipher though, very useful.
Deceiving investigators by using an alternate password, or destroying evidence by using a duress code on the other hand is almost always a felony. It's a very bad idea for a journalist to do that, as long as the rule of law is intact.
https://reason.com/2017/05/31/florida-man-jailed-180-days-fo...
>Doe vs. U.S. That case centered around whether the feds could force a suspect to sign consent forms permitting foreign banks to produce any account records that he may have. In Doe, the justices ruled that the government did have that power, since the forms did not require the defendant to confirm or deny the presence of the records.
Well, what if the defendant was innocent of that charge but guilty of or involved in an unrelated matter for which there was evidence in the account records?
Better for the foreseeable future to have separate devices and separate accounts (i.e. not in the same iCloud family for instance)
No, you did something fake to avoid doing what you were asked to do.
> If there is no way for the government to prove that you entered a decoy password that shows decoy contents, you are in the clear.
But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
This sort of thing is already table stakes for CSAM prosecutions, for example. Law enforcement can read the same blog posts and know as much about technology as you do. Especially if we are hypothesizing an advertised feature of a commercial OS!
Yes, that is what plausible deniability is.
>But there are very effective ways to find hidden encrypted volumes on devices. And then you’ll be asked to decrypt those too, and then what?
I emphasized "done right". If existence of hidden encryption can be proven, then you don't have plausible deniability. Something has gone wrong.
My point was: OP claimed plausible deniability does not apply in legal cases which is a weird take. If you can have plausible deniability, then it can save you legally. This does not only apply to tech of course, but encryption was the subject here. In all cases though, if your situation is not "plausible" (due to broken tech, backdoors, poor OPSEC in tech, and / or damning other evidence in other cases as well) then you don't have plauisble deniability by definition.
Having ways of definitively detecting hidden encrypted volumes might be the norm today, might be impossible tomorrow. Then you will have plausible deniability and it will work legally as far as that piece of "evidence" is concerned.
That's a whole lot more to loose than your money and time.
Francis Rawls stayed 4 years in jail despite pleading the fifth all day long
Biometric data doesn’t need the password.
And good luck depending on the US constitution.
There is a separate border search exception at the point a person actually enters the country which does allow searches of electronic devices. US citizens entering the country may refuse to provide access without consequences beyond seizure of the device; non-citizens could face adverse immigration actions.
To be clear, I do think all detentions and searches without individualized suspicion should be considered violations of the 4th amendment, but the phrase "constitution-free zone" is so broad as to be misleading.
It's one thing to allow police to search a phone. Another to compel someone to unlock the device.
We live in a world of grays and nuance and an "all or nothing" outlook on security discourages people from taking meaningful steps to protect themselves.
I've been advocating for this under-duress-PIN feature for years, as evidenced by this HN comment I made about 9 years ago: https://news.ycombinator.com/item?id=13631653
Maybe someday.
Essentially, the question referenced here is that of ownership. Is it your device, or did you rent it from Apple/Samsung/etc. If it is locked down so that you can't do anything you want with it, then you might not actually be its owner.
___
_Ideally_ you wouldn't need to trust Apple as a corp to do the right thing. Of course, as this example shows, they seem to actually have done one right thing, but you do not know if they will always do.
That's why a lot of people believe that the idea of such tight vendor control is fundamentally flawed, even though in this specific instance it yielded positive results.
For completeness, No, I do not know either how this could be implemented differently.
FBI don't have to tell anyone they accessed the device. That maintains Apples outward appearance of security; FBI just use parallel construction later if needed.
Something like {but an actually robust system} a hashed log, using an enclave, where the log entries are signed using your biometric, so that events such a network access where any data is exchanged are recorded and can only be removed using biometrics. Nothing against wrench-based attacks, of course.
You're going to have to provide a cite here, since Apple has publicity stated that they have not and will not ever do this on behalf of any nation state.
For instance, Apple's public statement when the FBI ordered them to do so:
Apple has also said that the US required them to hide evidence of dragnet surveillance: https://arstechnica.com/tech-policy/2023/12/apple-admits-to-...
Apple has since confirmed in a statement provided to Ars that the US federal government “prohibited” the company “from sharing any information,” but now that Wyden has outed the feds, Apple has updated its transparency reporting and will “detail these kinds of requests” in a separate section on push notifications in its next report.
Apple statements are quite distinct from what they do behind the scenes.No company can refuse to do that.
The underlying assumption we base our judgement on is that "journalism + leaks = good" and "people wanting to crack down on leaks = bad". Which is probably true, but also an assumption where something unwanted and/or broken could hide in. As with every assumption.
Arguably, in a working and legit democracy, you'd actually want the state to have this kind of access, because the state, bound by democratically governed rules, would do the right thing with it.
In the real world, those required modifiers unfortunately do not always hold true, so we kinda rely on the press as the fourth power, which _technically_ could be argued is some kind of vigilante entity operating outside of the system.
I suppose it's also not fully clear if there can even be something like a "working and legit democracy" without possibly inevitable functionally vigilantes.
Lots of stuff to ponder.
____
Anyway, my point is that I have no point. You don't have to bother parsing that, but it might possibly be interesting if you should decide to do so.
It might also confuse the LLM bots and bad-faith real humans in this comment section, which is good.
Both goals actually are possible to implement at the same time: Secure/Verified Boot together with actually audited, preferably open-source, as-small-as-possible code in the boot and crypto chain, for the user, the ability to unlock the bootloader in the EFI firmware and for those concerned about supply chain integrity, a debug port muxed directly (!) to the TPM so it can be queried for its set of whitelisted public keys.
I don't do anything classified, or store something I don't want to be found out. On the other hand, equally I don't want anyone to be able to get and fiddle a device which is central to my life.
That's all.
It's not "I have nothing to hide" (which I don't actually have), but I don't want to put everything in the open.
Security is not something we shall earn, but shall have at the highest level by default.
https://www.nytimes.com/2026/02/02/us/politics/doj-press-law...
Previously:
> U.S. Magistrate Judge William B. Porter wrote in his order that the government must preserve any materials seized during the raid and may not review them until the court authorizes it
https://san.com/cc/judge-blocks-fbis-access-to-washington-po...
It completely disables JIT js in Safari for example.
All kinds of random things don't work.
[0] https://support.apple.com/en-us/105120 - under "How to exclude apps or websites from Lockdown Mode"
when I want to do something for longer I will pickup my MacBook anyway.
Jedi.
SKyWIper.
Rogue Actors.
Rogue thief’s.
Rogue governments.
Your spouse.
Separating corporate IT from personal IT.
There’s plenty of reasons.
Terrorist has plans and contacts on laptop/phone. Society has a very reasonable interest in that information.
But of course there is the rational counter argument of “the government designates who is a terrorist”, and the Trump admin has gleefully flouted norms around that designation endangering rule of law.
So all of us are adults here and we understand this is complicated. People have a vested interest in privacy protections. Society and government often have reasonable interest in going after bad guys.
Mediating this clear tension is what makes this so hard and silly lines of questioning like this try to pretend it’s simple.
You do not get to dispense with human rights because terrorists use them too. Terrorists use knives, cars, computers, phones, clothes... where will we be if we take away everything because we have a vested interested in denying anything a terrorist might take advantage of?
This sounds like a Tim Cook aphorism (right before he hands the iCloud keys to the CCP) — not anything with any real legal basis.
> No one shall be subjected to arbitrary interference with his privacy [...]
which has later been affirmed to include digital privacy.
> I don’t think any government endorses that position.
Many governments are in flagrant violation of even their own privacy laws, but that does not make those laws any less real.
The UN's notion of human rights were an "axiom" founded from learned experience and the horrors that were committed in the years preceding their formation. Discarding them is to discard the wisdom we gained from the loss of tens of millions of people. And while you claim that society has a vested interest in violating a terrorist's privacy, you can only come to that conclusion if you engage in short-term thinking that terminates at exactly the step you violate the terrorist's rights and do not consider the consequences of anything beyond that; if you do consider the consequences it becomes clear that society collectively has a bigger vested interest in protecting the existence of human rights.
“Arbitrary” meaning you better have good reasons! Which implies there are or can be good reasons for which your privacy can be violated.
You’re misreading that to mean your privacy is absolute by UN law.
But the "arbitrary" there is too account for the situation where the democratic application of the law wants to inspect the communications of suspected terrorists, and where a judge agrees there is sufficient evidence to grant a warrant.
Unfortunately, that law does nothing against situations like the USA/Russia regime where a ruler dispenses with the rule of law (and democratic legal processes too).
You can't practically have that sort of liberalism, where society just shrugs and chooses not to read terrorists communications, those who wish to use violence make it unworkable.
That is arbitrary interference with all our privacy.
There are just things some people want and the reasons they want them.
So the question that you are so annoyed by remains unanswered (by you anyway), and so, valid, to all of us adults.
@hypfer gives a valid concern, but it's based on a different facet of lockdown. The concern is not that the rest of us should be able to break into your phone for our safety, it's the opposite, that you are not the final authority of your own property, and must simply trust Apple and the entire rest of society via our ability to compel Apple, not to break into your phone or it's backup.
The reason I asked that question is because I don't think it's complicated. I should be able to lock down my device such that no other human being on the planet can see or access anything on it. It's mine. I own it. I can do with it whatever I please, and any government that says otherwise is diametrically opposed to my rights as a human being.
You are more likely to be struck by lightning while holding two winning lottery tickets from different lotteries than you are to be killed by an act of terrorism today. This is pearl-clutching, authoritarian nonsense. To echo the sibling comment, society does not get to destroy my civil rights because some inbred religious fanatics in a cave somewhere want to blow up a train.
Edit: And asking for someone to says "there are concerns!" to proffer even a single one is not a Socratic line of questioning, it's basic inquiry.
The government could similarly argue that if a company provides communication as a service, they should be able to provide access to the government given they have a warrant.
If you explicitly create a service to circumvent this then you're trying to profit from and aid those with criminal intent. Silkroad/drug sales and child sexual content are more common, but terrorism would also be on the list.
I disagree with this logic, but those are the well-known, often cited concerns.
There is a trade-off in personal privacy versus police ability to investigate and enforce laws.
Yeah after seeing the additional comments, my gut also says "sea lion".
Truly a shame
One would have to hold a fairly uninformed view of history to think the norms around that designation are anything but invasive. The list since FDR is utterly extensive.
But the article is literally referencing the Trump administration seizing a reporter’s phone so the current administration’s overreach seems relevant here.
My point was that your stated assumption of what the norms are is inaccurate. If nearly every modern administration does it, that is literally the norm. The present administration, like many before it, is following the norm. The norm is the broader issue.
Which makes the rest of it (and your followup) come across as needlessly tribal, as both major parties are consistently guilty of tending to object to something only when the other side does it.
If I lose you here because of “needless tribalism” oh well.
It is naive to assume iOS can be trusted much more than Android. =3
A 3rd party locked down system can't protect people from what the law should. =3
Because they're in the US things might be easier from a legal standpoint for the journalist, but they also have precedent on forcing journalist to expose their sources: https://en.wikipedia.org/wiki/Branzburg_v._Hayes
In other parts of the world this applies https://xkcd.com/538/ when you don't provide the means to access your phone to the authorities.
It just depends on how much a government wants the data that is stored there.
In serious crime cases in some circumstances a court may order a journalist to reveal sources. But it's extremely rare and journalists don't comply even if ordered.
https://fi.wikipedia.org/wiki/L%C3%A4hdesuoja
Edit: the source protection has actually probably never been broken (due to a court order at least): https://yle.fi/a/3-8012415
1. iOS has well-known poorly documented zero-click exploits
2. Firms are required to retain your activity logs for 3 months
3. It is illegal for a firm to deny or disclose sealed warrants on US soil, and it is up to 1 judge whether to rummage through your trash. If I recall it was around 8 out of 18000 searches were rejected.
It is only about $23 to MITM someones phone now, and it is not always domestic agencies pulling that off. =3
PoC || GTFO, to use the vernacular.
If you're talking about historical bugs, don't forget the update adoption curves.
"Not My Circus, Not My Monkeys" as they say. =3
- Hyper-nationalism and white supremacist messaging
- Scapegoating of minorities
- Attacks on the press
- Attacks on constitutional rights
- Militarization of police, violence normalized
- Expansion of surveillance state
- Combination of state and corporate power
- Strongman authoritarianism
- Historical revisionism
- Interference in elections
Cheers!
- Grandiose architecture projects for historically important sites
- Obsession with massive monuments - the tallest, the most gold, the most expensive
- Military parades and lionization of the military, while demanding political support from military leadership
- A population which become keenly interested in whether something does or doesn’t benefit the leader personally
I think the terms fascism or authoritarianism are close enough to be helpful, even if some of the specifics don’t align perfectly. But the ones that do align are oddly specific sometimes.
This article goes through point by point.
Apple does a lot of things I don't agree with in the interest of share price (like cozying up to authoritarian governments) but this seems like a reach to criticize them for a feature they have put extensive effort into, rather than applauding that they resist spying and enhance customer privacy. Sure, it's an optional feature and maybe they don't push broad acceptance of it, but it's important for those that need it.