upvote
There's no such thing as "legitimacy of the bootloader, OS" that can be verified by someone who isn't the device's user. The bootloader that booted the phone I type this on is patched by me, which makes it more "legitimate" than any other bootloader that could be placed there.
reply
The reason (or, depending on your inclinations, the excuse) for trusted computing to exist is not to guarantee that I didn’t patch the bootloader of the phone on which I type my comment; it’s to guarantee I didn’t patch the bootloader of the phone on which your grandma logs in to her bank without her knowledge.
reply
No, the reason is to let application providers decide which platforms you can run their software on. The reasons why they need that are diverse: DRM, preventing reverse engineering, shifting liability, "cheating" prevention - to name a few, but ultimately they're all about asserting control over the user, just motivated differently in various use cases. "Think of the grandmas".
reply
What's the problem with the current status quo, or the status quo 5 or 10 years ago? 20 years ago there were basically no cheating prevention, but nobody cared. We just didn't play with cheaters. There are still cheaters in all games. No matter what kind of DRM streaming platforms use, their movies are on torrent immediately. The only difference compared to 5-20 years ago is that user experience is worse. I need to install a lot of intrusive bullshits, and I cannot watch movies with proper resolution. For literally nothing.
reply
It's not just that "user experience is worse", it's an existential threat to Free Software.

In the past, when you had a proprietary tool you needed to use to do something, people could analyze and reimplement it. The reasons to do that varied - someone needed "muh freedomz", someone else wanted to do the thing on an unsupported platform, someone else wanted to change something in the way the tool worked (perhaps annoyed by paper jams)... Eventually you could end up with an interoperable FLOSS reimplementation. This has happened with lots of various things - IMs, network service clients, appliance drivers, even operating systems, and this is how people like me could switch away from Windows and have their computers (and later phones) remain fully functional in the society around us, perhaps with minor annoyances, but without real showstoppers.

Remote attestation changes this dynamic drastically. Gaim (Pidgin), Kadu couldn't be made if the service provider like AIM, ICQ, Gadu-Gadu etc. could determine whether you're using the Official App™ from the Official Store™ on the Official OS™ and just refuse to handle requests from your reimplementation. They could still try and be hostile to you without it, and often did, but it wasn't an uneven fight. Currently we're still in the early days and you can still go by in the society by defaulting to use services on the Web, using plastic card instead of phone for payments etc. but this is already changing. And it's not just a matter of networked services either - I bet we're going to see peripheral devices refusing to be driven by non-attested implementations too.

Secure boot chains have some value and are worth having, but not when they don't let the user be in charge (or let the user delegate that to someone else) and when they prioritize the security of "apps" rather than users. The ability for us as users to lie to the apps is actually essential to preserving our agency. Without that we're screwed, as now to connect ourselves to the fabric of the society we'll need to find and exploit vulnerabilities that are going to be patched as soon as they become public.

reply
> The ability for us as users to lie to the apps is actually essential to preserving our agency. Without that we're screwed, as now to connect ourselves to the fabric of the society we'll need to find and exploit vulnerabilities that are going to be patched as soon as they become public.

The same freedom is being abused by malicious actors. Even on Windows (like BlackLotus), but also on pre-infected phones emptying people's bank accounts. This is an incredibly unfortunate outcome, but what's the solution?

I see no other potential outcome than that free computing and trusted computing are going to be totally separate. Possibly even on the same device, but not in a way that lets anyone tamper with it.

reply
A lot of other freedoms are being abused and always have been, but somehow we don't go and ban kitchen knives, as having them around is valuable. This is a false dichotomy. Systems can be secure and trusted by the user without having to cede control, and some risks are just not worth eliminating.

Most importantly - it's the user who needs to know whether their system has been tampered with, not apps.

reply
> somehow we don't go and ban kitchen knives

False analogy. You can’t have your kitchen knife exploited by a hacker team in North Korea, who shotgun attacks half of the public Internet infrastructure and uses the proceeds to fund the national nuclear program, can you? (I somewhat exaggerate, but you get the idea.)

> Systems can be secure and trusted by the user without having to cede control

In an ideal world where users have infinite information and infinite capability to process and internalize it to become an infosec expert, sure. I don’t know about you, but most of us don’t live in that world.

I agree it’s not perfect. Having to use liquid glass and being unable to install custom watch faces is ridiculous. There’s probably an opportunity for a hardened OS which can be trusted by interested parties to not be maliciously altered, and also not force so many constraints onto users like current walled gardens do. But a fully open OS, plus an ordinary user who has no time or willingness to casually become a tptacek on the side, in addition to completely unrelated full-time job that’s getting more competitive due to LLMs and whatnot, seems more like a disaster than utopia.

reply
> You can’t have your kitchen knife exploited by a hacker team in North Korea, who shotgun attacks half of the public Internet infrastructure and uses the proceeds to fund the national nuclear program, can you? (I somewhat exaggerate, but you get the idea.)

Isn’t the status quo, that you need to intentionally choose to allow this?

reply
Yes (well, kinda - attested systems can be and are vulnerable too), and remote attestation is completely orthogonal to that threat anyway. Securing the boot chain does not involve letting apps verify the environment they run in, it's an extra (anti-)feature that's built on top of secure boot chains.

It's also really incredible how people can see "user being in control" and just immediately jump to "user having to be an infosec expert", as if one implied the other. You can't really discuss things in good faith in such climate :(

reply
> but somehow we don't go and ban kitchen knives, as having them around is valuable

Some countries do :) Though I think physical analogies are misleading in a lot of ways here.

> Systems can be secure and trusted by the user without having to cede control, and some risks are just not worth eliminating.

Secure, yes, trustworthy to a random developer looking at your device, no. They're entirely separate concepts.

> Most importantly - it's the user who needs to know whether their system has been tampered with, not apps.

Expecting users to know things does a lot of heavy lifting here.

reply
I never mentioned users having to know things (what you quoted was about the user getting informed whether their system is compromised, which is the job of a secure boot chain). The user being in control means that the user can decide who to trust. The user may end up choosing Google, Apple, Microsoft etc. and it's fine as long as they have a choice. Most users won't even be bothered to choose and that's fine too, but with remote attestation, it's not the user who decides even if they want to. And we don't need random developers looking at our devices to consider them trustworthy, it's none of their business and it's a big mistake to let them.
reply
How large is this preinfected phones problem? Is it large enough to sacrifice freedom?
reply
We have had a large discovery of pre-installed malware every year for the past decade so far. Seems like a fairly big problem.
reply
And how exactly did attestation help there?

Securing apps from the user does not secure the user from malware.

reply
[dead]
reply
You can bicker about the words all day long. Legitimacy, or perhaps better: authenticity, in this context, would be a bootloader or OS that doesn't allow tampering with the execution of an app.
reply
Any bootloader or OS that doesn't allow the user to tamper with it or the other tools they're using on it is obviously illegitimate malware.
reply
It's a funny comment, because actual malware, very much loves to tamper with the bootloader and OS.

Which was the motivation for cryptographically attesting the boot process and OS, and in part paved the way for app attestation.

There are alternatives though: The Android Hardware Attestation API enables attestation on custom ROMs, but the attestation verifier needs a list of hashes for all "acceptable" ROMs. GrapheneOS publishes these but there's nobody, to my knowledge, maintaining a community list.

reply
Nothing funny in it, I'm afraid. Socially accepted malware is still malware. Caffeine is a stimulant, alcohol is a drug, a piece of software that works against the user is a malware.

Cryptographic attestation is not a problem in itself, the problem is exactly what you already somewhat hinted at: it's who and how decides who to trust and who gets to make (or delegate) the choices. You can make a secure system that lets the user be in charge, but these systems we're discussing here don't (and that's by design; they're made to protect "apps", not users).

reply
Sorry but this is nonsense - most users, even the Linux toting power users - don't have the time, ability or knowledge to verify the contents of their OS in a way that would catch issues prevented by attestation.

The problem with modified phones containing malware is very real and unless you want a full on Apple "you're not allowed to touch the OS" model you need some kind of audited OS verification that you as a user or a security sensitive software can depend on.

reply
No, what you're saying is nonsense. I can burn a key into efuses of this phone to make it only boot things signed by me and make the whole boot path verified, OS image immutable etc. and all of this can provide me some value, but it's absolutely not in my interest to let applications be picky on what can or can't happen in the OS (even if they would accept my key being there rather than Google's, which they won't). The only thing it manages to do is to prevent me from using the device the way I want or need it to be used.
reply
I agree about the part where apps shouldn't be able to see whether the OS is trusted.

But to remove that incentive you first need to stop punishing app companies for compromised user OSes from legal perspective.

Are you willing to absolve Google, Apple and Deutsche Bank from responsibility of damage that happens on compromised user OSes?

reply
The attested systems have vulnerabilities too, so how do they deal with that responsibility?
reply
There's also a problem with unmodified phones containing malware, namely an operating system made by an advertising company, which is designed to collect as much information about you as possible.

And this malware is largely based on open source code (Linux) that was originally developed on open, documented hardware, where the firmware boot loader did nothing more than load the first 512 bytes of your hard disk to address 0x7c00 and transfer complete control to it.

Yes, there were viruses that exploited this openness, but imagine if Linus Torvalds would have needed a cryptographic certificate from IBM or Microsoft to be allowed to run his own code! This is basically the situation we have today, and if you don't see how dystopian this is, I don't know what more to say.

I will never understand why such an overwhelming majority of people seem to just accept this. When frigging barcodes where introduced, there were widespread conspiracy theories about it being the Mark of the Beast -- ridiculous of course, but look at now where in some places you literally can't buy or sell without carrying around a device that is hostile to your interests. And soon it will be mandated by the state for everyone.

Google must be destroyed.

reply
Yeah, randomly calling software that you don't like "malware" isn't making a strong case you think it does. Or helps in this discussion.
reply
It's doing things that are against the interest of the user. But obviously, that's no longer an acceptable definition! According to our benevolent overlords, Android is definitely not malware, while yt-dlp is </s>
reply
> App attestation does not require an Apple account nor a google account. For Android, it does limit the ROMs to Google certified ones and requires GMS to be installed.

To me, there is no difference between your sentences. You require the blessing of an American company to be able use eIDAS. Google has the power to disable eIDAS at a national scale by making the attestation services treat all devices as not certified.

There should be NO reliance whatsoever on a private company not under the control (direct or indirect) of the government let alone a foreign private company.

Edit: I just noticed your username and the fact that your account is very new. Are you astroturfing?

reply
I made an account because I'm qualified to talk about this topic :-) I've spent a considerable time testing every corner case of UX, and DX of an app attested service.

App attestation can fail on simulators, Graphene OS, dev builds, I've seen it all. There is one check you can do to see if an app was side loaded, so indirectly, can require Google account.

Title is still misleading though, as it explicitly mentions accounts.

reply
Come September, there will be no side loaded apps on Android.
reply
You're behind on your news!

Google details new 24-hour process to sideload unverified Android apps (1196 points, 16 days ago, 1262 comments) https://news.ycombinator.com/item?id=47442690

reply
Functionaly it's dubious if this will not cause further issues. Developer tools cause some security checks to fail. It's not yet known if the unknown apps setting will do the same
reply
I agree, there is still a reliance on the tech giants that produce the phones, who are the o'es embedding the cryptographic keys, to make this end to end attestation work.

But in pure technical & UX terms, you don't need to be logged in.

reply
[flagged]
reply
Your whole point is orthogonal to what I said too.

I said the title is misleading, which it is.

Your argument that app attestation should be avoided because big tech company can withhold it is garbage. It holds no water. They can cut off access to the app in general by removing it from the app stores and the devices that have it installed.

American big tech has Europe in a stranglehold, I agree with your sentiment there.

eIDAS can be used with the ID reader on Linux even, there's no lock out. They want to offer a convenient alternative for the normies, in a secure manner, I don't mind.

Edit: my 70 y/o mother even eIDAS authenticates (not germany, other EU country) on Linux Mint. There's no argument for lockout in my anecdotal perspective.

reply
How are you expecting someone here to complete a captcha in the comments?
reply
deleted
reply