upvote
> Is that a rule?

No, it's commonly followed practice: https://en.wikipedia.org/wiki/Coordinated_vulnerability_disc...

I'm all for lighting a fire under the developer's ass, but we live in an imperfect world and the biggest problem that we have is end-users. We may have applied the mitigation on day 0, and updated as soon as the kernel landed in our distro - and if some of us didn't then we've even got savvy users in that "don't update fast enough group" (which is fine, which is human, but is said imperfection).

Major distros should at least have gotten a few days of notice for something this catastrophic. It doesn't help that the kernel is fixed if "normies" aren't able to access it on day 0. For reference, the standard is 30 for the developer to fix and 90 for it to land on machines. Even 30+7 would have been a substantial improvement.

Ethical security research involves ethics, and maybe they aren't referenced in university/college any more - but here's what I was taught: https://www.acm.org/code-of-ethics .

> 1.1 Contribute to society and to human well-being, acknowledging that all people are stakeholders in computing.

> [...] Computing professionals should consider whether the results of their efforts will [...] and will be broadly accessible.

> 1.2 Avoid harm.

> (Honestly, all of it)

> 2.3 Know and respect existing rules pertaining to professional work.

> 3.1 Ensure that the public good is the central concern during all professional computing work.

> People—including users, customers, colleagues, and others affected directly or indirectly—should always be the central concern in computing.

Maybe other code of ethics for CS exist; I'd like to know which ethics these ethical researchers were following.

reply
You're trying to extrapolate on this specific scenario from Wikipedia pages. Have you done any of this work? What have you done when you've reported a vulnerability to an upstream with dozens of downstreams? When your teammates have? You keep talking about "protocols" and "commonly followed practice" and "codes of ethics". Tell us more about the codes, protocols, and practices in your shop.

Nobody, for what it's worth, is arguing that major distros shouldn't have gotten some kind of notice. The problem is that the entity responsible for doing that isn't the vulnerability research lab. In fact, as a general procedural point, researchers can't go contact downstreams. They might be able to do so in the specific case of Linux, but you've tried to spin that possibility into a binding obligation derived from established practices, which: no. That's not a real thing.

reply
It’s a commonly followed practice for some people. Notably it’s what was done here: they coordinated disclosure with the Linux kernel devs. And now folks are angry that they didn’t also coordinate with yet more downstream projects.

> For reference, the standard is 30 for the developer to fix and 90 for it to land on machines.

I’ve never seen that as a standard anywhere.

Are you thinking of this? https://projectzero.google/vulnerability-disclosure-policy.h...

reply
You are strongly implying that keeping the vulnerability secret is following of what you quoted. But that’s the rub. Many of us think the opposite. Not disclosing this would have been the violation.
reply