upvote
> It disappoints me to see hardware compensate for the failures of software. We should have done better.

I disagree. From a user's point of view, hardware-assisted memory safety is always beneficial. As a user of any software, you cannot verify that you are running a program that is free of memory access errors. This is true even when the software is written in Rust or an automatic memory-managed language.

I hope that one day I will be able to enable memory integrity enforcement for all processes running on my computers and servers, even those that were not designed for it. I would rather see a crash than expose my machine to possible security vulnerabilities due to memory access bugs.

reply
I'm skeptical that you even can fully prevent exploitation of human error in software design. This just narrows one class of error.
reply
How could we have done better without first knowing better?
reply
We have know better for decades, that is why Multics has a higher security score than UNIX, C flaws versus PL/I are noted on DoD report.
reply
It also helps that nobody uses multics, so nobody has bothered to exploit it
reply
I can give other more recent examples, to prove the blindness of C community to security issues.

From which decade since C came to be, do you wish the example?

reply
I'm certainly not defending C. I'm just saying multics is a horrible example.
reply
It is one out of many since 1958, starting with JOVIAL, how the industry has been aware of the security flaws that C allows for, which WG14 has very little interest in fixing, including turning down Dennis Ritchie proposal for fat pointers in 1990.

Note that C authors were aware of many flaws, hence why in 1979 they designed lint, which C programmers were supposed to use as part of their workflow, and as mentioned above proposed fat pointers.

Also note that C authors eventually moved on, first creating Alef (granted failed experiment), then on Inferno, Limbo, finalising with Go.

Also Rust ideas are based on Cyclone, AT&T Research work on how to replace C.

It was needed the tipping point of amount money spent fixing CVEs, ransomware, for companies and government to start thinking this is no longer tolerable.

reply
Rust isn't going to fix security vulnerabilities, either, though.

My point is focusing on the language is inherently missing the point, which is simply incorrect code.

reply
I agree. The underlying hardware should be as simple as needed and thus be cheap and consume little power. Fixing bad software practices (like using an unsafe language) via hardware hacks is a terrible mistake.
reply
> Fixing bad software practices (like using an unsafe language) via hardware hacks is a terrible mistake.

It's like saying airbags, seat belts (and other safety features) in cars are a terrible mistake because they just fix bad driving practices.

reply
On the contrary, fixing pervasive and increasingly costly ecosystem issues in hardware is exactly the kind of innovation we need.
reply