But it wasn't always this way, and so, I don't think it has to be. People just need to start paying attention to this.
The impact of a lot of those vulnerabilities would be mitigated if the affected programs didn't connect to the network in the first place.
As for updates in general, I really like the model adopted by Linux update managers and BSD port systems. The entire repository metadata is downloaded from a mirror and cached locally, so the search terms never leave your machine. Downloads happen from the nearest mirrors, there's no "standard" mirror software (unless rsync and Apache count?) so they don't report what was downloaded by whom back to any central system and you can always host your own. Everything is verified via GPG. And most importantly, nothing happens on its own; you're expected to run `apt/dnf update` yourself. It won't randomly eat your bandwidth on a metered connection or reveal your OS details to a public hotspot.
Simple, non-invasive, transparent, (almost) all-encompassing, and centrally configurable.