upvote
> If the size of the new type is larger than the size of the last-written type, the contents of the excess bytes are unspecified (and may be a trap representation). Before C99 TC3 (DR 283) this behavior was undefined, but commonly implemented this way.

https://en.cppreference.com/w/c/language/union

> When initializing a union, the initializer list must have only one member, which initializes the first member of the union unless a designated initializer is used(since C99).

https://en.cppreference.com/w/c/language/struct_initializati...

→ = {0} initializes the first union variant, and bytes outside of that first variant are unspecified. Seems like GCC 15.1 follows the 26 year old standard correctly. (not sure how much has changed from C89 here)

reply
Programming languages are products, that is like saying you want to keep using vi 1.0.

Maybe C should have stop at K&R C from UNIX V6, at least that would have spared the world in having it being adopted outside UNIX.

reply
I liked the idea I heard: internet audiences demand progress, but internet audiences hate change.
reply
If C++ had never been invented, that might have been the case.
reply
C++ was invented exactly because Bjarne Stroustoup vouched never again to repeat the downgrade of his development experience from Simula to BCPL.

When faced with writing a distributed systems application at Bell Labs, and having to deal with C, the very first step was to create C with Classes.

Also had C++ not been invented, or C gone into an history footnote, so what, there would be other programming languages to chose from.

Lets not put programming languages into some kind of worshiping sanctuary.

reply
I don't think C would have become a footnote if not for C++ given UNIX.
reply
Most likely C++ would not happened, while at the same time C and UNIX adoption would never gotten big enough to be relevant outside Bell Labs.

Which then again, isn't that much of a deal, industry would have steered into other programming languages and operating systems.

Overall that would be a much preferable alternative timeline, assuming security would be taken more seriously, as it has taken 45 years since C.A.R Hoare Turing award speech and Morris worm, and only after companies and government started to feel the monetary pain of their decisions.

reply
I think there are very good reasons why C and UNIX were successful and are still around as foundational technologies. Nor do I think C or UNIX legacy are the real problem we have with security. Instead, complexity is the problem.
reply
Starting by being available for free with source code tapes, and a commented source code book.

History would certainly have taken a different path when AT&T was allowed to profit from Bell Labs work, as their attempts to later regain control from UNIX prove.

Unfortunately that seems the majority opinion on WG14, only changed thanks to government and industry pressure.

reply
Being free was important and history could have taken many paths, but this does not explain why it is still important today and has not been replaced despite many alternatives. WG14 consists mostly of industry representatives.
reply
It is important today just like COBOL and Fortran are with ongoing ISO updates, sunken cost, no one is getting more money out of rewriting their systems just because, unless there are external factors, like government regulations.

Then we have the free beer UNIX clones as well.

Those industry members of WG14 don't seem to have done much security wise language improvement during the last 50 years.

reply
I think this is far from the truth.
reply
deleted
reply
I suspect this change was motivated by standards conformance.
reply
The wording of GCC maintainer was "the standard doesn't require it." when they informed Linux kernel mailing list.

https://lore.kernel.org/linux-toolchains/Z0hRrrNU3Q+ro2T7@tu...

reply
Reminds me of strict aliasing. Same attitude...

https://www.yodaiken.com/2018/06/07/torvalds-on-aliasing/

reply
> I feel like once a language is standardized (or reaches 1.0), that's it. You're done. No more changes. You wanna make improvements? Try out some new ideas? Fine, do that in a new language.

Thank goodness this is not how the software world works overall. I'm not sure you understand the implications of what you ask for.

> if they aren't cheekily mutating over the years

You're complaining about languages mutating, then mention C++ which has added stuff but maintained backwards compatibility over the course of many standards (aside from a few hiccups like auto_ptr, which was also short lived), with a high aversion to modifying existing stuff.

reply
Perl 6 and Python 3 joined the chat
reply
It's careless development. Why think something in advance when you can fix it later. It works so well for Microsoft, Google and lately Apple. /s

The release cycle of a software speaks a lot about its quality. Move fast, break things has become the new development process.

reply
That does not make sense for anything that exists over decades.

Do you want to be still using Windows NT, or C++ pred 2004 standard or python 2.0

We learn more and need to add to things., Some things we designed 30 years ago were a mistake should we stick with them.

You can't design everything before release for much software. Games you can or bespoke software for a business as you can define what it does, but then the business changes.

reply