Steinmetz contributed heavily to AC systems theory which helped understand and expand transmission. while Scott contributed a lot to transformer theory and design (I have to find his Transformer book.)
In addition to the limits of human planning and intellect, I'd also add incentives:
as cynical as it sounds, you won't get rewarded for building a more safe, robust and reliable machine or system, until it is agreed upon that the risks or problems you address actually occur, and that the costs for prevention actually pays off.
For example, there would be no insurances without laws and governments, because no person or company ever would pay into a promise that has never been held.
It's not even limited to modern technology. If you go talk to certain grievance-driven individuals from tribal backgrounds (for lack of a better term) who have produced nothing for the last 10000 years, they will levy similar accusations against the very institutions that are providing them with healthcare their ancestors could only have dreamed of. In some areas, even agriculture is seen as suspect. It's ridiculous.
It's scary to me how both sides of the American political aisle have suddenly turned anti-tech and are buying into the same arguments. Gross.