This is an absolute classic of engineering literature. The last sentence, perhaps deservedly, gets most of the glory, but the whole piece should be under every engineer's and every manager's fingers.
I constantly see the dynamic observed in the first paragraph, and it would seem that the question "What is the cause of management's fantastic faith in the machinery?" is eternal.
I think management's big problem is that they are often confused by the difference between what they need and what they have. Of course I've spent a lot of my career working for venture capital-based startups where this problem might naturally be more prevalent.
I would phrase it slightly differently: the problem is that when human beings know that they need something, and also know that they will not be allowed to get it, they are remarkably successful at convincing themselves that they don't really need it after all.
Take the infamous O-rings from the Shuttle's solid rocket boosters, for instance. As Feynman notes in this appendix, the appearance of erosion on the O-rings was an indication that their design was fundamentally flawed. So why did NASA's engineers twist themselves into pretzels to argue that everything was fine? The reason is because they knew that there was no chance that those boosters were going to be redesigned. The political will for that, in NASA and in Washington, just was not there. Even if the engineers threw down their tools and refused to launch any more Shuttles on the grounds that they were unsafe, all that would happen would be that the Powers That Be would come down on them like a ton of bricks and force them either to shut up and get back to work, or to get out of NASA and therefore wreck their careers.
When people are forced to choose between bad options, the easiest thing to do is usually nothing. So that's what NASA's engineers did. And since "I know this thing I work on is likely to kill people, and I'm not going to do anything about that" is the kind of self-knowledge that leads to cognitive dissonance (http://en.wikipedia.org/wiki/Cognitive_dissonance), they put together ad-hoc rationalizations to help them live with it. Rationalizations like the misuse of the term "safety factor" that Feynman flagged, for example; if you can twist that term so it fits the facts in front of you, you can convince yourself that the Shuttle isn't really unsafe. Which takes care of the cognitive dissonance.
You can see this cycle all the time in software development, too. How many systems have you worked on that weren't really secure, but you know that management doesn't have the stomach to accept the tradeoffs -- the extra cost, the reduced convenience -- it would take to make them secure? What happens in those cases? In the ones I've seen, the people involved just convince themselves that the system isn't insecure after all. "We've never been hacked; if the system was insecure, we would have been hacked; therefore, the system is not insecure." It's not great logic, but when you desperately want to believe something it doesn't have to be great logic to convince you. It just has to tell you what you already want to hear.
The engineers knew why there was blow-by on the O-ring -- joint rotation. The obvious solution was to redesign the field joint to resist this rotation, so that the O-ring would never be exposed to hot gases. All they needed was time and money.
Then Challenger happened. Now, no Shuttle could ever launch again until the joint was redesigned and proven to work. Time and money were no object.
Management's main problem is that they believe the problem is one of people, and designing counter-productive incentive systems to try to get people to do better work, with raises and firings and carrot-and-stick motivation, and not, fundamentally, in designing systems for quality.
This may apply less to NASA, but it does apply to the business world, yet the result is the same. Failures and defects and poor quality are primarily the result of incomplete systematic control over the end product.
So in that sense, I saw this more as an astute diatribe on the management issues of 20th century America than an engineering piece. The engineering here is known quite well; any engineer can explain it thoroughly as Feynman has done.
The part that flies in the face of truth is how we manage people.
I constantly see the dynamic observed in the first paragraph, and it would seem that the question "What is the cause of management's fantastic faith in the machinery?" is eternal.