How do we assess data? What are we biased to see? How many times we miss the obvious?
From Jack Uldrich’ blog:
“Consider the case of Abraham Wald. During World War Two, he and a team of researchers were charged with protecting Allied bombers from German guns. As part of their work the researchers diligently recorded where on the body of the plane each returning bomber was struck by gunfire. The most common areas were the wings and the tails.
“In response, the researchers advised the military command to reinforce those bullet-struck areas. Everyone, that is, except Wald who suggested that those areas of the plane not struck by gunfire – largely the fuselage – be reinforced. His recommendation was initially met with incredulity by his peers and superiors.
“Eventually, Wald convinced them of the wisdom of his logic. The mistake his peers made was that they were only observing those planes which safely returned. What they were not seeing were those planes that didn’t return. Wald reasoned correctly that if a plane could safely return with bullet-ridden wings and tailfins then those areas didn’t need reinforcement and, counter-intuitively, the parts of the plane without bullet holes were the areas requiring additional armor.”