My oldest son is prone to getting really (REALLY) into something, for a brief period of time, and then moving quickly on. As parents, we try to keep up, encouraging his inquiry and trying not to reel too much when he abandons one topic for another.
For awhile, this winter, it was cholera.
As in, specifically, the cholera outbreak in Victorian London, and its contributions to the study of epidemiology and the development of modern sanitation.
He made a ‘ghost map’ showing how a cold outbreak could travel through his school, modeled after the map that Dr. John Snow made to finally prove that cholera resulted from contaminated water and not from bad smells.
And I, to make sure I could understand what he was talking about (and because most of his interests are, actually, quite interesting), read The Ghost Map myself.
And one part that stuck with me was how absolutely certain the best minds of the day were, at the time, that the deadly diseases they confronted must come from the smells of the sewers and of the decay with which they were surrounded. It made so much sense. London smelled really bad, according to almost all contemporary sources, and people were frequently ill, so, then, it made sense that the two would be related. They kept on believing this, even when houses with worse sanitation suffered lower death rates than the richer houses that happened to be downstream. They believed it because it seemed so right, even when data suggested that it wasn’t. At all. They believed it even when believing meant studiously ignoring countervailing facts, and even when believing one way led to behaviors significantly more likely to result in their deaths. They took clear action based on these flawed beliefs, never apologizing for or even seeming to doubt the veracity of beliefs based on no sound science at all.
The author asks, and we must ask ourselves, “How could so many intelligent people be so grievously wrong for such an extended period of time? How could they ignore so much overwhelming evidence that contradicted their most basic theories? These questions, too, deserve their own discipline–the sociology of error” (15).
Because, of course, this wasn’t the first time in history when powerful beliefs that defy truth have led to grave errors. During one outbreak of plague, a belief that the disease was spread by dogs and cats led to mass extermination which, of course, increased the plague, since it was actually spread by rats formerly kept in check by the dogs and cats (120).
And it wasn’t the last.
We have, with a greater or lesser degree of consensus, believed that interning Japanese-Americans would keep us safer; that cigarettes have no ill health effects; that people with mental illnesses belong in institutions; that nuclear power is infallibly safe…
We console ourselves that that was then, before we knew, because we don’t want to contemplate the very finite limits of the knowledge we have today.
And that’s our blind spot, this idea that we could be just as wrong now, about something else, as we recognize in hindsight. We could be ignoring just as many warning signs, about what’s wrong with our economic structure, or what it will take to really make schools work, or what supports young families need to thrive.
We could be just as wrong. And the consequences could be just as tragic.
If we don’t keep asking, why? And wondering, maybe?