


Ignaz Semmelweis (1818-1865) is hardly a household name, but he should be.
In 1844, the young Hungarian physician was assigned to an obstetrics hospital in Vienna, where puerperal fever, described today as postpartum infection, was rampant among mothers who had delivered babies. In one clinic, Semmelweis noticed that the death rates among these mothers was around 20%, while in another it was about one third that number. The only difference between the two clinics was that the first was serviced by medical students who often came straight from the dissection table while the second clinic was a training ground for midwives.
Deducing that the medical students were in some way carrying infection to the mothers, Semmelweis hit on the idea of having them wash their hands in a mixture of chlorinated lime, and the death rate quickly plummeted. Though he spent years afterwards promoting this idea—several hospitals adopted the practice, with stunning positive results—the medical authorities in general scoffed at his findings and rejected his teachings. Authority and tradition crushed both his discovery and Semmelweis, who died after a short stay in an insane asylum.
From his tragic history came our modern term the Semmelweis Reflex bias, the tendency to reject a piece of information which demonstrates that something we believe is wrong. Whether unconsciously or deliberately, when we turn our back on facts and data proving that we are in error, we have surrendered to this bias.
This is not an uncommon occurrence. In the 1920s, for example, General Billy Mitchell offered proof that aircraft had rendered battleships obsolete in naval warfare. His criticism of the upper echelons of command for ignoring this premise led to a conviction of insubordination, and he resigned from the service. By the end of World War II, however, battleships were becoming obsolete, and aircraft had become the key to victory in naval engagements.
We’ve seen similar examples of the Semmelweis Reflex in our own times. During the Covid pandemic, for example, some scientists and physicians who had devised effective ways to fight the virus were silenced by the establishment and banned from social media. Authority trumped valid innovation.
In our own lives, this bias can also come into play. We become accustomed to a certain way of thinking or doing things, and are loathe to change. A supervisor assigned to make a shipping department more efficient meets resistance from long-time employees. Two friends who are polar opposites in politics discuss some hot topic of the day, and though one of them brings out a wealth of data and facts to support his point, the other refuses to concede. A financially strapped couple sits down to discuss their summer budget, but the husband adamantly resists giving up the week at the Outer Banks renting a house with friends. “We’ve always done it this way,” he says. “What would they think of us?”
To avoid falling into this rut requires imitating the same procedures employed by Ignaz Semmelweis. When we notice that some long-accepted way of thinking or acting may be wrongheaded, we should consider certain the facts and data, and then experiment to discern if this new way does indeed work better. If the experiment succeeds, we make a change. If it fails, we return to our time-tested path.
Writer Somerset Maugham once noted, “Tradition is a guide and not a jailor.” With tradition as our guide, but with common sense and logic as our companions, we can avoid the Semmelweis Reflex.