Categories
Inpatient Practice

Hand Washing Deniers

We do monthly audits of how compliant our hospital personnel are with hand hygiene. Last month, our hospital hit 97% and the month before, 98%.  The policy is that everyone (doctors, nurses, physical therapists, housekeeping, etc.) has to sanitize their hands when entering a patient room and again when exiting a patient room. No exceptions. Our audits are done by incognito auditors who walk around the hospital watching to see if anyone goes in or out of a room without sanitizing their hands.

In addition to our own internal compliance audits, the Ohio Hospital Association sends “secret shoppers”, who are nursing students, out to Ohio hospitals to do additional audits of hand hygiene compliance. This year, hospitals in Ohio are at 84% compliance which doesn’t sound all that great until you compare it to the national average of 68%.

Until about 15 years ago, hand hygiene meant using soap and water. This was a problem for people like me – in a 19-bed ICU, I would wash my hands more than 100 times a day. Consequently, especially in the winter, my hands were constantly chapped, cracked, and bleeding. Not only was this a deterrent to regular hand washing, but it was disturbing to patients to be examined by a physician with crusty, bleeding hands. Now, we use alcohol hand sanitizer that is mounted on the wall outside every patient room and this is far less damaging to the skin of the hands which helps promote compliance.

Hand washing in medicine seems like such a no-brainer. But it wasn’t always that way.

The history of hand washing dates to 1847 in Austria. At that time, Louis Pasteur was still working on his thesis in chemistry and had not yet discovered bacteria. There was a Hungarian physician named Ignaz Philipp Semmelweis who was working in the maternity Department of the Vienna Lying-in Hospital. Semmelweis observed that the number of cases of peripartum fever and the mortality rate was higher in one hospital ward than another. When he looked closer, he determined that the key difference was that the ward with the high death rate was staffed by medical students whereas the ward with the lower death rate was staffed by midwife students. It turned out that the medical students were coming directly from lessons in the autopsy room to the delivery room, whereas the midwife students did not attend autopsy lessons. This same year, his close friend, Jakob Kolletschka died after being accidentally poked by a medical student’s scapal while performing an autopsy. Kolletschka’s autopsy showed the same findings as the woman who were dying of post-partum fever in the maternity ward.

Semmelweis then found that the number of cases of fever could be reduced if medical students washed their hands before contact with pregnant women. He proposed some type of “cadaveric material” brought from the autopsy room caused the fevers and deaths. When he lectured about his discovery, he met with considerable hostility by his peers, so much so that he was ostracized by the Viennese medical community and his ability to practice obstetrics was severely restricted. He spent the next 14 years developing his theory about hand washing and ultimately wrote a book in 1861. Unfortunately, his book received very poor reviews by a medical community that was strongly opposed to his theory and he suffered a nervous breakdown resulting in him being committed to an insane asylum where he soon died after being beaten by attendants.

We’ve really come a long way and now no one is going to commit you to an insane asylum for washing your hands before taking care of patients. But the story of Dr. Semmelweis does illustrate just how hard it can be to change practitioners beliefs about measures to improve quality of care.

Deniers exist in every corner of medicine and science. In 1492, people were convinced that Christopher Columbus was going to sail off the end of the world, because, of course, the world was flat and only an imbecile would thing that it was round. In the 16th century, Copernicus’s theory of heliocentrism of the universe was derided as “absurd” and the Pope banned publication of his books. In the 17th century, when Galileo championed heliocentrism, he was placed on house arrest. In 1925, substitute teacher John Scopes made the mistake of teaching human evolution in a public school and he was famously found guilty and fined. In my own lifetime, in the town of Lancaster,  just south of Columbus, all of the children get cavities; that is because the town’s leaders were convinced that fluoridation of the water did not protect against dental caries and moreover, it would cause cancer – so 1969, they banned fluoridation of city water; in 2004, they voted to continue the ban.

In my first month of medical school, a professor told me that 50% of everything I was about to learn was false. In hindsight, most of what I learned still holds true (the aortic valve still has 3 leaflets and there are still 5 toes on people’s feet) but there was a lot of dogma of 1980 that turned out to be totally wrong: to prevent SIDS, babies should sleep on their stomach; beta blockers are contraindicated after a myocardial infarction; amyl nitrate causes AIDS, etc.

We now look back on the hand-washing deniers of 1847, who emphatically stated that Semmelweis’s recommendations were ludicrous, as being ignorant deniers of what seems to us to be the obvious. But it does make me wonder, how many of the things that I currently think are ludicrous will in the future turn out to be correct after all? When you are a human, you have to work hard to keep from being a denier, it seems to be in our nature.

December 27, 2016

By James Allen, MD

I am a Professor Emeritus of Internal Medicine at the Ohio State University and former Medical Director of Ohio State University East Hospital