Busy ERs are one of many factors
that could distort a hospital's rankings

Busy ER? Expect to Drop in the Hospital Rankings, Says Study

September 16, 2009
by Brendon Nafziger, DOTmed News Associate Editor
Indicators used to rank hospitals are skewed against ones that treat the sickest, neediest patients, according to an article published in the Journal of Neurosurgery.

Doctors at Loyola University Hospital in Maywood, IL, just outside Chicago, say different factors -- such as having a busy emergency room, having a trauma center that treats difficult, critical injuries, or working with those on Medicaid -- could make hospitals seem riskier than they really are by significantly influencing one of the main indicators used for hospital quality: the mortality index.

"One of the straightforward [hospital] quality indicators is: Did you die or not?" Thomas Origitano, M.D., Ph.D., lead author of the study and a neurosurgeon at Loyola, tells DOTmed News.

A mortality index of over 1.0 means the hospital has more deaths than predicted in that specialty; less than 1.0, means it has fewer deaths than expected.

Dr. Origitano says he noticed that some hospitals with a low mortality index would often transfer to other hospitals patients suffering dire illnesses or injuries they were unequipped to treat. As these patients were more likely to die, he suspected this process could artificially deflate the transferring hospital's mortality score.

This study was conducted in two parts. In the first leg, Dr. Origitano and his colleagues looked at the neurosurgery mortality rates of his own hospital, Loyola. Out of the 3,650 operations performed in a three-year period, Dr. Origitano discovered that patients were six times more likely to die if they were transferred in from another hospital that couldn't treat them or if they showed up in the emergency room needing immediate care.

In the second leg of the study, he and the other researchers combed through the University Healthsystem Consortium's database drawing from 103 medical centers during a two-year period.

What Dr. Origitano found is that "if you have a hospital with a very busy emergency room, that has a trauma center, and has a high Medicaid population, you're going to have a higher mortality index," he says.

While half of all hospitals with the worst mortality scores were designated Level 1 trauma centers, a certification granted in most states by the American College of Surgeons, which indicates the hospital performs specialized treatment of tricky cases like complex head injuries, only a quarter of all hospitals with the best death ratings were, Dr. Origitano says.

Equally telling, half of all hospitals with the highest death rates had more than one in ten patients on Medicaid, whereas only a third of those hospitals with the lowest rates did.

This is important, because poverty is linked with worse health outcomes. "Patients who don't have the same type of access to health care...are sicker," Dr. Origitano says.

And although it wasn't conclusive, the study did suggest that hospitals with more elective surgeries also had lower mortality rates, something Dr. Origitano believes is easy to predict, because patients needing optional or non-medically required surgeries are bound to be healthier than those undergoing a mandated or emergency procedure.

Dr. Origitano also takes issue with what he considers to be overly subjective markers of quality used by many ranking guides, such as reputation. "If you ask my mother," he says, "I have a great reputation. But how does it get carried forward?"

Avery Comarow, editor of U.S. News and World Report Best Hospitals, calls Dr. Origitano's study "provocative," but defends his publication's methods by saying he and his editors have already put in practice some of the things recommended by the study, such as excluding transfer patients and those on Medicaid from the scores used in the rankings, and taking into account whether a hospital is a certified stroke center.

"We're well aware of the problems with severity adjustment," he says, adding that though there is no "gold standard" for evaluating hospital quality; his publication gets "closer than many," and uses the tools relied on by the government and academia. He also adds that the reputation score given to hospitals, while admittedly somewhat subjective, is useful to consumers, and is computed from carefully weighted surveys sent only to board-certified doctors about departments in their area of expertise, so it amounts to a kind of "peer review" of hospitals.

Dr. Origitano still believes more work should be done. "We don't really know all the parameters of how to judge [hospital] quality," he says. "We need to get some parameters down that are durable and really reflect the hospital's ability to manage different populations of people."

Source: Journal of Neurosurgery, July 31, 2009