Idea One: Eliminate the Denominator
Download PDF Version
Most “Quality Dashboards” contain data on rates
of hospital-acquired infections, adverse drug events, falls,
and other harm events e.g. “central line infections
per 1000 line hours” or “falls per 1000 bed days
.” Typically, these rates are shown alongside some
sort of benchmark rate for that indicator, usually established
by analyzing the rates for comparable hospitals, and then
displayed as the 50th, 75th, or 90th percentile. It’s
not uncommon for the dashboard to display any rate better
than the 50th or 75th percentile as “Green.” Expressing
data as rates, with benchmarks, allows the quality staff
and executive team to answer a question commonly asked by
Boards: “How are we doing compared to other hospitals
like ours?” Knowing how you’re doing compared
to other hospitals isn’t a bad thing.
But some innovative hospitals have started to ask a different
set of questions, and to use a different sort of performance
indicator to answer those questions. Instead of asking “How
are we doing compared to the competition?” they’re
asking “How are we doing compared to the theoretical
ideal?” (The theoretical ideal is often either 100%
or zero).And to track the answer to that question, they’re
eliminating the denominator. (For example, they are simply
tracking “total number of central line infections each
month” and “total # of falls each month.”)
There are five reasons why eliminating the denominators
is a good idea. 1. Neither your basic patient population
nor
your types of service change that dramatically from month
to month, (with some notable exceptions for seasonal conditions
such as allergies, and for institutions with large seasonal
influxes of “snowbirds.”) So a raw count of the
number of people who fall in your hospital, or get infected,
or have adverse drug events, is a fairly accurate indicator
of the burden of harm over time. 2. Any time we make a measurement
more complex (e.g by making it a ratio between two measurements)
we add measurement error. How accurately are we measuring
things like “ventilator days?” 3. If a measurement
is not adding value (many denominators fall into this category)
they’re simply adding measurement waste. Somebody has
to keep track of “line hours.” Is this value-added
activity, or not? 4. In order to get benchmarks, deciles
and other indicators of comparative performance, we usually
sent off our denominator-based measurements to some national
or regional data compiler (e.g. Premier, VHA, State Hospital
Association…) so that we can get them to send us back
our %tile ranking and position. This inevitably introduces
delay. How old are the data you show your Board? Six months?
Nine months? This isn’t a timely way to oversee and
steer improvement. 5. Finally, and most important, many of
these denominator-based measurements lull hospital leaders
into complacency, in two ways. First, the ratios make the
data fairly abstract e.g. “4.9 infections/1000 line
hours.” Compare this to what that abstract really means: “14
people doubled their risk of dying in our care last month,
because of a line infection that we gave them.” If
we want our Board members to understand our data, and to
oversee its improvement with urgency, they need to understand
it viscerally. Eliminating the denominators helps. The second
way in which denominators cause complacency is when leaders
look at their dashboards and say, “Hey, we must be
pretty
good. All our indicators are Green.” To which I say, “And
what, exactly, does it mean to be Green?” Being better
than the 50th percentile for hospital-acquired infections,
in a health care system where 200,000 people incur serious
harm every year from these infections, is not “Green.”
So what do I recommend? Try eliminating the denominator,
for many of your performance indicators. Track the number
of patients who are harmed, or receive the care they should
receive, every month, against the theoretical ideal…either
100% or zero. Your data will be more accurate, more timely,
and more viscerally meaningful. And that will give you a
jumpstart on improvement.
Note: from time to time, you might still have to answer
the question “But how are we doing compared to others?” For
this you will need denominators. But if you’ve been
working with the theoretical ideal in mind, you just might
find something interesting when you check your performance
against the competition: you’ve blown right past the
benchmark!