Beruflich Dokumente
Kultur Dokumente
© Copyright IBM Corporation 2018. All rights reserved. IBM, the IBM logo, ibm.com,
Watson Health, and 100 Top Hospitals are trademarks of International Business
Machines Corp., registered in many jurisdictions worldwide. Other product and service
names might be trademarks of IBM or other companies.
The information contained in this publication is intended to serve as a guide for general
comparisons and evaluations, but not as the sole basis upon which any specific
conduct is to be recommended or undertaken.
The reader bears sole risk and responsibility for any analysis, interpretation, or
conclusion based on the information contained in this publication, and IBM shall not
be responsible for any errors, misstatements, inaccuracies, or omissions contained
herein. No part of this publication may be reproduced or transmitted in any form or
by any means, electronic or mechanical, including photocopying, recording, or by
any information storage and retrieval system, without permission in writing from
IBM Watson Health.
ISBN: 978-1-57372-475-3
Introduction
Contents
Welcome to the 10th edition of the
03 Introduction Watson Health 15 Top Health Systems study.
09 2018 15 Top Health Systems
award winners This 2018 study from IBM Watson Health™
11 Findings marks another milestone in the 100 Top Hospitals®
21 Methodology program’s rich history: a full decade of publishing
37 Appendix A: an annual quantitative study designed to shine
Health system winners and a light on the nation’s highest-performing
their member hospitals health systems.
41 Appendix B:
The top quintile: Highest- Our research of US health system performance
performing health systems began with the same goal that has driven each
43 Appendix C: study since the beginning of the 100 Top Hospitals
Methodology details program: To identify top performers, and also
57 Appendix D: deliver insights that may help healthcare systems
All health systems in study better focus their improvement initiatives on
achieving consistent, balanced, and sustainable
high performance.
Truven Health Analytics® was acquired by Health systems do not apply for our 15 Top Health
IBM in 2016 to help form a new business, Systems selection process, and winners do not pay
Watson Health. to market their honor.
3
The Watson Health By finding ways to take balanced performance to
15 Top Health Systems the next level, the winners of our 15 Top Health
scorecard results are divided Systems award are identifying opportunities to
into two separate sections deliver healthcare value to patients, communities,
that graphically illustrate: and payers. The performance levels achieved by
––A health system’s these systems may motivate their peers to use
performance and data, analytics, and benchmarks to close their
improvement versus peer performance gaps.
health systems
––Cross-system performance
alignment of member Delivering a robust and transparent assessment
hospitals
We have designed this study to provide a view
of health system performance across multiple
dimensions: how they stand compared to peers
and high performers (whole-system performance),
where they stand in the evolution of their own
cultures of performance improvement (relative
long-term improvement and rate of improvement),
and the achievement of cross-system performance
alignment (member hospital performance).
* In this study, measures of financial health are reported for information-only purposes because public, audited financial statements are not available for all US health systems.
* 30-day mortality was slightly higher for winning health systems in this year’s study.
5
Understanding the similarities The analysis on the previous page is based on
and differences between applying the difference between study winners
high and low performers can and nonwinners to Medicare patient counts. If the
help provide benchmarks for same standards were applied to all inpatients, the
the industry. The findings we impact would be even greater.
assemble for this study provide
examples of excellence, as For more details about this study’s findings and
evidenced in several additional the achievements of the 15 Top Health Systems,
published studies2 - 24. please see the Findings section of this document.
7
Note that the order of health systems in the
2018 following tables does not reflect performance
rating. Systems are ordered alphabetically.
15 Top Health For full details on these peer groups and the
process we used to select the winning benchmark
Systems award health systems*, see the Methodology section of
this document.
winners
The Watson Health 100 Top Hospitals® program
is pleased to present the 2018 Watson Health
15 Top Health Systems.
* To see a full list of Winners Through the Years, visit truvenhealth.com/Products/100-Top-Hospitals/Program-Info/15-Top-Health-Systems/Winners-Through-the-Years.
9
Note: In Tables 1 through 4, data for the
Findings 15 Top Health Systems award winners is labeled
“Benchmark,” and data for all health systems,
The Watson Health 15 Top Health Systems study excluding award winners, is labeled “Peer group.”
profiles the top-performing health systems* in the In columns labeled “Benchmark compared
country. According to publicly available data and with peer group,” we calculated the actual and
our transparent methodologies, these industry percentage difference between the benchmark
leaders appear to be successfully addressing the hospital scores and the peer group scores.
challenge of deploying innovative clinical and
operational approaches to multiple hospital sites 15 Top Health Systems had better survival rates**
to achieve consistent top performance. ––The winners had 14.6% fewer in-hospital
deaths than their nonwinning peers,
For 10 years, the 15 Top Health Systems study considering patient severity (Table 1)
has followed the results achieved by leading –– Mortality results for medium health systems
health systems and published numerous showed the greatest difference between
examples of the benchmark systems’ clinical winners and nonwinners, with 15.9% fewer
and operational excellence. The study is more deaths among benchmark health systems
than a list of accomplishments; it is a tool US (Tables 2 - 4)
health system leaders can use to help guide their
own performance improvement initiatives. By 15 Top Health Systems had fewer
highlighting what the highest-performing leaders patient complications**
around the country are doing well, we create
aspirational benchmarks for the rest of the industry. –– Patients treated at the winning systems’
member hospitals had significantly fewer
complications, with rates 17.3% lower than
How the winning systems compared at nonwinning system hospitals, considering
to their peers patient severity (Table 1)
In this section, we show how the 15 Top Health –– Large health systems had the greatest
Systems performed within their comparison groups difference between winners and nonwinners,
(large, medium, and small systems), compared with 20.9% fewer complications (Tables 2 - 4)
to nonwinning peers. In addition, we identify
some key findings among comparison groups. For
performance measure details and definitions of
each comparison group, see the Methodology
section of this document.
* To be defined as a health system in this study, an organization must have at least two short-term, general, acute care hospitals with separate Medicare provider identification
numbers. Systems with multiple hospital facilities reporting under one provider ID are profiled as a single hospital in the Watson Health 100 Top Hospitals® study.
** Mortality and complications index values cannot be compared among the three different comparison groups because they are normalized by comparison group.
11
15 Top Health Systems had fewer healthcare- 15 Top Health Systems had mixed results on
associated infections* longer-term outcomes
A new ranked measure in the 2018 study, Several patient groups are included in the 30-
healthcare-associated infections (HAIs)**, day mortality and readmission extended care
captures information about the quality of inpatient composite metrics. The mean 30-day mortality
care. Based on nationwide data availability, we rate includes heart attack (AMI), heart failure (HF),
built a composite measure of HAI performance pneumonia, chronic obstructive pulmonary disease
at the system level, considering up to six HAIs, (COPD), and stroke patient groups. The mean 30-
depending on assigned comparison group. The day readmission rate includes AMI, HF, pneumonia,
six reported HAIs are: methicillin-resistant total hip arthroplasty and/or total knee arthroplasty
staphylococcus aureus (MRSA-bloodstream), (THA/TKA), COPD, and stroke patient groups.
central line-associated blood stream infections,
catheter-associated urinary tract infections, 30-day mortality results
clostridium difficile (C. diff), surgical site infections
–– In this year’s study, the winning systems had
(SSIs) following colon surgery, and SSIs following
a higher mean 30-day mortality rate than the
an abdominal hysterectomy.
nonwinning systems, due to slightly higher
–– Among all types of systems, winners overall rates among large and small systems
had a more favorable composite median HAI (Tables 1 - 4)
index value than nonwinner peers, at 0.77
–– Small health systems displayed the largest
versus 0.92, respectively; this reflects 16.2%
gap between winners and nonwinning peers
fewer infections occurring at the 15 Top Health
on 30-day mortality (13.4% versus 12.9%),
Systems compared to other peer systems
while medium health system winners had
(Table 1)
performance the same as nonwinning peers,
–– Small health system winners and nonwinners at 12.7%, which was also the lowest median
showed the most dramatic difference on 30-day mortality value observed in this year’s
HAI performance: winners had a median HAI study (Tables 1, 3, and 4)
composite index value of 0.64, which was
26.5% lower than the median HAI index score 30-day readmission results
at nonwinning systems (0.87) (Tables 2 - 4)
–– Winning health systems had lower 30-day
readmission rates than their nonwinning peers
nationally (0.6 percentage points lower) and for
all comparison groups (Table 1)
–– Small winning systems had the best mean
30-day readmission rate (14.4%) among all
comparison groups and outperformed their
nonwinning peers by the greatest margin,
0.9 percentage points (Tables 2 - 4)
* HAI index values cannot be compared among the three different comparison groups because they are normalized by comparison group.
** As developed by a unit of the Centers for Disease Control and Prevention, the National Healthcare Safety Network, and reported by the Centers for Medicare & Medicaid Services
(CMS) in the public Hospital Compare data set.
* ALOS cannot be compared among the three different comparison groups because values are normalized by comparison group.
13
Table 1. National health system performance comparisons (all systems)
Performance measure Medians Benchmark compared with peer group
Winning Nonwinning Difference Percent Comments
benchmark peer group difference
health of US Health
systems systems
1. Mortality, complications and average length of stay based on Present on Admission (POA)-enabled risk models applied to MedPAR 2015 and 2016 data (ALOS 2016 only).
2. HAI, ED measure, MSPB and HCAHPS data from CMS Hospital Compare Jan 1, 2016 - Dec 31, 2016 data set.
3. 30-day rates from CMS Hospital Compare July 1, 2013-June 30, 2016 data set.
4. We do not calculate percent difference for this measure because it is already a percent value.
NNote: Measure values are rounded for reporting, which may cause calculated differences to appear off.
1. Mortality, complications, and ALOS based on POA-enabled risk models applied to MEDPAR 2015 and 2016 data (ALOS 2016 only).
2. HAI, ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2016 - Dec. 31, 2016 data set.
3. 30-day rates from CMS Hospital Compare July 1, 2013 - June 30, 2016 data set.
4. We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
1. Mortality, complications, and ALOS based on POA-enabled risk models applied to MEDPAR 2015 and 2016 data (ALOS 2016 only).
2. HAI, ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2016 - Dec. 31, 2016 data set.
3. 30-day rates from CMS Hospital Compare July 1, 2013 - June 30, 2016 data set.
4. We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
1. Mortality, complications, and ALOS based on POA-enabled risk models applied to MEDPAR 2015 and 2016 data (ALOS 2016 only).
2. HAI, ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2016 - Dec. 31, 2016 data set.
3. 30-day rates from CMS Hospital Compare July 1, 2013 - June 30, 2016 data set.
4. We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
15
Appendix A.
1. Mortality, complications, and ALOS based on POA-enabled risk models applied to MEDPAR 2015 and 2016 data (ALOS 2016 only).
2. HAI, ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2016 - Dec. 31, 2016 data set.
3. 30-day rates from CMS Hospital Compare July 1, 2013 - June 30, 2016 data set.
Note: Mortality, complications, HAI and ALOS measures cannot be compared across comparison groups because they are normalized by comparison group.
Top and bottom quintile results ––They had somewhat lower mean 30-day
We divided all the health systems in this study into mortality rates (0.4 percentage points lower;
performance quintiles, by comparison group, based includes AMI, HF, pneumonia, COPD, and
on their performance on the study’s measures. stroke patients)
In Table 6, we have highlighted differences ––They had lower mean 30-day readmission
between the highest- and lowest-performing rates (14.8% versus 15.5%; includes AMI,
quintiles by providing their median scores on the HF, pneumonia, THA/TKA, COPD, and
study performance measures. (See Appendix B stroke patients)
for a list of the health systems included in the
top-performance quintile and Appendix D for all ––They had dramatically lower mean ED wait
systems included in the study.) times, with an average difference of 62
minutes per ED patient (24.1% less than the
The top quintile systems outperformed their lowest bottom quintile)
quintile peers in the following ways: ––They were more efficient, releasing patients
––They had much better patient outcomes: almost one day (0.9) sooner than the lowest
19.7% lower mortality and 5.7% lower performers and at a 5.3% lower MSPB index
complications ––They scored 11.3 points higher on the HCAHPS
––They had fewer occurrences of HAIs in their overall patient rating of care
facilities: a 25.8% smaller median HAI index
value (0.8) in the top performance quintile
Table 6. Comparison of health systems in the top and bottom quintiles of performance1
Performance measure Top quintile Bottom quintile Difference Percent Top versus bottom
median median difference quintile
Inpatient mortality index2 0.89 1.11 -0.22 -19.7% Lower mortality
Complications index2 0.99 1.05 -0.06 -5.7% Fewer complications
HAI index 3
0.8 1.0 -0.3 -25.8% Fewer infections
30-day mortality rate (%) 4
12.6 12.9 -0.4 n/a 5
Lower 30-day mortality
30-day readmission rate (%) 4
14.8 15.5 -0.8 n/a 5
Fewer 30-day readmissions
ALOS 2
4.5 5.4 -0.9 -16.1% Shorter stays
ED measure mean minutes3 195.0 256.9 -62.0 -24.1% Less time to service
MSPB index3 0.96 1.01 -0.05 -5.3% Lower episode cost
HCAHPS score 3
269.0 257.7 11.3 4.4% Better patient experience
1. Top and bottom performance quintiles were determined by comparison group and aggregated to calculate medians.
2. Mortality, complications, and ALOS based on POA-enabled risk models applied to MEDPAR 2015 and 2016 data (ALOS 2016 only).
3. HAI, ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2016 - Dec. 31, 2016 data set
4. 30-day rates from CMS Hospital Compare July 1, 2013 - June 30, 2016, data set.
5. We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
17
Test metrics: Reported for information only Excess days in acute care measures
Every year, we evaluate the 15 Top Health Systems The newest set of measures available from CMS
study and explore whether new measures would in the Hospital Compare data set are the excess
enhance the value of the analysis we provide. days in acute care (EDAC) measures for AMI and
For this 2018 study, we are testing several HF. CMS defines “excess days” as the difference
new performance measures that update basic between a hospital’s average days in acute
standards of inpatient care and expand the care and expected days, based on an average
balanced scorecard across the continuum of care. hospital nationally. Days in acute care include
These metrics were not used in the ranking and days spent in an ED, a hospital observation unit,
selection of winning health systems. or a hospital inpatient unit for 30 days following
a hospitalization. The data period in our study for
If you would like to provide feedback on these measures is the same as for the other 30-day
the following proposed measures, email metrics for specific patient conditions: three years,
100tophospitals@us.ibm.com. combined (July 1, 2013 - June 30, 2016).
30-day, hospital-wide readmission rate1 14.8 15.4 -0.63 n/a3 Fewer 30-day readmissions
AMI 30-day episode payment 1
$22,798 $23,204 -$406 -1.8% Lower episode cost
HF 30-day episode payment1 $16,023 $16,376 -$353 -2.2% Lower episode cost
Pneumonia 30-day episode payment 1
$17,117 $17,318 -$201 -1.2% Lower episode cost
AMI 30-day excess days in acute care 1
-5.2 7.6 -12.81 n/a 3
Fewer excess days
HF 30-day excess days in acute care 1
-8.5 8.1 -16.62 n/a 3
Fewer excess days
THA/TKA* 90-day episode payment 2
$20,653 $21,931 -$1,278 -5.8% Lower episode cost
THA/TKA* 90-day complications rate2 2.4 2.7 -0.30 n/a3 Fewer complications
1. 30-day measures from CMS Hospital Compare July 1, 2013 - June 30, 2016 data set.
2. 90-day measures from CMS Hospital Compare April 1, 2013 - March 31, 2016 data set.
3. We do not calculate percent difference for these measures because it can be a negative number or is already a percent value.
* Primary, elective total hip arthroplasty and total knee arthroplasty.
19
Financial metrics –– Overall, benchmark health system
We continue to publish the financial measures performance is better than nonwinning peers,
each year for information only, as audited financial both on operating margin (1.2 percentage
statements are not available for all systems points higher among winners) and long-term
included in the study*. These measures are not debt-to-capitalization ratio (LTD/cap)
included in the ranking and selection of benchmark (0.1 lower ratio among winners)
health systems. –– Notably, medium health system winners had
a much higher operating profit margin than
Results for included systems are found in nonwinners (8.7% versus 2.4%) and showed
Table 8 below. a greater difference on the LTD/cap (0.2
versus 0.4)
Note: Data sourced from audited 2016 financial reports via dacbond.com, emma.msrb.org, yahoo.brand.edgar-online.com, and sec.gov.
* A total of 84.2% of parent and independent systems published audited financial statements for 2016. Subsystems that are members of a larger “parent” health system generally
do not have separate audited financial statements. This translated into 65.1% of all in-study health systems with available financial reports.
The health system scorecard is based on the The following section is intended to be an overview
100 Top Hospitals® national balanced scorecard of these steps. To request more detailed information
methodologies and focuses on four performance on any of the study methodologies outlined here,
domains: inpatient outcomes, extended outcomes, email us at 100tophospitals@us.ibm.com or call
operational efficiency, and patient experience. 800-525-9083.
* The MEDPAR data years quoted in 100 Top Hospitals research are based on an FFY, a year that begins on October 1 of each calendar year and ends on September 30 of the
following calendar year. FFYs are identified by the year in which they end (for example, FFY 2016 begins October 1, 2015, and ends September 30, 2016). Data for all CMS Hospital
Compare measures is provided in calendar years, except the 30-day rates. CMS publishes the 30-day rates as three-year combined data values. We label these data points based
on the end date of each data set. For example, July 1, 2013 - June 30, 2016, is named “2016.”
21
We used the CMS Hospital Compare data The recalibrated models were used in producing
set published in the third quarter of 2017 for the risk-adjusted inpatient mortality and
healthcare-associated infection (HAI) measures, complications indexes, based on two years of
30-day mortality rates, 30-day readmission rates, MEDPAR data (2015 and 2016). The severity-
Medicare spending per beneficiary (MSPB) index, adjusted LOS was produced based on MEDPAR
and Hospital Consumer Assessment of Healthcare 2016 data.
Providers and Systems (HCAHPS) patient
perception-of-care data26. Present-on-admission coding adjustments
From 2010 through 2016, we have observed a
We also used the 2016 Medicare cost reports, significant rise in the number of principal diagnosis
published in the federal Hospital Cost Report and secondary diagnosis codes that do not have
Information System (HCRIS) third-quarter 2017 a valid POA indicator code in the MEDPAR data
data set, to create our proprietary database for files. Since 2011, an invalid code of “0” has been
determining system membership based on “home appearing. This phenomenon has led to an artificial
office” or “related organization” relationships rise in the number of complications that appear to
reported by hospitals. The cost reports were also be occurring during the hospital stay. See Appendix
used to aggregate member hospital total operating C for details.
expense to the system level. This data was
used to classify health systems into three To correct for this bias, we adjust MEDPAR record
comparison groups. processing through our inpatient mortality and
complications risk models and LOS severity-
Risk- and severity-adjustment models adjustment model as follows:
The IBM Watson Health™ proprietary risk- and
1. Original, valid (Y, N, U, W, or 1) POA codes
severity-adjustment models for inpatient mortality,
assigned to diagnoses were retained
complications, and LOS have been recalibrated for
this study release using FFY 2015 data available 2. Where a POA code of “0” appeared, we took
in the all-payer Watson Health Projected Inpatient the next four steps
Database (PIDB). The PIDB is one of the largest
a. We treated all diagnosis codes on the
US inpatient, all-payer databases of its kind,
CMS exempt list as “exempt,” regardless
containing approximately 23 million inpatient
of POA coding
discharges annually, obtained from approximately
5,000 hospitals, which comprise more than 65% b. We treated all principal diagnoses as
of the nonfederal US market. Watson Health risk- “present on admission”
and severity-adjustment models take advantage
c. We treated secondary diagnoses where the
of available present-on-admission (POA) coding
POA code “Y” or “W” appeared more than
that is reported in all-payer data. Only patient
50% of the time in the Watson Health all-
conditions that are present on admission are
payer database as “present on admission”
used to determine the probability of death,
complications, or the expected LOS. d. All others were treated as “not present”
* For composite measures (HAI, 30-day mortality, 30-day readmissions), the exclusion is applied ONLY if all individual measure comprising the composite are missing.
- For HAI, different numbers of individual measures were required depending on the comparison group (five for large and medium systems; three for small systems). A system not
meeting the minimum was excluded. See Appendix C for details.
- In systems where one or more individual 30-day mortality or 30-day readmission rates were missing, BUT NOT ALL, we calculated a median value for each, by comparison
group, and substituted the median for the missing value.
** This rule was not applied to the HAI composite, which followed different exclusion logic. See Appendix C for details.
23
Identifying health systems To analyze health system performance, we
To be included in the study, a health system must aggregate data from all of a system’s included
have at least two short-term, general, acute hospitals. In the methodology summary tables in
care hospitals with separate Medicare provider this section, we provide specific details about the
identification numbers. The minimum of two calculations used for each performance measure
hospitals must be met after hospital exclusions and how these measures are aggregated to
have been applied. In addition, we also include determine system performance.
any cardiac, orthopedic, women’s hospitals, and
CAHs that passed the hospital exclusion rules After all exclusions were applied and parent
cited on the previous page. For the 2018 study, systems identified, the final 2018 study group
we identified the “parent” system by finding the included 338 health systems with the profiles
“home office” or “related organization”, as reported outlined in Table 9.
on the hospitals’ 2016 (or 2015) Medicare
cost report.
Note: A hospital can be a member of both a parent system and a subsystem of that parent. They will be included in both parent and subsystem member hospital counts.
The total unduplicated hospital count in this study was 2,422 hospitals.
25
Table 11. Summary of measure data sources and data periods
Performance measure Current performance Five-year trend performance
(15 Top Health Systems award selection)
Risk-adjusted inpatient mortality index MEDPAR federal fiscal year (FFY) 2015 and 2016* MEDPAR FFY 2011 - 2016*
Risk-adjusted complications index MEDPAR FFY 2015 and 2016* MEDPAR FFY 2011 - 2016*
Mean HAI index CMS Hospital Compare Calendar Year (CY) 2016 Trend not available
Mean 30-day mortality rate CMS Hospital Compare July 1, 2013 - June 30, 2016 CMS Hospital Compare: Three-year data sets
ending June 30 in 2013, 2014, 2015, 2016
Mean 30-day readmission rate (AMI, HF, CMS Hospital Compare July 1, 2013 - June 30, 2016 CMS Hospital Compare: Three-year data sets
pneumonia, THA/TKA**, COPD, stroke) ending June 30 in 2013, 2014, 2015, 2016
Severity-adjusted ALOS MEDPAR FFY 2016 MEDPAR FFY 2012 - 2016
Mean ED throughput measure CMS Hospital Compare CY 2016 CMS Hospital Compare 2012 - 2016
MSPB index CMS Hospital Compare CY 2016 CMS Hospital Compare 2012 - 2016
HCAHPS score (overall hospital rating) CMS Hospital Compare CY 2016 CMS Hospital Compare 2012 - 2016
* Two years of data are combined for each study year data point.
** Primary, elective total hip arthroplasty and total knee arthroplasty.
27
Performance measure details
29
Mean HAI index
Why we include this element Calculation Comments Favorable
values are
Because there is a public interest in Measure data was obtained from We rank systems on the mean Lower
tracking and preventing HAIs, we the CMS Hospital Compare data set. normalized HAI z-score, by
now use the HAI data reported by Hospitals complete the required comparison group.
CMS to analyze hospital performance surveillance and report HAI occurrences,
and provide national benchmarks in and the count of patient days or For reporting, we also calculate a
this area. procedures associated with each system-level observed-to-expected
HAI metric, through the US Centers ratio (not normalized) for each HAI. We
for Disease Control and Prevention’s then calculate the unweighted mean
National Healthcare Safety Network of the observed-to-expected values for
(NHSN), which in turn reports data the HAIs included in each comparison
to CMS. group, as the reported HAI measure.
Mean 30-day risk-adjusted mortality rate (AMI, HF, pneumonia, COPD and stroke patients)
Why we include this element Calculation Comments Favorable
values are
30-day mortality rates are a widely Data is from the CMS Hospital Compare We rank systems by comparison group, Lower
accepted measure of the effectiveness data set. CMS calculates a 30-day based on the mean rate for included
of hospital care. They allow us to look mortality rate for each patient condition 30-day mortality measures (AMI, HF,
beyond immediate inpatient outcomes using three years of MEDPAR data, pneumonia, COPD, and stroke).
and understand how the care the health combined. We aggregate this data to
system provided to inpatients with produce a rate for each 30-day measure The CMS Hospital Compare data for
these particular conditions may have for each system. This is done by 30-day mortality is based on Medicare
contributed to their longer-term survival. multiplying the hospital-level reported fee-for-service claims only.
In addition, tracking these measures patient count (eligible patients) by the
may help health systems identify reported hospital rate to determine the
patients at risk for post-discharge number of patients who died within
problems and target improvements 30 days of admission. We sum the
in discharge planning and after-care calculated deaths and divide by the sum
processes. Health systems that score of eligible patients for member hospitals
well may be better prepared for a pay- of each system. This value is multiplied
for-performance structure. by 100 to produce the system-level
30-day mortality rate for each measure,
expressed as a percent. CMS does not
calculate rates for hospitals where the
number of cases is too small (less than
25). In these cases, we substitute
the comparison group-specific
median rate for the affected 30-day
mortality measure.
31
Mean ED throughput measure
Why we include this element Calculation Comments Favorable
values are
The hospital ED is an important access Data is from the CMS Hospital Compare We rank systems on the mean ED Lower
point to healthcare for many people. A data set. CMS reports the median throughput measure in minutes.
key factor in evaluating ED performance minutes for each ED throughput
is process “throughput,” or measures measure. We include two of the We include two measures that define
of the timeliness with which patients available ED measures in calculating an important ED processes: median time
receive treatment, and either are unweighted system aggregate measure. from ED arrival to ED departure for
admitted or discharged. Timely ED For each ED measure, we sum the admitted patients, and median time
processes impact both care quality and median minutes for system member from ED arrival to ED departure for non-
the quality of the patient experience. hospitals and divide by the number admitted patients.
of member hospitals to produce the
system-level minutes for each measure.
We calculate the arithmetic mean of the
two included ED measures to produce
the ranked composite ED measure.
MSPB index
Why we include this element Calculation Comments Favorable
values are
MSPB helps determine how efficiently Data is from the CMS Hospital Compare We rank systems on the weighted Lower
a hospital coordinates the care for its data set. CMS calculates the cost of care average MSPB index.
patients across continuum-of-care for each admitted patient, including
sites. Lower values indicate lower costs Medicare Part A and Part B costs. CMS calculates the cost of care for
relative to national medians and thus CMS aggregates costs associated with each admitted patient, including both
greater efficiency. the index admission from three days Medicare Part A and Part B costs.
preadmission, through inpatient stay,
and 30 days post-admission. This
cost is divided by the median national
cost. CMS applies both numerator and
denominator adjustments. We calculate
the system-level measure by weighting
each member hospital index by the
hospital’s MEDPAR discharges for the
most current year in the study. We sum
the weighted values and divide by the
sum of the MEDPAR discharges of all
member hospitals. This produces a
weighted average MSPB index for
each system.
33
Determining the 15 Top Health Systems Winner exclusions
Ranking For mortality and complications, which have
We rank health systems based on their observed and expected values, we identify systems
performance on each of the included measures with performance that is statistically worse than
relative to the other in-study systems, by expected. Systems with performance that is worse
comparison group. We sum the ranks, giving all than expected are excluded from consideration
measures equal weight, and re-rank overall to when selecting the study winners. This is done
arrive at a final rank for the system. The top five because we do not want systems that have poor
health systems with the best final rank in each clinical outcomes to be declared winners.
of the three comparison groups are selected
as the winners (15 total winners). The ranked A system is winner-excluded if both of the following
performance measures are listed in Table 12. conditions apply:
1. Observed value is higher than expected and
Table 12. Ranked performance measures and weights the difference is statistically significant, with
Ranked measure Weight in 99% confidence.
overall ranking
2. We calculate the 75th percentile index value
Risk-adjusted inpatient mortality 1
for mortality and complications, including
Risk-adjusted complications 1
data only for systems that meet condition
Mean HAI index 1
number 1 above. These values are used as
Mean 30-day mortality rate 1
the high trim points for those health systems.
Mean 30-day readmission rate 1 Systems with mortality or complications
Severity-adjusted average LOS 1 index values above the respective trim points
Mean ED throughput 1 are winner-excluded.
MSPB index 1
HCAHPS score (overall rating question) 1 If MSPB is missing, the system is winner excluded.
100
2 Perfect point
6
80
7
2012 - 2016 rate of improvement
System member
1 hospital key
Centroid
60 Hospital 1
Hospital 2
4 5
Hospital 3
Hospital 4
Hospital 5
40 Hospital 6
3 Hospital 7
20
0
0 20 40 60 80 100
2016 performance
35
Policy on revocation of a
15 Top Health Systems award
To preserve the integrity of the study, it is the
policy of the Watson Health 100 Top Hospitals
program to revoke a 15 Top Health Systems award
if a system is found to have submitted inaccurate
or misleading data to any data source used in
the study.
Note: Winning systems are listed alphabetically by name. Member hospitals are listed alphabetically by state, then alphabetically by name.
37
Health system/hospital name Location Hospital Medicare ID
Mayo Foundation Rochester, MN
Mayo Clinic Hospital Phoenix, AZ 030103
Mayo Clinic Hospital Jacksonville, FL 100151
Mayo Clinic Health System in Waycross Waycross, GA 110003
Mayo Clinic Hospital Rochester Rochester, MN 240010
Mayo Clinic Health System Albert Lea Albert Lea, MN 240043
Mayo Clinic Health System Cannon Falls Cannon Falls, MN 241346
Mayo Clinic Health System Fairmont Fairmont, MN 240166
Mayo Clinic Health System in Red Wing Red Wing, MN 240018
Mayo Clinic Health System Lake City Lake City, MN 241338
Mayo Clinic Health System Mankato Mankato, MN 240093
Mayo Clinic Health System New Prague New Prague, MN 241361
Mayo Clinic Health System Springfield Springfield, MN 241352
Mayo Clinic Health System St. James St. James, MN 241333
Mayo Clinic Health System Waseca Waseca, MN 241345
Mayo Clinic Franciscan Healthcare La Crosse, WI 520004
Mayo Clinic Health System Franciscan Sparta, WI 521305
Mayo Clinic Health System Chippewa Valley Bloomer, WI 521314
Mayo Clinic Health System Eau Claire Eau Claire, WI 520070
Mayo Clinic Health System Northland Barron, WI 521315
Mayo Clinic Health System Oakridge Osseo, WI 521302
Mayo Clinic Health System Red Cedar Menomonie, WI 521340
Mercy Chesterfield, MO
Mercy Hospital Berryville Berryville, AR 041329
Mercy Hospital Booneville Booneville, AR 041318
Mercy Hospital Fort Smith Fort Smith, AR 040062
Mercy Hospital Ozark Ozark, AR 041303
Mercy Hospital Rogers Rogers, AR 040010
Mercy Hospital Waldron Waldron, AR 041305
North Logan Mercy Hospital Paris, AR 041300
Mercy Hospital Columbus Columbus, KS 171308
Mercy Hospital Fort Scott Fort Scott, KS 170058
Mercy Hospital Aurora Aurora, MO 261316
Mercy Hospital Carthage Carthage, MO 261338
Mercy Hospital Cassville Cassville, MO 261317
Mercy Hospital Jefferson Festus, MO 260023
Note: Winning systems are listed alphabetically by name. Member hospitals are listed alphabetically by state, then alphabetically by name.
Note: Winning systems are listed alphabetically by name. Member hospitals are listed alphabetically by state, then alphabetically by name.
39
Health system/hospital name Location Hospital Medicare ID
Sentara Healthcare Norfolk, VA
Sentara Albemarle Medical Center Elizabeth City, NC 340109
Martha Jefferson Hospital Charlottesville, VA 490077
Sentara Careplex Hospital Hampton, VA 490093
Sentara Halifax Regional Hospital South Boston, VA 490013
Sentara Leigh Hospital Norfolk, VA 490046
Sentara Norfolk General Hospital Norfolk, VA 490007
Sentara Northern Virginia Medical Center Woodbridge, VA 490113
Sentara Obici Hospital Suffolk, VA 490044
Sentara Princess Anne Hospital Virginia Beach, VA 490119
Sentara RMH Medical Center Harrisonburg, VA 490004
Sentara Virginia Beach General Hospital Virginia Beach, VA 490057
Sentara Williamsburg Regional Medical Center Williamsburg, VA 490066
St. Luke's Health System Boise, ID
St. Luke's Boise Medical Center Boise, ID 130006
St. Luke's Elmore Medical Center Mountain Home, ID 131311
St. Luke's Jerome Jerome, ID 131310
St. Luke's Magic Valley RMC Twin Falls, ID 130002
St. Luke's McCall McCall, ID 131312
St. Luke's Wood River Medical Center Ketchum, ID 131323
UPMC Susquehanna Health System Williamsport, PA
Muncy Valley Hospital Muncy, PA 391301
Soldiers and Sailors Memorial Hospital Wellsboro, PA 390043
Williamsport Regional Medical Center Williamsport, PA 390045
TriHealth Cincinnati, OH
Bethesda North Hospital Cincinnati, OH 360179
Good Samaritan Hospital Cincinnati, OH 360134
McCullough Hyde Memorial Hospital Oxford, OH 360046
TriHealth Evendale Hospital Cincinnati, OH 360362
UCHealth Aurora, CO
UCHealth Medical Center of the Rockies Loveland, CO 060119
UCHealth Poudre Valley Hospital Fort Collins, CO 060010
UCHealth University of Colorado Hospital Aurora, CO 060024
University of Colorado Health Memorial Hospital Colorado Springs, CO 060022
Note: Winning systems are listed alphabetically by name. Member hospitals are listed alphabetically by state, then alphabetically by name.
Note: Health systems are ordered alphabetically. This year’s 15 Top Health Systems (2018) are in bold, blue text.
41
Small health systems
System name Location
Asante Medford, OR
Baptist Health Care (FL) Pensacola, FL
Cape Cod Healthcare Hyannis, MA
Centegra Health System Crystal Lake, IL
Centra Health Lynchburg, VA
CHI St. Joseph Health Bryan, TX
Genesis Health System Davenport, IA
Guthrie Healthcare System Sayre, PA
John D. Archbold Memorial Hospital Thomasville, GA
Mary Washington Healthcare Fredericksburg, VA
Maury Regional Health Columbia, TN
MidMichigan Health Midland, MI
Northern Arizona Healthcare Flagstaff, AZ
Penn Highlands Healthcare DuBois, PA
PIH Health Whittier, CA
ProHealth Care Waukesha, WI
Roper St. Francis Healthcare Charleston, SC
Saint Alphonsus Health System Boise, ID
Saint Joseph Regional Health System Mishawaka, IN
St. Charles Health System Bend, OR
St. Mary's Health Care System Athens, GA
UPMC Susquehanna Health System Williamsport, PA
Note: Health systems are ordered alphabetically. This year’s 15 Top Health Systems (2018) are in bold, blue text.
43
In addition to considering the POA indicator codes To correct for this bias, we adjusted MEDPAR record
in calibration of our risk- and severity-adjustment processing through our mortality, complications,
models, we have adjusted for missing/invalid POA and LOS models as follows
coding found in the Medicare Provider Analysis
1. Original, valid (Y, N, U, W, or 1) POA codes
and Review (MEDPAR) data files. After 2010, we
assigned to diagnoses were retained
have observed a significantly higher percentage of
principal diagnosis and secondary diagnosis codes 2. Where a POA code of “0” appeared, we took
that do not have a valid POA indicator code in the the next four steps:
MEDPAR data files. Since 2011, an invalid code of
a. We treated all diagnosis codes on the
“0” has been appearing. This phenomenon has led
CMS exempt list as “exempt,” regardless of
to an artificial rise in the number of conditions that
POA coding
appear to be occurring during the hospital stay, as
invalid POA codes are treated as “not present” by b. We treated all principal diagnoses as
POA-enabled risk models. “present on admission”
c. We treated secondary diagnoses where the
POA code “Y” or “W” appeared more than
50% of the time in the Watson Health all-
payer database, as “present on admission”
d. All others were treated as “not present”
Percentage of diagnosis codes with POA indicator code of “0” by MEDPAR year
2010 2011 2012 2013 2014 2015 2016
Principal diagnosis 0.00% 4.26% 4.68% 4.37% 3.40% 4.99% 2.45%
Secondary diagnosis 0.00% 15.05% 19.74% 22.10% 21.58% 23.36% 21.64%
45
Excluding records that are DNR status at admission Staff physicians at Watson Health suggested
is supported by the literature. A recent peer- clinical patient characteristics that were
reviewed publication stated: “Inclusion of DNR incorporated into the proprietary models. After
patients within mortality studies likely skews those assigning the predicted probability of the outcome
analyses, falsely indicating failed resuscitative for each patient, the patient-level data can then be
efforts rather than humane decisions to limit care aggregated across a variety of groupings, including
after injury”36. health system, hospital, service line, or MS-DRG
classification.
Our rationale is straightforward: If a patient is
admitted DNR (POA), then typically no heroic Expected complications rate index models
efforts would be made to save that patient if they Watson Health has developed a complications
began to fail. Without the POA DNR exclusion, if a risk model that can be applied to coded patient
given hospital has a higher proportion of POA DNR claims data to estimate the expected probability
patients that it is not attempting to save from death of a complication occurring, given various
compared to an otherwise similar hospital that is patient-related factors. We exclude long-term
not admitting as high a proportion of such patients, care, psychiatric, substance abuse, rehabilitation,
the first hospital would look lower-performing and federally owned or controlled facilities. In
compared to the second through no fault of its own. addition, we exclude certain patient records
The difference would be driven by the proportion of from the data set: psychiatric; substance abuse;
POA DNR patients. unclassified cases (MS-DRGs 945, 946, and
999); cases in which patient age was less than 65
A standard logistic regression model is used to years; and cases in which a patient transferred to
estimate the risk of mortality for each patient. This another short-term, acute care hospital. Palliative
is done by weighting the patient records of the care patients (Z515; V66.7) are included in the
hospital by the logistic regression coefficients complications risk model, which is calibrated
associated with the corresponding terms in the to estimate probability of complications for
model and the intercept term. This produces these patients.
the expected probability of an outcome for
each eligible patient (numerator) based on the Note: We are no longer able to exclude all
experience of the norm for patients with similar rehabilitation patients as we have done in the past.
characteristics (for example, age, clinical grouping, This is because the ICD-10-CM coding system does
and severity of illness)30 - 34. This model accounts not identify rehabilitation patients. We can only
for only patient conditions that are present on exclude those patients coded as being in a PPS-
admission when calculating risk. Additionally, in exempt hospital rehabilitation unit (provtype =
response to the transition to ICD-10-CM, diagnosis R or T).
and procedure codes, and the interactions among
them, have been mapped to AHRQ CCS categories Risk-adjusted complications refer to outcomes that
for assignment of risk instead of using the may be of concern when they occur at a greater-
individual diagnosis, procedure, and interaction than-expected rate among groups of patients,
effects. See discussion under the methods for possibly reflecting systemic quality-of-care
identifying patient severity above. issues. The Watson Health complications model
uses clinical qualifiers to identify complications
that have occurred in the inpatient setting. The
complications used in the model are listed on the
following page.
47
A standard regression model is used to estimate The index is the number of observed events divided
the risk of experiencing a complication for by the number of expected events and can be
each patient. This is done by weighting the calculated for outcomes that involve counts of
patient records of the hospital by the regression occurrences (for example, deaths or complications).
coefficients associated with the corresponding Interpretation of the index relates the experience
terms in the prediction models and intercept term. of the comparison population relative to a specified
This method produces the expected probability event to the expected experience based on the
of a complication for each patient based on the normative population.
experience of the norm for patients with similar
characteristics. After assigning the predicted Examples:
probability of a complication for each patient in
each risk group, it is then possible to aggregate the 10 events observed ÷ 10 events expected = 1.0:
patient-level data across a variety of groupings37 - 40, The observed number of events is equal to the
including health system, hospital, service line, expected number of events based on the
or MS-DRG classification. This model accounts normative experience
for only patient conditions that are present on
admission when calculating risk. Additionally, in 10 events observed ÷ 5 events expected = 2.0:
response to the transition to ICD-10-CM, diagnosis The observed number of events is twice the
and procedure codes, and the interactions among expected number of events based on the normative
them, have been mapped to AHRQ CCS categories experience
for assignment of risk instead of using the individual
diagnosis, procedure, and interaction effects. 10 events observed ÷ 25 events expected = 0.4:
The observed number of events is 60% lower
Index interpretation than the expected number of events based on the
An outcome index is a ratio of an observed normative experience
number of outcomes to an expected number of
outcomes in a population. This index is used to Therefore, an index value of 1.0 indicates no
make normative comparisons and is standardized difference between observed and expected
in that the expected number of events is based outcome occurrence. An index value greater than
on the occurrence of the event in a normative 1.0 indicates an excess in the observed number
population. The normative population used to of events relative to the expected based on the
calculate expected numbers of events is selected normative experience. An index value of less than
to be similar to the comparison population with 1.0 indicates fewer events observed than would
respect to relevant characteristics, including age, be expected based on the normative experience.
sex, region, and case mix. An additional interpretation is that the difference
between 1.0 and the index is the percentage
difference in the number of events relative to the
norm. In other words, an index of 1.05 indicates
5% more outcomes, and an index of 0.90 indicates
10% fewer outcomes than expected based on
the experience of the norm. The index can be
calculated across a variety of groupings (for
example, hospital or service line).
49
Data note relating to the A system is excluded from the study if it does not
July 2016 Hospital Compare have the HAIs required for its comparison group.
performance period See table below.
(July 1, 2012 - June 30, 2015):
The pneumonia measure HAIs by compare group
cohort was expanded to Compare group Required HAIs
include principal discharge
Large systems HAI-1, HAI-2, HAI-3, HAI-5, HAI-6
codes for sepsis and aspiration
Medium systems HAI-1, HAI-2, HAI-3, HAI-5, HAI-6
pneumonia. This resulted
Small systems HAI-1, HAI-2, HAI-6
in a significant increase in
pneumonia 30-day mortality
rates nationally, beginning with Note: The required HAIs were selected based on
the 2015 data year. an analysis of the completeness of data available
for each HAI in each system comparison group.
51
POA coding allows us to estimate appropriate ED throughput measures
adjustments to LOS weights based on pre-existing ED-1b Median time from ED arrival to ED departure for admitted
conditions. Complications that occurred during the ED patients
hospital stay are not considered in the model. We OP-18b Median time from ED arrival to ED departure for discharged
ED patients
calculate expected values from model coefficients
that are normalized to the clinical group and
transformed from log scale. Medicare spend per beneficiary index
The Medicare spend per beneficiary (MSPB) index
is included as a proxy for episode-of-care cost
Emergency department throughput measure efficiency for hospitalized patients. CMS develops
We have included two ED throughput measures and publishes this risk-adjusted index in the public
from the CMS Hospital Compare data set. The Hospital Compare data sets, and in FFY 2015,
hospital ED is an access point to healthcare began to include it in the Hospital Value-Based
for many people. A key factor in evaluating ED Purchasing program. The CMS-stated reason for
performance is process “throughput,” measures including this measure is “… to reward hospitals
of timeliness with which patients are seen by that can provide efficient care at a lower cost to
a provider, receive treatment, and either are Medicare”43.
admitted or discharged. Timely ED processes may
impact both care quality and the quality of the The MSPB index evaluates hospitals’ efficiency
patient experience. We chose to include measures relative to the efficiency of the median hospital,
that define two ED processes: median time from ED nationally. Specifically, the MSPB index assesses
arrival to ED departure for admitted patients, and the cost to Medicare of services performed by
median time from ED arrival to ED departure for hospitals and other healthcare providers during
non-admitted patients. an MSPB episode, which comprises the period
three days prior to, during, and 30 days following
For this study’s measure, we used 2016 data from a patient’s hospital stay. Payments made by
CMS Hospital Compare. Hospitals are required Medicare and the beneficiary (that is, allowed
to have reported both ED measures or they are charges) are counted in the MSPB episode as
excluded from the study. Our ranked metric is the long as the start of the claim falls within the
calculated mean of the two included measures. episode window. IPPS outlier payments (and
outlier payments in other provider settings) are
Hospitals participating in the CMS Inpatient also included in the calculation of the MSPB index.
Quality Reporting and Outpatient Quality Reporting The index is available for Medicare beneficiaries
Programs report data for any eligible adult ED enrolled in Medicare Parts A and B who were
patients, including Medicare patients, Medicare discharged from short-term, acute care hospitals
managed care patients, and non-Medicare patients. during the period of performance. Medicare
Submitted data can be for all eligible patients or a Advantage enrollees are not included. This
sample of patients, following CMS sampling rules. measure excludes patients who died during
the episode.
53
The HCAHPS data is adjusted by CMS for both To calculate the HCAHPS Score for each system, we
survey mode (phone, web, or mail survey) and multiply each member hospital’s HCAHPS score
the patient mix at the discharging facility, since by the hospital’s MEDPAR discharges for the most
respondents randomized to the phone mode tend current year included in the study. This produces
to provide more positive evaluations about their each hospital’s weighted HCAHPS score, which is
care experience than those randomized to the the ranked measure in this study.
mail survey mode. Details on this adjustment’s
parameters are available for all facilities with each To calculate the HCAHPS score for each health
quarterly update, at hcahpsonline.org. system, we sum the member hospital weighted
HCAHPS scores, sum the member hospital
Although we report health system performance on MEDPAR discharges, then divide the sum of the
all HCAHPS questions, only performance on the weighted HCAHPS scores by the sum of the
overall hospital rating question, “How do patients discharges. This produces the health system mean
rate the hospital, overall?” is used to rank system weighted HCAHPS score, which is the ranked
performance. measure in the study.
At the hospital level, patient responses fall into We apply this same methodology to each individual
three categories, and the number of patients in HCAHPS question to produce mean weighted
each category is reported as a percent: HCAHPS scores for the systems. These values are
reported for information only in the study.
–– Patients who gave a rating of 6 or lower (low)
–– Patients who gave a rating of 7 or 8 (medium)
Methodology for financial performance measures
–– Patients who gave a rating of 9 or 10 (high)
Data sources
For each answer category, we assign a weight as The financial measures included in this study are
follows: 3 equals high or good performance, 2 sourced from the annual audited, consolidated
equals medium or average performance, and 1 financial statements of in-study health systems,
equals low or poor performance. We then calculate when they are publicly available. Consolidated
a weighted score for each hospital by multiplying balance sheets and consolidated statements of
the HCAHPS answer percent by the category operations are used. 65.1% of all in-study health
weight. For each hospital, we sum the weighted systems* had publicly available audited financial
percent values for the three answer categories. statements for 2016. This included data for 84.2%
of parent and other independent health systems,
while audited financials were generally not
available for subsystems (1.3%). Audited financial
statements are obtained from the following sites:
–– Electronic Municipal Market Access
–– DAC Bond
–– US Securities and Exchange Commission
* We include subsystems in our study, as well as their parent systems, and independent systems with no subsystems. Subsystems generally are included in the parent
organization statements.
55
In-study system counts Why we have not calculated percent change in
There are fewer in-study systems in the trend specific instances
profile than the current profile because some Percent change is a meaningless statistic when
systems do not have enough data points for one the underlying quantity can be positive, negative,
or more measures to calculate trend, so they or zero. The actual change may mean something,
are excluded. Three data points are required to but dividing it by a number that may be zero or of
calculate the t-statistic of the regression line, the opposite sign does not convey any meaningful
which is the ranked metric. information because the amount of change is not
–– Additional impact on average LOS calculation: proportional to its previous value.
The observed/normalized expected LOS index
for each system is converted into an average We also do not report percent change when the
LOS in days by multiplying it by the mean metrics are already percentages. In these cases,
average LOS for all in-study systems (sum we report the simple difference between the two
observed LOS/in-study system count). The percentage values.
grand mean average LOS will be different in
current and trend profiles when there are
different numbers of in-study systems. Protecting patient privacy
We do not report any individual measure data that
Both the current and trend profiles are internally is based on 11 or fewer patients, as required by
consistent. They each provide relevant comparisons CMS. This is applicable to the following measures:
of a profiled health system’s performance versus
–– Risk-adjusted inpatient mortality index
peers and national benchmarks.
–– Risk-adjusted complications index
–– 30-day mortality rates for AMI, HF, pneumonia,
COPD, and stroke (CMS does not report a rate
when count is less than 25)
–– 30-day readmission rates for AMI, HF,
pneumonia, THA/TKA, COPD, and stroke
(CMS does not report a rate when count is less
than 25)
–– Average LOS
in study
BayCare Health System Clearwater, FL
Bayhealth Dover, DE
Baylor Scott & White Health Dallas, TX
Health system name Location Baystate Health Springfield, MA
Abrazo Community Health Network Phoenix, AZ Beacon Health System South Bend, IN
Adventist Florida Hospital Orlando, FL Beaumont Health Royal Oak, MI
Adventist Health Central Valley Network Hanford, CA BJC HealthCare Saint Louis, MO
Adventist Health System Altamonte Springs, FL Bon Secours Health System Marriottsville, MD
Adventist Health West Roseville, CA Bronson Healthcare Group Kalamazoo, MI
Adventist Healthcare Rockville, MD Brookwood Baptist Health Birmingham, AL
Advocate Health Care Downers Grove, IL Broward Health Fort Lauderdale, FL
AHMC Healthcare Alhambra, CA Cape Cod Healthcare Hyannis, MA
Alameda Health System Alameda, CA Cape Fear Valley Health System Fayetteville, NC
Alecto Healthcare Services Irvine, CA Capella Healthcare Franklin, TN
Alegent Creighton Health Omaha, NE Capital Health System Trenton, NJ
Alexian Brothers Health System Arlington Heights, IL Care New England Health System Providence, RI
Allegheny Health Network Pittsburgh, PA CareGroup Healthcare System Boston, MA
Allina Health System Minneapolis, MN CarePoint Health Bayonne, NJ
Alta Hospitals System Los Angeles, CA Carilion Clinic Roanoke, VA
Anderson Regional Health System Meridian, MS Carolinas HealthCare System Charlotte, NC
Appalachian Regional Healthcare (ARH) Lexington, KY Carondelet Health Network Tuscon, AZ
Ardent Health Services Nashville, TN Catholic Health Buffalo, NY
Asante Medford, OR Catholic Health Initiatives Denver, CO
Ascension Health St. Louis, MO Catholic Health Services of Long Island Rockville Centre, NY
Aspirus Wausau, WI Centegra Health System Crystal Lake, IL
Atlantic Health System Morristown, NJ Centra Health Lynchburg, VA
Aurora Health Care Milwaukee, WI Central Florida Health Leesburg, FL
Avanti Hospitals El Segundo, CA Centura Health Englewood, CO
Avera Health Sioux Falls, SD CHI Franciscan Health Tacoma, WA
Banner Health Phoenix, AZ CHI St. Joseph Health Bryan, TX
Baptist Health Montgomery, AL CHI St. Luke's Health Houston, TX
Baptist Health Little Rock, AR CHI St. Vincent Little Rock, AR
Baptist Health Care Pensacola, FL Christus Health Irving, TX
Baptist Health of Northeast Florida Jacksonville, FL Citrus Valley Health Partners Covina, CA
Note: This year’s 15 Top Health Systems (2018) are in bold, blue text.
57
Health system name Location Health system name Location
Cleveland Clinic Cleveland, OH Good Shepherd Health System Longview, TX
Columbia Health System Milwaukee, WI Greater Hudson Valley Health System Middletown, NY
Community Foundation of Northwest Indiana Munster, IN Greenville Health System Greenville, SC
Community Health Network Indianapolis, IN Guthrie Healthcare System Sayre, PA
Community Health Systems Franklin, TN Hartford HealthCare Hartford, CT
Community Hospital Corp Plano, TX Hawaii Health Systems Corporation Honolulu, HI
Community Medical Centers Fresno, CA Hawaii Pacific Health Honolulu, HI
Conemaugh Health System Johnstown, PA HCA Capital Division Richmond, VA
Covenant Health Knoxville, TN HCA Central and West Texas Division Austin, TX
Covenant Health Systems (Northeast) Syracuse, NY HCA Continental Division Denver, CO
CoxHealth Springfield, MO HCA East Florida Division Ft. Lauderdale, FL
Crozer-Keystone Health System Springfield, PA HCA Far West Division Las Vegas, NV
Dartmouth Hitchcock Health Lebanon, NH HCA Gulf Coast Division Houston, TX
DCH Health System Tuscaloosa, AL HCA Healthcare Nashville, TN
DeKalb Regional Healthcare System Decatur, GA HCA MidAmerica (North) Kansas City, MO
Detroit Medical Center Detroit, MI HCA MidAmerica (South) Kansas City, MO
Dignity Health San Francisco, CA HCA Mountain Division Salt Lake City, UT
Dimensions Health Corporation Cheverly, MD HCA North Florida Division Tallahassee, FL
Duke LifePoint Durham, NC HCA North Texas Division Dallas, TX
Duke University Health System Durham, NC HCA San Antonio Division San Antonio, TX
East Texas Medical Center Regional Tyler, TX HCA South Atlantic Division Charleston, SC
Healthcare System
HCA Tristar Division Nashville, TN
Eastern Connecticut Health Network Manchester, CT
HCA West Florida Division Tampa, FL
Eastern Maine Healthcare Systems Brewer, ME
Health First Rockledge, FL
Edward Elmhurst Health Naperville, IL
Health Group of Alabama Huntsville, AL
Einstein Healthcare Network Philadelphia, PA
Health Quest System Poughkeepsie, NY
Emory Healthcare Atlanta, GA
HealthEast Care System Saint Paul, MN
Essentia Health Duluth, MN
HealthPartners Bloomington, MN
Excela Health Greensburg, PA
Henry Ford Health System Detroit, MI
Fairview Health Services Minneapolis, MN
Heritage Valley Health System Beaver, PA
Forrest Health Hattiesburg, MS
HighPoint Health System Gallatin, TN
Franciscan Health Mishawaka, IN
Hillcrest HealthCare System Tulsa, OK
Franciscan Missionaries of Our Lady Baton Rouge, LA
Health System HonorHealth Scottsdale, AZ
Franciscan Sisters of Christian Charity Manitowoc, WI Hospital Sisters Health System Springfield, IL
Froedtert & the Medical College of Wisconsin Milwaukee, WI Houston Healthcare Warner Robins, GA
Note: This year’s 15 Top Health Systems (2018) are in bold, blue text.
Note: This year’s 15 Top Health Systems (2018) are in bold, blue text.
59
Health system name Location Health system name Location
OhioHealth Columbus, OH Samaritan Health Services Corvallis, OR
Orlando Health Orlando, FL Sanford Health Sioux Falls, SD
OSF Healthcare System Peoria, IL SCL Denver Region Denver, CO
Palmetto Health Columbia, SC SCL Health Denver, CO
Palomar Health Escondido, CA Scripps Health San Diego, CA
Parkview Health Fort Wayne, IN Sentara Healthcare Norfolk, VA
Partners HealthCare Boston, MA Seton Healthcare Family Austin, TX
PeaceHealth Vancouver, WA Sharp HealthCare San Diego, CA
Penn Highlands Healthcare DuBois, PA Sinai Health System Chicago, IL
Penn Medicine Philadelphia, PA Sisters of Charity Health System Cleveland, OH
Phoebe Putney Health System Albany, GA Skagit Regional Health Mount Vernon, WA
Physicians for Healthy Hospitals Hemet, CA Southeast Georgia Health System Brunswick, GA
Piedmont Healthcare Atlanta, GA SoutheastHEALTH Cape Girardeau, MO
PIH Health Whittier, CA Southern Illinois Healthcare Carbondale, IL
Premier Health Dayton, OH Sparrow Health System Lansing, MI
Presbyterian Healthcare Services Albuquerque, NM Spartanburg Regional Healthcare System Spartanburg, SC
Presence Health Chicago, IL Spectrum Health Grand Rapids, MI
Prime Healthcare Services Ontario, CA SSM Health Saint Louis, MO
ProHealth Care Waukesha, WI St. Charles Health System Bend, OR
ProMedica Health System Toledo, OH St. Elizabeth Healthcare Fort Thomas, KY
Providence Health & Services Renton, WA St. John Health System Tulsa, OK
Queens Health System Honolulu, HI St. John Providence Health Detroit, MI
Quorum Health Corporation Brentwood, TN St. Joseph Health System Irvine, CA
RCCH HealthCare Partners Brentwood, TN St. Joseph/Candler Health System Savannah, GA
Regional Health Rapid City, SD St. Luke's Health System Boise, ID
Renown Health Reno, NV St. Mary's Health Care System Athens, GA
Riverside Heath System Newport News, VA St. Peters Health Partners Albany, NY
Rochester Regional Health Rochester, NY St. Vincent’s Health System Birmingham, AL
Roper St. Francis Healthcare Charleston, SC St. Vincent's Healthcare Jacksonville, FL
RWJBarnabas Health West Orange, NJ Steward Health Care System Boston, MA
Sacred Heart Health System Pensacola, FL Success Health Boca Raton, FL
Saint Alphonsus Health System Boise, ID Sutter Health Sacramento, CA
Saint Francis Health System Tulsa, OK Sutter Health Bay Area Sacramento, CA
Saint Joseph Mercy Health System Ann Arbor, MI Sutter Health Valley Area Sacramento, CA
Saint Joseph Regional Health System Mishawaka, IN Swedish Seattle, WA
Saint Luke’s Health System Kansas City, MO Tanner Health System Carrollton, GA
Saint Thomas Health Nashville, TN Temple University Health System Philadelphia, PA
Note: This year’s 15 Top Health Systems (2018) are in bold, blue text.
Note: This year’s 15 Top Health Systems (2018) are in bold, blue text.
61
Footnotes 10. Bonis PA, Pickens GT, Rind DM, 19. Chenoweth J, Foster DA, Waibel
Foster DA. Association of a clinical BC. Best Practices in Board Oversight of
1. Kaplan RS, Norton DP. The Balanced knowledge support system with Quality. The Governance Institute.
Scorecard: Measures That Drive improved patient safety, reduced June 2006.
Performance. Harvard Bus Rev, complications and shorter length of stay
Jan–Feb 1992. among Medicare beneficiaries in acute 20. Kroch E, Vaughn T, Koepke M,
care hospitals in the United States. Roman S, Foster DA, Sinha S, Levey S.
2. Shook J, Chenoweth J. 100 Top Int J Med Inform. 2008 Nov;77(11):745 - Hospital Boards and Quality Dashboards.
Hospitals CEO Insights: Adoption Rates 753. Epub June 2008. J Patient Safety. 2(1):10 - 19,
of Select Baldrige Award Practices March 2006.
and Processes. Ann Arbor, MI: Truven 11. Foster DA. HCAHPS 2008:
Health Analytics Center for Healthcare Comparison Results for 100 Top 21. Cejka Search and Solucient, LLC.
Improvement. October 2012. Hospitals Winners Versus Non-Winners. 2005 Hospital CEO Leadership Survey.
Ann Arbor, MI: Truven Health Analytics
3. Foster DA. Hospital System Center for Healthcare Improvement. 22. Health Research and Educational
Membership and Performance. Ann August 2008. Trust and Prybil, L. Governance in
Arbor, MI: Truven Health Analytics High-Performing Organizations: A
Center for Healthcare Improvement. 12. Foster DA. Risk-Adjusted Mortality Comparative Study of Governing Boards
May 2012. Index Methodology. Ann Arbor, MI: in Not-For-Profit Hospitals. Chicago:
Truven Health Analytics Center for HRET in Partnership with AHA. 2005.
4. HIMSS Analytics, Truven Health Healthcare Improvement. July 2008.
Analytics. 2012 HIMSS Analytics Report: 23. Griffith JR, Alexander JA, Jelinek
Quality and Safety Linked to Advanced 13. Foster DA. Trends in Patient Safety RC. Measuring Comparative Hospital
Information Technology Enabled Adverse Outcomes and 100 Top Performance. Healthc Manag. 2002
Processes. Chicago, IL: HIMSS Analytics. Hospitals Performance, 2000 - 2005. Jan - Feb; 47(1).
April 2012. Ann Arbor, MI: Truven Health Analytics
Center for Healthcare Improvement. 24. Griffith JR, Knutzen SR, Alexander
5. Foster DA, Chenoweth J. Comparison March 2008. JA. Structural Versus Outcomes
of Baldrige Award Applicants and Measures in Hospitals: A Comparison
Recipients With Peer Hospitals on a 14. Shook J, Young J. Inpatient and of Joint Commission and Medicare
National Balanced Scorecard. Ann Arbor, Outpatient Growth by Service Line: Outcomes Scores in Hospitals. Qual
MI: Truven Health Analytics Center for 2006 Truven Health 100 Top Hospitals: Manag Health Care. 2002; 10(2): 29 - 38.
Healthcare Improvement. October 2011. Performance Improvement Leaders
Versus Peer Hospitals. Ann Arbor, MI: 25. See the CMS website at
6. Young J. Outpatient Care Standards Truven Health Analytics Center for cms.gov/Medicare/Quality-Initiatives-
Followed More Closely at Top- Healthcare Improvement. August 2007. Patient-Assessment-Instruments/
Performing Hospitals. Ann Arbor, MI: HospitalQualityInits/OutcomeMeasures.
Truven Health Analytics Center for 15. Chenoweth J, Safavi K. Leadership html.
Healthcare Improvement. March 2011. Strategies for Reaching Top Performance
Faster. J Healthc Tech. January 2007. 26. See the CMS Hospital Compare
7. Young J. Hospitals Increase HCT Project Volume 4. website at hospitalcompare.hhs.gov.
Cardiovascular Core Measure
Compliance. Ann Arbor, MI: Truven 16. Griffith JR, Alexander JA, Foster 27. See the CMS website at
Health Analytics Center for Healthcare DA. Is Anybody Managing the Store? cms.gov/Medicare/Quality-Initiatives-
Improvement. November 2010. National Trends in Hospital Performance. Patient-Assessment-Instruments/
Healthc Manag. 2006 Nov–Dec; HospitalQualityInits/OutcomeMeasures.
8. Foster DA. Top Cardiovascular Care 51(6):392 - 405; discussion 405 - 406. html.
Means Greater Clinical and Financial
Value. Ann Arbor, MI: Truven Health 17. McDonagh KJ. Hospital Governing 28. Iezzoni L, Ash A, Shwartz M, Daley
Analytics Center for Healthcare Boards: A Study of Their Effectiveness in J, Hughes J, Mackiernan Y. Judging
Improvement. November 2009. Relation to Organizational Performance. Hospitals by Severity-Adjusted Mortality
Healthc Manag. 2006 Nov - Dec; 51(6). Rates: The Influence of the Severity-
9. Lee DW, Foster DA. The association Adjusted Method. Am J Public Health.
between hospital outcomes and 18. Bass K, Foster DA, Chenoweth J. 1996; 86(10):1379 - 1387.
diagnostic imaging: early findings. J Am Study Results — Proving Measurable
Coll Radiol. 2009 Nov; 6(11):780 - 785. Leadership and Engagement Impact on
Quality, CMS Invitational Conference on
Leadership and Quality. Sept 28, 2006.
63
About IBM Watson Health™ IBM, the IBM logo, ibm.com, and
Each day, professionals make powerful Watson Health are trademarks of
progress toward a healthier future. At International Business Machines
IBM Watson Health, we help remove Corp., registered in many jurisdictions
obstacles, optimize their efforts, and worldwide. Other product and service
For more information reveal powerful new insights so they names might be trademarks of IBM or
can transform health for the people other companies.
Visit 100tophospitals.com, call they serve. Working across the
800-525-9083 option 4, or send an landscape, from payers and providers to A current list of IBM trademarks is
government and life sciences, we bring available on the web at “Copyright and
email to 100tophospitals@us.ibm.com. together deep health expertise, proven trademark information” at:
innovation, and the power of artificial ibm.com/legal/copytrade.shtml.
intelligence to enable our clients
to uncover, connect, and act on the This document is current as of the initial
insights that advance their work — and date of publication and may be changed
change the world. by IBM at any time. Not all offerings
are available in every country in which
© Copyright IBM Corporation 2018 IBM operates.
IBM Corporation
Software Group The information in this document is
Route 100 provided “as is” without any warranty,
Somers, NY 10589 express or implied, including without
ibm.com/watsonhealth any warranties of merchantability,
800-525-9083 fitness for a particular purpose and
any warranty or condition of non-
Produced in the United States of infringement.
America April 2018
IBM products are warranted
according to the terms and conditions
of the agreements under which they
are provided.