15 Top Health Systems Study, 2016

15 Top Health Systems Study, 2016
8th Edition | Published April 25, 2016
Truven Health Analytics, an IBM Company
100 Phoenix Drive
Ann Arbor, MI 48108 USA
1.800.525.9083
truvenhealth.com
Truven Health 15 Top Health Systems Study, 8th Edition
100 Top Hospitals is a registered trademark of Truven Health Analytics.
©2016 Truven Health Analytics Inc. All rights reserved. Printed and bound in the United States of America.
The information contained in this publication is intended to serve as a guide for general comparisons and
evaluations, but not as the sole basis upon which any specific conduct is to be recommended or undertaken.
The reader bears sole risk and responsibility for any analysis, interpretation, or conclusion based on the information
contained in this publication, and Truven Health Analytics shall not be responsible for any errors, misstatements,
inaccuracies, or omissions contained herein. No part of this publication may be reproduced or transmitted in any
form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage
and retrieval system, without permission in writing from Truven Health Analytics.
ISBN: 978-57372-469-2
Table of
Contents
Introduction.................................................................................................................................................................. 1
Innovation in System Measurement, Including Alignment................................................................. 2
The 2016 15 Top Health Systems:
Seizing Opportunities. Delivering Value..................................................................................................... 2
Key Differences Between Award Winners and Their Peers............................................................... 3
New Performance Measures............................................................................................................................ 3
The Multi-Faceted 100 Top Hospitals Program.......................................................................................4
About Truven Health Analytics, an IBM® Company...............................................................................4
Findings......................................................................................................................................................................... 5
Winner-Versus-Peer Results............................................................................................................................ 5
Winning Health System Results................................................................................................................... 10
Top and Bottom Quintile Results..................................................................................................................11
Performance Measures Under Consideration.........................................................................................12
New Measure of “Systemness” — The Performance-Weighted Alignment Score.................. 14
Methodology..............................................................................................................................................................15
Building the Database of Health Systems............................................................................................... 16
Measures and Methods Changes..................................................................................................................18
Identifying Health Systems............................................................................................................................ 19
Classifying Health Systems Into Comparison Groups........................................................................ 19
Scoring Health Systems on Weighted Performance Measures..................................................... 20
Performance Measure Details.......................................................................................................................22
Determining the 15 Top Health Systems.................................................................................................. 26
Appendix A: Health System Winners and Their Member Hospitals................................................. 29
Appendix B: The Top Quintile: Best-Performing Health Systems......................................................33
Appendix C: Methodology Details...................................................................................................................35
Methods for Identifying Complications of Care....................................................................................35
Risk-Adjusted Mortality Index Models.......................................................................................................37
Expected Complications Rate Index Models..........................................................................................37
Core Measures....................................................................................................................................................40
30-Day Risk-Adjusted Mortality Rates and 30-Day Risk-Adjusted Readmission Rates...... 41
Average Length of Stay.................................................................................................................................. 42
Emergency Department Throughput Measure.................................................................................... 42
Medicare Spending per Beneficiary Index............................................................................................. 43
HCAHPS Overall Hospital Rating............................................................................................................... 43
Methodology for Financial Performance Measures............................................................................ 45
Performance Measure Normalization....................................................................................................... 45
Why We Have Not Calculated Percent Change in Specific Instances........................................ 46
Protecting Patient Privacy............................................................................................................................ 46
Appendix D: All Health Systems in Study.................................................................................................... 47
References..................................................................................................................................................................55
Introduction
The 100 Top Hospitals Program's
15 Top Health Systems Study:
The Objective Path to Evidence-Based Management
The Truven Health Analytics™ 15 Top Health Systems study, in its eighth year, is the
newest analysis in the 100 Top Hospitals® program series. The 15 Top Health Systems
study is uniquely focused on objectively measuring the impact of managerial excellence,
and the results are highly correlated with the Malcolm Baldrige National Quality Award®
winner improvement rates and performance levels.
Our goal is simple: To inform U.S. health system leaders of relative long-term
improvement rates, resultant performance, and the achievement of "systemness" of
the top-performing organizations versus national peers. This analysis provides valuable
guidance to health system boards and executives who use these critical, quantitative
performance insights to adjust continuous improvement targets, ensure the collaboration
of member hospitals, and achieve systemwide alignment on common performance goals.
Just as evidence-based medicine is key to exceptional patient care, evidenced-based
management is key to the health and success of health systems.
The Truven Health 15 Top Health Systems study is an ongoing research project that is
adjusted as changes occur in the healthcare environment, new public data and metrics
become available, and managerial practices evolve. The current study measures relative
balanced performance across a range of organizational key performance indicators
— reflecting care quality, use of evidence-based medicine, post-discharge outcomes,
operational efficiency, and customer perception of care.
To maintain the study’s high level of integrity, only impartial, public data sources are
used for calculating study metrics. This eliminates bias, ensures inclusion of a maximum
number of health systems and facilities, and guarantees uniformity of definitions and
data. And like all Truven Health 100 Top Hospitals studies, our 15 Top Health Systems
research is performed by an experienced research team that includes epidemiologists,
statisticians, physicians, and former hospital executives.
The 15 Top Health Systems study is highly focused on the research and is not designed
as a marketing tool. Because we use only public data, there is no application or fee
for inclusion in the study. Winners do not pay to receive their results and do not pay
promotional fees. Additionally, neither Truven Health nor award winners release specific
15 Top Health Systems balanced scorecard results to consumers.
15 TOP HEALTH SYSTEMS
1
Innovation in System Measurement, Including Alignment
At the heart of the 15 Top Health Systems research is the methodology used for the
Truven Health 100 Top Hospitals national balanced scorecard.1 This proven scorecard and
The Truven Health 15 Top
Health Systems scorecard
results are divided into
two separate sections that
graphically illustrate:
ƒƒ A health system's
performance and
improvement versus
peer health systems
ƒƒ Alignment of
improvement and
performance of member
hospitals
its peer-reviewed, risk-adjusted methodologies are the foundation for the comparison of
health system-to-peer rate of improvement and performance.
The 15 Top Health Systems scorecard also goes beyond these insights by adding a third
measurement dimension — alignment. The alignment factor is particularly useful to health
system leaders as they work to empirically assess the degree of consistency achieved
across system facilities and develop action plans to strengthen it.
The end result is distinctive: A comprehensive analysis of the improvement, resultant
performance, and alignment of America’s health systems.
The 2016 15 Top Health Systems:
Seizing Opportunities. Delivering Value.
To survive in an industry challenged by increased competition and a new set of rules
imposed by reform, healthcare providers must deliver ever-higher quality and become
more efficient — doing more with potentially lower reimbursements.
Undoubtedly, our award winners are finding ways to succeed — defining best practices
and continuing to demonstrate what can be accomplished in healthcare today.
This year’s 15 Top Health Systems, placed into size categories by total operating
expense, are:
2
Large Health Systems (> $1.75 billion)
Location
Mayo Foundation
Rochester, MN
Mercy
Chesterfield, MO
Spectrum Health
Grand Rapids, MI
Sutter Health
Sacramento, CA
Sutter Health Valley Division
Sacramento, CA
Medium Health Systems ($750 million – $1.75 billion)
Location
Kettering Health Network
Dayton, OH
Scripps Health
San Diego, CA
St. Luke's Health System
Boise, ID
St. Vincent Health
Indianapolis, IN
TriHealth
Cincinnati, OH
Small Health Systems (< $750 million)
Location
Asante
Medford, OR
Lovelace Health System
Albuquerque, NM
MidMichigan Health
Midland, MI
Roper St. Francis Healthcare
Charleston, SC
Tanner Health System
Carrollton, GA
15 TOP HEALTH SYSTEMS
Key Differences Between Award Winners and Their Peers
The 2016 winners of the Truven Health 15 Top Health Systems award outperformed their
peers in a number of ways.
The 2016 study
They:
analyzed 338
§§ Saved more lives and caused fewer patient complications
health systems
§§ Followed industry-recommended standards of care more closely
§§ Released patients from the hospital a half-day sooner
and 2,912
§§ Readmitted patients less frequently and experienced fewer deaths within 30 days
hospitals that
of admission
§§ Had over 12-percent shorter wait times in their emergency departments
are members of
§§ Had nearly 5-percent lower Medicare beneficiary cost per 30-day episode of care
health systems.
§§ Scored over 7 points higher on patient overall rating of care
In addition, we found that hospitals that are health system members were more likely to
be Truven Health 100 Top Hospitals study winners than hospitals that are not a part of a
system. In fact, 69 percent of our 2016 winners belonged to systems; whereas 66 percent
of all in-study hospitals belonged to systems.
Understanding the similarities and differences between high and low system performers
provides benchmarks for the entire industry. Each year, the relevant benchmarks and
findings we assemble for this study provide many examples of excellence, as evidenced in
a number of additional published studies.2-24
New Performance Measures
Our studies continue to adapt as the healthcare market changes. We are currently
analyzing several new performance measures for information only; they are not yet being
used to rank health system performance on our balanced scorecard. These measures
included important information about hospital-acquired infections and also continued to
move outside the inpatient acute care setting.
In these new areas of performance, 2016 winners of the 15 Top Health Systems award
outperformed their peers in a number of ways, as well, even though these measures were
not included in the selection of winners. Our benchmark health systems had:
§§ 28 percent fewer patients with methicillin-resistant staphylococcus aureus (MRSA)
infections and 23 percent fewer patients with catheter-associated urinary tract
infections (CAUTI) — a truly notable difference in patient care
§§ Coronary artery bypass grafting (CABG) 30-day mortality and readmission rates that
were 0.5 and 0.7 percentage points lower
§§ 30-day episode payment costs from 2 to 5 percent lower
§§ Operating margins that were nearly 1 percent higher
15 TOP HEALTH SYSTEMS
3
The Multi-Faceted 100 Top Hospitals Program
The 15 Top Health Systems research is just one of the studies of the Truven Health
100 Top Hospitals program. To increase understanding of trends in specific areas of the
healthcare industry, the program includes a range of studies and reports, including:
§§ 100 Top Hospitals and Everest Award studies: Highly anticipated research that
annually recognizes the best hospitals in the nation based on overall organizational
performance, as well as long-term rates of improvement
§§ 50 Top Cardiovascular Hospitals study: An annual study identifying hospitals that
demonstrate the highest performance in hospital cardiovascular services
§§ 15 Top Health Systems study: A groundbreaking study that provides an objective
measure of health system performance as a sum of its parts
§§ 100 Top Hospitals Performance Matrix: A two-dimensional analysis — available for
nearly all U.S. hospitals — that provides a clear view of how long-term improvement
and current performance overlap and compare with national peers
§§ Custom benchmark reports: A variety of reports designed to help executives
understand how their performance compares with their peers within health systems,
states, and markets
You can read more about these studies and see lists of all study winners by visiting
100tophospitals.com.
Truven Health Analytics, an IBM® Company
Truven Health Analytics, an IBM Company, delivers the answers that clients need to
improve healthcare quality and access while reducing costs. We provide marketleading performance improvement solutions built on data integrity, advanced analytics,
and domain expertise. For more than 40 years, our insights and solutions have been
providing hospitals and clinicians, employers and health plans, state and federal
government agencies, life sciences companies, and policymakers the facts they need to
make confident decisions that directly affect the health and well-being of people and
organizations in the U.S. and around the world.
Truven Health Analytics owns some of the most trusted brands in healthcare, such as
MarketScan,® 100 Top Hospitals,® Advantage Suite,® Micromedex,® Simpler,® ActionOI,® and
JWA. Truven Health has its principal offices in Ann Arbor, Mich.; Chicago, Ill.; and Denver,
Colo. For more information, please visit truvenhealth.com.
4
15 TOP HEALTH SYSTEMS
Findings
Health system leaders focused on strategic performance
improvement in today’s healthcare environment must
determine how the process fits into their mission — and
then create a plan to drive consistent progress across the
entire system.
The 2016 Truven Health 15 Top Health Systems are leading the way in accomplishing
those objectives, outperforming their peers with balanced excellence across all of the
hospitals in their organization. That’s why there’s simply no better way to see how the
nation’s health and the industry’s bottom lines could be improved than by aggregating
the winner-versus-nonwinner data from our 2016 study.
Winner-Versus-Peer Results
By providing detailed performance measurement data, we show what the top
health system performers have accomplished and offer concrete goals for the entire
industry. To develop more actionable benchmarks for like systems, we divided health
systems into three comparison groups based on the total operating expense of
their member hospitals. (For more details on the comparison groups, please see the
Methodology section.)
In this section, Tables 1 through 4 detail how the winning health systems in these groups
scored on the study’s performance measures and how their performance compared with
nonwinning peers.
Below, we highlight several important differences between the winners and their peers,
and between the different health system comparison groups.
The Top Health Systems Had Better Survival Rates
§§ The winners had nearly 15 percent fewer in-hospital deaths than their nonwinning
peers, considering patient severity (Table 1).
§§ Mortality results for medium-sized health systems showed the greatest difference
between winners and nonwinners, with almost 24 percent fewer deaths among
benchmark health systems. In addition, the medium systems had the lowest mortality
rate (0.77) of the three comparison groups (Tables 2-4).
15 TOP HEALTH SYSTEMS
5
The Top Health Systems Had Fewer Patient Complications
§§ Patients treated at the winning systems’ member hospitals had significantly fewer
complications. Their rates were over 15 percent lower than at nonwinning system
hospitals, considering patient severity (Table 1).
§§ Medium health systems had the lowest complications (0.83) of the three comparison
groups and outperformed their peers by the widest margin. Hospitals in these winning
systems had 16.4 percent fewer complications than their peers (Tables 2–4).
The Top Health Systems Followed Accepted Care Protocols Somewhat
More Consistently
This year, we replaced the retired acute myocardial infarction (AMI), heart failure (HF),
pneumonia, and Surgical Care Improvement Project (SCIP) core measures with two
new groups: stroke and blood clot prevention core measures. The introduction of these
measures offers healthcare organizations a new challenge to address basic standards of
care, which is evident by the somewhat lower performance on these new measures.
§§ Overall, the winning systems’ higher core measures mean percentage of 95.3 was
more than 2 percentage points better than nonwinning peers (Table 1).
§§ Small winning health systems showed both best overall core measures performance
(96.5 percent) and the greatest difference between winners and nonwinners
(4 percentage points)(Tables 2–4).
§§ Large health system winners lagged behind their peers in core measures compliance
(93.3 and 93.7 percent, respectively).
Top Health Systems Had Mixed Results on Longer-Term Outcomes
This year, we added two new patient groups to the 30-day mortality and 30-day
readmission rates. The mean 30-day mortality rate now includes AMI, HF, pneumonia,
chronic obstructive pulmonary disease (COPD), and stroke. The mean 30-day
readmission rate now includes AMI, HF, pneumonia, hip/knee arthroplasty, COPD, and
stroke. Instead of ranking and reporting the individual rates, we are now using the mean
30-day mortality and mean 30-day readmission rates.
§§ The mean 30-day mortality rate for winning systems, overall, was lower than for
nonwinners (0.14 percentage point difference). This same percentage point difference
can be found among large and medium health system winners and nonwinners
(Table 1).
§§ Large system winners had the best performance on 30-day mortality (11.7 percent)
(Table 3).
§§ Small winning systems had the best mean 30-day readmission rate across all
comparison groups and also outperformed their nonwinner peers by the greatest
margin — 1.32 percentage points (Tables 2–4).
6
15 TOP HEALTH SYSTEMS
Patients Treated at Hospitals in the Winning Health Systems Returned
Home Sooner
§§ Winning systems had a median average length of stay (LOS) of 4.5 days, more than
half a day shorter than their peers’ median of 5 days (Table 1).
§§ Average LOS difference between winners and nonwinners was consistent across all
comparison groups, with benchmark systems discharging patients one-half day sooner
(Tables 2–4).
Patients Spent Less Time in the Emergency Department at Winning
Health Systems
In this year’s study, we added a new measure of process efficiency — mean emergency
department (ED) throughput (a wait time composite metric). The mean of three reported
wait time measures was used: median minutes to admission, to discharge from the ED,
and for receipt of pain medication for broken bones.
§§ Overall, winning systems had a 12.3 percent shorter ED wait time than nonwinners
(Table 1).
§§ The most dramatic difference between winning systems and their peers was in the large
health system comparison group. Large system winners averaged nearly 26 minutes
less wait time than nonwinners (Tables 2–4).
Medicare Beneficiary Episode-of-Care Costs Were Lower for Patients Discharged
From Winning Health Systems
§§ Overall, winning systems had a 4.9 percent lower Medicare spending per beneficiary
(MSPB) index than nonwinners (Table 1).
§§ Large health system winners had the best performance with an MSPB index of 0.93
and outperformed their nonwinner peers by the widest margin — 6.0 percent.
More research needs to be done to determine which community factors and hospital
practices have the greatest influence on episode costs.
Research is also needed on how inpatient, outpatient, and home-based care impact
longer-term patient outcomes in order to define best practices for population health
management.
Patients Treated by Members of the Top Health Systems Reported a Better
Overall Hospital Experience Than Those Treated in Nonwinning Peer Hospitals
§§ The winners’ 2.7 percent higher median Hospital Consumer Assessment of Healthcare
Providers and Systems (HCAHPS) score tells us that patients treated by members of
the top health systems reported a better overall hospital experience than those treated
in nonwinning peer hospitals (Table 1).
§§ Small winning systems had the best HCAHPS score (269.8) and also had the biggest
lead over nonwinning peers, with an HCAHPS score that was 3.5 percent higher
(Tables 2–4).
15 TOP HEALTH SYSTEMS
7
Table 1: National Health System Performance Comparisons (All Systems)
Performance Measure
Medians
Benchmark Compared With Peer Group
Winning
Benchmark
Health
Systems
Nonwinning
Peer Group
of U.S. Health
Systems
Difference
Percent
Difference
Comments
Mortality Index1
0.86
1.01
-0.15
-14.7%
Lower Mortality
Complications Index1
0.84
0.99
-0.15
-15.1%
Lower Complications
95.3
93.2
2.14
n/a
Greater Care Compliance
11.8
12.0
-0.14
n/a5
Lower 30-Day Mortality
30-Day Readmission Rate (%)
15.0
15.7
-0.78
n/a
Fewer 30-Day Readmissions
Average Length of Stay (LOS) (days)1
4.5
5.0
-0.51
-10.3%
Shorter Stays
Emergency Department (ED) Measure
Mean Minutes4
146.0
166.4
-20.41
-12.3%
Less Time-to-Service
Medicare Spend per Beneficiary (MSPB) Index4
0.94
0.99
-0.05
-4.9%
Lower Episode Cost
Hospital Consumer Assessment of Healthcare
Providers and Systems (HCAHPS) Score4
269.5
262.4
7.09
2.7%
Better Patient Experience
Core Measures Mean Percent
2
30-Day Mortality Rate (%)3
3
5
5
1. Mortality, complications, and average LOS based on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
2.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
3.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
4.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
5.We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
Table 2: Large Health Systems Performance Comparisons
Performance Measure
Medians
Benchmark Compared With Peer Group
Winning
Benchmark
Health
Systems
Nonwinning
Peer Group
of U.S. Health
Systems
Difference
Percent
Difference
Comments
0.92
1.00
-0.07
-7.5%
Lower Mortality
Complications Index
0.89
1.01
-0.11
-11.3%
Lower Complications
Core Measures Mean Percent2
93.3
93.7
-0.43
n/a5
Less Care Compliance
30-Day Mortality Rate (%)3
11.7
11.8
-0.14
n/a5
Lower 30-Day Mortality
30-Day Readmission Rate (%)
15.1
15.7
-0.64
n/a
Fewer 30-Day Readmissions
Average LOS (days)1
4.5
5.0
-0.50
-10.1%
Shorter Stays
ED Measure Mean Minutes
140.4
166.2
-25.78
-15.5%
Less Time-to-Service
MSPB Index4
0.93
0.99
-0.06
-6.0%
Lower Episode Cost
HCAHPS Score4
267.3
262.7
4.59
1.7%
Better Patient Experience
Mortality Index1
1
3
4
5
1. Mortality, complications, and average LOS based on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
2.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
3.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
4.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
5.We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
8
15 TOP HEALTH SYSTEMS
Table 3: Medium Health Systems Performance Comparisons
Performance Measure
Medians
Benchmark Compared With Peer Group
Winning
Benchmark
Health
Systems
Nonwinning
Peer Group
of U.S. Health
Systems
Difference
Percent
Difference
Comments
Mortality Index1
0.77
1.01
-0.24
-23.6%
Lower Mortality
Complications Index1
0.83
0.99
-0.16
-16.4%
Lower Complications
95.3
93.5
1.83
n/a
Greater Care Compliance
11.8
12.0
-0.14
n/a5
Lower 30-Day Mortality
30-Day Readmission Rate (%)
15.0
15.6
-0.61
n/a
Fewer 30-Day Readmissions
Average LOS (days)1
4.5
5.0
-0.47
-9.4%
Shorter Stays
ED Measure Mean Minutes4
146.0
171.2
-25.21
-14.7%
Less Time-to-Service
MSPB Index
0.98
0.99
-0.01
-1.4%
Lower Episode Cost
269.5
263.5
6.00
2.3%
Better Patient Experience
Core Measures Mean Percent
2
30-Day Mortality Rate (%)3
3
4
HCAHPS Score4
5
5
1. Mortality, complications, and average LOS based on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
2.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
3.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
4.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
5.We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
Table 4: Small Health Systems Performance Comparisons
Performance Measure
Medians
Benchmark Compared With Peer Group
Winning
Benchmark
Health
Systems
Nonwinning
Peer Group
of U.S. Health
Systems
Difference
Percent
Difference
Comments
Mortality Index1
0.88
1.00
-0.13
-12.6%
Lower Mortality
Complications Index1
0.85
1.00
-0.15
-14.6%
Lower Complications
Core Measures Mean Percent2
96.5
92.4
4.07
n/a5
Greater Care Compliance
30-Day Mortality Rate (%)
12.1
12.1
-0.03
n/a
No Difference in 30-Day Mortality
30-Day Readmission Rate (%)3
14.5
15.8
-1.32
n/a5
Fewer 30-Day Readmissions
Average LOS (days)
3
5
4.6
5.0
-0.41
-8.2%
Shorter Stays
ED Measure Mean Minutes4
153.3
158.8
-5.54
-3.5%
Less Time-to-Service
MSPB Index4
0.94
0.98
-0.04
-3.8%
Lower Episode Cost
269.8
260.7
9.06
3.5%
Better Patient Experience
HCAHPS Score
4
1
1. Mortality, complications, and average LOS based on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
2.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
3.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
4.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
5.We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
15 TOP HEALTH SYSTEMS
9
10
15 TOP HEALTH SYSTEMS
0.74
0.88
0.89
0.74
MidMichigan Health
Roper St. Francis Healthcare
Tanner Health System
0.72
TriHealth
1.00
0.87
St. Vincent Health
Lovelace Health System
1.09
St. Luke's Health System
Asante
0.71
0.91
0.62
0.90
0.85
0.57
0.87
0.81
0.73
0.83
0.85
0.81
0.89
0.92
95.4
96.5
93.3
98.2
97.6
94.1
87.5
95.3
98.3
98.9
95.9
94.8
91.6
12.7
11.5
12.1
11.4
12.9
11.3
11.8
12.1
11.0
12.0
12.0
11.6
11.7
12.2
11.4
15.4
14.3
15.3
14.5
13.8
15.0
14.6
14.1
15.2
15.6
15.1
15.3
14.5
15.3
14.9
1. Mortality, complications, and average LOS on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
2.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
3.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
4.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
Note: Based on national norms, ratings greater than 1.0 indicate more adverse events than expected; ratings less than 1.0 indicate fewer.
Small Health Systems
0.77
Scripps Health
0.96
Sutter Valley Division
Kettering Health Network
0.97
Sutter Health
Medium Health Systems
0.92
Spectrum Health
93.3
92.2
in each winning health system,
0.84
a list of all hospitals included
0.95
4.6
4.7
4.2
4.7
4.4
4.8
4.6
4.2
4.4
4.5
4.3
4.5
4.7
4.5
4.4
137.0
109.8
157.1
153.3
169.0
146.0
140.9
146.7
174.2
135.7
180.5
179.1
129.6
136.5
140.4
ED
Measure
Mean
Minutes4
medians for all winners and
0.86
nonwinners in this table. For
Average
LOS1
we also repeated the group
0.87
see Appendix A.
30-Day
Readmission
Rate (%)3
For comparative purposes,
Mercy
30-Day
Mortality
Rate (%)3
0.94
0.98
0.87
0.98
0.92
1.00
0.94
0.89
0.98
1.02
0.95
0.93
0.92
0.94
0.86
MSPB
Index4
performance measures.
Mayo Foundation
Core
Measures
Mean
Percent2
scores for each of the study’s
Large Health Systems
Complications
Index1
272.4
274.6
269.8
258.6
264.5
266.8
269.5
271.6
264.7
269.7
267.3
263.2
271.3
266.0
278.5
HCAHPS
Score4
the 15 top health systems’
Mortality
Index1
In Table 5, we’ve provided
Winning System Name
Table 5: Winning Health Systems Performance Measure Results
Winning Health
System Results
Top and Bottom Quintile Results
To provide more significant comparisons, we divided all of the health systems in this
study into performance quintiles, by comparison group, based on their performance on
the study’s measures. In Table 6, we’ve highlighted differences between the highest- and
lowest-performing quintiles by providing their median scores on study performance
measures. See Appendix B for a list of the health systems included in the topperformance quintile and Appendix D for all systems included in the study.
Following are several highlights of how the top quintile systems outperformed their
lowest quintile peers:
§§ They had much better patient outcomes — lower mortality and complications rates.
§§ They followed accepted care protocols for core measures more closely.
§§ They had lower mean 30-day mortality rates (includes AMI, HF, pneumonia, COPD,
and stroke patients).
§§ They had lower mean 30-day readmission rates (includes AMI, HF, pneumonia,
hip/knee arthroplasty, COPD, and stroke patients).
§§ They had much lower mean ED wait times, with an average difference of almost
45 minutes.
§§ They were more efficient, releasing patients more than three-quarters of a day sooner
than the lowest performers and at a 6.6-percent lower episode cost (MSPB).
§§ They scored over 12 points higher on the HCAHPS overall patient rating of care.
Table 6: Comparison of Health Systems in the Top and Bottom Quintiles of Performance1
Performance Measure
Top Quintile
Median
Bottom Quintile
Median
Difference
Percent
Difference
Top Vs. Bottom Quintile
Mortality Index2
0.89
1.09
-0.19
17.9%
Lower Mortality
Complications Index2
0.92
1.06
-0.14
13.1%
Lower Complications
Greater Care Compliance
Core Measures Mean Percent
94.55
90.60
3.95
n/a
30-Day Mortality Rate (%)4
12.04
12.18
-0.14
n/a6
Lower 30-Day Mortality
30-Day Readmission Rate (%)4
15.08
16.01
-0.92
n/a6
Fewer 30-Day Readmissions
Average LOS (days)
3
6
4.53
5.34
-0.80
15.1%
Shorter Stays
ED Measure Mean Minutes5
145.95
188.94
-42.98
22.7%
Less Time-to-Service
MSPB Index
0.95
1.01
-0.07
6.6%
Lower Episode Cost
268.24
255.87
12.36
4.8%
Better Patient Experience
5
HCAHPS Score5
2
1. Top and bottom performance quintiles were determined by comparison group and aggregated to calculate medians.
2.Mortality, complications, and average LOS based on present-on-admission (POA)-enabled risk models applied to MEDPAR 2013 and 2014 data (average LOS 2014 only).
3.Core measures data from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
4.30-day rates from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
5.ED measure, MSPB, and HCAHPS data from CMS Hospital Compare Jan. 1, 2014–Dec. 31, 2014, dataset.
6.We do not calculate percent difference for this measure because it is already a percent value.
Note: Measure values are rounded for reporting, which may cause calculated differences to appear off.
15 TOP HEALTH SYSTEMS
11
Performance Measures Under Consideration
As the healthcare industry changes, our study methods continue to evolve. Currently,
we are analyzing several new performance measures for information only; they are not
yet being used to rank health system performance on our Truven Health 15 Top Health
Systems balanced scorecard. We are now profiling performance on the new HealthcareAssociated Infection (HAI) measures, as well as on additional measures that move outside
the inpatient acute care setting, to look at new, extended outcomes metrics, 30-day
episode payment metrics, and the financial health of the system.
The new performance measures under consideration are:
§§ Six HAI measures: central line-associated bloodstream infection (CLABSI),
catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI) for
colon surgery and abdominal hysterectomy, methicillin-resistant staphylococcus aureus
(MRSA-bloodstream), and clostridium difficile (C.diff) — intestinal
§§ Coronary artery bypass graft (CABG) 30-day mortality and 30-day readmission rates
§§ Hospital-wide, all-cause 30-day readmission rate
§§ Two measures of system financial performance: operating margin and long-term
debt-to-capitalization ratio (LTD/Cap)
HAI Measures
The HAIs reported by the Centers for Medicare & Medicaid Services (CMS) in the public
Hospital Compare dataset capture important new information about the quality of
inpatient care. Data are reported for calendar year 2014. Tracking and intervening to
reduce these infection rates are becoming an important focus in hospitals. New public
data will allow the development of national benchmarks for use by hospital leadership to
affect change. (See Table 7.)
New 30-Day Mortality and Readmission Measures
We are also profiling hospitals on the new 30-day mortality and readmission rates for
CABG surgical patients, as well as the hospital-wide, all-cause 30-day readmission rate.
These rates are publicly reported by CMS in the Hospital Compare dataset. The data
period for the CABG rates is the same as for the other 30-day metrics — July 1, 2011,
through June 30, 2014. The data period for the hospital-wide readmission measure is
July 1, 2013, through June 30, 2014 (Table 7).
New 30-Day Episode-of-Care Payment Measures
Risk-standardized payments associated with 30-day episode-of-care measures for
three patient groups have recently been published by CMS in the Hospital Compare
dataset. These new measures capture differences in services and supply costs provided
to patients who have been diagnosed with AMI, HF, or pneumonia. According to the
CMS definition of these new measures, they are the sum of payments made for care and
supplies beginning the day the patient enters the hospital and for the next 30 days.
The data period for these measures is the same as for the other 30-day metrics for
specific patient conditions — July 1, 2011, through June 30, 2014 (Table 7).
12
15 TOP HEALTH SYSTEMS
Table 7: Information-Only Measures — Health System Performance Comparisons (All Classes)
Performance Measure
Medians
Benchmark Compared With Peer Group
Benchmark
Health
Systems
Peer Group
of U.S. Health
Systems
Difference
Percent
Difference
Comments
30-Day Coronary Artery Bypass Graft (CABG)
Mortality Rate1
2.6
3.1
-0.5
n/a4
Lower 30-Day Mortality
30-Day CABG Readmission Rate1
14.1
14.8
-0.7
n/a4
Fewer 30-Day Readmissions
30-Day Hospital-Wide Readmission Rate
14.7
15.3
-0.6
n/a
Fewer 30-Day Readmissions
Central Line-Associated Blood Stream Infection
(CLABSI)3
0.37
0.42
-0.1
-12.3%
Fewer Infections
Catheter-Associated Urinary Tract Infection (CAUTI)3
0.89
1.15
-0.3
-22.5%
Fewer Infections
Surgical Site Infection (SSI) From Colon Surgery3
1.01
0.94
0.1
7.4%
More Infections
SSI From Abdominal Hysterectomy3
0.89
0.84
0.1
6.3%
More Infections
Methicillin-Resistant Staphylococcus Aureus
(MRSA-Bloodstream)3
0.61
0.84
-0.2
-27.6%
Fewer Infections
Clostridium Difficile (C.diff.-Intestinal)3
0.87
0.88
0.0
-1.0%
Fewer Infections
Acute Myocardial Infarction (AMI) 30-Day Episode
Payment1
$21,890
$22,369
-479.4
-2.1%
Lower Episode Cost
Heart Failure (HF) 30-Day Episode Payment1
$15,394
$15,789
-395.0
-2.5%
Lower Episode Cost
Pneumonia 30-Day Episode Payment1
$13,896
$14,622
-726.7
-5.0%
Lower Episode Cost
2
4
1. 30-day CABG mortality, 30-day CABG readmission, and 30-day episode payment metrics from CMS Hospital Compare July 1, 2011–June 30, 2014, dataset.
2.30-day hospital-wide readmission data from CMS Hospital Compare July 1, 2013–June 30, 2014, dataset.
3.HAI measures from CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014, dataset.
4.We do not calculate percent difference for this measure because it is already a percent value.
Financial Metrics
We are again publishing health system financial measures this year, but they continue to
be for information only, as audited financial statements are not available for all systems
included in the study.* These measures were not included in the ranking and selection
of benchmark health systems. However, available data shows benchmark health system
performance is better than nonwinning peers across all comparison groups for operating
margin. Results for LTD/Cap do not show this difference in performance. (See Table 8.)
Table 8: Information-Only Measures — Financial Performance
Performance
Measure
Operating Margin
(Percentage)
Long-Term
Debt (LTD)-toCapitalization (Cap)
Ratio
Health System
Comparison
Group
Medians
Difference
Benchmark
Health Systems
Peer Group
of U.S. Health
Systems
Benchmark
Compared With
Peer Group
All Systems
3.8
3.3
0.6
Large
3.8
3.7
0.1
Medium
6.0
3.4
2.6
Small
3.6
2.3
1.3
All Systems
0.3
0.3
-0.1
Large
0.3
0.3
0.0
Medium
0.4
0.3
0.0
Small
0.2
0.4
-0.2
Note: D
ata sourced from audited financial reports, 2014
(dacbond.com; emma.msrb.org; http://yahoo.brand.edgar-online.com; sec.gov)
*6
2 percent of all systems, and 80 percent of parent and independent systems, had audited financial statements
available. Most subsystems that are members of larger “parent” health systems do not have separate audited
financial statements.
15 TOP HEALTH SYSTEMS
13
New Measure of “Systemness” — The Performance-Weighted Alignment Score
For several years, we have reported a health system alignment score that measured
how tightly member hospitals were grouped on rate of improvement and performance
achievement. The measure did not identify whether the performance was good or
bad. This year, we have enhanced the methodology to take the level of performance
into account.
This new methodology allows health systems, for the first time, to determine whether
a system “brand” is consistently delivering the same level of value in each community
served. This new measure was not used in the selection of the winning systems. However,
it can bring focus to leadership goal-setting and contribute to the development of more
closely aligned and better-performing systems.
Methodology
Each system performance-weighted alignment score is the average of the distance of
each member hospital from their central point (the centroid — measure of alignment)
and the distance of each of those hospitals from the 100th percentile point (the perfect
point — measure of performance), weighted by the distance from the perfect point. (See
Figure 1.) A score is calculated for overall performance and for each individual measure.
The system performance-weighted alignment scores are ranked by comparison group
and reported as rank percentiles. Higher percentiles mean better performance.
The profiled system performance is compared to the median alignment scores for
the systems that were in the top quintile on both Performance and Improvement (Top
P & I Group). This group of systems was selected using the study ranked metrics, not
member hospital alignment. We find that high alignment has not yet been achieved
uniformly across all measures, even in this high-performing group.
Figure 1: Performance-Weighted Alignment Scoring
100
Perfect Point
2
6
80
2010–2014 Rate of Improvement
7
8
Centroid
60
4
40
5
3
20
0
0
20
40
60
2014 Performance
14
15 TOP HEALTH SYSTEMS
80
100
Methodology
The Truven Health 15 Top Health Systems study
is the most recent addition to the Truven Health
100 Top Hospitals® program initiatives. It is a
quantitative study that identifies 15 U.S. health systems
with the highest overall achievement on a balanced
scorecard. The scorecard is based on the 100 Top
Hospitals national balanced scorecard methodologies
and focuses on five performance domains: inpatient
outcomes, process of care, extended outcomes,
efficiency, and patient experience.
This 2016 health systems study includes nine measures that provide a valid comparison
of health system performance using publicly available data. The health systems with the
highest achievement are those with the highest ranking on a composite score based on
these nine measures. To analyze health system performance, we include data for shortterm, acute care, nonfederal U.S. hospitals, as well as cardiac, orthopedic, women’s, and
critical access hospitals (CAHs) that are members of the health systems.
The main steps we take in selecting the top 15 health systems are:
1. Building the database of health systems, including special selection and
exclusion criteria
2. Identifying which hospitals are members of health systems
3. Aggregating the patient-level data from member hospitals and calculating a set of
performance measures at the system level
4. Classifying health systems into comparison groups based on total operating expense
5. Ranking systems on each of the performance measures by comparison group
6. Determining the 15 top performers — five in each comparison group — from the health
systems’ overall rankings based on their aggregate performance (sum of individual
weighted measure ranks)
The following section is intended to be an overview of these steps. To request more
detailed information on any of the study methodologies outlined here, please email us at
100tophospitals@truvenhealth.com or call 1.800.366.7526.
15 TOP HEALTH SYSTEMS
15
Building the Database of Health Systems
Like all the 100 Top Hospitals studies, the 15 Top Health Systems study uses only publicly
available data. The data for this study primarily come from:
§§ The Medicare Provider Analysis and Review (MEDPAR) dataset
§§ The Centers for Medicare & Medicaid Services (CMS) Hospital Compare dataset
We used MEDPAR patient-level demographic, diagnosis, and procedure information to
calculate mortality, complications, and length of stay (LOS) by aggregating member
hospital data to the health system level. The MEDPAR dataset contains information on
the approximately 15 million Medicare patients discharged annually from U.S. acute
care hospitals. In this year’s study, we used the most recent two federal fiscal years of
MEDPAR data available (2013 and 2014), which include Medicare health maintenance
organization (HMO) encounters.25
We used the CMS Hospital Compare dataset published in the second and third quarters
of 2015 for core measures, 30-day mortality rates, 30-day readmission rates, the
Medicare spending per beneficiary (MSPB) index, and Hospital Consumer Assessment of
Healthcare Providers and Systems (HCAHPS) patient perception of care data.26
We also used the 2014 Medicare cost reports, published in the federal Hospital Cost
Report Information System (HCRIS) third quarter 2015 dataset, to create our proprietary
database for determining system membership based on “home office” or “related
organization” relationships reported by hospitals. The cost reports were also used to
aggregate member hospital total operating expense to the system level. This data was
used to classify health systems into three comparison groups.
We, and many others in the healthcare industry, have used these public data sources
for many years. We believe them to be accurate and reliable sources for the types of
analyses performed in this study. Performance based on Medicare data has been found to
be highly representative of all-payer data.
Severity-Adjustment Models and Present-on-Admission Data
Truven Health proprietary risk- and severity-adjustment models for mortality,
complications, and LOS have been recalibrated in this release using federal fiscal year
(FFY) 2013 data available in the Truven Health all-payer Projected Inpatient Data Base
(PIDB). The PIDB is one of the largest U.S. inpatient, all-payer databases of its kind,
containing approximately 23 million inpatient discharges annually, obtained from
approximately 3,700 hospitals, which comprise more than 65 percent of the nonfederal
U.S. market.
Truven Health risk- and severity-adjustment models take advantage of available presenton-admission (POA) coding that is reported in all-payer data. Only patient conditions that
are present on admission are used to determine the probability of death, complications,
or the expected LOS.
16
15 TOP HEALTH SYSTEMS
The recalibrated severity-adjustment models were used in producing the risk-adjusted
mortality and complications indexes, based on two years of MEDPAR data (2013 and
2014). The severity-adjusted LOS was based on MEDPAR 2014 data.
Hospital Exclusions
After building the database, we excluded a number of hospitals that would have skewed
the study results. Excluded from the study were:
§§ Certain specialty hospitals (e.g., children’s, psychiatric, substance abuse, rehabilitation,
cancer, and long-term care)
§§ Federally owned hospitals
§§ Non-U.S. hospitals (such as those in Puerto Rico, Guam, and the U.S. Virgin Islands)
§§ Hospitals with Medicare average LOS longer than 30 days in FFY 2014
§§ Hospitals with no reported Medicare patient deaths in FFY 2014
§§ Hospitals that had fewer than 60 percent of patient records with valid POA codes
Cardiac, orthopedic, women’s hospitals, and CAHs were included in the study, as long
as they were not excluded for any other criteria listed above, with the exception of no
reported Medicare deaths.
In addition, specific patient records were also excluded:
§§ Patients who were discharged to another short-term facility (this is done to avoid
double counting)
§§ Patients who were not at least 65 years old
§§ Rehabilitation, psychiatric, and substance abuse patients
§§ Patients with stays shorter than one day
After all exclusions were applied, 2,912 individual hospitals were included in the 2016 study.
Health System Exclusions
Health systems were excluded if one or more measures were missing, with the exception
of the 30-day mortality rates, 30-day readmission rates, and individual core measures.
A health system was excluded only if all 30-day mortality rates, all 30-day readmission
rates, or all individual core measures were missing. In systems where one or more
individual 30-day rates or core measures were missing, we calculated a median health
system value for each, by comparison group, and substituted the median in any case
where a health system had no data for that measure.
In addition, since CMS does not publish MSPB index values for hospitals in Maryland due
to a separate payment agreement, we also substituted the median value for this measure.
This allowed us to keep health systems in the study that were unavoidably missing these
data due to low eligible patient counts or no CMS published data. Systems with missing
MSPB index values were not eligible to be winners.
15 TOP HEALTH SYSTEMS
17
Measures and Methods Changes
POA Coding Adjustments
From 2009 through 2014, we have observed a significant rise in the number of principal
diagnosis (PDX) and secondary diagnosis (SDX) codes that do not have valid POA codes
in the MEDPAR data files. Since 2011, a code of “0” is appearing. The percentage of
diagnosis codes with POA code 0 are displayed in Table 9 below. This phenomenon has
led to an artificial rise in the number of conditions that appear to be occurring during the
hospital stay.
Table 9: Trend in Diagnosis Codes With POA Code 0
2010
2011
2012
2013
2014
PDX
0.00%
4.26%
4.68%
4.37%
3.40%
SDX
0.00%
15.05%
19.74%
22.10%
21.58%
To correct for this bias, we adjusted MEDPAR record processing through our mortality,
complications, and LOS risk models as follows:
1. We treated all diagnosis codes on the CMS exempt list as “exempt,” regardless of
POA coding.
2. We treated all principal diagnoses as present on admission.
3. We treated secondary diagnoses, where POA code Y or W appeared more than 50
percent of the time in the Truven Health all-payer database, as present on admission.
Patient Safety Indicator Removed From Study
This year, we removed the Agency for Healthcare Research and Quality (AHRQ) patient
safety metrics from the 15 Top Health Systems study for further evaluation due to
concerns that many of the metrics reflect inaccurate coding and documentation rather
than adverse patient safety incidents. The concerns were first raised by the Institute of
Medicine and more recently amplified in peer-reviewed journals in regard to the Patient
Safety Composite (Patient Safety Indicator [PSI] 90) and some of the included PSIs.
Because of the importance of patient safety, we will complete a thorough analysis of the
patient safety metrics to ensure that national comparisons are accurate and actionable.
The results will be published in our study next year.
Do Not Resuscitate Exclusion Added
The Truven Health mortality risk model now excludes records with “do not resuscitate”
(DNR, v49.86) orders that are coded as POA. Excluding records that have DNR status
at admission removes these high-probability-of-death patients from the analysis and
allows hospitals to concentrate more fully on events that could lead to deaths during the
hospitalization.
Note: Truven Health asked AHRQ to clarify how the PSI models treated POA coding. The response included the
following statement: “A condition is considered to be POA only if the diagnosis is listed with a POA code of Y or W.
A code of N,U,E,1,X or Missing is considered not POA.”
18
15 TOP HEALTH SYSTEMS
Identifying Health Systems
To be included in the study, a health system must contain at least two short-term, general,
acute care hospitals, as identified using the 100 Top Hospitals specialty algorithm,
after hospital exclusions have been applied. In addition, we also included any cardiac,
orthopedic, women’s hospitals, and CAHs that passed the exclusion rules cited above.
We identified the parent system by finding the home office or related organization, as
reported on the hospitals’ 2014 (or 2013) Medicare cost report.
We identified health systems that have subsystems with their own reported home
offices or related organization relationships. Both the parent system and any identified
subsystems were treated as “health systems” for purposes of this study and were
independently profiled. Hospitals that belong to a parent health system and a subsystem
were included in both for analysis.
To analyze health system performance, we aggregated data from all of a system’s
included hospitals. Below, we provide specific details about the calculations used for
each performance measure and how these measures are aggregated to determine
system performance.
After all exclusions were applied and parent systems identified, the final 2016 study group
included 338 health systems with the following profile (Table 10).
Table 10: 2016 Health Systems Study Group
System Category
Systems
Member
Hospitals
Medicare
Patient
Discharges,
2014
Average
Hospitals
per System
Average
Patient
Discharges
per System
Benchmark Systems
15
147
345,951
9.8
23,063
Peer Systems
323
2,765
8,714,167
8.6
26,979
Total Systems
338
2,912
9,060,118
8.6
26,805
Classifying Health Systems Into Comparison Groups
Health System Comparison Groups
We refined the analysis of health systems by dividing them into three comparison groups
based on the total operating expense of the member hospitals. This was done to develop
more actionable benchmarks for like systems. The three comparison groups we used are
show in Table 11.
Table 11: Three Health System Comparison Groups, Defined
Health System Comparison Group
Total Operating Expense
Number of Systems
in Study
Number of
Winners
Large
> $1.75 billion
101
5
Medium
$750 million – $1.75 billion
114
5
Small
< $750 million
123
5
Total Systems
n/a
338
15
15 TOP HEALTH SYSTEMS
19
Scoring Health Systems on Weighted Performance Measures
Evolution of Performance Measures
We used a balanced scorecard approach, based on public data, to select the measures
most useful for boards and CEOs in the current healthcare operating environment. We
gather feedback from industry leaders, hospital and health system executives, academic
leaders, and internal experts; review trends in the healthcare industry; and survey
hospitals in demanding marketplaces to learn what measures are valid and reflective of
top performance.
As the industry has changed, our methods have evolved. Our current measures are
centered on five main components of hospital performance: inpatient outcomes, process
of care, extended outcomes, efficiency, and patient experience.
The 2016 Health Systems Study Performance Measures
The measures ranked in the 2016 study are:
Inpatient Outcomes
1. Risk-adjusted inpatient mortality index
2. Risk-adjusted complications index
Process of Care
3. Core measures mean percent (stroke care and blood clot prevention)
Extended Outcomes
4. Mean 30-day risk-adjusted mortality rate (acute myocardial infarction [AMI], heart
failure [HF], pneumonia, chronic obstructive pulmonary disease [COPD], and stroke)
5. Mean 30-day risk-adjusted readmission rate (AMI, HF, pneumonia, hip/knee
arthroplasty, COPD, and stroke)
Efficiency
6. Severity-adjusted average length of stay (LOS)
7. Mean emergency department (ED) throughput (wait time minutes)
8. Medicare spend per beneficiary (MSPB) index
Patient Experience
9. Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) score
(patient rating of overall hospital performance)
The data sources for these measures are:
20
15 TOP HEALTH SYSTEMS
Table 12: Summary of Measure Data Sources and Data Periods
Performance Measure
Data Source/Data Period
Risk-Adjusted Mortality Index (Inpatient)
MEDPAR FFY 2013 and 2014
Risk-Adjusted Complications Index
MEDPAR FFY 2013 and 2014
Core Measures Mean Percent (Stroke Care, Blood Clot Prevention)
CMS Hospital Compare Oct. 1, 2013–Sept. 30, 2014
Mean 30-Day Mortality Rate (AMI, HF, Pneumonia, COPD, Stroke)
CMS Hospital Compare July 1, 2011–June 30, 2014
Mean 30-Day Readmission Rate (AMI, HF, Pneumonia, Hip/Knee Arthroplasty, COPD, Stroke)
CMS Hospital Compare July 1, 2011–June 30, 2014
Severity-Adjusted Average Length of Stay
MEDPAR FFY 2014
Mean Emergency Department (ED) Throughput Measure
CMS Hospital Compare Calendar Year 2014
Medicare Spend per Beneficiary (MSPB) Index
CMS Hospital Compare Calendar Year 2014
HCAHPS Score (Patient Overall Hospital Rating)
CMS Hospital Compare Calendar Year 2014
Below, we provide a rationale for the selection of our balanced scorecard categories and
the measures used for each.
Inpatient Outcomes
We included two measures of inpatient outcomes — risk-adjusted mortality index and
risk-adjusted complications index. These measures show us how the health system is
performing on the most basic and essential care standards — survival and error-free care
— while treating patients in the hospital.
Process of Care
We included two groups of core measures — stroke care and blood clot prevention. These
measures were developed by The Joint Commission (TJC) and CMS, and endorsed by
the National Quality Forum (NQF), as basic, minimum process-of-care standards. These
measures have included specific guidelines for a wide variety of patient conditions, and
as compliance has grown, CMS has retired many and replaced them with new ones. Our
core measures score was based on the stroke care and blood clot prevention measures,
using Hospital Compare data reported on the CMS website.26 In this study, we included
core measures that CMS mandated for reporting in 2014. See Appendix C for a list.
Extended Outcomes
The extended outcomes measures — 30-day mortality rates for AMI, HF, pneumonia,
COPD, and stroke patients and 30-day readmission rates for AMI, HF, pneumonia, hip/
knee arthroplasty, COPD, and stroke patients — help us understand how the health
system’s patients are faring over a longer period. These measures are part of the CMS
Value-Based Purchasing Program and are watched closely in the industry. Health systems
with lower values appear to be providing or coordinating a continuum of care with better
medium-term results for these conditions.
Efficiency
The efficiency domain included severity-adjusted average LOS, the mean ED throughput
measure, and the MSPB index. Average LOS serves as a proxy for clinical efficiency in an
inpatient setting, while the ED throughput measures focus on process efficiency in one of
the most important access points to hospital care. For ED throughput, we used the mean
of the reported median minutes for three critical processes: time from door to admission,
time from door to discharge for non-admitted patients, and time-to-receipt of pain
medications for broken bones.
15 TOP HEALTH SYSTEMS
21
Average LOS requires adjustment to increase the validity of comparisons across the hospital industry. We used a Truven
Health proprietary, severity-adjustment model to determine expected LOS at the patient level. Patient-level observed
and expected LOS values were used to calculate the system-level, severity-adjusted, average LOS.
The MSPB index is a new proxy for continuum-of-care performance recently added to the study. This measure, as
defined and calculated by CMS, is the ratio of Medicare spending per beneficiary treated in a specific hospital and the
median Medicare spending per beneficiary, nationally. It includes Medicare Part A and Part B payments three days prior
to the hospital stay, during the stay, and 30 days post-discharge. We believe this indicator can be a beginning point for
understanding hospital and local area cost performance relative to hospital peer markets.
Patient Experience
We believe that a measure of patient perception of care — the patient experience — is crucial to the balanced scorecard
concept. Understanding how patients perceive the care a health system provides within its member hospitals, and how
that perception compares and contrasts with perceptions of its peers, is important if a health system is to improve
performance. For this reason, we calculated an HCAHPS score, based on patient perception of care data from the
HCAHPS patient survey. In this study, the HCAHPS score is based on the HCAHPS overall hospital rating question only.
Through the use of the combined measures described above, we provide a balanced picture of health system
performance. Full details about each of these performance measures are included on the following pages.
Performance Measure Details
Risk-Adjusted Mortality Index (Inpatient)
Why We Include This Element
Calculation
Comment
Favorable
Values Are
Patient survival is a universally accepted
measure of hospital quality. The lower the
mortality index, the greater the survival
of the patients in the system's hospitals,
considering what would be expected
based on patient characteristics. While all
hospitals have patient deaths, this measure
can show where deaths did not occur but
were expected, or the reverse, given the
patient’s condition.
We calculate a mortality index value based
on the aggregate number of actual inhospital deaths for all member hospitals
in each system, divided by the number of
normalized expected deaths, given the
risk of death for each patient. Expected
deaths are derived by processing MEDPAR
patient record data through our proprietary
mortality risk model, which is designed to
predict the likelihood of a patient’s death
based on patient-level characteristics (age,
sex, presence of complicating diagnoses,
and other characteristics). We normalize
the expected values using the observed-toexpected ratio for in-study health systems,
by comparison group.
We rank systems, by comparison group,
on the difference between observed and
expected deaths, expressed in normalized
standard deviation units (z-score).27, 28 Health
systems with the fewest deaths, relative to
the number expected, after accounting for
standard binomial variability, receive the
most favorable scores. We use two years
of MEDPAR data to reduce the influence of
chance fluctuation.
Lower
The mortality risk model takes into account
present-on-admission coding (POA) in
determining expected deaths. Palliative
care patients (v66.7) are included in the risk
model. DNR patients (v49.86) are excluded.
The reference value for this index is 1.00;
a value of 1.15 indicates 15 percent more
deaths occurred than were predicted, and
a value of 0.85 indicates 15 percent fewer
deaths than predicted.
22
15 TOP HEALTH SYSTEMS
The MEDPAR dataset includes both
Medicare fee-for-service claims and
Medicare Advantage (HMO) encounter
records.
Systems with observed values statistically
worse than expected (99-percent
confidence), and whose values are above
the high trim point, are not eligible to be
named benchmark health systems.
Risk-Adjusted Complications Index
Why We Include This Element
Calculation
Comment
Favorable
Values Are
Keeping patients free from potentially
avoidable complications is an important
goal for all healthcare providers. A lower
complications index indicates fewer
patients with complications, considering
what would be expected based on patient
characteristics. Like the mortality index, this
measure can show where complications did
not occur but were expected, or the reverse,
given the patient’s condition.
We calculate a complications index
value based on the aggregate number
of cases with observed complications
for all member hospitals in each system,
divided by the number of normalized
expected complications, given the risk of
complications for each patient. Expected
complications are derived by processing
MEDPAR patient record data through
our proprietary complications risk model,
which is designed to predict the likelihood
of complications during hospitalization.
This model accounts for patient-level
characteristics (age, sex, principal
diagnosis, comorbid conditions, and other
characteristics). We normalize the expected
values using the observed-to-expected ratio
for in-study health systems, by comparison
group. Complication rates are calculated
from normative data for two patient risk
groups: medical and surgical.
We rank systems on the difference between
the observed and expected number of
patients with complications, expressed
in normalized standard deviation units
(z-score). We use two years of MEDPAR
data to reduce the influence of chance
fluctuation.
Lower
The MEDPAR dataset includes both
Medicare fee-for-service claims and
Medicare Advantage (HMO) encounter
records.
Systems with observed values statistically
worse than expected (99-percent
confidence), and whose values are above
the high trim point, are not eligible to be
named benchmark health systems.
The complications risk model takes into
account POA coding in determining
expected complications. Also, conditions
that are present on admission are not
counted as observed complications.
The reference value for this index is
1.00; a value of 1.15 indicates 15 percent
more complications occurred than were
predicted, and a value of 0.85 indicates 15
percent fewer complications than predicted.
Core Measures Mean Percent
Why We Include This Element
Calculation
Comment
Favorable
Values Are
To be truly balanced, a scorecard must
include various measures of quality. Core
measures were developed by TJC and
endorsed by the NQF as basic, minimum
standards. They are a widely accepted
method for measuring patient care quality.
The reported core measure percent values
reflect the percentage of eligible patients
who received the expected standard of
patient care.
CMS reports the percentage of eligible
patients who received each core measure,
as well as the eligible patient count. For
each included core measure, we calculate
an aggregate core measure percent for
each system. This is done by multiplying
the system member hospital eligible patient
count by the reported percent who received
it. The result is the recipient count for each
member hospital. We sum the recipient
patient counts and divide by the sum of
eligible patients for member hospitals of
each system. This value is multiplied by 100
to produce the system-level core measure
percentage for the individual core measure.
We rank systems by comparison group,
based on the mean core measure percent
value for included core measures (stroke
care, blood clot prevention). Because of
low reporting, we exclude a number of core
measures for small community hospitals
and medium community hospitals. For a list
of the measures used and those excluded,
please see Appendix C.
Higher
We calculate the arithmetic mean of
the system-level core measure percent
values for the included measures (AMI,
HF, pneumonia, COPD, stroke) to produce
the ranked composite measure. Note:
We reverse the percent value for VTE-6,
where lower percentages mean better
performance, in order to include it in the
composite.
Exception:
VTE-6 —
Patients who
developed
a blood clot
while in the
hospital who
did not get
treatment
that could
have
prevented
it (lower
percentages
are better).
15 TOP HEALTH SYSTEMS
23
Mean 30-Day Risk-Adjusted Mortality Rate (AMI, HF, Pneumonia, COPD, and Stroke Patients)
Why We Include This Element
Calculation
Comment
Favorable
Values Are
30-day mortality rates are a widely
accepted measure of the effectiveness
of hospital care. They allow us to look
beyond immediate inpatient outcomes and
understand how the care the health system
provided to inpatients with these particular
conditions may have contributed to their
longer-term survival. In addition, tracking
these measures may help health systems
identify patients at risk for post-discharge
problems and target improvements
in discharge planning and after-care
processes. Health systems that score well
may be better prepared for a pay-forperformance structure.
Data are from the CMS Hospital Compare
dataset. CMS calculates a 30-day mortality
rate for each patient condition using three
years of MEDPAR data, combined. We
aggregate these data to produce a rate for
each 30-day measure for each system. This
is done by multiplying the hospital-level
reported patient count (eligible patients) by
the reported hospital rate to determine the
number of patients who died within 30 days
of admission. We sum the calculated deaths
and divide by the sum of eligible patients
for member hospitals of each system. This
value is multiplied by 100 to produce the
system-level 30-day mortality rate for
each measure, expressed as a percentage.
CMS does not calculate rates for hospitals
where the number of cases is too small (less
than 25). In these cases, we substitute the
comparison group-specific median rate for
the affected 30-day mortality measure.
We rank systems by comparison group,
based on the mean rate for included 30-day
mortality measures (AMI, HF, pneumonia,
COPD, stroke).
Lower
The CMS Hospital Compare data for 30-day
mortality are based on Medicare fee-forservice claims only.
We calculate the arithmetic mean of the
system-level included 30-day mortality
rates (AMI, HF, pneumonia, COPD, stroke) to
produce the ranked composite measure.
Mean 30-Day Risk-Adjusted Readmission Rate (AMI, HF, Pneumonia, Hip/Knee Arthroplasty, COPD, and Stroke Patients)
Why We Include This Element
Calculation
Comment
Favorable
Values Are
30-day readmission rates are a widely
accepted measure of the effectiveness of
hospital care. They allow us to understand
how the care a health system provided to
inpatients with these particular conditions
may have contributed to issues with their
post-discharge medical stability and
recovery. Because these measures are
part of the CMS Value-Based Purchasing
Program, they are being watched closely in
the industry. Tracking these measures may
help health systems identify patients at risk
for post-discharge problems if discharged
too soon, as well as target improvements
in discharge planning and after-care
processes. Health systems that score well
may be better prepared for a pay-forperformance structure.
Data are from the CMS Hospital
Compare dataset. CMS calculates a 30day readmission rate for each patient
condition using three years of MEDPAR
data, combined. We aggregate these
data to produce a rate for each 30-day
measure for each system. This is done by
multiplying the hospital-level reported
patient count (eligible patients) by the
reported hospital rate to determine the
number of patients who were readmitted
within 30 days of discharge. We sum the
calculated readmissions and divide by
the sum of eligible patients for member
hospitals of each system. This value is
multiplied by 100 to produce the systemlevel 30-day readmission rate for each
measure, expressed as a percentage. CMS
does not calculate rates for hospitals where
the number of cases is too small (less
than 25). In these cases, we substitute the
comparison group-specific median rate for
the affected 30-day readmission measure.
We rank systems by comparison group,
based on the mean rate for included 30-day
readmission measures (AMI, HF, pneumonia,
hip/knee arthroplasty, COPD, stroke).
Lower
We calculate the arithmetic mean of the
system-level included 30-day readmission
rates (AMI, HF, pneumonia, hip/knee
arthroplasty, COPD, stroke) to produce the
ranked composite measure.
24
15 TOP HEALTH SYSTEMS
The CMS Hospital Compare data for 30-day
readmissions are based on Medicare feefor-service claims only.
Severity-Adjusted Average Length of Stay
Why We Include This Element
Calculation
Comment
Favorable
Values Are
A lower severity-adjusted average length
of stay (LOS) generally indicates more
efficient consumption of health system
resources and reduced risk to patients.
We calculate an LOS index value for each
health system by dividing the sum of the
actual LOS of member hospitals by the
sum of the normalized expected LOS for
the hospitals in the system. Expected
LOS adjusts for difference in severity of
illness using a linear regression model. We
normalize the expected values using the
observed-to-expected ratio for in-study
health systems, by comparison group. The
LOS risk model takes into account POA
coding in determining expected LOS.
We rank systems on their severity-adjusted
average LOS.
Lower
The MEDPAR dataset includes both
Medicare fee-for-service claims and
Medicare Advantage (HMO) encounter
records.
We convert the LOS index into days by
multiplying each system’s LOS index by
the grand mean LOS for all in-study health
systems. We calculate grand mean LOS
by summing in-study health system LOS
and dividing that by the number of health
systems.
Mean Emergency Department Throughput Measure
Why We Include This Element
Calculation
Comment
Favorable
Values Are
The overall health system emergency
department (ED) is an important access
point to healthcare for many people. A
key factor in evaluating ED performance
is process "throughput" — measures of
the timeliness with which patients receive
treatment, and either are admitted or
discharged. Timely ED processes impact
both care quality and the quality of the
patient experience.
Data are from the CMS Hospital Compare
dataset. CMS reports the median minutes
for each ED throughput measure, as well
as the total number of ED visits, including
admitted patients. We include three of
the available ED measures in calculating
a system aggregate measure. For each
ED measure, we calculate the weighted
median minutes by multiplying the median
minutes by the ED visits. We sum the
weighted minutes for system member
hospitals and divide by the sum of ED visits
for the hospitals to produce the systemlevel weighted minutes for each measure.
We calculate the arithmetic mean of the
three included ED measures to produce the
ranked composite ED measure.
We chose to include three measures that
define three important ED processes: time
from door to admission, time from door
to discharge for non-admitted patients,
and time to receipt of pain medications for
broken bones.
Lower
Why We Include This Element
Calculation
Comment
Favorable
Values Are
Medicare spend per beneficiary (MSPB)
helps to determine how efficiently a
health system coordinates the care for its
patients across continuum-of-care sites.
Lower values indicate lower costs relative
to national medians and thus greater
efficiency.
Data are from the CMS Hospital Compare
dataset. CMS calculates the cost of care
for each admitted patient, including
Medicare Part A and Part B costs. CMS
aggregates costs associated with the index
admission from three days preadmission,
through inpatient stay, and 30 days postadmission. This cost is divided by the
median national cost. CMS applies both
numerator and denominator adjustments.
We calculate the system-level measure by
weighting each member hospital index by
the hospital's MEDPAR discharges for the
most current year in the study. We sum
the weighted values and divide by the sum
of the MEDPAR discharges of all member
hospitals. This produces a weighted MSPB
index for each system.
We rank health systems on the MSPB index.
Lower
Medicare Spend per Beneficiary Index
CMS calculates the cost of care for each
admitted patient including both Medicare
Part A and Part B costs.
An index value above 1.0 means higherthan-national median cost per beneficiary.
An index value below 1.0 means lower-thannational median cost per beneficiary.
15 TOP HEALTH SYSTEMS
25
Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) Score
(Patient Rating of Overall Hospital Performance)
Why We Include This Element
Calculation
Calculation Comment
Favorable
Values Are
We believe that including a measure of
patient assessment/perception of care is
crucial to the balanced scorecard concept.
How patients perceive the care a health
system provides has a direct effect on
its ability to remain competitive in the
marketplace.
Data are from CMS Hospital Compare
dataset. We used the data published by
CMS for the HCAHPS survey instrument
question, “How do patients rate the hospital
overall?” to score member hospitals. Patient
responses fall into three categories, and
the number of patients in each category is
reported as a percent by CMS:
§§ Patients who gave a rating of 6 or lower
(low)
§§ Patients who gave a rating of 7 or 8
(medium)
§§ Patients who gave a rating of 9 or 10
(high)
We rank systems based on the weighted
average health system HCAHPS score.
The highest possible HCAHPS score is 300
(100 percent of patients rate the health
system hospitals high). The lowest HCAHPS
score is 100 (100 percent of patients rate
the hospitals low).
Higher
HCAHPS data are survey data, based on
either a sample of hospital inpatients or
all inpatients. The dataset contains the
question scoring of survey respondents.
For each answer category, we assign a
weight as follows: 3 equals high or good
performance, 2 equals medium or average
performance, and 1 equals low or poor
performance. We then calculate a weighted
score for each hospital by multiplying the
HCAHPS answer percent by the category
weight. For each hospital, we sum the
weighted percent values for the three
answer categories. The result is the hospital
HCAHPS score.
To calculate the aggregate system score,
we multiply each member hospital HCAHPS
score by their MEDPAR discharges, sum the
weighted scores, and divide by the sum of
the member hospital discharges.
Determining the 15 Top Health Systems
Ranking
We ranked health systems on the basis of their performance on each of the included
measures relative to the other in-study systems, by comparison group. We summed the
ranks — giving all measures equal weight — and re-ranked overall to arrive at a final rank
for the system. The top five health systems with the best final rank in each of the three
comparison groups were selected as the winners (15 total winners).
Table 13: Ranked Performance Measures and Weights
26
Ranked Measure
Weight in Overall Ranking
Risk-Adjusted Mortality (Inpatient)
1
Risk-Adjusted Complications
1
Core Measures Mean Percent
1
Mean 30-Day Mortality Rate
1
Mean 30-Day Readmission Rate
1
Severity-Adjusted Average LOS
1
Mean ED Throughput
1
MSPB Index
1
HCAHPS Score (Overall Rating Question)
1
15 TOP HEALTH SYSTEMS
Winner Exclusions
For mortality and complications, which have observed and expected values, we identified
health systems with performance that was statistically worse than expected. Systems
that were worse than expected were excluded from consideration when selecting the
study winners. This was done because we do not want systems that have poor clinical
outcomes to be declared winners.
A system was winner-excluded if both of the following conditions applied:
1. Observed value was higher than expected, and the difference was statistically
significant with 99-percent confidence.
2. We calculated the 75th percentile index value for mortality and complications,
including data only for systems that met the first condition. These values were used as
the high trim points for those health systems. Systems with mortality or complications
index values above the respective trim points were winner-excluded.
Finally, systems with no reported MSPB were excluded from winner eligibility. See
Appendix C for details on outlier methodology.
Truven Health Policy on Revocation of a 15 Top Health Systems Award
To preserve the integrity of the study, it is the policy of Truven Health Analytics to revoke
a 15 Top Health Systems award if a system is found to have submitted inaccurate or
misleading data to any data source used in the study.
At the sole discretion of Truven Health, the circumstances under which a 15 Top Health
Systems award could be revoked include, but are not limited to, the following:
§§ Discovery by Truven Health staff, through statistical analysis or other means, that a
health system has submitted inaccurate data
§§ Discovery of media or Internet reports of governmental or accrediting agency
investigations or sanctions for actions by a health system that could have an adverse
impact on the integrity of the 15 Top Health Systems studies or award winner selection
15 TOP HEALTH SYSTEMS
27
Appendix A:
Health System/Hospital Name
Location
Hospital
Medicare ID
Health System
Asante
Medford, OR
Asante Ashland Community Hospital
Ashland, OR
380005
Winners and
Asante Rogue Regional Medical Center
Medford, OR
380018
Asante Three Rivers Medical Center
Grants Pass, OR
380002
Their Member
Kettering Health Network
Dayton, OH
Fort Hamilton Hughes Memorial Hospital
Hamilton, OH
360132
Hospitals
Grandview Medical Center
Dayton, OH
360133
Greene Memorial Hospital
Xenia, OH
360026
Kettering Medical Center
Kettering, OH
360079
Soin Medical Center
Beaver Creek, OH
360360
Sycamore Medical Center
Miamisburg, OH
360239
Lovelace Health System
Albuquerque, NM
Lovelace Medical Center
Albuquerque, NM
320009
Lovelace Regional Hospital — Roswell
Roswell, NM
320086
Lovelace Westside Hospital
Albuquerque, NM
320074
Lovelace Women's Hospital
Albuquerque, NM
320017
Mayo Foundation
Rochester, MN
Crossing Rivers Health Medical Center
Prairie du Chien, WI
521330
Mayo Clinic
Jacksonville, FL
100151
Mayo Clinic Health System in Albert Lea
Albert Lea, MN
240043
Mayo Clinic Health System in Cannon Falls
Cannon Falls, MN
241346
Mayo Clinic Health System in Eau Claire
Eau Claire, WI
520070
Mayo Clinic Health System in Fairmont
Fairmont, MN
240166
Mayo Clinic Health System in Lake City
Lake City, MN
241338
Mayo Clinic Health System in Mankato
Mankato, MN
240093
Mayo Clinic Health System in New Prague
New Prague, MN
241361
Mayo Clinic Health System in Red Cedar
Menomonee, WI
521340
Mayo Clinic Health System in Red Wing
Red Wing, MN
240018
Mayo Clinic Health System in Springfield
Springfield, MN
241352
Mayo Clinic Health System in St. James
St. James, MN
241333
Mayo Clinic Health System in Waseca
Waseca, MN
241345
Mayo Clinic Health System in Waycross
Waycross, GA
110003
Mayo Clinic Health System — Chippewa Valley
Bloomer, WI
521314
Mayo Clinic Health System — Franciscan La Crosse
La Crosse, WI
520004
Mayo Clinic Health System — Oakridge
Osseo, WI
521302
Mayo Clinic Health System — Franciscan Sparta
Sparta, WI
521305
Mayo Clinic Health System — Northland
Barron, WI
521315
Mayo Clinic Methodist Hospital
Rochester, MN
240061
Note: Winning systems are ordered alphabetically.
15 TOP HEALTH SYSTEMS
29
Health System/Hospital Name
Location
Hospital
Medicare ID
Mayo Clinic — Saint Marys Hospital
Rochester, MN
240010
Mayo Regional Hospital
Dover-Foxcroft, ME
201309
Tomah Memorial Hospital
Tomah, WI
521320
Veterans Memorial Hospital
Waukon, IA
161318
Mercy
Chesterfield, MO
Lincoln County Medical Center
Troy, MO
261319
Mercy Health Love County
Marietta, OK
371306
Mercy Hospital Ada
Ada, OK
370020
Mercy Hospital Ardmore
Ardmore, OK
370047
Mercy Hospital Aurora
Aurora, MO
261316
Mercy Hospital Berryville
Berryville, AR
041329
Mercy Hospital Cassville
Cassville, MO
261317
Mercy Hospital Columbus
Columbus, KS
171308
Mercy Hospital El Reno
El Reno, OK
370011
Mercy Hospital Fort Scott
Fort Scott, KS
170058
Mercy Hospital Fort Smith
Fort Smith, AR
040062
Mercy Hospital Healdton
Healdton, OK
371310
Mercy Hospital Independence
Independence, KS
170010
Mercy Hospital Jefferson
Crystal City, MO
260023
Mercy Hospital Logan County
Guthrie, OK
371317
Mercy Hospital Joplin
Joplin, MO
260001
Mercy Hospital Kingfisher
Kingfisher, OK
371313
Mercy Hospital Lebanon
Lebanon, MO
260059
Mercy Hospital Northwest Arkansas
Rogers, AR
040010
Mercy Hospital Oklahoma City
Oklahoma City, OK
370013
Mercy Hospital Springfield
Springfield, MO
260065
Mercy Hospital St. Louis
St. Louis, MO
260020
Mercy Hospital Tishomingo
Tishomingo, OK
371304
Mercy Hospital Turner Memorial
Ozark, AR
041303
Mercy Hospital Washington
Washington, MO
260052
Mercy Hospital Watonga
Watonga, OK
371302
Mercy McCune Brooks Hospital
Carthage, MO
260228
Mercy St. Francis Hospital
Mountain View, MO
261335
North Logan Mercy Hospital
Paris, AR
041300
St. Edward Health Facilities
Waldron, AR
041305
MidMichigan Health
Midland, MI
MidMichigan Medical Center — Clare
Clare, MI
230180
MidMichigan Medical Center — Gladwin
Gladwin, MI
231325
MidMichigan Medical Center — Gratiot
Alma, MI
230030
MidMichigan Medical Center — Midland
Midland, MI
230222
Roper St. Francis Healthcare
Charleston, SC
Bon Secours St. Francis Hospital
Charleston, SC
420065
Roper Hospital
Charleston, SC
420087
Roper St. Francis Mount Pleasant Hospital
Mount Pleasant, SC
420104
Note: Winning systems are ordered alphabetically.
30
15 TOP HEALTH SYSTEMS
Health System/Hospital Name
Location
Hospital
Medicare ID
Scripps Health
San Diego, CA
Scripps Green Hospital
La Jolla, CA
050424
Scripps Memorial Hospital Encinitas
Encinitas, CA
050503
Scripps Memorial Hospital La Jolla
La Jolla, CA
050324
Scripps Mercy Hospital
San Diego, CA
050077
Spectrum Health
Grand Rapids, MI
Memorial Medical Center of West Michigan
Ludington, MI
230110
Spectrum Health Big Rapids Hospital
Big Rapids, MI
230093
Spectrum Health Gerber Memorial
Fremont, MI
230106
Spectrum Health Kelsey Hospital
Lakeview, MI
231317
Spectrum Health Medical Center
Grand Rapids, MI
230038
Spectrum Health Reed City Hospital
Reed City, MI
231323
Spectrum Health United Hospital
Greenville, MI
230035
Zeeland Community Hospital
Zeeland, MI
230003
St. Luke's Health System
Boise, ID
St. Luke's Boise Medical Center
Boise, ID
130006
St. Luke's Elmore Medical Center
Mountain Home, ID
131311
St. Luke's Jerome Medical Center
Jerome, ID
131310
St. Luke's Magic Valley Medical Center
Twin Falls, ID
130002
St. Luke's McCall Medical Center
McCall, ID
131312
St. Luke's Wood River Medical Center
Ketchum, ID
131323
St. Vincent Health
Indianapolis, IN
St. Joseph Hospital & Health Center
Kokomo, IN
150010
St. Vincent Anderson Regional Hospital
Anderson, IN
150088
St. Vincent Carmel Hospital
Carmel, IN
150157
St. Vincent Clay Hospital
Brazil, IN
151309
St. Vincent Dunn Hospital
Bedford, IN
151335
St. Vincent Fishers Hospital
Fishers, IN
150181
St. Vincent Frankfort Hospital
Frankfort, IN
151316
St. Vincent Heart Center
Indianapolis, IN
150153
St. Vincent Indianapolis Hospital
Indianapolis, IN
150084
St. Vincent Jennings Hospital
North Vernon, IN
151303
St. Vincent Mercy Hospital
Elwood, IN
151308
St. Vincent Randolph Hospital
Winchester, IN
151301
St. Vincent Salem Hospital
Salem, IN
151314
St. Vincent Williamsport Hospital
Williamsport, IN
151307
Sutter Health
Sacramento, CA
Alta Bates Summit Medical Center
Berkeley, CA
050305
Alta Bates Summit Medical Center Summit
Oakland, CA
050043
California Pacific Medical Center
San Francisco, CA
050047
California Pacific Medical Center Davies
San Francisco, CA
050008
California Pacific Medical Center St. Luke's
San Francisco, CA
050055
Eden Medical Center
Castro Valley, CA
050488
Memorial Hospital Los Banos
Los Banos, CA
050528
Note: Winning systems are ordered alphabetically.
15 TOP HEALTH SYSTEMS
31
Health System/Hospital Name
Location
Hospital
Medicare ID
Memorial Medical Center
Modesto, CA
050557
Mills Peninsula Medical Center
Burlingame, CA
050007
Novato Community Hospital
Novato, CA
050131
Sutter Amador Hospital
Jackson, CA
050014
Sutter Auburn Faith Hospital
Auburn, CA
050498
Sutter Coast Hospital
Crescent City, CA
050417
Sutter Davis Hospital
Davis, CA
050537
Sutter Delta Medical Center
Antioch, CA
050523
Sutter Lakeside Hospital
Lakeport, CA
051329
Sutter Maternity & Surgery Center of Santa Cruz
Santa Cruz, CA
050714
Sutter Medical Center, Sacramento
Sacramento, CA
050108
Sutter Roseville Medical Center
Roseville, CA
050309
Sutter Santa Rosa Regional Hospital
Santa Rosa, CA
050291
Sutter Solano Medical Center
Vallejo, CA
050101
Sutter Surgical Hospital North Valley
Tracy, CA
050766
Sutter Tracy Community Hospital
Tracy, CA
050313
Sutter Health Valley Division
Sacramento, CA
Memorial Hospital Los Banos
Los Banos, CA
050528
Memorial Medical Center
Modesto, CA
050557
Sutter Amador Hospital
Jackson, CA
050014
Sutter Auburn Faith Hospital
Auburn, CA
050498
Sutter Davis Hospital
Davis, CA
050537
Sutter Medical Center, Sacramento
Sacramento, CA
050108
Sutter Roseville Medical Center
Roseville, CA
050309
Sutter Solano Medical Center
Vallejo, CA
050101
Sutter Surgical Hospital North Valley
Yuba City, CA
050766
Sutter Tracy Community Hospital
Tracy, CA
050313
Tanner Health System
Carrollton, GA
Higgins General Hospital
Bremen, GA
111320
Tanner Medical Center/Carrollton
Carrollton, GA
110011
Tanner Medical Center/Villa Rica
Villa Rica, GA
110015
TriHealth
Cincinnati, OH
Bethesda North Hospital
Cincinnati, OH
360179
Good Samaritan Hospital
Cincinnati, OH
360134
TriHealth Evendale Hospital
Cincinnati, OH
360362
Note: Winning systems are ordered alphabetically.
32
15 TOP HEALTH SYSTEMS
Appendix B:
Large Health Systems
The Top
Health System Name
Location
Allina Health System
Minneapolis, MN
Avera Health
Sioux Falls, SD
Banner Health
Phoenix, AZ
Baylor Scott & White Health
Dallas, TX
Performing
Carolinas HealthCare System
Charlotte, NC
Centura Health
Englewood, CO
Health
Houston Methodist
Houston, TX
Intermountain Health Care
Salt Lake City, UT
Systems
Mayo Foundation
Rochester, MN
Memorial Hermann Healthcare System
Houston, TX
Mercy
Chesterfield, MO
OSF Healthcare System
Peoria, IL
Prime Healthcare Services
Ontario, CA
Sanford Health
Sioux Falls, SD
SCL Health
Denver, CO
Spectrum Health
Grand Rapids, MI
SSM Health
St. Louis, MO
Sutter Health
Sacramento, CA
Sutter Health Valley Division
Sacramento, CA
Texas Health Resources
Arlington, TX
University of Colorado Health
Aurora, CO
Quintile: Best-
Medium Health Systems
Health System Name
Location
Alegent Creighton Health
Omaha, NE
Ardent Health Services
Nashville, TN
Bronson Healthcare Group
Kalamazoo, MI
Community Health Network
Indianapolis, IN
HealthPartners
Bloomington, MN
HonorHealth
Scottsdale, AZ
John Muir Health
Walnut Creek, CA
Kettering Health Network
Dayton, OH
Legacy Health System
Portland, OR
Memorial Healthcare System
Hollywood, FL
Mercy Health
Muskegon, MI
Mercy Health Southwest Ohio Region
Cincinnati, OH
Ministry Health Care
Milwaukee, WI
Mission Health
Asheville, NC
Note: Health systems are ordered alphabetically. This year’s 15 Top Health Systems are in bold, blue text.
15 TOP HEALTH SYSTEMS
33
Medium Health Systems
Health System Name
Location
Parkview Health
Fort Wayne, IN
Saint Joseph Mercy Health System
Ann Arbor, MI
SCL Denver Region
Denver, CO
Scripps Health
San Diego, CA
Sisters of Charity Health System
Cleveland, OH
St. Elizabeth Healthcare
Fort Thomas, KY
St. Luke's Health System
Boise, ID
St. Vincent Health
Indianapolis, IN
TriHealth
Cincinnati, OH
Small Health Systems
Health System Name
Location
Alexian Brothers Health System
Arlington Heights, IL
Asante
Medford, OR
Cape Cod Healthcare
Hyannis, MA
CHI St. Joseph Health
Bryan, TX
Franciscan Sisters of Christian Charity
Manitowoc, WI
Genesis Health System
Davenport, IA
Good Shepherd Health System
Longview, TX
InMed Group Inc.
Montgomery, AL
Lakeland HealthCare
St. Joseph, MI
Lovelace Health System
Albuquerque, NM
Maury Regional Healthcare System
Columbia, TN
Mercy Health Network
Des Moines, IA
MidMichigan Health
Midland, MI
Northern Arizona Healthcare
Flagstaff, AZ
Palomar Health
San Diego, CA
Penn Highlands Healthcare
DuBois, PA
ProHealth Care
Waukesha, WI
Roper St. Francis Healthcare
Charleston, SC
Saint Alphonsus Health System
Boise, ID
Saint Joseph Regional Health System
Mishawaka, IN
Southern Illinois Healthcare
Carbondale, IL
St. Charles Health System
Bend, OR
Tanner Health System
Carrollton, GA
ThedaCare
Appleton, WI
Wheaton Franciscan Healthcare
Waterloo, IA
Note: Health systems are ordered alphabetically. This year’s 15 Top Health Systems are in bold, blue text.
34
15 TOP HEALTH SYSTEMS
Appendix C:
Methods for Identifying Complications of Care
Methodology
adjust raw data to accommodate differences that result from the variety and severity of
Details
To make valid normative comparisons of health system outcomes, it is necessary to
admitted cases.
Truven Health Analytics is able to make valid normative comparisons of mortality and
complications rates by using patient-level data to control effectively for case-mix and
severity differences. We do this by evaluating ICD-9-CM diagnosis and procedure codes
to adjust for severity within clinical case-mix groupings. Conceptually, we group patients
with similar characteristics (i.e., age, sex, principal diagnosis, procedures performed,
admission type, and comorbid conditions) to produce expected, or normative,
comparisons. Through extensive testing, we have found that this methodology produces
valid normative comparisons using readily available administrative data, eliminating the
need for additional data collection.29
To support the transition from ICD-9-CM to ICD-10-CM, our risk- and severity-adjustment
models have been modified to use the Agency for Healthcare Research and Quality
(AHRQ) Clinical Classifications System (CCS)30 categories for risk assignment. CCS
categories are defined in both coding languages with the intent of being able to
accurately compare ICD-9 categories with ICD-10 categories. Calibrating our models
using CCS categories provides the flexibility to accept and process patient record data
in either ICD-9 or ICD-10 coding formats, and produces consistent results in risk and
severity adjustment.
The CCS-based approach applies to all Truven Health proprietary models that use codebased rate tables, which include the Risk-Adjustment Mortality Model (RAMI), Expected
Complication Risk Index (ECRI), and Patient Financial Data/Expected Resource Demand
(PFD/ERD) Length of Stay (LOS) models used in this study.
Normative Database Development
Truven Health constructed a normative database of case-level data from its Projected
Inpatient Data Base (PIDB), a national all-payer database containing more than
23 million all-payer discharges annually. These data are obtained from approximately
3,700 hospitals, representing more than half of all discharges from short-term, general,
nonfederal hospitals in the U.S. PIDB discharges are statistically weighted to represent
the universe of all short-term, general, non-federal hospitals in the U.S. Demographic
and clinical data are also included: age, sex, and LOS; clinical groupings (MS-DRGs);
ICD-9-CM principal and secondary diagnoses; ICD-9-CM principal and secondary
procedures; present-on-admission (POA) coding; admission source and type; and
discharge status. For this study, risk models were recalibrated using federal fiscal year
(FFY) 2013 all-payer data.
15 TOP HEALTH SYSTEMS
35
Use of POA Data
Under the Deficit Reduction Act of 2005, as of FFY 2008, hospitals receive reduced
payments for cases with certain conditions — such as falls, surgical site infections, and
pressure ulcers — that were not present on the patient’s admission, but occurred during
hospitalization. As a result, the Centers for Medicare & Medicaid Services (CMS) now
requires all Inpatient Prospective Payment System hospitals to document whether a
patient has these conditions, and other conditions, when admitted. The Truven Health
proprietary risk-adjustment models for mortality, complications, and LOS take into
account POA data reported in the all-payer data. Our risk models develop expected
values based only on conditions that were present on admission.
In addition to considering the POA coding data in calibration of our risk- and severityadjustment models, we also have adjusted for missing/invalid POA coding found in the
MEDPAR data files. From 2009 through 2014, we have observed a significant rise in the
number of principal diagnosis (PDX) and secondary diagnosis (SDX) codes that do not
have a valid POA codes, in the MEDPAR data files. Since 2011, a code of “0” is appearing
— apparently in place of missing POA codes. The percentage of diagnosis codes with
POA code 0 are displayed in the table below. This phenomenon has led to an artificial rise
in the number of conditions that appear to be occurring during the hospital stay.
Table 14: Trend in Diagnosis Codes With POA Code 0
2010
2011
2012
2013
2014
PDX
0.00%
4.26%
4.68%
4.37%
3.40%
SDX
0.00%
15.05%
19.74%
22.10%
21.58%
To correct for this bias, we adjusted MEDPAR record processing through our mortality,
complications, and LOS risk models as follows:
1. We treated all diagnosis codes on the CMS exempt list as “exempt,” regardless of
POA coding.
2. We treated all principal diagnoses as present on admission.
3. We treated secondary diagnoses, where POA code Y or W appeared more than 50
percent of the time in the Truven Health all-payer database, as present on admission.
36
15 TOP HEALTH SYSTEMS
Risk-Adjusted Mortality Index Models
Truven Health has developed an overall mortality risk model. From this model, we
excluded long-term care, psychiatric, substance abuse, rehabilitation, and federally owned
or controlled facilities. In addition, we excluded certain patient records from the dataset:
psychiatric, substance abuse, rehabilitation, and unclassified cases (MS-DRGs 945,
946, and 999); cases where the patient age was less than 65 years; and where patient
transferred to another short-term acute care hospital. Palliative care patients (v66.7)
were included in the mortality risk model, which is calibrated to determine probability of
death for these patients. The Truven Health mortality risk model now excludes records
with “do not resuscitate” (DNR) (v49.86) orders that are coded as present on admission.
Excluding records that are DNR status at admission removes these high-probability-ofdeath patients from the analysis and allows hospitals to concentrate more fully on events
that could lead to deaths during the hospitalization.
A standard logistic regression model was used to estimate the risk of mortality for
each patient. This was done by weighting the patient records of the hospital by the
logistic regression coefficients associated with the corresponding terms in the model
and intercept term. This produced the expected probability of an outcome for each
eligible patient (numerator) based on the experience of the norm for patients with
similar characteristics (age, clinical grouping, severity of illness, and so forth).31–35 This
model takes into account only patient conditions that are present on admission when
calculating risk. Additionally, in response to the upcoming transition to ICD-10-CM,
diagnosis and procedure codes, and the interactions among them, have been mapped
to AHRQ CCS groups for assignment of risk instead of using the individual diagnosis,
procedure, and interaction effects.
Staff physicians at Truven Health have suggested important clinical patient characteristics
that also were incorporated into the proprietary models. After assigning the predicted
probability of the outcome for each patient, the patient-level data can then be
aggregated across a variety of groupings, including health system, hospital, service line,
or MS-DRG classification system.
Expected Complications Rate Index Models
Risk-adjusted complications refer to outcomes that may be of concern when they occur
at a greater-than-expected rate among groups of patients, possibly reflecting systemic
quality-of-care issues. The Truven Health complications model uses clinical qualifiers to
identify complications that have occurred in the inpatient setting. The complications used
in the model are listed in Table 14.
15 TOP HEALTH SYSTEMS
37
Table 15: Complications Used in Model
Complication
Patient Group
Postoperative complications relating to urinary tract
Surgical only
Postoperative complications relating to respiratory system except pneumonia
Surgical only
Gastrointestinal (GI) complications following procedure
Surgical only
Infection following injection/infusion
All patients
Decubitus ulcer
All patients
Postoperative septicemia, abscess, and wound infection
Surgical, including cardiac
Aspiration pneumonia
Surgical only
Tracheostomy complications
All patients
Complications of cardiac devices
Surgical, including cardiac
Complications of vascular and hemodialysis devices
Surgical only
Nervous system complications from devices/complications of nervous system devices
Surgical only
Complications of genitourinary devices
Surgical only
Complications of orthopedic devices
Surgical only
Complications of other and unspecified devices, implants, and grafts
Surgical only
Other surgical complications
Surgical, including cardiac
Miscellaneous complications
All patients
Cardio-respiratory arrest, shock, or failure
Surgical only
Postoperative complications relating to nervous system
Surgical only
Postoperative acute myocardial infarction (AMI)
Surgical only
Postoperative cardiac abnormalities except AMI
Surgical only
Procedure-related perforation or laceration
All patients
Postoperative physiologic and metabolic derangements
Surgical, including cardiac
Postoperative coma or stupor
Surgical, including cardiac
Postoperative pneumonia
Surgical, including cardiac
Pulmonary embolism
All patients
Venous thrombosis
All patients
Hemorrhage, hematoma, or seroma complicating a procedure
All patients
Postprocedure complications of other body systems
All patients
Complications of transplanted organ (excludes skin and cornea)
Surgical only
Disruption of operative wound
Surgical only
Complications relating to anesthetic agents and central nervous system (CNS) depressants
Surgical, including cardiac
Complications relating to antibiotics
All patients
Complications relating to other anti-infective drugs
All patients
Complications relating to antineoplastic and immunosuppressive drugs
All patients
Complications relating to anticoagulants and drugs affecting clotting factors
All patients
Complications relating to blood products
All patients
Complications relating to narcotics and related analgesics
All patients
Complications relating to non-narcotic analgesics
All patients
Complications relating to anticonvulsants and antiparkinsonism drugs
All patients
Complications relating to sedatives and hypnotics
All patients
Complications relating to psychotropic agents
All patients
Complications relating to CNS stimulants and drugs affecting the autonomic nervous system
All patients
Complications relating to drugs affecting cardiac rhythm regulation
All patients
Complications relating to cardiotonic glycosides (digoxin) and drugs of similar action
All patients
Complications relating to other drugs affecting the cardiovascular system
All patients
Complications relating to antiasthmatic drugs
All patients
Complications relating to other medications (includes hormones, insulin, iron, and oxytocic agents)
All patients
38
15 TOP HEALTH SYSTEMS
A standard regression model was used to estimate the risk of experiencing a
complication for each patient. This was done by weighting the patient records of the
hospital by the regression coefficients associated with the corresponding terms in the
prediction models and intercept term. This method produced the expected probability
of a complication for each patient based on the experience of the norm for patients with
similar characteristics. After assigning the predicted probability of a complication for
each patient in each risk group, it was then possible to aggregate the patient-level data
across a variety of groupings,36–39 including health system, hospital, service line, or the
MS-DRG classification system. This model takes into account only patient conditions that
are present on admission when calculating risk. Additionally, in response to the upcoming
transition to ICD-10-CM, diagnosis and procedure codes, and the interactions among
them, have been mapped to AHRQ CCS for assignment of risk instead of using the
individual diagnosis, procedure, and interaction effects.
Index Interpretation
An outcome index is a ratio of an observed number of outcomes to an expected number
of outcomes in a particular population. This index is used to make normative comparisons
and is standardized in that the expected number of events is based on the occurrence
of the event in a normative population. The normative population used to calculate
expected numbers of events is selected to be similar to the comparison population with
respect to relevant characteristics, including age, sex, region, and case mix.
The index is simply the number of observed events divided by the number of expected
events and can be calculated for outcomes that involve counts of occurrences (e.g.,
deaths or complications). Interpretation of the index relates the experience of the
comparison population relative to a specified event to the expected experience based on
the normative population.
Examples:
10 events observed ÷ 10 events expected = 1.0: The observed number of events is equal to
the expected number of events based on the normative experience.
10 events observed ÷ 5 events expected = 2.0: The observed number of events is twice
the expected number of events based on the normative experience.
10 events observed ÷ 25 events expected = 0.4: The observed number of events is 60
percent lower than the expected number of events based on the normative experience.
Therefore, an index value of 1.0 indicates no difference between observed and expected
outcome occurrence. An index value greater than 1.0 indicates an excess in the observed
number of events relative to the expected based on the normative experience. An index
value less than 1.0 indicates fewer events observed than would be expected based on
the normative experience. An additional interpretation is that the difference between 1.0
and the index is the percentage difference in the number of events relative to the norm.
In other words, an index of 1.05 indicates 5 percent more outcomes, and an index of 0.90
indicates 10 percent fewer outcomes than expected based on the experience of the norm.
The index can be calculated across a variety of groupings (e.g., hospital, service).
15 TOP HEALTH SYSTEMS
39
Core Measures
Core measures were developed by The Joint Commission (TJC) and endorsed by the
National Quality Forum (NQF), the nonprofit public-private partnership organization
that endorses national healthcare performance measures, as minimum basic care
standards. They have been a widely accepted method for measuring quality of patient
care that includes specific guidelines for a wide variety of patient conditions. CMS no
longer requires reporting of the core measures formerly used in the study — heart attack
(acute myocardial infarction, or AMI), heart failure (HF), pneumonia, and surgical care
improvement project (SCIP) measures — so these have been dropped. In their place, we
are now including stroke care and blood clot prevention core measures in our composite
core measures mean percent metric. The data in this study are for Oct. 1, 2013, through
Sept. 30, 2014.
Stroke Care Core Measures
STK-1
Ischemic or hemorrhagic stroke patients who received treatment to keep blood clots from forming
anywhere in the body within 2 days of arriving at the hospital
STK-4
Ischemic stroke patients who got medicine to break up a blood clot within 3 hours after
symptoms started
STK-6
Ischemic stroke patients needing medicine to lower cholesterol, who were given a prescription for
this medicine before discharge
STK-8
Ischemic or hemorrhagic stroke patients or caregivers who received written educational materials
about stroke care and prevention during the hospital stay
Blood Clot Prevention and Treatment Core Measures
VTE-1
Patients who got treatment to prevent blood clots on the day of or day after hospital admission
or surgery
VTE-2
Patients who got treatment to prevent blood clots on the day of or day after being admitted to the
intensive care unit (ICU)
VTE-3
Patients with blood clots who got the recommended treatment, which includes using two different
blood thinner medicines at the same time
VTE-5
Patients with blood clots who were discharged on a blood thinner medicine and received written
instructions about that medicine
VTE-6
Patients who developed a blood clot while in the hospital who did not get treatment that could have
prevented it
If a health system was missing one or more core measure values, the comparison group
median core measure value was substituted for each missing core measure when we
calculated the health system core measure mean percent. In addition, the median core
measure value was substituted if a health system had one or more core measures with
relative standard error greater than or equal to 0.30. This was done because the percent
values are statistically unreliable.
40
15 TOP HEALTH SYSTEMS
30-Day Risk-Adjusted Mortality Rates and 30-Day Risk-Adjusted
Readmission Rates
This study currently includes two extended outcome measures — 30-day mortality and
30-day readmission rates, as developed by CMS and published in the Hospital Compare
dataset. The longitudinal data period contained in this analysis is July 1, 2011, through
June 30, 2014. The Hospital Compare website and database were created by CMS, the
U.S. Department of Health and Human Services, and other members of the Hospital
Quality Alliance. The data on the website come from hospitals that have agreed to submit
quality information that will be made public. Both of the measures used in this study have
been endorsed by the NQF.
CMS calculates the 30-day mortality and 30-day readmission rates from Medicare
enrollment and claims records using sophisticated statistical modeling techniques that
adjust for patient-level risk factors and account for the clustering of patients within
hospitals. In this study, we used the 30-day mortality rates for AMI, HF, pneumonia,
chronic obstructive pulmonary disease (COPD), and stroke patients, and the 30-day
readmission rates for AMI, HF, pneumonia, elective total hip or knee arthroplasty, COPD,
and stroke.
The individual CMS mortality models estimate hospital-specific, risk-standardized, allcause, 30-day mortality rates for patients hospitalized with a principal diagnosis of AMI,
HF, pneumonia, COPD, and stroke. All-cause mortality is defined as death from any cause
within 30 days after the admission date, regardless of whether the patient dies while still
in the hospital or after discharge.
The individual CMS readmission models estimate hospital-specific, risk-standardized, allcause, 30-day readmission rates for patients discharged alive to a non-acute care setting
with a principal diagnosis of AMI, HF, pneumonia, elective total hip or knee arthroplasty,
COPD, and stroke. Patients may have been readmitted back to the same hospital or to
a different hospital. They may have been readmitted for the same condition as their
recent hospital stay or for a different reason (this is to discourage hospitals from coding
similar readmissions as different readmissions).27 All readmissions that occur 30 days after
discharge to an acute setting are included, with a few exceptions. CMS does not count
planned admissions (obstetrical delivery, transplant surgery, maintenance chemotherapy,
rehabilitation, and non-acute admissions for a procedure) as readmissions.
CMS does not calculate rates for hospitals where the number of cases is too small (fewer
than 25). If a health system had no available hospital rates for all 30-day mortality or
30-day readmission measures, we excluded the system. If one or two individual 30-day
mortality or 30-day readmission rates were missing, we substituted the comparison
group median rate for the affected measure.
The ranked measures in this study were the mean of the individual 30-day mortality rates
and the mean of the individual 30-day readmission rates. This was a change over prior
years, where we ranked each individual measure separately.
15 TOP HEALTH SYSTEMS
41
Average Length of Stay
We use the Truven Health proprietary severity-adjusted resource demand methodology
for the LOS performance measure. The LOS severity-adjustment model is calibrated using
our normative PIDB — a national, all-payer database containing more than 23 million allpayer discharges annually, described in more detail at the beginning of this appendix.
Our severity-adjusted resource demand model allows us to produce risk-adjusted
performance comparisons on LOS between or across virtually any subgroup of inpatients.
These patient groupings can be based on clinical groupings, hospitals, product lines,
geographic regions, physicians, etc. This regression model adjusts for differences in
diagnosis type and illness severity, based on ICD-9-CM coding. It also adjusts for patient
age, gender, and admission status. Its associated LOS weights allow group comparisons
on a national level and in a specific market area. In response to the upcoming transition
to ICD-10-CM, diagnosis, procedure, and interaction codes have been mapped to AHRQ
CCS for severity assignment instead of using the individual diagnosis, procedure, and
interaction effects.
POA coding allowed us to determine appropriate adjustments to LOS weights based
on pre-existing conditions versus complications that occurred during hospital care. We
calculate expected values from model coefficients that were normalized to the clinical
group and transformed from log scale.
Emergency Department Throughput Measure
New to our study this year as a ranked metric, we have included three Emergency
Department (ED) throughput measures from the CMS Hospital Compare dataset. The
hospital ED is an important access point to healthcare for many people. A key factor in
evaluating ED performance is process “throughput” — measures of timeliness with which
patients receive treatment, and either are admitted or discharged. Timely ED processes
impact both care quality and the quality of the patient experience. We chose to include
measures that define three important ED processes: time from door to admission,
time from door to discharge for non-admitted patients, and time-to-receipt of pain
medications for broken bones. See below for a complete description of each measure.
The measure data from CMS Hospital Compare are published in median minutes and
are based on calendar year (2014) data. Our ranked metric is the calculated mean of the
three included measures. The hospital’s comparison group median ED measure value was
substituted for a missing measure for the purpose of calculating the composite measure.
Hospitals missing all three included measures were excluded from the study.
ED Throughput Measures
42
ED-1b
Average time patients spent in the emergency department, before they were admitted to the hospital
as an inpatient
OP-18b
Average time patients spent in the emergency department before being sent home
OP-21
Average time patients who came to the emergency department with broken bones had to wait before
receiving pain medication
15 TOP HEALTH SYSTEMS
Medicare Spending per Beneficiary Index
The Medicare spending per beneficiary (MSPB) index is included in the study as a proxy
for episode-of-care cost efficiency for hospitalized patients. CMS develops and publishes
this risk-adjusted index in the public Hospital Compare datasets, and in FFY 2015, it
began to be included in the CMS Value-Based Purchasing Program. The CMS reason for
including this measure is “…to reward hospitals that can provide efficient care at a lower
cost to Medicare.” In this study, we used data for calendar year 2014.
The MSPB index evaluates hospitals’ efficiency relative to the efficiency of the median
hospital, nationally. Specifically, the MSPB index assesses the cost to Medicare of services
performed by hospitals and other healthcare providers during an MSPB episode, which
comprises the period three days prior to, during, and 30 days following a patient’s
hospital stay. Payments made by Medicare and the beneficiary (i.e., allowed charges) are
counted in the MSPB episode as long as the start of the claim falls within the episode
window. Inpatient Prospective Payment System (IPPS) outlier payments (and outlier
payments in other provider settings) are also included in the calculation of the MSPB
index. The index is available for Medicare beneficiaries enrolled in Medicare Parts A
and B who were discharged from short-term acute care hospitals during the period of
performance. Medicare Advantage enrollees are not included. This measure excludes
patients who died during the episode.
The MSPB index is calculated by dividing each hospital’s risk-adjusted average
episode cost by the national hospital median. The hospital’s MSPB amount is the sum
of standardized, risk-adjusted spending across all of a hospital’s eligible episodes
divided by the number of episodes for that hospital. This is divided by the median
MSPB amount across all episodes nationally. CMS adjusts spending amounts for area
price variation and also for various risk factors including case mix, age, and hierarchical
condition category indicators.
HCAHPS Overall Hospital Rating
To measure patient perception of care, our study used the Hospital Consumer
Assessment of Healthcare Providers and Systems (HCAHPS) patient survey results.
HCAHPS is a standardized survey instrument and data collection methodology for
measuring patients’ perspectives on their hospital care. HCAHPS is a core set of
questions that can be combined with customized, hospital-specific items to produce
information that complements the data hospitals currently collect to support internal
customer service and quality-related activities.
HCAHPS was developed through a partnership between CMS and AHRQ that had three
broad goals:
§§ Produce comparable data on patients’ perspectives of care that allow objective and
meaningful comparisons among hospitals on topics that are important to consumers
§§ Encourage public reporting of the survey results to create incentives for hospitals to
improve quality of care
§§ Enhance public accountability in healthcare by increasing the transparency of the
quality of hospital care provided in return for the public investment
15 TOP HEALTH SYSTEMS
43
The HCAHPS survey has been endorsed by the NQF and the Hospital Quality Alliance.
The federal government’s Office of Management and Budget has approved the national
implementation of HCAHPS for public reporting purposes.
Voluntary collection of HCAHPS data for public reporting began in October 2006. The
first public reporting of HCAHPS results, which encompassed eligible discharges from
October 2006 through June 2007, occurred in March 2008. HCAHPS results are posted
on the Hospital Compare website, found at hospitalcompare.hhs.gov, or through a link
on medicare.gov. A downloadable version of HCAHPS results is available.40
For this study, we used Hospital Compare data for calendar year 2013. Although we are
reporting health system performance on all HCAHPS questions, only performance on the
overall hospital rating question, “How do patients rate the hospital, overall?” is used to
rank health system performance.
At the hospital level, patient responses fall into three categories,* and the number of
patients in each category is reported as a percent:
§§ Patients who gave a rating of 6 or lower (low)
§§ Patients who gave a rating of 7 or 8 (medium)
§§ Patients who gave a rating of 9 or 10 (high)
For each answer category, we assigned a weight as follows: 3 equals high or good
performance, 2 equals medium or average performance, and 1 equals low or poor
performance. We then calculated a weighted score for each hospital by multiplying the
HCAHPS answer percent by the category weight. For each hospital, we summed the
weighted percent values for the three answer categories. This produced the hospital
HCAHPS score. The highest possible HCAHPS score is 300 (100 percent of patients rate
the hospital high). The lowest possible HCAHPS score is 100 (100 percent of patients rate
the hospital low).
To calculate the system-level score, we multiplied the HCAHPS scores for every hospital
in the system by the total 2014 MEDPAR discharges at the hospital. This produced the
hospital weighted HCAHPS score.
To calculate the HCAHPS score for each health system, we summed the member hospital
weighted HCAHPS scores, summed the member hospital 2014 MEDPAR discharges, then
divided the sum of the weighted HCAHPS scores by the sum of the discharges. This
produced the health system mean weighted HCAHPS score, which is the measure used in
the study.
We applied this same methodology to each individual HCAHPS question to produce
mean weighted HCAHPS scores for the systems. These values were reported for
information only in the study.
* Exception: The “would refer” question has only two response categories — “yes” and “no.”
44
15 TOP HEALTH SYSTEMS
Methodology for Financial Performance Measures
Data Source
The financial measures included in this study were from the annual audited, consolidated
financial statements of in-study health systems, when they were available. Consolidated
balance sheets and consolidated statements of operations for the 2014 fiscal year were
used, and 62.4 percent of in-study health systems had audited financial statements
available. This included data for 80 percent of parent and other independent health
systems, while audited financials were generally not available for subsystems
(2.6 percent). Audited financial statements were obtained from the following sites:
§§ Electronic Municipal Market Access (EMMA®)
§§ DAC Bond®
§§ U.S. Securities and Exchange Commission (EDGAR)
Calculations
Operating Margin (Percentage):
(Operating Revenue - Total Operating Expense) / (Total Operating Revenue * 100)
Long-Term Debt-to-Capitalization Ratio:
(Long-Term Debt) / (Long-Term Debt + Unrestricted Net Assets)
Performance Measure Normalization
The mortality, complications, and average LOS measures were normalized, based on
the in-study population, by comparison group, to provide a more easily interpreted
comparison among health systems. We assigned each health system in the study
to one of three comparison groups based on the sum of member hospitals’ total
operating expense. (Detailed descriptions of the comparison groups can be found in the
Methodology section of this document.)
For the mortality and complications measures, we based our ranking on the difference
between observed and normalized expected events, expressed in standard deviation
units (z-scores). We normalized the individual health system expected values by
multiplying them by the ratio of the observed-to-expected values for their comparison
group. The normalized z-scores were then calculated using the observed, normalized
expected and record count.
For the average LOS measure, we based our ranking on the normalized, severity-adjusted
LOS index expressed in days. This index is the ratio of the observed and the normalized
expected values for each health system. We normalized the individual health system
expected value by multiplying it by the ratio of the observed-to-expected value for the
comparison group. The health system’s normalized index was then calculated by dividing
the health system’s observed value by its normalized expected value. We converted
this normalized index into days by multiplying by the average LOS of all in-study health
systems (grand mean LOS).
15 TOP HEALTH SYSTEMS
45
Why We Have Not Calculated Percent Change in Specific Instances
Percent change is a meaningless statistic when the underlying quantity can be positive,
negative, or zero. The actual change may mean something, but dividing it by a number
that may be zero or of the opposite sign does not convey any meaningful information
because the amount of change is not proportional to its previous value.41 We also did not
report percent change when the metrics were already percentages. In these cases, we
reported the simple difference between the two percentage values.
Protecting Patient Privacy
In accordance with patient privacy laws, we do not report any individual hospital data
that are based on 11 or fewer patients. This affects the following measures:
§§ Risk-adjusted mortality index
§§ Risk-adjusted complications index
§§ 30-day mortality rates for AMI, HF, pneumonia, COPD, and stroke
(CMS does not report a rate when count is less than 25)
§§ 30-day readmission rates for AMI, HF, pneumonia, hip/knee arthroplasty, COPD,
and stroke (CMS does not report a rate when count is less than 25)
§§ Average LOS
46
15 TOP HEALTH SYSTEMS
Appendix D:
All Health
Systems in
Study
Health System Name
Location
Abrazo Health Care
Phoenix, AZ
Adventist Florida Hospital
Orlando, FL
Adventist Health Central Valley Network
Hanford, CA
Adventist Health System
Winter Park, FL
Adventist Health West
Roseville, CA
Adventist Healthcare
Rockville, MD
Advocate Health Care
Downers Grove, IL
AHMC Healthcare Inc
Alhambra, CA
Alameda Health System
Alameda, CA
Alecto Healthcare Services
Long Beach, CA
Alegent Creighton Health
Omaha, NE
Alexian Brothers Health System
Arlington Heights, IL
Allegheny Health Network
Pittsburgh, PA
Allegiance Health Management
Shreveport, LA
Allina Health System
Minneapolis, MN
Alta Hospitals System LLC
Los Angeles, CA
Anderson Regional Medical Center
Meridian, MS
Appalachian Regional Healthcare (ARH)
Lexington, KY
Ardent Health Services
Nashville, TN
Asante
Medford, OR
Ascension Health
St. Louis, MO
Atlantic Health System
Morristown, NJ
Aurora Health Care
Milwaukee, WI
Avanti Health System LLC
El Segundo, CA
Avera Health
Sioux Falls, SD
Banner Health
Phoenix, AZ
Baptist Health
Montgomery, AL
Baptist Health (AR)
Little Rock, AR
Baptist Health Care (FL)
Pensacola, FL
Baptist Health of Northeast Florida
Jacksonville, FL
Baptist Health South Florida
Coral Gables, FL
Baptist Health System (MS)
Jackson, MS
Baptist Health System Inc. (AL)
Birmingham, AL
Baptist Healthcare System (KY)
Louisville, KY
Baptist Memorial Health Care Corp
Memphis, TN
Barnabas Health
West Orange, NJ
BayCare Health System
Clearwater, FL
Bayhealth
Dover, DE
Baylor Scott & White Health
Dallas, TX
Baystate Health
Springfield, MA
Note: Winning health systems are in bold, blue text.
15 TOP HEALTH SYSTEMS
47
Health System Name
Location
Beacon Health System
South Bend, IN
Beaumont Health
Royal Oak, MI
BJC Health System
St. Louis, MO
Blue Mountain Health System
Lehighton, PA
Bon Secours Health System
Marriottsville, MD
Bronson Healthcare Group
Kalamazoo, MI
Broward Health
Ft. Lauderdale, FL
Cape Cod Healthcare
Hyannis. MA
Capella Healthcare
Franklin, TN
Capital Health System
Trenton, NJ
Care New England Health System
Providence, RI
CareGroup Healthcare System
Boston, MA
CarePoint Health
Bayonne, NJ
Carilion Health System
Roanoke, VA
Carolinas HealthCare System
Charlotte, NC
Carondolet Health Network
Tucson, AZ
Carondolet Health System
Kansas City, MO
Catholic Health
Buffalo, NY
Catholic Health Initiatives
Denver, CO
Catholic Health Services of Long Island
Rockville Centre, NY
Centegra Health System
Crystal Lake, IL
Centra Health
Lynchburg, VA
Central Florida Health Alliance
Leesburg, FL
Centura Health
Englewood, CO
CHI St. Joseph Health
Bryan, TX
CHI St. Luke's Health
Houston, TX
CHI St. Vincent
Little Rock, AR
Christus Health
Irving, TX
Citrus Valley Health Partners
Covina, CA
Cleveland Clinic
Cleveland, OH
Columbia Health System
Milwaukee, WI
Columbus Regional Healthcare System
Columbus, GA
Community Foundation of Northwest Indiana
Munster, IN
Community Health Network
Indianapolis, IN
Community Health Systems
Franklin, TN
Community Hospital Corporation
Plano, TX
Community Medical Centers
Fresno, CA
Community Memorial Health System
Ventura, CA
Conemaugh Health System
Johnstown, PA
Covenant Health
Knoxville, TN
Covenant Health Systems (Northeast)
Syracuse, NY
CoxHealth
Springfield, MO
Crozer-Keystone Health System
Springfield, PA
Dartmouth Hitchcock Health
Lebanon, NH
Daughters of Charity Health System
Los Altos Hills, CA
DCH Health System
Tuscaloosa, AL
Note: Winning health systems are in bold, blue text.
48
15 TOP HEALTH SYSTEMS
Health System Name
Location
DeKalb Regional Healthcare System
Decatur, GA
Detroit Medical Center
Detroit, MI
Dignity Health
San Francisco, CA
Dimensions Health Corporation
Cheverly, MD
Duke LifePoint
Durham, NC
Duke University Health System
Durham, NC
East Texas Medical Center Regional Healthcare System
Tyler, TX
Eastern Connecticut Health Network
Manchester, CT
Eastern Maine Healthcare Systems
Brewer, ME
Einstein Healthcare Network
Philadelphia, PA
Emory Healthcare
Atlanta, GA
Essentia Health
Duluth, MN
Excela Health
Greensburg, PA
Fairview Health Services
Minneapolis, MN
Forrest Health
Hattiesburg, MS
Franciscan Alliance
Mishawaka, IN
Franciscan Health System
Tacoma, WA
Franciscan Missionaries of Our Lady Health System
Baton Rouge, LA
Franciscan Sisters of Christian Charity
Manitowoc, WI
Froedtert & the Medical College of Wisconsin
Milwaukee, WI
Geisinger Health System
Danville, PA
Genesis Health System
Davenport, IA
Good Shepherd Health System
Longview, TX
Greater Hudson Valley Health System
Middletown, NY
Greenville Hospital System
Greenville, SC
Guthrie Healthcare System
Sayre, PA
Hartford HealthCare
Hartford, CT
Hawaii Health Systems Corporation
Honolulu, HI
Hawaii Pacific Health
Honolulu, HI
HCA
Nashville, TN
HCA Capital Division
Richmond, VA
HCA Central and West Texas Division
Austin, TX
HCA Continental Division
Denver, CO
HCA East Florida Division
Ft. Lauderdale, FL
HCA Far West Division
Las Vegas, NV
HCA Gulf Coast Division
Houston, TX
HCA MidAmerica (North)
Kansas City, MO
HCA MidAmerica (South)
Kansas City, MO
HCA Mountain Division
Salt Lake City, UT
HCA North Florida Division
Tallahassee,FL
HCA North Texas Division
Dallas, TX
HCA San Antonio Division
San Antonio, TX
HCA South Atlantic Division
Charleston, SC
HCA Tristar Division
Nashville, TN
HCA West Florida Division
Tampa, FL
Health Alliance of the Hudson Valley
Kingston, NY
Note: Winning health systems are in bold, blue text.
15 TOP HEALTH SYSTEMS
49
Health System Name
Location
Health First
Rockledge, FL
Health Group of Alabama
Huntsville, AL
Health Quest System
Poughkeepsie, NY
HealthEast Care System
Saint Paul, MN
HealthPartners
Bloomington, MN
Henry Ford Health System
Detroit, MI
Heritage Valley Health System
Beaver, PA
HighPoint Health System
Gallatin, TN
Hillcrest HealthCare System
Tulsa, OK
HonorHealth
Scottsdale, AZ
Hospital Sisters Health System
Springfield, IL
Houston Healthcare
Warner Robins, GA
Houston Methodist
Houston, TX
Humility of Mary Health Partners
Youngstown, OH
IASIS Healthcare
Franklin, TN
Indiana University Health
Indianapolis, IN
Infirmary Health Systems
Mobile, AL
InMed Group Inc.
Montgomery, AL
Inova Health System
Falls Church, VA
Inspira Health Network
Vineland, NJ
Integrated Healthcare Holding Inc.
Santa Ana, CA
Integris Health
Oklahoma City, OK
Intermountain Health Care
Salt Lake City, UT
John D. Archbold Memorial Hospital
Thomasville, GA
John Muir Health
Walnut Creek, CA
Johns Hopkins Health System
Baltimore, MD
KentuckyOne Health
Lexington, KY
Kettering Health Network
Dayton, OH
Lahey Health System
Burlington, MA
Lakeland HealthCare
St. Joseph, MI
Lee Memorial Health System
Fort Myers, FL
Legacy Health System
Portland, OR
Lehigh Valley Network
Allentown, PA
LifePoint Hospitals Inc.
Brentwood, TN
Lifespan Corporation
Providence, RI
Los Angeles County-Department of Health Services
Los Angeles, CA
Lourdes Health System
Camden, NJ
Lovelace Health System
Albuquerque, NM
Loyola University Health System
Maywood, IL
LSU Health System
Baton Rouge, LA
Main Line Health
Bryn Mawr, PA
MaineHealth
Portland, ME
Mary Washington Healthcare
Fredericksburg, VA
Maury Regional Healthcare System
Columbia, TN
Mayo Foundation
Rochester, MN
McLaren Health Care Corp
Flint, MI
Note: Winning health systems are in bold, blue text.
50
15 TOP HEALTH SYSTEMS
Health System Name
Location
McLeod Health
Florence, SC
MediSys Health Network
Jamaica, NY
MedStar Health
Columbia, MD
Memorial Healthcare System
Hollywood, FL
Memorial Hermann Healthcare System
Houston, TX
MemorialCare Health System
Fountain Valley, CA
Mercy
Chesterfield, MO
Mercy Health
Muskegon, MI
Mercy Health (OH)
Cincinnati, OH
Mercy Health Network
Des Moines, IA
Mercy Health Partners (Northern OH)
Toledo, OH
Mercy Health Southwest Ohio Region
Cincinnati, OH
Mercy Health System of Southeastern Pennsylvania
Philadelphia, PA
Meridian Health
Neptune, NJ
Methodist Health System (TX)
Dallas, TX
Methodist Healthcare
Memphis, TN
MidMichigan Health
Midland, MI
Ministry Health Care
Milwaukee, WI
Mission Health
Asheville, NC
Montefiore Health System
Bronx, NY
Mount Carmel Health System
Columbus, OH
Mount Sinai Health System
New York, NY
Mountain States Health Alliance
Johnson City, TN
Multicare Medical Center
Tacoma, WA
Munson Healthcare
Traverse City, MI
Nebraska Medicine
Omaha, NE
Nebraska Methodist Health System
Omaha, NE
New York City Health and Hospitals Corporation (HHC)
New York, NY
New York-Presbyterian Healthcare System
New York, NY
North Mississippi Health Services
Tupelo, MS
Northern Arizona Healthcare
Flagstaff, AZ
Northside Hospital System
Atlanta, GA
Northwell Health
Great Neck, NY
Northwestern Medicine
Chicago, IL
Novant Health
Winston-Salem, NC
Ochsner Health System
New Orleans, LA
Ohio Valley Health Services & Education Corp
Wheeling, WV
OhioHealth
Columbus, OH
Orlando Health
Orlando, FL
OSF Healthcare System
Peoria, IL
Pallottine Health Services
Huntington, WV
Palmetto Health
Columbia, SC
Palomar Health
San Diego, CA
Parkview Health
Fort Wayne, IN
Partners Healthcare
Boston, MA
PeaceHealth
Bellevue, OR
Note: Winning health systems are in bold, blue text.
15 TOP HEALTH SYSTEMS
51
Health System Name
Location
Penn Highlands Healthcare
DuBois, PA
Phoebe Putney Health System
Albany, GA
Piedmont Healthcare Inc.
Atlanta, GA
Premier Health Partners
Dayton, OH
Presbyterian Healthcare Services
Albuquerque, NM
Presence Health
Mokena, IL
Prime Healthcare Services
Ontario, CA
Progressive Acute Care LLC
Mandeville, LA
ProHealth Care
Waukesha, WI
ProMedica Health System
Toledo, OH
Providence Health & Services
Renton, WA
Regional Health
Rapid City, SD
RegionalCare Hospital Partners
Brentwood, TN
Renown Health
Reno, NV
Riverside Heath System
Newport News, VA
Robert Wood Johnson Health Network
New Brunswick, NY
Rochester Regional Health System
Rochester, NY
Roper St. Francis Healthcare
Charleston, SC
Sacred Heart Health System
Pensacola, FL
Saint Alphonsus Health System
Boise, ID
Saint Francis Health System
Tulsa, OK
Saint Joseph Mercy Health System
Ann Arbor, MI
Saint Joseph Regional Health System
Mishawaka, IN
Saint Luke's Health System
Kansas City, MO
Saint Thomas Health
Nashville, TN
Samaritan Health Services
Corvallis, OR
Sanford Health
Sioux Falls, SD
Schuylkill Health System
Pottsville, PA
SCL Denver Region
Denver, CO
SCL Health
Denver, CO
Scripps Health
San Diego, CA
Sentara Healthcare
Norfolk, VA
Seton Healthcare Network
Austin, TX
Shands HealthCare
Gainesville, FL
Sharp Healthcare Corporation
San Diego, CA
Sisters of Charity Health System
Cleveland, OH
Southeast Georgia Health System
Brunswick, GA
Southern Illinois Healthcare
Carbondale, IL
Southwest Healthcare Services
Scottsdale, AZ
Sparrow Health System
Lansing, MI
Spartanburg Regional Healthcare System
Spartanburg, SC
Spectrum Health
Grand Rapids, MI
SSM Health
St. Louis, MO
St. Charles Health System
Bend, OR
St. Elizabeth Healthcare
Fort Thomas, KY
St. John Health System
Tulsa, OK
Note: Winning health systems are in bold, blue text.
52
15 TOP HEALTH SYSTEMS
Health System Name
Location
St. John Providence Health (MI)
Detroit, MI
St. Joseph Health System
Orange, CA
St. Joseph/Candler Health System
Savannah, GA
St. Luke's Health System
Boise, ID
St. Peters Health Partners
Albany, NY
St. Vincent Health
Indianapolis, IN
St. Vincents Health System (AL)
Birmingham, AL
St. Vincent's Healthcare
Jacksonville, FL
Steward Health Care System
Boston, MA
Success Health
Boca Raton, FL
Summa Health System
Akron, OH
SunLink Health Systems
Atlanta, GA
Susquehanna Health System
Williamsport, PA
Sutter Bay Division
Sacramento, CA
Sutter Health
Sacramento, CA
Sutter Health Valley Division
Sacramento, CA
Swedish
Seattle, WA
Tanner Health System
Carrollton, GA
Temple University Health System
Philadelphia, PA
Tenet California
Anaheim, CA
Tenet Central
Dallas, TX
Tenet Florida
Ft. Lauderdale, FL
Tenet Healthcare Corporation
Dallas, TX
Tenet Southern
Atlanta, GA
Texas Health Resources
Arlington, TX
The Manatee Healthcare System
Bradenton, FL
The University of Vermont Health Network
Burlington, VT
The Valley Health System
Las Vegas, NV
ThedaCare
Appleton, WI
TriHealth
Cincinnati, OH
Trinity Health
Livonia, MI
Trinity Regional Health System
Moline, IL
Truman Medical Center Inc.
Kansas City, MO
UAB Health System
Birmingham, AL
UC Health
Cincinnati, OH
UMass Memorial Health Care
Worcester, MA
United Health Services
Binghamton, NY
UnityPoint Health
Des Moines, IA
Universal Health Services Inc.
King of Prussia, PA
University Hospitals Health System
Cleveland, OH
University of California Health System
Los Angeles, CA
University of Colorado Health
Aurora, CO
University of Maryland Medical System
Baltimore, MD
University of Mississippi Medical Center
Jackson, MS
University of New Mexico Hospitals
Albuquerque, MN
University of North Carolina Health
Chapel Hill, NC
Note: Winning health systems are in bold, blue text.
15 TOP HEALTH SYSTEMS
53
Health System Name
Location
University of Pennsylvania Health System
Philadelphia, PA
University of Rochester Medical Center
Rochester, NY
University of Texas System
Austin, TX
UPMC Health System
Pittsburgh, PA
UT Southwestern Medical Center
Dallas, TX
Valley Baptist Health System
Harlingen, TX
Valley Health System
Winchester, VA
Valley Health System (CA)
Hemet, CA
Via Christi Health
Wichita, KS
Vidant Health
Greenville, NC
Virtua Health
Marlton, NJ
WakeMed
Raleigh, NC
Wellmont Health System
Kingsport, AL
WellSpan Health
York, PA
WellStar Health System
Marietta, GA
West Tennessee Healthcare
Jackson, TN
West Virginia United Health System
Fairmont, WV
Wheaton Franciscan Southeast Wisconsin
Glendale, WI
Wheaton Franciscan Healthcare
Waterloo, IA
Wheeling Hospital
Wheeling, WV
Willis-Knighton Health Systems
Shreveport, LA
Yale New Haven Health Services
New Haven, CT
Note: Winning health systems are in bold, blue text.
54
15 TOP HEALTH SYSTEMS
References
1 Kaplan RS, Norton DP. The Balanced Scorecard: Measures that Drive Performance.
Harvard Bus Rev, Jan–Feb 1992.
2 Foster DA. HCAHPS 2008: Comparison Results for 100 Top Hospitals Winners
Versus Non-Winners. Ann Arbor, MI: Truven Health Analytics Center for Healthcare
Improvement. August 2008.
3 Foster DA. Risk-Adjusted Mortality Index Methodology. Ann Arbor, MI: Truven Health
Analytics Center for Healthcare Improvement. July 2008.
4 Griffith JR, Alexander JA, Foster DA. Is Anybody Managing the Store? National
Trends in Hospital Performance. Healthc Manag. 2006 Nov–Dec; 51(6):392-405;
discussion 405-6.
5 Griffith JR, Alexander JA, Jelinek RC. Measuring Comparative Hospital Performance.
Healthc Manag. 2002 Jan–Feb; 47(1).
6 Griffith JR, Knutzen SR, Alexander JA. Structural Versus Outcomes Measures in
Hospitals: A Comparison of Joint Commission and Medicare Outcomes Scores in
Hospitals. Qual Manag Health Care. 2002; 10(2): 29-38.
7 Shook J, Young J. Inpatient and Outpatient Growth by Service Line: 2006
Truven Health 100 Top Hospitals: Performance Improvement Leaders Versus Peer
Hospitals. Ann Arbor, MI: Truven Health Analytics Center for Healthcare Improvement.
August 2007.
8 Young J. Outpatient Care Standards Followed More Closely at Top-Performing
Hospitals. Ann Arbor, MI: Truven Health Analytics Center for Healthcare Improvement.
March 2011.
9 Young J. Hospitals Increase Cardiovascular Core Measure Compliance. Ann Arbor, MI:
Truven Health Analytics Center for Healthcare Improvement. November 2010.
10 Foster DA. Top Cardiovascular Care Means Greater Clinical and Financial Value.
Ann Arbor, MI: Truven Health Analytics Center for Healthcare Improvement.
November 2009.
11 Foster DA. Trends in Patient Safety Adverse Outcomes and 100 Top Hospitals
Performance, 2000–2005. Ann Arbor, MI: Truven Health Analytics Center for
Healthcare Improvement. March 2008.
12 Bonis PA, Pickens GT, Rind DM, Foster DA. Association of a clinical knowledge support
system with improved patient safety, reduced complications and shorter length of stay
among Medicare beneficiaries in acute care hospitals in the United States.
Int J Med Inform. 2008 Nov;77(11):745-53. Epub 2008 Jun 19.
13 Lee DW, Foster DA. The association between hospital outcomes and diagnostic
imaging: early findings. J Am Coll Radiol. 2009 Nov; 6(11):780-5.
14 Shook J, Chenoweth J. 100 Top Hospitals CEO Insights: Adoption Rates of Select
Baldrige Award Practices and Processes. Ann Arbor, MI: Truven Health Analytics
Center for Healthcare Improvement. October 2012.
15 TOP HEALTH SYSTEMS
55
15 Foster DA, Chenoweth J. Comparison of Baldrige Award Applicants and Recipients
With Peer Hospitals on a National Balanced Scorecard. Ann Arbor, MI: Truven Health
Analytics Center for Healthcare Improvement. October 2011.
16 Chenoweth J, Safavi K. Leadership Strategies for Reaching Top Performance Faster.
J Healthc Tech. January 2007. HCT Project Volume 4.
17 McDonagh KJ. Hospital Governing Boards: A Study of Their Effectiveness in Relation
to Organizational Performance. Healthc Manag. 2006 Nov–Dec; 51(6).
18 Bass K, Foster DA, Chenoweth J. Study Results — Proving Measurable Leadership
and Engagement Impact on Quality, CMS Invitational Conference on Leadership and
Quality. Sept 28, 2006.
19 Cejka Search and Solucient, LLC. 2005 Hospital CEO Leadership Survey.
20HIMSS Analytics, Truven Health Analytics. 2012 HIMSS Analytics Report: Quality and
Safety Linked to Advanced Information Technology Enabled Processes. Chicago, IL:
HIMSS Analytics. April 2012.
21 Chenoweth J, Foster DA, Waibel BC. Best Practices in Board Oversight of Quality.
The Governance Institute. June 2006.
22Kroch E, Vaughn T, Koepke M, Roman S, Foster DA, Sinha S, Levey S. Hospital Boards
and Quality Dashboards. J Patient Safety. 2(1):10-19, March 2006.
23Health Research and Educational Trust and Prybil, L. Governance in High-Performing
Organizations: A Comparative Study of Governing Boards in Not-For-Profit Hospitals.
Chicago: HRET in Partnership with AHA. 2005.
24Foster DA. Hospital System Membership and Performance. Ann Arbor, MI:
Truven Health Analytics Center for Healthcare Improvement. May 2012.
25The MEDPAR data years quoted in 100 Top Hospitals studies are federal fiscal
years (FFY) — a year that begins on October 1 of each calendar year and ends on
September 30 of the following calendar year. FFYs are identified by the year in which
they end (e.g., FFY 2012 begins in 2011 and ends in 2012). Datasets include patients
discharged in the specified FFY.
26See the CMS Hospital Compare website at hospitalcompare.hhs.gov.
27Iezzoni L, Ash A, Shwartz M, Daley J, Hughes J, Mackiernan Y. Judging Hospitals by
Severity-Adjusted Mortality Rates: The Influence of the Severity-Adjusted Method.
Am J Public Health 1996; 86(10):1379-1387.
28Iezzoni L, Shwartz M, Ash A, Hughes J, Daley J, Mackiernan Y. Using Severity-Adjusted
Stroke Mortality Rates to Judge Hospitals. Int J Qual Health C. 1995;7(2):81-94.
29Foster D. Model-Based Resource Demand Adjustment Methodology.
Truven Health Analytics. July 2012.
30Elixhauser A, Steiner C, Palmer L. Clinical Classifications Software (CCS), 2014.
U.S. Agency for Healthcare Research and Quality. Available via
hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp.
56
15 TOP HEALTH SYSTEMS
31 DesHarnais SI, McMahon LF Jr, Wroblewski RT. Measuring Outcomes of Hospital Care
Using Multiple Risk-Adjusted Indexes. Health Services Research, 26, no. 4
(Oct 1991):425-445.
32DesHarnais SI, et al. The Risk-Adjusted Mortality Index: A New Measure of Hospital
Performance. Medical Care. 26, no. 12 (Dec 1988):1129-1148.
33DesHarnais SI, et al. Risk-Adjusted Quality Outcome Measures: Indexes for
Benchmarking Rates of Mortality, Complications, and Readmissions.
Qual Manag Health Care. 5 (Winter 1997):80-87.
34DesHarnais SI, et al. Measuring Hospital Performance: The Development and Validation
of Risk-Adjusted Indexes of Mortality, Readmissions, and Complications. Med Car. 28,
no. 12 (Dec 1990):1127-1141.
35Iezzoni LI, et al. Chronic Conditions and Risk of In-Hospital Death. Health Serv Res. 29,
no. 4 (Oct 1994): 435-460.
36Iezzoni LI, et al. Identifying Complications of Care Using Administrative Data.
Med Car. 32, no.7 (Jul 1994): 700-715.
37Iezzoni LI, et al. Using Administrative Data to Screen Hospitals for High Complication
Rates. Inquiry. 31, no. 1 (Spring 1994): 40-55.
38Iezzoni LI. Assessing Quality Using Administrative Data. Ann Intern Med. 127, no. 8
(Oct 1997):666-674.
39Weingart SN, et al. Use of Administrative Data to Find Substandard Care: Validation of
the Complications Screening Program. Med Care. 38, no. 8 (Aug 2000):796-806.
40See the CMS Hospital Compare website at www.hospitalcompare.hhs.gov.
41 The Wall Street Journal, New York, NY, Online Help: Digest of Earnings.
online.wsj.com/public/resources/documents/doe-help.htm.
15 TOP HEALTH SYSTEMS
57
For More Information
Visit 100tophospitals.com, call 800.525.9083 option 4
or send an email to 100tophospitals@truvenhealth.com
Truven Health Analytics, an IBM® Company
Truven Health Analytics, an IBM Company, delivers the answers that clients need to improve healthcare quality and access while reducing costs. We provide
market-leading performance improvement solutions built on data integrity, advanced analytics, and domain expertise. For more than 40 years, our insights
and solutions have been providing hospitals and clinicians, employers and health plans, state and federal government agencies, life sciences companies, and
policymakers the facts they need to make confident decisions that directly affect the health and well-being of people and organizations in the U.S. and around
the world.
Truven Health Analytics owns some of the most trusted brands in healthcare, such as MarketScan,® 100 Top Hospitals,® Advantage Suite,® Micromedex,® Simpler,®
ActionOI,® and JWA. Truven Health has its principal offices in Ann Arbor, Mich.; Chicago, Ill.; and Denver, Colo. For more information, please visit truvenhealth.com.
truvenhealth.com
|
1.800.366.7526
©2016 Truven Health Analytics Inc. All rights reserved. All other product names used herein are trademarks of their respective owners. TOP 16644 0416