Rating the rankings: How ‘top hospital’ lists define quality

It's the mid-1980s. You're an accountant, not a doctor, and you've just been told that your mother has a rare type of cancer. You respect and trust her physician-but this is your mother, and she needs serious medicine. How do you find the very best possible treatment?.


It's the mid-1980s. You're an accountant, not a doctor, and you've just been told that your mother has a rare type of cancer. You respect and trust her physician-but this is your mother, and she needs serious medicine. How do you find the very best possible treatment?

If you knew Mortimer Zuckerman, you might have asked him. Twenty years ago, Mr. Zuckerman, the owner and editor-in-chief of U.S. News & World Report, was on the board of Sloan-Kettering Hospital in New York. People often asked him to suggest the best hospital for loved ones with rare or advanced cancer, intractable heart failure and other grave ailments.

But which hospitals were the “best”? Mr. Zuckerman thought he might be able to use his magazine to help answer that question. He asked U.S. News assistant managing editor Avery Comarow, who was running the “News You Can Use” section, to figure out how to identify the best hospitals across the nation for patients with serious or complicated conditions.

“The idea, from the beginning, was to focus on a patient population that was most desperate, most in need of helpful information and guidance,” Mr. Comarow said.

a1

In 1990, U.S. News & World Report launched America's Best Hospitals, listing hospitals alphabetically in 12 specialties based solely on reputation. Beginning in 1993, the magazine began ranking hospitals, using a methodology created by the National Opinion Research Center at the University of Chicago in Illinois (see sidebar “How U.S. News & World Report rates hospitals”). The rankings are currently developed in conjunction with RTI International, a research institute in Research Triangle Park, N.C., and appear annually in the magazine.

What's out there?

U.S. News & World Report's widely read consumer guide is now one of more than 50 print or Web-based publications rating hospitals. The trend has grown in tandem with the industry's move toward quality measurement and pay-for-performance, which has put pressure on hospitals to quantify how well they meet predetermined standards of safety and patient care.

It's part of a change in culture, said Eric Mazur, FACP, chair of the department of medicine at Norwalk Hospital in Norwalk, Conn. The process, he said, is “transforming medicine from an artisan-dominated industry to one that serves the public at large in an industrial, mass production model … in which payers and consumers are demanding transparency and measurable quality.”

Various organizations have stepped in to answer the demand for data. U.S. News & World Report's America's Best Hospitals and Health Grades America's 50 Best Hospitals (healthgrades.com) are among the efforts aimed at health care consumers or patients. Others, like Solucient Center for Healthcare Improvement's 100 Top Hospitals: National Benchmarks for Success (100tophospitals.com), target the hospital and the provider.

“Our study isn't for consumers, it's for hospital leadership,” said Jean Chenoweth, senior vice president of Solucient. “The reports are designed to help hospitals see where they should improve and how they perform compared to other hospitals across the nation.”

Solucient, a proprietary information products company, uses data from Medicare Provider Analysis and Review (MedPAR) billing files and Medicare cost reports to examine cost and financial and market performance along with quality. Its list of top hospitals is available to the public on its Web site at no cost, while hospitals are charged a fee of $1,100 for their scores and full report.

CMS's Hospital Compare Web site (hospitalcompare.hhs.gov) doesn't rank hospitals, but publishes data-based on patient hospital records-on 21 quality measures focusing on heart attack and heart failure, pneumonia and surgical care.

“A lot of these data are used by other organizations, and can be accessed by the public to compare hospitals and allow hospitals to monitor and improve their quality,” said Michael Rapp, MD, director of the quality measurement and health assessment group at CMS.

The Joint Commission, meanwhile, evaluates and sets standards on measures from patient safety to human resource issues and pain management. A directory of Joint Commission-accredited institutions, as well as hospital performance measures on heart attack, heart failure, pneumonia, pregnancy and surgical infection prevention, can be found on its Quality Check Web site (qualitycheck.org).

A powerful motivator

Ideally, rankings would help patients make informed decisions about which hospital to choose for specific types of care. But so far, hospitals may have been paying more attention to the data. Rick Wade, senior vice president of the American Hospital Association, said that although the first U.S. News list was useful to the public, “it was more useful to hospitals as a benchmark.” Rankings, he said, are a “powerful motivator for hospitals to look at quality improvement.”

“What is clear is that hospitals and doctors respond to these rankings,” said Robert Wachter, FACP, chief of medical service at University of California, San Francisco, Medical Center. “We're seeing very impressive improvements on quality measures.” A hospital that doesn't make the “best of” list for heart attacks, for example, might make a special effort to improve in that area.

Hospitalist programs also help improve quality performance, according to a study done this past summer by Solucient.

“We found if a hospital did well on the 100 Top Hospital ratings, more likely than not it already adopted the use of hospitalists to treat inpatients, even in the smallest hospitals,” Ms. Chenoweth said.

The value of hospitalists lies in their ability to increase treatment standardization, according to Dr. Rapp. “Standardization can have a significant impact on quality by making things more efficient and reliable,” he said.

“The buzzword in the safety world is high reliability,” said Dr. Wachter, a founder of the hospitalist movement. “Most hospitalists have grown up in era where we appreciate that improving quality of care doesn't depend on the brilliance of particular physicians, but on creating a good system. Hospitalists are well positioned to lead that kind of effort.”

Measuring the metrics

The strongest criticism of ratings lists is that they are so different from one another.

“It's confusing,” said Rachel M. Werner, ACP Member, assistant professor of medicine at the University of Pennsylvania School of Medicine in Philadelphia. And, she said, because a variety of methods are used to measure quality, comparing lists is like comparing apples and oranges.

Another problem, Dr. Werner said, is that some rating systems are “only measuring discrete aspects of care … they don't tell about the overall care a patient receives. It's important to understand this when looking at rankings.”

“We need to appreciate and adjust for how sick patients are when they come in the front door,” said Dr. Wachter. “That's part of the problem right now. We don't have the science to do that very effectively.”

Quality measurements, like those on CMS's Hospital Compare Web site, capture basic processes performed in a reliable, predictable way, Dr. Wachter said. “But they don't really answer the question, ‘Can you take care of a really complicated patient?’ Nothing in present measures captures that.”

Dr. Werner's research on performance measures has led her to one overall conclusion: “We need to measure more.”

But exactly what can-and should-be measured is another question. Experts disagree, for example, on the importance of a hospital's reputation.

U.S. News uses [reputation] as a surrogate for process measures. Since no medical literature has demonstrated a link between reputation and actual quality, it would be better if performance vis-à-vis publicly reported measures was substituted,” said Thomas Balcezak, ACP Member, associate chief of staff at Yale-New Haven Hospital in New Haven, Conn.

Dr. Wachter, on the other hand, feels reputation is important because it “captures some element of quality that can't be captured by more quantitative measures.”

Experts caution that although top hospital lists and quality measures can provide useful information, no report is perfect.

“Performance and quality measurement in general are important, and the measures used in Hospital Compare are great first steps, but we haven't yet developed measures that truly capture differences in quality [among] hospitals,” Dr. Werner said. “We need to seek out measures that explain these differences.”

Dr. Mazur agreed. “We're just not there yet,” he said. “But just because it's imperfect doesn't mean we should not be participating. We should be part of the process to make it better.”

“In America, we love our rankings,” Mr. Comarow observed. But getting the best care, he cautioned, involves more than looking at a list. “Someone must do the homework … call top places and ask questions. You can't view rankings as the end, but the beginning.”