Skip to Main

June 03, 2014

Veterans Affairs Hospitals Vary Widely in Patient Care

The Phoenix facility at the heart of the crisis at the Department of Veterans Affairs is among a number of VA hospitals that show significantly higher rates of mortality and dangerous infections than the agency’s top-tier hospitals, internal records show.

The criticism that precipitated last week’s resignation of VA Secretary Eric Shinseki has focused largely on excessive wait times for appointments across the VA’s 150-hospital medical system.

But a detailed tabulation of outcomes at a dozen VA hospitals made available to The Wall Street Journal illustrates a deeper challenge: vastly disparate treatment results and what some VA doctors contend is the slippage of quality in recent years at some VA facilities.

Some of the discrepancies are stark, especially for an agency known for offering high-quality care in 50 states.

The rate of potentially lethal bloodstream infections from central-intravenous lines was more than 11 times as high among patients at the Phoenix facility than it was at top VA hospitals, data from the year ended March 31, 2014, show.

Those infections, called sepsis, can quickly cause multiple organ failure and kill an otherwise relatively healthy patient within days or even hours. The data don’t show what percentage of patients died as a result.

Among patients admitted to the hospital for acute care, the Phoenix VA Health Care System had a 32% higher 30-day death rate than did the top-performing VA hospitals, a finding flagged as statistically significant by the agency’s medical analysts.

By contrast, Boston’s VA hospital, considered among the system’s best, had a central-IV-line, bloodstream-infection rate that was 63% below the average of the top-performing hospitals. It also had a slightly better-than-average, 30-day mortality rate for acute care.

Scott McRoberts, spokesman for the Phoenix VA Health Care System, said on Monday the database "is an internal measurement system to benchmark our improvement, and is not for public consumption."

Variations in the quality of health care exist outside the VA system as well, though it is difficult to measure because relatively small numbers of hospital groups report a range of medical outcomes. But some experts in medical-quality measurement say the VA discrepancies stand out.

"Wide variations are a problem at both the VA and private hospitals. But I would expect to see much smaller variations in a national, integrated delivery system like the VA," said Ashish Jha, a professor at the Harvard School of Public Health and a physician in the VA system.

In all, the data point to VA hospitals in Phoenix, Atlanta, Houston and Dublin, Ga., as among the system’s lower-rated facilities, while those in Boston, Cleveland and Minneapolis rank among the top performers, according to VA officials and internal documents.

The findings come from a nonpublic VA database called Strategic Analytics for Improvement and Learning, known as SAIL. SAIL tracks procedure outcomes and ranks VA hospitals on a scale of five stars, the best, to one star, the lowest.

The SAIL data tabulate hospital performance across a wide range of safety measures, such as acute-care death, death from congestive heart failure and pneumonia, and deaths from avoidable causes like urinary-tract infections and ventilator-associated pneumonia.

A VA spokesman said SAIL has emerged as a useful barometer for the agency, but is "still very much a work in progress" whose efficacy will increase as the agency "gains more experience with it and refines its development and use."

On Tuesday morning, the spokesman said the VA’s Veterans Health Administration targets "facilities that fail to demonstrate improvement" and subjects those hospitals "to increasing degrees of scrutiny and oversight by VHA leadership."

The VA spokesman didn’t identify specific hospital centers that might be subject to higher scrutiny.

The VA’s inspector general in recent months has publicly used the SAIL data to point out significant problems at individual hospitals, illustrating how valued the information has become when identifying health-care problems.

VA hospitals in Atlanta, Houston and Dublin, Ga., declined to comment or referred questions to the national VA office. The VA spokesman wouldn’t address variations in care.

In other examples of variations in care, the Atlanta VA Medical Center, a two-star hospital for quality, has more than three times the rate of central-IV infections than the average of five-star VA hospitals. Houston’s VA hospital, ranked as a two-star hospital, had a 47% higher acute-care mortality rate than the five-star hospital rate.

The VA has disclosed a variety of health-care quality data for its hospitals, often including more information than the great majority of private hospitals make public.

It has built several Internet portals that allow the public to see information about infection rates, among other things, and how it compares with agency goals.

But the database published by the VA is less detailed and offers less ability to compare hospitals than SAIL.

For example, the Phoenix VA doesn’t appear to be an outlier on the VA’s main comparison website, www.hospitalcompare.va.gov.

By comparison, Pennsylvania has published hospital-specific medical outcomes from private and other hospitals for decades. They showed great variations initially, but a look at the current data set appears to show less variability among institutions.

The VA has long been on the cutting edge of medical advances, including in its gathering and disseminating of data.

It pioneered a national surgery quality-improvement program in recent decades that rigorously measured its own surgeons’ performance. By 2011, it also took the extraordinary step of beginning to publish some of its hospitals’ medical-complication and surgical death rates, with an eye toward ratcheting up excellence.

At the same time, the wide variation in outcomes among facilities sparked an internal battle over how much detailed data should be made public, said current and former VA doctors.

William E. Duncan, who supervised publication of medical outcomes until his 2012 departure, said in an interview that he urged that more data be posted regardless of the impact.

But his superior, VA Under Secretary for Health Robert Petzel, argued that the more detailed outcomes should stay private at the VA, senior VA doctors say.

Efforts to reach Dr. Petzel, who was forced out of office recently, weren’t successful.

Amid the spat, Dr. Duncan was forced out of the agency in 2012, he said. The VA spokesman didn’t answer questions about Dr. Duncan’s departure.

Dr. Duncan, now living in the Maryland suburbs of Washington, said the VA system of measuring and publishing outcomes was designed with lofty aspirations: "The goal was not for hospitals to be average performers. The goal was to be in the top 10%."

He is upset by the recent complaints about the VA.

"Our patients have little recourse, and they rely on our staff to tell them the truth," he said. "We can’t forget that medical quality is not just access to care."

As for the relatively poor results from Phoenix, this was no secret within the department, he said. "It was in their own data."

Read the story online here

Back to News