Nursing Homes: Public Reporting of Quality Indicators Has Merit, 
but National Implementation Is Premature (31-OCT-02, GAO-03-187).
                                                                 
GAO was asked to review the Centers for Medicare & Medicaid	 
Services (CMS) initiative to publicly report additional 	 
information on its "Nursing Home Compare" Web site intended to	 
help consumers choose a nursing home. GAO examined CMS's	 
development of the new nursing home quality indicators and	 
efforts to verify the underlying data used to calculate them. GAO
also reviewed the assistance CMS offered the public in		 
interpreting and comparing indicators available in its six-state 
pilot program, launched in April 2002, and its own evaluation of 
the pilot. The new indicators are scheduled to be used nationally
beginning in November 2002.					 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-03-187 					        
    ACCNO:   A05451						        
  TITLE:     Nursing Homes: Public Reporting of Quality Indicators Has
Merit, but National Implementation Is Premature 		 
     DATE:   10/31/2002 
  SUBJECT:   Data collection					 
	     Data integrity					 
	     Health care programs				 
	     Information disclosure				 
	     Inspection 					 
	     Nursing homes					 
	     Program evaluation 				 
	     Quality control					 
	     Web sites						 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-03-187

                                       A

Report to Congressional Requesters

October 2002 NURSING HOMES Public Reporting of Quality Indicators Has
Merit, but National Implementation Is Premature

GAO- 03- 187

Letter 1 Results in Brief 3 Background 4 Appropriateness of Quality
Indicators Proposed for Public

Reporting Is Unresolved 7 CMS Has Not Addressed Concerns About the
Underlying Accuracy

of MDS Data Used to Develop Quality Indicators 10 The Public May Be
Confused by Quality Data and CMS Is Not

Prepared to Respond to Consumers* Questions 15 Pilot Evaluation Is Limited
and Will Not Be Completed Prior to

National Reporting of Quality Indicators 20 Conclusions 21 Recommendations
for Executive Action 22 Comments from CMS and the NQF and Our Evaluation
22

Appendixes

Appendix I: Comparison of Quality Indicators Proposed by NQF and CMS for
National Rollout 27

Appendix II: Comments from the Centers for Medicare & Medicaid Services 28

Appendix III: Comments from the National Quality Forum 38

Appendix IV: GAO Contact and Staff Acknowledgments 40 Tables Table 1:
Error Rates in MDS Items Used to Develop Selected

Quality Indicators 14 Table 2: Percentage of Nursing Homes in the Six
Pilot States with

Missing Quality Indicator Scores 17 Table 3: Comparison of Publicly
Reported Nursing Home Quality

Indicator Scores and Quality- of- Care Survey Deficiencies in Six Pilot
States 18

Figure Figure 1: Error Rates in MDS Assessments in 30 Nursing Homes
Reviewed by Abt 13

Abbreviations

CMS Centers for Medicare & Medicaid Services HCFA Health Care Financing
Administration HHS Department of Health and Human Services MDS minimum
data set NQF National Quality Forum OIG Office of Inspector General QIO
Quality Improvement Organization

Letter

October 31, 2002 The Honorable Charles E. Grassley Ranking Minority Member
Committee on Finance United State Senate

The Honorable Christopher S. Bond United State Senate

Almost half of all Americans over the age of 65 will spend time in a
nursing home at some point in their lives. A series of congressional
hearings since 1998 has focused considerable attention on the unacceptably
high number of nursing homes with repeated, serious care problems that
harmed residents or placed them at risk of death or serious injury. Given
the large number of nursing home residents and the growing public concerns
over quality- of- care problems, the Centers for Medicare & Medicaid
Services (CMS) has shown a strong commitment to providing assistance to
individuals and their families in choosing a nursing home. CMS is the
agency within the Department of Health and Human Services (HHS) that
manages Medicare and Medicaid and oversees compliance with federal nursing
home quality standards. 1 In 1998, the agency launched a Web site*
*Nursing Home Compare** that has progressively expanded the availability
of public information on nursing homes and the quality of care provided. 2
Initially, it posted data on deficiencies identified during routine state
nursing home inspections, known as surveys. Data were later added on
resident characteristics, such as the percentage of residents with
pressure sores or physical restraints, nursing staff levels, and
deficiencies found during state investigations of complaints.

In November 2001, CMS announced a 12- month timeline for an initiative to
(1) augment existing public data on nursing home quality, and (2) provide
assistance to nursing homes to help improve their quality of care. In
addition to the valuable data already available on its Web site, CMS
proposed including newly developed quality indicators that permit a fairer
comparison across homes by adjusting for differences in residents* health

1 In June 2001, the agency*s name was changed from the Health Care
Financing Administration (HCFA) to CMS. In this report, we continue to
refer to HCFA where our findings apply to the organizational structure and
operations associated with that name. 2 www. Medicare. gov/ NHCompare/
home. asp.

characteristics. Quality indicators are essentially numeric warning signs
of problems, such as more frequent than expected pressure sores among
nursing home residents. They are based on data from facility- reported
assessments* known as the minimum data set (MDS)* conducted at established
intervals during each resident*s nursing home stay. The initiative also
envisioned a new role for Medicare Quality Improvement Organizations
(QIO): engaging in partnership building and local promotional activities
designed to put quality information into the hands of consumers and
working with nursing homes on a voluntary basis to help improve quality of
care. 3 In April 2002, CMS launched a six- state pilot to refine the
initiative before planned nationwide implementation in November 2002. The
six pilot states are Colorado, Florida, Maryland, Ohio, Rhode Island, and
Washington. Medicare QIOs are working with from 6 to 11 nursing homes in
each pilot state on projects including improving pain

management and preventing pressure sores. In view of the importance of
CMS*s quality indicator initiative and the relatively short pilot time
frame prior to national implementation, you asked us to review the (1)
development of the new quality indicators for public reporting, (2) status
of CMS*s efforts to ensure the accuracy of the underlying data used to
calculate the quality indicators, (3) assistance offered by CMS to the
public in understanding the new quality indicator data, and (4) results of
CMS*s evaluation of the pilot. To do so, we reviewed pertinent documents
from CMS on the development of the new quality

indicators, the approaches identified to adjust for differences in
residents* characteristics in each facility, the validation of the new
indicators, the operation and evaluation of the pilot, and the role of the
QIOs. We discussed these areas with CMS officials and with researchers
from Abt

Associates, Inc., the lead CMS contractor responsible for the development
and validation of the new quality indicators. We also interviewed and
reviewed materials provided by officials of the National Quality Forum
(NQF), a group CMS contracted with to review Abt*s work and provide
recommendations on quality indicators for public reporting. 4 We examined
the consistency of both the quality indicator data available in the six
pilot

3 Under contract with CMS, 37 QIOs (formerly known as Peer Review
Organizations) are responsible for determining the quality, effectiveness,
and efficiency of health care services provided to Medicare beneficiaries
in all 50 states and the District of Columbia.

4 NQF is a nonprofit organization created to develop and implement a
national strategy for health care quality measurement and reporting. NQF
has broad participation from government and private entities as well as
all sectors of the healthcare industry.

states and the extent of agreement between such data and the results of
nursing home surveys. To determine how CMS is assisting consumers in
understanding the quality indicator data available in the six pilot
states, we posed questions about discrepancies we identified between the
indicators and survey deficiencies to staff who field public inquiries
received by the Medicare and QIO toll- free telephone numbers. We
conducted our work from March through September 2002 in accordance with
generally accepted government auditing standards. Results in Brief
Overall, CMS*s initiative to augment existing public data on nursing home

quality has considerable merit but its plan for nationwide implementation
in November 2002 is premature. Conceptually, CMS*s plan encourages
consumers making a decision about a nursing home to consider those with
positive quality indicator scores* a use of market forces to encourage

poorly performing homes to improve quality of care or face the loss of
revenue. Such a plan hinges, in part, on appropriate quality indicators
that consistently distinguish between good and poor care provided by
nursing homes. However, CMS has not yet adequately resolved a number of
open issues regarding the appropriateness of the quality indicators
selected for public reporting and the accuracy of the underlying data. CMS
contracted with two expert organizations, Abt and NQF, to develop

and help select quality indicators appropriate for public reporting, but
it does not intend to await NQF input before proceeding. In August 2002,
CMS announced a set of quality indicators it intends to begin reporting
nationally in November 2002. Its selection was based on the results of
Abt*s efforts to validate the nursing home quality indicators it had
developed for CMS. Although the full Abt validation report was not
available to us as of October 28, 2002, our review of the available
portions of the report raised serious questions about the basis for moving
forward with national reporting at this time. Moreover, Abt*s finding that
the underlying MDS data are accurate is not convincing in light of the
results of earlier studies that identified widespread errors in the
accuracy of facility- specific assessments used to calculate some of the
quality indicators CMS has

selected for November reporting. In 2001, CMS also contracted with the NQF
to review Abt*s work and recommend a set of quality indicators for
national reporting but in June 2002 asked the NQF to delay finalizing its
recommendations until 2003. The delay will allow NQF to consider both the
full Abt validation report and the results of an evaluation of the six-
state pilot established to refine the initiative but will not allow CMS to
consider NQF*s input before its planned nationwide implementation.

In addition to concerns over the appropriateness of the quality
indicators, we found that CMS was not well prepared to respond to consumer
questions about the quality data and had not allotted sufficient time to
incorporate lessons learned from its six- state pilot. CMS*s reporting of
quality indicators in the pilot states was neither consumer friendly nor
presented in a format consistent with the data*s limitations. For example,
reporting homes* actual quality indicator scores rather than rankings*
whether a home was in the bottom, middle, or top range of homes on a
particular score* could make it difficult for consumers to interpret the
differences in homes* scores and could imply a precision that does not
currently exist. Our analysis of the data also demonstrated the potential
for public confusion over (1) contradictory information from the quality
indicators themselves* almost one- fifth of homes in pilot states had an
equal number of highly positive and highly negative quality indicator
scores and (2) inconsistencies between quality indicator scores and data
on deficiencies identified during nursing home surveys* almost one- fifth
of

homes with four or more highly positive quality indicator scores and no
highly negative scores had at least one serious quality of care deficiency
on a recent state survey. Moreover, our telephone calls to the Medicare
and QIO toll- free numbers revealed that CMS was not adequately prepared
to address questions raised by the public and we received erroneous or
misleading information in the majority of calls we placed. Finally, CMS*s
evaluation of the pilot itself is limited and will not be fully completed
until sometime in 2003* months after the planned national implementation
of the initiative.

We are recommending that the Administrator of CMS delay the national
reporting of quality indicators until (1) there is greater assurance that
quality indicators are appropriate and based on accurate data and (2) a
more thorough evaluation of the pilot is completed. In commenting on a
draft of this report, CMS reiterated its commitment to continually improve
the quality indicators and to work to resolve the issues discussed in our
report. However, CMS does not intend to delay its initiative before
resolving the issues raised. CMS believes that its quality measurement
information is sufficiently reliable, valid, accurate, and useful to move
forward with national implementation in November 2002 as planned.
Background Since 1998, the results from state surveys of nursing homes
have been the

principal source of public information on nursing home quality, which is
posted and routinely updated on CMS*s Nursing Home Compare Web site. Under
contract with CMS, states are required to conduct periodic surveys

that focus on determining whether care and services meet the assessed
needs of the residents and whether homes are in compliance with federal
quality requirements, such as preventing avoidable pressure sores, weight

loss, or accidents. 5 During a survey, a team that includes registered
nurses spends several days at a home reviewing the quality of care
provided to a sample of residents. States are also required to investigate
complaints lodged against nursing homes by residents, families, and
others. In contrast to surveys, complaint investigations generally target
a single area in response to a complaint filed against a home. Any
deficiencies identified during routine surveys or complaint investigations
are classified according to the number of residents potentially or
actually affected (isolated, pattern, or widespread) and their severity
(potential for minimal harm,

potential for more than minimal harm, actual harm, and immediate
jeopardy).

To improve the rigor of the survey process, HCFA contracted for the
development of quality indicators and required their use by state
surveyors beginning in 1999. 6 Quality indicators are derived from data
collected during nursing homes* assessments of residents, known as the
minimum data set (MDS). The MDS contains individual assessment items
covering 17 areas, such as mood and behavior, physical functioning, and
skin conditions. MDS assessments of each resident are conducted in the
first 14

days after admission and periodically thereafter and are used to develop a
5 Surveys must be conducted at each home on average every 12 months and no
less than once every 15 months. 6 The quality indicators used in nursing
home surveys were developed by the University of Wisconsin under a HCFA-
funded contract. See Center for Health Systems Research and Analysis,
Facility Guide for the Nursing Home Quality Indicators (Madison:
University of Wisconsin- Madison: September 1999). Surveyors use the
indicators to help select a preliminary sample of residents and preview
information on the care provided to a home*s residents prior to the on-
site inspection. Prior to their introduction in 1999, selection of the

sample relied on a listing of residents and their conditions maintained at
the nursing home and on observation of residents made during a walk
through of the facility. As a result of the quality indicators, the sample
selection is more systematic and surveyors are better prepared to identify
potential care problems. However, the quality indicators used during
surveys were not developed for public reporting because they were viewed
as providing an

indication of a potential quality problem that required validation through
an on- site survey.

resident*s plan of care. 7 Facility- reported MDS data are used by state
surveyors to help identify quality problems at nursing homes and by CMS to
determine the level of nursing home payments for Medicare; some states
also use MDS data to calculate Medicaid nursing home payments.

Because it also envisioned using indicators to communicate nursing home
quality to consumers, HCFA recognized that any publicly reported
indicators must pass a very rigorous standard for validity and
reliability. Valid quality indicators that distinguish between good and
poor care provided by nursing homes would be a useful adjunct to existing
quality data. Such indicators must also be reliable* that is, they must
consistently distinguish between good and bad care. HCFA contracted with
Abt to review existing quality indicators and determine if they were
suitable for public reporting. Abt catalogued and evaluated 143 existing
quality indicators, including those used by state surveyors. It also
identified the

need for additional indicators both for individuals with chronic
conditions who are long- term residents of a facility and for individuals
who enter a nursing home for a short period, such as after a
hospitalization (a postacute stay). According to Abt, a main concern about
publicly reporting quality indicators was that the quality indicator
scores might be influenced by other factors, such as residents* health
status. Abt concluded that the specification of appropriate risk
adjustment models was a key requirement for the validity of any quality
indicators. Risk adjustment is important because it provides consumers
with an *apples- to- apples* comparison of nursing homes by taking into
consideration the characteristics of individual residents and adjusting
quality indicator scores accordingly. For example, a home with a
disproportionate number of residents who are bedfast or who present a
challenge for maintaining an adequate level of nutrition* factors

that contribute to the development of pressures sores* may have a higher
pressure sore score. Adjusting a home*s quality indicator score to fairly
represent to what extent a home does* or does not* admit such residents is
important for consumers who may wish to compare one home to another. After
several years of work, Abt recommended 39 risk- adjusted quality
indicators to CMS in October 2001. Twenty- two were based on existing
indicators and the remaining 17 were newly developed by Abt, including 9
indicators for nursing home residents with chronic conditions

7 MDS assessments are conducted for all nursing home residents within 14
days of admission and at quarterly and yearly intervals unless there is a
significant change in condition. Medicare beneficiaries in a Medicare-
covered stay are assessed on or before the 5 th , 14 th , 30 th , 60 th ,
and 90 th day of their stays to determine if their Medicare coverage
should continue.

and 8 indicators for individuals who enter a nursing home for a short
period.

In September 2001, CMS contracted with the NQF to review Abt*s work with
the objective of (1) recommending a set of quality indicators for use in
its planned six- state pilot and (2) developing a core set of indicators
for national implementation of the initiative scheduled for late 2002. NQF
established a steering committee to accomplish these two tasks. 8 The
steering committee met in November 2001 and identified 11 indicators for
use in the pilot, 9 of which were selected by CMS. The committee made its
selection from among Abt*s list of 39 indicators but it did not recommend
use of Abt*s risk- adjustment approach. Moreover, the steering committee
indicated that it would not be limited to the same Abt list in developing
its recommended core set of indicators for national implementation. In
April 2002, NQF released a draft consensus report identifying the
indicators it had distributed to its members and the public for comment on
their potential inclusion in the national implementation. 9 Under its
contract, NQF was scheduled to make a final recommendation to CMS prior to
the national reporting of quality indicators.

Appropriateness of CMS*s initiative to augment existing public data on
nursing home quality

Quality Indicators has considerable merit but more time is needed to
assure that the

indicators proposed by CMS for public reporting are appropriate in terms
Proposed for Public

of their validity and reliability. Based on work by Abt to validate the
Reporting Is

indicators it developed for CMS, the agency selected quality indicators
for Unresolved

national reporting. The full Abt validation report* which is important for
a thorough analysis of the appropriateness of the quality indicators-- was
still not available to us as of October 28, 2002. Our review of available
portions of the Abt report, however, raised serious questions about
whether testing and validation of the selected indicators has been
sufficient to move

8 The steering committee consists of 12 stakeholders representing health
researchers, geriatricians, state survey agencies, state Medicaid
directors, health systems, and others. 9 The NQF relies on a consensus
process led by a steering committee that initially conducts an overall
assessment in a particular area and gathers input from NQF members,

nonmembers, and expert advisory panels. The steering committee then
recommends a set of draft measures, indicators, or practices for review.
Next, the draft recommendations are distributed for review and comment*
first to NQF members and then to the general public. Following this open
review period, a revised product is distributed to NQF members for a vote.
The NQF Board of Directors must ultimately approve matters under
consideration before the consensus process is complete.

forward with national reporting at this time. Moreover, CMS plans to
initiate national reporting before it receives recommendations from NQF,
its contractor, on appropriate quality indicators.

On August 9, 2002, CMS announced the 10 indicators selected for its
nationwide reporting of quality indicators, which it plans to launch in
midNovember 2002. CMS selected these indicators from those that Abt had
validated in its August 2, 2002, validation report. 10 Abt classified the
indicators it studied as to the degree of validity* top, middle, or not
valid. The indicators that CMS selected were in the top category with one

exception* residents in physical restraints* which was in the middle
category. The objective of Abt*s validation study was to confirm that the
indicators reflect the actual quality of care that individual nursing
facilities provide, after taking into account resident and facility- level
characteristics. For example, a validation analysis could confirm that a
low percentage of pressure sores among residents was linked to a
facility*s use of procedures to prevent their development. Successful
validation reduces the chance that publicly reported data could
misrepresent a high- quality facility as a low- quality facility* or vice
versa.

CMS*s decision to implement national reporting in November 2002 is
troubling, given the issues raised by our review of the available portions
of Abt*s validation report. Although we asked CMS for a copy of Abt*s 11
technical appendixes, as of October 28, 2002, they were still undergoing
review and were not available to us. The technical appendixes are
essential to adequately understand and evaluate Abt*s validation approach.
Our review of the available portions of the Abt report raised serious
questions about whether the effort to date has been sufficient to validate
the indicators. The validation study is based on a sample that is drawn
from six states; it is not representative of nursing homes nationwide and
may not be representative of facilities in these six states. Selected
facilities were allowed to decline participation and about 50 percent did
so. For those facilities in the validation study, Abt deemed most of the
indicators as valid* that is, better care processes were associated with
higher quality indicator scores, taking into account resident and
facility- level

characteristics. However, we could not evaluate these findings because Abt
provided little information on the specific care processes against which
the

10 Abt Associates, Inc., HRCA Research and Training Institute, and Brown
University, Validation of Long- Term and Post- Acute Care Quality
Indicators, final draft report prepared for CMS, Office of Clinical
Standards and Quality, Aug. 2, 2002.

indicators were validated. Unresolved questions also exist about the risk
adjustment of the quality indicators. Risk adjustment is a particularly
important element in determining certain quality indicators because it may
change the ranking of individual facilities* a facility that is among the
highest on a particular quality indicator without risk adjustment may fall
to the middle or below after risk adjustment* and vice versa. Data
released by CMS in March 2002 demonstrated that Abt*s risk adjustment
approaches could either lower or raise facility scores by 40 percent or
more. Although such changes in ranking may be appropriate, Abt did not
provide detailed information on how its risk adjustment approaches changed
facility

rankings or a basis for assessing the appropriateness of the changes. In
addition to the questions raised by our review of the Abt validation
report, CMS is not planning to wait for the expert advice it sought on
quality indicators through its contract with the NQF. Under this contract,
the NQF steering committee issued a consensus draft in April 2002 with a
set of potential indicators for public reporting. The steering committee
had planned to complete its review of these indicators using its consensus

process by August 2002. 11 In late June, however, CMS asked NQF to delay
finalizing its recommendations until early 2003 to allow (1) consideration
of Abt*s August 2002 report on the validity of its indicators and
riskadjustment methods* including the technical appendices, when they
become available and (2) a review of the pilot evaluation results expected
in October 2002. An NQF official told us that the organization agreed to
the

delay because the proposed rapid implementation timeline had been a
concern since the initiative*s inception. CMS*s list of quality indicators
for the November 2002 national rollout did not include six indicators
under consideration by NQF* depression, incontinence, catheterization,
bedfast residents, weight loss, and rehospitalization (see app. I).
Instead, CMS intends to consider NQF*s recommendations and revise the
indicators used in the mid- November national rollout sometime next year.

11 NQF indicated that it viewed its list as a starting point for a
stronger, more robust set of future indicators. Because nursing homes
include both medical care and social services, NQF believes that a core
set of indicators should cover several other highly important areas in
addition to clinical quality of care, including resident quality of life;
measures of resident and family satisfaction; and the nursing home
environment, such as food quality and number of residents per room. A
March 2002 report prepared for CMS acknowledged that clinical indicators
are less important to the public than issues such as facility cleanliness
and a caring staff. See Barents Group of KPMG Consulting, Inc., Nursing
Home Consumer Choice Campaign Needs Assessment Report (McLean, Va.: Mar.
14, 2002).

CMS is also moving forward without a consensus on risk adjustment of
quality indicators. CMS is planning to report one indicator with
facilitylevel adjustment based on a profile of residents* status at
admission, and two indicators both with and without this Abt- developed
risk adjuster. 12 However, both Abt and NQF have concluded that adjusting
for the type of

residents admitted to the nursing home required further research to
determine its validity. 13 We believe that reporting the same indicator
with and without facility- level risk adjustment could serve to confuse
rather than help consumers. Two of the three consultants hired by NQF
specifically recommended against the use of facility- level adjustments in
public reporting at this time. We also found that, as of October 1, 2002,
CMS had not reached internal consensus on how to describe the risk-
adjustment methods used in each of the 10 indicators it plans to begin
reporting nationally in November 2002. Several agency officials agreed
with our assessment that the descriptions on its Web site were
inconsistent with

Abt*s own descriptions of the risk adjustment associated with each
indicator. CMS Has Not Two different Abt studies have presented CMS with
conflicting messages Addressed Concerns

about the accuracy of MDS data. Abt*s August 2002 quality indicator
validation report suggested that the underlying data used to calculate
most About the Underlying

indicators were, in the aggregate, very reliable. However, our analysis of
Accuracy of MDS Data

more detailed facility- level data in a February 2001 Abt report raised
Used to Develop

questions about the reliability of some of the same MDS data. Because MDS
data are used by CMS and some states to determine the level of nursing

Quality Indicators home payments for Medicare and Medicaid and to
calculate quality

indicators, ensuring its accuracy at the facility level is critical both
for determining appropriate payments and for public reporting of the
quality indicators. Recognizing the importance of accurate MDS data, CMS
is in the process of implementing a national MDS accuracy review program
12 CMS explained that its decision to use facility- level adjustments was
influenced by *great

stakeholder interest* in how this new risk- adjustment methodology
affected them. 13 NQF based it recommendation on the work of a Special
Advisory Panel of three independent consultants who were asked to assist
in resolving concerns about the technical complexity of Abt*s risk
adjustment approaches, particularly its proposed facility- level
adjustments. Specifically, the April 2002 NQF consensus draft recommended
priority funding for (1) research regarding the selection of appropriate
risk factors; (2) comparisons of the different risk adjustment
methodologies for nursing home performance data, as applied to each
quality indicator; and (3) validation of different risk- adjustment
methods.

expected to become fully operational in 2003, after the nationwide
reporting of quality indicators begins in November 2002. We recently
reported that CMS*s review program is too limited in scope to provide
adequate confidence in the accuracy of MDS assessments in the vast bulk of
nursing homes nationwide. 14 Abt*s August 2, 2002, validation report
concluded that the reliability of the

underlying MDS data used to calculate 39 quality indicators ranged from
acceptable to superior, with the data for only 1 indicator proving
unacceptable. 15 Abt*s findings were based on a comparison of assessments
conducted by its own nurses to assessments performed by the nursing home
staff in 209 sample facilities. For each quality indicator, Abt reported
the overall reliability for all of the facilities in its sample. 16
However, because quality indicators will be reported for each nursing
home, overall reliability is not a sufficient assurance that the
underlying MDS data are

reliable for each nursing home. Although Abt did not provide information
on MDS reliability for individual facilities, it noted that reliability
varied considerably within and across states. Earlier work by Abt and
others calls into question the reliability of MDS data. Abt*s February
2001 report on MDS data accuracy identified significant variation in the
rate of MDS errors across the 30 facilities sampled. 17 Differences
between assessments conducted by Abt*s nurses and the nursing home staff
were classified as errors by Abt. Error rates for all MDS items averaged
11.7 percent but varied across facilities by a factor of almost two* from
7.8 percent to 14.5 percent. As shown in figure 1, the

14 See U. S. General Accounting Office, Nursing Homes: Federal Efforts to
Monitor Resident Assessment Data Should Complement State Activities, GAO-
02- 279 (Washington, D. C.: Feb. 15, 2002).

15 Validation analysis was incomplete for two additional indicators. 16 As
noted earlier, Abt*s sample may not be representative as only 50 percent
of homes agreed to participate. 17 Abt Associates, Inc., Development and
Testing of a Minimum Data Set Accuracy Verification Protocol, final report
prepared for HCFA, Feb. 27, 2001. The authors of this

study computed the combined error rate for individual facilities by
weighting the error rates for Medicare and non- Medicare assessments using
the proportion of Medicare (. 32) to nonMedicare (. 68) assessment items
for the entire sample of 30 facilities. However, the proportion of
Medicare to non- Medicare assessments varied across facilities. For
example, in one facility there were more Medicare than non- Medicare
assessments. We therefore

recomputed facility error rates using the proportion of Medicare to non-
Medicare MDS assessment items for each facility.

majority of error rates were higher than 10.5 percent. Furthermore, error
rates for some of the individual MDS items used to calculate the quality
indicators were much higher than the average error rate. 18 According to
Abt, the least accurate sections of the MDS included physical functioning
and skin conditions. Abt also noted that there was a tendency for
facilities to underreport residents with pain. 19 MDS items from these
portions of the assessment are used to calculate several quality
indicators that CMS plans to report nationally in November 2002*
activities of daily living, pressure sores, and pain management. Table 1
shows that the error rate across the residents sampled ranged from 18
percent for pressure sores to 42 percent for pain intensity. 20 Abt*s
February 2001 findings were consistent with areas that states have
identified as having a high potential for error, including activities of
daily living and skin conditions. 21 Moreover, a study

by the HHS Office of Inspector General (OIG), which identified differences
between the MDS assessment and the medical record, found that activities
of daily living was among the areas that provided the greatest source of
differences. 22 In addition, the OIG report noted that 40 percent of the

nursing home MDS coordinators it surveyed identified the physical
functioning section, used to calculate the quality indicator on activities
of daily living, as the most difficult to complete. Some coordinators
explained that facility staff view a resident*s capabilities differently
and thus the assessments tend to be subjective.

18 Abt did not report error rates for individual items at the facility
level. 19 More recently, state survey agency officials in three pilot
states told us that they are concerned that the public reporting of
quality indicators may lead to underreporting of certain problem areas,
such as pain management.

20 Abt did not provide error rates for individual items that are adjusted
to reflect the extent of the differences in assessments conducted by Abt
and the facility nurses. 21 GAO- 02- 279, pp. 16- 18.

22 OIG used the term *differences* rather than errors because its
methodology did not permit a specific determination as to why the
differences occurred. See HHS Office of Inspector General, Nursing Home
Resident Assessment: Quality of Care, OEI- 02- 99- 00040 (Washington, D.
C.: December 2000). In commenting on this report, CMS expressed
reservations about the OIG*s methodology and interpretation of CMS
documents used to perform the study. The OIG had recommended that nursing
homes be required to establish an *audit trail* to document support for
certain MDS elements. CMS disagreed, noting that it did not expect all
information in the MDS to be duplicated elsewhere in the medical record.
We concur with the OIG*s position that, given the use of MDS data in
adjusting nursing home payments and producing quality indicators,
documenting the basis for the MDS assessments in the medical record is
critical to assessing their accuracy. See GAO- 02- 279.

Figure 1: Error Rates in MDS Assessments in 30 Nursing Homes Reviewed by
Abt 8

Number of nursing homes 7 6 5 4 3 2 1 0

8. 5 9. 5

5 <= 8.6- 10. 11.5

12.5 13.5

14.5 9.6- 10.6- 11.6- 12.6- 13.6- Percent of assessments with errors

Source: GAO analysis of data from Abt Associates, Inc., Minimum Data Set
Accuracy.

Table 1: Error Rates in MDS Items Used to Develop Selected Quality
Indicators MDS item Error rate (percent) Physical functioning: used to
calculate quality indicator on decline in activities of daily living

Bed mobility 39 Transfer 34 Eating 37 Toilet use 35

Skin condition: used to calculate quality indicator on prevalence of
pressure sores

Pressure sore 18

Health condition: used to calculate quality indicator on inadequate pain
management

Pain frequency 39 Pain intensity 42 Source: Abt Associates, Inc., Minimum
Data Set Accuracy.

As part of CMS*s efforts to improve MDS accuracy, its contractor is still
field- testing the on- site aspect of its approach, which is not expected
to be implemented until 2003. 23 Although Abt*s February 2001 report found
widespread MDS errors, CMS intends to review roughly 1 percent of the MDS
assessments prepared over the course of a year, which numbered 14. 7
million in 2001. Moreover, only 10 percent of the reviews will be
conducted on- site at nursing homes. In contrast, our prior work on MDS
found that 9

of the 10 states with MDS- based Medicaid payment systems that examine MDS
data*s accuracy conduct periodic on- site reviews in all or a significant
portion of their nursing homes, generally examining from 10 to 40 percent
of assessments. On- site reviews heighten facility staff awareness of the
importance of MDS data and can lead to the correction of practices that
contribute to MDS errors. We reported earlier that CMS*s approach may
yield some broad sense of the accuracy of MDS assessments on an aggregate
level but is insufficient to provide confidence about the accuracy

of MDS assessments in the vast bulk of nursing homes nationwide. 23 On-
site reviews focus on determining whether a resident*s medical record
supports the MDS assessment completed by the facility. If the MDS
assessment is recent, the review may also include direct observation of
the resident and interviews with nursing home staff who have recently
evaluated or treated the resident. Off- site reviews of MDS data include
examining trends in assessments across facilities to identify aberrant or
inconsistent patterns.

The Public May Be While CMS is strongly committed to making more
information available to

Confused by Quality the public on nursing home quality and such an
initiative has considerable

merit, the agency had not demonstrated a readiness to assist the public in
Data and CMS Is Not understanding and using those data. We found that
CMS*s reporting of

Prepared to Respond quality indicators in the six pilot states was neither
consumer friendly nor

to Consumers* reported in a format consistent with the data*s limitations,
implying a

greater degree of precision than is currently warranted. Our analysis of
the Questions

data currently available in the six pilot states demonstrated the
potential for public confusion over both the quality indicators themselves
and inconsistencies with other available data on deficiencies identified
during nursing home surveys* which, to date, are the primary source of
public data on nursing home quality. Moreover, our phone calls to the
Medicare and QIO toll- free numbers revealed that CMS was not adequately
prepared to address consumers* questions raised by discrepancies between
conflicting sources of quality data.

Our review of the quality indicators on the CMS Web site found that the
presentation of the data was not consumer friendly and that the reporting
format implies a greater confidence in the data*s precision than may be
warranted at this time. Quality indicators are reported as the percentage
of

residents in a facility having the particular characteristics measured by
each indicator. The Web site explains that having a low percentage of
residents with pressure sores or pain is better than having a high
percentage. In the six- state pilot, the public can compare a nursing
home*s

score to the statewide and overall average for each quality indicator. We
believe that equating a high score with poor performance is
counterintuitive and could prove confusing to consumers. 24 Despite the
Web site*s explanation of how to interpret the scores, the public might
well assume that a high score is a positive sign.

In addition, reporting actual quality indicator scores rather than the
range of scores a home falls into for an indicator* a low, medium, or high
score* can be confusing and implies a confidence in the precision of the
results

24 Stakeholders that commented on NQF*s April 2002 draft set of indicators
suggested that the quality indicator scores be reported as the percentage
of residents not having the particular characteristic measured by each
indicator* e. g., reporting that 80 percent of residents were not
restrained rather than reporting that 20 percent of residents were
restrained. If quality indicators were reported this way, having a high
score would be better

than having a low score. CMS indicated that it had received similar
comments but will not make any changes prior to the national rollout in
November 2002.

that is currently a goal rather than a reality. Consumers will find it
difficult to assess a home with a score that is 5 to 10 percentage points
from the state average. Such a home could be an outlier* one of the best
or the worst on that indicator; alternatively, it could be that the home
was close to the state average because the outliers involved much larger
differences. Concerns about the validity of the indicators and the
potential reliability of the data make comparisons of homes with similar
scores questionable. Consumers may be misled if a difference of several
percentage points

between two homes is perceived as demonstrating that one is better or
worse than the other. To partially address these types of concerns,
Maryland has reported quality indicator data on its own Web site since

August 2001 in ranges rather than individual values. Thus, it indicates if
a facility falls into the bottom 10 percent, the middle 70 percent, or the
top 20 percent of facilities in the state.

Consumers may also be confused about how to interpret missing information.
Although the CMS Web site explains that quality indicator scores are not
reported for nursing homes with too few residents, it does not acknowledge
the extent of such missing data. We found that 6 percent of all nursing
homes in the six pilot states have no score for any of the nine quality
indicators and that, for individual indicators, from 9 percent to 40
percent of facilities have missing scores (see table 2). 25 When data for
homes of potential interest to consumers are not reported, consumers may
need some assistance in how to incorporate such instances into their
decisionmaking.

25 Chronic- care quality indicator scores were not reported for nursing
homes with fewer than 30 residents after excluding some residents, e. g.,
those with certain clinical characteristics or those with missing data
necessary to calculate a score. Short- stay quality indicator scores were
not reported for nursing homes with fewer than 20 residents after
excluding some residents.

Table 2: Percentage of Nursing Homes in the Six Pilot States with Missing
Quality Indicator Scores

Percentage of nursing homes with Quality indicators missing score

Chronic- care quality indicators

Decline in activities of daily living 19 Infections 16 Inadequate pain
management 16 Pressure sores 16 Physical restraints used daily 9 Weight
loss 21

Short- stay quality indicators

Failure to improve and manage delirium 36 Inadequate pain management 35
Improvement in walking 40 Source: GAO analysis of CMS quality indicator
data available on its Web site for the six pilot states. Consumer
confusion may also occur when quality indicator scores send

conflicting messages about the overall quality of care at a home. We found
that the Web site data for a significant number of facilities contained
such inconsistencies. Seventeen percent of nursing homes in the six pilot
states had an equal number of highly positive and highly negative quality
indicator scores. We defined highly positive scores as those indicating
that a facility was among the 25 percent of homes with the lowest
percentage of residents exhibiting poor outcomes, such as a decline in
their ability to walk or use the toilet. In contrast, facilities with a
highly negative score were among the top 25 percent of homes with poor
outcomes. We also found that 37 percent of nursing homes with four or more
highly positive

quality indicator scores had two or more highly negative scores. 26 In
addition, our comparison of survey deficiency data available on the Web
site with quality indicator scores also revealed inconsistencies. For
example, 17 percent of nursing homes with four or more highly positive
quality indicator scores and no highly negative scores* seemingly *good*
nursing homes* had at least one serious quality- of- care deficiency on a

26 The largest number of highly positive or highly negative scores that
any nursing home in the pilot states had was seven.

recent state survey. We have found that serious deficiencies cited by
state nursing home surveyors were generally warranted and indeed reflected
instances of documented actual harm to nursing home residents. 27
Moreover, 73 percent of nursing homes with four or more highly negative

quality indicator scores* seemingly *bad* facilities* had no serious
quality- of- care deficiencies on a recent survey (see table 3). The
latter situation is consistent with our past work that surveyors often
miss serious quality- of- care problems. 28 Nevertheless, consumers will
generally lack such insights on the reliability of state surveys that
would permit them to better assess the available data on quality of care.
Table 3: Comparison of Publicly Reported Nursing Home Quality Indicator
Scores

and Quality- of- Care Survey Deficiencies in Six Pilot States Nursing
homes with four or more

Nursing homes with four highly positive quality indicator

or more highly negative scores and no highly negative

quality indicator scores quality indicator scores that had at

that had no serious least one serious quality- of- care

quality- of- care survey State

survey deficiency (percent) deficiency (percent)

Colorado 0 75 Florida 11 81 Maryland 29 78 Ohio 27 69 Rhode Island 0 92
Washington 25 47

Tot al 17 73

Note: A serious quality- of- care deficiency indicates that surveyors
found actual harm to residents. Source: GAO analysis of quality indicator
and survey deficiency data available on CMS*s Web site for the six pilot
states.

With the apparent need for assistance to consumers in interpreting and
using this information, the important role of the Medicare and QIO toll-
free numbers is evident. We requested and reviewed copies of the Medicare
hotline and QIO scripts and found that they did not address the issue of

27 See U. S. General Accounting Office, Nursing Homes: Proposal to Enhance
Oversight of Poorly Performing Homes Has Merit, GAO/ HEHS- 99- 157
(Washington, D. C.: June 30, 1999). 28 See U. S. General Accounting
Office, California Nursing Homes: Care Problems Persist Despite Federal
and State Oversight, GAO/ HEHS- 98- 202 (Washington, D. C.: July 27,
1998).

responding to questions about conflicting or confusing quality data.
Furthermore, our calls to the Medicare hotline and to QIO toll- free
numbers in the six pilot states demonstrated that the staff were not
adequately prepared to handle basic questions about the quality data
available under the pilot. CMS officials had told us that Medicare hotline
callers with complicated questions would be seamlessly transferred to a
QIO without having to hang up and call another number. Although we asked
the Medicare hotline staff if another organization might be better able to
respond to our questions, no one offered to refer us to QIOs, even when we
specifically asked about them. In fact, one hotline staff member told us
that a QIO would not be an appropriate referral. Consequently, we
independently attempted to call the QIOs in the six pilot states. We found
that it was difficult to reach a QIO staff member qualified to answer
questions. Each QIO had a toll- free number but neither the automated

recordings at four QIOs nor operators at the remaining two indicated that
the caller had reached a QIO. 29 In addition, the automated recordings did
not contain a menu choice for questions about nursing home quality
indicators. 30 We were unable to contact one QIO because the hotline had
neither an operator nor a voice mail capability. On other calls, after

reaching a QIO staff person, it frequently took several referrals to
identify an appropriate contact point. One QIO took 5 working days for a
staff member to call us back. Four of the five QIOs we contacted explained
that their primary role was to work with nursing homes to improve quality
of care. In general, QIO staff were not prepared to respond to consumer

questions. Staff at the Medicare hotline and the QIOs varied greatly in
their basic understanding of quality indicators and survey deficiencies.
While two of the nine staff we contacted were generally knowledgeable
about different types of quality data, others were unable to answer simple
questions and the majority provided erroneous or misleading data. One QIO
staff member told us that MDS data were not representative of all
residents of a nursing home but only presented a *little picture* based on
a few residents. However, assessments of all residents are taken into
consideration in calculating quality indicators. When we expressed concern
about a home

identified on the Web site with a *level- 3* deficiency, a Medicare
hotline 29 The hotline identified each of the six QIOs we called by its
proprietary name* not by the term QIO or Quality Improvement Organization.
For example, the QIO for Ohio is known as KePRO, while the QIO for Alaska,
Idaho, and Washington is called Qualis Health.

30 A few QIOs did have a menu option for calls about the *nursing home
project.*

staff member incorrectly told us that it was not a serious deficiency
because level 3 indicated potential harm. 31 CMS designates actual harm
deficiencies as *level- 3* deficiencies. A QIO staff member incorrectly
told us that actual harm pressure sore deficiencies had nothing to do with
patient care and might be related to paperwork. Our review of survey
reports has shown that actual harm deficiencies generally involved serious
quality- of- care problems resulting in resident harm. 32 Generally,
hotline

staff did not express a preference for using either nursing home surveys
or quality indicators in choosing a nursing home. Two QIO staff, however,
stated that the nursing home survey information gave a better picture of
nursing home care than the quality indicators, which they judged to be
imprecise and subject to variability. Pilot Evaluation Is

CMS*s evaluation of the pilot is limited and will not be completed prior
to Limited and Will Not

national reporting of quality indicators because of the short period of
time between the launch of the pilot and the planned November 2002
national Be Completed Prior to

implementation. According to CMS officials, the pilot evaluation was never
National Reporting of

intended to help decide whether the initiative should be implemented
Quality Indicators

nationally or to measure the impact on nursing home quality. While CMS is
interested in whether nursing home quality actually improves as a result
of the initiative, it will be some time before such a determination can be
made.

Thus, CMS focused the pilot evaluation on identifying improvements that
could be incorporated into the initiative*s design prior to the scheduled
national implementation in November 2002. A CMS official told us that
initial pilot evaluation results were expected by early October 2002,
allowing just over a month to incorporate any lessons learned. In
commenting on a draft of this report, CMS stated that it was using
preliminary findings to steer national implementation. 33 The final
results of the pilot evaluation will not be completed until sometime in
2003.

31 CMS identifies nursing home deficiencies on its Nursing Home Compare
Web site using numbers with 2 equivalent to potential for more than
minimal harm and 3 equivalent to actual harm. 32 U. S. General Accounting
Office, Nursing Homes: Proposal to Enhance Oversight of

Poorly Performing Homes Has Merit, GAO/ HEHS- 99- 157 (Washington D. C.:
June 30, 1999). 33 CMS also plans to incorporate information from a
contractor*s study completed prior to the pilot to determine how it could
better motivate consumers to use nursing home quality information to make
better informed decisions. See Barents Group, Nursing Home Consumer Choice
Campaign Needs Assessment Report.

CMS*s evaluation of the pilot is focused on identifying how to communicate
more effectively with consumers about the initiative and how to improve
QIO interaction with nursing homes. Specifically, CMS will assess whether
(1) the target audiences were reached; (2) the initiative increased
consumer use of nursing home quality information; 34 (3) consumers used

the new information to choose a nursing home; (4) QIO activities
influenced nursing home quality improvement activities; (5) nursing homes
found the assistance provided by QIOs useful; and (6) the initiative
influenced those who might assist consumers in selecting a nursing home,
such as hospital discharge planners and physicians. Information is being
collected by conducting consumer focus groups, tracking Web site *hits*
and toll- free telephone inquiries, administering a Web site satisfaction
survey, and surveying nursing homes, hospital discharge planners, and
physicians. As of late August 2002, CMS teams were also in the process of
completing site visits to stakeholders in the six pilot states, including
QIOs,

nursing homes, ombudsmen, survey agencies, nursing home industry
representatives, and consumer advocacy groups. The teams* objective is to
obtain a first- hand perspective of how the initiative is working with the
goal of implementing necessary changes and better supporting the program
in the future.

Conclusions Although CMS*s initiative to publicly report nursing home
quality indicators is a commendable and worthwhile goal, we believe that
it is

important for CMS to wait for and consider input from NQF and make
necessary adjustments to the initiative based on its input. We believe
several factors demonstrate that CMS*s planned national reporting of
quality indicators in November 2002 is premature. Our review of the
available portions of Abt*s validation report raised serious questions
about whether the effort to date has been sufficient to validate the
quality indicators. NQF was asked to delay recommending a set of
indicators for national reporting until 2003, in part, to provide
sufficient time for it to review Abt*s report. Although limited in scope,
CMS*s planned MDS accuracy review program will not begin on- site accuracy
reviews of the

data underlying quality indicators until 2003. CMS*s own evaluation of the
pilot, designed to help refine the initiative, was limited to fit CMS*s
timetable for the initiative and the preliminary finding were not
available

34 In April and May 2002, the number of Web site *hits* for states in the
pilot increased substantially during the week the pilot was announced and
subsequently decreased, but they remained higher than before the launch of
the pilot.

until October 2002, leaving little time to incorporate the results into
the planned national rollout. Other aspects of the evaluation will not be
available until early 2003. We also have serious concerns about the
potential for public confusion over quality data, highlighting the need
for clear descriptions of the data*s limitations and easy access to
informed experts at both the Medicare and QIO hotlines. CMS has not yet

demonstrated its readiness to meet these consumer needs either directly or
through the QIOs. Recommendations for

To ensure that publicly reported quality indicator data accurately reflect
Executive Action

the status of quality in nursing homes and fairly compare homes to one
another, we recommend that the Administrator of CMS delay the
implementation of nationwide reporting of quality indicators until  there
is greater assurance that the quality indicators are appropriate for

public reporting* including the validity of the indicators selected and
the use of an appropriate risk- adjustment methodology* based on input
from the NQF and other experts and, if necessary, additional analysis and
testing; and

 a more thorough evaluation of the pilot is completed to help improve the
initiative*s effectiveness, including an assessment of the presentation of
information on the Web site and the resources needed to assist

consumers* use of the information. Comments from CMS CMS and the NQF
reviewed and provided comments on a draft of this and the NQF and Our

report. (See app. II and app. III, respectively). CMS reiterated its
commitment to continually improve the quality indicators and to work to
Evaluation

resolve the issues discussed in our report. Although CMS stated it would
use our report to help improve the initiative over time, it intends to
move forward with national implementation in November 2002 as planned. It
stated that *waiting for more reliability, more validity, more accuracy,
and more usefulness will delay needed public accountability, and deprive
consumers, clinicians, and providers of important information they can use
now.* The NQF commented that it unequivocally supports CMS*s plans to
publicly report quality indicators but indicated that the initiative would
benefit from a short- term postponement of 3 to 4 months to achieve a
consensus on a set of indicators and to provide additional time to prepare
the public on how to use and interpret the data. We continue to support
the

concept of reporting quality indicators, but remain concerned that a
flawed implementation could seriously undercut support for and the
potential effectiveness of this very worthwhile initiative. CMS*s comments
and our evaluation focused largely on two issues: (1) the selection and
validity of quality indicators, and (2) lessons learned from CMS*s
evaluation of the pilot initiative.

Selection and Validity of CMS asserts that the quality indicators it plans
to report nationally are

Quality Indicators reliable, valid, accurate, and useful and that it has
received input from a

number of sources in selecting the indicators for this initiative.
However, CMS provided no new evidence addressing our findings regarding
the appropriateness of the quality indicators selected for public
reporting and the accuracy of the underlying data. We continue to believe
that, prior to nationwide implementation, CMS should resolve these open
issues. CMS intends to move forward with nationwide implementation without
a

requested NQF assessment of the full Abt validation report and without
NQF*s final recommendations on quality indicators. CMS would not share the
technical appendices to Abt*s validation report with us because they were
undergoing review and revision. The technical appendices are critical to
assessing Abt*s validation approach. CMS*s comments did not address

our specific findings on the available portions of Abt*s validation
report, including: (1) the validation results are not representative of
nursing homes nationwide because of limitations in the selection of a
sample of nursing homes to participate in the validation study, and (2)
Abt provided little information on the specific care processes against
which the indicators were validated or how its risk adjustment approaches
changed

facility rankings and the appropriateness of the changes. Although both
Abt and the NQF concluded that Abt*s facility- level risk adjustment
approach required further research to determine its validity, CMS plans to
report two indicators with and without facility- level adjustments. CMS*s
comments indicated that it has chosen to report these measures both ways
in order to

evaluate their usefulness and to allow facilities and consumers the
additional information. We continue to believe that reporting data of
uncertain validity is inappropriate and, as such, will likely not be
useful to either facilities or consumers.

For quality indicators to be reliable, the underlying MDS data used to
calculate the indicators must be accurate. CMS*s comments did not
specifically address the conflicting findings on MDS accuracy from Abt*s
August 2002 validation report and its February 2001 report to CMS. Abt*s

August 2002 validation report concluded that, in aggregate, the underlying
MDS data were very reliable but that the reliability varied considerably
within and across states. Aggregate reliability, however, is insufficient
because quality indicators are reported separately for each facility. In
its February 2001 report to CMS, Abt identified widespread errors in the
accuracy of facility- specific assessments used to calculate some of the

quality indicators that CMS has selected for reporting in November. CMS
indicated that its efforts since 1999 have improved MDS accuracy. But
because CMS does not plan to begin limited on- site MDS accuracy reviews
until 2003, there is little evidence to support this assertion.

Lessons Learned from CMS commented that findings from a number of
activities evaluating the

CMS*s Evaluation of the six- state pilot were not available prior to the
time we asked for comments

Pilot Initiative on our draft report. While final reports are not yet
available for some of

these studies, CMS stated that the pilot allowed it to work through
important issues and incorporate lessons learned before a national launch.
We pointed out that the pilot evaluation was limited and incomplete* an

additional reason to delay the initiative. CMS also did not evaluate a key
implementation issue* the adequacy of assistance available to consumers
through its toll- free telephone hotlines. Moreover, the lack of formal
evaluation reports to help guide the development of a consensus about key
issues, such as how quality indicators should be reported, is troubling.

In its comments, CMS stated that it was committed to working aggressively
to help the public understand nursing home quality information using
lessons learned from the pilot. However, CMS learned about the flaws in
its hotline operations not from its pilot evaluation but from our attempts
to use the Medicare and QIO toll- free phone numbers to obtain information
on quality data. Acknowledging the weaknesses we identified, the agency
laid out a series of actions intended to strengthen the hotlines* ability
to

respond to public inquiries, such as providing additional training to
customer service representatives prior to the national launch of the
initiative. CMS outlined other steps it plans to take such as providing
its customer service representatives with new scripts and questions and
answers to the most frequently asked questions. At the outset of the pilot
in April 2002, CMS described seamless transfers from the Medicare to the
QIO hotlines for complicated consumer questions but now acknowledges that
limitations in QIO telephone technology prevent such transfers. Instead of
automatic transfers, CMS stated that, when referrals to QIOs are
necessary, callers will be provided with a direct toll- free phone number.
CMS also commented that consumers should be encouraged to consider
multiple

types of information on nursing home quality. While we agree, we believe
it is critical that customer service representatives have a clear
understanding of the strengths and limitations of different types of data
to properly inform

consumers when they inquire. CMS commented that we offered no explanation
of the analysis that led us to conclude that (1) consumers could be
confused because scores on quality indicators can conflict with each other
and the results of routine nursing home surveys, and (2) the public may
confuse a high quality indicator score with a positive result. Our draft
clearly states that our

findings were based on our analysis of the quality indicator data and
survey results available in the six pilot states* a database that CMS
provided at our request. In its comments, CMS provided limited data to
support its assertion that consumers are not confused by the quality
indicators and are very satisfied with the current presentation on its Web
site. According to CMS, over two- thirds of respondents to its August 2002
online satisfaction survey of randomly chosen users of Nursing Home
Compare information

said they were highly satisfied with the information, for example, it was
clearly displayed, easy to understand, and valuable. It is not clear,
however, that these responses were representative of all nursing home
consumers

accessing the Web site, as CMS implied. For example, CMS informed us that
this survey was part of a larger survey of all Medicare Web site users,
which had a low overall response rate of 29 percent. Moreover, of the 654
respondents to the Nursing Home Compare component of the survey, fewer
than half (40 percent) were identified as Medicare beneficiaries, family
members, or friends. NQF feedback to CMS on its Web site presentation was
consistent with our findings. In commenting on our draft report, NQF

noted that it had offered informal guidance to CMS, such as using positive
or neutral wording to describe indicators, exploring alternative ways of
presenting information about differences among facilities, and ensuring
that the presentation of the data reflects meaningful differences in
topics important to consumers. While justifying its current presentation
of quality indicator data, CMS commented that it is seriously considering
not reporting individual nursing home scores but rather grouping homes
into ranges such as the bottom 10 percent, middle 70 percent, and top 20
percent of facilities in a state. Such a change, however, would not come
before the national rollout. We agree with CMS that, when grouping homes
into ranges, homes on the margin* close to the bottom 10 percent or top 20
percent* may not be significantly different from one another. However, the
same is true of reporting individual facility scores. Moreover, reporting
ranges more clearly identifies homes that are outliers for consumers.

Additional CMS Comments CMS also commented on our characterization of the
scope of the nursing home quality initiative. CMS stated that we had
narrowly framed the

initiative as one designed solely for consumers, ignoring the QIO*s
quality improvement activities with individual nursing homes requesting
assistance. Our report acknowledged and briefly outlined the quality

improvement role of the QIOs. However, based on our requestors* concerns
about the relatively short pilot timeframe prior to national
implementation of public reporting of quality indicators, we focused our
work on that key aspect of the initiative. CMS cited its Interim Report on
Evaluation Activities for the Nursing Home Quality Initiative to support
its conclusion that the initiative was successful in promoting quality
improvement activities among nursing homes. The improvements cited in the
Interim Report were self- reported by facilities and CMS offered no

insights on the nature of the quality improvement changes. The Interim
Report was not available when we sent our draft report to CMS for comment.

CMS provided several technical comments which we incorporated as
appropriate.

As agreed with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days
after its issue date. At that time, we will send copies to the
Administrator of CMS, appropriate congressional committees, and other
interested parties. We will also make copies available to others upon
request. In addition, the

report will be available at no charge on the GAO Web site at http:// www.
gao. gov.

If you or your staff have any questions, please call me at (202) 512- 7118
or Walter Ochinko at (202) 512- 7157. GAO staff acknowledgments are listed
in appendix IV.

Kathryn G. Allen Director, Health Care* Medicaid

and Private Health Insurance Issues

Appendi xes Comparison of Quality Indicators Proposed by NQF and CMS for
National Rollout Appendi x I

Indicator Draft NQF indicators

Failure to improve and manage delirium a

for national reporting CMS indicators for Chronic care (long- stay
resident) quality indicators

Pressure ulcers, high and low risk a

Weight loss a

Depression without therapy a

Incontinence a

Catheterization a

Bedfast residents a

Postacute (short- stay resident) quality indicators

Rehospitalizations a

the national rollout

Decline in activities of daily living a a

Pressure ulcers a

Pressure ulcers a a

Inadequate pain management a a

Physical restraints used daily a a

Infections a

Failure to improve and manage delirium a a a

Inadequate pain management a a

Improvement in walking a a

a This indicator is listed twice because CMS plans to report it with and
without facility- level adjustment. Source: NQF and CMS.

Comments from the Centers for Medicare &

Appendi x II Medicaid Services

Appendi x III Comments from the National Quality Forum

Appendi x IV

GAO Contact and Staff Acknowledgments GAO Contact Walter Ochinko, (202)
512- 7157 Acknowledgments The following staff made important contributions
to this report: Laura

Sutton Elsberg, Patricia A. Jones, Dean Mohs, Dae Park, Jonathan Ratner,
Peter Schmidt, Paul M. Thomas, and Phyllis Thorburn.

(290232)

GAO*s Mission The General Accounting Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve

the performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates federal
programs and policies; and provides analyses, recommendations, and other
assistance to help Congress make informed oversight, policy, and funding
decisions. GAO*s commitment to good government is reflected in its core
values of accountability, integrity, and reliability.

Obtaining Copies of The fastest and easiest way to obtain copies of GAO
documents at no cost is

through the Internet. GAO*s Web site (www. gao. gov) contains abstracts
and fulltext GAO Reports and

files of current reports and testimony and an expanding archive of older
Testimony

products. The Web site features a search engine to help you locate
documents using key words and phrases. You can print these documents in
their entirety, including charts and other graphics.

Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as *Today*s Reports,* on its
Web site daily. The list contains links to the full- text document files.
To have GAO e- mail this

list to you every afternoon, go to www. gao. gov and select *Subscribe to
GAO Mailing Lists* under *Order GAO Products* heading.

Order by Mail or Phone The first copy of each printed report is free.
Additional copies are $2 each. A check or money order should be made out
to the Superintendent of Documents. GAO

also accepts VISA and Mastercard. Orders for 100 or more copies mailed to
a single address are discounted 25 percent. Orders should be sent to:

U. S. General Accounting Office 441 G Street NW, Room LM Washington, D. C.
20548

To order by Phone: Voice: (202) 512- 6000 TDD: (202) 512- 2537 Fax: (202)
512- 6061

To Report Fraud, Contact:

Waste, and Abuse in Web site: www. gao. gov/ fraudnet/ fraudnet. htm

E- mail: fraudnet@ gao. gov Federal Programs

Automated answering system: (800) 424- 5454 or (202) 512- 7470 Public
Affairs Jeff Nelligan, Managing Director, NelliganJ@ gao. gov (202) 512-
4800

U. S. General Accounting Office, 441 G Street NW, Room 7149 Washington, D.
C. 20548

a

GAO United States General Accounting Office

CMS*s initiative to augment existing public data on nursing home quality
has considerable merit, but its planned November 2002 implementation does
not allow sufficient time to ensure the indicators it publishes are
appropriate and useful to consumers. CMS*s plan urges consumers to
consider nursing homes with positive quality indicator scores, in effect,
attempting to use market forces to encourage nursing homes to improve the
quality of care. However, CMS is moving forward without adequately
resolving a number of important open issues on the appropriateness of the
indicators chosen for national reporting or the accuracy of the underlying
data.

To develop and help select the quality indicators, CMS hired two
organizations with expertise in health care data* Abt Associates, Inc. and
the National Quality Forum (NQF). Abt identified a list of potential
quality indicators and tested them to verify that they represented the
actual quality of care individual nursing homes provide. Although the full
Abt report on validation of the indicators was not available as of October
28, 2002, GAO*s review of the available portions of the report raised
serious questions about the basis for moving forward with national
reporting at this time. NQF, which was created to develop and implement a
national strategy for measuring health care quality, was hired to review
Abt*s work and identify core indicators for national reporting. To allow
sufficient time to review Abt*s validation report, NQF agreed to delay its
recommendations for national reporting until 2003. CMS limited its own
evaluation of its six- state pilot program for the initiative so that the
November 2002 implementation date could be met. Early results were
expected in October 2002, leaving little time to incorporate them into the
national rollout. Despite the lack of a final report from NQF and an
incomplete pilot evaluation, CMS has announced a set of indicators it will
begin reporting nationally in November 2002.

GAO has serious concerns about the potential for public confusion by the
quality information published, especially if there are significant changes
to the quality indicators due to the NQF*s review. CMS*s proposed
reporting format implies a precision in the data that is lacking at this
time. While acknowledging this problem, CMS said it prefers to wait until
after the national rollout to modify the presentation of the data. GAO*s
analysis of data currently available from the pilot states demonstrated
there was ample opportunity for the public to be confused, highlighting
the need for clear descriptions of the data*s limitations and easy access
to impartial experts hired by CMS to operate consumer hotlines. CMS has
not yet demonstrated its readiness to meet these consumer needs either
directly or through the hotlines fielding public questions about confusing
or conflicting quality data.

CMS acknowledged that further work is needed to refine its initiative, but
believes that its indicators are sufficiently valid, reliable, and
accurate to move forward with national implementation in November 2002 as
planned.

NURSING HOMES

Public Reporting of Quality Indicators Has Merit, but National
Implementation Is Premature

www. gao. gov/ cgi- bin/ getrpt? GAO- 03- 187. To view the full report,
including the scope and methodology, click on the link above. For more
information, contact Kathryn G. Allen at (202) 512- 7118. Highlights of
GAO- 03- 187, a report to

congressional requesters

October 2002

GAO was asked to review the Centers for Medicare & Medicaid Services (CMS)
initiative to publicly report additional information on its *Nursing Home
Compare* Web site that is intended to help consumers choose a nursing
home. GAO examined CMS*s development of the new nursing home quality
indicators and efforts to verify the underlying data used to calculate
them. GAO also reviewed the assistance CMS offered the public in
interpreting and comparing indicators available in its six- state pilot
program, launched in April 2002, and its own evaluation of the pilot. The
new indicators are scheduled to be used nationally beginning in November
2002.

GAO is recommending that the Administrator of CMS delay the national
reporting of quality indicators to allow sufficient time to resolve
important issues regarding appropriate indicators for public reporting and
to implement a program to review the accuracy of the data on which the
indicators are based. During this time, CMS also should more thoroughly
evaluate the results of its six- state pilot to assess how information is
presented and to improve assistance available through consumer hotlines.

Page i GAO- 03- 187 Nursing Home Quality Indicators

Contents

Contents

Page ii GAO- 03- 187 Nursing Home Quality Indicators

Page 1 GAO- 03- 187 Nursing Home Quality Indicators United States General
Accounting Office

Washington, D. C. 20548 Page 1 GAO- 03- 187 Nursing Home Quality
Indicators

A

Page 2 GAO- 03- 187 Nursing Home Quality Indicators

Page 3 GAO- 03- 187 Nursing Home Quality Indicators

Page 4 GAO- 03- 187 Nursing Home Quality Indicators

Page 5 GAO- 03- 187 Nursing Home Quality Indicators

Page 6 GAO- 03- 187 Nursing Home Quality Indicators

Page 7 GAO- 03- 187 Nursing Home Quality Indicators

Page 8 GAO- 03- 187 Nursing Home Quality Indicators

Page 9 GAO- 03- 187 Nursing Home Quality Indicators

Page 10 GAO- 03- 187 Nursing Home Quality Indicators

Page 11 GAO- 03- 187 Nursing Home Quality Indicators

Page 12 GAO- 03- 187 Nursing Home Quality Indicators

Page 13 GAO- 03- 187 Nursing Home Quality Indicators

Page 14 GAO- 03- 187 Nursing Home Quality Indicators

Page 15 GAO- 03- 187 Nursing Home Quality Indicators

Page 16 GAO- 03- 187 Nursing Home Quality Indicators

Page 17 GAO- 03- 187 Nursing Home Quality Indicators

Page 18 GAO- 03- 187 Nursing Home Quality Indicators

Page 19 GAO- 03- 187 Nursing Home Quality Indicators

Page 20 GAO- 03- 187 Nursing Home Quality Indicators

Page 21 GAO- 03- 187 Nursing Home Quality Indicators

Page 22 GAO- 03- 187 Nursing Home Quality Indicators

Page 23 GAO- 03- 187 Nursing Home Quality Indicators

Page 24 GAO- 03- 187 Nursing Home Quality Indicators

Page 25 GAO- 03- 187 Nursing Home Quality Indicators

Page 26 GAO- 03- 187 Nursing Home Quality Indicators

Page 27 GAO- 03- 187 Nursing Home Quality Indicators

Appendix I

Page 28 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 29 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 30 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 31 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 32 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 33 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 34 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 35 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 36 GAO- 03- 187 Nursing Home Quality Indicators

Appendix II Comments from the Centers for Medicare & Medicaid Services

Page 37 GAO- 03- 187 Nursing Home Quality Indicators

Page 38 GAO- 03- 187 Nursing Home Quality Indicators

Appendix III

Appendix III Comments from the National Quality Forum

Page 39 GAO- 03- 187 Nursing Home Quality Indicators

Page 40 GAO- 03- 187 Nursing Home Quality Indicators

Appendix IV

United States General Accounting Office Washington, D. C. 20548- 0001

Official Business Penalty for Private Use $300

Address Service Requested Presorted Standard

Postage & Fees Paid GAO Permit No. GI00
*** End of document. ***