Nursing Homes: Federal Efforts to Monitor Resident Assessment
Data Should Complement State Activities (15-FEB-02, GAO-02-279).
Nursing homes that participate in Medicare and Medicaid must
periodically assess the needs of residents in order to develop an
appropriate plan of care. Such resident assessments are known as
the minimum data set (MDS). According to officials in the 10
states with MDS accuracy review programs in operation as of
January 2001, these programs were established to set Medicaid
payments and identify quality of care problems. Nine of the 10
states conduct periodic on-site reviews in all or a significant
portion of their nursing homes to assess the accuracy of the MDS
data. These reviews sample a home's MDS assessments to determine
whether the basis for the assessments is adequately documented in
residents' medical records. In addition, these reviews often
include interviews of nursing home personnel familiar with
residents and observations of the residents themselves. States
with separate MDS review programs identified various approaches
to improving MDS accuracy. State officials highlighted the
on-site review process itself and provider education activities
as their primary approaches. State officials also reported
remedies such as requiring nursing homes to prepare a corrective
action plan or imposing financial penalties on nursing homes when
serious or extensive errors in MDS data are found. Following the
1998 implementation of Medicare's MDS-based payment system, the
Health Care Financing Administration began its own review program
to ensure the accuracy of MDS data.
-------------------------Indexing Terms-------------------------
REPORTNUM: GAO-02-279
ACCNO: A02769
TITLE: Nursing Homes: Federal Efforts to Monitor Resident
Assessment Data Should Complement State Activities
DATE: 02/15/2002
SUBJECT: Aid for the elderly
Federal/state relations
Health care programs
Reporting requirements
Data integrity
Nursing homes
Surveys
Medicaid Program
Medicare Program
******************************************************************
** This file contains an ASCII representation of the text of a **
** GAO Product. **
** **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced. Tables are included, but **
** may not resemble those in the printed version. **
** **
** Please see the PDF (Portable Document Format) file, when **
** available, for a complete electronic file of the printed **
** document's contents. **
** **
******************************************************************
GAO-02-279
a
GAO United States General Accounting Office
Report to Congressional Requesters
February 2002 NURSING HOMES Federal Efforts to Monitor Resident Assessment
Data Should Complement State Activities
GAO- 02- 279
Page i GAO- 02- 279 Nursing Home Resident Assessment Data Letter 1
Results in Brief 3 Background 6 Only Eleven States Conduct Separate On- Site
or Off- Site Reviews
of MDS Accuracy 10 States Attempt to Improve MDS Data Accuracy through On-
Site
Reviews, Training, and Other Remedies 20 CMS? MDS Review Program Could
Better Leverage Existing State
and Federal Accuracy Activities 23 Conclusions 29 Recommendations for
Executive Action 30 Agency and State Comments and Our Evaluation 30
Appendix I Summary of State On- Site MDS Reviews As of January 2001 34
Appendix II Comments from the Centers for Medicare and Medicaid Services 39
Tables
Table 1: States with and without MDS Review Programs as of January 2001 12
Table 2: MDS Assessments with Errors in Five States with On- Site
MDS Review Programs 23 Table 3: Implementation Schedule for CMS? MDS
Accuracy Review
Program 26
Figures
Figure 1: MDS Elements Identified By Nine States As Having High Potential
for MDS Errors 18 Contents
Page ii GAO- 02- 279 Nursing Home Resident Assessment Data Abbreviations
ADL activities of daily living CMS Centers for Medicare and Medicaid
Services DAVE data assessment and verification HCFA Health Care Financing
Administration HHS Health and Human Services MDS minimum data set OIG Office
of Inspector General OASIS Outcome and Assessment Information Set PPS
prospective payment system SNF skilled nursing facilities
Page 1 GAO- 02- 279 Nursing Home Resident Assessment Data
February 15, 2002 The Honorable Charles E. Grassley Ranking Minority Member
Committee on Finance United States Senate
The Honorable Larry Craig Ranking Minority Member Special Committee on Aging
United States Senate
Nursing homes play an important role in the health care system of the United
States. More than 40 percent of elderly Americans will use a nursing home at
some time in their lives. Such facilities provide skilled nursing, therapy,
or supportive care to older individuals who do not need the intensive
medical care provided by hospitals, but for whom receiving care at home is
not feasible. Under the Medicare and Medicaid programs, nursing homes were
expected to receive $58 billion in 2001, with a federal share of
approximately $38 billion. Nursing homes that participate in these programs
are required to periodically assess the care needs of residents in order to
develop an appropriate plan of care. Such resident assessment data are known
as the minimum data set (MDS). 1 The federal government contracts with
states to periodically inspect or survey nursing homes, and state surveyors
use MDS data to help assess the quality of resident care. 2 Medicare and
some state Medicaid programs also use MDS data to adjust nursing home
payments to account for variation in resident care needs.
1 The Omnibus Budget Reconciliation Act of 1987 required the Secretary of
Health and Human Services to specify a minimum data set of core elements to
use in conducting comprehensive assessments of patient conditions and care
needs. See 42 U. S. C. sect. 1395i- 3; 42 U. S. C. sect. 1396r. By mid- 1991, the
requirement to assess and plan for resident care had been implemented in all
nursing homes that serve Medicare and Medicaid beneficiaries. MDS data are
collected for all residents in these facilities, including Medicare,
Medicaid, and private pay patients.
2 The federal government has responsibility for establishing requirements
that nursing homes must meet to participate in publicly funded programs.
Every nursing home that receives Medicare or Medicaid funding must undergo a
standard survey conducted on average every 12 months and no less than once
every 15 months. Under its contracts with states, the federal government
funds 100 percent of costs associated with certifying that nursing homes
meet Medicare requirements and 75 percent of the costs associated with
Medicaid standards.
United States General Accounting Office Washington, DC 20548
Page 2 GAO- 02- 279 Nursing Home Resident Assessment Data
Thus, the accuracy of MDS data has implications for the identification of
quality problems and the level of nursing home payments.
MDS accuracy is one of many areas that state surveyors are expected to
examine during periodic nursing home surveys. Federal guidance for state
surveyors regarding the accuracy of MDS assessments focuses on whether
appropriate personnel completed or coordinated the assessments and whether
there are any indications that the assessments were falsified. This guidance
also instructs surveyors to conduct a check of specific MDS items to ensure
that the resident?s condition is appropriately characterized. Concerns
exist, however, that state surveyors already have too many tasks and that,
as a result, the survey process may not adequately address MDS accuracy. In
addition, our prior work on nursing home quality issues has identified
weaknesses in the survey process that raise questions about the thoroughness
and consistency of state surveys. 3
In response to your request, we assessed (1) how states monitor the accuracy
of MDS data compiled by nursing homes through review programs separate from
their standard nursing home survey process, (2) how states attempt to
improve the data?s accuracy where there are indications of problems, and (3)
how the federal government ensures the accuracy of MDS data. We surveyed the
50 states and the District of Columbia to determine whether states had a
separate MDS review program- distinct from any MDS oversight that might
occur during the periodic nursing home surveys performed by all states. We
then conducted structured interviews with officials in 10 of the 11 states
that indicated they had separate MDS review programs. 4 We also interviewed
staff from the Centers for Medicare and Medicaid Services (CMS), an agency
within the Department of Health and Human Services (HHS) that manages the
Medicare and Medicaid programs, who were responsible for developing
3 See Nursing Homes: Sustained Efforts Are Essential to Realize Potential of
the Quality Initiatives (GAO/ HEHS- 00- 197, Sept. 28, 2000). 4 These 10
states are Iowa, Indiana, Maine, Mississippi, Ohio, Pennsylvania, South
Dakota, Vermont, Washington, and West Virginia. Due to the newness of
Virginia?s MDS review program (implemented in April 2001), we focused on the
experience of the 10 states with longer standing programs. In addition,
about one- third of the states without separate MDS review programs
volunteered additional information regarding the ways in which the accuracy
of MDS data may be addressed through the nursing home survey process or
training programs offered by the state.
Page 3 GAO- 02- 279 Nursing Home Resident Assessment Data
the agency?s MDS review program. 5 In addition, we reviewed regulations,
literature, and other documents relating to MDS data. We performed our work
from December 2000 through January 2002 in accordance with generally
accepted government auditing standards.
Eleven states have established separate MDS review programs, apart from
their standard nursing home survey process, to monitor the accuracy of
resident assessment data compiled by nursing homes. An additional seven
states reported that they plan to do so. According to officials in the 10
states with MDS accuracy review programs in operation as of January 2001,
these programs were established primarily because of the important role
played by MDS data in setting Medicaid payments and identifying quality of
care problems. While routine nursing home surveys provide an opportunity to
examine the accuracy of MDS data, officials in some of the 10 states with
separate MDS review programs told us that surveyors do not have sufficient
time to focus on the data?s accuracy because of other survey tasks. To
assess the accuracy of the MDS data, 9 of the 10 states conduct periodic on-
site reviews in all or a significant portion of their nursing homes. These
reviews include checking a sample of a home?s MDS assessments and
determining whether the basis for the assessments is adequately documented
in residents? medical records. In addition, these reviews often include
interviews of nursing home personnel familiar with residents and
observations of the residents themselves. Such corroborating evidence
provides reviewers increased assurance that an MDS assessment accurately
reflects the resident?s condition. States with on- site review programs
reported that the discrepancies they identified between MDS assessments and
the supporting documentation, also called
?MDS errors,? typically resulted from differences in clinical interpretation
or mistakes, such as a misunderstanding of MDS definitions. Two of the 10
states were able to tell us the amount of the recoupments they obtained from
nursing homes due to Medicaid overpayments based on inaccurate MDS
assessments. For example, West Virginia received $1 million from one nursing
home relating to MDS errors associated with physical therapy services.
5 On June 14, 2001, the Secretary of HHS changed the name of the Health Care
Financing Administration (HCFA) to CMS. In this report, we will continue to
refer to HCFA where our findings apply to the organizational structure and
operations associated with that name. Results in Brief
Page 4 GAO- 02- 279 Nursing Home Resident Assessment Data
States with separate MDS review programs identified a variety of approaches
to improving MDS accuracy. State officials highlighted the onsite review
process itself and provider education activities as their primary
approaches. On- site reviews heighten facility staff awareness of the
importance of MDS data and can lead to the correction of practices that
contribute to MDS errors. Some officials said that on- site reviews provide
a valuable opportunity for informal training and coaching staff about
completing and documenting MDS assessments, which is important given the
types of MDS errors found and the high staff turnover in nursing homes.
Identifying areas of confusion by nursing home staff during on- site MDS
reviews is also useful in guiding the focus of formal training sessions
conducted outside of the nursing home. State officials reported that they
also have one or more remedies at their disposal to help improve accuracy,
such as requiring nursing homes to prepare a corrective action plan or
imposing financial penalties on nursing homes when serious or extensive
errors in MDS data are found. Indiana, for example, requires facilities to
submit a corrective action plan detailing how the facility will address
errors identified during an on- site review. In addition, Maine has
collected approximately $390,000 in financial penalties since late 1995 from
facilities with MDS errors. Finally, officials from five states told us that
their MDS review efforts have resulted in a notable decrease in MDS errors
across all facilities. For example, the average percentage of assessments
with MDS errors that resulted in a payment change since initiation of their
separate review programs has decreased from about 85 percent to 10 percent
of assessments in South Dakota and from 75 percent to 30 percent of
assessments in Indiana.
Following the 1998 implementation of Medicare?s MDS- based payment system,
the Health Care Financing Administration (HCFA) began building the
foundation for its own separate review program- distinct from state efforts-
intended to ensure the accuracy of MDS data for all nursing home residents.
In the course of developing and testing various accuracy review approaches,
an agency contractor found widespread MDS errors that resulted in a change
in the Medicare payment level for two- thirds of the resident assessments
sampled. Its on- site visits proved to be a very effective method of
assessing accuracy. As a result, the contractor recommended that any MDS
reviews involve on- site visits, at least for the first few years of any
national review program, along with certain off- site analysis to help
target homes and areas for review. In September 2001, CMS awarded a new
contract to establish a national MDS accuracy review program. As currently
planned, CMS? MDS review activities are projected to involve roughly 1
percent of the estimated 14.7 million MDS assessments expected to be
completed in 2001, with on- site reviews in
Page 5 GAO- 02- 279 Nursing Home Resident Assessment Data
fewer than 200 of the nation?s 17,000 nursing homes each year. In contrast,
states that conduct separate MDS reviews typically examine from 10 to 40
percent of assessments completed in all or a significant portion of their
nursing homes. The CMS contractor is required to coordinate its activities
with ongoing state and federal efforts. For example, to avoid unnecessary
overlap, the contractor is instructed to coordinate with states regarding
the selection of facilities and the timing of visits. However, the
contractor is not specifically tasked with assessing the adequacy of each
state?s MDS accuracy activities. While CMS? approach may yield some broad
sense of the accuracy of MDS assessments on an aggregate level, it appears
to be insufficient to provide confidence about the accuracy of MDS
assessments in the vast bulk of nursing homes nationwide.
Given the substantial level of effort and resources already invested at the
state and federal levels to oversee nursing home quality of care, including
periodic inspections at each home nationwide, we believe that CMS should
reorient its MDS accuracy program so that it complements and leverages
existing state review activities and its own established nursing home
oversight efforts. Therefore, we are making recommendations to the
administrator of CMS that include determining the adequacy of each state?s
efforts to ensure MDS accuracy and providing additional guidance and
technical assistance to individual states as needed; routinely monitoring
state review activities and progress as part of CMS? own ongoing federal
oversight of nursing home quality; and ensuring that states and nursing
homes have sufficient documentation to support the full MDS assessment.
In commenting on a draft of this report, CMS agreed with the importance of
assessing and monitoring the adequacy of state MDS accuracy efforts. CMS
recognized that the MDS impacts reimbursement and care planning and that it
is essential that the assessment data reflect the resident?s health status
so that the resident may receive the appropriate quality care and that
providers are appropriately reimbursed. While CMS? comments suggested that
its current efforts may be sufficient to assess and improve state
performance, we do not believe they will result in the systematic assessment
and monitoring of each state?s MDS accuracy that we recommended. CMS did not
agree with our recommendation on the need for sufficient documentation to
support the full MDS assessment, expressing concern about potential
duplicative effort and unnecessary burden for nursing homes. In our view,
documentation need not be duplicative, but demonstrative that the higher-
level summary judgment about a resident?s condition and needs entered on the
MDS can be independently validated. Given the importance of MDS data in
adjusting
Page 6 GAO- 02- 279 Nursing Home Resident Assessment Data
nursing home payments and guiding resident care, ensuring their integrity is
critical to achieving their intended purposes.
The nation?s 17,000 nursing homes play an essential role in our health care
system, providing services to 1.6 million elderly and disabled persons who
are temporarily or permanently unable to care for themselves but who do not
require the level of care furnished in an acute care hospital. Depending on
the identified needs of each resident, as determined through MDS
assessments, nursing homes provide a variety of services, including nursing
and custodial care, physical, occupational, and speech therapy, and medical
social services. 6 The majority of nursing home residents have their care
paid for by Medicaid, a joint federal- state program for certain low- income
individuals. Almost all nursing homes serve Medicaid residents, while more
than 14,000 nursing homes are also Medicarecertified. Medicare, the federal
health care program for elderly and disabled Americans, pays for
posthospital nursing home stays if a beneficiary needs skilled nursing or
rehabilitative services. 7 Medicarecovered skilled nursing home days account
for approximately 9 percent of total nursing home days. Medicare
beneficiaries tend to have shorter nursing home stays and receive more
rehabilitation services than individuals covered by Medicaid.
Since 1991, nursing homes have been required to develop a plan of care for
each resident based on the periodic collection of MDS data. The MDS contains
individual assessment items covering 17 areas, such as mood and behavior,
physical functioning, and skin conditions. MDS assessments of each resident
are conducted in the first 14 days after admission and are
6 For patients with an advanced illness, medical social services generally
help the patient and family cope with the logistics of daily life, including
financial and legal planning and mobilizing community resources that may be
available to the patient. Such services may also include counseling the
patient and family to address emotions and other issues related to the
advanced illness.
7 To qualify, a Medicare beneficiary must require daily skilled nursing or
rehabilitative therapy services, generally within 30 days of a hospital stay
of at least 3 days in length, and must be admitted to the nursing home for a
condition related to the hospitalization. Background
MDS Used to Assess Nursing Home Residents
Page 7 GAO- 02- 279 Nursing Home Resident Assessment Data
used to develop a care plan. 8 A range of professionals, including nurses,
attending physicians, social workers, activities professionals, and
occupational, speech, and physical therapists, complete designated parts of
the MDS. 9 Assessing a resident?s condition in certain areas requires
observation, often over a period of days. For example, nursing staff must
assess the degree of resident assistance needed during the previous 7 days-
none, supervised, limited, extensive, or total dependence- to carry out the
activities of daily living (ADL), such as using a toilet, eating, or
dressing. To obtain this information, staff completing the MDS assessments
are required to communicate with direct care staff, such as nursing
assistants or activities aides, who have worked with the resident over
different time periods. These staff have first- hand knowledge of the
resident and will often be the primary and most reliable source of
information regarding resident performance of different activities. While a
registered nurse is required to verify that the MDS assessment is complete,
each professional staff member who contributed to the assessment must sign
and attest to the accuracy of his or her portion of the assessment.
MDS data are also submitted by nursing homes to states and CMS for use in
the nursing home survey process and to serve as the basis for adjusting
payments. CMS contracts with states to periodically survey nursing homes to
review the quality of care and assure that the services delivered meet the
residents? assessed needs. In fiscal year 2001, the federal government
8 MDS assessments are conducted for all nursing home residents within 14
days of admission and at quarterly and yearly intervals unless there is a
significant change in condition. Accommodating their shorter nursing home
stays, Medicare beneficiaries in a Medicare- covered stay are assessed on or
before the 5th, 14th, and 30th day of their stays and every 30 days
thereafter.
9 In a recent study, the HHS Office of Inspector General (OIG) reported that
almost all of the facilities in its study had a position of MDS coordinator.
Eighty- one percent were registered nurses, and the remainder were either
licensed practical nurses, licensed vocational nurses, or social workers.
See HHS OIG, Nursing Home Resident Assessment: Quality of Care, OEI- 02- 99-
00040 (Washington, D. C.: HHS, Dec. 2000). MDS Used in Quality
Oversight and as Basis for Payments
Page 8 GAO- 02- 279 Nursing Home Resident Assessment Data
spent about $278 million on the nursing home survey process. 10 Effective
July 1999, the agency instructed states to begin using quality indicators
derived from MDS data to review the care provided to a nursing home?s
residents before state surveyors actually visit the home to conduct a
survey. 11 Quality indicators are essentially numeric warning signs of
potential care problems, such as greater- than- expected instances of weight
loss, dehydration, or pressure sores among a nursing home?s residents. They
are used to rank a facility in 24 areas compared with other nursing homes in
a state. In addition, by using the quality indicators before the onsite
visit to select a preliminary sample of residents to review, surveyors
should be better prepared to identify potential care problems.
In addition to quality oversight, some state Medicaid programs and Medicare
use MDS data to adjust nursing home payments to reflect the expected
resource needs of their residents. Such payment systems are commonly known
as ?case- mix? reimbursement systems. Because not all residents require the
same amount of care, the rate paid for each resident is adjusted using a
classification system that groups residents based on their expected costs of
care. Facilities use MDS data to assign residents to case- mix categories or
groups that are defined according to clinical condition, functional status,
and expected use of services. In Medicare, these case- mix groups are known
as resource utilization groups. Each case- mix group represents
beneficiaries who have similar nursing and therapy needs. As of January
2001, 18 states had introduced such payment systems for their Medicaid
programs. 12 As directed by the Congress, HCFA
10 To assess state survey agency performance in fulfilling contractual
obligations, CMS is required by statute to conduct federal oversight surveys
in at least 5 percent of the nursing homes in each state within 2 months of
the state?s completion of its survey. CMS fulfills this requirement by
conducting a combination of (1) comparative surveys, in which a federal team
independently surveys a nursing home recently inspected by a state in order
to compare and contrast the results, and (2) observational surveys where
federal surveyors accompany a state survey team to a nursing home to watch
the conduct of the survey, provide immediate feedback, and later rate the
team?s performance. Comparative surveys offer a more accurate picture of the
adequacy of state survey activities than do observational surveys, which
primarily are used to help identify training needs. HCFA surveyors found
deficiencies that were more serious than those identified by state surveyors
in about 70 percent of the 157 comparative surveys they conducted between
October 1998 and May 2000. See GAO/ HEHS- 00- 197, Sept. 28, 2000.
11 Quality indicators were developed in a HCFA- funded project at the
University of Wisconsin. See Center for Health Systems Research and
Analysis, Facility Guide for the Nursing Home Quality Indicators (University
of Wisconsin- Madison: Sept. 1999).
12 We refer to these states as having ?MDS- based payment systems.?
Page 9 GAO- 02- 279 Nursing Home Resident Assessment Data
in 1998 implemented a prospective payment system (PPS) for skilled nursing
facilities (SNF)- nursing homes that are certified to serve Medicare
beneficiaries. The SNF PPS also uses MDS data to adjust nursing home
payments.
States and CMS use the term ?accuracy reviews? to describe efforts that help
ensure MDS assessments accurately reflect residents? conditions. Review
activities can be performed on- site- that is, at the nursing home- or off-
site. On- site reviews generally consist of documentation reviews to
determine whether the resident?s medical record supports the MDS assessment
completed by the facility. 13 If the MDS assessment is recent, the review
may also include direct observation of the resident and interviews with
nursing home staff who have recently evaluated or treated the resident.
While documentation reviews may also be conducted outside of the nursing
home, other off- site reviews of MDS data include examining trends across
facilities. 14 For example, off- site review activities could involve the
examination of monthly reports showing the distribution of residents? case-
mix categories across different facilities in a state. Similarly, off- site
reviews could also involve an examination of particular MDS elements, such
as the distribution of ADLs within and across nursing homes to identify
aberrant or inconsistent patterns that may indicate the need for further
investigation. Off- site and on- site reviews may also be combined as a way
of leveraging limited resources to conduct MDS accuracy activities.
13 Each nursing home resident has a medical record where information about
the resident is documented. In addition to the current plan of care,
examples of medical record documentation include: (1) recent physician
notes, (2) results of recent tests, and (3) documentation of services
provided. Nursing home staff use this documentation to complete each MDS
assessment. Maintaining an adequate level of documentation in the medical
record improves the ability of staff to complete the MDS accurately,
particularly for areas that require observation over a period of days. Some
states assert that determining the degree of assistance that a resident
requires with ADLs, such as bathing, dressing and toileting, requires
repeated observation over several days, thus increasing the need for
documentation.
14 CMS? current review of SNF PPS claims is an example of an off- site
documentation review. CMS contracts with fiscal intermediaries to process
Medicare claims and to conduct reviews that use medical records requested
from nursing homes to ensure that claims for Medicare payments are
adequately supported. For fiscal years 2000 and 2001, such contracts
required fiscal intermediaries to review 0.5 percent and 1 to 3 percent,
respectively, of total SNF PPS claims. MDS Review Activities Can
Be On- Site or Off- Site
Page 10 GAO- 02- 279 Nursing Home Resident Assessment Data
Eleven states conduct separate MDS accuracy reviews apart from their
standard nursing home survey process. Ten of these states? reviews were in
operation as of January 2001. An additional 7 states reported that they
intend to initiate similar accuracy reviews. 15 All 18 of these states
either currently use an MDS- based Medicaid payment system or plan to
implement such a system. The remaining 33 states have no plans to implement
separate MDS review programs and currently rely on their periodic nursing
home surveys for MDS oversight. 16 In all but one of the states with
separate MDS review programs operating as of January 2001, accuracy reviews
entail periodic on- site visits to nursing homes. The reviews focus on
whether a sample of MDS assessments completed by the facility is supported
by residents? medical records. If the MDS assessments reviewed are recent
enough that residents are still in the facility and their health status has
not changed, the on- site review may also be supplemented with interviews of
nursing home staff familiar with the residents, as well as observations of
the residents themselves, to validate the record review. About half of these
states also conduct off- site data analyses in which reviewers look for
significant changes or outliers, such as facilities with unexplained large
shifts in the distribution of residents across case- mix categories over a
short period. Officials primarily attributed the errors found during their
on- site reviews to differences in clinical interpretation and mistakes,
such as a misunderstanding of MDS definitions. A few of these states have
been able to show some recoupments of Medicaid payments since the
implementation of their onsite review programs.
15 In January 2002, we learned that one of these states- Kentucky- had
implemented its MDS review program in October 2001. Our analysis, however,
is based on the 10 programs in operation as of January 2001.
16 The District of Columbia is included as one of the 33 states that has no
plans to implement a separate MDS review program. In this report, we
generally refer to the District of Columbia as a state. Only Eleven States
Conduct Separate OnSite or Off- Site Reviews of MDS Accuracy
Page 11 GAO- 02- 279 Nursing Home Resident Assessment Data
Of the 50 states and the District of Columbia, only 11 conduct accuracy
reviews of MDS data that are separate from the state?s nursing home survey
process. 17 (See table 1.) These 11 states provide care to approximately 22
percent of the nation?s nursing home residents and all but one have an MDS-
based payment system (Virginia began conducting MDS accuracy reviews in
April 2001 in anticipation of adopting such a payment system in 2002). Seven
additional states plan to initiate separate MDS reviews- three currently
have an MDS- based payment system and four are planning to implement such a
payment system. Officials in the 10 states with separate, longer standing
MDS review programs said that the primary reason for implementing reviews
was to ensure the accuracy of the MDS data used in their payment systems.
Several of these states also indicated that the use of MDS data in
generating quality indicators was another important consideration. Vermont
officials, in particular, emphasized the link to quality of care, noting
that the state had created its own MDS- based quality indicators prior to
HCFA?s requirement to use quality indicators in nursing home surveys. A
state official told us it was critical that the MDS data be accurate because
Vermont was making this information available to the public as well as using
it internally as a normal part of the nursing home survey process.
17 Since separate MDS accuracy reviews are associated with states? Medicaid
programs, the costs can be considered administrative expenses. In general,
the federal government pays 75 percent of the cost for review activities
performed by skilled professional medical personnel, such as registered
nurses, and 50 percent for other personnel costs. States are responsible for
the remaining costs. Most States Do Not Have
Separate MDS Review Programs
Page 12 GAO- 02- 279 Nursing Home Resident Assessment Data
Table 1: States with and without MDS Review Programs as of January 2001 Type
of payment system State State
totals States with separate MDS review programs
MDS- based payment system Indiana, Iowa, Maine, Mississippi, Ohio,
Pennsylvania, South Dakota, Vermont, Washington, West Virginia
10 Planning to adopt MDS- based payment system Virginia (reviews began
April 2001) 1
States planning separate MDS review programs
MDS- based payment system Idaho, Kentucky, New Hampshire 3 Planning to adopt
MDS- based payment system Georgia, Minnesota, a New Jersey,
Utah 4
SUBTOTAL 18 States with no plans to establish separate MDS review programs
MDS- based payment system Colorado, b Florida, Kansas, Nebraska, North
Dakota 5 No MDS- based payment system Alaska, Alabama, Arkansas,
Arizona, California, Connecticut, District of Columbia, Delaware, Hawaii,
Illinois, a Louisiana, Massachusetts, a Maryland, b Michigan, Missouri,
Montana, a North Carolina, New Mexico, Nevada, New York, a Oklahoma, Oregon,
Rhode Island, South Carolina, Tennessee, Texas, a Wisconsin, Wyoming
28
Subtotal 33 Total 51
Note: States? decisions regarding whether to adopt an MDS- based payment
system and MDS review program may have changed since the time of our data
collection (January 2001). For example, a Kentucky official told us that it
implemented a separate MDS review program in October 2001, and Montana has
shifted to an MDS- based payment system. a Although these states do not
conduct a separate review of MDS data, they do conduct separate
reviews of data that are linked to their state?s Medicaid payment system.
For example, Texas has a non- MDS- based case- mix payment system called the
Texas Index for Level of Effort that is based on a recipient?s condition,
ADLs, and the level of staff intervention. b Colorado and Maryland officials
volunteered that they had conducted onetime reviews of MDS data,
but are not planning to regularly continue these reviews. Colorado?s state
survey agency conducted an MDS review of 90 nursing homes (40 percent of
homes) in the summer of 2000 and Maryland officials participated in a HCFA-
funded project to conduct on- site reviews from May through July 2000 at 5
percent of its nursing homes.
Source: GAO survey of 50 states and the District of Columbia.
To varying degrees, three major factors influenced the decision of 33 states
not to establish separate MDS review programs. First, the
Page 13 GAO- 02- 279 Nursing Home Resident Assessment Data
majority- 28 states- do not have MDS- based Medicaid payment systems.
Second, some states cited the cost of conducting separate reviews. Kansas,
for example, reported a lack of funding and staff resources as the reason
for halting a brief period of on- site visits in 1996 as a follow- up to
nursing home surveys. Arkansas as well reported insufficient staff for
conducting a separate review of MDS data. 18 Finally, officials in about
onethird of the states without separate MDS reviews volunteered that they
had some assurance of the accuracy of MDS data either because of training
programs for persons responsible for completing MDS assessments or because
of the nursing home survey process. 19 For example, Missouri operates a
state funded quality improvement project in which nurses with MDS training
visit facilities to assist staff with the MDS process and use of quality
indicator reports. North Carolina also reported that its quarterly training
sessions provide MDS training to approximately 800 providers a year.
Regarding standard surveys, Connecticut and Maryland reported that their
nursing home survey teams reviewed MDS assessments to determine if they were
completed correctly and if the assessment data matched surveyor observations
of the resident. In Connecticut, surveyors may also review a sample of
facility MDS assessments for possible errors whenever they identify aberrant
or questionable data on the quality indicator reports.
Officials in the 10 states with separate, longer standing MDS review
programs generally said that the survey process itself does not detect MDS
18 A few of the 10 states that carry out separate MDS reviews have
structured their programs to reduce the costs of on- site reviews. For
example, Ohio uses off- site data analysis to target a subset of facilities
for further on- site review. However, West Virginia, which conducted on-
site reviews until 1998, cited a lack of staff as the major reason for
switching to an off- site- only review approach.
19 These 13 states include: Connecticut, Florida, Kansas, Maryland,
Michigan, Missouri, Montana, North Carolina, Nevada, Oregon, South Carolina,
Tennessee, and Wisconsin. Because states volunteered this information, there
may be other states that conduct similar activities that provide some
assurance of the accuracy of MDS data.
Page 14 GAO- 02- 279 Nursing Home Resident Assessment Data
accuracy issues as effectively as separate MDS review programs. 20 Some
noted that nursing home surveyors do not have time to thoroughly review MDS
accuracy and often review a smaller sample size than MDS reviewers. The
surveyors? primary focus, they indicated, was on quality of care and
resident outcomes- not accuracy of MDS data. For example, surveyors would
look at whether the resident needed therapy and whether it was provided. In
contrast, the MDS reviewer would calculate the total number of occupational,
speech, and physical therapy minutes to ensure that the resident was placed
in the appropriate case- mix category. Officials in Iowa similarly noted
that surveyors do not usually cite MDS accuracy as a specific concern unless
there are egregious MDS errors, again, because the focus of the survey
process is on quality of care.
Nine of the 10 states with separate, longer standing MDS accuracy review
programs use on- site reviews to test the accuracy of MDS data, generally
visiting all or a significant portion of facilities in the state at least
annually, if not more frequently. (See app. I for a summary of state on-
site review programs.) Due to a lack of staff, one state- West Virginia-
limits its MDS reviews to off- site analysis of facility- specific monthly
data. Most of these states have been operating their MDS review programs for
7 years or longer and developed them within a year of implementing an MDS-
based payment system. Three of the nine states arrive at the facility
unannounced while the other six provide advanced notice ranging from 48
hours to 2 weeks.
The sample of facility MDS assessments reviewed by each state varies
considerably. Assessment sample sizes generally range from 10 to 40 percent
of a nursing home?s total residents but some states select a specific number
of residents, not a percentage, and a few specifically target residents in
particular case- mix categories. For example, Indiana selects a sample of 40
percent- or no less than 25 residents- across all
20 Two of the 10 states with MDS accuracy programs closely coordinate their
reviews with state nursing home surveys- Vermont and Washington. In Vermont,
12 registered nurses separately conduct both the MDS accuracy reviews and
nursing home surveys. Vermont officials told us that they had previously
tried combining these processes but decided to separate them because of the
heavy workload. In Washington, the nurses who conduct nursing home surveys
and MDS reviews are located in the same department, and therefore coordinate
closely by sharing reports and other information. The quality assurance
nurses who conduct the MDS reviews are surveyor trained and participate in
nursing home surveys about six times per year. Even so, Washington officials
cited the importance of having a separate MDS review process aside from the
nursing home surveys. States with Separate MDS
Review Programs Emphasize On- Site Oversight, but Also Conduct Off- Site
Monitoring
Page 15 GAO- 02- 279 Nursing Home Resident Assessment Data
major case- mix categories, while Ohio?s sample can be based on a particular
case- mix category, such as residents classified as ?clinically
complex.? 21 Iowa officials told us that its reviewers select at least 25
percent of a facility?s residents, with a minimum of 5 residents, while
Pennsylvania chooses 15 residents from each facility, regardless of casemix
category or facility size. Some states expand the resident sample when
differences between the MDS assessment and supporting documentation reach a
certain threshold. 22 For example, if the on- site review for the initial
sample in Iowa finds that 25 percent or more of the MDS assessments have
errors, a supplemental random sample is selected for review. While a few
states limit their sample to Medicaid residents only, most select
assessments to review from the entire nursing home?s population.
On- site reviews generally involve a comparison of the documentation in the
resident?s medical record to the MDS assessment prepared by the facility. 23
Generally, the on- site process also allows reviewers to interview nursing
home staff and to directly observe residents, permitting a better
understanding of the documentation in a resident?s medical record and
clarifying any discrepancies that may exist. Staff interviews and resident
observations can enhance the reviewer?s understanding of the resident?s
condition and allow a more thorough MDS review than one relying primarily on
documentation. However, as the interval between the facility?s MDS
assessment and the on- site review increases, staff interviews and resident
observations become less reliable and more
21 Generally, patients classified as clinically complex may have conditions
such as burns, pneumonia, internal bleeding, or dehydration. 22 States with
on- site reviews generally define MDS errors as an unsupported MDS
assessment, or they use a stricter standard of an unsupported MDS assessment
that results in a change in the resident?s case- mix category. None of the
states identify whether an MDS error results in a quality indicator change.
23 To strengthen the on- site review process, a few states- Iowa, South
Dakota, and Vermont- conduct interrater reliability checks and one of these
states, South Dakota, also conducts independent assessments. During an
interrater reliability check, two reviewers examine the same MDS assessment
and medical record separately and compare their findings to determine if
they are correctly and consistently identifying MDS errors. For independent
assessments, reviewers complete a separate MDS assessment using all of the
available information at the facility and then compare it to the original
assessment completed by the facility. In two recent reports, the HHS OIG
also conducted independent assessments based on medical record documentation
for 640 residents. See HHS OIG, OEI02- 99- 00040, Dec. 2000 and Nursing Home
Resident Assessment: Resource Utilization Groups, OEI- 02- 99- 00041
(Washington, D. C.: HHS, Dec. 2000).
Page 16 GAO- 02- 279 Nursing Home Resident Assessment Data
difficult to conduct. 24 For example, staff knowledge of a particular
patient may fade over time, the patient?s health status may change, or the
patient may be discharged from the facility. Pennsylvania officials, who
reported reviewing assessments that were 6 to 12 months old, told us that
the state?s MDS reviews tended to identify whether the nursing home had
adequate documentation. Reviewing such old assessments tends to focus the
review process on the adequacy of the documentation rather than on whether
the MDS assessment was accurate. 25 Four of the nine states review
assessments between 30 and 90 days old, a process that likely increases the
value of interviews and observation. The combination of interviews and
observations can be valuable, but limiting reviews to only recent MDS
assessments and providing homes advance notice may undermine the
effectiveness of on- site reviews. 26 Under such circumstances, facilities
have an opportunity to focus on the accuracy of their recent assessments,
particularly if the nursing home knows when their reviews will occur,
instead of adopting facility- wide practices that increase the accuracy of
all MDS assessments.
Based on their on- site reviews, officials in the nine states identified
seven areas as having a high potential for MDS errors, with two areas most
often identified as being among the highest potential for error: (1) mood
and behavior and (2) nursing rehabilitation and restorative care. 27 (See
fig. 1.) Assessments of resident mood and behavior are used to calculate
quality indicators and, along with nursing rehabilitation and restorative
care, are
24 The nine states with on- site reviews had different criteria regarding
when the assessment was too old to use interviews and observations as
corroborating evidence. For example, one state reported that interviews and
observations become less useful for an MDS assessment completed 14 days
prior to the state review, while another state cited 180 days.
25 Similarly, the HHS OIG acknowledged that its documentation review of MDS
assessments up to 11 months old did not permit a specific determination of
why differences occurred, only whether the MDS was consistent with the rest
of the medical record. See HHS OIG, OEI- 02- 99- 00041 and OEI- 02- 99-
00040, Dec. 2000.
26 We have earlier reported that the timing of some nursing home surveys
makes them predictable, allowing facilities to mask certain deficiencies if
they chose to do so. See GAO/ HEHS- 00- 197, p. 11.
27 Nursing rehabilitation and restorative care are interventions that assist
or promote the resident?s ability to attain his or her maximum functional
potential. Some examples include passive or active range of motion
movements, amputation care, and splint or brace assistance.
Page 17 GAO- 02- 279 Nursing Home Resident Assessment Data
often important in determining nursing home payments. 28 CMS indicated that
several of the MDS elements cited in figure 1 were also identified by a CMS
contractor as areas of concern. Officials in most states with separate on-
site review programs told us that errors discovered during their on- site
reviews often resulted from differences in clinical interpretation or
mistakes, such as a misunderstanding of MDS definitions by those responsible
for completing MDS assessments. Officials in only four of the nine states
were able to tell us whether the errors identified in their MDS reviews on
average resulted in a case- mix category that was too high or too low. Two
of these states reported roughly equal numbers of MDS errors that
inappropriately placed a resident in either a higher or lower case- mix
category; a third indicated that errors more often resulted in higher
payments; and a fourth found that errors typically resulted in payments that
were too low. None of the nine states track whether quality indicator data
were affected by MDS errors.
28 For example, 2 of the 24 quality indicators are based on behavior areas
assessed in the MDS, such as residents being verbally abusive, physically
abusive, or showing symptoms of depression.
Page 18 GAO- 02- 279 Nursing Home Resident Assessment Data
Figure 1: MDS Elements Identified By Nine States As Having High Potential
for MDS Errors
Note: We asked states to identify areas of the MDS assessment that have a
high potential for MDS errors. State responses were included in this figure
if two or more states identified an area as ?high
potential.? a Staff record the number of days and total minutes of therapy,
such as physical or occupational therapy, received by a resident in the last
7 days. b Staff record the number of days during the last 14- day period in
which a physician has examined the
resident or changed the care directions for the resident. The latter is
known as physician orders. c Staff members record scheduled times each day
that they perform any of the following tasks: (1) take
the resident to the bathroom, (2) give the resident a urinal, or (3) remind
the resident to go to the bathroom.
Source: Interviews with officials from nine states with separate on- site
review programs in operation as of January 2001.
0 1
2 3
4 5
6 7
8 9
Number of states Mood and behavior Nursing
rehabilitation and restorative care
ADLs Therapya
Physician visits or orders b
Toileting plansc
Skin conditions
Identified as 1 of top 3 high potential areas Identified as high potential
area
Page 19 GAO- 02- 279 Nursing Home Resident Assessment Data
Two of the 10 states with MDS review programs were able to tell us the
amount of Medicaid recoupments resulting from inaccurate MDS assessments.
From state fiscal years 1994 through 1997, South Dakota officials reported
that the state had recouped about $360,000 as a result of recalculating
nursing home payments after MDS reviews. West Virginia received $1 million
in 1999 related to MDS errors for physical therapy discovered during a 1995
on- site review at a nursing home. Officials in five additional states told
us that they recalculate nursing home payments when MDS errors are found,
but could not provide the amount recovered. 29
Of the 10 states with longer standing MDS review programs, four use offsite
analyses to supplement their on- site reviews, while one state relies on
off- site analyses exclusively. Both Maine and Washington examine MDS data
off- site to monitor changes by facility in the mix of residents across
case- mix categories. Such changes may help identify aberrant or
inconsistent patterns that may indicate the need for further investigation.
Ohio, a state with approximately 1,000 facilities- more than any other state
that conducts MDS reviews- analyzes data off- site to identify facilities
with increased Medicaid payments and changes in case- mix categories to
select the approximately 20 percent of facilities visited each year. 30 West
Virginia has eliminated its on- site reviews and now focuses solely on
analyzing monthly reports for its 141 facilities- for example, significant
changes in case- mix categories or ADLs across consecutive MDS assessments.
In addition to informally sharing results of off- site reviews with the
state nursing home surveyors, West Virginia is trying to formalize a process
in which off- site reviews could trigger additional onsite or off- site
documentation reviews.
29 At the time of our interviews, three states did not recalculate Medicaid
payments as a result of errors found during MDS reviews- Maine,
Pennsylvania, and Iowa. 30 Although Virginia had not begun its reviews at
the time of our data collection, state officials told us that they planned
to use off- site data analysis to target approximately 20 facilities- 7
percent- per month for on- site review.
Page 20 GAO- 02- 279 Nursing Home Resident Assessment Data
Officials in the nine states with on- site review programs consistently
cited three features of their review programs that strengthened the ability
of nursing home staff to complete accurate MDS assessments and thus decrease
errors: (1) the actual presence of reviewers, (2) provider education, and
(3) remedies that include corrective action plans and financial penalties.
On- site reviews, for example, underscore the state?s interest in MDS
accuracy and provide an opportunity to train and coach those who are
responsible for completing MDS assessments. Similarly, the errors discovered
during on- site reviews guide the development of more formal training
sessions that are offered by the state outside of the nursing home.
Requiring nursing homes to prepare corrective action plans and imposing
financial penalties signal the importance of MDS accuracy to facilities and
are tools to improve the accuracy of the MDS data. As a result of these
efforts, some states have been able to show a notable decrease in their
overall error rates.
Most of the nine states view on- site visits and training as interrelated
elements that form the foundation of their MDS review programs. State
officials said that nursing homes pay more attention to properly documenting
and completing the MDS assessments because reviewers visit the facilities
regularly. On- site visits also allow reviewers to discuss MDS documentation
issues or requirements with staff, providing an opportunity for informal MDS
training. For example, Indiana officials told us that 2 to 3 hours of
education are a routine part of each facility?s MDS review. Noting the high
staff turnover rates in nursing homes, many states reported that frequent
training for the staff responsible for completing MDS assessments is
critical. 31 Officials in seven of the nine states with onsite reviews told
us that high staff turnover was one of the top three factors contributing to
MDS errors in their states. In addition, many of the
31 We recently testified on the problem of nurse and nurse aide retention in
a range of health care settings, including nursing homes. See Nursing
Workforce: Recruitment and Retention of Nurses and Nurse Aides Is a Growing
Concern (GAO- 01- 750T, May 17, 2001). In addition, the HHS OIG recently
reported that about 60 percent of MDS coordinators had worked 1 year or less
in that role at their current nursing home and over 65 percent had no prior
experience as an MDS coordinator. See HHS OIG, OEI- 02- 99- 00040, Dec.
2000. States Attempt to
Improve MDS Data Accuracy through OnSite Reviews, Training, and Other
Remedies
Page 21 GAO- 02- 279 Nursing Home Resident Assessment Data
reasons cited for MDS errors- such as a misunderstanding of MDS definitions
and other mistakes- reinforce the need for training. 32
States with on- site reviews use the process to guide provider education
activities- both on- site and off- site. For example, during Pennsylvania?s
annual MDS reviews of all nursing homes, state reviewers determine the types
of training needed. According to state officials, the state uses the results
of these reviews to shape and provide facility- specific training, if it is
needed, within a month of the review and subsequently conducts a follow- up
visit to see if the facility is improving in these areas. They indicated
that all 685 homes visited during 2000, the first year of this approach,
were provided with some type of training. To improve MDS accuracy, several
states also provide voluntary training opportunities outside of the nursing
home. Maine, Iowa, Indiana, and South Dakota, for example, provide MDS
training regularly throughout the state, rotating the location of the
training by region so that it is accessible to staff from all facilities.
While states generally emphasized on- site reviews and training as the
primary ways to improve the accuracy of the MDS data, some reported that
they have also instituted certain remedies, such as corrective action plans
and financial penalties. Indiana and Pennsylvania, for example, require
facilities to submit a corrective action plan detailing how the facility
will address errors identified during an on- site review. Two states- Maine
and Indiana- impose financial penalties. 33 Maine has instituted financial
penalties for recurring serious errors, collecting approximately $390,000
since late 1995. Maine also requires facilities with any MDS errors that
result in a case- mix category change to complete and
32 HCFA provided guidance in March and July 2001 to facilities regarding the
completion of MDS assessments. HCFA last published similar guidance in
August 1996. A few state officials noted the long lapse in the publication
between the two guides and told us that clearer and more timely guidance on
MDS definitions was needed. However, CMS? Long Term Care Facility Resident
Assessment Instrument User?s Manual, which provides guidance on completing
MDS assessments, has not been updated since 1995.
33 Vermont and Washington also told us that financial penalties are an
available remedy, but had not imposed them as of early 2001.
Page 22 GAO- 02- 279 Nursing Home Resident Assessment Data
submit a corrected MDS assessment for the resident. 34 While Indiana imposes
financial penalties, it does not view them as the primary tool for improving
MDS accuracy. 35 Rather, officials attributed a decrease in MDS errors to
the education of providers and the on- site presence of reviewers. Other
remedies cited by states include conducting more frequent on- site MDS
reviews and referring suspected cases of fraud to their state?s Medicaid
Fraud Control Unit.
Five of the nine states that conduct on- site MDS reviews told us that their
efforts have resulted in a notable decrease in MDS errors across all
facilities since the implementation of their review programs. (See table 2.)
South Dakota officials, for example, reported that the percentage of
assessments with MDS errors across facilities had decreased from
approximately 85 percent to 10 percent since the implementation of the
state?s MDS review program in 1993. Similarly, Indiana reported a decrease
in the statewide average error rate from 75 percent to 30 percent of
assessments in 1 year?s time. Four states could not provide these data. In
calculating these decreases, three of the five states- Indiana, Maine, and
South Dakota- define MDS errors as an unsupported MDS assessment that caused
the case- mix category to be inaccurate. 36 Iowa?s definition, however,
includes MDS elements that are not supported by medical record
documentation, observation, or interviews, regardless of whether the MDS
error changed the case- mix category. Similarly, while Pennsylvania does not
limit errors to those that changed the case- mix category, the state
34 In Maine, facilities are instructed to follow CMS? correction policy
guidelines for MDS errors that do not result in a case- mix category change.
In commenting on a draft of this report, CMS noted the development and
implementation of its policy, which provided a new mechanism for facilities
to correct inaccurate information in the MDS database. This new policy has
significantly decreased the ability of facilities to submit certain types of
inaccurate MDS data, such as entering a ?5? for a particular MDS element,
when the only available choices are ?1- 4.? Under this policy, CMS has seen
a reduction of approximately 66 percent in the proportion of records in the
database containing invalid data values.
35 Indiana imposes financial penalties if more than 35 percent of a
facility?s MDS assessments have errors. State officials told us that very
few facilities- roughly 3 to 4 each quarter- have errors that are
significant enough to trigger financial penalties.
36 In Maine, only a subset of these case- mix category changes is used to
calculate an error rate.
Page 23 GAO- 02- 279 Nursing Home Resident Assessment Data
defines errors as a subset of MDS elements that are not supported by the
medical record. 37
Table 2: MDS Assessments with Errors in Five States with On- Site MDS Review
Programs (in percent)
State Initial MDS error rate
Subsequent MDS error
rate Time of initial and subsequent error rate
Indiana 75 30 1999, 2000 Iowa 32 22 July, December 2000 Maine a 21 10 1995,
2000 Pennsylvania 20 15 2000, 2001 South Dakota 85 10 1993, 1998
a Errors that result in changes for a subset of case- mix categories were
used to calculate these error rates. Source: Data provided by Indiana, Iowa,
Maine, Pennsylvania, and South Dakota.
Following implementation of Medicare?s MDS- based payment system in 1998,
HCFA began building the foundation for its own separate review program-
distinct from state efforts- to help ensure the accuracy of MDS data. In the
course of developing and testing accuracy review approaches, its contractor
found widespread MDS errors that resulted in a change in Medicare payment
categories for 67 percent of the resident assessments sampled. In September
2001, CMS awarded a new contract to implement a nationwide MDS review
program over a 2- to 3- year period. 38 Despite the benefits of on- site
reviews, as demonstrated by states with separate review programs, the
current plan involves conducting on- site reviews in fewer than 200 of the
nation?s 17,000 nursing homes each year. In addition, the contractor?s
combined on- site and off- site reviews to evaluate MDS accuracy will
involve only about 1 percent of the approximately 14.7 million MDS
assessments expected to be prepared in 2001. In contrast, states that
conduct separate on- site MDS reviews typically visit all or a
37 Pennsylvania reviews only those MDS elements that have a positive
response. For example, if a facility responded ?no? or left an MDS element
blank, that item would not be reviewed for accuracy, even if it could affect
the case- mix category for that particular resident.
38 CMS refers to the contractor responsible for this program as the data
assessment and verification (DAVE) contractor. CMS? MDS Review
Program Could Better Leverage Existing State and Federal Accuracy Activities
Page 24 GAO- 02- 279 Nursing Home Resident Assessment Data
significant portion of their nursing homes and generally examine from 10 to
40 percent of assessments. While CMS? approach may yield some broad sense of
the accuracy of MDS assessments on an aggregate level, it may be
insufficient to help ensure the accuracy of MDS assessments in most of the
nation?s nursing homes. At present, it does not appear that CMS plans to
leverage the considerable resources already devoted to state nursing home
surveys and states? separate MDS review programs that together entail a
routine on- site presence in all nursing homes nationwide. Nor does it plan
to more systematically evaluate the performance of state survey agencies
regarding MDS accuracy through its own federal comparative surveys. Finally,
CMS is not requiring nursing homes to provide documentation for the full MDS
assessment, which could undermine the efficacy of its MDS reviews.
In September 1998, HCFA contracted with Abt Associates to develop and test
various on- site and off- site approaches for verifying and improving the
accuracy of MDS data. Two of the approaches resembled state on- site MDS
reviews and the off- site documentation reviews performed by CMS contractors
that review Medicare claims. 39 Another approach used off- site data
analysis to target facilities for on- site review. 40 To determine the
effectiveness of the approaches tested in identifying MDS inaccuracies, Abt
compared the errors found under each approach to those found in its
?reference standard?- independent assessments performed by MDStrained nurses
hired by Abt for approximately 600 residents in 30 facilities
39 Similar to the separate MDS reviews conducted by the states, Abt reviewed
a subset of MDS items at a sample of nursing homes that met certain
criteria, e. g., they were important in determining case- mix categories or
calculating quality indicators or were suspected of being underreported. Abt
reviewers used information from medical records as well as interviews and
observations with staff and residents to determine whether the selected
items on the MDS assessments were accurate.
40 One off- site approach tested relied on analyzing certain MDS ?trigger?
items, such as pneumonia, that are likely to be in error when found in a
certain pattern on two consecutive MDS assessments for the same resident.
Off- site data analysis under this approach could be used to identify
facilities for on- site review that have a high proportion of residents
shown as having pneumonia- one potential trigger item- across two or more
MDS assessments. Testing of MDS Accuracy
Approaches Identified Widespread Accuracy Problems
Page 25 GAO- 02- 279 Nursing Home Resident Assessment Data
in three states. 41 Abt found errors in every facility, with little
variation in the percentage of assessments with errors across facilities. On
average, the errors found affected case- mix categories in 67 percent of the
sampled Medicare assessments. Abt concluded that the errors did not result
in systematic overpayments or underpayments to facilities even though there
were more errors that placed residents in too high as opposed to too low a
case- mix category. Abt did not determine, however, the extent to which
errors affected quality indicators.
Due to the prevalence of errors, Abt recommended a review program that
included periodically visiting all facilities during the program?s first
several years. Recognizing the expense of visiting every facility, however,
Abt also recommended eventually transitioning to the use of off- site
mechanisms to target facilities and specific assessments for on- site
review. Abt also made recommendations to address the underlying causes of
MDS errors: simplifying the MDS assessment tool, clarifying certain MDS
definitions (particularly for ADLs), and improving MDS training for
facilities. 42
Building on the work of Abt Associates, in the summer of 2000, the agency
began formulating its own distinct nationwide review program to address
long- term MDS monitoring needs. The agency developed a request for proposal
for MDS data assessment and verification activities and sought proposals
from its 12 program safeguard contractors. 43 On September 28, 2001, CMS
awarded a 3- year contract for approximately $26 million to
41 The nurses conducted assessments over several days and shifts using all
available documentation- medical record reviews, interviews, and
observations- to replicate as closely as possible the observation period the
facility used to make its assessments of those same residents. Because Abt
found too few assessments meeting its original criteria- completed by the
facility up to 14 days prior to the visits- it augmented its sample with
assessments that were up to 35 days old.
42 Similar to Abt, the HHS OIG concluded that differences found between MDS
assessments and the supporting documentation indicated confusion or
difficulties with the MDS assessment instrument and the need for enhanced
training. The HHS OIG found differences in 76 percent of the Medicare
assessments reviewed. ADLs and the number of minutes recorded for therapy,
specifically occupational and physical therapy, provided the greatest source
of differences.
43 Program safeguard contractors were authorized by the Health Insurance
Portability and Accountability Act of 1996, which allowed HCFA to contract
with specialized entities to identify program integrity concerns. See 42 U.
S. C. sect. 1395ddd. In May 1999, HCFA selected a pool of 12 contractors that
can bid on proposed contracts covering these types of activities. See
Medicare: Opportunities and Challenges in Contracting for Program Safeguards
(GAO- 01- 616, May 18, 2001). The Federal MDS Review
Program Is Too Limited to Evaluate State- Level Accuracy Assurance Efforts
Page 26 GAO- 02- 279 Nursing Home Resident Assessment Data
Computer Sciences Corporation. The contract calls for the initiation of
onsite and off- site reviews by late spring 2002, but the full scope of MDS
review activities will not be underway until the second year of the
contract. 44 (See table 3.)
Table 3: Implementation Schedule for CMS? MDS Accuracy Review Program Phase
Time period Review activities
Developmental October 2001 through May 2002
Test a combination of the most promising components from Abt?s earlier
assessment of various on- site and off- site approaches.
Recommend the appropriate balance between on- site and off- site reviews.
Identify and develop new approaches for monitoring MDS accuracy.
Begin to identify communication and collaboration strategies for federal
and state accuracy reviews, such as coordinating with states. Initial
implementation April 2002 through
September 2002
Begin conducting on- site and off- site accuracy reviews.
Continue to evaluate the efficacy of the accuracy review approaches being
implemented and identify areas of risk.
Conduct ongoing data surveillance, such as monitoring and identifying
trends in payments based on MDS data. a Full implementation October 2002
through September 2003
Perform ongoing data analysis and the full scope of data assessment and
verification activities. b
Implement training and education activities to ensure that those
responsible for MDS data understand and accurately complete MDS assessments.
This approach is expected to include a method for communicating how the
contractor will continually refine and improve accuracy review processes.
Note: The contract covers 1 year with two additional 1- year options.
Currently, full implementation would occur in the second year of the
contract. The third year of the contract may also include on- site
enforcement surveys and special studies concerning the accuracy of reported
Medicare and Medicaid data.
44 Although the contractor will first focus on conducting MDS accuracy
activities, the contractor is also required to establish a review program
for the Outcome and Assessment Information Set (OASIS), the data used as the
basis for home health payments and quality measures.
Page 27 GAO- 02- 279 Nursing Home Resident Assessment Data
a For example, one of the contractor?s tasks is to analyze MDS data reported
by nursing homes that serve Medicare beneficiaries to determine whether
differences in case- mix categories relate to changes in the patient?s
health status or changes in how providers are reporting MDS data. b For
example, while continuing on- site and off- site MDS reviews, the contractor
will also be required
to calculate error rates for paid claims for Medicare- covered services.
Source: DAVE contract statement of work for CMS? review program for MDS
accuracy.
Despite this broad approach, the contractor is not specifically tasked with
assessing the adequacy of each state?s MDS reviews. Instead, it is required
to develop a strategy for coordinating its review activities with other
state and federal oversight, such as the selection of facilities and the
timing of visits, to avoid unnecessary overlap with routine nursing home
surveys or states? separate MDS review programs. This approach does not
appear to build on the benefits of on- site visits that are already
occurring as part of state review activities. Rather, the contract specifies
independent federal on- site and off- site reviews of roughly 1 percent of
the approximately 14.7 million MDS assessments expected to be prepared in
2001- 80, 000 during the first contract year and 130,000 per year
thereafter. 45 The contractor, however, tentatively recommended that the
majority of reviews, about 90 percent, be conducted off- site. According to
CMS, these off- site reviews could include a range of activities, such as
the off- site targeting approaches developed by Abt or medical record
reviews similar to those conducted by CMS contractors for purposes of
reviewing Medicare claims. In addition, the contractor is expected to
conduct a range of off- site data analyses that could include a large number
of MDS assessments. The remaining 10 percent of MDS assessments-
representing fewer than 200 of the nation?s 17,000 nursing homes- would be
reviewed on- site each year. This limited on- site presence is inconsistent
with Abt?s earlier recommendation regarding the benefits of on- site reviews
in detecting accuracy problems, and with the view of almost all of the
states with separate MDS review programs that an on- site presence at a
significant number of their nursing homes is central to their review
efforts.
While CMS? approach may yield some broad sense of the accuracy of MDS
assessments on an aggregate level, it appears to be insufficient to provide
confidence about the accuracy of MDS assessments in the vast bulk of nursing
homes nationwide. Given the substantial resources invested in onsite nursing
home visits associated with standard surveys or states?
45 The reviews would encompass assessments from all payer sources. According
to CMS, the number of assessments to be reviewed is a target that is subject
to change.
Page 28 GAO- 02- 279 Nursing Home Resident Assessment Data
separate MDS review programs, CMS? MDS review program could view states?
routine presence as the cornerstone of its program and instead focus its
efforts on ensuring the adequacy of state reviews. CMS could build on its
established federal monitoring survey process for nursing home oversight.
The agency is required by statute to annually resurvey at least 5 percent of
all nursing homes that participate in Medicare and Medicaid. One of the ways
CMS accomplishes this requirement is by conducting nursing home comparative
surveys to independently assess the states? performance in their nursing
home survey process. During a comparative survey, a federal team
independently surveys a nursing home recently inspected by a state in order
to compare and contrast the results. These federal comparative surveys have
been found to be most effective when completed in close proximity to the
state survey and involve the same sample of nursing home residents to the
maximum extent possible. Abt also attempted to review recently completed MDS
assessments.
Finally, a potential issue that could undermine the efficacy of the federal
MDS accuracy reviews involves the level of documentation required to support
an MDS assessment. CMS requires specific documentation for some MDS
elements, but officials said that the MDS itself- which can simply consist
of checking off boxes or selecting multiple choice answers on the assessment
form- generally constitutes support for the assessment without any
additional documentation. CMS officials consider the MDS assessment form to
have equal weight with the other components of the medical record, such as
physician notes and documentation of services provided. As a result, CMS
asserts that the assessment must be consistent with, but need not duplicate,
the medical record. In contrast, most of the nine states with separate on-
site review programs require that support for each MDS element that they
review be independently documented in the medical record. State officials
told us that certain MDS elements, such as ADLs, are important to thoroughly
document because they require observation of many activities by different
nursing home staff over several days. As a result, some of these states
require the use of separate flow charts or tables to better document ADLs.
Similarly, some states require documentation for short- term memory loss
rather than accepting a nursing home?s assertion that a resident has this
condition. CMS? training manual describes several appropriate tests for
identifying memory loss, such as having a resident describe a recent event.
In one of its December 2000 reports, the HHS OIG recommended that nursing
homes be required to establish an ?audit trail? to support certain MDS
elements. HCFA disagreed, noting that it does not expect all information in
the MDS to be duplicated elsewhere in the medical record. However, given the
uses of MDS data, especially in adjusting nursing home payments and
producing
Page 29 GAO- 02- 279 Nursing Home Resident Assessment Data
quality indicators, documenting the basis for the MDS assessments in the
medical record is critical to assessing their accuracy.
In complying with federal nursing home participation and quality
requirements, about 17,000 nursing homes were expected to produce almost 15
million MDS assessments during 2001 on behalf of their residents. This
substantial investment of nursing home staff time contributes to multiple
functions, including establishing patient care plans, assisting with quality
oversight, and setting nursing home payments that account for variation in
resident care needs. While some states, particularly those with MDS- based
Medicaid payment systems, stated that ensuring MDS accuracy requires
establishing a separate MDS review program, many others rely on standard
nursing home surveys to assess the data?s accuracy. Flexibility in designing
accuracy review programs that fit specific state needs, however, should not
preclude achieving the important goal of ensuring accountability across
state programs. It is CMS? responsibility to consistently ensure that states
are fulfilling statutory requirements to accurately assess and provide for
the care needs of nursing home residents.
The level of federal financial support for state MDS accuracy activities is
already substantial. The federal government pays up to 75 percent of the
cost of separate state MDS review activities and in fiscal year 2001
contributed $278 million toward the cost of the state nursing home survey
process, which is intended in part to review MDS accuracy. Instead of
establishing a distinct but limited federal review program, reorienting the
thrust of its review program in order to complement ongoing state MDS
accuracy efforts could prove to be a more efficient and effective means to
achieve CMS? stated goals. Such a shift in focus should include (1) taking
full advantage of the periodic on- site visits already conducted at every
nursing home nationwide through the routine state survey process, (2)
ensuring that the federal MDS review process is designed and sufficient to
consistently assess the performance of all states? reviews for MDS accuracy,
and (3) providing additional guidance, training, and other technical
guidance to states as needed to facilitate their efforts. With its
established federal monitoring system for nursing home surveys- especially
the comparative survey process- that helps assess state performance in
conducting the nursing home survey process, CMS has a ready mechanism in
place that it can use to systematically assess state performance for this
important task. Finally, to help improve the effectiveness of MDS review
activities, CMS should take steps to ensure that each MDS assessment is
adequately supported in the medical record. Conclusions
Page 30 GAO- 02- 279 Nursing Home Resident Assessment Data
With the goal of complementing and leveraging the considerable federal and
state resources already devoted to nursing home surveys and to separate MDS
accuracy review programs, we recommend that the administrator of CMS
review the adequacy of current state efforts to ensure the accuracy of MDS
data, and provide, where necessary, additional guidance, training, and
technical assistance;
monitor the adequacy of state MDS accuracy activities on an ongoing basis,
such as through the use of the established federal comparative survey
process; and
provide guidance to state agencies and nursing homes that sufficient
evidentiary documentation to support the full MDS assessment be included in
residents? medical records.
We provided a draft of this report to CMS and the 10 states with separate
MDS accuracy programs for their review and comment. (See app. II for CMS?
comments.) CMS agreed with the importance of assessing and monitoring the
adequacy of state MDS accuracy efforts. CMS also recognized that the MDS
affects reimbursement and care planning and that it is essential that the
assessment data reflect the resident?s health status so that the resident
may receive the appropriate quality care and that providers are
appropriately reimbursed. However, CMS? comments did not indicate that it
planned to implement our recommendations and reorient its MDS review
program. 46 Rather, CMS? comments suggested that its current efforts provide
adequate oversight of state activities and complement state efforts.
While CMS stated that it currently evaluates, assesses, and monitors the
accuracy of the MDS through the nursing home survey process, it also
acknowledged the wide variation in the adequacy of current state accuracy
review efforts. Our work in the 10 states with separate MDS review programs
raised serious questions about the thoroughness and adequacy of the nursing
home survey process for reviewing MDS accuracy. Officials in many of these
states said that the survey process itself does not detect MDS accuracy
issues as effectively as separate MDS review programs. Surveyors, we were
told, do not have time to thoroughly
46 CMS refers to the contractor responsible for this program as the DAVE
contractor. Recommendations for
Executive Action Agency and State Comments and Our Evaluation
Page 31 GAO- 02- 279 Nursing Home Resident Assessment Data
review MDS accuracy and their focus is on quality of care and resident
outcomes, not accuracy of MDS data.
In response to our recommendations on assessing and monitoring the adequacy
of each state?s MDS reviews, CMS commented that it would consider adding a
new standard to the state performance expectations that the agency initiated
in October 2000. CMS indicated that the state agency performance review
program would result in a more comprehensive assessment of state activities
related to MDS accuracy than could be obtained through the comparative
survey process. CMS also outlined planned analytic activities- such as a
review of existing state and private sector MDS review methodologies and
instruments, ongoing communications with states to share the knowledge
gained, and comprehensive analyses of MDS data to identify systemic accuracy
problems within states as well as across states- that it believes will help
to evaluate state performance.
We agree that some of CMS? proposed analytic activities could provide useful
feedback to states on problem areas at the provider, state, region, and
national levels. Similarly, the addition of MDS accuracy activities to its
state performance standards for nursing home surveys, which CMS is
considering, has merit. While CMS plans to consider adding a new standard to
its state agency performance review program, the agency has a mechanism in
place- the comparative survey process- that it could readily use to
systematically assess state performance. However, CMS apparently does not
intend to do so. Based on our discussions with agency officials, it does not
appear that CMS? approach will yield a consistent evaluation of each state?s
performance. We continue to believe that assessment and routine monitoring
of each state?s efforts should be the cornerstone of CMS? review program. As
we previously noted, the agency?s proposed on- site and off- site reviews of
MDS assessments are too limited to systematically assess MDS accuracy in
each state and would consume resources that could be devoted to
complementing and overseeing ongoing state activities. A comprehensive
review of the adequacy of state MDS accuracy activities, particularly in
those states without a separate review program, is essential to establish a
baseline and to allow CMS to more efficiently target additional guidance,
training, or technical assistance that it acknowledged is necessary.
CMS did not agree with our recommendation that it should provide guidance to
states regarding adequate documentation in the medical record for each MDS
assessment. CMS stated that requiring documentation of all MDS items places
an unnecessary burden on
Page 32 GAO- 02- 279 Nursing Home Resident Assessment Data
facilities. Skilled reviewers, it stated, should be able to assess the
accuracy of completed MDS assessments through a combination of medical
record review, observation, and interviews. CMS further stated that
requiring duplicative documentation might result in documentation that is
manufactured and of questionable accuracy. Of course, the potential for
manufactured data could also be an issue with the MDS, when supporting
documentation is absent or limited. Without adequate documentation, it is
unclear whether the nursing home staff sufficiently observed the resident to
determine his or her care needs or merely checked off a box on the
assessment form. We continue to believe, as do most of the states with
separate MDS review programs, that requiring documentation for the full MDS
assessment is necessary to ensure the accuracy of MDS data. In our view,
however, this documentation need not be duplicative of that which is already
in the medical record but rather demonstrative of the basis for the higher-
level summary judgments about a resident?s condition. Some states have
already developed tools to accomplish this and in commenting on a draft of
this report, two states said that CMS should establish documentation
requirements for responses on the MDS. In addition, the discrepancies cited
by the HHS OIG in its studies stemmed from inconsistencies between MDS
assessments and documentation in residents? medical records. The OIG
acknowledged that the results of its analyses were limited by the
information available in the medical record- for example, when a facility
MDS assessment was based on resident observation, the facility may not have
documented these observations in the medical record. The importance of
adequate documentation is further reinforced by the fact that using
interviews and observation to validate MDS assessments may often not be
possible, particularly for residents who have been discharged from the
nursing home before an MDS accuracy review. Given the importance of MDS data
in adjusting nursing home payments and guiding resident care, documenting
the basis for the MDS assessment- in a way that can be independently
validated- is critical to achieving its intended purposes.
CMS provided additional clarifying information that we incorporated as
appropriate. In addition, the states that commented on the draft report
generally concurred with our findings and provided technical comments that
we incorporated as appropriate.
Page 33 GAO- 02- 279 Nursing Home Resident Assessment Data
As agreed with your offices, unless you publicly announce the contents of
this report earlier, we will not distribute it until 30 days after its date.
At that time, we will send copies to the administrator of CMS; appropriate
congressional committees; and other interested parties. We will also make
copies available to others upon request.
If you or your staff have any questions, please call me at (202) 512- 7114
or Walter Ochinko at (202) 512- 7157. Major contributors to this report
include Carol Carter, Laura Sutton Elsberg, Leslie Gordon, and Sandra Gove.
Kathryn G. Allen Director, Health Care- Medicaid
and Private Insurance Issues
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Page 34 GAO- 02- 279 Nursing Home Resident Assessment Data
State a Number of
nursing homes b
Year state began:
� MDS- based payment system
� MDS reviews Review
combined with nursing home surveys?
Survey findings used in planning MDS reviews?
Reviews done on site, off site, or both?
Frequency of on- site reviews (all facilities unless otherwise noted ) c
Number of MDS assessments reviewed at each facility
Average time lapse between facility MDS and state review IA 465 2000
(payment)
2000 (reviews) No No Both Annually At least 25 percent with a
minimum of 5 residents
90 days
IN 562 1998 (payment) 1998 (reviews) No No On- site
only At least every 15 months 40 percent- or
no less than 25 residents
State reviews most recent MDS assessment
ME 126 1993 (payment) 1994 (reviews) No No g Both Quarterly Minimum
of 10 assessments per facility
76 days
MS 191 1988 (payment) 1992 (reviews) No No On- site
only Annually At least 20 percent of
residents in facility
45 days
OH 1,009 1993 (payment) 1994 (reviews) No Not usually i Both Annually j
Ranging from
all to 50 residents, based on facility size
State reviews most recent MDS assessment for the reporting quarter
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Page 35 GAO- 02- 279 Nursing Home Resident Assessment Data
Reported importance of interviews/ observations versus medical record review
d State definition
of ?error?
Facility error rate calculated
Examples of remedies and efforts to recoup Medicaid payments e Accuracy and
other
trends Other features of onsite reviews
Interviews and observations are less important than medical record review
MDS element not supported by record, observation, or interview
Yes Make referrals to state survey agency; conduct additional reviews;
provide onsite education
During the first 2 quarters of reviews, error rate decreased from 32 percent
to 22 percent
State provides voluntary training sessions on completing and submitting MDS
assessments. State officials noted that provider education is a strong focus
of their MDS review program. Interviews and observations are equally
important as medical record review
Assessment caused resident to be placed in the wrong case- mix category f
Yes Impose financial penalties by reducing the administrative component of a
facility?s Medicaid payment; facility must submit plan and is subject to
revisit; recalculate case- mix category and Medicaid rates
State officials link decreases in MDS error rates to the presence of on-
site reviewers and the education of providers
State publishes annual guidelines for providers on documentation needed to
support MDS data
Interviews and observations are equally important as medical record review
MDS element not supported by record, observation, or interview h
Yes h Conduct more frequent reviews; impose financial penalties; h request
MDS reassessment from facility
While problems continue in some MDS elements, others show improvement, such
as ADLs
Reviewers bring portable computers to facilities and, using statedesigned
software, review MDS data
Interviews and observations are equally important as medical record review
Assessment caused resident to be placed in the wrong case- mix category
No Revisit facilities where problems have been identified; recalculate case-
mix category and Medicaid rates
Facilities with poor MDS reviews tend to receive many survey deficiencies
State published guidelines for providers on documentation needed to support
MDS data Interviews and observations are less important than medical record
review
Assessment caused resident to be placed in the wrong case- mix category
Yes Revisit facilities where problems have been identified; recalculate
case- mix category and Medicaid rates
When recalculating the case- mix, the adjusted payments decreased about 99
percent of the time
State has done the following to address MDS errors: training; Web site; MDS
newsletter; and providing results of MDS reviews
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Page 36 GAO- 02- 279 Nursing Home Resident Assessment Data
State a Number of
nursing homes b
Year state began:
� MDS- based payment system
� MDS reviews Review
combined with nursing home surveys?
Survey findings used in planning MDS reviews?
Reviews done on- site, off- site, or both?
Frequency of on- site reviews (all facilities unless otherwise noted) c
Number of MDS assessments reviewed at each facility
Average time lapse between facility MDS and state review PA 774 1996
(payment)
1994 (reviews) No No On- site only Annually 15 randomly
selected residents from assessments actually used in the ratesetting
process 6- 12 months
SD 113 1993 (payment) 1993 (reviews) No Not
usually i On- site only Every 15
months At least 25 percent of
residents in facility
14- 30 days
VT 43 1992 (payment) 1992 (reviews) No, but same
staff conduct reviews and surveys
Not usually i On- site
only At least annually 10 percent
pre determined and/ or random sample of all residents in all units
MDS never older than 90 days
WA 271 1998 (payment) 1998 (reviews) No, but staff
participate in surveys about 6 times per year
Yes Both Annually (Staff also conduct quarterly quality review audits)
Approximately 20 percent, depending on facility size
45- 60 days
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Page 37 GAO- 02- 279 Nursing Home Resident Assessment Data
Reported importance of interviews/ observations versus medical record review
d
State definition of error
Facility error rate calculated
Examples of remedies and efforts to recoup Medicaid payments e Accuracy and
other trends Other features of
on- site reviews
Interviews and observations are less important than medical record review
Positive MDS element not supported by record k
Yes Conduct more frequent reviews; provide training within 1 month; require
corrective action plan
State officials expect that their new MDS review process will ultimately
lead to a decrease in error rates
By restructuring the MDS review process, facilities are reviewed more
frequently, issues are identified more quickly and training is provided
almost immediately to nursing facility staff Interviews and observations are
equally important as medical record review
Assessment caused resident to be placed in the wrong case mix category
Yes Revisit facilities where problems have been identified; recalculate
case- mix category and Medicaid rates
Since the state has been reviewing MDS data, the error rate has decreased
from about 85 percent to 10 percent
On- site reviews also include independent assessments and inter- rater
reliability checks
Interviews and observations are equally important as medical record review
MDS element not supported by record, observation, or interview (effective
10/ 1/ 01)
Yes (effective 10/ 1/ 01)
Impose financial penalties (none imposed to date); revisit facilities where
problems have been identified; recalculate case- mix category and Medicaid
rates
State officials told us that Vermont facilities do not have serious MDS
accuracy issues
Vermont tried to combine MDS reviews with nursing home surveys, but found
that it detracted from the survey process
Interviews and observations are equally important as medical record review
Assessment caused resident to be placed in the wrong case mix category
Yes Impose financial penalties (none imposed to date); revisit facilities
where problems have been identified; recalculate case- mix category and
Medicaid rates
The types of MDS errors that commonly reoccur relate to misapplication of
MDS definitions, and may in large part be due to facility staff turnover. In
commenting on a draft of this report, officials told us that these errors
are consistent with those found in other states with MDS- based payment
systems.
State plans to publish the results of MDS accuracy reviews on a Web page to
prevent simple but recurring errors
a Virginia is not included because of the newness of its MDS review program
(began operating in April 2001). We have included the nine other states with
longer standing on- site review programs. b Source: CMS Nursing Home Compare
Web site, http:// www. medicare. gov/ nhcompare/ Search,
printed 6/ 8/ 01. c This column reflects the frequency of initial reviews
for each facility. Some states conduct follow- up
reviews more frequently for facilities where problems have been identified.
Appendix I: Summary of State On- Site MDS Reviews As of January 2001
Page 38 GAO- 02- 279 Nursing Home Resident Assessment Data
d We asked states to select from the following categories: more important,
equally important, and less important. e In addition, all nine states
reported that they refer cases of suspected fraud to their state?s Medicaid
Fraud Control Unit. f Indiana officials added the following language to
characterize MDS errors: An error occurs when the audit findings are
different from the facility?s transmitted MDS data and those differences
result in a different case- mix category. g Survey findings may be used to
plan MDS reviews, although this has not occurred yet.
h Financial penalties and facility error rates, however, are only based on
errors that result in changes for a subset of case- mix categories. i Survey
findings are occasionally used in planning MDS reviews.
j Staff use risk analysis to select approximately 200 facilities per year
for on- site reviews. k Pennsylvania reviews only those MDS elements that
have a positive response. For example, if a facility responded ?no? or left
an MDS element blank, that item would not be reviewed for accuracy, even if
it could affect the case- mix category for that particular resident.
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Page 39 GAO- 02- 279 Nursing Home Resident Assessment Data
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Page 40 GAO- 02- 279 Nursing Home Resident Assessment Data
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Page 41 GAO- 02- 279 Nursing Home Resident Assessment Data
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Page 42 GAO- 02- 279 Nursing Home Resident Assessment Data
Appendix II: Comments from the Centers for Medicare and Medicaid Services
Page 43 GAO- 02- 279 Nursing Home Resident Assessment Data (290117)
The General Accounting Office, the investigative arm of Congress, exists to
support Congress in meeting its constitutional responsibilities and to help
improve the performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates federal
programs and policies; and provides analyses, recommendations, and other
assistance to help Congress make informed oversight, policy, and funding
decisions. GAO?s commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
The fastest and easiest way to obtain copies of GAO documents is through the
Internet. GAO?s Web site (www. gao. gov) contains abstracts and full- text
files of current reports and testimony and an expanding archive of older
products. The Web site features a search engine to help you locate documents
using key words and phrases. You can print these documents in their
entirety, including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as ?Today?s Reports,? on its Web
site daily. The list contains links to the full- text document files. To
have GAO e- mail this list to you every afternoon, go to www. gao. gov and
select "Subscribe to daily e- mail alert for newly released products" under
the GAO Reports heading.
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent of
Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more
copies mailed to a single address are discounted 25 percent. Orders should
be sent to:
U. S. General Accounting Office P. O. Box 37050 Washington, D. C. 20013
To order by Phone: Voice: (202) 512- 6000 TDD: (202) 512- 2537 Fax: (202)
512- 6061
GAO Building Room 1100, 700 4th Street, NW (corner of 4th and G Streets, NW)
Washington, D. C. 20013
Contact: Web site: www. gao. gov/ fraudnet/ fraudnet. htm, E- mail:
fraudnet@ gao. gov, or 1- 800- 424- 5454 or (202) 512- 7470 (automated
answering system).
Jeff Nelligan, Managing Director, NelliganJ@ gao. gov (202) 512- 4800 U. S.
General Accounting Office, 441 G. Street NW, Room 7149, Washington, D. C.
20548 GAO?s Mission
Obtaining Copies of GAO Reports and Testimony
Order by Mail or Phone Visit GAO?s Document Distribution Center
To Report Fraud, Waste, and Abuse in Federal Programs
Public Affairs
*** End of document. ***