Tax Administration: IRS Is Implementing the National Research	 
Program as Planned (16-JUN-03, GAO-03-614).			 
                                                                 
The Internal Revenue Service (IRS) needs up-to-date information  
on voluntary compliance in order to assess and improve its	 
programs. IRS's last detailed study of voluntary compliance was  
done in the late 1980s, so the compliance information IRS is	 
using today is not current. IRS is now carrying out the National 
Research Program (NRP), through which IRS auditors are reviewing 
about 47,000 randomly selected tax year 2001 individual tax	 
returns. In June 2002, GAO reported that NRP was necessary, that 
its design was sound, and that it appeared to meet IRS's goals of
acquiring useful compliance data while minimizing burden on	 
taxpayers with returns in the sample. GAO was asked to review	 
IRS's implementation of NRP. GAO reviewed IRS's method of	 
gathering internal and third-party data (casebuilding) and IRS's 
process of reviewing casebuilding materials to determine if	 
audits are necessary (classification) and assessed IRS's plans to
ensure consistent data collection while minimizing burden on	 
taxpayers.							 
-------------------------Indexing Terms------------------------- 
REPORTNUM:   GAO-03-614 					        
    ACCNO:   A07193						        
  TITLE:     Tax Administration: IRS Is Implementing the National     
Research Program as Planned					 
     DATE:   06/16/2003 
  SUBJECT:   Federal taxes					 
	     Research programs					 
	     Strategic planning 				 
	     Tax administration 				 
	     Tax administration systems 			 
	     Taxpayers						 
	     Voluntary compliance				 
	     Program evaluation 				 
	     Tax law						 
	     Data collection					 
	     Data integrity					 
	     IRS National Research Program			 

******************************************************************
** This file contains an ASCII representation of the text of a  **
** GAO Product.                                                 **
**                                                              **
** No attempt has been made to display graphic images, although **
** figure captions are reproduced.  Tables are included, but    **
** may not resemble those in the printed version.               **
**                                                              **
** Please see the PDF (Portable Document Format) file, when     **
** available, for a complete electronic file of the printed     **
** document's contents.                                         **
**                                                              **
******************************************************************
GAO-03-614

Report to the Committee on Finance, U. S. Senate

United States General Accounting Office

GAO

June 2003 TAX ADMINISTRATION

IRS Is Implementing the National Research Program as Planned

GAO- 03- 614

IRS*s NRP is being implemented as planned and consequently is on track to
meet the agency*s objectives of obtaining quality research results while
minimizing the burden on the approximately 47,000 taxpayers with returns
in

the NRP sample. IRS officials have completed the development and testing
of NRP processes and have selected and trained staff members to carry out
the program. Additionally, as the graphic illustrates, IRS is currently
nearing the completion of casebuilding and has made progress in
classifying NRP returns. Audits, when required, began in November 2002. As
of the end of March 2003, IRS had closed 3,651 NRP cases. In accordance
with IRS*s plans to minimize burden on taxpayers with returns in the NRP
sample, some cases have been closed without any taxpayer contact or with
only limited

audits. The NRP plan recognized that the initial estimates for the overall
NRP sample size and the number of returns to be audited were uncertain
because they were based on aging data. The overall NRP sample size will be
smaller and IRS officials expect to conduct more face- to- face audits
than

initially estimated. As IRS completes NRP casebuilding, classification,
and audits, it is implementing quality assurance steps, including efforts
to ensure that key audit steps are completed on all NRP audits before they
are formally closed with taxpayers. This is important since the data
collected from each NRP audit represent information from thousands of
similar taxpayers.

NRP Work Completed and Work Remaining as of the End of March 2003

The Internal Revenue Service (IRS) needs up- to- date information on
voluntary compliance in order to

assess and improve its programs. IRS*s last detailed study of voluntary
compliance was done in the late 1980s, so the compliance information IRS
is using today is not current. IRS is now carrying out the National
Research Program (NRP), through which IRS auditors

are reviewing about 47, 000 randomly selected tax year 2001 individual tax
returns. In June

2002, GAO reported that NRP was necessary, that its design was sound, and
that it appeared to meet IRS*s goals of acquiring useful compliance data
while minimizing burden on taxpayers with returns

in the sample. GAO was asked to review IRS*s implementation of NRP. GAO
reviewed IRS*s method of gathering internal and third- party data
(casebuilding) and IRS*s process of reviewing casebuilding materials to

determine if audits are necessary (classification) and assessed IRS*s
plans to ensure consistent data collection while minimizing burden on
taxpayers. GAO is not making recommendations in this report

because IRS is following through with its sound research plans in the
ongoing implementation of NRP. Furthermore, when GAO identified quality
assurance steps that IRS

could add to NRP during the course of this review, IRS agreed with the
suggestions and included the steps.

www. gao. gov/ cgi- bin/ getrpt? GAO- 03- 614. To view the full product,
including the scope and methodology, click on the link above. For more
information, contact James White at (202) 512- 5594 or WhiteJ@ GAO. gov.

Highlights of GAO- 03- 614, a report to the Committee on Finance, U. S.
Senate

June 2003

TAX ADMINISTRATION

IRS Is Implementing the National Research Program as Planned

Page i GAO- 03- 614 National Research Program Implementation Letter 1
Results in Brief 2 Background 3 Scope and Methodology 6 IRS Completed NRP
Process Testing 7 NRP Staff Selection and Training Is Complete 8 NRP
Reviews Are Under Way 10 IRS Has Implemented NRP Quality Assurance
Measures 12 IRS Is Taking Steps to Minimize and Assess Taxpayer Burden 17
Conclusions 22 Agency Comments 22 Appendix I Comments from the Internal
Revenue Service 24

Figures

Figure 1: NRP Process for Measuring Voluntary Reporting Compliance 4
Figure 2: NRP Work Completed and Work Remaining as of the End

of March 2003 11 Figure 3: NRP Implementation Progress 12 Figure 4:
Estimated NRP Sample by Level of Taxpayer Contact 20 Abbreviations

EQMS Examination Quality Measurement System IRS Internal Revenue Service
NRP National Research Program Contents

This is a work of the U. S. Government and is not subject to copyright
protection in the United States. It may be reproduced and distributed in
its entirety without further permission from GAO. It may contain
copyrighted graphics, images or other materials. Permission from the
copyright holder may be necessary should you wish to reproduce copyrighted
materials separately from GAO*s product.

Page 1 GAO- 03- 614 National Research Program Implementation June 16, 2003
The Honorable Charles E. Grassley

Chairman The Honorable Max Baucus Ranking Minority Member Committee on
Finance United States Senate

Understanding taxpayers* compliance with the nation*s tax laws is critical
to the ability of the Internal Revenue Service (IRS) to improve the
effectiveness of its programs to enforce and promote voluntary compliance.
While IRS strives to target enforcement audits to taxpayers who are not
complying with tax laws, this has become increasingly difficult for IRS to
do because the information it uses to identify likely noncompliant tax
returns is out of date. This means that a large and growing number of IRS
enforcement audits are directed at compliant taxpayers. IRS needs more
current data on compliance to be able to appropriately target audits to
problem returns. IRS last studied voluntary

compliance with random audits of tax year 1988 tax returns. These studies
included intensive, line- by- line audits and imposed a significant burden
on taxpayers with returns in the sample.

IRS is now carrying out a detailed study of taxpayer compliance* the
National Research Program (NRP). IRS has identified a random sample of
47,000 tax year 2001 returns and has begun NRP data gathering, including
limited audits where necessary to verify information reported by

taxpayers. NRP is intended to produce useful compliance data while
minimizing the burden on the taxpayers selected for the study. IRS
designed NRP to minimize taxpayer burden by gathering IRS and third- party
information* a process called casebuilding* in order to limit taxpayer
information requests to items that cannot be verified without such a
request. Specially trained IRS agents then review the approximately 47,000
returns in the

NRP sample and their supporting casebuilding files and identify lines on
the returns that cannot be verified without auditing the taxpayers* a
process called classification. With this approach, some taxpayers will not
be contacted at all and most others will be asked to verify only some of
the line items on their returns.

United States General Accounting Office Washington, DC 20548

Page 2 GAO- 03- 614 National Research Program Implementation In June 2002,
we reported to you our assessment of IRS*s NRP plans. 1 At that time we
determined that NRP was necessary and that its design

addressed both the need for up- to- date information on taxpayer
compliance and IRS*s goal of minimizing the burden NRP imposed on
taxpayers. We also reported that IRS had important casebuilding and
classification procedures testing to complete, as well as selection and
training of IRS personnel to carry out the program. You then asked us to
assess whether IRS is implementing NRP as planned. Accordingly, this
report describes and assesses (1) IRS*s completion of NRP development and
testing, (2) staff selection and training, (3) implementation progress of
the program, (4) quality assurance steps IRS is taking to ensure
consistent and accurate data collection, and (5) steps the agency is
taking to minimize burden on taxpayers with returns in the NRP sample.

To monitor IRS*s final development and implementation of NRP, we reviewed
agency documents concerning NRP design, testing, and implementation. We
also directly observed sessions of most aspects of NRP training and two
key tests of NRP procedures. We held frequent discussions with the IRS
personnel implementing the program and conducted field office visits to
several of the locations where NRP classification is taking place. During
our field visits, we met with managers and classifiers and discussed their
understanding of the program and any implementation issues that came up.
Our assessment of NRP implementation is based on IRS*s NRP plans as
described in our June 2002 report.

IRS is implementing key aspects of NRP according to the plans the agency
laid out in 2002 and the program is now fully under way. IRS has completed
NRP process testing and development and has identified and trained over
3,000 enforcement auditors to be NRP classifiers and auditors. IRS has
started carrying out NRP reviews. As of the end of March 2003, IRS had
selected a sample of about 47,000 tax year 2001 returns, completed
casebuilding files for over 90 percent of them, and classified over 70
percent of these returns. IRS had completed all NRP work on 3,651 NRP
cases.

1 U. S. General Accounting Office, Tax Administration: New Compliance
Research Effort Is on Track, but Important Work Remains, GAO- 02- 769
(Washington, D. C.: June 27, 2002). Results in Brief

Page 3 GAO- 03- 614 National Research Program Implementation As described
in its plans, IRS has incorporated important quality assurance measures
into NRP. IRS has made data consistency checks,

regular communication with staff members carrying out NRP, and
classification and audit reviews part of NRP. IRS has also adopted
suggestions we made in the course of this study that it add additional
quality checks to NRP in the form of classification site visits and
centralized evaluation of classification review results.

Also, as IRS planned, NRP casebuilding and classification processes are
helping minimize burden on taxpayers with returns in the NRP sample by
limiting most audits to line items that cannot be verified without
directly contacting taxpayers. Additionally, while the NRP sample size is
now smaller than originally projected, the number of face- to- face NRP
audits that IRS officials expect to take place is now larger than
originally estimated. The original NRP sample size estimates were based on
now obsolete 1988 compliance study data, the only data available. The
number of expected face- to- face audits, now projected to be about
39,000, is larger than originally estimated because the estimates were
also based on the

obsolete 1988 data and because IRS modified its classification guidelines
to better match the training and skill level of the auditors selected to
conduct NRP correspondence and face- to- face audits with the issues to be
covered by those audits. In commenting on a draft of this report, the
Commissioner of Internal

Revenue concurred with our findings and conclusions about IRS*s
implementation of NRP. He noted that as NRP continues, IRS will continue
to emphasize delivering quality results and minimizing taxpayer burden.
The commissioner*s letter is reprinted in appendix I.

IRS designed NRP to obtain new information about taxpayers* compliance
with the tax laws. While IRS is using NRP to measure voluntary filing,
reporting, and payment compliance, the majority of NRP efforts are devoted
to obtaining accurate voluntary reporting compliance data. In measuring
reporting compliance, IRS*s two primary goals are to obtain accurate
information but minimize the burden on the approximately 47,000 taxpayers
with returns in the NRP sample. IRS plans to use NRP data to update return
selection formulas, allow IRS to design prefiling programs that will help
taxpayers comply with the tax law, and permit IRS Background

Page 4 GAO- 03- 614 National Research Program Implementation to focus its
limited resources on the most significant areas of noncompliance. 2 NRP*s
reporting compliance study consists of three major processes:

(1) casebuilding* creating information files on returns selected for the
NRP sample, (2) classification* using that information to classify the
returns according to what, if any, items on the returns cannot be verified
without additional information from the taxpayers, and (3) taxpayer audits
limited to those items that cannot be independently verified. We reported
in June 2002 that NRP*s design, if implemented as planned, is likely to

yield the sort of detailed information that IRS needs to measure overall
compliance, develop formulas to select likely noncompliant returns for
audit, and identify compliance problems for the agency to address. Figure
1 shows NRP*s main elements.

Figure 1: NRP Process for Measuring Voluntary Reporting Compliance

2 IRS uses compliance research to determine the characteristics of tax
returns that are likely to be noncompliant. Screening tax returns for
those characteristics is an early step in selecting returns for
enforcement audits.

Page 5 GAO- 03- 614 National Research Program Implementation IRS designed
the casebuilding process to bring together available data to allow the
agency to establish the accuracy of information reported by taxpayers on
their returns. For each taxpayer with a return in the NRP

sample, IRS is compiling internal information, such as past years* returns
and information reported to IRS by third parties, such as employers and
banks, and information from outside databases, such as property listings,
address listings, and stock sale price data.

Classification is where IRS uses the casebuilding information to determine
whether an NRP audit is necessary and which items need to be verified
through an audit. Classifiers place NRP returns into one of four
categories: (1) accepted as filed, (2) accepted with adjustments, (3)
correspondence

audit, and (4) face- to- face audit. If the casebuilding material allows
IRS to verify all of the information that a taxpayer reported on his or
her tax return, then the taxpayer will not be contacted and the return
will be classified as accepted as filed. On returns where minor
adjustments are necessary, the adjustments will be recorded for research
purposes, but the taxpayers will not be contacted. These returns will be
classified as accepted with adjustments. NRP returns that have one or two
items from a specified list requiring examination will be classified for
correspondence audits. All other NRP returns for which the casebuilding
material does not enable IRS to independently verify the information
reported on the returns

will be classified for face- to- face audits. NRP audits will take place
either through correspondence with the taxpayers or through face- to- face
audits. When classifiers determine that an NRP return will be sent for a
correspondence audit, IRS will request that the taxpayer send
documentation verifying the line items in question. To ensure accurate and
consistent data collection, NRP audits will address all issues identified
by classifiers and will not be focused only on substantial issues or cases
for which there is a reasonable likelihood of collecting unpaid taxes,
according to IRS officials. NRP auditors also may expand the scope of the
audits to cover items that were not classified initially.

IRS plans to conduct detailed, line- by- line audits on 1,683 of the
approximately 47,000 returns in the NRP sample in order to assess the
accuracy of NRP classification and, if necessary, to adjust NRP results* a
process called calibration. One- third of the returns in the calibration
sample will be returns that were classified accepted as filed (either with
or without adjustments), one- third from those classified for
correspondence audits, and one- third from those classified for face- to-
face audits. None of

Page 6 GAO- 03- 614 National Research Program Implementation the taxpayers
with returns in the calibration sample will have been audited or otherwise
contacted by IRS prior to the start of these line- by- line audits.

To describe IRS*s implementation of NRP, we have conducted frequent
meetings with officials in IRS*s NRP Office and other IRS officials as
they have implemented the program. We reviewed NRP training materials and

observed NRP classifier, correspondence examination, and field examination
training sessions. We also observed NRP process tests and conducted site
visits to IRS area offices in Baltimore, Maryland; Brooklyn, New York;
Oakland, California; Philadelphia, Pennsylvania; and St. Paul, Minnesota,
in order to observe and review NRP classification in field

offices. We considered whether NRP is being implemented in accordance with
its design. In our report issued on June 27, 2002, we found that NRP*s
design, if implemented as planned, is likely to provide IRS with the type
of information it needs to ensure overall compliance, update workload
selection formulas, and discover other compliance problems that the agency
needs to address. For this review, we also considered whether IRS was
maintaining a focus on meeting NRP*s objectives of obtaining quality
research results while, at the same time, minimizing taxpayer burden. This
assessment was also based on IRS*s NRP implementation plans.

As of the completion of our work, IRS had a significant amount of NRP
implementation to carry out. Our evaluation of IRS*s efforts to implement
NRP, therefore, only provides an assessment of efforts that have taken
place through the time of our work. Additionally, we did not attempt to
assess IRS*s efforts to measure filing compliance and payment compliance
through NRP. Our evaluation focuses only on IRS*s efforts to obtain
voluntary reporting compliance information. 3 A more detailed description
of NRP can also be found in our 2002 report.

3 There are three types of voluntary compliance measures: filing
compliance, which measures the percentage of taxpayers who file returns in
a timely manner; payment compliance, which measures the percentage of tax
payments that are paid in a timely manner; and reporting compliance, which
measures the percentage of actual tax liability that is reported
accurately on returns. Although IRS*s NRP plans include reviews of all
three types of compliance, the majority of its efforts have been devoted
to development of reporting compliance measurement procedures. Reporting
compliance is also the only aspect of NRP that will include audits of
taxpayers. Scope and Methodology

Page 7 GAO- 03- 614 National Research Program Implementation We conducted
our work from September 2002 through April 2003 in accordance with
generally accepted government auditing standards.

In addition to the two tests described in our prior report on NRP, IRS
conducted two more tests of NRP processes prior to implementing the
program. IRS tested the casebuilding and classification processes in an
NRP simulation in July 2002, and conducted another classification process
test during the initial classification training session in September 2002.
IRS used the preliminary results of both of these tests to estimate NRP
classification outcomes and to evaluate the effectiveness of NRP training.
As we recommended in our June 2002 report, IRS substantially completed
this testing prior to full NRP implementation, though final reports from
the tests were not completed until later. In July 2002, IRS used draft NRP
training materials to train 16 auditors

from IRS field offices in the use of NRP casebuilding materials to carry
out the NRP classification process. The newly trained classifiers then
classified 506 tax year 2000 returns. NRP staff members reviewed the
classifiers* results and found that, overall, the results of this NRP
simulation were positive. They found that the classifiers understood the
NRP approach to classification but that there were instances where the
classifiers overlooked some of the issues indicated by the casebuilding
materials or made other errors.

In September 2002, IRS conducted another test of the NRP classification
process immediately following the initial training session using final
classification training materials. As we recommended in our June 2002
report, IRS had NRP classifiers classify previously audited tax returns in
order to compare classifiers* results with the results of actual audits.
Twenty- two newly trained classifiers classified 44 previously audited
returns, with each return classified by 5 different classifiers. All of
the earlier audits resulted in some changes. NRP staff members then
compared the classifiers* results with those of the other classifiers and
with the results of the earlier audits. NRP officials reported that the
test showed that about three- fourths of the time the trained NRP
classifiers were able to identify issues where noncompliance was found
through an audit.

IRS used preliminary results of these tests to identify and implement
improvements to NRP. For example, NRP staff members noticed early in the
course of the second test that NRP classifiers were failing to classify
some line items in accordance with NRP guidelines. Trainers reiterated the
IRS Completed NRP

Process Testing

Page 8 GAO- 03- 614 National Research Program Implementation importance of
following the classification guidelines for these items. NRP staff members
also saw that the format of the form that classifiers were to

use to record their classification decisions made it easy to make
mistakes. They revised the form to make decision recording less error-
prone. IRS also used these tests to identify the need for more stringent
classification review guidelines than initially planned in order to ensure
that classifiers understand and follow the classification guidelines.

IRS did not finish analysis and documentation of the NRP simulation and
assessment and the classification process test until after the beginning
of classification in IRS area offices. NRP classification began at IRS
area

offices during November 2002, but IRS did not finalize its report on the
July 2002 NRP simulation until December 2002, and the report on the
September 2002 NRP process test was finalized in December 2002. According
to NRP officials, this did not create problems because they made changes
to NRP processes and training materials before the reports of these tests
were final. Though the final reports were not completed until later, these
tests and the NRP modifications they generated were complete before full
implementation of NRP.

IRS identified and trained staff to complete NRP classification and
audits. IRS selected NRP classifiers and auditors from field offices
across the country to handle NRP cases along with the non- NRP enforcement
cases and carried out plans for special training of the staff members
tasked with NRP responsibilities. IRS delayed the delivery of computer
software training to managers and clerks involved in NRP audits due to
technical problems with NRP software. This initially delayed the start of
NRP audits,

but the training is now complete. The timing of NRP staff selection and
training fit the conclusion and recommendation in our June 2002 report
that IRS should make sure that these key steps are carried out in the
appropriate sequence and not rushed to meet an earlier, self- imposed
deadline.

IRS selected over 3,000 auditors to handle NRP cases. Most of these
auditors are assigned to the Small Business/ Self Employed operating
division. 4 IRS selected 138 Small Business/ Self Employed auditors to be

4 IRS has four business operating divisions: Wage and Investment, Small
Business/ Self Employed, Large and Mid- Sized Business, and Tax Exempt and
Government Entities. NRP Staff Selection

and Training Is Complete

IRS Selected Auditors to Carry Out NRP

Page 9 GAO- 03- 614 National Research Program Implementation NRP
classifiers and about 3,500 to handle NRP face- to- face audits. According
to NRP staff members, IRS offices across the country now have one or more
auditors trained to handle the NRP cases that come to those

offices. IRS area office managers determined how many auditors should
receive NRP training based on the projected distribution of NRP returns to
their areas.

Unlike face- to- face audits, NRP correspondence audits are being handled
out of a single office. IRS selected two groups of correspondence
auditors* 26 correspondence auditors* from the Wage and Investment
operating division*s Kansas City office to handle NRP correspondence
audits.

IRS originally planned to select a cadre of auditors to work only on NRP
face- to- face audits. According to NRP officials, the geographic
distribution of NRP returns would have made it difficult to have a cadre
of auditors dedicated entirely to NRP examinations because they would have
had to travel extensively to carry out NRP audits. IRS officials said that
even

though they did not implement the plan for a dedicated cadre of NRP
auditors, the number of full- time equivalent employees needed for NRP*
about 1,000 in fiscal year 2003* has not changed. In September 2002, IRS
trained 138 auditors to perform NRP classification. The classifiers
learned how to apply the guidelines for NRP classification

and were shown how to use NRP casebuilding materials. Instructors stressed
the concept of *when in doubt, classify the item* meaning that, unless the
casebuilding materials explicitly verify the line item in question, the
classifier should classify the item as needing to be verified through an
audit. Instructors explained that with a random sample such as in NRP,
every return represents many others so even small oversights on the part
of classifiers or auditors can have a substantial impact on data quality.
After the classification training, the classifiers remained at the
training

location and began classifying NRP returns. Specially trained
classification reviewers reviewed most of the classified cases and
provided rapid feedback to the newly trained NRP classifiers. The intent
of this was to ensure that NRP classifiers understood and consistently
applied the NRP classification guidelines and received any needed
retraining before returning to their respective field offices and
participating in future NRP classification sessions. NRP Classifier
Training Is

Complete

Page 10 GAO- 03- 614 National Research Program Implementation IRS
delivered NRP correspondence and face- to- face auditor training during
late 2002 and early 2003. Instructors provided an overview of NRP

goals and objectives, reviewed the casebuilding materials that auditors
would have at their disposal, and explained the guidelines for NRP audits.

IRS trained about 3,500 auditors to conduct NRP face- to- face audits.
This training took place in IRS field offices across the country from
October 2002 through February 2003. Each face- to- face NRP audit training
session lasted 3 days. The training consisted of an overview of NRP goals
and objectives, an explanation of how NRP audits differ from traditional
enforcement audits, and a description of how to apply NRP guidelines
during NRP audits. Trainers stressed that, for the purposes of consistent
and accurate data collection, NRP auditors should not focus solely on
significant issues or take into consideration the likelihood of collecting
unpaid taxes when conducting NRP audits, but should make sure that every
item identified by the classifier is carefully verified in the course of
the audit. Correspondence auditor training was similarly focused, and the
1- day training took place in September 2002. Staff members were trained
before they began to carry out NRP tasks.

IRS needed to provide training to NRP auditors and to IRS managers and
clerks with NRP responsibilities in order for staff members to understand
how to use the computer program IRS developed to capture NRP information.
Because of some problems IRS encountered in installing the NRP software in
offices across the agency, IRS had to delay training some clerks and
managers. This led to delays in starting some NRP audits

because managers were unable to assign NRP cases to auditors and clerks
were unable to assist in loading NRP cases on NRP auditors* laptop
computers. IRS resolved these problems and finished delivering the
majority of this training by the end of January 2003.

IRS is nearly finished creating NRP casebuilding files, has classified
nearly three- fourths of the NRP returns, and has begun conducting NRP
audits. As of the end of March 2003, IRS completed NRP casebuilding for
about 94 percent of the approximately 47,000 returns in the NRP sample and
about 73 percent of NRP returns have been classified. Also, for 3,651 NRP
cases,

IRS completed all necessary audit work. Some of these are cases where
correspondence or face- to- face audits are finished, but most of the NRP
cases closed so far* 2,709* are those that did not require audits. Cases
involving audits take longer to complete, so few have been closed thus
far. IRS made substantial progress in casebuilding and classification
starting in NRP Auditor Training Is

Complete Computer Software Training for Managers and Clerks Was Delayed

NRP Reviews Are Under Way

Page 11 GAO- 03- 614 National Research Program Implementation 2002, and
the number of cases assigned to NRP auditors has been increasing quickly
since January 2003. Figure 2 shows the progress IRS has

made in casebuilding, classifying, and closing cases.

Figure 2: NRP Work Completed and Work Remaining as of the End of March
2003

The number of completed NRP casebuilding files began to grow during the
second half of 2002, as shown in figure 3. As figure 3 also illustrates,
NRP classification began in September 2002. These were the cases
classified during sessions held immediately after classifier training.
Over 9,000 NRP returns were classified by the end of October 2002. After
these sessions, classification became an area office function, with some
offices scheduling weeklong classification sessions on a somewhat regular
basis and others classifying returns as they come into the office.

Page 12 GAO- 03- 614 National Research Program Implementation Figure 3:
NRP Implementation Progress

Note: Casebuilding and classification data from July through September
2002 are estimates because IRS did not keep monthly records on these
processes during this period.

IRS began conducting some NRP audits during November 2002, though these
audits began in earnest during the first quarter of 2003. By the end of
January 2003, IRS had assigned over 4,600 NRP cases to auditors to begin

conducting face- to- face and correspondence audits. By the end of March
2003, about 18,000 taxpayers had been contacted regarding NRP audits.

IRS recognizes the need for accurate NRP data and, as planned, has built
into the program several measures to ensure the quality of NRP results.
IRS designed the NRP classification process to include quality assurance
reviews and has added additional quality assurance measures in response

to suggestions we made in the course of this engagement. The NRP audit
process also includes quality assurance measures that include both
inprocess and completed case reviews, with all NRP audits reviewed before
they are formally closed with the taxpayer. IRS also built accuracy checks
into the data capture steps that take place throughout the NRP process.
IRS Has Implemented

NRP Quality Assurance Measures

Page 13 GAO- 03- 614 National Research Program Implementation IRS designed
NRP classification to include regular reviews of classifiers* decisions.
We found that these reviews are generally taking place

according to NRP guidelines. We also found that additional measures could
further improve NRP classification accuracy, and IRS implemented our
suggestions.

NRP guidelines specify that NRP classification reviewers review all cases
for which returns are classified as needing either no audit at all or only
correspondence audits to confirm their accuracy. Additionally, reviewers
must initially review 25 percent of the cases classified by each auditor
that are selected for face- to- face audits until they are satisfied with
the quality and consistency with NRP guidelines of the classifier*s work.
After that

standard has been met, the guidelines specify that reviewers need only
review approximately 10 percent of the cases that each classifier selects
for face- to- face audit.

We conducted site visits to five IRS area offices where NRP classification
was taking place and found that IRS*s plans to implement the
classification steps of the program were generally well understood by the
classifiers carrying them out. Classifiers were knowledgeable about the
differences between the NRP classification process and the classification
process used in the enforcement audit environment and supported NRP goals
in general. However, we also found instances where NRP classifiers were
not consistently following NRP classification guidelines.

Another issue we identified involved the use of the classification review
sheets that reviewers fill out when they find problems with classifiers*
decisions. We learned that there was no provision for further review of
these forms. In some cases, we found that reviewers were not always
documenting classification errors on the forms. We discussed with NRP
officials the potential benefits of using NRP classification review sheets
for more than identifying issues at the area office level. Specifically,
we suggested that classification review sheets be forwarded from the area
offices to a central location in order to identify problems that may be
occurring in different locations around the country or other trends that
the NRP Office may need to address during the course of NRP
classification. The NRP Office agreed with our suggestion and added
centralized review of classification review sheets to its other
classification quality assurance measures.

The NRP Office adopted our suggestion that it conduct site visits to area
offices to identify NRP classification implementation issues. Similar to
the visits we conducted, NRP staff members visited area offices and met
with IRS Is Conducting

Classification Reviews on a Sample of Returns

Page 14 GAO- 03- 614 National Research Program Implementation classifiers,
reviewers, and managers to identify issues encountered in carrying out NRP
classification and possible areas where NRP guidelines

may have been misinterpreted. Among the issues they are asking about is
the usefulness of the various materials included in the casebuilding
files, information which may prove useful in the design of the
casebuilding portion of future iterations of NRP. NRP staff members are
also conducting separate reviews of completed classification cases.

IRS has designed NRP to include several steps to identify NRP audit
quality problems at both the individual auditor level and across the
program. Reviews include quality checks while cases are in progress and
after work is complete, and reviews by managers at different levels.
Importantly, IRS*s plans call for every NRP audit to be reviewed at least
once at a point where it is still possible to return to the taxpayer and
complete additional audit steps, if necessary. These quality assurance
measures will serve to mitigate the risk of IRS including erroneous or
incomplete data in the NRP database.

NRP guidelines task group managers with reviewing one open NRP audit for
each auditor in the first 90 days of that auditor*s NRP activity and
another in the first 180 days. 5 NRP officials intend for these in-
process reviews to be extensive and timed early enough in the program to
identify individual auditors* misunderstandings of the program, correct
them on the audits under review, and prevent them on future NRP audits.

IRS has also created Quality Review Teams both to oversee individual audit
cases and identify problems at the area office level and systemically
across NRP. These teams are made up of IRS managers and are tasked with
checking for compliance with NRP- specific and overall IRS standards on 40
open cases and 20 closed cases for each of IRS*s 15 area offices. These
reviews will be repeated in each area about once every 3 months throughout
the planned 18- month NRP audit period. The IRS standards applied by the
teams to the audits they review are the same standards employed by IRS*s
Examination Quality Measurement System (EQMS). 6 5 Most auditors work in
groups headed by group managers. Groups are part of territories,

which are part of areas. There are 15 area offices participating in NRP. 6
EQMS measures compliance with IRS standards in several areas, including
audit planning, timeliness, and depth and how well the auditor considered
large, unusual, or questionable items and income on the taxpayer*s return.
NRP Audit Quality

Assurance Measures Include a Mix of InProcess and Completed Audit Reviews

Page 15 GAO- 03- 614 National Research Program Implementation Similar to
the visits NRP officials made to area offices to review classification
activities, NRP officials are also visiting area offices to

review NRP audit activities. NRP officials said that any systemic issues
identified through Quality Review Team reviews will then be addressed
across NRP.

Another NRP audit quality assurance element calls for all face- to- face
audits to be checked by group managers after work is completed but before
the cases are formally closed with the taxpayers. This review will include
assessing technical correctness, mathematical accuracy, completeness, and
adherence to procedural requirements. IRS officials said that these
requirements include adherence to the NRP- specific requirement that
audits include verification of all items identified through the NRP
classification process. These reviews also include assessing adherence to
IRS standards in areas such as audit depth and reviewing large, unusual,
or questionable items on the audited return. We were initially concerned
that IRS planned for these reviews to take place after NRP audits were
completely closed, precluding IRS from reopening the cases or otherwise
obtaining additional information from the taxpayers even if the reviewers
found that the original NRP audits were incomplete. 7 However, senior IRS
officials informed us in March 2003 that these

reviews will take place after NRP auditors consider their audit work to be
complete but before the taxpayers are notified that the audits are over.
The officials explained that these reviews of all NRP cases will be timed
to provide an important means of ensuring that complete and accurate audit
results are entered into the NRP database. They also explained that the
importance of NRP audit reviews has been stressed throughout NRP
implementation and will be the subject of ongoing communication with
managers in the field.

It is very important that IRS conduct reviews of NRP audits before they
are closed because IRS data show that auditors do not always meet
enforcement audit quality standards. In fiscal year 2002, IRS*s EQMS found
that auditors in the field did not meet the audit depth standard about 15
percent of the time on field audits; the standard for auditing taxpayer
income was not met about 25 percent of the time on field audits; and the
standard concerning audits of large, unusual, or questionable items was

7 After IRS notifies a taxpayer that an audit is closed, IRS may only
reopen the audit with the taxpayer in order to correct errors if special
circumstances, such as fraud, exist or if reopening the case will benefit
the taxpayer.

Page 16 GAO- 03- 614 National Research Program Implementation not met 40
percent of the time on field audits. IRS officials said that accurate
audit results in these areas are critical to NRP*s overall accuracy.

IRS officials pointed out that the error rate for NRP audits should be
lower than in the enforcement audit environment because NRP auditors
received special training and because the NRP classification process will
enhance NRP audit quality. For example, NRP guidelines call for
classifiers to identify large, unusual, or questionable items on returns
(the largest EQMS error category) and NRP auditors must address all
classified items. However, IRS did not implement its earlier plan of
having a selected cadre of auditors work only on NRP cases. While NRP-
specific training will serve

to prevent many audit errors, NRP audits are now being conducted by a
cross section of auditors from IRS field offices across the country and
more typical of the auditors who generated the 2002 EQMS error rates.
Because every return in the NRP sample represents many returns in the
whole population of 1040 filers, even a small number of cases closed with
incomplete information could affect the accuracy of NRP data.

IRS officials also noted that their plan to conduct early reviews of NRP
cases will identify problems with auditors* understanding of NRP and help
to keep them from recurring on subsequent NRP audits. At least two of

each NRP auditor*s early cases will have extensive manager involvement
while the cases are still in progress, and other managers will be looking
at a sample of both completed and open cases to identify problems. IRS
officials believe that these measures are sufficient to ensure NRP audit
quality.

IRS is including a series of data consistency checks in the NRP database
to verify that the information NRP auditors record in IRS*s NRP reporting
system agrees with the information that IRS recorded from the tax returns
earlier in processing. NRP auditors must first record the results of NRP
audits in the report- generating software that was modified for NRP
purposes. Once auditors have recorded audit results, NRP coordinators must
use a data conversion program to transfer the data into a format that the
NRP database will accept. Following data conversion, IRS coordinators
transfer the audit data to the NRP database.

Once the data are transferred to the NRP database, a series of data
consistency checks take place to confirm that the data IRS originally
transcribed from the tax return are consistent, within specified
tolerances, with the data that NRP auditors recorded in the NRP reporting
software. If any of the consistency checks fail for a return in the NRP
sample, the NRP NRP Includes Data

Consistency Checks

Page 17 GAO- 03- 614 National Research Program Implementation area
coordinator will be notified and the mistake will need to be corrected.
According to IRS officials, they will impress upon NRP auditors the

importance of entering data into the NRP software correctly the first time
because it will be time- consuming to correct errors. NRP officials have
developed a case tracking system in order to monitor which cases still
need to pass all of the consistency tests and which tests they need to
pass. IRS officials reported that, as of early April 2003, the NRP
database and related programs were running and that completed NRP cases
were being entered into the database. They said that they were still
making some enhancements, but that the programs were fully functional.

As IRS planned, NRP casebuilding and classification processes are helping
minimize the burden on taxpayers with returns in the NRP sample. In
addition, the size of the NRP sample is now smaller than IRS expected it
to be. However, the number of taxpayers who will be subject to NRP audits

has increased. IRS plans to survey taxpayers who receive NRP audits to
assess their perceptions of the burden posed by those audits. IRS also
used input from tax practitioners to identify ways to improve interactions
with taxpayers subject to NRP audits.

IRS is following its plans to reduce burden on taxpayers selected as part
of the NRP sample by (1) compiling NRP casebuilding materials that allow
IRS to verify certain items on tax returns without requesting the
information from the taxpayer, (2) classifying returns according to items
that need to be verified through an audit, and (3) limiting most NRP
audits to items that cannot be verified without an audit. IRS officials
also intend to compare classification decisions with the results of NRP
audits to identify ways of improving the classification process for future
rounds of NRP. Moreover, IRS*s intent in carrying out NRP is to reduce the
burden on taxpayers in general by developing better audit selection
formulas and reducing the number of audits of fully compliant taxpayers.

The NRP casebuilding and classification processes described on page 4 are
having their intended effect of reducing the burden NRP creates for
taxpayers with returns in the NRP sample. IRS has assembled IRS and third-
party data on most of the returns in the NRP sample and classifiers have
used these data to verify information on the returns, where possible,
without contacting taxpayers. The remaining casebuilding and
classification work was under way as of the end of March 2003. The
material in the casebuilding files has allowed IRS to fully verify about
10 percent of NRP returns without any audit. Classifiers were able to use
the IRS Is Taking Steps to

Minimize and Assess Taxpayer Burden

NRP Processes Are Helping to Reduce Taxpayer Burden

Page 18 GAO- 03- 614 National Research Program Implementation casebuilding
material to verify all but one or two items on another 5 percent of NRP
returns, and these were sent for correspondence audits.

Classifiers identified line items needing verification through a face- to-
face audit on about 85 percent of NRP returns classified as of the end of
March 2003. Because of the casebuilding and classification processes IRS
developed for NRP, these audits will generally be limited to line items
that cannot be verified using the information in the casebuilding files.
This is a substantial change from earlier compliance research efforts, in
which all returns were subject to audits of every line on the return. Only
the 1,683 taxpayers with returns selected for NRP calibration audits will
be subject to complete audits of their returns.

IRS plans to use NRP results to improve future iterations of NRP. For
example, NRP officials plan to compare classification outcomes with NRP
audit results to help them to identify possible changes needed in
casebuilding materials and the NRP classification process. They have told
us that it may be possible to further reduce the number of accurately

reported line items that are subject to compliance research audits. On the
other hand, IRS may also find through NRP calibration audits that
classification missed many items that should have been audited, so more
line items should receive some form of audit in future rounds of NRP in
order for the research results to be useful. IRS also intends to apply
lessons learned in NRP classification to classification in the enforcement
audit environment.

As we noted in our prior report, NRP should also lead to reductions in
taxpayer burden in general. IRS plans to use NRP results to help identify
and reduce causes of noncompliance and to better target enforcement

audits to noncompliant taxpayers, reducing the number of audits of fully
compliant taxpayers. IRS projects that, without improved audit selection
formulas based on NRP results, the percentage of enforcement audits that

result in no tax change will be about 35 percent higher in 2005 than it
was in 1993, the first year that selection formulas from the 1988
compliance study were available. Taxpayer burden will decrease if
successful execution of NRP enables IRS to reduce the number of these
audits of compliant taxpayers.

The NRP sample consists of 46,860 tax returns. We reported in June 2002
that the NRP sample would consist of 49,251 returns. The current number is
smaller than the initial estimate because IRS originally estimated the NRP
sample size based on the characteristics of the filing population that NRP
Sample Is Smaller

Than IRS Initially Estimated

Page 19 GAO- 03- 614 National Research Program Implementation existed
during the 1988 reporting compliance study. According to IRS officials,
when they applied the NRP sampling plan to the 2001 filing

population, the number of returns necessary to satisfy the requirements
for some of the NRP strata declined because filing rates for those strata
were smaller than IRS officials had projected. 8 The final NRP sample
consists of about 2,400 fewer returns than initially planned.

IRS officials are currently finding that the NRP classification results
are different than initially planned. IRS now estimates that more face-
to- face audits will take place than initially projected because (1) as
the NRP plan recognized, IRS*s initial estimates were uncertain and based
on aging data and (2) the final form of NRP classification guidelines
meant more face- toface

and fewer correspondence audits. IRS initially estimated that out of an
NRP sample of over 49,000 tax returns, classification would result in
about 30,000 face- to- face audits of selected line items, about 9,000
correspondence audits covering no more than two line items, and about
8,000 taxpayers who would not undergo any audit because classifiers were
able to either verify all of the items on their returns or could correct
some line items without contacting the taxpayers.

The final NRP sample is 46,860 returns, and IRS now estimates that NRP
classification will result in face- to- face audits of about 39,000
taxpayers, with approximately an additional 2,300 receiving correspondence
audits and 3,800 subject to no audit at all. IRS also plans to conduct
1,683 line- byline calibration audits, drawing 561 returns from each of
the three classification categories* these numbers have not changed.
Figure 4 shows IRS*s current estimate of how the three NRP classification
categories will be distributed.

8 The NRP sample is divided into 30 categories, each of which is referred
to as a stratum. Each stratum contains a different type of taxpayer based
on the taxpayer*s total positive income and the various schedules that the
taxpayer filed. IRS Estimates That NRP

Classification Will Result in More Audits Than Expected

Page 20 GAO- 03- 614 National Research Program Implementation Figure 4:
Estimated NRP Sample by Level of Taxpayer Contact

NRP officials explained that the number of face- to- face NRP audits is
higher than expected because they were relying on aging data and
preliminary classification guidelines. Our 2002 report on NRP also noted
the preliminary nature of these estimates. Initial classification
breakdown estimates were made using 14- year- old data from the 1988
Taxpayer Compliance Measurement Program study. NRP staff members said that
changes in the tax code and in the economic makeup of the filing
population since the 1988 study make the returns from that study an
unreliable tool for predicting NRP classification results, though that was
all they had to work with.

They also said that some of the change can be attributed to changes they
made in the final form of NRP classification guidelines. NRP staff members
said that they modified the NRP classification guidelines as a result of
discussions that took place between NRP staff members and representatives
from IRS*s business operating divisions. They instituted the changes to
the classification guidelines in order to better match the training and
skills of the examiners selected to conduct NRP correspondence and face-
to- face audits with the types of issues to be covered by those audits.
One change is that discrepancies between the casebuilding files and the
tax returns for issues such as Individual Retirement Account contributions
and Social Security income were removed from the list of issues that could
be verified through a correspondence audit. Another change is that the
final guidelines call for virtually all business returns to receive face-
to- face audits* initial

Page 21 GAO- 03- 614 National Research Program Implementation assumptions
about the classification process allowed for some business returns to be
accepted as filed or receive only correspondence audits.

IRS will survey taxpayers who are subject to NRP audits to assess overall
customer satisfaction and their perceptions of the burden audits created
for them. IRS will ask taxpayers to fill out the same survey it uses to
assess customer satisfaction in the enforcement audit environment and
compare the results for the two populations.

The surveys include issues related to taxpayer burden in the form of
questions about the amount of time taxpayers spent preparing for the
audits and the amount of time that they spent on the audits themselves.
The surveys also ask whether taxpayers receiving NRP audits believe the
information that they were asked to provide seemed reasonable and whether
they feel they received fair treatment from IRS.

After collecting the survey results, IRS will then develop a *score* for
each question on the survey that relates to burden. IRS will compare the
results from the NRP customer satisfaction survey to the results from
surveys completed after enforcement audits.

IRS consulted with outside stakeholders to enhance its efforts to minimize
the burden NRP created for taxpayers with returns in the sample. IRS
consulted with members of organizations that provide feedback to IRS on
matters concerning taxpayers, including the National Public Liaison, the
Information Reporting Program Advisory Committee, and the Internal Revenue
Service Advisory Council. 9 According to IRS, practitioner input led to
wording changes on taxpayer notification letters and improvements to
training materials, which strengthened the emphasis on maintaining good
relations with NRP- selected taxpayers. Representatives of the National
Public Liaison also participated in the training for the staff members who
were selected to conduct NRP auditor training. 9 The National Public
Liaison coordinates with tax practitioner organizations, other

government agencies, and IRS*s formal advisory groups to provide a forum
for external feedback. The Information Reporting Program Advisory
Committee provides input to IRS on reporting issues, and the Internal
Revenue Service Advisory Council provides input to IRS on tax
administration issues. IRS Will Survey NRP

Taxpayers IRS Consulted with Practitioners

Page 22 GAO- 03- 614 National Research Program Implementation IRS
continues to be on track for meeting its NRP goal of obtaining meaningful
compliance data while minimizing the burden on taxpayers

with returns in the NRP sample. IRS has followed the key elements of the
plans it laid out last year and has responded to identified needs to
modify the program that have come from its own testing as well as from
outside stakeholders. Because of this, we are not making any
recommendations in this report.

We recognize that IRS efforts to gather information about NRP
implementation while the program is under way are very important to IRS*s
continued success in carrying out NRP. Classification review results,
audit review results, and customer satisfaction surveys all provide the
means for IRS to make immediate adjustments to NRP now and to enhance the
design of future iterations of the program. Provisions for 100 percent
review of NRP audits before they are closed are particularly important
because even a small number of erroneous or incomplete cases will
negatively affect the quality of NRP data.

On May 22, 2003, we received written comments on a draft of this report
from the Commissioner of Internal Revenue (see app. I). The commissioner
noted the importance of NRP and IRS*s continued emphasis on minimizing
taxpayer burden and delivering quality results. We also received technical
comments from NRP staff members, which we have incorporated into this
report where appropriate. As agreed with your offices, unless you publicly
announce its contents

earlier, we plan no further distribution of this report until 30 days
after its date. At that time, we will send copies of this report to the
Secretary of the Treasury, the Commissioner of Internal Revenue, and other
interested parties. This report is also available at no charge on GAO*s
Web site at http:// www. gao. gov. Conclusions

Agency Comments

Page 23 GAO- 03- 614 National Research Program Implementation If you or
your staffs have any questions, please contact Ralph Block at (415) 904-
2150, David Lewis at (202) 512- 7176, or me at (202) 512- 9110.

Thomas Gilbert was also a key contributor to this assignment. James R.
White Director, Tax Issues Strategic Issues Team

Appendix I: Comments from the Internal Revenue Service

Page 24 GAO- 03- 614 National Research Program Implementation Appendix I:
Comments from the Internal Revenue Service

(440149)

The General Accounting Office, the audit, evaluation and investigative arm
of Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability of
the federal government for the American people. GAO examines the use of
public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO*s commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.

The fastest and easiest way to obtain copies of GAO documents at no cost
is through the Internet. GAO*s Web site (www. gao. gov) contains abstracts
and fulltext files of current reports and testimony and an expanding
archive of older products. The Web site features a search engine to help
you locate documents using key words and phrases. You can print these
documents in their entirety, including charts and other graphics.

Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as *Today*s Reports,* on its
Web site daily. The list contains links to the full- text document files.
To have GAO e- mail

this list to you every afternoon, go to www. gao. gov and select
*Subscribe to daily E- mail alert for newly released products* under the
GAO Reports heading.

The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent of
Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more
copies mailed to a single address are discounted 25 percent. Orders should
be sent to: U. S. General Accounting Office 441 G Street NW, Room LM
Washington, D. C. 20548 To order by Phone: Voice: (202) 512- 6000

TDD: (202) 512- 2537 Fax: (202) 512- 6061

Contact: Web site: www. gao. gov/ fraudnet/ fraudnet. htm E- mail:
fraudnet@ gao. gov Automated answering system: (800) 424- 5454 or (202)
512- 7470 Jeff Nelligan, Managing Director, NelliganJ@ gao. gov (202) 512-
4800

U. S. General Accounting Office, 441 G Street NW, Room 7149 Washington, D.
C. 20548 GAO*s Mission Obtaining Copies of

GAO Reports and Testimony

Order by Mail or Phone To Report Fraud, Waste, and Abuse in Federal
Programs Public Affairs
*** End of document. ***