TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

 

An Appropriate Methodology Has Been Developed for Conducting the National Research Program Study to Measure the Voluntary Compliance of Individual Income Taxpayers

 

 

 

June 17, 2009

 

Reference Number:  2009-30-086

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

June 17, 2009

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

                                         DIRECTOR, OFFICE OF RESEARCH, ANALYSIS, AND STATISTICS

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – An Appropriate Methodology Has Been Developed for Conducting the National Research Program Study to Measure the Voluntary Compliance of Individual Income Taxpayers (Audit # 200830047)

 

This report presents the results of our review to determine the adequacy of the methodology for conducting the National Research Program (NRP)[1] – Individual Reporting Compliance Study (hereafter referred to as the NRP study or the study).  This audit was conducted as part of the Treasury Inspector General for Tax Administration’s Fiscal Year 2009 Annual Audit Plan under the major management challenge of Tax Compliance Initiatives.

Impact on the Taxpayer

The Internal Revenue Service (IRS) is conducting a multiyear NRP study to measure the voluntary compliance of individual income taxpayers.  IRS management developed an appropriate methodology for conducting the study and took proper steps to ensure that the study provides quality and accurate results.  Because the NRP study results will be used to help identify tax returns that have a higher likelihood of noncompliance, the IRS can direct resources to where they provide the most value, which will reduce the burden on compliant taxpayers.

Synopsis

The IRS selects tax returns for audit based on data obtained from an NRP study of taxpayers that filed a Tax Year (TY) 2001 U.S. Individual Income Tax Return (Form 1040).  To update its compliance information, the IRS initiated a new NRP study of individual income taxpayers.  The new study was designed to use annual random samples[2] of about 13,200 returns.  Examinations began in October 2007 and were comprised of TY 2006 returns, which was the first tax year of the study.  Other tax years will follow and when data from the third year of the study (TY 2008) are ready for analysis, the resulting estimates should have a statistical precision that is comparable to that of the TY 2001 study.  After the first 3 years, the IRS plans to continue conducting examinations under the NRP study, which should allow for annual updates of compliance information.

We determined that the methodology of the NRP study sampling plan was appropriate.  Our review verified that the sample size was sufficient to measure the statistical reliability of results, the related calculations were accurate and consistent, and the formulas were appropriate.  We also determined that the sampling process was generally carried out as planned during the first year of the study.

To ensure the quality and accuracy of results, management instituted a multilayered quality review process, procedures for manual and systemic checks for data accuracy, and an effective process for training employees.  While there are requirements for quality reviews at the field office and Area Office[3] levels, there are no requirements for quality reviews to be conducted at the national level.  After NRP study examination procedures had been established, management determined that the quality review process would be improved by involving personnel at the national level.  As a result, national level analysts conducted Program visitations the first year of the study and will participate in reviews with each of the Area Office NRP study quality review teams during the second year of the study.  However, without continued national involvement in the quality review process, the IRS’ ability to provide proper oversight and ensure consistency and quality could be negatively affected.

Recommendation

We recommended that the Director, Examination, Small Business/Self-Employed Division, include a provision in the Internal Revenue Manual requiring Small Business/ Self‑Employed Division Examination Headquarters and NRP staff to provide oversight of NRP reviews conducted by Area Offices.

Response

IRS management agreed with our recommendation.  The Internal Revenue Manual for NRP Individual Income Tax Reporting Compliance Studies is currently being updated and a provision requiring Small Business/Self-Employed Division Headquarters and NRP staff to provide oversight of NRP exam reviews is being added.  Management’s complete response to the draft report is included as Appendix IV.

Copies of this report are also being sent to the IRS managers affected by the report recommendation.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Compliance and Enforcement Operations), at (202) 622-8510.

 

 

Table of Contents

 

Background

Results of Review

The National Research Program Study Sampling Process Was Appropriate

Management Implemented Several Procedures to Ensure That the Study Provided Quality and Accurate Results

Recommendation 1:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Management’s Response to the Draft Report

 

 

Abbreviations

 

IRS

Internal Revenue Service

NRP

National Research Program

TY

Tax Year

 

 

Background

 

One of the Internal Revenue Service’s (IRS) strategic goals is to get taxpayers to meet their tax obligations with as little intervention as possible.  To achieve its goal, the IRS depends on voluntary compliance, which is a system of compliance that relies on individual citizens to report their income freely and voluntarily, calculate their tax liability correctly, and file a tax return on time.  There are three primary measures of voluntary compliance:

  • Filing compliance the percentage of taxpayers with a filing requirement who file returns on time.
  • Payment compliance the percentage of tax reported on timely filed returns that is paid on time.
  • Reporting compliance the accuracy of tax reported on time.

In order to make decisions about its customer service and enforcement plans, it is important that the IRS have information about voluntary tax compliance.  In Fiscal Year 2000, the IRS established the National Research Program (NRP)[4] Office and charged it with gathering voluntary compliance data to help guide the IRS’ strategic plans.  The NRP Office separately collects information on filing, payment, and reporting compliance.  When these measures are considered individually and collectively, they provide the IRS with statistical information that it uses to make decisions about its customer service and enforcement strategies.

The NRP Office completed its first reporting compliance study of taxpayers that file a U.S. Individual Income Tax Return (Form 1040) using information from Tax Year (TY) 2001.  The results of this study were used to help the IRS estimate the tax gap, which is the difference between the total taxes that taxpayers should have paid and the total taxes that were actually paid timely.  The IRS estimated the gross tax gap at approximately $345 billion.  Within the gross tax gap, late filing and nonfiling of tax returns accounted for about $27 billion and underpayment of reported taxes accounted for approximately $34 billion.  The remaining approximately $284 billion was due to improper reporting, and the vast majority of that improper reporting came from individual taxpayers.

The IRS selects tax returns for audit based on the data obtained from the TY 2001 NRP study.  Although the TY 2001 NRP study provided valuable information, the TY 2001 data used in the study are not current and the areas of noncompliance identified in the study may have changed.  As these data become more outdated, there is an increased risk that the IRS does not have accurate information about taxpayer voluntary compliance and its limited resources are not directed where they could be most productive.  The TY 2001 NRP study included a sample of about 46,000 tax returns and strained the IRS’ examination plan for that year.  However, these limited resources are also necessary for conducting the type of studies needed to collect voluntary compliance data.

Recognizing the need for more current information and balancing that need with performing its annual examination plan with minimal disruption, the NRP Office initiated a new Individual Reporting Compliance Study (hereafter referred to as the NRP study or the study), beginning on October 1, 2007.  This new study lessens the impact on the annual examination plan because it spreads out the returns sampled over multiple tax years, which reduces the annual burden compared to the TY 2001 study.  The new study also calls for collecting data every year for the foreseeable future, which should allow the information to be more current and updated regularly.

The current NRP study was designed to use annual random samples of about 13,200 U.S. Individual Income Tax Returns (Form 1040).  The samples are intended to be representative of the individual taxpayer populations.  The examinations began on October 1, 2007, and were comprised of TY 2006 returns, which was the first tax year of the study.  Other tax years are to follow and when data from the third year of the study (TY 2008) are ready for analysis, the resulting estimates should have a statistical precision that is comparable to that of the TY 2001 study.  The merging of multiple tax years will allow researchers to look at line-item detail not available in previous compliance studies.  A major benefit of this shift is the ability to update compliance estimates and workload identification models annually after collecting the TY 2008 data.

This review was performed at the Office of Research, Analysis, and Statistics in WashingtonD.C., and the Small Business/Self-Employed Examination Division in New Carrollton, Maryland, during the period July 2008 through February 2009.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

 

Results of Review

 

The National Research Program Study Sampling Process Was Appropriate

The NRP study sampling plan, which will be implemented over 3 tax years, is designed to achieve three objectives:

  • Provide an estimate of reporting compliance.
  • Support the updating of formulas for examination workload selection.
  • Produce annual estimates of reporting compliance for taxpayers that claim the Earned Income Tax Credit.[5]

IRS statisticians used a 95 percent confidence interval and a precision of ± one-half of 1 percent to calculate a sample size that would allow for an estimate of the voluntary reporting rate.[6]  While this calculation provided a sample size necessary to estimate the voluntary reporting rate, it did not provide a sufficient number of cases required to address the workload formulas objective.  To update the workload formulas, the IRS has learned from experience that the sample design must provide a sufficient number of returns that can be defined as profitable to audit.[7]  In allocating the sample related to the workload formulas objective, the IRS ensured that the sample also met the reporting compliance objective by applying a factor to reflect each substratum’s[8] contribution to the tax gap.  

The IRS then calculated a separate sample of returns that will also be spread across 3 tax years to produce estimates of Earned Income Tax Credit compliance by using a 95 percent confidence interval and a precision of ± 2 percent.  One third of the sample size that met this requirement is also sufficient to produce annual estimates of Earned Income Tax Credit compliance based on a 90 percent confidence interval and a precision of ± 3 percent.

The IRS calculated a total sample size of 39,442 returns that would be necessary to address all 3 objectives of the sampling plan.  The sample will be spread uniformly, by substrata, across 3 tax years.  The sample was stratified across 58 strata; 11 primary strata, 10 of which are further separated into substrata.  The primary stratification was based on the examination classes used in examination workload selection.  Examination classes represent groupings of taxpayers based on various factors such as taxpayer income.  Substratification was based on various additional factors, such as taxpayer filing status.

In order to validate the sampling methodology, the IRS contracted with an outside research firm to evaluate the sample design for the study.  The research firm concluded that the sample design would meet all the requirements it was created to satisfy, provided that the statistical assumptions on which the design is based remain viable.  It found that the choice and application of statistical formulas were appropriate and verified the accuracy of the calculations.  The firm also reported that there were no inherent problems with using a multiyear approach. 

We reviewed the IRS sampling plan and the research firm’s report and discussed the plan with the IRS statistician who prepared it to determine the adequacy of the methodology for conducting the study.  We confirmed the accuracy and consistency of the calculations of the sample size, both in total and at the substrata level.  We also determined the sampling process was generally carried out as planned during the first year of the study.  To ensure the methodology of the IRS sampling plan was appropriate, we consulted with an independent statistician.  The statistician confirmed that the IRS sample design, including the formulas used, was appropriate for calculating the sample size and that the sample size was sufficient to measure the statistical reliability of results.  The statistician also confirmed that a multiyear strategy is an acceptable approach for performing the study and that any risks related to this approach could be addressed by making adjustments to the plan without affecting the validity of results.  After the third year of the study, based on the design of the sampling plan, the IRS should be able to combine the results from each year to address the objectives of the study.

Management Implemented Several Procedures to Ensure That the Study Provided Quality and Accurate Results

The validity of the NRP study is dependent upon having reliable information.  We determined that management implemented effective processes for 1) measuring quality and providing oversight, 2) determining the accuracy of results, and 3) training employees.

Management developed a comprehensive quality review plan that could be improved with continued national level involvement

The quality review process is important because it provides a method to monitor, measure, and improve the quality of work.  In addition, quality review results enable management to identify trends, problem areas, training needs, and opportunities for process improvement.  For the first year of the NRP study, we noted that quality reviews were conducted at the field office, Area Office,[9] and national office levels as follows:

  • NRP study coordinators were responsible for determining the quality of case building by verifying that case files were complete.
  • NRP study coordinators were also responsible for reviewing all classified NRP study returns until they were satisfied with the quality and consistency of each classifier.  Thereafter, they reviewed a 10 percent sample of the cases for each classifier.
  • Group managers were required to complete an evaluative case review on the first NRP study case assigned to each examiner within 60 calendar days of assignment.
  • Group managers were also required to complete a nonevaluative case review on all NRP study closed cases, which considers various quality attributes and data-gathering items.
  • NRP study quality review teams from each Area Office are required to conduct nonevaluative quality reviews on a sample of cases during each year of the study.
  • Management from the Small Business/Self-Employed Division established a visitation team to conduct quality reviews on a sample of cases during the first year of the study.  The visitation team specifically looked at the quality process and the overall quality of the cases to ensure that guidelines were being met.

While the quality review process was appropriate, there are no requirements for quality reviews to be conducted at the national level.  After NRP study examination procedures had been established, management determined national involvement in the quality review process would assist in the overall quality initiative.  During the second year of the study, the visitation team will participate with the Area Office quality review teams.  This effort was only made possible when IRS management approved a proposal submitted by analysts from the visitation team.  We agree with the analysts’ opinion that involvement at the national level is necessary for providing proper oversight and ensuring consistency and quality.  Also, the analysts indicated that continued direct involvement in the quality review process at the national level was dependent upon the IRS budget for the NRP study.  However, without continued national involvement in the quality review process, the IRS’ ability to provide proper oversight and ensure consistency and quality could be negatively affected.

Steps were taken to ensure results were accurately captured and tracked to address NRP objectives

Accuracy in the reporting of NRP study examination results is extremely important to enable proper decisions to be reached upon analysis of each study’s output.  Management established the following procedures to ensure that examination results are accurately captured and tracked to properly address the objectives of the NRP study:

  • Examiners input results into an electronic case file.  Completed records are uploaded to the group manager, who reviews each closed case for data accuracy and forwards the records to the NRP study coordinator.
  • NRP study coordinators complete a procedural review of the case and transmit closed case information to the Detroit Computing Center,[10] where the data are run through a program that tests for consistency errors.  For tests that fail, NRP study coordinators must verify the accuracy of the questioned data and transmit any necessary corrections.  This process will repeat until the data pass all consistency tests.
  • NRP study analysts conduct reviews of the data maintained at the Detroit Computing Center to identify anomalies and apply appropriate corrections.
  • Examination Automation Team analysts review a sample of closed case files for data consistency and accuracy.  The analysts coordinate their reviews with the Area Office NRP study quality review teams to coincide with the annual reviews conducted at each Area Office.

The process for training employees was effective

Training for employees working on the NRP study was provided in a manner that enabled them to complete the training just prior to beginning their assigned responsibilities.  Specifically,

  • Case builders were instructed on the use of a checksheet which identified all the items and information that must be added to each case file.  Due to the straight forward nature of case building, no formal classroom training was necessary. 
  • Technically proficient and experienced revenue agents and tax compliance officers were selected by each Area Office to serve as NRP study classifiers.  Classification sessions, which were held in the same location where cases were built, were scheduled to coincide with the availability of inventory.  At each session, classifiers were required to complete classroom training before they were assigned an inventory.  For consistency purposes, classification work is quality reviewed during each session and findings are shared with the classifiers.
  • Management required cases to only be assigned to examiners who completed all applicable training modules.  Rather than have examiners attend classroom training and wait months to receive their first case, training was made available through a web-based application.  This enabled examiners to take the training just prior to receiving cases, which management believed would result in better quality.

Recommendation

Recommendation 1:  The Director, Examination, Small Business/Self-Employed Division, should include a provision in the Internal Revenue Manual requiring Small Business/ Self‑Employed Division Examination Headquarters and NRP staff to provide oversight of NRP reviews conducted by Area Offices.

Management’s Response:  IRS management agreed with this recommendation.  The Internal Revenue Manual for NRP Individual Income Tax Reporting Compliance Studies is currently being updated and a provision requiring Small Business/Self-Employed Division Headquarters and NRP staff to provide oversight of NRP exam reviews is being added.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine the adequacy of the methodology for conducting the NRP – Individual Reporting Compliance Study (hereafter referred to as the study).  To accomplish the objective, we:

I.                    Evaluated the validity of the sampling process.

A.     Identified the criteria used to select the sample, as well as the basis for establishing the criteria.

B.     Determined how the sample size was calculated and whether the sample size was sufficient to measure the statistical reliability of results.

C.     Determined whether the sampling process was carried out as planned during the first year of the study.

D.     Identified the risks of implementing a sampling strategy over a multiyear period and determined how management accounted for the risks.

II.                 Evaluated the effectiveness of the quality review process.

A.     Identified and evaluated the process for reviewing the quality of case building.

B.     Identified and evaluated the process for reviewing the quality of classification.

C.     Identified and evaluated the process for reviewing the quality of examinations.

D.     Identified and evaluated the process for ensuring quality reviews were appropriately conducted.

III.               Evaluated the appropriateness of the methodology for collecting and using results.

A.     Determined how results were being controlled and tracked over the 3-year period.

B.     Identified and evaluated the process for ensuring examination results were accurately captured.

C.     Identified the objectives of the study.

IV.              Evaluated the effectiveness of the training process.

A.     Identified and evaluated the process for selecting classifiers.

B.     Identified and evaluated the process for training case builders, classifiers, and examiners.

C.     Identified and evaluated the process for assigning cases for case building, classification, and examination.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Compliance and Enforcement Operations)

Carl L. Aley, Director

Amy L. Coleman, Audit Manager

Todd M. Anderson, Lead Auditor

Janis Zuika, Senior Auditor

Niurka M. Thomas, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Deputy Commissioner, Small Business/Self-Employed Division  SE:S

Director, National Research Program  RAS:NRP

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Director, Examination, Small Business/Self-Employed Division  SE:S:E

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaisons:

Commissioner, Small Business/Self-Employed Division  SE:S

Director, Office of Research, Analysis, and Statistics  RAS

 

Appendix IV

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.



[1] The NRP is responsible for determining filing, payment, and reporting compliance by taxpayers for different types of taxes.

[2] The samples are intended to be representative of the individual taxpayer populations.

[3] An Area Office is a geographic organizational level used by IRS business units and offices to help their specific types of taxpayers understand and comply with tax laws and issues.

[4] The NRP is responsible for determining filing, payment, and reporting compliance by taxpayers for different types of taxes.

[5] The Earned Income Tax Credit is a tax credit for certain people who work and have income under established limits.

[6] The voluntary reporting rate is used for estimating reporting compliance and represents the total tax liability reported on timely filed returns in proportion to the total tax liability that should have been reported.

[7] A return is defined as being profitable to audit when the amount of the tax change exceeds the cost of conducting an examination at a level designated by the IRS for the purposes of the NRP study.

[8] Strata are derived from the division of the population into two or more segments.  Substrata are derived from further division of the strata.

[9] An Area Office is a geographic organizational level used by IRS business units and offices to help their specific types of taxpayers understand and comply with tax laws and issues.

[10] IRS Computing Centers support tax processing and information management through a data processing and telecommunications infrastructure.