Additional Efforts Could Further Improve the Execution of the National Research Program

 

January 2004

 

Reference Number:2004-30-044

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

January 29, 2004

 

 

 

DIRECTOR, OFFICE OF RESEARCH, ANALYSIS, AND STATISTICS

 

FROM:†††† Gordon C. Milbourn III /s/ Gordon C. Milbourn III

†††††††††††††††† Acting Deputy Inspector General for Audit

 

SUBJECT:†††† Final Audit Report - Additional Efforts Could Further Improve the Execution of the National Research Program(Audit # 200330013)

 

This report presents the results of our review of the Internal Revenue Serviceís (IRS) National Research Program (NRP).The overall objective of this review was to determine whether the IRS Examination function effectively implemented the 2002 NRP initiative.

In October 2002, the IRS initiated the return examination phase of the NRP to gather the data it needs to measure taxpayer compliance and to support its strategic planning process.The goal of the study is to accurately measure reporting compliance while minimizing the burden on taxpayers during the process.The NRP is expected to provide essential information concerning taxpayer compliance that will allow the IRS to identify the tax returns with the highest compliance risks and reduce the burden on compliant taxpayers.

In summary, the NRP is important because in recent years the IRS has been selecting tax returns for examination using very dated information; the last time similar information was obtained was 1988.The prior compliance efforts were negatively perceived because the examinations were very intrusive Ė the examiner verified every item on the tax return.To overcome this problem, the IRS designed the NRP process to reduce the intrusiveness of the examinations by minimizing line-by-line verifications.Even with the design changes, the NRP process remains as a very sensitive issue to the Congress and other external stakeholders such as the taxpayer representative community.

The NRP will be completed in three cycles and will cover both individual and business returns.The individual return cycle of the NRP includes face-to-face examinations of a sample of approximately 41,000 taxpayers.The IRS planned to complete the NRP cycle for individuals in time to update the return selection formulas for 2005.However, delays occurred in installing computer servers, upgrading computer software, and assigning cases.As a result, the 2002 NRP will not be able to provide data to update the IRS return selection formulas in 2005 as originally planned.Formulas for selecting tax returns for examination will now not be updated until 2006.In addition, as of September 30, 2003, many complex cases had not been started or were only recently started, which could further affect the timely completion of this Program.

Several operational issues could adversely affect the study results or the goal to minimize taxpayer burden.These issues include properly preparing the request for taxpayer information, performing the examination according to required procedures, and ensuring the examination quality review system can provide reliable information in the long term.The IRS has identified several of these issues and has already taken some corrective actions.

Our review of a sample of 81 cases was conducted during the third quarter of Fiscal Year (FY) 2003 and identified the need to reduce burden by improving the clarity of written information requests to taxpayers.For example, many of the information requests reviewed were overly general, used technical jargon, and/or requested items unnecessarily.The 81 sampled cases also identified that examination issues were not always properly addressed or information was not properly obtained.Specifically, a comprehensive evaluation of income was not consistently made and classified items were not always thoroughly verified.The IRS identified similar problems with communications and examination issues during its monitoring of NRP examination quality.Based on these results and our results discussed with management, the Small Business/Self-Employed (SB/SE) Division has already initiated a series of corrective actions, including requiring each Area Office to prepare an action plan to improve case quality.

Additionally, the IRSí efforts to reduce the burden of NRP examinations have yielded some positive results.In the 81 cases reviewed, wages, interest, and dividends were generally validated before contact with the taxpayers, where applicable.However, on average, 83 percent of the total line items on Form 1040 Schedule A and 73 percent of the total line items on Form 1040 Schedule C still had to be validated during the face-to-face contact portion of the examination process.Sixty-nine (85 percent) of the 81 returns reviewed had a Schedule A and/or a Schedule C.Finally, although the IRSí monitoring of NRP examination quality has identified significant areas for improvement, the long-term reliability of this quality review system could be improved by incorporating random case selection techniques.

We recommended the Director, Office of Research, Analysis, and Statistics, perform a thorough post-evaluation of the 2002 NRP and ensure that similar problems are minimized for the next NRP cycle.We also recommended the Director, Compliance, SB/SE Division, revise classroom instruction regarding Information Document Request (IDR) preparation for future NRP cycles, perform visitations to selected areas to help ensure unstarted and recently started examinations are completed by the September 2004 deadline, and incorporate random sampling into the NRP examination process.Finally, we recommended the Director, Office of Research, Analysis, and Statistics, develop interim milestones to help guide the examination phase of the next NRP cycle.

Managementís Response:The IRS generally agreed with the recommendations presented and indicated that it has already implemented corrective actions to address some of the issues identified in our report.Specifically, management has already conducted evaluations of the NRP implementation and will incorporate lessons learned into planning for the next phase of the NRP.In addition, management will revise future NRP classroom training concerning IDR preparation and has already completed field visitations to four high-risk Areas Offices.Finally, management will establish milestones to assist NRP and Operating Division management in allocating additional resources and providing assistance as needed on future NRPs.However, management did not agree with our recommendation to incorporate random sampling techniques into the NRP quality review process.Management noted that a subjective sample allowed them to ensure that some returns completed by each examiner are reviewed and to provide ďreal timeĒ feedback to each examiner.Management also indicated that the Examination Quality Measurement System (EQMS) will be reviewing a random sample of NRP returns for quality and that a separate random sampling quality review process is unnecessary.Managementís complete response to the draft report is included as Appendix IV.

Office of Audit Comment:We recognize the IRSí desire to maintain maximum flexibility in the NRP quality review process and provide feedback as quickly as possible to each examiner.We also concur that the EQMS, which uses random sampling, could be used to reliably evaluate NRP case quality nationwide, provided a sufficient sample of NRP cases is selected to meet this objective.Because this approach is consistent with the overall intent of our recommendation, we do not intend to elevate our disagreement to the Secretary of the Treasury.However, we will continue to closely monitor this issue in future reviews of the IRSí NRP activities.

Copies of this report are also being sent to IRS managers who are affected by the report recommendations.Please contact me at (202) 622-6510 if you have questions or Richard Dagliolo, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (631) 654‑6028.

 

Table of Contents

Background

The Quality of Examinations Is Being Actively Monitored

Delays Will Result in Examination Formulas Not Being Available Until January 2006

Recommendations 1 and 2:

Recommendation 3:

Taxpayers Were Sometimes Asked for Unnecessary Information

Recommendation 4:

Examination Issues Were Not Always Properly Addressed

Taxpayer Burden Reduction Efforts Have Yielded Some Positive Results

The Quality Review Process Did Not Use Random Selection Techniques

Recommendation 5:

Appendix I Ė Detailed Objective, Scope, and Methodology

Appendix II Ė Major Contributors to This Report

Appendix III Ė Report Distribution List

Appendix IV Ė Managementís Response to the Draft Report

 

Background

The American tax system is based upon the premise that taxpayers voluntarily file their tax returns and properly report their income and expenses.The Internal Revenue Service (IRS) began comprehensive testing of voluntary taxpayer compliance of individuals in 1963 and continued this practice without interruption until 1988.On average, a sample of approximately 55,000 individual tax returns was examined about every 3 years to measure compliance.These periodic examinations, called Taxpayer Compliance Measurement Program (TCMP) surveys, provided the IRS with valuable information regarding national compliance trends.The IRS used this information for a variety of purposes, including the development of formulas used to select returns for examination.

Because TCMP surveys sought to comprehensively measure taxpayer compliance and identify potential tax law changes, the selected individuals were subjected to lengthy and detailed examinations.The burden created by these examinations eventually resulted in significant criticism of the TCMP process by outside stakeholders, such as the Congress and the taxpayer representative community.Although the IRS attempted to initiate a TCMP survey in 1994, it was eventually cancelled due, in part, to external concerns.It was not until May 2000 that the IRS began to again seriously plan for another TCMP-type review.These plans eventually called for the National Research Program (NRP) to be completed in three cycles and cover both individual and business taxpayers.

In October 2002, the IRS initiated the return examination phase of the NRP to resume the gathering of data it needs to effectively measure noncompliance and support its strategic planning process.The goal of the study is to gather and accurately measure reporting compliance while minimizing the burden on taxpayers during the process.The 2002 NRP will require the face-to-face examination of a sample of approximately 41,000 individual tax returns.Approximately 2,000 additional individual tax returns will be examined via correspondence.

In preparation for the NRP, the Small Business/Self-Employed (SB/SE) Division initially trained 3,598 Examination personnel.This training was completed by December 31, 2002.Another 492 Examination personnel were trained in 2003 to supplement the initial staffing as needed.As of the end of Fiscal Year (FY) 2003, there were 5,667 Internal Revenue Agents and 1,029 Tax Compliance Officers in SB/SE Division Compliance Area Offices.Overall, the SB/SE Division trained over half of its Examination personnel to be able to conduct NRP examinations.

This review was performed from March through September 2003 at the IRS National Headquarters in the Office of the Director, Research, Analysis, and Statistics, and in the Boston, Massachusetts; Detroit, Michigan; and Nashville, Tennessee, SB/SE Division Compliance Area Offices.The audit was conducted in accordance with Government Auditing Standards.Detailed information on our audit objective, scope, and methodology is presented in Appendix I.Major contributors to the report are listed in Appendix II.

The Quality of Examinations Is Being Actively Monitored

The SB/SE Division is actively monitoring NRP examinations.This monitoring is accomplished through periodic reviews of samples of NRP examination cases by teams of SB/SE Division managers.These reviews were performed at all 15 Area Offices nationwide in April 2003 and again in July 2003.The results of these reviews, along with recommendations for improvement, were shared with NRP examiners and managers; they were consolidated and evaluated nationwide for trends.Some of the trends identified nationally included the need for better communication with taxpayers and more comprehensive income probes during examinations.

In addition, oversight was provided by SB/SE Division Headquarters officials, who also visited all 15 Area Offices to provide guidance and assistance in implementing the NRP.Based on the results of these efforts and the results of our review, the SB/SE Division has already initiated corrective actions to several identified problems in key areas of the NRP examination process concerning requesting information from taxpayers and performing examinations completely as required.

Delays Will Result in Examination Formulas Not Being Available Until January 2006

The 2002 NRP will not be able to provide data to update the IRS examination return selection formulas in 2005 as originally planned.The 2002 NRP individual tax return study was initially slated to begin in October 2002 and be completed by the end of March 2004.The IRS had estimated this schedule would allow sufficient time to update the 2005 return selection formulas for examination.

Delays occurred in installing computer servers and upgrading software, and in assigning and starting cases.As a result, the IRS revised the NRP completion date to September 2004 and pushed back the update of the return selection formulas to January 2006.The delayed start of the NRP individual return examination cycle also required assigning alternative work to some examiners who were scheduled to start NRP cases early in FY 2003.NRP management informed us that the revised completion date was derived by factoring in the delayed initial rollout of inventory and the need to allocate resources for other priority inventory during the current NRP cycle.

Delays occurred in installing computer servers and upgrading software

A late start occurred in securing and installing the computer servers needed to support the control and examination of NRP cases.This delayed any significant rollout of NRP inventory of returns from approximately October 2002 to January 2003.NRP personnel informed us that the number of examiners actually assigned to the NRP was significantly higher than originally estimated, which required a mid-stream increase in server capacity.The IRS also allocated insufficient time between the development of the Report Generation Software (RGS) upgrades needed to support NRP examinations and the rollout of the NRP inventory.As a result, insufficient time was available to train examiners in the use of the new software.

The next NRP cycle is being planned to examine income tax returns of partnerships and Subchapter S Corporations, for which a pilot study could be initiated as early as the first quarter of FY 2005.It is important that this next cycle is properly planned and computer requirements are in place to ensure schedules are met and compliance results are timely obtained.

Delays occurred in assigning and starting NRP cases

IRS records show that as of September 30, 2003, only 12,654 of the planned 41,046 face-to-face NRP examinations were either closed or in the process of being closed.Another 28,031 NRP cases had been assigned for examination; however, 7,439 (27 percent) had not yet been started.Of the cases not started, 5,068 had been waiting to be started for over 120 days.Over one-half of the 5,068 cases waiting to be started were located in 4 of the 15 national SB/SE Division Compliance Area Offices.The remaining 361 cases had not yet been assigned (see Figure 1).

Figure 1:NRP Face-to-Face Examination Inventory Levels

Figure 1 was removed due to its size.To see Figure 1, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

Among the 5,068 cases assigned but waiting to be started for over 120 days, 3,057 (60 percent) contained returns with taxpayer income or gross receipts from business income of $100,000 or greater.Examinations of higher-income returns tend to be more complex and could take longer to complete than examinations of less-complex returns.Therefore, these examinations need to be started soon to meet the completion schedule.

Similarly, 8,781 (43 percent) of the 20,592 cases started as of September 30, 2003, had been in process less than 90 days.Therefore, while the SB/SE Division has made positive progress in processing NRP face-to-face examinations, a significant number had yet to be started or had only recently been started at the time of our review, many of which are of the more complex variety.It is critical that progress on the remaining NRP examinations stay on track to ensure selection formulas can be timely developed and to avoid delaying the start of the next NRP cycle.

SB/SE Division Headquarters monitors NRP inventory levels through the analysis of monthly AIMS data reports.However, the IRS has not developed any interim inventory milestones to help it gauge its progress on the 18-month effort.

Delays occurred in working cases

Even when cases were assigned, examiners did not always provide priority treatment to them.In 17 (21 percent) of the 81 sample cases examined during the third quarter of FY 2003, we identified periods of inactivity of 30 days or more.In these cases, examiners were not actively working the cases, usually because they were working non-NRP inventory or they were being assigned to other temporary work details.In addition, 8 of the 17 cases had delays in processing of 45 days or more.Gaps in case processing delay final case resolution and can result in increased taxpayer burden.The performance of NRP examinations is a major operational priority.The IRS is relying on the data from the NRP to timely update its examination selection process and reduce the burden on compliant taxpayers.

Recommendations

1.      The Director, Office of Research, Analysis, and Statistics, should perform a thorough post-evaluation of the planning and scheduling of key deliverables of the 2002 NRP.Planning efforts for the next NRP should ensure that similar problems are minimized.

Managementís Response:The NRP staff has conducted post-evaluative conferences and focus groups with NRP examiners and managers to discuss what went right and wrong with the reporting compliance study.The results from the lessons learned and focus groups are being incorporated into the planning for the next phase of the NRP, as appropriate.

2.      The Director, Compliance, SB/SE Division, should perform field visitations focused on Area Offices most at risk of not meeting deadlines.

Managementís Response:In October 2003, the SB/SE Division identified four Area Offices that had the highest percentage of unstarted NRP returns and completed two inventory visitations in November and two in December for these Area Offices.The SB/SE Division and the NRP staff will continue to monitor Area Office operations and provide assistance as necessary.

3.      The Director, Office of Research, Analysis, and Statistics, in consultation with the appropriate Operating Divisions, should develop interim milestones to help guide the examination phase of the next NRP cycle.

Managementís Response:The NRP staff, in close cooperation with the appropriate Operating Divisions, will include milestones in the NRP project plan for completing the examination phase of their next reporting compliance study.These milestones will be used only as guidelines to help the NRP and Operating Division staffs determine where additional resources and assistance may be needed.Any milestones developed will be used as an information tool for internal use only and not as a metric to assess Area Offices.

Office of Audit Comment:Managementís corrective action is consistent with the intent of the recommendation.

Taxpayers Were Sometimes Asked for Unnecessary Information

Improving the clarity of the IRSí written information requests provided to taxpayers could reduce the burden imposed by the NRP process.Examiners use Information Document Requests (IDRs) to request documentation for expenses such as receipts and mileage logs.The IDRs in 32 (40 percent) of our 81 reviewed cases were overly general, used technical jargon, requested unnecessary items, or requested items that the IRS already had.For example, one IDR requested data related to interest and dividend income although the taxpayerís return and all of the IRSí information sources clearly indicated the taxpayer received no dividend or interest income.Clear and concise information requests are critical to minimizing taxpayer confusion and speeding the examination process.

The SB/SE Divisionís Reporting Compliance function identified similar problems with IDRs during its monitoring of NRP examination quality.Based on these results, in conjunction with the results we provided, the IRS has already initiated some corrective actions.On May 28, 2003, the SB/SE Division Reporting Compliance function issued the National Quality review results to examiners outlining the NRP quality areas most in need of improvement nationwide.The quality review results provided reinforcement of proper examination techniques, including proper IDR preparation.In addition, on June 12, 2003, the SB/SE Division distributed a self-study workshop regarding the preparation of IDRs to NRP Territory Office managers.

One possible cause for the IDR problems is that the 3-day NRP training class that all examiners were required to attend did not address the importance of ensuring each IDR is specifically tailored to the taxpayer under examination and avoids technical jargon.IDRs that request unnecessary items are counterproductive to the IRSí efforts to reduce the taxpayer burden of this study.

Recommendation

4.†† The Director, Compliance, SB/SE Division, should revise classroom instruction regarding IDR preparation for future NRPs.

Managementís Response:Classroom instruction for the next phase of the NRP will include a specific module on preparing IDRs.

Examination Issues Were Not Always Properly Addressed

Our review of a sample of 81 NRP examinations identified 3 areas in which issues were not always addressed or information was not properly obtained.First, a comprehensive evaluation of income was not consistently made.

        A preliminary assessment of the validity of the income reported on a tax return, called a Cash T analysis, was not made in 24 (30 percent) of the 81 cases reviewed.

        Routine income evaluation steps, such as the analysis of bank statements for unreported income and queries to determine the disposition of assets from a business liquidated during the examination year, were not made in 6 (14 percent) of 42 examined cases involving taxpayers with business income.

NRP guidelines require a thorough evaluation of income from all sources, including an analysis of bank records for business taxpayers.Ensuring examinations include a thorough income evaluation has been a longstanding problem for the SB/SE Division and may be attributable to a number of factors, including a fear by examiners of violating the provisions of the IRS Restructuring and Reform Act of 1998 (RRA 98).The RRA 98 added a requirement to determine there is a reasonable indication that unreported income exists before examiners can use financial status or economic reality examination techniques.In addition, taxpayers can file claims that an employee is harassing them.An employee violation could lead to reprimand, disciplinary action, or removal.

Second, classified items were not always thoroughly verified for accuracy.In 6 (7 percent) of the 81 cases, classified deductions and reported income were accepted with little or no documented substantiating evidence.In one case, all of the three expenses classified on a Schedule C business return were accepted despite the lack of any documentary evidence.

Finally, information gathered regarding independent contractor status was not accurately recorded on the NRP questionnaire in 7 (21 percent) of the 34 applicable cases.As part of the NRP process, the IRS is gathering detailed information on independent contractor status issues.This information is documented in the NRP questionnaire, which is completed as part of the processing of all NRP cases.Accurate verification of items and information gathering is critical to ensuring the reliability of the NRP results.

The SB/SE Division identified similar problems with income probes and classified item evaluation during its monitoring of NRP examination quality.Based on these results and the results we discussed with management officials, the SB/SE Division has already initiated the following corrective actions relative to income probes, classified issue evaluation, nonemployee compensation, and independent contractor status determinations:

        On May 28, 2003, and September 2, 2003, the SB/SE Divisionís Reporting Compliance function issued the National Quality review results to examiners outlining the NRP quality areas most in need of improvement nationwide.For example, these memoranda highlighted common errors related to income probes and provided reinforcement of the proper examination techniques.The results of these reviews were also discussed in conference calls held with SB/SE Division Area Office managers.

        On June 25, 2003, all Area NRP Territory Office manager Coordinators were reminded about the need to ensure information gathered about nonemployee compensation/independent contractor status is accurately recorded on the NRP questionnaire.

        On July 24, 2003, the Deputy Director, Compliance Policy, issued a memorandum addressing the quality of NRP examinations, including a discussion of the NRP questionnaire.

        On September 12, 2003, the Deputy Director, Compliance Policy, directed each Area Office to prepare an NRP case quality action plan.The plan is required to list additional actions that will be taken at the Area level to improve case quality and should be tailored to address concerns identified during the Area Office quality reviews.

We believe the SB/SE Divisionís actions should adequately address the concerns we identified in this area, so we are making no recommendations.

Taxpayer Burden Reduction Efforts Have Yielded Some Positive Results

The NRP process is a very sensitive issue to external stakeholders such as the Congress and the taxpayer representative community.Prior compliance efforts were negatively perceived because the examinations were very intrusive Ė the examiner verified every item on the tax return.When planning for the 2002 NRP, the IRS announced and publicized the new effort and explained that examiners would use data from various sources, rather than verify with the taxpayer each line item on the return.

The IRSí efforts to reduce the burden of NRP examinations have yielded some positive results.In the 81 cases reviewed, wages, interest, and dividends were generally validated before contact with the taxpayers, where applicable.However, on average, 83 percent of the total line items on Form 1040 Schedule A and 73 percent of the total line items on Form 1040 Schedule C still had to be validated during the face-to-face contact portion of the examination process.Sixty-nine (85 percent) of the 81 returns had a Schedule A and/or a Schedule C.

Although the IRS attempted to reduce the burden on taxpayers, in reality, numerous income and expense items still needed to be verified with the taxpayer because no other source was available to verify the item.Without such verification, the NRPís measurement of compliance would not be accurate.

The Quality Review Process Did Not Use Random Selection Techniques

The NRP process includes a quality review program that analyzes various examination standards to evaluate the quality of the examinations.While the SB/SE Divisionís quality review efforts have thus far identified a number of significant areas for improvement, the reliability of data gathered through this effort could be enhanced.Specifically, NRP examinations subject to quality review are not randomly selected, which could eventually affect the long-term reliability of the results.According to NRP Territory Office managers, they select the cases for quality review based on a number of criteria, including the desire to sample from as many different examiners and taxpayer income levels as possible.

NRP project management chose the present methodology to expedite the implementation of the quality review process.A random sample is one that seeks to represent, as closely as possible, the population from which it is drawn.Random sample selection techniques require that every item in the population have an equal chance of being selected.When random samples are not used, the reliability of the results may not be adequate because the results may not be representative of the program.Reliable management information is critical to effective decision making.

Recommendation

5.†† The Director, Compliance, SB/SE Division, should develop procedures that require random sampling techniques for the quality review process.

Managementís Response:Management did not agree with our recommendation to incorporate random sampling techniques into the NRP quality review process.Management noted that the NRP quality review process was designed to identify problems and provide feedback to each examiner.Using a subjective sample assures that some returns completed by each examiner will be reviewed. Management also indicated that the Examination Quality Measurement System (EQMS) would be reviewing a random sample of NRP returns for quality and that a separate random sampling quality review process is unnecessary.

Office of Audit Comment:We recognize the IRSí desire to maintain maximum flexibility in the NRP quality review process and provide feedback as quickly as possible to each examiner.We also concur that the EQMS, which uses random sampling, could be used to reliably evaluate NRP case quality nationwide, provided a sufficient sample of NRP cases is selected to meet this objective.This approach is consistent with the overall intent of our recommendation; however, we will continue to closely monitor this issue in future reviews of the IRSí NRP activities.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The objective of this review was to determine whether the Internal Revenue Service (IRS) Examination function effectively implemented the 2002 National Research Program (NRP) initiative.This review was accomplished by conducting interviews with responsible management officials and analyzing NRP examination case files.We also examined instructions issued pertaining to the examination of NRP cases and tools developed to assist examiners in conducting NRP examinations.Specifically, we:

I.                Determined whether the resources of the Boston, Massachusetts, and Nashville, Tennessee, Area Offices were effectively organized to meet the goals of the NRP.

A.     Examined field-level procedures for assigning NRP inventory to examiners.

B.     Interviewed responsible managers to determine whether IRS goals relating to the completion of the NRP were reasonable and attainable.

II.             Determined whether field-level controls were effective to ensure NRP examinations yield reliable data and minimize taxpayer burden.

A.     Interviewed responsible management officials in the Boston, Massachusetts; Detroit, Michigan; and Nashville, Tennessee, Area Offices to identify field-level concerns regarding NRP examination processing.We selected the Boston and Nashville Area Offices randomly.The Detroit Area Office was selected to facilitate interviews of the NRP Champion (area level lead executive) and his staff.

B.     Examined a sample of 81 NRP cases selected from the Boston and Nashville Area Offices and determined whether NRP examiners performed required income probe procedures and gathered sufficient evidence to support their conclusions regarding the accuracy of filed returns.To accomplish our objective, we relied on a judgmental sampling methodology.We used this sampling methodology to allow for the on-site review of in-process NRP examinations within the constraints of our available staffing.The cases were chosen from the inventory of NRP examinations at the selected offices during the third quarter of Fiscal Year 2003.The individual return cycle of the NRP is comprised of approximately 41,000 face-to-face examinations.

C.     Assessed the overall timeliness of casework for the sample of NRP cases.

III.           Evaluated the extent of all levels of IRS managerial supervision and involvement in the NRP examination process.

A.     Determined whether effective procedures were in place for the conduct of reviews of examiner case actions by Area Office-level review teams.

B.     Determined whether controls were in place to incorporate managerial feedback into future case activity.

 

Appendix II

 

Major Contributors to This Report

 

Richard J. Dagliolo, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Parker F. Pearson, Director

Philip Shropshire, Director

Anthony J. Choma, Audit Manager

Joseph F. Cooney, Senior Auditor

Joseph P. Snyder, Senior Auditor

Mildred Rita Woody, Senior Auditor

Seth Siegel, Auditor

 

Appendix III

 

Report Distribution List

 

CommissionerC

Office of the Commissioner Ė Attn:Chief of StaffC

Deputy Commissioner for Services and EnforcementSE

Commissioner, Small Business/Self-Employed DivisionSE:S

Commissioner, Wage and Investment DivisionSE:W

Director, National Research ProgramRAS:NRP

Director, Compliance, Small Business/Self-Employed DivisionSE:S:C

Chief CounselCC

National Taxpayer AdvocateTA

Director, Office of Legislative AffairsCL:LA

Director, Office of Program Evaluation and Risk AnalysisRAS:O

Office of Management ControlsOS:CFO:AR:M

Audit Liaisons:

††††††††††† Commissioner, Small Business/Self-Employed DivisionSE:S

††††††††††† Commissioner, Wage and Investment DivisionSE:W

††††††††††† Director, Office of Research, Analysis, and StatisticsRAS

 

Appendix IV

 

Managementís Response to the Draft Report

 

The response was removed due to its size.To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.