TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

The Automated Collection System Gap Case Test Initiative Was Not Effectively Conducted

 

 

 

August 13, 2008

 

Reference Number:  2008-30-150

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

August 13, 2008

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – The Automated Collection System Gap Case Test Initiative Was Not Effectively Conducted (Audit # 200830004)

 

This report presents the results of our review of the Internal Revenue Service (IRS) Automated Collection System[1] (ACS) Gap Case Test (Gap Test) Initiative.  The overall objectives of this review were to determine whether the IRS effectively performed the ACS Gap Test and whether Gap Test results were used to implement or expand the Gap Test program.  This audit was included in the Treasury Inspector General for Tax Administration Fiscal Year 2008 Annual Audit Plan. 

Impact on the Taxpayer

The Gap Test Initiative was intended to assess the impact of providing selected ACS personnel with the authority to perform additional research on balance-due accounts above their normal authority level rather than placing these accounts in the Queue,[2] where a significant number might never be worked or their balances due never collected because of limited resources.  At the conclusion of the Gap Test, management decided not to establish a new program.  However, they did not provide sufficient oversight or collect sufficient information to accurately assess the effectiveness of the Gap Test program.  Failure to collect taxes that are due results in unfair treatment of taxpayers who file accurate tax returns and pay their taxes.

Synopsis

In October 2006, the Small Business/Self-Employed Division Campus Filing and Payment Compliance function began a 6-month Gap Test.  The Gap Test was intended to assess the impact of providing selected ACS personnel with the authority to perform additional research on balance-due accounts above their normal authority level that would otherwise have been sent to the Queue for possible follow-up by revenue officers.[3]  In Fiscal Years 2001 through 2007, the IRS removed from the Queue about 7.6 million Taxpayer Delinquent Accounts,[4] with balance-due amounts totaling almost $31.2 billion, because the cases were potentially less productive than other available inventory. 

At the conclusion of the Gap Test, management decided not to establish a separate program to resolve specific balance-due accounts because they believed that ACS personnel had spent too much time performing additional research.  However, management did not collect sufficient information from the Gap Test to accurately assess the effectiveness of the Gap Test.  For example, management did not capture the amount of funds collected during the Gap Test.  We reviewed a random sample of 100 completed Gap Test cases and identified approximately $315,000 collected from taxpayers as a result of the additional research performed during the Gap Test.  In addition, tax examiners issued 52 liens and/or levy notices for these balance-due accounts.  Because IRS management did not collect this kind of information during the Gap Test, they did not consider it when making their decision to not establish the Gap Test program. 

In addition, management did not assess the Gap Test to determine whether they could make changes to reduce the number of research steps or the time needed to complete them.  For example, they did not conduct any analysis to determine whether all of the research performed by the tax examiners was actually needed or identify steps that could eliminate or reduce the time needed to perform the research.  Also, management did not ensure that tax examiners had the correct skills levels to perform additional research to complete the Gap Test and did not ensure that examiners had all the tools needed to complete necessary research. 

Finally, management did not establish performance measures to evaluate the Gap Test results.  Without defining benchmarks, milestones, or specific performance goals, IRS management was not adequately prepared to 1) provide sufficient oversight during the testing, 2) make appropriate changes as the Gap Test progressed, or 3) evaluate the results of the Gap Test program.  The weaknesses in management oversight contributed to other problems, including unnecessarily long delays in returning the unworked cases to the Collection function inventory and inadequate documentation of the actions tax examiners took to resolve the cases.  These conditions further compromised the data on which management relied to make their decision. 

The Gap Test was an attempt to reduce the number of balance-due accounts that are sent to the Queue and was one of several initiatives that the Collection function considered to identify potential cases for the campuses.  One such initiative, developed by the Corporate Approach to Collection Inventory (CACI) group, recommended the creation of a “Hybrid Pilot Program” that uses ACS technology to test alternate inventory streams by using the combined collection skill sets of ACS employees and revenue officers.  The Hybrid Pilot Program was still in process during our review.

Recommendations

We recommended that the Director, Campus Compliance Services, ensure that 1) benchmarks, parameters, and other performance standards are established to effectively measure the success or failure of cases meeting the Gap Test criteria that have been included in the CACI Hybrid Pilot Program, 2) cases meeting the Gap Test criteria are measured in terms of benefits and results, such as revenue collected, as part of the CACI Hybrid Pilot Program, and 3) assessments of processing times are made for cases meeting Gap Test criteria that are included in the CACI Hybrid Pilot Program.  If improvements are identified, the Director should provide timely quality review feedback so the improvements can be implemented during the testing.

Based on the results of Recommendations 1 through 3, the Director, Campus Compliance Services, should determine whether the decision to not expand the Gap Test program was warranted or whether additional testing is necessary.  If more testing is needed, the Director should establish proper inventory control procedures to reduce the number of cases not being worked.  If ACS Support function personnel are not granted additional authority or other options do not arise from the Hybrid Pilot Program to work Gap Test criteria cases, the Director should consider elevating this issue to the appropriate Council overseeing IRS Private Debt Collection[5] activity to determine whether cases that meet Gap Test criteria can be sent to Private Debt Collection program contractors. 

Response

IRS management agreed with the findings and recommendations.  They plan to track cases meeting the Gap Test criteria as part of the CACI Hybrid Pilot Program.  Management also extended the Gap Test from 3 months to 6 months to gather more information, provide additional time for employees to learn the new processes, and allow the cases to progress to conclusion.  Further, management agreed to 1) measure Gap Test cases in terms of benefits and results to the extent that these cases are included in the CACI Hybrid Pilot Program, and 2) capture the assessment of processing time related to the Hybrid initiative.  Because the more extensive CACI Hybrid initiative already includes key objectives that were part of the Gap Test, further testing of Gap Test cases is not planned.  In addition, management agreed to consider elevating unresolved inventory for Hybrid initiative cases meeting Gap Test criteria to the Private Debt Collection Program office as another source of work for its consideration.  Management’s complete response to the draft report is included as Appendix V.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (202) 622-8510. 

 

 

Table of Contents

 

Background

Results of Review

Management Did Not Fully Plan the Gap Test

Recommendation 1:

Management Did Not Provide Sufficient Oversight While the Gap Test Was Being Conducted

Recommendation 2:

Recommendation 3:

Management Did Not Have Sufficient Information to Make Decisions

Recommendation 4:

Appendices

Appendix I – Detailed Objectives, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Outcome Measure

Appendix V – Management’s Response to the Draft Report

 

 

Abbreviations

 

ACS

Automated Collection System

CACI

Corporate Approach to Collection Inventory

IRS

Internal Revenue Service

 

 

Background

 

The Automated Collection System[6] (ACS) Gap Case Test (Gap Test) was initiated by the Small Business/Self-Employed Division Campus Filing and Payment Compliance function to determine the impact of having ACS Support function staff perform additional research steps to locate taxpayers and resolve balance-due accounts above their normal authority level.  These accounts had already gone through the normal notice routine and would have been sent to the Queue[7] for possible follow-up by revenue officers.[8]  According to Collection Activity Reports, the ACS Support function sent approximately 800,000 accounts of varying balance-due amounts (totaling $7.8 billion) to the Queue during Fiscal Year 2006.  Although numerous factors determine whether a case is assigned from the Queue, these cases are unlikely to be selected and worked by revenue officers. 

The Small Business/Self-Employed Division ACS Support function in Cincinnati, Ohio, was chosen to work the cases for the Gap Test.  The Gap Test team consisted of six tax examiners, a lead examiner, and a project coordinator and was conducted from October 2006 through March 2007.  It required an Internal Revenue Manual deviation that allowed the Cincinnati ACS Support function to resolve balance-due accounts above its normal authority level and to close cases as Currently Not Collectible-Unable to Locate or Currently Not Collectible-Unable to Contact after performing additional research steps.  The research to locate each Gap Test taxpayer consisted of searching the Internal Revenue Service (IRS) computer systems for taxpayer information, performing telephone checks, sending written communications to the taxpayer’s last known address, and conducting additional electronic research such as searches for credit report and real property data. 

The Campus Filing and Payment Compliance function 1) developed an action plan and procedures for working the cases, 2) ensured that original Gap Test team members received training, and 3) developed a checksheet on which the tax examiners would record specific actions taken.  Periodically, the Gap Test project coordinator tallied the information from the checksheets and forwarded the totals to the National Headquarters in New Carrollton, Maryland (Program Office).  At the end of the Gap Test, the recorded information was to be analyzed by Collection Policy and ACS Support Operations functions to determine whether a separate program should be established.

The Gap Test is one of several initiatives that the Collection function considered to identify potential cases for the campuses.  In 2006, the Collection Governance Council tasked the Corporate Approach to Collection Inventory (CACI) group with reviewing their case assignment practices from an enterprise-wide perspective.  The CACI team included members from Headquarters offices, campuses, and field offices.  The team used its combined experiences and expertise to develop multiple hypotheses.  During early 2007, the CACI team translated these hypotheses into detailed recommendations and presented them to the Collection Governance Council for consideration and approval.  Several of the recommendations dealt with creating a Hybrid Pilot Program that uses ACS technology to test alternate inventory streams by using combined collection skill sets of ACS employees and revenue officers to provide better corporate coverage and to help reduce the tax gap. 

The volume of overall test cases for the Hybrid Pilot Program will be relatively small.  The case assignment and routing of the inventories will use ACS or Entity systems to minimize manual case handling.  The testing period is planned to run for several months to ensure that valid and complete data are gathered.  Once the hybrid testing is completed, an assessment of the results will be made, and the routing of cases might change and could be programmed into the Inventory Delivery System for future case routing.  All future changes, including measures and details, will be communicated through multiple channels.

This review was performed at the ACS Support function in Cincinnati and the Program Office of the Campus Filing and Payment Compliance function in New Carrollton during the period August 2007 through February 2008.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.  Detailed information on our audit objectives, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

 

Results of Review

 

The Gap Test was an attempt to reduce the number of balance-due accounts that are sent to the Queue for possible follow-up by revenue officers.  Although some of the cases in the Queue might be assigned to be worked, a significant number will never be worked and their balances due never collected.  In Fiscal Years 2001 through 2007, the IRS removed from the Queue about 7.6 million Taxpayer Delinquent Accounts,[9] with balance-due amounts totaling almost $31.2 billion.  These cases were removed from Collection Field function[10] inventory because they were potentially less productive than other available inventory.  The Collection Field function is unable to work all of the existing accounts in the Queue with current staffing, and the number of new Taxpayer Delinquent Accounts is outpacing closures.  The number of taxpayers with unpaid accounts in the Queue and the amounts owed on these accounts increased to 10-year highs during Fiscal Year 2007.

At the conclusion of the Gap Test, management decided not to establish a new program that would expand the number of cases worked by the ACS Support function staff.  However, our review showed that management did not fully plan, implement, or have sufficient information to assess the results of the Gap Test.  Specifically, management did not:

  • Establish performance measures or goals to measure success.
  • Ensure that personnel assigned to the Gap Test had the appropriate skills and training.
  • Provide assigned personnel timely access to credit check services needed to research taxpayers’ accounts.
  • Collect and consider information on the benefits and costs of Gap Test cases.
  • Ensure that assigned personnel were properly documenting the Gap Test case files.
  • Conduct interim quality reviews to determine whether improvements could be made to the processes used during the Gap Test.
  • Return the Gap Test cases that were not worked during the Gap Test to the Queue in a timely manner.

As a result, management’s decision to not expand the authority and research conducted by the ACS Support function might have been based on inaccurate and incomplete information.  When the IRS does not sufficiently evaluate new alternatives for collecting unpaid taxes, it might lose an opportunity to increase revenues.

Management Did Not Fully Plan the Gap Test

Proper project planning is important to ensure that a project meets its goals and that management obtains complete and reliable information to make decisions.  When planning a project, management should identify the project’s objectives and goals and ensure that sufficient resources are available. 

However, when planning for the Gap Test, management did not create benchmarks or goals by which to measure the Gap Test’s success, failure, or effectiveness.  By not defining the benchmarks, milestones, or specific performance goals, IRS management was not adequately prepared to 1) provide sufficient oversight during the testing, 2) make appropriate changes as the Gap Test progressed, or 3) evaluate the results of the Gap Test program.  Without these measurements, management did not have objective criteria with which to assess whether the resources used in conducting the Gap Test outweighed any benefits that might have been received.  For example, management did not create specific goals for reducing time expended, collecting revenues, or closing cases. 

As part of their research steps, the tax examiners needed to evaluate each Gap Test taxpayer’s financial condition.  To do so, the examiners needed access to credit check services similar to those used in private industry.  This requirement was known before the Gap Test started, and access to the services should have been provided to the tax examiners to ensure that they had all necessary tools.  However, management did not secure access to the credit check services until February 2007, 1 month before the end of the Gap Test period.  This delay caused a large percentage of cases to not be completed within the 6-month Gap Test period. 

In addition, management did not adequately plan for potential employee reassignments or ensure that employees had the tools and skills necessary to do their work.  Initially, the tax examiners were trained to perform research and prepare documentation.  As the Gap Test progressed, some tax examiners were reassigned to work on other priorities, while other examiners were assigned to work on the CACI Hybrid Pilot Program as the Gap Test concluded.  The CACI team identified a need for separate treatment of business and individual taxpayers who voluntarily filed their returns.  Using less costly ACS resources would allow these cases to receive priority focus, and the unresolved cases would be more thoroughly developed to identify the most appropriate resolution.  In addition, with authority levels in concurrence with those in the Collection Field function, more accounts would be resolved that would otherwise remain in the Queue. 

In addition, because of training inadequacies, some Gap Test tax examiners spent more time performing research on the Gap Test cases than others, and some examiners performed too much research.  For example, some examiners were not adequately trained to use specific research tools, such as Accurint.[11]  It took tax examiners an average of 3.1 hours (ranging from 1 hour to 4 hours) to perform the research.  However, management never performed an analysis to determine why there was such a variance in the amount of time required to work a case.

Recommendation

Recommendation 1:  The Director, Campus Compliance Services, should ensure that benchmarks, parameters, and other performance standards are established to effectively measure the success or failure of cases meeting the Gap Test criteria that have been included in the CACI Hybrid Pilot Program.

Management’s Response:  IRS management agreed with the recommendation.  They plan to track cases meeting Gap Test criteria as part of their tracking of the CACI Hybrid project.  The IRS encountered and documented unforeseen challenges during the Gap Test, including systems access issues, the extended learning curve for additional processing, shifting resources, and case complexity.  Management recognized these issues and extended the Gap Test from 3 months to 6 months to gather more information, provide additional time for employees to learn the new processes, and allow the cases to progress to conclusion.

Management Did Not Provide Sufficient Oversight While the Gap Test Was Being Conducted

The Internal Revenue Manual requires Small Business/Self-Employed Division management to provide the necessary oversight, guidelines, and analysis of programs to identify trends for collection processes.  Our review showed that the Program Office did not provide sufficient guidance, oversight, and feedback to local management during the Gap Test to ensure that it was progressing efficiently and collecting useful information.  Several deficiencies that occurred during the Gap Test could have been corrected if management had provided sufficient oversight and quality reviews.  The Program Office reported that local management was very involved at the start of the Gap Test but that their involvement decreased as the Gap Test progressed.  The Program Office was unaware of the impact of the decrease in local management’s involvement until the conclusion of the Gap Test because it was not regularly communicating with local management on the progress and status of the Gap Test.

Collecting data

For the Gap Test to be effective, management should have ensured that the tax examiners collected information they could use to assess the performance of the Gap Test.  The tax examiners captured information related to the process of working the cases but did not collect sufficient information about the benefits of the Gap Test program.  For example, no information was obtained on revenues received during the Gap Test.  We reviewed a random sample of 100 completed Gap Test cases and identified approximately $315,000 collected from taxpayers as a result of the additional research performed during the Gap Test.  In addition, tax examiners issued 52 liens and/or levy notices for these 100 cases, which will protect the Federal Government’s interest.  Because this information was not collected during the Gap Test, the IRS did not consider it when it decided to not expand the authority for cases worked by ACS Support function staff. 

Documenting case files

The tax examiners were authorized to close a case as Currently Not Collectible if they received no response after performing additional research to locate the taxpayer.  This research included performing searches to locate telephone numbers of the taxpayer’s neighbors, sending written communications to the taxpayer’s last known address, and conducting additional research to find real property and credit report information.  The tax examiners were to place calls to new telephone numbers, send letters to new addresses, and initiate liens and/or issue levy notices when appropriate.  In addition, they were to document the case checksheet and the Desktop Integration system[12] with the results of their research, any actions taken, and how they were disposing of the case.

Tax examiners did not properly document research performed or take appropriate actions to resolve the balance-due accounts.  In our random sample of 100 completed Gap Test cases, 22 were not adequately documented, and 7 had an incorrect Currently Not Collectible closing code.  We also noted that the Gap Test project coordinator corrected another 16 cases that had been closed with incorrect closing codes.  Although the Gap Test tax examiners received training prior to conducting the Gap Test, differences between the closing codes for Currently Not Collectible-Unable to Locate and Currently Not Collectible-Unable to Contact were unclear to the examiners.

The tax examiners were required to record their actions in the Desktop Integration system.  However, they did not always input to this system the telephone numbers of the taxpayers and/or their neighbors and relatives or the mailing addresses to which correspondence had been sent.  Because this information was not always documented, we could not determine whether the tax examiners sent correspondence to the most current addresses or used the most recent telephone numbers.  If actions are not properly documented, work could be duplicated if the cases are reactivated, and processing issues and trends might not be readily identified. 

Conducting interim quality reviews

Although the Program Office had some communication with the Gap Test project coordinator throughout the Gap Test, it did not perform a quality review of the Gap Test until May 2007 (1 month after the Gap Test had ended), and it did not provide feedback to the project coordinator.  Further, it selected 25 cases to review but completed reviews of only 12.  Because the reviews were not conducted during the Gap Test, the Gap Test team did not have an opportunity to make changes based on the results.  For example, the Gap Test team could have tried to find ways to reduce the number of research steps, or the time needed to complete them, if the quality reviews had been performed during the Gap Test.

Returning cases to the Queue

For the Gap Test, the ACS call sites were instructed to place cases meeting Gap Test criteria into a special inventory category.  Normally, these cases would have been sent to the Queue for follow-up by revenue officers.  When the cases were placed in the Gap Test inventory, they became unavailable to revenue officers.  Only Gap Test personnel could work these cases.  The tax examiners accessed the Gap Test inventory by using the “next case” processing tool in the ACS to obtain the next case to work. 

Between October 2006 and March 2007, the ACS call sites placed 2,743 cases in the Gap Test inventory.  The Gap Test project coordinator rejected 106 of these cases because they did not meet the criteria, which left 2,637 cases available for the Gap Test team.  However, tax examiners worked only 607 cases (23 percent). 

ACS call sites continued to place cases in the Gap Test inventory, even though there were a sufficient number of cases for the Gap Test already in the inventory.  These additional cases could not be worked.  Management did not provide clear procedures on when to discontinue placing cases in the Gap Test inventory. 

In addition, management did not establish procedures for placing the 2,030 unworked cases back in the normal Collection function workflow.  As a result, these 2,030 Gap Test cases remained in the Gap Test inventory until November 2007, when the Gap Test project coordinator returned them to the normal Collection function workflow.  Thus, these cases were held for more than 7 months after the Gap Test had been completed, without the possibility of being worked.

Recommendations

The Director, Campus Compliance Services, should:

Recommendation 2:  Ensure that cases meeting the Gap Test criteria are measured in terms of benefits and results, such as revenue collected, as part of the CACI Hybrid Pilot Program.

Management’s Response:  IRS management agreed with the recommendation.  They plan to measure Gap Test cases in terms of benefits and results to the extent that these cases are included in the CACI Hybrid Pilot Program.  The Hybrid initiative will not specifically focus on the Gap Test cases.  However, because they are included, IRS management agreed to measure the cases meeting Gap Test criteria in the same way they measure the benefits and results captured as part of the entire CACI Hybrid initiative.

Recommendation 3:  Ensure that assessments of processing times are made for cases meeting Gap Test criteria that are included in the CACI Hybrid Pilot Program.  If improvements are identified, the Director should provide timely quality review feedback so the improvements can be implemented during the testing.

Management’s Response:  IRS management agreed with the recommendation.  They plan to capture the assessment of processing time related to the CACI Hybrid initiative.

Management Did Not Have Sufficient Information to Make Decisions

At the conclusion of the Gap Test, management decided that the research was too time-consuming and, therefore, did not expand the Gap Test program.  This decision was based primarily on the average of 3.1 hours per case expended by the examiners to perform the additional research on the Gap Test cases.  Management must have complete and reliable information to make informed decisions.  However, management did not:

  • Perform any analysis to determine whether all of the additional research was necessary or whether some research steps could be eliminated.
  • Consider all of the benefits obtained during the Gap Test, such as increased revenues. 
  • Ensure that the Gap Test team had sufficient resources, adequate training, and necessary access to information to properly conduct the Gap Test.

These conditions further compromised the data on which management relied to make their decision.  Although management did not have sufficient information available, they decided not to establish a program that would expand the number of cases worked by ACS Support function personnel.  As a result, management might have missed an opportunity to implement a cost-effective process to reduce the number of cases in the Queue and increase revenues.

Recommendation

Recommendation 4:  Based on the results of Recommendations 1 through 3, the Director, Campus Compliance Services, should determine whether the decision to not expand the Gap Test program was warranted or whether additional testing is necessary.  If more testing is needed, the Director should establish proper inventory control procedures to reduce the number of cases not being worked.  If ACS Support function personnel are not granted additional authority or other options do not arise from the CACI Hybrid Pilot Program to work Gap Test criteria cases, the Director should consider elevating this issue to the appropriate Council overseeing IRS Private Debt Collection[13] activity to determine whether cases that meet Gap Test criteria can be sent to Private Debt Collection program contractors.

Management’s Response:  IRS management agreed with the recommendation.  Because the more extensive CACI Hybrid initiative already includes key objectives that were part of the Gap Test, further testing of Gap Test cases is not planned.  Management agreed to consider elevating unresolved inventory for Hybrid initiative cases meeting Gap Test criteria to the Private Debt Collection program office as another source of work for its consideration.

 

Appendix I

 

Detailed Objectives, Scope, and Methodology

 

The overall objectives of this review were to determine whether the IRS effectively performed the ACS[14] Gap Case Test (Gap Test) and whether Gap Test results were used to implement or expand the Gap Test program.  To accomplish these objectives, we:

I.                   Interviewed the project manager and other appropriate officials to determine the criteria and procedures for identifying, selecting, and reviewing cases for the Gap Test. 

II.                Determined whether controls were effective to ensure that the Gap Test team properly performed additional research and took appropriate actions to resolve or close the cases. 

A.    Selected a random sample of 100 cases from the 444 completed Gap Test cases[15] to determine whether the appropriate research was performed and documented by the ACS Support function Gap Test team.  We used a random sample to ensure that each item in the population had an equal opportunity to be selected.

1.      Reviewed supporting documentation to determine whether the additional research was performed. 

2.      Reviewed the Gap Test case files to determine whether the actions taken to resolve the cases were properly documented.

3.      Researched available resources (e.g., the Integrated Data Retrieval System,[16] ACS, and Desktop Integration system[17]) to determine whether the additional research steps were correctly performed and accurately documented in the Desktop Integration system comments.

B.     Determined whether reviews were performed to ensure that Gap Test cases were properly worked and that Gap Test results were accurately captured and reported to the Program Office.

III.             Determined whether the National Headquarters in New Carrollton, Maryland (Program Office) provided adequate oversight to ensure that the Gap Test was properly conducted. 

A.    Determined whether reviews of the Gap Test results were performed and whether feedback was provided to the Gap Test team.

B.     Reviewed the final Gap Test data to determine whether the Gap Test results were accurately captured and included a breakdown showing the resolution of the Gap Test cases (e.g., Currently Not Collectible-Unable to Locate or Currently Not Collectible-Unable to Contact, transferred to a revenue officer,[18] sent to the Queue,[19] or fully paid).  We also determined whether the results were consistently calculated.  For example, we determined whether some actions performed in April, which was after the Gap Test period, still counted toward the final Gap Test results (e.g., if the case was ultimately fully paid), while other actions performed after the Gap Test period were not counted (e.g., if the case was closed as Currently Not Collectible).

IV.             Evaluated the IRS’ plans for expanding or implementing the Gap Test procedures and the Gap Test’s impact on the ACS Support function workload.

A.    Determined whether any studies were performed or are planned to evaluate the impact that performing additional research had on the ACS Support function staff.

B.     Determined whether there are plans to expand the Gap Test to include other types of cases such as Business Master File[20] or different dollar values.

C.     Determined whether the IRS captured the time charged to the Gap Test.

D.    Determined whether the IRS established criteria to measure the Gap Test’s success.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Carl L. Aley, Director

Edward Gorman, Audit Manager

Denise M. Gladson, Lead Auditor

Janis Zuika, Senior Auditor

Stephen A. Elix, Auditor

Marcus D. Sloan, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE
Deputy Commissioner, Small Business/Self-Employed Division  SE:S

Director, Campus Compliance Services, Small Business/Self-Employed Division  SE:S:CCS

Director, Campus Filing and Payment Compliance, Small Business/Self-Employed Division  SE:S:CCS

Director, Collection, Small Business/Self-Employed Division  SE:S:C

Director, Collection Policy, Small Business/Self-Employed Division  SE:S:C:CP

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaison:  Commissioner, Small Business/Self-Employed Division  SE:S

 

Appendix IV

 

Outcome Measure

 

This appendix presents detailed information on the measurable impact that our recommended corrective actions will have on tax administration.  This benefit will be incorporated into our Semiannual Report to Congress.

Type and Value of Outcome Measure:

·         Reliability of Information – Actual; $315,000 collected from taxpayers as a result of performing the additional research (see page 5).

Methodology Used to Measure the Reported Benefit:

From a random sample of 100 ACS[21] Gap Case Test (Gap Test) cases, we identified 28 cases for which the Gap Test program collected $314,579.11 from taxpayers as a result of the additional research performed during the Gap Test.  However, the funds collected were not considered when assessing the results of the Gap Test. 

 

Appendix V

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.



[1] A telephone contact system through which telephone assistors collect unpaid taxes and secure tax returns from delinquent taxpayers who have not complied with previous notices.

[2] An automated holding file for unassigned inventory of delinquent cases for which the Collection function does not have enough resources to immediately assign for contact.

[3] Employees in the Collection Field function who attempt to contact taxpayers and collect delinquent taxes or secure unfiled returns on accounts that have not been resolved through notices sent by the IRS campuses or the ACS.  The campuses are the data processing arm of the IRS.  They process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts. 

[4] A balance-due account of a taxpayer.  A separate Taxpayer Delinquent Account exists for each tax period.

[5] An IRS program permitting private collection agencies to help collect Federal tax debts.

[6] A telephone contact system through which telephone assistors collect unpaid taxes and secure tax returns from delinquent taxpayers who have not complied with previous notices.

[7] An automated holding file for unassigned inventory of delinquent cases for which the Collection function does not have enough resources to immediately assign for contact.

[8] Employees in the Collection Field function who attempt to contact taxpayers and collect delinquent taxes or secure unfiled returns on accounts that have not been resolved through notices sent by the IRS campuses or the ACS.  The campuses are the data processing arm of the IRS.  They process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts.

[9] A balance-due account of a taxpayer.  A separate Taxpayer Delinquent Account exists for each tax period.

[10] The unit in the Area Offices consisting of revenue officers who handle personal contacts with taxpayers to collect delinquent accounts or secure unfiled returns.  An Area Office is a geographic organizational level used by IRS business units and offices to help their specific types of taxpayers understand and comply with tax laws and issues.

[11] Accurint is the IRS’ asset locator tool for front-line employees.

[12] The Desktop Integration system provides a common interface that allows users of multiple IRS systems to view history and comments from other systems.

[13] An IRS program permitting private collection agencies to help collect Federal tax debts.

[14] A telephone contact system through which telephone assistors collect unpaid taxes and secure tax returns from delinquent taxpayers who have not complied with previous notices.

[15] Of the 607 cases worked during the Test period, 444 cases were completed.

[16] IRS computer system capable of retrieving or updating stored information.  It works in conjunction with a taxpayer’s account records.

[17] The Desktop Integration system provides a common interface that allows users of multiple IRS systems to view history and comments from other systems.

[18] Employees in the Collection Field function who attempt to contact taxpayers and collect delinquent taxes or secure unfiled returns on accounts that have not been resolved through notices sent by the IRS campuses or the ACS.  The campuses are the data processing arm of the IRS.  They process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts.

[19] An automated holding file for unassigned inventory of delinquent cases for which the Collection function does not have enough resources to immediately assign for contact.

[20] The IRS database that consists of Federal tax-related transactions and accounts for businesses.  These include employment taxes, income taxes on businesses, and excise taxes.

[21] A telephone contact system through which telephone assistors collect unpaid taxes and secure tax returns from delinquent taxpayers who have not complied with previous notices.