TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

Progress Has Been Made to Reengineer the Examination Program, but Additional Improvements Are Needed to Reduce Taxpayer Burden

 

 

 

February 18, 2011

 

Reference Number:  2011-30-016

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Redaction Legend:

1 = Tax Return/Return Information

 

Phone Number   |  202-622-6500

Email Address   |  TIGTACommunications@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

 

HIGHLIGHTS

 

PROGRESS HAS BEEN MADE TO REENGINEER THE EXAMINATION PROGRAM, BUT ADDITIONAL IMPROVEMENTS ARE NEEDED TO REDUCE TAXPAYER BURDEN

Highlights

Final Report issued on February 18, 2011

Highlights of Reference Number:  2011-30-016 to the Internal Revenue Service Commissioners for the Small Business/Self-Employed Division and the Wage and Investment Division.

IMPACT ON TAXPAYERS

The Correspondence and Discretionary Examination Program (hereafter referred to as the Program) conducts examinations exclusively by mail to reduce Internal Revenue Service (IRS) operational costs and minimize the burden on taxpayers.  However, taxpayers have expressed concerns with the length of the examination process, the lack of consideration given to their information sent to the IRS, and treatment by IRS employees.  While the IRS is reengineering the examination process, taxpayer burden continues to exist.

WHY TIGTA DID THE AUDIT

This audit was initiated at the IRS Oversight Board’s request for TIGTA to assess whether recent IRS efforts to identify weaknesses and take actions in its Program actually improved results.  Our overall objective was to determine whether the IRS’s reengineered Program resulted in a more responsive and less burdensome process for taxpayers.

WHAT TIGTA FOUND

Steps have been taken to reengineer the Program and improve employee compliance with Program guidelines, which could ultimately lessen taxpayer burden and increase taxpayer rights and entitlements.  TIGTA selected and reviewed two samples to evaluate employee performance before and after the Program implemented a new mail model process at one processing site.  Our results showed that after implementing the process, the Program reflected significant improvements in several attributes used to measure performance.  Despite the progress, results from our statistical sample of cases where taxpayers agreed to the additional tax assessments showed 28 of 62 cases contained errors.  The majority of these errors related to employees untimely closing cases.

Our analyses of another statistical and two judgmental samples of cases where the taxpayer did not agree with the additional assessment showed Program employees did not always consider the taxpayer’s correspondence before closing the case. 

WHAT TIGTA RECOMMENDED

TIGTA recommended that the IRS ensure all Program employees 1) follow mail processing guidelines until the mail model process is fully implemented in all sites and 2) follow guidelines for handling, responding to, and considering taxpayer correspondence.

Although the IRS agreed with our recommendations, it did not agree with our outcome measure.  Specifically, the IRS stated that many of the errors do not impact taxpayer rights and entitlements because the absence of a date stamp on a taxpayer’s correspondence would not constitute burden to the taxpayer.  In addition, the IRS expressed concerns with TIGTA’s use of the word “error” because it could lead readers to believe that an incorrect conclusion was reached during the examination.

Our audit findings and recommendations address results that showed IRS employees did not adhere to established procedures and/or guidelines when processing correspondence taxpayers submit for examinations of their tax returns.  For example, when date stamps which are applied to assist employees control of documents received from taxpayers are missing, the IRS cannot ensure a timely response to the taxpayer.  When employees do not adhere to IRS guidelines and procedures, it is simply an error.  TIGTA continues to believe that when these errors occur, taxpayers are at risk of not receiving their rights, entitlements, and protection of due process when they question the accuracy of tax liabilities resulting from Program examinations.

 

February 18, 2011

 

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

                                         COMMISSIONER, WAGE AND INVESTMENT DIVISION

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – Progress Has Been Made to Reengineer the Examination Program, but Additional Improvements Are Needed to Reduce Taxpayer Burden (Audit # 201030034)

 

This report presents the results of our review to determine whether the Internal Revenue Service’s (IRS) reengineered Correspondence and Discretionary Examination Program (hereafter referred to as the Program) process resulted in a more responsive and less burdensome process for taxpayers.  The Program’s process includes mail processing, information document requests, and telephone access and service.  The IRS Oversight Board (hereafter referred to as the Board) and tax practitioners expressed concerns with the execution of Program examinations.  Specifically, the excessive time it takes to reach final resolution of a taxpayer’s case, the inability of the taxpayer to contact the IRS to obtain definitive information on the questionable tax issue, and the fact that taxpayer inquiry calls are not being returned.  In February 2009, the IRS responded to the Board that it had taken a series of actions to identify and address key weaknesses in its Program processes that contributed to taxpayer dissatisfaction in the past.  This review was requested by the Board and was part of our Fiscal Year 2010 Annual Audit Plan and addresses the major management challenge of Tax Compliance Initiatives.

Management’s complete response to the draft report is included as Appendix VIII.  Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Compliance and Enforcement Operations), at (202) 622-8510.

 

 

Table of Contents

 

Background

Results of Review

Examination Program Changes Were Successfully Piloted, but Challenges Still Exist

Recommendations 1 and 2:

Additional Steps Are Needed to Ensure Employees Follow Procedures to Improve Customer Satisfaction

Recommendation 3:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Outcome Measure

Appendix V – Taxpayer Feedback From the Fiscal Year 2010 Customer Satisfaction Survey on Their Experience With the Examination Process

Appendix VI – Letter 3219 Notice of Deficiency

Appendix VII – Glossary of Terms

Appendix VIII – Management’s Response to the Draft Report

 

 

Abbreviations

 

IRS

Internal Revenue Service

SB/SE

Small Business/Self-Employed

W&I

Wage and Investment

 

 

Background

 

The Correspondence and Discretionary Examination Program (hereafter referred to as the Program)[1] plays a vital role in the Internal Revenue Service’s (IRS) mission of promoting voluntary compliance with the tax law.  Program examinations are conducted at 10 sites[2] in the Small Business/Self-Employed (SB/SE) Division as part of the Campus Compliance Services function and in the Wage and Investment (W&I) Division under the Reporting Compliance function.  Figure 1 reflects the volume of examinations and additional tax revenue generated from October 1, 2007, through March 10, 2010, for both Divisions.

Figure 1:  Volume of Examinations and Additional Tax Revenue

*                   Fiscal Year

Number of Examinations

*                   Tax Revenue (in billions)

*                   2008

*                   1,070,548

*                     $6.16

*                   2009

*                   1,094,996

*                     $7.58

*                   2010

*                   1,075,963

*                     $7.73

*                   Totals

*                   3,241,507

*                   $21.47

Source:  Our analysis of the Audit Information Management System[3] for October 1, 2007, through March 10, 2010.

By conducting examinations, Program examiners are primarily responsible for determining the correct tax liabilities for taxpayers.  Examinations of individual taxpayers can range from reviewing their tax returns and resolving questionable items by corresponding with them through the mail to a detailed face-to-face examination of a taxpayer’s financial records at his or her place of business.  In contrast to the more labor-intensive, face-to-face examination, the correspondence examination process is less intrusive, more automated, and conducted by examiners who are trained to address and focus on less complex tax issues.  Importantly, correspondence examinations also enable the IRS to reach more taxpayers at a lower cost, minimize taxpayer burden, and release resources for face-to-face examinations focused on more complex noncompliance tax issues. 

Once a tax return is selected for examination, taxpayers are issued a letter requesting additional information to support the questionable tax items.  If the taxpayer does not respond, the IRS issues a second letter informing the taxpayer of the proposed tax assessment.  If the taxpayer does not respond to the second letter or if the response is insufficient to address the items in question, the IRS will issue a Notice of Deficiency (Letter 3219).[4]  This Letter gives the taxpayer 90 days[5] to pay the tax assessment or file a petition with the Tax Court.  If resolution has not occurred by the end of the 90-day period, the Program examination is closed and the assessed tax is posted to the taxpayer’s account.

In 2008, practitioners stated the IRS had increased the number of Program audits and expressed concern about the extraordinary amount of time required to reach a final resolution.

In the 2008 IRS Oversight Board (hereafter referred to as the Board) Annual Report to Congress, practitioners expressed concern about the increased number of Program examinations and the extraordinary amount of time required to reach a final resolution.  In addition, practitioners shared that the IRS did not designate, in the various letters, an employee who could be contacted to further define the issues or answer taxpayer questions.  The current process requires taxpayers to call the number listed on letters and leave a voice mail message.  Practitioners stated IRS employees are not responding to these calls and suggested a telephone help line for taxpayers to call when they have questions.  Finally, some practitioners stated their clients get repeat notices over several years for the same issue even after prior examinations were closed without a tax assessment (i.e., referred to by the Program as a no-change case).

In February 2009, the IRS briefed the Board on research it conducted which confirmed that many taxpayers and practitioners find the process too lengthy and the IRS correspondence difficult to understand.  The IRS indicated that it had identified key weaknesses and had developed solutions that would focus on three areas for improvement mail processing, requests for taxpayer information documents, and telephone access and service.  The Board requested the Treasury Inspector General for Tax Administration assess whether the recent efforts the IRS made to reengineer its Program resulted in a more responsive and less burdensome process for taxpayers.

We identified two internal IRS studies of the correspondence examination process that responded to the Board’s concerns.  The purposes of the studies were to evaluate if Program employees were adhering to Program guidelines and make recommendations to reduce the burden on taxpayers.  The two studies and reported results were as follows:

·         In January 2009, the SB/SE Division Taxpayer Improvement Initiative study recommended solutions to improve Program employees’ communication with taxpayers and improve taxpayers’ ability to receive assistance when they call the IRS.  The SB/SE Division developed job aides to help Program employees communicate to taxpayers what documentation is required to substantiate questionable items on the tax return.  As of July 2010, the SB/SE Division completed implementation of the toll-free call routing system currently being used by the W&I Division.  Taxpayers will be able to speak directly with assistors when they have questions about their Program examination.

·         In February 2010, the W&I Division Lean Six Sigma study began piloting a centralized model for processing incoming mail at the Austin Compliance Site.  This model will centralize all mail processing and increase the flexibility in planning, staffing, and teamwork.  The W&I Division plans to implement the model in all its sites by June 2011.  The SB/SE Division plans to pilot the process in December 2010 at one of its five sites and implement the process in the remaining sites by April 2011. 

This review was performed at the W&I Division Headquarters and sites at the Atlanta Campus in Atlanta, Georgia; the Austin Campus in Austin, Texas; the Kansas City Campus in Kansas City, Missouri; the Memphis Campus in Memphis, Tennessee; and, the SB/SE Division Headquarters in New Carrolton, Maryland, during the period January through August 2010.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

 

Results of Review

 

Examination Program Changes Were Successfully Piloted, but Challenges Still Exist

For our review, we selected a statistical sample of 62 W&I Division and SB/SE Division closed agreed cases from a population of 251,215 for the period April through December 2009.[6]  Results showed 28 (48 percent)[7] contained errors where Program employees did not follow guidelines while completing the Program examination.  The majority of these errors related to employees untimely closing cases.  We also selected and reviewed a judgmental sample of 35 closed agreed cases from the Austin Compliance Site for the month of May 2010.  The Austin Compliance Site was selected because the IRS chose it to pilot the changes to the mail process recommended from the Lean Six Sigma study.  We compared the results from this sample to the results from our statistical sample for several Program attributes to determine if the implemented changes improved Program results. 

The IRS implemented a new mail model process that has improved Program compliance and could ultimately reduce taxpayer burden

Our review showed measurable differences between the two samples, indicating the new mail model process has improved compliance with Program guidelines and could ultimately lessen taxpayer burden.  For example, prior to implementing changes at the Austin Compliance Site, it took the Program an average of 27 days to close an agreed case.  Based upon the sample results, after implementing the changes, 100 percent of the cases were closed within an average of 7 days.[8]  Figure 2 reflects a comparison of the attributes used by the Program before and after the changes were implemented at the Austin Compliance Site. 

Figure 2: Comparison of Pilot and Program Results
for Compliance With Program Guidelines

 

Program Attributes

Results From the Program Before Changes Were Implemented[9]

Results From Pilot Site After Changes Were Implemented

Cases were not stamped with Program received date.

5% (3 of 62)

0%

All stamped correctly

Cases were not routed to Program employees to work within 1 to 3 days.

5% (3 of 59)

Ranged from 4 to 14 days

0%

All routed timely

Cases were not closed within the required number of days from the Program received date.[10]

37% (21 of 62)

17% (6 of 35)

Case actions were not documented in automated history sheets.

***1***

***1***

Source:  Our analyses of closed agreed cases from both Divisions.

Program management has not established guidelines for the number of days it should take examiners to close agreed cases.  However, when asked, the SB/SE Division considered 21 days as reasonable to close agreed cases and the W&I Division had an expectation of 7 days.  Government Accountability Office standards provide that management conduct reviews to compare actual performance to planned results and analyze significant differences.  Based on our analysis and for Program consistency, we believe the opportunity exists for Program management to strengthen its controls by revisiting what is reasonable and their expectation for the number of days it takes to close agreed cases.

Overall, we believe the changes the Program has taken to improve its processes are a positive step in the right direction.  However, until the Program can implement the planned changes at all sites, taxpayers will continue to experience increased burden and inconsistencies with the process.  Based on our results, we project 108,092[11] closed agreed cases from April through December 2009 contained an error that decreased the Program’s performance measures and increased the risk that taxpayers were not afforded the rights and entitlements to due process as protected by employees’ adherence to Program guidelines.  See Appendix IV for our projection details.

Recommendations

The Commissioners, SB/SE and W&I Divisions, should:

Recommendation 1:  Ensure that all Program employees follow current mail processing guidelines until the pilot mail model is fully implemented in all 10 sites. 

Management’s Response:  IRS management agreed with this recommendation.  They will continue to ensure adherence to their procedures in all aspects of their Program through their annual operational and program reviews.

Recommendation 2:  Strengthen controls over processing agreed cases and, for consistency between the two Business Divisions, establish a specific time constraint for Program employees to close agreed cases.

Management’s Response:  IRS management agreed with this recommendation.  They will establish a standard timeframe and incorporate this into their Internal Revenue Manual guidance.  They also agreed to implement a processing change to make it easier to identify agreed cases that are delayed more than 7 days.

Additional Steps Are Needed to Ensure Employees Follow Procedures to Improve Customer Satisfaction

Our audit results showed the Program process continues to be lengthy and, prior to closing the case, Program employees did not always consider the information taxpayers provided in response to IRS letters.  We completed analyses of three additional samples of default cases and cases that were closed but were reopened because taxpayers provided new information for audit reconsideration.  We selected these samples to determine whether employees adhered to Program guidelines.  We did not evaluate whether Program changes, made as a result of the IRS studies, improved the examination process.[12] 

Process improvements are needed to close cases when the taxpayer responds but disagrees with the proposed tax assessment

From a population of 137,294 cases, we selected and reviewed a statistical sample of 60 closed default cases for the period April through December 2009 for the W&I and SB/SE Divisions to determine if employees followed Program guidelines.  Our results showed 47 (83 percent) contained errors that indicate Program employees did not always follow procedures when closing cases.  For example, after responding to correspondence received from the Program, taxpayers experienced delays ranging from 32 to 137[13] days waiting for Program employees to take action with the correspondence that was sent.  Guidelines require Program employees to evaluate the taxpayers’ correspondence and take the next action within 30 days from the IRS received date.  Figure 3 reflects the results for the Program attributes used to measure performance in both Divisions.

Figure 3: Analyses of 60 Closed Default Cases[14]

Program Attributes

Error Rates for
Closed Cases

Cases were not stamped with Program received date.

6% (4 of 60)

When warranted, taxpayers were not contacted by Program employees when additional information was needed.

18% (5 of 34)

Program employees did not evaluate the taxpayers’ correspondence within 30 days from the IRS received date.

72% (40 of 60)

Ranged from 32 to 137 days

Cases were not routed to Program employees within 5 business days.

45% (14 of 41)

Ranged from 7 to 130 days

The computer system was not updated within 5 business days of the Program received date.

63% (27 of 44)

Ranged from 8 to 71 days

Source:  Our analyses of the sampled closed default cases from April through December 2009.

The Program’s Fiscal Year 2009 operational reviews showed similar results to the types of errors identified in our sample.  In response to the operational reviews, Program management stated that an adequate process to update its computer system with correspondence received from taxpayers was not in place to meet Program requirements.  We shared our sample results with Program management and were advised that the lack of a process to handle the high volume of correspondence was a critical challenge during Fiscal Year 2009.  Many of the delays IRS employees experienced working the cases could be attributed to correspondence and claims associated with the First-Time Homebuyer Credit.  For example, the volume of claims and original tax returns associated with the Credit steadily increased from 1,466 in April 2009 to 89,338 by September 2009.  The unexpected increase caused Program management to shuffle resources in an effort to keep up with the demand.  Based on our results, we project 101,787[15] closed default cases contained an error that decreased the Program’s performance measures and increased the risk that taxpayers were not afforded the rights and entitlements to due process as protected by employees’ adherence to Program guidelines.  See Appendix IV for our projection details.

Process improvements are needed to ensure correspondence received from the taxpayer is considered before the case is closed

We selected a judgmental sample of 24[16] default cases after receiving concerns that Program employees were not following procedures requiring them to consider taxpayer correspondence prior to closing the cases.  Our results showed for 17 of the cases, Program employees did not consider the taxpayers’ correspondence prior to closing the case.  In addition, for 10 (59 percent) of the 17 cases, the taxpayers’ correspondence was not input to the Integrated Data Retrieval System and/or the Correspondence Examination Automation Support System within the required time period to alert Program employees that correspondence had been received.  This situation was attributed to Program employees storing taxpayer correspondence on shelves instead of entering the information in the computer systems. 

When correspondence is not timely entered into the systems, IRS employees who work the Toll-Free telephone lines cannot advise taxpayers when they call that their correspondence has been received and is being considered.  In addition, Program employees are unable to consider the taxpayer’s correspondence because the computer systems have no record of the correspondence being received.  Our analyses showed that taxpayer correspondence was stored on the shelves from 13 to 822[17] days prior to closing.  Shelving the correspondence resulted in a backlog of work that employees could not process until Program resources were allocated to enter the correspondence in the computer systems. 

Guidelines state the Program employees should, before deciding whether to assess additional taxes, consider all correspondence received from the taxpayer within 7 days after the case is closed.  Further, if correspondence is received and a decision to close the case cannot be made within 14 days from the date the taxpayer responds, the correspondence should be input into both computer systems.  Finally, if correspondence is received after the case is closed, the correspondence should be reviewed as an audit reconsideration.  Figure 4 shows the results for the Program attributes used to measure Program performance for default cases closed.

Figure 4:  Analyses of Correspondence Received
Prior to Closing the Default Cases

Program Attributes

Error Rate

Decision to close the case was not made within the required 14 days of receipt of correspondence.

96% (23 of 24)

Correspondence was not controlled within the required 14 days after correspondence was received.

54% (13 of 24)

Ranged from 20 to 707 days

Cases were not worked within 30 days of receipt.

92% (22 of 24)

Ranged from 68 to 822 days

Cases were not stamped with a Program received date.

*****1*****

Correspondence received either prior to closing or within 7 days after closing but was not considered.[18]

71% (17 of 24)

Correspondence received more than 7 days after the case was closed but was not worked as an audit reconsideration case as required.

29% (7 of 24)

Ranged from 9 to 357 days

 

Source:  Analyses of 24 judgmentally selected default cases from one Program site.

Since Program employees did not consider the correspondence prior to closing the case, 17 taxpayers were assessed $38,591 in additional taxes and experienced increased taxpayer burden.  Our analyses of the cases showed the taxpayers questioned the assessments and resubmitted information that was not considered during the original examination.  We did not evaluate if the information provided by the taxpayer would have substantiated the items in question on the tax returns.  However, after reviewing the information provided by the taxpayer, Program employees agreed to reduce the total assessments by 34 percent, or $13,287.

Program employees considered taxpayer correspondence after the case was closed.    

We selected and reviewed a judgmental sample of 14[19] cases after receiving concerns that taxpayer correspondence was not being considered before the case was closed.  Our results showed that for 11 (79 percent) of the cases, Program employees met expectations when completing audit reconsideration cases.  Program management does not have specific time constraints for Program employees to close audit reconsideration cases.  However, we were advised that Program employees are expected to review taxpayer correspondence and complete audit reconsideration cases within 120 days from the date the correspondence is received.  We determined that the average number of days to work and close all 11 cases was 104 days.  *********1****************while the remainder ranged from 13 to 123 days to close.  Further, after considering the taxpayers’ correspondence, Program employees reduced the additional tax assessments by 63 percent, or from $36,075 to $22,720.

For the remaining 3 cases, Program employees classified and worked the cases as audit reconsiderations even though the taxpayers’ correspondence was actually received an average of 26 days prior to the case closing.  Because the taxpayers’ correspondence was not considered before the cases were closed, the Program assessed the taxpayers $11,282 in additional taxes.  After considering the correspondence during the audit reconsideration, Program employees abated the entire $11,282 in additional tax assessments. 

In most instances, we believe Program management has established guidance for its employees when responding to and controlling taxpayer correspondence.  However, when Program employees do not follow procedures or consider taxpayers’ correspondence, the burden on taxpayers increases.  For example, taxpayers in our sample would have eventually received notices demanding payment for taxes that were not owed.  Taxpayers could have incurred additional expenses if they had to hire a certified public accountant or attorney to represent them before the IRS.  In addition, the IRS inefficiently used its resources because Program employees performed additional work abating taxes that should not have been assessed if the taxpayers’ correspondence had been considered when it was initially received.  Government Accountability Office standards provide that transactions are to be accurately and timely recorded to maintain their relevance and value to management in controlling operations and making decisions.  When this efficiency occurs, management is able to achieve effective results both within its Program and with its customers.  

Program improvements could increase taxpayers’ customer satisfaction 

As part of the IRS agency-wide initiative to monitor and improve taxpayer satisfaction, the IRS provides a Customer Satisfaction Survey to taxpayers whose tax returns were examined by Program employees.  Our review of the survey results from the SB/SE and W&I Divisions for the period October through December 2009[20] showed, on average, a 44 percent dissatisfaction rate when taxpayers were questioned about the length of the examination process, time spent on the examination, consideration given to information sent to the IRS, and fairness of treatment by Program employees.  These issues were rated by taxpayers as very important and consistent with concerns expressed by the Board. 

The IRS’ contractor that analyzed the survey results suggested that “making improvements to areas in which customers are relatively dissatisfied and/or where the item is very important to them will improve overall customer satisfaction.”  See Appendix V for taxpayers’ responses to the Customer Satisfaction Survey on their examination experience.

Recommendation

Recommendation 3:  The Commissioners, SB/SE and W&I Divisions, should ensure employees follow all Program guidelines for handling, responding to, and considering taxpayer correspondence when working Program cases.

Management’s Response:  IRS management agreed with this recommendation.  They will continue to ensure adherence to their procedures in all aspects of their Program through their annual operational and program reviews.

Office of Audit Comment:  The IRS agreed with our recommendations, but did not agree with our reported outcome measure.  Specifically, the IRS stated many of the errors do not impact taxpayer rights and entitlements because the absence of a date stamp on a taxpayer’s correspondence would not constitute burden to the taxpayer.  In addition, the IRS expressed concerns with our use of the word “error” when referring to delayed processing because it could lead readers to believe that an incorrect conclusion was reached during the examination.

Our audit findings and recommendations address results that showed IRS employees did not adhere to established procedures and/or guidelines when processing correspondence taxpayers submit for examinations of their tax returns.  For example, when date stamps, which are applied to assist employees’ control of documents received from taxpayers, are missing, the IRS cannot ensure a timely response to the taxpayer.  We used the IRS’s criteria, which included some inconsistency between the W&I and SB/SE Divisions, for measuring timeliness to identify errors.  When employees do not adhere to IRS guidelines and procedures, it is simply an error.  We continue to believe that when these errors occur, taxpayers are at risk of not receiving their rights, entitlements, and protection of due process when they question the accuracy of tax liabilities resulting from Program examinations.  For example, we reported 17 instances where taxpayers were assessed additional taxes and interest totaling $38,591 because the IRS did not timely consider the information submitted by taxpayers.  After taxpayers questioned the assessments and resubmitted information, Program management agreed to reduce the assessments by $13,287. 

The overall objective of our review was to determine whether the IRS’s reengineered Program process resulted in a more responsive and less burdensome process for taxpayers.  We clearly state in the report that it did not evaluate whether the information provided by the taxpayer would have substantiated the items in question.  The IRS’s guidelines and procedures are designed to ensure fair and equitable treatment for all taxpayers and to control and monitor employee work.  We reported the IRS’s new mail process has improved Program results and could, once implemented at all sites, lessen taxpayer burden.  These results were based on valid statistical samples which were shared with the IRS throughout this review.  It is unclear to us why the IRS disagrees with our outcome measure when it agreed to take corrective actions for all of our recommendations.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

Our overall objective was to determine whether the IRS’s reengineered Program process (i.e., mail processing, information document requests, and telephone access and service) resulted in a more responsive and less burdensome process for taxpayers.  To accomplish this objective, we:

I.                   Determined what changes, if any, had been implemented to improve procedures in place for processing incoming mail.  This included following up to determine if any Taxpayer Improvement Initiative study team and Lean Six Sigma[21] Mail study recommendations were implemented and whether the Mail study met its April 2010 completion date.  In addition, we reviewed quarterly Customer Satisfaction Survey results for the satisfaction levels of taxpayers with the Program.

II.                Determined whether the Program effectively and efficiently followed procedures to process taxpayer correspondence. 

A.    Selected a statistical sample of 62 agreed and 60 default cases from Audit Information Management System to determine if Program employees adhered to existing guidelines when completing Program examinations.  In addition, we assessed the reliability of the data in the Examination databases used during the audit by comparing selected fields from our sampled cases to the Integrated Data Retrieval System.  We did not identify any reportable differences.  We used attribute sampling and selected cases from the SB/SE and W&I Divisions. 

1.      For the agreed cases sample, we used a 90 percent error rate, a 90 percent confidence interval, and a ±6.25 percent precision level.  We used the weighted average method to determine the number of cases selected from each Division.

                       Weighted       Sample

   Cases           Average           Size

SB/SE population of cases                    94,931               .38                   23
W&I population of cases                    156,284               .62                   39
                                                           251,215             100%                62

2.      For the default cases sample, we used an 80 percent error rate, a 90 percent confidence interval, and a ±8.5 percent precision level.  We used the weighted average method to determine the number of cases selected from each Division.

                      Weighted          Sample

   Cases           Average             Size 

SB/SE population of cases                    51,588               .38                   23
W&I population of cases                      85,706               .62                   37
                                                          137,294              100%                60

B.     We selected a judgmental sample of 24[22] cases in one Program site to determine whether Program employees adhered to guidelines for processing and closing default cases.  We used judgmental sampling because we could not define the population and did not plan to project our results. 

C.     We selected a judgmental sample of 14[23] cases in one Program site using the Treasury Inspector General for Tax Administration Data Center Warehouse Individual Return Transaction File and Audit Information Management System to determine whether Program employees adhered to guidelines for processing and closing default cases that were reopened to consider new information provided by the taxpayer.  We used judgmental sampling because we could not define the population and did not plan to project our results.

III.             Monitored whether the SB/SE Division toll-free call routing system was on target for the July 2010 rollout.

Internal controls methodology

Internal controls relate to management’s plans, methods, and procedures used to meet their mission, goals and objectives.  Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations.  They include the systems for measuring, reporting, and monitoring program performance.  We determined the following internal controls were relevant to our audit objective:  IRS guidelines for timely and effectively routing, handling and closing Program cases.  We evaluated these controls through discussions with Program management and employees, and by selecting and reviewing closed agreed and default Program cases. 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Compliance and Enforcement Operations)

Frank Jones, Director

Deborah Smallwood, Audit Manager

Sylvia Sloan-Copeland, Lead Auditor

Cindy Harris, Senior Auditor

Lynn Ross, Senior Auditor

Chanda Stratton, Auditor

Michele Strong, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Deputy Commissioner, Small Business/Self-Employed Division  SE:S

Deputy Commissioner, Wage and Investment Division  SE:W

Director, Campus Compliance Services, Small Business/Self-Employed Division  SE:S:CCS

Director, Campus Reporting Compliance, Small Business/Self-Employed Division  SE:S:CCS:CRC

Director, Compliance, Wage and Investment Division  SE:W:CP

Director, Reporting Compliance, Wage and Investment Division  SE:W:CP:RC

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaisons:

            Commissioner, Small Business/Self-Employed Division  SE:S

            Commissioner, Wage and Investment Division  SE:W

 

Appendix IV

 

Outcome Measure

 

This appendix presents detailed information on the measurable impact that our recommended corrective actions will have on tax administration.  This benefit will be incorporated into our Semiannual Report to Congress.

Type and Value of Outcome Measure:

·         Taxpayer Rights and Entitlements – Potential; taxpayers represented by 108,092 agreed[24] closed cases and 101,787[25] closed default cases that contained an error because Program employees did not adhere to Program guidelines (see pages 4 and 6).

Methodology Used to Measure the Reported Benefit:

Closed Agreed Cases

We selected a statistical sample of 62 W&I Division and SB/SE Division closed agreed cases from a population of 251,215 for the period April through December 2009.  Our results showed 28 (48 percent) contained errors where employees did not follow guidelines while completing the Program examination. 

  • The W&I Division had 24 of 39 closed agreed cases with errors and the SB/SE Division had 4 of 23, totaling 28 of 62.
  • To determine the combined error rate for both Divisions, we weighed the error rate for each Division by the population for both Divisions.  Specifically,

o   Agreed W&I Division population + agreed SB/SE Division population = agreed total population:  (156,284 + 68,523[26] = 224,807).

o   Agreed W&I Division population percent:  69.5 percent (156,284/224,807).

o   Agreed SB/SE Division population percent:  30.5 percent (68,523/224,807).

o   Error rate for the W&I Division is 61.5 percent and the SB/SE Division is 17.4 percent.

o   Error rate for both Divisions is 0.695 x 0.615 + 0.305 x 0.174 = 0.4278 + 0.0530 = 0.4808 or 48 percent.

  • Projected number of Program cases with an error that represented taxpayers (0.4808 x 224,807) = 108,092.[27]

Closed Default Cases

We selected a statistical sample of 60 closed default cases from a population of 137,294 for the period April through December 2009 for the W&I and SB/SE Divisions to determine if employees followed Program guidelines.  Results showed 47 (83 percent) contained errors that indicate Program employees continue to not always follow procedures when closing cases. 

  • The W&I Division had all 37 closed default cases with errors and the SB/SE Division had 10 of 23, totaling 47 of 60.
  • To determine the combined error rate for both Divisions, we weighed the error rate for each Division by the population for both Divisions.  Specifically,

o   Default W&I Division population + default SB/SE Division population = the default total population:  (85,706 + 37,018[28] = 122,724).

o   Default W&I Division population percent:  69.8 percent (85,706/122,724).

o   Default SB/SE Division population percent: 30.2 percent (37,018/122,724).

o   Error rate for the W&I Division is 100 percent and the SB/SE Division is 43.5 percent.

o   Error rate for both Divisions is 0.698 x 1.00 + 0.302 x 0.435 = 0.6980 + 0.1314 = 0.8294 or 83 percent.

  • Projected number of Program cases with an error that represented taxpayers (.8294 x 122,724)  = 101,787.

 

Appendix V

 

Taxpayer Feedback From the Fiscal Year 2010 Customer
Satisfaction Survey on Their Experience With the Examination Process

 

“I had to use the Taxpayer Advocate due to no response to phone calls or fax requests; the examiner had terrible professional contact.”

“The frustrating part is that I made sixteen phone calls and left messages, and not one call was returned.”

“I left several messages with my case worker, and she would never get back to me.”

“It took seven months to resolve this issue.”

“The length of time during the audit can cause anxiety!”

“I feel like I have been singled out, picked on, and harassed by the IRS.”

“I am still waiting for my refund.  I feel this is very unfair.”

“I spent hours on the phone with the IRS going from one person to the next trying to get guidance as to what to do.”

“It takes a lot of time to get all of the paperwork together.  How do people with full-time jobs handle this?”

“The wait time to actually get my case assigned and my information in front of an agent was way too long.  It took approximately six to eight months.”

“Find a way to have these matters resolved quicker than one year.”

“I sent in a copy of my divorce agreement with my taxes but it was never looked at.”

“When a person sends court papers for proof in a case for the audit, they should consider them and not let them be ignored.”

“Original notice of adjustment to my return gave no explanation as to why changes were made.”

“I think the IRS needs to give a more clear explanation of their findings during the audit.”

 

Appendix VI

 

Letter 3219 Notice of Deficiency

 

Letter 3219was removed due to its size.  To see the Letter, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Appendix VII

 

Glossary of Terms

 

Abating – The act of reducing or eliminating unpaid taxes.

Agreed – Cases are considered agreed when the taxpayer signs the proposed tax assessment report agreeing to changes made.

Assessment An assessment is the statutorily required recording of the tax liability.  This generally happens when the IRS determines the taxpayer owed more taxes than reported on the tax return.

Audit Information Management System An IRS computer system that provides inventory and activity control of active examinations.

Audit Reconsideration – The process the IRS uses to reevaluate the results of a prior examination where additional tax was assessed and remains unpaid or a tax credit was reversed.  If the taxpayer disagrees with the original determination, he or she must provide information that was not previously considered during the original examination.

Automated History Sheet – An electronic workpaper in IRS computer systems which captures the actions taken on a case (e.g., contacts with taxpayers and research conducted).

Campus The data processing arm of the IRS.  The campuses process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts. 

Correspondence Examination Automation Support System – The IRS suite of web-based applications developed to enhance the site examination process.  The system enables case assignment and transfer between examination groups and batch groups, facilitates a universal view of the campus examination case inventory, and allows the display of the client-generated tax reports and letters associated with the examination case.

DefaultedCases are defaulted when the taxpayer fails to sign the proposed tax assessment report, contacts the IRS to appeal the tax assessment, or petitions the Tax Court.

Individual Return Transaction File – One of the IRS data files stored at the Data Center Warehouse.  The Return Transaction File contains all edited, transcribed, and error-corrected data from the U.S. Individual Income Tax Return (Form 1040) series and related forms for the current processing year and 2 prior years.

Integrated Data Retrieval System An IRS computer system with the capability to instantaneously retrieve or update stored taxpayer information.  The system tracks taxpayer status and allows for post transaction updates back to the Master File.

IRS Received Date – The date the IRS stamps on correspondence designating the date it was received at any IRS office or campus.

Lean Six SigmaLean is a time and value-based process improvement philosophy designed to eliminate waste and nonvalue-added activities.  Six Sigma is a business process improvement method that uses data and facts to produce bottom-line measurable results through reduction in process variation.

Master File The IRS database that stores various types of taxpayer account information.  This database includes individual, business, and employee plans and exempt organizations data.

Program Received Date – The date the Discretionary Examination and Correspondence Examination Program receives and date stamps mail for processing.

Treasury Inspector General for Tax Administration Data Center Warehouse – A centralized storage and administration of files that provides data and data access services of IRS data.

 

Appendix VIII

 

Management’s Response to the Draft Report

 

DEPARTMENT OF THE TREASURY

INTERNAL REVENUE SERVICE

ATLANTA. GA 30308

 

               COMM1SSIONER

WAGE AND INVESTMENT DIVISION

 

 

January 6, 2011

 

 

MEMORANDUM FOR MICHAEL R. PHILLIPS

    DEPUTY INSPECTOR GENERAL FOR AUDIT

 

FROM:                            Richard Byrd. Jr. /s/ Richard Byrd

    Commissioner, Wage and Investment Division

 

SUBJECT:                     Draft Audit Report – Progress Has Been Made to Reengineer the Examination Program, but Additional Improvements Are Needed to Reduce Taxpayer Burden (Audit #201030034)

 

We have reviewed the subject draft report and appreciate your recognition of steps we have taken to reengineer the Correspondence and Discretionary Examination Program in the Wage and Investment and Small Business/Self-Employed Divisions.  Specifically, your report acknowledges the implementation of the corporate routing of the telephone system and the pilot of the new mail model processing for Correspondence Examination. The implementation of the corporate routing of the telephone system enables taxpayers to speak directly to examiners to address concerns about their examinations. Corporate call routing has significantly increased the taxpayer’s ability to reach an examiner. Your report also acknowledged that the new mail model process has improved compliance with Program guidelines.

In recent years, we implemented other processing enhancements to improve our service in response to concerns from taxpayers and tax practitioners. These include: standardizing and extending the time periods for taxpayers to provide supporting documentation, granting additional time to respond to notices when requested by taxpayers, issuing acknowledgement letters to taxpayers upon receipt of their correspondence, providing additional training to examiners to help reduce taxpayer burden, and providing additional tools for our telephone assistors, which has enabled them to provide more completed and consistent answers to taxpayers.  These enhancements have enabled us to be more responsive to taxpayers. It is also noteworthy that we have implemented these program enhancements as a result of efficiencies in our program, without additional funding.

Your report addresses timeliness issues identified in the processing of sample cases. While we agree that some of the sample cases involved processing delays, incorrect audit determinations were not found in these cases, which is one of our significant goals in providing service to the taxpayers. Additionally, our existing review process and campus oversight adequately monitors these timeliness issues.

We are concerned with the terminology used in the report when referring to delayed processing as “errors.” The use of “error” in this context could lead to an incorrect conclusion that these examinations contained errors in reaching correct tax determinations, whereas your audit did not reveal any weaknesses to audit determinations.  Additionally, we are concerned about the small statistical samples used in your report to make population inferences. The small samples, based on assumed error rates, could result in inaccurate inferences to the overall population on certain measures and be misleading to the public.

We do not agree with the outcome measure in Appendix IV because many of the “errors” do not impact taxpayer rights and entitlements. For example, your office has not explained to our satisfaction how it was determined that the lack of an internal date stamp on the taxpayer’s correspondence constitutes a burden to the taxpayer. We also do not feel that the error rate for the agreed cases is reliable since inconsistent time frames were used to determine errors. Cases meeting the same time period were considered errors in some instances, but were not errors in other instances. Additionally, while we set an expectation to address the mail within 30 days, processing variables such as surges in mail receipts often make it difficult to meet this business expectation. We have business measures in place to monitor overage mail specifically for this reason. We also have a process in place (interim letters) to keep the taxpayer informed when case resolution is delayed.

Attached are our comments to your specific recommendations. If you have any questions, please contact me, or a member of your staff may contact Ray Johnson, Acting Director, Reporting Compliance, Wage and Investment Division, at (404) 338-8983.

 

Attachment

 

Attachment

 

The Commissioners, SB/SE and W&I Divisions, should:

 

RECOMMENDATION 1

Ensure that all Program employees follow current mail processing guidelines until the pilot mail model is fully implemented in all 10 sites.

CORRECTIVE ACTION

We will continue to ensure adherence to our procedures in all aspects of our program through our annual operational and program reviews. 

IMPLEMENTATION DATE

Implemented and Ongoing

RESPONSIBLE OFFICIALS

Director, Reporting Compliance, Wage and Investment Division

Director, Campus Reporting Compliance, Small Business/Self-Employed Division

CORRECTIVE ACTION MONITORING PLAN

We will continue to monitor this corrective action as part of our internal management control process.

RECOMMENDATION 2

Strengthen controls over processing agreed cases and for consistency between the two Business Divisions, establish a specific time constraint for Program employees to close agreed cases.

CORRECTIVE ACTION

We agree with this recommendation.  We will establish a standard timeframe and incorporate this into our Internal Revenue Manual guidance.  We also agree to implement a processing change to make it easier to identify agreed cases that are delayed more than seven days.

IMPLEMENTATION DATE

October 15, 2011

RESPONSIBLE OFFICIALS

Director, Reporting Compliance, Wage and Investment Division

Director, Campus Reporting Compliance, Small Business/Self-Employed Division

CORRECTIVE ACTION MONITORING PLAN

We will monitor this corrective action as part of our internal management control process.

RECOMMENDATION 3

The Commissioners, SB/SE and W&I Divisions, should ensure employees follow all program guidelines for handling, responding to, and considering taxpayer correspondence when working Program cases.

 

CORRECTIVE ACTION

We will continue to ensure adherence to our procedures in all aspects of our program through our annual operational and program reviews.

IMPLEMENTATION DATE

Implemented and Ongoing

RESPONSIBLE OFFICIALS

Director, Reporting Compliance, Wage and Investment Division

Director, Campus Reporting Compliance, Small Business/Self-Employed Division

CORRECTIVE ACTION MONITORING PLAN

We will continue to monitor this corrective action as part of our internal management control process.



[1] In the W&I Division, the Program is referred to as Discretionary Examination, and in the SB/SE Division, the Program is referred to as Correspondence Examination.

[2] The 10 sites are located at the Andover Campus in Andover, Massachusetts; Atlanta Campus in Atlanta, Georgia; Austin Campus in Austin, Texas; Brookhaven Campus in Holtsville, New York; Cincinnati Campus in Cincinnati, Ohio; Fresno Campus in Fresno, California; Kansas Campus in Kansas City, Missouri; Memphis Campus in Memphis, Tennessee; Ogden Campus in Ogden, Utah; and Philadelphia Campus in Philadelphia, Pennsylvania.

[3] See Appendix VII for a glossary of terms.

[4] If the taxpayer’s response is not sufficient, the taxpayer is issued a Request for Consideration of Additional Findings letter (Letter 692) explaining that the response did not substantially verify the issues.  This is done prior to the issuance of the Notice of Deficiency.  See Appendix VI for a Notice of Deficiency (Letter 3219).

[5] Unless otherwise noted, all references to days throughout the report are calendar days.

[6] This is the period after the IRS made a presentation to the Board in February 2009.

[7] The error rates from the statistical samples throughout this report, including results in Figures 2 and 3, are weighted error rates based on the errors and population of closed cases for each Division.  As a result, the percentages cannot be determined based on the numbers presented.  See Appendix IV for the methodology used to determine the percentages.

[8] Figure 2 shows 6 of 35 cases were not closed timely; however, it took an average of 7 days to close all 35 cases.  Based on these results, the Austin Compliance Site met the W&I Division’s 7-day expectation.

[9] For those attributes that do not total 62, we could not evaluate them because information was not available for us to determine if an error occurred.

[10] To calculate the error rate for this Program attribute, we determined *************1************* and 19 W&I Division cases were closed beyond 7 days.

[11] We used a 90 percent confidence interval and a precision of ±6.25 percent to calculate a sample size to select cases we reviewed to identify error rates where employees did not always follow procedures when closing cases.  We used the results of our analysis to project across the population of closed agreed cases. 

[12] We could not compare these results to a judgmental sample from the Austin Compliance Site because there were no default cases available for us to review when we selected our sample.  Also, the Lean Six Sigma study did not evaluate the Program process for completing audit reconsideration cases.

[13] Only 2 of the 41 cases experienced delays greater than 100 days.  The delays for the remaining cases were all less than 94 days.

[14] For those attributes that do not total 60, we were unable to determine if an error occurred. 

[15] We used a 90 percent confidence interval and a precision of ± 8.5 percent to calculate a sample size to select cases we reviewed to identify error rates where employees did not always follow procedures when closing cases.  We used the results of our analysis to project across the population of closed default cases. 

[16] We randomly selected 27 cases; however, we removed 3 cases from our sample because *******************************1******************

[17] **********************1*******************************************************************  The remaining cases were less than 302 days.

[18] The remaining seven cases were not considered because the correspondence was received beyond the 7-day period. 

[19] We randomly selected 15 default cases; however, ****************************1******************************.

[20] We reviewed the latest Customer Satisfaction Survey results available for both Divisions.

[21] See Appendix VII for a glossary of terms.

[22] We randomly selected 27 cases; however, we removed 3 cases from our sample because 1 case was a duplicate and the remaining 2 did not meet the criteria for the closed cases reviewed.

[23] We randomly selected 15 default cases; however, in 1 case, there was no taxpayer correspondence so the case was eliminated from the sample.

[24] We used a 90 percent confidence interval and a precision of ±6.25 percent to calculate a sample size to select cases we reviewed to identify error rates where employees did not always follow procedures when closing cases.  We used the results of our analysis to project across the population of closed agreed cases.  See Appendix VII for a glossary of terms.

[25] We used a 90 percent confidence interval and a precision of ±8.5 percent to calculate a sample size to select cases we reviewed to identify error rates where employees did not always follow procedures when closing cases.  We used the results of our analysis to project across the population of closed default cases.

[26] We reduced our initial population by 26,408 because we did not receive cases for the Philadelphia Compliance Site.  Because we did not examine any cases from this site, we did not include its population in our projection.

[27] The calculated number of affected cases will not equal due to rounding.

[28] We reduced our initial population by 14,570 because we did not receive cases for the Philadelphia Compliance Site.  Because we did not examine any cases from this site, we did not include its population in our projection.