TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

The Internal Revenue Service and Contractors Are Generally Following Procedures Established for the Private Debt Collection Program, but Improvements Are Needed

 

 

 

September 10, 2008

 

Reference Number:  2008-30-157

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Redaction Legend:

1 = Tax Return/Return Information

3(d) = Identifying Information - Other Identifying Information of an Individual or Individuals

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

September 10, 2008

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – The Internal Revenue Service and Contractors Are Generally Following Procedures Established for the Private Debt Collection Program, but Improvements Are Needed (Audit # 200730009)

 

This report presents the results of our review of the Internal Revenue Service’s (IRS) Private Debt Collection Program (the Program).  This audit was initiated because several parties, including members of Congress and the National Taxpayer Advocate, had expressed concerns regarding the risks involved in contracting out tax collection activity.  These risks include the potential for disclosure of taxpayer information and violation of taxpayer rights.  The overall objective of this review was to determine whether the IRS and contractors have been following required procedures in the Program since implementation on September 7, 2006.  This audit was conducted as part of the Treasury Inspector General for Tax Administration Fiscal Year 2007 Annual Audit Plan.

Impact on the Taxpayer

The Internal Revenue Code[1] authorizes the IRS to enter into contracts with private collection agencies (hereafter referred to as PCAs or contractors) to assist in the collection of delinquent Federal taxes.  Although many of the Program procedures were being followed, improvements can be made in how a taxpayer’s identity is authenticated, how contractors handle taxpayer requests to opt out of the Program, the quality control system, and case processing.  These improvements will help ensure that taxpayer rights are protected during the collection process.

Synopsis

As of September 30, 2007, the gross accounts receivable owed to the IRS totaled almost $290 billion.  On October 22, 2004, the President signed the American Jobs Creation Act of 2004,[2] which created new Internal Revenue Code Section 6306 (2004) to permit PCAs to help collect Federal tax debts.

This review did not address whether the Program has been successful or whether the policy to use PCAs is appropriate.  The IRS, at the recommendation of the Government Accountability Office, is in the process of conducting a cost-effectiveness study to determine the Program’s effectiveness and its impact on the overall collection of delinquent taxes.  This review focused on compliance with IRS and contractor procedures for administering the Program.

Overall, the IRS and contractors have generally taken actions consistent with the procedures developed for the Program.  For example, the IRS and the PCAs generally followed required procedures for recalled accounts,[3] took appropriate actions to obtain full payment from taxpayers, and took timely and appropriate actions to address reported complaints.  The IRS properly implemented the initial inventory selection criteria and adequately conducted the closeout review of the PCA whose contract ended on March 8, 2007.  In addition, the PCAs appropriately monitored installment agreements with taxpayers for default conditions and adequately met quality standards during telephone calls with taxpayers.

While the IRS and contractors appropriately handled several processes, we identified some issues that needed to be addressed:

  • The IRS and the PCAs were inconsistent about what issues they considered to be complaints.
  • The Complaint Panel’s role was not defined.
  • Contractors were not always able to verify the identity of the taxpayer over the telephone.  When a taxpayer cannot or will not provide the necessary information to verify his or her identity, the contractor cannot discuss the taxpayer’s account or request payment to resolve the tax delinquency.
  • Quality review skip intervals were improperly calculated and applied, which affected the reliability of the results.
  • The results of quality reviews were unreliable because quality review sampling methodologies were merged, results were not properly weighted, telephone monitoring and case action reviews were not conducted on a regularly scheduled basis, and the quarterly sampling requirements were not met for telephone monitoring and case action reviews.  Also, the semiannual meeting with the Statistics of Income Division staff to assess the results and modify the sampling plan was delayed by 6 months.
  • Taxpayer requests to opt out of the Program were inconsistently processed.
  • PCAs properly established installment agreements but did not always document certain actions.
  • The Taxpayer Advocate Service (TAS) did not always properly code cases on its computer system.  Also, TAS case advocates did not always meet requirements for notifying Contracting Officer’s Technical Representatives of TAS activity on cases.  In addition, the IRS and the PCAs did not properly track and process some TAS cases.

After we brought these issues to the attention of IRS management, they took corrective actions to address the conditions.  However, the following issue requires further management action:  the contractors administered the taxpayer satisfaction surveys to the taxpayers, which could influence the results and produce low participation rates.

Recommendations

We recommended that the Director, Collection, Small Business/Self-Employed Division:

  • Continue to monitor each contractor’s authentication process and continue to implement improvements as necessary to assist contractors in increasing the number of authenticated taxpayer contacts.
  • Ensure that the Quality Unit 1) continues to provide weighted estimates of quality for external reporting purposes and for determining sample sizes, 2) continues to conduct quality reviews on a regularly scheduled basis, 3) continues to certify that the quality analysts meet with the Statistics of Income Division staff semiannually, and 4) establishes a plan for backup quality analysts to conduct reviews as needed and continues oversight of the process.
  • Identify and evaluate factors producing the low taxpayer satisfaction survey response rate and identify how to improve the response rate; explore alternatives to obtaining participation from taxpayers who hung up, got disconnected, or were not solicited for participation; and require quality analysts to evaluate contractor calls with taxpayers for undue influence by the PCA representative when following the prescribed scripts for soliciting participation in the survey.

Response

The IRS agreed with our recommendations and has already taken the following corrective actions:  1) improved the taxpayer authentication process; 2) ensured the availability of weighted estimates of quality, ensured that daily telephone reviews are conducted as mandated by the sampling plan and that the sampling plan is confirmed with the Statistics of Income Division quarterly, and ensured that meetings are held with the National Quality Review System staff and the Statistics of Income Division staff to revalidate sample sizes and address any necessary adjustments to the sampling plan; 3) implemented a revised backup plan to ensure that continuous reviews are conducted and sampling plan requirements are met, and held and will continue to hold weekly conference calls between management and the quality analyst staff; and 4) revised the taxpayer satisfaction survey methodology and the quality analyst review to include whether the contract representatives solicit the survey in an unbiased manner, and the Policy and Procedures Guide to reflect changes.

Actions still being addressed include the following:  1) adding to the national webpage a link that will assist taxpayers in verifying whether the PCA calling them is an approved contractor, and revising the Policy and Procedures Guide for PCAs as it relates to the authentication process; 2) revising the Oversight Quality Handbook to capture changes made; and 3) forming a team to explore ways to improve the taxpayer satisfaction survey participation rate.  Management’s complete response to the draft report is included as Appendix VI.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (202) 622-8510.

 

 

Table of Contents

 

Background

Results of Review

The Internal Revenue Service and Private Collection Agencies Took Appropriate Actions Related to Many Program Processes

The Complaint and Concern Process Needs Clarification

Recommendation 1:

Quality Review Results Are Not Statistically Valid

Recommendation 2:

Recommendation 3:

The Process for Requesting Participation in the Taxpayer Satisfaction Survey Could Be Improved

Recommendation 4:

Improvement Is Needed for Processing Certain Case Actions

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Glossary of Terms

Appendix V – Details of Identified Complaints and Concerns

Appendix VI – Management’s Response to the Draft Report

 

 

Abbreviations

 

COTR

Contracting Officer’s Technical Representative

FDCPA

Fair Debt Collection Practices Act

IRS

Internal Revenue Service

PCA; contractor

Private Collection Agency

SOI

Statistics of Income

TAS

Taxpayer Advocate Service

 

 

Background

 

One objective of the Private Debt Collection Program is to use private collection agencies to help collect the $290 billion in taxes owed to the Federal Government.

On October 22, 2004, the President signed the American Jobs Creation Act of 2004,[4] which created new Internal Revenue Code Section (§) 6306 (2004) to permit private collection agencies (hereafter referred to as PCAs or contractors) to help collect Federal tax debts.  The law allows PCAs to locate and contact any taxpayer specified by the Internal Revenue Service (IRS), to request from such taxpayer full payment of the amount of Federal tax due, and to obtain financial information with respect to such taxpayer.  The law allows the IRS to pay an amount not in excess of 25 percent of the amount collected by each PCA for the cost of services performed under a contract.

The gross accounts receivable amount owed to the IRS has been very large for many years (it totaled almost $290 billion as of September 30, 2007).  To help address this tax debt inventory, the Department of the Treasury proposed that Congress pass legislation authorizing the IRS to use PCAs to help collect tax debts for simpler types of cases.  The IRS refers to this effort as the Private Debt Collection Program (the Program).

The IRS believes that cases have a higher probability of collection when taxpayers are contacted.  The PCAs are assigned cases that the IRS would otherwise be unable to work because of its limited resources.  The IRS established three main objectives for the Program:

·         Help significantly reduce the growing number of uncollected tax liabilities.

·         Help maintain taxpayer confidence in the fairness of the tax system by assisting the IRS in addressing more of its delinquent accounts.

·         Assist the IRS in its continued focus to dedicate existing collection and enforcement resources on more difficult cases and issues.

The legislation requires the provisions of the Fair Debt Collection Practices Act[5] (FDCPA) to be applied to the PCAs.  The law also prohibits PCAs from committing or omitting any act that IRS employees are prohibited from in the performance of similar services.  In addition, the legislation created Internal Revenue Code § 7433A (2004) to permit civil actions by taxpayers for unauthorized collection actions by employees of the PCAs.  The law also amended § 1203 of the IRS Restructuring and Reform Act of 1998[6] relating to termination of employment for misconduct to include employees of PCAs, if such individuals committed any act or omission described under subsection (b).

The IRS requires that contractors comply with all taxpayer protections and prohibits them from threatening or intimidating taxpayers, or otherwise suggesting that enforcement action will or might be taken, if a taxpayer does not pay the liability.  The contractors must also adhere to all security and privacy regulations for systems, data, personnel, and physical security, and all taxpayer rights and protections.

On March 9, 2006, the IRS awarded contracts to 3 firms from a field of 33 for the first phase of the Program.  On September 7, 2006, the IRS placed an initial inventory of 11,562 balance-due accounts with the 3 contractors.  While the contract for one of the PCAs ended in March 2007, the IRS has twice renewed the contracts for the other two PCAs.  As a result of the most recent renewal in March 2008, the Program will continue into March 2009.

This review did not address whether the Program has been successful or whether the policy to use PCAs is appropriate.  In May 2004, the Government Accountability Office recommended a comprehensive study of the Program to ensure that the IRS is making the most effective and cost-efficient use of total resources available.  In response, the IRS Commissioner agreed to complete a study to determine the Program’s effectiveness and its impact on the overall collection of delinquent taxes.  The Government Accountability Office agreed that the results of the cost-effectiveness analysis would be provided as soon as practical after full Program implementation, which began when the new contracts were awarded in March 2008.  The Cost-Effectiveness Study final report is scheduled to be issued in August 2008.

We previously reported[7] that the IRS needs to continue monitoring inventory levels to ensure that the volume of cases available for placement is sufficient.  The IRS agreed with our recommendation and reaffirmed its commitment to monitoring the Program and to make any changes necessary to maximize the effectiveness of its strategy.  Since then, the IRS has continued to explore options for other potential inventory by testing the placement of additional case types, including Taxpayer Delinquency Investigations[8] and various Automated Collection System cases.  The challenge of identifying additional sources of cases will continue as the inventory of current case types is exhausted.

In the same report, we also noted that the predicted collection rate used by the IRS is higher than the industry standard.  IRS research of Federal and State Government agencies reflected an average collection rate of less than 3 percent.  However, the IRS had predicted a collection rate of 10 percent to 15 percent once it and the PCAs reached optimum performance and productivity.  The IRS agreed with our recommendation to continue updating revenue projections to ensure that it appropriately accounts for the actual collection rate achieved.  As the IRS predicted, the collection rate increased early in the Program.  However, after 6 months, the collection rate steadily declined each month from a peak of 11.7 percent in January 2007 to 5.1 percent in February 2008.  Based on this trend, it is important that the IRS continue monitoring the actual collection rate and updating the revenue projections.

This review was performed in the IRS Small Business/Self-Employed Division in New Carrollton, Maryland, and Kansas CityMissouri, and in the contractor worksites of Pioneer Credit Recovery, Inc. in PerryNew York; Linebarger Goggan Blair & Sampson LLP in Austin and San Antonio, Texas; and The CBE Group Inc. in Waterloo, Iowa, during the period April 2007 through February 2008.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.

Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.  A Glossary of Terms is included in Appendix IV.

 

 

Results of Review

 

The Internal Revenue Service and Private Collection Agencies Took Appropriate Actions Related to Many Program Processes

Overall, the IRS and contractors have taken appropriate measures to address many of the Program processes.  Specifically:

  • The initial inventory selection criteria were properly implemented.  To qualify for the Program, a case had to involve an individual taxpayer with a balance due for 1 tax period, and the amount due had to be less than $25,000.  The IRS initially assigned 11,562 balance-due accounts to the PCAs on September 7, 2006.  From this population, we selected and reviewed a statistical sample of 73 balance-due accounts to determine whether account data matched the initial inventory selection criteria.  All of our sample cases met the selection criteria.
  • Procedures were generally followed when cases were recalled.  A total of 3,177 taxpayer accounts were recalled to the IRS from the Program on September 7, 2006, through March 30, 2007.  From this population, we selected and reviewed a random sample of 30 cases and determined that the IRS and the PCAs generally followed required procedures for recalled accounts.
  • Procedures were properly followed for fully paid cases.  A total of 3,356 taxpayer accounts were paid in full from September 7, 2006, through March 29, 2007.  From this population, we selected and reviewed a random sample of 30 taxpayers to determine whether the IRS and the PCAs followed required procedures.  IRS procedures include posting payments to taxpayer accounts and providing account update information to the PCAs.  PCA procedures include requesting full payment from the taxpayer and monitoring account update information to identify payments received.  The IRS and the PCAs properly followed procedures for the 30 fully paid accounts in our sample.
  • PCAs were appropriately monitoring defaulted installment agreements.  When PCAs establish installment agreements with taxpayers, they are required to monitor the cases for payment compliance.  Two conditions will cause a PCA-monitored installment agreement to default:  the failure of the taxpayer to make scheduled installment agreement payments or the presence of a new balance due related to another tax period.  We obtained default lists from each PCA and selected and reviewed a judgmental sample of 30 taxpayers who had defaulted.  We determined that the PCAs were appropriately monitoring these taxpayers.
  • Actions taken to address reported complaints were timely and appropriate.  Upon receipt of a complaint, the contractors are required to immediately stop collection action.  In addition, they must provide a copy of the collection activity record on the account and any other relevant information to the Contracting Officer’s Technical Representative (COTR) by the following business day.  Our review of the Complaint Panel meeting minutes and the contractors’ complaint logs identified 101 reported incidents[9] occurring between September 7, 2006, and August 1, 2007.  This represents less than 0.2 percent of the 58,155 accounts placed with the PCAs during roughly the same period.  The cases included instances when the taxpayers, the taxpayers’ representatives, or other third parties expressed concern regarding the authentication process, the calling practices, or other interactions with the contractors.  The cases also included inadvertent disclosure of taxpayer information and other contract concerns.  Details are provided in Appendix V.

The actions taken to address these incidents were both timely and appropriate.  In general, the contractors stopped collection actions on taxpayers’ accounts in a timely manner when required; the contractors provided timely notice to the COTRs when they became aware of complaints as required; and the contractors, COTRs, and Complaint Panel[10] evaluated the nature of the complaints, determined whether actions by the collectors and contractors were in accordance with procedures, and determined whether actions (if necessary) were appropriate to prevent future complaints.

Further, our review showed that in 98 of the 101 incidents, the contractors followed IRS-approved collection practices, the actions were inadvertent, or no information was identified to substantiate the incident.  In just three instances, the IRS considered the actions to be a significant contract concern and appropriately imposed a penalty as allowed under the contract.

  • The quality of contractor calls was adequate.  We monitored 50 telephone calls (25 for each of the 2 PCAs that remain under contract) received or placed by the contractors during our onsite visits in July and August 2007.[11]  Overall, the quality of the calls we monitored was adequate.  In our sample cases, the contractors’ representatives were clear, professional, and courteous.  The representatives appropriately conducted authentication actions to determine that they were speaking to the appropriate party, discussed payment alternatives when warranted, and documented the calls on the contractors’ systems.
  • The process for ending operations at a PCA whose contract expired was adequate.  The contract for one of the PCAs ended in March 2007.  The IRS conducted a closeout review of this PCA in May 2007.  The IRS ensured that the PCA developed and took appropriate measures relating to the disposal, destruction, return, and safeguarding of Federal tax information under the PCA’s control pursuant to the completion of its contract with the IRS.

While the IRS and contractors appropriately handled these processes, we identified some issues that need to be addressed.

The Complaint and Concern Process Needs Clarification

The IRS has a process for handling taxpayer complaints about its employees.  Such complaints are reported to several IRS functions for tracking on management information systems.  Similarly, contracts with the PCAs include a process for handling complaints, including provisions for the imposition of a payment deduction (i.e., a penalty) for a complaint that is validated.  As previously discussed, the contractors’ actions to address complaints identified were timely and appropriate.  However, the IRS and the contractors were inconsistent about what issues they considered to be complaints.  The IRS needs to clarify how contractors should classify complaints and the role of the Complaint Panel.  In addition, continued management attention is needed to evaluate solutions to taxpayers’ reservations with the authentication process.

The IRS and the PCAs were inconsistent about what issues they considered to be complaints

As a result of unclear guidance defining complaints, the IRS and the contractors were inconsistent about what they considered to be complaints.  As previously discussed, our review of the Complaint Panel meeting minutes and the contractors’ complaint logs identified 101 incidents occurring between September 7, 2006, and August 1, 2007.  However, the Program considered only 81 of the 101 incidents to be complaints.  The remaining 20 cases were included on the contractor’s complaint log but were categorized by the Program as authentication concerns rather than complaints and were not reviewed by the Complaint Panel.  Our review of these 20 cases showed:

  • 3 (15 percent) involved taxpayer representatives (authorized and not authorized) who expressed concerns, including a concern over the clarity of the contractor’s letter sent to authorized taxpayer representatives.  The IRS considered a similar issue to be a complaint for the other contractor.  These cases do not appear to involve authentication concerns.
  • 17 (85 percent) involved taxpayers who expressed concern about the authentication process.  Treatment of these 17 cases was inconsistent with that given to similar cases that were included as complaints.  However, the Program has expanded the definition of this issue to include all refusals to authenticate and has taken significant actions in this area.

In addition, the contractors have inconsistent interpretations of what represents a complaint.  At one contractor, calls in which the taxpayer sounds irate or indicates a desire to file a complaint are forwarded to the next-level supervisor and are classified as complaints.  At the other contractor, if such a call is favorably resolved by the next-level supervisor, the instance is not classified as a complaint.  However, the next section of this report discusses corrective action the IRS has taken that should improve the consistency of complaint classification and identification.

The role of the Complaint Panel has been defined

Program management identified the need to provide assistance and consistency to the COTRs in complaint validation, penalty determinations, and the overall execution of the contract.  The IRS established the Complaint Panel to perform that role, but a panel charter that outlined the role and responsibilities of the Complaint Panel was not completed at that time.  For example, the Complaint Panel needed to decide what types of complaints warrant its attention.  The IRS categorized 66 (81 percent) of the 81 complaints as Type One complaints.  These cases do not result in contract penalties unless they are elevated to a higher complaint level.  Without a clear vision of the purpose of the Panel, contract administration time might not be effectively used.

Management Action:  After we brought these issues to management’s attention, they took corrective action by issuing a Complaint Panel Charter in January 2008.  The Charter further defines complaints and the purpose of the Panel to include:

  • Ensuring the consistency of adjudication of complaints as contract violations.
  • Reviewing Type One complaints for trends and, when necessary, recommending improvement processes.
  • Evaluating Type Two and Type Three complaints for purposes of validation and, when necessary, recommending assessment of penalties.

Continued monitoring of the authentication process is needed

When a taxpayer cannot or will not provide the information necessary to verify his or her identity, the contractor cannot discuss the account or request payment to resolve the tax delinquency.  A majority of the complaint cases (52 of 101) involved the authentication process.

IRS management and the contractors identified the authentication process as a concern.  In June 2007, the IRS began tracking instances in an authentication concern log.  The log showed 1,385 occasions in which taxpayers refused to provide the contractors with information needed to authenticate their identity during the 3-month period from July through September 2007.  There were another 368 instances[12] in which taxpayers were unable to authenticate.  In comparison, our review showed 9,013 instances in which right-party contacts were made.  This represents a ratio of approximately 16 authentication concerns for every 100 taxpayers contacted.[13]

The IRS has taken a number of actions to improve the contractors’ abilities to locate and authenticate taxpayers, including:

  • Providing contractors with the taxpayer’s date of birth information that can be used as an alternative to the taxpayer’s complete address during the authentication process.  This helps to authenticate taxpayers who are unable to provide an exact match to the address of record.
  • Performing a pilot project on the use of postal tracers to assist contractors in locating the correct taxpayer.
  • Issuing a certified initial contact letter to taxpayers who refuse to authenticate and who indicate that they had not received the initial contact letter.
  • Issuing a letter to taxpayers who refuse to authenticate but who indicate that they have received the initial contact letter.
  • Providing the contractors with the taxpayer’s last known telephone number(s), when available, to assist contractors in locating the correct taxpayer.

In October 2007, IRS management also obtained an IRS Office of Chief Counsel opinion allowing the contractors to disclose that they are calling on behalf of the IRS during the authentication process in certain circumstances.  This should help to alleviate concerns for those taxpayers who might be unfamiliar with the business names of the contractors.  Program management informed us that they have developed measures to evaluate these improvements in the authentication process.  Because some of these actions are still in development, the IRS needs to closely monitor authentication matters to identify emerging issues and to make improvements.

Recommendation

Recommendation 1:  The Director, Collection, Small Business/Self-Employed Division, should continue to monitor the contractors’ authentication process and continue to implement improvements as necessary to assist contractors in increasing the number of authenticated taxpayer contacts.

Management’s Response:  The IRS agreed with the recommendation.  As noted in the report, the Director, Collection, Small Business/Self-Employed Division, has already taken several corrective actions to improve the taxpayer authentication process.  Actions still being addressed include adding to the national webpage a link that will assist taxpayers in verifying whether the PCA calling them is an approved contractor and revising the Policy and Procedures Guide for PCAs as it relates to the authentication process.  

Quality Review Results Are Not Statistically Valid

The quality review process provides a method to monitor, measure, and improve the quality of work.  Quality review results enable management to identify trends, problem areas, training needs, and opportunities for process improvement.  The Program Quality Unit administers quality control tests, compiles the results, and provides the quality review results to Program management and various external stakeholders, including the Tax Fairness Coalition and the Treasury Inspector General for Tax Administration.  During the period October 2006 through June 2007, the Quality Unit consistently reported quality levels above 90 percent from its telephone monitoring and case action reviews.  Results incorporate the factors of timeliness; professionalism; and customer, regulatory, and procedural accuracy.

The IRS Statistics of Income (SOI) Division staff developed sampling plans for quality analysts to use to monitor PCA telephone calls and conduct case action reviews.  The plans ensure that samples are valid at the PCA site level quarterly.  We evaluated data related to the quality analysts’ telephone monitoring and case action reviews of the two PCAs that remain under contract.  Several concerns surfaced about the validity of the reported quality results.  The IRS has taken action to address some of our concerns, but additional management action is needed to address others.

Management took action to correct the quality review skip intervals and the merging of sampling methodologies

Proper calculation and application of the skip interval is necessary to ensure that a sample is spread appropriately across the population and to provide estimates that are relatively unbiased.  We determined that ****3(d)**** However, this problem was corrected in February 2007 when the Quality Unit started obtaining its sample using a new selection method that was not previously available.

In addition, the Quality Unit implemented and combined 2 different sampling methods during the first 6 months of the Program.  According to the Internal Revenue Manual, different samples should never be merged because doing so could cause biased results.  Although the Quality Unit has been proactive at improving the quality case selection process, it did not consider when a new process should be implemented.  Because the Quality Unit plans to continue using the sampling methodology implemented in 2007, additional corrective action is not necessary.  However, if the Quality Unit identifies an improved method for case selection, it should coordinate with the SOI Division staff to ensure that implementation is executed appropriately.

Improvements are still needed in the Quality Review Program

While the IRS has taken some corrective actions, we identified concerns that require further management attention.  Specifically, the Quality Unit did not:

  • Consider the relative weight of each sample with respect to the entire population.
  • Conduct telephone monitoring and case action reviews on a regularly scheduled basis or select a sufficient sample size in each quarter of the fiscal year.
  • Regularly meet with the SOI Division to assess results and adjust sampling plans.

Weighting the samples.  According to the Internal Revenue Manual, weighting should be used to ensure that every sampled case has the appropriate amount of influence on the overall cumulative estimate of quality.  Estimates that combine more than one site are not considered statistically valid if they are not weighted.  However, management did not consider the issue of weighting when the Program was developed.  To provide weighted estimates of quality, at our suggestion, the IRS has required the quality analysts to input inventory volumes for case action reviews and telephone monitoring since October 1, 2007.  However, the Quality Unit continued to provide management with unweighted estimates.  After we brought this issue to management’s attention, the Quality Unit began providing management with both unweighted and weighted estimates in February 2008.

Telephone monitoring and case action reviews.  The quality analysts did not meet the requirement to monitor five telephone calls per day for either PCA, which could cause biased results.  Through June 2007, the goal was met only 33 days for 1 PCA and 47 days for the other, during the approximately 180 workdays since Program inception.  The quality analysts indicated that it is sometimes difficult to meet the requirement due to the amount of time they spend performing other duties, and there are no backup quality analysts to fill in when needed.  Further, one of the PCAs provides secure dial-up capabilities for telephone monitoring reviews, but it does not record telephone calls.[14]  As a result, the quality analyst must wait for live calls to conduct reviews.  With our assistance, the quality analyst implemented a process to receive instant notification when the contractor’s collectors make or receive a call, but the process of selecting calls would be more efficient if telephone calls were recorded.

We also determined that quality analysts did not conduct an equal number of case action reviews each month.  For example, in 1 quarter, ****3(d)**** This occurred because quality analysts conducted their reviews as time permitted.  According to the SOI Division, quality analysts must consistently conduct the required number of case action reviews on a regularly scheduled basis (i.e., daily, weekly, or monthly).  Failure to meet this requirement does not allow for results to be weighted and could cause biased results.

In addition, we determined that ****3(d)**** 1 PCA.  For the first 3 quarters since Program inception, ****3(d)**** We believe that the primary cause for this condition was the amount of time ****3(d)**** spent trying to obtain calls to monitor.  Not meeting the quarterly sample requirements for telephone monitoring and case action reviews could cause results to be biased.

Management Action:  After we brought these issues to management’s attention, the Quality Unit significantly improved meeting the requirements for conducting telephone monitoring and case action reviews during the period October 2007 through March 2008.

Adjustments to the sampling plan.  Program management worked with the SOI Division to develop the sampling plan.  When the Program began, management requested a quarterly sample size sufficient to produce estimates of quality with 90 percent confidence and +5 percent precision.  Quality analysts are required to meet with the SOI Division staff semiannually to recalculate the sample size required for each PCA.  As necessary, adjustments should be made to the sampling plan to ensure that an appropriate number of telephone calls and cases are selected to meet the goal of quality estimates with 90 percent confidence and +5 percent precision.

Because there were no historical data available on this type of program, the SOI Division staff calculated a sample size of 270 based on a conservative estimate of 50 percent quality.  Using an estimate of 50 percent quality produces the largest sample size possible, so the SOI Division staff recommended that the sample size be adjusted after 2 quarters of historical data became available.  The 270-case sample size went into effect October 1, 2006, and new sample sizes should have been implemented in April 2007.

In August 2007, we informed IRS management that the quality analysts had not met with the SOI Division staff to recalculate the sample size.  Management indicated that they did not want to make changes to the requirements during the first year of the Program.  However, after the quality analysts met with the SOI Division staff, the quarterly sample size requirements for telephone monitoring and case action reviews were reduced from 270 to 120 and 98, respectively, beginning October 1, 2007.  If the quality analysts had met with the SOI Division staff when recommended, they could have reduced their required sample sizes earlier.  They might also have been able to implement changes necessary to lessen the effect of noncompliance with quarterly sample size requirements previously discussed in this report.

The Quality Unit provided the SOI Division staff with an unweighted quality rate of 95 percent to calculate the new sample size requirements.  Using this rate, the sample size requirements for telephone monitoring and case action reviews would have been 60 and 53, respectively.  However, because unweighted quality results are not considered statistically valid, the SOI Division staff had to use a more conservative quality rate of 90 percent in their sample size calculations, which generated higher sample size requirements of 120 and 98 for telephone monitoring and case action reviews, respectively.

Further, as previously discussed, the quality review sampling methodologies were merged, monitoring was not conducted on a regularly scheduled basis, and quality analysts did not meet the quarterly sample size requirements for telephone monitoring and case actions reviews.  However, an SOI Division representative advised us that the conservative estimate of 90 percent did not take these issues into account because the SOI Division staff were not made aware of them at the time by the quality analysts.  The SOI Division representative stated that they would have used an even more conservative estimate to generate the required sample sizes if they had known about these issues.

Management Action:  Since we brought these issues to IRS management’s attention, quality analysts have held quarterly and semiannual meetings with the SOI Division staff to review the quality results and sampling plans during the period September 2007 through March 2008.  While no additional adjustments to the sampling plan have been necessary, the Quality Unit should continue this process to assure that quality review results are statistically valid.

Recommendations

The Director, Collection, Small Business/Self-Employed Division, should ensure that the Quality Unit:

Recommendation 2:  Continues to provide statistically valid, weighted estimates of quality for external reporting purposes and for determining sample sizes, continues to conduct the required number of case action reviews on a regularly scheduled basis to permit weighting of quality estimates and prevent biased results, and continues to ensure that the quality analysts meet with the SOI Division staff semiannually to recalculate the sample size required for each PCA and make any necessary adjustments to the sampling plan.

Management’s Response:  The IRS agreed with the recommendation.  As noted in the report, the Director, Collection, Small Business/Self-Employed Division, has already taken actions to ensure 1) the availability of weighted estimates of quality, 2) that daily telephone reviews are conducted as mandated by the sampling plan, 3) that the sampling plan is confirmed with the SOI Division quarterly, and 4) that meetings are held with National Quality Review Staff and the SOI Division staff (monthly and quarterly, respectively) to revalidate sample sizes and address any necessary adjustments to the sampling plan.  The Director intends to revise the Oversight Quality Handbook to capture changes in the sampling methodology, schedule, and coordination with the SOI Division. 

Recommendation 3:  Establishes a procedure for backup quality analysts to conduct telephone monitoring and case action reviews, as needed, and continues oversight of these processes to ensure that sample size requirements are met.

Management’s Response:  The IRS agreed with the recommendation.  The Director, Collection, Small Business/Self-Employed Division, has implemented a revised backup plan to ensure that continuous reviews are conducted and sampling plan requirements are met.  Also, management now holds weekly conference calls with the quality analyst staff to 1) ensure that sample size requirements are met, 2) verify that the backup plan implemented is sufficient, and 3) ensure that reviews captured are on track to meet sampling plan requirements.  The quality analyst staff will submit a weekly report to management to show that daily targets for telephone reviews were met.  The Director intends to revise the Oversight Quality Handbook to capture these changes. 

The Process for Requesting Participation in the Taxpayer Satisfaction Survey Could Be Improved

The IRS contracted with a consulting company to perform a survey of taxpayers, or their representatives, to gauge their satisfaction with the contractors.  This measure is also used as part of the IRS methodology to evaluate the contractors’ performance.  Currently, the survey is to be offered to every right-party contact.  However, the contractors’ representatives who handle the calls are also responsible for offering the survey.  This creates the potential for the representatives to bias the results.  While we found no evidence that bias occurred, we did identify instances in which some right-party contacts did not have the opportunity to participate in the survey (e.g., the taxpayer hung up before being offered the survey or the contractor forgot to offer the survey).

Taxpayers who participated in the survey were satisfied with their interactions with the PCAs

In the survey, respondents rate their satisfaction on a scale of 1 to 5, with 1 being “very dissatisfied” and 5 being “very satisfied.”  Of the survey questions, 15 related to taxpayer satisfaction.  The other questions were used to gather more information about the respondents themselves.  During the period November 27, 2006, through September 30, 2007, 3,046 taxpayers completed the taxpayer satisfaction survey that was offered to taxpayers assigned to the 2 remaining contractors.

Taxpayers gave the two remaining contractors high marks in response to the survey questions.  The majority (96 percent) of respondents indicated that they were satisfied with the service they received, giving an overall satisfaction rating[15] of a 4 or 5 on the 5-point scale.  The respondents were most satisfied with Courtesy of Representative, Representative’s Willingness to Help With Your Issue, and Fairness of Treatment.  The average ratings for these items over the 10-month period were 4.90, 4.88, and 4.86, respectively.

The survey consultant’s report, and our analysis of the survey data, also showed that the respondents were least satisfied with Ease of Understanding Letters, Tone of Letters, and Keeping You Informed on the Status of Your Case.  The average ratings for these items over the 10-month period were 4.31, 4.31, and 4.50, respectively.  Our discussions with the PCAs indicated that they are evaluating improvement options for their letters.

Contractor representatives conducting collection calls are also responsible for soliciting participation in the survey

The contractors are responsible for offering every right-party contact who called, or was called by a contractor, an equal opportunity to participate in the survey.  To assure that the survey results are representative of the population, the selection process should ensure that each right-party contact has an equal chance of selection, and the selection process should be free from bias.

The Automated Collection System functions, in both the IRS Small Business/Self-Employed Division and Wage and Investment Division, conduct a similar taxpayer satisfaction survey.  In that process, an independent IRS employee monitors calls to match the sampling pattern and solicit taxpayer participation in the survey.  As the call is approaching the end of the conversation, the IRS monitor notifies the IRS representative that the call has been selected for inclusion in the survey.

The contractors were generally consistent in their overall approach for offering the survey to right-party contacts.  However, our comparison of the participation rates (i.e., those who were asked to take the survey and accepted the opportunity to do so) showed a significant difference between the rate achieved by the contractors and the rates achieved by both Automated Collection System functions.  For the period July 1, 2007, through September 30, 2007, the IRS Automated Collection System functions had participation rates of 42 percent and 40 percent, respectively.  In comparison, only 14 percent (1,165 of approximately 8,115) of the right-party contacts[16] offered the survey by the contractors participated in it during this period.

Our discussion with an independent statistician indicated that the survey selection process detracts from the credibility of the survey results.  This occurs because the contractors’ representatives who conduct the collection calls are also responsible for offering the survey, which allows the potential for discretion or discouragement in the selection process (i.e., biased results).  While our monitoring of a judgmental sample of 50 calls did not identify any calls in which taxpayers were discouraged from taking the survey, we did identify instances in which some right-party contacts did not have the opportunity to participate in the survey.

For comparative purposes, we reviewed the contractors’ survey tracking data for the period July 1, 2007, through September 30, 2007.  These data showed that approximately 8,115 (90 percent) of the 9,013 right-party contacts were offered the survey, and 1,165 (13 percent) right-party contacts accepted and completed the survey.

The contractors’ survey tracking data showed that approximately 898 (10 percent) of the 9,013 right-party contacts were not asked to participate in the survey.  This included:

·         Taxpayers who hung up or disconnected prior to being asked to participate in the survey (350 instances, 4 percent).

·         Taxpayers who had limited English-speaking proficiency[17] (189 instances, 2 percent).

·         Other instances such as taxpayers who indicated an inconvenient time to discuss the account issue, taxpayers who indicated that they were driving, or technical issues in the ability to transfer calls to the survey line (164 instances, 2 percent).

·         Contractor employees who forgot to ask (58 instances, 1 percent).

·         Tracking data for one contractor that did not provide enough detail for us to determine why the taxpayers were not asked to take the survey (129 instances, 1 percent).

The consulting company that handled the survey indicated that right-party contacts who are not asked to participate in the survey create an unknown bias in the survey because it is unknown whether those not asked to participate would respond in the same manner as those who do participate.  The IRS indicated that plans for the survey include offering it on a random basis to maintain the desired confidence within a tolerable error rate and distributing calls over the entire month.  In implementing this change in the sampling, the IRS could also implement a change in the selection process that would provide that a third party–who was not substantially involved in the contact–be responsible for offering the survey.

The Quality Unit’s random telephone monitoring includes a review element to determine whether the survey was offered.  However, when the IRS begins offering the survey on a random basis as well, a new method will need to be developed to identify a sufficient population of calls.  This new method will need to evaluate whether the surveys are offered according to the survey sampling plan and whether contract representatives exert no undue influence on participation in the survey.

Recommendation

Recommendation 4:  The Director, Collection, Small Business/Self-Employed Division, should further identify and evaluate factors producing the low response rate and identify how to improve the response rate for the taxpayer satisfaction survey.  Alternatives should be researched to obtain participation from taxpayers who hung up, got disconnected, or were not solicited for participation.  As part of the Program’s review process, the Director should require that quality analysts routinely evaluate calls to ensure that there is no undue influence by the contract representative when following the prescribed scripts for soliciting participation in the survey.

Management’s Response:  The IRS agreed with the recommendation.  The Director, Collection, Small Business/Self-Employed Division, has revised the taxpayer satisfaction survey methodology to minimize the potential for bias by spreading the random sample out over each month.  Also, the quality analysts’ review of telephone recordings now includes evaluating whether the contract representatives solicit the survey in an unbiased manner.  The Policy and Procedures Guide has been updated to reflect these changes.  The Director intends to form a team to explore ways to improve the survey participation rate. 

Improvement Is Needed for Processing Certain Case Actions

We also evaluated the processing of taxpayer requests to opt out of the Program, the establishment of installment agreements, and the referral of cases to the Taxpayer Advocate Service (TAS).  Our review identified a need for improvements in each of these areas.

Taxpayer requests to opt out of the Program were inconsistently processed

When a taxpayer chooses not to work with a PCA, he or she must submit a written request to opt out of the Program.  Upon receipt of verbal notification, the PCA must suspend all activity on the account for 30 calendar days to allow the taxpayer sufficient time to make a written request.  When a written request is received, the PCA must ensure that no contact is made with the taxpayer.  If the PCA continues to contact the taxpayer in these circumstances, the subsequent contacts might represent violations of the FDCPA.  From September 7, 2006, through May 26, 2007, the IRS placed 43,805 accounts with the PCAs.  During this same period, 227 taxpayers opted out of the Program.  From this population, we selected and reviewed a random sample of 30 cases to determine whether the IRS and the PCAs followed required procedures.  The IRS did not follow procedures for 8 of the 30 cases.  Specifically:

  • In four cases, the Referral Unit did not notify the PCA of the taxpayer’s verbal request to opt out of the Program.  Therefore, no 30-day hold was placed on the related accounts to stop collection activity pending receipt of the written request.  No contact was made with the taxpayer in any of these cases.  However, ****1**** Although there were no violations for these 4 cases, the Referral Unit’s failure to notify the PCA to place a 30-day hold on these accounts could have resulted in violations of the FDCPA.
  • In three cases, the Referral Unit did not document, in the case history, the taxpayer’s verbal request to opt out.  However, the Referral Unit did notify the PCAs to place a hold on the related accounts.
  • ****1**** While the Referral Unit Policy and Procedures Guide does not cite a specific time period for processing requests, we believe that ****1****

We also determined that the PCAs did not follow procedures for ****1**** of the 30 cases.  For example:

  • ****1**** The PCA Policy and Procedures Guide requires the contractors to forward requests within 1 business day of receipt.
  • ****1**** While the PCA Policy and Procedures Guide requires the contractors to issue recall letters to taxpayers who opt out of the Program, it does not cite a specific time period as to when the letters should be issued.

Management Action:  After we brought these issues to Program management’s attention, they informed us that, in June 2008, they provided Referral Unit employees with refresher training.  This training included a lesson that reinforced the requirements to notify the PCA when they receive a verbal request from a taxpayer to opt out of the Program and to properly document the contact in the case history.  Program management also added a requirement to the Referral Unit Policy and Procedures Guide for employees to complete initial actions on an account within 5 business days of Referral Unit receipt of a written taxpayer request to opt out of the Program.  In addition, Program management added a requirement to the PCA Policy and Procedures Guide for PCAs to issue a recall letter to a taxpayer within 5 business days from the date the PCA receives systemic notification of the recall.

PCAs properly established installment agreements but did not always document certain actions

Through April 28, 2007, the Program had implemented 1,471 installment agreements with taxpayers.  From this population, we selected and reviewed a random sample of 30 cases to determine whether PCAs are following required procedures.  Overall, we determined that when PCAs determine that full payment by the taxpayer is not possible, the PCAs appropriately established terms with the taxpayer for a mutually agreed upon installment agreement to pay the tax liability and obtained necessary approvals from the IRS.  However, our audit results showed two areas that need management attention:

  • In four cases, the PCA did not document in the case history whether the taxpayer was advised about compliance checks, as required in the PCA Policy and Procedures Guide.  In these cases, we believe that the PCA simply forgot to follow the documentation procedure.
  • In four cases, the PCA did not document in the case history whether the taxpayer was advised about the user fee.  The PCA Policy and Procedures Guide requires the PCAs to advise the taxpayer about the user fee, but there is no requirement in the Guide to document this in the case history.

To be granted an installment agreement, a taxpayer must be in full compliance for all tax periods other than those under consideration.  If the PCA does not conduct a compliance check, and the taxpayer is not in full compliance, the installment agreement request will be rejected.  In this event, the taxpayer will have to resolve the issue and perform additional work with the PCA to submit another installment agreement request, thus causing the taxpayer unnecessary burden.  If the PCA does not advise the taxpayer about the installment agreement user fee, assessment of the fee could cause confusion for the taxpayer to resolve it, resulting in unnecessary taxpayer burden.  Documenting the case history would help ensure that the taxpayer was advised.

Management Action:  After we brought these issues to Program management’s attention, they issued instructions to the contractors in January 2008 that directed contractors to remind their employees of the requirements to conduct full compliance checks and to document this action on the PCA systems.  They also informed us that contractors will be provided with refresher training in July 2008 that will include a lesson to reinforce these requirements.  In addition, Program management added a requirement to the PCA Policy and Procedures Guide for PCAs to document in their system that the taxpayer was notified of the installment agreement user fee.

Cases were not always properly coded on the TAS computer system

Since the Program was first proposed in 2002, the TAS’ role has been to ensure that taxpayer protections applicable to IRS collection employees apply with equal force to PCA personnel.

When Program-related TAS cases are not properly coded on the TAS computer system, case advocates are less likely to send proper notification to the COTRs to notify them of TAS openings and closings.  This limits the COTRs’ ability to direct PCAs to stop and resume collection activity when an account is being worked by the TAS.  In July 2007, we reconciled the cases identified on the TAS computer system as Program cases with those on the IRS’ lists of Program-related TAS cases for each PCA, as well as those on one of the PCA’s fax logs.  Out of the 549 cases, we identified 95 from the IRS’ lists and the PCA’s fax logs that were on the TAS computer system but were not properly coded as Program cases.  We discussed this issue in general with the TAS Representative for the Program, who advised us that he or she was aware of this issue and was already planning to conduct a reconciliation.

In August 2007, we obtained lists of Program-related TAS cases from each of the PCAs and added them to our reconciliation.  In November 2007, we rechecked the TAS computer system and determined that the coding issue for the majority of the cases had been corrected.  Out of the 652 cases, we identified 13 that were on the TAS computer system but were not properly coded as Program cases.  In addition, we identified nine cases that should not have been coded as Program cases.  We believe that these issues were caused by the lack of case advocate experience in handling Program cases.  After bringing these 22 cases to the TAS Representative’s attention, we were informed that the coding had been corrected.

Management Action:  To prevent the coding problem from continuing, management reminded TAS employees of procedures for handling Program cases through several regularly scheduled weekly emails.  Also, in January 2008, all TAS employees were provided training that incorporated a session on working Program cases.  Additional actions the TAS intends to take include conducting a monthly match between the TAS Representative and the COTRs, updating and clarifying TAS procedures in the Internal Revenue Manual, and working with the Program team to reach agreement on improving the process.

Case advocates did not always meet requirements for notifying COTRs of TAS activity on cases

When a PCA becomes aware of TAS involvement on a particular case, the PCA is required to put a hold on all collection activity until the IRS notifies it that collection activity can be resumed.  If the case advocates do not properly notify the COTRs of openings and closings of Program-related TAS cases, COTRs are unable to give the PCA timely and appropriate notice to place and release holds on taking collection action.

During our reconciliation of 652 Program-related TAS cases and an indepth review of 50 of these cases, we identified:

  • 41 cases for which case advocates did not send Operation Assistance Requests to the COTRs to notify them of TAS involvement.
  • Seven cases for which case advocates did not send timely Operation Assistance Requests to notify COTRs of TAS involvement.
  • Six cases for which case advocates did not send notifications of closing action to the COTRs in a timely manner.

In addition, the PCAs did not have six accounts on hold while they were being worked by the TAS.  We determined the cause for each case as follows:

  • Four were caused by case advocates not properly notifying the COTRs of TAS case openings.  In ****1**** of the four cases, the PCA was a contributing factor because it did not recognize the need for TAS referral.
  • ****1****
  • ****1****

As with the coding issue, we believe that these issues were caused by the lack of case advocate experience in handling Program cases.  The TAS’ actions to correct the coding problem should also improve communication between the case advocates and the COTRs.

The IRS and the PCAs did not properly track and handle some TAS cases

When a taxpayer requests assistance from the TAS or describes circumstances meeting criteria for referral to the TAS, the PCA prepares and forwards a referral to the quality analyst.  The quality analyst scans the referral and forwards it via email to the COTR, who inputs the referral to a computer system where it is then retrieved by the TAS.  Taxpayers may also contact the TAS directly.  In these instances, the TAS will forward notification of TAS involvement to the COTRs.  The PCAs and COTRs maintain separate lists of all cases referred to or from the TAS.  We used these lists as part of our reconciliation process to identify 652 Program-related TAS cases.  During our reconciliation of the 652 cases and an indepth review of 50 cases, we identified the following issues:

  • Five cases forwarded by the COTRs were not received by the TAS.  There are no procedures for COTRs to verify timely receipt of cases by the TAS.  As a result, the TAS was unable to address the taxpayers’ concerns.
  • Four cases forwarded by the PCAs were not received by the COTRs or the TAS.  There are no procedures for PCAs to verify timely receipt of cases by the COTRs.  As a result, the TAS was unable to address the taxpayers’ concerns.
  • Three cases that met TAS referral criteria were not identified by the PCAs or the Referral Unit as needing referral.  Cases were initiated when the taxpayer called the TAS for assistance.  These all occurred toward the beginning of the Program when processes were first placed into action.  As a result, the TAS was unable to address the taxpayers’ concerns in a timely manner.  Because this condition has not continued to occur, we made no recommendation.
  • Seven cases were not forwarded in a timely manner by the COTRs, quality analysts, and/or PCAs.  The responsible parties were performing other required activities, and there were no procedures for someone to serve as a backup.  As a result, the TAS was unable to address the taxpayers’ concerns in a timely manner.
  • ****1, 3(d)****
  • Four cases were not forwarded by the PCAs until the next business day.  Procedures require PCAs to forward referrals the same day they are received or created.  Due to efforts by the COTRs and quality analysts, these referrals were forwarded to the TAS in a timely manner.  However, delays by the PCAs place limits on meeting this requirement.
  • Due to the manual nature of handling TAS cases, we identified the following errors:

·            Nine instances on the IRS lists of TAS cases with mistyped Social Security Numbers.

·            13 cases forwarded by the PCAs that were received by the TAS but were not on the IRS lists of TAS cases.

·           ****1****

Although we did not identify any occurrences, the COTRs might not identify cases on which collection may resume after the TAS closes the case.

The Government Accountability Office Standards for Internal Control in the Federal Government requires information to be clearly documented and promptly, completely, and accurately recorded.  In addition to the results described above for each of the conditions, when TAS cases are not properly identified, documented, or forwarded, the related taxpayers are more likely to experience unnecessary burden.

Management Action:  Subsequent to our review, Program management designated one analyst to be responsible for initiating, monitoring, and reporting on Program-related TAS cases.  To ensure consistent handling of TAS cases, management designated a second analyst to serve as a backup.  On May 2, 2008, management also initiated a new process directing the designated analyst to conduct a weekly reconciliation of referrals received from the PCAs as well as those forwarded to the TAS.  We believe that these new procedures will help ensure 1) that all Program-related TAS cases are being tracked, and 2) the timely receipt of referrals from the PCAs to the quality analyst and from the analyst to the TAS.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine whether the IRS and contractors have been following required procedures in the Private Debt Collection Program (the Program) since implementation on September 7, 2006.  To accomplish the objective, we:

I.                   Conducted various case reviews to determine whether the IRS was following procedures.

A.    Obtained a list from the Program staff that identified a population of 227 taxpayers who had opted out of the Program from September 7, 2006, through May 26, 2007.  We selected a random sample[18] of 30 of the 227 taxpayers and analyzed case activity to determine whether the IRS took appropriate actions.

B.     Obtained a list from the Program staff that identified a population of 3,177 taxpayers whose accounts were recalled[19] to the IRS from September 7, 2006, through May 26, 2007.  We selected a random sample of 30 of the 3,177 taxpayers and analyzed case activity to determine whether the IRS took appropriate actions.

C.     Obtained a list from the Program staff that identified a population of 11,562 taxpayers whose accounts were placed with PCAs on September 7, 2006.  We selected a statistically valid sample[20] of 73 of the 11,562 taxpayers and analyzed case information to determine whether the accounts met inventory selection criteria.

II.                Determined the TAS office’s status of reviewing Program referrals.

A.    Reconciled IRS, PCA, and TAS records to determine whether all referrals to the TAS were tracked and addressed.

B.     Selected a random sample of 50 Program referrals to the TAS[21] and determined whether referral criteria were met.

III.             Conducted various case reviews to determine whether the PCAs were following procedures.

A.    Obtained a list from the Program staff that identified a population of 3,356 taxpayers whose accounts were paid in full from September 7, 2006, through March 29, 2007.  We selected a random sample of 30 of the 3,356 taxpayers.

B.     Obtained a list from the Program staff that identified a population of 1,471 taxpayers who had received installment agreements from September 7, 2006, through April 28, 2007.  We selected a random sample of 30 of the 1,471 taxpayers.

C.     For the samples identified in Steps I.A. - I.C., II.B., III.A., and III.B., obtained case file information from the PCAs and analyzed case activity to determine PCA compliance with procedures and timeliness of actions.

D.    Selected a judgmental sample[22] of 30 taxpayers who had defaulted on their installment agreements.  We obtained case file information from the PCAs and analyzed case activity subsequent to the date of default, if any, to determine PCA compliance with procedures and timeliness of actions.

IV.             Evaluated the process for handling Program-related complaints and determined its effectiveness.

A.    Determined the IRS process for reviewing, validating, and resolving complaints, as well as the IRS process for preventing recurrence of complaints.

B.     Determined the Program Complaint Panel’s role in handling complaints.

C.     Identified and obtained all Program-related complaints occurring between September 7, 2006, and August 1, 2007.  We compared IRS and PCA records to determine whether all complaints were tracked and addressed.  We obtained case file information from the PCAs and analyzed case activity to determine PCA compliance with procedures and timeliness of actions.

D.    Evaluated the complaints identified in Step IV.C. and determined whether actions taken were appropriate.

E.     Reviewed the performance trends identified by the IRS and the actions taken to address them.

V.                Evaluated the Quality Assurance Review Process and determined its effectiveness.

A.    Determined how quality analysts conduct their reviews of the PCAs (e.g., systems access, login required).

B.     Interviewed IRS management about the quality plan to determine:

1.      The sample required for capturing the results for the Program.

2.      The numbers of telephone calls monitored and case action reviews conducted for each PCA and the results of these reviews.

3.      The limitations imposed by the PCAs, if any, during the quality review process.

C.     Determined the validity and reliability of the sampling results.

D.    For each PCA that remains under contract, selected a judgmental sample[23] of 25 telephone calls with taxpayers[24] and determined whether quality standards were met.

VI.             Evaluated the taxpayer satisfaction survey process and determined its effectiveness.

A.    Determined the process and procedures for administering the survey.

B.     Determined the process and procedures for capturing and reporting results of the survey.

C.     Determined the strategy and/or guidance developed to address survey results that are less than those expected.

VII.          Determined whether the IRS ensured that Linebarger Goggan Blair & Sampson LLP took appropriate actions to conclude its work on the IRS Program.

A.    Determined whether the IRS ensured that Linebarger Goggan Blair & Sampson LLP appropriately completed all the steps outlined in the Plan of Actions and Milestones document developed as part of the closeout process.

B.     Coordinated with the IRS Mission Assurance and Security Services organization to observe its onsite review of Linebarger Goggan Blair & Sampson LLP as part of the closeout process.  We determined whether the IRS verified that the PCA was in compliance with the IRS Federal Information Security Management Act[25] Contractor Review Sanitization and Disposal Checklist.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Carl L. Aley, Director

Amy L. Coleman, Audit Manager

Todd M. Anderson, Lead Auditor

Christina M. Dreyer, Senior Auditor

Darryl R. Roth, Senior Auditor

Marcus D. Sloan, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Chief Counsel  CC

Deputy Commissioner, Small Business/Self-Employed Division  SE:S

Director, Collection, Small Business/Self-Employed Division  SE:S:C

Project Director, Filing and Payment Compliance Modernization, Small Business/Self-Employed Division  SE:S:C:FPCMO

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaison:  Commissioner, Small Business/Self-Employed Division  SE:S

 

Appendix IV

 

Glossary of Terms

 

Authentication Process – The contractor’s process of obtaining necessary information from the party called or calling to be reasonably sure that the contractor is discussing the tax information with the appropriate person.  During this process, the contractor verifies the caller’s name, Social Security Number, and address of record.

Automated Collection System – A telephone contact system through which telephone assistors collect unpaid taxes and secure tax returns from delinquent taxpayers who have not complied with previous notices.

Complaint Panel – A group consisting of Private Debt Collection Program managers, Small Business/Self-Employed Division program managers, contract specialists, and a TAS representative.  The Panel is responsible for ensuring the consistency of adjudication of complaints, reviewing Type One complaints for trends, and reviewing Type Two and Type Three complaints for validation purposes.

Contracting Officer’s Technical Representative – A Private Debt Collection Program employee responsible for managing the PCA contracts and ensuring compliance with requirements.

Full Compliance – A taxpayer is in full compliance when he or she has filed all required tax returns and is current with withholding or payment of estimated taxes.

Installment Agreement – A method for taxpayers to pay tax liabilities by making regular payments to the IRS over time rather than all at once.

Mission Assurance and Security Services Organization – The IRS organization that works to assure the security and resilience of critical agency functions and business processes by assisting in maintaining secure facilities, technology, and data.

Offer in Compromise – An agreement between the IRS and a taxpayer that settles a tax liability for payment of less than the full amount owed.

Operation Assistance Request – A process used by the TAS to refer cases to an IRS operating division or function to resolve a taxpayer’s problem when the TAS does not have the authority to take the required action(s).

Plan of Actions and Milestones – A management tool used to assist organizations in identifying, assessing, prioritizing, and monitoring the progress of corrective actions for security weaknesses found in programs and systems.

Recalled Accounts – Accounts recalled from the contractors for various reasons, including when an accepted installment agreement extends beyond 60 months or when the IRS receives a valid offer in compromise to pay less than the full balance due.

Referral Unit – A unit of the Private Debt Collection Program responsible for assigning cases to contractors; maintaining cases; recalling cases; responding to inquiries from taxpayers, contractors, and IRS staff; and handling taxpayer complaints.

Right-Party Contact – A taxpayer who confirmed his or her identity to the contractor over the telephone.

Skip Interval – The selection of every nth case in a population to obtain a random sample.

Statistics of Income Division – A Division of the IRS that collects, analyzes, and disseminates information on Federal taxation for the IRS in its administration of tax laws.  As part of its services, the Division assists in the quality measurement programs of the IRS.

Tax Fairness Coalition – A group comprised of ACA International member companies, including PCAs under contract with the IRS, established for the purpose of promoting the continuation of the Program.  The ACA International is an association of professional businesses and individuals involved in the credit and collection industry.

Taxpayer Advocate Service – An independent organization within the IRS whose employees assist taxpayers seeking help in resolving tax problems that have not been resolved through normal channels or who are experiencing significant hardships.

Taxpayer Delinquency Investigation – An unfiled tax return(s) for a taxpayer.  One investigation exists for all tax periods.

Type One, Type Two, and Type Three Complaint Code – A code assigned to each complaint, based on the severity of the allegation(s).  Rude behavior would be a Type One complaint, intimidation would be a Type Two complaint, and a violation of the FDCPA[26] would be a Type Three complaint.

 

Appendix V

 

Details of Identified Complaints and Concerns

 

We identified 101 incidents reported on the contractors’ complaint logs or related documentation for the Complaint Panel meetings.  Our review showed that in 98 of the 101 incidents, the contractors were working within the guidelines of IRS-approved collection practices, actions were inadvertent, or no information was identified to substantiate the incident.  In three instances, the IRS considered the incident to be a significant contract concern and imposed a penalty as allowed under the contract.  The issues identified for the 101 incidents include:

·         Authentication – Taxpayer expresses concern with or becomes upset over the authentication process.  Some taxpayers became frustrated when they did not recognize the contractor’s real company name and, due to the contractor’s and the IRS’ concerns about disclosure provisions, the name did not provide insight to the contractor’s type of business (52 instances).

·         Rude or unprofessional behavior – Taxpayer complains that the contractor’s representative was rude or unprofessional (seven instances).

·         Installment agreement-related – Taxpayer expresses concern that his or her call was not returned to set up an installment agreement, an agreement was not established, and/or no reminder notice(s) was received (five instances).

·         Contact number used – A third party or the taxpayer expresses concern about the number used to contact the taxpayer (e.g., use of cell phone number) (seven instances).

·         Frequency or unusual time of calls – Taxpayer expresses concern about the frequency of calls made or the time of the calls (six instances).  In ****1**** our review showed that the contractors’ case histories did not identify excessive calls or calls placed outside of the allowed time periods.  ****1****[27],[28]

·         Taxpayer representative issues – Taxpayer representative expresses concern about the clarity of IRS-approved letters sent to the representative, appropriateness of calls placed to an authorized representative, and/or lack of any information provided to an unauthorized representative (seven instances).

·         Inadvertent but unauthorized disclosure Includes such situations as a third party misrepresenting himself or herself as a taxpayer of interest, misdirected mail inadvertently opened by contractor staff not assigned to the IRS contract, and mail machine[29] stuffing malfunction (12 instances).  The IRS considered the mail machine stuffing malfunction a Type Three validated complaint.

·         Other miscellaneous issues – Various taxpayer concerns related to the collection of the tax debt (e.g., adequacy of prior notice or the ability to secure employment due to tax debt) and contract concern relating to a contractor’s license status in one State (five instances).  The IRS considered the license status a Type Three validated complaint.

One violation of the Fair Debt Collection Practices was identified

As originally enacted, the FDCPA included provisions that restricted various collection abuses and harassment in the private sector.  These restrictions did not apply to Federal Government practices.  However, Congress requires the IRS to comply with applicable portions of the FDCPA and to be at least as considerate to taxpayers as private creditors are required to be with their customers.  The IRS Restructuring and Reform Act of 1998[30] required the IRS to follow Fair Tax Collection Practices[31] that are similar to the FDCPA provisions.  The Treasury Inspector General for Tax Administration is required to report Fair Tax Collection Practices violations when the action was taken by an IRS employee involved in some type of collection activity and the employee received some type of disciplinary action considered to be an administrative action.  Our review of civil actions related to violations of the Fair Tax Collection Practices for Calendar Year 2006 identified five cases involving a Fair Tax Collection Practices violation for which the IRS employee received administrative disciplinary action.[32]

Our review of the contractor complaints determined that 9 of the 101 cases could have been violations of the FDCPA, if substantiated or if collection calls continued.  These nine cases included calls made at an unusual time,[33] harassment due to frequent calls,[34] or other instances of taxpayers feeling harassed.  Of these cases, eight did not appear to be violations of the FDCPA because the complaint was not substantiated or the contractors stopped collection calls.  ****1****

 

Appendix VI

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.


[1] 26 U.S.C. Section 6306 (2004).

[2] Pub. L. No. 108-357, 118 Stat. 1418.

[3] See Appendix IV for a glossary of terms.

[4] Pub. L. No. 108-357, 118 Stat. 1418.

[5] 15 U.S.C. §§ 1601 note, 1692-1692o (2000).

[6] Pub. L. No. 105-206, 112 Stat. 685 (codified as amended in scattered sections of 2 U.S.C., 5 U.S.C. app., 16 U.S.C. , 19 U.S.C., 22 U.S.C., 23 U.S.C., 26 U.S.C., 31 U.S.C., 38 U.S.C., and 49 U.S.C.).

[7] Management Needs to Continue Monitoring Some Case Selection Issues As the Private Debt Collection Program Is Implemented (Reference Number 2006-30-064, dated April 2006).

[8] See Appendix IV for a glossary of terms.

[9] Due to inconsistencies in the definition of a complaint and the self-reporting nature of complaints, we cannot provide assurance of the reliability of this count.

[10] As discussed later in the report, 20 of the 101 cases were included on the contractor’s complaint log but were categorized by the IRS as authentication concerns.  These cases were not discussed by the Complaint Panel but were reviewed by the COTR.

[11] The dates of the calls monitored were July 9, 2007, for one contractor, and August 7 and 8, 2007, for the other contractor.

[12] This count could include the same taxpayer(s) due to subsequent attempts to make contact.

[13] In our analysis, we include repeat contacts with taxpayers who refused to authenticate, taxpayers who were unable to authenticate, and right-party contacts.  This was done for consistency with the number of right-party contacts discussed elsewhere in this report.  The ratio was calculated as ((1,385+368)/(1,385+368+9,013)) X 100.  This ratio does not include other contact types such as a wrong number or when the taxpayer was currently unavailable.

[14] At our recommendation, the IRS added a requirement in the Request for Quotation, as well as in the March 2008 contract renewals, for all contractors to record telephone calls with taxpayers to provide accessible replay for review purposes.  A Request for Quotation is a formal solicitation to sources outside of the Federal Government for offers to provide products or services.

[15] The overall satisfaction rating represents the answer to one question on the automated survey.  The question was “Everything considered, whether you agree or disagree with the final outcome, rate your overall satisfaction with the service you received during this call.”

[16] The contractors maintain logs documenting the offering of surveys to right-party contacts.  This process is prone to data input errors and adversely affects the accuracy and reliability of the data.  While data accuracy is a concern, our review of a judgmental sample of 50 monitored calls found that the logs were generally sufficient for the purpose of our tests.

[17] The taxpayer satisfaction survey is provided in English only.  However, beginning in April 2008, the contractors will also offer the survey in Spanish.

[18] All random samples were selected because we did not intend to make projections based on results.

[19] See Appendix IV for a glossary of terms.

[20] We selected a statistically valid sample to project results of negative issues.  The sample was based on a confidence level of 95 percent, an expected error rate of 5 percent, and a precision of ±5 percent.  We did not identify any negative issues from this test.  Therefore, no projections were made.

[21] We identified a total population of 652 Program-related TAS cases.  This total represents the combined populations identified from various sources from September 7, 2006, through different periods.  The TAS computer system had a population of 451 through July 16, 2007.  We determined the reliability of the data on the TAS computer system through the reconciliation we conducted under Step II.A.  Because we identified cases that should have had codes but did not, and cases that should not have had codes but did, the reliability of the data is in question.  See information in the report for more details.  The IRS lists for each PCA had an additional population of 96 through June 30, 2007.  The CBE Group Inc. fax log had an additional population of ****1**** through April 30, 2007.  The CBE Group Inc. list had an additional population of 69 through August 28, 2007.  The Pioneer Credit Recovery, Inc. list had an additional population of 34 through August 15, 2007.

[22] We used judgmental sampling because we were unable to identify the population at the time we selected our sample.  Through lists obtained from the PCAs and Program staff, we later identified a total population of 421 taxpayers who had defaulted on their installment agreements.  This total represents the combined populations identified at each contractor site over various periods.  At The CBE Group Inc., the population was 201 from September 7, 2006, through July 30, 2007.  At Pioneer Credit Recovery, Inc., the population was 182 from September 7, 2006, through August 8, 2007.  At Linebarger Goggan Blair & Sampson LLP, the population was 38 from September 7, 2006, through March 7, 2007.

[23] We selected a judgmental sample because we did not intend to make projections based on our results.

[24] Per lists obtained from the PCAs, the population of telephone calls between taxpayers and The CBE Group Inc. was 100 on July 9, 2007.  The population of telephone calls between taxpayers and Pioneer Credit Recovery, Inc. was 154 on August 7 and 8, 2007.

[25] Pub. L. No. 107-347, Title III, 116 Stat. 2946 (2002).  The Federal Information Security Management Act requires each Federal Government agency to develop, document, and implement an agency-wide information security program to provide security for the information and information systems that support the operations and assets of the agency.

[26] 15 U.S.C. Sections 1601 note, 1692-1692o (2000).

[27] 15 U.S.C. Sections (§§) 1601 note, 1692-1692o (2000).

[28] Our review of civil actions related to violations of Fair Tax Collection Practices (26 U.S.C. § 6304 (2004)) for Calendar Year 2006 identified five cases involving a Fair Tax Collection Practices violation for which the IRS employee received administrative disciplinary action.

[29] A machine used to place printed letters inside of mailing envelopes.

[30] Pub. L. No. 105-206, 112 Stat. 685 (codified as amended in scattered sections of 2 U.S.C., 5 U.S.C. app., 16 U.S.C., 19 U.S.C., 22 U.S.C., 23 U.S.C., 26 U.S.C., 31 U.S.C., 38 U.S.C., and 49 U.S.C.).

[31] 26 U.S.C. § 6304 (2004).

[32] Five Fair Tax Collection Practices Violations Resulted in Administrative Actions in Calendar Year 2006 (Reference Number 2007-10-188, dated September 21, 2007).

[33] FDCPA § 805(1) indicates that absent of knowledge to the contrary, the debt collector may assume that a convenient time is between the hours of 8 a.m. and 9 p.m.

[34] FDCPA § 806(5) addresses causing a telephone to ring or engaging any person in conversation repeatedly with intent to annoy, abuse, or harass.