TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

Office of Audit

Highlights

IMPROVEMENTS IN RETURN SCORING AND RESOURCE ALLOCATION AT THE STRATEGIC LEVEL
COULD ENHANCE EXAMINATION PRODUCTIVITY

Final Report issued on April 24, 2019

Highlights of Reference Number:  2019-30-024 to the Commissioner of Internal Revenue.

IMPACT ON TAXPAYERS

The IRS’s primary objective in selecting returns for examination is to promote the highest degree of voluntary compliance.  This requires the exercise of professional judgement in selecting returns to assure all taxpayers of equitable consideration and in making the most efficient use of staffing resources.  The IRS uses a variety of sources to select returns for audit and strives to select those returns for which its examiners are likely to find areas of noncompliance.

WHY TIGTA DID THE AUDIT

This audit was initiated to evaluate the Small Business/Self-Employed Division’s strategic priority selection methods used to identify individual income tax returns for examination by revenue agents and to assess the effectiveness of the strategic priorities with an emphasis on the Discriminant Function (DIF) selection method.

WHAT TIGTA FOUND

Between Fiscal Years 2014 and 2016, the Small Business/Self-Employment Division Examination function (Examination function) used 10 strategies to group similar examination work.  During each of these fiscal years, the Examination function established planned examined returns closed (hereafter referred to as closures) at the strategic priority level.  However, the Examination function does not establish performance goals at the strategic level; rather, it uses cumulative direct examination staff years and cumulative closures combined for revenue agents and tax compliance officers to determine whether the examination plan was met.

TIGTA found the Examination function did not take corrective action when actual closures exceeded or fell behind planned closures for a specific strategic priority.  As a result, the IRS lost the opportunity to assess an additional $262.5 million on individual income tax return examinations by revenue agents for Fiscal Year 2016.

TIGTA also reviewed the DIF scores for individual returns filed with the IRS during Calendar Year 2015, and examined and closed by revenue agents and tax compliance officers through December 2017.  Even though the DIF identifies returns with examination potential, TIGTA found that, more often than not, the highest DIF score individual returns examined by revenue agents do not necessarily result in a higher net tax assessment for most examination classes.

WHAT TIGTA RECOMMENDED

TIGTA recommended that 1) the IRS establish actionable performance goals, and monitor and take corrective action specific to staffing resources and closures for each strategic priority to ensure that it provides balanced examination coverage, and 2) determine why the highest DIF scores for individual returns examined by revenue agents and tax compliance officers did not result in higher net tax assessments than lower DIF scores for most examination classes.

The IRS disagreed with the recommendations.  The management response stated that Examination does not control how its examination plan is worked.  It also stated that that National Research Program data supports its DIF models.  It does not believe that the specific examination results in the report provide any actionable information relevant to the current DIF models.  TIGTA believes that the IRS does control how the plan is worked and should make adjustments as needed to meet the plan.  Moreover, the examination results should be used to improve the DIF.

READ THE FULL REPORT

To view the report, including the scope, methodology, and full IRS response, go to:

https://www.treasury.gov/tigta/auditreports/2019reports/201930024fr.pdf.

 

Phone Number   /  202-622-6500

E-mail Address  /  TIGTACommunications@tigta.treas.gov

Website             /  https://www.treasury.gov/tigta