Performance Management in the Large and Mid-Size Business Division’s Industry Case Program Needs Strengthening

 

May 2005

 

Reference Number:  2005-30-084

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

May 27, 2005

 

 

MEMORANDUM FOR COMMISSIONER, LARGE AND MID-SIZE BUSINESS DIVISION

 

FROM:     Pamela J. Gardiner /s/ Pamela J. Gardiner

                 Deputy Inspector General for Audit

 

SUBJECT:     Final Audit Report - Performance Management in the Large and Mid-Size Business Division’s Industry Case Program Needs Strengthening  (Audit # 200330028)

 

This report presents the results of our review of the Large and Mid-Size Business (LMSB) Division’s performance management system for examiners and team managers in its Industry Case (IC) Program.  The overall objective of this review was to determine whether the LMSB Division’s performance management system is effective in linking the Internal Revenue Service (IRS) mission, strategic goals, and balanced measures to team manager and examiner performance in the IC Program.

In summary, the LMSB Division’s performance management system provides direct links from the IRS mission and goals down to the appraisal processes for examiners and team managers in the IC Program.  Actions have also been taken to enhance the performance management system for team managers.  However, there are areas in the appraisal processes of IC examiners that could be strengthened to reinforce the importance of adhering to LMSB Quality Management System (LQMS) standards and completing examinations more timely. 

At the IRS, as in other Federal Government agencies, implementing a performance management system is a critically important endeavor.  Among other things, agency managers need to establish processes that promote teamwork and organizational success by integrating individual performance with agency goals.  Since the LMSB Division’s work primarily relates to enhancing enforcement of the tax law, the primary objective of the IC Program, and the responsibility of examiners and team managers, is to conduct timely, quality examinations of selected tax returns.  To assist examiners and team managers in meeting their responsibilities, the LMSB Division defines and measures quality against four standards that are set forth in the LQMS and uses cycle time to monitor the timeliness of examinations.

Since the LMSB Division’s stand-up nearly 5 years ago, overall performance ratings of “Exceeds Fully Successful” or higher on IC examiner critical job responsibilities have been the norm, and the use of the “Outstanding” overall rating has steadily increased.  While the performance ratings of IC examiners have been increasing, the IC Program as a whole has been challenged to meet LQMS quality standards and cycle time target goals in examinations.  For example, statistics show approximately 20 percent of IC corporate examinations and between 40 and 60 percent of partnership examinations are closed without any adjustments.  While these “no-change” examinations can indicate a poor job of selecting returns for examination and/or a poor job of examining them, evidence suggests that it is the latter and not the former.  In addition, examinations are consuming more hours and taking longer to complete. 

Team managers are encouraged to conduct workload reviews over the work of IC examiners under their supervision.  For a number of reasons, these reviews can be a critically important component in the appraisal process for IC examiners.  Despite the importance of workload reviews, they are not being consistently used to monitor and evaluate the work of IC examiners.  We reviewed the Fiscal Years (FY) 2002 and 2003 annual appraisals for 30 IC examiners and found that for 7 (23 percent), the performance ratings in the annual appraisals were not supported by any documented workload reviews in at least 1 year.  Where workload reviews were documented, relatively few discussed the examiners’ critical job responsibilities, addressed adherence to LQMS standards, and/or identified opportunities to enhance performance.

To strengthen performance management for IC examiners, we recommended the Commissioner, LMSB Division, develop and implement plans requiring (1) team managers provide more specific written feedback to examiners on the quality and timeliness of examinations that relates to their critical job elements and can be used as support for midyear progress reports and annual appraisals and (2) territory managers monitor and assess the appraisal process of IC examiners during their operational reviews. 

Management’s Response:  The Commissioner, LMSB Division, agreed with both of our recommendations.  The Commissioner, LMSB Division, will issue a performance management reminder to the field to highlight the responsibility to conduct on going performance assessments, stress the importance of documentation for progress reviews and annual appraisals, and remind territory managers to include individual agent performance in the topics discussed as part of the operational review process.  A Team Manager Checksheet has already been issued to assist managers in conducting reviews and assessments.  Also, the LMSB Guide of Field Balanced Measures Priorities and Recommended Performance Commitments for FY 2005 was developed and issued to encourage team managers to follow a seven-step process for performance reviews and use performance data in developing commitments.  Finally, the Commissioner, LMSB Division, stated LMSB Division territory managers already include discussions of individual agent performance in their operational reviews.  Management’s complete response to the draft report is included as Appendix VII.

Office of Audit Comment:  With respect to the Commissioner, LMSB Division’s response to our second recommendation, we agree the LMSB Guide of Field Balanced Measures Priorities and Recommended Performance Commitments for FY 2005 provides valuable guidance for developing team manager commitments.  However, it does not specifically address the examiner appraisal process.  Because the LMSB Division has not established a formal process to monitor and assess whether team managers are providing meaningful performance feedback to examiners, we continue to recommend territory managers be required to monitor and assess the appraisal process of IC examiners during operational reviews and take steps to address any problems identified.  As we noted in the report, we reviewed a number of operational reviews of territory managers over team managers.  Of the 45 operational reviews evaluated, we found no documented evidence that territory managers assessed whether team managers were conducting workload reviews for IC examiners.

Copies of this report are also being sent to IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (215) 516-2341.

 

Table of Contents

Background

Performance Management in the Large and Mid-Size Business Division Connects to and Supports Agency Goals

Actions Have Been Taken to Assist Team Managers in Constructing More Meaningful Commitments

Examiners Need More Meaningful and Constructive Performance Feedback

Recommendation 1:

Recommendation 2:

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Examiner Annual Performance Appraisal Form

Appendix V – Team Manager Performance Agreement Form

Appendix VI – The Large and Mid-Size Business Division’s Quality Measurement System

Appendix VII – Management’s Response to the Draft Report

 

Background

At the Internal Revenue Service (IRS), as in other Federal Government agencies, implementing a performance management system is a critically important endeavor.  Among other things, agency managers need to establish processes that promote teamwork and organizational success by integrating individual performance with agency goals.  As reflected in its mission statement and goals, providing quality service to taxpayers is of paramount importance to the IRS.  However, concerns were raised in the 1990s that the IRS performance management system may be promoting revenue production over service to taxpayers.  The concerns focused primarily on employees in its operating divisions who were responsible for enforcing the tax laws, such as IRS examiners and collectors.  In January 1998, the IRS began developing a blueprint to guide the redesign of its performance management systems to better balance the needs of providing service to taxpayers and collecting revenue.

The Large and Mid-Size Business (LMSB) Division is dedicated to serving large businesses and is one of four operating divisions in the IRS.  While the LMSB Division programs are designed to support all IRS goals, the Division’s work primarily relates to enforcing the tax law.  To support this broader IRS goal, the primary objective of its Industry Case (IC) Program, and the responsibility of examiners and team managers, is to conduct timely, quality examinations of selected large business tax returns to determine if the businesses have paid the proper amount of tax.  To assist examiners and team managers in meeting their responsibilities, the LMSB Division defines and measures quality of work against four standards that are set forth in its LMSB Quality Management System (LQMS).  To monitor the timeliness of examinations, the LMSB Division uses a cycle time measure.

Team managers are responsible for managing the performance of examiners under their supervision.  As described in IRS documents, a key component in the appraisal process is the set of examiner critical job responsibilities that are provided to and discussed with each examiner.  Among other things, the critical job responsibilities describe in detail the knowledge, skills, and abilities that examiners are expected to demonstrate and are used as the basis for their annual performance appraisals.  Appendix IV contains a copy of the examiner appraisal form and critical job responsibilities.

The performance of team managers is evaluated in much the same way as that of examiners, although there are some differences.  One difference is that LMSB Division territory managers are responsible for managing and evaluating the performance of the team managers.  Another important difference is that team managers develop commitments at the beginning of the fiscal year that supplement the critical job responsibilities.  The commitments are to be specifically tailored to the developmental needs of the individual team manager.  Appendix V contains a copy of the manager performance agreement form and critical job responsibilities.

This review was performed at the LMSB Division Headquarters in Washington, D.C., and IRS offices in the Los Angeles, California; Houston, Texas; and New York, New York, metropolitan areas during the period October 2003 through June 2004.  The audit was conducted in accordance with Government Auditing Standards.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

Performance Management in the Large and Mid-Size Business Division Connects to and Supports Agency Goals

The LMSB Division is implementing the redesigned IRS performance system according to the blueprint developed by the IRS Performance Management Executive Council in 1999.  As called for in the blueprint, the LMSB Division develops an annual program plan identifying trends, issues, and problems in tax administration affecting large businesses.  Additionally, the annual program plan identifies the Division’s strategic goals and corresponding objectives that are intended to address the trends, issues, and problems as well as link to and support the broader IRS mission and strategic goals. 

To illustrate, LMSB Division surveys of large businesses have consistently shown dissatisfaction with the postfiling examination process.  Stated simply, large businesses believe the examination process is too long and consumes too much of their time.  To address this issue, the LMSB Division has a strategic goal of “developing and institutionalizing a comprehensive issue management strategy” and supporting operating objectives that are intended to accomplish the goal of the strategy.  Figure 1 shows the links among the IRS goal of “improving taxpayer service;” the trend, issue, and problem with the examination process; and the LMSB Division goals and objectives. 

Figure 1:  Links Between the IRS and LMSB Division Goals for Improving Taxpayer Service

Figure 1 was removed due to its size.  To see Figure 1, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

The LMSB Division annual program plan also describes the measures used to judge the Division’s performance.  The IRS balanced measures of customer satisfaction, business results, and employee satisfaction are used Division-wide and provide the links to the IRS goals of improving taxpayer service, enhancing tax law enforcement, and modernizing through people, processes, and technology. 

Just as the IRS balanced measures define and measure the performance of the LMSB Division, examiners and team managers are evaluated on critical job responsibilities that are identical to the balanced measures.  The match between the IRS balanced measures and critical job responsibilities provides direct links from the IRS mission and goals down to the performance appraisals of LMSB Division examiners and team managers.

The relationship between balanced measures and critical job responsibilities is also intended to create a balanced approach to tax law enforcement and taxpayer service.  This is important because a fundamental reason for the matching measures was to address criticisms that managers, examiners, and other frontline employees were more focused on enforcing the tax law than on improving service to taxpayers.

For example, the Government Accountability Office (GAO) found that, before the new critical job responsibilities were introduced, appraisals overemphasized enforcement at the expense of taxpayer service.  The GAO estimated in a 1999 report that two-thirds of written performance feedback in frontline employee appraisals related to enforcement and only one-third to taxpayer service. 

Using the GAO results as a baseline for comparison purposes, for a judgmental sample of 30 IC examiners we analyzed 61 annual appraisals containing the new critical job responsibilities issued over a 2-year period.  We concluded that team managers were effectively balancing their narrative comments in performance appraisals between enforcement and taxpayer service.  As shown in Figure 2, about 38 percent of the team manager comments related to enforcement actions (Business Results) and 41 percent emphasized service, such as taking into account the taxpayer’s point of view (Customer Satisfaction). 


Figure 2:  Frequency of Taxpayer Service Comments and Business Results Comments in IC Examiner Appraisals Issued for Fiscal Years (FY) 2002 and 2003

Frequency of Comments

Critical Job Element

2002

2003

Total

Customer Satisfaction (Knowledge)

172

97

269

Customer Satisfaction (Application)

205

127

332

Subtotals

377

224

601

Business Results (Quality)

174

112

286

Business Results (Efficiency)

178

100

278

Subtotals

352

212

564

Employee Satisfaction

198

103

301

Totals

927

539

1,466

Source:  TIGTA analysis of IC examiner appraisals for FYs 2002 and 2003.

While performance management in the LMSB Division connects to and supports IRS goals, actions have also been taken to enhance the performance management system for team managers.  These actions, as discussed below, can provide a strong foundation for holding managers accountable for their individual and team performance.

Actions Have Been Taken to Assist Team Managers in Constructing More Meaningful Commitments

The performance management system for team managers requires that at the beginning of each fiscal year they coordinate with their respective territory managers to set forth commitments in their individual performance plans.  The commitments are intended to provide the basis for linking team manager critical job responsibilities with the IRS balanced measures and strategic goals and holding them accountable for their individual and team performances.  To realize these benefits, the commitments are to be related to at least one critical job responsibility.  They should also, according to the IRS, specifically describe the actions to be taken, include a deadline, indicate an expected result, and include a numeric target or some other means of measurement.

We reviewed the FYs 2002 and 2003 performance agreements of 20 team managers and determined that they had developed a total of 367 commitments over the 2-year period.  We also found documentation that most territory managers had conducted annual operational reviews and midyear progress reviews over the team managers under their supervision.  The documentation from the reviews was incorporated into the team managers’ annual appraisals and provided evaluative feedback for each of the team managers’ critical job responsibilities.  As shown in Figure 3, each of the commitments related to at least one of the critical job responsibilities and thus provided links to the IRS balanced measures and strategic goals.

Figure 3:  Distribution of Team Manager Commitments Among the Critical Job Responsibilities for FYs 2002 and 2003

Critical Job Responsibility

Number

Percentage

Leadership

46

13%

Employee Satisfaction

63

17%

Customer Satisfaction

66

18%

Business Results

174

47%

Equal Employment Opportunity

18

5%

Totals

367

100%

Source:  TIGTA analysis of 20 team manager performance agreements for the 2-year period ending in FY 2003.

While team manager commitments met the IRS’ criteria for addressing critical job responsibilities, they did not always meet the other criteria for well-structured commitments.  We determined that 191 (52 percent) of the 367 commitments were stated in broad, general terms that did not clearly describe the action to be taken, indicate a deadline, identify an expected result, and/or include a way to measure whether the commitment was met.  As a result, territory managers could have difficulty monitoring the commitments and holding team managers responsible for meeting them.  We found, for example, commitments that stated:

  • I will take the necessary steps in a joint effort with Team Members to identify and implement corrective actions to improve the quality of workpapers.
  • I will work towards the assignment of cases in the Manufacturing, Construction and Transportation Industry in my group.  I will provide my agents training in the issues related to these industry cases.
  • I will assure the [sic] all team members throughout the entire examination cycle remain aware of the new acceptable auditing standards applicable to limited scope examinations of a cycle and also to the individual issues being developed by each team member.
  • I will pursue a working strategy that involves taxpayer/customer as a partner in IRS audit and case resolution process.

While the above commitments are worthwhile goals, they do not include any specific actions to be taken to achieve the commitment or a method to measure progress.  In contrast, some commitments were specific:

  • I will review cases selected by the territory manager using the LQMS reviewers checksheet.  Areas for improvement will be discussed with the exam teams.  A follow-up review will be performed at the 50% or 75% milestone to assess improvement.
  • I will schedule two work related training classes to be presented during group meetings.
  • I will conduct reviews of CIC and IC cases in process 48 months and 24 months, respectively, from file date to improve cycle time.
  • Develop and implement Managerial participation in all Opening and Closing Conferences to provide audit support and facilitate Case Resolution.

Our analysis additionally showed commitments could have been used to greater advantage to reinforce the importance of key directives issued from LMSB Division senior executives.  Figure 4 provides an overview for three directives issued to team managers and other field personnel by the LMSB Division Commissioner during the period covered by our review.




Figure 4:  Frequency Selected Directives Appeared as Commitments in Performance Agreements for 20 Team Managers in FYs 2002 and 2003

Frequency with which the
directive appeared
as a commitment

Directive and Overview

2002

2003

Issue mandatory request for abusive tax shelter information.  This is a starting point for determining if the corporation was involved in certain abusive transactions.

4

16

Discuss the purpose and use of prefiling agreements in the early stages of the examination.  Prefiling agreements are a critical part of an overall effort to shorten the postfiling examination process and increase customer satisfaction.

8

18

Issue a mandatory request for transfer pricing documentation.  This is a key component in a compliance initiative to deal with cross-border transactions that could be undermining the U.S. tax base.

N/A

0

N/A – directive not issued until FY 2003.

Source:  TIGTA analysis of 20 team manager performance agreements for FYs 2002 and 2003.

Although the directives supported the IRS strategic goals and could enhance the quality and/or timeliness of examinations, they were not consistently included as a commitment in the performance agreements of the 20 IC team managers we reviewed.  For example, Figure 4 shows the directive to issue requests for transfer pricing documentation during examinations was included as a commitment in none of the 20 team manager performance agreements for FY 2003.

The LMSB Division Office of Performance, Quality, and Audit Assistance issued the LMSB Guide of Field Balanced Measures Priorities and Recommended Performance Commitments for FY 2005 after we completed our audit work.  The LMSB Division guide was developed as the result of analysis that provided top executives with information on how well the processes for evaluating team managers was working in practice and if changes were needed to make that process more meaningful.  Unlike the agency-wide reference guide for writing commitments published in 2001, which provided general directions for all IRS managers, the LMSB Division guide is uniquely tailored to team managers.  We believe the recommendations in the LMSB Division guide, if properly implemented and monitored, will address factors that contributed to the concern about vaguely worded team manager commitments. 

The LMSB Division guide, for example, recommends that team managers follow a seven-step process and use performance data in developing their commitments so they are clear, specific, easy to monitor, and results oriented.  It also provides specific examples of well-constructed commitments addressing the different operational priorities of the LMSB Division and how team managers are to incorporate the priorities into their commitments.  For example, abusive corporate tax shelters continue to receive top priority in FY 2005.  To address this priority, the guide recommends team managers commit to starting examinations on all corporate returns with an abusive tax shelter within 30 days of receiving the returns. 

With enhancements underway to the performance management system for team managers, we believe the next step is to strengthen the performance management system for IC examiners.  Available evidence suggests that there are actions that could be taken to better hold IC examiners accountable for the quality and timeliness of their examinations.

Examiners Need More Meaningful and Constructive Performance Feedback

Since the LMSB Division’s stand-up nearly 5 years ago, overall performance ratings of “Exceeds Fully Successful” or higher on examiner critical job responsibilities have been the norm.  In FY 2004, for example, nearly all (93 percent) IC examiners received an overall rating of “Exceeds Fully Successful” or higher.  Additionally, the use of the “Outstanding” overall rating has steadily increased to the point that 61 percent of examiners received the rating in FY 2004 compared to 36 percent in FY 2001.  While the performance ratings of IC examiners have been increasing, the IC Program has been challenged to meet LQMS quality standards and cycle time goals despite having reported successes in other areas of the Program.

For example, statistics show IC examiners over the last 4 years have closed approximately 20 percent of corporate examinations and between 40 and 60 percent of partnership examinations without recommending any adjustments.  While these “no-change” examinations can indicate a poor job of selecting returns for examination and/or a poor job of examining them, evidence suggests that it is the latter and not the former.  To illustrate, consider the following information regarding the quality of IC examinations.

Large businesses do not want to be examined if they have complied with the tax law.  If large businesses have not complied with the tax law, LMSB Division surveys indicate they want the examination to be over quickly and targeted only at questionable items.  As part of the LMSB Division’s strategy for dealing with this problem, its top executives have invested considerable effort in developing and implementing automated systems to identify those tax returns with a high compliance risk for examination.  The systems are designed to reduce the number of no-change examinations by ensuring only returns with the highest potential for adjustment are selected for examination.  However, the quality deficiencies shown in Figure 5 that continue to be identified by LQMS reviewers could hamper efforts to reduce the number of unproductive examinations. 

Figure 5:  LQMS Pass Rates٭ for Selected Quality Responsibilities in IC Examinations, by Fiscal Year

Percentage of Examinations Passing
Selected Quality Responsibilities

Quality Elements

2003

2004

Identifies material issues.

38%

48%

Makes required referrals to specialists.

50%

63%

Performs initial risk analysis appropriately.

44%

40%

Uses appropriate examination procedures and techniques.

63%

61%

٭ The pass rate measurement computed the percentage of cases that showed the characteristics of the quality element.

Source:  LMSB Division data.

The processes underlying each of the items in Figure 5 are designed to ultimately determine the large, unusual, and questionable tax issues that will be examined.  Consequently, if the processes are not properly completed, tax issues can be overlooked, resulting in no-change examinations that otherwise may have been avoided.

In response to surveys indicating large businesses want examinations to be over quickly and targeted only at questionable items, promising new business practices have been introduced.  For example, a “fast track” process for resolving disputes that surface in examinations has been tested and converted to a permanent program.  The goal of the program is to expedite the entire postfiling issue resolution process by bringing in the IRS Office of Appeals to resolve disputes concurrently with an examination rather than subsequent to it.  A prefiling agreement process was also introduced through a test project and subsequently converted to a permanent program.  The program permits a taxpayer to resolve, before the filing of a return, the treatment of an issue that otherwise would likely be disputed in a postfiling examination. 

To target the questionable items on a tax return for examination, the LMSB Division is emphasizing a risk-based examination approach that is exemplified in the Limited Issue Focused Examination (LIFE) process.  Introduced in 2002, the goals of the LIFE process include restricting examinations of large businesses to the few issues on their tax returns that pose the greatest compliance risk.  Despite the introduction of new business processes, reducing the time spent on and length of IC examinations remains a challenge.  As shown in Figure 6, the time spent on and the length of IC examinations are trending upward.

Figure 6:  Average Hours Spent on and Length of IC Corporate and Partnership Examinations in FYs 2001-2004

Fiscal Year

Number of Returns Examined
(∂)

Length of Examination in Months
(√)

Examiner Hours Spent on Each Return

2001

11,267

34.2

142

2002

9,601

36.5

183

2003

8,002

39.6

202

2004

9,044

39.2

212

(∂)  Returns include corporations, partnerships, and subchapter S corporations.
                (√)  Length measures from the return filed date to the date the examination was closed.

Source:  LMSB Division data (FY 2004 data are preliminary final data).

There are several factors contributing to the concerns with IC examinations, some of which are beyond the control of the LMSB Division.  For example, abusive tax avoidance transactions proliferated in the 1990s and are affecting IC examinations.  Officials told us that the complexity of abusive tax avoidance transactions can, among other things, increase the length of and time spent on examinations.  However, we found that there are areas in the appraisal processes of examiners that could be strengthened to reinforce the importance of adhering to LQMS standards and completing examinations more timely.

In a 2003 report to the President and the Congress, the United States Merit Systems Protection Board (MSPB) reported  that continually monitoring and providing feedback to employees is perhaps the most important component of performance management.  According to the MSPB:

This component, more than any other, can give employees a sense of how they are doing and can motivate them to be as effective as possible.  Ideally, through these ongoing interactions between employees and supervisors, employees learn how their work fits into the goals of the work unit and how it contributes to the larger mission of the agency.

Team managers are encouraged to conduct workload reviews over the work of each IC examiner under their supervision.  These reviews can be a critically important component in the appraisal process for IC examiners for a number of reasons.  The foremost reasons are they provide team managers with opportunities to ensure examiners are adhering to LQMS standards, reinforce the importance of completing examinations timely, and pinpoint and address performance gaps.  They also provide the principal support for the ratings examiners receive in their critical job responsibilities that are reflected in their annual appraisals and midyear progress reports. 

Despite the importance of workload reviews, we determined team managers are not consistently using them to monitor and evaluate the work of IC examiners.  As a result, IC examiners are receiving ratings in their annual appraisals that are not well supported.  Further, team managers may be missing opportunities to better hold examiners accountable for improving the quality and timeliness of their examinations.

We reviewed 61 annual appraisals provided to 30 IC examiners for FYs 2002 and 2003 and determined the appraisals for 7 (23 percent) of the 30 IC examiners were not supported by any workload reviews in 1 or more years.  For the remaining 23 examiners for whom workload reviews were performed in both years, there were significant differences among team managers in the types and quality of feedback provided to examiners on their performance during workload reviews.  For example, one team manager developed a template to capture and record detailed narrative comments on each examiner’s critical job responsibilities.  Other workload reviews were documented on monthly time reports of IC examiners and contained one or two narrative comments, such as:

  • Claim was not processed properly and you assisted taxpayer to get resolved.
  • Thank you for keeping your cases moving and for closing cases with low hours.

Overall, relatively few of the workload reviews specifically discussed the examiners’ critical job responsibilities or identified opportunities to improve the timeliness of their examinations.  Of the 23 examiners that received workload reviews over the 2-year period, only 8 received comments specifically addressing all critical job elements and only 8 received comments identifying opportunities for improving the timeliness of their work. 

Our analysis also showed there are ample opportunities to use workload reviews for emphasizing the importance of adhering to LQMS standards.  We analyzed the 149 workload reviews given to the 23 examiners over the 2-year period and determined the LQMS standards were discussed in 40 (27 percent) of the 149 workload reviews.  Figure 7 provides summary information from our analysis of the reviews discussing LQMS standards and shows the percentage of the 149 reviews that related to the deficiencies discussed earlier that the LQMS continues to identify year after year.


Figure 7:  Frequency of Discussion of Selected LQMS Standards Elements in Workload Reviews

Quality Elements

Number of Reviews (√)

Percentage of All Reviews

Number of Examiners

Identifies material issues.

2

1%

2

Makes required referrals to specialists.

33

22%

9

Performs initial risk analysis appropriately.

12

8%

7

Uses appropriate procedures and techniques.

3

2%

2

All other quality elements.

2

1%

2

(√) The documentation in some workload reviews discussed more than one LQMS standard.
Source:  TIGTA analysis of 149 workload reviews conducted by 24 team managers over the 2-year period ending in FY 2003.

In addition, our evaluation of 62 returns (47 cases) closed as no-change between FYs 1999 and 2002 by the 30 examiners in our review supports the concern with the quality of the feedback examiners are receiving in their workload reviews.  We determined that, in 54 (87 percent) of the 62 returns, at least 1 mandatory specialist referral was not made.  As noted in Figure 7, LQMS standards and elements require IC examiners to call upon specialists during their examinations.  These specialists, according to the LMSB Division, have the technical training needed to assist in the identification, selection, and examination of complex tax issues.  Besides raising questions about the adequacy of the feedback to examiners in workload reviews, our evaluation raises questions about whether the cases would have resulted in a no-change had the specialists been involved.  In prior TIGTA reports, we determined that significant potential tax adjustments were not considered because specialists had not been involved in examinations.

We identified two factors that affected the amount and quality of the feedback examiners received on their performance.  First, although the LMSB Division encourages team managers to conduct workload reviews, it does not specifically require that such reviews be conducted.  Instead, the Division allows a great deal of flexibility in how and when the reviews are conducted.  Moreover, guidelines do not specifically require that managers discuss either LQMS standards or examiner critical job responsibilities during their reviews.

Second, the LMSB Division has not established a process to monitor and assess whether team managers are conducting required workload reviews and providing meaningful performance feedback to examiners.  Although territory managers had conducted operational reviews over the examination teams, their reviews did not include evaluating team managers’ workload reviews. 

Recommendations

To better hold IC examiners accountable for the quality and timeliness of their examinations, the Commissioner, LMSB Division, should develop and implement plans requiring that:

1.      Team managers provide more specific written feedback to examiners on the quality and timeliness of examinations that relates to their critical job elements and can be used as support for midyear progress reports and annual appraisals.

Management’s Response:  The Commissioner, LMSB Division, will issue a performance management reminder to the field.  The memorandum will highlight the responsibility to conduct ongoing performance assessments, stress the importance of documentation for progress reviews and annual appraisals, and remind territory managers to include individual agent performance in the topics discussed as part of the operational review process.

2.      Territory managers monitor and assess the appraisal process of IC examiners during operational reviews and take steps to address any problems identified.

Management’s Response:  The Commissioner, LMSB Division, responded that the LMSB Case Quality Improvement Council has developed and issued a Team Manager Checksheet to assist managers in conducting reviews and assessments.  Also, the LMSB Division’s Performance, Quality, and Audit Assistance Office developed and issued the LMSB Guide of Field Balanced Measures Priorities and Recommended Performance Commitments for FY 2005, which encourages team managers to follow a seven-step process for performance reviews and use performance data in developing commitments.  The Commissioner, LMSB Division, also stated that LMSB territory managers already include discussions of individual agent performance in their operational reviews. 

Office of Audit Comment:  While we agree that the LMSB Guide of Field Balanced Measures Priorities and Recommended Performance Commitments for FY 2005 provides valuable guidance for developing team manager commitments, it does not specifically address the examiner appraisal process.  Because the LMSB Division has not established a formal process to monitor and assess whether team managers are providing meaningful performance feedback to examiners, we continue to recommend territory managers be required to monitor and assess the appraisal process of IC examiners during operational reviews and take steps to address any problems identified.  As we noted in the report, we reviewed a number of operational reviews of territory managers over team managers.  Of the 45 operational reviews evaluated, we found no documented evidence that territory managers assessed whether team managers were conducting workload reviews for IC examiners.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine whether the Large and Mid-Size Business (LMSB) Division’s performance management system is effective in linking the Internal Revenue Service (IRS) mission, strategic goals, and balanced measures to team manager and examiner performance in the LMSB Division Industry Case (IC) Program.  Work on this review was performed at the LMSB Division Headquarters in Washington, D.C., and IRS offices in the Los Angeles, California; Houston, Texas; and New York, New York, metropolitan areas.  We chose these three metropolitan areas primarily to achieve coverage in geographically dispersed offices.

To meet our objective, we relied on the IRS’ internal management reports and databases.  We did not establish the reliability of these data because extensive data validation tests were outside the scope of this audit and would have required a significant amount of time.  Additionally, we used judgmental sampling techniques unless otherwise noted, to minimize time and travel costs.  To accomplish the objective, we:

             I.      Developed criteria for the review by studying best practices and standards on performance management contained in various publications issued by the United States (U.S.) Merit Systems Protection Board, Government Accountability Office, IRS, and U.S. Office of Personnel Management.

          II.      Analyzed the Treasury Integrated Management Information System to assess the performance ratings and awards received by IC examiners in Fiscal Years (FY) 2001-2004.

       III.      Evaluated the LMSB Division Quality Management System (LQMS) to identify trends in the quality of IC examinations and to determine whether problems areas were incorporated into IC examiner workload reviews, midyear progress reports, and annual appraisals.

       IV.      Analyzed FYs 2002 and 2003 workload reviews, midyear progress reports, and annual appraisals for a sample of 30 out of approximately 3,102 IC examiners to assess the types, quality, and amount of feedback examiners received on their performance.

          V.      Reviewed a sample of 62 returns (47 cases) out of 143 returns that were examined and closed with no adjustments during FYs 1999-2002 by the 30 examiners included in the review to evaluate selected LQMS elements. 

       VI.      Analyzed FYs 2002 and 2003 performance agreements and related commitments for a sample of 20 out of approximately 1,390 IC team managers to assess the types, quality, and amount of feedback team managers received on their performance.

 

Appendix II

 

Major Contributors to This Report

 

Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Frank Dunleavy, Audit Manager

Robert Jenness, Lead Auditor

Douglas Barneck, Senior Auditor

Stanley Pinkston, Senior Auditor

Lisa Stoy, Senior Auditor

William Tran, Senior Auditor

Debra Mason, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Deputy Commissioner, Large and Mid-Size Business Division  SE:LM

Director, Performance, Quality, and Audit Assistance  SE:LM:Q

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaison:  Commissioner, Large and Mid-Size Business Division  SE:LM

 

Appendix IV

 

Examiner Annual Performance Appraisal Form

 

The following form is used to evaluate examiners in the Large and Mid-Size Business Division. 

 

The form was removed due to its size.  To see the form, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Appendix V

 

Team Manager Performance Agreement Form

 

The following form is used to evaluate managers in the Large and Mid-Size Business Division.

 

The form was removed due to its size.  To see the form, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Appendix VI

 

The Large and Mid-Size Business Division’s Quality Measurement System

 

The Office of Performance, Quality, and Audit Assistance, within the Large and Mid-Size Business (LMSB) Division, has responsibility for the LMSB Division Quality Measurement System.  Among other uses, the LMSB Division uses the system to measure quality of Industry Case examinations against four standards:  (1) Planning the Examination; (2) Inspection/Fact Finding; (3) Development, Proposal, and Resolution of Issues; and (4) Workpapers and Reports.  Each standard also has several key elements that elaborate on the overall standard.  Table 1 summarizes the standards and associated key elements.

 

Table 1:  Summary of the LMSB Division’s Quality
Measurement System (as of September 2003)

No.

Standard

Key Elements

Overview

1

Planning the Examination

·   Was appropriate information considered in the
preplanning process?

·   Were material items identified?

·   Was an appropriate initial risk analysis performed?

·   Were timely referrals to specialists and requests for support made?

·   Were all required procedures followed for Form 1065 and 1120-S returns?

 

·   Did the audit plan adequately set forth the scope and depth of the examination?

·   Did the audit plan include a realistic estimated completion date and realistic time periods for development of issues/areas?

·   Were audit procedures documented during the planning process?

·   Did the planning process have adequate taxpayer involvement?

The standard evaluates whether the audit plan identifies material issues; whether initial requests for information are clear, concise, and appropriate and address the potential issues selected; and whether all necessary steps are taken to set the groundwork for a complete examination.

2

Inspection/Fact Finding

·   Were appropriate audit procedures and examination techniques used?

·   Were requests for information clear and concise?

·   Were Computer Audit Specialist applications used in obtaining necessary information?

·   Was there communication with the taxpayer to reach an understanding of the facts regarding material issues?

·   Were mandatory Information Document Requests issued as appropriate?

Appropriate audit procedures and examination techniques, including interviews, written requests, inspection, observation, and other fact finding techniques, should be used to gather sufficient, competent information to determine the correct tax liability.

 

No.

Standard

Key Elements

Overview

3

Development, Proposal, and Resolution of Issues

·   Were the issues appropriately developed based upon the facts obtained?

·   Was the time commensurate with the complexity of the issues?

·   Was appropriate advice and assistance obtained from resources outside the team?

·   Was there timely and effective communication among all team members?

·   Did the case file reflect a reasonable interpretation, application, and explanation of the law based upon the facts and circumstances of the examination?

·   Were penalties considered and applied as warranted?

·   Was an appropriate midcycle risk analysis performed?

·   Were the Forms 5701 clear and concise?

·   Were proposed adjustments discussed with the taxpayer prior to issuance of
Form 5701?

·   Did the team adequately consider responses to
Forms 5701 provided by the taxpayer?

·   Were appropriate actions taken to resolve issues at the lowest level?

·   Was there meaningful managerial involvement to resolve issues at the lowest level?

Due professional care should be exercised in the application of the tax law.  The taxpayer should be given an opportunity to participate in issue development.

 

Notices of proposed adjustment and attachments should be stated in terms understandable to the taxpayer; they should clearly state the issue, facts, law, Federal Government’s position, taxpayer’s position, and conclusions. 

4

Workpapers and Reports

·   Were workpapers legible/organized?

·   Were examination activities properly documented by using agent activity records or quarterly narratives?

·   Did the workpapers adequately document the audit trail, techniques, and conclusions?

·   Were applicable
report-writing procedures followed?

·   Did the team manager review the audit report prior to issuance?

·   Were factual and legal differences in the taxpayer’s protest addressed?

Workpapers are the link between the examination work and the report.  They should contain the evidence to support the facts and conclusions contained in the report.  Written reports should communicate the findings and examination in a professional manner.

Source:  Large and Mid-Size Business Division Focus on Quality Examinations (LQMS) (Document 12076).

 

Appendix VII

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.