Consistent and Effective Manager Involvement Is Needed in Examinations of Large Businesses

 

February 2004

 

Reference Number:  2004-30-054

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

February 27, 2004

 

 

MEMORANDUM FOR COMMISSIONER, LARGE AND MID-SIZE BUSINESS DIVISION

 

FROM:     Gordon C. Milbourn III /s/ Gordon C. Milbourn III

                 Acting Deputy Inspector General for Audit

 

SUBJECT:     Final Audit Report - Consistent and Effective Manager Involvement Is Needed in Examinations of Large Businesses (Audit # 200230045)

 

This report presents the results of our review to determine whether team manager workload reviews are an effective tool in managing the outcomes of examinations in the Large and Mid-Size Business (LMSB) Division Industry Case (IC) Program.

In summary, compared to 1996, team managers have fewer open examinations to manage and are no longer responsible for managing all types of examinations, just those associated with the nation’s largest taxpayers.  However, our analysis of IC examinations, managerial practices, and the LMSB Division’s Quality Management System (LQMS) indicates team managers may be missing opportunities to more effectively control the timeliness and quality of IC examinations.  In Fiscal Year (FY) 2002, the LMSB Division closed examinations on 8,636 IC returns, 3,810 of which had been in status 12 (examination started) for more than a year.  This represents a 44 percent over-age inventory.  Despite the over-age inventory, we did not consistently find documentation of team manager involvement in the over-age IC examinations we reviewed.  When team managers were involved, we found very few instances in which they had documented action plans or target dates for closing the examinations even though there were numerous periods of unexplained inactivity exceeding 45 days.  Finally, the LQMS has reported quality concerns with the IC examinations closed in FY 2003.  In the cases reviewed by the LQMS staff, only 54 percent adequately documented examiner audit trails, techniques, and conclusions in the examination working papers and only 38 percent identified material tax issues during examination planning. 

To better ensure team managers are controlling the timeliness and quality of examinations, we recommended the Commissioner, LMSB Division, require team managers to more consistently document their reviews in the working papers.  Although the LMSB Division requires managerial reviews of examination work, it gives managers discretion on the nature and frequency of their reviews.  Moreover, it does not specifically require documentation of all reviews in the working papers even though generally accepted governmental auditing standards require such documentation.

We also made two recommendations to the Commissioner, LMSB Division, that will strengthen management controls to ensure more consistent and effective managerial involvement in IC examinations.  First, LQMS procedures should be modified to include determining whether team managers are consistently and effectively involved in all IC examinations selected for LQMS review.  At present, the LQMS staff determines the sufficiency of managerial involvement in only those examinations that have disputed tax issues.  This resulted in determining the effectiveness of managerial involvement in just 71 (17 percent) of the 425 IC examinations reviewed in FY 2003.  Second, updated guidelines need to be developed and provided to team managers that standardize and describe in detail the review processes managers should use and follow in evaluating examination work.  For years, the Internal Revenue Manual (IRM) contained an Examination Group Manager’s Handbook that served this purpose.  However, it was eliminated when the IRM was revised to reflect changes associated with the Internal Revenue Service’s (IRS) modernization effort.

Management’s Response:  The Commissioner, LMSB Division, agreed with our assessment that managers need to consistently and effectively perform and document examination reviews.  To ensure this, the Commissioner, LMSB Division, is choosing to rely on a new risk analysis process that is scheduled for implementation in September 2004.

The Commissioner did not agree to modify the LQMS auditing standards, choosing instead to rely upon the new risk analysis process.  However, as part of the new risk analysis process, the LQMS Reviewers’ Guide is being revised to reflect the additional manager responsibilities for evaluation under existing LQMS auditing standards.  Further, the Commissioner stated a Risk Analysis Design Team is developing a new IRM section that will contain specific and standard guidelines for oversight and guidance for all LMSB Division examinations, including guidelines related to the new risk analysis process.  We believe if management follows through with implementing the new risk analysis process as described in the response, they should be able to determine whether team managers are consistently and effectively involved in IC examinations. 

In the LMSB Division response, the Commissioner also provided technical comments to clarify a specific section of the draft report that described the number of cases managers were responsible for controlling.  We incorporated these comments into the report where appropriate.  Management’s complete response to the draft report is included as Appendix VI.

Copies of this report are also being sent to the IRS managers who are affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Richard Dagliolo, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (631) 654-6028.

 

Table of Contents

Background

Team Managers Are Better Positioned to Improve the Timeliness and Quality of Examinations

Team Managers Need to Document Their Involvement in Examinations More Consistently

Recommendation 1:

Management Controls Need Strengthening to Ensure Consistent and Effective Managerial Involvement in Examinations

Recommendations 2 and 3:

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Survey of Industry Case Examination Team Managers

Appendix V – The Large and Mid-Size Business Division’s Quality Measurement System

Appendix VI – Management’s Response to the Draft Report

 

Background

To measure the timeliness of examinations, the Large and Mid-Size Business (LMSB) Division uses cycle time, which is defined as the average number of months from when a return is filed until the examination process is completed.  In Fiscal Years (FY) 2002 and 2003, examinations in the LMSB Division Industry Case (IC) Program were considered timely if, on average, they were completed within 31 and 35 months, respectively.

To define examination quality, the Division uses four quality standards:  (1) Planning the Examination; (2) Inspection/Fact Finding; (3) Development, Proposal, and Resolution of Issues; and (4) Workpapers and Reports.  Each standard also has several key elements that elaborate on the overall standard.  (See Appendix V for more details on the standards and their associated elements).

The primary tool used by the LMSB Division to control the timeliness and quality of examinations is the review of ongoing examination work.  This review is the responsibility of the LMSB Division’s team managers, who are responsible for ensuring the timeliness and quality of examinations done by the examiners on their team.  To meet this responsibility, team managers can use a variety of processes, such as ongoing observations and discussions with examiners, reviews of work during examinations and after they are closed, and monthly reviews of examiners’ time reports.  Through these reviews, team managers attempt to identify problems with the timeliness and quality of examinations so examiners can take prompt corrective actions.

After an IC examination is closed, the staff of the LMSB Division’s Quality Management System (LQMS) may review the case file to assess the degree to which the examiner complied with the quality standards.  The purpose of these reviews is to collect information about the examination process, communicate areas of concern to top management, identify potential training needs, and improve work processes.

We performed our audit in accordance with Government Auditing Standards at the LMSB Division field offices in the Los Angeles, California; Dallas, Texas; and New York, New York, metropolitan areas between August 2002 and February 2003.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

Team Managers Are Better Positioned to Improve the Timeliness and Quality of Examinations

In the past, administrative demands on managers’ time, combined with broad spans of control, hampered their ability to take a greater role in the examination process.  As reported by the General Accounting Office (GAO) in 1997, the lack of managerial involvement in IC examinations was a contributing factor to a trend showing examinations were taking longer to complete and generating less additional recommended taxes.  The GAO found:

…managers were responsible for many revenue agents and other auditors who audit a range of tax entities, from individual returns through complex corporate returns that involve different tax rules and issues.  [M]anagers tended to focus attention on newer staff and administrative duties.

Since 1997, the Internal Revenue Service (IRS) has made significant progress in positioning team managers to take a greater role in the examination process.  The most significant step was establishing the LMSB Division to serve and ensure the compliance of the nation’s largest taxpayers.  With the grouping together of large taxpayers, there has been a noticeable impact on the number and types of examinations that must be managed by LMSB Division team managers.

Compared to 1996, for example, LMSB Division team managers have fewer open examinations to manage at any specific time and are no longer responsible for managing all types of examinations, just those associated with the nation’s largest taxpayers.  Also, their teams are comprised of only the most experienced examiners.  Table 1 shows that, on average, in 2003 these team managers had 257 fewer examinations under their control than in 1996.

Table 1:  Average Inventory of Open Examinations Under Team Managers in 1996 and 2003

Type of Return Under Examination

Average Number of Returns per Examination Manager in 1996

Average Number of IC Returns per LMSB Team Manager in 2003

Differences Between 1996 and 2003

Individual

197

9

188

Corporate

43

12

31

Partnership

4

4

0

Others

45

7

38

Overall Avg.

289

32
(
)

257

() The Commissioner, LMSB Division, indicated in the response to the draft report that the overall average number of returns per team manager in 2003 was 52 if CIC returns were included in the analysis.

Source:  Treasury Inspector General for Tax Administration’s analysis of the IRS Audit Information Management System (a computer system used to control returns, input assessments and adjustments to the Master File, and provide management reports).  The Master File is the IRS database that stores various types of taxpayer account information.

A number of other actions associated with the modernization effort that are perhaps less apparent but should encourage greater management involvement in casework include:

·        Creating a Managers’ Advisory Group to serve as an informal forum for managers to discuss issues of interest and elevate areas of concern to senior level management.

·        Implementing a new employee evaluation system designed to align performance expectations with the IRS’ three balanced measures of performance (customer satisfaction, employee satisfaction, and business results).

·        Establishing an online human resources system to allow users to initiate paperless personnel transactions, which can be approved by managers electronically.  This will reduce the administrative burden on managers because required information is entered into the system only once.

·        Developing computer-based training packages to deliver briefings to Revenue Agents on mandatory topics that were previously the team manager’s responsibility.

·        Realigning the LMSB Division’s Territory and team structure to reduce the number of industry groups represented at each post of duty, thus reducing the dispersion of some teams over wide geographic areas.  One of the stated purposes of this action was to “ease burden on managers by facilitating decision making and accountability.”

·        Introducing publications intended to standardize the timeliness and content of communications within the LMSB Division, as well as control the volume of communications by highlighting essential information.  These publications include Red Book communications, which provide information on the LMSB Division-wide executive-level decisions, new initiatives, and other matters expected to directly affect frontline managers and their teams; and Managers’ News Briefs, which are short, concise, easy-to-read informational messages for managers consolidated into a single document.

As envisioned in the IRS’ 2000 Organization Blueprint, the modernization effort at the agency has better positioned team managers to increase their involvement in IC examinations.  However, our analysis indicates additional steps could be taken to reinforce the emphasis on increasing manager involvement in examinations.  Team manager involvement in the examination process needs to be documented more consistently and management controls could be strengthened.

Team Managers Need to Document Their Involvement in Examinations More Consistently

Both within and outside the Federal Government, the primary control process that ensures quality audits are completed timely is the supervisory review of audit work conducted by auditors.  As an example of a best practice, generally accepted governmental auditing standards require that supervisors review all auditor working papers and that evidence of these reviews be maintained in the working papers.  The American Institute of Certified Public Accountants has similar requirements for audits conducted in the public sector.

The LMSB Division requires managers to review examination work but gives them discretion on the frequency and nature of their reviews.  For example, our survey of all team managers conducting IC examinations found managers use a variety of managerial practices to provide oversight and involvement in IC examinations.  These practices include ongoing observations and discussions with examiners, reviews of work during examinations and after they are closed, and reviews of examiners’ monthly time reports.  However, the LMSB Division does not specifically require that all of these reviews be documented in the working papers, despite generally accepted governmental auditing standards requiring such documentation.  As a result, opportunities to better control the timeliness and quality of examinations may be missed.

In FY 2002, the LMSB Division closed examinations on 8,636 IC returns, 3,810 of which had been in status 12 (examination started) for more than a year.  This represents a 44 percent over-age inventory.  We reviewed 47 over-age cases (75 returns) that, on average, exceeded the LMSB Division’s cycle goal by 3 months and found no documentation of team manager involvement in 21 (45 percent) of the 47 cases.  In reviewing the examinations, we considered evidence of managerial involvement to include any indication of team manager directions, comments, initials, or notations in the case files.

Because reviews may have been conducted but not documented in the case files, we also visited 3 large metropolitan areas and reviewed another 21 cases (67 returns) from 9 groups that had still been open in FY 2003 for more than a year.  Although our review of the 21 open cases found the percentage of cases without any managerial involvement decreased from 45 percent to 19 percent, there were 45 periods of unexplained inactivity ranging from 49 to 639 days.  Except for one team, we found very few instances in which managers developed action plans or target dates for closing the cases.  The IRS has traditionally considered action plans and target dates for completing specific actions effective techniques for managing and controlling examinations.

Although we did not review case files to evaluate the quality of examinations, the LQMS is finding concerns in areas of the four standards the LMSB Division uses to define examination quality.  As shown in Table 2, the LQMS reported examiner audit trails, techniques, and conclusions were adequately documented in only 54 percent of the examination working papers reviewed in FY 2003.  In addition, material tax issues were identified during the planning phase of only 38 percent of the examinations.

Table 2:  Fiscal Year 2003 Pass Rates for Selected Key Quality Elements in the LMSB Division’s Examination Standards

Audit Standard

Key Quality Element

FY 2003 Pass Rates

Planning the Examination

Identifying material tax issues.

38%

Inspection/Fact Finding

Using appropriate examination procedures and techniques.

63%

Development, Proposal, and Resolution of Issues

Considering and applying penalties.

49%

Workpapers and Reports

Adequately documenting the audit trail, techniques, and conclusions in working papers.

54%

- The pass rate measurement computes the percentage of examinations that showed the characteristics of the key element.

Source:  LMSB Division data.

We believe the quality of the working papers is a particular source of concern because it raises questions about whether managers are controlling the timeliness and quality of examinations as intended.  Like others in the auditing community, the LMSB Division considers working papers an important aspect of the overall quality of an examination.  Working papers provide the principal support for the scope of the examination, procedures used, evidence examined, and conclusions reached.  They are especially important when a taxpayer does not agree with an examiner’s conclusion that additional taxes are owed.  In these instances, the working papers are used to resolve differences over how much, if any, additional tax is owed. 

Recommendation

1.      To better ensure team managers are controlling the timeliness and quality of examinations as intended, the Commissioner, LMSB Division, should require team managers to document their reviews in working papers more consistently.  The documentation should include brief summaries of discussions held and action plans developed.

Management’s Response:  The Commissioner, LMSB Division, agreed with our assessment that managers need to consistently and effectively perform and document examination reviews.  To ensure this, the Commissioner, LMSB Division, is choosing to rely upon a new risk analysis process that is scheduled for implementation in September 2004.

Management Controls Need Strengthening to Ensure Consistent and Effective Managerial Involvement in Examinations

According to the GAO Standards for Internal Control in the Federal Government, control activities are the policies, procedures, techniques, and mechanisms established to assist agencies in achieving their objectives.  To meet its objective of ensuring team manager involvement in examinations, the LMSB Division has several control components. 

At the top of the organization, there is a broad policy to resolve tax issues at the lowest level.  In practice, this requires team managers to contact taxpayers when disagreements surface in examinations.  The purpose of the contact is to resolve disputes or document the reasons for the disagreement in the working papers.  In addition, the LMSB Division uses the LQMS as a mechanism for measuring the sufficiency of managerial involvement in examinations and provides managers with ready access to official procedures governing the examination process.  However, as the LMSB Division moves forward there are steps that could be taken to strengthen the LQMS and the official procedures governing examinations.

The LQMS staff reviews a sufficient sample of closed IC examinations to ensure team managers are resolving issues during examinations.  Among other things, a result from these reviews is to communicate areas of concern, such as inadequate team manager involvement in examinations, to top management so corrective actions can be taken if needed.  However, the LQMS staff determines the sufficiency of managerial involvement in only those examinations that have disputed tax issues.  Consequently, top management may not be getting a complete picture of team manager involvement in examinations from the LQMS.  In FY 2003, the sufficiency of team manager involvement was determined in just 71 (17 percent) of the 425 IC examinations reviewed.

The Internal Revenue Manual (IRM) serves as the official compilation of procedures, instructions, and guidelines that govern the examination process in the IRS.  For years, the IRM contained an Examination Group Manager’s Handbook that was designed to assist frontline managers in meeting their responsibility to ensure quality examinations are completed timely.  Among other things, the Handbook standardized and described in detail the review processes, such as workload reviews and on-the-job visits, that managers should use in evaluating examination work. 

However, recent changes to the IRM associated with the IRS’ modernization effort replaced the Examination Group Manager’s Handbook with a new Small Business/Self-Employed (SB/SE) Compliance Field Examination Group Manager Guide.  Unlike the old Handbook that was designed for use agency-wide, the new Guide is uniquely focused on the frontline managers in the IRS SB/SE Division.  Consequently, at present, there are no specific and standard guidelines for team managers to use in reviewing examinations in the LMSB Division.

Recommendations

To strengthen management controls, the Commissioner, LMSB Division, should coordinate with the Director, Quality Assurance and Performance Management, in:

2.      Modifying the LQMS procedures to include determining whether team managers are consistently and effectively involved in all IC examinations.

Management’s Response:  The Commissioner, LMSB Division, did not agree to modify the LQMS auditing standards, choosing instead to rely upon a new risk analysis process that is being implemented.  As part of the new risk analysis process, the LQMS Reviewers’ Guide is being revised to reflect the additional manager responsibilities for evaluation under existing LQMS auditing standards.

Office of Audit Comment:  We believe if management follows through with implementing the new risk analysis process as described in the response, they should be able to determine whether team managers are consistently and effectively involved in IC examinations.

3.      Providing team managers with specific and standard guidelines to use in providing oversight and guidance to examinations.

Management’s Response:  The Commissioner, LMSB Division, stated that a Risk Analysis Design Team is developing a new IRM section that will contain specific and standard guidelines for oversight and guidance for all LMSB Division examinations, including guidelines related to the new risk analysis process. 

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

Our objective was to determine whether team manager workload reviews are an effective tool in managing the outcomes of Industry Case (IC) examinations within the Large and Mid-Size Business (LMSB) Division.  To meet our objective, we relied upon the Internal Revenue Service’s (IRS) internal management reports and databases.  We did not establish the reliability of these data because extensive data validation tests were outside the scope of this audit and would have required a significant amount of time.  Our tests included:

              I.      Reviewing the IRS’ policies and procedures to determine the level of involvement expected of team managers in controlling IC examinations.

           II.      Reviewing prior General Accounting Office (GAO) and Treasury Inspector General for Tax Administration reports to identify past concerns, if any, with IC examinations and the corrective actions taken in response to any concerns reported.

         III.      Analyzing a judgmental sample of approximately 50 out of 1,735 corporate IC examinations that were closed in Fiscal Year (FY) 2002 and 21 out of the 6,541 corporate IC examinations that were open as of October 2002, to assess the sufficiency of managerial involvement in the examinations.  Judgmental sampling was used to minimize time and travel costs.

        IV.      Evaluating the status and impact of initiatives to reduce the administrative burden on frontline managers including recommendations made by the Taxpayer Treatment and Service Improvement Executive Steering Committee, Professional Managers Association, and LMSB Division Managers’ Advisory Group.

           V.      Analyzing FYs 1995 though 2003 data from the Audit Information Management System (AIMS) to identify trends in IC examinations, including the number of returns examined, examination cycle time, and the average inventories of open examinations under team managers. 

        VI.      Using the GAO Standards for Internal Control in the Federal Government to assess the adequacy of controls established to ensure team managers are involved in the examinations.

      VII.      Evaluating the LMSB Division’s Quality Management System to determine if managerial involvement was assessed during reviews and, if so, whether it had been identified as a problem. 

   VIII.      Surveying all LMSB Division team managers conducting IC examinations to determine how they got involved in examinations and if there were potential barriers that hampered their ability to become more involved in the examinations they control.

 

Appendix II

 

Major Contributors to This Report

 

Richard Dagliolo, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Philip Shropshire, Director

Frank Dunleavy, Audit Manager

Earl Charles Burney, Senior Auditor

Robert Jenness, Senior Auditor

Lawrence Smith, Senior Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Acting Deputy Commissioner, Large and Mid-Size Business Division  SE:LM

Director, Quality Assurance and Performance Management, Large and Mid-Size Business Division  SE:LM:Q

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaison:  Commissioner, Large and Mid-Size Business Division  SE:LM

 

Appendix IV

 

Survey of Industry Case Examination Team Managers

 

The following survey questions were contained in an online survey hosted on the Internal Revenue Service (IRS) Intranet at http://survey.web.irs.gov/ICmanager/default.asp.  Notification about the survey was e-mailed to all frontline managers conducting Industry Case examinations by the Large and Mid-Size Business Division Office of Communications and Liaison on February 6, 2003, asking for completion by February 14, 2003.

Welcome

Welcome to the IC Team Manager Survey. Since this is a step-by-step form, returning to incorrect answers may be troublesome, so please answer each of the following questions as thoroughly as possible and verify your answers before continuing. Click the "Next" button after answering each question.

This is an anonymous survey. However, at the end of the survey, you will be asked to enter your name into a form field. Please provide this information. It will not be linked with the answers you provided. It is only used to verify how many individuals completed the survey.

Contact Robert Jenness, TIGTA Senior Auditor, if you have any problems completing this survey or other questions regarding the review. He may be reached at (213) 894-4470, x119 or by E-mail at Robert.Jenness@tigta.treas.gov.

 

Part A:  General Questions

 

 

 

QUESTIONS

ANSWERS

Number

Applic. %

1.   Were you assigned as a manager of an examination team that performed at least “some” Industry Case (IC) examinations in FY 2002?  (i.e., not primarily a Coordinated Industry Case team).

Yes (Go to next question.)

No (Go to Question 18.)

190


67

74%


26%

2.   If you managed an examination team that performed at least some IC work in FY 2002, did you conduct any formal documented workload reviews (see IRM 114.1.3.8.6) for the Revenue Agents (RAs) assigned to your team during FY 2002?

Yes (Go to next question.)

 

No (Go to Question 7.)

146

 

 

44

77%

 

 

23%

3.   If you performed documented workload reviews in FY 2002 for the RAs performing IC work assigned to your team, did you perform at least one for each RA?

Yes (Go to next question.)
No (Go to Question 9.)

113

33

77%

23%

 

QUESTIONS

ANSWERS

Number

Applic. %

4.     If you performed at least one documented workload review for each RA performing IC work assigned to your team, approximately how many workload reviews did you conduct in FY 2002?

 

(When complete, go to next question.)

Approx. 1 per RA assigned

 

Approx. 2 per RA assigned

 

Approx. 3 or more per RA

64

 

35

 

 

14

57%

 

31%

 

 

12%

5.   Were you able to conduct all the documented workload reviews you deemed necessary for your team in
 FY 2002?

Yes (Go to Question 12.)

 

No (Go to next question.)

104

 

 

9

92%

 

 

8%


6.     What was the reason, or reasons, you were not able to conduct all the documented workload reviews you deemed necessary for your team in FY 2002? 

 

___ Administrative burdens (personnel issues, mandatory briefings/training, etc.)

___ Span of control (too many RAs and/or dispersed over large area)

___ Lack of clerical support (managers performing clerical tasks due to lack of clerical support)

___ Communication problems (volume of e-mail, technical problems with computers and/or telecommunications, etc.)

___ Other reason (describe in textbox)

(When complete, go to Question 12.)

(multiple answers allowed)

 

 

 

 

8

 

3

 

2

 

3

 

 

3

 

 

 

 

 

89%

 

33%

 

22%

 

33%

 

 

33%

 

 

 

Part B:  Managers Not Conducting Any Workload Reviews for RAs in FY 2002

QUESTIONS

ANSWERS

Number

Applic. %

7.     What was the reason, or reasons, that you did not conduct any documented workload reviews for the RAs performing IC work assigned to your team in FY 2002? 

 

___ Administrative burdens (personnel issues, mandatory briefings/training, etc.)

___ Span of control (too many RAs and/or dispersed over large area)

___ Lack of clerical support (managers performing clerical tasks due to lack of clerical support)

___ Communication problems (volume of e-mail, technical problems with computers and/or telecommunications, etc.)

___ Other reason (describe in textbox)

(When complete, go to next question.)

(multiple answers allowed)

 

 

 

 

 

14

 

5

 

5

 

5

 

 

36

 

 

 

 

 

 

32%

 

11%

 

11%

 

11%

 

 

82%

8.     Did you use another documented method, other than workload reviews, to manage your examiners’ IC inventories, such as those listed in IRM 114.1.3.8? 

___ In-process case review

___ On-the-job visits

___ Other method (describe in textbox)

(When complete, go to Question 16.)

(multiple answers allowed)

 

 

 

 

20

31

16

 

 

 

 

45%

70%

36%

 


 

Part C:  Managers Not Conducting Workload Reviews for All RAs in FY 2002

QUESTIONS

ANSWERS

Number

Applic. %

9.     If you did not perform at least one documented workload review for each RA performing IC work assigned to your team in FY 2002, approximately what percentage of your RAs received documented workload reviews in FY 2002?

 

(When complete, go to next question.)

Approx. less than 25%

 

Approx. 25% to 50%

 

Approx. 51% to 75%

 

Approx. more than 75%

7

 

 

10

 

 

14

 

 

2

21%

 

 

30%

 

 

42%

 

 

6%

10.   What was the reason, or reasons, that you did not conduct documented workload reviews for all RAs performing IC work assigned to your team in FY 2002? 

 

___ Administrative burdens (personnel issues, mandatory briefings/training, etc.)

___ Span of control (too many RAs and/or dispersed over large area)

___ Lack of clerical support (managers performing clerical tasks due to lack of clerical support)

___ Communication problems (volume of e-mail, technical problems with computers and/or telecommunications, etc.)

___ Other reason (describe in textbox)

(When complete, go to next question.)

(multiple answers allowed)

 

 

 

 

 

11

 

5

 

3

 

2

 

 

25

 

 

 

 

 

33%

 

15%

 

9%

 

6%

 

 

76%

11.   Did you use another documented method, other than workload reviews, to manage your examiners’ IC inventories, such as those listed in IRM 114.1.3.8? 

___ In-process case review

 

___ On-the-job visits

 

___ Other method (describe in textbox)

(When complete, go to next question.)

(multiple answers allowed)

 

 

 

 

21

 

28

 

7

 

 

 

 

64%

 

85%

 

21%

 

 

Part D:  Managers Conducting at Least Some Workload Reviews in FY 2002

QUESTIONS

ANSWERS

Number

Applic. %

12.   For the documented workload reviews conducted in
FY 2002, approximately how many hours were required for the average workload review?

(When complete, go to next question.)

Less than 4 hours

 

Approx. 4 to 8 hours

 

More than 8 hours

49

 

70

 

27

34%

 

48%

 

18%

13.   For the workload reviews you conducted in FY 2002, did you observe any of the following areas of concern in any examination (see IRM 114.1.3.6(7))? 

 

___ Adequacy of inventory

 

___ Work problems and delays

 

___ Awareness to fraud indicators

 

___ Compatibility of work with grade

 

___ Need for special advice and assistance

 

___ Awareness of computer assisted audit program and utilization of computer audit specialist

 

___ Consider retention requirements

 

___ Application of statistical sampling techniques

 

___ Use of computer report writing programs

 

___ Awareness of the MSSP and ISP

 

___ AIMS/ERCS controls

 

___ Other concern (describe in textbox)

(If any concerns, go to next question; if no areas of concern, go to Question 16.)

(multiple answers allowed)

 

 

 

 

 

75

 

105

 

14

 

18

 

82

 

40

 

 

8

 

15

 

36

 

34

 

47

 

23

 

 

 

 

 

51%

 

72%

 

10%

 

12%

 

56%

 

27%

 

 

5%

 

10%

 

25%

 

23%

 

32%

 

16%

 

QUESTIONS

ANSWERS

Number

Applic. %

14.   For the areas of concern noted in the question above, were you able to resolve the problem based on your workload review and subsequent follow-up?

Yes, all problems (Go to next question.)

 

Yes, some problems (Go to next question.)

 

No (Go to Question 16.)

60

 

 

 

61

 

 

 

14

44%

 

 

 

45%

 

 

 

10%

15.   If you resolved any of the areas of concern listed in Question 13, indicate if you used any of the methods listed below (see IRM 114.1.3.6(11)) 

 

___ Suggest new approaches to your employee in scheduling appointments

___ Point out valid objections by the taxpayer that the employee has failed to consider

___ Urge employees to reach decisions when there is enough information to do so

___ Increase/decrease in scope of examination

___ Provide additional resources (staff hours, travel funds, etc.)

___ Other method (describe in textbox)

(When complete, go to next question.)

(multiple answers allowed)

 

 

 

 

36

 

37

 

84

 

97

44

 

16

 

 

 

 

25%

 

25%

 

58%

 

66%

30%

 

11%

 

Part E:  Conclusion

16.   Are you aware of any IRS initiative(s) (planned or underway) to eliminate any of the barriers to conducting documented workload reviews cited in your responses? 

___ Yes (describe in textbox, then go to next question)

 

___ No (Go to Question 18.)

 

___ N/A—no barriers cited (Go to Question 18.)

10

 

 

 

148

 

 

32

5%

 

 

 

78%

 

 

17%

 

QUESTIONS

ANSWERS

Number

Applic. %

17.   If you answered “yes” to the previous question, do you believe that the IRS initiative(s) you described will eliminate the barriers to conducting documented workload reviews that you cited in your responses?

___ Yes (Go to next question.)

 

___ No (describe in textbox why initiative(s) will not be effective, then go to next question)

3

 

 

7

30%

 

 

70%

18.   Do you have any suggestions to improve the workload review process or reduce the administrative burden on first-line managers? 

(describe in textbox)

 

 

 

Part F: Review


Please review the following data before submitting. If a change is needed, you will need to use your browser's "Back" button to return to the question of concern and continue the form from there. This is because the survey is a step-by-step form.
Your name:

                   

 Submit 

NOTE: Your name will not be linked with the answers you provided. This is only to confirm that you have completed the survey.

Contact Robert Jenness, TIGTA Senior Auditor, if you have any problems completing this survey or other questions regarding the review. He may be reached at (213) 894-4470, x119 or by E-mail at Robert.Jenness@tigta.treas.gov.

 

 

Appendix V

 

The Large and Mid-Size Business Division’s Quality Measurement System

 

The Office of Quality Assurance and Performance Management, within the Large and Mid-Size Business (LMSB) Division, has responsibility for the LMSB Division’s Quality Measurement System.  The LMSB Division uses the System to, among other things, measure the quality of Industry Case examinations against four standards:  (1) Planning the Examination; (2) Inspection/Fact Finding; (3) Development, Proposal, and Resolution of Issues; and (4) Workpapers and Reports.  Each standard also has several key elements that elaborate on the overall standard.  Table 1 summarizes the standards and associated key elements.

 

Table 1:  Summary of the Large and Mid-Size Business Division’s Quality
Measurement System (as of September 2003)

No.

Standard

Key Elements

Overview

1

Planning the Examination

·   Was appropriate information considered in the
preplanning process?

·   Were material items identified?

·   Was an appropriate initial risk analysis performed?

·   Were timely referrals to specialists and request for support made?

·   Were all TEFRA procedures followed for Form 1065 and
Form 1120-S returns?

 

·    Did the audit plan adequately set forth the scope and depth of the examination?

·    Did the audit plan include a realistic estimated completion date and realistic time frames for development of issues/areas?

·    Were audit procedures documented during the planning process?

·    Did the planning process have adequate taxpayer involvement?

The standard evaluates whether the audit plan identifies material issues; whether initial requests for information are clear, concise, and appropriate and address the potential issues selected; and whether all necessary steps are taken to set the groundwork for a complete examination.

2

Inspection/Fact Finding

·   Were appropriate audit procedures and examination techniques used?

·   Were the Information Document Requests clear and concise?

·   Were Computer Audit Specialist applications used in obtaining necessary information?

·    Was there communication with the taxpayer to reach an understanding of the facts regarding material issues?

·    Were mandatory Information Document Requests issued as appropriate?

Appropriate audit procedures and examination techniques, including interviews, written requests, inspection, observation, and other fact finding techniques, should be used to gather sufficient, competent information to determine the correct tax liability.

 

No.

Standard

Key Elements

Overview

3

Development, Proposal, and Resolution of Issues

·   Were the issues appropriately developed based upon the facts obtained?

·   Was the time commensurate with the complexity of the issues?

·   Were appropriate advice and assistance obtained from resources outside the team?

·   Was there timely and effective communication among all team members?

·   Did the case file reflect a reasonable interpretation, application, and explanation of the law based upon the facts and circumstances of the examination?

·    Were penalties considered and applied as warranted?

·    Was an appropriate midcycle risk analysis performed?

·    Were the Forms 5701 clear and concise?

·    Were proposed adjustments discussed with the taxpayer prior to issuance of
Form 5701?

·    Did the team adequately consider responses to
Forms 5701 provided by the taxpayer?

·    Were appropriate actions taken to resolve issues at the lowest level?

·    Was there meaningful managerial involvement to resolve issues at the lowest level?

Due professional care should be exercised in the application of the tax law.  The taxpayer should be given an opportunity to participate in issue development.

 

The notice of proposed adjustments and attachments should be stated in terms understandable to the taxpayer, and it should clearly state the issue, facts, law, Federal Government’s position, taxpayer’s position, and conclusions. 

4

Workpapers and Reports

·   Were workpapers legible/organized?

·   Were examination activities properly documented by using agent activity records or quarterly narratives?

·   Did the workpapers adequately document the Audit Trail, Techniques, and Conclusions?

·    Were applicable report writing procedures followed?

·    Did the team manager review the audit report prior to issuance?

·    Were factual and legal differences in the taxpayer’s protest addressed?

Workpapers are the link between the examination work and the report.  They should contain the evidence to support the facts and conclusion contained in the report.  Written reports should communicate the findings and examination in a professional manner.

Source:  Internal Revenue Service Document 12076, “Focus on Quality Examinations.”

 

Appendix VI

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.