TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

The Internal Revenue Service Does Not Adequately Assess the Effectiveness

of Its Training

 

 

 

September 2005

 

Reference Number:  2005-10-149

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

 

Web Site - http://www.tigta.gov

 

September 19, 2005

 

 

MEMORANDUM FOR DEPUTY COMMISSIONER FOR OPERATIONS SUPPORT

 

FROM:                            Pamela J. Gardiner /s/ Pamela J. Gardiner

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – The Internal Revenue Service Does Not Adequately Assess the Effectiveness of Its Training (Audit # 200410020)

 

This report presents the results of our review of the Internal Revenue Service (IRS) training programs.  The audit was conducted to address concerns by Congress and other stakeholders as to the adequacy of the IRS’ training programs.  The overall objective of this review was to evaluate whether training for IRS employees in job series that deal directly with taxpayers and their representatives is periodically assessed to ensure its effectiveness.

Synopsis

The Federal Workforce Flexibility Act of 2004[1] requires agencies to regularly assess their training efforts to determine whether their training is contributing to the successful completion of the agencies’ missions.  The IRS has training assessment and development procedures; however, these procedures are generally not followed by its operating divisions.  Although each operating division advised us they follow their own processes, they were generally not able to provide documentation to substantiate whether assessments were performed.  In 24 (56 percent) of the 43 training courses we sampled, there was no evidence that an assessment was performed which supported changes or updates to the courses.  The lack of documentation of the actions taken prevented us from verifying whether the IRS is taking steps to ensure employees are getting the right training to perform their jobs effectively.

The IRS incurs substantial expense to use the Integrated Training Evaluation and Measurement Services system, which is a tool to gather and analyze training data for the purpose of evaluating and improving training.  The IRS has paid a total of approximately $4 million over a 7-year period to use this system.  Nonetheless, it uses this system only to a limited extent.  The system is designed to provide four levels of assessment, and the IRS procedures require the use of all four levels.  The IRS recorded only the Level 1 assessments (employee and instructor class evaluations) to any significant extent, and there was little evidence indicating this information was used to improve training.  A better strategy and method are needed for achieving the intended purpose of this system, or it should be discontinued and replaced with a more cost-effective system.

Employee satisfaction survey comments have been used to make improvements to training.  Each of the operating divisions have evaluated these comments and taken action to address concerns by changing the process and training content.  While the analysis of employee comments from the surveys has resulted in certain changes in training, there did not appear to be an analysis of the rankings by divisions and groups to better evaluate problems or training gaps that are occurring in specific groups or functions.  About one-fourth of the workgroups we reviewed had scores indicating that employees didn’t believe they were getting the training they needed.  An analysis of the groups with lower scores might lead to the identification of problems that need to be addressed, while groups with high scores might indicate there are best practices that could be emulated by other groups.  This type of analysis could be used to identify whether the workgroups in a particular job series or function share common concerns or if there are significant differences among workgroups.  This could significantly assist the IRS in its training assessment and development process.

Recommendations

The Chief Human Capital Officer (CHCO) should require all business units to follow the assessment and documentation requirements to ensure employees have the knowledge and skills needed to successfully perform their jobs.  Additionally, the CHCO, in coordination with the Deputy Commissioners, should ensure all IRS components follow established procedures to evaluate training in order for the IRS to comply with training assessment requirements of the Federal Workforce Flexibility Act of 2004.  The use of the current training evaluation and measurement system should be required, or an alternate system should be developed that will allow the IRS to evaluate training effectiveness.  Lastly, we recommended the CHCO require the IRS operating divisions to use the numerical scores from the Employee Satisfaction Survey question on training to perform further analysis to identify problems or trends and use this information in the training assessment and development process.

Response

IRS management agreed to implement our recommendations.  The CHCO, through the Director, Leadership and Education Division, and the Learning and Education Policy Sub-council will:

  • Issue policy requiring training needs assessments for mission-critical occupations or occupational specialties to identify knowledge and skill gaps.  Policy statements will be revised to strengthen the requirement to conduct evaluation levels 1, 2, 3, and 4. 
  • Devise a plan for regular review of all training work processes, including those that support training evaluations to ensure compliance with the Federal Workforce Flexibility Act.
  • Issue a policy statement that will require the use of available data sources, both quantitative and qualitative, including analyses of numerical scores from the Employee Satisfaction Survey training question in addition to narrative comments.

Management’s complete response to the draft report is included as Appendix V.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Daniel R. Devlin, Assistant Inspector General for Audit (Headquarters Operations and Exempt Organizations Programs), at (202) 622-8500.

 

 

Table of Contents

 

Background

Results of Review

Training Assessment and Development Procedures Are Generally Not Followed

Recommendation 1:

The System for Training Evaluation and Measurement Is Used Only to a Limited Extent

Recommendation 2:

Employee Satisfaction Survey Results Could Be Better Used to Assist in Training Assessment and Development

Recommendation 3:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Fiscal Year 2003 Employee Satisfaction Rating of Training for the Job Series Reviewed Within Each Operating Division

Appendix V – Management’s Response to the Draft Report

 

 

Background

 

There have been significant concerns expressed by Congress and other Internal Revenue Service (IRS) stakeholders such as tax practitioners, the National Treasury Employees Union (NTEU), and the IRS Oversight Board as to the adequacy of the IRS’ training programs.  These concerns relate to whether employees have adequate knowledge of IRS procedures and the tax laws to effectively perform their duties and deal with the public and tax professionals.

In a public meeting held by the IRS Oversight Board in January 2004, practitioners raised several concerns about the experience levels and technical ability of IRS employees.  A representative of the National Society of Accountants reported its members have observed an “…expanding training gap at the IRS…” and that its members “…experience instances where IRS employees lack experience and skills to handle difficult problem cases and complex problems are shoved aside.”  A representative of the Tax Executive Institute noted that Large and Mid-Size Business (LMSB) Division specialists need training in accounting principles, the latest technology, and business practices and stated the LMSB Division must “…ensure that agents receive consistent and timely training….”

In a prior audit report, we reported that training deficiencies have caused some taxpayer assistance employees to inaccurately answer tax law questions.[2]  The issues of tax law accuracy have also become a Congressional concern and in the past year the Government Accountability Office (GAO) has been asked to review the quality of the training provided to employees working in taxpayer assistance centers. 

The NTEU stated that within the IRS the amount of training employees receive is not usually an issue, but rather the problem is the training is often of the wrong type.  NTEU officials also advised us that IRS instructors are not properly evaluated.  They were also concerned that IRS officials have not been receptive to and have not acted on the NTEU’s suggestions to improve training.

The IRS spends approximately $100 million per year on training for its employees.  The IRS has over 100,000 employees and over 150 different job series operating throughout the nation.  Some of its staff is seasonal and works only during the January through May tax-filing season.  There is a high turnover rate in seasonal staff; consequently, the IRS must hire and train a large number of inexperienced staff each year.  Furthermore, the tax laws that the IRS must administer often change from year to year.

The IRS is structured to address the needs of specific types of taxpayers through its four main operating divisions:

  • LMSB Division.
  • Small Business/Self-Employed (SB/SE) Division.
  • Tax Exempt and Government Entities (TE/GE) Division.
  • Wage and Investment (W&I) Division.

The responsibility for training is decentralized in the IRS based on the precept that the operating divisions are in the best position to meet the training and educational needs of their employees.  The operating divisions are accountable for developing strategic training plans and goals and their own training budgets.

Due to concerns about the effectiveness of training Government-wide, Congress passed the Federal Workforce Flexibility Act of 2004.[3]  The Act requires agencies to regularly assess their training efforts to determine if their training is contributing to the successful completion of the agencies’ missions.

The IRS Chief Human Capital Officer (CHCO) provides tools to the IRS operating divisions and functional units to assess and evaluate the training they provide their employees.  These tools include the Training Development Quality Assurance System (TDQAS) and the Integrated Training Evaluation and Measurement Services (ITEMS) system.  Additionally, all operating divisions and functional units receive information from the Employee Satisfaction Survey related to training.

This audit of the IRS’ efforts to evaluate and improve training is the second in a series of audits of the IRS’ training programs.  Our previous audit evaluated the accuracy and completeness of IRS training information.[4]  The scope of this audit covered the training for the five specific job series that have the most contact with taxpayers and their representatives.  These five job series account for about one-half of the IRS’ total workforce.

·         Revenue Agent (Series 512) – Examines all types of Federal tax returns.

·         Tax Technician (Series 526) – This series includes Tax Compliance Officers in the SB/SE Division who perform IRS office examinations and related investigations of individual taxpayers.  It also includes Tax Resolution Representatives in the W&I Division who resolve tax law, collection, and customer account issues.

·         Tax Examiner (Series 592) – Researches, analyzes, and initiates actions on tax issues and adjusts taxpayer accounts.  Tax Examiners also secure payments and filings for delinquent taxes and returns, respectively.

·         Customer Service Representative (Series 962) – Provides basic procedural and technical responses to taxpayer inquiries in person or by telephone.

·         Revenue Officer (Series 1169) – Collects delinquent accounts and secures delinquent tax returns.

This review was performed at the offices of the CHCO and the SB/SE Division National Headquarters in Washington, D.C.  We also made field office visits to LMSB, SB/SE, and W&I Division offices in Houston, Texas; Philadelphia, Pennsylvania; and Atlanta, Georgia, respectively.[5]  The data we analyzed ranged from Fiscal Years (FY) 2002 through 2004.  We performed this audit during the period May 2004 through June 2005.  The audit was conducted in accordance with Government Auditing Standards.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

 

Results of Review

 

Training Assessment and Development Procedures Are Generally Not Followed

 

To provide the extensive technical knowledge and skills required of IRS employees to perform their jobs effectively, the IRS has core training curricula in place for specific job series.  Each curriculum consists of a number of different courses.  For example, the Revenue Agent curriculum in the SB/SE Division consists of 12 different courses.  Within each job series, there can be a number of different categories of jobs.  Each job category requires the same basic qualifications and technical training as other jobs in the same series, but also requires some degree of specialization.

 

Table 1:  Core Curricula for Selected Job Series

 

Position Title

Series

Number of Core Curricula

LMSB Division

SB/SE Division

W&I Division

 

Revenue Agent

512

5

1

N/A

 

Tax Technician

526

N/A

2

2

 

Tax Examiner

592

N/A

29

7

 

Customer Service Representative

962

N/A

4

30

 

Revenue Officer

1169

N/A

1

N/A

 

TOTAL

 

5

37

39

 

 

Source:  LMSB, SB/SE, and W&I Learning and Education staff.

 

For example, some of the Revenue Agents in the LMSB Division specialize as International Examiners.[6]  In addition to fulfilling the Revenue Agent core curriculum requirements, they are required to take two additional training courses in international issues.  Likewise, some Revenue Agents are Computer Audit Specialists and must take computer audit training courses.  Table 1 shows the number of core curricula associated with the five job series in the three operating divisions we reviewed.

Because of the complexity of ensuring its curricula and associated courses are comprehensive and up-to-date, the IRS procedures require the IRS divisions and functions to follow a systematic approach to guide the process of training assessment, analysis, design, development, implementation, and evaluation.  For guidance on conducting the training assessment, the Internal Revenue Manual recommends course developers and managers refer to the TDQAS, which is intended to enable the operating divisions to comply with the requirement to document curricula and course content and provides a discipline to course development.  The assessment phase of the TDQAS lists the following:

1.      The actions associated with the assessment phase of [the] TDQAS are:

a. Identify the performance problem.

b. Determine the focus of training development efforts.

2.      A performance problem is the discrepancy between actual job performance and required performance.  Performance problems may arise as a result of inadequate performance by existing employees, changed job responsibilities, new technology, or newly selected employees who lack the necessary job skills.

3.      The second action in the assessment phase is to determine the focus of training development efforts.  Course development activities are prioritized to ensure that the most pressing training needs are given immediate attention.

Notwithstanding these guidelines, the IRS could not provide adequate documentation of actual assessments of its training to identify differences between actual job performance and required performance or the steps it takes to address these differences with training.

To determine whether we could trace the steps the IRS takes to ensure its training addresses any specific performance problems identified, we selected a judgmental sample of 43 training courses that had been developed and/or updated since FY 2002 from the LMSB, SB/SE, and W&I Divisions.  We reviewed the documentation associated with each course to determine the basis for the development of or the update to the curricula and whether the assessment phase requirements of the TDQAS were followed.  The IRS Learning and Education (L&E) Policy Handbook[7] requires all education components in the IRS to maintain documentation used to develop curricula and learning content.  This includes documentation on updates and other changes caused by tax law changes or information received from students or instructors.

Table 2:  Analysis of Developed and Updated Courses

Operating Division

Number of Courses Reviewed

Number of Courses With No Information to Justify the Change

LMSB

14

13

SB/SE

15

7

W&I

14

4

TOTAL

43

24

Source:  Treasury Inspector General for Tax Administration analysis of course changes and updates.

In 24 (56 percent) of the 43 training courses sampled, there was no evidence that an assessment was performed which supported the changes or updates made to the training courses.  For 19 (44 percent) of the 43 training courses, there was some evidence, such as employee and/or instructor feedback, that demonstrated the basis for the changes or updates to the training courses.  Table 2 shows a breakdown of the results.

Although each of the operating divisions stated they performed evaluations, they did not have documentation to confirm this in the instances we noted.  For example, of the training courses in our sample, 14 were from the LMSB Division.  The LMSB Division was able to provide evidence of a TDQAS evaluation for only one of these courses.  However, LMSB Division officials stated they followed the TDQAS process for all courses.  They also stated they assess which courses need to be updated in light of new initiatives, procedural changes, and tax law changes and this information is obtained through feedback from employees, Subject Matter Experts,[8] and program managers.

The SB/SE Division stated they use a method called performance consulting,[9] which uses procedures similar to those contained in the TDQAS.  However, when time constraints become a factor, SB/SE Division officials indicated they abbreviate the process and may not document what steps were taken to assess whether to update or modify a training course.  The W&I Division staff informed us they have their own Learning Product Development Policy which, like the TDQAS, also requires maintaining documents to show why courses are being modified or updated.  However, 4 (29 percent) of the 14 courses we reviewed did not contain this type of information.

By not following the recommended procedures and maintaining sufficient documentation to support the actions taken and the training products produced in the TDQAS assessment phase, the IRS operating divisions increase the chances that gaps in the needed and actual skills or performance will go undetected and that training resources will be not be used effectively.  Furthermore, the lack of documentation of the actions taken prevented us from verifying whether the IRS is taking steps to ensure employees are getting the right training to perform their jobs effectively.

 

Recommendation

 

Recommendation 1:  The CHCO should require all business units to follow the requirements to assess and develop training and to properly document this process.  Assessments should address the concerns raised by various stakeholders and the IRS’ own internal evaluation tools and ensure the employees have the knowledge and skills needed to successfully perform their jobs.  This process should be reviewed to ensure the training is appropriately evaluated, the reasons for modifying courses are valid, and post reviews assess the effectiveness of the changes.

 

Management’s Response:  The CHCO, through the Director, Leadership and Education Division, and the L&E Policy Sub-council,[10] will issue policy requiring training needs assessments for mission-critical occupations or occupational specialties to identify knowledge and skill gaps.  Revised policy statements will be issued to strengthen the requirement to conduct evaluation levels 1, 2, 3, and 4.  This will ensure training is appropriately evaluated and will create the necessary documentation to substantiate the validity of course modifications and the effectiveness of the changes.

 

The System for Training Evaluation and Measurement Is Used Only to a Limited Extent

 

The GAO stresses the importance of Federal Government agencies evaluating their own training and development programs to demonstrate how their training efforts help develop employees and improve the agencies’ performance.[11]  The GAO states:

Because the evaluation of training and development programs can aid decision makers in managing scarce resources, agencies need to develop evaluation processes that systematically track the cost and delivery of training and development efforts and assess the benefits of these efforts.  To the extent possible, Federal Government agencies need to ensure data consistency across the organization.  Variations in methods used to collect data can greatly affect the analysis of uniform, quality data on the cost and delivery of training and development programs.

As previously stated, with the passage of the Federal Workforce Flexibility Act of 2004, Congress now requires agencies to determine if their training is contributing to the effective completion of their mission.

The professional education community follows the four-level Kirkpatrick Model[12] with respect to training evaluation.  The IRS has adapted this model through implementing the ITEMS system which provides guidance, functionalities, and tools to gather and analyze training data for the purpose of improving training at the IRS.  The ITEMS system consists of four interrelated training evaluation processes which the IRS has defined as:

·         Level 1 – Reaction:  Trainees and instructors complete an after course evaluation to provide, among other things, their perceptions of training quality and satisfaction with the training.

·         Level 2 – Achievement:  Trainees are assessed at the end of training, usually through tests, whether learning has occurred (i.e., changed attitudes, increased knowledge, and/or increased skills) as a result of attending a training program.

·         Level 3 – Effectiveness:  Trainees and their managers are surveyed after training to determine whether it has resulted in on-the-job proficiency in employees’ performance.

·         Level 4 – Organizational Results:  Post-training data are analyzed to determine the time to achieve intended capability and the impact of training on organizational performance.

The IRS L&E Policy Sub-Council completed the development of an ITEMS system policy in May 2003, which provides the following guidance as to when to perform each level of evaluation:

Evaluation Level

Required for the following types of courses

Level 1

All courses longer than 1 hour.

Level 2

All E-Learning[13] courses, all new courses, and any existing courses that previously had an associated Level 2 evaluation.

Level 3

Every course which requires the Level 2 evaluation.

Level 4

All mission-critical training programs.[14]

 

Table 3:  Percentage of Level 1 Evaluations
Recorded in the ITEMS System

Fiscal Year

LMSB

SB/SE

W&I

2004

91%

93%

82%

2003

94%

63%

71%

Source:  ITEMS system Evaluation Proficiency Index Report.

In FY 2004, about 90 percent (38,471 of 42,933) of the Level 1 evaluations for classes that required Level 1 evaluations had been completed and recorded in the ITEMS system.  The LMSB, SB/SE, and W&I Divisions decided to place more emphasis on the Level 1 evaluations in FY 2004 and achieved significant increases in the number of evaluations conducted.  Table 3 shows the percentage of Level 1 evaluations recorded in the ITEMS system.

Twenty-six (59 percent) of the 44 courses we selected for review had Level 1 evaluations recorded in the ITEMS system.  None of the three operating divisions we reviewed had any of the Level 2 evaluations on the ITEMS system for FY 2004.  All three indicated they did perform some testing; however, it was not the type required by the ITEMS system.  The W&I Division often did not formally administer the tests.  Instead, employees completed the tests and the instructors reviewed the tests and answers with the students during class but the tests were not graded and scores were not recorded.  LMSB and SB/SE Division officials stated they conduct Level 2 evaluations on new recruits.  In addition, LMSB Division officials stated they have never taken the time to ensure the Level 2 information was being entered into the ITEMS system.  SB/SE Division officials stated that the ITEMS system is limited to providing only an average score; therefore, they prefer to use a separate database that provides more detailed training information on its new recruits.

The information gathered from Level 1 and 2 assessments should be used to make changes to training courses and programs based on employee and instructor comments and observations.  While all three business units stated they used this type of data when updating or revising courses, our review of 43 courses that were changed in the past year does not support this contention.  Despite the fact that a significant amount of time and effort is spent to gather and record Level 1 evaluations in the ITEMS system, there was evidence for only 3 of the 43 courses we reviewed that either Level 1 or 2 assessments were used when updating the courses and were recorded in the ITEMS system.  Consequently, for the courses in our sample, we were not able to verify whether the IRS is using the data it records in the system.

The operating divisions performed Level 3 evaluations on only 86 (4 percent) of 2,366 of the courses that required these evaluations in FY 2004.  We asked officials from the operating divisions why so few Level 3 evaluations were actually performed.  SB/SE Division officials stated they do not fully utilize Level 3 because administering these evaluations is labor intensive.  W&I Division officials also believe the evaluations are too labor intensive.  Because most of the W&I Division’s training relates to Campus[15] employees and these employees don’t have email or Internet access, it must administer paper based assessments.  This increases the workload because of the time needed to administer, collect, and input results into the ITEMS system database.  LMSB Division officials stated instead of performing Level 3 evaluations, they interview new hires, training managers, and on-the-job instructors to gather feedback about training issues.  The feedback is summarized in a report and is submitted to the training managers to make improvements to the individual training classes.

Both SB/SE and W&I Division officials have stated they plan to perform more Level 3 evaluations.  However, neither indicated how many they planned to perform.  Both stated additional resources would be needed to develop, administer, collect, and input surveys into the ITEMS system.

None of the three divisions have developed or administered any Level 4 evaluations.  Before the Level 4 evaluation can be conducted, the prerequisite Level 2 and Level 3 evaluations must be completed.  Level 4 evaluations can be very resource intensive to develop and costly to administer.  An Office of Personnel Management official we interviewed was not aware of any Federal agencies that conduct Level 4 evaluations.  Few organizations, including private sector companies, have developed Level 4 evaluations.  Per a 2003 survey of private companies by the American Society for Training & Development, fewer than 10 percent of the companies conducted Level 4 evaluations.

The Level 1 assessment can provide measurement of participants’ reactions or attitudes toward specific components of the program or course.  Level 2 and the other higher level assessments are intended to determine what program or course participants actually learned from the training event as well as its effect on organizational performance.  As such, a comprehensive process to evaluate and improve training should make use of the Level 1 and higher level assessments.

Table 4:  Estimated Cost of the ITEMS System Incurred From FY 1998 Through FY 2004

Fiscal Year

Cost

1998

$172,000

1999

$549,000

2000

$480,000

2001

$612,000

2002

$605,000

2003

$803,000

2004

$800,000

TOTAL

$4,021,000

Source:  IRS Leadership and Education function.

Because the IRS uses the ITEMS system on only a limited basis, we believe it should evaluate the cost benefit of continuing to support the ITEMS system.  The IRS, which contracts to use the system, has paid the contractor just over $4 million.  Table 4 shows the estimated cost incurred from FYs 1998 through 2004.  Future costs of the ITEMS system are expected to total about $2.3 million over the next 4 years (about $575,000 annually).  The contract is based on a fixed price and even though the IRS is not using all functions and services provided, the IRS must still pay the full contract amount.  It cannot pay only for the functions and services used.

If the operating divisions do not use the ITEMS system consistently as required, the funds spent on this system could be put to better use.  In such case, the IRS would need to develop a defined, systematic way to evaluate training consistently among the operating divisions and functional units.  Otherwise, a better strategy and method are needed for achieving the intended purpose of the ITEMS system or it should be discontinued and replaced with a more cost-effective system.  It is critical for the IRS to have a repeatable and effective method to evaluate training.

Recommendation

 

Recommendation 2:  The CHCO, in coordination with the Deputy Commissioners, should ensure all IRS components follow established procedures to evaluate training in order for the IRS to comply with training assessment requirements of the Federal Workforce Flexibility Act of 2004.  The use of the ITEMS system should be required or an alternate system should be developed that will allow the IRS to evaluate training effectiveness.  The results of the training evaluations should be incorporated into the TDQAS process.

Management’s Response:  The CHCO, through the Director, Leadership and Education Division, and the L&E Policy Sub-council, will devise a plan for regular review of the training process to ensure compliance with the Federal Workforce Flexibility Act.  The L&E Analysis Project (LEAP)[16] has been charged to review all training work processes, including those that support training evaluation.  Decisions about which business processes are approved and the eventual organization structure to support those processes will be made in December 2005.  The CHCO will issue guidance implementing the required evaluation process.

 

Employee Satisfaction Survey Results Could Be Better Used to Assist in Training Assessment and Development

 

The IRS conducts an all employee survey each year to obtain feedback on employees’ overall job satisfaction and workplace issues, including training issues.[17]  To assess employees’ opinions about training, the survey asks for employees to respond to the statement, “I receive the training I need to perform my job effectively.”  Employees are asked to respond to the statement using a scale that ranges from strongly agree (5 points) to strongly disagree (1 point).  Employees may also provide narrative comments.  All three divisions have a process in place to review narrative comments; the LMSB Division has its Performance, Quality, and Audit Assistance section; the W&I Division has its Division Partnering Council;[18] and the SB/SE Division has established a review team of L&E function staff and field staff from each unit.

Many of the comments from employees in the SB/SE and LMSB Divisions were related to the discontinuation of the seminar/classroom style Continuing Professional Education (CPE)[19] program.  Employees believed the CPEs should not be discontinued because the face-to-face interactions with instructors and colleagues were an important part of the CPE sessions.  In addition, employees requested a wider range of technical courses to be taught at the CPEs.  The LMSB Division received over 600 comments on training in FY 2003.  The SB/SE Division analyzed over 1,400 comments related to training.

As a result of these comments, the SB/SE and LMSB Divisions appointed redesign teams for their CPE programs.  However, the seminar/classroom style CPEs will be conducted selectively and when appropriate.  The SB/SE and LMSB Divisions have started to develop web-based training for their employees.[20]  Both divisions plan to allow employees to choose the sessions they want to attend after conferring with and getting approval from their managers on their CPE plan.

W&I Division employees had concerns with the quality of CPE instructors as well as the quality of the training material presented, which they believed was out-dated and/or did not impact their day-to-day duties.  However, W&I Division officials did not provide us the number of comments related to these concerns.  One unit within the W&I Division is studying approaches to address these problems.  One result of the study is the selection of a cadre of full time instructors[21] who will be provided additional training.  The instructors will report to Supervisory Training Coordinators who are responsible for the quality of training and also for evaluating the instructors.  In addition, the W&I Division plans to develop a Training and Quality Resource Site on the Internet to provide teaching tips and technical information about IRS processes, systems, and tax matters, which will allow the instructors and coordinators to share ideas and issues with others.

While the analysis of employee comments from the surveys has resulted in certain changes in training, there did not appear to be an analysis of the rankings by divisions and groups to better evaluate problems or training gaps that are occurring in specific groups or functions.  The following table shows the number of workgroups in the job series we reviewed in the LMSB, SB/SE, and W&I Divisions in four specific ranges of satisfaction with training for the FY 2003 survey.

Table 5:  FY 2003 Employee Satisfaction Rating of Training for the Five Job Series Reviewed

(By Operating Division)[22]

Operating
Division

Groups with Scores of 1.9 or Less

Groups with Scores from 2.0 to 2.9

Groups with Scores from 3.0 to 3.9

Groups with Scores of 4 or More

Total Number of Groups

LMSB

16

(2.40%)

172

(25.83%)

348

(52.25%)

130

(19.52%)

666

SB/SE

61

(2.14%)

766

(26.85%)

1,627

(57.03)

399

(13.99%)

2,853

W&I

13

(0.83%)

429

(27.48%)

921

(59.00%)

198

(12.68%)

1,561

Source:  IRS Employee Satisfaction Survey results.

About one-fourth of the workgroups we reviewed had group scores of three or less indicating that many employees don’t believe they are getting the training they need.  Conversely, only 14 percent of the workgroups had a training question score of 4 and higher indicating the employees in these groups were very satisfied with training.  A detailed analysis of scores within specific job series would provide more detailed information about the training perceptions of specific employees (see Appendix IV).  An analysis of the groups with lower scores might lead to the identification of problems that need to be addressed, while groups with high scores might indicate there are best practices in some groups that could be emulated by other groups.  This type of analysis could be used to identify whether the workgroups in a particular job series or function share common concerns or if there are significant differences among workgroups.  This could significantly assist the IRS in its training assessment and development process.  Without considering this type of information, the IRS is missing an opportunity to make the best use of survey results to help further achieve the organization’s mission and goals.

 

Recommendation

 

Recommendation 3:  The CHCO should require the IRS operating divisions to use the numerical scores from the Employee Satisfaction Survey training question in addition to narrative comments.  The numerical training scores should be analyzed within each job series to identify any relevant trends.  This information should be used as part of the TDQAS process.

Management’s Response:  The CHCO, through the Director, Leadership and Education Division, and the L&E Policy Sub-council, will issue a policy statement in conjunction with its corrective action to Recommendation 1 of this report.  The policy will direct the use of available data sources, both quantitative and qualitative, when the data sources are judged to be reliable and representative.  This will include analyses of numerical scores from the Employee Satisfaction Survey training question in addition to narrative comments.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to evaluate whether training for Internal Revenue Service (IRS) employees in job series that deal directly with taxpayers and their representatives is periodically assessed to ensure its effectiveness.  We limited our review to five job series that most often interact with taxpayers and their representatives (Revenue Agent [GS-512]; Tax Technician [GS-526]; Tax Examiner [GS-592]; Customer Service Representative [GS-962]; Revenue Officer [GS-1169]) and to the three business units that had the greatest number of these employees (Large and Mid-Size Business [LMSB], Small Business/Self-Employed [SB/SE], and Wage and Investment [W&I] Divisions).  To accomplish this objective, we:

I.                   Evaluated how the Learning and Education (L&E) function staff assigned to the Chief Human Capital Officer (CHCO) worked with the embedded L&E function staffs within the LMSB, SB/SE, and W&I Divisions to assess job performance gaps and possible training‑based solutions for the development and/or update of core curriculum.

A.    Determined what training guidance the CHCO provided to the divisions.

B.     Interviewed selected external stakeholders to gain their perspective on the status of training in the IRS and any suggestions for improvement.

C.     Interviewed the LMSB, SB/SE, and W&I Divisions embedded L&E function staff, division program managers, and project managers and determined the processes used to assess job performance gaps and possible training-based solutions for the development and update process of their core curriculum.

D.    Verified the W&I Division’s Core Curriculum for the three Job Series GS - 526, 592, and 962 and the positions that have the most impact with taxpayers. 

E.     Verified the SB/SE Division’s Core Curriculum for the five Job Series GS - 512, 526, 592, 962, and 1169 and the positions that have the most impact with taxpayers.

F.      Verified the LMSB Division’s Core Curriculum for the Job Series GS - 512 and the positions that have the most impact with taxpayers.

G.    Selected a judgmental sample of LMSB, SB/SE, and W&I Divisions Core Curriculum courses to ensure that the five job series that most often interact with taxpayers and their representatives are evenly represented in our sample.  Also, we reviewed the documentation associated with the courses to determine whether the divisions were adequately assessing job performance gaps and possible training-based solutions when modifying courses.  Our judgmental sample of 43 courses was selected from a population of 796 training courses.  The judgmental sample was based on the following criteria:

·         Selecting a minimum of one core curriculum class per job position under the job series directly impacting taxpayers.

·         Core courses that were developed or changed in the last 24 months.

·         Core courses that had several days of training.

II.                Evaluated how the CHCO’s L&E function staff and the divisions’ L&E function staff used the Employee Satisfaction Survey results to assess both training needs and effectiveness.

A.    Interviewed the CHCO’s L&E function staff and determined if they used the Employee Satisfaction Survey results and workgroup information regarding training to assess both training needs and effectiveness.

B.     Interviewed Survey Program Leaders in the LMSB, SB/SE, and W&I Divisions and determined how training issues are assessed and resolved based on the Employee Satisfaction Survey.

C.     Determined whether the embedded L&E function staff in the LMSB, SB/SE, and W&I Divisions received training issues developed by the Employee Satisfaction tracker workgroups.

III.             Evaluated how the CHCO’s L&E function staff and the divisions’ L&E function staff used the Integrated Training Evaluation and Measurement Services (ITEMS) system data to assess both training needs and effectiveness.

A.    During the opening conference with the CHCO, we determined what training guidance is provided to the divisions as a result of the ITEMS system.

B.     Interviewed members of the Human Capital Office Leadership & Education function staff and determined:

1.      Whether the ITEMS system reports are routinely distributed and if so, to whom and for what purposes.

2.      Who is responsible for testing the validity of data contained in the ITEMS system database.

3.      What training services the staff provides to the various divisional L&E function staffs, program managers, or project managers.

C.     Determined whether the required ITEMS system evaluations (Levels 1, 2, and 3) were performed by the divisions and recorded in the ITEMS system database.

D.    Evaluated the usefulness of the Level 1, 2, and 3 Evaluation Reports.  Interviewed staff in the LMSB, SB/SE, and W&I Divisions to determine how useful the data are.

E.     Summarized by assessment level the number of courses that require Level 1, 2, or 3 assessments.  Determined the reasons why the courses were subjected to different assessment levels.

F.  Determined if and how the IRS staff uses the ITEMS system reports to improve training.

 

Appendix II

 

Major Contributors to This Report

 

Daniel R. Devlin, Assistant Inspector General for Audit (Headquarters Operations and Exempt Organizations Programs)

Michael E. McKenney, Director

Kevin P. Riley, Audit Manager

Kenneth E. Henderson, Lead Auditor

Michael S. Laird, Senior Auditor

David P. Robben, Senior Auditor

Michael Della Ripa, Auditor

Stephen E. Holmes, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Chief Human Capital Officer  OS:HC

Commissioner, Large and Mid-Size Business Division  SE:LM

Commissioner, Small Business/Self-Employed Division  SE:S

Commissioner, Wage and Investment Division  SE:W

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaisons:

            Commissioner, Large and Mid-Size Business Division  SE:LM

Commissioner, Small Business/Self-Employed Division  SE:S

Commissioner, Wage and Investment Division  SE:W

Chief Human Capital Officer  OS:CFO

 

Appendix IV

 

Fiscal Year 2003 Employee Satisfaction Rating of Training for the Job Series Reviewed Within Each Operating Division[23]

 

Job Series

Groups With Scores Less Than 1.9

Groups With Scores From 2.0 To 2.9

Groups With Scores From 3.0 To 3.9

Groups With Scores Of 4.0 or More

Total Number Of Groups

LARGE AND MID-SIZE BUSINESS DIVISION

666

512

16

172

348

130

666

2.40%

25.83%

52.25%

19.52%

SMALL BUSINESS AND SELF-EMPLOYED DIVISION

2,853

512

8

155

369

110

642

1.25%

24.14 %

57.48 %

17.13 %

526

10

95

180

29

314

3.18%

30.25%

57.32%

9.24%

592

32

251

528

110

921

3.47%

27.25%

57.33%

11.94%

962

2

84

217

39

342

.58%

24.56%

63.45%

11.40%

1169

9

181

333

111

634

1.42%

28.55%

52.52%

17.51%

WAGE & INVESTMENT DIVISION

1,561

526

0

90

176

26

292

0%

30.82%

60.27%

8.90%

592

3

112

386

121

622

.48%

18.01%

62.06%

19.45%

962

10

227

359

51

647

1.55%

35.09%

55.49%

7.88%

Source:  Internal Revenue Service Employee Satisfaction Survey Results.

 

Appendix V

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.



[1] Public Law 108-411 [S. 129] (2004).

[2] Taxpayer Assistance Center Employees Improved the Accuracy of Answers to Tax Law Questions but Answered Some Questions Beyond Their Level of Training (Reference Number 2003-40-157, dated July 2003).

[3] Public Law 108-411 [S. 129] (2004).

[4] Information on Employee Training Is Not Adequate to Determine Training Cost or Effectiveness (Reference Number 2003-10-212, dated September 2003).

[5] The TE/GE Division training information was not covered within the scope of this audit.  The TE/GE Division represents approximately 3 percent of the number of employees that are in a job series that deal directly with the taxpaying public.

[6] International Examiners conduct the international aspects of a field examination of returns, which may contain unusually difficult and complex legal, financial, or valuation issues of major proportions.

[7] Internal Revenue Manual 6.410.1.3.1 (9).

[8] Subject Matter Experts are assigned to develop one or several training courses within a training program.

[9] Performance consulting is a problem identification process that looks at the difference between “what is” and “what should be” for a particular situation.  This process is documented in the SB/SE Division’s L&E function Core Operating Processes.

[10] The L&E Policy Sub-Council establishes IRS-wide policies, strategies, and initiatives to promote the delivery of effective learning and performance support to all IRS managers and employees.

[11] Human Capital:  A Guide for Assessing Strategic Training and Development Efforts in the Federal Government (GAO-04-546G, dated March 2004).

[12] The Kirkpatrick evaluation model is a system for assessing a specific training activity’s value to a specific business.

[13] E-learning is anything delivered, enabled, or mediated by electronic technology for the explicit purpose of learning. 

[14] Mission critical training programs are those that relate to the IRS programs to process returns, examine returns, and collect money owed to the IRS.

[15] Campuses are the data processing arm of the IRS.  The campuses process paper and electronic submissions, correct errors, and forward data to the Computing Centers for analysis and posting to taxpayer accounts.  Many of the employees don’t have computers assigned to them.

[16] The LEAP is charged with reorganizing the L&E servicewide organization to improve processes and procedures to lead to a more efficient organization with a goal of an overall reduction to required resources.  The project is to be completed by December 2005.

[17] The survey is contracted by the Chief Financial Officer and is administered by the Performance, Budgeting, Planning, and Performance Office.

[18] The Division Partnering Council’s purpose is to serve as a decision-making body that advances the mission of the IRS and the vision of the division.

[19] CPE is an educational program that is provided to employees to keep them informed on the latest changes or updates regarding subject matter related to their job position. 

[20] Web-based training refers to conducting training through the use of computers instead of a classroom environment.

[21] These positions are intended not to exceed 3 years.

[22] Percentages do not always total to 100 due to rounding.

[23] Percentages do not always total to 100 due to rounding.