TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

 

An Improved Project Management Process Is Needed to Measure the Impact of Research Efforts on Tax Administration

 

 

 

July 21, 2009

 

Reference Number:  2009-10-095

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

July 21, 2009

 

 

MEMORANDUM FOR DIRECTOR, OFFICE OF RESEARCH, ANALYSIS, AND STATISTICS

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – An Improved Project Management Process Is Needed to Measure the Impact of Research Efforts on Tax Administration (Audit # 200810001)

 

This report presents the results of our review of the Internal Revenue Service’s (IRS) process to monitor the progress of projects, validate results, and measure the impact of its research efforts.  The overall objective of this review was to determine whether the structure and management of IRS research efforts can be improved when providing information to IRS executive management.  This review was part of the Treasury Inspector General for Tax Administration Fiscal Year 2008 Annual Audit Plan coverage under the major management challenge of Using Performance and Financial Information for Program and Budget Decisions.

Impact on the Taxpayer

IRS management relies on research programs to deliver information that is integral to improving performance on strategic goals and objectives.  However, IRS research management has not developed and implemented effective business measures and project management processes to provide pertinent data about whether IRS research efforts achieved established program objectives.  This is especially significant because the IRS spent more than $93.2 million on research in Fiscal Year 2008, but cannot effectively assess the impact its research efforts had on tax administration.

Synopsis

The IRS does not have a systemic process to measure research outcomes against intended results.  Specifically, the IRS should develop additional business measures to effectively assess the impact of its research projects on tax administration.  Currently, the IRS primarily assesses the impact of its research projects by measuring customer satisfaction.  However, customer satisfaction is not the most relevant and effective business measure since it does not directly show whether IRS research projects contributed to the research programs’ mission of providing information that supports data-driven decisions by IRS management.  As a result, IRS management cannot fully assess the impact of the research projects, and the associated $93.2 million spent, on tax administration.

In addition, IRS management should improve project management practices to ensure project files contain adequate information related to the planning, conducting, and reporting of research efforts.  The IRS research files in our judgmental sample did not always contain adequate information to determine the scope of the planned work and why research projects were terminated or to assess the efficiency and effectiveness of the research projects.  In addition, key information should be captured on an information system to track and monitor research efforts.  This will enable IRS management to more effectively monitor whether research is completed within established time/budget standards and assess its impact on improving taxpayer service, enhancing enforcement of the tax laws, and modernizing the IRS.

Effective business measures and improved project management practices will provide IRS management with pertinent data to enable them to determine whether their research efforts achieved established objectives, ensure IRS resources are focused on the issues that are most relevant to its programs’ missions, and assure stakeholders that the annual investment in research efforts provides value to tax administration.

Recommendations

We recommended the Director, Office of Research, Analysis, and Statistics (RAS):  1) coordinate with the Directors in each IRS research program to develop and implement business measures to better assess whether IRS research efforts achieve program objectives and to show their impact on improving tax administration; 2) establish research standards and practices that define what activities constitute a research project and specify the minimum required documentation that should be prepared and maintained for all research projects; and 3) develop guidance to ensure basic project information[1] is captured, tracked, and monitored to allow for the consistent and comparable reporting of IRS research efforts.

Response

IRS management agreed in principle with our recommendations, but offered alternative corrective actions for all three recommendations.  With respect to our recommendation regarding business measures, IRS management responded that the Servicewide Research Council will discuss and consider developing appropriate business measures.  As part of this consideration, the Servicewide Research Council will explore whether standardized measures are desirable.  Some IRS research units are in the process of developing measures that address workload capacity, productivity, and delivery cost objectives.  As for our recommendations pertaining to project management practices (defining activities constituting a research project, specifying project documentation requirements, developing guidance to capture, track, and monitor research projects), IRS management stated that the Servicewide Research Council will look for opportunities to develop consistent definitions of research projects and establish an appropriate set of documentation standards and practices.  Additionally, some of the IRS’ research program tracking systems are being redesigned to capture project documentation, time, and resource expenditures and to monitor research projects.  Management’s complete response to the draft report is included as Appendix VII.

Office of Audit Comment

The IRS’ planned corrective actions do not adequately address our recommendations.  While we recognize the value in having the Servicewide Research Council discuss the establishment of business measures, we are concerned that IRS management has not made a commitment to establishing business measures.  The IRS 2009-2013 Strategic Plan shows an expanded use of research efforts to strengthen the IRS’ decision-making processes by placing special emphasis on high performance investments to improve taxpayer services and enforcement activities.  Strategies to accomplish this objective include developing research-driven methods and tools to detect and combat noncompliance and improve resource allocation and to maintain an ongoing research program to determine evolving taxpayer and partner service needs, preferences, and behaviors.  However, the current business measures, and the measures under development, do not effectively show the impact of IRS research projects and activities on improving tax administration.  If the IRS had the ability to measure the impact of various research projects on key mission priorities, the IRS would be better able to ensure its resources were used on the highest priority projects.

In addition, we are concerned that the responsibility for implementing these corrective actions has been given to the Servicewide Research Council when this Council serves as a forum for exchanging ideas and coordinating crosscutting activities, not as a standards-making body.  The Internal Revenue Manual states the Office of RAS is responsible for providing functional leadership, guidance, and support to the IRS Research Community on research standards and practices to ensure consistency and comparability of performance between the various research programs.  In this environment of increased government accountability and transparency, the development of effective business measures is critical for IRS management to assess the impact of their research projects (and the associated Fiscal Year 2008 $93.2 million research investment) on tax administration.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Nancy A. Nakamura, Assistant Inspector General for Audit (Management Services and Exempt Organizations), at (202) 622-8500.

  

Table of Contents

 

Background

Results of Review

The Internal Revenue Service Needs Additional Business Measures to Assess the Impact of Its Research on Tax Administration

Recommendation 1:

Project Management Practices for Research Efforts Can Be Improved

Recommendation 2:

Recommendation 3:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Outcome Measure

Appendix V – Description of the Office of Research, Analysis, and Statistics Sub-Offices

Appendix VI – Glossary of Terms

Appendix VII – Management’s Response to the Draft Report

  

Abbreviations

 

IRS

Internal Revenue Service

RAS

Research, Analysis, and Statistics

 

Background

 

IRS management relies on research programs to deliver information integral to its decision-making processes to improve performance on strategic goals and objectives.

Collection and analysis of tax administration data are essential to the planning, management, and evaluation of Internal Revenue Service (IRS) programs.  IRS management relies on research programs to deliver information that is integral to its decision-making processes to improve performance on strategic goals and objectives (e.g., are the IRS business divisions helping taxpayers understand and comply with applicable tax laws?).  The IRS Research Community consists of a decentralized management structure that includes the Office of Research, Analysis, and Statistics (RAS) and research programs in each of the four IRS business divisions.[2]  Each business division maintains control of its research program, while the Office of RAS is managed centrally in Washington, D.C.

The decentralized IRS research structure resulted from the IRS’ October 2000 reorganization into four business divisions.  As a result of the reorganization, a separate research program was embedded in each division in an attempt to create research operations focused on specific taxpayer groups.  Also, the reorganization was intended to address concerns over whether research was addressing areas with the most impact on tax administration previously raised by the Treasury Inspector General for Tax Administration and the Government Accountability Office.

The Office of RAS is comprised of a Director’s office along with five sub-offices.[3]  The Office of RAS is responsible for providing functional leadership and guidance to the IRS Research Community by supporting the research standards and practices to ensure consistency, comparability, and quality throughout the community.  In addition, the Office of RAS serves a wide variety of both internal and external customers, including providing the IRS Commissioner with a source for short-term and long-term research and analytical work, the American public with data on the Federal tax system,[4] and IRS employees with a suite of corporate research tools and services. 

Research programs in the IRS business divisions are responsible for performing reviews and providing IRS management with information to develop and improve their programs and processes, solve problems, and make data-driven decisions.  One challenge under the reorganized IRS research structure is that the Office of RAS does not have line authority over the research programs in the IRS business divisions.  Each business division controls the budget and resources needed for its respective program.  Figure 1 shows the resources assigned to research efforts[5] in each research program. 

Figure 1: IRS Employees Assigned to Research Efforts As of December 2007

Operating Division/Function

Employees

Office of RAS, Office of Research

70

Office of RAS, Office of Program Evaluation and Risk Assessment

44

Large and Mid-Size Business Division

76

Small Business/Self Employed Division

107

Tax Exempt and Government Entities Division

3

Wage and Investment Division

62

Totals

362

Source: Office of RAS December 2007 Business Performance Report and IRS business divisions’ web sites.

Because the IRS Research Community consists of a complex set of networks, the Servicewide Research Council was created as a forum for sharing information, coordinating cross-functional actions, and resolving procedural issues that affect research and analysis across the IRS.  Decisions regarding the use of common assets across the IRS Research Community are reached through the Servicewide Research Council.  Permanent Council members include the Director, Office of RAS (as chair of the Council), Directors of Research from the five Office of RAS sub-offices, and Directors from each of the four IRS business divisions.

This review was performed at the IRS Office of RAS in Washington, D.C., and the IRS research programs at the Large and Mid-Size Business Division in Oakland, California; the Small Business/Self-Employed Division in New Carrollton, Maryland; the Tax Exempt and Government Entities Division in Washington, D.C.; and the Wage and Investment Division in Atlanta, Georgia, during the period of August 2008 through January 2009.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

  

Results of Review

 

In Fiscal Year 2008, the IRS spent approximately $93.2 million on research efforts.  However, we determined there is no standardized process to monitor the progress of projects, validate results, and measure the impact of research efforts.  As a result, IRS management cannot fully assess the impact that research has in improving taxpayer service, enhancing the enforcement of tax laws, and assisting in the modernization of the IRS.  These types of information are important when assigning and prioritizing research community resources.  While we recognize the value research efforts can have for IRS management, we determined the IRS needs additional business measures to effectively assess the impact its research has in improving IRS programs and processes.  In addition, IRS management should improve its project management practices related to planning, conducting, and reporting of research efforts to more effectively monitor and assess related work.

Effective business measures and improved project management practices will provide IRS management with pertinent data to enable them to determine whether their research efforts achieved established objectives, ensure IRS resources are focused on the issues that are most relevant to its programs’ missions, and assure stakeholders that the annual investment in research efforts provides value to tax administration. 

The Internal Revenue Service Needs Additional Business Measures to Assess the Impact of Its Research on Tax Administration

The IRS does not have a systemic process to measure research outcomes against intended results.  Currently, IRS research management primarily assesses the impact of their research projects by measuring customer satisfaction (i.e., percentage of customers who feel the product or service met their needs or the number of new and repeat customers).  However, customer satisfaction is not the most relevant and effective business measure since it does not directly show IRS management whether research projects are contributing to the research programs’ mission of providing information that supports data-driven decisions by IRS management.  Without effective business measures, IRS management cannot effectively assess the impact of their research projects (and the associated Fiscal Year 2008 $93.2 million research investment) on tax administration.

The Government Performance and Results Act of 1993[6] requires Federal agencies to become results-oriented.  This shifted the focus of Federal agencies from simply accountability for the process, (i.e., does the program spend the correct amount of money in a proper manner?) to accountability for results, (i.e., what does the program actually accomplish with the money spent?).  In addition, The President’s Management Agenda relating to improving budget and performance integration requires Federal Government agencies to identify high-quality outcomes business measures and accurately monitor program performance.  Finally, business measures that assess the impact of research on tax administration would support the Department of the Treasury’s objective of effective IRS investments in high return-on-investment activities that generate improved compliance and fairness in the application of tax laws. 

In prior reports,[7] we recommended IRS management develop a process to better measure actual research outcomes to enable management to monitor the effect that research programs have on improving tax administration.  In addition, in 2001 the Government Accountability Office identified the need for additional business measures to determine the effectiveness of the IRS’ research results.[8]  Given the difficulty across the Federal Government in measuring the long-term effects of research programs, especially in quantifiable terms, the IRS has not been able to develop business measures to assess the impact of its research.  However, in 1999 the Committee on Science, Engineering, and Public Policy addressed the challenges Federal agencies were having in developing meaningful ways to measure the effectiveness of research programs.  They concluded that despite the difficulties Federal research programs were having in measuring performance, research “can be evaluated meaningfully on a regular basis” in accordance with the Government Performance and Results Act of 1993.  For research programs like those of the IRS, it was recommended they should measure the relevance of the program’s research to the agency’s mission.  The Committee further stated that the “Government Performance and Results Act of 1993 is not an executive branch initiative but rather a Congressional mandate.”  As such, the IRS needs to develop effective business measures that provide a direct link of program results to its mission of improving tax administration. 

Without effective business measures, the ability of IRS management and Congress to make informed business decisions related to IRS research programs is hampered.  Business measures that assess the progress of research toward the programs’ missions will allow the IRS to determine whether its research efforts achieved program objectives, accurately quantified problems, evaluated alternatives, and allocated resources so they are focused on subjects most relevant to its program.  This will also help stakeholders to assess the value IRS research programs provide in exchange for their $93.2 million Fiscal Year 2008 investment in research.

Recommendation

Recommendation 1:  The Director, Office of RAS, should coordinate with the Directors in each IRS research program to develop and implement effective business measures to better assess whether IRS research efforts achieve program objectives and show their impact on improving tax administration.

Management’s Response:  IRS management agreed in principle with this recommendation.  The Servicewide Research Council, chaired by the Director, Office of RAS, will discuss this issue and consider developing appropriate business measures.  As part of this consideration, the Servicewide Research Council will explore whether standardized measures are desirable.  Some of the research units are already in the process of developing a series of measures that address workload capacity, productivity, and delivery cost objectives.  As these are developed, the Servicewide Research Council will be the forum for discussing whether these approaches should be more widely adopted.

Office of Audit Comment:  While we recognize the value in having the Servicewide Research Council discuss the establishment of business measures, we are concerned that IRS management has not made a commitment to establishing business measures.  The development of appropriate business measures is a Congressional requirement mandated by the Government Performance and Results Act of 1993 and has been determined to be possible within the Federal research programs.  While the business measures currently being considered by some of the research units will be good measures for monitoring overall project/program performance, they will not provide IRS management with information to measure the progress of IRS research programs towards improving tax administration.  The latter type of measure will be important in enabling the IRS to achieve its strategic goals.  Specifically, the IRS 2009-2013 Strategic Plan shows an expanded use of research efforts to strengthen the IRS’ decision-making processes by placing special emphasis on high-performance investments to improve taxpayer services and enforcement activities.  Strategies to accomplish this objective include, “developing improved research-driven methods and tools to detect and combat noncompliance and improve resource allocation and maintaining an ongoing research program to determine evolving taxpayer and partner service needs, preferences, and behaviors.”  If the IRS had the ability to measure the impact of various research projects on key mission priorities, the IRS would be better able to ensure its resources were used on the highest priority projects.  Furthermore, in this environment of increased government accountability and transparency, we believe it is critical for IRS management to be able to report the benefits and impact of its research resources ($93.2 million spent in Fiscal Year 2008) beyond just internal IRS customer satisfaction. 

 

Project Management Practices for Research Efforts Can Be Improved

IRS management should improve project management practices to ensure project files contain adequate information related to the planning, conducting, and reporting of research efforts.  In addition, key information should be captured on an information system to track and monitor research efforts.  This will enable IRS management to more effectively monitor whether research is completed within established time/budget standards and assess its impact on improving taxpayer service, enhancing enforcement of the tax laws, and modernizing the IRS.   

Project documentation must be enhanced to ensure key information is consistently maintained to evaluate the results of IRS research

The IRS Research Community has not established consistent procedures to ensure research project files contain the necessary documentation to support key decisions made during the planning and conducting of research projects, as well as the reporting of research results.  Our review of 30 judgmentally selected IRS research projects showed that many documents needed to show the final results of the research, and management’s actions taken to deliver the research project were not maintained in the research project files.  As a result, we could not always determine the scope of work planned or why research projects were terminated, or assess the efficiency and effectiveness of the sampled research projects.

We obtained lists showing the research projects conducted by each program during Fiscal Year 2008 and judgmentally sampled 30 projects.  After we reviewed case information for the 30 selected projects, we determined there are differences in the types of work performed by the various IRS research programs to define what “is a research project.”  Specifically, 16 (53 percent) of the 30 IRS research projects were “research activities” and not “research projects.”  All 16 activities were either closed in a short time period (e.g., 3 projects collectively required 4 days to complete) or required few resources to complete.  Examples of the types of Fiscal Year 2008 activities include providing assistance to other IRS programs or using the projects as an administrative tool to capture time.  Because these projects required little time or resources to complete, they did not require formal planning, monitoring, or tracking.  In addition, the application of business measures to these types of activities is not practical.

We determined the remaining 14 sampled cases were more extensive “research projects,” and generally contained more detailed case information.  Project file documentation generally identified the customers and the requestors of the research and showed the technical details of the research project.  However, documentation was lacking to indicate why a research project was closed, whether a report was issued, and the results of any post-research assessment.  In addition, none of the 14 research project files showed the project management details such as the use of milestone dates or the final results and impact of the research on tax administration.  Because IRS research project files do not contain complete documentation, IRS research program management cannot effectively analyze the reasons for terminating projects, identify problems in project management, and determine whether program resources are addressing the most relevant issues.

Among the well-established best practices in the research community is that research should have the capability of being reproduced by both internal and external stakeholders to independently confirm the validity of research results.  Office of Management and Budget guidelines, for example, indicate that having the capability of replicating research can enhance the credibility of results, increase opportunities for improvement, and avoid perception of bias.  Documentation should be maintained in the project files to support the validation of the results of research, as well as the events that transpire over the course of completing a research project that is critical to ensure management has the information needed to make business decisions.

As a result of the IRS’ October 2000 reorganization, the decentralized IRS research programs were allowed to develop their own procedures and best practices.  In addition, the Office of RAS did not provide functional leadership and guidance to the IRS research programs to clarify what constitutes a research project, identify the project file requirements to show what documentation must be maintained, and require research programs to develop a process for evaluating results.  Established guidelines will enhance the credibility of project file information by providing greater assurance that information needed to manage and assess IRS research programs is consistently captured, tracked, monitored, and reported.   

Database information is not useful to track and monitor IRS research efforts

We determined the IRS research project databases do not contain information needed by research program management to effectively ensure projects remain on track, on time, and within budget.  Similar to the lack of documentation of IRS research project files, useful and timely information was not captured in the research project databases.  Specifically, some IRS research programs do not maintain a database of research projects or may maintain a database of projects but do not use information on the system to monitor the progress of its research.  For example, our review of the data in the project databases that are maintained identified that basic project information is not always captured and the requirements to capture, track, and monitor basic project information varies between the research programs.  Information not always captured in the project databases included project numbers, objectives of the research, customers and requestors of the research, as well as start, milestone, and completion dates.  As a result, IRS research management did not have complete information to fully and accurately assess the impact of its research on tax administration and determine whether its resources are focused on the issues most relevant to its program mission.   

In addition, because the definition of what constitutes a research project varied between research programs, information captured in the research programs’ databases makes it difficult to ensure consistency and comparability when assessing productivity.  For example, research database information for the Tax Exempt and Government Entities Division shows its research program completed or worked on 218 research projects in Fiscal Year 2008 costing $1.1 million.  In comparison, the Wage and Investment Division research program’s database for the same period shows it completed or worked on 163 research projects costing $10.9 million.  We recognize various factors can cause differences in productivity between research programs.  These factors can include the different complexities of one program’s research projects or performing more long-term projects that can require several years to complete.  However, inconsistency in how each research program defines what constitutes a “research project” makes it difficult to accurately assess and compare performance across the IRS. 

As outlined in the Standards for Internal Control in the Federal Government,[9] managers need access to reliable and timely operational data to meet their responsibility of ensuring the effective use of resources.  Tracking such information directly relates to the effective stewardship of resources by helping to answer questions such as, “How long are the different steps in the research process taking?” and “Are the time periods that have been set for the process being met?”  These techniques are designed to develop plans and control systems to ensure research is completed on time and within budget so needed information is available for executive decision-making purposes.  This information could also assist the Servicewide Research Council when sharing information across the IRS Research Community and identifying potential best practices in research activities.

The variations of information in the IRS research project databases occurred because each IRS research program was allowed to develop its own database standards and best practices with no central oversight to ensure that basic research information is properly captured and monitored.  While we recognize that variations between the research project databases must exist due to the different customers each research program serves and other informational needs that are unique to each research program, basic research information that is common to all research programs must be captured and entered into the IRS research project databases.  An improved project management process will provide standardized guidance that specifically defines what activities should be classified as a research project, what information is required to be maintained in the project file to support key decisions and the results of research, and what basic research information must be captured in research databases to enable management to measure their progress in achieving their strategic goals and objectives. 

Recommendations

The Director, Office of RAS, should:

Recommendation 2:  Establish research standards and practices that define what activities constitute a research project and specify the required documentation that should be prepared and maintained for all research projects.  The standards and practices should include, at a minimum, who requested the research, objectives of the research, estimated resources and time periods necessary to complete the project, the final report documenting the results of the research and any potential actions taken by management as a result of the project, and the results of any post-research assessment.

Management’s Response:  IRS management agreed in principle with this recommendation.  The Servicewide Research Council, chaired by the Director, Office of RAS, will look for opportunities to develop consistent definitions of research projects and activities and to establish an appropriate set of documentation.  Results of ongoing work will be shared with the entire Servicewide Research Council.  

Office of Audit Comment:  We are concerned that the responsibility for implementing this recommendation has been given to the Servicewide Research Council when this Council serves as a forum for exchanging ideas and coordinating crosscutting activities, not as a standards-making body.  The Internal Revenue Manual states the Office of RAS is responsible for providing “functional leadership, guidance, and support to the IRS Research Community on research standards and practices to ensure consistency and comparability” of performance between research programs.  The standards, practices, and guidance referred to in this recommendation will provide IRS management with research information that is common to all projects, including non-research projects, for comparable and consistent reporting and evaluation of performance within the IRS Research Community.  This consistent information will be critical for the IRS to ensure program resources are addressing the most important tax administration issues.   

Recommendation 3:  Develop guidance to ensure that basic project information is captured, tracked, and monitored and allows for consistent and comparable reporting of IRS research efforts.  This guidance should also include a requirement for management to clearly distinguish research projects from research activities on their databases. 

Management’s Response:  IRS management agreed in principle with this recommendation.  Tracking systems currently exist in some research units.  Additionally, some of these tracking systems are being redesigned and will distinguish, where appropriate, between research “projects” and “activities,” capture project documentation and time and resource expenditures, and monitor research projects.  Results of ongoing work will be shared with the entire Servicewide Research Council. 

Office of Audit Comment:  We are concerned that the responsibility for implementing this recommendation has been given to the Servicewide Research Council when this Council serves as a forum for exchanging ideas and coordinating crosscutting activities, not as a standards-making body.  The Internal Revenue Manual states the Office of RAS is responsible for providing “functional leadership, guidance, and support to the IRS Research Community on research standards and practices to ensure consistency and comparability” of performance between research programs.  The standards, practices, and guidance referred to in this recommendation will provide IRS management with research information that is common to all projects, including non-research projects, for comparable and consistent reporting and evaluation of performance within the IRS Research Community.  This consistent information will be critical for the IRS to ensure program resources are addressing the most important tax administration issues. 

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 The overall objective of this review was to determine whether the structure and management of IRS research efforts[10] can be improved when providing information to IRS executive management.  To accomplish our objective, we:

I.                    Determined how IRS research activities are structured and coordinated in the Office of RAS and how the research programs are structured in the IRS business divisions.

A.     Identified the decentralized structure of the IRS Research Community in the Office of RAS and the embedded research programs in the business divisions.

B.     Determined whether the structure of the Servicewide Research Council allows it to serve as a forum for sharing information, coordinating actions, and resolving procedural issues that impact the execution of research crosscutting across the IRS.

C.     Evaluated how resources were committed to perform research in the Office of RAS and the research programs in the business divisions, including Full-Time Equivalents, employees involved in cross-functional research efforts, and funding. 

D.     Interviewed executives in the Office of RAS, the research programs in the business divisions, the IRS Oversight Board, and the Servicewide Research Council. 

II.                 Determined whether the IRS Research Community is effectively managed when performing research.

A.     Reviewed a judgmental sample of 30 projects from the 907 research projects[11] in the IRS Research Community’s Fiscal Year ending 2008 databases of completed or in process research projects in the Office of RAS and business divisions and obtained project file documents that support decisions made during the project.

B.     Evaluated the management of research projects by the Office of RAS and research programs in the business divisions.

C.     Determined whether the Servicewide Research Council provides an effective forum to coordinate crosscutting actions and resolve procedural issues that affect research across the business divisions.

Internal Controls Methodology

Internal controls relate to management’s plans, methods, and procedures used to meet its mission, goals, and objectives.  Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations.  They include the systems for measuring, reporting, and monitoring program performance.  We determined the following internal controls were relevant to our audit objective:  the Government Performance and Results Act of 1993,[12] the Standards for Internal Control in the Federal Government,[13] the President’s Management Agenda, and Parts 1.1.18 and 1.7.4 of the Internal Revenue Manual.  These controls were reviewed through interviews with the Directors in the Office of RAS and research programs in the IRS business divisions and reviews of research projects, budget documents, mission statements, and charter documents.

 

Appendix II

 

Major Contributors to This Report

 

Nancy A. Nakamura, Assistant Inspector General for Audit (Management Services and Exempt Organizations)

Jeffrey M. Jones, Director
Diana M. Tengesdal, Acting Director

Joseph F. Cooney, Audit Manager

John W. Baxter, Lead Auditor

Lauren W. Bourg, Auditor

Angela Garner, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Services and Enforcement  SE

Commissioner, Large and Mid-Size Business Division  SE:LM

Commissioner, Small Business/Self-Employed Division  SE:S

Commissioner, Tax Exempt and Government Entities Division  SE:T

Commissioner, Wage and Investment Division  SE:W

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaison:  Director, Communications and Liaison, Office of Research, Analysis, and Statistics  RAS

 

Appendix IV

 

Outcome Measure

 

This appendix presents detailed information on the measurable impact that our recommended corrective actions will have on tax administration.  This benefit will be incorporated into our Semiannual Report to Congress.

Type and Value of Outcome Measure:

·        Reliability of Information – Actual; 16 research activities which were inaccurately reported as research projects (see page 7).

Methodology Used to Measure the Reported Benefit:

To determine whether the IRS reliably reported the number of research projects completed in Fiscal Year 2008, we reviewed case information for 30 selected projects and determined differences in the type of work considered as a “research project” by the various IRS research programs.  Specifically, we determined that 16 (53 percent) of the 30 IRS research projects were “research activities” and not “research projects.”  All 16 activities were either closed in a short time period (e.g., 3 projects collectively required 4 days to complete) or required few resources to complete.

 

Appendix V

 

Description of the Office of Research, Analysis, and Statistics Sub-Offices

 

1.      Office of Research Improves tax administration by providing information, analysis, and solutions from an agency-wide perspective and by advocating actions for decision makers by providing data; developing tools, techniques, methodologies, analysis, and models used to support organizational decisions; and developing strategies to achieve the IRS mission, vision, and goals. 

2.      National Research Program Measures voluntary compliance including filing, payment, and reporting compliance to improve the IRS’ ability to detect noncompliance and develop appropriate cost-effective treatments for prevention and early intervention.

3.      Office of Program Evaluation and Risk Analysis Conducts short-term research projects to provide senior IRS management with accurate and timely analysis of ongoing and proposed IRS programs and investments to support quality, data-driven strategic thinking and decision making across the organization. 

4.      Statistics of Income Division Collects, analyzes, and disseminates information on Federal taxation for the Department of the Treasury Office of Tax Analysis, Congressional Committees, IRS business units in their administration of the tax laws, other organizations engaged in economic and financial analysis, and the general public.

5.      Office of Servicewide Policy Directives and Electronic Research – Designs and delivers core research tools and services that advance the customer service, compliance, and enforcement priorities of the IRS.

 

Appendix VI

 

Glossary of Terms

 

Term

Definition

Basic Project Information

Includes information common to all projects, regardless of whether the project is related to research.  This includes project numbers; objectives of the research projects; customers and requestors of the research; and start, milestone, and completion dates. 

Full-Time Equivalent

A measure of labor hours in which 1 Full-Time Equivalent is equal to 8 hours multiplied by the number of compensable days in a particular fiscal year.

Milestones

Milestones provide for “go/no-go” decision points in a project and are sometimes associated with funding approval to proceed.

Research Activity

A task that usually is short in duration or requires little time to complete and does not require the detailed planning and management of resources.  Research activities can be part of a research project or separate tasks not related to a research project such as obtaining information or providing support services. 

Research Efforts

The combined resources used to perform research projects and research activities, including activities specifically designed to support research. 

Research Project

A process which is initiated by a person (or group) who realizes that a specific problem needs resolution; involves the development of well-defined goals and plans to determine project completion; and requires the organization, monitoring, and management of resources during project execution to deliver the project within scope, costs, and time constraints.  Defining what a research project is helps in understanding the project management methodology and its effectiveness for project management.

 

Appendix VII

 

Management’s Response to the Draft Report

 The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.


[1] See Appendix VI for a glossary of terms.

[2] The research programs are in the Large and Mid-Size Business, Small Business/Self-Employed, Tax Exempt and Government Entities, and Wage and Investment Divisions.  The Taxpayer Advocate Service and the Criminal Investigation Division also have research programs, but due to their sizes were not included in the review.

[3] A detailed description of each sub-office in the Office of RAS is included in Appendix V.

[4] The Statistics of Income office, in the Office of RAS, collects, analyzes, and disseminates information on Federal taxation for the Department of the Treasury, Congressional Committees, the IRS, other organizations engaged in economic and financial analysis, and the general public.

[5] See Appendix VI for a glossary of terms.

[6] Pub. L. No. 103-62, 107 Stat. 285 (codified as amended in scattered sections of 5 U.S.C., 31 U.S.C., and 39 U.S.C.).

[7] The Internal Revenue Service Needs to Improve Control of Its Compliance Research Program (Reference Number 2000-40-068, dated May 12, 2000); Information Is Needed to Determine the Effect the Wage and Investment Division Research Program Has on Improving Customer Service and Voluntary Compliance (Reference Number  2004-40-088, dated April 14, 2004); and Important Progress Has Been Made in Using Research to Improve Programs for Large Businesses, but Challenges Remain (Reference Number 2004-30-130, dated August 12, 2004).

[8] Internal Revenue Service – Status of the Modernized Research Operations (Reference Number GAO-01-656R, dated April 2001).

[9] GAO/AIMD-00-21.3.1, dated November 1999.

[10] See Appendix VI for a glossary of terms.

[11] We selected a judgmental sample because we were not going to project our results.

[12] Pub. L. No. 103-62, 107 Stat. 285 (codified as amended in scattered sections of 5 U.S.C., 31 U.S.C., and  39 U.S.C.).

[13] GAO/AMID-00-21.3.1, dated November 1999.