Many Aspects of the Field Examination Reengineering Pilot Were Effectively Implemented; However, Continued Monitoring Is Needed During the Nationwide Rollout

 

June 2004

 

Reference Number:2004-30-116

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this doucment.

 

June 30, 2004

 

 

MEMORANDUM FOR COMMISSIONER, SMALL BUSINESS/SELF-EMPLOYED DIVISION

 

FROM:†††† Gordon C. Milbourn III /s/ Gordon C. Milbourn III

†††††††††††††††† Acting Deputy Inspector General for Audit

 

SUBJECT:†††† Final Audit Report - Many Aspects of the Field Examination Reengineering Pilot Were Effectively Implemented; However, Continued Monitoring Is Needed During the Nationwide Rollout (Audit # 200330024)

 

This report presents the results of our review of the Internal Revenue Serviceís (IRS) Field Examination Reengineering Pilot.The overall objective of this review was to determine whether the Field Examination Reengineering Pilot met its established goals.This review was part of our efforts to provide ongoing input during the Small Business/Self-Employed (SB/SE) Divisionís examination reengineering process.

In summary, the SB/SE Division initiated an indepth review of the examination process to reengineer and/or redesign processes, products, and services.The primary objective was to align examination procedures with the objectives of the organization:customer satisfaction, employee satisfaction, and increased business results.As part of this review, the SB/SE Division conducted the Field Examination Reengineering Pilot in which it tested new Field Examination function procedures in 15 groups located in the state of New York.The Pilot was conducted between October 2002 and March 2003.In November 2003, after evaluating Pilot results and modifying certain processes and procedures, the SB/SE Division began to roll out the reengineered process to Field Examination offices across the nation.The nationwide rollout is expected to be completed by the end of Fiscal Year 2004.

Many aspects of the Pilot were effectively implemented.The Field Examination Reengineering Project Team provided effective oversight of the Pilot implementation and ensured the new process was being followed and working as intended.The reengineered process provided enhanced uniformity in workpaper documentation.Also, Concurrence and Engagement Meetings were designed to promote greater collaboration and cooperation among group managers, revenue agents, and taxpayers.

However, results from the Pilot showed that three of the five goals were either not met or not fully measured.The Pilot cases did not result in decreased cycle time per case, and there was no quantitative analysis to determine whether employee and customer satisfaction measures were maintained or increased in comparison with using the old Field Examination function process.The reengineered examination process achieved the goals of reduced examination time and increased examination quality measures.However, results originally reported by the Project Team for these two goals were overstated.

Two of the reengineered processes could be improved.The use of Engagement Agreements is an optional procedure but would help ensure taxpayers are properly informed about the examination process and mutual expectations with the IRS.Also, controls could be strengthened to ensure Mutual Commitment Dates (MCD) are provided to the taxpayers and monitored by the group managers.

Although we do not disagree with the decision to roll out the Pilot nationwide, we do believe SB/SE Division management should carefully monitor the nationwide rollout to ensure all Examination function reengineering goals are met.

We recommended the Director, Compliance, SB/SE Division, continue to measure the impact of the new procedures on hours per return, cycle time, and examination quality.The Director should conduct formal surveys to measure employee satisfaction and customer satisfaction, in an effort to determine whether employees and customers think the new procedures have improved the examination process.

The Director should require that revenue agents prepare Engagement Agreements and provide them to the taxpayers or their representatives.In instances in which an Engagement Agreement is not necessary, the revenue agent should properly document the case file as to the reason why one is not needed.The revenue agents should also be required to record the MCD on the Engagement Agreement.The Director should add the MCD as a field on the Examination Returns Control System (ERCS) and direct group managers to monitor the MCDs to identify potential timeliness and inventory management issues.

Managementís Response:The Director, Compliance, SB/SE Division, agreed with three of our five recommendations and has taken, or plans to take, appropriate corrective actions.SB/SE Division management has developed a performance measures and monitoring methodology which will measure the impact of reengineered process changes on business results.This plan, which was put into place on May 20, 2004, includes a trend analysis of compliance area business results pre-implementation and post-implementation, a comparative case review process for pre-implementation and post-implementation closed examinations, and a compliance area site visit plan.The plan also includes site visits and employee and focus group surveys to quantitatively determine the level of employee satisfaction with the reengineered process.The IRS currently plans to administer the survey to employees 6 months after Field Examination Reengineering has been implemented in their Compliance Area.The performance measures and monitoring methodology also includes a plan to analyze data from the ongoing customer survey process, to quantitatively determine the level of customer satisfaction with the reengineered process.

The Director, Compliance, SB/SE Division, did not agree at this time to require revenue agents to prepare Engagement Agreements and then provide them to the taxpayers or their representatives.During the early stages of the Field Examination Reengineering design and pilot, they considered requiring revenue agents to prepare, sign, and share Engagement Agreements with taxpayers or their representatives.However, they encountered strong negative feedback from representatives and examiners who focused on the signature process of the Agreements rather than the communication they were trying to foster.As a result, SB/SE Division management determined mandatory use of signed Engagement Agreements to be counter-productive to their redesign effort at that time.

Also, SB/SE Division management did not agree to require revenue agents to record the MCD on the Engagement Agreement, add the MCD as a field on the ERCS, and direct group managers to monitor MCDs to identify potential timeliness and inventory management issues.They believe current procedures are sufficient to ensure revenue agents establish and provide the MCDs to taxpayers or their representatives.They also believe requirements for revenue agents to update the Activity Record and notify the group manager and taxpayer of changes to the MCD of more than 30 calendar days are sufficient for group managers to monitor the MCD.Managementís complete response to the draft report is included as Appendix IV.

Office of Audit Comment:We still believe revenue agents should be required to prepare Engagement Agreements and provide them to the taxpayers or their representatives.Engagement Agreements would help ensure revenue agents and taxpayers or their representatives clearly understand their responsibilities and expectations.We also continue to believe additional controls are necessary to ensure revenue agents discuss and provide MCDs to taxpayers or their representatives.Additional controls would help ensure group managers take a more proactive approach to monitoring MCDs for timeliness and would provide them with a systemic means by which to do this.While we still believe our recommendations are worthwhile, we do not intend to elevate our disagreement concerning them to the Department of the Treasury for resolution.

Copies of this report are also being sent to IRS managers affected by the report recommendations.Please contact me at (202) 622-6510 if you have questions or Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs), at (215) 516-2341.

 

Table of Contents

Background

Many Aspects of the Field Examination Reengineering Pilot Were Effectively Implemented

Measured Results of the Field Examination Reengineering Pilot Process Met Expectations for Only Two of the Five Goals

Recommendations 1 through 3:

Two of the Field Examination Reengineering Processes Could Be Improved

Recommendation 4:

Recommendation 5:

Appendix I Ė Detailed Objective, Scope, and Methodology

Appendix II Ė Major Contributors to This Report

Appendix III Ė Report Distribution List

Appendix IV Ė Managementís Response to the Draft Report

 

Background

The Small Business/Self-Employed (SB/SE) Division is organized into three primary functions:Taxpayer Education and Communication, Customer Account Services, and Compliance.The Compliance function directs all post-filing activities such as examination of returns.Under the Compliance function, revenue agents in the Field Examination function determine the correct income levels and corresponding tax liabilities for SB/SE Division taxpayers by conducting field examinations primarily at the taxpayerís place of business.

In 2001, the SB/SE Division initiated an indepth review of the examination process to reengineer and/or redesign processes, products, and services.The primary objective was to align examination procedures with the objectives of the organization:customer satisfaction, employee satisfaction, and increased business results.

The Examination Reengineering Project Team consisted of representatives from the Internal Revenue Service (IRS) and the consulting firms of Booz, Allen & Hamilton and Deloitte and Touche.The Project Team developed various recommendations to improve efficiency and effectiveness in the current examination processes.As part of this review, the SB/SE Division conducted the Field Examination Reengineering Pilot, in which it tested new Field Examination function procedures.

The new procedures and tools used during the Pilot included the following:

        The Examination Workpapers Index (Form 4318) serves as the table of contents for the case file.

        Activity Forecasts are used to estimate the time necessary to complete the examination.

The Field Examination Reengineering Pilot was conducted between October 2002 and March 2003 in 15 groups located in the state of New York.Revenue agents started and closed approximately 122 examinations during the Pilot.The project goals were to:

In November 2003, after making revisions to some of the processes and procedures based on results from the Pilot, the SB/SE Division began to roll out the reengineered process to Field Examination offices across the nation.It is implementing the rollout on a staggered basis and expects to complete deployment by the end of Fiscal Year (FY) 2004.

This review was performed at the IRS National Headquarters in New Carrollton, Maryland, and in the New York Area Office during the period August 2003 through January 2004.The audit was conducted in accordance with Government Auditing Standards.Detailed information on our audit objective, scope, and methodology is presented in Appendix I.Major contributors to the report are listed in Appendix II.

Many Aspects of the Field Examination Reengineering Pilot Were Effectively Implemented

Many aspects of the Field Examination Reengineering Pilot were effectively implemented.The Pilot had effective management oversight; provided enhanced uniformity in workpaper documentation; and promoted greater collaboration and cooperation among group managers, revenue agents, and taxpayers.

Management oversight was effective

The Field Examination Reengineering Project Team provided effective oversight of the Pilot implementation.To ensure the reengineered process was being followed and working as intended, the Project Team conducted four site visits to each group during the Pilot.The agenda for the site visits included meeting with group managers and revenue agents to discuss process issues, clarify responsibilities, and obtain feedback.The Project Team also collected summary information on cases in process and conducted reviews on a sample of cases to identify and correct any problems.At the end of each visit, the Project Team provided feedback to the group to identify what was going well and what needed improvement.The Project Team then evaluated the results of the visits and made adjustments to the process as necessary.

Adjustments made to the reengineered process included revising the Risk Analysis Worksheet to eliminate its susceptibility to manipulation, providing flexibility in the timing of the Concurrence Meeting, and revising Lead Sheets to eliminate duplicate steps.

In addition to the site visits, the Project Team held weekly teleconferences with Territory and group managers.The Project Team provided results, gave positive reassurances, obtained feedback, addressed concerns, and advised of procedural changes.

Also, upon completion of the Pilot, the Project Team requested the Office of Performance Excellence to conduct focus group interviews with participants in the Field Examination Reengineering Pilot.It interviewed 16 focus groups that involved 77 revenue agents and 14 group managers.In general, it gathered feedback relating to the Pilot and provided the results to the Project Team to assist in decision making regarding the national rollout of the reengineered process.

The reengineered process provided enhanced uniformity in workpaper documentation

The Field Examination Reengineering Project Director indicated that, prior to development of the reengineered process, numerous packages were available to revenue agents to record examination activities.As a result, there were inconsistent workpaper documentation policies among the various offices.

To update all these packages would have required a lot of work and time.Also, when a case is transferred between offices, the receiving office would have to interpret another officeís method of file organization.To be more efficient and consistent, the Project Team selected one package to be used by all revenue agents.This created a common framework for documenting workpaper files.

As part of the common framework, the Project Team developed a series of workpapers that serve as a guide for the reengineered examination process.It redesigned Form 4318, created Lead Sheets for issues to be examined, developed a Risk Analysis Worksheet for modifying the examination scope, established an Initial Appointment Meeting Agenda, and directed revenue agents to work with taxpayers in determining an Estimated Completion Date.

We randomly sampled 42 closed Pilot cases and determined revenue agents and group managers generally used the Pilot procedures and tools.We noted that they did not always use the Risk Analysis Worksheet or Engagement Agreement.However, the Risk Analysis Worksheet had to be revised by the IRS to eliminate a manipulation weakness.Also, the Engagement Agreement was an optional tool under the Pilot.

Concurrence and Engagement Meetings were designed to promote collaboration and cooperation

Group managers are directed to conduct Concurrence Meetings with revenue agents to ensure proper planning has occurred for addressing issues selected for verification on a tax return.These Meetings should promote greater collaboration between group managers and revenue agents.

In turn, revenue agents are directed to conduct Engagement Meetings with taxpayers to explain the examination process, identify the issues to be examined, and establish a Mutual Commitment Date (MCD) for completing the examination.These Meetings should encourage cooperation between the revenue agent and taxpayer and/or Power of Attorney.Our review of the sample of 42 closed Pilot cases determined that Concurrence and Engagement Meetings were generally being held.

While many aspects of the Pilot were effectively implemented, measured results show the Pilot met only two of five goals.We also identified two processes that could be improved.Although we do not disagree with the decision to roll out the Pilot nationwide, we do believe SB/SE Division management should carefully monitor the nationwide rollout to ensure all Examination function reengineering goals are met.

Measured Results of the Field Examination Reengineering Pilot Process Met Expectations for Only Two of the Five Goals

The project goals were to reduce hours per return, reduce cycle time, maintain or improve examination quality, maintain or increase employee satisfaction, and maintain or increase customer satisfaction.However, we determined the Pilot produced measurable results that met only two of these goals.

The average number of hours spent on Pilot cases decreased, but cycle time increased

To measure business results, the Project Team compared the Examination Quality Measurement System (EQMS) results for Pilot and selected non-Pilot cases.The Project Team reported that the reengineered examination process generated a 7 percent savings in direct examination time and a 4 percent savings in cycle time.However, our analysis of EQMS data indicated the reengineered examination process generated only a 5 percent savings in direct examination time and incurred a 13 percent increase in cycle time.

We evaluated the data the Project Team used to determine the results of the Pilot and identified two causes for these differences.The main factor is that the Project Team included 9 cases that were not identified as Pilot cases on the EQMS database, and excluded 11 cases that were identified as Pilot cases on the EQMS database.

A secondary factor is that there were errors in the Project Teamís calculation of results.Errors were caused by double counting the results of one case, including inaccurate data for another case, and using the wrong data field to calculate total cycle time.

In measuring the effectiveness of the reengineered examination process, it should also be noted that it takes more time for revenue agents to complete more complex examinations.Although the Pilot lasted for 6 months, there may not have been sufficient time in which to close such cases by the time the Pilot ended.As a result, the direct examination time and cycle time may not have been fully measured.We believe direct examination time and cycle time may become even higher as these complex cases are closed.Revenue agents started and closed approximately 122 examinations during the Pilot.As of April 1, 2003, there were approximately 1,092 Pilot cases still in process.

Quality improved, but the Project Team did not properly measure the increase

To measure examination quality, the Project Team compared EQMS results for cases worked under the Pilot with those for cases worked by the Pilot groups the prior year.The Project Team reported that the reengineered examination process generated a 9 percent improvement in overall examination quality measures.However, our analysis of EQMS data indicated the reengineered examination process generated only a 6 percent improvement.

As noted above, we identified differences in the cases used by the Project Team to measure current and prior year results of the Pilot groups.As a result, quality measures were overstated.

Results did not identify whether employee satisfaction was maintained or increased

The Project Team did not conduct a formal survey process with revenue agents to measure employee satisfaction.However, it did use a variety of communication tools to capture employee recommendations for improvements and potential enhancements to the Pilot.It conducted weekly teleconference calls, made four site visits, and solicited comments through the Field Examination Reengineering web site.Group managers and revenue agents used the opportunities to provide constructive suggestions for improving the reengineered process.

In addition, the Project Team requested the Office of Performance Excellence to conduct post-Pilot focus group sessions.The purpose of the focus groups was to gather feedback from the Pilot participants that would help the Project Team evaluate the Pilot and make decisions regarding nationwide rollout of the reengineered process.

The Office of Performance Excellence stated, however, that it did not measure employee satisfaction as part of the focus group sessions.It might be possible to make inferences regarding employee satisfaction based on some of the responses; however, there was no direct comparison between the Pilot process and the prior Field Examination function process.

These efforts were successful in identifying employee attitudes toward the reengineered processes and what can be done to improve them.The Project Team believes that, based on all of these efforts, employees are satisfied with the new Field Examination function process.However, the Project Team did not conduct a quantitative analysis of employee satisfaction (through the form of a survey) that would serve as a better measurement system for determining whether employees are more satisfied than they were using the old Field Examination function process.

Results could not be appropriately measured for customer satisfaction due to an insufficient number of responses received on customer surveys mailed

The Project Team collected customer feedback on the reengineered process through two communications methods:customer surveys and presentations by the Taxpayer Education and Communication office.

Customer satisfaction surveys were mailed to taxpayers, or their representatives, for each of the cases closed under the Pilot.The Project Team used the national survey on customer satisfaction to allow for comparison to prior results for taxpayers who were not part of the Pilot.However, only 14 customers returned the surveys, and the Project Team appropriately concluded this was an insufficient sample size for a valid analysis.There were no other actions taken to obtain responses from the remaining customers of the Pilot cases.

Taxpayer Education and Communication office representatives conducted presentations throughout the area in which the Pilot was conducted.They met with numerous practitioners, Certified Public Accountants, and Enrolled Agents.This effort was successful in identifying practitioner attitudes toward the reengineered process.

However, customer satisfaction needs to be continually monitored throughout the nationwide rollout to determine if the goal of maintaining or improving customer satisfaction has truly been met.

Recommendations

The Director, Compliance, SB/SE Division, should:

1.      Continue to measure the impact of the new procedures on hours per return, cycle time, and examination quality as the procedures are implemented on a nationwide basis and ensure the measures are accurate.

Managementís Response:SB/SE Division management has developed a performance measures and monitoring methodology which will measure the impact of reengineered process changes on business results.This plan, which was put into place on May 20, 2004, includes a trend analysis of compliance area business results pre-implementation and post-implementation, a comparative case review process for pre-implementation and post-implementation closed examinations, and a compliance area site visit plan.

2.      Conduct formal surveys to measure employee satisfaction and determine whether the measure was improved as a result of the reengineered process.

Managementís Response:SB/SE Division management has developed a performance measures and monitoring methodology that includes site visits and employee and focus group surveys to quantitatively determine the level of employee satisfaction with the reengineered process.The employee satisfaction survey is being developed as part of the implementation monitoring responsibilities of the Field Examination Reengineering Advisory Committee.The IRS currently plans to administer the survey to employees 6 months after Field Examination Reengineering has been implemented in their Compliance Area.

3.      Continue to conduct surveys of taxpayers who undergo the reengineered examination process to measure customer satisfaction.

Managementís Response:SB/SE Division management has developed a performance measures and monitoring methodology that includes a plan to analyze data from the ongoing customer survey process to quantitatively determine the level of customer satisfaction with the reengineered process.

Two of the Field Examination Reengineering Processes Could Be Improved

Many of the reengineered procedures and tools, as noted earlier, were generally used by the revenue agents and group managers and provided enhanced uniformity.However, controls could be strengthened by making Engagement Agreements mandatory and for providing and monitoring MCDs.

Engagement Agreements are not mandatory

An Engagement Agreement is a written summary of revenue agent and taxpayer expectations regarding an examination that were discussed during an Engagement Meeting.It identifies the issues to be examined, steps to be taken, expected time period for completion, and responsibilities of all parties in working toward completing the examination timely and thoroughly.

The major objectives of an Engagement Meeting are to encourage cooperation between the revenue agent and taxpayer, promote customer satisfaction, and help the taxpayer understand what will transpire during the examination.An Engagement Agreement is the culmination of this Meeting.

When implementing the procedures nationwide, the Project Team decided to make the Engagement Agreement optional rather than mandatory, due to resistance from the employees, the taxpayers, and the National Treasury Employees Union.The concern was that managers might be critical of employees if certain expectations of the Engagement Agreement were not met.However, the Engagement Agreement is not a contract.It is an unsigned document subject to change if there are changes in scope or unforeseen circumstances.

Our review of the sample of 42 closed cases from the Pilot determined that 34 could have had a formal Engagement Agreement prepared.However, only 3 of the 34 cases had a formal Engagement Agreement prepared and documented in the case file.Without an Engagement Agreement, the effectiveness of the Engagement Meeting could be reduced.Since no documentation of the expectations would exist, revenue agents and taxpayers may not clearly understand the expectations and may not be diligent in ensuring they each meet the expectations discussed.As a result, the objectives and goals of the Engagement Meeting may not be achieved.

Controls could be strengthened to ensure MCDs are provided to taxpayers and monitored by group managers

In a report on customer satisfaction, Examination function customers believed the time spent on an examination and the length of the examination process are the third and fourth highest improvement priorities, respectively.The Project Director, during a February 20, 2002, discussion on the status of the Field Examination Reengineering Project, stated that it was reasonable for a world-class tax organization to advise taxpayers how long an examination will last.

Under the reengineered process, revenue agents are directed to work with taxpayers in determining the amount of time it will take to complete an examination.In our review of 42 closed Pilot cases, we determined that revenue agents established MCDs and documented the dates in the case histories.However, there is no mechanism to ensure agents provide the MCDs to the taxpayers.As a result, we could not determine whether the agents informed the taxpayers of the MCDs in a majority of the cases.

During the Pilot, the Engagement Agreement template was designed to include a paragraph relating to the MCD.However, as previously discussed, of the 34 closed Pilot cases reviewed that could have had an Engagement Agreement prepared, only 3 did because these Agreements are optional.

In addition, managers are responsible for ensuring revenue agents are working cases timely and for managing their groupsí inventories.To aid in performing these duties, managers could monitor each caseís MCD.However, there is no requirement for managers to monitor this date; nor is there a mechanism in place to allow for such monitoring, since the date is not identified in any of the systems used by managers to control cases.Group managers identified this issue as a concern during the focus group sessions.They indicated there was no process to track MCDs unless they developed one for themselves.

All 42 cases we reviewed had an MCD established.Only 4 of the cases were not completed by the MCD, with late completion in these cases ranging from 4 to 34 days.While our limited review of Pilot cases showed the MCDs were generally met, there is no assurance that these dates will be met as the procedures are rolled out nationwide.

Unless employees are held accountable for establishing accurate MCDs and providing them to taxpayers, the benefits of establishing these dates may not be realized.As a result, customer satisfaction could decline, and the managersí ability to effectively manage inventories could be limited.

Recommendations

The Director, Compliance, SB/SE Division, should:

4.      Require revenue agents to prepare Engagement Agreements and then provide them to the taxpayers or their representatives.In instances in which an Engagement Agreement is not appropriate, the revenue agent should document the reason why an Agreement was not prepared.

Managementís Response:SB/SE Division management did not agree at this time to implement our recommendation.During the early stages of the Field Examination Reengineering design and pilot, they considered requiring revenue agents to prepare, sign, and share Engagement Agreements with taxpayers or their representatives.However, they encountered strong negative feedback from representatives and examiners who focused on the signature process of the Agreements rather than the communication they were trying to foster.As a result, SB/SE Division management determined mandatory use of signed Engagement Agreements to be counter-productive to their redesign effort at that time.

Office of Audit Comment:We still believe revenue agents should be required to prepare Engagement Agreements and provide them to the taxpayers or their representatives.Engagement Agreements would help ensure revenue agents and taxpayers clearly understand their responsibilities and expectations.

5.      Require revenue agents to record the MCD on the Engagement Agreement.Also, the Director should add the MCD as a field on the Examination Returns Control System and direct group managers to monitor MCDs to identify potential timeliness and inventory management issues.

Managementís Response:SB/SE Division management did not agree with our recommendation.They believe current procedures are sufficient to ensure revenue agents establish and provide the MCDs to taxpayers or their representatives.They also believe requirements for revenue agents to update the Activity Record and notify the group manager and taxpayer of changes to the MCD of more than 30 calendar days are sufficient for group managers to monitor the MCD.

Office of Audit Comment:We continue to believe additional controls are necessary to ensure revenue agents discuss and provide MCDs to taxpayers or their representatives.Additional controls would also help ensure group managers take a more proactive approach to monitoring MCDs for timeliness and would provide them with a systemic means by which to do this.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine whether the Field Examination Reengineering Pilot met its established goals.To accomplish our objective, we:

I.                    Evaluated the implementation of the Pilot.

A.     Determined whether new procedures were implemented.

1.      Interviewed the Field Examination Reengineering Project Team to determine which new procedures have been implemented.

2.      Interviewed group managers to determine how their responsibilities have changed under the reengineered audit process.

3.      Interviewed revenue agents to determine how their responsibilities have changed under the reengineered audit process.

4.      Reviewed files for a random sample of 42 closed cases, from an approximate population of 122, for evidence that new procedures were implemented.We selected a random sample to avoid bias.

B.     Determined whether new procedures were working as intended.

1.      Using the random sample of 42 closed cases in Step I.A.4., reviewed files for evidence that new procedures were followed.

II.                 Evaluated the oversight process of the Pilot.

A.     Interviewed the Field Examination Reengineering Project Team to determine how oversight was provided.

1.      Reviewed results of monthly site visits conducted by the Project Team.

2.      Identified actions taken during the monthly site visits.

3.      Determined how the Project Team used the results of the site visits.

4.      Identified other methods used by the Project Team to provide oversight to the Pilot.

III.               Evaluated the measurement process of the Pilot results.

A.     Interviewed the Field Examination Reengineering Project Team to determine how results were captured.

B.     Interviewed the Field Examination Reengineering Project Team to determine how the Pilotís level of success was measured.

C.     Obtained a download of cases, closed in Fiscal Years 2002 and 2003, from the Examination Quality Measurement System (EQMS).

1.      Calculated results of the Pilot cases.

a)                  Compared our results to the results identified by the Internal Revenue Service.

b)                  Compared our results to the results of non-Pilot cases.

c)                  Determined if the Pilot cases met the stated objectives of the Pilot, including a reduction in hours per return, a reduction in cycle time, the maintenance of or an improvement in examination quality, an increase in employee satisfaction, and an increase in customer satisfaction.

 

Appendix II

 

Major Contributors to This Report

 

Philip Shropshire, Acting Assistant Inspector General for Audit (Small Business and Corporate Programs)

Parker F. Pearson, Director

Amy L. Coleman, Audit Manager

Todd M. Anderson, Senior Auditor

Pillai Sittampalam, Senior Auditor

 

Appendix III

 

Report Distribution List

 

CommissionerC

Office of the Commissioner Ė Attn:Chief of StaffC

Deputy Commissioner for Services and EnforcementSE

Acting Deputy Commissioner, Small Business/Self-Employed DivisionSE:S

Acting Director, Compliance, Small Business/Self-Employed DivisionSE:S:C

Chief CounselCC

National Taxpayer AdvocateTA

Director, Office of Legislative AffairsCL:LA

Director, Office of Program Evaluation and Risk AnalysisRAS:O

Office of Management ControlsOS:CFO:AR:M

Audit Liaison:Commissioner, Small Business/Self-Employed DivisionSE:S:C

 

Appendix IV

 

Managementís Response to the Draft Report

 

The response was removed due to its size.To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.