TREASURY INSPECTOR GENERAL

FOR TAX ADMINISTRATION

GPRA: Weaknesses in the Service Center Correspondence Examination Process Reduce the Reliability of the Customer Satisfaction Survey

April 2001

Reference No. 2001-10-067

Executive Summary

This audit was performed as part of the Treasury Inspector General for Tax Administrationís overall strategy to assess the reliability of the Internal Revenue Serviceís (IRS) customer satisfaction performance measures as they relate to the Government Performance and Results Act of 1993 (GPRA). The overall objective of our review was to assess the validity of the information used to measure customer satisfaction for the Service Center Correspondence Examination function.

The GPRA requires federal agencies to establish standards for measuring their performance and effectiveness. The law requires executive agencies to prepare multi-year strategic plans, annual performance plans, and performance reports on prior year accomplishments. The first annual performance reports were to be provided to the President and the Congress in March 2000. The Congress will use the GPRA measurement results to help evaluate the IRSí budget appropriation. Therefore, it is essential that the IRS accurately measure its success in meeting the performance goals.

The IRS prepared a strategic plan and an annual plan establishing goals for the agency. One of the IRSí three strategic goals is to provide quality service to each taxpayer. The IRS is measuring its success in achieving this goal through surveys conducted by a vendor. Taxpayers are being asked to complete a survey to rate the service they received. These survey results are summarized and used to evaluate the overall satisfaction with the service provided by the IRS.

Results

IRS management has not established an effective process to ensure that the survey is conducted appropriately to measure the level of satisfaction customers receive from interactions with all Correspondence Examination program areas. Consequently, the IRS needs to qualify the use of any data from the Correspondence Examination Customer Satisfaction Surveys.

Processing Errors Could Affect the Survey Results

Processing errors, such as assigning incorrect organization codes and the incorrect use of disposal and technique codes, caused some returns to be excluded from the survey. In 1 service center, the use of an incorrect organization code resulted in as much as 80 percent of its work being excluded from the survey. The use of incorrect disposal and technique codes also resulted in 16 percent of the tax returns in our sample either being improperly excluded or incorrectly included in the survey.

Weaknesses in Approving Access, User Profiles, and Inventory Validations Could Affect the Survey Results

Some required procedures in the approval process for granting employees access to the computer system were not consistently followed. Some tax examining assistants and correspondence examination technicians had command codes in their user profiles that would allow them to make inappropriate changes and updates to the records on the database. In addition, inventory validations were not consistently conducted. When taken in the aggregate, these weaknesses increase the risk that the data, which are the basis of the survey, are not totally accurate.

Low Survey Response Rate Could Affect the Survey Results

The Internal Revenue Service Customer Satisfaction Survey National Report - Service Center Examination, dated March 2000, showed a response rate of 26.3 percent for the period July through September 1999. The response rate for all of 1998 was only 24.2 percent. Such low rates increase the risk that the opinions of the few who responded may not match the opinions of the many who did not.

Sample Selection Is Not Always Representative and Random

Surveys are issued to 100 taxpayers per month for each service center regardless of the number of tax returns closed by each service center that month. Because the volume of tax returns closed by each service center varies, this sampling technique does not result in a truly random selection process in which each taxpayer would have an equal chance of being included in the survey.

Summary of Recommendations

To address the issues involving the organization, disposal and technique codes, the Director, Compliance (Wage and Investment Division), should stress using the correct codes during training and require reviews to ensure the proper codes are used.

The Director, Compliance, should re-emphasize the importance of properly following all procedures when granting access to computer systems and re-evaluate the decisions to allow some technical employees to have command codes that allow them to adjust and delete information. If the employees must have adjustment capabilities, the Director should develop controls that will reduce the risk associated with the lack of separation of duties. Additionally, the Director should re-emphasize the need to follow inventory validation requirements and require each Examination unit to forward a validation certification to the Service Center Examination Branch Chief.

The Directors of Compliance and the Organizational Performance Division should disclose the current response rates and study what actions can be taken to increase them. Additionally, they should disclose the sampling limitations encountered and take care not to portray the taxpayersí opinions obtained as representative of all taxpayers across the nation.

Managementís Response: Management agreed to all of our recommendations expect the one involving the sampling selection methodology. The Director, Compliance, will stress the use of the correct codes and proper procedures for allowing computer access, and will evaluate the need for employees to have adjustment capabilities. Also, the Director will emphasize inventory validations and require a validation certification. The Directors of Compliance and the Organizational Performance Division will disclose the current response rates and use telephone surveys in place of mail surveys. Management stated that the current use of a stratified sample with weighting factors for non-respondent and stratified imbalances is sufficient. Managementís complete response is included in Appendix VII of this report.

Office of Audit Comment: We continue to believe that because of the large variation in the number of cases processed in each service center, the current sample methodology does not allow each taxpayer an equal opportunity to be included in the survey.