TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

Enhancements Made to the Modernized e-File System in Release 8 Should Improve System Performance for the 2013 Filing Season

 

 

 

April 22, 2013

 

Reference Number:  2013-20-039

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

 

Phone Number  /  202-622-6500

E-mail Address /  TIGTACommunications@tigta.treas.gov

Website           /  http://www.treasury.gov/tigta

 

 

HIGHLIGHTS

ENHANCEMENTS MADE TO THE MODERNIZED E-FILE SYSTEM IN RELEASE 8 SHOULD IMPROVE SYSTEM PERFORMANCE FOR THE 2013 FILING SEASON

Highlights

Final Report issued on April 22, 2013

Highlights of Reference Number:  2013-20-039 to the Internal Revenue Service Chief Technology Officer.

IMPACT ON TAXPAYERS

The Modernized e-File (MeF) system is the IRS’s electronic filing system that enables real‑time processing of tax returns while improving error detection, standardizing business rules, and expediting acknowledgements to taxpayers.  It is a critical component to meet the needs of taxpayers, reduce taxpayer burden, and broaden the use of electronic interactions.  The enhancements made to the MeF system in preparation for the 2013 Filing Season should help ensure the reliability of the MeF system and reduce delays in processing taxpayers’ tax returns.

WHY TIGTA DID THE AUDIT

This review is part of our Fiscal Year 2013 Annual Audit Plan and addresses the major management challenge of Modernization.  The overall audit objective was to determine whether the MeF system infrastructure changes were on track to deliver improvements in performance and reliability for the 2013 Filing Season.

WHAT TIGTA FOUND

The IRS reviewed the MeF system performance issues from the 2012 Filing Season and identified major categories of work to address performance and reliability issues.  To ensure that the IRS could process the anticipated volume of returns for the 2013 Filing Season, the IRS modified its testing strategy by incorporating significant enhancements, namely the use of a production-sized database and performance testing for a duration of 48 hours.  Results from the 48-hour test show that the tool for validating delivery of downstream files timely generated files to the Electronic Fraud Detection System.  The IRS also increased the bandwidth of the portal that serves as the entry point for web-based access to IRS applications and data.  Because other IRS applications use the same portal as the MeF system, this increased bandwidth helps guard against a decrease in overall performance. 

To improve the IRS’s ability to monitor and measure the MeF system’s availability, the IRS is in the process of implementing new monitoring tools and Automated Ticket Generation.  Although monitoring activities do not directly improve performance of the MeF system, they can help alert the IRS to problems or potential problems so that IRS personnel can proactively act to mitigate the problems when they are identified.

Collectively, the various categories of work should provide the enhancements intended to correct the 2012 Filing Season issues and give the IRS the assurance of the MeF system’s readiness for the 2013 Filing Season.

WHAT TIGTA RECOMMENDED

TIGTA did not make any recommendations in this report.  However, a discussion draft of the report was provided to the IRS for review and comment.  The IRS had no comments on the report.  

 

April 22, 2013

 

 

MEMORANDUM FOR CHIEF TECHNOLOGY OFFICER

 

FROM:                       Michael E. McKenney /s/ Michael E. McKenney

Acting Deputy Inspector General for Audit

 

SUBJECT:                  Final Audit Report – Enhancements Made to the Modernized e-File System in Release 8 Should Improve System Performance for the 2013 Filing Season (Audit # 201320026)

 

This report presents the results of our review of the Modernized e-File system.  The overall objective was to determine whether the Modernized e-File system infrastructure changes were on track to deliver improvements in performance and reliability for the 2013 Filing Season.  This audit is included in the Treasury Inspector General for Tax Administration Fiscal Year 2013 Annual Audit Plan and addresses the major management challenge of Modernization.

We did not make any recommendations in this report.  However, a discussion draft of the report was provided to the Internal Revenue Service (IRS) for review and comment.  The IRS had no comments on the report.

Copies of this report are also being sent to the IRS managers affected by the report.  If you have any questions, please contact me or Alan R. Duncan, Assistant Inspector General for Audit (Security and Information Technology Services).

 

 

Table of Contents

 

Background

Results of Review

Necessary Steps Were Taken to Improve Modernized e-File System Performance

Monitoring Tools Are Still Under Development

A Contingency for the Modernized e-File System Exists to Minimize Disruptions in the Taxpayers’ Ability to File Tax Returns Electronically

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – MeF Release 8 – 48-Hour Test Results

Appendix V – Forms Accepted by the Legacy e-File System for the 2013 Filing Season

Appendix VI – Glossary of Terms

 

 

Abbreviations

 

IRS

Internal Revenue Service

IT

Information Technology

MeF

Modernized e-File

ODS

On-Demand Services

PETE

Performance Evaluation Testing Environment

RUP

Registered User Portal

 

 

Background

 

During the 2012 Filing Season, MeF system processing was suspended at least twice to correct system performance issues.

The Modernized e-File (MeF) system is the Internal Revenue Service’s (IRS) electronic filing system that enables real-time processing of tax returns while improving error detection, standardizing business rules, and expediting acknowledgements to taxpayers.  It is a critical component to meet the needs of taxpayers, reduce taxpayer burden, and broaden the use of electronic interactions.  The IRS deployed Release 7[1] for the 2012 Filing Season, which included the remaining individual tax return forms and schedules.[2]  During the 2012 Filing Season, the IRS required all transmitters that transmitted one million or more individual tax returns during the 2011 Filing Season to use the MeF system to electronically file tax returns.  On at least two different occasions, the IRS had to suspend MeF system processing to correct system performance issues.  IRS management noted that the performance issues first experienced on January 17, 2012, might have been caused by the large volume of tax returns received by the MeF system during the first day of processing.  According to the IRS, the volume of returns received by the MeF system on January 17, 2012, was one of the largest the IRS had ever received to date.  The second incidence of MeF system performance issues started in late January 2012.  These issues primarily included delays sending files to downstream systems and delivering of acknowledgements, which resulted in delays in processing individual tax returns.

The IRS determined that performance enhancements would be needed to increase processing rates and improve the reliability of delivering submissions to downstream systems.  To ensure that it corrected the identified issues, the IRS approved a change request in early February 2012 to modify the scope of Release 8 in order to focus on performance, system reliability, and production support, thereby delaying implementation of new business taxpayer forms to Release 9.

This review was performed at the offices of the Information Technology (IT) Applications Development, Enterprise Operations, and Enterprise Services organizations located in Lanham, Maryland, during the period October 2012 through March 2013.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II. 

 

 

Results of Review

 

Necessary Steps Were Taken to Improve Modernized e-File System Performance

The IRS reviewed the MeF system performance issues from the 2012 Filing Season to identify required enhancements for Release 8.  The following list illustrates the 2012 Filing Season issues to be addressed:

·       Insufficient volume of test data to conduct performance testing.

·       Insufficient length of performance testing.

·       Differences between the test and production environments.

·       Insufficient testing of timely delivery of files to downstream systems.

·       Undersized portal to meet increased usage.

Figure 1 shows where the performance issues occurred during the 2012 Filing Season.

Figure 1:  Location of MeF System Performance Issues
During the 2012 Filing Season

Figure 1 was removed due to its size.  To see Figure 1, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

Source:  Our modification of a diagram provided by Enterprise Services organization management.

Based on the issues, IRS management identified three major categories of work (or work streams) to address performance and reliability issues.  The IRS also identified a fourth work stream to improve monitoring activities that help alert the IRS of problems or potential problems so personnel from the IT organization can proactively act to mitigate the problems when they are identified.  Figure 2 identifies each work stream’s objective, estimated cost, and the specific IT organization responsible for delivery.

Figure 2:  Four Major Work Streams

Work Stream

Lead IT Organization

Objective

Estimated Cost

Performance Test

Enterprise Services

Enhance performance testing environment.

$3,440,619

ODS

Applications Development

Validate timely delivery of files to downstream systems.

$404,000

Portal Readiness and Testing

Enterprise Services

Determine if the Registered User Portal (RUP) is sized sufficiently to support the
2013 Filing Season volume.

$1,264,780

Monitoring

Enterprise Operations

Implement improved monitoring capabilities.

$1,536,401

Source:  IRS MeF system briefings and IT organization management.

The IRS reported it invested about $43.07 million developing Release 8, approximately $6.65 million of which supported the work streams.  The performance test and portal readiness and testing work streams directly affect the timely submission of tax returns for processing.  The IRS modified its testing strategy by incorporating significant enhancements, namely the use of a production-sized database and performance testing for a duration of 48 hours, to ensure that the IRS could process the anticipated volume of returns for the 2013 Filing Season.  Additionally, the IRS increased the bandwidth of the portal that serves as the entry point for web-based access to IRS applications and data.  Because other IRS applications use the same portal as the MeF system, this increased bandwidth helps guard against a decrease in overall performance.

The ODS work stream helps track the flow of tax return information to downstream systems.  The monitoring work stream allows the IRS to proactively respond to events/incidents reported by the MeF system.

Enhanced Release 8 performance test strategy addressed issues identified during the 2012 Filing Season

One of the lessons learned from the 2012 Filing Season was that the IRS did not have a sufficient volume of test data to enable it to conduct longer, sustained performance testing.  As a result, the IRS increased the Release 8 test requirement for the Performance Evaluation Testing Environment (PETE) database by implementing a copy of the MeF system production database.  This production-sized database contained 23.1 million records, up from the 6.6 million records required for Release 7 testing.  By doing so, the IRS leveraged data from the production‑sized database to help it obtain the required amount of data needed to execute sustained performance testing.

The IRS also learned it needed to execute a performance test with a continuous rate for more than eight hours.  As a result, the IRS included a requirement to execute a 48-hour test in the Release 8 test plan.  The 48-hour test is significant for the following reasons:

·       It tests MeF system processing during peak and non-peak conditions.

·       It was the primary test of Release 8 ODS enhancements. 

The IRS uses receipt rate and acknowledgement rate measurements to evaluate MeF system performance.  For time periods with a low submission rate, the IRS expects the acknowledgement and receipt rates to match.  For high-volume periods where the submission rate is sufficient to test peak filing season volume, the IRS expects the MeF system to deliver an acknowledgement rate of 132 acknowledgements per second, with up to a two-hour delay.  Most of the IRS’s performance testing is geared towards validating whether the MeF system can perform during peak loads of steady submissions.  However, with the exception of a couple of peak periods, notably at the start of the filing season and the April deadline for individual returns, most of the filing season contains peaks and valleys of processing.  The 48-hour test simulates the peaks and valleys of processing experienced during the filing season. 

The Internal Revenue Manual states that performance testing is required to determine whether the system undergoing testing can effectively process transactions under expected normal and peak workload conditions and within acceptable response times.  If performance and capacity tests are not successful, the risk that the system may fail to meet stated business performance requirements is increased.  Specific risks include the possibility that an application could become unavailable to process mission-critical transactions or the system could experience unexpected outages or degraded response times.

Figure 3 shows excerpts of hourly performance test results from the 48-hour test where volume increases to a point where the acknowledgement rate falls well behind the receipt rate, consistent with volume for peak filing season conditions.  Appendix IV provides the complete results of the 48-hour test. 

Figure 3:  Results From the 48-Hour Test in Release 8

Line

Date

Time

Receipts
per second

Acknowledgements
per second

Differential

 

December 11, 2012, Peak Period

 

1

12/11/2012

16:00

71.9

71.7

-0.2

2

12/11/2012

17:00

104.2

100.8

-3.4

3

12/11/2012

18:00

154.1

132.4

-21.7

4

12/11/2012

19:00

100.6

124.9

24.3

5

12/11/2012

20:00

79.3

79.9

0.6

6

12/11/2012

21:00

85.9

86.5

0.6

7

12/11/2012

22:00

77.3

77.0

-0.3

8

12/11/2012

23:00

77.3

76.8

-0.5

 

December 12–13, 2012, Peak Period

 

9

12/12/2012

17:00

111.8

111.2

-0.6

10

12/12/2012

18:00

166.5

124.7

-41.8

11

12/12/2012

19:00

124.1

124.9

0.8

12

12/12/2012

20:00

148.7

122.7

-26

13

12/12/2012

21:00

43.1

98.7

55.6

14

12/12/2012

22:00

90.2

98.0

7.8

15

12/12/2012

23:00

95.0

97.8

2.8

16

12/13/2012

0:00

89.3

90.3

1

17

12/13/2012

1:00

63.4

63.8

0.4

Source:  The Daily PETE report dated December 13, 2012, and our calculations.  We did not independently
assess the accuracy or reliability of the data presented in this figure.

For the first peak period (Figure 3, line 3), the MeF system experienced a one-hour volume spike to rates of 154.1 receipts per second and 132.4 acknowledgements per second at 18:00 (6:00 p.m.), a negative differential of 21.7 acknowledgements per second.  By 19:00 (7:00 p.m.) (Figure 3, line 4), the receipt rate dropped to 100.6 receipts per second and the acknowledgement rate was 124.9 acknowledgements per second, a positive differential of 24.3 acknowledgements per second, showing that the MeF system caught up on the prior hour’s backlog.  In the second peak period (Figure 3, line 10), volume again spiked at 18:00 (6:00 p.m.) to 166.5 receipts per second and 124.7 acknowledgements per second.  Unlike the first peak period where the volume spike lasted only one hour and dropped to a steady lower rate, the volume of returns remained high for three hours in the second peak period, until 20:00 (8:00 p.m.), making the evaluation slightly more complicated.  Therefore, we calculated total receipts per second and acknowledgements per second from 18:00 (6:00 p.m.), which was the start of the spike, to 22:00 (10:00 p.m.), which was two hours after the volume spike ended.  By 22:00 (10:00 p.m.), the cumulative differential between receipts per second and acknowledgements per second was only a negative 3.6 acknowledgements per second (-41.8 + 0.8 + -26 + 55.6 +7.8), indicating that acknowledgements had again largely caught up to receipts.  Based on the results of the 48-hour test, we believe that MeF Release 8 meets its stated acknowledgement rate performance requirement.

In addition to reviewing results from the 48-hour test, we reviewed test results from the final tests performed January 11–25, 2013.  The IRS will implement configurations from the final tests into production for the start of the filing season.  The final tests concluded with seven acknowledgement rate tests to validate that the MeF system would meet performance requirements for peak filing season.  For these tests, the MeF system averaged a receipt rate of 150.71 receipts per second with an acknowledgement rate of 127.71 acknowledgements per second.  While the tests do not specifically depict whether the MeF system delivered acknowledgements within two hours, the IRS conducted analysis of the 2012 Filing Season to estimate hourly return submissions to the MeF system for the April 2013 Filing Season peak period.  The IRS subsequently estimated the delay in delivering acknowledgements at various potential acknowledgement rates.  Figure 4 displays projections for whether the MeF system will deliver acknowledgements within a two-hour period at different rates. 

Figure 4:  2013 Filing Season Projections of Acknowledgement Rates

Figure 4 was removed due to its size.  To see Figure 4, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

Source:  Enterprise Services organization Solutions Engineering management. 
Acks/sec = acknowledgements/second.

The graph shows that if the MeF system delivers an acknowledgement rate of at least 113 acknowledgements per second, then it will meet performance requirements for Release 8.

Collectively, the upgrades to the Release 8 PETE environment and enhancements to Release 8 performance testing addressed weaknesses identified from Release 7 testing.  It also provided the IRS with a higher degree of confidence that the performance test results observed during Release 8 testing will mirror the performance results observed during the 2013 Filing Season.

ODS enhances downstream processing of files

The IRS developed requirements for the ODS to deliver files to downstream systems.  For example, the ODS will deliver data to the Generalized Mainline Framework at specified times each day, whereas the ODS will deliver data to the Electronic Fraud Detection System every hour on the half-hour.  The ODS also creates summary data for reports showing the counts of the returns submitted downstream and removes data that have aged past their required retention period.  Figure 5 lists the high-level 2012 Filing Season ODS issues and the corresponding enhancements for Release 8.

Figure 5:  Comparison of ODS Issues and Corresponding Enhancements 

2012 Filing Season ODS Issue

MeF Release 8 ODS Enhancement

Files to downstream systems were not timely generated and duplicate files containing accepted tax return data were delivered to downstream systems; also called File Generation.

The IRS implemented monitoring to ensure that files are delivered timely and separated the file creation step into independent units of work. 

Tool used to create summary data for reports generated incorrect counts and took longer than expected; also called Aggregators.

Updated programming code to fix problems that caused incorrect counts and transferred work from the application to the database to assist with timely completion of tasks. 

The Java Virtual Machine became unresponsive and had to be stopped because it could not process all its tasks.

Updated the programming code to distribute the ODS tasks across multiple Java Virtual Machines. 

Tasks for removing old or unnecessary data took longer than expected; also called Purge.

Updated queries to allow purge tasks to complete sooner.

Source:  ODS Enhancements Management Briefing, dated September 24, 2012.

The IRS tested the Release 8 ODS enhancements during the 48-hour test.  The Daily PETE report dated January 11, 2013, noted all Aggregator and Purge tasks completed with the exception of known issues with a specific Aggregator and a specific Purge task that have no impact on performance.  Further, test results provided by IRS management for File Generation tasks show that the MeF system timely generated files to the Electronic Fraud Detection System during the 48-hour test, the primary system affected by the 2012 Filing Season ODS issues.  However, the test results also showed that the Electronic Tax Administration Research and Analysis System and the E-File Reports website did not meet file delivery requirements.  During a February 26, 2013, meeting, IRS management noted that the MeF system experienced some issues with timely delivery of files to various downstream systems during the 2013 Filing Season and stated that the issues have subsequently been fixed.  

Testing indicated the successful expansion of the portal capacity

The RUP is the IRS external portal that allows registered individuals, third-party users, and other individual taxpayers or their representatives to access the IRS for interaction with selected tax processing and other sensitive systems, applications, and data.  The RUPs are used by external users to access the MeF system applications, find the status of their refunds, obtain Preparer Tax Identification Numbers, and perform other functions.  There are two RUP-LAs[3] and two RUP‑SAs.[4]

During the 2012 Filing Season, the MeF system experienced a more than 4,000 percent increase in demand.  The IRS observed several issues with the portal, resulting in extraordinarily slow or no service for its customers.  Several contributing factors included high central processing unit utilization on the session database server and MeF system backend processing.  Analyses were performed and several recommendations were made to improve the performance of the RUP-LA.  A major recommendation was to expand the network bandwidth from 45 Mbps (megabits per second) to 70 Mbps.  Because other systems, besides the MeF system, use the RUP, IT organization management stated that IRS executives decided to increase the bandwidth to 155 Mbps to ensure that the bandwidth will support all activities using this portal.  The additional bandwidth will guard against a decrease in portal performance.  Other recommendations, not all inclusive, for improving portal performance included replacing and upgrading the telecommunications technology.  A draft December 31, 2012, MeF system status briefing[5] reported that the IRS completed replacing the telecommunications equipment.  The IRS also completed testing of the portal bandwidth in December 2012.  The test results indicated that the bandwidth upgrade was successful and information is being sent through the circuits at a rate of 155 Mbps.

In addition, the IRS conducted a test in September 2012 to determine the readiness of the RUP‑SA to support estimated workloads for the 2013 Filing Season of 120 End User Transactions per second.  The test report concluded that the RUP-SA processed 124 End User Transactions per second, which met and exceeded the test objective of 120 End User Transactions per second.  The report also stated that testing determined that one portal could process the required peak workload in the event the IRS had to failover to one portal.

Collectively, the various work streams should provide the enhancements intended to correct the 2012 Filing Season issues and give the IRS the assurance of the MeF system’s readiness for the 2013 Filing Season.

Monitoring Tools Are Still Under Development

The IRS created the Information Technology Business Plan[6] as a roadmap in its journey to world-class status.  The Plan documents the IT organization’s strategic direction for Fiscal Years 2011–2013 and includes a goal for delivering improved business capabilities and governance.  To achieve this goal, the IRS chartered an End-To-End Program in July 2010 to improve the IT organization’s ability to monitor, measure, manage, and improve information technology service ability end to end.  World-class organizations monitor and measure the availability and performance of the services they provide to customers and use these data to quickly react to system outages/degradation and take proactive steps to prevent outages from occurring. 

Enterprise Operations organization management stated that they performed monitoring activities for the MeF system during the previous filing season.  However, for MeF Release 8, they worked on implementing new monitoring tools and Automated Ticket Generation[7] to improve the IRS’s ability to monitor and measure the MeF system’s availability as well as help address the system’s 89 technical monitoring requirements for the 2013 Filing Season.  The technical monitoring requirements were separated into six high-priority categories. 

Figure 6 provides the technical monitoring requirements categories and the number of requirements.  As of February 10, 2013, the IRS implemented 67 (75 percent) of the 89 technical monitoring requirements.  Unresolved issues caused a delay in implementing the remaining 22 requirements.  Some of the unresolved issues include problems with the functionalities of a monitoring application, monitoring solutions not yet developed, and events initiating the auto‑tickets not being generated or received.  The IRS planned to deploy the remaining technical monitoring requirements during the 2013 Filing Season; however, it has yet to identify the implementation dates.

Figure 6:  Technical Monitoring Requirements Categories

Category

Description

Total Requirements

Requirements Deployed by 02/10/2013

Requirements to be Deployed in the Future

Dashboard

Provide single monitoring dashboard.

13

5

8

Database Monitoring

Monitor and ticket on status of Oracle databases.

12

9

3

File Transfer

Monitor file creation status to downstream systems.

1

1

0

Integration

Integrate Tivoli Event Console errors and auto-ticketing.

23

15

8

ODS Monitoring

Correlate and ticket ODS Events.

7

7

0

Queue Monitoring

Monitor queues [Java Messaging Service and Messaging Queue].

33

30

3

 

Total

89

67

22

Source:  Our analysis of data provided by Enterprise Operations organization management.

Although monitoring activities do not directly improve performance of the MeF system, they can help alert the IRS to problems or potential problems so that IRS personnel can proactively act to mitigate the problems when they are identified.  Enterprise Operations organization management stated that a delay in implementing these requirements would necessitate the continued use of tools or processes currently in place (e.g., manual monitoring).

The IRS also planned to implement Automated Ticket Generation during Release 8, a feature that did not exist during the 2012 Filing Season.  As part of monitoring, Automated Ticket Generation will automatically open incident tickets in the Knowledge Incident/Problem Service Asset Management system for 69 different conditions with a Priority 2 level.  Examples of conditions that will generate a ticket are when the acknowledgement rate goes below a certain threshold, a server is down, or the portal is unavailable.  The Automated Ticket Generation is more efficient than the manual process because it automatically routes the tickets to the appropriate party for resolution.

As of February 1, 2013, the IRS deployed 29 of 69 auto‑ticketing requirements into production and plans to deploy the remaining requirements during the 2013 Filing Season.  We encourage the IRS to continue to develop and test all of the monitoring requirements for the MeF system, and we will evaluate its progress during our review of MeF Release 9.  Once they are completely implemented, we believe that the monitoring requirements will improve the IRS’s ability to monitor the MeF system’s availability and performance, allowing personnel to efficiently and proactively identify and address system problems and positioning the IRS to becoming a world‑class IT organization.

A Contingency for the Modernized e-File System Exists to Minimize Disruptions in the Taxpayers’ Ability to File Tax Returns Electronically

When errors in the MeF system caused processing problems in February 2012, the IRS encouraged transmitters to use the Legacy e-File system instead of the MeF system.  The IRS temporarily lifted and later permanently lifted the restrictions on the use of the Legacy e-File system for the remainder of the 2012 Filing Season.  IRS management originally planned to retire the Legacy e-File system in October 2012; however, we recommended that the IRS defer the retirement of the Legacy e-File system until the increased risk associated with retiring the system can be addressed.  The MeF Business System Requirements Report Release 8 System Development Phase Milestone 4B dated November 13, 2012, included an assumption that the IRS will revert to the Legacy e-File system if the MeF system cannot be used.  

The IRS is committed to having all software providers, transmitters, and States use the MeF system as the only platform to transmit individual tax returns during the 2013 Filing Season.  At the same time, the IRS realized it needed a contingency or backup plan in the event the MeF system experienced significant performance issues.  As a risk management step, the IRS announced it will continue to maintain part of the Legacy e-File system as a backup for the MeF system.  Specifically, the IRS agreed it will accept only the 25 most frequently used individual forms and schedules.[8]  Further, the IRS indicated it would permit software developers and transmitters to use the Legacy e-File system only after they passed required testing.  A review of the IRS’s website on August 28, 2012, indicated that the IRS notified the public about its approach for the Legacy e-File system as early as July 31, 2012.

As of December 10, 2012, IRS management reported updating and testing the changes for the Legacy e-File system for the 2013 Filing Season.  The IRS planned to complete testing by January 14, 2013; however, testing was extended to January 17, 2013, due to late tax legislation enacted on January 2, 2013.  On January 8, 2013, the IRS announced plans to commence the 2013 Filing Season and begin processing most individual income tax returns on January 30, 2013.

Because the IRS prepared the Legacy e-File system to accept the designated returns and required transmitters to pass the Participant Acceptance Testing System test in order to file returns through the Legacy e-File system, we concluded that the IRS has taken adequate steps to ensure that the Legacy e-File system can be used during the 2013 Filing Season in the event that the MeF system is unable to process the entire workload.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective was to determine whether the MeF system infrastructure changes were on track to deliver improvements in performance and reliability for the 2013 Filing Season.  To accomplish our objective, we: 

I.                 Determined if the IRS’s mitigation activities from the 2012 Filing Season will improve the MeF system’s performance and reliability to enable it to be the sole system supporting the 2013 Filing Season.

A.    Obtained an understanding of the differences in performance testing strategies between Releases 7 and 8.

B.    Evaluated how the four work streams will enhance MeF Release 8 performance and correct the performance problems that occurred during the 2012 Filing Season.

C.    Reviewed performance test results from tests performed December 11–13, 2012, to determine whether the work streams are improving performance to the IRS’s expectations.

II.               Evaluated the IRS’s plans to ensure continuity of processing returns during the filing season.

A.    Determined what the IRS has done to ensure that the Legacy e-File system can process returns during the 2013 Filing Season if the MeF system is unable to do so.

B.    Determined what guidance has been provided to the transmitters if it becomes necessary for them to use the Legacy e-File system in lieu of the MeF system.

 Internal controls methodology

Internal controls relate to management’s plans, methods, and procedures used to meet their mission, goals, and objectives.  Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations.  They include the systems for measuring, reporting, and monitoring program performance.  We determined the following internal controls were relevant to our audit objective:  IRS testing guidance and the MeF Release 8 requirements.  We evaluated these controls by interviewing IRS IT management from the Applications Development, Enterprise Operations, and Enterprise Services organizations and reviewing relevant supporting documentation.

 

Appendix II

 

Major Contributors to This Report

 

Alan R. Duncan, Assistant Inspector General for Audit (Security and Information Technology Services)

Danny R. Verneuille, Director, Systems Operations

Diana M. Tengesdal, Audit Manager

Tina Wong, Senior Auditor

Michael T. Mohrman, Information Technology Specialist

Linda L. Nethery, Information Technology Specialist

 

Appendix III

 

Report Distribution List

 

Acting Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Deputy Commissioner for Services and Enforcement  SE

Commissioner, Wage and Investment Division  SE:W

Deputy Chief Information Officer for Operations  OS:CTO

Associate Chief Information Officer, Application Development  OS:CTO:AD

Associate Chief Information Officer, Enterprise Operations  OS:CTO:EO

Associate Chief Information Officer, Enterprise Services  OS:CTO:ES

Director, Customer Account Services, Wage and Investment Division  SE:W:CAS

Director, Enterprise Systems Testing  OS:CTO:AD:EST

Director, Submission Processing  OS:CTO:AD:SP

Director, Submission Processing, Wage and Investment Division  SE:W:CAS:SP

Director, Electronic Products and Services Support, Wage and Investment Division  SE:W:CAS:E:PSS

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaison:  Director, Risk Management Division  OS:CTO:SP:RM

 

Appendix IV

 

MeF Release 8 – 48-Hour Test Results

 

Date

Time

Receipts
per second

Acknowledgements
per second

12/11/2012

9:00

21.7

21.9

12/11/2012

10:00

54.7

54.6

12/11/2012

11:00

57.5

57.6

12/11/2012

12:00

51.6

51.1

12/11/2012

13:00

94.1

94.5

12/11/2012

14:00

40.6

40.6

12/11/2012

15:00

48.7

48.7

12/11/2012

16:00

71.9

71.7

12/11/2012

17:00

104.2

100.8

12/11/2012

18:00

154.1

132.4

12/11/2012

19:00

100.6

124.9

12/11/2012

20:00

79.3

79.9

12/11/2012

21:00

85.9

86.5

12/11/2012

22:00

77.3

77.0

12/11/2012

23:00

77.3

76.8

12/12/2012

0:00

73.3

73.7

 

Date

Time

Receipts
per second

Acknowledgements
per second

12/12/2012

1:00

70.1

70.3

12/12/2012

2:00

109.5

108.6

12/12/2012

3:00

62.8

63.6

12/12/2012

4:00

40.0

40.2

12/12/2012

5:00

27.6

27.6

12/12/2012

6:00

39.6

39.5

12/12/2012

7:00

34.5

34.5

12/12/2012

8:00

26.5

26.6

12/12/2012

9:00

31.6

31.5

12/12/2012

10:00

43.1

43.1

12/12/2012

11:00

87.7

87.4

12/12/2012

12:00

74.9

75.0

12/12/2012

13:00

91.3

91.3

12/12/2012

14:00

40.7

40.5

12/12/2012

15:00

108.3

108.6

12/12/2012

16:00

68.9

68.8

12/12/2012

17:00

111.8

111.2

12/12/2012

18:00

166.5

124.7

 

Date

Time

Receipts
per second

Acknowledgements
per second

12/12/2012

19:00

124.1

124.9

12/12/2012

20:00

148.7

122.7

12/12/2012

21:00

43.1

98.7

12/12/2012

22:00

90.2

98.0

12/12/2012

23:00

95.0

97.8

12/13/2012

0:00

89.3

90.3

12/13/2012

1:00

63.4

63.8

12/13/2012

2:00

112.6

110.1

12/13/2012

3:00

53.1

55.6

12/13/2012

4:00

26.0

26.2

12/13/2012

5:00

26.0

25.9

12/13/2012

6:00

7.8

7.8

12/13/2012

7:00

10.4

10.4

12/13/2012

8:00

3.6

3.6

Source:  The Daily PETE report dated December 13, 2012.  We did not
independently assess the accuracy or reliability of the data presented
in this figure.

Appendix V

 

Forms Accepted by the Legacy e-File
System for the 2013 Filing Season

 

Form

Name

1040

U.S. Individual Income Tax Return

1040A

U.S. Individual Income Tax Return

1040EZ

Income Tax Return for Single and Joint Filers With No Dependents

1040 (Sch A)

Itemized Deductions

1040 (Sch B)

Interest and Ordinary Dividends

1040 (Sch C)

Profit or Loss From Business (Sole Proprietorship)

1040 (Sch D)

Capital Gains and Losses

1040 (Sch E)

Supplemental Income and Loss

1040 (Sch EIC)

Earned Income Credit

1040 (Sch SE)

Self-Employment Tax

1099R

Distributions From Pensions, Annuities, Retirement or Profit-Sharing

Plans, IRAs, Insurance Contracts, etc.

2106

Employee Business Expenses

2106EZ

Unreimbursed Employee Business Expenses

2210

Underpayment of Estimated Tax by Individuals, Estates, and Trusts

2441

Child and Dependent Care Expenses

4562

Depreciation and Amortization (Including Information on Listed Property)

8283

Noncash Charitable Contributions

8812

Additional Child Tax Credit

8829

Expenses for Business Use of Your Home

8863

Education Credits (American Opportunity and Lifetime Learning Credits)

8867

Paid Preparer's Earned Income Credit Checklist

8880

Credit for Qualified Retirement Savings Contributions

8888

Allocation of Refund (Including Savings Bond Purchases)

W-2

Wage and Tax Statement (Info Copy Only)

8949

Sales and Other Dispositions of Capital Assets

Source:  IRS website at http://www.irs.gov/uac/e-file-requirements-for-the-2013-Filing-Season.

 

Appendix VI

 

Glossary of Terms

 

Term

Definition

Acknowledgement Rate

Captures the time from when the receipt of a tax return submission is created to the time the acknowledgement for that tax return submission is made available for retrieval.

Auto-Ticketing

When a specific incident or problem occurs, an incident ticket is automatically generated and routed to the appropriate organization for resolution.

Capacity Test

Test used to determine how many users and/or transactions a given system will support and still meet performance goals.

Central Processing Unit

An internal component of the computer that performs arithmetic and logical operations, extracts instructions from memory, decodes instructions, and executes instructions.

Electronic Fraud Detection System

An automated system used to maximize fraud detection at the time tax returns are filed to eliminate the issuance of questionable refunds.

Enterprise File Transfer Utility

A utility that moves data in a controlled, structured, and secured environment through the organization.

Electronic Tax Administration Research and Analysis System

A system that captures summary statistics used to generate management reports and to support research activities.

Filing Season

The period from January through mid-April when most individual income tax returns are filed.

Generalized Mainline Framework

The system that validates and perfects data from a variety of input sources (e.g., tax returns, remittances, information returns, and adjustments) and controls, validates, and corrects updated transactions.

Java

A general purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible.

Knowledge Incident/Problem Service Asset Management System

An IRS application that maintains the complete inventory of information technology and non–information technology assets, computer hardware, and software.  It is also the reporting tool for problem management with all IRS-developed applications and shares information with the Enterprise Service Desk.

Legacy e-File System

The current IRS electronic filing system that is being replaced by the MeF system.

Milestones

Provide “go/no-go” decision points in a project and are sometimes associated with funding approval to proceed.

Portal

A point of entry into a network system that includes a search engine or a collection of links to other sites, usually arranged by topic.

Receipt Rate

Measured from the time the transmitter sends the return to the time the acknowledgement of receipt is made available for retrieval.

Release

A specific edition of software.

Requirement

A statement of capability or condition that a system, subsystem, or system component must have or meet to satisfy a contract, standard, or specification.

Risk

A potential event that could have an unwanted impact on the cost, schedule, business, or technical performance of an information technology program, project, or organization.

Transmitter

A firm, organization, or individual that receives returns and Personal Identification Number registrations electronically from clients (Electronic Return Originators, Reporting Agents, or taxpayers), reformats the data (if necessary), batches the data with data from the returns and electronic Personal Identification Number registrations of other clients, and then transmits the data to the IRS.

 



[1] See Appendix VI for a glossary of terms.

[2] In February 2010, the IRS deployed MeF Release 6.1 to begin electronically processing the Form 1040, U.S. Individual Tax Return, series along with 22 other forms and schedules.  In January 2011, the IRS deployed Release 6.2, which focused on improving system performance and capacity.

[3] This access requires registration and login authentication and is referred to as RUP-LA.

[4] This access requires self-authentication using shared secrets and is referred to as RUP-SA.

[5] MeF R8 Executive Status/Risk and Schedule Meeting dated December 31, 2012.

[6] Based on a draft copy of the IRS Information Technology Business Plan Fiscal Years 2011–2013.

[7] The End-To-End System can automatically open incident tickets in the Knowledge Incident/Problem Service Asset Management system.

[8] See Appendix V for a list of the forms and schedules.