System Requirements Were Not Adequately Managed During the Testing of the Custodial Accounting Project

 

December 2004

 

Reference Number:  2005-20-019

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

December 21, 2004

 

 

MEMORANDUM FOR CHIEF INFORMATION OFFICER

 

FROM:     Gordon C. Milbourn III /s/ Gordon C. Milbourn III

                 Assistant Inspector General for Audit

                 (Small Business and Corporate Programs)

 

SUBJECT:     Final Audit Report - System Requirements Were Not Adequately Managed During the Testing of the Custodial Accounting Project (Audit # 200420011)

 

This report represents the results of our review of the Custodial Accounting Project’s System Acceptance Testing.  The overall objective of this review was to determine the status of the Internal Revenue Service’s (IRS) and the contractors’ readiness to deliver the Custodial Accounting Project (CAP) Release 1.0.

In summary, the IRS is currently modernizing its computer systems, business processes, and practices.  This effort is known as Business Systems Modernization (BSM).  One of the BSM projects is the CAP, which will help to modernize the IRS financial systems and improve processes.  The CAP is designed to help correct longstanding weaknesses in the IRS financial management systems, which accounted for over $2 trillion in tax collections (about 95 percent of Federal Government receipts) and $300 billion in tax refunds in Fiscal Year 2003. 

Release 1.0 was planned to deliver the CAP’s primary functions and establish the CAP to operate on the modernized infrastructure.  The IRS and the CAP contractor have completed all major test activities for Release 1.0.  However, Release 1.0 was not deployed as originally planned but was, instead, incorporated into Release 1.1 and deployed in late summer 2004.

The BSM Office (BSMO), the CAP contractor, and the eventual user of the system have been working together to deploy the CAP.  Throughout our audit, the CAP team was making progress toward this goal.  We determined significant test phases were completed, corrective actions were taken to complete testing of deferred system integration test (SIT) test procedures, testing practices were followed, and data anomalies were tracked and prioritized.

Despite progress toward a long-awaited goal, the IRS and the CAP contractor did not adequately manage system requirements during the Release 1.0 System Acceptability Test (SAT).  As a result, the CAP Release 1.0 may not function as intended.  The IRS and the CAP contractor did not track Release 1.0 system requirements during the SAT, testing practices did not allow the testers to determine whether system requirements were successfully tested during the SAT, the IRS approved changes to the baseline system requirements without always knowing which system requirements were affected, and the IRS accepted the Release 1.0 SAT without knowing or reviewing how many requirements were successfully verified during testing.  As a result, critical system requirements were not tested, additions to the baseline system requirements were not tested, and discrepancies in the Release 1.0 test results may have affected Release 1.1 testing. 

We also determined the main system performance requirement would not be tested prior to deployment.  In addition, CAP improvement recommendations developed by the IRS’ internal reviews have not been fully implemented.

To ensure an accurate requirements baseline is developed and maintained for future releases, we recommended the Chief Information Officer (CIO) determine the system requirements that have been successfully deployed with the current CAP release and identify all open requirements.  For future CAP releases, the CIO should implement appropriate requirements management practices to adequately define, track, and report on system requirements.  To ensure the deployed CAP will function as intended, we recommended the CIO test the main system performance requirement as soon as possible.  To improve testing, data quality, and engineering aspects of the CAP, we recommended the CIO ensure the approved internal review recommendations are implemented as soon as possible.

Management’s Response:  The CIO agreed with our recommendations and has completed corrective actions on three of the four recommendations.  The CIO indicated the IRS has significantly modified its approach to requirements management for CAP Releases 1.1 and 1.2.

More specifically, the CIO stated the IRS continues to use and strengthen the requirements management practices as outlined in the Enterprise Life Cycle.  Also, the IRS has changed the Requirements Traceability Verification Matrix to improve the tracking of test scripts to requirements and has increased the participation of the Chief Financial Officer, Business Systems Development, and Business Systems Modernization organizations and the CAP Architecture Review Board in discussing change requests as well as how change requests are implemented.  Finally, IRS executives conduct weekly meetings with the contractor to review and reach agreement on what change requests will be implemented and which reports will be incorporated or produced in each release.  Management’s complete response to the draft report is included in Appendix VII.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs), at (202) 622-8510.

 

Table of Contents

Background

The Custodial Accounting Project Team Is Making Progress Toward Deployment

System Requirements Were Not Adequately Managed During Testing

Recommendation 1:

Recommendation 2:

The Main System Performance Requirement Will Not Be Tested Prior to Deployment

Recommendation 3:

Recommendations for Improvement Have Not Been Implemented

Recommendation 4:

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Subreleases for the Custodial Accounting Project Release 1

Appendix V – Enterprise Life Cycle Overview

Appendix VI – Custodial Accounting Project System Requirements Changes

Appendix VII – Management’s Response to the Draft Report.

 

Background

The Internal Revenue Service (IRS) is currently modernizing its computer systems, business processes, and practices.  This effort is known as Business Systems Modernization (BSM).  One of the BSM projects is the Custodial Accounting Project (CAP), which will help to modernize the IRS financial systems and improve business processes.

The CAP is designed to help correct longstanding weaknesses in the IRS financial management systems, which accounted for approximately $2 trillion in tax collections (about 95 percent of Federal Government receipts) and $300 billion in tax refunds in Fiscal Year 2003.  These weaknesses include:

·         Deficiencies in controls to properly manage unpaid assessments, resulting in both taxpayer burden and lost revenue to the Federal Government.

·         Deficiencies in controls over tax refunds, permitting the disbursement of improper refunds.

·         Inadequacies in the financial reporting process that prevent the IRS from having timely and reliable information for decision making. 

Due to these weaknesses, the IRS has to implement compensating processes and expend tremendous resources to prepare its financial statements.  Additionally, these weaknesses may adversely affect the decisions made by the IRS and/or the Congress when relying on the information obtained from the IRS custodial reporting systems.  Part of the solution for correcting these weaknesses includes developing and implementing the CAP.

The CAP will be the primary system for the IRS to store taxpayer data for analysis and financial reporting purposes.  Northrop Grumman is the contractor responsible for planning, developing, and deploying the CAP under the leadership and direction of the IRS BSM Office (BSMO).  The PRIME contractor is responsible for integrating the CAP with other modernization systems.

The IRS initiated two systems in 1997 and 1998 that evolved into the CAP.  Currently, the CAP will be developed and deployed in three separate phases, known as releases.  The three releases for the CAP are as follows.

·         Releases 1 and 2 will provide a single, integrated data repository of taxpayer account data, which includes detailed taxpayer account history and unpaid assessment information.  Release 1 will consist of data from the Individual Master File (IMF) and the Customer Account Data Engine (CADE).  Release 2 will consist of data from the Business Master File (BMF), the CADE, and other sources.  The IRS and the CAP contractor are currently working on CAP Release 1.  The IRS suspended work on CAP Release 2 in December 2003, due to delays and technical issues with Release 1.

·         Release 3 will provide a single, integrated data repository of payment and deposit information captured at the point of receipt and establish the Collections Subledger.  The IRS and the CAP contractor have not started work on Release 3.

The IRS and the CAP contractor initially planned to deploy the CAP Release 1 by May 2002.  Since then, the CAP deployment date has been significantly delayed, and Release 1 has been divided into six subreleases.  See Appendix IV for further explanation of the subreleases.

Our audit focused on CAP Release 1.0 (primary or “core” system functionality).  This audit is the third Treasury Inspector General for Tax Administration (TIGTA) review of the CAP.  Our first review of the CAP reported that processes to effectively manage the CAP development were improving.  However, efforts to design, develop, and deploy CAP Release 1 were significantly behind schedule and over budget.  Our second review reported the CAP team prepared test plans to help ensure the developed system meets expectations.  However, we found the CAP contractor did not accurately report test results and did not always follow established test procedures.

This review was performed at the BSMO facility in New Carrollton, Maryland, and the CAP contractor offices in Merrifield, Virginia, during the period February through September 2004.  The audit was conducted in accordance with Government Auditing Standards.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

This audit was conducted while changes were being made at both the BSM program level and the CAP project level.  We provided significant issues and recommendations to the BSMO by April 2004 and completed the majority of our fieldwork in July 2004.  We provided further information and assistance to the BSMO during August 2004.  Any project changes that have occurred since we concluded our analyses are not reflected in this report.  As a result, this report may not reflect the most current status.

The Custodial Accounting Project Team Is Making Progress Toward Deployment

 

The BSMO, the CAP contractor, and the eventual user of the system have been working together to deploy the CAP, and it was, in fact, deployed in late summer 2004.  Throughout our audit, the CAP team was making progress toward this goal.

·         Significant test phases were completed – The System Acceptability Test (SAT) for CAP Releases 1.0 and 1.1, as well as the Release System Integration Test (RSIT) for CAP Release 1.0, were completed.

·         Corrective actions were taken to complete testing of deferred system integration test (SIT) procedures – In our prior audit, we found that certain SIT procedures were not completed.  The CAP contractor responded it would ensure these procedures were included as part of the Release 1.0 SAT.  We determined these procedures were scheduled for testing as part of the Release 1.0 SAT.

·         Testing practices were followed – Based on our review of a judgmental sample of test scripts for Release 1.0, we determined tests were designed to prove the CAP would function correctly.  Also, test scripts that failed initially were retested until final resolution.  While certain testing practices were followed, overall testing could be improved (see System Requirements Were Not Adequately Managed During Testing section in this report).

·         Data anomalies were tracked and prioritized – The CAP contractor was able to log, track, and prioritize data inaccuracies that could cause the CAP to produce unreliable reports or inaccurate financial statements.

Despite progress toward a long-awaited goal, requirements were not adequately managed during testing.  In addition, recommendations developed by the IRS’ internal reviews to improve the CAP have not been fully implemented.

System Requirements Were Not Adequately Managed During Testing

 

According to the Enterprise Life Cycle (ELC), requirements management is the process by which requirements of all types are defined, formalized, managed, controlled, and verified.  Effective requirements management is crucial for establishing and maintaining both program and user expectations and for providing a basis for acceptance of a system.  To ensure requirements are tracked and maintained, measurements should be gathered to determine the status of requirements.  The Software Engineering Institute (SEI) recommends measuring the status of each requirement along with the change activities affecting the baseline system requirements.

We determined the IRS and the CAP contractor did not adequately manage system requirements during the Release 1.0 SAT.  This occurred because an effective requirements management process was not implemented by the CAP contractor and the IRS did not adequately oversee the CAP contractor’s results.  As a result, the CAP Release 1.0 may not function as intended.  Specifically, we determined:

·         The IRS and the CAP contractor did not track Release 1.0 system requirements during the SAT.

·         During the SAT, testing practices did not allow the testers to determine whether system requirements were successfully tested.  As a result, the CAP could not provide and the IRS did not receive reports to review the completion status of system requirements.

·         The IRS approved changes to the baseline system requirements without always knowing which system requirements were affected.

·         The CAP contractor did not report the final status of the system requirements.  Therefore, the IRS accepted the Release 1.0 SAT without knowing or reviewing how many requirements were successfully verified during testing.

As a result of these practices, we performed a detailed analysis of the SAT results and determined:

·         Critical system requirements were not tested.

·         Additions to the baseline system requirements were not tested.

·         Discrepancies in Release 1.0 test results may have affected Release 1.1 testing.

On April 1, 2004, prior to the completion of the SAT, we communicated our concerns to the BSMO and recommended testers review and verify the CAP system requirements immediately after completion of the remaining test scripts.  We stressed the importance of requirements management and recommended the status of all requirements be tracked in an updated Requirements Traceability and Verification Matrix (RTVM) prior to completing the SAT.  The CAP contractor responded it was too late in Release 1.0 testing to revise the SAT Release 1.0 test procedures.  Therefore, the IRS and the CAP contractor did not take corrective actions to address our recommendations or concerns.

The IRS and the CAP contractor did not track Release 1.0 system requirements during the SAT

The primary goal during the SAT is to ensure system requirements are successfully tested and verified.  The RTVM establishes a thread tracing each requirement from the time of identification through changes, testing, and implementation.  The CAP contractor prepared an RTVM for CAP Release 1 in March 2003 but did not provide the ability to map a system requirement to its assigned test script(s).  Therefore, the IRS could not validate the results to verify if a system requirement was tested.  We reported a similar problem in a previous BSM project audit and recommended the BSMO perform reviews to ensure documentation is received showing that project system requirements are traced to test procedures.

In February 2004, we requested the IRS provide us with the current RTVM documenting the test script in which each system requirement would be tested for Release 1.0.  However, this RTVM was not available because the IRS had never requested or required the CAP contractor to develop this type of information.  Based on our concerns, the contractor prepared an updated RTVM for our benefit but did not do so until after the SAT was completed.

Management Actions:  The CAP contractor created an RTVM for CAP Release 1.1 that maps system requirements to associated test scripts.  In addition, the IRS stated the CAP team changed the Release 1.1 testing process to allow for clearer requirements validation.

During the SAT, testing practices did not allow the testers to determine whether system requirements were successfully tested

While executing the test scripts, testers were not analyzing the results to validate or sign off on the system requirements.  The CAP contractor stated it was not part of its testing process to verify a system requirement immediately after each script was completed.  Instead, the verification process was performed after all the test scripts were completed.  Throughout the SAT, the CAP contractor provided periodic status reports documenting the progress of testing.  However, since verification of requirements was not timely performed, the CAP contractor could not provide the status of the system requirements.  The IRS did not know how many requirements passed or failed throughout the 9-month testing period.

The IRS approved changes to the baseline system requirements without always knowing which system requirements were affected

Some of the original system requirements have been modified, added, and/or removed from the CAP baseline system requirements through the use of change requests (CR).  We reviewed 63 approved CRs and determined that 16 (25 percent) did not adequately identify the affected system requirements to be tested.  For the remaining 47 CRs that did identify affected requirements, we identified 11 individual system requirements that were added to the CAP Release 1.0 but were not tested during the SAT.  These missing requirements represent new business functionality the IRS and the CAP contractor have agreed to deploy with the CAP but have omitted from their testing processes.

The IRS accepted the Release 1.0 SAT without knowing or reviewing how many requirements were successfully verified during testing

During the audit, we requested the IRS provide us the status of tested requirements to determine which business functions had been successfully verified.  According to IRS officials, they did not have this information and would have to request it from the CAP contractor.  In fact, the IRS did not know how many requirements were to be tested during SAT Release 1.0 and, at the end, never knew how many had passed, failed, been waived, or been deferred to other CAP releases.  However, the IRS accepted the CAP Release 1.0 SAT results without this vital information.

At our request, the CAP contractor created a mapping of the tested requirements and we performed our own detailed review of the results.  We judgmentally selected a sample of 23 tested requirements for detailed analysis against the test script results.  Our analyses showed that 4 (17 percent) of the 23 sampled requirements contained the following inconclusive data:

·         Two requirements did not have results and were not tested.  We reviewed the SAT Release 1.1 RTVM to determine if those requirements were planned for Release 1.1 testing.  The Release 1.1 RTVM incorrectly reported that the two requirements had already been validated or tested in CAP Release 1.0.

·         Two other requirements were deferred to CAP Release 1.1.  However, the SAT Release 1.1 RTVM listed one requirement as having already been validated in CAP Release 1.0 and did not list the other requirement at all.

These four requirements are classified as critical requirements or “of greatest priority” by the IRS Chief Financial Officer but were not successfully validated during CAP Release 1.0 testing.  Based on the results from our initial sample, we reviewed seven additional requirements and found four other discrepancies:

·         One requirement failed during testing, and the SAT Release 1.1 RTVM stated the requirement had already been tested in CAP Release 1.0.

·         Three requirements were deferred to CAP Release 1.1.  However, the SAT Release 1.1 RTVM incorrectly reported the requirements as having already been validated in CAP Release 1.0.

Since these system requirements were not successfully tested in the SAT Release 1.0 and not properly tracked for testing in the RTVM for Release 1.1, they may not be tested prior to deployment of the CAP.

Management Action:  According to the IRS, it conducted a subsequent review of the Release 1.0 test results and the Release 1.1 RTVM and found that 500 (approximately 18 percent) out of almost 2,800 rows contained inconsistent results.  This is consistent with the results of our sample.

We also performed additional reviews to determine the final status of the tested system requirements for the CAP Release 1.0.  Based on our analysis, we determined only 269 (62 percent) of the 435 tested system requirements were successfully completed and passed.  Of the 166 system requirements that did not successfully pass the SAT Release 1.0, 147 (34 percent of the total 435) were either waived or deferred.  The remaining 19 (4 percent) of the 435 system requirements test results were inadequate to support a conclusion.  The main purpose for developing the CAP Release 1.0 was to ensure all core system functions were operating adequately.  Subsequent subreleases were to deliver maintenance upgrades and enhancements to the deployed CAP.  However, the test results show not all the core system functions were tested as part of the Release 1.0 SAT.  Much of this core functionality was deferred to Release 1.1, which was not part of the scope of our review.

Figure 1:  SAT Release 1.0 Results (435 Requirements)

 

Figure 1 was removed due to its size.  To see Figure 1, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Of the 166 requirements that did not pass during testing, 116 were classified as critical.  Therefore, the SAT did not verify 116 (27 percent) of 435 of the IRS’ highest priority functions required from the CAP.

Recommendations

To ensure an accurate requirements baseline is developed and maintained for future releases, the Chief Information Officer (CIO) should:

1.      Determine the system requirements that have been successfully deployed with the current CAP Release and identify all remaining open requirements so they can be tested in future releases of the CAP.

Management’s Response:  The CIO stated the corrective action has been completed with a strengthened requirements management process and the contractor’s enhancements to the automated requirements tool, REDCOAT.  In addition, the CIO stated the SAT testing process has been altered to perform requirements validation immediately following the execution of each test script.

2.      Ensure the IRS and the CAP contractor implement appropriate requirements management practices, as required by the ELC, to adequately define, track, and report on system requirements for testing and delivery of future CAP releases.

Management’s Response:  The CIO stated the corrective action has been completed with a strengthened requirements management process to ensure that system requirements are defined, tracked, and reported.

The Main System Performance Requirement Will Not Be Tested Prior to Deployment

 

Software and hardware testing ensures a system meets functional and performance requirements and can be effectively used in its intended operational environment.  The testing process is a key management control for ensuring IRS executives have valid, credible information upon which to base their decisions for project deployments.  The purpose of performance testing is to demonstrate and ensure the system can operate and run at specified levels prior to deployment. 

The CAP baseline requirements include three performance requirements, consisting of the amount of time it should take to load data weekly into the CAP (the main performance requirement), create reports, and query the system.  We determined the main performance requirement was not scheduled for testing prior to deployment.

The CAP contractor documented that the main performance requirement could not be tested due to a lack of IRS software in the test environment.  In addition, the IRS stated the main performance requirement may have to be revised because the CAP can not currently achieve the 55-hour data load performance level within the testing environment due to computer capacity issues.  Without testing performance requirements prior to deployment, the IRS will not have objective evidence with which to predict how the CAP will perform in the production environment.

Recommendation

To ensure the deployed CAP will perform as intended, the CIO should:

3.      Test the main CAP performance requirement as soon as possible.

Management’s Response:  The CIO stated the testing and verification of performance requirements were and continue to be part of planned deployment.  The CIO also stated that during the Life Cycle Stage Review for the CAP Release 1.2 held November 9, 2004, the contractor used information to show the amount of time to process a peak cycle will be 48.9 hours, well within the system requirement of 55 hours.

Recommendations for Improvement Have Not Been Implemented

 

In mid-2003, the IRS and the PRIME contractor initiated four studies to help identify the root causes of the problems hindering the BSM effort and to make recommendations for remedying the problems identified.  Key IRS executives and stakeholders reviewed the results of the four studies and created actions to address the study recommendations.  These actions collectively became known as the BSM Challenges Plan.

One of the actions from the BSM Challenges Plan that extended to the CAP was the need to implement short duration “Tiger Teams.”  The Tiger Teams were to establish a forum for escalating issues, facilitate the rapid escalation of issues, and gain commitments from managers to address escalated issues quickly.

The BSMO initiated three separate internal reviews, led by Tiger Teams, to conduct studies on the testing, data quality, and engineering aspects of the CAP.  The engineering team was initiated as a result of the BSM Challenges Plan, while the testing and data quality teams were initiated by the IRS and the CAP contractor for process improvements. 

The Tiger Teams made 24 recommendations for improvement (e.g., establishing Milestone 5 exit criteria and implementing data recovery procedures).  However, the BSMO and the CAP contractor had implemented only 2 of the 24 recommendations by the end of our audit work.  The BSMO and the CAP contractor plan to implement the remaining 22 recommendations as part of a new contract, which the IRS plans to be a fixed-price contract.  Fixed-price contracts for development work can balance the financial risk between the Federal Government and the CAP contractor.  Such a contract could reduce the risk of substantial cost overruns that have occurred in the past.  The CIO has stated the IRS would ensure capped or fixed-price contracts are used for acquisition contracts.

The IRS and the CAP contractor began working toward a fixed-price contract in February 2004.  As of the end of our audit work, the new contract that will include the implementation of the approved internal review recommendations had not been negotiated.  One reason for the delay in completing the fixed-price contract is the CAP contractor’s concern that the IRS continues to change the scope of the project.  According to our analysis, 35 (56 percent) of the 63 CRs that have occurred throughout the development of the CAP were due to requirement errors.  See Appendix VI for details of our analysis. 

Delays in completing a new contract for the CAP will delay implementation of the approved internal review recommendations, and the BSMO could continue to experience problems developing and deploying future CAP releases.  Additionally, the IRS will not be able to gain value from the studies and achieve the goals of the BSM Challenges Plan (i.e., rapid escalation and resolution of issues hindering the BSM effort).

Recommendation

To improve testing, data quality, and engineering aspects of the CAP, the CIO should:

4.      Ensure approved internal review recommendations are implemented as soon as possible.

Management’s Response:  The CIO stated all but one of the approved recommendations have been completed or closed. 

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine the status of the Internal Revenue Service’s and the contractors’ readiness to deliver the Custodial Accounting Project (CAP) Release 1.0.  To achieve our objective, we:

I.                   Determined if the System Acceptability Test (SAT) results could be relied upon to provide assurance that the CAP would function correctly for the end users.

A.    Traced CAP Release 1.0 system requirements to SAT test cases.  We captured changes to requirements by reviewing the total population of 63 change requests.

B.     Determined the status of waived system integration test (SIT) cases.

C.     Determined if the CAP system requirements were tested as planned by selecting and reviewing a judgmental sample of 23 CAP Release 1.0 requirements from a population of 435 requirements (approximately 5.3 percent of the total population) that were scheduled for testing during the Release 1.0 SAT.  We used a judgmental sample because we did not plan on projecting the results.

D.    Expanded the CAP system requirements testing by selecting and reviewing a second judgmental sample of 7 CAP Release 1.0 requirements from a population of 435 (approximately 1.6 percent of the total population) that were scheduled for testing during the Release 1.0 SAT.  We used a judgmental sample because we did not plan on projecting the results.

E.     Determined the status of Release 1.0 system requirements by reviewing the results of the total population of 435 tested system requirements.

II.                Determined if the CAP contractor had adequate procedures to manage data loads.

A.    Reviewed problems encountered during the data load process.

B.     Documented the cause and effect of the problems identified in Step II.A.

C.     Reviewed the data load process plans.

III.             Reviewed “Tiger Team” reports for the likelihood of a successful CAP deployment.

A.  Reviewed the findings and recommendations included in each report.

B.  Evaluated justification for approving or rejecting the Tiger Team recommendations.

C.  Determined the implementation status of the Tiger Team recommendations.

IV.       Reviewed the Release System Integration Test (RSIT) plans and results.

A. Reviewed the RSIT plan and conducted interviews of the PRIME contractor.

B. Obtained and reviewed the results of all RSIT cases.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs)

Gary Hinkle, Director

Troy Paterson, Audit Manager

Phung Nguyen, Lead Auditor

James Douglas, Senior Auditor

Wallace Sims, Senior Auditor

Louis Zullo, Senior Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn: Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Deputy Associate Chief Information Officer, Business Integration  OS:CIO:B:BI

Deputy Associate Chief Information Officer, Program Management  OS:CIO:B:PM

Deputy Associate Chief Information Officer, Systems Integration  OS:CIO:B:SI

Director, Stakeholder Management  OS:CIO:SM

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaisons: 

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Manager, Program Oversight Office  OS:CIO:SM:PO

 

Appendix IV

 

Subreleases for the Custodial Accounting Project Release 1

 

Releases 1 and 2 of the Custodial Accounting Project (CAP) will provide for a single, integrated data repository of taxpayer account data, which includes detailed taxpayer account history and unpaid assessment information.  Release 1 will consist of data from the Individual Master File (IMF) and the Customer Account Data Engine (CADE).  Release 1 has been divided into six subreleases. 

Release 1.0 – This release includes the CAP core functionality and an interface with the modernized infrastructure.  The Internal Revenue Service (IRS) and the CAP contractor have completed all major test activities for Release 1.0.  However, Release 1.0 was not deployed as originally planned but was, instead, incorporated into Release 1.1.

Release 1.1 – This release includes additional reporting capabilities, which were not included in Release 1.0, and changes to tax laws for 2004.  Release 1.1 was deployed in late summer 2004.

Release 1.2 – This release will include the CADE/IMF/CAP interface and the midyear 2004 tax changes.  There will not be a direct interface between the CADE and the CAP in this release.  Release 1.2 should be completed in November 2004.

Release 1.3 – This release will include audit capabilities and 2005 tax year changes.

Release 1.4 – This release will include the midyear 2005 tax year changes.

Release 1.5 – This release will include the direct interface between the CADE and the CAP, as well as an interface with the Payment and Claims Enhancement Reconciliation system and 2006 tax year changes.

 

Appendix V

 

Enterprise Life Cycle Overview

 

The Enterprise Life Cycle (ELC) defines the processes, products, techniques, roles, responsibilities, policies, procedures, and standards associated with planning, executing, and managing business change.  It includes redesign of business processes; transformation of the organization; and development, integration, deployment, and maintenance of the related information technology applications and infrastructure.  Its immediate focus is the Internal Revenue Service (IRS) Business Systems Modernization (BSM) program.  Both the IRS and the PRIME contractor must follow the ELC in developing/acquiring business solutions for modernization projects.

The ELC framework is a flexible and adaptable structure within which one plans, executes, and integrates business change.  The ELC process layer was created principally from the Computer Sciences Corporation’s Catalyst® methodology.  It is intended to improve the acquisition, use, and management of information technology within the IRS; facilitate management of large-scale business change; and enhance the methods of decision making and information sharing.  Other components and extensions were added as needed to meet the specific needs of the IRS BSM program.

ELC Processes

A process is an ordered, interdependent set of activities established to accomplish a specific purpose.  Processes help to define what work needs to be performed.  The ELC methodology includes two major groups of processes:

Life-Cycle Processes, which are organized into phases and subphases and address all domains of business change.

Management Processes, which are organized into management areas and operate across the entire life cycle.

 

Enterprise Life-Cycle Processes

The chart was removed due to its size.  To see the chart, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

Life-Cycle Processes

The life-cycle processes of the ELC are divided into six phases, as described below:

·                     Vision and Strategy - This phase establishes the overall direction and priorities for business change for the enterprise.  It also identifies and prioritizes the business or system areas for further analysis.

·                     Architecture - This phase establishes the concept/vision, requirements, and design for a particular business area or target system.  It also defines the releases for the business area or system.

·                     Development - This phase includes the analysis, design, acquisition, modification, construction, and testing of the components of a business solution.  This phase also includes routine planned maintenance of applications.

·                     Integration - This phase includes the integration, testing, piloting, and acceptance of a release.  In this phase, the integration team brings together individual work packages of solution components developed or acquired separately during the Development phase.  Application and technical infrastructure components are tested to determine whether they interact properly.  If appropriate, the team conducts a pilot to ensure all elements of the business solution work together.

·                     Deployment - This phase includes preparation of a release for deployment and actual deployment of the release to the deployment sites.  During this phase, the deployment team puts the solution release into operation at target sites.

·                     Operations and Support - This phase addresses the ongoing operations and support of the system.  It begins after the business processes and system(s) have been installed and have begun performing business functions.  It encompasses all of the operations and support processes necessary to deliver the services associated with managing all or part of a computing environment.

The Operations and Support phase includes the scheduled activities, such as planned maintenance, systems backup, and production output, as well as the nonscheduled activities, such as problem resolution and service request delivery, including emergency unplanned maintenance of applications.  It also includes the support processes required to keep the system up and running at the contractually specified level.

Management Processes

Besides the life-cycle processes, the ELC also addresses the various management areas at the process level.  The management areas include:

·                     IRS Governance and Investment Decision Management - This area is responsible for managing the overall direction of the IRS, determining where to invest, and managing the investments over time.

·                     Program Management and Project Management - This area is responsible for organizing, planning, directing, and controlling the activities within the program and its subordinate projects to achieve the objectives of the program and deliver the expected business results.

·                     Architectural Engineering/Development Coordination - This area is responsible for managing the technical aspects of coordination across projects and disciplines, such as managing interfaces, controlling architectural changes, ensuring architectural compliance, maintaining standards, and resolving issues.

·                     Management Support Processes - This area includes common management processes, such as quality management and configuration management that operate across multiple levels of management.

Milestones

The ELC establishes a set of repeatable processes and a system of milestones, checkpoints, and reviews that reduce the risks of systems development, accelerate the delivery of business solutions, and ensure alignment with the overall business strategy.  The ELC defines a series of milestones in the life-cycle processes.  Milestones provide for “go/no-go” decision points in the project and are sometimes associated with funding approval to proceed.  They occur at natural breaks in the process where there is new information regarding costs, benefits, and risks and where executive authority is necessary for next phase expenditures.

There are five milestones during the project life cycle: 

·                     Milestone 1 - Business Vision and Case for Action.  In the activities leading up to Milestone 1, executive leadership identifies the direction and priorities for IRS business change.  These guide which business areas and systems development projects are funded for further analysis.  The primary decision at Milestone 1 is to select BSM projects based on both the enterprise-level Vision and Strategy and the Enterprise Architecture.

·                     Milestone 2 - Business Systems Concept and Preliminary Business Case.  The activities leading up to Milestone 2 establish the project concept, including requirements and design elements, as a solution for a specific business area or business system.  A preliminary business case is also produced.  The primary decision at Milestone 2 is to approve the solution/system concept and associated plans for a modernization initiative and to authorize funding for that solution.

·           Milestone 3 - Business Systems Design and Baseline Business Case.  In the activities leading up to Milestone 3, the major components of the business solution are analyzed and designed.  A baseline business case is also produced.  The primary decision at Milestone 3 is to accept the logical system design and associated plans and to authorize funding for development, test, and (if chosen) pilot of that solution.

·                     Milestone 4 - Business Systems Development and Enterprise Deployment Decision.  In the activities leading up to Milestone 4, the business solution is built.  The system is integrated with other business systems and tested, piloted (usually), and prepared for deployment.  The primary decision at Milestone 4 is to authorize the release for enterprise-wide deployment and commit the necessary resources.

·                     Milestone 5 - Business Systems Deployment and Postdeployment Evaluation.  In the activities leading up to Milestone 5, the business solution is fully deployed, including delivery of training on use and maintenance.  The primary decision at Milestone 5 is to authorize the release of performance-based compensation based on actual, measured performance of the business system.

 

Appendix VI

 

Custodial Accounting Project System Requirements Changes

 

A detailed analysis of the 63 Custodial Accounting Project change requests (CR) we were provided showed the majority of CRs were initiated to fix and adjust for requirement errors.  The remaining CRs were due to tax processing changes or miscellaneous errors.

Figure 1:  Reasons for the 63 CRs

 

Figure 1 was removed due to its size.  To see Figure 1, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Further analyses of the 63 CRs showed the following 2,234 changes occurred to affect system requirements.

 

Figure 2:  Breakdown of 2,234 changes from the 63 CRs

 

Figure 2 was removed due to its size.  To see Figure 2, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Appendix VII

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.