TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

Further Enhancements to the Guidance for Testing Practices Will Help Ensure the Quality of Modernization Projects

 

 

 

March 2006

 

Reference Number:  2006-20-051

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Phone Number   |  202-927-7037

Email Address   |  Bonnie.Heald@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

March 17, 2006

 

 

MEMORANDUM FOR CHIEF INFORMATION OFFICER

                                        

FROM:                            Michael R. Phillips /s/ Michale R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – Further Enhancements to the Guidance for Testing Practices Will Help Ensure the Quality of Modernization Projects (Audit # 200520019)

 

This report presents the results of our review of the Internal Revenue Service’s (IRS) Business Systems Modernization (BSM) program testing processes.  The overall objective of this review was to determine whether the BSM program testing processes are conducted efficiently and effectively and are ensuring the testing of appropriate system requirements.  As part of this audit, we followed up on Treasury Inspector General for Tax Administration audit recommendations about the effectiveness of the BSM program testing activities reported since issuance of our report entitled Testing Practices for Business Systems Modernization Projects Need Improvement.[1]

Synopsis

The BSM program is a complex effort to modernize the IRS’ technology and business processes.  According to the IRS, this effort involves integrating thousands of hardware and software components over 15 years with PRIME contractor[2] costs of over $8 billion.  As the BSM projects progress through development and deployment, they undergo various testing processes to ensure they meet functional and performance specifications and can be effectively used in their intended operational environment.

The Business System Modernization Office (BSMO) and the PRIME contractor are primarily responsible for ensuring BSM projects have been adequately tested and the projects perform as expected.  The BSMO and the PRIME contractor developed a systems development methodology called the Enterprise Life Cycle (ELC)[3] that provides guidance and detailed processes to be followed by project teams working on BSM projects.  Included in the ELC is a process description which details processes for designing, developing, and testing BSM projects.

We commented on the effectiveness of the BSM program testing activities in a report entitled Testing Practices for Business Systems Modernization Projects Need Improvement.  Since the issuance of the September 2003 audit report, the BSMO has taken actions to provide specific guidance to ensure the testing of project requirements will meet business needs.  We analyzed our audit report findings with testing concerns issued since the September 2003 audit report and determined the guidance updates issued between February and May 2005 adequately address testing process concerns.

Although the BSMO has updated its testing guidance, it can further improve testing procedures and testing activities to help ensure development of quality modernization projects.  Our reviews of a sample of modernization project test results folders[4] found documentation of testing results and related management reviews needs improvement.  Our assessment of the current BSMO testing procedures found additions and clarification to testing guidance can help ensure the quality of modernization projects, revisions to the test readiness review procedure can help ensure its effectiveness, and clarification of participants’ roles and responsibilities can make testing activity performance more efficient.

While the Office of Release Management has made significant efforts to address and document the testing activities performed by the BSMO, it did not identify all the necessary guidance to document the testing responsibilities.  Including additional detail in the testing guidance will also allow for efficiency in conducting testing by reducing the need to interpret testing process requirements.  Adding specific guidance about documenting testing activities and results allows for a readily available historical reference for analyses that may be needed.  Additionally, without detailed procedures describing all of the roles and responsibilities for conducting the testing activities, the testing procedures will not be able to provide a consistent process for BSM employees to effectively perform, monitor, and control the testing activities within the BSMO.  This could affect the efficiency and effectiveness of testing activities.

Recommendations

To help ensure adequate documentation and review of testing results, we recommended the Chief Information Officer direct the Office of Release Management to establish guidance requiring the BSMO’s Acquisition Project Managers to certify test results folders are complete and reviewed as part of the issuance of related end of test reports.  This guidance should be specified in revisions/updates to the testing procedures.  To help ensure a consistent process to effectively monitor and control the testing activities within the BSMO, the Chief Information Officer should direct the Office of Release Management to provide additional detail to the testing activities within the BSMO, designate the use of the Test Readiness Checklist as mandatory and update the Checklist to include all appropriate questions, and amend the testing procedures to specify participants’ roles in testing and clarify their related scope of responsibility.  Appendix VII provides specific suggestions to provide additional guidance or clarification to existing guidance in the testing procedures.

Response

IRS management agreed with all of our recommendations.  To implement the corrective actions, the Technology Release Management organization will form a working group of stakeholders to review existing test guidance and the suggestions provided in Appendix VII of this report.  The recommendations of the working group will be vetted through the formal IRS document review processes, and revised guidance will be posted to the Process Asset Library after executive approval.  Management’s complete response to the draft report is included as Appendix VIII.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Program), at (202) 622-8510.

 

 

Table of Contents

 

Background

Results of Review

The Business Systems Modernization Office Has Taken Significant Actions to Provide Project Testing Guidance

Documentation of Testing Results and Related Management Reviews Needs Improvement

Recommendation 1:

Additions and Clarification to Testing Guidance Can Help Ensure the Quality of Modernization Projects

Recommendation 2:

Revisions to the Test Readiness Review Procedure Can Help Ensure Its Effectiveness

Recommendation 3:

Clarifying Participants’ Roles and Responsibilities Can Make Testing Activity Performance More Efficient

Recommendation 4:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Enterprise Life Cycle Overview

Appendix V – Business Systems Modernization Office Testing Procedures

Appendix VI – Analysis of Testing Procedures Issued by the Business Systems Modernization Office

Appendix VII – Suggested Additions/Clarification to the Business Systems Modernization Office Testing Procedures

Appendix VIII – Management’s Response to the Draft Report

 

 

Background

 

The Business Systems Modernization (BSM) program is a complex effort to modernize the Internal Revenue Service’s (IRS) technology and business processes.  According to the IRS, this effort involves integrating thousands of hardware and software components over 15 years with PRIME contractor[5] costs of over $8 billion.  As the BSM projects progress through development and deployment, they undergo various testing processes to ensure they meet functional and performance specifications and can be effectively used in their intended operational environment.  Testing of new hardware and software is often the last opportunity for IRS executives and project managers to ensure BSM projects meet requirements and expectations before they become operational.  These testing processes are a key management control for ensuring IRS executives have valid credible information upon which to base their decisions for modernization project investments.

The Business Systems Modernization Office (BSMO) and the PRIME contractor are primarily responsible for ensuring BSM projects have been adequately tested and the projects perform as expected.  The BSMO and the PRIME contractor developed a systems development methodology called the Enterprise Life Cycle (ELC)[6] that provides guidance and detailed processes to be followed by project teams working on BSM projects.  Included in the ELC is a process description which details processes for designing, developing, and testing BSM projects.

The testing process is designed to detect errors in both software and hardware before a system is made operational.

The testing process includes several subphases:

·         The Combined System Integration Test is conducted in the Enterprise Integration and Test Environment[7] and consists of activities designed to verify the quality of the integrated system (i.e., verify the system is integrated into other IRS systems properly and functions as required).  The Combined System Integration Test is performed by the PRIME contractor with participation by the IRS in a joint effort.

·         The Government Acceptance Test is conducted in the Enterprise Integration and Test Environment and consists of activities designed to independently verify aspects of the capabilities tested during the Combined System Integration Test.  The Government Acceptance Test provides the system owner with an independent assessment of the system to determine its fitness for implementation.  The Government Acceptance Test is performed by the IRS’ Product Assurance office when the project meets its priority criteria.

·         The Final Integration Test is conducted in the Enterprise Integration and Test Environment with a copy of production data and consists of activities designed to ensure the interoperability of systems.  This activity is the final step prior to the application release into production.  The Final Integration Test is performed by the Product Assurance office.

·         The Deployment Site Readiness Test is conducted in the IRS production environment and consists of activities designed to ensure the deployed application works in the production environment as intended.  The Deployment Site Readiness Test is performed by the PRIME contractor.

In addition to the above testing subphases, testing may also include the following test activities:

·         The Application Regression Test appears in many ELC test plans.  The purpose of regression testing is to ensure a modified system still works.  Whereas other testing focuses on testing whether the new and modified functionality works properly, regression testing focuses on testing the integrity of previously tested portions of the system.

·         Post Initial Operational Capability[8] Testing has been conducted on projects when testing activities were to be completed after the initial operational capability date.  For example, on one project, testing needed to be customer run during production.  On another project, the IRS decided to initiate the application into operation before completing all testing.  The IRS decided the remaining testing could be completed after the application’s initiation into operation.

The testing process description also includes guidance for performing the systems evaluation process.  Included are criteria for skills and abilities of the individuals performing the testing and task descriptions for test planning, performance, and closeout.

This review was performed at the BSMO facilities in New Carrollton, Maryland, during the period August through December 2005.  The audit was conducted in accordance with Government Auditing Standards.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

Results of Review

 

The Business Systems Modernization Office Has Taken Significant Actions to Provide Project Testing Guidance

We commented on the effectiveness of the BSM program testing activities in a report entitled Testing Practices for Business Systems Modernization Projects Need Improvement.[9]  The report included the following findings:

Ÿ         Insufficient testing plans and facilities contributed to project delays.  All necessary details or procedures were not developed and followed to properly conduct performance testing, and the capacity of the test lab was inadequate to support the BSM program.

Ÿ         Testing activities were not completed before putting systems into operation.  Testing was not always performed to assure the adequacy of project capabilities, hardware upgrades, and system performance.

Ÿ         Failed tests were not being properly resolved during project development.  Documentation was not adequate in defect reports[10] to describe actions to resolve failed tests, and guidance was insufficient to direct closure of defect reports for failed tests.

Ÿ         Security test documentation was not completed before systems were placed into operation.

Since the issuance of the September 2003 audit report, the BSMO has taken actions to provide specific guidance to ensure the testing of project requirements will meet business needs.  Further, we analyzed our audit report findings with testing concerns issued since the September 2003 audit report and determined the guidance adequately addresses testing process concerns.

The testing guidance updates were issued between February and May 2005 and include procedures for:

Ÿ         Test planning and test design.

Ÿ         Requirements review.

Ÿ         Test readiness reviews.

Ÿ         Defect Review Board.

Ÿ         Test case deferral and waiver.

Ÿ         Test folders and records.

Ÿ         Test completion reporting.

Appendix V provides a synopsis of the guidance issued.  Appendix VI presents analyses of our audit report findings and the BSMO’s subsequently issued guidance.

The Filing and Payment Compliance Project’s Release 1.1 test case scenarios include all identified project requirements

In response to a request by the Associate Chief Information Officer, BSM, we reviewed the adequacy of test case documentation for the Filing and Payment Compliance Project’s Release 1.1.[11]  Our analysis showed identified project requirements were traced from the System Requirements Report[12] to the project’s Requirements Traceability Verification Matrix.[13]  Successful tracing of project requirements indicates the testing should include all identified project requirements.

Documentation of Testing Results and Related Management Reviews Needs Improvement

The BSMO Test Folders and Records Procedure defines the process for documenting and verifying test results folders.  This procedure applies to all BSMO projects in which a Combined Systems Integration Test, Government Acceptance Test, or Deployment Site Readiness Test is performed.  The BSMO’s Acquisition Project Manager is responsible for ensuring the activities to support the testing subphases are performed.

The test results folder is the documentation maintained by the testing organization that contains all the information pertinent to the test case executed.  It ensures access to completed test results for a historical reference.  The items for inclusion in the test results folder are detailed in the Test Folders and Records Procedure and include the test script, [14] Test Case Summary Certification Report (Form 6503), test artifacts (test result record), and related defect reports.

We reviewed a sample of test results folders for the e-Services,[15] Modernized e-File (MeF),[16] and Customer Account Data Engine (CADE)[17] Projects.  The MeF Project’s test results folders included the test script with test results for each step.  In addition, the MeF Project’s test results folders with defect reports included the defect report number on the initial artifact and the updated artifact to show the defect was fixed.

However, our reviews showed the e-Services and CADE Projects’ test results folders did not include all required documentation, and folder reviews were not always performed.  Table 1 summarizes deficiencies in the test results folder documentation.

Table 1:  Sample Review Results of Test Results Folders

Test Results Folders Include:

e-Services*

CADE**

MeF***

Test Script

0 of 29

 17 of 17

20 of 20

Test Completion Date

 24 of 29

  0 of 17

20 of 20

Review Date

0 of 29

  0 of 17

20 of 20

Current Version of Form 6503

0 of 29

 17 of 17

20 of 20

Test Artifact

 29 of 29

15 of 17

20 of 20

Test Result on Form 6503

 29 of 29

 8 of 17

20 of 20

Documentation of Test Variances

 29 of 29

 9 of 17

20 of 20

Tester’s Name

 29 of 29

 15 of 17

20 of 20

Documentation of Review

 29 of 29

 11 of 17

20 of 20

Information on Related Defect Reports

 1 of 11

 2  of  5

3 of  3

Source:  The e-Services Project Release 2.2 and CADE Project Release 1.3.1 Combined System Integration Tests, and the MeF Project Release 3.1 Deployment Site Readiness Test.

* We requested test results folders for 30 sampled test cases from the e-Services Project Release 2.2 Combined System Integration Test.  One test case was deferred and a test results folder was not maintained.

** We requested test results folders for 20 sampled test cases from the CADE Project Release 1.3.1 Combined System Integration Test.  Two test cases were deferred and test results folders were not maintained.  One folder did not include the Form 6503 for our analysis.

*** We requested and received all test results folders for 20 sampled test cases from the MeF Project Release 3.1 Deployment Site Readiness Test.

Although the project teams were generally able to provide the documentation required by the Test Folders and Records Procedure, the documentation did not always reside in the test results folder, or did not specifically meet the form described by the procedure.  For example, the e-Services Project relied on the PRIME contractor to maintain the record of the test case completion and test case review dates.  However, the records maintained by the PRIME contractor did not include the required information.

As detailed later in this report, additional detail and clarity in the Test Folders and Records Procedure will help to ensure adequate test results folder documentation is maintained.  The e-Services and CADE Projects’ Acquisition Project Managers did not ensure all appropriate testing documentation was maintained in these project files.  Currently, this procedure does not require the Acquisition Project Manager to verify all test results folders are complete and reviewed.

Without complete test results folder documentation, there may not be sufficient evidence showing the system being developed was adequately tested.  In addition, the project will not have a complete record to support the testing of the requirements of the system being developed or a historical reference for subsequent inquiries about test results.

Recommendation

Recommendation 1:  The Chief Information Officer (CIO) should direct the Office of Release Management to establish guidance requiring Acquisition Project Managers to certify test results folders are complete and reviewed as part of the issuance of related end of test reports.  This guidance should be specified in revisions/updates to the testing procedures.

Management’s Response:  The IRS agreed with this recommendation.  The Technology Release Management organization will form a working group of stakeholders to review existing test guidance, including the existing Test Folders and Records Procedure and the Test Completion Reporting Procedure.  The recommendations of the working group will be vetted through the formal IRS document review processes and revised guidance will be posted to the Process Asset Library after executive approval.

Additions and Clarification to Testing Guidance Can Help Ensure the Quality of Modernization Projects

Our reviews of the testing processes show the IRS monitors the contractor’s Applications Qualification Testing.[18]  The IRS activities generally begin subsequent to the successful completion of Applications Qualification Testing.  The IRS testing activities assess the adequacy of the application’s integration and performance in the IRS’ operating environment.  These assessments are accomplished through the Combined System Integration Test, Government Acceptance Test, Final Integration Test, and Deployment Site Readiness Test.

As discussed above, the BSMO has taken actions to provide specific guidance to ensure the testing of project requirements will meet the business’ needs.  Our analysis of the BSMO’s testing guidance shows further enhancements can be made in the following procedures to help ensure the quality of BSM projects.

  • Test Planning and Test Design Procedure – This procedure can be improved by providing specific references to standards to be used in test planning, defining specific procedures for problem reporting and escalation, and establishing procedures for a peer or project review of the test cases and test scripts to ensure their adequacy.
  • Test Folders and Records Procedure – This procedure can be improved by providing specific requirements for the scope, documentation, and completion of the technical review of test artifacts, technical verification of test results, and test results folder document verification.

Because different versions of the Form 6503 are in use, the procedure should specify the version of the Form to be used.

Although the Test Folders and Records Procedure requires test results folders be maintained and controlled until the end of test script execution and test results verification, there is no guidance for the retention location or duration for test results folders after test completion.

  • Test Completion Reporting Procedure – This procedure can be improved by providing the required elements for the make-up of the Final Test Completion Report and the actions necessary if the report is not issued within its 30-day requirement.

The Test Completion Reporting Procedure requires the disposition of all unresolved severity 3 and 4 defect reports[19] be reviewed for impact with an action plan specifying proposed resolution.  However, the procedure does not provide what elements should be included in the action plan, the steps that should be followed to resolve the defect reports, or the actions to take if a defect cannot be resolved.  Also, the procedure does not include the method to report a project’s post initial operating capability testing results.

  • Defect Review Board Procedure – This procedure includes steps to identify defect reports from previous tests, review current defect reports and agree on their severity levels, and establish target dates for fixing the defects.  However, the procedure can be improved by providing direction for the Defect Review Board to take if prior defect reports are identified, to document decisions about changes to a defect’s severity level, and if defects are not resolved by the established target dates.

Including additional detail in the testing guidance will also allow for improved efficiency in conducting testing by reducing the need for each project to interpret testing requirements.  Adding specific guidance about documenting testing activities and results also ensures more readily available historical information for future analyses.

Recommendation

Recommendation 2:  The CIO should direct the Office of Release Management to provide additional detail in the testing procedures to help ensure a consistent process to effectively monitor and control the testing activities within the BSMO.  Appendix VII provides specific suggestions to provide additional guidance or clarification to existing guidance in the testing procedures.

Management’s Response:  The IRS agreed with this recommendation.  The Technology Release Management organization will form a working group of stakeholders to review existing test guidance and the suggestions provided in Appendix VII of this report and provide additional detail to the testing procedures within the Modernization and Information Technology Services organization.  The recommendations of the working group will be vetted through the formal IRS document review processes and revised guidance will be posted to the Process Asset Library after executive approval.

Revisions to the Test Readiness Review Procedure Can Help Ensure Its Effectiveness

A Test Readiness Review is a formal lifecycle review conducted to assess the readiness of the test team, test environment, development team, and supporting organizations to begin testing.  The decision to proceed with the next phase of testing is made based on a successful Test Readiness Review outcome.  The projects we reviewed are using the Test Readiness Review Checklist.  However, the projects are using different versions of the Checklist (one version was published in the Internal Revenue Manual and the other version was published in a BSMO procedure).  Further, in conducting the reviews the project teams have removed some of the questions from the Checklist and added other questions.

Examples of questions from the Checklist that were not included in some of the projects’ Test Readiness Reviews include:

·         Has each organization designated personnel to assist/support with the execution of the test procedures?

·         Has a complete list of documentation (with locations) been provided to the test team?

·         Have all defects for the prior test phase been resolved, verified, and either closed or approved for deferral/waiver?

Examples of questions that were added to the Checklist, but were not included in the Test Readiness Review procedure include:

·         Has all required data been created and deemed ready for use?

·         Have all limitations to, or constraints upon, planned testing activities been documented?

·         Has environment readiness been validated?

The use of the Test Readiness Review Checklist in the BSMO procedure is not mandated.  Each project team determines which questions to include in the Checklist.  The Product Assurance office uses the Internal Revenue Manual in its testing activities, which includes a different Test Readiness Review Checklist.

While the Office of Release Management has made significant efforts to address and document the testing activities performed by the BSMO, it did not identify all necessary guidance to perform and document the testing activities.  Without requiring the use of the Checklist provided in the Test Readiness Review Procedure and periodically updating it as modifications are identified, projects may not consider all appropriate questions in determining test readiness.  As a result, a testing phase could begin prematurely and testing delays could occur because all items were not addressed to help ensure efficient and effective testing activities.

Recommendation

Recommendation 3:  The CIO should direct the Office of Release Management to review and update the Test Readiness Review Checklist to include all appropriate questions.  The completion of the Checklist should be designated as a mandatory activity and all questions from the Checklist should be answered or noted as “not applicable.”

Management’s Response:  The IRS agreed with this recommendation.  The Technology Release Management organization will form a working group of stakeholders to review existing test guidance, including the existing Test Readiness Review Procedure.  The recommendations of the working group will be vetted through the formal IRS document review processes and revised guidance will be posted on the Process Asset Library after executive approval.

Clarifying Participants’ Roles and Responsibilities Can Make Testing Activity Performance More Efficient

The BSMO testing procedures do not always clearly define the roles and responsibilities of all the participants in the testing activities.  The Test Lead or Test Manager is usually identified in the testing procedures as the official responsible for performing the testing activities with the Acquisition Project Manager, Office of Release Management, business organization, contractor, or other Modernization and Information Technology Services organization either contributing or providing support to the testing activity.  However, these testing procedures did not always provide specific detail about what these support activities should include.

·         Test Planning and Test Design Procedure – The procedure is not clear as to which participants, including the Test Lead, are responsible for requirements gathering, development and analysis, and performing the Customer Technical Review.

·         Test Readiness Review Procedure – The procedure indicates the Test Team is responsible for performing this activity.  The procedure does not specify whether the Test Manager, Test Lead, or Acquisition Project Manager is responsible for initiating and planning this review.

·         Defect Review Board Procedure – The procedure states a Defect Review Board Lead is responsible for initiating this activity.  However, there is nothing in the procedure indicating who is designated as the Defect Review Board Lead or who is responsible for assigning this role.

·         Test Folders and Records Procedure – The procedure contains steps requiring a technical review of test artifacts, technical verification of test results, and test results folder document verification.  These procedural steps do not identify who is responsible for performing these reviews.

·         ELC Enterprise Integration and Test Supplement The Combined System and Integration Test procedure is based on the PRIME contractor being responsible for leading the testing activities.  The procedure considers the IRS as a contributor or reviewer.  The procedure does not provide a clear description of the responsibilities required of the IRS as the contributor.

While the Office of Release Management has made significant efforts to address and document the testing activities performed by the BSMO, it did not identify all the necessary guidance to document the testing responsibilities.  Without detailed procedures describing all the roles and responsibilities for conducting the testing activities, the testing procedures will not be able to provide a consistent process for BSM employees to effectively perform, monitor, and control the testing activities within the BSMO.  This could affect the efficiency and effectiveness of testing activities.

Recommendation

Recommendation 4:  The CIO should direct the Office of Release Management to amend the testing procedures to specify participants’ roles in testing and clarify their related scope of responsibility.

Management’s Response:  The IRS agreed with this recommendation.  The Technology Release Management organization will form a working group of stakeholders to review existing test guidance, including all 10 of the existing testing procedures, to clarify participants’ roles and responsibilities.  The recommendations of the working group will be vetted through the formal IRS document review processes and revised guidance will be posted on the IRS Process Asset Library after executive approval.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to determine whether the Internal Revenue Service’s (IRS) Business Systems Modernization (BSM) program testing processes are conducted efficiently and effectively and are ensuring the testing of appropriate system requirements.  As part of this audit, we followed up on Treasury Inspector General for Tax Administration audit recommendations about the effectiveness of the BSM program testing activities reported since issuance of our report entitled Testing Practices for Business Systems Modernization Projects Need Improvement.[20] To accomplish this objective, we:

I.                   Determined whether the Business Systems Modernization Office (BSMO) provided adequate testing guidance to ensure project requirements testing provides assurance that systems and applications meet the IRS’ business needs.

A.    Reviewed findings from our audit reports issued since September 2003 to determine whether these issues have been addressed with updated guidance.

B.     Assessed the adequacy of updated BSMO testing procedures by comparing them to industry standards presented by the Software Engineering Institute’s Capability Maturity Model ® Integration (CMMI®)[21] and the Institute of Electrical and Electronic Engineers.

C.     Interviewed and obtained documentation from the Modernization and Information Technology Services organization’s Office of Release Management, Integration, Test and Deployment staff, to determine the status of efforts to improve the testing processes.

II.                Determined whether the test case results completely and consistently documented the project requirements tested.

A.    Reviewed documentation to determine whether the Filing and Payment Compliance Project Release 1.1[22] project requirements were consistent between the System Requirements Report[23] and Requirements Traceability Verification Matrix.[24]

B.     Determined whether the procedures for electronic test results folders[25] reviews were adequate to provide sufficient review of the test case results.  We judgmentally sampled test results folders from the following Modernization project releases:  Customer Account Data Engine Project[26] Release 1.3.1 – 20 of 468 folders; e-Services Project[27] Release 2.2 – 30 of 300 folders; Modernized e-File Project[28] Release 3.1 – 20 of 178 folders.  Our samples were used to evaluate the process to document, review, and approve test case results.  We used judgmental samples because a precise projection of sample results over a population was not required.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs)

Gary Hinkle, Director

Edward A. Neuwirth, Audit Manager

Michael Garcia, Senior Auditor

Phung H. Nguyen, Senior Auditor

Beverly Tamanaha, Senior Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Associate Chief Information Officer, Business Systems Development  OS:CIO:I:B

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Associate Chief Information Officer, Enterprise Services  OS:CIO:E

Deputy Associate Chief Information Officer, Business Integration  OS:CIO:E

Deputy Associate Chief Information Officer, Systems Integration  OS:CIO:E

Deputy Associate Chief Information Officer, Program Management  OS:CIO:B:PM

Director, Stakeholder Management  OS:CIO:SM

Director, Product Assurance  OS:CIO:I:B:P

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaisons:

Associate Chief Information Officer, Business Systems Development  OS:CIO:I:B

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Manager, Program Oversight Office  OS:CIO:SM:PO

 

Appendix IV

 

Enterprise Life Cycle Overview

 

The Enterprise Life Cycle defines the processes, products, techniques, roles, responsibilities, policies, procedures, and standards associated with planning, executing, and managing business change.  It includes redesign of business processes; transformation of the organization; and development, integration, deployment, and maintenance of the related information technology applications and infrastructure.  Its immediate focus is the Internal Revenue Service (IRS) Business Systems Modernization (BSM) program.  Both the IRS and the PRIME contractor1 must follow the Enterprise Life Cycle in developing/acquiring business solutions for modernization projects.

The Enterprise Life Cycle framework is a flexible and adaptable structure within which one plans, executes, and integrates business change.  The Enterprise Life Cycle process layer was created principally from the Computer Sciences Corporation’s Catalyst® methodology.2  It is intended to improve the acquisition, use, and management of information technology within the IRS; facilitate management of large-scale business change; and enhance the methods of decision making and information sharing.  Other components and extensions were added as needed to meet the specific needs of the IRS BSM program.

Enterprise Life Cycle Processes

A process is an ordered, interdependent set of activities established to accomplish a specific purpose.  Processes help to define what work needs to be performed.  The Enterprise Life Cycle methodology includes two major groups of processes:

Life-Cycle Processes, which are organized into phases and subphases and address all domains of business change.

Management Processes, which are organized into management areas and operate across the entire life cycle.

Enterprise Life-Cycle Processes

The chart was removed due to its size.  To see the chart, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Life-Cycle Processes

The life-cycle processes of the Enterprise Life Cycle are divided into six phases, as described below:

  • Vision and Strategy - This phase establishes the overall direction and priorities for business change for the enterprise.  It also identifies and prioritizes the business or system areas for further analysis.

  • Architecture - This phase establishes the concept/vision, requirements, and design for a particular business area or target system.  It also defines the releases for the business area or system.

  • Development - This phase includes the analysis, design, acquisition, modification, construction, and testing of the components of a business solution.  This phase also includes routine planned maintenance of applications.

  • Integration - This phase includes the integration, testing, piloting, and acceptance of a release.  In this phase, the integration team brings together individual work packages of solution components developed or acquired separately during the Development phase. Application and technical infrastructure components are tested to determine if they interact properly.  If appropriate, the team conducts a pilot to ensure all elements of the business solution work together.

  • Deployment - This phase includes preparation of a release for deployment and actual deployment of the release to the deployment sites.  During this phase, the deployment team puts the solution release into operation at target sites.

  • Operations and Support - This phase addresses the ongoing operations and support of the system.  It begins after the business processes and system(s) have been installed and have begun performing business functions.  It encompasses all of the operations and support processes necessary to deliver the services associated with managing all or part of a computing environment.

The Operations and Support phase includes the scheduled activities, such as planned maintenance, systems backup, and production output, as well as the nonscheduled activities, such as problem resolution and service request delivery, including emergency unplanned maintenance of applications.  It also includes the support processes required to keep the system up and running at the contractually specified level.

Management Processes

Besides the life-cycle processes, the Enterprise Life Cycle also addresses the various management areas at the process level.  The management areas include:

  • IRS Governance and Investment Decision Management - This area is responsible for managing the overall direction of the IRS, determining where to invest, and managing the investments over time.

  •  Program Management and Project Management - This area is responsible for organizing, planning, directing, and controlling the activities within the program and its subordinate projects to achieve the objectives of the program and deliver the expected business results.

  • Architectural Engineering/Development Coordination - This area is responsible for managing the technical aspects of coordination across projects and disciplines, such as managing interfaces, controlling architectural changes, ensuring architectural compliance, maintaining standards, and resolving issues.

  • Management Support Processes - This area includes common management processes, such as quality management and configuration management that operate across multiple levels of management.

Milestones

The Enterprise Life Cycle establishes a set of repeatable processes and a system of milestones, checkpoints, and reviews that reduce the risks of system development, accelerate the delivery of business solutions, and ensure alignment with the overall business strategy.  The Enterprise Life Cycle defines a series of milestones in the life-cycle processes.  Milestones provide for “go/no-go” decision points in the project and are sometimes associated with funding approval to proceed.  They occur at natural breaks in the process where there is new information regarding costs, benefits, and risks and where executive authority is necessary for next phase expenditures.

There are five milestones during the project life cycle:

  • Milestone 1 - Business Vision and Case for Action.  In the activities leading up to Milestone 1, executive leadership identifies the direction and priorities for IRS business change.  These guide which business areas and system development projects are funded for further analysis.  The primary decision at Milestone 1 is to select BSM projects based on both the enterprise-level Vision and Strategy and the enterprise architecture.

  • Milestone 2 - Business Systems Concept and Preliminary Business Case.  The activities leading up to Milestone 2 establish the project concept, including requirements and design elements, as a solution for a specific business area or business system.  A preliminary business case is also produced.  The primary decision at Milestone 2 is to approve the solution/system concept and associated plans for a modernization initiative and to authorize funding for that solution.

  • Milestone 3 - Business Systems Design and Baseline Business Case.  In the activities leading up to Milestone 3, the major components of the business solution are analyzed and designed.  A baseline business case is also produced.  The primary decision at Milestone 3 is to accept the logical system design and associated plans and to authorize funding for development, test, and (if chosen) pilot of that solution.

  • Milestone 4 - Business Systems Development and Enterprise Deployment Decision.  In the activities leading up to Milestone 4, the business solution is built.  The Milestone 4 activities are separated by two checkpoints.  Activities leading up to Milestone 4A involve further requirements definition, production of the system’s physical design, and determination of the applicability of fixed-price contracting to complete system development and deployment.  To achieve Milestone 4B, the system is integrated with other business systems and tested, piloted (usually), and prepared for deployment.  The primary decision at Milestone 4B is to authorize the release for enterprise-wide deployment and commit the necessary resources.

  • Milestone 5 - Business Systems Deployment and Postdeployment Evaluation.  In the activities leading up to Milestone 5, the business solution is fully deployed, including delivery of training on use and maintenance.  The primary decision at Milestone 5 is to authorize the release of performance-based compensation based on actual, measured performance of the business system.

 

Appendix V

 

Business Systems Modernization Office Testing Procedures

 

The Office of Release Management developed the following Business Systems Modernization Office (BSMO) testing procedures to clearly define and improve the testing activities and responsibilities.  These procedures were developed in response to the Treasury Inspector General for Tax Administration report entitled The Office of Release Management Can Improve Controls for Modernization Program Coordination.[29]  These procedures were initially issued between February and May 2005 and apply to all Modernization projects in which Combined Systems Integration Testing, Government Acceptance Testing, or Deployment Sight Readiness Testing is performed.

BSMO Test Planning and Test Design Procedure (effective February 25, 2005; revised September 1, 2005)

The purpose of this procedure is to establish guidelines for Test Planning and Test Design.  Test planning begins in the Domain Architecture Phase and continues throughout development.  Test design begins after the requirements and design have been approved.  Test organizations having responsibility for systems evaluation activities participate in systems requirements and system design reviews to gain an understanding of the business needs and in requirements gathering, development, and analysis activities conducted during the design phase.

BSMO Requirements Review Procedure (effective May 31, 2005)

This procedure describes the activities performed to ensure requirements for Modernization projects are testable.

BSMO Test Readiness Review Procedure (effective February 25, 2005)

This procedure establishes guidelines for conducting Test Readiness Reviews for a Modernization release.  A Test Readiness Review is a formal lifecycle review that is conducted to assess the readiness of the test team, test environment, development team, and supporting organizations to begin testing.  The decision to proceed with the next phase of testing is made based on a successful Test Readiness Review outcome.

BSMO Defect Review Board Procedure (effective February 25, 2005)

This procedure establishes the guidelines for the activities performed to establish and implement a Defect Review Board for a Modernization release.  A Defect Review Board is a group discrepancy review activity to resolve issues surrounding final severity and/or priority levels for test defect reports.[30]  The Defect Review Board is also responsible for evaluating and accepting or rejecting defect reports deferred from previous releases or tests.

BSMO Test Case Deferral and Waiver Procedure (effective February 25, 2005)

This procedure defines the requirement for management of test case deferrals and waivers and to ensure their identification, documentation, review, and disposition.

BSMO Test Folders and Records Procedure (effective May 31, 2005)

This procedure defines the process for documenting and verifying test results folders.  The test results folder is the documentation maintained by the testing organization(s) that contains all the information pertinent to one test case executed on the identified release.  It ensures access to complete test results for historical reference.

BSMO Test Completion Reporting Procedure (effective May 31, 2005)

This procedure defines the process and structure for end of test reporting and describes the activities that occur after test execution is complete including post test documentation to be produced.  Test documentation includes work products that provide evidence of successful test completion (e.g., the test report and lessons learned documents).  A test report provides a detailed description and evaluation of testing activities.  It documents actual testing results and identifies applicable environmental, test approach, test design, test planning, and test execution variances from the original test plan.

 

Appendix VI

 

Analysis of Testing Procedures Issued by the Business Systems Modernization Office

 

Figure 1 presents testing practice findings from the prior Treasury Inspector General for Tax Administration (TIGTA) audit report, Testing Practices for Business Systems Modernization Projects Need Improvement[31] and the subsequent Business Systems Modernization Office’s (BSMO) Office of Release Management procedures addressing the findings.

Figure 1:  Analysis of Test Practice Findings and Subsequent Business Systems Modernization Testing Guidance

Prior Test Practice Findings

BSMO Release Management Procedure

Insufficient testing plans and facilities have contributed to project delays.  All necessary details or procedures were not developed and followed to properly conduct performance testing, and the capacity of the test lab was inadequate to support the Business Systems Modernization program.

-         BSMO Test Planning and Test Design Procedure, effective February 25, 2005.

-         Enterprise Life Cycle (ELC)[32] Supplement: Enterprise Integration and Test Version 2.1, dated May 12, 2005.

Testing activities were not completed before putting systems into operation.  Testing was not always performed to assure the adequacy of project capabilities, hardware upgrades, and system performance.

 

-         BSMO Test Planning and Test Design Procedure, effective February 25, 2005.

-         BSMO Requirements Review Procedure, effective May 31, 2005.

-         BSMO Test Readiness Review Procedure, effective February 25, 2005.

-         BSMO Test Case Deferral and Waiver Procedure, effective February 25, 2005.

-         ELC Supplement: Enterprise Integration and Test Version 2.1, dated May 12, 2005.

Failed tests were not being properly resolved during project development.  Documentation was not adequate in defect reports to describe actions to resolve failed tests, and guidance was insufficient to appropriately and consistently direct closure of defect reports for failed tests.

-         BSMO Defect Review Board Procedure, effective February 25, 2005.

-         BSMO Test Readiness Review Procedure, effective February 25, 2005.

-         BSMO Test Completion Reporting Procedure, effective May 31, 2005.

Source:  TIGTA report Testing Practices for Business Systems Modernization Projects Need Improvement and the Office of Release Management procedures.

Figure 2 identifies testing practice findings from TIGTA audit reports issued between March and December 2004 and the subsequent BSMO Office of Release Management procedures addressing the findings.

Figure 2:  Analysis of Test Practice Findings from March to December 2004 and Subsequent Business Systems Modernization Testing Guidance

Finding

Corrective Action

BSMO Release Management Procedure

Additional focus is needed to monitor testing results since the contractor did not always complete the test procedures as written and test procedures were not updated with actual test results.  (The Custodial Accounting Project[33]  Team Is Making Progress; However, Further Actions Should Be Taken to Increase the Likelihood of a Successful Implementation, Reference Number 2004-20-061, dated March 2004).

An embedded testing team has been established to further enhance the Custodial Accounting Project (CAP) testing process.  The team has representatives from the CAP project as well as MITRE[34] support and will continue its activity through CAP
Release 1 concurrent with exit of Milestone 5 (Completed
November 5, 2003).

-         Test Planning and Test Design Procedure, effective February 25, 2005.

-         Test Completion Reporting Procedure, effective
May 31, 2005.

-         ELC Supplement: Enterprise Integration and Test
Version 2.1,
dated
May 12, 2005.

 


The ELC requires that all system requirements include a Verification and Validation method.  Systems requirements listed in the Requirements Traceability Verification Matrix (RTVM)[35] did not include a Verification and Validation method (The Integrated Financial System[36] Project Team Needs to Resolve Transition Planning and Testing Issues to Increase the Chances of a Successful Deployment, Reference Number 2004-20-147, dated
August 2004).

The RTVM will be updated and delivered with the final Systems Integration Testing Report scheduled for delivery in August 2004.  In addition, the new ELC framework with Milestone 4A will require the updates to the RTVM at Milestone 2 and updated at each successive milestone, as a condition of milestone exit.  The systems engineering reviews conducted within each ELC phase will include validation of bi-directional traceability of requirements (Completed August 31, 2004).

-         Test Planning and Test Design Procedure, effective February 25, 2005.

-         Requirements Review Procedure, effective
 May 31, 2005.

-         Test Readiness Review Procedure, effective February 25, 2005.

-         ELC Supplement: Enterprise Integration and Test,
Version 2.1, dated
May 12, 2005.

 

The ELC provides guidance on the signatures required for moving and deleting requirements from a release including the Internal Revenue Service (IRS) Internal Management Program Director.  The project team prepared deviation forms for a number of requirements but the deviation forms did not include a signature line for the Internal Management Program Director (The Integrated Financial System Project Team Needs to Resolve Transition Planning and Testing Issues to Increase the Chances of a Successful Deployment, Reference Number 2004-20-147, dated
August 2004).

 

The ELC requires the Program Office Director’s signature for test case waivers and deferrals. This signature line is included in all but the earliest waiver forms (Completed
May 31, 2004).

 

-         BSMO Test Case Deferral and Waiver Procedure, effective February 25, 2005.

 


Without an understanding of why so many defects have been encountered, project teams may continue to experience testing delays, and future modernization projects may not learn lessons based on the experience of a prior project team (The Integrated Financial System Project Team Needs to Resolve Transition Planning and Testing Issues to Increase the Chances of a Successful Deployment, Reference Number 2004-20-147, dated August 2004).

As the ELC calls for a lessons learned report after the exit
of each Milestone, the IRS conducted a lessons
learned meeting on
December 15, 2004, as part of its post Milestone 4. The number of defects found was within the project’s estimated expectations once the corrected defect estimates were applied. The original estimate did not account for code developed for interfaces, reports, and user exits, which resulted in a significant difference in estimates (Completed
January 1, 2005).

 

 

 

 

-         Defect Review Board Procedure, effective February 25, 2005.

-         Test Readiness Review Procedure, effective February 25, 2005.

-         Test Completion Reporting Procedure, effective
May 31, 2005.

 

A major upgrade to an application was installed and went through system integration testing and deployment site readiness testing, but it did not receive the level and volume of performance testing, simulating high volume processing a pilot test offers (To Ensure the Customer Account Data Engine’s[37] Success, Prescribed Management Practices Need to Be Followed, Reference Number 2005-20-005, dated November 2004).

 

The IRS is planning a semiannual release of the Customer Account Data Engine in July and January.  The July delivery will involve higher risk, more complex functionality, and the January delivery will include filing season changes combined with additional changes as capacity permits.  Since the returns from earlier in the filing season will be available for testing, the IRS can conduct performance testing on the July release using the highest volume periods. The IRS will determine whether to conduct additional performance testing on the January release based on the likelihood of the changes affecting performance (Open - Due August 1, 2005; Extended to
December 1, 2005).

-         BSMO Test Planning and Test Design Procedure, effective February 25, 2005.

-         Requirements Review Procedure, effective
May 31, 2005.

-         Test Readiness Review Procedure, effective February 25, 2005.

-         ELC Supplement: Enterprise Integration and Test
Version 2.1,
dated
May 12, 2005.

 

 

 

 

 

System requirements were not adequately managed during the Systems Acceptability Test (System Requirements Were Not Adequately Managed During the Testing of the Custodial Accounting Project, Reference Number 2005-20-019, dated December 2004).

For CAP Release 1.1, the contractor, Northrop Grumman, made enhancements to the automated requirements tool, REDCOAT. Due to these enhancements, the IRS now traces system requirements to the test script[38] level.  The RTVM shows traceability to the test script level as well as requirements validation results.  There are no unmapped requirements.  Exception reports identify requirements that are not mapped to test cases and test scripts. The reports also identify the disposition (i.e., added, changed, or deleted) of requirements due to approved change requests.  The IRS is storing deferral/waiver information (relative to test scripts and requirements) in REDCOAT to ensure consistency between the RTVM and the requirements reported as deferred/waived.  In addition, the IRS altered the System Acceptance Testing process to perform requirements validation immediately following the execution of each test script. Therefore, when the team tester signs the hard copy printouts of requirements traced to test scripts, they are recording the requirements validation results. These enhancements will be used for all subsequent releases of the CAP (Completed June 30, 2004).

-         BSMO Test Planning and Test Design Procedure, effective February 25, 2005.

-         Requirements Review Procedure, effective
May 31, 2005.

-         Test Readiness Review Procedure, effective February 25, 2005.

-         ELC Supplement: Enterprise Integration and Test
Version 2.1,
dated
May 12, 2005.

Source:  TIGTA audit reports and the Office of Release Management procedures.

 

Appendix VII

 

Suggested Additions/Clarification to the Business Systems Modernization Office Testing Procedures

 

The following is our analysis of the Office of Release Management testing procedures.  The analysis presents our suggestions to provide additional guidance or clarification to existing guidance in the testing procedures.

Test Planning and Test Design Procedure

Perform Test Planning (section A1)

Step 2:  Participate in the requirements gathering, development of requirements, and the analysis of requirements in order to comprehend the logical design and how it relates to the requirements.

Suggestion:  Identify the participants responsible for performing this activity and the extent of their involvement.  Provide details on the documentation required.

Step 3:  Review the Interface Control Documents and testing related Memoranda of Agreements.

Suggestion:  Identify the participants responsible for performing this activity.  Provide detailed steps on what this review process should include and what it will accomplish.

Step 7:  Identify applicable standards as determined by the Project.

Suggestion:  Identify the participants responsible for performing this activity.  List or define the “applicable standards” that should be identified or considered by the project and the purpose these standards will have on a project’s testing activity.

Step 11:  Define procedures for problem reporting and escalation.

Suggestion:  Identify what these reporting and escalation procedures should include.  This should include references to established guidelines or database systems used for capturing and controlling problems identified during testing (ClearQuest and Information Technology Assets Management System).

Create Test Case Development Schedule (section A3)

The Test Case Development Schedule is created by each individual testing organization.

Suggestion:  Include the development of test scripts[39] in the schedule.

Develop Test Cases (section A4)

Step 1:  Obtain support from the projects as needed to facilitate test case development.

Step 2:  Develop test cases that adequately cover all aspects of the project or system requirement(s) identified to be in-scope for the current release.  This includes verification of valid and invalid conditions applicable to the designated requirement.

Suggestion:  Identify the procedure detailing a review of the test cases to ensure the adequacy of the test cases.  The procedure should include how this review is performed, documented, and how to timely correct problems identified.

Update Requirements Traceability Verification Matrix (section A5)

Step 3:  Perform a quality check to ensure all test cases in the planned baseline are accounted for and all attributes are populated.

Suggestion:  Identify the participants responsible for performing this activity.  The procedure should include how this review is performed, documented, and how to timely correct problems identified.

Develop Test Scripts (section A7)

Step 1:  Obtain support from the project as needed to facilitate Test Script development.

Step 2:  Design scripts that have a relationship to the test case in accordance with the test design specifications.

Suggestion:  Identify the procedure detailing the review of the test scripts to ensure the adequacy of the test scripts.  The procedure should include how this review is performed, documented, and how to timely correct problems identified.

Update Test Plan and Requirements Traceability Verification Matrix (section A9)

Step 2:  Conduct a Customer Technical Review and finalize the Test Plan.

Suggestion:  This procedure indicates the test lead is responsible for performing the Customer Technical Review.  The Requirements Review Procedure indicates the integrated project team or the Acquisition Project Manager is responsible for performing this review.  Identify the participants responsible for performing this activity and ensure consistency within the Business Systems Modernization Office procedures.  Include in the procedure references to the Requirements Review procedures or to established Customer Technical Review guidelines.

Test Readiness Review Procedure

The Test Readiness Review procedures show the test team is responsible for performing this activity, with support from the Acquisition Project Manager, Office of Release Management, the business operating divisions and functional organizations, Office of Configuration Management, and the contractor.  The procedure does not specify whether the test manager, test lead, or Acquisition Project Manager is responsible for initiating and planning this review.

Suggestion:  Identify the participant(s) responsible for initiating, planning, and executing the Test Readiness Review.  The procedure should include the extent of their responsibility and the responsibilities from those participants supporting this activity.  Provide details on any documentation required.

Identify Test Readiness Review Participants and Prepare Tailored Test Readiness Review Checklist (section A1)

Tailor the Checklist (Appendix A) to identify critical items and insert questions for project-specific concerns.

Suggestion:  A mandatory Test Readiness Review Checklist should be developed for use in all testing activities.  This revised checklist should incorporate all the necessary items from the Business System Modernization Office Test Readiness Review procedure checklist and the Internal Revenue Manual checklist.  The Test Readiness Review procedure should make the use of this checklist mandatory; however, a project should be able to add any project-specific questions.  The procedure should require all checklist questions be answered or otherwise noted as “not applicable.”  The procedures should include a checklist review process to ensure the checklists are appropriately completed and that there are no unresolved items before testing commences.  Additionally, questions added by projects should be considered for addition to the checklist template.

Defect Review Board Procedure

Conduct Defect Review Board Meetings (section A2)

This activity is performed as often as necessary as determined by the Defect Review Board lead.

Suggestion:  Identify the participants responsible for selecting the Defect Review Board Lead and for performing this activity and the extent of their involvement or responsibilities.  Provide details on any documentation required.

Review defect reports from previous tests (if applicable).

Suggestion:  Identify the procedure detailing the review of defect reports from previous tests to ensure the defects have been adequately resolved and are ready for testing.  The procedure should include how this review is performed, documented, and how problems identified are timely corrected.

Review defect reports, agree on severity levels, and reassign severity levels (if necessary).  Agree on target dates for defect reports to be fixed.

Suggestion:  Identify the procedure detailing the review of defect report severity levels to ensure the changes made to the defects severity levels are consistently applied and properly approved.  The procedure should include how this review is performed, logs prepared to document the reason for any changes and the applicable approvals, and how problems identified are timely corrected.  Further, the procedure should include actions taken by the Defect Review Board Lead if target dates to resolve defects are not met.

Establish procedures detailing actions for performing an analysis of defect reports when a significant number of defects are encountered during testing.

Test Folders and Records Procedure

Create Test Results Folder (section A1)

Create one electronic test results folder for each test identified in the Test Execution Schedule in the team’s designated repository.  This repository will be used to store the test results collected during test execution.  Each folder should contain: Test case(s), test script(s), test results, and Test Case Summary Certification Report Form.

Compile Test Results Folder (section A2)

Throughout test execution, populate the electronic folder with the artifacts collected during test execution.  Document the outcome of test execution in the electronic version.  Populate the electronic folder with the test execution results (reference Rational Test Manager for completed test script).  Document resolution and closure of the Defect Reports (Reference Rational ClearQuest for the applicable defect report).  Document pass/fail results for each execution of the test script.  Ensure results are documented for each test script step modifications (redlines) and the execution results obtained for each test script step and verification point.  Provide a comprehensive summary of test execution results using Test Case Summary Certification Report, Form 6503.

Suggestion:  The procedure should specify the version of Test Case Summary Verification Report (Form 6503) to use during testing.  A review of all versions should be performed in order to develop an updated Form 6503 that will incorporate all appropriate information.

The procedure needs to provide additional detail on how the defect reports, test script pass/fail execution results, test script modifications, and test results for each step and verification point should be documented in the test results folders.  This includes documenting the various test script iterations.  This will ensure test results folders consistency among all the BSM projects on how to capture and control this information.  A test results folders checklist can help ensure inclusion of all required documentation.

Compile Test Results Folder (section A2)

Perform technical verification of actual results for a subset of test cases selected by Rational Test Manager.

Verify Contents of Test Results Folder (section A3)

Verify the documentation provided in each test results folders within 2 to 3 business days following test completion.  Maintain and control the Test Results Folder until the end of test script execution and test results verification.  Perform a technical review of the artifacts provided to ensure that the test objective has been met and is sufficiently documented.

Suggestion:  The procedure should identify who is responsible for performing these reviews, what the reviews should include, how the review should be documented, and a process to verify the reviews were completed.  The procedure should include actions taken if the reviews are not completed within the 2 to 3 business days of test completion.  Further, decisions must be made to use electronic or paper documentation to control Test Results Folder information.  The procedure should specify where to maintain the Test Results Folder after the test completion and how long after test completion.  Finally, the procedure requires the Acquisition Project Manager to certify the Test Results Folders include all applicable documentation, all pertinent reviews were performed within the 2 to 3 business days and signed by the reviewer, and the folders have been secured in the proper location.  The procedures should require the Acquisition Project Manager to prepare a certification form for inclusion in the end of test report.

Test Completion Reporting Procedure

Review and Confirm Exit Criteria (section A1)

The Test Manager is responsible for confirming that the following exit criteria have been satisfied in order to conclude test activities:

All test procedures have been executed.

Suggestion:  Identify and reference to the specific test procedures.  If this refers to the Test Plan or Test Cases, then the requirement needs to be corrected.

The disposition of all unresolved severity 3 and 4 defect reports[40]  must be reviewed for impact by the Business Requirements Director and the Acquisition Project Manager, and an action plan specifying proposed resolution must be developed and approved by the Business Requirements Director and the Acquisition Project Manager.

Suggestion:  Identify the procedure detailing the review of the unresolved severity 3 and 4 defect reports to ensure adequate resolution.  The procedure should include how this review is performed, documented, and how problems identified are timely corrected.  Additionally, the procedure should include detailed action plans (add an attachment for an action plan template).

Draft Test Report (section A2)

Distribute the Draft Test Report for review to the testing organizations and, upon request, to management and the business representatives.  Include preliminary conclusions supported by initial observations and indicate what preliminary analysis was performed.  Note areas that require further analysis.

Issue Final Test Report (section A3)

The Test Manager is responsible for ensuring the content of the Test Report, issuance of the Final Test Report only after the analysis of the test results for the release is complete, ensuring issuance of the Final Test Report within 30 days following test completion, and obtaining approval signatures.

Suggestion:  Identify the procedure detailing the analysis and presentation of test results.  The procedure should include how this analysis is performed, documented, and how problems identified are timely corrected.  Additionally, the procedure should include details on the actions taken if the report cannot be issued within the 30 days of test completion.  The procedure should also list those individuals responsible for providing approval signatures.

Identify the procedure detailing the activities performed when post initial operating capability (IOC) testing is being conducted to ensure all test results are being adequately reported.  The procedure should include how post IOC is performed, documented, and how problems identified are timely corrected.  Additionally, the procedure should include details on the impact to the final report, how the final report should present the post IOC test results (or whether two reports should be issued), and whether authorized approvals were obtained to conduct the post IOC testing and delay the issuance of the final report.

Additionally, the procedure should include the requirement that the Final Test Report incorporate the Acquisition Project Manager’s completed certification form that the Test Results Folders include all applicable documentation, all pertinent reviews were performed within the 2-3 business days and signed by the reviewer, and the folders have been secured in the proper location.

 

Appendix VIII

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.


[1] Reference Number 2003-20-178, dated September 2003.

[2] To facilitate success of its modernization efforts, the IRS hired the Computer Sciences Corporation as the PRIME contractor and integrator for the BSM program and created the Business Systems Modernization Office to guide and oversee the work of the PRIME contractor.

[3] The ELC establishes a set of repeatable processes and a system of reviews, checkpoints, and milestones that enable delivery of promised business results.  See Appendix IV for an overview of the components of the ELC.

[4] The test results folder is the documentation maintained by the testing organization that contains all the information pertinent to the test case executed.

[5] To facilitate success of its modernization efforts, the IRS hired the Computer Sciences Corporation as the PRIME contractor and integrator for the BSM program and created the Business Systems Modernization Office to guide and oversee the work of the PRIME contractor.

[6] The ELC establishes a set of repeatable processes and a system of reviews, checkpoints, and milestones that enable delivery of promised business results.  See Appendix IV for an overview of the components of the ELC.

[7] The Enterprise Integration and Test Environment contains components representative of key elements of the enterprise architecture to support integration and the full range of testing activities.

[8] Initial Operational Capability is a set of functional and performance characteristics and threshold parameters defined and agreed to by the business owner for the specific release at the time of Logical Design.

[9] Reference Number 2003-20-178, dated September 2003.

[10] A defect report is the mechanism for formally documenting a perceived problem in a controlled product.

[11] The Filing and Payment Compliance Project is an end-to-end strategy to resolve payment and filing compliance issues quickly and fairly.  Release 1.1 will use a combination of commercial off-the-shelf software and manual processes for the Filing and Payment Compliance system.

[12] The System Requirements Report documents a feasible, quantified, verifiable set of requirements that define and scope the business system being developed by the project.

[13] The Requirements Traceability Verification Matrix documents the specific test cases and test results for each requirement.

[14] A series of instructions that carry out the test case description contained in the test plan.

[15] The e-Services Project provides a set of web-based business products to tax practitioners and other third parties such as banks and brokerage firms.

[16] The MeF Project’s goal is to replace the current technology for filing IRS tax return forms with modernized, Internet-based electronic filing applications.

[17] The CADE Project will consist of databases and related applications and become the foundation for managing taxpayer accounts in the IRS modernization plan.

[18] Application Qualification Testing focuses on verifying that business processes are supported by the application and that the requirements of the application are met.

[19] A severity 3 defect report involves problems that are serious but do not prevent using or testing a required capability.  A severity 4 defect report involves minor problems, or documentation errors, that do not cause deviations from tasks or project standards.

[20] Reference Number 2003-20-178, dated September 2003.

[21] CMMI® is a process improvement approach that helps integrate traditionally separate organizational functions, set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising current processes.

[22] The Filing and Payment Compliance Project is an end-to-end strategy to resolve payment and filing compliance issues quickly and fairly.  Release 1.1 will use a combination of the commercial off-the-shelf software and manual processes for the Filing and Payment Compliance system.

[23] The System Requirements Report documents a feasible, quantified, verifiable set of requirements that define and scope the business system being developed by the project.

[24] The Requirements Traceability Verification Matrix documents the specific test cases and test results for each requirement.

[25] The test results folder is the documentation maintained by the testing organization that contains all the information pertinent to the test case executed.

[26] The Customer Account Data Engine Project will consist of databases and related applications and become the foundation for managing taxpayer accounts in the IRS modernization plan.

[27] The e-Services Project provides a set of web-based business products to tax practitioners and other third parties such as banks and brokerage firms.

[28] The Modernized e-File Project’s goal is to replace the current technology for filing IRS tax return forms with modernized, Internet-based electronic filing applications.

1 To facilitate success of its modernization efforts, the IRS hired the Computer Sciences Corporation as the PRIME contractor and integrator for the BSM program and created the Business Systems Modernization Office to guide and oversee the work of the PRIME contractor.

2 The IRS has acquired a perpetual license to Catalyst® as part of the PRIME contract, subject to certain restrictions. The license includes rights to all enhancements made to Catalyst® by the Computer Sciences Corporation during the contract period.

[29] Reference Number 2004-20-157, dated September 2004.

[30] A defect report is the mechanism for formally documenting a perceived problem in a controlled product.

[31] Reference Number 2003-20-178, dated September 2003.

[32] The ELC establishes a set of repeatable processes and a system of reviews, checkpoints, and milestones that enable delivery of promised business results.  See Appendix IV for an overview of the components of the ELC.

[33] The CAP uses a data warehouse approach for storing, analyzing, and reporting taxpayer accounts and collections.

[34] MITRE is a not-for-profit corporation that provides the Internal Revenue Service with independent, expert, and objective advice and guidance on strategic, technical, and program management issues.

[35] The RTVM documents the specific test cases and test results for each requirement.

[36] The Integrated Financial System Project includes the accounts payable, accounts receivable, general ledger, budget execution, cost management, and financial reporting activities.

[37] The Customer Account Data Engine Project will consist of databases and related applications and become the foundation for managing taxpayer accounts in the IRS modernization plan. 

[38] A series of instructions that carry out the test case description contained in the test plan.

[39] A series of instructions that carry out the test case description contained in the test plan.

[40] A severity 3 defect report involves problems that are serious but do not prevent using or testing a required capability.  A severity 4 defect report involves minor problems, or documentation errors, that do not cause deviations from tasks or project standards.