The Integrated Financial System Project Team Needs to Resolve Transition Planning and Testing Issues to Increase the Chances of a Successful Deployment

 

August 2004

 

Reference Number:  2004-20-147

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

August 13, 2004

 

 

MEMORANDUM FOR CHIEF INFORMATION OFFICER

                                         CHIEF, MISSION ASSURANCE

 

FROM:     Gordon C. Milbourn III /s/ Gordon C. Milbourn III

                 Acting Deputy Inspector General for Audit

 

SUBJECT:     Final Audit Report - The Integrated Financial System Project Team Needs to Resolve Transition Planning and Testing Issues to Increase the Chances of a Successful Deployment (Audit # 200320043)

 

This report presents the results of our review of the Internal Revenue Service’s (IRS) Integrated Financial System (IFS).  The overall objective of this review was to determine if adequate testing processes and procedures were being followed prior to the deployment of the IFS to ensure the system meets expectations.  Additionally, we determined if the IFS project team was adequately planning for the transition of operation and maintenance of the system from the contractor to the IRS.

In summary, the IFS is intended to provide administrative financial management information essential for financial statement preparation.  In addition, the development of the IFS supports one of the President’s Management Agenda initiatives to improve financial performance, which involves ensuring Federal Government financial systems produce accurate and timely financial information. 

The IRS and the PRIME contractor recently announced that the IFS would not be deployed until October 2004, a delay of 1 year from the originally scheduled deployment date. While the IFS project has experienced delays, the project team has made progress in the areas of System Integration Testing (SIT), transition planning, and security test planning.  We determined that significant integration testing was underway, and the risk of the IRS Information Technology Services organization not being able to operate and maintain the IFS had been reduced.  In addition, the IFS project team became the first Business Systems Modernization project team to complete a Transition Management Plan.  Also, a detailed IFS Security Test and Evaluation (ST&E) Plan has been prepared.

While the IFS project team continues to make progress, improvements can be made in the areas of transition planning and testing to increase the chances of a successful deployment.  To provide timely feedback during critical transition and testing activities, we communicated the results of our analyses during the audit; these communications are discussed throughout the report.

To ensure transition planning issues are resolved prior to acceptance of the IFS by the IRS and transition management is considered when exiting appropriate milestones, we recommended the Chief Information Officer (CIO) ensure outstanding IFS transition issues are documented and tracked to closure and the Enterprise Life Cycle (ELC) Milestone Exit Criteria are updated to include completion of the Transition Management Plan.  To ensure IFS testing results in a high-quality product and future modernization projects are improved through lessons learned by the IFS project team, we recommended the CIO ensure 1) the Requirements Traceability Verification Matrix (RTVM) is updated to include a verification and validation method for all requirements, 2) deviation forms are updated to include a signature line for all required approving officials, 3) the root cause is investigated and resolved for why defects are more voluminous than expected and for why defects are being evaluated and resolved slower than expected, and 4) the ELC is updated to include a requirement for entrance and exit criteria in the ST&E Plan.  To ensure planning for the upcoming IFS ST&E is as thorough as possible, we recommended the Chief, Mission Assurance, ensure all needed elements from the Internal Revenue Manual (IRM) are included in the ST&E Plan and trusted recovery and object reuse testing occurs (or the risk of not conducting these tests is documented).

Management’s Response:  Management agreed with seven of our recommendations and disagreed with one recommendation.  The IRS responded that it is now including the correct signature line on deviation forms and using an issue tracking log to monitor transition issues.  The IRS also plans to update the IFS RTVM and ST&E Plan, ensure a lessons learned report is generated, and update the ELC to ensure that projects develop a Transition Management Plan.

In addition to responding to our recommendations, the IRS indicated that it has assigned the Chief Financial Officer full-time to the IFS Project, negotiated a cost-sharing arrangement with the PRIME contractor to balance financial risk, and begun validating the integrity of the system. The PRIME contractor has also bolstered its management team with key skills.  Management’s complete response to the draft report is included as Appendix VII.

Office of Audit Comment:  The CIO disagreed with our recommendation to ensure that trusted recovery and object reuse testing occurs, or the risk of not conducting these tests is documented.  The IRM requires that correct implementation of object reuse and trusted recovery be validated as part of the ST&E.  In addition, the IRS is planning to test object reuse capabilities on one of two computer platforms.  The IRS responded that object reuse testing is no longer required by the Department of the Treasury.  While the Department of the Treasury requirement may have changed, the IRM still requires object reuse testing.  Therefore, we believe the IRS should conduct the required testing, particularly since it plans to perform object reuse testing on one computer platform and not the other.  The IRS’ response did not address trusted recovery testing.

The CIO agreed with our recommendation to update the ELC to include a requirement for entrance and exit criteria in the ST&E Plan.  However, the corrective action provided will not adequately address our recommendation.  A best practice in test planning is to create a predefined set of criteria used to determine when a test is ready to be conducted (entrance criteria) and when a test has been completed at an acceptable quality level (exit criteria).  The IRS responded that it would be incorporating changes into the milestone exit requirements.  Changing the milestone exit criteria will not create a requirement for entrance and exit criteria in the ST&E Plan.

While we still believe our recommendations are worthwhile, we do not intend to elevate our disagreements concerning them to the Department of the Treasury for resolution.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs), at (202) 622-8510.

 

Table of Contents

Background

While the Integrated Financial System Project Continues to Experience Significant Schedule Delays, the Project Team Is Making Progress Toward Deployment

Transition Planning Is Improving; However, Additional Improvements Are Needed

Recommendations 1 and 2:

Testing Practices Continue to Need Improvement

Recommendations 3 through 5:

Recommendations 6 through 8:

Appendix I – Detailed Objectives, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Enterprise Life Cycle Overview

Appendix V – Test Case Elements Required by the Enterprise Life Cycle

Appendix VI – Defect Tracking and Resolution Time Periods

Appendix VII – Management’s Response to the Draft Report

 

Background  

The Internal Revenue Service (IRS) is currently engaged in an effort, known as Business Systems Modernization (BSM), to modernize its systems and associated processes.  To facilitate the success of its modernization efforts, the IRS hired the Computer Sciences Corporation as the PRIME contractor and integrator for the BSM program and created the Business Systems Modernization Office (BSMO) to guide and oversee the work of the PRIME contractor.  

One of the BSM projects is the Integrated Financial System (IFS), which will help to modernize the IRS financial systems and processes used to generate annual financial statements.  This new general ledger and accounting system is based on commercial software from the Systems Analysis and Program Development (SAP) organization.

The IFS is intended to provide administrative financial management information essential for financial statement preparation.  In addition, the development of the IFS supports one of the President’s Management Agenda initiatives to improve financial performance, which involves ensuring Federal Government financial systems produce accurate and timely financial information.

This is the Treasury Inspector General for Tax Administration (TIGTA) Information Systems Programs business unit’s second audit of the IFS project.  The first audit found the IFS project team had begun important testing activities, ensured compatibility with the Security and Technology Infrastructure Release (STIR), and ensured redundant hardware was planned for implementation.  However, we also determined that risks were mounting as the IFS project team strived to meet an aggressive deployment date of October 2003.  In September 2003, the IFS project team announced that IFS deployment was being rescheduled for February 13, 2004, and that a delay until April 13, 2004, was possible.  In January 2004, the IFS project team announced that the IFS would not deploy until October 2004.

After learning the PRIME contractor would again miss the target deployment date for the IFS, the Commissioner testified that the delay was a huge disappointment to the IRS.  The Commissioner indicated the PRIME contractor was willing to bear the financial burden for the additional delay; however, the IRS needed to take stronger steps.  The Commissioner stated the IRS would expand competition for new enforcement projects as well as for the next phase of the IFS.

This audit was conducted at the BSMO facilities in New Carrollton, Maryland, during the period September 2003 through March 2004 in accordance with Government Auditing Standards.  Detailed information on our audit objectives, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.  To provide timely feedback during critical transition and testing activities, we communicated the results of our analyses to the BSMO during the audit; these communications are discussed throughout the report.

During the audit, the BSMO informed us of the results of several studies that were performed to obtain fresh and independent assessments from outside experts on the health of the BSM program.  To address the results of the studies, the IRS and the PRIME contractor have developed and are implementing an action plan, known as the BSM Challenge Plan, designed to address the BSM-related study recommendations. Several of the BSM Challenge Plan actions, if effective, would address issues within this report (e.g., ensuring projects strictly follow the Enterprise Life Cycle (ELC), tightly managing business requirements, fully maturing management disciplines, and ensuring more effective integration of IRS and PRIME/subcontractor test teams).

This audit was conducted while changes were being made at both the BSM program level and the IFS project level.  The majority of our fieldwork was completed in February 2004; however, we reviewed additional information provided by the BSMO and the PRIME contractor during March 2004.  Any project changes that have occurred since we concluded our analyses are not reflected in this report.  As a result, this report may not reflect the most current status.

While the Integrated Financial System Project Continues to Experience Significant Schedule Delays, the Project Team Is Making Progress Toward Deployment

The IRS and the PRIME contractor recently announced that the IFS would not be deployed until October 2004, a delay of 1 year from the originally scheduled deployment date. This delay has been attributed to the following issues:  data conversion, health coverage tax credit (HCTC), interface development, and testing productivity.  While the IFS project has experienced delays, the project team has made progress in the areas of integration testing, transition planning, and security test planning.

·        Significant integration testing is underway and the process for moving and deleting requirements is being followed.  The IFS System Integration Testing (SIT) was separated into five parts.  As of March 31, 2004, one part was complete and the remaining four parts were underway. In addition, the PRIME contractor informed us that testing productivity was increasing because both developers and testers are now included in the SIT environment.  We also determined that the ELC process for moving a set of requirements to another part of testing and deleting requirements from the current release was being documented.

·        The IFS project team became the first BSM project team to complete a Transition Management Plan (TMP). In October 2002, the PRIME contractor enhanced the ELC to require a TMP in place of a Transition To Support Plan. The TMP, which describes the plans for transitioning the initial release of the IFS to the IRS, is a result of the accumulation of best practices by the PRIME contractor.

While the Transition To Support Plan and the TMP both describe transition activities, the TMP is a much more comprehensive document than the Transition To Support Plan.  For instance, the TMP provides a listing of applications, project documentation, hardware, software, telecommunications, and database items that will be transitioned to the IRS.

In September 2003, the IFS project team delivered the initial version of the IFS TMP.  The IFS TMP includes transition schedules and transition acceptance criteria, as well as information about user training, staffing, and other items needed to plan for the transition.

·        The risk of the IRS Information Technology Services (ITS) organization not being able to operate and maintain the IFS has been reduced.  The IFS project team has been preparing to implement the transfer of the IFS to the IRS.  Initially, the ITS organization did not believe enough time was being allocated for its personnel to learn how to operate and maintain the IFS once it was implemented.  The proposed transition time was much less than the proposed transition time for a similar BSM project.  The IFS project team correctly documented this as a risk and created a risk reduction plan.  At the end of our audit, all risk reduction activities were on track, and the project delay had provided additional transition time.

·        A detailed IFS Security Test and Evaluation (ST&E) Plan has been prepared. The IRS Office of Mission Assurance and its security testing contractor have prepared an ST&E Plan to ensure the IFS application will meet security requirements.  The Plan includes a good overview of the tests to be conducted, as well as many test cases designed to ensure security requirements are met.  While the ST&E Plan is a very good start, several areas in it could be improved (see The ST&E Plan can be improved section later in this report).  The actual ST&E for the IFS application will be conducted just prior to deployment.

While the IFS project team continues to make progress, improvements can be made in the areas of transition planning and testing to increase the chances of a successful deployment.

Transition Planning Is Improving; However, Additional Improvements Are Needed

In September 2003, the IFS project team delivered the IFS TMP, which describes the plans for transitioning the initial release of the IFS to the IRS.  Based on our review of the TMP and discussions with BSMO, Chief Financial Officer (CFO), ITS, and PRIME contractor personnel, we communicated our concerns regarding transition management to BSMO officials on October 17, 2003.

Our initial impression was that the IFS TMP included a very good overview of transitional activities; however, the level of detail in the TMP and related documents was insufficient and raised a lot of unanswered questions.  We were informed the BSMO also had come to this conclusion and had requested that the PRIME contractor revise the TMP.

The PRIME contractor delivered a revised version of the IFS TMP in December 2003.  We reviewed the revised TMP and provided additional input to the BSMO on January 16, 2004.  While our original concerns were not completely addressed by the revised TMP, we noted that the IFS project team was making progress in planning for the IFS transition.  The strongest improvement in the revised TMP was that many of our original concerns were mentioned, whereas the previous TMP was often silent on important details.

The ELC requires that transition activities, such as help desk, service level agreements (SLA), configuration management, and training requirements, be covered in the TMP.  Transition issues, if left unresolved, will result in inadequate clarity as to how to operate and maintain the IFS once it is implemented.

We determined that further details are needed to fully address several post-implementation transition planning activities.  These details could be provided in future updates of the TMP or as related transition documents are created or updated.

·        Help desk procedures and staffing levels were not defined.  Procedures for help desk call handling, routing, and escalation were under development.  In addition, staffing levels for the help desk had not been determined.  Once the IFS has been implemented, procedures will need to be in place to instruct users on how to get assistance should they have trouble using the IFS.  In addition, CFO and ITS organization personnel will need to be assigned to answer help desk calls and will need to know how to handle calls for assistance.  

·        Not all SLAs and contracts involving the CFO, the ITS, and the SAP organization were finalized.  Activities had not been finalized for determining how the IRS will access the development and test environments to support ongoing system operations and maintenance once the IFS has been implemented.  The project team had not completed SLAs for help desk support, response time, availability, and security for IFS operations and maintenance support.  In addition, the contract needed to obtain SAP organization support had not been finalized.  Without finalized SLAs and contracts, the IRS may not receive the needed level of support once the IFS is implemented.

·        Procedures for configuration management were not defined. Configuration management processes were under development. Configuration management processes are needed to ensure changes to the IFS, after it is implemented by the CFO and ITS organizations, are coordinated with the PRIME contractor.  Without proper configuration management processes, unapproved changes could be made to the IFS, and approved changes made in the production environment may not result in needed changes to the IFS development, testing, and disaster recovery environments.  This could affect the quality of future releases.  In a previous audit, we noted that 1 project was delayed by 4 months and additional costs were incurred when configuration management processes were not followed.

Management Action: Upon reviewing a preliminary version of this report, BSMO officials informed us that they agreed planning for post-implementation configuration management processes was significant and the PRIME contractor had started working the issue.  However, BSMO officials wanted to clarify that the current IFS Configuration Management Plan does not cover post-implementation configuration management processes because it is not required to be updated to cover post-implementation until the project team begins working towards Milestone 5. We agree the IRS and the PRIME contractor should begin working the post-implementation configuration management issues now and, as stated previously, further detail could be provided in future updates of the TMP or related transition documents.

·        Training delivery dates were not finalized.  Due to schedule delays, training delivery dates were not finalized.  These dates should be finalized as the IFS deployment date nears.  If not, future operations and maintenance personnel may not be trained timely.

We also determined that completion of the TMP was not part of the ELC Milestone Exit Criteria, which are used to determine whether modernization projects are ready to proceed to the next life cycle milestone.  The Transition To Support Plan, which has been replaced by the TMP, is part of the milestone exit criteria, but the TMP has not yet been included in the ELC Milestone Exit Criteria, according to BSMO officials, since it is a recent requirement (as previously stated, the IFS project was the first project to deliver a TMP).  If the TMP is not included as part of the milestone exit criteria, the IRS may not consider transition issues when determining whether modernization projects are ready to proceed to the next milestone.

Management Action: The BSMO stated it was acting on our concerns as the IFS project moves toward deployment. In January 2004, the ITS organization approved the IFS TMP citing 11 conditions, such as help desk and SLAs, that need to be improved in the future.  Also, BSMO officials informed us that they had created a database to track project transition issues.  According to BSMO officials, the project team began using the database in May 2004.

Recommendations

To ensure transition issues are resolved prior to acceptance of the IFS by the IRS and transition management is considered when exiting appropriate milestones, the Chief Information Officer (CIO) should ensure:

1.      Outstanding IFS transition issues are documented and tracked to closure.

Management’s Response:  The IRS responded that it is now using a team-level issue tracking log to monitor transition issues.  Any issues not resolved at this level will be elevated and monitored at weekly executive management meetings.

2.      The ELC Milestone Exit Criteria are updated to include completion of the TMP.

Management’s Response:  The IRS responded that the new ELC framework will require a TMP to be developed at different milestones during a project’s life cycle.

Testing Practices Continue to Need Improvement

As part of our prior audit, we evaluated IFS testing activities.  We found the IFS project was in the early stages of testing and testing practices could be improved as the project advanced to more formal testing. We recommended IRS management correct these deficiencies prior to the SIT, to help ensure a high-quality system is delivered.

As part of the current audit, we followed up on our previous testing concerns to determine if progress was being made as part of the SIT.  In addition, we reviewed SIT deviation forms, defect processing, and security test planning documentation.

We identified and communicated the following issues to the BSMO and Mission Assurance function during our review:

·        SIT documentation could be improved to ensure testing objectives are being met.

·        Defect management processes need improvement.

·        The ST&E Plan can be improved.

SIT documentation could be improved to ensure testing objectives are being met

While the SIT is progressing and certain parts of the SIT are near completion, we found the IFS project team could improve SIT documentation to ensure testing objectives are being met.  Based on our review of SIT documentation, we communicated the following concerns to the BSMO on February 23, 2004, March 3, 2004, and March 18, 2004.

Entrance criteria were not always met

In our previous audit, we determined Application Qualification Testing (AQT) test cases were not fully documented before the AQT began.  Therefore, the AQT entrance criteria had not been met before AQT activities were started.  We recommended SIT practices be strengthened based on lessons learned during the initial AQT.

As part of our current audit, we followed up to determine if entrance criteria were being met before each part of the SIT was started.  The IFS SIT Test Plan states that Test Readiness Reviews (TRR) are the forum for determining if entrance criteria have been met.  A successful TRR inspires confidence in the testing team and in the adequacy of the tests to be performed.  A TRR checklist is used to document the TRR meeting and the determination of test readiness.  As required by the ELC, TRRs were conducted for each of the five parts of the SIT.

No determination of test readiness was documented for one TRR.  For three TRRs, a decision was made to start testing even though all activities on the TRR checklist had not been completed.  For the final SIT part, all TRR checklist items were satisfied and the decision to move forward with testing was documented.  

In some cases, a reason was documented as to why certain checklist items were not met.  No reasons were provided for others.  For the part of the SIT that contains the majority of accounting requirements, we noted the decision was made to proceed with testing even though all test cases were not complete.

The BSMO informed us all TRR checklist items were not being satisfactorily completed prior to testing because the PRIME contractor had the final determination as to whether to proceed with testing, and the PRIME contractor had determined it was willing to assume the risk of proceeding without having completed all checklist items.  Without successful TRRs, the goal of inspiring confidence in the testing team and in the adequacy of the tests to be performed is not met, and the risk increases that testing may be delayed due to incomplete planning.

The IRS initially planned to complete corrective actions from our prior audit regarding strengthening SIT practices by October 31, 2003.  The IRS has extended the time needed to complete prior corrective actions until May 15, 2004.  Since the IRS is in the process of implementing corrective actions to our previous recommendation in this area, we are making no additional recommendations regarding entrance criteria.

Test cases did not include all required elements

In our previous audit, we found that not all AQT test cases were complete.  For example, test cases did not always include adequate expected results.  We recommended SIT practices be strengthened based on lessons learned during the initial AQT.

As part of our current audit, we followed up to determine if required test case elements were included in SIT test cases (see Appendix V for an explanation of the test case elements required by the ELC).  We reviewed a random sample of 85 test cases to identify whether they were being developed in accordance with guidelines.  We found that 21 (about 25 percent) of the 85 test cases we reviewed did not include all required elements, such as expected results.

Without all required test case elements, the tester may not know what the test is meant to achieve and what should be done before and after the test.  The ELC provides a template to make required test case elements easily identifiable. The PRIME contractor informed us it was not completely filling out the required test case template, but the information could be found in the detailed test steps.  Since the information in the test steps is not labeled, we could not determine if the information in the detailed test steps included all required test case elements.  The PRIME contractor also informed us it would be making corrections to put the required test case elements in the appropriate part of the test case form. At the conclusion of our audit in March 2004, this had not been done.

The IRS initially planned to complete corrective actions from our prior audit regarding strengthening SIT practices by October 31, 2003.  The IRS has extended the time needed to complete prior corrective actions until May 15, 2004.  Since the IRS is in the process of implementing corrective actions to our previous recommendation in this area, we are making no additional recommendations regarding test case elements. 

The Requirements Traceability Verification Matrix (RTVM) and related documentation were incomplete

The SIT Test Plan includes several key testing documents:

·        RTVM – The RTVM is a tool used to ensure each system requirement is tested by assigning it to one or more test cases. This tool helps ensure all requirements are tested. 

·        Software Documentation Automation (SoDA) Report – The SoDA Report includes a listing of all documented test cases.

·        Test Execution Schedule – Test execution schedules are developed by the test team to manage testing activities by listing the date when each test case is estimated to start and complete.  The schedules should provide one entry for each planned test case. 

In our previous audit, we found the RTVM was not complete because the AQT test execution schedule referred to test cases that were not in the AQT RTVM.  We recommended SIT practices be strengthened based on lessons learned during the initial AQT.

As part of our current audit, we followed up to determine if the SIT RTVM was complete.  Our objective was to determine if the SIT RTVM contained all requirements and the requirements were properly linked to the appropriate, documented test cases as required by the ELC.  Additionally, the ELC states that the RTVM should map requirements to test cases and the test execution schedule should provide one entry for each planned test case.

Throughout the audit, we received RTVMs for each of the different parts of the SIT.  For the part that contains the majority of accounting requirements, we compared the IFS test execution schedule to the SoDA Report to determine whether scheduled test cases had been documented. We determined that 227 (approximately 31 percent) of 731 test cases listed in the test execution schedule were not documented in the SoDA Report.

The inconsistency between the test execution schedule and the SoDA Report was due, in part, to test case naming inconsistencies.  For instance, one test case in the execution schedule was called IFS-R1-SIT2-TAXES-001.  In the SoDA Report, this test case was called IFS-R1-SIT2-TAXES_1.  The PRIME contractor responded this was due to the fact that the test execution schedule was created manually and typographical errors were always possible. 

In March 2004, we received RTVMs for all parts of the SIT.  The RTVMs did not always map to test cases. Specifically, 85 (approximately 3 percent) out of 2,696 initial IFS Release 1.0 requirements were not mapped to test cases.  In addition, we received SoDA Reports for all five parts of the SIT.  We compared the test case names listed in the RTVMs to the SoDA Reports and found the SoDA Reports did not contain 472 (approximately 37 percent) of 1,283 test cases referred to in the RTVMs.

While we were unable to determine the cause of each inconsistency between the RTVMs and SoDA Reports, a recent internal study indicated that the PRIME contractor needed to enforce requirements definition and program management compliance.  In addition, the PRIME contractor stated the SIT was a work in progress and the final SIT report would contain a complete, correct reference to all test cases in an RTVM and SoDA Report.  Without a complete and accurate test execution schedule, RTVM, and documented test cases, the IRS cannot determine if all system requirements are scheduled to be tested and if all test cases are documented. 

The IRS initially planned to complete corrective actions from our prior audit regarding strengthening SIT practices by October 31, 2003.  The IRS has extended the time needed to complete prior corrective actions until May 15, 2004.  Since the IRS is in the process of implementing corrective actions to our previous recommendation in this area, we are making no additional recommendations regarding inconsistencies between the RTVM, test execution schedules, and SoDA Reports. 

Some requirements did not include a verification and validation method

The ELC requires that all system requirements include a verification and validation (V&V) method. We found 12 (approximately 26 percent) out of 47 systems requirements to assist disabled individuals in using technology, which were listed in the RTVM and did not include a V&V method. Without a V&V method, the IRS cannot determine how requirements will be verified and validated.  The PRIME contractor stated that each of these requirements had a verification and validation method; however, the documentation was not up-to-date.

Deviation forms did not include all signatures

The ELC provides guidance on the signatures required for moving and deleting requirements from a release.  One required signature for the IFS project is that of the IRS Internal Management Program Director. During the audit, the project team prepared deviation forms for 36 requirements.  However, the deviation forms did not include a signature line for the Internal Management Program Director.  Without all required signatures, the deviation forms are not being adequately approved.  The PRIME contractor stated it was not aware of the requirement to have the IRS Internal Management Program Director approve the deviation forms.

Defect management processes need improvement

A defect occurs during testing when the actual results differ from the expected results documented in the test cases.  Defects discovered during each part of testing are documented, fixed, and retested.

The PRIME contractor’s responsibilities include ensuring the defect reporting and tracking process is used to report and track defects for all modernization projects.  Defects encountered during testing are entered into the PRIME contractor’s Defect Report Tracking System, which is used to verify defect resolution and the closing of defects when tests have been successfully completed.

Based on our review of IFS defect reports, we communicated the following concerns to the BSMO on February 23, 2004.

IFS SIT defects have exceeded the average number of defects for a project of its size and complexity

The BSM Performance Management Office performs data analyses using a collection of industry data and provides an estimate of the total number of defects that will be found for each modernization project. The BSM Performance Management Office determined that projects similar to the IFS in size and complexity produced an average of 121 defects.

As of January 29, 2004, the IFS project team reported that it had encountered 190 defects.  As mentioned previously, the IFS project team has delayed the IFS deployment date until October 2004 and is continuing to test.  Therefore, the number of defects will undoubtedly increase. 

We asked the PRIME contractor if it had performed any analyses to determine if there was a reason why there were so many defects and what actions could be taken to reduce the number of future defects.  The PRIME contractor stated it had performed a defect causal analysis, but as of March 31, 2004, the PRIME contractor had not provided us this analysis.  Without an understanding of why so many defects have been encountered, the IFS project team may continue to experience testing delays, and future modernization projects may not learn lessons based on the experience of the IFS project team.

Management Action: The BSMO requested that the PRIME contractor review the input that had been provided previously on the size and complexity of the IFS project.  The PRIME contractor responded with revised size figures.  Based on the revised figures, the BSMO recomputed the average number of defects for projects similar to IFS in size and complexity.  The new average number of defects is 440.  While the new average number of defects has increased, the number of actual defects encountered by the IFS project team has also increased.  As of April 20, 2004, the BSMO estimates the IFS project team has encountered 30 percent more defects than comparable projects.

Defects were not being evaluated and resolved timely

The ELC provides goals for timely evaluating and resolving defects based on the severity of each defect. See Appendix VI for a listing of these goals.

We randomly selected 50 (approximately 26 percent) of the 190 defects documented as of January 29, 2004. Six (12 percent) of the 50 defects were not evaluated timely, and 42 (84 percent) defects were not resolved timely.

Without an understanding of why defects are not evaluated and resolved timely, the IFS project team could continue to experience testing delays, and future modernization projects may not learn lessons based on the experience of the IFS project team.

The IFS project team documented this issue and created an action plan.  One of the actions was for the PRIME contractor to “analyze why SIT didn’t meet the throughput and develop a mitigation strategy to prevent reoccurrence.” The PRIME contractor could not provide any documentation on this mitigation strategy; however, it stated the mitigation strategy involved making sure developers are onsite with testers to view testing and understand any defects that are encountered.  The PRIME contractor also informed us it was now passing more test cases due to this new strategy.  We determined that the IFS project team had passed 171 test cases from the last workday of February through the month of March, compared to passing 96 test cases during the latter part of December and the month of January.

The ST&E Plan can be improved

The Department of the Treasury Directive 71-10 (August 23, 1999) establishes the security policy for requiring formal reviews of computer systems that will become operational.  The IRS Internal Revenue Manual (IRM) provides policies and guidance to be used by IRS organizations to carry out their respective responsibilities in information systems security.  The IRM indicates that testing and evaluation of computer systems must validate the correct implementation of security features.

The Office of Mission Assurance and its contractor are responsible for conducting ST&Es for all modernization projects.  An ST&E involves the planning and execution of security tests and the evaluation and analysis of the subsequent test results.  The goals of the ST&E are to identify the security profile of a particular system and to assess whether the system configuration and controls meet the security requirements and are ready for operation.

We reviewed the ST&E Plan and found it included an understandable outline of the ST&E that would be conducted for the IFS application.  In addition, the ST&E Plan included many test cases designed to ensure security requirements are met. 

We also found areas in which the ST&E Plan could be improved, and we communicated the following concerns to the BSMO and Office of Mission Assurance on December 17, 2003, and February 4, 2004. 

The ST&E Plan does not completely comply with the IRM

The IRM includes a template to be used when developing the ST&E Plan, but the IFS ST&E Plan does not follow this template.  Therefore, we reviewed the IFS ST&E Plan to determine if it included the information in the ST&E template.  For example, the ST&E template states that the Plan should describe the requirements, goals, and functions of the security system; why the system security is needed, what the system security should do, and how well the system should do it; and the system security functions as they relate to the system requirements.  However, we were unable to identify these features within the IFS ST&E Plan. 

The Office of Mission Assurance informed us the IRM template was just a suggested format and further information could be added to the ST&E Plan as the test draws nearer.  Templates are created to bring about standardization.  When templates are not followed, needed information may not be included and lessons learned that led to the creation of the template may have to be relearned.

Some required security features are not being tested

The IRM states that testing and evaluation must validate the correct implementation of security features, including identification and authentication, audit capability, access controls, object reuse, user accountability, trusted recovery, network connectivity, and transmission encryption (if applicable).

Trusted recovery test cases were not included in the IFS ST&E Plan, and object reuse was being tested on only one of two computer platforms.  The Office of Mission Assurance informed us the trusted recovery controls were not being tested because testing these controls was too costly. The Office of Mission Assurance also informed us it was testing object reuse on both computer platforms; however, the test case it referred us to for one platform was not a test for object reuse. Furthermore, no other test cases for object reuse could be located for this platform.  If these tests are not conducted, the IRS will not have assurance the IFS meets all security requirements.

Entrance and exit criteria are not documented in the ST&E Plan

A best practice in test planning is to create a predefined set of criteria used to determine when a test is ready to be conducted (entrance criteria) and when a test has been completed at an acceptable quality level (exit criteria).  The ELC requires that entrance and exit criteria be defined for all tests, except the ST&E.

The Office of Mission Assurance informed us it was not required to document entrance and exit criteria in the ST&E Plan; however, it was aware of the conditions that must exist for the ST&E to start and complete.  Without documented entrance and exit criteria, ST&Es could start prematurely or stop prior to expected results being achieved.

Recommendations

To ensure IFS testing results in a high-quality product and future modernization projects are improved by lessons learned by the IFS project team, the CIO should ensure:

3.      The RTVM is updated to include a verification and validation method for all requirements.

Management’s Response:  The IRS responded that the RTVM will be updated.  In addition, the new ELC framework will require updates to the RTVM during the project’s life cycle.

4.      Deviation forms are updated to include a signature line for all required approving officials.

Management’s Response:  The IRS is now including the correct signature line on deviation forms.

5.      The root cause is investigated and resolved for why defects are more voluminous than expected and for why defects are being evaluated and resolved slower than expected.

Management’s Response:  The IRS agreed with our recommendation and will ensure that a lessons learned report is generated.  In addition, the IRS revised its defect estimate and found the project was within expectations once a corrected defect estimate was calculated.   

6.      The ELC is updated to include a requirement for entrance and exit criteria in the ST&E Plan.

Management’s Response:  The IRS will be incorporating changes into the milestone exit requirements.

Office of Audit Comment:  The CIO agreed with our recommendation; however, the corrective action provided will not adequately address our recommendation.  Changing the milestone exit criteria will not create a requirement for entrance and exit criteria in the ST&E Plan.

To ensure planning for the upcoming IFS ST&E is as thorough as possible, the Chief, Mission Assurance, should ensure:

7.      All needed elements from the IRM are included in the ST&E Plan.

Management’s Response:  The IRS responded that the IFS ST&E Plan or its associated documents will be updated to address needed elements as identified in the IRM.

 

8.      Trusted recovery and object reuse testing occurs or the risk of not conducting these tests is documented.

Management’s Response:  The CIO disagreed with our recommendation to ensure trusted recovery and object reuse testing occurs, or the risk of not conducting these tests is documented.  The IRS responded that object reuse testing is no longer required by the Department of the Treasury.

Office of Audit Comment:  While the Department of the Treasury requirement may have changed, the IRM still requires object reuse testing.  Therefore, we believe the IRS should conduct the required testing, particularly since it plans to perform object reuse testing on one computer platform and not the other.  In addition, the IRS’ response did not address trusted recovery testing. 

 

Appendix I

 

Detailed Objectives, Scope, and Methodology

 

The overall objective of this review was to determine if adequate testing processes and procedures were being followed prior to the deployment of the Integrated Financial System (IFS) to ensure the system meets expectations.  Additionally, we determined if the IFS project team was adequately planning for the transition of operation and maintenance of the system from the contractor to the Internal Revenue Service (IRS). 

To achieve these objectives, we:

I.                    Determined if testing support processes for the IFS Release 1.0 were being adequately planned for deployment.

A.     Determined if all entrance and exit criteria were being met prior to initiating and exiting the various parts of the System Integration Testing (SIT). 

B.     Determined if the IFS project team was adequately controlling waivers and deferrals.

C.     Determined if all aspects of the IFS application would be tested during the SIT.

D.     Determined if test cases included required elements and all scheduled test cases were documented.  NOTE:  The PRIME contractor provided 523 test cases for the second part of SIT and 103 test cases for the third part of SIT.  To accomplish this audit step, we chose a random sample of 75 test cases (approximately 14 percent) from the second part of SIT and 10 test cases (approximately 10 percent) from the third part of SIT for review.  We did not use a statistical sample because we were not going to project the results to the entire population.

E.      Determined the quality of the product the PRIME contractor had delivered for the SIT by comparing the actual number of defects with the estimated number of defects.

F.      Determined if the defect reporting process was being followed and resolved in accordance with the Enterprise Life Cycle (ELC) during each part of the SIT.  NOTE:  The PRIME contractor provided 190 defect reports from SIT as of January 29, 2004.  To accomplish this audit step, we chose a random sample of 50 defect reports (approximately 26 percent) for review.  We did not use a statistical sample because we were not going to project the results to the entire population.

II.            Determined if the IRS adequately planned for the Security Test and Evaluation (ST&E) by reviewing the ST&E Plan.

III.          Determined if adequate plans were in place to ensure the Information Technology Services (ITS) and the Chief Financial Officer (CFO) organizations would be ready to provide technical and infrastructure support for the IFS Release 1.0 after Milestone 5.

A.            Determined if all training needed to accomplish the transfer of the IFS Release 1.0 to the IRS had been identified and scheduled.

B.             Determined if post-Milestone 5 roles and responsibilities had been defined for the CFO and ITS organizations and for the PRIME contractor.

C.            Determined if the IFS project team had documented how to handle changes to the Release 1.0 software after Milestone 5.

D.            Compared the steps being taken to transfer knowledge about the IFS Release 1.0 to that of the Custodial Accounting Project (CAP) Release 1.0 to determine if there were any additional items the IFS project team should be taking into account.  NOTE:  The CAP is a similar project that should have similar transition activities.

E.             Reviewed the IFS Transition Management Plan and determined if it included all required ELC elements.

F.             Determined if the risk reduction plans regarding ITS readiness were on track and key activities were being completed. 

NOTE:  Additional work was scheduled concerning the IFS Deployment Site Readiness Test (DSRT). Due to delays in IFS testing, at the time of our audit the IFS DSRT Test Plan was not available for review and DSRT testing had not been initiated.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs)

Gary V. Hinkle, Director

Troy D. Paterson, Audit Manager

Mark K. Carder, Senior Auditor

Charlene L. Elliston, Auditor

Perrin T. Gleaton, Auditor

 

Appendix III

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Associate Chief Information Officer, Management  OS:CIO:M

Associate Chief Information Officer, Modernization Management  OS:CIO:MM

Deputy Associate Chief Information Officer, Program Management OS:CIO:B:PM

Acting Director, Internal Management Modernization  OS:CIO:B:PM:IMM

Director, Stakeholder Management  OS:CIO:SM

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaisons: 

Associate Chief Information Officer, Business Systems Modernization  OS:CIO:B

Chief, Mission Assurance  OS:MA

 

Appendix IV

 

Enterprise Life Cycle Overview

 

The Enterprise Life Cycle (ELC) defines the processes, products, techniques, roles, responsibilities, policies, procedures, and standards associated with planning, executing, and managing business change. It includes redesign of business processes; transformation of the organization; and development, integration, deployment, and maintenance of the related information technology applications and infrastructure.  Its immediate focus is the Internal Revenue Service (IRS) Business Systems Modernization (BSM) program.  Both the IRS and the PRIME contractor must follow the ELC in developing/acquiring business solutions for modernization projects.

The ELC framework is a flexible and adaptable structure within which one plans, executes, and integrates business change. The ELC process layer was created principally from the Computer Sciences Corporation’s Catalyst® methodology.  It is intended to improve the acquisition, use, and management of information technology within the IRS; facilitate management of large-scale business change; and enhance the methods of decision making and information sharing. Other components and extensions were added as needed to meet the specific needs of the IRS BSM program.

ELC Processes

A process is an ordered, interdependent set of activities established to accomplish a specific purpose.  Processes help to define what work needs to be performed. The ELC methodology includes two major groups of processes:

            Life-Cycle Processes, which are organized into phases and subphases and which address all domains of business change.

            Management Processes, which are organized into management areas and which operate across the entire life cycle.

 

Enterprise Life-Cycle Processes

 

The chart was removed due to its size.  To see the chart, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

Life-Cycle Processes

The life-cycle processes of the ELC are divided into six phases, as described below:

·                     Vision and Strategy - This phase establishes the overall direction and priorities for business change for the enterprise.  It also identifies and prioritizes the business or system areas for further analysis.

·                     Architecture - This phase establishes the concept/vision, requirements, and design for a particular business area or target system.  It also defines the releases for the business area or system.

·                     Development - This phase includes the analysis, design, acquisition, modification, construction, and testing of the components of a business solution.  This phase also includes routine planned maintenance of applications.

·                     Integration - This phase includes the integration, testing, piloting, and acceptance of a release.  In this phase, the integration team brings together individual work packages of solution components developed or acquired separately during the Development phase. Application and technical infrastructure components are tested to determine whether they interact properly. If appropriate, the team conducts a pilot to ensure all elements of the business solution work together.

·                     Deployment - This phase includes preparation of a release for deployment and actual deployment of the release to the deployment sites.  During this phase, the deployment team puts the solution release into operation at target sites.

·                     Operations and Support - This phase addresses the ongoing operations and support of the system.  It begins after the business processes and system(s) have been installed and have begun performing business functions.  It encompasses all of the operations and support processes necessary to deliver the services associated with managing all or part of a computing environment.

The Operations and Support phase includes the scheduled activities, such as planned maintenance, systems backup, and production output, as well as the nonscheduled activities, such as problem resolution and service request delivery, including emergency unplanned maintenance of applications.  It also includes the support processes required to keep the system up and running at the contractually specified level.

Management Processes

Besides the life-cycle processes, the ELC also addresses the various management areas at the process level.  The management areas include:

·                     IRS Governance and Investment Decision Management - This area is responsible for managing the overall direction of the IRS, determining where to invest, and managing the investments over time.

·                     Program Management and Project Management - This area is responsible for organizing, planning, directing, and controlling the activities within the program and its subordinate projects to achieve the objectives of the program and deliver the expected business results.

·                     Architectural Engineering/Development Coordination - This area is responsible for managing the technical aspects of coordination across projects and disciplines, such as managing interfaces, controlling architectural changes, ensuring architectural compliance, maintaining standards, and resolving issues.

·                     Management Support Processes - This area includes common management processes, such as quality management and configuration management that operate across multiple levels of management.

Milestones

The ELC establishes a set of repeatable processes and a system of milestones, checkpoints, and reviews that reduce the risks of systems development, accelerate the delivery of business solutions, and ensure alignment with the overall business strategy. The ELC defines a series of milestones in the life-cycle processes.  Milestones provide for “go/no-go” decision points in the project and are sometimes associated with funding approval to proceed.  They occur at natural breaks in the process where there is new information regarding costs, benefits, and risks and where executive authority is necessary for next phase expenditures.

There are five milestones during the project life cycle: 

·                     Milestone 1 - Business Vision and Case for Action.  In the activities leading up to Milestone 1, executive leadership identifies the direction and priorities for IRS business change.  These guide which business areas and systems development projects are funded for further analysis.  The primary decision at Milestone 1 is to select BSM projects based on both the enterprise-level Vision and Strategy and the Enterprise Architecture.

·                     Milestone 2 - Business Systems Concept and Preliminary Business Case.  The activities leading up to Milestone 2 establish the project concept, including requirements and design elements, as a solution for a specific business area or business system.  A preliminary business case is also produced.  The primary decision at Milestone 2 is to approve the solution/system concept and associated plans for a modernization initiative and to authorize funding for that solution.

·                     Milestone 3 - Business Systems Design and Baseline Business Case.  In the activities leading up to Milestone 3, the major components of the business solution are analyzed and designed.  A baseline business case is also produced.  The primary decision at Milestone 3 is to accept the logical system design and associated plans and to authorize funding for development, test, and (if chosen) pilot of that solution.

·                     Milestone 4 - Business Systems Development and Enterprise Deployment Decision.  In the activities leading up to Milestone 4, the business solution is built.  The system is integrated with other business systems and tested, piloted (usually), and prepared for deployment.  The primary decision at Milestone 4 is to authorize the release for enterprise-wide deployment and commit the necessary resources.

·                     Milestone 5 - Business Systems Deployment and Postdeployment Evaluation.  In the activities leading up to Milestone 5, the business solution is fully deployed, including delivery of training on use and maintenance.  The primary decision at Milestone 5 is to authorize the release of performance-based compensation based on actual, measured performance of the business system.

 

Appendix V

 

Test Case Elements Required by the Enterprise Life Cycle

 

The Enterprise Life Cycle requires the following elements to be included in test cases.

Test Case Identifier

The test case identifier should identify the PROJECT and/or TEST PHASE, RELEASE, CONFIGURATION ITEM and/or REQUIREMENTS CATEGORY (Functional, Security, Performance, Regression, etc.), TEST CASE NUMBER (unique alpha numeric or numeric identifier), and, if the same test is to be executed at multiple locations or against multiple components, a LOCATION identifier.

Test Script Identifier

The test script identifier should specify the unique identifier of each script necessary to exercise the functionality and requirements.

Test Case Description

The test case description provides the condition (or object or application state) being tested, use case, use-case scenario, or technical or supplemental requirement from which the test case is derived.

Requirement Category

The requirement category helps to categorize test cases.  Examples of test types are Performance, Cycle, Data Conversion, Security, and External Function.

Verification AND Validation Method

The verification and validation method classifies the approach for ensuring the implementation of the requirements(s) such as analysis, inspection, test, and demonstration.

Requirements

Requirement number and description, also called test inputs, must be provided to identify the functionality a test is validating. Examples of test inputs are requirements, model elements, spreadsheet values, etc.

PreConditions

A precondition is the description of the constraints a test case requires before it is run.  Examples of preconditions include:

·        Other test case dependencies.

·        Input data conditions.

·        Input files.

·        Preparation activities.

·        Account or password setup.

PostConditions

The postconditions of a test case are actions required to restore the tested system to its original state once the test has passed.

Configuration

The configuration refers to the hardware and software components and specific settings required for proper execution of the test case.

Acceptance Criteria

The acceptance criteria are the expected result stated in terms of the output state, condition, or data value(s) that provides evidence or proof a requirement has been satisfied.

 

Appendix VI

 

Defect Tracking and Resolution Time Periods

 

Table 1 depicts the goals for evaluating and resolving defect reports.

  Table 1:  Severity and Schedule of Resolution

Severity
Schedule of Resolution

1 – Critical

Development team evaluates the defect report (DR) within an hour and works the resolution around the clock with the goal of delivering a fix within 24 hours, unless otherwise scheduled.

2 – High

Development team evaluates the DR within 1 business day with the goal of delivering a fix within 3 business days, unless otherwise scheduled.

3 – Medium

 

4 – Low

Development team evaluates the DR within 3 business days with the goal to deliver all severity 3 and 4 fixes.  The effort for resolving severity 3 and 4 DRs will be assessed according to the number of severity 1 and 2 DRs being worked, the remaining test schedule, and management decisions to defer fixes to a later release.

Source:  Enterprise Life Cycle (ELC).  See Appendix IV for an overview of the ELC.

 

Appendix VII

 

Management’s Response to the Draft Report

 

The response was removed due to its size.  To see the response, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.