Risks Are Mounting as the Integrated Financial System Project Team Strives to Meet an Aggressive Implementation Date

 

October 2003

 

Reference Number:  2004-20-001

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

October 7, 2003

 

 

MEMORANDUM FOR CHIEF INFORMATION OFFICER

 

FROM:     Gordon C. Milbourn III /s/ Gordon C. Milbourn III

                 Assistant Inspector General for Audit (Small Business and

                 Corporate Programs)

 

SUBJECT:     Final Audit Report - Risks Are Mounting as the Integrated Financial System Project Team Strives to Meet an Aggressive Implementation Date (Audit # 200320038)

 

This report presents the results of our review of the Integrated Financial System (IFS) Release 1.  The overall objective of this review was to assess whether the Business Systems Modernization Office (BSMO) and the PRIME contractor have controls in place to ensure that activities for testing, business contingencies, enterprise architecture compliance, and transition management are adequately planned for the IFS Release 1.  Additionally, we reviewed any recent or anticipated changes to the costs and benefits of the IFS Release 1.

Beginning in 1995, the General Accounting Office designated the Internal Revenue Service’s (IRS) financial management area as a high-risk Federal Governmental operation.  The IRS intends to address administrative financial management weaknesses by implementing the IFS. 

During the audit period, the IFS project team made progress toward implementing the first release of the IFS in October 2003.  Specifically, the project team has begun important testing activities, ensured compatibility with the Security and Technology Infrastructure Release (STIR), and ensured redundant hardware is planned for implementation.  However, we also determined that testing practices could be improved, project costs are increasing, some functionality has been postponed, and disaster recovery will not be optimal or fully tested prior to implementation.

Since the IFS project team is in the midst of critical testing activities, we communicated the results of our analyses intermittently during the audit.  Therefore, this report is historical in nature and may not reflect the most current testing activities or processes.  A follow-up audit will be conducted to provide additional analyses and recommendations as the IFS project works toward implementation.

To help ensure that a high-quality system is delivered, we recommended that the Chief Information Officer (CIO) ensure that testing practices are strengthened in future tests, data cleaning issues and the business risk of untimely IFS implementation are formally tracked, and life cycle documentation is updated.  We also recommended that the CIO ensure that independent testing roles are documented, disaster recovery capabilities are implemented and tested as soon as possible, and IFS classification in the draft Technical Contingency Planning Document is reconsidered.

Management’s Response:  BSMO management requested an extension to respond to our draft report from September 25, 2003, to October 2, 2003.  As of October 3, 2003, management had not responded to the draft report.

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs), at (202) 622-8510.

 

Table of Contents

Background

The Project Team Is Making Progress Toward Its Scheduled Implementation Date

Project Testing Practices Can Be Improved

Recommendations 1 and 2:

Recommendations 3 through 5:

Project Costs Are Increasing and Some Functionality Has Been Postponed

Disaster Recovery Will Not Be Optimal or Fully Tested Before Initial Implementation

Recommendations 6 and 7:

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Independent Testing Roles Example

Appendix V – Integrated Financial System Testing Risk and Complexity

 

Background

The Internal Revenue Service (IRS) is currently modernizing its computer systems and business processes and practices.  This effort is known as Business Systems Modernization (BSM).  One of the BSM projects is the Integrated Financial System (IFS), which will help to modernize the IRS’ financial systems and processes. 

Beginning in 1995, the General Accounting Office (GAO) designated the IRS’ financial management area as a high-risk Federal Governmental operation.  In Fiscal Year (FY) 2002, the President included “improving financial performance” as one of five Government-wide areas needing improvement.  Recently, the GAO concluded that financial systems and internal control weaknesses continue to preclude the IRS from providing managers with financial information needed to make day-to-day decisions.

The IRS intends to address administrative financial management weaknesses by implementing the IFS.  The first release of the IFS will include the Accounts Payable, Accounts Receivable, General Ledger, Budget Execution, Cost Management, and Financial Reporting activities.  A future IFS release will be needed to fully resolve all administrative financial management weaknesses.  Therefore, all actions needed to address financial management weaknesses are not scheduled for implementation until January 2006.

This review was performed at the IRS National Headquarters and the Business Systems Modernization Office (BSMO) facilities in New Carrollton, Maryland.  The audit was conducted between January and July 2003 in accordance with Government Auditing Standards.  To provide timely feedback during critical testing activities, we communicated the results of our analyses intermittently during the audit.  These communications are discussed throughout the report.

The BSMO and the PRIME contractor were making changes to IFS testing activities during our review period, and changes that have occurred since we concluded our analysis in early July 2003 are not reflected in this report.  As a result, this report may not reflect the most current activities or processes.  We plan to conduct a follow-up audit to assess changes being made and future activities as the IFS project team works toward implementing the IFS Release 1.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

The Project Team Is Making Progress Toward Its Scheduled Implementation Date

The IFS project team has made progress toward implementing the IFS Release 1 in October 2003.  Specifically, the project team has begun important testing activities, ensured compatibility with the Security and Technology Infrastructure Release (STIR), and ensured redundant hardware is planned for implementation.

Important IFS testing activities have begun

In May 2003, the project team created an Application Qualification Testing (AQT) Plan describing the procedures to be followed during AQT.  AQT involves testing an application before it is integrated with other release components.  The intent is to discover and resolve errors prior to more formal systems integration testing.  The AQT Plan included a detailed schedule of activities to be completed and predefined entrance and exit criteria.  In addition, the project team conducted a test readiness review to determine if the project was ready to proceed with AQT.  In July 2003, the first part of AQT was completed.

In May 2003, the project team delivered a data conversion test plan that describes the methods for ensuring that data conversion programs work correctly.  The plan included a detailed schedule of activities to be followed.  At the end of our audit work, the first data conversion test was underway.

In July 2003, the first part of the System Integration Test (SIT) began.  The SIT ensures that all system components (hardware and software) are working correctly and collectively with other related or dependent systems.  A follow-up audit will be conducted to review the results of the SIT.

The IFS is compatible with the STIR

As part of an earlier review by the BSMO, the project team was required to prove that a unique component of the IFS was compatible with the STIR.  Early testing has proven that this component works with the STIR.

Redundant hardware is planned for implementation

Based on a review of project documentation with a MITRE official, we determined that hardware redundancy is planned for initial implementation of the IFS at the production site.  Therefore, the system should be able to recover from a short-term equipment failure if hardware devices are configured correctly.

While the BSMO and PRIME contractor have made significant strides toward delivering the first release of the IFS, the project risks are mounting, and the project team is beginning to encounter obstacles as it attempts to meet the aggressive schedule for an October 2003 implementation date.

Project success can be defined as meeting cost, schedule, and quality constraints, often portrayed as a triangle (see Figure 1).  Any change to one of these three constraints will affect one or both of the remaining constraints. 

Figure 1:  Classic Project Management Triangle

The figure was removed due to its size.  To see the figure, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

At the end of our audit fieldwork, the IFS project team was holding to the October 2003 implementation date.  Therefore, our review focused on risks in the area of quality and cost.  We identified the following areas that BSMO and PRIME management will need to focus on as they strive to meet an aggressive schedule:

·        Project testing practices can be improved.

·        Costs are increasing and some functionality has been postponed.

·        Disaster recovery will not be optimal or fully tested before initial implementation.

Project Testing Practices Can Be Improved

At the end of our audit fieldwork, a significant set of tests, known as the SIT, was beginning and data conversion testing was continuing.  We determined that the following improvements should be made as testing progresses:

·        Lessons learned from the AQT should be applied to the SIT.

·        Data conversion issues should be tracked.

·        Independent testing roles should be defined.

·        Risk reduction planning regarding implementation uncertainties should be formalized.

Lessons learned from the AQT should be applied to the SIT

The IFS project team delivered AQT planning materials just prior to the start of AQT.  Therefore, any deficiencies that were noted could not be corrected in time to meet the scheduled AQT start date.  If these deficiencies are not corrected prior to the SIT, the IFS may not be thoroughly tested, and the IRS may accept a system that does not function as intended.

The AQT Plan included a set of entrance criteria, test cases, a detailed schedule of test cases to be run, and a requirements traceability verification matrix.  Based on our review of this documentation, we communicated the following concerns to BSMO officials on June 18, 2003.

·        Entrance criteria – The AQT Plan documents the entrance criteria that should be met for AQT to proceed (e.g., test cases are fully documented and all testing personnel are identified).  We determined that test cases were not fully documented and all testing personnel were not identified.  Therefore, the AQT entrance criteria were not met before starting AQT activities.

·        Test cases – A test case should contain a specific set of conditions, data, and expected results for a particular test objective.  However, not all AQT test cases were complete.  For example, test cases did not always provide the detailed steps for executing the tests.  In addition, test cases did not always include adequate expected results.  The Chief Financial Officer (CFO) function reviewed the test cases and also found them to be lacking adequate expected results.

·        Test schedule – Some of the test cases that were scheduled for execution as part of AQT could not be located in the supporting documentation received as part of the AQT Plan.   To adequately plan, all test cases that are scheduled should be documented.

·        Requirements Traceability – The requirements traceability verification matrix is a tool used to ensure that each system requirement is tested by assigning it to one or more test cases.  This tool helps to ensure that all requirements are tested.  We found that the AQT project schedule referred to test cases that were not in the AQT requirements traceability matrix.  To adequately plan, the matrix should be complete.

Data conversion issues should be tracked

Data conversion testing is needed when converting data from an old system to a new system.  Due to the nature of data conversions, it is likely that inaccuracies will be encountered with data from the old system.  Under the pressure of a tight schedule, the IFS project team should be acutely aware of data accuracy and validation issues that could delay timely IFS implementation.

The PRIME contractor informed us that it was referring data integrity issues to CFO personnel for resolution.  At the time, the PRIME contractor had identified a relatively small amount of data conversion issues and was not formally tracking them.  Since data conversion activities will be taking place for the next several months, the potential still exists for significant, or a large number of, errors to be found.  On June 24, 2003, we communicated to BSMO officials that data cleaning and validation issues should be tracked more formally. 

For instance, the National Aeronautics and Space Administration (NASA) used a tracking form and measures to record data integrity issues and track progress as it was converting data to the same software package that the IFS project is implementing.  If issues are not tracked and large volumes of data integrity issues are encountered just prior to implementation, IFS implementation could be delayed. 

Management Action:  During discussions of a preliminary version of this report, the BSMO indicated that it was using some tracking mechanisms to monitor data cleaning issues.  As of the date of the draft report, the BSMO was reviewing NASA information provided by the audit team to determine if additional detail was needed to formally track data cleaning issues.

Independent testing roles should be defined

The Enterprise Life Cycle (ELC) provides that the IRS Product Assurance function will perform a Systems Acceptance Test (SAT) on each modernization project.  A SAT independently assesses the quality of a system and the system’s readiness for implementation.  However, the Product Assurance function is not performing an independent SAT for the IFS.  Instead, the SIT and the SAT are being combined into one test, which will be performed by a combination of PRIME contractor and CFO personnel. 

The Product Assurance function determined that it would not conduct a SAT for the IFS because it normally only conducts a SAT on systems that affect taxpayers, and it did not have the skills needed to test the new software and existing interfaces.  In addition, the combined testing approach was employed in response to IRS executive direction to reduce overall testing costs.

To ensure that adequate independent testing is still being provided without an independent SAT, we conducted interviews with the CFO, Product Assurance, and BSMO functions.  We determined that the CFO function is playing a significant role during testing.  In addition, the Associate Commissioner, BSM, commented that he had asked project officials to try to ensure some form of Product Assurance function involvement during testing, due to the Product Assurance function’s skills in reviewing testing practices.  The Product Assurance function agreed to provide testing support by reviewing high-risk test cases and system documentation.

While we agree that the significant role of the CFO function, with assistance from the Product Assurance function in high-risk areas, helps to reduce the risk of not conducting an independent SAT, we are concerned that we could find no clear documentation ensuring that all system requirements are being independently verified.  For instance, CFO and Product Assurance function personnel are concentrating on financial requirements.  However, the IRS defined certain nonfinancial requirements that will also need to be tested.  While these requirements may be tested independently, we could not find any documentation that clearly revealed that all requirements would be independently tested or verified by the IRS.  If IRS personnel do not independently ascertain that the correct items are being tested and that test results meet expectations, then the IRS may accept a system that does not perform as expected.

The Product Assurance function is also not conducting an independent SAT for the Custodial Accounting Project, a related BSM project.  However, the Custodial Accounting Project team is using a roles matrix to ensure that all test cases will be independently executed or validated by IRS personnel.  Please see Appendix IV for an independent testing roles matrix example.  We believe it would be prudent for the IFS project team to develop a similar matrix. 

Management Action:  During discussions of a preliminary version of this report, the BSMO indicated that it had used information provided by the auditors during the review to fill out an initial matrix.  The BSMO indicated that it is now using the initial matrix to ensure that adequate independent testing is being conducted.  

Risk reduction planning regarding implementation uncertainties should be formalized

Testing risk and complexity are increasing, which puts the October 2003 implementation date at risk (see Appendix V for details).  Therefore, sound management practices dictate that the IRS begin planning for contingencies due to implementation uncertainties. 

The functions currently being performed by the Automated Financial System (AFS) must continue into FY 2004 on either the existing AFS or the new IFS.  Therefore, the IRS needs to plan to reduce the business risk in case one of the following scenarios occurs:

·        The IFS is not implemented in time for FY 2004 processing.

·        The IFS is implemented in October 2003, but significant problems are encountered after implementation.

We were informed that the CFO function was taking steps to ensure that the vendor who maintains the AFS could be brought on-board in time for FY 2004 processing.  We were also informed that files were being readied for both systems as a contingency.  However, the IRS had not formally documented the risk and the associated plan to reduce the business impact if the IFS is deployed but encounters implementation problems.  If the IRS does not document and track activities needed due to implementation uncertainties, needed actions may not be planned for and the IRS may not be able to process financial transactions on either the old or new system at the beginning of FY 2004.

Recommendations

To ensure that a high-quality IFS is delivered, the Chief Information Officer should ensure that:

1.      SIT practices are strengthened based on lessons learned during the initial AQT.

Management’s Response:  BSMO management requested an extension to respond to our draft report from September 25, 2003, to October 2, 2003.  As of October 3, 2003, management had not responded to the draft report.

2.      Data cleaning and validation issues are formally tracked.

3.      Risk reduction activities being taken in case of untimely IFS implementation are formally documented and tracked.

4.      The ELC is updated with the new testing strategy, if it is determined that the strategy is successful and will be employed in the future.  If an ELC update is made, ensure that there is a requirement to document independent acceptance roles when the Product Assurance function is not providing full independent assurance.

5.      An independent testing roles matrix is prepared for the IFS.

Project Costs Are Increasing and Some Functionality Has Been Postponed

The IFS project team delivered a baseline business case in January 2003.  The baseline business case provides the estimated cost and benefits of the project.  The BSMO also provides the Congress justification to release funds specifically set aside for the BSM effort by submitting BSM Spending Plans.

During our audit fieldwork, an update to the baseline business case was not available for review.  Therefore, we interviewed officials, analyzed GAO reports, and reviewed change requests to determine if any cost increases were anticipated or had already occurred during FY 2003.

The GAO reported in June 2003 that the IFS project had a net budget increase of $20 million between the November 2002 BSM Spending Plan and the March 2003 BSM Spending Plan.  The increase was needed to cover revised labor estimates and an increase in infrastructure costs. 

The IRS is also considering moving some functionality originally planned for delivery in October 2003 to early Calendar Year 2004.  According to the IRS, the delay in functionality will not have an adverse effect on the IRS, as the functionality is not needed until Calendar Year 2004.  The IRS estimates that the cost increase to deliver delayed functionality could be $7 million, due to the additional labor costs the PRIME contractor would incur after implementation of the IFS Release 1. 

Disaster Recovery Will Not Be Optimal or Fully Tested Before Initial Implementation

In case of a disaster after system implementation, certain components needed to fully restore IFS functionality are not currently available.  For example, mid-level computer systems that will be used to communicate with the IFS application have not been replicated at a disaster site.  Also, on July 10, 2003, we communicated to IFS project officials that the current IFS classification for disaster recovery purposes needs to be reevaluated.

According to BSMO officials, funds have been earmarked for the next 2 fiscal years to improve disaster recovery capabilities for all modernization projects.  Until that time, disaster recovery capabilities will be less than optimal.  Until all components needed for a full restoration of IFS capabilities are put in place, a full test of disaster recovery capabilities cannot be conducted.  According to project officials, the lack of full disaster recovery capabilities was caused by past budget cuts.  Without full disaster recovery capabilities and testing, the IRS runs the risk that the IFS will not be able to fully recover in the event of a disaster.

The Technical Contingency Planning Document is required to describe business contingency capabilities for a system before the system is implemented.  The final IFS Contingency Plan was not available prior to the completion of our audit fieldwork.  However, we reviewed the draft Contingency Plan and noted that the IFS was classified as a “critical” system.  We believe the IRS should reconsider classifying this system as “mission critical” for two reasons.

First, the IFS supports 3 of the 18 mission critical business processes, as defined in the IRS Business Contingency Case For Action.  Second, the definition of “critical” in the draft Contingency Plan may not fit the IFS.  The draft Contingency Plan states that a “critical” system:

·        Is critical in accomplishing the work of the IRS.

·        Is primarily performed by computers.

·        Can be performed manually for a limited time period. 

Based on our analysis and discussions with BSMO officials, it would be very difficult to perform the full range of IFS capabilities manually for a limited period of time.  Because the Contingency Plan was still in draft, we did not determine why the system was classified as “critical” versus “mission critical.”  However, confusion seems to stem from the definition of critical infrastructure versus the classification definitions in the draft Technical Contingency Planning Document.  If the IFS is not classified correctly, plans may not be made to recover the system in time to perform mission critical tasks.

Recommendations

To ensure that a high-quality system is delivered, the Chief Information Officer should ensure that:

6.      The disaster recovery environment is completely built out and tested as soon as possible.

7.      The IFS classification in the draft Technical Contingency Planning Document is reconsidered.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of this review was to assess whether the Business Systems Modernization Office (BSMO) and the PRIME contractor have controls in place to ensure that activities for testing, business contingencies, enterprise architecture compliance, and transition management are adequately planned for the Integrated Financial System (IFS) Release 1.  Additionally, we reviewed any recent or anticipated changes to the costs and benefits of the IFS Release 1. 

I.                    Evaluated whether testing for the IFS Release 1 was adequately planned.

A.                 Determined if the project team had developed test plans that verify whether the IFS Release 1 meets the documented requirements.

B.                 Determined if all aspects of the application would be tested.

C.                 Determined the reasonableness of the testing schedule.

D.                 Determined what controls were in place to ensure sufficient independent testing would be conducted.

E.                  Determined whether performance testing was conducted.

F.                  Determined if data conversion testing controls were in place.

II.                 Determined if a specific IFS infrastructure component would be successfully tested and meet Enterprise Architecture requirements.

III.               Determined if business contingency planning for the IFS Release 1 was adequate.

IV.              Reviewed recent or anticipated changes to the IFS’ cost and benefits.

V.                 Determined if the Internal Revenue Service could switch production back to the current system if the IFS Release 1 did not perform adequately at deployment.

 

Appendix II

 

Major Contributors to This Report

 

Margaret E. Begg, Assistant Inspector General for Audit (Information Systems Programs)

Scott A. Macfarlane, Director

Troy D. Paterson, Audit Manager

Beverly K. Tamanaha, Senior Auditor

Charlene L. Elliston, Auditor

Perrin T. Gleaton, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Associate Commissioner, Business Systems Modernization  OS:CIO:B

Deputy Associate Commissioner, Program Management  OS:CIO:B:PM

Director, Internal Management Modernization  OS:CIO:B:PM:IMM

Acting Director, Portfolio Management Division  OS:CIO:R:PM

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Management Controls  OS:CFO:AR:M

Audit Liaison:  Associate Commissioner, Business Systems Modernization  OS:CIO:B

 

Appendix IV

 

Independent Testing Roles Example

 

Below is an example of a testing roles matrix that could be used to define Internal Revenue Service (IRS) involvement during testing.

In the simplistic example below, test type E is not being reviewed, executed, or validated by the IRS.

Figure 1:  Independent Testing Roles Matrix

 

The figure was removed due to its size.  To see the figure, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

 

Appendix V

 

Integrated Financial System Testing Risk and Complexity

 

Testing risk and complexity are increasing in the following areas:

 

·        No slack time is left in the testing schedule.  Therefore, any slippage on the critical path of the project will result in a schedule slip.

·        The test team has begun using a cycle approach to testing.  See the following figure depicting the cycle approach.  At the end of our audit fieldwork, Application Qualification Testing (AQT) Cycle 1 had been completed and System Integration Test (SIT) Cycle 1 had begun.  At the beginning of SIT Cycle 1, a full listing of requirements, and when they would be tested, was not available.  Also, approved test cases for each test cycle were not available.  The Integrated Financial System (IFS) project team intends to provide this information before SIT Cycle 2.  Therefore, we could not determine if all requirements were going to be tested using Internal Revenue Service-approved test cases.

In addition, the IFS project team is under time pressure due to the fact that the first AQT did not accomplish as much as hoped for; therefore, the remaining cycles of AQT and the SIT will have more requirements to be tested than originally anticipated.

Figure 1:  IFS Cycle Testing Approach

The figure was removed due to its size.  To see the figure, please go to the Adobe PDF version of the report on the TIGTA Public Web Page.

·        Performance requirements are still being defined.

·        No parallel processing is planned for when the IFS project is implemented.  Parallel processing is a control that can be used to compare the results of a new system to the results of an old system after a new system is deployed.  Both systems are run until confidence is gained that the new system is producing comparable results.  The Chief Financial Officer function explained to us that parallel processing would not be conducted because the Chief Financial Officer function did not have enough resources to reconcile the IFS to the Automated Financial System upon implementation.