TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

The Federal Student Aid Datashare Application Was Successfully Deployed, but Improvements in Systems Development Disciplines Are Needed

 

 

 

September 3, 2010

 

Reference Number:  2010-20-099

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

 

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

HIGHLIGHTS

 

THE FEDERAL STUDENT AID DATASHARE APPLICATION WAS SUCCESSFULLY DEPLOYED, BUT IMPROVEMENTS IN SYSTEMS DEVELOPMENT DISCIPLINES ARE NEEDED

Highlights

Final Report issued on September 3, 2010

Highlights of Reference Number:  2010-20-099 to the Internal Revenue Service Chief Technology Officer.

IMPACT ON TAXPAYERS

The Federal Student Aid Datashare application provides a web-based method for taxpayers to complete the Department of Education’s Free Application for Federal Student Aid.  Instead of manually entering their tax return information on the application form, taxpayers can now automatically transfer their tax return data to the application form.

WHY TIGTA DID THE AUDIT

This audit was initiated at the request of the Associate Chief Information Officer for Applications Development.  The Internal Revenue Service (IRS) implemented the project to assist the Department of Education initiative to simplify the Federal student aid application process.  Our objective was to determine whether the IRS followed the Enterprise Life Cycle in developing the project within the established time period and ensuring adequate security was put in place to protect taxpayer information.

WHAT TIGTA FOUND

The IRS successfully developed and deployed the Federal Student Aid Datashare application on January 28, 2010.  As of May 2010, more than 264,750 taxpayers had used the application to automatically transfer their tax return information to the Federal student aid application form.  Security controls to safeguard taxpayer information were built into the application and tested prior to deployment of the application.

While the application was successfully deployed, some system development processes needed improvement.  The IRS took actions on many of our recommendations and observations during the course of our audit, and addressed the concerns TIGTA identified.  However, some actions had not been completed or developed by the time our review concluded.  Specifically, controls over requirements management need strengthening to ensure test cases and requirement documents are fully developed; test results should be documented timely and consistently and in a manner that minimizes the potential for manipulation; and project team meetings should be documented to ensure significant decisions and followup action items are tracked and timely completed.

WHAT TIGTA RECOMMENDED

TIGTA recommended that the Chief Technology Officer ensure that the Systems Acceptability Test team 1) uses consistent, documented processes to generate a requirements traceability matrix linking each requirement to a test case and 2) revises the applicable Internal Revenue Manual to require that test results be recorded, documented, and verified consistently during test execution.

In their response to the report, IRS officials agreed to TIGTA’s recommendations.  IRS management plans to revise the Internal Revenue Manual Part 2.6.1 to add hyperlinks in the Requirements Traceability Verification Matrix that connect directly to the associated test cases and update the Test Results Section to require consistent recording, documentation, and verification of test results.

 

 

September 3, 2010

 

 

MEMORANDUM FOR CHIEF TECHNOLOGY OFFICER

                                        

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – The Federal Student Aid Datashare Application Was Successfully Deployed, but Improvements in Systems Development Disciplines Are Needed (Audit # 200920031)

 

This report presents the result of our review of the development of the Federal Student Aid Datashare project.  The overall objective of this review was to determine whether the Internal Revenue Service (IRS) followed the Enterprise Life Cycle to develop the Federal Student Aid Datashare project within the established time period and to ensure adequate security was in place to protect taxpayer information.  The audit was requested by the Associate Chief Information Officer for Applications Development and addresses the major management challenge of Modernization of the IRS.

Management’s complete response to the draft report is included as Appendix VI. 

Copies of this report are also being sent to the IRS managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Alan R. Duncan, Assistant Inspector General for Audit (Security and Information Technology Services), at (202) 622-5894.

 

 

Table of Contents

 

Background

Results of Review

The Federal Student Aid Datashare Application Was Successfully Developed and Deployed

Improvements Are Needed in Several Systems Development Disciplines

RRecommendation 1U:

RRecommendation 2U:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Outcome Measure

Appendix V – Glossary of Terms

Appendix VI – Management’s Response to the Draft Report

 

 

Abbreviations

 

DocIT

Document Management for Information Technology 

ELC

Enterprise Life Cycle

FSA-D

Federal Student Aid Datashare

IRS

Internal Revenue Service

SAT

Systems Acceptability Test

 

 

Background

 

The FSA-D system streamlines the Federal student aid application process, making it more efficient for hundreds of thousands of taxpayers to acquire the required tax return information.

The Internal Revenue Service (IRS) developed the Federal Student Aid Datashare (FSA-D) project to support the Department of Education initiative to simplify the Federal student aid application process.  The Department of Education requested that the IRS develop the FSA-D project prior to the next application period, which began in January 2010, and funded the development costs of approximately $4.5 million.

The FSA-D is a web-based application designed to provide applicantsF[1]F with their filed tax return information while they are accessing the Department of Education web siteF[2]F to complete the Free Application for Federal Student Aid.  While online, the applicants can retrieve information from their tax returns and have the option to automatically transfer the required tax return data to their application for Federal student aid.  Prior to the deployment of the FSA-D, applicants were required to manually input their tax data using the hardcopy of their tax returns.  The FSA-D streamlines the process and makes it more efficient for hundreds of thousands of users to acquire the required tax return information.

The FSA-D application functions via an IRS Internet link, which is available on the Department of Education web site.  This link is the only way an applicant can directly access their filed tax return data required to complete the application form.  The applicant must adhere to verification features in order to gain access to the IRS Internet link and tax return data.  The applicant accomplishes this access by creating a personal identification number and inputting requested personal information on the Department of Education web site.  The information that must be input includes, but is not limited to, the applicant’s first and last name, Social Security Number, and date of birth. 

Once the personal identification number and the applicant’s personal information are entered, the system takes the applicant to the IRS FSA-D application.  The FSA-D application asks the applicant for additional personal data and allows the applicant to revise certain information, such as their address, to ensure the information entered into the FSA-D matches the information on the tax return previously filed with the IRS.  The FSA-D uses personal data input to verify that the applicant is the taxpayer.  After the system authenticates the taxpayer, the tax return information for the requested tax year is retrieved and displayed.  While the tax return information is displayed on the computer screen, the applicant’s options include 1) closing the IRS session without accepting the tax data from the IRS, 2) transferring the tax return data directly into the application, or 3) printing the tax return data.  

This review was requested by the Associate Chief Information Officer for Applications Development and performed at the Modernization and Information Technology Services organization facilities in New Carrollton, Maryland, during the period July 2009 through March 2010.  During our audit, we participated in meetings and briefings between IRS executives and the project team.  As a result, when issues and concerns were identified, they were immediately documented and reported to management along with audit recommendations for corrective actions.  During the audit, management implemented many of the recommendations prior to deploying the FSA-D application; therefore, project changes or progress occurring after system deployment may not be included in the audit analyses.

We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

Results of Review

 

The Federal Student Aid Datashare Application Was Successfully Developed and Deployed

The IRS successfully developed and deployed the FSA-D application.  The IRS project team began working in May 2009 and, with the assistance of a contractor, deployed the system 9 months later on January 28, 2010.  According to the May 2010 IRS report, more than 264,750 taxpayers have used the FSA-D to automatically transfer their tax return information to the Federal student aid application form.  Currently, the FSA-D application allows users to access only their filed 2008 tax returns in order to complete the application form.  However, the 2009 tax returns are scheduled to be available to taxpayers in the next phase of the FSA-D application, which is planned for implementation in September 2010.  The IRS staff has already initiated development of the next phase. 

The IRS convened a project team composed of personnel from the Modernization and Information Technology Services organization and the Wage and Investment Division.  Under the leadership of the FSA-D project manager, weekly meetings were convened with the project team, and development activities, including risks, were constantly coordinated and solutions were jointly developed and promptly implemented.  During weekly project team meetings, the group discussed activities on the FSA-D project schedule and made adjustments as necessary to ensure the project remained on schedule. 

The FSA-D project team held a series of weekly, biweekly, and monthly meetings to brief IRS executives.  The executives participating in the briefings included the Chief Technology Officer, the Associate Chief Information Officer for Applications Development, and the Wage and Investment Division Business Modernization Executive.  In addition, the Customer Services Executive Steering Committee provided high-level executive oversight for the FSA-D project. 

As required by the Enterprise Life Cycle (ELC),F[3]F the FSA-D project was subject to several tests before it was deployed.  The tests completed included integration tests performed by the Department of Education staff and security tests performed by the IRS.  The National Institute of Standards and Technology provides guidelines for selecting and specifying security controls for Federal information systems and organizations.  Security controls are the safeguards employed within an information system to protect the confidentiality, integrity, and availability of the system and its information.  Our analysis determined the IRS included the required security controls in its test plan and test cases.  According to the test results, security controls were implemented correctly and working as intended, and weaknesses identified during testing have been included in corrective action plans.F[4]F  Following the security testing, the Wage and Investment Division Business Modernization Executive approved the system to operate in the IRS environment. 

Improvements Are Needed in Several Systems Development Disciplines

While the IRS successfully developed and deployed the FSA-D application to assist the Department of Education to more efficiently serve thousands of applicants for Federal student aid, improvements are needed in several systems development processes.

  • UManaging requirementsU The FSA-D test team did not follow effective requirements management processes.  Specifically, the Requirements Traceability Matrix (traceability matrix) and test cases were not sufficiently developed throughout the planning and completion of testing.

·         URecording test resultsU Test results were not recorded timely and consistently, nor were they recorded in a manner to minimize the potential for manipulation.

·         UDocumenting project team meetingsU During project development activities, the results of project team meetings were not documented for several weeks.

The FSA-D project manager implemented corrective actions immediately after being advised of many of these concerns.  However, additional actions need to be taken in order for the Chief Technology Officer to improve systems development processes.

Requirements management procedures were not effectively followed throughout planning and completion of testing

Requirements management is the process by which information technology project requirements of all types are defined, formalized, managed, controlled, and verified.  The requirements documentationF[5]F and the traceability matrix are two of the primary controls used to document, manage, and effectively trace requirements to test cases.  The requirements documents, the traceability matrix, and the test cases should be developed before initiation of testing activities.  The traceability matrix should be updated when changes occur and then accurately maintained throughout the requirements management and testing processes.

The traceability matrix and the Systems Acceptability Test (SAT) cases were not sufficiently developed.  The following are specific concerns identified during the audit:

·         Prior to the SAT, an analysis of the first 25 test cases was performed to determine the accuracy and reliability of the traceability matrix.  Due to insufficient details on the traceability matrix, requirements could not be traced from the 25 test cases to the matrix.  For example, the traceability matrix should contain a reference number which uniquely identifies each requirement being tested; however, 17 of the 25 test cases had a requirement reference number that was not included on the traceability matrix.

·         In a comparison of the traceability matrix to 38 additional test cases performed during the SAT, 110 discrepancies existed between the requirement details on the traceability matrix and the 38 test cases.  Figure 1 provides further information on the discrepancies.

Figure 1:  Discrepancies Between the Traceability Matrix and Test Cases

Traceability Matrix Categories

Number of Discrepancies Between the Traceability Matrix Categories and the
Information on the 38 SAT Test Cases

Requirement Specification Document Version Number

1

Requirement Specification Document Requirement Identification Number

27

Page Behavior Design Document Version Number

17

Page Behavior Design Requirement Identification Number

28

Wireframe Document Version Number

12

Wireframe Number

14

Requirement Description

11

Total Number of Discrepancies Identified

110

Source:  The requirements documentation and test cases.

·         Following finalization of the SAT, an accurate total of SAT test cases could not be determined.  Specifically, three sources reported conflicting numbers as to the total number of SAT test cases performed.  The Document Management for Information Technology (DocIT) web site, a central repository used by the IRS to store documents, showed 486 cases; the SAT End-of-Test Status Report showed 484 cases; and the Functional Configuration Audit report showed 422 cases.  The reporting of inconsistent final test results contributes to unreliable management information and could lead to incorrect decisions by IRS management.  During our review, we recommended that the project manager determine the correct number of final SAT test cases and revise the End-of-Test Status Report to accurately reflect the final total.  The SAT test manager concluded there were 472 final test cases, and the End-of-Test Status Report was revised to show these 472 cases.F[6]

The aggressive time period and limited time schedule required to implement the FSA-D project contributed to discrepancies between the traceability matrix and SAT test cases.

  • An effective requirements management process was not established.  Specifically, the Test, Assurance, and Documentation office test team performed the SAT without initially developing a reliable traceability matrix and sufficiently completing all the test cases.
  • The SAT team concurrently developed test cases and prepared the traceability matrix.  However, as changes occurred to requirements and test cases, the traceability matrix was not always updated to reflect the changes.  The SAT test manager stated this occurred due to the amount of revisions to requirements documentation.  Specifically, the manager believed it was not possible to completely update and cross-reference the traceability matrix for each revision to the requirement documents and also complete the test execution within the required time period. 
  • The test analysts individually and manually prepared extensive test cases, possibly causing the development of unnecessary or overlapping test cases.  For instance, the test manager determined that four test cases were waived because their conditions were covered by other tests, and nine were duplicates of existing cases. 

When there are several sources that report conflicting requirements and test case data, the potential exists that not all requirements were tested, the IRS cannot ensure that the system is working as intended, and the unreliable final test case data could lead to incorrect management decisions.

Management Action:  During the review, we discussed the discrepancies between the traceability matrix and the SAT test cases with the project manager and recommended a thorough quality review of the traceability matrix, the requirements documentation, and the SAT test cases prior to deploying the project in January 2010 to ensure that all requirements are included in the test cases and tested during the SAT.  The project manager agreed and provided information supporting that the quality review was completed and the requirements tested prior to deployment of the project.

Recommendation

Recommendation 1:  The Chief Technology Officer should ensure that the Test, Assurance, and Documentation office SAT test team uses consistent, documented processes to generate a traceability matrix linking each requirement to a test case.

Management’s Response:  The IRS agreed with this recommendation.  The Associate Chief Information Officer for Applications Development plans to revise the Requirements Traceability Verification Matrix contained in Internal Revenue Manual Part 2.6.1 to include hyperlinks that will connect directly to the associated test cases.

The SAT test analysts did not record results timely and consistently

During an observation of 3 testers performing 38 of 472 SAT test cases, we determined that not all test results were recorded consistently and timely.  For example, one of the testers properly documented testing by immediately recording final results on test case printouts and completing a followup edit at the conclusion of each test day.  The other testers recorded their test results inconsistently during test performance by either manually transcribing test notes into electronic test cases at the end of each test day or creating test notes based on memory and only at the end of each test week.

When test results are not recorded until testing is complete, a risk exists that test results may not be recorded accurately or at all.  The Modernization and Information Technology Services organization has proceduresF[7]F applicable to Systems Integration Testing that contain several guidelines for properly recording test results.  Two such requirements include recording test results during actual test execution and subsequently verifying the results by saving computer screen images immediately after each test.  In addition, while the tests are being performed, the test manager should verify the results to ensure that objectives are achieved and sufficiently documented.  The tester should also ensure all test result documentation is maintained in a test folder and available for verification by the test manager.  After verification, both the testers and project managers are required to sign the test results certification form, which documents tests were performed as required.

This situation occurred because the Test, Assurance, and Documentation office SAT test team does not have any formal procedures that require prompt recording of test results.  The test team followed Internal Revenue Manual Part 2.6.1, Test, Assurance, and Documentation Standards and Procedures, which does not include detailed procedures requiring SAT testers to record test results consistently, accurately, and timely during test execution. 

Management Action:  We discussed the inconsistent recording of test results with the SAT test manager, who then met with the test team and advised them of the expectation that they record test results consistently and promptly.  The SAT test manager also agreed to review existing Modernization and Information Technology Services organization procedures for recording test results and implementing appropriate processes.

Recommendation

Recommendation 2:  The Chief Technology Officer should update Internal Revenue Manual Part 2.6.1, Test, Assurance, and Documentation Standards and Procedures, to state that SAT test results will be consistently recorded, documented, and verified during test execution.

Management’s Response:  The IRS agreed with this recommendation.  The Associate Chief Information Officer for Applications Development plans to update the Test Results Section of Internal Revenue Manual Part 2.6.1 to require consistent recording, documentation, and verification of test results during execution.

The Application Qualification Testing results were not protected from revisions

The FSA-D contractorF[8]F did not have adequate internal procedures and controls over the recording and reporting of Application Qualification Testing results.  The contractor’s testers kept track of the test cases they completed along with notes on their test results in an Excel spreadsheet file.  At the end of the testers’ day, another contractor employee accessed the Excel results file and converted the file to a Portable Document Format.  This same employee then uploaded the Portable Document Format results file to the IRS DocIT web site.  The contractor’s existing processes could allow for manipulation and changing of Application Qualification Testing results prior to the results being converted to the Portable Document Format file and uploaded to the IRS DocIT web site.

Established processes should ensure that all project testing results provide objective and unedited evidence that the project satisfies the agreed-upon requirements.  As a result, procedures should be in place to ensure the results are safeguarded from potential alterations or any changes that could compromise the test results.  Best practices for good internal controls require the segregation of certain key duties.  In this instance, responsibilities for recording and reporting test results should be separated.    

Management Action:  This concern was discussed with the project manager and we recommended that the actual testers for the Application Qualification Testing be responsible for converting their Excel results to “read only” Portable Document Format files before the documents are accessed by other employees and uploaded to the IRS DocIT web site.  The FSA-D project manager agreed and worked with the contractor to require the testers convert their test results to a “read only” Portable Document Format file at the end of testing each day.

Minutes were not being prepared to capture and track the details of weekly project meetings

The ELC requires details and decisions be documented on important project matters.  According to project management best practices, formal minutes should be prepared to document decisions made during project meetings.  This ensures important project decisions and significant issues, as well as any results from followup action items, are appropriately documented.  Details and decisions made during weekly FSA-D project meetings were not always recorded.  After the FSA-D project was implemented, the project team continued for a month without documenting results of details and decisions that occurred during weekly team meetings. 

The FSA-D project team did not follow project management best practices.  When minutes are not prepared to document significant decisions and followup action items, the potential exists for important project actions to not receive tracking to ensure sufficient and timely completion.

Management Action:  After the initial first few weeks of project meetings and discussions in which the Treasury Inspector General for Tax Administration participated, this issue was discussed with the FSA-D project manager who immediately required the preparation of minutes at the next project team meeting.  Thereafter, all FSA-D meeting minutes were adequately prepared and complete results were documented.

 

Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective of our review was to determine whether the IRS followed the ELCF[9]F to develop the FSA-D project within the established time period and to ensure adequate security was in place to protect taxpayer information.

To accomplish the overall objective, we:

I.                   Determined whether ELC guidelines were properly followed in the development of the FSA-D project.

A.    Determined if program management processes were used to manage and guide project activities.

1.      Obtained and reviewed documentation which provided evidence that the project followed ELC guidance, such as the project charter, the project management plan, the tailoring plan, and customer technical reviews.

2.      Determined and documented how the project team gathered and managed requirements that included verifying with stakeholders that the requirements were complete, tracking and controlling changes to requirements, and ensuring that each requirement was included in a test case.

3.      Determined and documented the risk management process to ensure all issues and risks were properly identified and tracked from inception to mitigation or resolution.

B.     Determined whether the project team performed effective oversight and monitoring of the project.

1.      Regularly attended or called into the project team meetings, such as briefings of the Project Director, meetings of the executive steering committee, walkthroughs of the FSA-D project, meetings with stakeholders, risk management meetings, and meetings of the project team.

2.      Obtained and reviewed documents from the project meetings in Step I.B.1.

II.                Determined whether the project team adequately planned for and managed testing activities.

A.    Compared the traceability matrix, the Requirements Specification Document, and the applicable test cases to ensure that each requirement was included in a test case.

1.      Judgmentally selected and compared the first 25 of the 472 total population of SAT test cases to the traceability matrix.  We used a judgmental sample because we were not planning to project our results.

2.      Judgmentally selected and compared an additional 38 of the 472 SAT test cases to the traceability matrix.  We used a judgmental sample because we were not planning to project our results.

B.     Conducted onsite observations of the SAT and the security tests to verify that results were accurately recorded.

1.      Observed the testing and the recording of results for 38 of the 472 total population of SAT test cases.

2.      Observed the testing and recording of results for 28 of the 486 total population of security test cases.

C.     Obtained and reviewed the final results to ensure problems during Application Qualification Testing, Security Test and Evaluation, performance, integration, and SAT testing were resolved, including retesting failed requirements and properly handling defects.

D.    Determined whether the ELC guidelines and other applicable security guidelines were followed to ensure authenticated taxpayers have secure access to their tax return information.

1.      Obtained and reviewed the security results contained in the Security Certification and Accreditation Package.

2.      Determined how the project team resolved the authentication risk.

III.             Obtained and reviewed the Reimbursable Agreement between the IRS and the Department of Education to determine the funding source for the FSA-D project.

Internal controls methodology

Internal controls relate to management’s plans, methods, and procedures used to meet their mission, goals, and objectives.  Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations.  They include the systems for measuring, reporting, and monitoring program performance.  We determined the following internal controls were relevant to our audit objective:  ELC and related IRS guidelines, and the processes followed in the development of information technology projects.  We evaluated these controls by reviewing the guidelines, conducting interviews and meetings with management and staff, and reviewing project documentation such as the project charter, various project plans, and test case files which provided evidence of whether ELC systems development processes were followed.

 

Appendix II

 

Major Contributors to This Report

 

Alan R. Duncan, Assistant Inspector General for Audit (Security and Information Technology Services)

Scott A. Macfarlane, Director

Kimberly R. Parmley, Audit Manager

Wallace C. Sims, Lead Auditor

Louis Lee, Senior Auditor

Suzanne M. Westcott, Senior Auditor

David F. Allen, Program Analyst

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Deputy Commissioner for Services and Enforcement  SE

Commissioner, Wage and Investment Division  SE:W

Chief Information Officer  OS:CTO:CIO

Associate Chief Information Officer, Applications Development  OS:CTO:AD

Director, Risk Management  OS:CTO:SP:RM

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA
Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaisons: 

            Commissioner, Wage and Investment Division  SE:W         

Director, Program Oversight  OS:CTO:SP:RDM:PO

 

Appendix IV

 

Outcome Measure

 

This appendix presents detailed information on the measurable impact that our recommended corrective action will have on tax administration.  This benefit will be incorporated into our Semiannual Report to Congress.

Type and Value of Outcome Measure:

·         Reliability of Information – Actual; the final number of SAT test cases reported was 472 (see page 4).

Methodology Used to Measure the Reported Benefit:

The IRS issued the SAT End-of-Test Status Report containing a total of 484 test cases, while 486 actual SAT test cases were placed on the IRS DocIT web site.  Reporting different SAT testing results on multiple sources could lead to incorrect management decisions.  We recommended that the IRS determine the correct number of final test cases and revise the End-of-Test Status Report to accurately reflect the final total of test cases.  The IRS agreed with our recommendation and determined the final total of SAT test cases is 472.  Also, the End-of-Test Status Report has been revised to show the correct final total of SAT test cases. 

 

Appendix V

 

Glossary of Terms

 

Term

Definition

Application Qualification Testing

The testing phase focused on ensuring the system is functioning as designed and it includes business and design requirements, data validation, hyperlinks, and interface compatibility.

Enterprise Life Cycle

A structured business systems development method that requires the preparation of specific work products during different phases of the development process.  The ELC establishes a set of repeatable processes and a system of reviews, checkpoints, and milestones that reduce the risks of systems development and ensure alignment with the overall business strategy.

Functional Configuration Audit

The audit performed by IRS staff to independently verify whether or not the system contains the functions expected.  This is usually accomplished by tracing requirements to test scripts and determining whether testing adequately exercised the system’s intended function.

Requirement

A formalization of a need and the statement of a capability or condition that a system, subsystem, or system component must have or meet to satisfy a contract, standard, or specification.

Requirements Traceability Matrix

A matrix developed and continually updated that links each requirement to a test case.  The matrix provides important information on system testing and functionality status.

Systems Acceptability Testing

The process of testing a system or program to ensure it meets the original objectives outlined by the user in the requirement analysis document.

Systems Integration Testing

Systems Integration Testing verifies that each individual work product
(i.e., application software, technical infrastructure, facility, documentation, or training material) still meets requirements when integrated with the rest of the release and the business system.
    

Test, Assurance, and Documentation Office

The IRS office responsible for planning, developing, scheduling, and conducting the SAT on selected systems.  This includes providing an environment for testing and integrating modernization and production systems that emulate the target environment.  They also test the acceptability of application software for implementation ensuring only approved and controlled versions of software are deployed.

Traceability

The activity that maps requirements to business processes, systems development documents, and test cases.

 

Appendix VI

 

Management’s Response to the Draft Report

 

DEPARTMENT OF THE TREASURY

INTERNAL REVENUE SERVICE

WASHINGTON, DC  20224

 

CHIEF TECHNOLOGY OFFICER

 

 

July 28, 2010

 

 

MEMORANDUM FOR DEPUTY INSPECTOR GENERAL FOR AUDIT

 

FROM:                            Terence V. Milholland /s/ Terence V. Milholland

    Chief· Technology Officer

 

SUBJECT:                       Draft Audit Report - The Federal Student Aid Datashare Application Was Successfully Deployed, but Improvements in Systems Development Disciplines Are Needed (Audit #20090031) (i-trak #2010--12640)

 

Thank you for the opportunity to review your draft audit report and to meet with the audit team to discuss earlier report observations. I appreciate the comments and observations on our efforts to successfully develop and deploy the Federal Student Aid Datashare application. The results of this audit we requested have provided us with timely feedback as we move toward implementing the application.

 

I agree with both recommendations regarding consistent testing processes and documentation of test results. The attachment to this memo details our planned actions to implement your suggestions.

 

Your continued support and the assistance and guidance your team provides have been a valuable resource to our business units. If' you have any questions, please contact me on (202)622-6800, or Darrin Brown on (202) 283-4613.

 

Attachment

 

RECOMMENDATION #1: We recommend the Chief Technology Officer should ensure that the TAD SAT test team use consistent, documented processes to generate a RTM linking each requirement to a test case.

 

CORRECTIVE ACTION #1: We agree with the recommendation. AD TAD will revise the Requirements Traceability Verification Matrix (see TAD Internal Revenue Manual (IRM) 2 .6.1 dated March 1, 2010, Exhibit 2.6.1-6) to include hyperlinks that will connect directly to associated test cases.

 

IMPLEMENTATION DATE: October 1, 20 I0

 

RESPONSIBLE OFFICIAL: MITS, Applications Development OS:CIO:AD

 

CORRECTIVE ACTION MONITORING PLAN: We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES). These Corrective Actions are monitored on a monthly basis until completion.

 

RECOMMENDATION #2: The Chief Technology Officer should update Internal Revenue Manual 2.6.1, Test. Assurance & Documentation Standards and Procedures to state that SAT test results will be consistently recorded, documented, and verified during test execution.

 

CORRECTIVE ACTION #2: We agree with the recommendation. AD TAD will update the Test Results Section (2.6.1.4.2.5.3) of the January 2011I publication of IRM 2.6.1 accordingly.

 

IMPLEMENTATION DATE: March 1, 2011

 

RESPONSIBLE OFFICIAL: MITS, Applications Development  OS :CIO:AD

 

CORRECTIVE ACTION MONITORING PLAN: We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES). These Corrective Actions are monitored on a monthly basis until completion.



[1] An applicant can be the student, the student’s spouse, or the parents of the student.

[2] The Department of Education web site may be accessed at fafsa.ed.gov.

[3] See Appendix V for a glossary of terms.

[4] When vulnerabilities are identified during security testing, system owners are required to develop mitigation Plans of Action and Milestones and monitor the corrective actions until they are completed.

[5] The functional requirements for the FSA-D project were included in three separate documents:  the Requirement Specification Document, the Page Behavior Design Documents, and the Wireframes.  These three documents describe the functionality or operation of the FSA-D system.

[6] See Appendix IV.

[7] Modernization and Information Technology Services organization procedure entitled “Test Folders and Records Procedure,” dated November 30, 2006.

[8] The IRS contracted with a vendor to conduct the Application Qualification Testing.

[9] See Appendix V for a glossary of terms.