TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

 

 

The Applications Development Function’s Quality Assurance Program Office Can Make Its Processes More Effective

 

February 17, 2011

 Reference Number:  2011-20-007

 

 

This report has cleared the Treasury Inspector General for Tax Administration disclosure review process and information determined to be restricted from public release has been redacted from this document.

Phone Number   |  202-622-6500

Email Address   |  inquiries@tigta.treas.gov

Web Site           |  http://www.tigta.gov

 

 

HIGHLIGHTS

 

THE APPLICATIONS DEVELOPMENT FUNCTION’S QUALITY ASSURANCE PROGRAM OFFICE CAN MAKE ITS PROCESSES MORE EFFECTIVE

Highlights

Final Report issued on February 17, 2011

Highlights of Reference Number:  2011-20-007 to the Internal Revenue Service Chief Technology Officer.

IMPACT ON TAXPAYERS

The mission of the Applications Development function’s Quality Assurance Program Office is to assure product compliance, drive process improvement, and promote quality awareness.  Ensuring the quality of development activities helps the Modernization and Information Technology Services (MITS) organization deliver services and solutions that drive effective tax administration to ensure public confidence.

WHY TIGTA DID THE AUDIT

This audit was initiated at the request of the MITS organization.  The overall objective was to determine whether the Applications Development function’s Quality Assurance Program Office ensures development projects implement a coordinated set of activities that conform to organizational policies, processes, and procedures that meet the standards of the Software Engineering Institute’s Capability Maturity Model Integration (CMMI) - Development maturity level 2.

WHAT TIGTA FOUND

The Quality Assurance Program Office generally meets the CMMI-Development maturity level 2 requirements.  The Applications Development function updated the role of the Quality Assurance Program Office in April 2007.  The update resulted in a directive that established authority and responsibility for the performance of quality assurance activities across the Applications Development function.

The Quality Assurance Program Office implemented a comprehensive plan to assess the Applications Development function’s products and services.  It employs qualified specialists to perform its audits and provides the Applications Development function feedback about its organizational practices.

Currently, the Applications Development function is the only activity in the MITS organization with a Quality Assurance Program Office.  Transitioning to a MITS-wide Quality Assurance Program Office will help the MITS organization achieve its goal of reaching CMMI-Development maturity level 3.

The products and documents created generally met the Quality Assurance Program Office guidelines.  However, the guidelines do not require approval signatures and dates on the products by the appropriate Applications Development function managers.  Also, the Quality Assurance Program Office did not effectively maintain all necessary documentary evidence to assess and support the reported audit results.

WHAT TIGTA RECOMMENDED

TIGTA recommended that the Chief Technology Officer:  1) expand the scope of the Quality Assurance Program Office to provide coverage across the MITS organization; 2) implement procedures to officially approve its products and guidance documents; 3) improve the guidance to include requirements that assure an informative, accurate, and appropriate perspective in reporting; and 4) further develop the peer review guidance to help ensure audit reports are supported by sufficient, competent, and relevant evidence.

In its response to the report, the IRS agreed with TIGTA’s recommendations.  The IRS plans to:  1) evaluate expanding the scope of the Quality Assurance Program Office; 2) modify its reporting procedures and templates to include approvals; 3) strengthen the language relative to the reporting of the audit findings and ensure, when possible, that the audit’s results are input into the database; and 4) strengthen the language relative to the peer review process and analyze the checklist to ensure it includes all appropriate issues for review.

 

 

February 17, 2011

 

 

MEMORANDUM FOR CHIEF TECHNOLOGY OFFICER

 

FROM:                            Michael R. Phillips /s/ Michael R. Phillips

                                         Deputy Inspector General for Audit

 

SUBJECT:                    Final Audit Report – The Applications Development Function’s Quality Assurance Program Office Can Make Its Processes More Effective (Audit # 201020026)

 

This report presents the results of our review of the Applications Development function’s Quality Assurance Program Office activities.  The overall objective of this review was to determine whether the Applications Development function’s Quality Assurance Program Office ensures development projects implement a coordinated set of activities that conform to organizational policies, processes, and procedures that meet the standards of the Software Engineering Institute’s Capability Maturity Model Integration - Development maturity level 2.  This review was requested by the Modernization and Information Technology Services organization and addresses the major management challenge of Modernization of the Internal Revenue Service.  Management’s complete response to the draft report is included as Appendix VI.

Copies of this report are also being sent to the Internal Revenue Service managers affected by the report recommendations.  Please contact me at (202) 622-6510 if you have questions or Alan Duncan, Assistant Inspector General for Audit (Security and Information Technology Services), at (202) 622-5894.

 

 

Table of Contents

 

Background

Results of Review

The Quality Assurance Program Office Generally Meets the Maturity Level 2 Requirements

Recommendation 1:

The Quality Assurance Program Office Should Ensure All Products Include an Approval Signature

Recommendation 2:

The Quality Assurance Program Office Audit Documentation and Procedures Need Improvement

Recommendation 3:

Recommendation 4:

Appendices

Appendix I – Detailed Objective, Scope, and Methodology

Appendix II – Major Contributors to This Report

Appendix III – Report Distribution List

Appendix IV – Quality Assurance Program Office Audits Reviewed

Appendix V – Glossary of Terms

Appendix VI – Management’s Response to the Draft Report

 

 

Abbreviations

 

CMMI

Capability Maturity Model Integration

IRS

Internal Revenue Service

MITS

Modernization and Information Technology Services

TIGTA

Treasury Inspector General for Tax Administration

 

 

 

 

Background

 

The Applications Development function in the Modernization and Information Technology Services (MITS) organization collaborates with the other Internal Revenue Service (IRS) business functions to provide integrated computer software solutions that align with the business priorities of the IRS.  Specific focus areas for the Applications Development function include developing modernized applications, delivering applications to support the processing of income tax returns, and maintaining existing technology systems.  The Quality Assurance Program Office assists the Applications Development function meet its mission by assuring product compliance, driving process improvement, and promoting quality awareness.  To fulfill this role, the Quality Assurance Program Office provides the MITS organization’s senior management with assessments of the products being built and the services being provided.  These assessments communicate whether development activities conform to applicable contractual, program, and project requirements and whether the development activities use repeatable, standardized, and effective processes.

The objectives of the Quality Assurance Program Office are to:

·         Assess the Applications Development function’s portfolio each year to objectively evaluate performance against applicable standards and requirements.

·         Institutionalize the Enterprise Life Cycle[1] and promote standardization of organizational processes and procedures.

·         Enable project quality principles and practices through mentoring, coaching, and training.

·         Produce performance measures that identify progress, variances, trends, and opportunities for improvement.

·         Ensure continuous process improvement using industry standards to provide cost-effective, high-quality products and solutions.

The Quality Assurance Program Office is part of the Applications Development function’s effort in leading a MITS organization-wide initiative to use the Software Engineering Institute’s Capability Maturity Model Integration (CMMI).  The CMMI consists of best practices that address development and maintenance activities covering the product development life cycle from conception through delivery and maintenance.  Specifically, the MITS organization is planning to use the CMMI-Development model to help improve its development and maintenance processes for both products and services.

All CMMI models reflect maturity levels in their design and content.  A maturity level consists of related specific and generic practices for a predefined set of process areas that improve the organization’s overall performance.  The MITS organization has set a target for achieving maturity level 2 using the CMMI-Development model by January 2011 and level 3 by June 2012.

·         At maturity level 2, the projects of the organization have ensured that processes are planned and executed in accordance with policy; projects employ skilled people who have adequate resources to produce controlled outputs; projects involve relevant stakeholders; projects are monitored, controlled, and reviewed; and projects are evaluated for adherence to their process descriptions.

·         At maturity level 3, processes are well characterized and understood, and are described in standards, procedures, tools, and methods.  The organization’s set of standard processes, which is the basis for maturity level 3, is established and improved over time.  These standard processes are used to establish consistency across the organization.

Process and product quality assurance provides the project staff and all levels of managers with appropriate visibility into, and feedback on, the processes and associated work products throughout the life of the project.

The CMMI defines quality assurance as a planned and systematic means for assuring management that the defined standards, practices, procedures, and methods of the process are applied.  Process and product quality assurance is an aspect of the CMMI that provides specific practices for objectively evaluating performed processes, work products, and services against the applicable process descriptions, standards, and procedures, and ensuring that any issues arising from these reviews are addressed. 

This review was performed at the MITS organization facilities in New Carrollton, Maryland, during the period June through September 2010.  This audit was performed at the request of the MITS organization.  We conducted this performance audit in accordance with generally accepted government auditing standards.  Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objective.  We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective.  Detailed information on our audit objective, scope, and methodology is presented in Appendix I.  Major contributors to the report are listed in Appendix II.

 

 

Results of Review

 

The Quality Assurance Program Office Generally Meets the Maturity Level 2 Requirements

We analyzed the Internal Revenue Manual and found that it included the CMMI-Development maturity level 2 quality assurance requirements.  Further, the Quality Assurance Program Office’s processes, guidance, and procedures generally meet the CMMI maturity level 2 requirements for quality assurance.

The Applications Development function updated the role of the Quality Assurance Program Office in April 2007.  The update resulted in a directive that established authority and responsibility for the performance of quality assurance activities across the Applications Development function.

Guidance documents were developed for auditing project development activities

The Quality Assurance Program Office has made significant progress in developing guidance for its audit activities.  Since April 2007, it has developed and issued guidance documentation that includes templates, checklists, processes, procedures, handbooks, and training modules.  The guidance developed covers all aspects of the quality assurance activities for use by the quality specialists.  The topics the guidance covers include auditing, status reporting, peer reviews, metrics, and lessons learned.  By developing this guidance, the Quality Assurance Program Office has provided the quality specialists with tools to guide them in auditing the Applications Development function’s project development activities.    

A comprehensive plan was implemented to assess the products and services

To implement the directive, the Quality Assurance Program Office develops annual audit plans.  The annual audit plans describe the goals for coverage of the Applications Development function’s five business domains.  Table 1 presents these domains and the eight process areas planned for review by the Quality Assurance Program Office during Fiscal Year 2010.

Table 1:  Applications Development Function Domains and Process Areas

Business Domain

Process Area

Compliance

Project Planning

Corporate Data

Project Monitoring and Control

Customer Service

Configuration Management

Internal Management

Requirements Management

Submission Processing

Measurement and Analysis

 

Software Development

 

Supplier Agreement Management

 

Testing

Source:  Internal Revenue Manual and the Applications Development function’s Quality Assurance Program Office Fiscal Year 2010 Program Plan.

The Quality Assurance Program Office met the annual audit plan goal in Fiscal Year 2008 by performing 65 audits and in Fiscal Year 2009 by performing 79 audits.  These audits included representative coverage of the Applications Development function’s business domains and process areas.

The audit reports included issues facing the domains and projects in the Applications Development function.  These issues included project planning, project monitoring and control, configuration management, risk management and contingency management plans, requirements management, and security issues.  With this information, the Applications Development function can focus on improvements the Quality Assurance Program Office audit reports identified. 

Qualified specialists were employed to perform the audits

The Quality Assurance Program Office uses quality specialists to conduct audits of the Applications Development function’s portfolio to determine the level of compliance with the organizational standards, processes, and procedures.  The quality specialists have responsibilities for implementing best practices that are compliant with the CMMI-Development in order to achieve maturity levels 2 and 3. 

The quality specialist’s qualifications are outlined in a set of requirements for the position’s knowledge, skills, and abilities.  These requirements include knowledge of program and project management concepts; experience in developing and performing audits/monitoring software engineering life cycle processes; experience in performing quality assurance activities; the ability to communicate effectively, both verbally and in writing; and the ability to effectively interact in groups.  The quality specialists have also received CMMI-Development training to supplement their knowledge, skills, and abilities.

The quality specialists make recommendations to help improve the organization’s project management.  These recommendations relate to project management activities such as configuration management, requirements management, and project scheduling.

Feedback about organization practices is provided

An element of the CMMI-Development Process and Product Quality Assurance process involves providing feedback to Applications Development function project staff and managers on the results of quality assurance activities.  The Quality Assurance Program Office implemented this guidance with its Status Reporting Procedure.  The Procedure directs the Office to collect, analyze, interpret, and report on measures derived from audits conducted.  These measures provide insight on the organization’s ability to comply with Enterprise Life Cycle requirements.  They also help to identify skill gaps and organization processes that may need modifications for proper implementation across the Applications Development function’s portfolio.  Measurements are reported quarterly and at yearend to Applications Development function’s Domain Directors and the Program Management Office.

Expanding the Quality Assurance Program to cover all MITS organizations will contribute to achieving a higher maturity level

The Applications Development function’s Quality Assurance Program Office structure is adequate to its goals.  The CMMI-Development provides the following guidance on the expectations for a quality assurance organization at different maturity levels. 

·         To meet CMMI-Development maturity level 2 requirements, a Process and Product Quality Assurance support process must be established to objectively evaluate performed processes, work products, and services against the applicable process descriptions, standards, and procedures and ensure that any issues arising from these reviews are addressed. 

·         To meet CMMI-Development maturity level 3 requirements, the Process and Product Quality Assurance organization must ensure processes are well characterized and understood, and are described in standards, procedures, tools, and methods.  These standard processes are used to establish consistency across the organization.  The CMMI defines an organization as an administrative structure in which people collectively manage one or more projects as a whole and whose projects share a senior manager and operate under the same policies.

Currently, the Applications Development function is the only activity in the MITS organization with a Quality Assurance Program Office.  Although the Quality Assurance Program Office is assigned to and reports to the Applications Development function, the scope of its audits involves other components of the MITS organization.  For example, the Quality Assurance Program Office has made assessments of configuration management and requirements management which are in the Enterprise Services function’s responsibilities.

For the MITS organization to reach its goal of achieving CMMI-Development maturity level 3, the scope of the Quality Assurance Program Office must ensure processes are well characterized, understood, and described in standards, procedures, tools, and methods.  These standard processes should be used to establish consistency across the MITS organization.  Implementing this scope would involve organizational changes for a Quality Assurance Program Office to have MITS-wide assessment and reporting responsibilities.  This scope of responsibility would benefit the MITS organization by enabling it to achieve consistency across the organization.  In addition, a MITS-wide Quality Assurance Program Office would prevent duplication of effort and extra costs if other MITS organization components begin implementing Quality Assurance Program Offices. 

Recommendation

Recommendation 1:  As the Quality Assurance Program Office processes mature, the Chief Technology Officer should consider establishing a separate quality assurance group to provide coverage across the MITS organization.  Once the development processes throughout the organization have matured and CMMI maturity level 3 is within sight, the MITS organization should realign the quality assurance group to report to the Office of the Chief Technology Officer.

Management’s Response:  The IRS agreed with our recommendation.  The Chief Technology Officer plans to evaluate the feasibility and timing of this recommendation, in consideration of a variety of Information Technology factors, as the achievement of CMMI maturity level 3 is within sight.

The Quality Assurance Program Office Should Ensure All Products Include an Approval Signature

The Government Accountability Office’s Standards for Internal Control in the Federal Government[2] provide that internal control activities help ensure that management’s directives are carried out.  Control activities are the policies, procedures, techniques, and mechanisms that enforce management’s directives.  Control activities occur at all levels and functions of the entity.  They include a wide range of diverse activities such as approvals, authorizations, verifications, and the creation and maintenance of related records which provide evidence of execution of these activities as well as appropriate documentation.

The Quality Assurance Program Office develops products and documents to provide direction for its program and to summarize results of its efforts.  The products and documents include the annual audit plans, program guidance documents, audit reports, and Domain Director and Program Management Office Director Reports.  The Quality Assurance Program Office created directives and templates to help ensure these documents meet the needs of the program. 

The products and documents created generally met the office guidelines.  However, the guidelines do not require approval signatures and dates on the products by the Quality Assurance Program Office Director, the Program Management Office Director, or the Assistant Chief Information Officer for the Applications Development function.  Dated signatures by the appropriate levels of management provide assurance that the products and guidance documents issued or implemented were properly reviewed and approved.

Recommendation

Recommendation 2:  The Chief Technology Officer should require the Applications Development function to implement procedures to officially approve the Quality Assurance Program Office products and guidance documents including, but not limited to, the annual audit plans, program guidance documents, audit reports, and Domain Director and Program Management Office Director Reports.

Management’s Response:  The IRS agreed with our recommendation.  The Applications Development function’s Quality Assurance Program Office plans to modify its reporting procedures and templates to include approvals related to its products and guidance documentation.

The Quality Assurance Program Office Audit Documentation and Procedures Need Improvement

To assess the effectiveness of the Quality Assurance Program Office’s evaluations of projects and processes, we reviewed a judgmental sample of 29 Quality Assurance Program Office audits conducted in Fiscal Years 2008 and 2009[3] to determine whether the results reported by the Quality Assurance Program Office were supported by adequate documentary evidence.

Documentary evidence to assess and support the reported audit results was not effectively maintained

Although the Quality Assurance Program Office audit reports provide valuable feedback to the Applications Development function, the issues reported were not always supported with sufficient, relevant, and accessible evidence.  The Quality Assurance Program Office uses an IRS electronic network repository to maintain electronic files documenting the audit evidence.  The Applications Development function’s Quality Assurance Program Office Document Management Procedure, dated March 30, 2007, provides instructions for audit evidence (audit notifications, evaluation checklists, presentations, audit reports, and corrective action plans) to be filed within the Quality Assurance Program Office electronic network repository. 

We were informed that the required documentation missing from the repository was sometimes maintained on the personal computers of the quality specialists who performed the audits.  In addition, some audits were performed by contractor personnel who did not have access to the electronic repository.  This documentation was maintained in a separate hardcopy file and was not always scanned for inclusion in the electronic network repository.  In our review of 29 audits, we identified: 

·         19 audits that did not have all the supporting documentation filed in the Quality Assurance Program Office repository; however, the documents were provided from the quality specialists’ personal computer files.

·         11 audits that were missing the Preliminary Audit Plan Notice and/or Opening Presentation documentation.

·         4 audits that were missing the Audit Evaluation Checklist.

·         3 audits that were missing the Corrective Action Plan.

Adequate maintenance of audit evidence in a centralized and accessible location will allow the Quality Assurance Program Office to effectively provide the necessary documentation to support the audit results reported. 

Procedures to ensure the audit evidence was complete were not adequate or always followed

The audit files did not always clearly support the findings reported or ensure resolution of the noncompliance issues.  In our review of the 29 audits, we found the following procedural issues. 

The Audit Evaluation Checklist issues did not include references to the reports’ noncompliance issues or reasons for not including the issues identified on the Audit Evaluation Checklist as report findings.  The absence of cross references to and from the audit documentation and the report prevents a reviewer from making an adequate assessment of the accuracy of the audit report.  For example:

·         18 Audit Evaluation Checklists did not include explanations for noncompliant and/or partially compliant issues and could not be easily referenced to noncompliance issues presented in the audit report. 

·         3 audits had inconsistent results reported between the checklist, audit report, and/or Program Management Office Director Reports.

The Quality Assurance Program Office includes program reviews as part of its audit coverage.  Program review audits were initiated because the Applications Development function’s domains reorganized their investment portfolios and aligned related projects in the same program.  The program review includes audits of related projects within a specific program.  The results of these audits are summarized and presented as one report for the entire program.  

Our sample review of 29 audits included 3 program reviews.  The official database for Quality Assurance Program Office audits did not include results for the program reviews or the reviews of the projects associated with the programs.  Without detailed program review results, the database does not provide a complete picture of the audit activities.  Although noncompliance issues from the program reviews are not included in the aging reports for follow up by the Quality Assurance Program Office, according to the Chief, Quality Assurance Program Office, these issues are monitored separately by the quality specialists.

The Quality Assurance Program Office reports noncompliance issues identified during an audit.  The project team is required to prepare a Corrective Action Plan detailing when each reported issue will be resolved.  The Quality Assurance Program Office monitors the resolution of the reported noncompliance issues to ensure the Applications Development function’s domains timely implement their corrective actions as documented in their plans and then update the database regarding the status of the resolution of the noncompliance issues.  If a noncompliance issue is not timely resolved, it is escalated through the management chain for resolution. 

We reviewed the status of the Corrective Action Plans prepared by the project teams to resolve the reported noncompliance issues and determined if the issues were monitored and resolved timely.  Our review found:  

·         16 audits did not provide explanations for overdue noncompliance issues, and the issues were not escalated to ensure resolution.

·         8 audits with inaccurate database date entries for audit closing, audit report issue, planned resolution, and/or noncompliance issue closed dates.

Adequate procedures to provide guidance on maintaining complete and accurate audit evidence will help ensure the Quality Assurance Program Office has sufficient support for its reports.  In addition, taking adequate action to follow up on noncompliant project issues will allow the Applications Development function the opportunity to achieve product and service improvements.

The peer review process was not complete or adequately documented

The peer review provides a control to help assure findings and conclusions in audit reports are fully supported by sufficient, competent, and relevant evidence.  The Quality Assurance Program Office developed peer review guidance to help facilitate a thorough and consistent peer review process.  Although the Quality Assurance Program Office initiated peer review activities, the process was not always thorough.   

We analyzed the samples of 29 audit reports, 10 Domain Director Reports, and the Fiscal Years 2008 and 2009 annual Program Management Office Director Reports, to determine whether the reports were subjected to a peer review before issuance.  We reviewed the completed peer review comments to note any issues identified and whether all significant issues were resolved before report issuance.  Our review found:

·         The Quality Assurance Program Office repository did not include peer review documentation for any of the Quality Assurance Program Office’s audit reports, Domain Director Reports, and Program Management Office Director Reports. 

·         After we requested the missing documentation, we received peer review documentation for only 5 of the 29 audit reports, 3 of the 10 Domain Director Reports, and 1 of the 2 Program Management Office Director Reports.

·         The peer review documentation focused primarily on the report presentation (format/punctuation/grammar) rather than determining whether the report issues were supported with adequate documentary evidence. 

·         There was no clear indication that the quality specialists addressed the peer review comments prior to the issuance of the audit report.

·         The peer reviews did not address Audit Evaluation Checklist items that were inaccurately presented or missing from the audit reports.

By ensuring an adequate peer review, the Quality Assurance Program Office audit, Domain Director, and Program Management Office Director Reports will have greater assurance of being informative, accurate, and appropriate in perspective. 

Recommendations

Recommendation 3:  The Chief Technology Officer should ensure the Quality Assurance Program Office guidance includes requirements that:  1) quality specialists support all findings included in reports with available references to the documentation to support the report issues, 2) all noncompliance issues are adequately monitored to resolution, and 3) the database repository for Quality Assurance Program Office audits includes all audit results and corrective action dates.

Management’s Response:  The IRS agreed with our recommendation.  The Quality Assurance Program Office plans to strengthen the language relative to the reporting of the audit findings to include a mapping of the checklists to the findings and the monitoring of noncompliances.  To the extent possible, the Quality Assurance Program Office plans to ensure that the audit’s results are input into the database.  The IRS noted that the database currently does not have the ability to capture program data.  The Quality Assurance Program Office plans to explore the acquisition of a more robust tool to alleviate the issues described in item number three of this recommendation.

Recommendation 4:  The Chief Technology Officer should have the Quality Assurance Program Office further develop the peer review guidance to ensure audit reports are supported by sufficient, competent, and relevant evidence.  To help facilitate an adequate peer review, the Quality Assurance Program Office should analyze the peer review checklist to ensure it includes all appropriate issues for review and require its use in performing peer reviews.

Management’s Response:  The IRS agreed with our recommendation.  The Quality Assurance Program Office plans to strengthen the language relative to the peer review process and analyze the checklist to ensure it includes all appropriate issues for review.

 


Appendix I

 

Detailed Objective, Scope, and Methodology

 

The overall objective was to determine whether the Applications Development function’s Quality Assurance Program Office ensures development projects implement a coordinated set of activities that conform to organizational policies, processes, and procedures that meet the standards of the Software Engineering Institute’s CMMI - Development maturity level 2. 

We assessed the adequacy of program documentation and data provided by the IRS.  We supported this work by interviewing Quality Assurance Program Office personnel.  Specifically, we:

I.                   Determined whether the Applications Development function’s Quality Assurance Program Office had effective program management processes to objectively identify and evaluate projects and their related development work processes.

A.    Reviewed the current organizational structure of the Quality Assurance Program Office to determine if it is effective to provide independent evaluations of the Applications Development function processes and projects.

B.     Determined whether the Quality Assurance Program Office’s audit selection criteria and annual audit plan provide adequate review coverage to meet the goals and objectives for performing evaluations of the investment portfolio.

C.     Determined whether the Quality Assurance Program Office specialists have adequate qualifications to perform the audits in compliance with the CMMI.

D.    To obtain a general assessment of the adequacy of the Quality Assurance Program Office audit activities, we selected a sample of Quality Assurance Program Office Domain Director Reports to determine whether the audits performed were fairly presented.  Our review included a judgmentally selected sample of 10 of the 55 Domain Director Reports that were issued in Fiscal Years 2008 and 2009.  Also, we reviewed the Fiscal Years 2008 and 2009 annual Program Management Office Director Reports.  We determined whether the audits performed were fairly presented in the reports and if peer reviews were conducted for the sample of Domain Director Reports and the annual Program Management Office Director Reports.  We used a judgmental sample because we did not intend to project the results of this sample to the population.

II.                Determined whether the Quality Assurance Program Office has an effective process to perform the responsibilities as required by the CMMI.

A.    Reviewed the processes, guidance, and procedures implemented by the Quality Assurance Program Office and determined whether they meet the requirements of the CMMI-Development.

B.     Determined whether the audit reports, Domain Director Reports, and the annual Program Management Office Director Reports used to convey the results of the evaluations of the processes and projects meet the requirements of the CMMI- Development.

III.             Determined whether the project and process evaluations effectively identified and reported noncompliance with processes and procedures and whether adequate corrective actions were taken.

A.    To obtain a general assessment of the adequacy of the Quality Assurance Program Office audit activities, we selected a judgmental sample of Quality Assurance Program Office audit files to determine whether the Quality Assurance Program Office quality specialists obtained and documented adequate evidence to support the observations reported.  Our review included a judgmentally selected sample of 20 percent of the Quality Assurance Program Office audits conducted in Fiscal Years 2008 and 2009.  The sample of 29 audits included 13 of the 65 audits completed in Fiscal Year 2008 and 16 of the 79 audits completed in Fiscal Year 2009.  We used a judgmental sample because we did not intend to project the results of this sample to the population. 

B.     Determined whether each audit report was subjected to the Quality Assurance Program Office’s own peer review before issuance.

C.     Determined whether all the noncompliance issues identified in the reports were or are being monitored.

D.    Determined if any audit reports issued with noncompliance issues were disagreed by the auditee and whether the escalation procedures were followed to resolve the disagreements.

Internal controls methodology

Internal controls relate to management’s plans, methods, and procedures used to meet their mission, goals, and objectives.  Internal controls include the processes and procedures for planning, organizing, directing, and controlling program operations.  They include the systems for measuring, reporting, and monitoring program performance.  We determined the following internal controls were relevant to our audit objective:  CMMI-Development, the Enterprise Life Cycle, and the Internal Revenue Manual.  We supported this work by interviewing Applications Development function executives and the Chief, Quality Assurance Program Office, and staff.

 

Appendix II

 

Major Contributors to This Report

 

Alan R. Duncan, Assistant Inspector General for Audit (Security and Information Technology Services)

Scott A. Macfarlane, Director

Edward A. Neuwirth, Audit Manager

Michael A. Garcia, Senior Auditor

Beverly K. Tamanaha, Senior Auditor

Tina Wong, Senior Auditor

Louis V. Zullo, Senior Auditor

Tuyet Nguyen, Auditor

 

Appendix III

 

Report Distribution List

 

Commissioner  C

Office of the Commissioner – Attn:  Chief of Staff  C

Deputy Commissioner for Operations Support  OS

Associate Chief Information Officer, Applications Development  OS:CTO:AD

Associate Chief Information Officer, Enterprise Services  OS:CTO:ES

Associate Chief Information Officer, Modernization Program Management Office  OS:CTO:MP

Deputy Associate Chief Information Officer, Applications Development  OS:CTO:AD

Deputy Associate Chief Information Officer, Enterprise Services OS:CTO:ES

Director, Enterprise Systems Testing  OS:CTO:AD:TAD

Director, Risk Management OS:CTO:SP:RM

Director, Strategy and Capital Planning  OS:CTO:SP:S

Chief Counsel  CC

National Taxpayer Advocate  TA

Director, Office of Legislative Affairs  CL:LA

Director, Office of Program Evaluation and Risk Analysis  RAS:O

Office of Internal Control  OS:CFO:CPIC:IC

Audit Liaisons:

Associate Chief Information Officer, Applications Development  OS:CTO:AD

Associate Chief Information Officer, Enterprise Services  OS:CTO:ES

Director, Risk Management  OS:CTO:SP:RM

 

Appendix IV

 

Quality Assurance Program Office Audits Reviewed

The following tables present the results of our review of 29 Quality Assurance Program Office audits from Fiscal Years 2008 and 2009.  The review results presented below include the issues we identified relating to deficiencies in capturing adequate documentation to support audit results and appropriate application of procedural guidance in performing the audits.

Table 1:  Fiscal Year 2008 Audits

 

Quality Assurance Audit

TIGTA
Documentation Issue

TIGTA
Procedural Issue

1.

Correspondence Examination Automated System Major Windows-Intel (Intel-based Windows computer system)

No peer review documentation.

No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues.

 

2.

Automated Underreporter Program

Peer review documentation not on electronic network repository; provided upon request.                  

No explanation for overdue
noncompliance issues and not escalated. 

 

3.

Automated Insolvency System  

Peer review documentation not on electronic network repository; provided upon request.                  

1) No explanation for overdue noncompliance issues and not escalated.                                                        

 2) No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues. 

4.

Business Master File Research and Support Federal Unemployment Tax Act

1) No audit notification documentation.                                     

2) No peer review documentation.                          

3) Opening presentation, audit evaluation checklist, corrective action plan, and audit report not on electronic network repository; provided upon request.                  

1) No explanation for overdue noncompliance issues and not escalated.                                     

2) Incorrect dates on database.                           

3) No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues.

5.

Notice Print Processing Individual Taxpayer Identification Number   

1) Audit notification and open presentation not on electronic network repository; provided upon request.                                                          

2) No peer review documentation.                     

 3) Corrective action plan incomplete.

1) No explanation for overdue noncompliance issues and not escalated.                            

 2) No explanation for noncompliant issues unable to track these checklist issues to the reported noncompliance issues.

6.

Enterprise Return Retrieval      

1) No audit notification documentation.                                    

 2) No peer review documentation. 

None.

7.

Accounts Management Services Project Release 1.3                                          

1) Audit notification not on electronic network repository; provided upon request.                                         

 2) No peer review documentation.                                             

 3) Corrective action plan incomplete.

No explanation for overdue noncompliance issues and not escalated.

8.

Web Integration, Collaboration and Development                       

1) No audit notification and open presentation documentation.                              

2) No peer review documentation

3) No audit evaluation checklist.

No explanation for overdue noncompliance issues and not escalated. 

9.

Web Integration, Collaboration and Development       

1) No audit notification and open presentation documentation.                             

2) No peer review documentation.

3) No audit evaluation checklist.

No explanation for overdue noncompliance issues and not escalated.

10.

Embedded Quality

1) Audit notification not on electronic network repository; provided upon request.                                          

2) No opening presentation documentation.                       

3) No peer review documentation.                                         

4) No audit evaluation checklist. 

1) Incorrect dates on database.                                    

 2) Noncompliance issues closed even though not resolved because followup audit being conducted. 

11.

Security Audit and Analysis System

1) Audit notification, audit evaluation checklists, and corrective action plan not on electronic network repository; provided upon request.                                                                  

2) No peer review documentation.    

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                                              

2) Incorrect dates on database.                                                                  3) No explanation for overdue noncompliance issues and not escalated.                                                                                                          4) Inconsistent results reported between checklist, audit report, and/or Program Management Office Director Report.  

12.

Totally Automated Personnel System Program Operations and Maintenance

1) Audit notification and audit evaluation checklists not on electronic network repository; provided upon request.                                                  

2) No peer review documentation.

1) Incorrect dates on database.                                

2) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                                           

3) No explanation for overdue noncompliance issues and not escalated.                                       

4) Inconsistent results reported between checklist, audit report, and/or Program Management Office Director Report. 

13.

Business Master File Electronic Filing                                                   

Audit notification and peer review documentation not on electronic network repository; provided upon request.  

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                       

2) Incorrect dates on database.  

 

Table 2:  Fiscal Year 2009 Audits

 

Quality Assurance Audit

TIGTA
Documentation Issue

TIGTA
Procedural Issue

1.

Return Inventory Classification System                                                              

Audit evaluation checklist with comments and peer review documentation not on electronic network repository; received upon request.                       

No explanation for overdue noncompliance issues and not escalated. 

2.

Examination Returns Control System

1) Audit evaluation checklist with comments not on electronic network repository; received upon request.                                                                    

2) No peer review documentation. 

No explanation for overdue noncompliance issues and not escalated. 

3.

Issue Based Management Information System

1) Opening presentation, audit evaluation checklist, audit report, and corrective action plan not on electronic network repository; provided upon request.                                                                 

2) No peer review documentation. 

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                              

2) No explanation for overdue noncompliance issues and not escalated.                                       
3) Inconsistent results reported between checklist, audit report, and/or Program Management Office Director Report. 

4.

Appeals Centralized Database System

Peer review documentation and corrective action plan not on electronic network repository; provided upon request.

No explanation for overdue noncompliance issues and not escalated.

5.

Inventory Delivery System             

1) Opening presentation, audit evaluation checklist, and audit report not on electronic network repository; provided upon request.            

2) Corrective action plan incomplete.                                                          

3) No peer review documentation.

No explanation for overdue noncompliance issues and not escalated. 

6.

Notice Conversion

1) No audit notification documentation.                                                               2) No peer review documentation.

1) No explanation for noncompliant issues unable to track these checklist issues to the reported noncompliance issues.                               

2) Incorrect dates on database.                               

7.

Individual Master File Online     

1) No audit notification documentation.  

2) No peer review documentation.                              

3) Open presentation, audit evaluation checklist, and corrective action plan not on electronic network repository; provided upon request.

1) No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues.                              

2) Program Review No audit information included on the Quality Assurance Program Office database.                                              

 3) No explanation for overdue noncompliance issues and not escalated. 

8.

e-Services                                                   

1) Audit notification not on electronic network repository; provided upon request.                                          

2) No peer review documentation. 

1) No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues.                              

2) Program Review No audit information included on the Quality Assurance Program Office database.  

9.

Account Management Services  Operations and Maintenance Sub-Projects:  Field Assistance Self Assist Model

1) Audit notification not on electronic network repository; provided upon request.                                               

2) No audit evaluation checklist.                                    

3) No peer review documentation. 

Program Review No audit information included on the Quality Assurance Program Office database.  

10.

Automated Labor and Employee Relations Tracking System                  

1) No audit notification documentation. documentation.

2) No peer review documentation. 

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                             

 2) No explanation for overdue noncompliance issues and not escalated.

 3) Incorrect dates on database.     

11.

GovTrip

1) No audit notification and open presentation documentation.                              

2) No peer review documentation.

No explanation for partially compliant issues unable to track these checklist issues to the reported noncompliance issues.  

12.

Unpaid Assessments                                  

1) Audit notification not on electronic network repository; provided upon request.                             

2) No peer review documentation. 

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                              

2) Noncompliance issues closed even though not resolved because a followup audit being conducted. 

13.

Custodial Detail Database          

1) Audit notification not on electronic network repository; provided upon request.                                           

2) No peer review documentation. 

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                              

2) Noncompliance issues closed even though not resolved because followup audit being conducted. 

14.

Embedded Quality

No peer review documentation. 

No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.       

15.

Individual Master File Document Specific

1) No audit notification documentation.                                                           

2) No peer review documentation.

1) No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.                                                

2) Incorrect dates on database. 

16.

Modernized e-File Operations and Maintenance

1) No audit notification documentation.                      2) No peer review documentation. 

No explanation for partially compliant and noncompliance issues unable to track these checklist issues to the reported noncompliance issues.     

 

Appendix V

 

Glossary of Terms

 

Term

Definition

Best Practice

A technique or methodology that, through experience and research, has proven to reliably lead to a desired result.

Capability Maturity Model Integration - Development®

A model or collection of “best practices” that organizations follow to dramatically improve the effectiveness, efficiency, and quality of their product and service development work.  CMMI-Development is also supported by training courses and appraisal methodologies to help organizations objectively measure their improvement progress.

Enterprise Life Cycle

A structured business systems development method that requires the preparation of specific work products during different phases of the development process.

Release

A specific edition of software.

Software Engineering Institute

A Federally funded research and development center whose purpose is to help others make measured improvements in their software engineering capabilities. 

Task Order

An order for services planned against an established contract.

 

Appendix VI

 

Management’s Response to the Draft Report

 

DEPARTMENT OF THE TREASURY

INTERNAL REVENUE SERVICE

WASHINGTON, D.C. 20224

 

CHIEF TECHNOLOGY OFFICER

 

January 26, 2011

 

 

MEMORANDUM FOR DEPUTY INSPECTOR GENERAL FOR AUDIT

 

FROM:                             Terence V. Milholland /s/ Terence V. Milholland

Chief Technology Officer

 

SUBJECT:                       Draft Audit Report - The Applications Development Function's Quality Assurance Program Office Can Make It’s Processes More Effective (Audit #201020026)

(i-trak #2011-17695)

Thank you for the opportunity to review your draft audit report and meet with the audit team to discuss earlier report observations. I appreciate your recognition of the Applications Development Quality Assurance Program Office's efforts to successfully meet the Software Engineering Institute's Capability Maturity Model Integration - Development maturity Level 2.

I am pleased with your assessment of this organization’s implementation of a comprehensive plan to assess the products and services of Applications Development by providing guidance and tools for auditing project development and employing qualified specialists to perform the audits.

I agree with your recommendations and the attachment to this memo details our planned actions to implement your suggestions.

Your continued support and the assistance and guidance your team provides have been valuable. If you have any questions, please contact me on (202) 622-6800, or Darrin Brown on (202) 283-4613

Attachment

RECOMMENDATION #1:  As the Quality Assurance Program Office processes mature, the Chief Technology Officer should consider establishing a separate Chief Technology Officer quality assurance group to provide coverage across the MITS organization. Once the development processes throughout the organization have matured and CMMI maturity level 3 is within sight, the MITS organization should realign the quality assurance function to report to the Office of the Chief Technology Officer.

CORRECTIVE ACTION #1:  We agree with the recommendation. The Chief Technology Officer will evaluate the feasibility and timing of this recommendation, in consideration of a variety of Information Technology factors, as the achievement of CMMI maturity level 3 is within sight.

IMPLEMENTATION DATE:  January 1, 2013

RESPONSIBLE OFFICIAL:  ACIO, Strategy and Planning

CORRECTIVE ACTION MONITORING PLAN:  We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES) and monitor them on a monthly basis until completion.

RECOMMENDATION #2:  The Chief Technology Officer should require the Applications Development function to implement procedures to officially approve the Quality Assurance Program Office products and guidance documents, including but not limited to the annual audit plans, program guidance documents, audit reports, and Domain Director and Program Management Office Director Reports.

CORRECTIVE ACTION #2:  We agree with the recommendation. The Applications Development Quality Assurance Program Office will modify its reporting procedures and templates to include approvals related to their products and guidance documentation.

IMPLEMENTATION DATE:  November 1, 2011

RESPONSIBLE OFFICIAL:  ACIO, Applications Development

CORRECTIVE ACTION MONITORING PLAN:  We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES) and monitor them on a monthly basis until completion.

RECOMMENDATION #3:  The Chief Technology Officer should ensure the Quality Assurance Program Office guidance includes requirements that:  1) quality specialists support all findings included in reports with available references to the documentation to support the report issues, 2) all noncompliance issues are adequately monitored to resolution, and 3) the database repository for Quality Assurance Program Office audits includes all audit results and corrective action dates.

CORRECTIVE ACTION #3:  We agree with the recommendation. The Quality Assurance Program Office will strengthen the language relative to the reporting of the audit findings to include a mapping of the checklists to the findings and the monitoring of non-compliances. To the extent possible, the Quality Assurance Program Office will ensure that the audit's results are input into the database. It is noted the database currently does not have the ability to capture program data. We will explore the acquisition of a more robust tool to alleviate the issues described in bullet #3 of this recommendation.

IMPLEMENTATION DATE: November l, 2011

RESPONSIBLE OFFICIAL:  ACIO, Applications Development

CORRECTIVE ACTION MONITORING PLAN:  We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES) and monitor them on a monthly basis until completion.

RECOMMENDATION #4:  The Chief Technology Officer should have the Quality Assurance Program Office further develop the peer review guidance to ensure audit reports are supported by sufficient, competent, and relevant evidence. To help facilitate an adequate peer review the Quality Assurance Program Office should analyze the peer review checklist to ensure it includes all appropriate issues for review, and require its use in performing peer reviews.

CORRECTIVE ACTION #4:  We agree with the recommendation. The Quality Assurance Program Office will strengthen the language relative to the peer review process and analyze the checklist to ensure it includes all appropriate issues for review.

IMPI.EMENTATION DATE:  November I, 2011

RESPONSIBLE OFFICIAL:  ACIO, Applications Development

CORRECTIVE ACTION MONITORING PLAN:  We enter accepted Corrective Actions into the Joint Audit Management Enterprise System (JAMES) and monitor them on a monthly basis until completion.



[1] See Appendix V for a glossary of terms.

[2] GAO/AIMD-00-21.3.1, November 1999.

[3] Appendix IV presents a summary of the Treasury Inspector General for Tax Administration (TIGTA) review results for the 29 Quality Assurance Program Office audits sampled.