Defense Financial Management: Immature Software Development Processes at
Indianapolis Increase Risk (Letter Report, 05/23/97, GAO/AIMD-97-41).
GAO reviewed the Defense Finance and Accounting Service Financial
Systems Activity (FSA)-Indianapolis' capability for developing and
maintaining software for its information systems.
GAO noted that: (1) although FSA-Indianapolis does not yet satisfy the
criteria for a level 2 (i.e., repeatable) software development
capability on any of the four projects GAO reviewed, the two projects
under its software process improvement (SPI) program showed strengths
and improvement activities in many of the key process areas; (2) for
example, projects under the SPI program generally kept software-related
work products consistent with requirements; (3) in contrast, projects
not under SPI had few such identifiable strengths or improvement
activities; (4) while SPI is making progress in ensuring that its
projects implement defined and documented processes, many of its
processes were not yet institutionalized; (5) for example, many policies
were still in draft form or were in the planning phase, and therefore
were not yet an ongoing way of doing business; (6) in addition, software
quality assurance activities, such as audits, were not used to ensure
that defined software processes and standards were being followed; (7)
such deficiencies pose unnecessary risks to the success of the software
project until they are addressed; (8) by more rigorously implementing
its project management processes among its SPI projects,
FSA-Indianapolis could accelerate progress toward reaching the level 2
capability; and (9) this would enhance its ability to repeat individual
project successes within similar application areas.
--------------------------- Indexing Terms -----------------------------
REPORTNUM: AIMD-97-41
TITLE: Defense Financial Management: Immature Software Development
Processes at Indianapolis Increase Risk
DATE: 05/23/97
SUBJECT: Computer software verification and validation
Financial management systems
Systems development life cycle
Requirements definition
Strategic information systems planning
Project monitoring
Subcontracts
Quality control
IDENTIFIER: Software Capability Maturity Model
DFAS Software Process Improvement Strategic Action Plan
DFAS Defense Transportation Payment System
DFAS Standard Army Financial Inventory Accounting and
Reporting System Modernization
Army Corps of Engineers Financial Management System
Army Standard Finance System
**************************************************************************
* This file contains an ASCII representation of the text of a GAO *
* report. Delineations within the text indicating chapter titles, *
* headings, and bullets are preserved. Major divisions and subdivisions *
* of the text, such as Chapters, Sections, and Appendixes, are *
* identified by double and single lines. The numbers on the right end *
* of these lines indicate the position of each of the subsections in the *
* document outline. These numbers do NOT correspond with the page *
* numbers of the printed product. *
* *
* No attempt has been made to display graphic images, although figure *
* captions are reproduced. Tables are included, but may not resemble *
* those in the printed version. *
* *
* A printed copy of this report may be obtained from the GAO Document *
* Distribution Facility by calling (202) 512-6000, by faxing your *
* request to (301) 258-4066, or by writing to P.O. Box 6015, *
* Gaithersburg, MD 20884-6015. We are unable to accept electronic orders *
* for printed documents at this time. *
**************************************************************************
Cover
================================================================ COVER
Report to the Under Secretary of Defense (Comptroller)
June 1997
DEFENSE FINANCIAL MANAGEMENT -
IMMATURE SOFTWARE DEVELOPMENT
PROCESSES AT INDIANAPOLIS INCREASE
RISK
GAO/AIMD-97-41
Defense Financial Management
(511502)
Abbreviations
=============================================================== ABBREV
AIS - automated information system
ASQC - American Society for Quality Control
CCB - configuration control board
CDA - central design activity
CEFMS - Corps of Engineers Financial Management System
CFO - Chief Financial Officer
CMM - capability maturity model
DFAS - Defense Finance and Accounting Service
DOD - Department of Defense
DTRS - Defense Transportation Payment System
FSA - financial systems activity
FSO - financial systems organization
KPA - key process area
SCCB - software configuration control board
SCE - software capability evaluation
SCM - software configuration management
SDS - system development scenario
SEI - Software Engineering Institute
SMS - system modification scenario
SPI - software process improvement
SQA - software quality assurance
STANFINS - Standard Finance System
STARFIARS-MOD - Standard Army Financial Inventory Accounting
Reporting System Modernization
Letter
=============================================================== LETTER
B-276764
June 6, 1997
The Honorable John J. Hamre
The Under Secretary of Defense (Comptroller)
Dear Mr. Hamre:
In conjunction with our responsibilities to audit the U.S.
government's financial statements, we are reviewing the Department of
Defense (DOD) financial management systems. As you know, Defense's
ability to produce accurate, auditable financial statements and other
reliable management reports as required by the Chief Financial
Officers (CFO) Act of 1990, as expanded upon by the Government
Management Reform Act of 1994, has been hampered by inadequate
financial systems. This report discusses the results of our
evaluation of the Defense Finance and Accounting Service (DFAS)
Financial Systems Activity (FSA)-Indianapolis' capability for
developing and maintaining software for its information systems. Our
objective was to evaluate software development processes used at
FSA-Indianapolis. The four projects we reviewed were selected by the
FSA director as those best representing its software development
processes and practices.
BACKGROUND
------------------------------------------------------------ Letter :1
DFAS was created in 1991 from the financial centers of the military
departments as the executive agent responsible for finance and
accounting functions within DOD. Through consolidation, DFAS
acquired the responsibility for more than 200 existing "legacy"
finance and accounting systems, commonly referred to as automated
information systems. Some are being eliminated and others are being
further consolidated into a smaller number of "migratory" and
"interim migratory" systems. In October 1993, the newly formed DFAS
organization, called the Financial Systems Organization (FSO), was
created to provide traditional central design activity services, as
well as technical support, on a fee-for-service basis. FSO
headquarters staff were in Indianapolis with additional personnel at
six geographically dispersed FSAs having the primary mission to
develop, modify, and maintain DFAS' automated information systems and
secondarily to provide technical support in a number of
systems-related areas.
Just prior to this reorganization, in June 1993, the Indianapolis
center completed an assessment of the software engineering processes
associated with its role as a central design activity. This internal
software process assessment concluded that the overall software
engineering process practiced at Indianapolis was consistent with
level 1 of the Software Engineering Institute's (SEI) capability
maturity model.\1 SEI characterizes a Level 1 software process, which
is the initial and most basic of the five levels, as ad hoc, and
occasionally even chaotic, with few processes defined, and success
depending on individual effort. Table 1 describes each level of this
model.
Today, the FSAs are mainly concerned with maintaining and modifying
the 109 existing automated systems, with software development,
modification, and maintenance of DFAS' mission-oriented and support
systems consuming more than a reported 80 percent of the FSO's fiscal
year 1995 budget. Recognizing this role, in 1994, FSO initiated a
major effort to improve its underlying software processes. In March
of that year, the original FSO-developed software process improvement
(SPI) strategic action plan was approved.
Since 1994, FSO has been implementing its long-term SPI plan to
improve and standardize maintenance and modification processes. The
plan includes the implementation of a system modification scenario
and achievement of a level 2 software engineering capability
according to the criteria of SEI's model. (See appendix II for a
list of the 10 major objectives in the strategic plan, and the names
and locations of other activities and systems under the SPI
umbrella.)
--------------------
\1 The Software Engineering Institute (SEI) is a nationally
recognized, federally funded research and development center
established at Carnegie Mellon University in Pittsburgh,
Pennsylvania, to address software development issues. In the late
1980s, with assistance from the Mitre Corporation, SEI developed a
process maturity framework designed to assist organizations in
improving their software processes. In general, software process
maturity serves as an indicator of the likely range of cost,
schedule, and quality of results that can be expected to be achieved
by projects within a software organization.
RESULTS IN BRIEF
------------------------------------------------------------ Letter :2
Although FSA-Indianapolis does not yet satisfy the criteria for a
level 2 (i.e., repeatable) software development capability on any of
the four projects we reviewed, the two projects under its SPI program
showed strengths and improvement activities in many of the key
process areas (KPAs). For example, projects under the SPI program
generally kept software-related work products consistent with
requirements. In contrast, projects not under the SPI program had
few such identifiable strengths or improvement activities.
While the SPI program is making progress in ensuring that its
projects implement defined and documented processes, many of its
processes were not yet institutionalized.\2 For example, many
policies were still in draft form or were in the planning phase, and
therefore were not yet an ongoing way of doing business. In
addition, software quality assurance activities, such as audits, were
not used to ensure that defined software processes and standards were
being followed. Such deficiencies pose unnecessary risks to the
success of the software project until they are addressed.
By more rigorously implementing its project management processes
among its SPI projects, FSA-Indianapolis could accelerate progress
toward reaching the level 2 capability. This would enhance its
ability to repeat individual project successes within similar
application areas.
--------------------
\2 The SEI defines institutionalization as the building of an
infrastructure and corporate culture that suggest methods, practices,
and procedures so that they become the ongoing way of doing business,
even after those who originally defined them are gone. Software
Capability Evaluation, Version 3.0, Method Description
(CMU/SEI-96-TR-002, April 1996).
SCOPE AND METHODOLOGY
------------------------------------------------------------ Letter :3
To evaluate FSA-Indianapolis' software development capability,
version 3.0 of the Software Engineering Institute's (SEI) software
capability evaluation (SCE) method\3 was used by an SEI-trained team
of GAO specialists, including an authorized lead evaluator trained in
this evaluation technique by SEI. The evaluation is a method of
assessing agencies' and contractors' software development processes
against industry-accepted criteria in SEI's five-level software
capability maturity model (CMM), as shown in table 1. These levels
and the key process areas described within each level define an
organization's ability to develop software, and can be used to
improve its software development processes. The findings generated
from an SCE identify (1) process strengths that mitigate risks, (2)
process weaknesses that increase risks, and (3) improvement
activities that indicate potential mitigation of risks.
Table 1
CMM Levels and Descriptions
Level Name Description
------------- ------------- ----------------------------------------
5 Optimizing Continuous process improvement is
enabled by quantitative feedback from
the process and from piloting innovative
ideas and technologies.
4 Managed Detailed measures of the software
process and product quality are
collected. Both the software process and
products are quantitatively understood
and controlled.
3 Defined The software process for both management
and engineering activities is
documented, standardized, and integrated
into a standard software process for the
organization. All projects use an
approved, tailored version of the
organization's standard software process
for developing and maintaining software.
2 Repeatable Basic project management processes are
established to track cost, schedule, and
functionality. The necessary process
discipline is in place to repeat earlier
successes on projects with similar
applications.
1 Initial The software process is characterized as
ad hoc, and occasionally even chaotic.
Few processes are defined, and success
depends on individual effort.
----------------------------------------------------------------------
Note: According to an SEI study (Moving on Up: Data and Experience
Doing CMM-Based Process Improvement, Technical Report
CMU/SEI-95-TR-008, August 1995) of 48 organizations that implemented
software process improvement programs, the time required to increase
process maturity from level 1 to level 2 took an average of 30
months, with a range of 11 months to 58 months.
Source: Capability Maturity Model for Software, Version 1.1,
(Technical Report CMU/SEI-93-TR-24, February 1993).
We requested that FSA-Indianapolis identify for our evaluation those
projects that best represented their software development processes
implemented at Indianapolis. The Director, FSA-Indianapolis
identified two SPI projects and two non-SPI projects, as follows:
--------------------
\3 Version 3.0 of the SCE method is based on SEI's capability
maturity model, version 1.1.
SPI PROJECTS
---------------------------------------------------------- Letter :3.1
Defense Transportation Payment System (DTRS)
Standard Army Financial Inventory Accounting and Reporting System
Modernization (STARFIARS-MOD)
NON-SPI PROJECTS
---------------------------------------------------------- Letter :3.2
Corps of Engineers Financial Management System (CEFMS)\4
Standard Finance System (STANFINS)
We evaluated the software development processes used on these
projects, focusing on the key process areas necessary to achieve a
repeatable capability. In particular, the team evaluated the degree
of implementation and institutionalization of all KPA goals in
accordance with the SCE methodology. Accordingly, rating judgments
were made at the goal level. A goal is satisfied if the associated
findings indicate that this goal is implemented and institutionalized
either as defined in CMM, with no significant weaknesses, or that an
adequate alternative exists.
Organizations that have a repeatable software development
process--one that can be counted on to render the same results if the
same processes are followed--have been able to significantly improve
their productivity and return on investment. According to SEI,\5
processes for a repeatable capability (CMM level 2) are considered
the most basic in establishing discipline and control in software
development and are crucial steps for any project to mitigate risks
associated with cost, schedule, and quality. As shown in table 2,
these processes include (1) requirements management, (2) software
project planning, (3) software project tracking and oversight, (4)
software subcontract management, (5) software quality assurance, and
(6) software configuration management.
Table 2
CMM Level 2 "Repeatable" Key Process
Area Descriptions
CMM Level 2 KPAs Description
---------------------------- ----------------------------------------
Requirements management Defining, validating, and prioritizing
requirements, such as functions,
performance, and delivery dates.
Software project planning Developing estimates for the work to be
performed, establishing the necessary
commitments, and defining the plan to
perform the work.
Software project tracking Tracking and reviewing software
and oversight accomplishments and results against
documented estimates, commitments, and
plans and adjusting these based on the
actual accomplishments and results.
Software subcontract Selecting qualified contractors and
management managing them effectively.
Software quality assurance Reviewing and auditing the software
products and activities to ensure that
they comply with the applicable
processes, standards, and procedures and
providing the staff and managers with
the results of their reviews and audits.
Software configuration Selecting project baseline items, such
management as specifications; systematically
controlling these items and changes to
them; and recording and reporting status
and change activity for these items.
----------------------------------------------------------------------
The Department of Defense provided written comments on a draft of
this report. These comments are presented and evaluated at the end
of this report and are reprinted in appendix I. We conducted our
review from August 1996 through February 1997 in accordance with
generally accepted government auditing standards.
--------------------
\4 CEFMS represents FSA-Indianapolis' attempt to adapt the Corps of
Engineers' Financial Management System (CEFMS) for Army posts, camps,
and stations.
\5 Software Capability Evaluation, Version 3.0, Method Description
(CMU/SEI-96-TR-002, April 1996).
FSA-INDIANAPOLIS SOFTWARE
DEVELOPMENT PROCESSES ARE
IMMATURE
------------------------------------------------------------ Letter :4
In order for FSA-Indianapolis to be rated at CMM level 2, all
evaluated projects would have to pass every level 2 KPA. As shown in
appendix III, this is not the case. No project passed every KPA, nor
was there a single KPA that was passed by every project. Therefore,
we conclude that FSA-Indianapolis as an organization remains a long
way from achieving the repeatable level of maturity (level 2).
Organizations that have not developed the process discipline
necessary to better manage and control their projects at the
repeatable level incur greater risk of schedule delay, cost overruns,
and poor quality software. To mitigate this, such organizations
typically rely upon the variable capabilities of individuals, rather
than on institutionalized processes considered basic to software
development.
Highlights of our evaluation of the four projects follow.
Requirements Management. The purpose of requirements management is
to establish a common understanding between the customer and the
software project of the customer's requirements that will be
addressed by the software project.
DTRS was the only project that met all of the goals for requirements
management. Specifically, for this project, functional and
performance requirements and delivery dates were defined, validated,
and prioritized.
Other than DTRS, the projects' functional requirements were not
adequately reviewed at the early stages of the software development
life cycle. Specifically, the configuration control boards (CCBs)
used in FSA-Indianapolis were responsible primarily for funding
decisions but not the review and authorization of the establishment
of and changes to software baselines. This situation can lead to
wasted effort developing requirements which may be technically
infeasible.
In addition, the CEFMS project and its prime contractor in
FSA-Indianapolis depended on a subcontractor to perform the
requirements management function, but the subcontractor did not
satisfy any of the goals within the requirements management KPA.
Specifically, although the contractor reviewed software change
requests before they were incorporated into the CEFMS project, a
baseline of requirements was not established. Without a baseline, it
is difficult to manage changes to the project and maintain the
stability of the software produced from release to release.
Table 3
Results for the Requirements Management
Key Process Area
Requirements STARFIARS-
management goal STANFINS DTRS MOD CEFMS
-------------------- ============= ============= ============= =============
System requirements Unsatisfied Satisfied Unsatisfied Unsatisfied
allocated to
software are
controlled to
establish a baseline
for software
engineering and
management use.
Activity:
--The software
engineering group
reviews the
allocated
requirements before
they are
incorporated into
the software
project.
Software plans, Unsatisfied Satisfied Satisfied Partially
products, and satisfied
activities are kept
consistent with the
system requirements
allocated to
software.
Activities:
--The software
engineering group
uses the allocated
requirements as the
basis for software
plans, work
products, and
activities.
--Changes to the
allocated
requirements are
reviewed and
incorporated into
the software
project.
--------------------------------------------------------------------------------
Satisfied - Practices that achieve the intent of the goal were
implemented.
Unsatisfied - Weaknesses that significantly impact the goal exist.
Partially satisfied - Practices that achieve the intent of the goal
were implemented but not institutionalized.
Software Project Planning. The purpose of software project
planning is to establish reasonable plans for performing the
software engineering and for managing the software project.
DTRS and STARFIARS-MOD had software development plans and documented
software estimates (e.g., effort, cost, and schedule) for the
project. Further, STARFIARS-MOD had recently implemented function
point analysis\6 to estimate software size. On the other hand,
software risks associated with the cost, resource, schedule, and
technical aspects of the projects were not adequately identified,
assessed, or documented for any of the four projects evaluated.
Without risk assessment, the reliability of estimates is
questionable, and the ability of a project to meet its schedule is
reduced.
Table 4
Results for the Software Project
Planning Key Process Area
Software project STARFIARS-
planning goal STANFINS DTRS MOD CEFMS
-------------------- ============= ============= ============= =============
Software estimates Unsatisfied Partially Partially Unsatisfied
are documented for satisfied satisfied
use in planning and
tracking the
software project.
Activities:
--Estimates for the
size of the software
work products (or
changes to the size
of software work
products) are
derived according to
a documented
procedure.
--Estimates for the
software project's
effort and costs are
derived according to
a documented
procedure.
--Estimates for
project's critical
computer resources
are derived
according to a
documented
procedure.
--The project's
software schedule is
derived according to
a documented
procedure.
--Software planning
data are recorded.
Software project Unsatisfied Partially Partially Unsatisfied
activities and satisfied satisfied
commitments are
planned and
documented.
Activities:
--Software project
planning is
initiated in the
early stages of, and
in parallel with,
the overall project
planning.
--A software life
cycle with
predefined stages of
manageable size is
identified or
defined.
--The project's
software development
plan is developed
according to a
documented
procedure.
--The plan for the
software project is
documented.
--Software work
products that are
needed to establish
and maintain control
of the software
project are
identified.
--The software risks
associated with the
cost, resource,
schedule, and
technical aspects of
the project are
identified,
assessed, and
documented.
--Plans for the
project's software
engineering
facilities and
support tools are
prepared.
Affected groups and Unsatisfied Satisfied Partially Unsatisfied
individuals agree to satisfied
their commitments
related to the
software project.
Activities:
--The software
engineering group
participates on the
project proposal
team.
--The software
engineering group
participates with
other affected
groups in the
overall project
planning throughout
the project's life.
--Software project
commitments made to
individuals and
groups external to
the organization are
reviewed with senior
management according
to a documented
procedure.
--------------------------------------------------------------------------------
Satisfied - Practices that achieve the intent of the goal were
implemented.
Unsatisfied - Weaknesses that significantly impact the goal exist.
Partially satisfied - Practices that achieve the intent of the goal
were implemented but not institutionalized.
Software Project Tracking and Oversight. The purpose of software
project tracking and oversight is to provide adequate visibility
into actual progress so that management can take effective
actions when the software project's performance deviates
significantly from software plans.
FSA-Indianapolis projects that were evaluated underwent periodic
status reviews at meetings with key personnel present, and changes to
commitments were generally agreed to by the affected groups and
individuals. However, software risks associated with cost, resource,
schedule, and technical aspects of the projects were not tracked.
Moreover, although the SPI projects tracked performance and actual
results, a mechanism for making corrections if and when projects
failed to meet estimates was not in place. As a result of these
weaknesses, the projects reviewed are more likely to be affected by
unplanned events and are less likely to meet schedule and cost
commitments.
Table 5
Results for the Software Project
Tracking and Oversight Key Process Area
--------------------
\6 Function points are derived using an empirical relationship based
on countable measures (e.g., number of user inputs, number of user
outputs, number of user inquiries, number of files, and number of
external interfaces) of software's information domain and assessments
of software complexity.
*** Error occurred during conversion. Document is incomplete. ***