TITLE: American Systems Corporation, B-292755; B-292755.2, December 3, 2003
BNUMBER: B-292755; B-292755.2
DATE: December 3, 2003
**********************************************************************
American Systems Corporation, B-292755; B-292755.2, December 3, 2003
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective
Order. This redacted version has been approved for public release.
Decision
Matter of: American Systems Corporation
File: B-292755; B-292755.2
Date: December 3, 2003
Joseph G. Billings, Esq., for the protester.
K. Lisa Daniel, Esq., Department of the Navy, for the agency.
Charles W. Morrow, Esq., and James A. Spangenberg, Esq., Office of the
General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Agency reasonably rated proposal as marginal/high risk for technical
factor under solicitation requesting proposals for training systems
devices and curricula, where the proposal failed to provide sufficient
details of its written instructional system development processes and
failed to provide sufficient details in its sample task response.
2. Discussions were not misleading, even though, on the basis of an
incorrect assumption, protester misinterpreted a particular discussion
question, where a reasonably diligent offeror would have correctly
understood, or requested clarification of, the agency discussion question.
3. Agency reasonably made one of the awards under solicitation
contemplating multiple awards of indefinite-delivery/indefinite-quantity
contracts to an offeror whose marginal proposal contained one significant
weakness, but not to an offeror whose marginal proposal contained two
significant weaknesses under the same technical subfactor (with other
aspects of the evaluation being relatively equal), where the agency
reasonably determined this was a discriminator between the technical
merits of the two proposals that justified award to one and not the other.
DECISION
American Systems Corporation (ASC) protests the elimination of its
proposal from the competitive range under request for proposals (RFP) No.
N61339-02-R-0063, issued by the Naval Air Warfare Center Training Systems
Division, for training systems devices and curricula.
We deny the protest.
The RFP, issued as a partial small business set-aside, was to procure
trainer/training systems and technology-based curricula. The RFP
contemplated the award of multiple indefinite-delivery/indefinite-quantity
(ID/IQ) task order contracts for an 8‑year period for two separate
contractual lots. Lot I is not at issue here because the protester
submitted a proposal only for Lot II. Lot II, involving technology-based
curricula, required the contractor to accept task orders to perform
planning, analysis, design, development, implementation, evaluation,
support, maintenance, modification, modeling and simulation, and
management of technology-based training products.
The RFP provided for award of contracts for Lot II to those offerors with
proposals representing the *greatest value,* considering three evaluation
factors: technical, past performance, and price. The technical factor
was comprised of two equally weighted subfactors, instructional systems
development (ISD) and management. The technical and past performance
factors were of equal importance and when combined were considered
significantly more important than price.
The RFP required proposals to describe the offeror*s formal, written,
documented and in-place processes that will be used in the performance of
orders under the ID/IQ contract, and noted that *the Government is
concerned that awardees under the [contract] be organizationally mature
with established processes and procedures that will ensure repeatable
success in performance.* RFP S: L.3.2. In addition, offerors were
required to respond to a sample task, primarily at an oral presentation.
Under the technical factor, the RFP stated that the proposals would be
evaluated to determine the offeror*s ability to plan, analyze, design,
develop, implement, evaluate, support, maintain, modify, and manage
technology-based training products. Under the ISD subfactor, the RFP
stated that proposals would be evaluated *to determine the offeror*s
ability to provide established and proven processes to reliably ensure the
successful completion of prospective orders,* and that the *Sample Task
will be evaluated to ensure incorporation of these processes.*[1] RFP S:
M.3.2(a). In responding to the sample task, offerors were required to
demonstrate understanding and application of the ISD process to the
necessary courseware development encompassed by this solicitation. RFP
S: L.5.7.2.1.
Twenty-nine offerors, including ASC (a large business and an incumbent
contractor), submitted proposals for Lot II. The source selection
evaluation board (SSEB) assigned each proposal a qualitative rating and a
risk rating for each technical subfactor and a risk rating for past
performance.[2]
ASC*s initial proposal was rated marginal with high risk under the ISD
subfactor and marginal with medium risk under the management subfactor,
with low past performance risk. Among the major weaknesses found in ASC*s
proposal under the ISD subfactor were that it provided insufficient
written processes for ISD, and that its sample task technical approach for
courseware development was presented only at a high level, with inadequate
details on the proposed plan for meeting the sample task requirements.
Based on these evaluation results, a competitive range of the nine most
highly rated proposals was established, including those of ASC and
Advanced Engineering & Research Associates (AERA), a small business whose
initial proposal had received identical ratings to ASC*s but with a lower
price.
The agency conducted detailed discussions with each offeror in the
competitive range by issuing written evaluation notices (EN), supplemented
by oral communications. ASC received numerous ENs encompassing the
weaknesses found in its proposal, including some indicating that the
agency was concerned about the dearth of details concerning ASC*s ISD
processes with regard to formative evaluation and one stating *[t]he
sample task technical approach for courseware development was presented at
a high level that furnished inadequate detail on the proposed plan for
meeting the Sample Task requirements.* Agency Report, Tab 64, EN No.
ASC-ISD-11-PC.
Following the receipt of proposal revisions, the SSEB again rated the
proposals in the competitive range. ASC*s proposal was still rated
marginal with high risk under the ISD subfactor but had improved its
management subfactor rating to satisfactory with low risk. AERA*s
proposal*s rating improved to marginal with medium risk under the ISD
subfactor and to satisfactory with low risk under the management
subfactor.[3]
Based on the evaluation results, the source selection advisory committee
(SSAC) recommended award to the six highest-rated offerors, all of which
had resolved all deficiencies and major weaknesses, and had received at
least satisfactory with low risk technical ratings and low risk past
performance ratings.[4] Of the lower-rated proposals (including AERA, ASC
and others), the SSAC recommended award only to AERA because of what the
SSAC considered a *clear distinction* between AERA*s proposal and the
others.
As between AERA*s and ASC*s proposals, the SSAC found that AERA had only
one remaining *significant moderate weakness* concerning the level of
detail in its ISD processes; the SSAC found that the discussions with AERA
had revealed that AERA*s ISD processes were fundamentally sound, even
though they lacked detail at the lowest level of its processes.
In contrast, the SSAC found that ASC*s proposal still had two major
weaknesses and a minor weakness. One major weakness involved ASC*s
written ISD processes, where the agency found ASC*s discussion of
formative evaluation in the evaluation phase of the ISD did not address
the process evaluation portion and much of the product evaluation portion
of formative evaluation. The other major weakness involved ASC*s lack of
details concerning the analysis and design phases of courseware
development in its response to the sample task. The minor weakness noted
was that ASC*s ISD processes were scattered across multiple documents,
making it difficult to discern the process flow.
Thus, the SSAC concluded that because ASC*s proposal had one more major
weakness than AERA*s, which was viewed as a discriminator between the
proposals, and because AERA had submitted a competitive price proposal
(lower‑priced than ASC*s), had a low risk past performance rating,
and is a small business (which would result in a stronger small business
pool for set-asides made under the ID/IQ contract), AERA should also
receive an award. Agency Report, Tab 123, Addend. to SSEB Report, at 5,7;
Tab 125, SSAC Award Recommendation.
ASC*s proposal was eliminated from the competition on August 8 and award
was made to the seven firms on August 15 without further discussions.
After a debriefing, this protest followed.
ASC challenges each aspect of the agency*s evaluation. In reviewing a
protest of an agency*s evaluation of proposals, our Office will not
reevaluate proposals but instead will examine the record to determine
whether the agency*s judgment was reasonable and consistent with the
stated evaluation criteria and applicable statutes and regulations. A
protester*s mere disagreement with the agency*s judgment in its
determination of the relative merits of competing proposals does not
establish that the evaluation was unreasonable. See SDS Int*l, Inc.,
B-291183.4, B-291183.5, Apr. 28, 2003, 2003 CPD P: 127 at 5-6. Based on
our review of the record, we find the agency*s evaluation and source
selection were reasonable.
As noted above, the Navy found that ASC*s written processes were
incomplete because the proposal lacked sufficient details of the actual
step-by-step processes that ASC would employ during formative evaluation,
particularly with regard to process evaluation. See Tr. at 33-34, 36-38.
In many cases, the agency found that ASC*s proposal identified
responsibilities, tasks and procedures, instead of processes. Tr. at
35-37, 60-61, 91-92. For example, in the area of process evaluation, ASC
did not provide a detailed description of the formative evaluation process
that it would utilize to ensure quality during the analysis, design, and
development activities. See Tr. at 29‑30. The Navy officials
explained that the lack of detail pertaining to formative evaluation in
the areas of process and product evaluations caused them to question the
adequacy of ASC*s written processes for ensuring repeatable success in
performance. See Tr. at 13-14, 29, 38.
ASC asserts that its proposal does include formative evaluation elements
to ensure instructionally sound courseware throughout the various phases
of the ISD process at a sufficient level of detail. To support this
contention, ASC has offered several sworn statements and hearing testimony
from a consultant with expertise in the ISD field. Although this
individual testified at the hearing conducted by our Office that in his
opinion the written processes were adequately described for formative
evaluation (albeit scattered throughout several documents in ASC*s
proposal), he conceded that the proposal did not include the level of
step-by-step detail desired by the agency, particularly with respect to
the process for ensuring quality as part of the process evaluation
subphase. This individual argued that in his experience it was not
unusual to have less detailed written processes for formative evaluation
involving process evaluation, and the initial ISD phases, when, as was the
case here, no specific courseware task had been identified. See
Tr. at 50-53.
The Navy officials explained at the hearing, however, that the level of
detail reflected in an offeror*s written processes, particularly for
formative evaluation, provides the agency an opportunity to assess the
offeror*s ability to succeed on numerous projects involving evolving
technologies over the 8-year term of the contract, dozens of orders, a
wide range of courseware projects, customers, and training curricula. See
Tr. at 11-12, 62‑63, 75. The Navy officials explained that, in
their view, the details and quality of a contractor*s written processes
ensures repeatable success on a long-term basis through an established
structured system, and that a contractor that has not provided sufficient
formative evaluation detail would cause the agency to have significant
concerns about that contractor*s performance. See Tr. at 13-14. The Navy
officials further testified that the more highly rated proposals included
the level of detail found lacking in ASC*s written processes and that the
quality of the detail found in the written processes was the basis upon
which the agency conducted the evaluation. Tr. at 63-65.
Although it is apparent that formative evaluation was addressed to some
extent in ASC*s proposal, such as in the area of validation, we find that
the agency reasonably determined that ASC*s written processes lacked
step-by-step detail, particularly concerning the process evaluation
subphase. As noted above, the RFP required proposals to describe the
offeror*s formal, written, documented and in-place processes that will be
used in the performance of orders under the ID/IQ contract. RFP S:
L.3.2. Consequently, the agency reasonably determined that ASC*s failure
to adequately address formative evaluation in its proposal constituted a
significant weakness. While we recognize that the Navy and ASC*s
consultant disagree on the required or desired level and adequacy of
detail by an offeror to show effective written processes, such a
disagreement is not a basis to overturn the agency*s evaluation decision.
See SDS Int*l, Inc., supra.
The second reason that the Navy found warranted assigning a marginal/high
risk rating for the ISD subfactor involved ASC*s response to the sample
task, including its oral presentation, which the Navy found addressed in
adequate detail only the development phase of courseware development, but
not the analysis and design phases.
ASC concedes that its proposal did not provide the same level of detail
for the analysis and design phases as for the development phase, see Tr.
at 101-03, 112, but contends that the Navy led it to believe that it only
needed to address the development phase in its response to the discussions
concerning its sample task response. ASC states that it made this
assumption based on its understanding of the EN it received from the Navy
on this matter, which only referenced *courseware development,* and
because during oral discussions on this point the Navy had cited to a
development tool that had been discussed only in the development phase of
ASC*s plan for implementing the sample task.
According to the parties, the term *courseware development* can, depending
on the context, refer to either all phases of ISD, including analysis,
design, development, implementation and evaluation, or, more narrowly,
only to the actual development phase. Agency Report at 23; Tr. at 95,
106-07; Affidavit of Protester*s Consultant (Oct. 5, 2003) at 7. In this
regard, the Navy argues that ASC interpreted the discussion question
unreasonably because, according to the Navy, the common interpretation for
the term *courseware development* covers all five ISD phases,
and the word *phase* would ordinarily be added when the reference is to
the development phase as a subset of ISD courseware development. The
agency advises that this interpretation is consistent with the way the
term is used in the RFP, and consistent with how ASC used the terminology
in its own proposal.
Although discussions must address at least deficiencies and significant
weaknesses identified in proposals, the precise content of discussions is
largely a matter of the contracting officer*s judgment. We review the
adequacy of discussions to ensure that agencies point out weaknesses that,
unless corrected, would prevent an offeror from having a reasonable chance
for award. Northrop Grumman Info. Tech., Inc., B‑290080 et al.,
June 10, 2002, 2002 CPD P:136 at 6. In conducting discussions, an agency
may not prejudicially mislead offerors. Burns and Roe Servs. Corp.,
B‑251969.4, 94-1 CPD P: 160 at 4.
Based on our review, we conclude that ASC received meaningful
discussions. As indicated by the Navy, the RFP specifically required
offerors to discuss all of the processes related to analysis and design in
responding to the sample task and stated that the sample task would be
evaluated to ensure incorporation of these processes. See RFP S:S:
L.5.7.2, M.3.2(a)(1). Furthermore, notwithstanding what may have occurred
during oral communications,[5] the written EN specifically requested that
ASC address courseware development, with no mention that this subject was
limited to the development phase. Given the potential dual meaning of
courseware development and the context of the EN, as well as the RFP*s
emphasis on the offeror*s ability to demonstrate all of its established
proven processes, we do not think that a reasonably diligent offeror would
have interpreted the agency discussions as not applying to the entire
courseware development process in order to meet the sample task
requirements, without at least first seeking to clarify the matter.
Indeed, ASC*s representative was cognizant of the possible dual meaning of
courseware development, but admits that it was his own assumption that
caused him to believe that a response was required only for the
development phase of the firm*s courseware plan. Tr. at 99-101. On this
record, we find no basis to question the propriety of the agency*s
discussions.
ASC nevertheless argues that the Navy reasonably could have extrapolated
that it could sufficiently address the other phases, given its adequate
response regarding the development phase, and because it was evident from
its response that ASC had misinterpreted the EN. ASC thus contends that
its response should have led to further discussions rather than its
elimination from the competition.
An agency is not obligated to reopen negotiations to give an offeror the
opportunity to remedy a defect that first appears in a revised proposal.
See Burns and Roe Servs. Corp., supra. Further, the agency*s witness
testified that she did not consider it logical to assume that an adequate
response to the development phase meant that an offeror was capable of
developing adequate processes for the other two phases, because she
considered the analysis phase to be a particularly difficult aspect of ISD
courseware, which has tended to be done poorly by some contractors.
Tr. at 117. On the record before us, we find that the Navy had a
reasonable basis for attributing a weakness to ASC*s proposal as it
related to the sample task and the analysis and design phases of its
courseware plan.
ASC questions whether the evaluation was fair, given that both AERA*s and
ASC*s proposals received marginal ratings for the ISD subfactor because
their written ISD processes lacked detail. However, the record reflects
that the agency reasonably considered the additional major weakness
associated with ASC*s sample task to be a significant discriminator for
purposes of making an award selection;[6] indeed, it was because of this
additional significant weakness that ASC*s proposal was considered
inferior to AERA*s under the ISD subfactor, as evidenced by ASC*s high
risk rating (as compared to AERA*s moderate risk rating) under this
subfactor.[7] While ASC argues that its low risk past performance should
have somehow counterbalanced any risk found under the ISD subfactor,
AERA*s past performance was also rated low risk and the evaluation scheme
provided for separate evaluations of these two evaluation factors.
Finally, ASC argues that the Navy*s evaluation was unreasonable because it
should have received an overall acceptable rating. In this regard, ASC
contends that the management and ISD subfactors (for which ASC*s proposal
received satisfactory and marginal ratings, respectively) should have been
averaged to arrive at an overall technical score, the action it asserts
was indicated by the equal weight attached to these two subfactors.
Regardless of the logic of this argument, which assumes that the agency
should round up (to satisfactory) and not down (to marginal), the record
indicates that the Navy source selection officials individually considered
the ratings of each proposal under the various subfactors, and also looked
behind the ratings to determine their basis, and reasonably determined
that ASC*s overall technical rating should be marginal with high risk, and
that AERA*s proposal was technically superior to ASC*s in a significant
way.[8] Thus, we find no basis to conclude that the Navy acted improperly
in eliminating ASC*s proposal from the competition.
The protest is denied.
Anthony H. Gamboa
General Counsel
------------------------
[1] ISD is a systematic process for designing effective educational
programs, such as courseware development tasks. The ISD process generally
consists of five phases: analysis, design, development, implementation
and evaluation. The evaluation phase generally consists of two parts:
formative evaluation and summative evaluation. Within formative
evaluation, there are two subphases: process evaluation and product
evaluation. Process evaluation ensures quality in the analysis, design,
and development activities and checks each activity against certain
metrics/standards to ensure quality while continuously seeking
improvements to each activity. Product evaluation, which includes
validation and quality control, also focuses on quality and measures the
products--produced within each phase--of analysis design and development
against certain metrics/standards drawn from the contractual
requirements. Agency Report at 18; Hearing Transcript (Tr.) at 25-28.
[2] The potential qualitative ratings under the technical subfactors were
outstanding, highly satisfactory, satisfactory, marginal, and
unsatisfactory. The potential risk ratings under the technical subfactors
were low, medium and high. The potential risk ratings for the past
performance factor were very low, low, moderate, high, very high and
unknown.
[3] ASC*s and AERA*s low risk past performance ratings were unchanged.
[4] These offerors* awards have not been challenged.
[5] The record is unclear as to what precisely occurred during oral
discussions between the Navy and ASC. The agency*s contemporaneous record
of the communications contains only the statement, *ASC stated they
understood [the EN] and would address.* An ASC official testified that he
believed that the intent of the EN could only have been for him to further
elaborate on the development phase, given that the Navy officials during
oral communications focused on a tool which had been proposed only under
the development phase of its plan. Tr. at 98‑101. The agency
witnesses testified that while there were no contemporaneous notes, they
were sure that they used ASC*s proposed tool only as an example to
illustrate the lack of detail in the overall courseware development plan
in ASC*s sample task response. Tr. at 103-05.
[6] While the protester, through statements and testimony of its
consultant, contends that AERA*s response to the sample task was no better
than ASC*s, our review does not confirm the validity of this contention.
For example, not only did protester*s consultant not fully explain the
reasons for this conclusion at the hearing, but his review confirmed that
AERA*s oral presentation addressed the analysis and design phases,
Protester*s Consultant*s Affidavit (Oct. 5, 2003), at 9, for which
subjects the protester admitted it provided less detail than other aspects
of its response to the sample task. Tr. at 101-03, 112.
[7] At the hearing, the protester*s consultant stated that he had not
reviewed AERA*s written ISD processes, Tr. at 147, even though that
portion of AERA*s proposal had previously been made available to the
protester*s counsel and consultant under the protective order. In the
final post-hearing comments, the consultant, for the first time, provided
a critique of AERA*s written processes and concluded that they were
inferior to ASC*s written processes. Given the timing of this submission,
we accord it no weight.
[8] Adjectival ratings are no more than guidelines for intelligent
decision making to assist source selection officials in evaluating
proposals; they do not mandate automatic selection of a particular
proposal. See SDS Int*l, Inc., supra, at 9.