TITLE: Bechtel Hanford, Inc., B-292288; B-292288.2; B-292288.3, August 13, 2003
BNUMBER: B-292288; B-292288.2; B-292288.3
DATE: August 13, 2003
**********************************************************************
Bechtel Hanford, Inc., B-292288; B-292288.2; B-292288.3, August 13, 2003
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective
Order. This is an interim redacted version that has been approved for
public release; GAO intends to issue a final redacted version at a later
date.
Decision
Matter of: Bechtel Hanford, Inc.
File: B-292288; B-292288.2; B-292288.3
Date: August 13, 2003
Marcia G. Madsen, Esq., David F. Dowd, Esq., and Michael J. Farley, Esq.,
Mayer, Brown Rowe & Maw, and Robert Humphries, Esq., Bechtel Hanford,
Inc., for the protester.
Kenneth B. Weckstein, Esq., Shlomo D. Katz, Esq., and Tammy Hopkins, Esq.,
Epstein Becker & Green, for Washington Closure Company, LLC, an
intervenor.
Gena E. Cadieux, Esq., Joseph B. Schroeder, Esq., Patricia D. Grahm, Esq.,
and Jonathan M. Dreger, Esq., Department of Energy, for the agency.
Scott H. Riback, Esq., and John M. Melody, Esq., Office of the General
Counsel, GAO, participated in the preparation of the decision.
DIGEST
Protest that agency improperly made award in a cost-plus-incentive-fee
acquisition to a firm that submitted a proposal whose cost was found to be
[deleted] unrealistic by the agency is sustained where solicitation called
for evaluation of realism of cost proposals, agency emphasized need for
realism during written and oral discussions, agency never indicated to
offerors during the competition that it would accord little weight to
realism in its source selection, protester relied on agency*s direction in
submitting a proposal that was found very realistic, and agency failed to
adequately document in its selection decision why it discounted the
importance of realism in its source selection.
DECISION
Bechtel Hanford, Inc. (BHI) protests the award of a contract to Washington
Closure Company, LLC (WCC) under request for proposals (RFP) No.
DE-RP06-02RL14300, issued by the Department of Energy (DOE) to acquire
environmental remediation services. BHI maintains that the agency
misevaluated proposals and made an improper source selection decision.
We sustain the protest.
BACKGROUND
DOE occupies a 586 square mile area in southeastern Washington State known
as the Hanford Site, through which flows the Columbia River. The 210
square mile area adjacent to the river is known as the River Corridor.
The agency acquired the Hanford Site in 1943, and for almost 50 years it
was dedicated to the production of plutonium used to construct nuclear
weapons. The agency describes the environmental legacy resulting from
these activities as *multifaceted and immense.* The object of the current
RFP is to acquire environmental remediation services leading, ultimately,
to restoration of the River Corridor area.
The RFP contemplated the award, on a *best value* basis, of a
cost-plus-incentive-fee (CPIF) contract, considering evaluated cost and
several non-cost evaluation criteria, with the non-cost criteria
collectively being significantly more important than evaluated cost. RFP
at M-2.[1] Offerors were required to submit both technical and cost
proposals. Cost proposals were to include estimates based on two
differing assumptions relating to the level of available funding: the
*base case* level of funding--which assumed funding of approximately $150
million per year--and the *40 percent increment* level of
funding--approximately $210 million per year (40 percent higher than the
base case level). RFP at L-21. These two estimates would be averaged
together for purposes of evaluation and award. RFP at M 5-6.
The solicitation included a document designated as attachment 11, which
was a comprehensive list of all tasks to be performed under the contract.
This document included a government estimate, expressed in unescalated
fiscal year 2001 dollars, of the cost for each element of work. The
agency also separately published a government estimate and an accompanying
report relating to the preparation of the estimate (referred to as the
ICE) that was available to offerors through the Internet.[2] DOE engaged
the services of the U.S. Army Corps of Engineers to prepare the estimate,
which is expressed in two ways--a 50-percent confidence level and an
80-percent confidence level--reflecting the degree of certainty the agency
had in the estimates (with the 80-percent confidence level estimate
reflecting greater certainty). AR, exh. 84. The agency estimated the
project*s overall cost as $1,429,462,000 ($1,518,874,000 with escalation)
at the 50-percent confidence level, and as $1,509,512,000 at the
80-percent confidence level, assuming an 8.5 percent fee and a funding
profile midway between the base and increment cases. AR, exh. 28,
at 64 n.7.
The ICE was central to the agency*s cost evaluation. According to DOE
testimony at the hearing held at our Office in this matter, offerors were
not required or expected to prepare *top-to-bottom* cost estimates for
each work element identified in RFP attachment 11 because of the
overwhelming scope of such an effort. Hearing Transcript (Tr.) at 42.
Rather, the agency used the ICE as a benchmark against which to measure
the realism of the proposed costs, and anticipated that offerors would use
the ICE as a point of comparison in developing their target cost
estimates. AR at 5. Offerors were to provide a clear and concise
rationale for all proposed cost elements that were either 25 percent or
more above the 80-percent confidence level ICE (an anticipated overrun
vis-`a-vis the ICE), or 25 percent or more below the
50-percent confidence level ICE (a proposed savings). RFP at L-22.
Additionally, RFP attachment 11 had numerous asterisked work elements for
which offerors were required to provide a detailed rationale, whether or
not their cost estimates were outside of the parameters noted above. Id.
(The asterisked items were selected as representative of the entire scope
of the requirement. Tr. at 42.) The agency states it required detailed
information for these cost items to ensure that departures from the ICE
within the 25 percent parameters were reasonably supported. Tr. at 49.
Because of the way the agency required offerors to prepare their cost
proposals, it adopted a somewhat unconventional method for arriving at its
most probable cost (MPC) estimates. Rather than prepare a *top to bottom*
MPC estimate for each offeror, the agency reviewed the proposed costs to
determine whether there were cost elements proposed at either 25 percent
(or greater) savings, or overruns, compared to the two ICEs. Where a cost
was proposed outside the 25 percent parameters, the agency examined the
supporting rationale and data to decide whether the cost was acceptable as
proposed. Where the agency accepted the offeror*s rationale, it used the
offeror*s proposed cost for the work element in arriving at its MPC
estimate. On the other hand, if the agency was not persuaded by the
rationale provided, it substituted the 50-percent confidence ICE cost
estimate for the work element in question in arriving at a proposal*s
MPC.[3] (In some cases, the agency gave *partial credit* where it
concluded that a proposed cost was partially justified.) This process of
adjusting proposed costs to the 50-percent confidence ICE is referred to
in the record as the *bounding rule.*
As noted, this is a CPIF contract under which offerors proposed a target
cost and a target fee. See Federal Acquisition Regulation (FAR) S:
16.405-1. As the proposed target cost was adjusted to arrive at the MPC,
the fee percentage was automatically adjusted based on the amount by which
the MPC varied from the target cost. If the MPC was calculated as higher
than the target cost, the fee percentage would be reduced below the
proposed target fee percentage. (Correspondingly, if the MPC was
calculated as lower than the target cost, the fee percentage would be
increased above the proposed target fee percentage.) This adjustment was
achieved by means of a fee curve, set forth in the RFP, which included a
maximum fee of 15 percent, and a minimum fee of 2.5 percent, with offerors
permitted to propose a target fee of up to 8.5 percent. The RFP specified
that offerors would receive 30 percent of all savings realized below the
target cost until the fee was increased to the 15 percent maximum, and
would bear liability for 20 percent of all costs incurred over the target
cost until the fee was reduced to the 2.5 percent minimum. RFP at B-6.
The RFP also specifically advised that the term *default* included a
situation where the contractor performed at the minimum fee (2.5 percent)
for a period of any
4 consecutive calendar quarters; in effect, the RFP advised that the
agency could exercise its right to terminate the contract for default
where these circumstances existed during performance. RFP at B-15. To
arrive at an evaluated price for the offerors, the agency first calculated
the firm*s MPC using the procedures outlined above, and then calculated
and added the predicted fee associated with performance at the MPC. Thus,
if a firm*s proposed target cost and MPC estimate were identical, and the
firm proposed a target fee of 5 percent, the agency added a fee of
5 percent of the MPC to arrive at an evaluated price. On the other hand,
if a firm*s MPC varied from its proposed target cost, the fee was
calculated using the fee curve, with fee being the proposed target fee
percentage, plus or minus some amount based on the amount by which the MPC
reflected an underrun or overrun vis-`a-vis the proposed target cost.
DOE received three proposals, including BHI*s and WCC*s. The agency
evaluated the proposals and concluded that discussions were necessary
because, among other things, the cost proposals did not provide enough
information to permit the agency to perform its cost realism evaluation.
Tr. at 50-52. Accordingly, the agency sent all offerors discussion
letters (dated July 17, 2002). In addition to specific questions for each
offeror, the discussion letters included *Clarifying Instructions for the
Final Proposal Revision.* The portion of those instructions relating to
the cost proposals included the following four statements:
This effort [preparing the ICE] resulted in a total estimated project cost
in which the SEB [source evaluation board] places high confidence.
* * * * *
The primary objective of the cost realism analysis is to ensure the
proposed target cost is not understated, thereby basing award on
information that turns out to be seriously in error.
* * * * *
To make a sound award for this [CPIF] type of contract requires a solid
basis for the target cost in order to provide DOE confidence the Offeror
can complete the work at or below the proposed target cost.
* * * * *
Offerors should also be mindful of Section B11, which defines four
successive quarters of performance at minimum fee as *default.*
Letter from DOE to BHI, July 17, 2003, attach. 1, at 4; Letter from DOE
to WCC, July 17, 2003, attach. 1, at 5. The letters also included another
attachment that described in some detail the agency*s methodology for
determining evaluated cost based on the offerors* proposed cost and fee.
July 17 Letter to BHI, attach. 2; July 17 Letter to WCC, attach. 2. This
attachment was relatively specific and included an articulation of the
so-called bounding rule described above. [4]
In addition to these letters, the agency engaged in oral discussions.
While the content of those discussions was not formally recorded, BHI took
extensive notes during the discussions, which the chairman of the SEB
testified were *approximately correct.* Tr. at 401. Those notes include
the following two statements by the chairman of the SEB, on which, BHI
states, it placed great significance when it prepared its proposal:
We*re going to be very suspicious if you*re going to propose something
different than the ICE. You have to sell us on any change.
* * * * *
We want the winner to perform as close to the maximum fee as possible. We
want a performer. If we picked just the low bidder, and got a low
performer . . . we*d have to toss them out and start over. We don*t want
to go through that.
BHI Protest, May 5, 2003, attach. 3, at 7.
Subsequent to these discussions, the agency received final proposal
revisions (FPR) from all offerors. After reviewing the FPRs, the agency
engaged in a second round of discussions and solicited second FPRs.
In its final evaluation, the SEB assigned [deleted] to the WCC and BHI
proposals (the third proposal is not relevant here) [deleted] except the
[deleted], under which BHI*s proposal was rated [deleted] and WCC*s
[deleted]. The evaluators also found a [deleted] number of either
*significant strengths* or *strengths* for the two proposals, albeit in
different areas. Overall, the evaluators determined that BHI*s was the
highest-rated proposal under [deleted] of the evaluation areas, WCC*s was
highest-rated in [deleted] areas and there were no discriminators in
[deleted] of the evaluation areas. In short, the SEB found the proposals
[deleted]. AR, exh. 28, at 89-91.
BHI*s final evaluated price was [deleted], while WCC*s was [deleted], for
a difference of approximately [deleted] in favor of WCC.[5]
Significantly, however, there was [deleted] between WCC*s proposed target
cost and its MPC estimate (the agency added approximately [deleted] to
WCC*s target cost in making its MPC adjustments), that the agency
determined that WCC would earn only the minimum fee if it were to perform
at its MPC. The SEB report states in this regard:
Therefore, it would appear on the basis of the cost realism analysis
performed by the SEB that [deleted].
AR, exh. 28, at 91. The SEB report goes on to note that, since WCC*s MPC
is only [deleted]. Id. The report then describes *intangible factors*
that the agency thought would [deleted]. AR, exh. 101, at 14.
The SSO essentially agreed with the SEB in terms of the [deleted] of the
WCC and BHI proposals, but concluded that the WCC proposal was in fact
slightly superior technically, finding that some of the strengths the SEB
identified in the WCC proposal provided discriminators in favor of WCC.
AR, exh. 101, at 6 et seq. The SSO also determined that WCC*s advantage
in terms of evaluated price was significant; that, [deleted], the firm
understood the scope of work; and that the possibility that WCC would earn
[deleted]. Id. at 13. The SSO concluded, on the basis of what he
determined was WCC*s slight technical advantage and lower evaluated price,
that WCC*s proposal offered the best overall value to the government, and
thus made award to WCC.
PROTEST
BHI argues that the agency*s award decision was fundamentally inconsistent
with the terms of the solicitation and the information presented during
discussions, which it understood as providing that realistic cost
proposals were what the agency sought, and that such proposals would be
evaluated more favorably than unrealistic proposals. BHI asserts that it
relied on the agency*s representations during the acquisition process in
preparing its cost proposal. Specifically, BHI notes that, while its own
proposal was realistic as measured by the closeness of its target cost to
its evaluated MPC, WCC*s cost proposal was not, given that its MPC was
significantly higher than its target cost. The award decision was
particularly unreasonable, BHI maintains, in light of the agency*s own
observation during the evaluation that if WCC [deleted]. BHI concludes
that the agency improperly failed to evaluate proposals in accordance with
the announced cost evaluation scheme.
In response, both the agency and WCC emphasize that this was a best value
acquisition, and that it was reasonable to select WCC for award based on
its [deleted] lower evaluated price. The agency also asserts that, as
noted in the SEB report and source selection decision, [deleted]. The
agency also emphasizes that at least a portion of the MPC adjustments made
to the WCC proposal resulted from the firm*s failure to justify its
proposed savings, and that, as a practical matter, the firm may well
achieve the savings it proposed during performance. Finally, the agency
and WCC assert that BHI*s proposal reflects nothing more than a business
strategy of maximizing its fee.
ANALYSIS
We agree with BHI that the agency improperly failed to evaluate proposals
in accordance with the established evaluation scheme, and that BHI was
competitively prejudiced by the agency*s actions. Specifically, DOE
failed to adequately take into consideration the comparative realism of
the proposals, as indicated by the degree to which their MPCs deviated
from their proposed target costs.
The central feature of a CPIF contract is a financial risk and reward
mechanism to spur cost effective performance on the part of the
contractor; the contractor will be rewarded for reducing costs, and
penalized for cost overruns. See FAR S: 16.405-1. The fee adjustment
provision here was designed to serve this purpose; cost savings--that is,
performance at or below the target cost--would result in a higher fee,
while cost overruns would result in a reduced fee. The incentive aspect
of a CPIF contract works only within a range defined by the minimum and
maximum fee, and the FAR provides that the *fee adjustment formula should
provide an incentive that will be effective over the full range of
reasonably foreseeable variations from target cost.*
FAR S:16.405-1(b)(3). Most significantly, once an overrun (that is, the
variation between the target cost and the actual cost) is so great that
the fee has dropped to the minimum, the CPIF mechanism no longer functions
to give the contractor an incentive to control costs. An unrealistically
low target cost risks putting the contractor (and the agency) in precisely
this situation. While with other cost-reimbursement vehicles, calculating
a proposal*s most probable cost may be all that is needed to address cost
realism, in the CPIF context, a lack of realism in an offeror*s target
cost can defeat the purpose of the incentive fee structure and cause
performance risk. See Hayes Int*l Corp., B-162387, 47 Comp. Gen. 336,
passim (1967).
The solicitation and the discussions made clear to the offerors that the
realism of their target costs mattered beyond the calculation of the MPC.
Moreover, the agency*s ongoing appreciation of this point is reflected in
the testimony of the chairman of the SEB:
Q. Why is it important to have an accurate target cost?
A. In the best of all possible worlds, we like to have the offeror
performing fully within the incentive regime and somewhere close to the
middle so that he can improve further or if he falls behind, he is still
operating in the incentive regime.
Q. Does the incentive regime help manage contract performance?
A. Yes.
Q. Both the up side and the down side?
A. Yes.
Tr. at 121.
As discussed, the RFP provided for a detailed cost realism analysis
focused on the interplay among the target cost and fee, the ICE, the fee
curve and the adequacy of a firm*s justifications for departures from the
ICE. A firm*s MPC was determined by the agency*s making adjustments to
the firm*s proposed target cost where the proposed cost was not
justified. To the extent a firm*s target cost was found not to be
realistic, there would be a variation between the target cost and the
agency*s calculation of the probable cost of performance by that firm (the
MPC). If a firm persuaded DOE that its target cost was realistic,
[deleted], there would be little or no variation between its target cost
and the agency*s calculated MPC. On the other hand, where the variation
was great, it could take the predicted performance (as reflected in the
MPC) out of the effective range of the CPIF incentive mechanism,
[deleted].
A firm*s evaluated fee was in turn a function of the realism of its
proposed target cost, with the fee being adjusted downward where the
firm*s MPC was found to be above its proposed target cost, as in the case
of WCC. (Correspondingly, where a firm*s proposed target cost and MPC
were similar or identical, the proposed fee would be used to calculate
evaluated price, without a downward adjustment in fee, as in the case of
BHI). If the overrun was predicted to be so great as to lead to
performance at the minimum fee, it would remove the incentive aspect of
the CPIF mechanism--which is why, presumably, the agency warned offerors
in the RFP and in discussions that performance at the minimum fee in 4
consecutive calendar quarters would be grounds for default.
The agency*s focus on a realistic target cost under this scheme was
further emphasized to all offerors in DOE*s *Clarifying Instructions for
the Final Proposal Revision,* where it informed offerors that the *primary
objective* of the cost realism analysis was *to ensure the proposed target
cost is not understated, thereby basing award on information that turns
out to be seriously in error,* and that *a sound award for this type of
contract requires a solid basis for the target cost in order to provide
DOE confidence the offeror can complete the work at or below the proposed
target cost.* Letter from DOE to BHI, July 17, 2003, attach. 1, at 4;
Letter from DOE to WCC, July 17, 2003, attach. 1, at 5. In addition, the
record shows that the agency reinforced its preference for realistic
target costs one more time during its oral discussions with BHI. As
noted, the agency advised that it wanted the winning firm to perform *as
close to the maximum fee as possible,* that it was going to be *very
suspicious* of proposed costs different from the ICE, that it would have
to *be sold* on any proposed deviations from the ICE, and that if it
picked just the low offeror and got a low performer, it would have to
throw the firm out and start over, which the agency stated it did not want
to do. BHI Protest, May 5, 2003, attach. 3, at 7.
The record shows that BHI and WCC responded [deleted] in their proposals
to this guidance from the agency. While BHI presented a proposal that
was, by the agency*s own measure, exceptionally realistic, WCC presented a
proposal that was [deleted], but nonetheless appeared to offer a
substantial savings to the government.[6] BHI testified that its proposal
strategy was to present the *most realistic* target cost, that is, to
minimize the agency*s MPC adjustments, because it understood that this was
what the agency wanted, Tr. at 390, and BHI appears to have been
successful in its effort. The record shows that the agency ultimately
found the BHI proposal very realistic and actually adjusted its proposed
target cost [deleted] to arrive at its MPC. AR, exh. 28, at 65.
In stark contrast, the agency made [deleted]. After initial proposal
evaluation, the agency added approximately [deleted] to WCC*s proposal,
more than [deleting] its approximately [deleted] target cost. AR, exh.
28, at 65. After the first round of discussions, the agency adjusted the
revised target cost upward by approximately [deleted]. Id. After the
final round of discussions, the agency still added approximately [deleted]
to WCC*s final target cost. Id.[7]
Based on the RFP and the information provided to BHI and WCC during
discussions, we conclude that offerors were on notice that, as BHI
asserts, the evaluation scheme in particular (as well as the agency*s use
of a CPIF contract more generally) called for the agency to evaluate
realistic proposals--as measured by the amount by which the MPC deviated
from the target cost--more favorably than unrealistic proposals in
determining which proposal represented the best value to the government.
In effect, the agency was obliged in making its source selection to
consider, among other things, which proposal*s target cost was more
realistic. We also conclude that BHI heeded the RFP and the agency*s
instructions, and submitted (as the agency found) a very realistic target
cost, [deleted]. In confining its cost realism analysis to the
calculation of MPCs and in otherwise discounting the difference in realism
between the two proposals, the agency failed to adhere to the announced
evaluation scheme.
The announced evaluation scheme was perhaps flexible enough to permit the
agency to conclude, in the final analysis, that WCC*s [deleted] lower
evaluated price provided the best value. The evaluated price difference
could not, however, properly be treated like the difference between
offered prices in a fixed-price context. Here, part or all of the
difference could be illusory, simply reflecting different levels of
aggressiveness in the offerors* claims of anticipated savings--discussed
further in our analysis of prejudice, below. Moreover, to the extent that
part of the [deleted] difference reflected the agency*s expectation
[deleted], the agency could not reasonably treat that simply as a benefit
to the government. Most importantly, the agency could not reasonably
select WCC without considering the fact that BHI*s target cost was
virtually equal to its MPC--and therefore entirely realistic--while WCC*s
target cost [deleted]. Notwithstanding this dramatic difference between
the proposals (and the resulting corollary that, [deleted], there is
nothing in the record showing that the agency adequately factored this
essential comparison into its award decision.[8]
PREJUDICE
Even where we find an impropriety in an evaluation, we will sustain a
protest only where the protester was competitively prejudiced by the
agency*s actions; that is, the protester would have had a reasonable
chance of receiving award but for the agency*s actions. See
McDonald-Bradley, B-270126, Feb. 8, 1996, 96-1 CPD P: 54 at 3;
Stastistica, Inc. v. Christopher, 102 F.3d 1577, 1581 (Fed. Cir. 1996).
We conclude that BHI was prejudiced by the agency*s actions here;
specifically we find that, had BHI been aware that its proposal would not
be evaluated more favorably based on its degree of realism, it potentially
could have changed its proposal so as to eliminate WCC*s cost advantage.
As noted above, because WCC*s proposed target cost [deleted], it obtained
a substantial evaluated price advantage through the agency*s calculation
of its fee. As explained earlier, after arriving at the total MPC, the
agency calculated WCC*s fee based on the fee curve to arrive at the final
evaluated price. As a result, since there was a [deleted] to WCC*s target
cost to arrive at its MPC, the offeror*s target fee percentage was
correspondingly reduced; this reduced fee resulting from [deleted]
benefited WCC by reducing its overall evaluated price. While WCC proposed
a target fee of [deleted] percent ([deleted] BHI*s proposed target fee of
[deleted] percent)--which would be approximately [deleted] if it had
proposed a [deleted] ([deleted] percent multiplied by its MPC)[9]--the
evaluated fee added to the firm*s MPC was only [deleted].[10] This
amounts to an approximately [deleted] reduction to WCC*s evaluated
price.[11] Had BHI been aware that its proposal would not be penalized in
the evaluation process for offering an [deleted], it could have benefited
similarly by proposing a substantially lower target cost.[12]
The record also shows that the agency*s insistence on realism had a
significant effect on how BHI prepared its proposal. In particular, the
record shows that there were several instances where BHI identified
potential savings below the ICE but, rather than claim the entire amount
of those savings, incorporated only a portion in its target cost in an
effort to demonstrate to the agency the realism of its proposal. We
discuss three specific areas where the firm proposed lower savings than it
thought it could achieve.
First, BHI proposed that, [deleted]; this would result in a potential
savings of [deleted] below the amount assumed in the ICE. BHI highlighted
in its proposal the fact that it thought there was as much as [deleted] in
savings, but claimed a savings of only [deleted] in its cost proposal,
thereby resolving not to include some [deleted] of proposed savings in its
target cost. Tr. at 420-26 (including cited proposal pages). The
testimony of BHI*s principal vice president relating to this cost element
is typical of BHI*s explanation relating to all these potential savings:
[deleted].
Tr. at 424-25.
Second, BHI proposed to [deleted]. Tr. at 415. The record shows that BHI
determined that it could reduce its cost by [deleted], Tr. at 413-17
(including cited proposal pages), but that, in an effort to demonstrate
the realism of its proposed cost, and to convey that [deleted], it
proposed only [deleted] in savings for this aspect of the requirement.
Consequently, because of the agency*s insistence on a realistic target
cost, BHI resolved not to claim some [deleted] in potential savings.
[deleted].
In sum, the record shows that, had BHI known how the agency would evaluate
the cost proposals, it could have approached its target cost differently,
and in a manner that could have changed the outcome of the cost evaluation
and, ultimately, the competition.[13]
RECOMMENDATION
In view of the foregoing, we sustain BHI*s protest. In deciding on the
appropriate recommendation, we are concerned that the record reflects a
lack of confidence on the part of the agency in its evaluation results.
The record contains testimony from both the SEB chairman and the SSO that
suggests that neither one considered the agency*s MPC estimates to be
particularly probative of the likely cost of performance, and that the MPC
estimates reflected little more than essentially the mathematical result
of applying the agency*s cost realism evaluation process. See, e.g., Tr.
at 125-26, 141-43, 253-54. At the same time, the SEB chairman testified
that the statistical utility of the MPC was essentially identical to the
statistical utility of the 50-percent confidence ICE, Tr. at 206-08, a
number in which the agency had *high confidence.* This apparent anomaly
suggests either that the agency did not have the *high confidence* it
professed in the ICE, or that there may otherwise be a lack of confidence
because of the agency*s chosen method for evaluating realism. (For
example, the hearing testimony from the agency*s witnesses, as well as the
SEB report and source selection decision, suggest that this lack of
confidence in the evaluation results was due, at least in part, to the
fact that the agency did not think that a failure on the part of an
offeror to justify a savings in its proposal actually indicated that the
offeror would not realize those savings during performance.)
Accordingly, we recommend that the agency ensure that the method of
evaluating proposals meets its requirements in terms of providing
reasonably reliable evaluation results in which the agency has adequate
confidence. This necessarily will entail consideration of the manner in
which cost realism is to be evaluated, in particular, how the realism of
proposed target costs will be factored into the cost evaluation and
eventual source selection decision. The agency then should revise the RFP
to reflect these considerations and provide all competitive range offerors
an opportunity to submit revised proposals. If, following the evaluation
of revised proposals, the agency determines that an offeror other than WCC
is in line for award, it should terminate WCC*s contract for the
convenience of the government, and make award to the firm found to be in
line for award. Finally, we recommend that BHI be reimbursed the costs
associated with filing and pursuing its protest, including reasonable
attorneys* fees. 4 C.F.R. S: 21.8(d)(1). BHI*s certified claim for
costs, detailing the time spent and the costs incurred must be submitted
to the agency within 60 days of receiving of our decision. 4 C.F.R.S:
21.8(f)(1).
The protest is sustained.
Anthony H. Gamboa
General Counsel
------------------------
[1] The technical evaluation criteria (and subcriteria), along with their
relative weights were: project management, total weight 47 percent (key
personnel, 16 percent; business plan, 10 percent; corporate involvement,
7.5 percent; organization controls and systems, 7.5 percent; and small
business, 6 percent); technical approach, 32 percent (quality of plan for
execution of work scope, 16 percent; and environment, safety and health,
16 percent); past performance and experience, 15 percent (past
performance, 8 percent; and experience, 7 percent); and contractor
enhancements, 6 percent. In evaluating proposals against the non-cost
criteria, the agency assigned adjectival ratings of either outstanding,
very good, adequate, inadequate or *unacceptable and unsatisfactory.* See
Source Selection Plan, Agency Report (AR), exh. 25, at 11. In addition to
the adjectival ratings, the agency included narrative material outlining
significant strengths, strengths, weaknesses and significant weaknesses.
Id. at 10-12.
[2] The agency created an Internet website to make a large amount of
information available to the offerors, including the solicitation, the
cost estimate, and various other materials. The agency*s electronic
documentation can be found at:
http://www.hanford.gov/procure/solicit/rcc/.
[3] Thus, for example, if the 50-percent confidence level ICE for a given
work element was $100 (or $105 at the 80-percent confidence level ICE),
and an offeror proposed to perform for either $50 or $150, the agency
evaluated the rationale presented. If the rationale were accepted, then
for purposes of calculating MPC the agency would use the amount proposed
(i.e., either $50 or $150). If the agency did not accept the rationale,
it would use $100 in calculating MPC.
[4] In its protest, BHI challenges DOE*s use of the bounding rule,
maintaining that the agency improperly failed to account for a portion of
WCC*s evaluated costs. Notwithstanding the way in which BHI has couched
its protest, we view this protest ground as a challenge to the bounding
rule itself and, since the agency set out the rule during the procurement
and BHI did not challenge it until well after award, we view this issue as
untimely. 4 C.F.R. S: 21.2(a)(1) (2003).
[5] During the course of the protest, the agency conceded that there were
errors in its MPC calculations for WCC that should have resulted in the
firm*s evaluated price being adjusted upward by approximately $10
million. AR at 71, 79. Accordingly, the evaluated price advantage
enjoyed by WCC should be reduced by this amount.
[6] All dollar values expressed in this section of our decision are in
unescalated, fiscal year 2001 dollars averaged between the base and 40
percent increment cases and exclusive of what are referred to as *added
items,* that is, additional activities identified by the offerors for
performance that were not included in the ICE. The comparison is
presented this way in the final SEB report because the ICE was prepared on
the basis of unescalated fiscal year 2001 dollars and also did not include
the so-called added items. The final SEB report states that the
comparison was performed in this fashion in order to make it more
meaningful. AR, exh. 28, at 64-66. One consequence of this comparison is
that it tends to understate the magnitude of the final escalated numbers.
For example, while under this method, the WCC second FPR was adjusted
upward for cost realism purposes by approximately [deleted], id. at 65,
when accounting for the other variables, its second FPR was adjusted
upward by approximately [deleted]. Id. at 73; AR, exh. 101, at 13.
[7] To highlight the difference in terms of realism in a CPIF contract,
the agency noted that the fee curve would be effective over a range of
[deleted] percent of the 50‑percent confidence ICE for BHI, whereas
the fee curve would be effective for WCC over a range of only between
[deleted] percent. AR, exh. 28, at 64. Thus, WCC will have to perform
substantially below both the agency*s 50‑percent confidence ICE (a
number in which the agency places *high confidence*), and its MPC in order
for the fee mechanism to function over its full effective range. In terms
of the likelihood of this occurring, the chairman of the SEB testified
that he estimated the probability of WCC performing at its very low target
cost at only [deleted] percent. Tr. at 342.
[8] It is implicit in our decision that we do not agree with the agency
and intervenor that BHI*s target cost merely reflects its business
judgment. Rather, BHI*s business judgment and resulting proposal strategy
reflected the importance that the RFP and the discussions indicated the
agency would give to the realism of target costs in the CPIF mechanism in
use in this procurement.
[9] We arrive at this number as follows: (base MPC [deleted] multiplied
by [deleted]) +(40 percent increment MPC [deleted] multiplied by
[deleted]) = [deleted]. This number, divided by 2 is [deleted].
[10] The agency estimated the firm*s average fee assuming performance at
the average MPC (the average MPC between the base and 40 percent increment
case for WCC was [deleted]). AR exh. 28, at 88.
[11] We arrive at this number as follows: [deleted] (fee calculated at
[deleted] percent), minus [deleted] (calculated fee) = [deleted].
[12] We recognize, as the agency argues, that WCC could be viewed as
taking a risk by proposing a target cost [deleted], and that BHI could be
viewed as conservative in proposing a target cost so high as to lead to
potentially high fees for achieving an underrun. See, e.g., Agency*s
Post-Hearing Comments at 4. If considered in the context of a reasoned,
reasonable analysis that addresses the impact on the CPIF mechanism of
performance at such an overrun or underrun, the agency*s point might
deserve consideration. Instead, though, and notwithstanding some
discussion in the source selection document of the risk associated with
WCC*s low target cost, the agency appears to have largely treated
[deleted] as reflecting a savings to the government.
[13] BHI*s initial protests raised a large number of issues relating to
the propriety of the agency*s technical and cost evaluations, and
challenged as well the agency*s ultimate source selection decision. After
receiving the agency*s report and participating in a hearing, BHI narrowed
somewhat the scope of its protest, but the firm continues to maintain a
variety of challenges to the agency*s technical and cost evaluations, with
an emphasis on cost evaluation issues. We need not consider these
detailed allegations because our conclusions and recommendation render
these assertions academic.