TITLE: B-310436; B-310436.2, Recon Optical, Inc., December 27, 2007
BNUMBER: B-310436; B-310436.2
DATE: December 27, 2007
************************************************************
B-310436; B-310436.2, Recon Optical, Inc., December 27, 2007

   DOCUMENT FOR PUBLIC RELEASE
   The decision issued on the date below was subject to a GAO Protective
   Order. This redacted version has been approved for public release.

   Decision

   Matter of: Recon Optical, Inc.

   File: B-310436; B-310436.2

   Date: December 27, 2007

   David J. Taylor, Esq., and William J. Spriggs, Esq., Spriggs &
   Hollingsworth, and Bradford E. Biegon, Esq., and Katherine A. Allen, Esq.,
   for the protester.

   Richard H. Streeter, Esq., and Scott E. Pickens, Esq., Barnes & Thornburg
   LLP, for Kongsberg Defence & Aerospace AS, an intervenor.

   Daniel Pantzer, Esq., and Frank DiNicola, Esq., Department of the Army,
   for the agency.

   Sharon L. Larkin, Esq., and James A. Spangenberg., Esq., Office of the
   General Counsel, GAO, participated in the preparation of the decision.

   DIGEST

   Agency's decision not to select protester's proposal for award is
   unobjectionable, where the proposal was reasonably assessed the lowest
   possible ratings ("red" or "high risk") under the three most important
   evaluation factors and was higher in price than the other proposals, which
   were technically superior.

   DECISION

   Recon Optical, Inc. (ROI) protests the award of a contract to Kongsberg
   Defence & Aerospace AS by the Department of the Army under request for
   proposals (RFP) No. W15QKN-06-R-1409, for common remotely operated weapon
   stations (CROWS). ROI challenges the evaluation of its and Kongsberg's
   proposals under each of the evaluation factors.

   We deny the protest.

   BACKGROUND

   The CROWS is a multi-vehicle weapon mounting and control system that
   attaches to the top of an armored vehicle and allows the gunner to remain
   inside the vehicle while firing the weapon. This remote weapon station is
   to be capable of mounting various small to medium caliber machine guns
   (the MK19 grenade machine gun, M2 HB machine gun, M240B machine gun, or
   M249 squad automatic weapon) and is to include, at a minimum, the weapon
   mount above the roof with sensors, fire control processor, display and
   controls, and other associated hardware. RFP, Executive Summary, at 7;
   Statement of Work, at 29. The specific requirements and capabilities of
   the proposed CROWS were contained in a detailed statement of work and
   performance specifications that were included with the RFP. Among other
   things, the CROWS was to provide remote day and night sighting, ballistic
   control capability, remote weapon charging, and first burst engagement of
   targets at the maximum effective range of the weapon. Agency Report (AR),
   Tab 3, Award Decision Document, at 1.

   The RFP provided for award of a fixed-price,
   indefinite-delivery/indefinite-quantity (ID/IQ) contract (with
   time-and-material line items for depot operations). The RFP contemplated
   six "ordering periods": the first beginning on the date of contract award
   and ending on December 31, 2007, followed by four 1-year ordering periods,
   and concluding with a period running from January 1 to August 12, 2012.
   RFP sect. B. Production deliveries were to start 6 months after the first
   delivery order award at a rate of 30 CROWS per month, with a production
   rate ramp up by December 2007 to a minimum capacity of 100 CROWS per
   month. Id. sect. F. The solicitation stated that a minimum quantity of
   1,000 CROWS and a maximum quantity of 6,500 CROWS would be ordered over
   the life of the contract. Id. sect. B.

   The RFP announced that the evaluation would be conducted in two phases,
   each with its own evaluation factors. "Phase I" (not challenged here)
   reviewed physical characteristics to determine eligibility for "Phase II,"
   and Phase II (which is the subject of this protest) evaluated proposals
   and bid sample testing for award. RFP sections M.1, M.7, M.8. The Phase II
   evaluation criteria announced that award would be made on a best value
   basis considering the following factors, listed in descending order of
   importance: technical, schedule, management, logistic support, price,
   government purpose license rights (GPLR), past performance, and small
   disadvantaged business.[1] Id. sect. M.5. The technical, management, and
   logistic support factors each contained a number of equally rated
   subfactors. Id. sect. M.6. The RFP stated that proposals would be given
   adjectival ratings (major strength, strength, weakness, major weakness)
   for each of the subfactors, and color ratings (blue, green, amber, red)
   for each of the factors, and included the rating definitions for each
   factor and subfactor in the solicitation.[2] Id. sections M.9.1, M.11.

   The RFP required each offeror to provide a "bid sample," which was to be
   identical to the CROWS production system unit offered. The solicitation
   stated that each sample would be tested to "determine if the proposed
   design meets the minimum standards as specified in the solicitation and
   then competitively evaluated against the evaluation plan."[3] Id.,
   Executive Summary, at 7. After each "test event," any "unexpected
   incidents, potential failures or unexplained events"[4] that occurred
   during testing were to be reported to the offerors in a "Test Incident
   Report" (TIR), and each offeror was to be given an opportunity to diagnose
   the failure and make repairs before the next test event. Id. sect. M.8.1.
   Bid sample testing was to be conducted concurrently with proposal
   evaluation.

   As is relevant here, ROI and Kongsberg based their proposals and bid
   samples on existing CROWS systems that each firm had provided under prior
   contracts, with modifications or proposed modifications to meet the RFP
   requirements. ROI's proposed CROWS was based on a system it provided under
   predecessor contracts[5] (which had similar, but not identical,
   requirements to those here), and ROI asserts that it has fielded over 250
   systems in Iraq since it was awarded the initial contract in 2000. ROI's
   Comments at 58. Since February 2007, however, no units have been fielded;
   the agency has issued ROI a show cause letter and termination for default
   cure notice due to performance problems. Kongsberg's Comments, attach. 2,
   Army's Stay Override, at 3. Kongsberg's proposed CROWS was based on its
   "M151 Protector," which has been in full scale production since 2001; more
   than 1,500 systems have been delivered to the Army through the Stryker
   program, and over 1,000 have been fielded in Iraq since 2003. Kongsberg's
   Comments, attach. 3, Declaration of Kongsberg's Vice President, paras.
   2-3.

   Three offerors, including ROI and Kongsberg, were invited to participate
   in Phase II of the competition after submitting proposals and bid samples
   for evaluation.[6] Proposals and bid sample testing were evaluated by a
   source selection evaluation board (SSEB), which consisted of separate
   technical, schedule, management, logistics, price, GPLR, past performance,
   and small disadvantaged business teams.[7] After initial evaluations,
   offerors were provided the technical evaluation reports of their proposals
   and were invited to participate in discussions. Offerors were issued
   written "items for negotiation" (IFN) that identified weaknesses and major
   weaknesses in their proposals under each of the evaluation factors and
   subfactors; some of these addressed failures or concerns identified during
   bid sample testing as reported in TIRs. Each offeror responded to the IFNs
   and discussion issues in their final proposal revisions (FPR). The SSEB
   evaluated FPRs against the stated evaluation criteria and reported its
   findings to a source selection advisory committee (SSAC) and the source
   selection authority (SSA).[8] The SSAC performed a comparative analysis of
   proposals and provided its analysis to the SSA. The SSA adopted the
   findings of the SSEB and SSAC and rated proposals as follows:

   +------------------------------------------------------------------------+
   |                                    |    Kongsberg    |       ROI       |
   |------------------------------------+-----------------+-----------------|
   |Technical                           |      Blue       |       Red       |
   |------------------------------------+-----------------+-----------------|
   |  |Vehicle Interfaces               |    Strength     | Major Weakness  |
   |  |---------------------------------+-----------------+-----------------|
   |  |Weapon Installation & Operation  | Major Strength  |    Strength     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Target Acquisition               |    Strength     |    Strength     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Accuracy & Boresight Retention   | Major Strength  |    Strength     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Stabilization                    |    Strength     |    Weakness     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Temperature                      | Major Strength  | Major Weakness  |
   |  |---------------------------------+-----------------+-----------------|
   |  |Safety                           | Major Strength  |    Weakness     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Reliability                      | Major Strength  | Major Weakness  |
   |------------------------------------+-----------------+-----------------|
   |Schedule                            |    Low Risk     |    High Risk    |
   |------------------------------------+-----------------+-----------------|
   |Management                          |      Blue       |       Red       |
   |------------------------------------+-----------------+-----------------|
   |  |Program Management Plan          | Major Strength  |    Weakness     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Subcontractor Management Plan    | Major Strength  | Major Weakness  |
   |  |---------------------------------+-----------------+-----------------|
   |  |Software Management Plan         |    Weakness     | Major Weakness  |
   |  |---------------------------------+-----------------+-----------------|
   |  |Quality Management Plan          | Major Strength  | Major Weakness  |
   |------------------------------------+-----------------+-----------------|
   |Logistic Support                    |      Blue       |      Green      |
   |------------------------------------+-----------------+-----------------|
   |  |Depot Level Operations           | Major Strength  |    Strength     |
   |  |---------------------------------+-----------------+-----------------|
   |  |Fielding & Operational Support   |    Strength     |    Strength     |
   |------------------------------------+-----------------+-----------------|
   |GPLR                                |      Blue       |      Amber      |
   |------------------------------------+-----------------+-----------------|
   |Past Performance                    |    Low Risk     |  Moderate Risk  |
   |------------------------------------+-----------------+-----------------|
   |Small Disadvantaged Business        |      Green      |      Blue       |
   |------------------------------------+-----------------+-----------------|
   |Evaluated Price                     | $513,270,432.40 | $539,446,515.80 |
   +------------------------------------------------------------------------+

   AR, Tab 3, Award Decision Document, at 8.[9] The SSA discussed in detail
   the merits of each offeror's proposal under each of the evaluation
   factors, then determined not to award the contract to ROI because ROI
   received either a "red" or "high risk" rating under the three most
   important factors. According to the SSA,

     This represents a strong lack of confidence in the ability of the Recon
     Optical system to meet the Government requirements. It also raises
     uncertainties regarding their ability to meet schedule, resulting in
     delayed fielding of the Urgent Material Release (UMR) item with
     attendant increase in expenditure of [government] resources. Recon
     Optical was also the highest evaluated price among all the offerors.
     Based on this analysis[,] I have determined not to award to Recon
     Optical.

   Id. at 20. The SSA performed a best value tradeoff between the proposals
   of Kongsberg and the other competing offeror (which received the second
   highest technical rating and was the lowest in price) and selected
   Kongsberg for award.

   This protest followed.

   DISCUSSION

   ROI protests the ratings assigned to its proposal under each of the
   evaluation factors, as well as each of the "weakness" and "major weakness"
   ratings assessed under the subfactors, and contends that its proposal was
   evaluated disparately to Kongsberg's. In reviewing protests of an agency's
   evaluation, our Office does not reevaluate proposals, but instead examines
   the record to determine whether the agency's judgment was reasonable and
   in accord with the RFP criteria. Abt Assocs., Inc., B-237060.2, Feb. 26,
   1990, 90-1 CPD para. 223 at 4. Offerors have the burden of submitting an
   adequately written proposal, and it runs the risk that its proposal will
   be evaluated unfavorably when it fails to do so. Beck's Spray Serv., Inc.,
   B-299599, June 18, 2007, 2007 CPD para. 113 at 3. Mere disagreement with
   the agency's conclusions does not render the agency's judgment
   unreasonable. UNICCO Gov't Servs., Inc., B-277658, Nov. 7, 1997, 97-2 CPD
   para. 134 at 7.

   In response to the protests, the agency has provided a detailed and
   voluminous record illustrating that the agency performed a comprehensive
   evaluation and assessed ratings in a fair and equitable manner that was
   consistent with the RFP's stated evaluation criteria. The extensive record
   shows that ROI's proposal and the results of bid sample testing raised
   serious doubts that ROI's proposed CROWS would be able to meet the
   requirements of the solicitation in accordance with the required schedule,
   and ROI did not satisfactorily address the agency's concerns during
   discussions. In contrast, Kongsberg provided a technically superior
   proposal at a lower price, and thus its proposal was reasonably selected
   for award. We have considered each of the protester's numerous arguments
   and find them to be without merit, although we discuss here only the three
   most heavily weighted factors.[10] As we explain, ROI's proposal was
   reasonably given the lowest possible ratings ("red" or "high risk") under
   the three most important factors, which provided a reasonable basis for
   the agency not to select the proposal for award.

   Technical Factor

   ROI complains that its proposal should not have received a "red" rating
   under the technical factor, or "weakness" or "major weakness" ratings
   under the technical subfactors.

   For example, under the vehicle interfaces subfactor, ROI's proposal was
   assessed a "major weakness" rating because

     Recon Optical's bid sample did not operate successfully at 20 VDC [volts
     direct current] during bid sample testing and a critical message
     instructing the user to shut down the system was displayed. Recon
     Optical stated in its [FPR] that it has initiated corrective actions to
     address bid sample voltage input test failures as evidenced in [TIRs].
     No root cause analysis[[11]] was provided to support these corrective
     actions . . . Also Recon Optical had ECP [Engineering Change Proposal]
     changes required which raised the issue that weight may be over 400
     [pounds] when included. Weight reduction ECPs were not documented.

   AR, Tab 3, Award Decision Document, at 9. ROI complains that the "major
   weakness" rating was not warranted because its proposed CROWS, in fact,
   weighed less than 400 pounds, and the inability to operate at 20 VDC was
   an intentional "safety feature" of the proposed CROWS that could easily be
   modified with a design change. ROI's Comments at 14, 17.

   The RFP defined a "major weakness" rating for this subfactor as:

     System weight for the external components above the roof is more than
     400 [pounds] not including the weapon and ammunition and the Offeror has
     not provided a plan with sufficient detail and analysis on how the
     weight will be reduced to less than 400 [pounds] without any negative
     impacts to overall performance. OR [t]he bid sample System did not
     operate effectively and/or safely between the ranges of 20-30 Volts DC
     and sufficient evidence has not been provided to demonstrate with high
     confidence that the system will be capable of meeting this requirement
     prior to delivery.

   RFP sect. M.11.1.a. Thus, a major weakness rating would be reasonable if
   the agency identified either a weight or a voltage issue described above.

   Here, ROI admits that its system did not operate at 20 VDC, and its FPR
   response to an IFN on this issue explains only that "[c]orrective actions
   have been initiated" through design changes that are being tested. ROI's
   Response to IFN 212 at 1-14. ROI's proposal did not explain that the
   failure was an intentional safety feature, as ROI now alleges in its
   protest, or provide any "root cause analysis" of this problem, and the
   proposal did not contain explanation or evidence to "demonstrate with high
   confidence" that the system would be capable of meeting this requirement
   prior to delivery. Thus, we find that the record supports the agency's
   assessment of a "major weakness" rating under the vehicle interface
   subfactor on this issue alone.

   We find the agency's conclusions with regard to weight under this
   subfactor also to be reasonable. Again, our review of the record confirms
   that although ROI asserted in its FPR that the weight of the proposed
   CROWS would be less than 400 pounds after implementation of all "existing
   and planned design modifications" as a result of numerous ECPs, ROI's
   Response to IFN 174 at 1-425, the offeror did not provide sufficient
   detail regarding the effects of the ECPs on performance, or testing
   analysis to support its assertion the ECPs would in fact reduce the weight
   as asserted. Thus, the agency could reasonably conclude that there "may"
   be an issue with whether the proposed CROWS would be too heavy.

   ROI also protests its "weakness" rating under the stabilization subfactor.
   The definition of a "weakness" rating under this subfactor was:

     Offeror has provided documentation, analyses, and test data on the
     proposed stabilization system design but lacks sufficient detail and
     clarity to substantiate that the subsystems are fully integrated. There
     is some doubt that the offeror will be able to meet the stabilization
     performance requirements prior to production delivery.

   Id. sect. M.11.1.e.

   The record shows that the SSA credited ROI's proposal for offering a fully
   integrated system, but assessed a "weakness" rating because there was
   "some doubt that that offeror will be able to meet the stabilization
   performance requirements prior to production delivery." The SSA noted that
   the agency was unable to "extrapolate . . . the true weapon stabilization
   performance" of ROI's proposed CROWS from the data and information that
   ROI provided. AR, Tab 3, Award Decision Document, at 11. As explained in
   the SSEB report, the agency was unable to evaluate stabilization
   performance because ROI's stabilization tests did not account for weapon
   movement in the cradle, and ROI's stabilization system did not account for
   linear disturbances. The SSEB noted that although ROI "provided
   conclusions from the stabilization test and test procedures[, ROI] did not
   provide data reduction methodology to support the results." These concerns
   led the agency to conclude that there was "some doubt" that ROI's CROWS
   would achieve the required level of stabilization performance prior to
   production delivery. AR, Tab 1, SSEB Final Report, at 78-79.

   ROI does not dispute the agency's findings regarding linear disturbances
   or deny that it failed to provide data reduction methodology, and ROI
   admits that its in-house measurement technique did not take into account
   weapon movement in the cradle. ROI's FPR, vol. 1, 1-570. However, ROI
   asserts that its proposal deserved a higher subfactor rating because it
   provided some test data and explained that weapons movement in the cradle
   did not impact stabilization. Although ROI's FPR does address these issues
   and did provide test data, id. at 1-559 to 1-570, the agency considered
   all of the information provided by ROI and still had doubt as to
   stabilization performance based on the lack of data reduction methodology
   to support ROI's claimed performance and ROI's failure to address linear
   disturbance issues. AR, Tab 1, SSEB Final Report, at 78-79. Although ROI
   disagrees with this assessment, it has not shown it to be unreasonable.

   ROI next challenges the assessment of a "major weakness" rating under the
   temperature subfactor. The definition of a "major weakness" rating under
   this subfactor was:

     Bid sample testing did not demonstrate an ability to reliably and/or
     safely operate the system at -20 degrees [Fahrenheit] and 140 [degrees
     Fahrenheit] or the Offeror has not provided an acceptable plan with
     supporting data that the production systems will be capable of operating
     reliably and safely with high confidence at temperatures as low as -50
     degrees [Fahrenheit] for external components and -20 degrees
     [Fahrenheit] for internal components and up to 140 [degrees Fahrenheit]
     for both internal and external components prior to production delivery.

   Id. sect. M.11.1.f. ROI's proposal was assessed a "major weakness" rating
   under this subfactor because the SSA had "doubt that ROI will be capable
   of providing a production system capable of operating reliably and safely
   down to -50 [degrees Fahrenheit] prior to production delivery." AR, Tab 3,
   Award Decision Document, at 12.

   ROI's initial proposal, ROI's internal testing, and the agency's bid
   sample testing revealed that ROI's production system did not meet rotation
   or elevation speeds at -50 degrees Fahrenheit. In response to discussion
   questions on this issue, ROI provided a root cause analysis explaining
   that the "problem is caused by the increased viscosity of the grease at
   low temperatures." ROI's Response to IFN 201 at 1-795; see also ROI's
   Response to IFN 212 at 1-15. ROI identified three possible "design
   solutions" to address this problem: (1) replacing the grease with one that
   will function at the required temperatures, (2) increasing power to the
   motors, or (3) adding heaters to the motors to warm the grease. Id. ROI
   stated that it did "not know which is the best approach or combination of
   approaches" to solve the problem; the potential solutions were in early
   stages of testing. ROI's Response to IFN 212 at 1-15. ROI's schedule
   contemplated that system level testing of these solutions would not begin
   until 3 months after award. Id., attach. 212-11. Given these
   uncertainties, we find reasonable the SSA's determination that there was
   "doubt" that ROI's production system would be capable of operating
   reliably and safely at low temperatures prior to delivery.[12] AR, Tab 3,
   Award Decision Document, at 12.

   ROI complains that its proposal was evaluated more severely under this
   subfactor than Kongsberg's, who, according to ROI, identified a similar
   "grease issue" causing a failure of Kongsberg's azimuth release mechanism
   at low temperatures. However, the record shows that the two issues were
   not similar. With ROI, the agency reasonably noted that the offeror was
   investigating changing the type of grease as one of three possible
   solutions to correct an identified problem with performance of the bid
   sample, and ROI had not demonstrated that any of these possible solutions
   would in fact work. With Kongsberg, the issue related to simple
   maintenance (poor lubrication) on a Stryker program unit, not the bid
   sample, and was not related to a design defect.[13] Agency Supplemental
   Report at 18; Kongsberg's Response to IFN 84 at 2. As the agency
   convincingly explains, an isolated issue of unperformed maintenance (not
   lubricating the parts) not related to system design is far different from,
   and is of a less serious nature than, the performance failures associated
   with ROI's proposed system design.[14]

   ROI contends that a failure of Kongsberg's visible imaging module (VIM) at
   cold temperatures was also evaluated less harshly than similar failures
   identified in ROI's bid sample under the stabilization subfactor. Again,
   the VIM issue arose in connection with the predecessor Stryker unit
   performance and not with the bid sample. Agency Supplemental Report at
   4-5; see also Kongsberg's Response to IFN 84 at 2 (denying "any failure or
   functional deviation of the VIM in connection with tests performed on the
   [Kongsberg] bid sample."). The SSEB, in consultation with subject matter
   experts, later determined that the problem was caused by the government's
   failure to perform maintenance on the VIM and was not the result of a
   design flaw. Thus, the agency reasonably concluded that the incident did
   "not negatively affect the [system's] capability to operate reliably and
   safely" at low temperatures. AR, Tab 1, SSEB Final Report, at 19; Agency
   Supplemental Report at 4-5. In contrast, ROI's stabilization issues
   (movement in the weapons cradle and linear disturbances) were in fact
   design related. Furthermore, both government and Kongsberg testing
   confirmed satisfactory performance of Kongsberg's CROWS at the required
   temperatures, whereas ROI provided insufficient testing data for the
   agency to conclude that ROI's CROWS could meet the stabilization
   requirements at delivery. AR, Tab 1, SSEB Final Report, at 19-20, 78-79.
   Based on our review, we find that proposals were not evaluated unequally
   under this subfactor.

   Next, ROI challenges the evaluation of its proposal under the safety
   subfactor of the technical factor where the SSA assessed a "weakness"
   rating. The RFP defined a "weakness" rating under this subfactor as:

     The Offeror has not demonstrated a pro-active approach to addressing
     hazards and safety risks. Limited testing is available and/or the system
     is not mature. The system may require design changes to meet the
     requirement.

   RFP sect. M.11.1.g. Consistent with this definition, the SSA found that

     Recon Optical did not demonstrate a proactive approach to address
     hazards, safety risks and known field issues.[[15]] Recon Optical
     admits, in the FPR, the need to further analyze its system once/if the
     contract is awarded. Some level of hazard analysis has been conducted to
     identify potential safety risks. The mention of unspecified post award
     ECPs creates concern that the system may require further design changes
     the effects of which are unknown. These ECPs have no documentation and
     limited test data to address their impact on software and system safety
     of the proposed system.

   AR, Tab 3, Award Decision Document, at 13.

   ROI complains that the SSA "failed to give ROI credit for safety
   improvements identified in [ECPs[16]]," Protest at 14, and ignored "more
   than 700 pages of safety information" in ROI's proposal. ROI's Comments at
   47. However, based on our review, the agency reasonably determined that
   none of the safety information provided by ROI adequately addressed the
   agency's concerns over the impact of ECPs and whether the system may
   require further design changes, and thus we find no basis to question the
   SSA's rating assessment for this subfactor.[17]

   ROI challenges the "major weakness" rating assessed to its proposal under
   the reliability subfactor of the technical factor. A "major weakness"
   rating under this subfactor was defined as:

     The Offeror has an immature system or does not have any demonstrated or
     documented system reliability that provides confidence that at a minimum
     an inherent reliability of 1000 hours mean time between system aborts
     [MTBSA] is achievable prior to production delivery. Significant doubt
     exists that an inherent reliability of 1000 [MTBSA] will be met prior to
     production delivery.

   RFP sect. M.11.1.h. Recon Optical was assessed a "major weakness" rating
   because

     Recon Optical provided insufficient data to support[ ]its claim that the
     system will meet the threshold requirement for [MTBSA] prior to
     production. In the FPR, ROI did not address the impact to reliability as
     a result of implementing individual ECPs, nor did it state at what level
     the proposed ECPs will affect the system reliability. This lack of
     information raises concerns of the actual reliability of the proposed
     system. ROI has not provided sufficient evidence to show that its
     burn-in process and the submitted/proposed ECPs will improve the MTBSA
     of [REDACTED] to 1000 [hours] prior to delivery.

   AR, Tab 3, Award Decision Document, at 13.[18]

   ROI complains that the agency should not have considered MTBSA failures
   that occurred over a year ago during "burn-in" testing in the field,
   because burn-in is now done in the factory before fielding. According to
   ROI, MTBSA should only measure "aborts on the battlefield" after factory
   burn-in. ROI's Comments at 49. However, the RFP language makes no such
   distinction, and since ROI proposed reliability data based on the fielded
   units, we find the agency's consideration of the MTBSA failures associated
   with these units unobjectionable.

   In sum, we find that the assessment of subfactor ratings supports the
   overall "red" rating that ROI's proposal received under the technical
   factor.

   Schedule Factor

   ROI protests the assessment of a "high risk" rating to its proposal under
   the schedule factor. A "high risk" rating was defined as:

     Offeror does not have an active production line or has not demonstrated
     an ability to meet the CROWS production delivery as described in this
     solicitation to include a spares kit delivery schedule that supports the
     systems delivered based on the reliability documented and supported. The
     bid sample performance and written proposal[] provide significant doubt
     that the Offeror can meet these schedules without start-up delays.

   RFP sect. M.12. The SSA assessed ROI's proposal a "high risk" rating
   because

     The Recon Optical design does not appear to be stable and mature.
     Numerous requirements are identified by Recon Optical that are in
     development, under validation or have root cause issues identified with
     multiple design approaches that are still under evaluation as of the
     submittal of the [FPR]. Recon Optical lists 24 design changes to its
     current bid sample design. Limited information is available to assess
     the level of confidence regarding the feasibility of these changes and
     the impact to schedule with regard to performance, reliability and
     safety compliance. System level testing is not planned to be completed
     until three months after contract award. Recon Optical rationalizes that
     the "final official production configuration" is identified after FAT
     [first article test] and PVT [product verification test] rather than
     prior to delivery to the Government. It is not clear that Recon Optical
     understands the requirement that the FAT and PVT tests are to be
     conducted on systems that are "final official production configuration"
     representative. These factors raise significant doubt that Recon Optical
     will meet production delivery schedules with fully compliant weapon
     stations.

   AR, Tab 3, Award Decision Document, at 14. The SSEB further explains that
   ROI provided limited evidence that supported their claim that they had a
   current production capacity of 60 systems per month; the protester's
   production process is highly dependent on the success of its
   subcontractors because ROI maintains minimal stock; ROI provided only
   limited information regarding one of its major subcontractors (which is
   responsible for building approximately half of the CROWS systems), and the
   information provided raised additional concerns related to the major
   subcontractor's efforts to outsource components and its ability to meet
   and maintain the manufacturing schedule; ROI did not appear to understand
   environmental screening requirements; ROI made distinctions between major
   and minor characteristics, even though all verifications were classified
   as major requirements; ROI's design did not seem to be stable or mature
   due to numerous design changes, and limited information was available to
   assess the feasibility and schedule impact of these changes; safety
   analysis and documentation was not finalized and available prior to
   delivery; software safety analysis would not be conducted until government
   testing was complete; and it was not clear that ROI understood that FAT
   and PVT testing were to be conducted on systems that are representative of
   the final official production configuration. AR, Tab 1, SSEB Final Report
   at 81-84; Agency Report at 35-36. All of these issues created "significant
   doubt" whether ROI would meet the production delivery schedules with fully
   compliant CROWS. Although ROI disagrees with several of the individual
   concerns expressed by the SSEB and the overall rating of its proposal as
   providing "high risk" under the schedule factor, ROI has not shown the
   agency's judgment to be unreasonable or inconsistent with the record.

   For example, ROI contends that the agency misread or ignored portions of
   ROI's proposal that addressed scheduling concerns, and evaluated ROI's
   proposal using a "higher evaluation standard than [for] other offerors."
   Protest at 15-17; Supplemental Protest at 10-12; ROI's Comments at 51-68.
   ROI contends that its proposal should have been rated higher than
   Kongsberg's under the schedule factor largely because "as the incumbent
   contractor, ROI is the only offeror which has produced stabilized CROWS
   units," and therefore, ROI contends, it is more likely to meet the
   delivery schedule than the other offerors. ROI's Comments at 58-59.
   However, the record shows that ROI had difficulty meeting schedule and
   performance requirements under the incumbent contract,[19] and its
   proposed system here likewise encountered numerous problems during bid
   sample testing. ROI's promises that it could provide compliant CROWS by
   the scheduled due dates were not supported, and the information that ROI
   did provide in its proposal raised additional concerns about schedule
   risk. In contrast, Kongsberg has fielded over a 1,000 stabilized systems
   and, unlike ROI, provided detailed information demonstrating to the agency
   that it would, and could, meet the schedule requirements with fully
   compliant CROWS. Thus, we find no merit to ROI's arguments that its
   proposal was evaluated unfairly or unequally as compared to Kongsberg's.

   Management Factor

   ROI protests the assessment of a "red" rating to its proposal under the
   management factor, challenging the "weakness" and "major weakness" ratings
   of each of the four subfactors under this factor.

   Under the program management plan subfactor, ROI's proposal was assessed a
   "weakness" rating, which was defined as:

     The Offeror appears to understand the requirements of the RFP. The
     Offeror is lacking in some knowledge or experience in managing a project
     with the demands and ability to adapt to changes required for fielding a
     system of a complexity such as CROWS.

   RFP sect. M.13.1.a.

   In evaluating ROI's proposal, the SSEB favorably recognized that ROI had
   shown "general traceability" to the RFP's requirements and that several of
   its program managers were certified. However, the SSEB also noted a number
   of "negative observations" that precluded a higher rating, including
   discrepancies in the proposal regarding the number of project leads and
   their roles and responsibilities; proposal statements that ROI was
   planning on changing its tools for enterprise resource and material
   planning because the tools it currently uses are "not well suited for a
   program of this caliber"; concern with potential problems with ROI's cost
   management system requiring the Defense Contract Management Agency to
   audit ROI's systems bimonthly; and proposal responses providing only
   "generic answers" to issues relating to program risk and "generic high
   level information" on proposed program management processes. AR, Tab 1,
   SSEB Final Report, at 85-86. The SSEB considered ROI's repeated reference
   to delivery under its incumbent contract as proof of its overall
   management system's "effectiveness, efficiency, and capability," but did
   not find that this warranted a strength because ROI had not "actually
   delivered any systems on the identified contract that have met the
   performance specifications." Id. at 86. The SSA agreed with the SSEB that
   ROI's proposal warranted a "weakness" rating under this subfactor,
   concluding that ROI "is lacking in some knowledge or experience in
   managing a project with the demands and ability to adapt to changes
   required for fielding a system of a complexity such as CROWS." AR, Tab 3,
   Award Decision Document, at 15.

   ROI disagrees with this assessment, essentially complaining that some of
   the SSEB's conclusions are "factually incorrect" because the agency either
   misread or ignored ROI's proposal, which ROI asserts addressed the SSEB's
   criticisms. Protest at 17-18; ROI's Comments at 68-72. However, our review
   of the record confirms that the agency's evaluation was reasonable. For
   example, although ROI asserts that it did not admit its enterprise
   resource and material planning tools were "not well suited" for the CROWS
   program, its FPR does in fact state that ROI "has formed a committee to
   evaluate replacements for [the tool]" and that "[other tools] perhaps more
   suited to a company like Recon Optical . . . will be evaluated for planned
   implementation." ROI's Response to IFN 103 at 1-466. Similarly, although
   ROI complains that the agency misread discrepancies in the number of
   project leads, we note that its proposal does vary the numbers in at least
   three places. Compare ROI FPR, vol. II, table 2, at 1-326 to 1-328
   (identifying 13 "key personnel") with id., fig. 95-3 (identifying 9
   project leads); id. at 1-329 (referring to 14 project leads). It is the
   offeror's burden to submit an adequately written proposal, and based on
   our review of the record, ROI has not met its burden here. See Beck's
   Spray Serv., Inc., supra, at 3.

   ROI makes similar challenges to the assessment of "major weakness" ratings
   to its proposal under the subcontractor management plan, software
   management plan, and quality management plan subfactors. That is, ROI
   contends that the agency misread or ignored aspects of its proposal
   addressing several of the cited concerns. Again, we find no error in the
   agency's evaluation. For example, under the three subfactors, the agency
   found several instances of proposal inconsistencies, insufficient
   documentation to support ROI's promises, or "generic" responses to
   proposal weaknesses, AR, Tab 1, SSEB Final Report, at 87-92, and our
   review of the record confirms the reasonableness of the agency's concerns.
   The RFP warned offerors that "[u]nexplained inconsistencies . . . may be
   grounds for rejection of the proposal," cautioned that "unsupported
   promises to comply with the contractual requirements will not be
   sufficient," and required that proposals "provide convincing documentary
   evidence in support of a conclusionary statement." RFP sect. M.2.1. In
   light of the RFP's clear directives, we are unpersuaded by ROI's
   arguments, which amount to mere disagreement with the agency's judgments
   and are insufficient to render those judgments unreasonable.[20] UNICCO
   Gov't Servs., Inc., supra, at 7.

   In sum, the record shows that ROI's proposal was reasonably rated "red" or
   "high risk" under the three most important factors, and was lower
   technically rated and higher priced than the two other competing
   offers.[21] As a result, we find the SSA's decision not to select ROI's
   proposal for award to be reasonable and consistent with the RFP.[22]

   The protest is denied.

   Gary L. Kepplinger
   General Counsel

   ------------------------

   [1] Combined, all non-price factors were more important than price. RFP
   sect. M.5.1.

   [2] At the factor level, color ratings were defined as follows:

   +------------------------------------------------------------------------+
   |Blue |[T]he Offeror demonstrates an excellent understanding of the      |
   |     |requirements and provides exceptional strengths that will         |
   |     |significantly benefit the Government. The relative value of the   |
   |     |sub-factors rated as Major Strengths and Strengths significantly  |
   |     |outweigh the relative value of the sub-factors rated as           |
   |     |Weaknesses. There are no Major Weaknesses.                        |
   |-----+------------------------------------------------------------------|
   |Green|[T]he Offeror demonstrates a good understanding of the            |
   |     |requirements and provides strengths that will benefit the         |
   |     |Government. The relative value of the sub-factors rated as Major  |
   |     |Strengths or Strengths outweigh the relative value of the         |
   |     |sub-factors rated as Weaknesses and Major Weaknesses.             |
   |-----+------------------------------------------------------------------|
   |Amber|[T]he Offeror demonstrates an acceptable understanding of the     |
   |     |requirements. The relative value of the sub-factors rated as Major|
   |     |Strengths and Strengths are outweighed by Weaknesses and Major    |
   |     |Weaknesses. The relative value of sub-factors rated as Weaknesses |
   |     |outweigh the relative value of sub-factors rated as Major         |
   |     |Weaknesses.                                                       |
   |-----+------------------------------------------------------------------|
   |Red  |[T]he Offeror does not demonstrate an acceptable understanding of |
   |     |the requirements or [has] not sufficiently supported claims. The  |
   |     |relative value of the sub-factors rated as Major Weaknesses and   |
   |     |Weaknesses significantly outweigh the relative value of the       |
   |     |sub-factors rated as Strengths and Major Strengths.               |
   +------------------------------------------------------------------------+

   RFP sect. M.9.1.

   [3] Details about the testing requirements were included in the
   solicitation. RFP sections L.7, M.8.1.

   [4] A failure was defined as "something that prevents the system from
   operating as intended or creates an unsafe condition." RFP sect. M.8.1.

   [5] The first contract was competitively awarded in 2000; the subsequent
   contracts were "Urgent Material Requirement" contracts awarded on a
   sole-source basis.

   [6] A fourth offeror responded to the RFP but did not provide a bid
   sample, so was excluded from the competition.

   [7] The SSEB also obtained assistance from "subject matter experts" during
   the course of the evaluation.

   [8] Briefings to the SSAC and SSA occurred after both the initial and
   final evaluations.

   [9] The third competing offeror's proposal was rated "green" under the
   technical, management, logistics support, and small disadvantaged business
   factors; "amber" under the GPLR factor, "medium risk" under the schedule
   factor, and had an evaluated price of $488,482,668.37. AR, Tab 3, Award
   Decision Document, at 8.

   [10] Although we do not specifically address all of the numerous
   contentions raised by ROI with regard to these three factors (some of
   which were untimely raised), we have taken them into account and
   determined that there is no basis to find the agency's evaluation of these
   factors unreasonable or inconsistent with the RFP.

   [11] A "root cause analysis" is the process of identifying the cause of an
   incident or problem.

   [12] Testing of ROI's bid sample within the temperature range also
   revealed problems with the display (the "monitor was distorted and several
   vertical lines appeared on the screen" and lasted for "approximately three
   minutes") and with "uncommanded movement in the form of a continuous side
   to side tremble of the mounted weapon." AR, Tab 30, TIR 73-1, Screen
   Overlay; id., Aberdeen Test Center (ATC) Test Record No. AD-F-55-07, at 4.
   ROI attempts to minimize the severity of these issues, but the agency has
   persuasively explained that these operational problems could significantly
   impact safety and reliability of the CROWS in combat situations, thus
   putting the operator at serious risk. Agency Supplemental Report at 10-13.
   With regard to the display problem, ROI complains of unequal treatment
   because Kongsberg's proposal was rated a "major strength" under the
   temperature subfactor even though its CROWS display showed an "error
   message" during testing. See AR, Tab 31, ATC Test Record No. AD-F-56-07,
   at 4. However, it is not clear from the record that the display issues of
   the two bid samples were in fact the same, as ROI asserts. In any event,
   the display was not the basis for Kongsberg's "major strength" rating, and
   although the SSEB mentions ROI's display problem as a concern in its
   report, the SSA did not identify this issue as a basis for ROI's rating
   under the temperature subfactor. Thus, even if ROI were correct that bid
   samples were evaluated unequally, it was not prejudiced as a result.

   [13] ROI cannot reasonably complain that the agency should have given
   greater weight to Stryker unit performance, when it asserts that any
   consideration of information outside of the proposal and the bid sample
   was "inconsistent with the Solicitation's evaluation scheme." ROI's
   Supplemental Comments at 10-11.

   [14] ROI disputes that the "failure" of Kongsberg's azimuth release
   mechanism occurred only with the Stryker units or that the root cause and
   a solution were identified. It quotes from Kongsberg's response to IFN 84,
   discussing this issue, where Kongsberg states:

     The root cause of the [azimuth release mechanism] failure experienced as
     part of [t]he [Kongsberg] Bid sample testing at low temperature may be
     poor lubrication of the mechanical parts; [Kongsberg] will investigate
     this problem on the [Kongsberg] Bid sample when it is returned to
     [Kongsberg].

   Kongsberg's Response to IFN 84 at 2. However, as the agency explains,
   Kongsberg was responding to a mistaken belief that the agency had found a
   bid sample failure; in fact, no testing failures of the azimuth release
   mechanism occurred at low temperatures with Kongsberg's bid sample. See
   AR, Tab 31, TIR 73-1, Phase II Cold Temperature Testing ("no faults were
   observed"); id., Test Record No. AD-F-56-07 (no failure identified).
   Kongsberg's confusion may have occurred because the agency did not fully
   explain that the IFN was discussing a failure previously identified in
   units from the Stryker program (of which the SSEB members were aware) and
   not the bid sample. Agency Supplemental Report at 18. The agency's
   explanation seems correct given that, as the protester notes, bid sample
   testing under the temperature subfactor occurred after the IFN was issued
   to Kongsberg, so the IFN could not have been asking about bid sample
   failures. In any event, these issues were later determined not to be
   design related. AR, Tab 1, SSEB Final Report at 19-20. The protester has
   not pointed to any evidence in the record that shows a failure of the
   azimuth release mechanism in Kongsberg's bid sample at low temperatures,
   other than Kongsberg's statement in this IFN.

   [15] ROI complains that the SSA's findings are inconsistent with those of
   the SSEB, which found that ROI had a "proactive system safety program in
   place to address hazards and safety risks." AR, Tab 1, SSEB Final Report,
   at 80. However, it is well settled that an SSA is not bound by the
   recommendation of lower level evaluators. Hubbell Elec. Heater Co.,
   B-289098, Dec. 27, 2001, 2002 CPD para. 15 at 6.

   [16] ROI asserts that the agency has "deliberately withheld approval of
   ECPs," ROI's Comments at 45, but provides no evidence to support this
   assertion.

   [17] ROI complains that Kongsberg's proposal was assessed a "strength" for
   implementing design changes, which evidences disparate treatment in the
   evaluation because ROI's proposal was assessed a "weakness" for design
   changes. ROI's arguments ignore the fact that its "weakness" rating was
   due to its plan to implement numerous future design changes, the impact of
   which was unknown, and ROI's failure to provide sufficient information for
   the agency to evaluate the impact of those changes on safety and
   performance. In contrast, Kongsberg's proposal "display[ed] the ability to
   initiate and implement design changes to mitigate or eliminate risks," and
   Kongsberg provided sufficient supporting data to prove its claims. AR,
   Tab 1, SSEB Final Report, at 20, 80.

   [18] ROI complains that the agency's demand (during discussions) for
   additional documentation supporting the reliability of ECPs was
   unreasonable and was not required by the RFP. However, the RFP expressly
   provides for the consideration of "demonstrated or documented system
   reliability." RFP sect. M.11.1.h.

   [19] As stated above, ROI was issued a show cause notice and termination
   for default cure notice.

   [20] ROI alleges disparate treatment in the evaluation of its and
   Kongsberg's proposal under the program management plan and subcontractor
   management plan subfactors. It asserts it deserved higher ratings since
   its plans were greater in page count than Kongsberg's. However, ROI
   ignores the fact that much of its proposal was reasonably found to contain
   inconsistencies, was generic, and despite the length lacked sufficient
   detail, whereas Kongsberg provided "detailed interrelated and coherent
   documentation" of its plans and responded fully to agency concerns in its
   FPR and responses to IFNs. AR, Tab 1, SSEB Final Report, at 24-25. Other
   assertions of unequal treatment are similarly without merit.

   [21] ROI also challenges the evaluation of its proposal under the logistic
   support, GPLR, past performance, and price factors, and asserts that the
   "blue" rating it received under the small disadvantaged business factor
   should have been given more weight in the evaluation. Although we do not
   discuss these aspects of the evaluation, we have reviewed the protest
   issues raised and find them to be without merit.

   [22] ROI also challenges the evaluation of Kongsberg's proposal, but ROI
   is not an interested party to raise these protest grounds. An "interested
   party" is an actual or prospective offeror whose direct economic interest
   is affected by the award, or failure to award, a contract. 4 C.F.R. sect.
   21.0(a)(1) (2007). An offeror who is not next in line for award is not an
   interested party to protest the evaluation of the awardee. Ridoc Enter.,
   Inc., B-292962.4, July 6, 2004, 2004 CPD para. 169 at 9. As discussed
   above, ROI's proposal was reasonably rated the lowest of three offerors
   and was the highest in price, and therefore it is not next in line for
   award even if its protest of Kongsberg's proposal were sustained. Since
   ROI did not challenge the evaluation of the intervening offeror's
   proposal, ROI is not an interested party to challenge the evaluation of
   Kongsberg's proposal in this case. Id.