BNUMBER:  B-272261; B-272261.2
DATE:  September 18, 1996
TITLE:  Research Analysis and Maintenance, Inc.

**********************************************************************

DOCUMENT FOR PUBLIC RELEASE
A protected decision was issued on the date below and was subject to a 
GAO Protective Order.  This version has been redacted or approved by 
the parties involved for public release.
Matter of:Research Analysis and Maintenance, Inc.

File:     B-272261; B-272261.2

Date:September 18, 1996

William L. Walsh, Jr., Esq., J. Scott Hommer III, Esq., Wm. Craig 
Dubishar, Esq., and Paul N. Wengert, Esq., Venable, Baetjer and 
Howard, for the protester.
Gerard F. Doyle, Esq., Alexander T. Bakos, Esq., Doyle and Bachman, an 
intervenor.
Jeffrey I. Kessler, Esq., and Gary Theodore, Esq., Department of the 
Army, for the agency.
Guy R. Pietrovito, Esq., and James A. Spangenberg, Esq., Office of the 
General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  In a negotiated procurement for computer system and software 
engineering support services, protest that the contracting agency was 
required to consider Capability Maturity Model software process 
assessments under various technical evaluation factors is denied where 
the solicitation only provided for the use of the Capability Maturity 
Model in the evaluation of proposals under one evaluation subfactor.

2.  Protest that the contracting agency misapplied the Capability 
Maturity Model in assessing the protester's and awardee's software 
process risk is denied where the protester merely disagrees with the 
agency's risk assessment and does not show the agency's judgment to be 
unreasonable.

3.  In a procurement for the award of a time-and-materials contract 
with fixed-price burdened labor rates, the contracting officer 
reasonably evaluated the realism of the offerors' proposed labor 
rates, in accordance with the solicitation, where the agency assessed 
the offerors' ability to attract, hire, and retain qualified personnel 
during the contract at the proposed rates.

4.  The contracting agency did not coerce or mislead the protester 
into raising its proposed labor rates, where the agency was reasonably 
concerned with the protester's ability to hire and retain qualified 
personnel due to its low proposed labor rates and asked the protester 
during discussions to substantiate how it intended to hire and retain 
qualified personnel at the rates proposed; the protester's decision to 
increase its proposed labor rates reflected the exercise of the firm's 
business judgment.

5.  Protest that the awardee's proposed small business and small 
disadvantaged business subcontracting plan does not satisfy the 
solicitation requirements is denied where the solicitation required 
offerors to subcontract at least 20 percent of the contract value to 
small business concerns and to make a good faith effort to subcontract 
at least 5 percent of the contract value to small disadvantaged 
business concerns; the awardee proposed subcontracting more than 20 
percent of the contract value to two subcontractors, a small business 
concern and a small disadvantaged business concern; and the protester 
has not shown that the agency's determination that the awardee made a 
good faith effort to meet the small disadvantaged business 
subcontracting goal was unreasonable or not in accord with the 
solicitation requirements.

6.  The contracting agency reasonably determined that any potential 
organizational conflict of interest involving a proposed subcontractor 
of the awardee would be avoided or mitigated through the assignment of 
work under contract task orders.

DECISION     

Research Analysis and Maintenance, Inc. (RAM) protests the award of a 
contract to Ilex Systems, Inc. under request for proposals (RFP) No. 
DAAB07-95-R-H601, issued by the U.S. Army Communications-Electronics 
Command for "system and software engineering support services for 
Mission Critical Defense Systems."  RAM challenges the agency's 
technical and cost evaluations, asserts that the agency conducted 
misleading discussions that caused the protester to increase its 
proposed labor rates, claims that Ilex's proposed level of 
subcontracting did not comply with the RFP requirements, and argues 
that a proposed Ilex subcontractor has an organizational conflict of 
interest (OCI).

We deny the protest.

The RFP provided for the award of a time-and-materials, indefinite 
delivery, indefinite quantity contract for system and software 
engineering support services.  The statement of work (SOW) detailed 
the services that could be ordered under the contract, including 
support services for system and software acquisition, for the 
establishment and maintenance of a software support environment, and 
for software maintenance and enhancement.  Offerors were informed that 
delivery orders would be issued for the various contract tasks and 
that the total estimated level of effort was 300 man-years.  The RFP, 
as amended, also provided that offerors were required to subcontract 
at least 20 percent of the contract value to small business concerns, 
and to make a good faith effort to subcontract at least an additional 
5 percent of the contract value to small disadvantaged business 
concerns.

The RFP provided for award on a best value basis, and stated the 
following evaluation factors and subfactors:

        1.  Technical

          a.  Technical approach
          b.  Technical management of software activities
          c.  Qualifications and availability of personnel
          
        2.  Sample Task Orders     
          (5 sample tasks provided)

        3.  Management

          a.  Hiring and staffing plan
          b.  Management techniques and controls
          c.  Small business and small disadvantaged business 
subcontracting proposal

        4.  Performance Risk

          a.  Software process risk evaluation (SPRE)
          b.  Technical, management, schedule, and cost risk
     
        5.  Cost

          a.  Cost proposal
          b.  Labor rate realism

The RFP provided that the technical and sample task evaluation factors 
were equal in weight and together were significantly more important 
than the remaining evaluation factors combined.  The management, 
performance risk, and cost evaluation factors were stated to be of 
equal weight.  With regard to the performance risk factor's SPRE 
subfactor, the RFP stated that:

     "The [g]overnment will use the [SPRE] to evaluate the process 
     capability of each offeror.  The SPRE methodology is consistent 
     with the Software Engineering Institute's Software Capability 
     Evaluation methodology.  The [o]fferor's process will be 
     evaluated against the Capability Maturity Model as defined in the 
     CMU/SEI-93-TR-24 and CMU/SEI-93-TR-25 to determine the risks 
     associated with the ability of the [o]fferor's process, when 
     followed, to produce quality software on schedule and within 
     budget."[1]

As amended, the RFP required offerors to complete a "Software Process 
Maturity Questionnaire" to aid the agency's SPRE.  Offerors were 
informed that while proposal risk would be assessed under each factor 
and subfactor "in terms of overall potential performance," the 
"performance risk factor shall be separately evaluated so that a level 
of confidence can be determined."

The RFP required offerors to provide fixed hourly rates, fully 
burdened with all indirect expense rates, based upon labor 
classifications and estimated man-years set forth in the solicitation.  
Offerors were informed that the agency would evaluate the realism of 
the offeror's proposed rates in relationship to the contract 
requirements, and to ascertain the offeror's ability to attract, hire, 
and retain qualified personnel during the contract.  The RFP also 
provided that the award fee was fixed for all offerors and would not 
be evaluated, and that other direct costs (such as materials, travel, 
facility leases, and equipment rental) were fixed at $750,000 per year 
for evaluation purposes.

The Army received proposals from three offerors, including RAM and 
Ilex (the incumbent contractor).  Technical and cost discussions were 
conducted with each offeror, and proposal revisions received.  Ilex's 
and RAM's best and final offers (BAFO) were evaluated as follows:[2]

                         Ilex           RAM

     Technical           [DELETED]      [DELETED]

     Sample Tasks        [DELETED]      [DELETED]

     Management          [DELETED]      [DELETED]

     Performance Risk    [DELETED]      [DELETED] 

     Cost                $82.6M[3]      [DELETED]

Ilex's and Ram's proposed costs were determined to be reasonable and 
realistic for the proposed contract.

The source selection authority was briefed on the evaluation findings, 
and determined that Ilex's proposal was the most advantageous to the 
government, as follows:

     "In summary, [Ilex] presented the highest rated Technical and 
     Sample Tasks Proposals.  Each of the offerors were rated 
     [DELETED] in the Management Proposals and were rated [DELETED] in 
     the Performance Risk factor.  [Ilex] clearly represents the best 
     overall value to the Government since the proposal received the 
     best rating in the most heavily weighted factors, Technical and 
     Sample Tasks, at the lowest evaluated cost."

Award was made to Ilex, and this protest followed.

RAM first challenges the Army's evaluation of RAM's and Ilex's 
technical proposals under the technical, sample tasks, and performance 
risk evaluation factors.  RAM's complaint is grounded upon its 
argument that the maturity of an offeror's software development 
processes, as determined using the software process assessment of the 
Carnegie-Mellon Software Engineering Institute's Capability Maturity 
Model, is intrinsically related to an offeror's potential performance 
under the contract, and must be considered under each of the various 
RFP evaluation factors and subfactors.  RAM alleges that it has 
achieved a higher maturity level evaluation than Ilex under software 
process assessments conducted by the Army during the performance of 
other contracts and that the Army unreasonably did not consider this 
information in its technical evaluation.

Specifically, RAM notes that the technical approach subfactor to the 
technical evaluation factor provided for the evaluation of an 
offeror's "demonstrated ability" to develop and especially maintain 
application software, operating systems, and support software for 
tactical systems.  In RAM's view, the evaluation of an offeror's 
"demonstrated ability" under this subfactor necessarily must include 
consideration of software process assessments performed by the agency 
under other contracts.  Also, RAM complains that it was unreasonable 
for the Army to rate Ilex's proposal as [DELETED]--as compared to 
RAM's proposal's [DELETED] rating--for the sample tasks factor, given 
RAM's allegedly more mature software development process; in this 
regard, RAM complains that its proposal was downgraded under Sample 
Task [DELETED] because of its reference to its [DELETED].  Finally, 
RAM challenges the agency's evaluation scores for RAM and Ilex under 
the performance risk evaluation factor on the basis of RAM's allegedly 
better and more mature software process assessments.

The Army responds that, as provided for by the RFP, it used the 
Capability Maturity Model in its evaluation of the proposals under the 
SPRE subfactor of the performance risk factor.  In addition, the Army 
and Ilex dispute RAM's assertion that the RFP required the agency to 
consider prior software process assessments in the evaluation of 
proposals under the technical and sample tasks factors.  The agency 
and Ilex also dispute RAM's allegation that RAM was previously rated 
as having a more mature software development process than Ilex; in 
this regard, the agency states that the software process assessment to 
which RAM points was actually an informal practice exercise conducted 
in 1992 at Fort Huachuca as training for a software capability 
evaluation team to use in an upcoming procurement and was not 
documented.

In considering a challenge to a particular evaluation conclusion, we 
examine the record to determine whether the judgment was reasonable 
and in accord with the evaluation criteria listed in the solicitation.  
Abt Assocs., Inc., B-237060.2, Feb. 26, 1990, 90-1 CPD  para.  223.  A 
protester's mere disagreement with the agency's evaluation 
determination does not demonstrate that the evaluation was 
unreasonable.  Brunswick Defense, B-255764, Mar. 30, 1994, 94-1 CPD  para.  
225.

Here, we agree with the Army and Ilex that the RFP only required the 
use of the Capability Maturity Model in the evaluation of proposals 
under the performance risk factor's SPRE subfactor, inasmuch as it is 
the only evaluation factor or subfactor that explicitly provides for 
the use of the Capability Maturity Model in the evaluation of 
proposals under the RFP.  While the RFP's technical evaluation factor 
concerns whether an offeror's approach satisfies the SOW requirements, 
and the offeror's ability to utilize process improvement principles, 
this does not reasonably suggest that the agency would employ software 
process assessments from the Capability Maturity Model in evaluating 
offerors' proposals with respect to those evaluation factors and 
subfactors other than the SPRE subfactor.  Rather, the RFP clearly 
informed offerors that their technical approach, understanding, and 
capabilities would be evaluated under the technical and sample tasks 
evaluation factors, but did not specify the particular methodology or 
evaluation tool or tools to be used.

It is also true, as RAM notes, that the SOW references the use of the 
Capability Maturity Model for a number of contract services, e.g., 
that the contractor, as a part of its contract support services, would 
support the agency's evaluations, using the Capability Maturity Model, 
of other contractor's software development and maintenance processes; 
would assist the agency in evaluating various continuous improvement 
activities, such as "total quality management" and the Capability 
Maturity Model; and would assist in reviewing and evaluating "generic 
software development methodologies," such as the Capability Maturity 
Model.  However, these SOW provisions also do not reasonably suggest 
to offerors that their proposals will be evaluated using the 
Capability Maturity Model, but merely state the contract requirements 
that the contractor will be required to meet.

In sum, we find that the RFP did not require the use of Capability 
Maturity Model's software process assessments in the evaluation of 
proposals under the technical and sample tasks evaluation factors.[4]  
Given this determination and the fact that RAM's challenge to the 
Army's evaluation under the technical and sample tasks evaluation 
factors was limited to the agency's failure to use the Capability 
Maturity Model to assess offerors' proposals under these evaluation 
factors, we have no basis to question the agency's evaluation ratings 
under the factors.[5]  We further note that the underlying premise on 
which RAM's protest allegation is based--that RAM's software process 
has been assessed as more mature than Ilex's--is both undocumented and 
unsupported in the record.  Accordingly, this part of RAM's protest is 
denied.[6]  

RAM also challenges the agency's use of the Capability Maturity Model 
in the evaluation of RAM's and Ilex's proposals under the performance 
risk factor's SPRE subfactor.  RAM's and Ilex's proposals were each 
evaluated as being of [DELETED] risk under the SPRE subfactor and 
overall under the performance risk evaluation factor.[7]  RAM argues 
that its [DELETED] risk rating is unreasonable and reflects a 
misapplication of the Capability Maturity Model; specifically, RAM 
complains that the agency apparently required "institutionalized" 
software processes before an offeror's software process would be 
deemed low risk and that this is contrary to the Capability Maturity 
Model.  RAM also complains that the agency overlooked the information 
RAM provided in response to the agency's discussion questions, which 
assertedly was sufficient either to demonstrate a mature software 
process or to  alleviate evaluated weaknesses.

The record shows that the agency evaluated the information provided by 
RAM in its proposal, as well as information obtained by the agency in 
an on-site visit to RAM's plant, where the agency reviewed RAM's 
organizational-level and specific project documentation and 
interviewed various RAM personnel in accordance with the Capability 
Maturity Model procedures.  As a result of this evaluation, RAM's 
proposal received a [DELETED] risk rating under the SPRE subfactor and 
the Army notified RAM of [DELETED] weaknesses identified in its 
software development process.  In response to the agency's discussions 
concerning RAM's software development process, RAM provided detailed 
information addressing the evaluated weaknesses and stating how RAM 
intended to improve its process; RAM also provided documentation to 
show its established software process.  Despite RAM's lengthy and 
detailed response to the agency's discussions, the Army again 
evaluated RAM's proposal as [DELETED] risk under the SPRE subfactor 
because although RAM had provided adequate methods to correct or 
mitigate most of the perceived weaknesses, the firm did not establish 
dates or a specific schedule for its offered improvements.  The Army 
concluded, based on the SPRE, that some risk remained regarding RAM's 
performance of the contract work.

We find reasonable the Army's evaluation of RAM's proposal as 
[DELETED] risk under the SPRE subfactor.  Although RAM offered a plan 
to correct or mitigate weaknesses identified in its software 
development process, the agency was reasonably concerned that RAM's 
software process was not well established, and that this indicated 
some risk in the performance of the contract.  While RAM complains 
that the more "institutionalized" software process the agency required 
before it would find the firm's software process to be low risk is 
inconsistent with the Capability Maturity Model, RAM has not 
established why this is so.  Moreover, from our review of the 
Capability Maturity Model, we find, contrary to RAM's arguments, that 
the assessment of an organization's maturity level under a software 
process assessment does not necessarily govern the assignment of risk 
in a software capability evaluation.[8]  Rather, the Capability 
Maturity Model specifically provides for the exercise of informed 
professional judgment in conducting both software process assessments 
and software capability evaluations, and recognizes that "the results 
of a software process assessment or software capability evaluation may 
differ, even on successive applications of the same method."

RAM also argues that the agency's evaluation of its and Ilex's 
proposals as [DELETED] risk under the SPRE subfactor is unreasonable 
because RAM allegedly has a more mature and less risky software 
process than Ilex.  As noted above, we find no support in the record 
for RAM's unsubstantiated assertions concerning the relative maturity 
of RAM's or Ilex's software processes.  Rather, the record establishes 
that the Army evaluated the two firms' software processes based upon 
the information provided in the proposals, the agency's on-site visits 
to the firms' offices, and the offerors' responses to the agency's 
discussions questions.  RAM has not shown the agency's evaluation 
judgment to be unreasonable.

RAM next challenges the agency's cost evaluation of Ilex's proposal.  
Specifically, RAM argues that the Army, contrary to the RFP 
requirement for a cost realism evaluation of offerors' proposed loaded 
labor rates, did not evaluate Ilex's more than $[DELETED] decrease in 
its total proposed cost from that proposed in the awardee's initial 
proposal.[9]

The record does not support this protest allegation.  The Army 
verified both RAM's and Ilex's proposed loaded labor rates (base 
direct labor rates and indirect cost rates) with the Defense Contract 
Audit Agency, and also compared the two firms' proposed rates with the 
Independent Government Estimate (IGE).  Because both firms proposed 
some rates that were below those of the IGE, the agency conducted cost 
discussions with RAM and Ilex seeking additional support for the 
proposed fixed rates.  In response to discussions, Ilex reduced its 
proposed costs by approximately [DELETED] percent by (1) [DELETED]; 
(2) [DELETED]; and (3) [DELETED].  Although the Army found that, 
overall, Ilex's proposed costs, based upon its fixed-price rates, were 
reasonable and realistic, some of Ilex's proposed rates were 
considered low; on this basis, the Army assessed, under the technical 
evaluation factor's qualifications and availability of personnel 
subfactor, the risk that Ilex could obtain and retain qualified 
personnel.

We find that the agency's cost realism evaluation was reasonable and 
consistent with the RFP requirements.[10]  Where, as here, a 
solicitation provides for the award of a time-and-materials contract 
with fixed-price burdened labor rates and no-cost reimbursable 
elements, there is no requirement that the agency assess the cost 
realism of the proposed rates.  See Research Management Corp., 69 
Comp. Gen. 368 (1990), 90-1 CPD  para.  352; SYS, B-258700, Jan. 31, 1995, 
95-1 CPD  para.  57.  The RFP provided, however, that the agency would 
assess offerors' proposed rates to ascertain the offerors' ability to 
attract, hire, and retain qualified personnel during the contract.  
The record demonstrates that the agency did exactly that.[11]

RAM also protests that it was misled during discussions to raise its 
proposed labor rates such that its final proposed price was 
approximately $[DELETED] higher than that of Ilex.  

An agency may not consciously coerce or mislead an offeror into 
raising its price.  See Eagle Technology, Inc., B-236255, Nov. 16, 
1989, 89-2 CPD  para.  468.  Here, however, the record establishes that the 
agency did not coerce or mislead RAM into raising its proposed labor 
rates.  Rather, the record shows that the agency found that a number 
of RAM's proposed labor rates in its initial proposal were low and, as 
a result, asked RAM in discussions to substantiate how it intended to 
hire and retain qualified personnel at the rates proposed.  In 
response to the agency's discussions, RAM raised its proposed labor 
rates to reflect the rates currently being paid employees.  Rather 
than establishing coercion or misleading discussions as RAM asserts, 
the agency's discussions merely reflected the agency's reasonable 
concern, in accordance with the RFP's provisions, that RAM's low 
proposed labor rates may affect its ability to hire and retain 
qualified personnel.  RAM was given the opportunity either to 
substantiate its initially proposed rates or to propose different 
rates.  That RAM chose to raise its proposed labor rates reflects the 
exercise of the firm's business judgment, not improper conduct by the 
agency.

RAM also protests that Ilex's offer was not compliant with the RFP's 
mandatory subcontracting requirements.  We disagree.  The RFP, as 
amended, provided that: 

     "[o]fferors are required to subcontract at least 20 (twenty) 
     [percent] of the contract value to small business concerns, and 
     to make a good faith effort to subcontract at least an additional 
     5 (five) [percent] of the contract value to small disadvantaged 
     business concerns."  

In its BAFO, Ilex proposed two subcontractors:  a small business 
concern and a small disadvantaged business concern--together the 
subcontracts to these two concerns totaled more than 20 percent of the 
contract value.  As the Army reasonably found, Ilex's proposed 
subcontracting plan satisfied the RFP requirement that offerors 
subcontract at least 20 percent of the contract value to small 
business concerns.  The Army also found that the amount Ilex proposed 
to subcontract above 20 percent represented the awardee's good faith 
effort to subcontract with its proposed small disadvantaged business 
subcontractor.  While RAM apparently disagrees with this 
determination, it does not show why this determination was 
unreasonable or not in accordance with the RFP requirements.

RAM finally protests that one of Ilex's proposed subcontractors has an 
OCI, which should have disqualified Ilex from award.  The Army 
responds that it is true that one of Ilex's subcontractors is also a 
subcontractor under another task order contract to provide software 
development and maintenance support on some of the systems to be 
supported under the contract to be awarded here.  The contracting 
officer for that contract is the same person assigned as contracting 
officer for this procurement.  The Army disputes that Ilex's 
subcontractor has an actual OCI because the subcontractor has not done 
any work under the other contract.  In addition, the Army states that 
any potential OCI will be avoided or mitigated through the negotiation 
of delivery orders under the two contracts or the tasking of potential 
OCI work to another contractor.

A contracting officer is required to avoid, neutralize, or mitigate a 
significant potential OCI before contract award.  Federal Acquisition 
Regulation  sec.  9.504, 9.505.  We will not overturn an agency's 
determination regarding the existence of an actual or potential OCI, 
or whether that OCI can be avoided, neutralized, or mitigated, except 
where the determination is shown to unreasonable.  D.K. Shifflet & 
Assocs., Ltd., B-234251, May 2, 1989, 89-1 CPD  para.  419.  Here, the Army 
states, reasonably we find, that any potential OCI will be avoided by 
the careful assignment of work to the subcontractor under the two task 
order contracts.  See Deloitte & Touche, 69 Comp. Gen. 463 (1990), 
90-1 CPD  para.  486.  RAM does not contend that any potential OCI involving 
the subcontractor cannot be so avoided.  

The protest is denied.

Comptroller General
of the United States

1. The Capability Maturity Model was developed by the Carnegie-Mellon 
Software Engineering Institute to provide a tool for assessing and 
evaluating the maturity of an organization's software processes.  The 
model, which identifies five levels of maturity, provides for software 
process assessments--or self-assessments--that allow organizations to 
implement improvement programs and for software capability evaluations 
that allow evaluators to identify the risks of selecting among 
different contractors for award.

2."Good" was defined as being a beneficial approach that satisfies all 
the government's requirements with a low degree of risk, while 
"acceptable" was defined as an approach that satisfies all the 
government's requirements with a moderate degree of risk.  Under the 
performance risk factor, "moderate" risk was defined as the existence 
of some doubt, based on an offeror's performance record, that the 
offeror can perform the proposed effort.

3."M" means million.

4. We also disagree with RAM's argument that the Capability Maturity 
Model was required in the evaluation of proposals under the 
performance risk factor's technical, management, schedule, and cost 
risk subfactor.  This subfactor specifically provided for the 
evaluation of the degree to which offerors and their major 
subcontractors had met technical, schedule, and cost objectives on 
contracts within the past 5 years for related efforts.

5. RAM in its comments complained that the agency's evaluation of 
proposals was inadequately documented.  Where an agency fails to 
document or retain evaluation records, it bears the risk that there is 
inadequate supporting rationale in the record for its evaluation and 
source selection decision and that we will not conclude that there is 
a reasonable basis for the agency's evaluation or decision.  Southwest 
Marine, Inc.; American Sys. Eng'g Corp., B-265865.3; B-265865.4, Jan. 
23, 1996, 96-1 CPD  para.  56.  However, we will not disrupt an agency's 
procurement merely because the agency has failed to adequately 
document its evaluation or source selection decision, where the record 
otherwise shows the evaluation or source selection decision to be 
reasonable.  Id.  Here, even assuming the evaluation was not 
adequately documented, we do not find on this record that the agency's 
evaluation under these factors was unreasonable. 

6. RAM also challenged the agency's evaluation of its response to 
Sample Task [DELETED] as [DELETED].  We need not address this 
allegation because the record establishes that, even if true, the 
allegation would not change the relative competitive standing of the 
offerors.  That is, Ilex's proposal would remain higher rated with a 
lower evaluated cost than RAM's and, on this basis, Ilex would remain 
entitled to award.

7. RAM's and Ilex's proposals both received low risk ratings under the 
technical, management, schedule, and cost risk subfactor.

8. The Capability Maturity Model provides for software process 
assessments, which "focus on identifying improvement priorities within 
an organization's own software process," as well as software 
capability evaluations, which focus "on identifying the risks 
associated with a particular project or contract for building 
high-quality software on schedule and within budget."  As recognized 
in the model, these assessment and evaluation processes are different; 
they "differ in motivation, objective, outcome, and ownership of the 
results."

9. RAM initially argued that Ilex had failed to include subcontractor 
costs in its cost proposal.  This allegation was without merit and was 
not pursued by RAM after its receipt of the agency report.

10. RAM speculates that during contract performance Ilex may attempt 
to charge as direct labor cost elements that are properly accounted 
for as indirect costs.  There is no evidence to support RAM's 
supposition in this regard, and we will not presume that an agency 
will improperly administer its contracts.

11. Although RAM complains that the agency did not sufficiently 
document its cost realism evaluation of Ilex's proposal, the record 
shows that the agency did consider Ilex's proposed rates and assess 
Ilex's ability to hire and retain qualified personnel.  In the context 
of this procurement, we find this record adequate for our review.