Ceres Environmental Services, Inc. v. United States

OPINION AND ORDER

WILLIAMS, Judge.

In this post-award bid protest, Ceres Environmental Services, Inc. (“Ceres”) challenges the award of contracts made by the United States Army Corps of Engineers (the “Corps”) in two geographic regions, Regions 6a and 6b covering Hawaii and Alaska, respectively, following a reeompetition directed by this Court in AshBritt, Inc. v. United States, 87 Fed.Cl. 344 (2009). In AshBritt, the Court found that the Government engaged in unequal and misleading discussions and improperly evaluated reach back assignments without considering price, and ordered the Government to “reproeure the services awarded in the primary contracts in Regions 5, 6A and 6B and the reach-back assignments in Regions 2C, 4, 5, 6A and 6B.” 87 Fed.Cl. at 380. This protest to the reprocurement comes before the Court on the parties’ cross-motions for judgment on the Administrative Record and on Ceres’ motion for a permanent injunction.

Findings of Fact 2

This protest, like the protest in AshBritt, concerns Request for Proposals (“RFP”) number W912P8-07-R-0101 that the Corps issued on June 23, 2007, to procure specified equipment, operators, and laborers for the removal of debris originating from any natural or man-made catastrophe or disaster in 10 geographic regions and sub-regions. AAR at 209, 218.3 For each region, the Corps would award an Indefinite Delivery/Indefinite Quantity (“ID/IQ”) contract at a firm fixed price for a single base year with four one-year option periods, limited to a maximum of $50 million per year and $250 million over the life of the contract. AAR at 210-11, 218, 317. For each region, the Corps would also assign a back-up, or “reach back,” contractor that might, under certain circumstances, be activated in addition to, or in place of, the primary contractor.4 The contract allowed the agency to issue task orders as firm fixed-price, time and materials, or a hybrid of both. Id. The agency indicated in the solicitation that it intended “to issue the majority of the task orders as firm-fixed price.” AAR 319.

On April 8, 2008, the agency selected primary contracts and reach back assignments for the 10 regions and sub-regions as follows: *27987 Fed.Cl. at 360. In AshBritt, the Court ordered the agency to recompete the primary contract awards in three regions—Regions 5, 6a and 6b—and the reach back assignments in five regions—Regions 2C, 4, 5, 6a and 6b. 87 Fed.Cl. at 380. Following recompetition, on November 25, 2009, the Source Selection Authority (“SSA”) decided to award contracts in the recompeted regions as follows:

*278 Region Primary Contract Reach Back Assignment
1 ECC 5 Phillips and Jordan 6
2A Phillips & Jordan Ceres
2B AshBritt 7 Ceres
2C Xpert’s Phillips & Jordan
2D BGM-Ceres N/A
3 ECC AshBritt
4 Ceres Phillips & Jordan
5 ECC Phillips & Jordan
6A ECC Phillips & Jordan
6B Ceres Phillips & Jordan
*279 Region Primary Contract Reach Back Assignment
2C Notrecompeted Phillips and Jordan
4 Not recompeted Phillips and Jordan
5 AshBritt, Inc. ECC
6a AshBritt, Inc. ECC
6b Phillips and Jordan Ceres Environmental Services, Inc.

CAR 1156. Ceres now challenges the recom-peted primary contract awards in Regions 6a and 6b.8 Region 6a covers the state of Hawaii, and Region 6b covers the state of Alaska. AAR 233.

Source Selection Process and Solicitation

According to the solicitation, award was to be made using the “best value” tradeoff process. AAR at 317-18; CAR 2. The solicitation stated that “[p]roposal evaluation factors shall be rated in accordance with the criteria set forth in the Army Source Selection Manual” and provided an internet link to this document. AAR at 317. The Army Source Selection Manual, Appendix AA to the Army Federal Acquisition Regulation Supplement (“AFARS”), explained the source selection authority’s assessment of best value as follows: “To determine which proposal provides the best value, the [source selection authority] must analyze the differences between competing proposals. This analysis must be based on the facts and circumstances of the specific acquisition.” AFARS Appendix AA at 39. The Army Source Selection Manual further provided:

The tradeoff process, or tradeoff analysis, compares the strengths and weaknesses of the competing proposals to determine which proposal(s) represent(s) the best value to the Government and thus shall receive contract award.
Tradeoff analysis is a subjective process in that it requires the [source selection authority] to exercise reasonable business judgment. When performing this analysis, consider each proposal’s total evaluated price and the discriminators in the non-cost ratings as indicated by each proposal’s strengths, weaknesses, and risks. Consider these differences in light of the relative importance of each evaluation factor.

AFARS Appendix AA at 40-41.

The solicitation also incorporated the full text of FAR 52.215-1, “Instructions to Offer-ors — Competitive Acquisition.” The clause provided that:

The Government may determine that a proposal is unacceptable if the prices proposed are materially unbalanced between line items or subline items. Unbalanced pricing exists when, despite an acceptable total evaluated price, the price of one or more contract line items is significantly overstated or understated as indicated by the application of cost or price analysis techniques. A proposal may be rejected if the Contracting Officer determines that the lack of balance poses an unacceptable risk to the Government.

FAR 52.215-l(f)(8) (2003); AAR 315. In addition, subsection (f)(9) provided that “[i]f a cost realism analysis is performed, cost realism may be considered by the source selection authority in evaluating performance or schedule risk.” FAR 52.215-1(0(9); AAR 315.

Section M of the solicitation established five evaluation factors: (1) Past Performance, (2) ManagemenVOperations Plan, (3) Small Business Subcontracting Plan, (4) Technical Approach to Sample Task Order, and (5) Price. AAR at 318-19; CAR 4-11. As to the relative importance of the five evaluation factors, the solicitation stated that the evaluation factors other than cost or price, when combined, were “significantly more important than cost or price.” AAR 318. The *280relative importance of each of the non-cost factors was comparatively equal, and the relative importance of the sub-factors of each non-cost factor was also comparatively equal. Id.; CAR 11. The agency amended the solicitation to add that each proposal would also receive “an integrated assessment based on [the five] factors listed” above. AAR 354. To that end, the agency would assign a proposal risk rating of “Low,” “Moderate,” or “High” to each proposal. Id.

To evaluate offerors’ technical approach, the agency indicated it would evaluate the offerors’ technical responses to a sample task order (“STO”) based upon a mock hurricane event in North Carolina. AAR 3-4, 292-301. The STO was provided in Attachment 12 to Section J of the solicitation. The solicitation provided that the following sub-factors would be used to develop an overall rating for this factor: (1) technical approach, (2) production rate, (3) use of local subcontractors, and (4) site specific safety and health plan. AAR 3-4.

As to the fifth evaluation factor — price— Section M.5 of the solicitation described how the agency would evaluate prices, stating:

A separate price evaluation will be completed for each region. Offerors shall submit fixed price and time and material rates and markups for every region for which they wish to be considered in the competition. These prices will be contractually binding and cover each contract line item included in Section B of the solicitation. The evaluated rate schedules will form the basis for the best-value trade-off decisions. The sum of the extended value of the Section B fixed price contract line items will determine the low offeror for each region. The extended value refers to the proposed unit price multiplied by the solicitation’s estimated quantity.
The Government intends to issue the majority of the task orders as firm-fixed price. The prices submitted in response to the sample task order will be used to determine price/cost realism and as part of the proposal risk assessment.

AAR 319 (emphasis in original).

Section L of the solicitation instructed of-ferors to organize their proposals into five volumes: Volume I was to include the offer- or’s Section B price proposals for each region in which it was competing; Volume II, the offeror’s past performance information and small business contracting plan; Volume III, the offeror’s management and operations plan; Volume IV, the offeror’s technical response to the STO; and Volume V, the offer- or’s “pricing information” for the STO. AAR 351-52.

As originally issued on June 23, 2007, Section L mandated that the offerors’ pricing for the STO be derived from their Schedule B prices:

Volume V: Pricing for Sample Task Order: This volume shall contain the pricing information for the Sample Task Orders. Ml pricing shall match the proposed rates submitted in Volume I. The contractor shall document all assumptions made in estimating the price. There is no page limit to this volume.

AAR 311. Attachment 12, the Sample Task Order, set forth assumptions offerors were to use in preparing their proposals, listing 12 labor categories with rates provided by the agency, including a $40.00 per hour rate for the Operations Manager position. AAR 292.

On July 10, 2007, however, the agency issued an amendment to the solicitation “to answer questions and update the solicitation document.” AAR 320. One question addressed the instructions to offerors on STO pricing. AAR 323. The agency set forth the question and answer in Amendment 0001 to the solicitation as follows:

SECTION J QUESTIONS
6. J. Attachment 12, Assumptions, Item 5.k: This specifies that offerors must use a $40.00/hr rate for the Operations Manager when pricing the sample task order. Section B.l, Item 0001 requires the offeror to propose an hourly rate for the Operations Manager. Section L.l, Volume V Pricing for Sample Task Order, states “Ml pricing shall match the proposed rates submitted in Volume I.” Which rate should offerors use when pricing the sample task order: a) the $40.00/hr rate specified in Attachment 12 or the proposed rate in Section B.l? *281Removed, sentence relative to “pricing in Volume V, Solicitation Section L.l. Please price according to assumptions listed in Attachment 12.

AAR 323. Based on this amendment the following sentence in Section L’s Instructions on Pricing for Sample Task Order was removed: “All pricing shall match the proposed rates submitted in Volume I.” AAR 323, 352. This omitted sentence was the only indication in the solicitation that offerors were to use their Schedule B pricing in pricing the STO.

Finally, according to a July 10, 2007 amendment to the solicitation, each proposal would also be assessed for risk. Section M.6 advised offerors that the Source Selection Evaluation Team would assign a proposal risk rating according to the following scale:

Low Risk = Pi’oposal weaknesses have little potential to cause disruption of schedule, increase in cost or degradation of performance. Normal contractor effort and normal Government monitoring will probably minimize any difficulties.
Moderate Risk = Approach has weaknesses that can potentially cause some disruption of schedule, increasing in cost, or degradation of performance. However, special contractor emphasis and close Government monitoring will probably minimize difficulties.
High Risk = Approach has weaknesses that have the potential to cause serious disruption of schedule, increase in cost, or degradation of performance even with special contractor emphasis and close Government monitoring.

AAR 354.

Section B Price Schedule

Section B required offerors to complete a rate schedule featuring rates “fully burdened with indirect cost and profit” for each region for which they wished to compete. AAR 212.9 The solicitation further instructed of-ferors to “propose escalation rates for each option period following this schedule” which would be “applied against the current year price when an option is exercised.” Id. Schedule B rates were binding and were to be honored during the base period on any resulting task orders. Id. Offerors were to propose escalation rates for each option period, and those rates would be applied against the base year price, when an option was exercised. Id. Offerors were required to “document all assumptions made in the estimating the price.” AAR 351.

Section B included 29 contract line items (“CLINs”) for various debris removal services, broken out by the type of equipment to be used, such as dump trucks, wheel-loaders, and knucklebooms. AAR 212-15. With regard to labor, CLINs 0001 and 0006 required offerors to propose an hourly price for an “Operations Manager” to perform mission planning and mission execution, respectively. AAR 212. Similarly, CLIN 0002 covered the “Operations Planner,” and CLIN 0003 related to an “Environmental Health and Safety Manager.” Id.

Section B advised that offerors’ proposed rates “should be fully burdened with indirect cost and profit.” Id. Offerors were instructed to submit rates for the base period only and to “propose escalation rates for each option period following this schedule.” Id.; AAR 215. Proposed escalation rates were “applicable to the prime and all key team subcontractors^] labor rates.” AAR 215.

Sample Task Order

The original STO envisioned a Category III hurricane in North Carolina with localized flood damage as a result of the storm. AAR 292, 294. Offerors were to propose methods of removing and reducing a total of 2,250,000 cubic yards of debris from three different sectors: an urban sector with high-density debris and limited access for larger equipment, a semi-urban sector with medium-density debris and minor access limitations for larger equipment, and a rural sector with low-density debris and open access for larger equipment. AAR 292, 294.

The STO was organized into ten sections: (1) General; (2) Description of Work; (3) Services; (4) Performance Schedule; (5) Equipment; (6) Debris Management; (7) Reporting; (8) Handling and Collection of *282Household Hazardous Waste; (9) Disposal Site Design, Management and Facilities; and (10) Performance of Work. AAR 292-98.

The STO’s Performance of Work section stated that “[t]he Contractor agrees to complete the work in a professional, workmanlike manner and within the scope of work guidelines set forth above based on the unit pricing submitted by the Contractor in the Bid Schedule.” AAR 298. This referenced “Bid Schedule” was included in the STO and was titled a “Bid Schedule for Debris Removal and Reduction” and consisted of 81 line items subdivided into sub-CLINs describing the different sectors and haul distances. AAR 299. The STO CLINs were not the same as Schedule B CLINs. For example, CLINs 0004 and 0005 of Schedule B covered the use and operation of Automated Debris Management System (“ADMS”), CAR 212, whereas CLIN 0006 of the STO included ADMS requirements, AAR 299. In the STO, CLINs 0004 and 0005 of the STO covered “Life Support Base Camp” and “Fuel and Operations Support for Remote Location,” respectively. AAR 299.

The STO Bid Schedule required offerors to list their average daily production rate for each sector, and the unit price and total price for 31 line items. AAR 299. CLINs 0007 through 0011, and 0013 and 0014 were further broken down into sub-CLINs detailing the type of sector and haul distance.

Recompetition

On August 6, 2009, the Government held a brief conference call to inform offerors that it would be conducting written discussions related to the reeompetition. CAR 58. On August 21, 2009, the CO notified offerors in the affected regions in writing that the agency was in the process of evaluating their proposals for the recompetition. Id. The agency explained that offerors’ final price proposals from the original competition had been evaluated against a new Independent Government Estimate (“IGE”) formulated on August 13, 2009, under the direction of Colonel Michael C. Wehr. Both Phillip G. Heg-wood, Chief of the Cost Engineering Section, and Douglas J. Kamien, Chief of Planning, Programs, and Project Management, recommended approval of the revised IGE. See, e.g., CAR 58; see also CAR 106-26. The revised IGE estimated a base-year price of $11,584,310.44 for Region 6a and $12,462,320.71 for Region 6b. CAR 106, 316, 330.

For Region 6a, the evaluated prices from the original competition were as follows:

Contractor Total Price
Ceres $[Redacted]
ECC $[Redacted]
Phillips and Jordan $[Redacted]
AshBritt $[Redacted]

CAR 1132. The final prices and escalation rates for Region 6b were the same except that ECC did not submit an offer for this region. Id.

On August 10, 2009, the CO sought an independent review of the offerors’ pricing information from 2007, stating that he “would feel more comfortable since DCAA has abandoned me.” CAR 2105. Mr. Black was referred to Ms. Tina Guillot, Chief of the Supply & Services Division Branch of the Army Corps of Engineers in St. Paul, Minnesota. CAR 2104. Ms. Guillot provided Mr. Black with her analysis of the offerors’ pricing on August 18, 2009. CAR 2102.

Initial Discussions

On August 21, 2009, the agency issued letters to all offerors by way of an initial round of discussions. Each letter contained identical language regarding the nature of feedback provided:

The feedback on your existing offer is enclosed for your review. Enclosure 1 provides feedback on the non-cost evaluation. Enclosure 2 provides feedback on the price evaluation. The price evaluation was made against a new Independent Government Estimate dated August 13, 2009. Also, please note that the government intends to use a new non-cost evaluation team to evaluate this recompetition revision to your existing proposal. This feedback is intended to improve your existing offer. Subsequent strengths and weaknesses may be found by the new evaluation team. You are encouraged to present the best proposal you can in response to the stated evaluation factors. Your Sample Task Order price was determined fair and *283reasonable. No update to the price is required for this evaluation.

CAR 58. Both enclosures were tailored to each offeror’s proposal.

By way of “price feedback,” the agency reproduced and highlighted offerors’ Schedule B proposals to indicate whether each proposed unit price was below or above the IGE. See, e.g., CAR 63-72. The agency did not disclose either the overall IGE or the IGE for individual CLINs. The majority of Ceres’ unit prices were lower than the IGE. See id. For Region 6a, Ceres’ proposed unit price exceeded the IGE for CLIN 0005AA, “Drive Cheek-In/Truck Certification Point,” CLINs 0009AA and 0009AB related to debris pickup, and CLIN 0027AA and 0027AB related to self-contained systems for air curtain burners. CAR 69-70. For Region 6b, Ceres’ proposed unit prices exceeded the IGE for CLIN 0005AA, “Drive Check-In/ Truck Certification Point.” CAR 71-72.

With respect to the Sample Task Order (“STO”), in these discussion letters, offerors received detailed non-price feedback regarding strengths and weaknesses in their technical approach to the scenario. The only information provided regarding STO pricing was contained in the identical letters that accompanied the individualized feedback — offerors were all informed that STO prices had been “determined fair and reasonable” such that no updates to those prices were required. See, e.g., CAR 58.

Final Revised Proposals Submitted on August 28,2009

The offerors’ final revisions to their proposals for the recompetition were due on August 28, 2009. See, e.g., CAR 59. Ceres, ECC, AshBritt, and Phillips and Jordan submitted revised proposals for Region 6a, and Ceres, AshBritt, and Phillips and Jordan submitted revised proposals in Region 6b. CAR 1151.10 The prices and escalation rates included in the revised proposals for Region 6a were as follows:

Contractor Escalation Rates Total Pnce
ECC6 [Redacted] $[Redacted]
Phillips and Jordan [Redacted] ¡{[Redacted]
AshBritt [Redacted] ¡{[Redacted]
Ceres [Redacted] ¡¡¡[Redacted]

CAR 240. AshBritt did not propose an escalation rate as part of its revised proposal. The prices and escalation rates included in the revised proposals for Region 6b were the same except that ECC did not submit an offer for this region. Id.

Ceres’ proposed prices for both Region 6a and 6b increased from its original prices submitted with its 2007 proposal. Ceres’ August 2009 proposal — totaling $[Redacted] — represented a $[Redacted] increase over its original 2007 proposal. Compare CAR 240, with CAR 136-39. AshBritt, by contrast, lowered its proposed price by [Redacted] — from [Redacted] to [Redacted]— representing a [Redacted] drop. P & J also lowered its proposed price from [Redacted] in 2007 to [Redacted] in August of 2009 — a difference of [Redacted], or [Redacted].

Evaluation of Initial Revised Proposals

The Chief of the Supply & Services Division’s Independent Price Analysis

On September 3, 2009, Mr. Black sent Ms. Guillot pricing information from AshBritt, CrowderGulf, ECC, and Phillips and Jordan’s revised price proposals for Region 5, and requested that Ms. Guillot “complete a review like [she] did last time.” CAR 1696. In particular, he inquired whether she could “determine [that the] new proposals [were] fair and reasonable.” Id.

Mr. Black advised Ms. Guillot that “[a]s with the original competition, the pricing for ADMS is messed up again,” and reported that he had included the original Price Negotiation Memorandum to familiarize Ms. Guil-lot with ADMS pricing issues in the prior procurement. Id. In addition, Mr. Black observed that “[s]ome of the offerors did not bother updating the rates for some regions.” Id.

*284Ms. Guillot provided her findings in an email dated September 17, 2009. Specifically, Ms. Guillot stated:

The previous competition, and the current competition, both have an escalated IGE. Although an IGE is not required (it is required for FAR Part 36, construction), [its] use limits the establishment of a competitive range in this procurement, and does not serve its intended purpose.

CAR 342. Ms. Guillot further stated:

If all offerors come in below the IGE, this in itself does not constitute all offerors being in a competitive range. The IGE should have been revised based on the previous and historical contract pricing, and should have also taken into consideration inflation and escalation rates, if an IGE was to be used. This would have provided a more accurate tool to assist in the determination of the competitive range.

Id. (emphasis added). Ms. Guillot did not explain her assessment of the IGE in any greater detail. Despite her determination that the IGE was not useful for determining a competitive range, Ms. Guillot concluded that the agency “received adequate competition, and as a result of that competition, a competitive range can be established.” Id. She noted further that “[flair and reasonable pricing can be determined based on adequate competition.” CAR 343. Ms. Guillot said she would have eliminated [Redacted] from the competitive range, but that all other vendors in the established competitive range were “between [Redacted] and [Redacted] difference from one another. Averaging Ceres, ECC, and Phillips & Jordan’s proposed pricing results in an average overall total proposed amount of [Redacted], base including options.” CAR 2100.

Ms. Guillot also highlighted several perceived discrepancies in the offerors’ Schedule B prices. First, she observed that AshBritt “did not propose escalation rates ... and is low in comparison to all [offerors’] proposed pricing.” CAR 2099. In addition, Ms. Guil-lot highlighted the disparity between Ash-Britt’s 2007 proposal and the proposal it submitted for the reeompetition, observing that:

[B]ased on the previous procurement history, [AshBritt] proposed approximately [Redacted] for Region 5, for a base and 4 options, [Redacted], just 2 years ago. (from approximately [Redacted] to [Redacted] this year? ?) This proposed pricing history contradicts current proposed pricing and raises the question of purposely low-bailing on their proposal in order to win a government contract.

CAR 342. Ms. Guillot concluded that Ash-Britt’s “price [was] so low as to question whether there may be an unknown risk to the government if awarded to AshBritt.” Id. Ms. Guillot also reported that the five contract specialists she had consulted “questioned the AshBritt low bid, [Redacted], and some suggested the apparent risk involved as well.” CAR 2100.

Mr. Black responded “The info is very helpful. It [is] exactly what I needed. Your ultimate conclusions were a little different from mine.” CAR 2098.

Determination and Findings

In its September 18, 2009 Determination and Findings regarding the offerors’ revised final proposals for all reeompeted regions, the Source Selection Evaluation Board (“SSEB”) documented its concerns regarding the offerors’ proposed Schedule B pricing for Region 5. The SSEB used Region 5 as “a representative sample of the price evaluation.” CAR 345. First, the SSEB was concerned that “the proposed prices from Ash-Britt, Inc. and ECC are well below the other offers.” Id. In addition, the SSEB noted that AshBritt’s and ECC’s revised prices were substantially lower than the prices proposed in their 2007 final proposals. After comparing AshBritt and ECC’s revised proposals from 2007 with their latest proposals, the SSEB determined that “it is clear that the prices in AshBritt, Inc.’s offers are not comparable. Many proposed unit prices were cut over [Redacted]. Overall, Ash-Britt’s proposed prices have dropped about [Redacted].” CAR 345-46. Though the SSEB determined that ECC’s offers were “comparable for most proposed unit prices,” overall, “ECC’s proposed prices have dropped about [Redacted] from the previous offer.” CAR 346. The SSEB was unable to *285identify “any information in either the Ash-Britt, Inc. or the ECC re-competition offer to explain this significant drop in proposed prices.” Id.

Finally, the SSEB reported that AshBritt’s and ECC’s proposed unit prices were below those estimated in the IGE. The SSEB observed:

In the revised proposal, all AshBritt, Inc.’s proposed unit prices are below the Independent Government Estimate. Almost all of ECC’s proposed prices are below the Independent Government Estimate. [Redacted].

CAR 345.

The SSEB also expressed concern over the drop in STO prices, stating:

In my original letter requesting these re-competition proposals, I had informed the offerors that the sample task order prices were reasonable and there was no need to update those prices. I had not expected these drastic price reductions within these re-competition offers. In light of these significant price reductions, the sample task order prices in the offers are now obsolete.

CAR 346. The CO concluded “[i]n order to complete an integrated assessment of the latest offers, the government needs these offerors to revise their sample task order prices to reflect the latest Section B proposed prices.” Id. (emphasis added). Based on its findings, the SSEB determined that it was necessary to enter into discussions. Id.

Recompetition Evaluation Report

The SSEB’s Re-Competition Evaluation Report, signed by Mr. Black on October 8, 2009, summarized evaluations conducted by a four-member technical evaluation team, a one-person eost/priee evaluation “board,” and a five-member past performance evaluation team. See CAR 233. By way of an overview, the report advised that “[t]he cost/price evaluation team reviewed the Finn-Fixed Price rates in Section B.l of the solicitation to establish the lowest offeror. The sample task order prices and time and material rates in Section B.l were reviewed for cost realism and price reasonability.” CAR 234.

With regard to the Cost/Price Evaluation, the SSEB reported that “the team noticed right away that the prices proposed to operate the [ADMS] were not consistent.” CAR 236. The team also “noticed the large disparity” among the offerors’ total evaluated prices and the large disparity between the offerors’ prices and the IGE. Id. (“All the existing offers are well below the IGE for all regions.”). The SSEB also noted that “ECC and AshBritt’s offers are significantly lower than the other three offerors.” Id.

The SSEB provided an integrated assessment of revised proposals for Region 6a. In all technical factors and past performance all offerors received the highest rating, outstanding, and were deemed low risk for both Regions 6a and 6b. The prices were:

Contractor Price
ECC [Redacted]
P & J [Redacted]
AshBritt [Redacted]
Ceres[Redacted]

CAR 236. All offerors — ECC, P & J, Ash-Britt, and Ceres — received the highest rating available for all of the non-cost factors. Specifically, all offerors were rated “Blue” or outstanding under the Past Performance factor, the Management/Operations Plan factor, and the Technical Approach to Sample Task Order factor. Id. In addition, all offerors were rated “Outstanding” with regard to the Small Business Subcontracting Plan. Id. The SSEB’s assessment of offerors’ revised proposals for Region 6b was identical except that ECC did not submit a proposal for that region. CAR 236. The SSEB summarized the prices for Region 6b as follows:

Contractor Price
P & J [Redacted]
AshBritt [Redacted]
Ceres [Redacted]

Id.

The SSEB concluded that initial proposals received in response to this recompetition were “extremely competitive.” CAR 237. Nevertheless, the SSEB could not formulate a final rating for the offers because of “prob*286lems experienced during the price/cost evaluation.” Id. The SSEB concluded that “[discussions are needed to improve understanding of the proposed prices, provide feedback to the offerors on their strengths and weaknesses, and increase the overall value of the offers.” Id.

Pre-Negotiation Objectives Memorandum

In its Pre-Negotiation Objectives Memorandum, dated October 9, 2009, — one day after the SSEB’s Recompetition Evaluation Report — the Contracting Officer described the SSEB’s evaluation of revised proposals. In an overview of the agency’s analysis of proposals, the CO summarized Ms. Guillot’s independent review of the price proposals:

The reviewer concluded that fair and reasonable pricing can be determined based on adequate competition. But, the reviewer warned that the proposed price for Ash-Britt, Inc. seems so low as to question whether there may be an unknown risk to the Government if AshBritt, Inc. were selected for award. The reviewer recommends more analysis of the pricing is needed and a second round of discussions may lower the risk of this acquisition.

CAR 229 (emphasis added).

The CO then identified several areas of the offerors’ proposals that he believed could be improved through a second round of discussions. First, he noted problems with Section B pricing, stating first that “[sjeveral offer-ors failed to fill out the bid schedule correctly.” CAR 229. [Redacted]. Id. Accordingly, the CO planned to “identify the error and ask for a correction during discussions.” CAR 229. In addition, the CO observed that “[a]ll of the proposed offers are well below the Independent Government Estimate (IGE).” Id. The CO determined, however that:

[a]t this time, the Government will not provide any further feedback on the Section B pricing. The sample task order shall be adjusted to reflect the current competition and the Government will use that information to further analyze the proposed pricing in the offers.

Id. With respect to non-cost weaknesses, the CO stated that the Government would notify offerors of those weaknesses during the second round of discussions. Id.

The memorandum further described the “significant reduction” in prices proposed by Phillips and Jordan, ECC, and AshBritt. According to the CO, the agency was “concerned about the unknown risks associated with the significant drop in proposed prices from ECC and AshBritt, Inc.” Id. The CO continued, “In order to execute an integrated assessment of these offers, the Government will revise the Sample Task Order assumptions and request an updated Sample Task Order price based on these new assumptions.” Id.

In addition to the concerns with the low offers from ECC and AshBritt, the memorandum documented concerns over the drop in Phillips and Jordan’s proposed price for the Automated Debris Management System. The CO reported that

the September 2007 offer from Phillips and Jordan for CLIN 0005AC, Loading and Drop Sites for ADMS, was [Redacted] per day. The latest offer has a proposed price for the same CLIN as [Redacted] per day. The offer has information about a new system owned by Phillips and Jordan, but this information is not detailed enough to support a [Redacted] drop in proposed price. The IGE for CLIN 0005AC in Region 5 was $19,020 per day.

CAR 229-30.

The memorandum noted that the Government considered AshBritt’s proposed [Redacted]. CAR 230. The CO concluded that the Government would ask AshBritt to confirm its escalation rate during the next round of discussions.

To lower the risks associated with this acquisition, the Contracting Officer recommended “more analysis of the pricing.” CAR 229. To further assess the prices, the agency stated that it would “revise the Sample Task Order assumptions and request an updated Sample Task Order price based on these new assumptions.” Id. The CO in his Price Objective Memorandum dated October 9, 2009, stated:

*287The new Sample Task Order assumptions will shift the setting of the scenario into Region 5 and the Government will ask all offerors to make the Sample Task Order price proposal reflect the recent prices offered for Region 5 in the Section B CLINS.

Id. (emphasis added). According to the CO’s “Plan of Action” as described in the Price Objective Memorandum,

[T]he Government will ask all offerors to make the Sample Task Order price proposal reflect the recent prices offered for Region 5 in the Section B CLINs. With this new information, the Government can better evaluate the reasonableness and realism of the new proposed [Schedule B] prices for ECC and AshBritt, Inc submitted on 28 AUG 2009.

Id.11 Similarly, after noting the drop in Phillips and Jordan’s proposed price for ADMS, the CO concluded that “[tjhrough evaluation of the revised Sample Task Order, the Government is able to complete an integrated assessment of the offer.” Id.

Second Round of Discussions

On October 2, 2009, the agency notified the offerors in writing that it intended “to reopen discussions to resolve disparities between the Independent Government Estimate and [their offers].” CAR 224. In preparation for the second round of discussions, offerors were instructed to “assemble [their] proposal teams and prepare to enter into discussions and submit revised proposals for these regional awards and reach back assignments.” See, e.g., CAR 224. Every offeror received the same letter.

On October 16, 2009, the agency sent discussion letters to the offerors. Id.; CAR 348-407. These discussion letters included individualized comments on Volumes I-IV of the offerors’ proposals, and requested that the offerors revise their proposals. Ceres received the following feedback:

Volume II: The small business subcontracting plan does not affirmatively state that indirect costs are either included or excluded from the proposed goals.
Volume III: Letters of commitment for subcontractors not included in the offer.
Volume III: Past success in safety is not addressed in the proposal in adequate detail for debris work for the last three years.
Volume IV: Some of the information about proposed personnel in the sample task order is confusing, (i.e. missing resumes and the same person with multiple roles assigned)
Volume IV: The crew composition in the sample task order is not well defined.
Volume IV: The proposal does not include a list of local subcontractors from the affected area in the sample task order.

CAR 360.

The agency sent AshBritt different feedback, tailored to its offer:

Volume I: The Section B rates in your offer dated August 28, 2009 did not include Time and Material rates for CLIN 0007-0020 and CLIN 0022-CLIN 0027. [Redacted]. Please correct this discrepancy in the final proposal revision.
Volume I: You have proposed 0% escalation for all Option Years in the proposal. Please confirm you intend not to raise your proposed prices for the option years.
Volume II: The small business subcontracting plan does not provide details on how you will achieve goals or improve contribution by small businesses at the subcategory level.
Volume III: The proposal did not include enough details concerning the travel or deployment of personnel during an event.
Volume III: The proposal states the U.S. Army Corps of Engineers (USACE) is responsible for the removal of all hazardous material and Household Hazardous Waste (HHW) on a daily basis using other contractors and that USACE will be responsible for baseline testing for Temporary Debris Storage And Reduction Sites (TDSR) for existing soil and water contamination. Please confirm that you are not willing or
*288able to perform these services for the Government.
Volume III: The proposal did not include enough details concerning sectoring of subcontractors.
Volume IV: The proposal does not demonstrate an ability to respond to the event in less than 24 hours.
Volume IV: The proposal does not commit to single handling of material and does not commit to hauling debris to the final disposal site on the first pass.
Volume IV: The proposal includes an alternate Contractor Quality Control Systems Manager with less than 3 years of experience.

CAR 348-49.

Phillips and Jordan also received a letter. The agency provided the following feedback:

Volume II: Past performance in meeting small business subcontracting goals shows mixed results.
Volume III: Some letters of commitment were not signed.
Volume III: The CQC structure needs more discussion concerning subcontractors in the offer.
Volume III: The proposal includes a CQC systems manager who has not taken the CQC USACE course.
Volume IV: The sample task order response needs more discussion about the use of local subcontractors and strategies to locate new subcontractors from the affected area.
Volume IV: The proposal does not indicate that the safety and health plan is current.

CAR 396.

In addition, in these discussion letters, the Contracting Officer, Mr. Black, advised offer-ors that their revised proposals should address the agency’s revised STO and new STO Bid Schedule. Id. Using identical language in each letter, the CO stated that:

In addition to addressing the feedback detailed above, the Government has revised the Sample Task Order. The new Sample Task Order and Task Order Bid Schedule are enclosed with this letter. You shall update your offer to respond to the revised Sample Task Order. Your final revised offer shall include a revised Price for Volume V [pricing] in addition to updating the non-cost proposal in Volume IV [technical approach].

See, e.g., CAR 360. Neither the letter nor the enclosed revised STO informed offerors why the STO had been revised. In addition, the revised STO did not advise offerors that they should base their STO proposals on prices they had proposed in Schedule B.

The revised mock task order was attached to the letter. See, e.g., CAR 362. The revised STO envisioned a “Category III Hurricane storm event in Southern California” with some “localized flood damage” as a result of the storm. See, e.g., id. As before, the Revised STO required removal or reduction of debris from three sectors: an urban sector, a semi-urban sector, and a rural section with low debris load and open access. See, e.g., id. The Revised STO updated the cost of fuel and the cost of freon removal and recycling — $4.50 per gallon and $10.00 each, respectively — and provided revised assumptions regarding labor rates and categories:

a. Unskilled labor $32.00/hr**
b. Skilled labor $37.00/hr**
c. HHW Skilled Labor $48.00/hr**
d. Equipment Operator $50.00/hr**
e. Truck driver $43.00/hr**
f. QC Site $43.00/hr**
g-h. Foreman HHW Foreman $50.00/hr** $52.00/hr**
i. Superintendent $55.00/hr**
JQC Manager $55.00/hr**
k. Operations Manager $65.00/hr**
** — these labor rates including fringe benefits are for use with this Sample Task Order only, actual task orders issued under this Contract will use the Department of Labor (DOL) labor rates applicable to the area the Contractor will be working.

See, e.g., CAR 362-63. Assumptions regarding debris loading and classification remained unchanged, but the Revised STO updated the method of reducing vegetative debris to include only reduction by grinding, not by air-curtain incineration. CAR 363,364.

The Scope of Work appended to the revised STO outlined responsibilities and services, most of which remained unchanged from the original STO. Minor revisions in-*289eluded the omission of the requirement that contractors test and dispose of ash from incinerating vegetative debris, and the omission of other requirements regarding incineration operations. Compare CAR 365-66, with CAR 295-96. As before, the STO’s Performance of Work section required offer-ors “to complete the work in a professional, workmanlike manner and within the scope of work guidelines set forth above based on the unit pricing submitted by the Contractor in the Bid Schedule.” See, e.g., CAR 368. The Bid Schedule itself did not require that offer-ors insert Schedule B pricing. Responses to the revised STO were due by October 30, 2009. See, e.g., CAR 361.

Final Written Discussions

Between October 21 and 23, 2009, several offerors submitted questions to the agency, which the agency answered in a summary letter to all offerors. CAR 408. ECC submitted questions requesting “clarification of government feedback and mechanics of how to respond to [the] new sample task order.” Id. AshBritt, Inc. submitted questions focusing on “defining terms the Government used in the discussions letter.” Id. Ceres submitted one question regarding “assumptions made in the sample task order.” Id. The SSA concluded that

[m]any of the questions asked apply to all offerors as similar terms were used in each discussions letter and the new sample task order must be incorporated into the final revised offer. The best course of action is to treat this as pre-proposal questions and provide all offerors with a copy of the question and answer.

CAR 408. The agency informed offerors that ECC had asked questions that did not fit this paradigm, but rather “were very specific to feedback from their offer.” CAR 350. Consequently, the agency stated that it would “attempt to correct ECC’s misunderstanding of [the] feedback.” CAR 408.

In a memorandum considering whether to delay receipt of final proposals due to the receipt of questions from the offerors, the Contracting Officer documented his consideration of the proper course of action. The SSEB considered the following options:

(1) closing discussions and not responding at all to the offerors; (2) continuing discussions by responding individually to the of-ferors that asked questions; or (3) continuing the discussions by responding to the offerors that had asked questions and providing to all offerors the questions and answers to “generic” questions common to all, while also answering questions that were unique to each individual offeror.

CAR 409. Ultimately, the SSEB decided that the third option was the correct position because it enabled the agency to “conduct meaningful discussions with all offerors,” “tailor the discussion comments to each individual proposal,” and “treat all offerors equally since the information common to all was provided to all.” Id.

On October 28, 2009, the agency sent identical final discussion letters to offerors with its responses. See, e.g., CAR 411-13. One question concerning the Revised STO submission requested clarification regarding the list of labor rates to be used for the STO. CAR 412. The question asked whether, given the instruction that “these labor rates including fringe benefits are for use with this Sample Task Order only,” the offeror could assume “that the given hourly wages are inclusive of the factors such as fringe benefits, workers’] compensation, overtime allowance, and other similar factors.” See, e.g., CAR 415. CO Timothy Black responded that offerors “should treat the wage rates the same as ... in your original proposal. Our intent is to make these hourly rates reflect what you would encounter in a Department of Labor Wage Determination for the County you will be executing the debris mission.” See, e.g., CAR 416.

Revised Final Proposals

The revised proposals were originally due on October 30, 2009, CAR 361, but after receiving and responding to questions submitted by the offerors, the agency moved the due date to November 4, 2009. See, e.g., CAR 408, 411-25.

The offerors in the competitive range submitted final proposal revisions on November 4, 2009. CAR 1009. The offerors’ revised proposal prices for Region 6a were as follows:

*290 Offeror Price
ECC [Redacted]
Phillips and Jordan [Redacted]
AshBritt [Redacted]
Ceres Environmental [Redacted]

CAR 1014. Revised price proposals for Region 6b were similar, except that ECC did not submit an offer for that region, and Ceres’ price increased slightly:

Offeror Price
Phillips and Jordan [Redacted]
AshBritt [Redacted]
Ceres Environmental [Redacted]

Id.

Prices in ECC’s revised Volume I for Region 6a, AshBritt’s offers in Regions 6a and 6b, and Phillips and Jordan’s offers in both regions [Redacted]. CAR 240, 1014. Ceres’ total proposed price increased by approximately [Redacted] in Region 6a and by approximately [Redacted] in Region 6b.

The offerors’ price proposals for the revised STO were as follows:

Offeror Price
[Redacted] CrowderGulf
[Redacted] Ceres Environmental
[Redacted] ECC
[Redacted] Phillips and Jordan
[Redacted] AshBritt

CAR 1115. The IGE price for the revised STO was $102,561,045.30. CAR 1122.

Ceres’Revised Proposal

Ceres submitted its final proposal revision on November 4, 2009. According to the Government, Ceres “was the only offeror that chose to update its prices in Section B. The net effect of these proposal changes lowered the evaluated price for Region 5 and raised the evaluated price for Regions 6a and 6b.” CAR 1131. Thus, Ceres’ proposed price increased from [Redacted] to [Redacted] for Region 6a, and from [Redacted] to [Redacts ed] for Region 6b.

Ceres revised its prices for most of the Schedule B CLINs. Ceres’ proposed prices decreased with respect to three of the ADMS sub-CLINs but remained the same for other ADMS sub-CLINs. With the exception of CLIN 0013 regarding knuckleboom operation and CLIN 0021 covering an inspection tower, Ceres’ unit prices increased or remained unchanged from its August 28, 2009 revision. Compare CAR 561-64, with CAR 430-34. Ceres’ proposed STO prices also increased.

Ceres’ proposal for the revised STO included a four-page introduction that provided an overview of the assumptions underlying its calculations. Ceres explained that its STO proposal “uses wage rates provided by the USACE documents titled ‘Debris ACI Mock Task Order’ ... and uses assumptions provided in the same document.” CAR 592. Ceres’ STO proposal “also use[d] equipment rates and assumptions provided by Ceres.” Id. Ceres noted that it was “providing the Corps with Schedule B pricing as requested, and Ceres matched our projected requirements with Rostan Solutions [a named subcontractor for ADMS] unit pricing to arrive at Mock Task Order pricing.” CAR 600. Accordingly, in its revised STO proposal, Ceres included a copy of its proposed Schedule B pricing for Region 5 as a backup for pricing assumptions that it used in calculating proposed prices for the STO. CAR 618-19.

Ceres explained its calculations of its proposed STO prices in a more detailed “Assumptions Documented” section that defined terms, components, and aspects of the project. See CAR 599-652. In addition to breaking down calculations related to individual line item numbers, Ceres’ Assumptions section described how other rates for the three sectors were calculated with respect to distance traveled and types of crew used. Finally, with regard to labor rates, Ceres “added factors to the base labor rates to include overtime, insurance, tax, and other burden factors to arrive at a fully burdened labor rate per job description.” CAR 600.

AshBritt’s Revised Proposal

AshBritt did not change its Schedule B prices, but proposed a total of [Redacted] for the revised STO. CAR 1115. In addition to the line item proposed prices, AshBritt’s STO proposal included a section called “Assumptions for Sample Task Order” that outlined pricing and technical assumptions on which AshBritt’s proposal was based. AshBritt or*291ganized its pricing assumptions according to the Assumptions provided in the Revised STO. For example, the first assumption provided in the STO discussed the strength of the hurricane and the local flooding that resulted noting that the revised STO envisioned a “Category III Hurricane storm event in Southern California.” See, e.g., CAR 362. Similarly, AshBritt listed its first series of assumptions under the heading “Category III Hurricane in Southern California” as follows:

1. Category III Hurricane in Southern California
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]
[Redacted] [Redacted]

CAR 751.

AshBritt then described assumptions regarding the sectors, the temporary debris storage and reduction sites, infrastructure damage, and labor rates, among other details featured in the STO. For example, AshBritt assumed that [Redacted]” Id. With regard to debris removal, [Redacted]. CAR 752.

P & J’s Revised Proposal

Phillips and Jordan similarly did not change its Schedule B prices, but proposed a total of [Redacted] for the revised STO. See CAR 1115. Rather than filling out the STO Bid Schedule and providing explanations of its calculations in a separate section, P & J separated each CLIN and sub-CLIN into labor and equipment costs to determine and present a “total task cost” for each sub-CLIN. P & J’s proposal also included the indirect cost spread for each line item.

For example, Phillips and Jordan broke CLIN 0004 (Life Support Base Camp) into several labor and equipment components. P & J estimated that the base camp would require two foremen, four skilled laborers, and one truck driver. P & J calculated the base and loaded rates, as well as the number of days and anticipated overtime, for each of the labor categories. CAR 809. P & J provided both a total and a unit cost for the labor categories, arriving at a [Redacted] cost for labor. Id. With regard to equipment, P & J estimated the number of items, duration of use, rate per day, and other costs to arrive at a total equipment cost of [Redacted]. Finally, P & J added the labor and equipment components to arrive at a “total task cost” for CLIN 0004 of [Redacted]. Id.

SSEB Consensus Meeting Notes

In its November 16, 2009 Consensus Meeting Notes, the SSEB noted that AshBritt’s “[p]rice for Section B and mock task order prices are not consistent.” CAR 1628. The SSEB observed:

Hourly equipment costs Sample Task Order CLIN 0015 was priced at [Redacted] each and the Section B price was [Redacted] each. CLIN 0014 was priced at [Redacted] each and the Section B price is [Redacted]. ADMS price for CLINs 0006AA, 0006AC, and 0006AD sample task order price does not match Section B prices.

Id. The SSEB concluded “All proposed Sample Task Order hourly rates for equipment plus operator are suspect due to inconsistency with Section B pricing.” Id.

Similarly, with respect to P & J’s STO pricing, the SSEB noted, “Price for Section B and mock task order prices are not consistent.” CAR 1695. In particular, the SSEB observed:

All proposed Sample Task Order hourly rates for equipment plus operator are suspect due to inconsistency with Section B pricing.... Sample Task Order ADMS CLINs 0006AA-0006AD prices did not match Section B proposed rates.... Sample Task Order CLIN 0015 price of [Redacted] did not match Section B proposed rate of [Redacted].

Id.

The SSEB’s Re-Evaluation Report

In its November 17, 2009 Re-Evaluation Report, the SSEB reported the results of the evaluation. In analyzing final proposal revisions submitted on November 4, 2009, the *292voting members of the technical evaluation team read each proposal and completed individual assessments. The team met and developed consensus ratings for all the factors and subfactors. The SSEB assigned new proposal risk ratings and summarized the technical evaluation team’s consensus ratings for all factors and subfaetors. As before, all offerors received the highest available ratings for Past Performance, ManagemenVOp-eration Plan, Small Business Contracting Plan, and the technical evaluation of the STO. CAR 1010.

However, both Phillips and Jordan and AshBritt, previously rated “low” risk, were downgraded to “moderate” risk. Id. The solicitation defined “moderate risk” as follows:

Moderate Risk = Approach has weaknesses that can potentially cause some disruption of schedule, increase in cost, or degradation of performance. However, special contractor emphasis and close Government monitoring will probably minimize difficulties.

AAR 354.

In explaining the purpose of the final round of discussions, the SSEB’s Re-Evaluation Report stated that “[t]he primary focus of [the latest] round of discussions was to attempt to determine the reasonableness of the latest proposed prices for Section B.” CAR 1011. According to the Report, “the SSA concurred with the decision to enter into discussions to obtain more information to help determine [Section B] price reasonableness.” Id.12 As noted above, the CO did not communicate the Government’s intent to use the revised STO to ascertain the realism of Schedule B prices to offerors.

The SSEB reported that the team noticed “right away” that many offerors’ proposed prices to operate the ADMS were not consistent with other offers. Id. Specifically, the proposed prices for Schedule B CLIN 0005AC for ADMS for CrowderGulf and Ceres Environmental Services were between [Redacted] and [Redacted] per day. By contrast, according to their Schedule B submissions, AshBritt, ECC, and Phillips and Jordan proposed between [Redacted] and [Redacted] per day to operate the ADMS. According to the SSEB’s report, the team also noticed “the large disparity” among of-ferors’ total evaluated prices. In particular, the SSEB observed that ECC and Ash-Britt’s offers for Region 6a were “significantly lower” than the other offerors’ proposed prices. Id.

The SSEB explained that offerors had been required to respond to a new STO that featured a setting in Region 5: “Since all offerors in the competitive range submitted offers for Region 5, the new sample task order setting was in Region 5.” Id. Accordingly, the Government altered the assumptions “to reflect the prohibition of burning debris in California.” Id.

With regard to Phillips and Jordan’s revised STO proposal, the SSEB found that the company’s proposal “did not reflect a clear understanding of the requirements of the sample task order,” which “elevated the risk of [its] offer.” CAR 1012. In particular, P & J “failed to utilize the correct bid schedule for the new sample task order,” and included incineration in its technical solution despite the omission of debris reduction by air curtain incineration. Id. Consequently, P & J’s proposed prices for related CLINs were “inaccurate due to differences in quantities.” Id.

The SSEB found that AshBritt responded to the revised STO correctly, “but the proposed pricing of the sample task order was not in line with the contract rates proposed in Section B.” Id. In addition, the SSEB noted:

AshBritt’s offer included a robust mobilization plan and offered a price of [Redacted] for mobilization and demobilization. Finally, AshBritt’s proposed prices for ADMS CLINs in the sample task order were significantly higher than the rates listed in Section B. These inconsistencies in proposed prices for the sample task order *293elevated the risk level of [its] proposal. The price analysis concluded that Ash-Britt’s sample task order prices are not consistent with the unique methods of performance and materials described in the offei’or’s technical proposal.

Id. With regard to ECC, the SSEB found that ECC priced the STO correctly but had left intact references to incineration in its technical solution and production rate calculation. Id.

The SSEB observed that Ceres had included an extensive mobilization plan with [Redacted] costs. The SSEB did not comment on Ceres’ prices or understanding of the project. Similarly, with respect to Crowder-Gulf s proposal, the SSEB noted only that its “proposed price for the sample task order seemed to be most in line with [its] offered rates in Section B.” Id.

The SSEB report also illustrated the price realism analysis conducted by the agency as follows:

Sum of All ADMS Costs Cost Per Cubic Yard
in Sample Task As Proposed in
a. ADMS Order CLINs Sample Task Order
1. CROWDER GULF [Redacted] [Redacted]
2. Ceres Environmental [Redacted] [Redacted]
3. ECC [Redacted] [Redacted]
4. Phillips and Jordan [Redacted] [Redacted]
5. AshBritt_[Redacted]_[Redacted]_
Notes: Range from [Redacted] to [Redacted] for 90 days of support. AshBritt is the only firm that the sample task order price for ADMS did not match Region 5 Section B proposed price.
CAR 1012. The SSEB then summarized average and total costs of ADMS among all offerors:
Average Daily Cost for ADMS: [Redacted]
Average Proposed Cubic Yard
Hauled Per Day: [Redacted]
Average Proposed Cost Per
Cubic Yard for ADMS: [Redacted]
Total Cubic Yards of Debris
Estimated: [Redacted]
Production Rate Required to
Complete in 90 Days: [Redacted]

Id. The SSEB observed that the average proposed cost of [Redacted] per cubic yard for ADMS was “much lower than experienced during Hurricane Ike and Gustav,” but reasoned that those hurricanes were smaller events such that there were fewer cubic yards against which offerors spread their costs. Id.

The SSEB’s analysis continued with an examination of offerors’ debris removal from public roads in the Revised STO:

Sum of all hauling Cost Per Cubic Yard Percent of Total Debris Removal costs in Sample As Proposed in Price for Sample b. from Public Roads Task Order CLINs Sample Task Order Task Order
1. CROWDER GULF[Redacted][Redacted][Redacted]
2. Ceres Environmental [Redacted] [Redacted] [Redacted]
3. ECC [Redacted] [Redacted] [Redacted]
4. Phillips and Jordan [Redacted] [Redacted] [Redacted]
5. AshBritt [Redacted] [Redacted] [Redacted]
IGE [Redacted] [Redacted] [Redacted]

*294Id. The SSEB noted that “[i]n the evaluation of the prices in Section B, Crowder Gulf has the highest evaluated price and AshBritt has the lowest. Yet, in this scenario, Ash-Britt is one of the highest priced offers and Crowder Gulf is the apparent low offeror.” Id.

The SSEB found final proposal revisions received in response to the recompetition “extremely competitive” and ultimately concluded that, “[a]s demonstrated by the price analysis detailed in this report, the proposed prices were lower than expected, but cannot be determined to be unreasonable.” CAR 1013.

Memorandum Regarding Cost Realism Analgsis of Mock Task Order

In a “Memo for Record” dated November 18, 2009, the SSEB Technical Advisor evaluated proposals based on the three criteria for cost realism analysis in the FAR. CAR 1128-29. FAR Subpart 15.4 provides, in relevant part:

Cost realism analysis is the process of independently reviewing and evaluating specific elements of each offeror’s proposed cost estimate to determine whether the estimated proposed cost elements are realistic for the work to be performed; reflect a clear understanding of the requirements; and are consistent with the unique methods of performance and materials described in the offeror’s technical proposal.

FAR 15.404-l(d)(l).

The results of the realism evaluation as to Ceres, AshBritt, ECC and P & J were as follows:

CrowderGulf
• Realistic for the work to be performed. Cubic Yard pricing appears fair and reasonable. Costs appear to demonstrate a clear understanding of the work to be performed and the level of effort required.
Reflect a clear understanding of the requirements. Costs used in the Mock Task Order follows the pricing in the bid schedule.
• Are consistent with the unique methods of performance and materials described in the offeror’s technical proposal. Mock Task Order, proposal and bid schedule are consistent.

AshBritt

• Realistic for the work to be performed. [Redacted]. Technical Proposal Volume IV, pages 3r through 7r provide and [sic] extensive mobilization plan and a tiered activation approach, but did not [Redacted].
Reflect a clear understanding of the requirements. Unclear if they possess a clear understanding] of mock task work because they did not include [Redacted] in the mock task order.
• Are consistent with the unique methods of performance and materials described in the offeror’s technical proposal. Mock task order and bid schedule prices are not consistent. The reviewed mock task order prices were higher than the bid schedule.
P&J
• Realistic for the work to be performed. Costs do not appear to demonstrate a clear understanding of the work to be performed and the level of effort required.
Reflect a clear understanding of the requirements. Unclear if they possess a clear understanding] of the mock task work because ash removal, incineration, and ash testing were included in the pricing.
• Are consistent with the unique methods of performance and materials described in the offeror’s technical proposal. Mock task order and bid schedule prices are not consistent. The reviewed mock task order prices were higher than the bid schedule.
ECC
• Realistic for the work to be performed. The cost proposal for the mock task order demonstrates a good understanding *295of the work and the level of effort required.
Reflect a clear understanding of the requirements. Good understanding] of equipment and task required to perform the work, however, the mock task order and bid schedule vary significantly on the equipment rental prices.
• Are consistent with the unique methods of performance and materials described in the offeror’s technical proposal. Mock task order and bid schedule prices are not consistent. The reviewed mock task order prices were higher than the bid schedule.
CERES
• Realistic for the work to be performed. Prices do not represent a realistic account of the task needed to accomplish the work. For example, [Redacted]. Technical Proposal Volume IV, Section 4.A, pages 5 and 6 indicate mobilization of equipment from TX, MN and FL.
Reflect a clear understanding of the requirements. Good understanding] of equipment and task required to perform the work, however, the mock task order and bid schedule vary significantly on the [Redacted].
• Are consistent with the unique methods of performance and materials described in the offeror’s technical proposal. Mock task order and bid schedule are significantly different.13

CAR 1128-29. Thus, the SSEB’s Technical Advisor advised that AshBritt’s, P & J’s, ECC’s, and Ceres’ proposed STO prices were not consistent with their Schedule B prices. The agency determined that only Crowder-Gulfs proposed STO pricing followed its Schedule B proposal.

Price Negotiation Memorandum

The Price Negotiation Memorandum written by the CO, dated November 24, 2009, reiterated that the agency used the offerors’ responses to the revised STO to evaluate “reasonableness and realism.” CAR 1136. The CO noted that the Government had “asked all offerors to make the Sample Task Order price proposal reflect the recent prices offered for Region 5 in the Section B CLINs.” Id.14

The memorandum presented each offeror’s proposed escalation factors, multipliers, and total prices for each region as submitted in the original 2007 competition, the August 28, 2009 proposals, and November 4, 2009 final proposals. See generally CAR 1130-36. The offerors’ proposed prices can be summarized as follows:

Region 6a Pricing:

2008_8/28/09_11/4/09
Ceres Environmental [Redacted] [Redacted] [Redacted]
Phillips and Jordan [Redacted] [Redacted] [Redacted]
AshBritt [Redacted] [Redacted] [Redacted]
ECC [Redacted] [Redacted] [Redacted]
Region 6b Pricing:
2008 8/28/09 11/4/09
*296Ceres Environmental_[Redacted]_[Redacted]_[Redacted]
Phillips and Jordan_[Redacted]_[Redacted]_[Redacted]
AshBritt [Redacted] [Redacted] [Redacted]

CAR 1132-34. A note following these tables observed that Ceres

responded positively to our discussions letter. [Ceres was] the only offeror to update VOL I Section B pricing in the final offer. The total evaluated price in Region 5 was lower and the total evaluated price in Regions 6a and 6b were higher. In the final discussion letter, there was no feedback given on Section B pricing.

CAR 1134. With regard to AshBritt, the memorandum noted, “AshBritt updated [its] Time and Material rates as requested. The evaluated prices in the proposal from Ash-Britt remained constant from [its] previous submission on 28 Aug 2009.” CAR 1135. Similarly, the memorandum stated that “evaluated prices in the proposal from Phillips and Jordan remained constant from [its] previous submission on 28 Aug 2009.” Id.

The CO also summarized the results of the Sample Task Order analysis. The IGE for the STO was approximately $102 million, and offerors’ proposed prices for the revised STO were as follows:

1 CrowderGulf[Redacted]
2 Ceres [Redacted]
3 ECC [Redacted]
4 Phillips and Jordan [Redacted]
5 AshBritt[Redacted]

Id. The memorandum also reproduced the price realism analysis originally included in the SSEB’s Re-Evaluation Report.

The Price Negotiation Memorandum observed that “[a]ll of the proposed offers for the sample task order were below the IGE.” CAR 1136. The CO explained that “[t]he chart listed on the previous page [listing proposed STO prices] demonstrates that all proposals were below the IGE.” Id. In particular, “average haul rates for the scenario were lower than the average in the IGE.” Id. Ultimately, the SSEB concluded that “[a]ll revised final proposals in the competitive range compared favorably to the IGE.” Id.

Nevertheless, the CO was critical of prices submitted by AshBritt, P & J, ECC, and Ceres. The memorandum explained that the SSEB

could not verify that Phillips and Jordan, Ceres Environmental Services, Inc., Ash-Britt, Inc. and ECC had utilized their prices from Section B Region 5. Although the proposed prices to accomplish the sample task order were considered reasonable, [in] a real event, additional contract administration would be required to enforce the pre-priced rates in the task order.

Id. The memorandum continued:

In accordance with FAR 52.215-1(0(8), the Government evaluated the offers to determine if the proposed prices were materially unbalanced between line items or sub-line items. Unbalanced prices exist when, despite an acceptable total evaluated price, the price of one or more contract line items is specifically overstated or understated as indicated by the application of cost or price analysis techniques. Although the AshBritt, Inc. and ECC proposed prices are lower than the other of-ferors in the competitive range, they are consistently lower. The proposed prices in all offers appear to be balanced.

Id.

Source Selection Decision Document

In a November 25, 2009 Source Selection Decision Document, the SSA, Richard Johnson, the Corps’ Mississippi Valley Division Regional Contracting Chief, provided his summary of ratings for the final proposals for Region 6a. See CAR 1151. The SSA’s summary of ratings was identical to that provided by the SSEB in the November 17, 2009 Re-Competition Evaluation Report. All offerors were rated “Blue” or outstanding for Past Performance, ManagemenVOperation Plan, and the technical evaluation of the STO, and all offerors were rated “Outstand*297ing” with regard to the Small Business Subcontracting factor. Id.

The SSA first provided a summary of each offeror’s strengths and weaknesses without regard to the region. After describing P & J’s non-cost strengths, the SSA described a “weakness that drove the proposal risk rating to moderate”:

In the final round of discussions the Contracting Officer had updated the sample task order scenario to occur in California. Also, the new sample task order removed burning as an option of debris reduction. Phillips and Jordan did not update the proposal in accordance with the new scenario. They did not use the correct bid schedule for the scenario which compounded this error during the price analysis.

CAR 1144-45. The SSEB had concluded that the production rate proposed for the STO was “suspect along with the price.” CAR 1145. The SSA reported that he “recognize[d] that this [was] an oversight by the proposal prep team for Phillips and Jordan, but agree[d] with the SSEB that this raise[d] the proposal risk of this offer.” Id.

The SSA also identified several strengths in P & J’s proposal. The SSA observed that the experience and length of service of key Management/Operations personnel on projects similar in scope and value “adds considerable value to the offer.” CAR 1144. According to the SSA, “[t]he discussion in the offer demonstrates a thorough understanding of sectoring methodology,” and P & J’s proposal evidenced a “[g]ood understanding of the economics behind recycling.” Id. The SSA further determined that P & J’s “sample task order proposal demonstrates a good understanding of the requirements of a debris mission. The mobilization plan was excellent.” Id. The SSA continued:

Plan for base camp, segregation, [temporary debris storage and reduction sites] setup/management was very good. Recognition in the proposal that double hauling is least preferred method of debris removal. Proposal indicates a preference to haul [construction/demolition] directly to final disposal site. Clear analysis for numbers and types of crews for each sector. The proposal exceeded the standard production rate and a recognition that daily production rate varies, but an average standard can be attained over the course of a mission. Proposal includes a list of local subcontractors. Also, it includes strategies on finding more local subs like using state agencies and purchasing ad in local media outlets. Corporate health and safety plan was included in the offer.

CAR 1144.

With regard to ECC’s proposal, the SSA reported several weaknesses. See CAR 1145-46. In part, the SSA noted that ECC’s technical solution for the STO “included references to burning as a method of debris reduction” despite the omission of incineration from the revised STO. CAR 1145-46. The SSA noted only that the SSEB had concluded that this discrepancy “was an oversight by the ECC proposal prep team.” CAR 1146.

The SSA further conveyed the SSEB’s determination that ECC’s Section B prices were “not utilized in the development of the price proposal for the sample task order.” CAR 1146. By way of example, the SSA noted as follows:

Hourly equipment costs for Sample Task Order CLIN 0015 was priced at [Redacted] each and the Section B price was [Redacted] each. CLIN 0014 was priced at [Redacted] each and the Section B price is [Redacted]. ADMS price for CLINs 0006AA, 0006AC, and 0006AD sample task order price do not match Section B prices.

Id.

The SSA identified one weakness in Crow-derGulf s proposal. According to the Source Selection Decision Document, CrowderGulfs revised proposal did not detail employee travel arrangements after the Government identified the lack of arrangements as a weakness during discussions. Id.

AshBritt’s proposal included a weakness that “drove the proposal risk rating to moderate.” CAR 1148. The SSA observed that the SSEB had determined that Schedule B prices “were not utilized in the development of the price proposal for the sample task order.” Id. The SSA compared prices as follows:

*298Hourly equipment costs for Sample Task Order CLIN 0015 was priced at [Redacted] each and the Section B price was [Redacted] each. CLIN 0014 was priced at [Redacted] each and the Section B price is [Redacted], ADMS price for CLINs 0006AA, 0006AC, and 0006AD sample task order price do not match Section B prices. All proposed Sample Task Order hourly rates for equipment plus operator are suspect due to inconsistency with Section B pricing.

CAR 1148. The SSA also observed that AshBritt did not include [Redacted]. The SSA concluded that “[t]hese pricing discrepancies elevate the risk of this proposal.” Id. Similarly, AshBritt’s proposed 0% escalation rate elevated the performance risk of the proposal because, in the SSA’s estimation, “the offeror may not be able to successfully perform the requirements of the contract in the final years of contract life.” Id.

The SSA identified several strengths in AshBritt’s proposal. The SSA first examined AshBritt’s Management/Operation Plan and key personnel:

Key personnel have a diversity of project experience, education, and qualifications. The length of service of these individuals with the company adds considerable value to the offer. Mobilization plan was good. Added emphasis for mobilization to region[s] 6a and 6b. Understanding of the equipment required to execute a debris mission. Proposal demonstrates ability to handle various debris streams.

CAR 1147. The SSA also noted AshBritt’s experience in sectoring, stating that Ash-Britt “[a]eknowledged that the situation at hand determines the crew composition and numbers personnel needed.” Id. The SSA concluded that AshBritt’s discussion “demonstrates a thorough understanding of sector-ing methodology.” Id. According to the SSA, AshBritt’s STO proposal was “well presented” and “provided a good understanding of technical solutions, management, and organizational capabilities of a debris mission.” Id. In addition, the SSA determined that the proposal

demonstrates ability of the contractor to execute the mission[.] Final proposal revision page 3r3, commits to identifying staging areas [Redacted] prior to landfall with names and contact information to USACE. The proposal exceeds the standard production rate. Calculations provided to support the proposed rate. Recognition that daily production rate varies, but an average standard can be attained over the course of a mission. Large number of local subcontractors in all regions listed in the proposal. Good discussion on approaches to locate additional subcontractors.

Id.

The SSA noted several weaknesses in Ceres’ proposal. [Redacted], Ceres did not propose [Redacted] or [Redacted] costs in its proposed sample task order price despite including an “extensive [Redacted] plan” as part of-its technical solution to the STO. CAR 1149. The SSA conveyed the SSEB’s determination that Ceres’ Section B prices were “not utilized in the development of the price proposal for the sample task order,” observing:

Hourly equipment costs for Sample Task Order CLIN 0015 was priced at [Redacted] each and the Section B price was [Redacted] each. ADMS price for CLINs 0006AA, 0006AC, and 0006AD sample task order price do not match Section B prices.

CAR 1149.

In addition, the SSA identified several strengths in Ceres’ proposal. First, the SSA found Ceres’ key ManagemenVOperations personnel to be “highly qualified.” CAR 1148. Ceres’ mobilization plan was “good,” and according to the SSA, its proposal included a “[g]ood description of general debris removal and reduction” as well as a “[g]ood discussion on how AMDS integrates into the process.” Id. The SSA determined that Ceres’ proposal “includes [a] well thought out discussion on sectoring and crew size to meet mission deadlines.” Id. The SSA determined that Ceres’ proposal “demonstrates [a] good understanding of sector management.” Id. With regard to Ceres’ STO proposal, the SSA determined that Ceres’ proposal was “well presented and provided a good understanding of technical solutions, management, *299and organizational capabilities of a debris mission.” CAR 1149.

The SSA ultimately concluded that his analysis had reaffirmed the competitive range decision and offers were “extremely competitive.” CAR 1149.

After summarizing the strengths and weakness in each offeror’s proposal, the SSA moved to a discussion of each region under consideration. For Region 6a, the SSA first reported that “AshBritt is the apparent low offeror. The evaluated price of [Redacted] includes a commitment from AshBritt [Redacted].” CAR 1151. The SSA continued, “[t]he highest non-cost rated offeror with the lowest evaluated price is ECC at [Redacts ed],” approximately [Redacted] higher than AshBritt. Id.

The SSA noted that both proposals were rated highly under the non-cost factors. According to the SSA,

past performance for both ECC and Ash-Britt, Inc. are comparable. Both proposals had very good management/operations plans. AshBritt, Inc.’s offer had no identified weaknesses in the management/operations plans. ECC’s proposal did not discuss the number of monitors in relation to the number of crews in the field. Also, the recycle plan was not new and innovative. The proposal did not adequately discuss green waste initiatives. For the Small Business Subcontracting Plan, neither proposal had an identified weakness.

Id.

In evaluating the STOs in the context of selecting an awardee for Region 6a, the SSA compared AshBritt’s and ECC’s STO proposals under each subfactor:

a.Technical Approach: ECC received the second highest rating because of the reference to burning vegetative debris in its approach. Other than this error, the proposed approach was well documented and thorough. Segregating C & D curbside. Infers support for direct haul to permanent land fill. Environmental baseline survey for each TDSR. Mobilization plan was good. AshBritt, Inc. received the highest rating. The AshBritt, Inc. revised final proposal is extremely well written and well thought out. Proposal demonstrates ability of the contractor to execute the mission.
b. Production Rate: ECC received the second highest rating because of the reference to burning vegetative debris in its approach. Other than this error, the proposed approach was well documented and thorough. The ECC proposed production rate exceeds the standard. AshBritt, Inc. received the highest rating with no weaknesses identified. The AshBritt, Inc. proposed production rate exceeds the standard.
c. Use of Local Subcontractors: Both firms received the highest technical rating without any identified weaknesses.
d. Site Specific Safety and Health Plan: Both firms received the highest technical rating without any identified weaknesses.

CAR 1152. The SSA concluded that ECC’s non-cost weaknesses would not result in higher costs to the Government, but that “discrepancies” identified in AshBritt’s Schedule B and STO prices may result in higher costs.

The SSA then endeavored to explain his conclusion that award to AshBritt might result in higher costs to the Government. According to the SSA, as part of its acquisition strategy, the Corps did not try to preprice haul rates. Id. The Project Delivery Team (“PDT”) reviewed the results of task order negotiations during Hurricane Katrina and Hurricane Rita and determined that the proposed haul rates in the previous ACI Debris contracts were not enforceable because of unknown factors that are different for every disaster event. Id. To solve this problem, the SSA noted, the PDT “only tried to fix prices on various items that could be enforced during any event.” Id. The SSA thus observed that the STO “demonstrates that even though an offeror may be low in the Section B prices, the government may actually have to pay more during an event.” Id. In particular, the SSA observed that “the discrepancies identified in AshBritt, *300Inc.’s Volume I and V pricing may result in higher costs.” Id. The SSA provided the following table to illustrate this point:

Sum of all hauling Cost Per Cubic Yard Percent of Total Debris Removal costs in Sample As Proposed in Price for Sample from Public Roads Task Order CLINs Sample Task Order Task Order
CROWDER GULF [Redacted]_[Redacted]_[Redacted]
Ceres Environmental [Redacted]_[Redacted]_[Redacted]_
ECC_[Redacted]_[Redacted]_[Redacted]_
Phillips and Jordan_[Redacted]_[Redacted]_[Redacted]_
AshBritt_[Redacted]_[Redacted]_[Redacted]_
IGE_[Redacted]_[Redacted]_[Redacted]_
Notes: In the evaluation of the prices in Section B, Crowder Gulf has the highest evaluated price and AshBritt has the lowest. Yet, in this scenario, AshBritt is one of the highest priced offers and Crowder Gulf is the apparent low offeror.

Id. As noted by the SSA, the average proposed haul rate for ECC was lower than AshBritt, Inc. CAR 1153. In addition, the SSA concluded that “some risk exists” that AshBritt’s future costs would be higher because AshBritt’s price for hauling represented [Redacted] of its total price. CAR 1152.

Nevertheless, the SSA determined that the “identified risk” with AshBritt’s offer could be mitigated. Specifically, the SSA stated that “[t]hrough proper contract administration, the government will be able to enforce the proposed prices.” CAR 1153. In addition, the SSA reasoned that “the reach back program is designed to provide the government leverage in just this sort of situation. If a real event occurs and AshBritt’s proposed haul rates are unreasonable, the government can issue task orders to the reach back contractor.” Id. The SSA concluded that “the lower risk in ECC proposal [did] not outweigh the savings in price associated with AshBritt, Inc. proposal. In accordance with FAR 15.101-1, the Government has decided to select other than the highest technically rated offer.” Id.

The SSA summarized his decision to award Region 6a to AshBritt as follows:

Based upon the findings of the Source Selection Evaluation Board and the Source Selection Advisory Council, I have compared the offers for Region 5 [sic] giving appropriate consideration to the evaluation criteria as set forth in the solicitation and their relative importance. Based upon this comparison of the proposals and a detailed assessment of the advantages and disadvantages associated with each, I have determined AshBritt, Inc.’s proposal represents the best overall value to the Government.

Id.

With regard to Region 6b proposals, Ash-Britt was ineligible because it was awarded the primary contract in Region 6a such that P & J became the apparent low offeror in Region 6b with an evaluated price of [Redacted]. Id. Ceres, the only other eligible offeror, was the “highest non-cost rated of-feror with the next lowest evaluated price” at [Redacted], approximately [Redacted] more than P & J. Id.

The SSA addressed non-cost factors first, noting that they were significantly more important than price. The SSA observed that both offerors’ proposals had good management/operations plans, and neither had an identified weakness for their Small Business Subcontracting Plan. The SSA further determined that P & J’s and Ceres’ past performance were comparable. Id. The SSA noted that the “primary difference in ratings was associated with the sample task order.” Id.

*301The SSA again compared the offerors’ STO proposals under the four technical sub-factors to explain his selection of an awardee for Region 6b. Here, the SSA focused on proposals submitted by Ceres and P & J. The SSA stated as follows:

a. Technical Approach: Phillips and Jordan received the second highest rating because of the reference to burning vegetative debris in its approach. Other than this error, the proposed approach was well documented and thorough. The Ceres Environmental Services, Inc. proposal received the highest rating and had no identified weaknesses.
b. Production Rate: Phillips and Jordan received the second highest rating because of the reference to burning vegetative debris in its approach. Other than this error, the proposed approach was well documented and thorough. Proposal received the highest rating and had no identified weaknesses.
c. Use of Local Subcontractors: Both firms received the highest technical rating without any identified weaknesses.
d. Site Specific Safety and Health Plan: Both firms received the highest technical rating without any identified weaknesses.

CAR 1153-54.

The SSA further determined that non-cost weaknesses in P & J’s offer “will not result in higher costs to the government.” CAR 1154. Instead, “discrepancies identified in both of-feror[s’] Volume I and V pricing may result in higher costs.” Id. The SSA observed that the STO for Region 6b again demonstrated that even if an offeror proposed low Section B prices, the Government might have to pay more during an actual event. Id. Using the same table to illustrate offerors’ proposed cost per cubic yard as proposed in the STO, the SSA noted that the average proposed haul rate for Ceres was lower than P & J, and that P & J’s proposed price for hauling represented [Redacted] of its total price. Id. The SSA concluded that “some risk exists with this offer that future costs would be higher if [he selected] Phillips and Jordan for the award.” Id.

The SSA determined that contract administration could mitigate the identified risk in Philipps and Jordan’s offer. The SSA stated that “[t]hrough proper contract administration, the government will be able to enforce the proposed prices,” and cited the reach back program as a mechanism for providing the Government “leverage in just this sort of situation.” Id. In particular, the SSA noted that “[i]f a real event occurs and Phillips and Jordan’s proposed haul rates are unreasonable, the government can issue task orders to the reach back contractor.” CAR 1154.

The SSA selected P & J for award, noting that it was “clear from this evaluation [that] the lower risk in Ceres Environmental Services, Inc.’s proposal [did] not outweigh the savings in price associated with the Phillips and Jordan proposal.” Id. Accordingly, the SSA decided to select “other than the highest technically rated offer.” Id. The SSA summarized his findings as follows:

Based upon the findings of the Source Selection Evaluation Board and the Source Selection Advisory Council, I have compared the offers for Region 6b giving appropriate consideration to the evaluation criteria as set forth in the solicitation and their relative importance. Based upon this comparison of the proposals and a detailed assessment of the advantages and disadvantages associated with each, I have determined Phillips and Jordan’s proposal represents the best overall value to the Government.

CAR 1155.

Based on these evaluations, the SSA selected AshBritt as the primary awardee and ECC as the reach back selectee for Region 6a, and Phillips and Jordan as the primary and Ceres as the reach back for Region 6b. CAR 1156.

Discussion

Standard of Review

In a bid protest, the court reviews an agency’s decision under the standards in the Administrative Procedure Act (“APA”). 28 U.S.C. § 1491(b)(4). A reviewing court shall overturn an agency action that is “arbitrary, *302capricious, an abuse of discretion or otherwise not in accordance with law.” 5 U.S.C. § 706(2)(A).

In order to meet this standard, the protestor must show that ‘“either: (1) the procurement official’s decision lacked a rational basis; or (2) the procurement procedure involved a violation of regulation or procedure.’ ” Axiom Res. Mgmt., Inc. v. United States, 564 F.3d 1374, 1381 (Fed.Cir.2009) (quoting Impresa Construzioni Geom. Domenico Garufi v. United States, 238 F.3d 1324, 1332 (Fed.Cir.2001)). A court evaluating a challenge on the first ground must determine “whether the contracting agency provided a coherent and reasonable explanation of its exercise of discretion.” Id. “When a challenge is brought on the second ground, the disappointed bidder must show a clear and prejudicial violation of applicable statutes or regulations.” Id.

The Court will find agency action arbitrary and capricious when the agency “entirely failed to consider an important aspect of the problem, offered an explanation for its decision that runs counter to the evidence before the agency, or [the decision] is so implausible that it could not be ascribed to a difference in view or the product of agency expertise.” Ala. Aircraft Indus., Inc.-Birmingham v. United States, 586 F.3d 1372, 1375 (Fed.Cir.2009) (quoting Motor Vehicle Mfrs. Ass’n v. State Farm Mut. Auto. Ins. Co., 463 U.S. 29, 43, 103 S.Ct. 2856, 77 L.Ed.2d 443 (1983)). Contracting officers are afforded considerable discretion in negotiated procurements, such as this one, where award is premised on a “best value” determination. Banknote Corp. of Am., Inc. v. United States, 365 F.3d 1345, 1355 (Fed.Cir.2004) (“It is well-established that contracting officers have a great deal of discretion in making contract award decisions, particularly when, as here, the contract is to be awarded to the bidder or bidders that will provide the agency with the best value.”). Such discretion, however, “does not relieve the agency of its obligation to develop an evidentiary basis for its findings.” In re Sang Su Lee, 277 F.3d 1338, 1344 (Fed.Cir.2002). Indeed, it is well established that “the agency must examine the relevant data and articulate a satisfactory explanation for its action including a ‘rational connection between the facts found and the choice made.’ ” Motor Vehicle, 463 U.S. at 43, 103 S.Ct. 2856 (quoting Burlington Truck Lines v. United States, 371 U.S. 156, 83 S.Ct. 239, 9 L.Ed.2d 207 (1962)).

In resolving bid protests, the trial court is to make findings of fact weighing the evidence in the administrative record. See Bannum, Inc. v. United States, 404 F.3d 1346, 1355 (Fed.Cir.2005). If the protester succeeds in demonstrating an error in the procurement process, the Court then proceeds to determine, as a factual matter, whether the protestor was prejudiced by that error. Id. at 1351; see also Data Gen. Corp. v. Johnson, 78 F.3d 1556, 1562 (Fed.Cir.1996) (holding that to prevail in the protest, the protestor must show not only a significant error in the procurement process, but also that the error prejudiced it); AshBritt, 87 Fed.Cl. at 365.

Ceres’ Protest

Plaintiff limits its protest to the agency’s reeompetition in Regions 6a and 6b. In essence Plaintiff lodges three grounds of protest, claiming that the agency:

(1) failed to conduct a meaningful analysis of AshBritt’s and Phillips and Jordan’s pricing;
(2) failed to comply with the solicitation’s terms by failing to do a best value tradeoff and basing its award decision on price and ignoring the greater risk associated with AshBritt’s and P & J’s proposals;
(3) engaged in unfair, misleading, incomplete, and unequal discussions with Ceres regarding price.

The Agency’s Evaluation of AshBritt’s and P & J’s Pricing Was Not Arbitrary and Capricious

Plaintiff argues that the Government erred in awarding contracts to AshBritt in Region 6a and Phillips and Jordan in Region 6b because their revised Schedule B prices were too low. Although Plaintiff characterizes AshBritt’s and P & J’s prices as “unreasonable,” Plaintiffs challenge to these prices *303concerns price realism, not reasonableness.15 “Arguments that an agency did not perform an appropriate analysis to determine whether prices are too low, such that there may be a risk of poor performance, concern price realism.” C.L. Price & Assocs., Inc., B-403476.2, 2011 CPD ¶ 16, at *3 (Jan. 7, 2011) (quoting SDV Solutions, Inc., B-402309, 2010 CPD ¶ 48 (Feb. 1, 2010)).

In a fixed-price procurement, the agency ordinarily does not consider the “realism” of offerors’ proposed prices because the contractor bears the risk of underpricing its offer. Fulcra Worldwide, LLC v. United States, 97 Fed.Cl. 523 (2011) (“Generally, price realism is not considered in fixed price contracts because the contractor assumes the full risk and responsibility that the work can be performed for the price offered.”); Afghan Am. Army Servs. Corp. v. United States, 90 Fed.Cl. 341, 356 (2009). However, an agency may, at its discretion, provide for the use of a price realism analysis to measure an offeror’s understanding of the solicitation requirements, or to avoid the risk of poor performance from a contractor who is forced to provide goods or services at little or no profit. Grove Res. Solutions, Inc., B-296228, 2005 CPD ¶ 133 (July 1, 2005). If an agency commits itself to a particular methodology in the solicitation, it must follow that methodology. Afghan Am. Army, 90 Fed.Cl. at 359. Results of the analysis “may be used in performance risk assessments and responsibility determinations,” but “the offered prices shall not be adjusted as a result of the analysis.” FAR 15.404-l(d)(3).

The nature and extent of an agency’s price realism analysis, as well as an assessment of potential risk associated with a proposed price, are matters within the agency’s discretion. Afghan Am. Army, 90 Fed. Cl. at 356 (citing Perneo Aeroplex, Inc., B-310372.3, 2008 CPD ¶ 126, at *5 (June 13, 2008)); Labat-Anderson Inc. v. United States, 50 Fed.Cl. 99, 106 (2001). The Court’s review of the agency’s analysis is limited to determining whether the evaluation was reasonable and consistent with the, solicitation’s evaluation criteria. Ala. Aircraft, 586 F.3d at 1375-76; see also Perneo Aeroplex, 2008 CPD ¶ 126, at *5 (citing Grove Res. Solutions, Inc., 2005 CPD ¶ 133).

The price realism determination done here pushes up against the outermost limits of agency discretion. Plaintiff points out that AshBritt’s Schedule B proposed price for Regions 6a and 6b was originally [Redacted] for the base year, but dropped to [Redacted] in the recompetition. See, e.g., AAR 516; CAR 318, 331. Over the five-year life of the contract, AshBritt’s original evaluated price was $55,545,449.54, compared to its revised proposed price of [Redacted] — a [Redacted] reduction. In a similar vein, P & J’s proposed price for the five-year life of the contract in Region 6b dropped [Redacted], from [Redacted] to [Redacted]. According to Plaintiff, the agency failed to resolve its concerns regarding the significant reductions in these offerors’ revised Schedule B prices.

To be sure, the agency voiced deep concern with AshBritt’s precipitous and unexplained [Redacted] drop in Schedule B pricing in the re-competition. As the CO’s price consultant, Tina Guillot, put it, Ashbritt’s “price [was] so low as to question whether there may be an unknown risk to the government if [the contract were] awarded to Ash-Britt.” CAR 342. Other contract specialists shared this concern about the “apparent risk involved” with AshBritt’s low bid. CAR 2100. The SSEB documented its concerns about AshBritt’s low prices and [Redacted]. The agency also expressed concerns about the [Redacted] drop in P & J’s price for ADMS, and P & J’s overall [Redacted] drop in price.

What did the agency due to assuage its valid concerns about the radical drop in pricing? The agency decided to revise the Sample Task Order assumptions and request an updated Sample Task Order price based *304upon these new assumptions. CAR 230. In the CO’s words, “the Government will ask all offerors to make the sample task order price proposal reflect the recent prices offered for Region 5 in the Section B CLINs. With this new information the government can better evaluate the reasonableness and realism of the new proposed prices for ECC and Ash-Britt.” CAR 229. The CO voiced the same intention with respect to completing an integrated assessment of P & J’s offer.

The problem is the agency did not fully execute this worthy plan. There was abundant confusion about what prices offerors were to propose in the revised sample task order. Plaintiff argues the solicitation required Schedule B prices to be used, which would have made sense since the record reflects the agency’s intention to measure Schedule B price realism via the revised Sample Task.16 Strangely, however, the agency never told offerors to use Schedule B pricing in the revised sample task, and the solicitation clearly did not require that offer-ors use Schedule B pricing in their sample task pricing.

In seeking a new proposal for the revised sample task order, the agency did not indicate why it was requesting this revision and simply told offerors to update their technical STO proposals in Volume IV and the pricing for the STO in Volume V of their proposal submissions. The agency’s letter inviting these proposals made no mention of what pricing offerors should use in the revised STO itself. The STO accompanying this letter contained its own “Task Order Bid Schedule,” and offerors were to fill in prices for the contract line items on those schedules, but there was no indication that they were required to use their Schedule B prices in doing so. Rather, the STO’s performance of work section simply required offerors “to complete the work ... based on the unit pricing submitted by the Contractor in the Bid Schedule.” CAR 368. Plaintiff cites this provision, arguing that the “Bid Schedule” necessarily refers to the Task Order Schedule B pricing. However, read in the context of the revised STO, it is clear that the term “Bid Schedule” refers to the Bid Schedule in the STO itself, which contained different CLIN descriptions than Schedule B. Compare CAR 369-71, with AAR 328-29.

Nor did the solicitation require that offer-ors use Schedule B pricing in the STO. Back in 2007, in Amendment 001, the agency amended the solicitation to remove the instruction in Section L that offerors use Schedule B pricing in pricing the STO. In instructing offerors on STO pricing, Section L.l initially had provided that “[a]ll pricing shall match the proposed rates submitted in Volume I [Schedule B].” AAR 311; 352. However, the agency deleted that requirement on July 10, 2007, after an offeror inquired as to the proper labor rates to be used in STO proposals. This question and answer is reflected in the amendment:

[Q.] Which rate should offerors use when pricing the sample task order: a) the $40.00/hr rate specified in Attachment 12 [STO] or the proposed rate in Section B. 1?
[A] Removed sentence relative to pricing in Volume V, Solicitation Section L.l [Pricing for STO]. Please price according to assumptions listed in Attachment 12.

AAR 323.

Plaintiff asserts that the agency’s deletion of the requirement linking Schedule B prices to STO proposals only applied to labor rates listed in the STO Assumptions section. Plaintiff observes that the question that prompted the deletion focused narrowly on the inconsistency between the instruction to assume that the Operation Manager labor rate was $40/hour, and the requirement that offerors use Schedule B prices when caleulat-*305ing STO prices. In Plaintiffs view, the requirement that offerors price that particular labor rate according to given assumptions is logical because Davis Bacon wage rates vary from region to region, and “the differentiating factor between the proposals is not the base rate but the burden applied to the Operations Manager’s hourly rate.” Pl.’s Supp. Br. at 9. Thus, Plaintiff argues, the deletion was only meant to clarify the proper labor rate and was “not a wholesale change to the Revised STO pricing methodology.” Id. Plaintiffs interpretation cannot prevail, however, because there was nothing in the solicitation — other than the sentence in Section L.l that was removed — that required offer-ors to use their Schedule B prices in pricing the STO. To the extent any STO “pricing methodology” had been required in the solicitation, that requirement was removed in Amendment 001.

While it might seem odd that an agency would have offerors respond to a mock task and not use their actual pricing, it is not unheard of, and the agency could still assess offerors’ systemic and logistical approaches to the work, including designation of manpower and equipment, and estimation of resources. See e.g., CW Gov’t Travel, Inc.Reconsideration, B-295530.2, 2005 CPD ¶ 139, at *4 n. 2 (July 25, 2005) (“We have previously acknowledged that prices or costs proposed in the context of hypothetical sample tasks in a solicitation for an ID/IQ contract, while somewhat artificial in nature, may permit the government to assess the probable cost of competing offerors — provided that the solicitation takes into account offerors’ differing technical approaches and meaningfully evaluates the costs or prices underlying their proposals.”); S.J. Thomas Co., B-283192, 99-2 CPD ¶ 73, at *4 (Oct. 20, 1999) (“If used intelligently, sample tasks can provide insight into competing offerors’ technical and staffing approach and thus provide a reasonable basis to assess the relative cost of the competing proposals.”); Aalco Forwarding, Inc., B-277241, 98-1 CPD ¶ 87, at *7 (Mar. 11, 1998) (“Where estimates for various types of required services are not reasonably available, an agency may establish a reasonable hypothetical, consistent with the RFP requirements, to provide a common basis for comparing the relative costs of the proposals.”); see also Magnum Opus Techs., Inc. v. United States, 94 Fed. Cl. 512, 534-35 (2010) (acknowledging the legitimacy of sample task pricing particularly where the quantity of work may not be known until task orders are issued). Indeed, here, it was clear that offerors could not use their actual Schedule B pricing for Regions 6a and 6b for the Sample Task because the mock disaster in the STO occurred in Region 5. In any event, odd or not, the solicitation simply did not require that offerors use Schedule B pricing in pricing their Sample Task response.

So, despite the intentions of the CO that the revised STO would be a good tool for assessing the price realism of offerors’ Schedule B pricing — intentions which were never shared with the offerors — this tool was never actually utilized as envisioned.17 Most offerors, including AshBritt and P & J, did not use Schedule B pricing in the revised STOs. However, the agency performed a detailed evaluation of the responses to the STO from a technical standpoint and concluded that AshBritt and P & J exhibited a good understanding of the work.

Recognizing that these offerors did not use their Schedule B prices in responding to the STO, the agency downgraded AshBritt’s and P & J’s risk from low to moderate. The agency articulated that the risk associated *306with AshBritt and P & J meant that, in a real event, costs might be higher. However, the SSA determined that the Government could enforce the Schedule B prices through contract administration and issue task orders to the reach back contractors if necessary.

In reviewing a challenge to an agency’s price realism determination, this Court is mindful of the scope of review. In Alabama Aircraft, the Federal Circuit explained that this Court’s review of a price realism determination is limited: “The trial court’s duty [is] to determine whether the agency’s price-realism analysis was consistent with the evaluation criteria set forth in the RFP, ... not to introduce new requirements outside the scope of the RFP.” 586 F.3d at 1375-76 (citing Galen Med. Assocs., Inc. v. United States, 369 F.3d 1324, 1330 (Fed.Cir.2004)).

Here, the solicitation only stated that the Government would analyze the realism of proposed STO prices, not Schedule B prices. Section M.5 outlined how the agency intended to use each set of proposed prices. Schedule B prices were to be “contractually binding and cover each contract line item included in Section B of the solicitation,” and the agency pledged to evaluate the rate schedule as “the basis for the best-value trade-off decisions.” Specifically, “[t]he sum of the extended value of the Section B fixed price contract line items will determine the low offeror for each region.” AAR 319. By contrast, the solicitation provided that “prices submitted in response to the sample task order will be used to determine price/ cost realism and as part of the proposal risk assessment.” AAR 319 (emphasis in original). The solicitation does not say what Plaintiff urges this Court to find — that the agency was required to analyze Schedule B prices for realism.

In situations where the solicitation does not expressly or implicitly require a price realism analysis, it is improper for an agency to conduct a price realism analysis and then reject a proposal for having an unrealistically low price. In Milani Construction, LLC, B-401942, 2010 CPD ¶ 87 (Dee. 22, 2009), the Government Accountability Office (“GAO”) sustained a protest where the agency determined that the protester’s proposed price was unreasonably low, reflected a lack of understanding of the project requirements and posed a performance risk, because the solicitation “did not provide offerors with adequate notice that [the agency] intended to perform a price realism analysis.” In CSE Construction, B-291268.2, 2002 CPD ¶ 207 (Dec. 16, 2002), GAO remanded a procurement where the agency did not consider the protester for award after determining that protester’s proposed price was unreasonably low and reflected a lack of understanding of the contract requirements, because the solicitation did not provide for the evaluation of price realism. GAO continued: “[t]he agency’s apprehension that [the protester’s] price was too low would appear to concern the firm’s responsibility, that is, whether [the protester] could satisfactorily perform at its proposed price ... or whether [the protester] may have made a mistake in its proposed price”. Id.18 Thus, had the agency rejected Ashbritt’s and P & J’s prices for being unrealistically low, it would have run afoul of the fundamental precept that an agency may not apply undisclosed evaluation factors.

Moreover, this is not a situation where the agency “entirely failed to consider an important aspect of the problem.” Motor Vehicle, 463 U.S. at 43, 103 S.Ct. 2856. The agency is going into this procurement with its eyes wide open as to the lower-than-expected pricing on the part of AshBritt and P & J, and the agency is willing to take the risk associated with those low prices. This is a business judgment and should not be second-guessed by this court. As Professors Nash and Ci-binic recognized, “while ‘buying-in’ is among the ‘other improper business practices’ covered in FAR Subpart 3.5, that Subpart does not direct COs to refuse award if a price is too low.” Ralph C. Nash & John Cibinic, Price Realism Analysis: A Tricky Issue, 12 No. 7 Nash & Cibinic Rep. ¶ 40, at 2 (1998). These commentators continued:

*307an apparent ‘buy-in’ price does not necessarily mean that the offeror does not understand the work or that a performance risk exists. Instead, it may mean that the offeror has a business reason for buying-in and that it intends to perform the contract in accordance with the specifications, even if it loses money.

Id. at 6.

Here, the sample task evaluation the agency performed did give the agency assurance that AshBritt and P & J understood the work and had strong technical approaches. With respect to AshBritt, the record reflects:

• Under the sample task order, proposal was well presented and provided a good understanding of technical solutions, management, and organizational capabilities of the debris mission.
• Revised final proposal was extremely well written and thought out. Proposal demonstrated ability of the contractor to execute the mission.
• Under past performance, proposal demonstrated past performance as a prime contractor in Hurricane Katrina, and exceeded goals for small business during Hurricane Katrina.
• Proposal included a good organizational chart and personnel are identified by position. Key personnel have a diversity of project experience, education and qualifications. The length of service of these individuals with the company added considerable value to the offer.

CAR 1147.

So too, with Phillips and Jordan, the agency recognized that the offeror had significant strengths. Under past performance, P & J had been a prime contractor for debris response missions in Hurricane Ivan, Hurricane Katrina and the World Trade Center. CAR 1144. In past performance, P & J had four outstanding ratings for major disaster responses. Id. Like AshBritt, P & J had key personnel whose length of service added considerable value to the offer. Id. P & J demonstrated a “thorough understanding of sectoring methodology.” Id. Further, P & J’s “sample task order proposal [demonstrated] a good understanding of the requirements of a debris mission. The mobilization plan was excellent.” CAR 1144.

This Court recognizes that an agency’s price-realism analysis lacks a rational basis if the contracting agency made “irrational assumptions or critical miscalculations,” or used an evaluation method that produced a misleading result. OMV Med., Inc. v. United States, 219 F.3d 1337, 1344 (Fed.Cir.2000); DMS All-Star Joint Venture, 90 Fed.Cl. 653, 663 (2010); CW Gov’t Travel, 2005 CPD ¶ 139, at *4 (“The method chosen must include some reasonable basis for evaluating or comparing the relative costs of proposals, so as to establish whether one offeror’s proposal would be more or less costly than another’s.”). As the Federal Circuit recognized in OMV Medical,

[a] reviewing court in dealing with a determination or judgment which an administrative agency alone is authorized to make, must judge the propriety of such action solely by the grounds invoked by the agency. If those grounds are inadequate or improper, the court is powerless to affirm the administrative action by substituting what it considers to be a more adequate or proper basis.

219 F.3d at 1344 (citing SEC v. Chenery Corp., 332 U.S. 194, 196, 67 S.Ct. 1575, 91 L.Ed. 1995 (1947)).

Whether the agency made an “irrational assumption” in conducting its price realism analysis of the STO is a close call. Admittedly, the agency did not do a price realism analysis of the contractually binding Schedule B pricing in evaluating the revised sample task — even though that was the CO’s internally stated reason for doing the STO evaluation. However, several factors combine to persuade the Court that this flaw does not warrant disrupting the awards to Ashbritt and P & J. First, the solicitation did not mandate that the agency evaluate Schedule B pricing for realism, and it would have been error for the agency to have rejected low offerors based upon an unstated cost-realism evaluation criterion. Second, this is a fixed-price contract where the risk for not performing at the contractually binding price falls squarely and exclusively on the contractors — here tried-and-true contractors *308with outstanding past performance records, understanding of the work, and technical solutions. Third, under the principles of judicial review recently articulated by the Federal Circuit in Alabama Aircraft, this Court must afford discretion to the agency’s price realism assessment. In sum, given the deference to be afforded the agency in price realism assessments and the agency’s willingness to assume risk, this Court does not find the agency’s determination to award to Ash-britt and P & J arbitrary, capricious, or unreasonable.19

Plaintiff also argues that AshBritt and P & J’s price proposals were materially unbalanced. In support of its argument, Plaintiff first compares the offerors’ original price proposals with the revised price proposals. Plaintiff calculated the amounts — specific dollar amounts and percent increase and decrease — by which the prices changed, and argues that disparities between the proposed prices indicate that “the price of one or more contract line items is specifically overstated or understated.” Pl.’s Mot. J. AR at 41. To that end, Plaintiff observes that AshBritt’s revisions included a [Redacted] increase in the price listed for CLIN 28, “Debris Reduction By Mechanical Means,” pointing out Ashbritt’s price increased from $[Redacted] per CY to $[Redacted] per CY. Id. Plaintiff also cites an [Redacted] reduction in price for CLIN 0005AD, training. Id. Similarly, P & J’s revised proposal included a [Redacted] increase for one line item, and a [Redacted] reduction in another. Id. According to Plaintiff, these variations indicate that the prices were not “consistently lower” as claimed by the CO, but rather were commercially unreasonable, inconsistent, and therefore unbalanced. Id.

Unbalanced pricing exists when, “despite an acceptable total evaluated price, the price of one or more contract line items is significantly overstated or understated as indicated by the application of cost or price analysis techniques.” FAR 52.215-l(f)(8). Upon determining that an offer contains unbalanced pricing, a contracting officer must “determine whether award on the basis of an apparently unbalanced offer would result in paying unreasonably high prices” or present unacceptable risks to the Government. CCL Serv. Corp. v. United States, 48 Fed.Cl. 113, 122 (2000); see OSG Product Tankers LLC v. United States, 82 Fed.Cl. 570, 575 (2008). Thus, a contracting officer may reject a proposal if he determines that “the lack of balance poses an unacceptable risk to the Government.” FAR 52.215-l(f)(8). Here, Plaintiff focuses solely on the change in prices between revised and original submissions, but has not demonstrated that any individual CLIN price proposed by AshBritt or P & J was unreasonably overstated or understated compared to the offeror’s overall proposed price.

The Agency Evaluated Proposals In Accordance With The Terms of the Solicitation

Plaintiff argues that the Government ignored technical evaluations that rendered Ceres’ proposal the best value to the Government. Plaintiff further claims that award to the lowest price offerors violates the solicitation, which clearly stated that price was less important than technical factors. However, a review of the SSA’s decision indicates that the SSA did exactly what the solicitation and the Army Source Selection Manual required him to do — he analyzed the differences among the competing proposals, comparing strengths and weaknesses and weighing non-cost discriminators as well as prices. The SSA’s decision explained the different technical approaches and acknowledged the risk associated with the low-price proposals. Contrary to Plaintiffs suggestion, the SSA did not ignore non-price factors, but rather weighed them carefully in concluding that accepting the low priced proposals did not pose undue risk.

This Court does not sit as a super source selection authority to second guess agency procurement decisions. Rather, it is well established that the Court should not substitute its judgment to assess the relative merits of competing proposals in a Government procurement. See, e.g., R & W Flammann *309GmbH v. United States, 339 F.3d 1320, 1322 (Fed.Cir.2003) (citing Ray v. Lehman, 55 F.3d 606, 608 (Fed.Cir.1995)); Lumetra v. United States, 84 Fed.Cl. 542, 549 (2008) (“[T]he court “will not second guess the minutiae of the procurement process in such matters as technical ratings and the timing of various steps in the procurement.’ ” (quoting E.W. Bliss Co. v. United States, 77 F.3d 445, 449 (Fed.Cir.1996))). Plaintiff has not given the Court reason to upset the agency’s assessment of the best value or demonstrated that it failed to apply the proper weight to the technical and price factors.

Plaintiff further asserts that the Government violated the terms of the solicitation because the CO consulted Ms. Guillot regarding the offerors’ proposed prices. Plaintiff observes that the solicitation stated that Mr. Black was to be the only person on the price evaluation committee. According to Plaintiff, Mr. Black improperly sought the input of others outside the team. However, as the CO, Mr. Black was not prohibited from seeking guidance so long as he exercised independent judgment and did not abdicate his responsibility. Here, Plaintiff has not established that Mr. Black adopted Ms. Guillot’s assessment wholesale. In fact, Mr. Black’s response to Ms. Guillot demonstrates the opposite: “The info is very helpful. It [is] exactly what I needed. Your ultimate conclusions were a little different from mine.” CAR 2098. The FAR explicitly allows contracting officers to consult with others when conducting a price analysis: “The contracting officer may request the advice and assistance of other experts to ensure that an appropriate analysis is performed.” FAR 15.404-1(a)(5).

The Agency Conducted Adequate Discussions

Plaintiff asserts that the agency engaged in unfair, misleading, incomplete, and unequal discussions with Ceres regarding its price. Plaintiff argues that the agency did not notify Ceres that its price was “no longer competitive” on two separate occasions. According to Plaintiff, the agency did not lead Ceres into the areas of its proposal requiring amplification or correction, as required by the FAR, by failing to discuss cost, which it characterizes as a significant weakness in its proposal.

Plaintiff suggests that the Government had an obligation to advise it that “its price was no longer competitive,” but this argument exhibits a fundamental misconception of the role of discussions in a procurement. Under FAR Subpart 15.3, the contracting officer must

indicate to, or discuss with, each offeror still being considered for award, deficiencies, significant weaknesses, and adverse past performance information to which the offeror has not yet had an opportunity to respond. The contracting officer also is encouraged to discuss other aspects of the offeror’s proposal that could, in the opinion of the contracting officer, be altered or explained to enhance materially the proposal’s potential for award.

FAR 15.306(d)(3).

Meaningful discussions “generally lead of-ferors into the areas of their proposals requiring amplification or correction.” Advanced Data Concepts, Inc. v. United States, 43 Fed.Cl. 410, 422 (1999). However, the FAR states that “the contracting officer is not required to discuss every area where the proposal could be improved” and that instead “[t]he scope and extent of discussions are a matter of contracting officer judgment.” FAR 15.306(d)(3). Under this provision, “‘[t]he government need not discuss every aspect of the proposal that receives less than the maximum score or identify relative weaknesses in a proposal that is technically acceptable but presents a less desirable approach than others.’ ” Cube Corp. v. United States, 46 Fed.Cl. 368, 384 (2000) (quoting ACRA, Inc. v. United States, 44 Fed.Cl. 288, 295 (1999)).

As the Court has recognized, “unless an offeror’s costs constitute a significant weakness or deficiency in its proposal, the contracting officer is not required to address in discussions costs that appear to be higher than those proposed by other offerors.” DMS All-Star Joint Venture v. United States, 90 Fed.Cl. 653, 669 (2010) (citing SOS Interpreting, Ltd., B-287477.2, 2001 CPD ¶ 84 (May 16, 2001)). In the SSEB’s view, Ceres’ pricing was not a deficiency or a *310weakness — it was not so high or out of line with other offerors’ pricing as to require discussions. Nor did Ms. Guillot suggest that Ceres be eliminated from the competitive range based on its pricing. As such, Ceres’ proposed pricing, while higher than other offerors, did not require amplification or correction.

Under the rubric of unfair discussions Plaintiff also argues that the agency provided AshBritt and P & J access to Ceres’ successful pricing in Region 6b in a debriefing, but failed to mitigate the competitive advantage held by AshBritt and P & J by providing Ceres with reciprocal access to AshBritt and P & J’s prices. Plaintiff apparently contends that the agency should have disclosed its competitors’ pricing during discussions to mitigate this unfairness. However, Plaintiff knew, prior to submission of its proposal in the reeompetition, that other offerors had obtained its pricing for the regions in which it was the awardee at the debriefing. Similarly, Plaintiff gained access to the successful pricing of awardees in other regions including Regions 5 and 6a by virtue of the debriefing for those awards. If Plaintiff believed that the procedure for the recompetition had to be amended to ensure that all offerors’ pricing be released, it had an obligation to raise this argument prior to the closing date for receipt of proposals. Blue & Gold Fleet, L.P. v. United States, 492 F.3d 1308, 1313 (Fed.Cir.2007). Plaintiff’s attempt to challenge the ground rules of the bidding process after award is untimely.

In Blue & Gold, the Federal Circuit held that “a party who has the opportunity to object to the terms of a government solicitation containing a patent error and fails to do so prior to the close of the bidding process waives its ability to raise the same objection subsequently in a bid protest action in the Court of Federal Claims.” 492 F.3d at 1313-14. Although the instant case does not involve a “patent error” in a solicitation, it involves an obvious procurement procedure which Plaintiff knew was being applied and chose not to challenge prior to submitting its proposal in the recompetition. All offerors knew that awardees’ prices had been disclosed in debriefings and that the same offer-ors and former awardees would be competing in the recompetition. If Plaintiff thought this disclosure gave other offerors an unfair advantage in Region 6a, the time to raise that complaint was before the closing date for submission of proposals in the recompetition.

Plaintiff further argues that the agency acted arbitrarily and capriciously by relying on an allegedly flawed IGE as the basis for its price discussions. According to Plaintiff, the agency knew the IGE was flawed and revised the estimate without notifying or discussing the revisions with the offerors. Ultimately, Plaintiff contends, the agency abandoned the IGE and instead relied on competitive pricing as the basis for its reasonableness determinations.

The sole basis for Plaintiffs contention that the IGE was flawed was Ms. Guillot’s somewhat cryptic statement in an email to the CO:

The previous competition, and the current competition both have an escalated IGE. Although an IGE is not required (it is required for FAR Part 36, construction), its use limits the establishment of a competitive range in this procurement, and does not serve its intended purpose.
If all offerors come in below the IGE, this in itself does not constitute all offerors being in a competitive range. The IGE should have been revised based on the previous and historical contract pricing, and should have also taken into consideration inflation and escalation rates, if an IGE was to be used. This would have provided a more accurate tool to assist in the determination of the competitive range.

CAR 342 (emphasis added).

Plaintiff seizes upon Ms. Guillot’s phrase “escalated IGE” to argue that the IGE was “hopelessly excessive.” PL’s Mot. J. AR at 51. However, it is difficult to divine what Ms. Guillot’s comment about the IGE means. While Ms. Guillot said both the previous and current competition “have an escalated IGE,” she may have meant this as shorthand for indicating that both competitions “have an IGE calling for escalation of prices” in the option years. Otherwise, the sentence in the ensuing paragraph would not make *311sense. That sentence reads the IGE “should have also taken into consideration inflation and escalation rates,” which would mean the IGE was too low because both inflation and the escalation rates would serve to raise the IGE. CAR 342. Further, Ms. Guillot’s unclear criticism of the IGE was limited and directed at what should have been done to make the IGE “a more accurate tool to assist in the determination of the competitive range,” not the ultimate source selection decision. Id. Moreover, despite her determination that the IGE was not useful for determining a competitive range, Ms. Guillot concluded that the agency “received adequate competition, and as a result of that competition, a competitive range can be established.” Id. Ms. Guillot’s internal comment in an email does not demonstrate either that the IGE was so flawed that it should have been scrapped or that the CO erred in continuing to use it as a gauge in discussions. Finally, Plaintiffs suggestion that the agency ultimately ignored the IGE is not supported by the record.

Plaintiff further argues that the agency failed to hold additional discussions that the Government acknowledged internally were necessary despite seeking — and obtaining— an extension of time from this Court to do so. According to Plaintiff, the agency failed to act on its determination that it was critical to seek an explanation for AshBritt and P & J’s dramatic pricing changes. However, in receiving an enlargement of time to finish the reeompetition, the agency was not obligated to follow any particular course of action. There was no requirement that the agency hold a second round of discussions on Schedule B prices in the reeompetition. As commentators have recognized:

Reopening negotiations is not a desirable course of action. It adds time and expense to the procurement and extends the time when information may be improperly disclosed.

John Cibinic, Jr. & Ralph C. Nash, Jr., Formation of Government Contracts 915 (3d ed.1998) (citing Mine Safety Appliances Co. B-242379.5, 92-2 CPD ¶ 76 (Aug. 6, 1992)). Here, the offerors were told which CLINs were priced above or below the IGE and were afforded an opportunity to change their pricing. Some made changes, and others did not. The Government was not obligated to point this out again. See Phoenix Safety Assocs. Ltd., B-216504, 84-2 CPD ¶621 (Dec. 4, 1984). While the agency could have amended the solicitation to provide that Schedule B prices had to be used in the STO and received revised STO pricing proposals, it was not required to do so.20

Order

1. Plaintiffs Motion for Judgment on the Administrative Record is DENIED.

2. Defendant’s Cross-Motion for Judgment on the AR is GRANTED.

3. Prior to the release of this opinion to the public, the parties shall review this unre-dacted opinion for competition-sensitive, proprietary, confidential or other protected information. The parties shall file proposed redacted versions of this decision by March 10, 2011.

4. The Clerk is directed to enter judgment on the AR in favor of Defendant consistent with this opinion.

. These findings are derived from the opinion and Administrative Record in AshBritt v. United States, 08-473 (cited as AAR) and the Administrative Record filed in the instant action (cited as CAR).

. The Corps did not issue a completely new solicitation in conducting the reprocurement; it reopened discussions, revised the Sample Task Order, and invited offerors to revise their proposals. See CAR 360-61.

. Reach back contractors could be activated in any of three circumstances: (1) if a single event generated in excess of 10M cubic yards of debris; (2) if the regional primary contractor had two or more performance evaluations with a score of 50 or less on any task order; or (3) if the Government was unable to negotiate fair and reasonable prices for the task orders with the primary contractor. AAR 317.

. Environmental Chemical Corporation.

. Phillips and Jordan, Inc. ("P & J”).

. AshBritt, Inc. ("AshBritt”).

. The RFP provided that "no single offeror may receive a contract award of two regions that are adjacent to each other.” AAR 233. Ceres was not eligible for the primary contract in Region 5 because it was the primary contract awardee in Region A — an adjacent region. As such, Ceres does not challenge the recompeted award in Region 5.

. The Court refers to "Section B” and "Schedule B” prices interchangeably.

. The record does not contain revised proposals from CrowderGulf for Regions 6a and 6b, but CrowderGulf did provide a proposal for Region 5 and its pricing for that region was evaluated, as Region 5 pricing was used by the Source Selection Evaluation Board as a representative sample.

. There is nothing in the record to suggest that the CO or any government agent ever asked offerors to make their STO proposals reflect Section B prices.

. The first discussion letter (August of 2009) noted that prices would be analyzed again, and provided the CLIN by CLIN comparison to the IGE. The second discussion letter identified issues with Volumes I through IV (non-price), but not V (price). The final discussion letters provided questions and responses submitted by all of-ferors.

. Despite these findings by the SSEB’s Technical Advisor, Ceres was not downgraded in the evaluation of the revised STO.

. Although the CO apparently thought he asked offerors to use Section B CLIN prices in the STO, nothing in the record indicates that he did this. The discussion letters do not advise offer-ors to use Section B CLIN pricing, and there is no suggestion that oral discussions were held. Ceres did not seek supplementation of the AR regarding this statement by the CO in the Price Negotiation Memorandum. Later, the CO and the SSEB recognized that most offerors did not in fact use their Section B pricing in the STO.

. The evaluation of price reasonableness is designed to prevent the Government from paying too high a price for a particular contract. See DMS All-Star Joint Venture v. United States, 90 Fed.Cl. 653, 663 n. 11 (2010); Serco, Inc. v. United States, 81 Fed.Cl. 463, 494 n. 48 (2008); see also Ralph C. Nash & John Cibinic, Cost and Price Analysis: Understanding the Terns, 9 No. I Nash Si Cibinic Rep. ¶ 5 (1995). Normally, competition establishes price reasonableness. FAR 15.305(a)(1); FAR 15.404-1 (a)(1), (b)(2).

. Defendant asserts that Plaintiff’s argument that Schedule B prices had to be used in the STO is untimely. According to Defendant, Ceres should have sought clarification of the solicitation prior to the closing date and/or protested any perceived ambiguities prior to submitting its proposal for consideration. Defendant contends that Ceres waived its ability to protest the solicitation's price evaluation criteria. However, Plaintiff is not challenging the solicitation's price evaluation criteria, but rather the way the agency applied — or failed to apply — the terms of the solicitation to evaluate price realism in the context of the STO. As such, this ground of protest is timely.

. The record is replete with references that the CO intended to use the revised STO exercise to assess the realism of Schedule B prices, but this was never communicated to offerors. See CAR 229 ("[T]he Government will ask all offerors to make the Sample Task Order price proposal reflect the recent prices offered for Region 5 in the Section B CLINS. With this new information, the Government can better evaluate the reasonableness and realism of the new proposed prices for ... AshBritt Inc[.] submitted on 28 AUG 2009.”) (emphasis added); CAR 346. In internally explaining the purpose of the final round of discussions in its November 17, 2009 Recompetition Evaluation Report, the CO stated that

The primary focus of this round of discussions was to attempt to determine the reasonableness of the latest proposed prices for Section B ... but the proposed pricing of the sample task order was not in line with the contract rates proposed in Section B.

CAR 1009, 1011-12 (emphasis added).

. Here, Ashbritt and P & J both reaffirmed their low prices after discussions, eliminating the possibility that their bids were mistaken. Plaintiff has not challenged the agency’s responsibility determinations.

. Ceres also contends that the agency should have based its best value determinations on STO pricing, not Schedule B pricing, and should have deemed it the low-priced offeror, but such a process would have squarely contradicted the solicitation.

. In its supplemental brief Plaintiff argues that AshBritt violated the solicitation by not documenting its assumptions in calculating its STO prices. Plaintiff’s allegation is not supported by the record. See CAR 751-57.