In the United States Court of Federal Claims
No. 16-1071C
(Filed Under Seal: February 6, 2017)
(Reissued for Publication: February 14, 2017)
*************************************
*
ACTIVE NETWORK, LLC, *
*
*
Plaintiff, *
*
Post-award Bid Protest; Price Realism
v. *
Analysis; Disparate Treatment Claim;
*
Past Performance Evaluation; Meaningful
THE UNITED STATES, *
Discussions; Protestor’s Burden to Show
*
Prejudice; Remand.
*
Defendant, *
*
and *
*
BOOZ ALLEN HAMILTON, *
*
Defendant-Intervenor. *
*
*************************************
Eric J. Marcotte, with whom were Kelly E. Buroker, and Jacob W. Scott, Vedder Price, P.C.,
Washington, D.C., and Daniel R. Forman and James G. Peyster, Crowell & Moring, LLP,
Washington, D.C., for Plaintiff.
Mollie L. Finnan, with whom were Benjamin C. Mizer, Principal Deputy Assistant Attorney
General, Robert E. Kirschman, Jr., Director, Douglas K. Mickle, Assistant Director, Farida Ali,
Trial Attorney, Commercial Litigation Branch, Civil Division, U.S. Department of Justice,
Washington D.C., and Elin M. Dugan, Senior Counsel, and Melissa D. McClellan, Attorney-
Advisor, U.S. Department of Agriculture, Office of the General Counsel, for Defendant.
Mark D. Colley, with whom were Stewart W. Turner, Emma K. Dinan, and Amanda Johnson,
Arnold & Porter, LLP, Washington, D.C., for Defendant-Intervenor.
OPINION AND ORDER1
WHEELER, Judge.
In this post-award bid protest, Plaintiff Active Network, LLC (“Active”) challenges a
contract award by the United States Department of Agriculture, Forest Service to Booz Allen
Hamilton (“BAH”). The contract is for Recreation One Stop (“R1S”) support services, a program
which allows members of the public to make on-line reservations to visit national parks,
memorials, and museums, among others. Active alleges a number of defects in the procurement
process, including the Forest Service’s failure to conduct a mandatory price realism analysis, the
disparate treatment of technical proposals, an arbitrary and capricious past performance
evaluation, and failure to conduct meaningful discussions. The Court finds that the Forest Service
acted rationally and in accordance with the law in some instances, and in others that Active failed
to show any prejudice from procurement defects. However, the Forest Service failed to perform
a required price realism analysis, without which the Court cannot determine if the agency made a
rational decision in awarding to BAH. Therefore, this Court remands to the Forest Service to
conduct a price realism analysis. In all other respects, the protest is denied.
Background
On July 17, 2015, the Forest Service issued Request for Proposal AG-3187-S-1000
(“RFP”) contemplating award of a single indefinite delivery, indefinite quantity contract for R1S
support services involving the redesign, development, administration and maintenance of the web
site “Recreation.gov.” AR 1515. The agency emphasized that the awardee must be committed to
integrating pre-existing user interfaces while also updating and improving upon them in order to
“maximize the users’ end-to-end recreation experience.” Id. at 1367. The agency sought a
contractor able to provide exhaustive support services for R1S including program management,
telecommunications, data mining, and marketing. Id. at 295.
Once proposals were submitted, the contract was to be awarded based on a “best value
determination” consistent with four Factors: (1) Technical Approach, (2) Integrated Solution
Approach, (3) Past Performance, and (4) Price. Id. at 1559, 1580. Source selection was to occur
in two phases. Id. at 1580. During the “Initial Evaluation” the agency would evaluate Factors 1,
3, and 4 in order to establish the competitive range. Id. During the “Final Evaluation,” offerors
1
The Court issued this decision under seal on February 6, 2017 and invited the parties to submit proposed redactions
of any proprietary, confidential, or other protected information on or before February 13, 2017. The parties
proposed four redactions which the Court allows. First, the Court replaced references to non-party Offerors with
either [Offeror C] or [Offeror D]. Second, the Court replaced the final proposed prices of unsuccessful Offerors
with [$***]. Third, the Court replaced the proposed prices for “time ticketing” on page 9 with [$***]. Fourth, the
Court redacts its discussion of an Offeror’s particular technical capacity on page 10, and is indicated as [***].
2
in the competitive range would make oral presentations of their Factor 2 Integrated Solutions. Id.
at 1572. The RFP stated that the agency would engage in discussions with offerors in the
competitive range before awarding the contract. Id. at 1579.
Factors 1, 2 and 3 were “of equal importance and when combined [were] significantly more
important than price.” Id. at 1580. Under Factor 1, Technical Approach, the agency was to
“evaluate the Offeror’s Technical Approach based on the degree to which it is clear,
comprehensive, detailed, effective, and demonstrates how it provides, retains and applies the
necessary requirements.” Id. at 1581. Factor 1 consisted of six Subfactors, each containing its
own requirements and evaluated separately. Id. at 1568-70. Under Factor 3, Past Performance,
the agency was required to “evaluate the Offeror’s past performance based upon its relevancy and
recency of its references [and] the responses received from past performance surveys . . . .” Id. at
1584. Under Factor 4, Price, the agency was to evaluate the offerors’ prices for reasonableness,
completeness, realism and balance. Id. at 1586-87. Both Factors 1 and 2 were subject to scaled
ratings of Outstanding, Excellent, Acceptable, Marginal, and Unacceptable. Id. at 1588. Factor
3 would be assessed by first assigning a relevancy score of Very Relevant, Relevant, Somewhat
Relevant, and Not Relevant to each instance of past performance, and then assigning a
performance confidence assessment. Id. at 1589. Factor 4 was not subject to a ratings scale. Id.
at 1587. The RFP informed Offerors that the “proposal should not simply rephrase or restate the
Government’s requirements.” Id. at 1562. The agency warned that proposals should be “clear,
concise, and shall include necessary and sufficient detail for effective evaluation and for
substantiating the validity of stated claims.” Id.
Importantly for this dispute, the RFP required the use of Agile software methodology and
asked offerors to demonstrate their commitment to implementing Agile methodology. 2 Id. at
1369. The Performance Work Statement provided that “[a] minimum viable product (MVP) shall
be releasable within 6 months (or as proposed) in a development environment based on an Agile
software development iterative process . . . .” Id. In evaluating the technical proposals, the agency
was particularly interested in familiarity with and commitment to Agile methodology.
In September 2015, the agency received six initial proposals including a proposal from the
incumbent contractor, Active. After evaluating the proposals under Factors 1, 3, and 4, the Source
Selection Evaluation Team (“SSET”) selected four offerors - Active, BAH, [Offeror C], and
[Offeror D] - for the competitive range. Id. at 7962-82. At this stage, BAH’s technical proposal
was rated “excellent” while all other technical proposals were rated “unacceptable.” Id. at 7979.
In December 2015, the agency notified each offeror in the competitive range and invited
each to make its Factor 2 Integrated Solution presentation. The agency also sent each offeror
Evaluation Notices (“ENs”), which were detailed discussion questions seeking clarification about
2
Agile software methodology is an alternative approach to traditional project management developed in 2001. For a
description of Agile software methodology and its origins, see https://www.agilealliance.org/.
3
the proposals. See id. at 3085-3466. Offerors were to address the ENs during their Integrated
Solution presentations and submit written responses afterward. Id. at 3087, 3222. Active received
118 ENs, while BAH received 55. Id. at 3091, 3224. The agency informed Active that its proposal
was generally vague and required an entire revision to remove all the “nebulous” statements. Id.
at 13104. The agency held the Integrated Solution presentations in January 2016, and the agency
received EN responses by February 12, 2016. Id. at 4020-4708.
Between January and March 2016, the SSET completed evaluation of the initial proposals
and EN responses, and compiled consensus reports for all non-price Factors. Id. at 8569-9031.
However, the SSET Chair evaluated Factor 4, Price, separately in handwritten notes. Id. at 7911-
61. There is no consensus report for Price in the administrative record.
In March 2016, the agency invited all four offerors to submit Final Proposal Revisions
(“FPRs”). Id. at 4709. The letter stated that “[e]valuation for award will be solely on the
information presented in your [FPR]. Since we expect the FPR to be a comprehensive document,
the Government will not consider information submitted by you during the discussions process.”
Id. The agency received FPRs in late March. See id. at 4725-6695. On May 4, 2016, the SSET
produced a Proposal Analysis Report (“PAR”) which explained its consensus ratings, conclusions,
and recommendations for award. Id. at 9284-9350. The PAR contained hundreds of pages of
attachments consisting of a “running log” of evaluator comments on Factors 1-3. Id. at 9448-
9664; Def.’s Mot. at 6. In these attachments, evaluators cited the offerors’ materials which were
the source of the comment. See e.g., AR 9452 (citing page 25 of Active’s FPR as demonstrating
that Active would be able to provide users with driving directions). The PAR provided a chart
summarizing the SSET’s evaluations of Factors 1-4 and their relevant Subfactors:
SSET Consensus Ratings on Offeror Final Revised Proposals
BAH Active [Offeror C] [Offeror D]
Technical Outstanding Acceptable Acceptable Marginal
Phase-In & Startup Outstanding Acceptable Marginal Marginal
Ops & Cust. Spt Outstanding Acceptable Acceptable Acceptable
Public Interface Spt Svc Outstanding Marginal Acceptable Excellent
Res & Other Rec Rel Svc Outstanding Marginal Acceptable Marginal
Telecom Acceptable Acceptable Acceptable Acceptable
PM Spt Svc Outstanding Acceptable Excellent Marginal
Integrated Solution Outstanding Acceptable Marginal Marginal
IS Approach Outstanding Acceptable Marginal Marginal
Arch. & Agile Approach Outstanding Acceptable Marginal Marginal
Perf. Confidence Substantial Satisfactory Satisfactory Substantial
Price $182,113,077.03 [$***] [$***] [$***]
4
Id. at 9347-48. The PAR concluded that BAH had provided the superior proposal:
The SSET determined that the [BAH] proposal contained a total of 88
individual Strengths across all evaluation factors and subfactors . . .
and did not find any Weaknesses or Deficiencies associated with their
proposal. Similarly, the next higher priced proposal, [Active], was
determined to contain 24 Strengths, 7 Weaknesses, and 3 Deficiencies
across all evaluation factors and subfactors . . . . Additionally, the
SSET determined [Active] to have only Satisfactory Confidence
based on a body of work that was determined to be largely only
Somewhat Relevant to the full size scope and complexity of the
solicitation’s requirements, while [BAH] was determined to have
Substantial Confidence.
Id. at 9349. On the basis of this final evaluation, the SSET recommended that the Source Selection
Authority (“SSA”) award the R1S contract to BAH. Id. at 9350.
On May 5, 2016, the SSET Chair briefed the SSA on “all aspects of the source selection
process” using a PowerPoint slide presentation. Id. at 10875. The SSA received a copy of the
PAR before the briefing. Id. The SSA proceeded to create the Source Selection Decision
Document (“SSDD”) in which he “accomplished an independent analysis” using the information
provided during his briefing, the FPR Consensus Reports on Factors 1-3, and the PAR. Id. at
9919. The SSA concurred with the SSET’s recommendation stating that “[BAH] received the
highest possible ratings from the SSET in each of the 3 rated Evaluation Factors . . . . The
combination of the highest technical ratings and the lowest proposed price clearly establishes
[BAH] as the Best Value to the Government . . . .” Id. at 9926.
On May 12, 2016, the Contracting Officer (“CO”) notified BAH that it was the successful
offeror, and notified the other three offerors that they were unsuccessful. Id. at 9927, 9950, 9962,
9967. Active was the second lowest priced offeror with an acceptable technical proposal. Id. at
9347-48.
On May 23, 2016, Active filed a protest with the Government Accountability Office
(“GAO”) challenging “numerous significant flaws” in the procurement process. Id. at 10494,
10496. Active withdrew its protest on August 25, 2016, six days before the GAO was required to
issue its decision, after the Government declined alternative dispute resolution. Id. at 11367.
On August 26, 2016, Active filed its complaint in this Court seeking declaratory and
injunctive relief requesting the agency to terminate BAH’s award and re-initiate the procurement
process due to the agency’s failure to comply with the RFP and applicable regulations. Compl. at
1; Pl.’s Mot. at 40. On August 30, 2016, the Court granted BAH’s motion to intervene. On
November 2, 2016, Active filed a motion for judgment on the administrative record. On
5
December 2, 2016, the Government and BAH each filed an opposition to Active’s motion and a
cross-motion for judgment on the administrative record. The parties have fully briefed their
motions, and on January 24, 2017, the Court heard oral argument.
Discussion
Under the Tucker Act, when presented with a bid protest this Court reviews an agency’s
decision pursuant to the standards of the Administrative Procedure Act (“APA”). 5 U.S.C. § 706;
28 U.S.C. § 1491(b)(4); see also Impresa Construzioni Geom. Domenico Garufi v. United States,
238 F.3d 1324, 1332 (Fed. Cir. 2001) (stating that the APA standard of review shall apply in all
procurement protests in the Court of Federal Claims). Under the APA, this Court shall set aside
an agency action if it is “arbitrary, capricious, an abuse of discretion, or otherwise not in
accordance with law.” 5 U.S.C. § 706(2)(A); see Banknote Corp. of America, Inc. v. United
States, 365 F.3d 1345, 1350-51 (Fed. Cir. 2004). An agency’s decision does not violate the APA
if the agency “provided a coherent and reasonable explanation of its exercise of discretion.”
Impresa, 238 F.3d at 1332–33. Further, an agency must articulate a “rational connection between
the facts found and the choice made.” Motor Vehicle Mfrs. Ass’n of the United States, Inc. v.
State Farm Mutual Auto. Ins., Co., 463 U.S. 29, 43 (1983). The Court’s review is “highly
deferential” to the agency so long as the award decision was rationally explained. Bannum, Inc.
v. United States, 91 Fed. Cl. 160, 172 (2009).
If the Court determines that an agency acted without a rational basis, it must then determine
whether “the bid protestor was prejudiced by that conduct.” Bannum, Inc. v. United States, 404
F.3d 1346, 1351 (Fed. Cir. 2005). The plaintiff must show prejudice by demonstrating “that there
was a substantial chance it would have received the contract award but for [the agency’s
procurement] error.” Alfa Laval Separation, Inc. v. United States, 175 F.3d 1365, 1367 (Fed. Cir.
1999) (internal citation omitted); see also Bannum, 404 F.3d at 1353.
In its motion for judgement on the administrative record, Active argues that it is entitled to
relief due to four critical violations of the RFP and applicable regulations. First, Active argues
that the agency failed to abide by the RFP’s commitment to evaluate proposals for price realism.
Pl.’s Mot. at 7. Second, Active claims that the agency engaged in disparate treatment of the
offerors’ Factor 1 technical proposals. Id. at 16. Third, Active argues that the agency failed to
evaluate the relevance of past performance equally against a common relevancy standard. Id. at
27. Finally, Active asserts that the agency failed to conduct meaningful discussions with Active.
Id. at 32. After careful consideration, the Court GRANTS IN PART Active’s motion for judgment
on the administrative record as to its first argument and orders this matter remanded to the agency
to perform a price realism evaluation. As to Active’s remaining three arguments, the Court
GRANTS the Government’s and BAH’s cross-motions for judgment on the administrative record.
The Court will address each of Active’s four arguments in turn.
6
A. The Agency Failed to Conduct a Proper Price Realism Analysis.
When an RFP contains a price realism clause, the agency must abide by it. Alfa Laval
Separation Inc. v. United States, 175 F.3d 1365, 1367 (Fed. Cir. 1999); Afghan Am. Army
Services v. United States, 90 Fed. Cl. 341, 359 (2009). While the agency has discretion to
determine the means of evaluating price realism, that method must be rational and documented.
Cohen Financial Services, Inc. v. United States, 110 Fed. Cl. 267, 288 (2013). Finally, “if the
reviewing court simply cannot evaluate the challenged agency action on the basis of the record
before it, the proper course . . . is to remand to the agency for additional investigation or
explanation.” Florida Power & Light Co. v. Lorion, 470 U.S. 729, 744 (1985).
In this case, it is undisputed that the RFP called for a price realism evaluation. Pl.’s Mot.
at 8; Def.’s Mot. at 13. The dispute concerns whether the agency properly adhered to the
requirements of the price realism clause in the RFP. Section M of the RFP states:
E. Price Realism
The Government will assess the extent to which the pricing approach
demonstrates alignment with the requirements of the [Performance
Work Statement], achievement of program goals and objectives and
best overall value. The government will evaluate price realism as it
applies to:
- Demonstration of clear commitment to the success of the
program, achievement of program goals, increased capacity,
and better quality of service
- Demonstration of partnership and a willingness to balance
revenue and risks.
AR 1587. Further, Section L.20 of the RFP explains that “the purpose of this [price] analysis is
to determine that the Offeror fully understands the requirements of the solicitation and has the
ability and capacity to successfully perform the contract at the offered price.” Id. at 1575. Based
upon the language in the RFP, it is clear that the agency contemplated a qualitative and substantive
price realism analysis. If the agency adhered to its own price realism clause, then the record
should demonstrate a documented, clear explanation as to how BAH’s price demonstrated
“alignment with the requirements of the [Performance Work Statement]” and “achievement of
program goals.” The agency may choose its means of documenting and explaining its analysis,
but its chosen method must refer to the standards clearly stated in Section M of the RFP.
Advanced Data Concepts, Inc. v. United States, 216 F.3d 1054, 1058 (Fed. Cir. 2000) (a rational
evaluation must demonstrate “consideration of relevant factors.”). Here, the Court can find no
documentation in the administrative record of a price realism analysis consistent with Section M.
The materials reviewed and prepared by the SSA contain no price realism analysis. First,
the SSDD itself does not contain a satisfactory price realism analysis. It merely mentions the
7
price of each offeror in the competitive range. AR 9921, 9922, 9924, 9925. Second, the SSET
briefing materials contains four slides in which the prices of the four offerors are listed and
organized from lowest price to highest price. Id. at 9908-11. However, there is no discussion of
whether any of these prices are realistic in relation to the standards discussed in Section M of the
RFP. Id. Third, the FPR Consensus Reports upon which the SSA also relied only concern Factors
1-3. Finally, the PAR, representing the most exhaustive price analysis seen by the SSA, contains
two identical conclusory sentences for each of the four Offerors: “The SSET has determined the
Offeror’s FPR pricing to be realistic for the services proposed. There is no evidence that the
Offeror has manipulated their pricing to be unrealistically low in order to ‘buy-in’ to the R1S
Support Services contract.” Id. at 9315 (Active), 9325 (BAH), 9336 ([Offeror C]), 9347 ([Offeror
D]). These sentences do not contain any evaluation or analysis, just the conclusions reached by
the SSET. Further, they do not address the standards specifically mentioned in Section M. The
SSDD, SSET briefing materials, Consensus Reports and PAR provide no basis upon which the
Court could confidently affirm that the SSA independently performed a price realism analysis
consistent with Section M.
The record contains only one clear instance when the agency engaged in some kind of price
realism analysis. In early December 2015, the SSET Chair produced 51 pages of notes on the
offerors’ price proposals. Id. at 7911-61. For each offeror, the SSET Chair notes the price
proposed for each Contract Line Item Number (“CLIN”) and occasionally offers commentary
such as “[n]o issue here” or “it is unclear what this is for.” Id. at 7934-36. The SSET Chair then
concludes by writing “yes” or “no” next to “completeness,” “realism,” “reasonableness,” and
“balance.” Id. at 7920, 7929, 7939, 7951, 7961. These notes are insufficient under Section M of
the RFP for multiple reasons. While they make a price “realism” judgment, there is no evidence
that they make a judgment in light of the standards set forth in Section M. Also, the SSET
members responsible for price evaluation did not produce these notes and the SSA never saw
them.3 See id. at 9290. Finally, these notes were produced three months before FPRs were
received. Id. at 9288-89. During that time, two amendments were added to the RFP which
prompted offerors to change their price proposals. Id. at 5622-23.
Next, the Government argues that pricing comparison spreadsheets satisfy the agency’s
obligation to complete a price realism analysis under Section M. Def.’s Mot. at 15. These
spreadsheets contain only the offerors’ prices proposed for each CLIN. AR 9035-36, 9277-78.
The spreadsheets only demonstrate that a dollar-by-dollar comparison was performed in order to
determine who had the lowest price and the difference between prices. It is impossible to verify
that the qualitative standards set forth in Section M were considered by a purely quantitative
3
The Government refers to an email between the SSET Chair and an SSET member responsible for price analysis as evidence
that the proper SSET member evaluated price. However, the email shows that the SSET member responsible for price just
provided the SSET Chair with spreadsheets containing the offerors’ CLIN price proposals. AR 10436. The Government also
refers to a statement by the CO before the GAO indicating that “the Pricing evaluation team conducted in-depth analysis of
each Offeror’s proposed pricing to ensure the pricing proposed was complete, realistic, and reasonable . . . .” Id. at 10874.
However, the CO cites no documents in the record to support this claim.
8
comparison. For example, the fact that BAH proposed [$***] for “time ticketing” while Active
proposed [$***] tells the Court nothing about whether these prices were in “alignment with the
requirements of the [Performance Work Statement]” or “[d]emonstrat[ed] [a] clear commitment
to the success of the program.” Id. at 1587. Without some sort of explanation to accompany these
spreadsheets, they are insufficient to show that a proper price realism analysis was completed.
Without a price realism analysis in the record, the Court has nothing to review and no way
of determining whether Active was prejudiced. This conclusion alone is sufficient to warrant
remand to conduct a proper price realism analysis. Florida Power & Light Co., 470 U.S. at 744;
Afghan Am. Army Services, 90 Fed. Cl. at 359. Pursuant to Rule 52.2, the agency is directed to
perform a price realism analysis specifically addressing and documenting whether the offerors’
price proposals are consistent with the standards put forward in Section M of the RFP.
B. The Agency did not Evaluate Active’s and BAH’s Technical Proposals Disparately.
An agency’s procurement decisions are arbitrary and capricious it they treat offerors
unevenly. PGBA, LLC v. United States, 60 Fed. Cl. 196, 207 (2004). Agencies must “evaluat[e]
proposals evenhandedly against common requirements.” GW Government Travel, Inc. v. United
States, 110 Fed. Cl. 462, 490 (2013). When evaluation factors are listed in the RFP, those factors
must be applied to each proposal consistently. TLT Construction Corp. v. United States, 50 Fed.
Cl. 212, 216 (2001).
Active claims that the agency did not evaluate the Factor 1 technical proposals equally
according to the RFP. Active alleges five examples of disparate evaluation of its technical
proposal and BAH’s technical proposal: (1) The agency relied on information included in BAH’s
EN responses in making its recommendation, but refused to do so for Active. (2) The agency
supported BAH’s Factor 1 rating by relying on information provided in BAH’s Factor 2
presentation and Quality Control Plan (“QCP”), but refused to do so for Active. (3) The agency
misevaluated Active and BAH’s MVP functionality. (4) The agency inconsistently evaluated
Active and BAH’s proposal for Public Interface Support Services (“PIS Services”). (5) The
Agency inconsistently evaluated Active’s and BAH’s timeline to delivery MVP functionality. As
to the first three claims, the Court finds that the agency evaluated technical proposals rationally
and consistently. While there may be some merit to Active’s final two claims, the Court finds
that Active suffered no prejudice from the agency’s actions. For these reasons, the Court
GRANTS the Government and BAH’s cross-motions for judgment on the administrative record
as to Active’s claim of disparate treatment of technical proposals.
1. The Agency did not rely on EN Responses in evaluating BAH’s Technical
Proposal.
The agency informed offerors that an award determination would be based solely upon the
contents of the FPRs. AR 4709. The attachments to the PAR contained a list of SSET comments
9
and notes entitled “SSET Detailed Comments”. Id. at 9284. Active identified four instances in
this attachment where the SSET cited to EN responses to support giving BAH a “strength” for its
proposed software functionality. Id. at 9589-90. Active received a key weakness against its
technical proposal for failure to “state a clear plan for . . . the required Denver, Colorado program
management office.” Id. at 9306-07. The SSET commented that “[Active] articulated a
reasonable approach [for the] Denver, Colorado program management office, however they failed
to include this required information in their FPR submittal.” Id. Active infers from these record
citations that BAH received credit for information contained in its EN responses, but not in its
FPR, but Active did not receive credit for information likewise contained solely in its EN
response. Pl.’s Mot. at 17-19.
The Government has sufficiently demonstrated that Active misconstrues the record. While
the SSET Detailed Comments cite BAH’s EN responses, BAH’s FPR contains the same
information. For example, the SSET Detailed Comments cite to EN response 009 to support that
BAH’s [***] provide “benefits to [***] by [***].” AR 9590. BAH’s FPR likewise states that
“[***].” Id. at 5451. In fact, it seems that the only information contained in the SSET Detailed
Comments that does not appear in BAH’s FPR is the following sentence: “[***].” Id. at 9589.
All other comments made by the SSET can be supported by BAH’s FPR. See id. at 5451.
Moreover, the PAR itself contains no citations to EN responses and there is no evidence
that the PAR relied on anything other than information contained in the FPRs. The fact that a list
of SSET findings was attached to the PAR should come as no surprise to Active. Offerors were
informed that the “Final Evaluation will include the results of the initial evaluate, the Integrated
Solution Presentation, and Discussions.” Id. at 1580. The PAR stated that “key determinations”
were included in the PAR while other “individual evaluation finding[s]/comment[s]” would be in
the attachment. Id. at 9295. This language suggests that the award recommendation was solely
based upon the FPR, while the attachment to the PAR contains only the details of the final
evaluation conducted by the SSET in accordance with the RFP.
In sum, Active’s claim of disparate treatment fails because there was a meaningful
difference between its FPR and BAH’s FPR. BAH integrated its EN responses into its FPR, while
Active did not.
2. The Agency did not rely upon BAH’s Factor 2 Presentations and Quality Control
Plan in evaluating BAH’s Factor 1 Technical Proposal.
Similarly, Active claims that it suffered from disparate treatment when the agency relied
upon BAH’s Factor 2 Integrated Solution presentation and QCP to bolster BAH’s Factor 1 rating
but did not afford Active the same consideration. Pl.’s Mot. at 19, 20. First, Active claims that
the agency relied upon BAH’s Factor 2 Integrated Solution presentation in awarding a strength
for BAH’s advanced functionality by its own admission before the GAO. AR 11324; Pl.’s Reply
at 13. In contrast, the agency awarded Active a “key strength” for its Agile methodology
10
implementation plan under a Factor 2 evaluation, but awarded Active a weakness for its Agile
software methodology implementation plan under a Factor 1 evaluation. AR 9309. Second,
Active argues that the agency’s defense of BAH’s Agile methodologies rating before the GAO
rested solely upon language in BAH’s QCP. Id. at 11121; Pl.’s Mot. at 20. These two arguments
turn on whether the information cited as a strength for BAH was included in its FPR. Again, the
Government has sufficiently demonstrated that Active misconstrues the record.
As to Active’s first argument, the CO stated before the GAO: “BAH provided significant
detail regarding the benefits and functionalities of the ‘Advanced Functionalities’ during their
Integrated Solution Presentation, and followed up with sufficient detail in their [FPR] to satisfy
the SSET’s reasonable judgment that BAH was committed to providing these specific
functionalities in their final product.” AR 11345-46 (emphasis added). BAH’s FPR discusses its
“Advanced Functionality” in section 2.1.6 of its FPR. Id. at 5449-5451. The agency stated that
it would base its award recommendation solely on the FPR because “we expect the FPR to be a
comprehensive document.” Id. at 4709. This language indicates that the agency expected offerors
to include all the information it presented to the agency throughout the evaluation process to be
included in the FPR. BAH followed these instructions and included the information in its
Integrated Solution presentation in its final FPR, which allowed the agency to rely solely on the
FPR in assessing BAH a strength. However, Active did not follow these instructions. During its
Integrated Solution presentation, the agency praised Active for its plan to implement Agile
methodology under the new contract. Id. at 9309. But, Active’s FPR lacked a sufficient
description of how it would implement Agile methodology under its technical proposal. Id. at
9305-07, 9314.
Active’s second argument is based upon the agency’s reference, before the GAO, to a small
portion of BAH’s QCP in explaining why BAH received a strength for its commitment to
implementing Agile methodology. Id. at 11121; Pl.’s Mot. at 20. The PAR contains the same
commitment by BAH without any reference to the QCP. AR 9318. Further, the SSET Detailed
Comments attached to the PAR show that the SSET evaluated BAH’s commitment to Agile
methodology without any reference to the QCP. Id. at 9577, 9579, 9582-87, 9589-92. This one-
off statement by the CO during GAO proceedings is simply not sufficient to show that the agency
relied upon BAH’s QCP during the procurement process, especially given the extensive
discussion of Agile methodology in BAH’s FPR. See e.g., 5408-09, 5415-27, 5436-39.
The record shows that BAH’s FPR was sufficiently more detailed and comprehensive than
Active’s FPR. The SSET had more than enough information before it to justify both offerors’
technical ratings. There is no evidence that the agency relied on information outside the FPRs in
making its award recommendation.
11
3. The Agency rationally evaluated MVP Functionality.
Next, Active argues that the agency treated it and BAH differently with respect to the level
of detail required concerning MVP functionality. Pl.’s Mot. at 23. The agency issued Active ENs
asking it to further describe its proposed MVP functionality, but did not issue similar ENs to BAH.
AR 4043, 4045, 4047. Active argues that BAH’s description of its MVP functionality was just as
detailed as Active’s description, but only Active was criticized.
In its initial proposal, Active included a chart describing its MVP functionality. Id. at 1636-
39. Some “Basic System Requirement[s]” were described as “In Production” while others were
described as “In Development.” Id. Active introduces the table by stating: “The majority of the
basic system requirements are readily available to R1S. The table below outlines the basic
requirements in two components: In Production and In Development.” Id. The table and this
statement are the only information Active offers about its MVP functionality. All of the agency’s
ENs focused on what exactly the distinctions between “in production,” “in development,” and
“readily available” mean, and when Active will have all basic system requirements functional. Id.
4043, 4045, 4047. BAH’s initial proposal clearly committed to specific deadlines regarding when
each basic system requirement would be functional. Id. at 1959. Thus, the concerns that prompted
the agency to criticize Active’s MVP functionality proposal were not present in BAH’s proposal
and there was no disparate treatment.
4. Active fails to Establish Prejudice even if its Remaining Claims have some Merit.
Active’s final two arguments claim that the agency inconsistently evaluated BAH and
Active’s PIS Services (Factor 1, Subfactor 3) and the timeline for delivering the MVP
functionality (Factor 1, Subfactor 1). In both instances, Active claims that BAH did not comply
with the instructions of the RFP but were not assigned lower ratings for non-compliance. Pl.’s
Mot. at 21-23, 25-27. While both of Active’s arguments may have some merit, Active cannot
establish that it was prejudiced by the agency’s evaluation of its PIS Services and MVP delivery
timeline. Since Active did not demonstrate “that there was a substantial chance it would have
received the contract award but for” these alleged inconsistent evaluations, Active cannot prevail.
Alfa Laval Separation, Inc., 175 F.3d at 1367; Bannum, 404 F.3d at 1351.
First, in describing Factor 1, Subfactor 3, PIS Services, the RFP instructed that
“contractor[s] shall propose a fee structure applicable only to high volume data consumers. . . .
Should the contractor opt to propose such a fee structure, their proposal shall clearly state the
applicable rates and details of the proposed fee structure.” AR 1569 (emphasis added). According
to the plain language of the RFP, offerors were required to propose a fee structure applicable only
to high volume data consumers, as indicated by the use of the word “shall.” Id. Active’s proposal
stated that: “Third parties wishing to access this data . . . will be charged on a sliding scale based
on the growth of the overall program.” Id. at 4801. It went on to propose specific fee structure
to which third parties would be subjected. Id. BAH’s proposal stated that “for high-volume users,
12
we have a fee model to support their desired usage as well. . . .[F]ees will be determined by
establishing a volume/usage threshold based on industry standards and best practices.” Id. at
5456. Active received a “marginal” rating for its PIS Services, while BAH received an
“outstanding” rating. Id. at 9347-48. According to Active, BAH did not provide a detailed fee
structure yet only Active was assigned a deficiency under the relevant technical Subfactor for PIS
Services. Pl.’s Mot. at 21. The Government contends that Active received a deficiency because
it proposed a fee structure which charged all third party users, and not just high volume users, and
BAH did provide a sufficient fee structure. Def.’s Mot. at 28.
Second, according to Factor 1, Subfactor 1, the MVP must be delivered “within 6 months
(or as proposed).” Id. at 1369. Active argues that this language entitled offerors to propose
delivery of the MVP after six months because the term “within” included all time before the six
month period ended and the “or” denoted any time after the six month period ended. Pl.’s Mot.
at 26; Texas State Comm’n for the Blind v. United States, 6 Cl. Ct. 730, 738 (1984) (“The term
‘within’ . . . is defined as ‘used as a function word to indicate enclosure or containment . . . .”).
The Government argues that the language only allows offerors to deliver the MVP before six
months because allowing contractors to propose an MVP delivery schedule without a deadline
would be “absurd.” Def.’s Mot. at 33.
Active raises legitimate concerns regarding the agency’s evaluation of these two Subfactors
and the strangely worded language present in the RFP, however the Court need not resolve these
concerns. Even if Active had also received an “outstanding” rating for the relevant technical
Subfactors (Factor 1, Subfactors 1 and 3), BAH still would have had three more “outstanding”
ratings and Active still would have been assessed one “marginal” rating. See id. at 9347-48 (chart
containing final evaluations). BAH’s technical proposal was simply far superior overall. The
contract would have been awarded to BAH even had Active and BAH been evaluated consistently
under these two Subfactors. Thus, Active has not established that it was prejudiced by the
agency’s evaluations.
For the reasons stated above, the Government and BAH’s cross-motions for judgment on
the administrative record are GRANTED as to Active’s claim of disparate evaluation of Factor 1
technical proposals.
C. The Agency’s Past Performance Evaluation was Rational.
When reviewing an evaluation of past performance in a negotiated procurement, the Court
affords an agency “the greatest deference possible.” Commissioning Solutions Global, LCC v.
United States, 97 Fed. Cl. 1, 9 (2011); Vanguard Recovery Assistance v. United States, 101 Fed.
Cl. 765, 784 (2011); Banknote Corp. of America v. United States, 56 Fed. Cl. 377, 386 (2003)
(citing Forestry Surveys & Data v. United States, 44 Fed. Cl. 493, 499 (1999)). A decision
regarding what constitutes relevant past performance falls within this considerable deference.
Glenn Defense Marine (ASIA) PTE, Ltd. v. United States, 720 Fed. Cl. 901, 911 (Fed. Cir. 2003);
13
PlanetSpace, Inc. v. United States, 92 Fed. Cl. 520, 539 (2010) (“At the outset, it is important to
note that what does or does not constitute ‘relevant’ past performance falls within the [agency’s]
considered discretion.”). An offeror’s mere disagreement with the evaluation does not mean that
the Court should override the agency’s judgements as unreasonable. Blackwater Lodge &
Training Ctr., Inc. v. United States, 86 Fed. Cl. 488, 515 (2009).
Under Section L of the RFP, offerors were required to submit up to six references of
relevant past experience to the agency. AR 1573-74. From the information provided, the agency
expected to receive a “clear understanding of the Offeror’s previous success in delivering the
proposed style of Agile methodology, systems implementation, marketing, telecommunications,
program management, and hosting.” Id. at 1585. In addition, the agency reserved the right to
obtain information related to past performance from any additional sources available to the
Government. Id. at 1573. Particularly pertinent for this protest, the agency defined “relevant”
past experience as “work which is the same and/or similar in complexity and scope to the work
described in [the] Performance Work Statement.” Id. at 1573. The agency assigned BAH a
Substantial Confidence rating based on its three Very Relevant references and three Relevant
references. Id. at 9314. Active received a Satisfactory Confidence rating based on its one Very
Relevant reference, one Relevant reference, and four Somewhat Relevant references. Id. at 9324.
Active argues that the agency applied different standards when evaluating the relevance of
Active and BAH’s past performance references. Pl.’s Mot. at 28. Active believes the agency
prejudiced it by taking a more relaxed approach as to what constituted relevant contracts for BAH.
Id. at 31. The Government responded that there is no evidence of disparate treatment, and the
agency lawfully and rationally rated the relevancy of Active’s past performance references based
on the criteria outlined in the RFP. Def.’s Mot. at 36; AR 1573. In reviewing the record, the
Court finds no evidence of the agency abusing its wide discretion in assigning relevancy ratings.
As an example of alleged unequal relevancy standards, Active points to the agency’s rating
of “Relevant” for BAH’s $28 million DIS Agency Teleport (“DISA”) contract compared to
Active’s “Somewhat Relevant” rating for their $28 million New York State Parks recreation
management contract. Pl.’s Rep. at 21-22. Active argues that the two evaluation comments
displayed a large amount of overlapping terms, but BAH received a higher rating. Id. In
reviewing BAH’s past experience, the agency noted in the attachments to the PAR that:
[i]n the BAH response, there [sic] DISA past performance does not
directly utilize Agile development services, mapping, ticketing, trip
planning, data management, or hosting. It did highlight the need for
Telecomm support services similar to the R1S [Performance Work
Statement].
AR 9636. The SSET stated the following about Active’s New York State Parks contract:
14
Active did not use a RUP methodology, not an agile methodology.
They did not provide mapping, ticketing, trip planning, data
management, program management and hosting based on the
relevance of the R1S PWS. Active did provide a reservation system,
call center, web site, help desk, training, and reporting.
Id. at 9542.
Active believes these agency evaluations were extremely similar in terms of the
components identified as either present or lacking, and the references deserved the same relevancy
rating. Pl.’s Rep. at 21-22. However, the quotes referred to by Active point to only a portion of
the SSET’s evaluations. In general, the agency noted that Active’s past performance descriptions
were incomplete and made assigning a relevancy rating more difficult. See e.g., AR 9542. BAH’s
proposal did not suffer from this defect. Regarding the very example that Active relies on in its
briefs, the agency goes on to describe additional aspects of BAH’s DISA contract that
demonstrated it was more relevant that Active’s New York State Parks contract, such as BAH’s
marketing related services. See id. at 9636. Viewing the agency’s evaluation as a whole provides
adequate context as to why BAH received higher relevancy ratings. Id. at 9632-40.
The standard of deference is clear when it comes to past performance evaluations. This
Court must give the “the greatest deference possible” to the agency and its ratings of the relevancy.
Commissioning Solutions Global, 97 Fed. Cl. at 9; Glenn Defense Marine (ASIA), 720 F. 3d at
911. This Court sees ample reason to defer to the agency’s past performance evaluation. Given
this extraordinary deference, the extensive record of evaluation comments contained in the PAR
attachments more than justify deferring to the agency’s relevancy ratings.
For the reasons stated above, the Government and BAH’s cross-motions for judgment on
the administrative record are GRANTED as to Active’s claim of disparate past performance
evaluations.
D. The Agency Engaged in Meaningful Discussions.
The FAR provides that discussions are “undertaken with the intent of allowing the offeror
to revise its proposal” with the “primary objective of . . . maximize[ing] the Government’s ability
to obtain best value.” FAR 15.306(d), (d)(2). Further, the CO must discuss “deficiencies [or]
significant weaknesses . . . to which the offeror has not yet had an opportunity to respond.” FAR
15.306(d)(3) (emphasis added). Agencies are not required to have identical discussions with all
offerors and the scope of those discussions are “a matter of [CO] judgment.” Id.; Atlantic Diving
Supply, Inc. v. United States, 107 Fed. Cl. 244, 263-64 (2012). Active argues that the agency
failed to notify it of two significant weaknesses in its proposal: an unacceptable system uptime
15
commitment and an unacceptable MVP delivery schedule. Pl.’s Mot. at 32. The Court finds that
the agency engaged in meaningful discussions consistent with the FAR and GRANTS the
Government and BAH’s cross-motions for judgment on the administrative record as to Active’s
fourth claim.
1. MVP Uptime Commitments
Section 6.3 of the Performance Work Statement stated that “[t]he Contractor shall . . .
maintain system uptime and availability to all users at a level of 99.9886% which equates to 60
minutes of unplanned downtime per year.” AR 1409. Active proposed an “acceptable quality
limit” of 99.9 percent and stated that it had “recorded a 100% uptime for FY2014.” Id. at 1849;
1632. In its final proposal, the acceptable quality limit was reduced to 99.5 percent. Id. at 4978.
The agency interpreted these statements to show that, in the past, Active had an uptime
commitment of 100 percent and, in the future, an uptime of 99.5 percent would be deemed
acceptable. Def.’s Reply at 17. Active argues that its historical uptime commitment of 100
percent amounted to a commitment to a 100 percent uptime in the future despite never putting that
commitment into writing. Pl.’s Mot. at 33-34. It is undisputed that Active’s “acceptable quality
limit” did not meet the clear standards set forward in the Performance Work Statement and the
agency never specifically entered into discussions with Active about that deficiency. Def.’s Reply
at 17; Pl.’s Mot. at 34.
The Court agrees with the Government that the vague statements in Active’s proposal
indicate, at most, Active’s uptime commitment in the past. AR 1632. At no point does Active
unambiguously commit to a 100 percent uptime in its proposal. In fact, the only clear commitment
is to an “acceptable quality limit” which was clearly in violation of the RFP. Active states that
since the agency never asked it about the nonconformance, “[Active] reasonably believed when it
submitted its FPR that its proposed [acceptable quality limit] could vary from the Solicitation’s
uptime target.” Pl.’s Mot. at 34. According to Active, the agency’s failure to question Active
about a clear violation of the RFP means that Active does not have to conform to the standards of
the RFP. This argument goes too far and places an extraordinary burden on agencies. Under
Active’s desired standard, it is the responsibility of the agency to make sure that offerors
understood the plain language of the RFP. One goal of meaningful discussions is to allow offerors
to respond to perceived weaknesses to which they “[have] not yet had an opportunity to respond.”
FAR 15.306(d)(3). Active cannot reasonably argue that it did not have an “opportunity to
respond” to a requirement that was clearly defined in the RFP. Active did respond; it just did so
defectively.
It is simply asking too much of an agency to identify each and every flaw in an offeror’s
proposal, especially when that flaw is in direct violation of an already clearly identified standard.
16
2. MVP Delivery Schedule
The agency also gave Active a deficiency for its failure to propose an MVP delivery
schedule within six months. See supra section B.4. Active claims that the agency never informed
Active of its nonconformance with the RFP. Pl.’s Mot. at 35. The agency did send Active two
ENs asking for clarification about its MVP delivery schedule, specifically indicating confusion
about Active’s commitment. AR 3101-02. Active’s proposal was ambiguous about when all the
requirements of the MVP would be available. See supra section B.3; AR 1636-39. As the
Government points out, Active’s proposal was so noncommittal that there was very little for the
agency to clarify. Def.’s Mot. at 45. These two ENs represent meaningful discussions regarding
Active’s MVP delivery schedule.
Active argues that BAH benefitted from discussions during the Factor 2 Integrated Solution
presentation regarding its adherence to Treasury requirements. Pl.’s Mot. at 35. Active claims
that if the agency was still unclear as to Active’s MVP delivery schedule, it should have inquired
during Active’s Integrated Solution presentation as well. Id. Active’s only evidence of
discussions between the agency and BAH during BAH’s presentation is one SSET member’s note
in the attachments to the PAR explaining that “[i]n discussions, [BAH] stated that they
understood” the Treasury requirements. AR 9625. Active is unable to point to any moment in
BAH’s Integrated Solution presentation transcript where an SSET member asked BAH about the
Treasury requirements. Instead, this note seems written in response to information BAH presented
during the course of its Factor 2 Integrated Solution presentation. Id. at 15264-66.
In sum, the agency issued Active 118 ENs, while only 55 ENs were issued to BAH, and
allowed 120 mins of presentation time. In addition, the agency told Active to revise its entire
proposal due to an “overwhelming number of nebulous statements scattered throughout the entire
proposal.” Id. at 13104. The agency’s discussions with Active helped Active improve its proposal
from unacceptable to acceptable. Def.’s Reply at 18. These discussions were clearly consistent
with the goals of FAR 15.305(d). Active demands too much from the agency. Thus, the Court
GRANTS the Government and BAH’s cross-motions for judgment on the administrative record
as to Active’s claim regarding meaningful discussions.
Conclusion
For the reasons stated above, this Court GRANTS IN PART Active’s motion for judgment
on the administrative record as to its first claim concerning price realism, but DENIES all other
claims alleged in Active’s motion. The Court also GRANTS IN PART the Government and
BAH’s cross-motion for judgment on the administrative record as to Active’s remaining claims,
but DENIES their cross-motions as to Active’s price realism claim. Pursuant to Rule 52.2, this
case is remanded to the agency to conduct a proper price realism analysis consistent with the RFP
and this opinion. The agency will have 30 days to complete a price realism analysis on remand.
17
The Government will file a Remand Report on March 8, 2017 indicating the analysis and
reasoning supporting the remand decision.
IT IS SO ORDERED.
s/ Thomas C. Wheeler
THOMAS C. WHEELER
Judge
18