Ames 1, LLC v. United States ( 2022 )


Menu:
  •         In the United States Court of Federal Claims
    AMES 1, LLC,
    Plaintiff,                                 No. 22-cv-00196
    v.                                     Filed Under Seal: August 3, 2022
    THE UNITED STATES,                                     Publication: August 10, 20221
    Defendant.
    Anne M. Tavella, Davis Wright Tremaine LLP, Anchorage, Alaska for Plaintiff. With her on the
    briefs is Jonathan A. DeMella, Davis Wright Tremaine LLP, Seattle, Washington.
    David M. Kerr, United States Department of Justice, Civil Division, Commercial Litigation,
    Washington, District of Columbia for Defendant. With him on the briefs are Brian M. Boynton,
    Principal Deputy Assistant Attorney General, Civil Division; Patricia M. McCarthy, Director,
    Commercial Litigation; Deborah A. Bynum, Assistant Director, Commercial Litigation; and
    Nicholas T. Iliff, Jr., United States Air Force, Commercial Litigation Field Support Center.
    MEMORANDUM AND ORDER
    Located on the edge of Anchorage, Alaska, “amid picturesque, majestic, snow-capped
    mountains, lakes, rivers and glaciers,” Joint Base Elmendorf-Richardson (JBER) spans nearly
    13,130 square acres, making it the largest military installation in Alaska.2 JBER is “one of the
    1
    This Memorandum and Order was filed under seal in accordance with the Protective Order
    entered in this case (ECF No. 8) and was publicly reissued after incorporating all redactions
    proposed by the parties. (ECF No. 28.) The sealed and public versions of this Memorandum and
    Order are substantively identical, except for the addition of the publication date and this footnote.
    2
    https://installations.militaryonesource.mil/in-depth-overview/joint-base-elmendorf-richardson-
    jber      (last      viewed       June     23,      2022);     https://www.pacaf.af.mil/Info/Fact-
    Sheets/Display/Article/909896/elmendorf-air-force-base/ (last viewed June 23, 2022).
    1
    most prominent and active Air Force bases in the United States,” housing elite units such as the
    United States Air Force’s 3rd Wing, whose mission is to “support and defend U.S. interests in the
    Asia Pacific region and around the world.”3 As a large military installation, JBER requires
    continued maintenance to keep its “more than 800 buildings, two runways and more than 150 miles
    of roads” in pristine condition.4 But Alaska is different than the lower forty-eight. Situated above
    the 50th and 60th parallels, Alaska experiences “severe weather conditions” between October and
    March, narrowing the window for performing such maintenance.5
    At issue in this protest are contract awards issued to third parties to perform maintenance
    and repair tasks at JBER.      The protestor, AMES 1, LLC (Ames 1), a “minority-owned,
    [Historically Underutilized Business Zone] small business,” brings this post-award bid protest
    challenging the decision of Defendant United States, acting through the U.S. Department of the
    Air Force (Air Force), “not to award AMES 1 an indefinite-delivery/indefinite-quantity” (IDIQ)
    contract under solicitation number FA500021R0001 (Solicitation). Complaint (ECF No. 1)
    (Compl.) ¶¶ 5-6, 10. In its Complaint, Ames 1 argues that the Air Force arbitrarily and capriciously
    (i) considered Past Performance by looking beyond offerors’ overall Past Performance rating,
    contradicting the Solicitation’s express language and the recommendation of the Source Selection
    Evaluation Board (SSEB), and (ii) failed to properly consider Price, resulting in an unlawful award
    3
    https://installations.militaryonesource.mil/in-depth-overview/joint-base-elmendorf-richardson-
    jber      (last      viewed       June     23,      2022);     https://www.pacaf.af.mil/Info/Fact-
    Sheets/Display/Article/909896/elmendorf-air-force-base/ (last viewed June 23, 2022);
    https://www.jber.jb.mil/Units/Air-Force/ (last viewed June 23, 2022).
    4
    https://www.pacaf.af.mil/Info/Fact-Sheets/Display/Article/909896/elmendorf-air-force-base/
    (last viewed June 23, 2022).
    5
    https://installations.militaryonesource.mil/in-depth-overview/joint-base-elmendorf-richardson-
    jber (last viewed June 23, 2022).
    2
    decision. Id. ¶¶ 40-53. Ames 1 seeks (1) an order finding unlawful the Air Force’s failure to
    award Ames 1 a contract, (2) an injunction barring the Air Force from “proceeding with the award,
    including any task order award,” (3) an order directing the Air Force to “conduct an evaluation
    and make an award consistent” with the Solicitation and law, (4) its attorneys’ fees and costs
    associated with this action, including before the Government Accountability Office (GAO),6 and
    (5) “such other relief as the Court deems appropriate.” Id. at 17-18.
    The parties subsequently filed motions for judgment on the administrative record. See
    Plaintiff’s Motion for Judgment on the Administrative Record (ECF No. 15) (Pl.’s MJAR);
    Defendant’s Response to Plaintiff’s Motion for Judgment on the Administrative Record and Cross-
    Motion for Judgment on the Administrative Record (ECF No. 16) (Def.’s Cross-MJAR); see also,
    Plaintiff’s Reply in Support of Its Motion for Judgment on the Administrative Record (ECF No.
    17) (Pl.’s Reply); see infra Background Section IV.
    On March 31, 2022, this Court issued an injunction in a related case, Frawner Corporation
    v. United States, No. 22-cv-0078, mooting Ames 1’s MJAR. See March 31, 2022 Order (ECF No.
    21); Frawner, Memorandum and Order (ECF No. 36) (Frawner Mem. and Order). Accordingly,
    as reflected on the record and in this Court’s March 31, 2022 Order (ECF No. 21), and for the
    reasons explained below, this Court DENIES as moot Plaintiff’s Motion for Judgment on the
    Administrative Record (ECF No. 15) and DENIES as moot Defendant’s Cross-Motion for
    Judgment on the Administrative Record (ECF No. 16).
    6
    Plaintiff does not address fees or costs in its Motion for Judgment on the Administrative Record.
    See Plaintiff’s Motion for Judgment on the Administrative Record (ECF No. 15).
    3
    BACKGROUND7
    The current protest centers on “a multiple-award IDIQ involving minor construction
    services as well as multi-discipline facility and real property repair and alteration services to be
    performed on Joint Base Elmendorf-Richardson” (JBER) in Anchorage, Alaska. Compl. ¶ 6.
    While the JBER contracting program involves two government contracts – a larger project
    referenced as the DB-MACC and a complementary Multiple Award Construction Contract (Mini-
    MACC) involving “small project[s]” – only the Mini-MACC contracts are at issue in this protest.
    Tab 2 (Advanced Procurement Plan (March 15, 2019)) at Administrative Record (AR) 78. As
    explained further below, Ames 1 filed the present action after the Air Force declined to award it
    one of the Mini-MACCs.
    I.    Solicitation
    The Air Force issued the Solicitation for the Mini-MACC contracts on May 3, 2021. See,
    e.g., Tab 8 (Solicitation No. FA500021R0001 (May 3, 2021)); Tab 58 (Executed Contracts
    (December 28, 2021)) at AR 3291 (noting “DATE ISSUED 5/3/2021”). Bids were originally due
    by June 1, 2021, at 2:00 p.m. Alaska Daylight Time. Tab 8 at AR 320. Amendment 0005 to the
    Solicitation extended that deadline to June 14, 2021.8 Tab 13 (Amendment 0005 (June 6, 2020))
    7
    This section contains the Court’s findings of fact derived from the Administrative Record (AR).
    The AR is contained in ECF No. 13. Documents within the Administrative Record are divided
    into “Tabs.” An index of the Administrative Record’s tabs can be found at ECF No. 13-1.
    8
    Between May 3, 2021 and June 7, 2021, the Air Force issued five other amendments to the
    Solicitation that did not substantively alter the evaluation process as it relates to this action. See
    Tab 41 (Source Selection Evaluation Board Report (December 8, 2021)) at AR 2565 (Amendment
    0001 – “Updated site visit date and building number for the seed project”; Amendment 0002 –
    “1st proposal due date extension, remove FAR clause, removed mission essential contractor
    services plan, and provide site visit sign in sheet”; Amendment 0003 – “Responded to contractor
    questions 1 through 40”; Amendment 0004 – “Responded to contractor questions 41-206, revised
    SOW, Spec and drawings for Seed Project”; and Amendment 0006 – “Changed Schedule B unit
    of issue from Lot to Project”); see also Tab 9 (Amendment 0001 (May 3, 2021)); Tab 10
    4
    at AR 835. The Air Force anticipated awarding “up to 4” Mini-MACC contracts that would satisfy
    its construction needs at JBER on an “as needed basis.” Tab 8 at AR 320. While the DB-MACC
    primarily fulfilled task orders “requiring more than incidental design or the services of a registered
    architect or professional engineer,” the Mini-MACC complemented the DB-MACC by satisfying
    JBER’s “small project” needs. Id. at AR 324. As specified in the Solicitation, the Mini-MACC’s
    tasks, to be fulfilled by the Mini-MACC awardees, would consist of “multiple disciplines in
    general construction categories of on-base facilities for JBER.” Id. Contractors fulfilling the Mini-
    MACC’s requirements were expected to have the design and engineering expertise of a “general
    construction contractor.” Id.
    The Mini-MACC awards are indefinite delivery/indefinite quantity (IDIQ) contracts
    governed under FAR “Part 15, Department of Defense (DoD) FAR Supplement Procedures,
    Guidance and Information Subpart 215.3, and Air Force FAR Supplement (AFFARS) Mandatory
    Procedure (MP) 5315.3.” Tab 8 at AR 323; see Tab 7 (Source Selection Plan (June 4, 2021)) at
    AR 317. As defined by the General Services Administration (GSA), IDIQ contracts “provide for
    an indefinite quantity of services for a fixed time [and] are used when [an agency] can’t determine,
    above a specified minimum, the precise quantities of supplies or services that the government will
    require during the contract period.” U.S. General Services Administration, “Indefinite Delivery,
    Indefinite Quantity Contracts,” https://www.gsa.gov/buying-selling/new-to-gsa-acquisitions/how-
    to-sell-to-the-government/indefinite-delivery-indefinite-quantity-contracts (last viewed June 27,
    2022).
    (Amendment 0002 (May 25, 2021)); Tab 11 (Amendment 0003 (May 25, 2021)); Tab 12
    (Amendment 0004 (May 27, 2021)); Tab 14 (Amendment 0006 (June 6, 2020)).
    5
    While Mini-MACC awardees would not know the Air Force’s precise construction needs
    during the lifetime of the procurement, the Solicitation provided estimates for compensation. First,
    each Mini-MACC contract’s value was estimated between $2,000 and $99,999,999 over the five-
    year base period given by the Solicitation. See Tab 8 at AR 320. Second, individual task orders
    under the IDIQ would have a minimum value of $2,000 and a maximum value of $2,000,000,
    “with the majority expected to be less than $500,000.” Id. at AR 323, AR 395. Third, in addition
    to the IDIQ contracts, the Solicitation also included an award for a seed project, FXSB 17-1110,
    Repair BLDG. 5327 Exterior, JBER, AK, which had an estimated value of $500,000 to
    $1,000,000. Tab 8 at AR 320; See Tab 9 at AR 694 (describing the seed project). As explained
    below, the Air Force evaluated the seed project award concurrently with the IDIQ contract awards
    and used the seed project as a measure for evaluating Price for both awards. See infra Background
    Section I(B).
    A. Off-Ramp/On-Ramp Award Procedures
    Although the Solicitation limited the number of initial Mini-MACC IDIQ contracts to “up
    to four,” that pool of awardees could expand or contract if the Air Force triggered certain off-
    ramp/on-ramp procedures. Tab 8 at AR 336-37. In conjunction with selecting “up to four” IDIQ
    awardees, the Air Force created a “reserve vendor pool of up to ten (10) contractors” of otherwise
    eligible offerors who were not chosen for one of the awards. Id. at AR 336. The Solicitation stated
    that such reserve vendors “may be offered an opportunity, within the ordering period of this
    contract, to receive an IDIQ award and be authorized to participate in task order competition.” Id.
    The Air Force could “off-ramp” one of the four awardees for reasons such as “[c]onvenience of
    the Government,” and it could “on-ramp” contractors from the reserve pool if “in the
    Government’s best interest.” Id. In the event the agency chose to “on-ramp” bidders, it would
    6
    “start[] at the highest ranked On-Ramp contractor in the [reserve] pool.” Id. An “on-ramped”
    offeror would then “become an additional Mini-MACC awardee.” Tab 8 at AR 337. The Air
    Force was not required to “off-ramp” one of the four original awardees prior to “on-ramping” an
    offeror from the reserve vendor pool. See id. at AR 336-37.
    B. Basis of Evaluation
    This procurement was a “best value” procurement conducted on a “competitive subjective
    tradeoff” of (1) Price and (2) Past Performance, with the latter “significantly more important” than
    the former. Id. at AR 389. The Air Force only considered “offerors whose proposals conform[ed]
    to all required terms and conditions, include[ed] all required representations and certifications,
    [met] all requirements set forth in the RFP[,] and also provide[d] the best value to the
    Government.” Id.
    The Solicitation permitted the Air Force to award a contract to another offeror that did not
    submit the lowest price “if the difference in the Past Performance Confidence Rating of another
    offeror with [a] higher price justifie[d] the higher price premium.” Id. at AR 391. However, such
    justification had to be based on “an integrated assessment best value award decision using the
    [total evaluated price (TEP)] and the Past Performance Confidence Rating.” Id. At the final stage
    of its analysis, “[o]nce selected awardees [had] been identified for the IDIQ, the Government . . .
    rank[ed] each selected awardee by price from lowest to highest for the seed project (Attachment
    J-4) and award[ed] the seed project to the lowest offeror.” Id. Accordingly, consideration of the
    IDIQ and seed project awards occurred concurrently. Id. The agency’s evaluation of Price and
    Past Performance are described below.
    7
    i. Factor 1: Price
    Price was evaluated based on offerors’ proposed pricing for the seed project. Tab 8 at AR
    389. Offerors submitted their Attachment J-4 Price Schedule, which was then used as the offeror’s
    total evaluated price (TEP) for its bid. Id. While the Solicitation required the Air Force to
    “evaluate the fairness and reasonableness of the Total Evaluated Price (TEP) for all offerors,” the
    Air Force retained the discretion to choose its “price analysis technique[] and procedure[].” Id.
    The Solicitation lists “examples” of such techniques:
    (a) Price analysis: The process of examining and evaluating a proposed price without
    evaluating its separate cost elements and proposed profit.
    (b) Comparison of proposed prices received in response to the [S]olicitation. Normally,
    adequate price competition establishes price reasonableness.
    (c) Comparison of previously proposed prices and previous Government contract prices
    with current proposed prices for the same or similar effort.
    Id. In addition to a price reasonableness analysis, the agency could — but was not required to —
    conduct a price realism analysis to ensure that “the project [could] realistically be completed within
    the proposed constraints.” Id. As a result of any price realism analysis, the Air Force would
    disregard price proposals that it found “unrealistically low.” Id.
    ii. Factor 2: Past Performance
    Past Performance was evaluated to “assess the degree of confidence the Government has
    in the offeror’s ability to meet the [S]olicitation requirements based on the offeror’s demonstrated
    record of performance.” Tab 8 at AR 390. The Air Force could evaluate up to five past projects
    submitted by an offeror. Id. at AR 379. In assessing Past Performance, the Air Force could “give
    greater consideration to information on those contracts deemed most relevant to the effort
    described in [the] [S]olicitation.” Id. at AR 391.
    The Solicitation indicated that “[r]elevant past performance information for the five (5)
    completed projects must demonstrate minimum design/build and build experience with multiple
    8
    disciplines.” Id. at AR 380. While “[n]ot all projects are required to have design/build or multiple-
    discipline aspects,” the Solicitation noted that “this experience must be represented within the total
    of submitted contracts/projects.” Id. at AR 380-81. The Solicitation listed applicable disciplinary
    skills as follows:
    (1) Demolition
    (2) Painting
    (3) Carpentry
    (4) Mechanical/HVAC/Plumbing
    (5) Fire Sprinkler/Fire Alarm Systems
    (6) Electrical
    (7) Structural
    (8) Minimal Design as specified in Mini-MACC SOW 01000, para. 1.3
    (9) Hazmat/Asbestos/Abatement
    (10) Roofing/Insulation/Thermal & Moisture Control
    (11) Civil work
    Id. at AR 380.
    1. Past Performance Documentation
    Offerors were required to submit the following documentation along with each past project:
    a (1) Past Performance Questionnaire, (2) Past Performance Questionnaire Cover Letter, (3) Past
    Performance Supplement Worksheet, and (4) Past Performance Information Document. Tab 8 at
    AR 379.
    Past Performance Questionnaires (PPQs) and Cover Letter. PPQs, also referenced as
    Attachment J-7 to the Solicitation, are completed by references who worked with the offeror on
    one of its past projects. Id. at AR 379. The Solicitation “require[d] the offeror send out a PPQ to
    each [reference] identified in the Past Performance” section. Id. The PPQ contained fifteen
    multiple choice questions, and the reference was asked to “answer all questions by checking only
    one (1) response per question, [while placing] additional information . . . in the space provided.”
    Tab 8 at AR 613-18. In addition to the fifteen multiple choice questions, the PPQ also requested
    9
    the reference indicate, inter alia, the project’s (i) “original” and “final” award amount, (ii) status
    as “active” or “100% complete,” (iii) “completion date” or “expected completion date,” and (iv)
    inclusion of any of the eleven disciplines (i.e., Demolition) listed in the Solicitation. Id. at AR
    614; see supra p. 9. After completing the PPQ, the reference was required to email the PPQ and
    PPQ Cover Letter directly to Air Force personnel Donald Dougherty and John Jeffrey indicating
    whether the reference would recommend the offeror for the procurement. See Tab 8 at AR 379.
    Past Performance Supplement Worksheet. The Past Performance Supplement Worksheet,
    also referenced as Attachment J-8 to the Solicitation, is a Microsoft Excel spreadsheet completed
    by the offeror. An offeror must list its submitted projects (five maximum) indicating each past
    project’s (i) “[p]eriod of [p]erformance,” (ii) “[p]rice,” and (iii) inclusion of any of the eleven
    disciplines (i.e., Demolition) listed in the Solicitation. Id. at AR 619; see supra p. 9.
    Past Performance Information Document. Offerors were required to provide a “[s]ummary
    of each contract/project” that included the following:
    (a) Company/Division name[,] (b) Contract/Project Title[,] (c) Contract/Project
    Location[,] (d) Contracting Agency/Customer[,] (e) Contract Number[,] (f)
    Contract Dollar Value[,] (g) Period of Performance[,] (h) Verified, up-to-date
    name, mailing and e-mail addresses, and telephone number of the contracting
    officer (Point-of-Contact)[,] (i) Comments regarding compliance with contract
    terms and conditions (e.g. scope, cost and period of performance, labor and
    statutory requirements)[,] (j) Descri[ption] [of] any known performance deemed
    unacceptable by the customer, or not in accordance with the contract terms and
    conditions [and] description of how it was resolved[,] (k) [S]ummary description
    of the project scope of work [including] (i) rationale supporting your assertion of
    relevance and identify aspects (scope, magnitude of effort, and complexity) of the
    contracts deemed relevant and how they relate to the proposed effort [and] (ii)
    [d]emonstration of performance of minimal design/build[,] [d]emonstration of
    management of multiple discipline construction projects[,] [d]emonstration of
    meeting project cost, quality standards and schedule, (l) Discussion of noteworthy
    aspects and challenges[,] [and] (m) Pictures of projects may be included, if desired.
    Id. at AR 379-80.
    10
    2. Past Performance Analysis
    Based on the documentation provided by the offerors and their references, each of the five
    submitted projects were assessed for “Recency and Relevancy.” Id. at AR 390. Those projects
    deemed “Recent” and “Relevant” were then assessed for Quality. Id. The Solicitation defined
    Recent projects as “those efforts completed for any customer(s) within the last three (3) years prior
    to the issuance date of the [S]olicitation.” Id. The Solicitation noted that Relevant projects would
    be “assessed based upon the extent to which past performance is of similar scope, magnitude and
    complexity to the type of projects exemplified by the seed project for this [S]olicitation.” Id. The
    Solicitation further noted that “[t]o be considered relevant, greater consideration [would] be given
    to submitted contracts demonstrating work completed above the 60th parallel in a seismically
    active area (e.g. Alaska).” Tab 8 at AR 380. Further, “[c]omplexity and scope [could] be
    determined by viewing the previous contracts” based on the presence of the eleven disciplines
    listed in the Solicitation. Id. at AR 381. The Solicitation provided the following Relevancy
    adjectival ratings and accompanying definitions:
    Rating: Very Relevant. Definition: Present/past performance involved essentially the same
    scope and magnitude of effort and complexities this solicitation requires.
    Rating: Relevant. Definition: Present/past performance effort involved similar scope and
    magnitude of effort and complexities this solicitation requires.
    Rating: Somewhat Relevant. Definition: Present/past performance effort involved some of
    the scope and magnitude of effort and complexities this solicitation requires.
    Rating: Not Relevant. Definition: Present/past performance effort involved little or none
    of the scope and magnitude of effort and complexities this solicitation requires.
    Id. at AR 390. The Solicitation warned that only those past projects deemed “[R]ecent, [R]elevant
    and [S]omewhat [R]elevant” would be assessed for Quality. Id. Accordingly, a “Not Recent” or
    11
    “Not Relevant” project would not receive further consideration for purposes of Past Performance
    analysis. Id.
    Quality was assessed “with a focus on quality control, timely performance, effectiveness
    of management, and regulatory compliance.” Id. After the Air Force completed its (i) Recency,
    (ii) Relevancy, and, if applicable, (iii) Quality assessment for up to five of an offeror’s submitted
    projects, each offeror was then “assigned a single past performance confidence rating.” Tab 8 at
    AR 391. The Past Performance confidence ratings and their accompanying definitions were as
    follows:
    Rating: SUBSTANTIAL CONFIDENCE. Description: Based on the offeror’s
    recent/relevant performance record, the government has a high expectation that the offeror
    will successfully perform the required effort.
    Rating: SATISFACTORY CONFIDENCE. Description: Based on the offeror’s
    recent/relevant performance record, the government has a reasonable expectation that the
    offeror will successfully perform the required effort.
    Rating: NEUTRAL CONFIDENCE. Description: No recent/relevant performance is
    available or the offeror’s performance record is so sparse that no meaningful confidence
    assessment rating can be reasonably assigned. The offeror may not be evaluated favorably
    or unfavorably on the factor of past performance.
    Rating: LIMITED CONFIDENCE. Description: Based on the offeror’s recent/relevant
    performance record, the government has a low expectation that the offeror will successfully
    perform the required effort.
    Rating: NO CONFIDENCE. Description: Based on the offeror’s recent/relevant
    performance record, the government has no expectation that the offeror will successfully
    perform the required effort.
    Id.
    II.    The Air Force’s Analysis and Award Decisions
    The Air Force received thirteen timely proposals from the following entities: Ahtna Global,
    LLC (Ahtna); AMES 1, LLC (Ames 1); Ancor, Inc. (Ancor); Eklutna Construction &
    Maintenance, LLC (Eklutna); Frawner Corporation (Frawner); HPM, Inc. (HPM); Iyabak
    12
    Construction, LLC (Iyabak); Nodak Electric & Construction, Inc. (Nodak); Orion Construction,
    Inc. (Orion); SD Construction, LLC (SD Construction); Tikigaq Federal Services, LLC (Tikigaq);
    Tyonek Construction Services, LLC (Tyonek); and White Mountain Construction, LLC (White
    Mountain Construction).9 See Tab 41 (Source Selection Evaluation Board Report (December 8,
    2021)) at AR 2566. The Air Force’s review proceeded in two steps. First, its Source Selection
    Evaluation Board (SSEB) issued a report recommending awardees based on analyses performed
    by its “pricing and past performance evaluation teams.” Id. at AR 2565-66. Second, its Source
    Selection Authority (SSA) reviewed the SSEB’s report and made the ultimate award decisions.
    See Tab 44 (Source Selection Decision (December 20, 2021)) at AR 2704-11. Each entity’s
    analysis and report is summarized in turn.
    A. The Source Selection Evaluation Board (SSEB) Report
    On December 8, 2021, the SSEB issued a report analyzing the offerors and detailing its
    awardee recommendations to the SSA. Tab 41 at AR 2652.
    i. Factor 1: Price
    In evaluating Price, the Air Force’s price evaluation team “reviewed each offeror’s
    Attachment [J-]4 Price Schedule for the seed project to calculate each offeror’s [Total Evaluated
    Price (TEP)].” Tab 41 at AR 2566. Based on its review, the price evaluation team determined
    that it was unnecessary to conduct a price realism analysis for this procurement. Id. Next, in
    assessing price reasonableness, the SSEB employed “the technique outlined in FAR 15.404-
    1(b)(2)(i),” by comparing proposed prices received “to establish a fair and reasonable price.” Id.
    The SSEB went a step further to “assist in a more thorough analysis of price” by calculating the
    9
    Defendant did not evaluate two additional offerors —
    . Tab 41 at AR 2566-67.         submitted its proposal late, and the Air
    Force never received         proposal. Id.
    13
    average of all TEPs and comparing each price “to the mean of all evaluated proposals.” Id. In its
    documentation of each offeror’s Price assessment, the SSEB indicated each offeror’s (i) total
    evaluated price, (ii) “[m]ean [p]riced [p]roposal,” (iii) “[d]ifference from [m]ean [p]riced
    proposal” in dollars, and (iv) “[d]ifference from [m]ean [p]riced [p]roposal” by percentage. See,
    e.g., Tab 41 at AR 2613.
    The SSEB determined that price competition in this procurement — among thirteen bidders
    — was a factor that ensured reasonable prices for the Air Force:
    In this case, it is apparent that two or more responsible offerors, competing
    independently, submitted priced offers that satisfy the Government’s expressed
    requirement. Therefore, price competition can be used as the basis to establish a
    fair and reasonable price.
    Id. at AR 2568. However, the Air Force’s analysis did not end there. It noted an expectation of
    price variation in the proposals because “in the Anchorage market individual project costs vary,
    sometimes quite significantly, between offerors on competitive proposals.” Id. The SSEB
    provided numerous reasons for the pricing disparity in Anchorage including: “some contractors
    have their own workforce for certain disciplines, some own equipment for certain project types vs.
    having to lease equipment, better relationships with limited sub-contractor marketplace, etc.” Id.
    Finally, in reference to the two highest priced bids (Tyonek and Eklutna), the SSEB noted that
    “although th[ese] offeror[s’] price[s] [are] substantially higher than other offerors that does not
    mean [their] offer[s] [are] not fair and reasonable. [Instead,] their price is still considered [to] be
    within a reasonable range given the extreme variability in prices seen in the past year.” Tab 41 at
    AR 2582 (Eklutna), AR 2602 (Tyonek).
    ii. Factor 2: Past Performance
    The past performance evaluation team assessed Past Performance by analyzing the required
    documentation provided by offerors and their references as well as “information obtained [in
    14
    accordance with] FAR Part 42.1503(g), and . . . any other information independently obtained by
    the Government.”     Id. at AR 2566; see supra Background Section I(B)(ii)(1).            While the
    Solicitation did not specify how “scope, magnitude of effort, and complexity” would be evaluated,
    the SSEB evaluated (i) “scope,” (ii) “magnitude,” and (iii) “complexity” separately as independent
    sub-factors of Relevance, with each prong receiving its own adjectival sub-rating reflecting one of
    the overall Relevance adjectival ratings. See Tab 41. The SSEB then automatically applied the
    lowest of the three adjectival sub-ratings as the overall Relevance rating for that project. Id. For
    example, Ames 1’s F-35A Aircraft Maintenance Unit Administrative Facility (EIE432) project,
    which received “Very Relevant” “scope” and “complexity” sub-ratings, but a “Not Relevant”
    “magnitude” sub-rating, was rated “Not Relevant.” Tab 41 at AR 2580. The SSEB evaluated each
    Relevance sub-factor as follows:
    Scope. The SSEB assessed “scope” by determining a past project’s inclusion of the eleven
    disciplines stated in the Solicitation.10 See, e.g., Tab 41 at AR 2569; see also supra p. 9. Past
    projects demonstrating three or more of the eleven disciplines received a “Very Relevant” “scope”
    sub-rating. See generally Tab 41 (assigning a “Very Relevant” “scope” sub-rating for past projects
    with three through eleven of eleven disciplines listed in the Solicitation). Past projects containing
    fewer than three disciplines received lower “scope” sub-ratings. See, e.g., id. at AR 2593
    (assigning                                                project a “Relevant” “scope” sub-rating
    where it demonstrated two of the eleven disciplines listed in the Solicitation).
    Magnitude. In evaluating “magnitude,” the SSEB assessed whether a past project was
    “representative of the task order values projected for the Mini-MACC.” See, e.g., id. at AR 2569.
    10
    While the Solicitation did not specify how its “multiple discipline[ary]” requirement could be
    met, the SSEB analyzed those “disciplines” in its “scope” sub-factor analysis. See Tab 8 at AR
    380-81.
    15
    In doing so, it assigned “Very Relevant” sub-ratings to past projects valued less than $2 million
    and “Not Relevant” sub-ratings to past projects valued more than $2 million. Compare id. at AR
    2579 (
    ), with id. at AR 2580 (
    ).
    Complexity. A past project’s “complexity” sub-rating hinged on whether it was performed
    “above the 60th parallel in a seismically active area.” See, e.g., id. at AR 2644. The SSEB assigned
    “Very Relevant” “complexity” sub-ratings to projects performed above the 60th parallel and in a
    seismically active area. Id. (
    ). If a past project was performed below the 60th parallel
    or was not in a seismically active area, it received a lower “complexity” sub-rating. See, e.g., id.
    at AR 2622 (assigning “Relevant” “complexity” sub-rating to
    project “performed in Glacier Bay, AK, which is below the 60th parallel and in a
    seismically active area”).
    In assessing Recency, the SSEB mistakenly noted that projects “completed within three (3)
    years of the solicitation date of April 30, 2021” were Recent. See, e.g., id. at AR 2568.
    Defendant’s counsel noted at oral argument in Frawner that Defendant believes this was a
    typographical error in the SSEB’s report as the Solicitation was issued on May 3, 2021, not on
    April 30, 2021. Frawner, Transcript of Oral Argument, dated March 17, 2022 (ECF No. 30) at
    55:9-14; see also AR 3291 (noting “DATE ISSUED 5/3/2021”).
    16
    Next, for past projects deemed Relevant and Recent, the SSEB assigned a “general
    [Q]uality rating for each PPQ . . . based on the answers to the questions in the PPQ and narratives,
    if provided.” Tab 41 at 2566; see also Tab 41 at AR 2580 (
    ). The SSEB’s Quality ratings included: “Exceptional,” “Very Good,” “Not
    Received,” and “Not Rated.” See generally Tab 41. Neither the Solicitation nor the SSEB’s report
    provide separate definitions for these ratings. Id.; see also Tab 8.
    Finally, consistent with the definitions provided in the Solicitation, each offeror received
    an overall Past Performance rating based on the SSEB’s confidence in the offeror’s ability “to
    perform the work under this Mini-MACC program.” See, e.g., Tab 41 at AR 2647; see also Tab
    8 at AR 391 (Past Performance confidence rating definitions). The SSEB assigned “Substantial
    Confidence” ratings to those offerors “found to have performed work comparable to the scope,
    magnitude, and complexity associated with Mini-MACC task orders, and based upon evaluation
    of PPQs and CPARs [that] obtained Very Good to Exceptional [Quality] ratings.” Tab 41 at AR
    2649.
    iii. Tradeoff Analysis and Recommendation
    After concluding its Price and Past Performance review, the SSEB ranked offerors based
    on the “past performance confidence rating of the Offerors[’] ability to perform the work
    anticipated under the Mini-MACC contract and by the TEP from lowest to highest within each
    confidence rating.” Id. at AR 2648. The SSEB’s rankings, based on Price and Past Performance,
    are reflected in the following chart:11
    11
    The SSEB did not provide a reason for shading cells within its chart differently. Tab 41 at AR
    2648.
    17
    Id.
    The SSEB ranked offerors receiving “Substantial Confidence” Past Performance ratings
    above offerors with “Satisfactory Confidence” and “Neutral Confidence” Past Performance
    ratings, without regard to Price. Id. Offerors within each Past Performance confidence rating
    category were then ranked from lowest to highest Price. Id. The SSEB determined that offerors
    receiving “Substantial Confidence” Past Performance ratings “were all essentially of equal value
    to the [G]overnment” and that “there were no single or cumulative past performance records that
    demonstrated that an offeror’s performance history warranted an assessment of additional value
    amongst the other offers evaluated as ‘[S]ubstantial [C]onfidence.’”      Tab 41 at AR 2649.
    Accordingly, the SSEB determined that for offerors receiving a “Substantial Confidence” Past
    Performance rating, it need not trade off Price “for a higher performance confidence rating since
    the lowest priced offers with [a] [S]ubstantial [C]onfidence [rating] are recommended for award.”
    Id.
    Rather, the SSEB conducted tradeoff analyses only for offerors receiving “Satisfactory
    Confidence” or “Neutral Confidence” Past Performance ratings. See Tab 41 at AR 2649-51. Its
    tradeoff analysis among those offerors was nearly identical — a lower priced bid did not outweigh
    18
    a higher rated Past Performance ranking. Id. For example, the SSEB determined that because
    Frawner received a “                        ” Past Performance rating and the Solicitation stated
    that “Past Performance was significantly more important than [P]rice,” Frawner should not receive
    one of the four Mini-MACC awards as its lower Price did not outweigh its                     Past
    Performance rating. Tab 41 at AR 2649-50. Rather, the SSEB recommended that Frawner “be in
    the on-ramp pool.” Id. at AR 2649. In sum, a bidder could not rise in ranking if it had a lower
    Price, but a weaker “Substantial Confidence” Past Performance rating than another offeror.
    As Ames 1 bid the third lowest price among offerors receiving a “                       ”
    rating, the SSEB recommended Ames 1 receive a Mini-MACC award. Id. at AR 2652. In addition
    to Ames 1, the SSEB also recommended the following offerors receive a Mini-MACC award: SD
    Construction, Ancor, and Nodak. Id. It further recommended the following offerors as on-ramp
    contractors: (1) Orion, (2) Tyonek, (3) Eklutna, (4) Ahtna, (5) Frawner, (6) Iyabak, (7) White
    Mountain, (8) HPM, and (9) Tikigaq. Id.
    B. The Source Selection Authority (SSA) Report and Decision
    In the second phase of the Air Force’s evaluation, the Source Selection Authority (SSA),
    reviewed the SSEB’s report along with “all available documents pertaining to the acquisition,
    including evaluation briefing slides, offeror proposals, consensus documentation, evaluation
    reports, price information, and other documentation.” Tab 44 at AR 2704. The SSA then made
    its award decisions on December 20, 2021, “after extensive review of the documentation and in
    consultation with the Source Selection Evaluation Board (SSEB), and . . . advisors.” Id. at AR
    2704, 2711. While it adopted many of the SSEB’s findings, the SSA’s conclusions differed from
    the SSEB report in significant respects.
    19
    i.   Factor 1: Price
    In analyzing price reasonableness, the SSA adopted the SSEB’s conclusion of reasonable
    prices given project costs in the Anchorage market “vary, sometimes quite significantly, between
    offerors on competitive proposals.” Tab 41 at AR 2568. The SSA further determined that this
    price variance, influenced by supply and labor factors, was exacerbated by the “recent COVID-
    affected environment.” Id. The SSA also based its price reasonableness determination on
    competition among awardees because “future [task orders] will be competed amongst all the
    offerors and therefore no awardee with consistently high prices will ever receive any of those
    competed [task orders].” Id.
    ii.   Factor 2: Past Performance
    In analyzing Past Performance, the SSA accepted the SSEB’s Past Performance adjectival
    ratings and sub-ratings, but differed with the SSEB’s conclusion that offerors with the same overall
    Past Performance rating necessarily provide the Air Force with the “essentially . . . equal value.”
    Tab 44 at AR 2710 (“I have reviewed the SSEB Report and agree with the rationales documented
    for the Confidence Ratings.”); Tab 41 at AR 2649. Instead, the SSA determined that “there are
    past performance records that demonstrate that some offeror[s’] performance history warrants an
    assessment of additional value amongst the other offers evaluated as ‘[S]ubstantial [C]onfidence.’”
    Tab 44 at AR 2706.
    iii.    Tradeoff Analysis and Award Decision
    The SSA came to a different award determination than recommended by the SSEB.
    Specifically, three bidders recommended by the SSEB for Mini-MACC awards did not receive an
    award following the SSA’s review. Compare Tab 41 at AR 2651-52 (SSEB’s recommendation),
    with Tab 44 at AR 2708-10 (SSA’s decision). For example, the SSEB ranked Ames 1 third and
    20
    recommended it for an award, yet the SSA moved it to seventh place on the on-ramp. Compare
    Tab 41 at AR 2651-52, with Tab 44 at AR 2708-10. On December 20, 2021, the SSA awarded the
    IDIQ contracts to (1) SD Construction, (2) Tyonek, (3) Eklutna, and (4) Orion, the “four Offerors
    with the highest number of Very Relevant efforts in correlation with the highest number of
    Exceptional [PPQs] and CPARS.” Id. at AR 2707. All four awardees received “Substantial
    Confidence” ratings. Id. As the top placed offeror, SD Construction was also awarded the seed
    project. Id.
    The SSA stated that the “remaining nine (9) offerors that submitted timely proposal[s]”
    would be on-ramp contractors in the following order: (1) Nodak, (2) Ancor, (3) Ames 1, (4)
    Frawner, (5) Ahtna, (6) White Mountain, (7) Iyabak, (8) HPM, and (9) Tikigaq. Tab 44 at AR
    2708-09. The SSA listed the on-ramp awardees “in order of Confidence Rating[,] [and within a
    Confidence Rating category based on] the number of projects found to be Exceptional and those
    determined to be Very Good.” Id. at AR 2709.
    The SSA concurred with the SSEB’s Price analysis using offerors’ total evaluated price
    (TEP) for the seed project. Id. at AR 2706. However, the SSA noted that while TEPs are “useful
    as a guide to give the Government an indication of how each offeror would price this specific
    project, . . . it is not a reliable indicator of their prices for future requirements and is certainly not
    an indication of their prices relative to other offerors for that future work.” Id. at AR 2707.
    Accordingly, the SSA concluded that instead of relying only on Price in comparing offerors with
    the same Past Performance rating, the Air Force had to consider “evidence of quality in recent and
    relevant projects” (i.e., Quality ratings). Id.
    Having determined that offerors with the same Past Performance rating are not necessarily
    of the same value to the Government, the SSA disagreed with the SSEB’s decision to forgo a best
    21
    value tradeoff analysis for offerors receiving a “Substantial Confidence” rating. Id. at AR 2706.
    Accordingly, the SSA performed a best value analysis for the four awardees. Id. at AR 2706-08.
    Each is described in turn.
    SD Construction.       SD Construction’s Price was $933,000.00 with a “Substantial
    Confidence” Past Performance rating. Id. at AR 2708. The SSA determined that SD Construction
    “ha[d] the offer that is most beneficial to the Government and is [thus] the first awardee listed.”
    Id. at AR 2707. Not only did SD Construction offer the lowest price at “33.6% below the mean
    of all TEPs,” but it also had the “highest [Q]uality rating[s] of any offeror (tied with Tyonek).” Id.
    Tyonek & Eklutna.       Tyonek’s Price was $1,874,000.00, while Eklutna’s Price was
    $1,995,080.00. Id. at AR 2708. Both bidders received “Substantial Confidence” Past Performance
    ratings. Id. The SSA noted that Tyonek’s TEP was “33.9% above the mean of all TEPs and
    100.9% higher than SD [Construction].” Id. at 2707. Eklutna had “the highest TEP of any offeror
    and [was] 42.0% above the mean.” Id. Accordingly, the SSA reasoned that Tyonek and Eklutna
    warranted the second and third awards, respectively, for the following reasons:
    First, they have higher quality ratings than the offerors identified below as going
    into the on-ramp pool. Second, any future [task orders] awarded under this Mini-
    MACC IDIQ will be competed amongst all awardees which will prevent any
    excessively high prices from being paid by the Government. Third, although these
    two companies have the highest TEPs of all considered offerors their TEPs are not
    outside the range of reason given the high variability in construction pricing and
    methods in this region. Finally, the need for well-qualified contractors able to
    provide quality work for the period of this Mini-MACC is worth more to the
    Government than the risk represented by higher TEPs for the seed project,
    especially when that seed project will be awarded to a different offeror.
    Tab 44 at AR 2707-08.
    Orion. For the fourth award, the SSA recognized that two offerors it considered — Orion
    and Nodak — “are very closely matched” as (i) each received three “Exceptional” Quality ratings
    and one “Very Good” Quality rating,
    22
    Id. at AR 2708.
    Orion’s Price was $1,441,274.11, while Nodak’s Price was                        . Id. at AR 2708-09.
    Both bidders received “Substantial Confidence” Past Performance ratings. Id.
    , the SSA awarded the fourth contract to Orion because all four of its
    rated projects received “Very Relevant” sub-ratings, whereas Nodak had three projects receiving
    “Very Relevant” sub-ratings and one project receiving a “Relevant” sub-rating. Id. The SSA
    further rationalized this tradeoff by referencing the Solicitation’s preference for Past Performance
    over Price. Id. (“Since we stated that Past Performance is significantly more important than price
    I conclude that Orion offers the best value to the Government,                                       ,
    based on the slightly higher quality demonstrated by the projects submitted for Past Performance
    when relevancy is also considered.”).
    The following chart summarizes the SSEB’s and SSA’s overall findings:
    Bidder      SSEB Ranking              SSA Ultimate       Factor #1:         Factor #2: Past
    Name       Recommendation                Ranking          Price12           Performance13
    SD       1 (Recommended              1 (Awardee       $933,000.00      Substantial Confidence
    Construction     awardee)                  and seed                           Exceptional (4)
    project                           Very Good (1)
    awardee)
    Tyonek                6             2 (Awardee)      $1,874,000.00    Substantial Confidence
    Exceptional (4)
    Very Good (1)
    12
    The prices referenced reflect each bidder’s total evaluated price.
    13
    Bidders’ overall Past Performance ratings are underlined in the chart with individual Quality
    ratings listed below. A project received a “Not Assessed” rating if it was not assigned a Quality
    rating, a “Not Considered” rating if it failed to meet the Solicitation’s technical requirements, and
    a “Not Received” rating if a bidder failed to submit all requisite information for a past project. Tab
    41 at AR 2566 (“Not Assessed”); Tab 41 at AR 2594, 2612, 2629, 2634 (“Not Considered”); Tab
    41 at 2581, 2640 (“Not Received”).
    23
    Eklutna            7          3 (Awardee)    $1,995,080.00   Substantial Confidence
    Exceptional (3)
    Very Good (2)
    Orion             5          4 (Awardee)    $1,441,274.11   Substantial Confidence
    Exceptional (3)
    Very Good (1)
    Not Assessed (1)
    Nodak        4 (Recommended   5 (On-Ramp)                    Substantial Confidence
    awardee)                                       Exceptional (3)
    Very Good (1)
    Not Considered (1)
    Ancor       2 (Recommended   6 (On-Ramp)                    Substantial Confidence
    awardee)                                       Exceptional (2)
    Very Good (2)
    Ames 1       3 (Recommended   7 (On-Ramp)                    Substantial Confidence
    awardee)                                       Exceptional (1)
    Very Good (2)
    Not Received (1)
    Not Assessed (1)
    Frawner            9          8 (On-Ramp)                    Satisfactory Confidence
    Exceptional (1)
    Very Good (2)
    Not Assessed (2)
    Ahtna              8          9 (On-Ramp)                    Satisfactory Confidence
    Global                                                           Exceptional (1)
    Very Good (4)
    White              11         10 (On-Ramp)                   Satisfactory Confidence
    Mountain                                                          Exceptional (1)
    Construction                                                      Not Assessed (3)
    Not Considered (1)
    Iyabak             10         11 (On-Ramp)                   Satisfactory Confidence
    Very Good (2)
    Not Assessed (2)
    Not Received (1)
    HPM Inc.            12         12 (On-Ramp)                     Neutral Confidence
    Not Considered (3)
    24
    Tikigaq              13           13 (On-Ramp)                         Neutral Confidence
    Federal                                                                  Very Good (1)
    Services                                                                Not Assessed (1)
    Not Considered (3)
    See Tab 41; Tab 44.
    III.    Ames 1 Debriefing
    Despite that the SSEB recommended Ames 1 as an awardee, the SSA — and ultimately
    the Air Force — ranked Ames 1 as the seventh best bid, and on December 17, 2021, notified
    Ames 1 that it was not selected for a Mini-MACC award. Tab 44 at AR 2708-10; Tab 43 (Pre-
    Award Notice to Unsuccessful Offeror (December 17, 2021)) at AR 2688-89. The Air Force
    issued a formal Post-Award Notification to Ames 1 on December 28, 2021. Tab 49 (Post-Award
    Notice of Unsuccessful Offeror (December 28, 2021)) at AR 2892-95. While the Air Force
    informed Ames 1 that it was not selected for a Mini-MACC award, the Air Force also noted that
    Ames 1 would be placed in the on-ramp reserve vendor pool. Id. at AR 2893. Consistent with the
    Air Force’s debriefing protocols, Ames 1 submitted questions to the Air Force on January 3, 2022,
    and the Air Force timely responded on January 5, 2022. Tab 60 (Post-Debriefing Questions /
    Response – AMES 1 (January 5, 2022)) at AR 4344-46.
    IV.     Procedural History
    On January 7, 2022, Ames 1 timely filed a bid protest at the GAO. Tab 66 (Protest of
    AMES 1, LLC (January 7, 2022)) at AR 4411-12. The GAO dismissed Ames 1’s protest on
    February 4, 2022, due to a pending protest at the United States Court of Federal Claims concerning
    the same Solicitation by another disappointed bidder, Frawner Corporation. Compl. ¶¶ 3-4; see
    also Frawner Corporation v. United States, No. 22-cv-0078.
    On February 22, 2022, Ames 1 filed the present protest. See Compl. On March 11, 2022,
    Ames 1 filed its Motion for Judgment on the Administrative Record, and on March 22, 2022,
    25
    Defendant filed its Cross-Motion for Judgement on the Administrative Record. See Pl.’s MJAR;
    Def.’s Cross-MJAR. In the related case, Frawner Corporation v. United States, No. 22-cv-0078,
    Defendant consented to a voluntary stay of its award through March 31, 2022, also reflected in a
    Joint Status Report filed by the parties to this action. See March 1, 2022, Joint Status Report (ECF
    No. 11) (JSR) at 1. On March 30, 2022, this Court conducted oral argument on the pending
    motions at the United States District Court for the District of Alaska in Anchorage, Alaska. See
    Transcript of Oral Argument, dated March 30, 2022 (ECF No. 20) (Tr. Oral Arg.).
    Due to the “fast approaching start to the Alaskan construction season and the expiry of
    [D]efendant’s voluntary stay,” on March 31, 2022, the Court issued a decision on the record in the
    current action and in Frawner. Transcript of Joint Status Conference dated March 31, 2022 (ECF
    No. 23) (Ames Mar. 31 Tr.) at 3:8-10; JSR at 2; Frawner, Transcript of Joint Status Conference
    dated March 31, 2022 (ECF No. 35) (Frawner Mar. 31 Tr.). In Frawner, the Court enjoined the
    Air Force from awarding or proceeding with any award under the Solicitation, other than to SD
    Construction, and ordered the Air Force to undertake corrective action should it opt to continue
    with Mini-MACC awards under the Solicitation.14 Frawner Mar. 31 Tr. at 3:24-4:12, 13:17-15:18;
    Frawner Mem. and Order; Frawner, Order Granting in Part Plaintiff’s MJAR and Denying in Part
    Defendant’s Cross-MJAR (ECF No. 33) (Frawner Order). This Court’s injunction in Frawner
    mooted Ames 1’s MJAR. Ames Mar. 31 Tr. at 4:9-14. On March 31, 2022, on consent of the
    parties, this Court provided a ruling on the record denying both Plaintiff’s Motion for Judgment
    on the Administrative Record and Defendant’s Cross-Motion for Judgment on the Administrative
    14
    Familiarity with this Court’s decision in Frawner reflected in its March 31, 2022 decision on
    the record, March 31, 2022 Order, and its July 29, 2022 Memorandum and Order is presumed.
    Frawner Mar. 31 Tr.; Frawner, Order Granting in Part Plaintiff’s MJAR and Denying in Part
    Defendant’s Cross-MJAR (ECF No. 33) (Frawner Order); Frawner Mem. and Order.
    26
    Record as moot. Transcript of Status Conference dated March 2, 2022 (ECF No. 25) at 8:4-9:22
    (noting no objections on behalf of parties to Court’s intention of first providing oral ruling followed
    by a later written opinion and judgment); Ames Mar. 31 Tr.
    APPLICABLE LEGAL STANDARD
    This Court reviews post-award bid protests in two steps. First, the Court analyzes the
    procurement under the Administrative Procedure Act (APA). 
    28 U.S.C. § 1491
    (b)(4); Harmonia
    Holdings Grp., LLC v. United States, 
    20 F.4th 759
    , 766 (Fed. Cir. 2021). Second, the Court must
    analyze whether the alleged errors prejudiced the protestor. See DynCorp Int’l, LLC v. United
    States, 
    10 F.4th 1300
    , 1308 (Fed. Cir. 2021).
    Turning to the first step, the APA requires a reviewing court to determine “whether the
    agency’s actions were ‘arbitrary, capricious, an abuse of discretion, or otherwise not in accordance
    with law.’” Off. Design Grp. v. United States, 
    951 F.3d 1366
    , 1371 (Fed. Cir. 2020) (quoting
    Glenn Def. Marine (ASIA), PTE Ltd. v. United States, 
    720 F.3d 901
    , 907 (Fed. Cir. 2013)); see
    
    5 U.S.C. § 706
    . Although the inquiry under the APA “is to be searching and careful, the ultimate
    standard of review is a narrow one. The court is not empowered to substitute its judgment for that
    of the agency.” Citizens to Pres. Overton Park, Inc. v. Volpe, 
    401 U.S. 402
    , 416-20 (1971).
    Accordingly, courts may set aside an award only if (1) “‘the procurement official’s decision lacked
    a rational basis[,]’ or (2) ‘the procurement procedure involved a violation of regulation or
    procedure.’” DynCorp, 10 F.4th at 1308 (quoting WellPoint Mil. Care Corp. v. United States, 
    953 F.3d 1373
    , 1377 (Fed. Cir. 2020)).
    When a protestor alleges the agency’s decision lacked a rational basis, the court reviews
    “whether the contracting agency provided a coherent and reasonable explanation of its exercise of
    27
    discretion.” Dell Fed. Sys., L.P. v. United States, 
    906 F.3d 982
    , 992 (Fed. Cir. 2018) (quotations
    and citations omitted). As the United States Court of Appeals for the Federal Circuit has explained,
    “the disappointed bidder bears a heavy burden of showing that the award decision had no rational
    basis.” Centech Grp., Inc. v. United States, 
    554 F.3d 1029
    , 1037 (Fed. Cir. 2009) (quotations and
    citations omitted). Indeed, agency decisions are “entitled to a presumption of regularity.” Impresa
    Contruzioni Geom. Domenico Garufi v. United States, 
    238 F.3d 1324
    , 1338 (Fed. Cir. 2001).
    Protestors bear a similar burden when alleging that the procurement involved legal or procedural
    violations, as the court reviews such claims for “a clear . . . violation of applicable statutes or
    regulations.” 
    Id. at 1333
     (quoting Kentron Hawaii, Ltd. v. Warner, 
    480 F.2d 1166
    , 1169 (D.C.
    Cir. 1973)).
    At the second step, regardless of whether the alleged error relates to irrational conduct or
    a violation of law, the protestor must establish that the agency’s conduct prejudiced the protestor.
    Sys. Studs. & Simulations, Inc. v. United States, 
    22 F.4th 994
    , 997 (Fed. Cir. 2021). This is a
    factual question for which the protestor must show “that there was a ‘substantial chance’ it would
    have received the contract award but for” the alleged error. Bannum, Inc. v. United States, 
    404 F.3d 1346
    , 1353 (Fed. Cir. 2005) (citations omitted). De minimis errors in the procurement process
    generally do not justify relief. Off. Design Grp., 951 F.3d at 1374.
    If a protestor meets its burden of demonstrating that the procurement both violated the APA
    and prejudiced the protestor, declaratory or injunctive relief may be appropriate. See 
    28 U.S.C. § 1491
    (b)(2). However, successful protestors are not automatically entitled to an injunction. See
    Centech, 
    554 F.3d at 1037
    . Before entering injunctive relief, “the court must consider whether (1)
    the plaintiff has succeeded on the merits, (2) the plaintiff will suffer irreparable harm if the court
    28
    withholds injunctive relief, (3) the balance of hardships to the respective parties favors the grant
    of injunctive relief, and (4) the public interest is served by a grant of injunctive relief.” 
    Id.
    The Rules of the United States Court of Federal Claims provide the equivalent of an
    expedited trial on a “paper record, allowing fact-finding by the trial court.” Bannum, 
    404 F.3d at 1356
    . Parties initiate such proceedings by filing motions for judgment on the administrative
    record. See Rule 52.1(c). In adjudicating cross motions under Rule 52.1, this court resolves
    questions of fact by relying on the administrative record. See 
    id.
     If necessary, this court may
    remand the case back to a governmental agency for further factual findings. See Rule 52.2.
    DISCUSSION
    While it is not this Court’s role to “substitute its judgment for that of the agency,” it is
    within this Court’s purview to determine whether an agency acted irrationally, violated U.S.
    procurement law, or acted in conflict with the terms of the Solicitation. Citizens to Preserve
    Overton Park, Inc. v. Volpe, 
    401 U.S. 402
    , 416-20 (1971); see Banknote Corp. of Am. v. United
    States, 
    365 F.3d 1345
    , 1351-53 (Fed. Cir. 2004). Plaintiff argues that relief is appropriate because
    the Air Force (1) irrationally evaluated its Past Performance, and (2) arbitrarily evaluated Price.
    See Pl.’s MJAR at 19-33. Ames 1 seeks a permanent injunction barring the Air Force from
    proceeding with the Mini-MACC awards to Eklutna, Tyonek, and Orion. Id. at 33-37; Compl. at
    17; see also infra n.22.
    As noted, on March 31, 2022, this Court in Frawner, enjoined Defendant from proceeding
    with any awards under the same Solicitation at issue in this action, except for the award to SD
    Construction, or upon taking corrective action consistent with the Court’s direction. See Frawner
    Order; Frawner Mem. and Order. This Court held that the Air Force’s Mini-MACC awards and
    29
    on-ramp rankings, other than its award to SD Construction, involved an arbitrary and capricious
    Past Performance evaluation and best value tradeoff determination. Id.; see also Frawner Mar. 31
    Tr. at 3:24-4:2; Frawner Mem. and Order at 29-44, 65-74. Further, this Court held in Frawner
    that should Defendant opt to continue with Mini-MACC awards under the Solicitation, other than
    to SD Construction, the Air Force must take corrective action consistent with the Court’s direction.
    See Ames Mar. 31 Tr. at 11:25-14:25; Frawner Mar. 31 Tr. at 13:17-15:18; Frawner Mem. and
    Order at 72-74.
    The Court’s injunction in Frawner mooted the current protest as Ames 1 seeks the same
    relief as was already granted by this Court in Frawner — namely an injunction barring the Air
    Force from proceeding with its Mini-MACC awards. See Mitchco Int'l, Inc. v. United States, 
    26 F.4th 1373
    , 1378 (Fed. Cir. 2022) (“[A] case is moot when the issues presented are no longer ‘live’
    or the parties lack a legally cognizable interest in the outcome.”). As the Court’s ruling in Frawner
    moots Plaintiff’s Motion for Judgment on the Administrative Record (ECF No. 15) and
    Defendant’s Cross-Motion for Judgment on the Administrative Record (ECF No. 16), both
    motions are DENIED as moot.
    Although the pending motions are moot, for completeness this Court addresses the parties’
    arguments on the merits. This Court then explains the basis for its injunction in Frawner of the
    same awards under the same Solicitation present in this action. As noted, Ames 1 argues that the
    Air Force’s analysis and award decisions are arbitrary and capricious because the Air Force (1)
    irrationally evaluated its Past Performance, and (2) arbitrarily evaluated Price. See Pl.’s MJAR at
    19-33. For the following reasons, this Court finds Plaintiff’s Past Performance arguments lack
    merit, but finds one of Plaintiff’s Price arguments compelling.
    30
    I.    Past Performance
    Plaintiff argues that the Air Force’s Past Performance evaluation was irrational because the
    SSA (1) arbitrarily and capriciously rejected the SSEB’s Past Performance recommendation, (2)
    failed to “properly document” its rationale for rejecting the SSEB’s recommendation, and (3)
    improperly looked beyond offerors’ overall Past Performance ranking “in contravention to the
    evaluation criteria that states a ‘single’ past performance rating would be assigned.” Pl.’s MJAR
    at 22-26 (quoting Tab 8 (Solicitation No. FA500021R0001 (May 3, 2021)) at AR 391). This Court
    disagrees with Plaintiff’s contentions. Instead of raising a colorable claim, Plaintiff appears to
    predicate its argument on a preference for the SSEB’s recommendation, which placed it as an
    awardee, over the SSA’s decision, which instead placed it on the on-ramp. See Tr. Oral Arg. at
    11:14-17 (acknowledging on behalf of Plaintiff that Ames 1 preferred “the SSEB’s
    recommendation because [it was ranked] third” versus the SSA’s decision to rank Ames 1
    seventh); compare Tab 41 (Source Selection Evaluation Board Report (December 8, 2021)) at AR
    2651-52 (SSEB ranking Ames 1 third), with Tab 44 (Source Selection Decision (December 20,
    2021)) at AR 2708-11 (SSA ranking Ames 1 seventh). However, a party’s preference for one
    rating over another is insufficient to render the SSA’s analysis irrational. See Active Network, LLC
    v. United States, 
    130 Fed. Cl. 421
    , 432 (2017), aff’d, 718 F. App’x 981 (Fed. Cir. 2018) (holding
    mere disagreements with an agency’s past performance evaluation fail to establish that the agency
    acted unreasonably); see also Newimar S.A. v. United States, No. 21-CV-1897, 
    2022 WL 1592813
    ,
    at *30 (Fed. Cl. May 12, 2022) (in price reasonableness context, “mere disagreement with the
    [agency’s] conclusion . . . cannot sustain a protest”). Accordingly, as explained below, in
    exercising its limited APA review, this Court will not disturb the considered judgment of the SSA
    on this basis.
    31
    A. The SSA Rationally Analyzed the SSEB’s Past Performance Recommendation
    Plaintiff first contends that the SSA arbitrarily and capriciously rejected the SSEB’s
    recommendation on how to evaluate offerors with the same overall Past Performance rating. Pl.’s
    MJAR at 23-24. While the SSEB determined that the seven offerors receiving “Substantial
    Confidence” Past Performance ratings “were all essentially of equal value to the [G]overnment,”
    and thus no comparison of their underlying Quality or Relevancy ratings were necessary, the SSA
    disagreed. Compare Tab 41 at AR 2649 (SSEB), with Tab 44 at AR 2706 (SSA). Instead, the
    SSA determined that “there are past performance records that demonstrate that some offeror[s’]
    performance history warrants an assessment of additional value amongst the other offers evaluated
    as “‘[S]ubstantial [C]onfidence.’” Tab 44 at AR 2706. Plaintiff’s disagreement with the SSA’s
    findings fails to establish that the SSA improperly discounted the SSEB’s Past Performance
    analysis.
    This Solicitation is conducted under FAR “Part 15, Department of Defense (DoD) FAR
    Supplement Procedures, Guidance and Information Subpart 215.3, and Air Force FAR Supplement
    (AFFARS) Mandatory Procedure (MP) 5315.3.” Tab 8 at AR 317. Under FAR 15.308, “[w]hile
    the SSA may use reports and analyses prepared by others, the source selection decision shall
    represent the SSA’s independent judgment.” Similarly, DFARS Procedures, Guidance, and
    Information (PGI) provides that “[t]he SSA is not bound by the evaluation findings of the SSEB
    . . . as long as the SSA has a rational basis for the differing opinion.” DFARS/PGI 215.3 ¶ 3.9.1;
    see also DFARS 215.300 (“Contracting officers shall follow the principles and procedures in
    Director, Defense Procurement and Acquisition Policy memorandum dated April 1, 2016, entitled
    ‘Department of Defense Source Selection Procedures,’ when conducting negotiated, competitive
    acquisitions utilizing FAR part 15 procedures.”). Thus, under governing regulations, the SSA not
    32
    only had the discretion to deviate from the SSEB’s recommendation, but also had an obligation to
    issue a decision based on its “independent judgment.” FAR 15.308.
    Plaintiff does not contest the SSEB’s advisory role in the procurement process, nor does it
    argue that the SSA had to “rubber stamp” the SSEB’s findings. See Tr. Oral Arg. at 11:18-20
    (“Source Selection Authorities, they must exercise their discretion. They can’t rubber stamp an
    SSEB.    And we don’t take issue with that.”).         Rather, Plaintiff contends that the SSA’s
    methodology was irrational as it allegedly disregarded “the extensive narrative evaluation of each
    individual offeror’s past performance documented throughout the [SSEB’s] report” and merely
    examined “the highest number of Very Relevant efforts in correlation with the highest number of
    Exceptional PPQ[s] and CPARS assigned to each offeror.” Pl.’s MJAR at 25 (citation omitted).
    This contention cannot be squared with the Administrative Record. In its decision, the SSA stated
    that it considered the entirety of the SSEB’s report, which included the SSEB’s narrative
    comments:
    I was given complete access to all available documents pertaining to the
    acquisition, including evaluation briefing slides, offeror proposals, consensus
    documentation, evaluation reports, price information, and other documentation to
    support my decision. As the Source Selection Authority (SSA), after extensive
    review of the documentation and in consultation with the Source Selection
    Evaluation Board (SSEB), and my advisors, I have determined that the proposals
    submitted by the following offerors offer[] the best overall value to satisfy the Air
    Force’s stated requirements for the JBER Mini-MACC IDIQ.
    Tab 44 at AR 2704 (emphases added). Plaintiff fails to point to a portion of the SSA’s decision
    indicating that it ignored or otherwise discounted the SSEB’s “narrative comments.” See Pl.’s
    MJAR at 22-26.
    The SSA’s disagreement with the SSEB’s conclusion that offerors with “Substantial
    Confidence” Past Performance ratings are of the same “value to the [G]overnment” is unrelated to
    whether the SSA properly considered the SSEB’s narrative comments. Compare Tab 41 at AR
    33
    2649 (SSEB analysis), with Tab 44 at AR 2706 (SSA analysis). The Administrative Record
    reflects that the SSA fully considered the SSEB’s report. Tab 44 at AR 2704. In its “independent
    judgment,” the SSA considered the SSEB’s approach to Past Performance and determined that a
    different mode of analysis was more appropriate. FAR 15.308; see Tab 41.
    The SSA’s decision to depart from the SSEB’s recommendation was squarely within the
    SSA’s discretion and is certainly rational — differences in offerors’ “quality control, timely
    performance, effectiveness of management, and regulatory compliance” that underly the Quality
    ratings could affect their “value to the Government.” Tab 8 at AR 390; FAR 15.308; DFARS PGI
    215.3 ¶ 3.9.1. For example, while two offerors may both have received “Substantial Confidence”
    Past Performance ratings, one may have received slightly higher underlying Quality ratings due to
    its comparatively better “timely performance” in the past. It would be rational for the SSA to take
    slightly superior “timely performance” into account when considering the best value to the
    Government. See Glenn Def. Marine (ASIA), PTE Ltd. v. United States, 
    720 F.3d 901
    , 909 (Fed.
    Cir. 2013) (finding rational agency’s decision to consider sub-factor ratings and narrative
    comments, rather than just considering overall ratings). That is exactly what the SSA did here by
    examining offerors with the same Past Performance rating and reviewing their underlying Quality
    ratings to differentiate between bids. See Tab 44. As the SSA acted rationally and within its
    discretion after “extensive” review of the SSEB report and other documents, Plaintiff fails to
    establish that the SSA arbitrarily and capriciously discounted the SSEB’s findings. Tab 44 at AR
    2704.
    B. The SSA Properly Documented Its Decision
    Plaintiff next contends that the SSA failed to “properly document” its departure from the
    SSEB’s analysis. Pl.’s MJAR at 25-26 (citing FAR 15.308). This claim also lacks merit. FAR
    34
    15.308 requires the SSA to “document” its decision including “the rationale for any business
    judgments and tradeoffs made or relied on by the SSA, including benefits associated with
    additional costs.” The SSA did so here in explaining that “[c]ontrary to the findings of the SSEB
    . . . there [were] past performance records that demonstrate[d] that some offeror[s’] performance
    history warrant[ed] an assessment of additional value amongst the other offers evaluated as
    ‘[S]ubstantial [C]onfidence.’” Tab 44 at AR 2706. Plaintiff argues that “just one sentence” is
    insufficient to document the SSA’s departure from the SSEB’s analysis. Pl.’s Reply at 15. Yet,
    Plaintiff offers no binding jurisprudence to support its position, and this Court is unaware of any
    statutory or regulatory length requirement for proper documentation. See id.; Pl.’s MJAR at 25-
    26.     Substantively, the SSA’s one-sentence rational is sufficient to explain its position.
    Accordingly, this Court finds that the SSA satisfied its documentation requirement with respect to
    Past Performance by noting that offerors receiving the same Past Performance rating did not
    necessarily provide the same “value to the [G]overnment.”15 Tab 44 at AR 2706; see also Green
    Tech. Grp., LLC v. United States, 
    147 Fed. Cl. 231
    , 244 (2020) (holding agency satisfied its
    documentation requirement by stating its conclusions “as to [bidder’s] strengths, weaknesses and
    adjectival ratings”).
    C. The SSA Appropriately Looked Beyond Overall Past Performance Adjectival
    Ratings
    Finally, Plaintiff contends that by analyzing differences between offerors receiving the
    same “Substantial Confidence” Past Performance rating, the SSA impermissibly added an
    undisclosed sub-ranking criterion to its Past Performance evaluation.            Pl.’s MJAR at 25.
    Specifically, Plaintiff argues that by looking past the overall Past Performance rating — to Quality
    15
    The sufficiency of the SSA’s tradeoff analysis is discussed infra at Discussion Section III(B).
    35
    ratings, for example — in the final stage of its analysis when comparing Past Performance to Price,
    the SSA contravened the Solicitation’s requirement that a “single” Past Performance rating be
    assigned. 
    Id.
     (quoting Tab 8 at AR 391). This Court disagrees. Notwithstanding the contradictory
    nature of Plaintiff’s Past Performance argument — on the one hand arguing that the SSA “did not
    look deep enough into the SSEB’s report,” while on the other hand asserting that the SSA “dug
    too deep” into the SSEB’s underlying Quality ratings — Plaintiff’s contention that the SSA was
    barred from looking beyond the overall Past Performance adjectival ratings fails as a matter of
    law. Def.’s Cross-MJAR at 30.
    While the Solicitation indicates that “each offeror will be assigned a single [P]ast
    [P]erformance confidence rating,” Plaintiff’s argument ignores an agency’s duty to look beyond
    adjectival ratings in making value judgments. Tab 8 at AR 391; see Glenn Def. Marine, 720 F.3d
    at 909 (finding rational agency’s decision look beyond overall rating); Femme Comp Inc. v. United
    States, 
    83 Fed. Cl. 704
    , 758 (2008) (“Looking beyond the adjectival ratings is necessary because
    proposals with the same adjectival rating are not necessarily of equal quality.”) (internal quotations
    and citation omitted). An agency’s review beyond the overall Past Performance adjectival ratings
    ensures that the agency can “determine which proposal represents the best value for the
    government.” E.W. Bliss Co. v. United States, 
    77 F.3d 445
    , 449 (Fed. Cir. 1996).
    Here, the Solicitation obligated the SSA to make such value judgments between offerors’
    “price and the Past Performance Confidence Rating[s]” if the SSA awarded the contract to a bidder
    other than one with the lowest price. Tab 8 at AR 389. As the SSA awarded three of the four
    contracts to bidders that did not have the lowest price, the Air Force was required to conduct a best
    36
    value tradeoff analysis.16 Id.; see also Tab 41 at AR 2566 (second place contract awardee Tyonek
    price of $1,874,000.00) (third place contract awardee Eklutna price of $1,995,080.00) (fourth place
    contract awardee Orion price of $1,441,274.11). In conducting its tradeoff analysis, the SSA
    permissibly examined Quality — a sub-factor of Past Performance assigned by the SSEB — to
    assist the SSA in comparing offerors with identical Past Performance ratings. Tab 8 at AR 389-
    90; see also Tab 41; Tab 44. Plaintiff presents no authority that bars the Air Force from looking
    past the Past Performance adjectival rating and, as discussed, relevant authority dictates that the
    agency should look past those ratings. Accordingly, the Air Force acted rationally in looking
    beneath the veil of the overall Past Performance to the Quality sub-rating before comparing those
    ratings to Price. Thus, had this Court ruled on the merits of this protest, Ames 1’s Past Performance
    argument would have failed.
    II.   Price
    Plaintiff next argues that the Air Force’s Price analysis was arbitrary, capricious, and
    unlawful because the agency failed to conduct (1) “the required price reasonableness analysis” and
    (2) “the required tradeoff analysis and award based on best value.” Pl.’s MJAR at 19-22, 26-31.
    Defendant responds that the agency properly considered Price in both its price reasonableness and
    best value tradeoff analyses. Def.’s Cross-MJAR at 19-27, 31-40. As Ames 1’s Price arguments
    mirror those raised by Frawner in its protest, this Court would have held — just as it held in
    Frawner — that the Air Force’s price reasonableness analysis was proper, while its best value
    tradeoff analysis was arbitrary and capricious. Frawner Mar. 31 Tr. at 3:24-4:2, 8:3-5; Frawner
    Mem. and Order at 60. Ames 1’s price reasonableness argument is addressed below. Since this
    16
    In fact, the Air Force awarded two of the four contracts to the highest priced bidders — Tyonek
    and Eklutna. Tab 44 at AR 2708.
    37
    Court in Frawner held that the Air Force’s best value tradeoff analysis was arbitrary and
    capricious, the Court does not separately address Ames 1’s best value tradeoff argument. Frawner
    Mar. 31 Tr. at 8:24-12:2; Frawner Mem. and Order at 60. Rather, the Court will detail the flaws
    in the Air Force’s best value tradeoff analysis by describing the basis of the injunction entered in
    Frawner. See infra Discussion Section III(B).
    A. The Air Force Properly Evaluated Price Reasonableness
    Plaintiff contends that the Air Force did not conduct the “required” price reasonableness
    evaluation under FAR 15.404-1(a)(1)17 because “the SSA selected the two highest price offerors
    and the lowest price offeror for the award pool.” Pl.’s MJAR at 19-22 (referencing Tyonek and
    Eklutna as highest priced offerors and SD Construction as lowest priced offeror); see also FAR
    14.408-2(a) (“The contracting officer shall determine that . . . the prices offered are reasonable
    before awarding the contract. The price analysis techniques in 15.404-1(b) may be used as
    guidelines. In each case the determination shall be made in the light of all prevailing
    circumstances.”). More broadly, Plaintiff argues that the Air Force’s price reasonableness analysis
    ignored Price entirely, contravening the Solicitation, which required consideration of both Past
    Performance and Price. Pl.’s MJAR at 20-22. Specifically, Plaintiff alleges that the Air Force
    abandoned its “stated evaluation methodology” for determining whether all prices were
    reasonable. Id.; Tab 8 at AR 389. This Court disagrees. As explained below, the Air Force (1)
    compared offerors’ prices by applying a method endorsed by the FAR, and (2) rationally concluded
    17
    “The contracting officer is responsible for evaluating the reasonableness of the offered prices.
    The analytical techniques and procedures described in this subsection may be used, singly or in
    combination with others, to ensure that the final price is fair and reasonable. The complexity and
    circumstances of each acquisition should determine the level of detail of the analysis required.”
    38
    based on its evaluation and expertise that the offerors’ prices were reasonable. Accordingly, the
    Air Force satisfied its obligation to analyze price reasonableness.
    i. The Air Force Properly Applied the Evaluative Technique It Selected to
    Analyze Price Reasonableness
    At the outset, Plaintiff “does not take issue with the SSEB’s use of techniques outlined in
    FAR 15.404-1(b)(2)(i) to assess fairness and reasonableness of proposed pricing,” nor does it
    challenge the “SSEB’s calculation and use of mean price to further analyze and establish price
    reasonableness and fairness.” Pl.’s Reply at 5. Nor could it, as the FAR affords the agency the
    discretion to choose its test for conducting a price reasonableness analysis. FAR 15.404-1(b)(2)
    (“The Government may use various price analysis techniques and procedures to ensure a fair and
    reasonable price.”). One of the “preferred” methods is a “[c]omparison of proposed prices received
    in response to the [S]olicitation.”    FAR 15.404-1(b)(2)(i); FAR 15.404-1(b)(3) (identifying
    preferred price reasonableness techniques). Here, the Air Force satisfied its requirement under the
    FAR by calculating offerors’ TEPs, averaging them, and then comparing the individual prices to
    the mean. See Tab 41 at AR 2566; Tab 44 at AR 2706-07 (describing SSEB’s price reasonableness
    analysis and adopting its finding of reasonable prices). This analysis certainly meets the FAR
    endorsed test of comparing “prices received in response to the [S]olicitation.” FAR 15.404-
    1(b)(2)(i); see also Tab 41 at AR 2566.
    ii. The Air Force Rationally Determined That There Was Price Competition
    Rather than critique the agency’s price reasonableness evaluative technique employed,
    Plaintiff instead argues that the Air Force irrationally concluded that the offerors’ prices were
    reasonable. Pl.’s MJAR. at 20-21. Specifically, Plaintiff contends that the Air Force’s conclusion
    of price reasonableness cannot be squared with its finding that offerors’ TEPs are “[un]reliable
    indicator[s] of their prices for future requirements.” 
    Id.
     (quoting Tab 44 at AR 2707). This Court
    39
    disagrees. Ames 1’s complaints “are nothing more than mere disagreement with the Agency’s
    reasonable exercise of its considerable discretion.” Vertex Aerospace, LLC v. United States, No.
    20-700C, 
    2020 WL 5887750
    , at *10 (Fed. Cl. Sept. 21, 2020).
    While “[n]ormally, adequate price competition establishes a fair and reasonable price,”
    agencies perform price reasonableness analyses to “prevent the Government from paying too high
    a price” for the procurement. FAR 15.404-1(b)(2)(i); see IAP World Servs., Inc. v. United States,
    
    152 Fed. Cl. 384
    , 406 (2021) (citation omitted). The agency is best suited to assess whether the
    bidders’ prices are reasonable. Serco Inc. v. United States, 
    81 Fed. Cl. 463
    , 495 (2008) (“[T]he
    depth of an agency’s price reasonableness analysis and its ultimate findings on that count are both
    matters of discretion.”). The Court need only assess whether the agency acted rationally in making
    its price reasonableness conclusion. Moore’s Cafeteria Servs. v. United States, 
    77 Fed. Cl. 180
    ,
    187 (2007), aff’d, 314 F. App’x 277 (Fed. Cir. 2008) (applying a rational basis standard to agency’s
    price reasonableness analysis).    This Court finds that the Air Force’s price reasonableness
    conclusion was rational.
    First, the Air Force determined that the particularities of Anchorage construction pricing
    — which it notes have been exacerbated by COVID-19 sourcing issues — adequately explained
    the wide variation in proposal prices. See Tab 41 at AR 2568; Tab 44 at AR 2707. The agency
    based that finding on its years of experience and its observation that while some contractors have
    their own workforce, others do not, and while some own their own equipment, others will have to
    lease equipment — all issues especially relevant to the unpredictable Alaska construction market.
    See Tab 41 at AR 2568; Tab 44 at AR 2707. An agency is permitted to use its knowledge and
    expertise during the procurement process, including in its evaluation of price reasonableness.
    DynCorp Int’l LLC v. United States, 
    139 Fed. Cl. 481
    , 487 (2018) (deferring to “an agency’s
    40
    expertise in making procurement decisions”); see also Oral Arg. Tr. 35:21-25 (acknowledging on
    behalf of Plaintiff that there is a “certain amount of discretion that the Government gets in a FAR
    Part 15 negotiated procurement” in which the agency is “supposed to be using [its] expertise”).
    The Air Force is familiar with the Anchorage market based on previous contracts it has entered
    with third parties for services at JBER. See, e.g., https://www.asrcfederal.com/asrc-federal-
    subsidiary-awarded-the-united-states-air-force-joint-base-elmendorf-richardson-base-operations-
    maintenance-services-contract/ (last viewed June 21, 2022). Thus, the agency acted reasonably in
    applying its knowledge of that market to the bids at issue in this procurement.
    Second, unlike Plaintiff’s contention that the Air Force completely disregarded Price due
    to its finding that TEPs were an “[un]reliable indicator of . . . prices for future requirements,” this
    Court finds that the agency rationally cabined the relevancy of pricing for the seed project as it
    may apply to future unknown task orders. Tab 44 at AR 2707. While the Air Force recognized
    that “[t]he TEP is useful as a guide to give the Government an indication of how each offeror
    would price this specific project,” it also acknowledged its limited applicability in an IDIQ
    procurement. 
    Id.
     For example, the Solicitation noted that future task orders will vary from
    “facility upgrades, renovations, painting, utility work, airfield pavement, roads, roofs and other
    assorted repair and alteration projects” and tasks will vary in trade between “carpentry, asbestos
    abatement/removal, demolition, mechanical, electrical, plumbing, concrete masonry, welding, fire
    systems, and paving.” Tab 8 at AR 395. These future task order projects will likely differ from
    the seed project as quantified by the TEP, and thus the Air Force rationally recognized that pricing
    for these future projects may cost more or less than the seed project. Tab 44 at AR 2707.
    Additionally, the Air Force recognized that competition for task orders among the four contract
    awardees would offset some uncertainty surrounding future cost reasonableness. Tab 44 at AR
    41
    2706 (noting that “the price being considered here, the TEP, is the proposed price for the seed
    project for this set of IDIQ contracts”). Such future price competition among the four IDIQ
    awardees further supports the Air Force’s price reasonableness finding. See Tech. Innovation All.
    LLC v. United States, 
    149 Fed. Cl. 105
    , 142 (2020) (finding that contracting officer permissibly
    concluded that prices were reasonable even in the face of wide price variation where the agency
    justified such variations based on “the complex and unknown nature of the work” and future
    competition at the task order level would mitigate any price concerns) (citation omitted).
    Accordingly, the Air Force’s explanation of price variation is rational based on (i) the agency’s
    knowledge of the fluctuating Anchorage market, and (ii) the future competition for task orders
    among the four winning bidders. See Moore’s Cafeteria Servs., 
    77 Fed. Cl. at 187
    ; Impresa, 
    238 F.3d at 1332
     (contracting officer “entitled to exercise discretion upon a broad range of issues
    confronting them in the procurement process”) (internal quotation and citation omitted); Labat-
    Anderson, Inc. v. United States, 
    42 Fed. Cl. 806
    , 846 (1999) (contracting officer yields “wide
    discretion” in evaluation of bids).
    iii. The Air Force Properly Considered Higher Priced Bids in its Price
    Reasonableness Determination
    Plaintiff also claims that although it “is plainly evident from the fact that the SSA selected
    the two highest price offerors and the lowest price offeror for the award pool, the SSA did not
    consider, at all, the magnitude of the price differentials or the relative benefits yielded by higher
    priced offerors.” Pl.’s MJAR at 22. To the extent Plaintiff contests the Air Force’s best value
    tradeoff analysis, the Court discusses that argument infra at Discussion Section III(B). On the
    other hand, to the extent Plaintiff is contending that the awards to SD Construction (the lowest
    priced offeror), Eklutna (the highest priced offeror), and Tyonek (the second highest priced
    offeror) indicate that the Air Force did not rationally consider price reasonableness, that claim is
    42
    belied by the Administrative Record. See Tab 44 at AR 2708. For Eklutna and Tyonek, whose
    TEPs exceeded the mean by 42.0% ($590,130.70 above the mean) and 33.9% ($469,050.70 above
    the mean) respectively, the Air Force assessed that “although th[ose] offeror[s’] price[s] [are]
    substantially higher than other offerors,” they were still reasonable because of “extreme variability
    in prices seen in the past year.” Tab 41 at AR 2582 (Eklutna), AR 2602 (Tyonek). This Court
    defers to the agency’s judgment concerning “extreme [price] variability” over the prior year. Tab
    41 at AR 2582, AR 2602; DynCorp Int’l, 139 Fed. Cl. at 487. For the lowest priced offeror, SD
    Construction, a price reasonableness analysis would not have assessed whether that price was too
    low as price reasonableness focuses on “prevent[ing] the Government from paying too high a
    price” for the procurement. IAP World Servs., 152 Fed. Cl. at 406 (citation omitted). Instead, a
    price realism analysis focuses on whether prices are too low. Asset Prot. & Sec. Servs., L.P. v.
    United States, 
    5 F.4th 1361
    , 1363 (Fed. Cir. 2021). However, the Solicitation did not require the
    agency to conduct a price realism analysis, and the Air Force did not do so with this procurement.
    Tab 8 at AR 389 (“The Government reserves the right to conduct price realism.”); Tab 44 at AR
    2705 (“The Government did not perform a price realism analysis for this proposal.”). That
    reasonable decision was squarely within the Air Force’s discretion. Accordingly, had this Court
    ruled on the merits of this protest, Ames 1’s price reasonableness argument would also have failed.
    III.   Basis for Injunction in Frawner
    The Court entered an injunction in Frawner based on Defendant’s arbitrary and capricious
    Past Performance and best value tradeoff analyses.18 Frawner Mar. 31 Tr. at 3:24-4:12, 13:17-
    15:18; Frawner Mem. and Order at 72-74. As this Court’s Order in Frawner mooted Ames 1’s
    18
    Frawner presented a different Past Performance argument than Ames 1. As Frawer and Ames 1
    presented the same best value tradeoff argument, consideration of that argument is reflected in the
    Court’s Frawner Memorandum and Order on pp. 65-70.
    43
    protest, the Court addresses the basis and scope of its injunction in Frawner. See supra p. 30.
    First, the Court describes the basis for finding the Air Force’s Past Performance and best value
    tradeoff analyses arbitrary and capricious. Second, the Court describes the scope of its injunction.
    A. The Air Force’s Past Performance Analysis Was Arbitrary and Capricious
    This Court held that Defendant arbitrarily applied unstated evaluation criteria in two
    instances during its Past Performance evaluation: (1) automatically applying the lowest of the three
    Relevance sub-factor ratings as the overall Relevance rating and (2) imposing an unstated $2
    million cap for evaluating past projects’ “magnitude.” Frawner Mem. and Order at 32-44.
    First, the Air Force’s application of the lowest of the three sub-factor ratings as the overall
    Relevance rating was arbitrary and capricious as (i) that procedure was not disclosed in the
    Solicitation and (ii) it directly conflicted with the Solicitation’s language. Id. at 35-40. Rather
    than state that a project would be assigned adjectival ratings for a project’s “scope,” “magnitude,”
    and “complexity,” with the lowest rating of the three sub-factors serving as the overall Relevance
    rating, the Solicitation simply stated that a past project’s overall Relevance would be assessed
    “based upon the extent to which past performance is of similar scope, magnitude and complexity
    to the type of projects exemplified by the seed project for this [S]olicitation.” Tab 8 at AR 390.
    As bidders were not put “on notice” of the significant consequences of submitting projects
    containing some, but not all, of the three Relevance sub-factors — namely, that such projects
    would be eliminated from further consideration — this Court held that the application of this
    unstated evaluation criterion was arbitrary and capricious. Frawner Mem. and Order at 36; see,
    e.g., Tab 41 at AR 2642-43.
    This Court also held that this unstated criterion arbitrarily ran counter to the Solicitation’s
    language as the only projects that could be assigned an overall Relevance rating consistent with
    44
    the Relevance definitions were ones that received the same sub-rating for each sub-factor (i.e.,
    assigned a “Relevant” “scope” and “complexity” and “magnitude” sub-rating). Frawner Mem.
    and Order at 37-38. This reading makes it impossible to assign an overall Relevance rating to a
    project that received different sub-factor ratings. Id. However, as the Solicitation required the
    agency to assign all projects, including those with different sub-factor ratings, an overall Relevance
    rating, the Court held this reading of the Solicitation unreasonable. Id. at 39. Rather, the Court
    held that the only reasonable construction of the Relevance definitions is to consider all three sub-
    factors together. Id. Under that reading, a project such as Frawner’s Repair Network Operations
    Center project could receive at least a “Somewhat Relevant” overall Relevancy rating given it has
    “some of” the same relevancy sub-factors. Id.; see Tab 8 at AR 390; Tab 41 at AR 2642-43.
    Second, the Air Force arbitrarily imposed an unstated criterion of a $2 million dollar cap
    for past projects by assigning a project’s “magnitude” sub-factor as “Not Relevant” if it exceeded
    that cap and “Very Relevant” if it was below that cap. Frawner Mem. and Order at 40-44; see,
    e.g., Tab 41 at AR 2643-45 (listing two of Frawner’s projects valued over $2 million as “Not
    Relevant”). While the Air Force is entitled to significant deference in evaluating the relevancy of
    past performance efforts based on their “magnitude,” the Court held that the Air Force failed to
    put bidders “on notice” of the serious consequences of failing to meet the unstated $2 million
    requirement; namely that all projects valued at over $2 million would be rated as “Not Relevant”
    and eliminated from further consideration. Id. While the $2 million maximum task order under
    the Solicitation put bidders on notice that past projects valued at over $2 million might be rated
    under the “magnitude” sub-factor as less relevant than those valued under $2 million, it did not put
    45
    bidders on notice that those projects would eliminated entirely from consideration.19 Id.; Tab 8 at
    AR 323.
    B. The Air Force’s Best Value Tradeoff Analysis Was Arbitrary and Capricious20
    Next, the Court held that while thorough in several respects, the Air Force’s best value
    tradeoff analysis was arbitrary and capricious given there were material gaps in the SSA’s report.
    Frawner Mem. and Order at 65. While the Court held that the agency was allowed to look beyond
    the overall Past Performance confidence ratings to examine the underlying Quality ratings, the Air
    Force was still required to document tradeoffs it made to warrant paying a higher price. Id. at 66-
    67. The only rankings for which the Air Force clearly documented best value tradeoffs were the
    highest and lowest ranked offers. Id. at 69. The Air Force reasonably explained that SD
    Construction, the first-ranked bidder, was a better value to the Government than Tyonek even
    though both were, in the SSA’s eyes, “essentially equal” with regard to the highest Past
    Performance and Quality ratings because SD Construction had the lowest TEP of all offerors. Tab
    44 at AR 2707. The SSA did not again “depart from the model of going strictly by [Q]uality
    ratings” until it came to ranking the two offerors with “Neutral Confidence” ratings. Id. at AR
    2710. There, it reasoned that a 34% price premium was too much to pay for the better Past
    Performance rating of Tikigaq because there was essentially no past performance to compare. Id.
    The decisions to rank Tyonek and Eklutna as the second and third best offers, Orion and Nodak as
    the fourth and fifth best offers, and the remaining offers, including those by Frawner and Ames 1,
    19
    In Frawner, the Court conducted an analysis under relevant Federal Circuit authority and held
    that Frawner was prejudiced based on the Air Force’s flawed Past Performance evaluation.
    Frawner Mar. 31 Tr. at 7:2-8:2; Frawner Mem. and Order at 56-59.
    20
    Ames 1 made the same arguments as Frawner concerning the Air Force’s best value tradeoff
    analysis. Had this Court considered Ames 1’s argument, it would have come to the same
    conclusion on best value as reflected in the Frawner Memorandum and Order.
    46
    in the on-ramp did not include documentation of tradeoffs where higher-priced offerors received
    a higher rank. Id.
    The alleged benefits that justified ranking Tyonek and Eklutna after SD Construction even
    though their prices were 33.9% and 42.0% above the mean, respectively, were that “they ha[d]
    higher [Q]uality ratings than the offerors identified below as going into the on-ramp pool” and
    “the need for well-qualified contractors able to provide quality work for the period of this Mini-
    MACC [was] worth more to the Government than the risk represented by higher TEPs for the seed
    project, especially when that seed project [would] be awarded to a different offeror.” Id. at AR
    2707-08. However, the SSA did not explain what the Air Force gained materially other than being
    able to say that its projects were performed by a higher-rated company. Id. Although Past
    Performance is weighted significantly more than Price, when making a best value determination,
    it is well-established that price must still be meaningfully considered and reflected in the SSA’s
    documented analysis. Lockheed Missiles & Space Co. v. Bentsen, 
    4 F.3d 955
    , 959 (Fed. Cir. 1993)
    (“[T]he importance of price in a price/technical tradeoff must not be discounted to such a degree
    that it effectively renders the price factor meaningless.”); see also Tab 8 at AR 389 (stating that
    “competing offerors’ past performance proposal will be evaluated on a basis significantly more
    important than price”).
    As this Court explained, the Air Force must document its rationale by explaining why these
    additional Quality ratings outweighed paying a higher price. Frawner Mem. and Order at 68-69.
    A higher Quality rating could mean that the company’s previous projects were more relevant in
    “scope.” Or, it could mean that company’s previous projects were more relevant in “magnitude”
    or “complexity.” It could mean that the company had a better history of timely performance. Or,
    that the company had a better history of regulatory compliance. It is not clear that all these factors
    47
    are of equal value, and the SSA did not provide adequate documentation of its reasoning.
    Accordingly, the Court held that the Air Force should sufficiently document why these higher
    Quality rating values were worth paying higher prices, in some cases as much as 30-40% more.21
    Frawner Mem. and Order at 68-69.
    C. Injunctive Relief and Corrective Action in Frawner
    In Frawner, the Court enjoined the Air Force from awarding or proceeding with any award
    under the Solicitation other than to SD Construction, but permitted corrective action. Frawner
    Mar. 31 Tr. at 3:24-4:12, 13:17-15:18; Frawner Mem. and Order at 72-74. “[T]he Court of Federal
    Claims has broad equitable powers to fashion an appropriate remedy.” Turner Constr. Co., Inc. v.
    United States, 
    645 F.3d 1377
    , 1388 (Fed. Cir. 2011). Indeed, the Tucker Act empowers this Court
    to “award any relief that [it] considers proper.” 
    28 U.S.C. § 1491
    (b)(2) (emphasis added). SD
    Construction was the clear frontrunner, and the awardee of the seed project, as reflected by the
    SSA’s report.22 Tab 44 at AR 2707 (noting that SD Construction is the “most beneficial to the
    Government and is the first awardee listed” given it has “the lowest TEP of all offerors” at
    $933,000.00 and “has the highest [Q]uality rating of any offer”). The issues discussed regarding
    21
    In Frawner, the Court found prejudice resulting from the Air Force’s flawed best value tradeoff
    analysis. Frawner Mar. 31 Tr. at 11:19-12:2. The Court would have reached the same conclusion
    about the Air Force’s best value analysis if it had first ruled in this action. As Ames 1 received a
    “Substantial Confidence” Past Performance rating and had a Price below three of the four eventual
    awardees, the Air Force’s flawed best value analysis prejudiced Ames 1. Ames 1 would have had
    a “substantial chance” of receiving one of the Mini-MACC contracts placing it “within the zone
    of active consideration” for an award if the Air Force properly evaluated Price. Allied Tech. Grp.,
    Inc. v. United States, 
    649 F.3d 1320
    , 1326 (Fed. Cir. 2011); Tab 41 at AR 2566.
    22
    Counsel for Ames 1 acknowledged that SD Construction was the clear winner and that its
    protest, even if successful, would not challenge that outcome. Tr. Oral Arg. at 42:6-15 (“We are
    not taking issue with [the award to SD Construction] because they have the lowest price and they
    had the same rating. So it’s only the other three [awardees] that had higher prices in the same
    rating that we took issue with.”).
    48
    the Air Force’s Past Performance and best value tradeoff evaluations will not change that outcome
    as it related to SD Construction due to its best performance ratings and lowest price. 
    Id.
     at AR
    2707-09. Keeping the award in place for SD Construction will also minimize potential disruptions
    to maintenance projects planned at JBER during the short Alaskan summer.
    In Frawner, this Court also ordered that if Defendant opted to continue with awards under
    the Solicitation, it must undertake corrective action consistent with the following conditions:
    First, Defendant shall not treat the “magnitude” sub-factor as a binary factor
    where Past Performance efforts valued above $2 million receive a “Not
    Relevant” rating for that sub-factor and Past Performance efforts valued
    below $2 million receive a “Very Relevant” rating. Rather, Defendant shall,
    to the extent it applies adjectival ratings to Relevance sub-factors, employ
    the full range of such ratings as defined in the Solicitation. This is not to
    say that a project valued at over $2 million cannot be rated as “Not
    Relevant.” A past project valued at $7 million may potentially still receive
    a “Not Relevant” “magnitude” sub-rating given the significant variance
    between the maximum task order listed in the Solicitation and the past
    project’s value. However, if the Air Force chooses to rate that past project
    with a “Not Relevant” “magnitude” sub-rating, it must document the
    specific reason for doing so, the basis of which cannot be that it was
    applying an unstated $2 million threshold.
    Second, consistent with this Court’s ruling, Defendant shall not
    automatically assign as the overall Relevance rating for a Past Performance
    effort the adjectival rating of the lowest rated Relevance sub-factor. This
    means that the Air Force cannot use the lowest of the three Relevance sub-
    factor adjectival ratings as the automatic, overall Relevance score. Rather,
    prior to assigning an overall Relevance adjectival rating, the agency should
    analyze each project’s Relevance in its entirety, analyzing all three sub-
    factors together, consistent with the express terms of the Solicitation.
    Again, this is not to say that a project with, for example, a “Not Relevant”
    “magnitude” sub-score and “Relevant” “scope” and “complexity” sub-
    scores cannot still receive an overall Relevance rating of “Not Relevant;” it
    may do so as long as that rating is consistent with the Solicitation’s
    language. However, the agency cannot do so merely because it is
    automatically adopting the lowest of the three sub-factor ratings. The Air
    Force must document its reason for its rating consistent with the
    Solicitation’s language.
    Third, regarding price and best value analysis, to the extent the Air Force
    concludes that a higher-priced offer presents the best value to the agency
    49
    due to superior technical aspects reflected in the offeror’s past performance
    rating, it must specifically document those benefits and whether they are
    worth the price premium. Merely stating that an offeror has stronger
    Quality ratings will not suffice. A fuller explanation is necessary that
    documents the tradeoffs the Air Force is making.
    Frawner Mem. and Order at 73-74.
    The Court’s ruling in Frawner does not require the Air Force to adopt a specific ordering
    of the bids beyond SD Construction should Defendant opt to take corrective action and continue
    with awards under the Solicitation. Id. at 74. The Air Force has considerable discretion with how
    it moves forward with any reevaluation consistent with this Court’s ruling. Id. However, it must
    do so in compliance with the terms of the Solicitation and the applicable provisions of the FAR,
    and consistent with this Court’s Memorandum and Order in Frawner. Id.
    CONCLUSION
    For the reasons set forth above, Plaintiff’s Motion for Judgment on the Administrative
    Record (ECF No. 15) is DENIED as moot and Defendant’s Cross-Motion for Judgment on the
    Administrative Record (ECF No. 16) is DENIED as moot. The Clerk of Court is DIRECTED to
    enter Judgment accordingly.
    The parties are directed to CONFER and FILE a Notice within seven days of this
    Memorandum and Order, attaching a proposed public version of this Sealed Memorandum and
    Order, with any competition-sensitive or otherwise protected information redacted.
    IT IS SO ORDERED.
    Eleni M. Roumel
    ELENI M. ROUMEL
    Judge
    August 3, 2022
    Washington, D.C.
    50