Electronic On-Ramp, Inc. v. United States , 120 Fed. Cl. 515 ( 2015 )


Menu:
  •          In the United States Court of Federal Claims
    (Bid Protest)
    No. 14-1038C
    (Filed Under Seal: March 18, 2015 | Reissued: March 30, 2015)*
    )
    THE ELECTRONIC ON-RAMP, INC.,         )
    )
    Plaintiff,            )             Post-award bid protest; Best-Value
    )             Procurement; FAR Part 15; Deficiency;
    v.                               )             FAR 15.001.
    )
    THE UNITED STATES OF AMERICA,         )
    )
    Defendant,            )
    and                              )
    )
    PREMIER MANAGEMENT                    )
    CORPORATION,                          )
    Defendant-Intervenor. )
    )
    William Thomas Welch, McMahon, Welch & Learned, Reston, VA, for plaintiff.
    Barbara E. Thomas, Commercial Litigation Branch, Civil Division, United States
    Department of Justice, Washington D.C., for defendant. With her on the briefs
    were Robert M. Norway, Martin F. Hockey, Jr., Robert E. Kirschman, Jr., and
    Joyce R. Branda of the Department of Justice and Of Counsel, Warren M. Rose of
    the Air Force Legal Operations Agency.
    Heather A. James, Whiteford, Taylor and Preston, LLP, Washington, D.C., for
    defendant-intervenor.
    OPINION AND ORDER
    KAPLAN, Judge.
    In this post-award bid protest, Plaintiff Electronic On-Ramp, Inc. (“EOR”) challenges the
    Air Force’s award of a contract for linguist and analyst support services to defendant-intervenor
    Premier Management Corporation (“Premier”). Currently before the Court are EOR’s motion
    for judgment on the administrative record and the government’s cross-motion for judgment on
    *
    This Opinion was originally issued under seal, and the parties were given the opportunity to
    propose redactions by March 27, 2015. Because the Court did not receive any such proposals,
    the Court reissues its decision without redactions.
    the administrative record. For the reasons set forth below, EOR’s motion is DENIED, and the
    government’s cross-motion is GRANTED.
    BACKGROUND1
    I.     The Solicitation
    In May 2013, the Air Force Intelligence, Surveillance, and Reconnaissance Agency (“the
    agency” or “the government”) issued a solicitation seeking “linguist and analyst support
    services,” to be performed during a base and two option years pursuant to task orders issued
    under an indefinite delivery, indefinite quantity contract. Administrative Record (“AR”) 10:211,
    243.2 The solicitation sought offers only from disadvantaged small businesses, pursuant to the
    Small Business Administration’s 8(a) Business Development Program. AR 10:141. The
    contract would be awarded to the offeror whose proposal represented the best value to the
    government. AR 10:175.
    A.      Services to be Provided
    A “performance work statement,” or PWS, included in the solicitation described the tasks
    that the personnel of the successful offeror would be obligated to perform. These tasks included:
    translating spoken and written communications into English from various other languages, and
    vice versa, sometimes in “near real-time,” AR 10:243-45; analyzing translated communications
    for potentially significant intelligence, which would be reported to agency personnel and used to
    inform tactical and strategic military decisions, AR 10:245-46; and developing and conducting
    training programs to teach agency personnel to translate and analyze specified languages. AR
    10:246-48. Work under the contract would be performed at agency facilities in a variety of
    locations, including Fort Meade, Maryland; Royal Air Force (RAF) Mildenhall, United
    Kingdom; JB Kunia, Hawaii; and Kadena Air Base, Japan, AR 10:243, and would be so time-
    sensitive that the contractor would be required to retain 85 percent of the necessary personnel as
    of contract award and 100 percent as of 90 days after award. AR 10:178.
    The solicitation instructed each potential offeror to submit a proposal consisting of five
    volumes, including a technical volume (Volume II), a past and present performance volume
    (Volume III), and a price/cost volume (Volume IV). AR 10:166. In the technical volume,
    offerors were to address two major subject areas: their technical approach and their management
    approach. AR 10:167-68.
    1
    The background constitutes findings of fact made by the Court from the administrative record
    of the procurement filed pursuant to RCFC 52.1(a). See Bannum, Inc. v. United States, 
    404 F.3d 1346
    , 1356 (Fed. Cir. 2005) (observing that bid protests “provide for trial on a paper record,
    allowing fact-finding by the trial court”).
    2
    The administrative record is paginated sequentially and also divided into tabs. In citing to the
    administrative record, the Court will designate the tab, followed by the page number. For
    example, AR 10:211 refers to page 211, which is located in Tab 10.
    2
    Most pertinent to this case, in connection with the technical approach factor, the agency
    requested that offerors provide “five (5) samples of current complete curriculum” for five of the
    foreign-language instruction courses conducted by each offeror, “to include lesson plans,
    quizzes/progress checks and tests for courses.” AR 10:168. After the issuance of the
    solicitation, the agency received the following question: “Would a comprehensive course
    syllabus that covers daily lesson plans for the entire course duration, as well as quizzes, test[s],
    and student progress be sufficient to meet the sample requirements?” AR 11:317. The agency
    clarified that such a syllabus would not satisfy the solicitation’s requirements because the term
    “‘curriculum’ refers to the entire course” conducted by the offeror. 
    Id.
    After describing the information that offerors were to provide regarding their technical
    approach, the solicitation listed a variety of topics related to management approach that offerors
    should also discuss in their technical volume. AR 10:168. The agency identified and asked the
    offeror to identify transition and execution risks that it considered likely to arise during contract
    performance and asked how the offeror would mitigate or address each one. See 
    id.
     In addition,
    each offeror was asked to “[d]escribe your approach for managing personnel resources across
    multiple projects at various geographical locations” and for “managing multiple task orders at
    various geographical locations.” AR 10:168. The agency also requested “a Staffing Plan that
    addresses your acquisition and retention methods for skilled personnel, including retention
    during times of limited taskings,” as well as “a Training Plan for skill personnel” that detailed
    the contractor’s “process for introducing new personnel to the project” and its method for
    “ensur[ing] that the required training will be accomplished and managed.” 
    Id.
    B.      Evaluation Criteria
    The solicitation explained the manner in which the agency would evaluate information
    offerors provided in their proposals. AR 10:174-81. The agency would evaluate proposals using
    three factors listed here in descending order of importance: technical, past and present
    performance, and cost/price. AR 10:175. The technical factor would encompass two subfactors,
    technical approach and management approach, of which the former would carry more weight.
    AR 10:175-76. In combination, the technical and past and present performance factors would be
    “significantly more important than cost or price,” so that award might be made “to a higher rated,
    higher priced offeror, where the decision is consistent with the evaluation factors, and the Source
    Selection Authority (SSA) reasonably determines that the technical and/or overall business
    approach and/or past and present performance of the higher price offeror outweighs the cost
    difference.” AR 10:175-76.
    In evaluating the technical aspects of offerors’ proposals, the agency would conduct “two
    distinct but related assessments,” each having equal weight, with respect to both the technical
    approach subfactor and the management approach subfactor: “the [t]echnical [r]ating and the
    [t]echnical [r]isk [r]ating.” AR 10:176. The technical rating, expressed by an adjective paired
    with a corresponding color, would capture the agency’s determinations regarding the extent to
    which a proposal met the Government’s requirements. 
    Id.
     The highest possible technical rating
    of blue/outstanding would be awarded to any proposal that “[met] requirements and indicate[d]
    an exceptional [a]pproach and understanding of the requirements”; such a proposal would
    “contain multiple strengths and no deficiencies.” 
    Id.
     The lowest rating of red/unacceptable
    3
    would be assigned to any proposal that did “not meet requirements[,] . . . contain[ed] one or
    more deficiencies,” and was, accordingly, “unawardable.” 
    Id.
    Along with the technical rating, the agency would assign each proposal a technical risk
    rating, determined based upon “the identification of weakness(es)” that, in the agency’s
    estimation, increased the potential “for disruption of schedule, increased costs, . . . degradation of
    performance, the need for Government oversight, or the likelihood of unsuccessful contract
    performance.” AR 10:176. Possible risk ratings ranged from low, indicating “little potential to
    cause disruption of schedule, increased cost or degradation of performance,” to high, indicating a
    “likel[ihood] to cause significant disruption of schedule, increased cost, or degradation of
    performance” that was “unlikely” to be “overcome . . . , even with special contractor emphasis
    and close Government monitoring.” AR 10:176-77.
    II.    Submission and Initial Evaluation of Proposals
    The agency received proposals from nine offerors, including EOR and the ultimate
    awardee, Premier. AR 80:9814-15. EOR offered a lower price than any other offeror. 
    Id.
     In its
    initial evaluation of EOR’s technical approach, the agency’s Source Selection Evaluation Board
    (SSEB) assigned EOR a red/unacceptable rating because EOR “did not sufficiently demonstrate
    understanding/knowledge in some areas and it was deficient in providing current complete
    course samples.” AR 33:7192. Specifically, EOR’s proposal was deficient because it did not
    provide “samples of quizzes/progress checks or tests” for its five foreign language course
    curricula. AR 33:7193.
    The SSEB also determined that the proposal included several weaknesses not rising to the
    level of deficiencies: EOR “did not show a sufficient understanding of the roles of the assigned
    [contractor] personnel within the DoD Intelligence Community . . . and its mission of providing
    near to real time intelligence to supported units,” nor did EOR “show sufficient knowledge and
    experience” in “[p]rocessing Signals Intelligence (SIGINT),” “[p]roducing written
    documentation such as transcripts and reports,” or “implement[ing] . . . training req[uirements]
    as related to DCGS, DSO, COMPASS CALL, Airborne/Ground Linguists.” 
    Id.
     Finally, the
    SSEB noted that EOR’s proposal “failed to comply with page limits,” and that the SSEB had
    “removed,” and so had not considered, “pages 61 and later” from EOR’s technical volume. 
    Id.
    The SSEB identified one strength in EOR’s proposal, approving of EOR’s plan to use “a
    rapid prototype approach” to adapt its training courses to changing needs. AR 33:7191-93.
    Because of the incompleteness of EOR’s curricula and its “weaknesses in showing
    understanding/knowledge of items specified in” the solicitation, the SSEB assessed the risk
    associated with the offeror’s technical approach as moderate. AR 33:7193.
    As to its management approach, EOR received a technical rating of yellow/marginal and
    a technical risk rating of high. AR 33:7193-94. The SSEB noted “several weak areas” in EOR’s
    responses to prompts contained in the solicitation. AR 33:7193. First, EOR “did not sufficiently
    detail how [it] would deal with transition risks,” including the need to “meet[] Government
    staffing requirements for 85% staff at contract award and 100% staff at 90 days.” 
    Id.
     Second,
    EOR did not “show how [it] can sufficiently manage personnel at multi-geographical locations
    4
    and assigned to multiple tasking orders. AR 33:7193-94. Finally, although EOR’s “management
    hierarchy satisfactorily m[et] Government requirements,” “it d[id] not include an acceptable
    staffing and training plan for their personnel.” AR 33:7194. Although the SSEB found no
    strengths or deficiencies in EOR’s management approach, it concluded that EOR’s “significant
    weaknesses . . . increase[d] the risk of unsuccessful contract performance.” 
    Id.
    In contrast to EOR, Premier received a technical rating of blue/outstanding and technical
    risk rating of low risk for both technical and management approaches. AR 36:7407. Both the
    technical approach and the management approach proposed by Premier offered several strengths,
    while revealing no weaknesses. AR 34:7197-200. The SSEB assessed Premier’s risk of non-
    performance as low. 
    Id.
     Several other offerors also received technical ratings that, while not as
    high as Premier’s, far exceeded EOR’s. AR 36:7407.
    On the past and present performance factor, EOR received a rating of limited confidence.
    AR 31:7182. EOR identified in its performance volume five contracts performed by either itself
    or one of its six proposed subcontractors. AR 18:1315-31. The agency’s past performance
    evaluation team evaluated only two of those contracts, however, because EOR’s past
    performance volume did not comply with the established page limitations, running fourteen
    pages over the ten-page limit. AR 31:7171. See AR 18:1310-1333. Once the excess pages were
    removed, the remaining pages described only two contracts, neither of which was considered
    relevant by the evaluators. AR 31:7169. Performance reviews obtained by EOR and by the
    agency provided little to no information regarding the extent to which EOR had in the past
    successfully performed activities similar to those required by the solicitation. AR 31:7181.
    The evaluators noted further concerns. EOR’s past performance volume “was of
    extremely poor quality” and so “rife with typographical errors, incorrect word usage, fragmented
    or incomplete sentence[s]/thoughts[,] and contradictory statements” that evaluators had difficulty
    understanding “what was being communicated.” AR 31:7181. The SSEB also noted that the
    source selection team “was concerned that EOR’s adversarial attitude (noted in the past
    performance binder and CPARS 1) towards current and previous teammates (including the
    government)” could adversely impact this contract. AR 31:7181.
    After evaluation teams had rated the technical and past performance volumes of all
    offerors, the source selection authority (SSA) reviewed those ratings and weighed them against
    each other, as well as against each offeror’s price. While the SSA recognized that EOR offered
    the lowest price, based upon that weighing process, the SSA ranked Premier’s proposal first,
    with ratings on technical and past performance factors as follows:
    1. Premier Management (Premier)
    Factor 1: Technical                          Technical Rating        Risk Rating
    Subfactor 1, Technical Approach:          Blue/Outstanding        Low Risk
    Subfactor 2, Management Approach          Blue/Outstanding        Low Risk
    Factor 2, Past Performance                   Satisfactory Confidence
    AR 36:7407. EOR was ranked seventh out of nine offerors, with ratings on technical and past
    performance factors as follows:
    5
    7. Electronic On-Ramp (EOR)
    Factor 1: Technical                          Technical Rating         Risk Rating
    Subfactor 1, Technical Approach:          Red/Unacceptable         Moderate
    Risk
    Subfactor 2, Management Approach          Yellow/Marginal          High Risk
    Factor 2, Past Performance                   Limited Confidence
    AR 36:7408.
    III.   Award, Protests, and Corrective Action by Agency
    Based upon the SSA’s determination that Premier’s proposal represented the best value to
    the government, the agency awarded the contract to Premier on September 20, 2013. AR
    40:7588. Shortly thereafter, two unsuccessful offerors lodged protests with the Small Business
    Administration (SBA), asserting that Premier did not properly qualify as a business entitled to
    participate in the 8(a) program. AR 43:7735-37; 45:7740-57. The SBA rejected these protests.
    AR 61:8547, 8549.
    While the SBA’s decision was pending, EOR filed a protest before the Government
    Accountability Office (GAO). AR 51:8203-54. EOR argued, among other things, that (1) in
    evaluating EOR’s sample curricula, the agency failed to notice that sample tests and quizzes had
    been submitted for one curriculum, a course in the Tamasheq language, AR 51:8208; (2) the
    agency had erroneously excluded from consideration several pages of EOR’s technical proposal
    that had fallen within the page limit and treated the presence of the pages as a weakness in its
    proposal, AR 51:8214-15; and (3) the two contracts considered by the agency during the past and
    present performance evaluation should have been deemed relevant. AR 51:8248-49. EOR
    subsequently withdrew its protest after the agency issued a notice announcing that it would take
    voluntary corrective action. AR 53:8256. In its notice, the agency stated that it “believes its
    evaluation of past performance submissions did not comply with the evaluation scheme stated in
    the solicitation,” and that “there were instances in the technical evaluations where documents or
    pages within the proposals were not sufficiently evaluated.” AR 52:8255. As a result, the
    agency would re-evaluate offerors’ past performance and technical volumes. 
    Id.
    In accordance with its corrective action notice, the agency conducted re-evaluations of
    offerors’ past and present performance submissions, see AR, Tabs 55, 56, 64, 65, and of EOR’s
    technical volume. See AR Tab 54. Upon completion of its re-evaluation of EOR’s past
    performance information the agency reaffirmed EOR’s “limited confidence” rating. AR
    64:8593-94.
    In re-evaluating EOR’s technical information, as pertinent here, the agency re-examined
    EOR’s sample curricula and noted that the Tamasheq curriculum does, in fact, contain sample
    quizzes. AR 54:8259. Nevertheless, EOR’s technical volume remained “deficient in this area by
    not providing samples of quizzes/progress checks or tests for [the other four course samples].”
    
    Id.
     Furthermore, the SSEB noted that the quizzes included in the Tamasheq course consisted of
    two quizzes with “10 questions each,” a third quiz involving only “translat[ion] [of] a song,” and
    “a blank progress check sample” with “no actual questions.” 
    Id.
     The agency considered these
    6
    sample assessments “very weak.” Overall, EOR’s technical approach ratings remained
    unchanged.
    After completing this initial re-evaluation of EOR’s technical proposal, the SSEB
    conducted a second re-evaluation, considering the arguments advanced by EOR in its protest
    before the GAO. AR 77:9331-35. The SSEB determined that none of those arguments justified
    a change in EOR’s technical ratings. AR 77:9335. It also deleted its reference to the weakness
    of EOR’s Tamasheq quizzes from its explanation of the deficiency of EOR’s proposal, making
    clear that the absence of tests and quizzes from the other four curricula, not the weakness of the
    Tamasheq assessments, was responsible for the finding of deficiency. AR 77:9336.
    After conducting re-evaluations to ensure that all proposals had been submitted precisely
    as the solicitation dictated, the SSA’s ultimate ranking of offerors remained the same. AR
    80:9814-15.
    IV.    Award and Second GAO Protest
    The agency awarded the contract to Premier on June 17, 2014. AR 81:9842-43. On June
    30, 2014, EOR, despite being ranked seventh out of nine offerors, again filed a protest at the
    GAO. AR 1:1-50. After EOR failed to timely submit its comments on the agency’s report, the
    GAO dismissed the protest on August 14, 2014. AR 89:9986-87.
    EOR filed this protest here on October 24, 2014 seeking to permanently enjoin the
    agency from proceeding with performance of the contract. EOR contends that the evaluation of
    its proposal as well as Premier’s was unreasonable, arbitrary, and unsupported by the written
    record. It challenges virtually every rating that the agency assigned to its proposal. Moreover, it
    contends that the agency failed to follow through on its promised corrective action upon which
    EOR agreed to withdraw its prior protest.
    The Court held oral arguments on the parties’ motion on February 10, 2015. For the
    reasons set forth below, the Court concludes that the government is entitled to judgment on the
    administrative record because the agency’s decision to rate EOR as red/unacceptable on its
    technical proposal was neither arbitrary, nor capricious, nor contrary to law. In light of that
    ruling, EOR was ineligible for an award under the terms of the solicitation, see AR 10:176, and it
    is unnecessary for the Court to address EOR’s other challenges to the agency’s award decision.
    DISCUSSION
    I.     Jurisdiction
    The Court of Federal Claims has “jurisdiction to render judgment on an action by an
    interested party objecting to . . . a proposed award or the award of a contract or any alleged
    violation of statute or regulation in connection with a procurement or a proposed procurement.”
    
    28 U.S.C. § 1491
    (b)(1) (2012). A party is an “interested party” with standing to bring suit under
    
    28 U.S.C. § 1491
    (b)(1) if the party “is an actual or prospective bidder whose direct economic
    interest would be affected by the award of the contract.” Orion Tech., Inc. v. United States, 704
    
    7 F.3d 1344
    , 1348 (Fed. Cir. 2013). An offeror has a direct economic interest if it suffered a
    competitive injury or prejudice. Myers Investigative & Sec. Servs., Inc. v. United States, 
    275 F.3d 1366
    , 1370 (Fed. Cir. 2002) (holding that “prejudice (or injury) is a necessary element of
    standing”).
    In a post-award bid protest, the protestor has suffered prejudice if it would have had a
    “substantial chance” of winning the award “but for the alleged error in the procurement process.”
    Info. Tech. & Applications Corp. v. United States, 
    316 F.3d 1312
    , 1319 (Fed. Cir. 2003). See
    also Weeks Marine, Inc. v. United States, 
    575 F.3d 1352
    , 1359 (Fed. Cir. 2009); Rex Serv. Corp.
    v. United States, 
    448 F.3d 1305
    , 1308 (Fed. Cir. 2006). “In other words, the protestor’s chance
    of securing the award must not have been insubstantial.” Info. Tech., 
    316 F.3d at 1319
    .
    Neither the government nor Premier disputes that EOR is an “interested party” within the
    meaning of the statute. EOR was an actual offeror for the contract in question. Although EOR
    was ranked seventh out of nine offerors, it proposed the lowest price and received slightly higher
    technical ratings than the other unsuccessful offerors. If all of EOR’s allegations were found
    meritorious, including its challenge to the assessment of a deficiency in its technical proposal,
    then “all of the agency’s ratings would need to be redone, and a new best value determination
    made.” Preferred Sys. Solutions v. United States, 
    110 Fed. Cl. 48
    , 57 (2013). Because EOR
    would have had a not insubstantial chance of securing the award upon a ruling that the agency’s
    technical evaluation of its proposal was erroneous or that award to Premier was invalid, EOR has
    established its standing as an interested party.
    II.    Standard for Granting Judgment on the Administrative Record
    Pursuant to RCFC 52.1, the Court reviews an agency’s procurement decision based on
    the administrative record. Bannum, Inc., 
    404 F.3d at 1354
    . The Court makes “factual findings
    under RCFC [52.1] from the record evidence as if it were conducting a trial on the record.” 
    Id. at 1357
    . Thus, “resolution of a motion respecting the administrative record is akin to an expedited
    trial on the paper record, and the Court must make fact findings where necessary.” Baird v.
    United States, 
    77 Fed. Cl. 114
    , 116 (2007). The court’s inquiry is “whether, given all the
    disputed and undisputed facts, a party has met its burden of proof based on the evidence in the
    record.” A&D Fire Prot., Inc. v. United States, 
    72 Fed. Cl. 126
    , 131 (2006). Unlike a summary
    judgment proceeding, genuine issues of material fact will not foreclose judgment on the
    administrative record. Bannum, Inc., 
    404 F.3d at 1356
    .
    III.   Standard of Review in Bid Protest Cases
    The court reviews challenges to a contract award under the same standards used to
    evaluate agency action under the Administrative Procedure Act (“APA”), 
    5 U.S.C. § 706
     (2012).
    See 
    28 U.S.C. § 1491
    (b)(4) (stating that “[i]n any action under this subsection, the courts shall
    review the agency’s decision pursuant to the standards set forth in section 706 of title 5”). To
    successfully challenge an agency’s procurement decision, a plaintiff must show that the agency’s
    decision was “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with
    law.” 
    5 U.S.C. § 706
    (2)(A); Bannum, Inc., 
    404 F.3d at 1351
    . “The arbitrary and capricious
    standard applicable here is highly deferential. This standard requires a reviewing court to sustain
    8
    an agency action evincing rational reasoning and consideration of relevant factors.” Advanced
    Data Concepts, Inc. v. United States, 
    216 F.3d 1054
    , 1058 (Fed. Cir. 2000) (citing Bowman
    Transp., Inc. v. Arkansas-Best Freight Sys., Inc., 
    419 U.S. 281
    , 285 (1974)).
    In a bid protest, the disappointed offeror “bears a heavy burden” in attempting to show
    that a procuring agency’s decision lacked a rational basis. Impresa Construzioni Geom.
    Domenico Garufi v. United States, 
    238 F.3d 1324
    , 1338 (Fed. Cir. 2001). Indeed, such a
    challenge can succeed only where the agency “entirely failed to consider an important aspect of
    the problem, offered an explanation for its decision that runs counter to the evidence before the
    agency, or [the decision] is so implausible that it could not be ascribed to a difference in view or
    the product of agency expertise.” Ala. Aircraft Indus., Inc.–Birmingham v. United States, 
    586 F.3d 1372
    , 1375 (Fed. Cir. 2009) (alteration in original) (quoting Motor Vehicle Mfrs. Ass’n v.
    State Farm Mut. Auto Ins. Co. (“State Farm”), 
    463 U.S. 29
    , 43 (1983)).
    The Court is particularly deferential to an agency’s technical evaluation. E.W. Bliss Co.
    v. United States, 
    77 F.3d 445
    , 449 (Fed. Cir. 1996) (stating that “such matters as technical
    ratings” involve “the minutiae of the procurement process . . . that a court will not second
    guess”). “This is because the evaluation of proposals for their technical excellence or quality is a
    process that often requires the special expertise of procurement officials, and thus reviewing
    courts give the greatest deference possible to these determinations.” One Largo Metro, LLC v.
    United States, 
    109 Fed. Cl. 39
    , 74 (2013) (alteration and internal quotations marks omitted)
    (quoting Beta Analytics Int’l, Inc. v. United States, 
    67 Fed. Cl. 384
    , 395 (2009)).
    Given this highly deferential standard of review, the court’s function is limited to
    “determin[ing] whether ‘the contracting agency provided a coherent and reasonable explanation
    of its exercise of discretion.’” Impresa, 
    238 F.3d at 1332-33
     (quoting Saratoga Dev. Corp. v.
    United States, 
    21 F.3d 445
    , 456 (D.C. Cir. 1994)). The agency need only articulate a “rational
    connection between the facts found and the choice made,” and the court will “uphold a decision
    of less than ideal clarity if the agency’s path may reasonably be discerned.” State Farm, 
    463 U.S. at 43
    .
    IV.    The Government Is Entitled to Judgment on the Administrative Record
    As described above, the agency found EOR’s technical proposal deficient3 because the
    curricula that EOR submitted for four of five sample courses were incomplete. Specifically, the
    agency concluded, EOR did not include samples of the quizzes/progress checks or tests that EOR
    stated would be administered as part of those courses. EOR claims that this finding was
    incorrect and contrary to the record. The Court disagrees.
    As described above, the solicitation required offerors to provide “five (5) samples of
    current complete curriculum to include lesson plans, quizzes/progress checks and tests for
    courses.” AR 10:168. After the issuance of the solicitation, the agency clarified that
    3
    A “deficiency” is a “material failure of a proposal to meet a Government requirement or a
    combination of significant weaknesses in a proposal that increases the risk of unsuccessful
    contract performance to an unacceptable level.” FAR 15.001.
    9
    “‘curriculum’ refers to the entire course” conducted by the offeror, and that “a comprehensive
    course syllabus that covers daily lesson plans, . . . quizzes, test[s], and student progress” would
    not satisfy the solicitation’s requirement. AR 11:317. The parties agree that the solicitation
    required offerors to include any quizzes or tests actually used in its curricula. Furthermore, the
    solicitation stated that the “curriculum samples [would] be considered as an indicator of
    technical capability.” AR 10:168. According to the agency, EOR failed to meet a material
    requirement of the solicitation because it did “not provid[e] samples of quizzes/progress checks
    or tests for . . . [its] Arabic Refresher, Bambara Advanced, French Basic, and French
    Intermediate” courses. AR 54:8259. This conclusion was eminently reasonable and clearly
    supported by the administrative record.
    EOR stated in each of its curricula that tests and quizzes would be administered as part of
    each class. For example, for its Arabic Refresher course for advanced students, EOR stated that
    the course included grammar quizzes, unit tests, weekly exams, and writing assignments. See
    AR 16:528, 530 (noting that as part of the course objectives, students were required “to score at
    least 70% on all assessment modalities: unit tests, weekly exams and homework”; “score [] at
    least 70% in the 10 grammar quizzes”; and “score at least 70% on your writing assignment”).
    EOR contends that “‘Quiz[zes]’ meaning Examination[s]” are included in the curriculum as well
    as progress reports, a half term examination, a mid-term examination, and a final examination.
    Pl.’s MJAR 8. The pages identified by EOR as containing quizzes or tests, however, are
    virtually blank, containing only a single line at the top right hand corner of the page that includes
    the words “Test/Exam” in Arabic and specifies the number of minutes allowed. AR 16:553, 573,
    583, 596, 618, 640, 663, 678, 694, 706, 734, 746, 759, 770. See also LeBlanc Decl. ¶¶ 4-31,
    Def.’s Unopposed Mot. Supplement AR Ex. B, Dec. 5, 2014, ECF No. 23. These blank pages
    are plainly not sufficient to meet the specifications of the solicitation that required EOR to
    actually include the contents of any tests or quizzes for each course in which it intended to
    administer such quizzes and tests.4
    Similarly, in a section of its Bambara Advanced curriculum titled “Testing Schedule
    Formula,” EOR stated that “[a] quiz will be given every day to test the mastery of previous
    lessons” and “an exam will be given in the middle and at the end of the semester.” AR 16:783.
    It further stated that the midterm and final exam would be administered in class by the professor.
    Id.5
    4
    On January 27, 2015, the defendant notified the Court that the blank pages in EOR’s Arabic
    curriculum labeled “Test/Exam” actually contained hidden text that for whatever reason is not
    apparently visible on the electronic or hard copies of EOR’s proposal. Def.’s Notice 1-2, ECF
    No. 38. This technical problem may explain but does not excuse EOR’s non-compliance with
    the solicitation’s requirements. The agency cannot reasonably have been expected to guess that
    an apparently blank page in EOR’s proposal contained hidden text in the electronic copy which
    was visible only when the page was highlighted.
    5
    The syllabus for the Bambara Advanced course also recommended that instructors distribute
    periodic progress reports each week. The periodic progress report is a form that is to be filled
    out by the course instructor for each individual student in which the instructor assesses the
    student’s skills in areas such as simple short conversations, reporting facts about current events,
    10
    The end of each lesson plan for the Bambara course includes “Exercises,” “TDA,” and
    “Self-Evaluation,” all of which ask students to recall and use the skills and knowledge gained
    from the day’s lesson. See AR 16:815-16, 824, 828, 835-37, 841-43, 850-52, 862-63, 867-70,
    876-77, 885, 889-90, 894, 898, 902, 908-09, 915, 923. Students are then asked to evaluate
    themselves on specific skills like, “I can speak about daily activities: YES____ NOT
    YET____.”
    EOR contends that the Exercises at the end of each lesson constitute the quizzes that it
    planned to “give[] every day to test the mastery of previous lessons” in accordance with the
    testing schedule and formula. Pl.’s MJAR 1. Moreover, EOR contends, the exercises and
    progress checks taken together are within the meaning of “quiz/progress check” as used in the
    solicitation since they “serve the function of testing and evaluation of the student on a daily
    basis.” Pl.’s Resp. & Reply 1. These contentions are unavailing.
    First, as noted, EOR’s syllabus for the Bambara Advanced course states that instructors
    will administer a midterm and final exam to students. EOR does not provide any citation to
    material in its proposal that the agency should reasonably have construed as a midterm or final
    exam and the Court has not found any. Further, at the oral argument in this case, counsel for
    EOR acknowledged that the final and midterm exams for this course (and for the other courses)
    were “just not there.” See Oral Arg., 11:39:38-11:39:41.
    It was also not unreasonable for the agency to conclude that EOR’s proposal failed to
    provide the copies of the daily quizzes it discussed in its syllabus. The daily “exercises” which
    EOR now contends the agency should have recognized as “quizzes” were not identified as such
    in EOR’s materials. And, as the record shows, when EOR wants to identify a series of questions
    as a “quiz,” it knows how to do so—as it did in the Arabic course and in the syllabus for the
    Bambara course. See, e.g., AR 16:530. Moreover, the agency’s conclusion that the exercises
    were not quizzes or tests comports with the plain and ordinary meaning of the word “exercise” as
    used in the educational context, which is “a particular task or set of tasks devised as exercise,
    practice in a technique, etc.” or a mental activity “esp. as practice to develop a faculty.” Oxford
    English Reference Dictionary 490 (rev. 2d ed. 2002). By contrast, the use of the word “quiz” or
    “examination” in this context connotes not a vehicle for a student to practice his skills, but rather
    a test or assessment which will be scored to measure a student’s knowledge of the course
    material.
    EOR’s French Intermediate and French Beginner’s courses proposed the same testing
    requirements as the Bambara course. Thus, for each course, EOR stated that “a quiz will be
    given every day to test the mastery of previous lessons. An exam will be given in the middle and
    understanding speech about basic needs, or reading simple authentic material. AR 16:796-97. In
    addition, the instructor is to rate students on whether they are below average, average, above
    average or superior on their pronunciation, fluency, grammar control, and vocabulary retention.
    Part three of the progress report includes an optional place for the instructor to include an
    achievement test score, expressed as a percentage. Students were also encouraged to evaluate
    their own progress at the end of each week. AR 16:783.
    11
    at the end of the semester.” AR 17:963, 1045. Like the Bambara course, students were also to
    receive a weekly periodic assessment report assessing a student’s proficiency in the language.
    EOR contends that it met the requirements of the solicitation because its French courses include
    “Periodic Assessment Reports” as well as “‘Exercices in French’ meaning quizzes,” and tests.
    Pl.’s MJAR. 7-8.
    As with the Advanced Bambara course, however, the tests EOR stated it would
    administer at the middle and at the end of the semester are absent from the curricula for both
    Intermediate and Beginner French. The French curricula contain references to the administration
    of tests, and include placeholders for such tests, but copies of the tests are not included in the
    course samples. See AR 17:1098, 1030, 1103. Again similar to the Advanced Bambara course,
    none of the pages of the administrative record that EOR has cited with respect to its two French
    courses include materials that are identified as quizzes. Further, the word “exercices” in French
    means “exercises,” not “quizzes.” See Kendall Decl. ¶¶ 4-7. And, as explained above, the
    agency had ample basis to find that the “exercises” were not the “quizzes” that EOR referred to
    in its course design.
    As perhaps its final Hail Mary pass, EOR contends that “[t]o the extent that one or more
    of the curricula [are] lacking or ‘weak’ . . . there is no question that the source material for
    quizzes and daily exercises is present in the curricula provided.” Pl.’s Resp. & Reply, Dec. 16,
    2014, ECF No. 31; see also Oral Arg., 11:37:06-11:37:35 (stating that EOR intended to create
    examinations by “using the exercises that are included and then compiling them as types of[,] as
    a type of midterm as a type of final”). But the solicitation specifically required offerors to
    include lesson plans, quizzes/progress checks, and tests that were going to be administered in the
    courses, not material that could be used to create these instruments. Further, to the extent that
    EOR intended to redeploy the instructional materials it submitted as quizzes/progress checks and
    tests, it failed to advise the agency of that intention. In short, this argument is unsupported by
    the record and is, at best, a post hoc rationalization of EOR’s failure to supply the quizzes and
    tests along with its proposal.
    Indeed, the deficiencies of EOR’s submission are highlighted by comparing EOR’s
    curricula to the submission of the contract awardee, Premier.6 For all of its courses, Premier
    included a short section near the beginning of each curriculum stating that it would use both
    formal and informal assessments. AR 23:1501; AR 24:2199-200; AR 25:3106-07; AR 26:5465;
    AR 27:6512. It then supplied sample copies of all of the assessment tools its curricula stated that
    it would use in the courses. See AR Tabs 23-27.
    Thus, for its Intermediate and Expedited Pashto courses, Premier provided copies of the
    formal assessments it intended to use, which consisted of tests and quizzes. For example, the
    6
    The Court requested supplemental briefs from the parties asking them to identify which parts of
    the administrative record reflect “quizzes/progress checks and tests” submitted by the contract
    awardee, Premier. Order, January 16, 2015, ECF No. 36. See Tech. Sys. v. United States, 
    98 Fed. Cl. 228
    , 253 (2011) (noting that, in some cases, the court may examine whether features of
    a proposal that are highly rated are similar to another proposal that is not similarly credited).
    12
    Intermediate Pashto course included weekly reading and listening tests, as well as “pre-” and
    “post-tests,” given at the beginning and [at] the end of the course respectively. AR 24:2199,
    2322-27, 2475-81, 2546-50, 2687-92, 2246-71 (pre-test), 3079-97 (post-test). The expedited
    course, in addition to listening and reading tests, also incorporates daily quizzes, vocabulary
    quizzes, speaking tests, and mid-term tests, all of which were included in Premier’s proposal.
    AR 25:3106-07, 3178, 3205, 3249 (examples of daily quizzes), 3414-15, 3780-81, 3986-87
    (examples of speaking tests), 3417-23, 3786-94, 3988-98 (examples of listening and reading
    tests); 4302-15, 4317-4340 (mid-term tests).
    Premier did not propose to use its own formal assessments for three of its curricula. For
    example, for its Persian-Farsi course, Premier stated that the formal assessment for the course
    was the DLPT.7 AR 27:6512. Premier explained that “[t]his course is designed as a refresher
    course. As such, it is highly encouraged that the students [take the] DLPT upon course
    completion, or as directed by their CLPM or their unit.” Premier provided similar responses for
    its Arabic Dialect and Persian-Dari refresher courses. See AR 23:1501; AR 26:5465.
    Premier stated that it would use informal assessments for these courses. These informal
    assessments would consist of self-check questions, optional homework assignments, and
    opportunities for students to present questions to and request clarification from the instructor at
    the end of each day. AR 23:1501; AR 26:5465; AR 27:6512. Premier then went on to include
    samples of these informal assessments in its curricula. See, e.g., AR 23: 1510, 1514-15, 1518-
    19, 1540-41, 1543; AR 26:5487-510, 5549, 5571-72, 5585-86; AR 27:6522-31, 6542-48, 6561-
    68, 6576-83, 6829-33.
    Plaintiff argues that—because Premier used informal rather than formal assessments for
    these courses—its submission was as deficient as EOR’s. See Pl.’s Reply to Def.’s Resp. to Ct.
    Order 1-2, Jan. 30, 2015, ECF No. 39. This contention is unpersuasive. The solicitation did not
    require the use of formal assessments; it simply required that if such assessments were to be
    used, then sample copies would be provided as part of the offeror’s proposal. Premier outlined
    the assessment strategy used in each of its courses, provided the materials it had created to
    implement those strategies, and clearly identified its assessment tools as such in its submission.8
    Finally, because the agency’s assignment of a deficiency had a rational basis, the agency
    acted reasonably in rating EOR as red/unacceptable on its technical proposal. Thus, pursuant to
    the terms of the solicitation, a proposal containing one or more deficiencies must be assigned a
    technical rating of red/unacceptable, which makes it ineligible for award. AR 10:176. For that
    reason, the Court need not address EOR’s other challenges to the ratings assigned to its proposal
    by the Air Force. See Glenn Defense Marine (Asia), PTE Ltd. v. United States, 
    720 F.3d 901
    ,
    912 (Fed. Cir. 2013) (noting that plaintiff failed to show prejudicial error where it could not
    7
    DLPT stands for Defense Language Proficiency Test. AR 10:257.
    8
    Premier did not include copies of DLPTs in its submission. It was not unreasonable for the
    agency not to assess any deficiency on that basis because the DLPT is created by the Defense
    Language Institute, used by the United States Department of Defense, and is administered by the
    government, not Premier. See AR 10:244.
    13
    show that even if the agency had erred in its past performance ratings, it would have had a
    substantial chance of winning the award).
    CONCLUSION
    As described above, EOR failed to meet a requirement of the solicitation which the
    agency considered material, making its proposal ineligible for contract award. Consequently,
    EOR has failed to establish the merit of its claims by showing any error, prejudicial or otherwise,
    in the procurement process. Accordingly, the government’s motion for judgment on the
    administrative record is GRANTED. The plaintiff’s motion for judgment on the administrative
    record is DENIED. Each party shall bear its own costs.
    IT IS SO ORDERED.
    s/ Elaine D. Kaplan
    ELAINE D. KAPLAN
    Judge
    14