Review Strategy Brown Bag

Jane Battles

6/19/14

SRP Review Strategy Brown Bag on June 19, 2014 (Jane Battles)

1)What happened when 40 applications were expectedand 115 Letters of Intent were received and 125 applications came throughthe system.

RFA-AI-12-020 “Partnerships for Interventions to Treat Chronic, Persistent and Latent Infections (R21/R33)”. Review was conducted prior to requirement of asking OERbefore splitting SEP.

  • RFA was for any latent, persistent, and chronic infections except HIV. Planned to fund 13-15 awards. A single Program Officer as contact and one pot of money (biodefense) but applications were assigned to multiple POs throughout DMID based on type of infection.
  • Not all PIs who submitted LOIs submitted applications or have applications that came through; 43 applications were submitted without LOIs. Nine applications were determined by SRP to be non-responsive but Program wanted all reviewed. One PI withdrewapplication prior to review.
  • During administrative review, two other SROs and two additional ESAs assisted.
  • Reasons for splitting SEP? The expertise required, the availability of reviewers for one or more face-to-face meetings, and competing meetings around the same time.
  • Proposed to Branch Chief that meeting be split into 6 SEPsbased on the type of infection/causative agent. Obtained buy-in from the Contact PO for holding 6 teleconference reviews. Single COI pack sent. Had common pre-review teleconferences and a Chair teleconference.
  • Split applications into 6 SEPs (1 SRO/1 ESA). Streamlining was conducted within each SEP.
  • SEP J1: Bacteria other than Mycobacterium, Staphylococcus, and agents causing pulmonary infections other than TB (20 reviewers/17 applications/8 scored/9 ND)
  • SEP J2: Mycobacterium(11 reviewers/22 applications/9 scored/13 ND)
  • SEP J3: Staphylococcus, other wound-causing bacteria, and bacteria causing pulmonary infections other than TB (17 reviewers/23 applications/10 scored/13 ND)
  • SEP J4: Hepatitis viruses, enteroviruses, polyomaviruses, lymphocytic choriomeningitis virus, human papillomaviruses, andcoxsackieviruses (22 reviewers/25 applications/11 scored/14 ND)
  • SEP J5: Herpesviruses (14 reviewers/20 applications/10 scored/10 ND)
  • SEP J6: Parasites and fungi (12 reviewers/17 applications/12 scored/5 ND)
  • Conducted six 6-hr teleconferences on 11/27, 11/29, 12/7, 12/10, 12/11 & 12/13/12.
  • 96 reviewers were listed on 6 rosters. Was able to convince 19 to serve on more than one SEP so there were only 74 unique names. One did 3 SEPs and another did 4 SEPs. Was able to use the same Chair for two SEPs; several who were in more than one SEP also served as back-up Chairs.Scores were not released until the last SEP was completed.
  • Program funded 21 applications (2 from J1, 5 from J2, 3 from J3, 5 from J4, 4 from J5, and 2 from J6).

2)What happened when 12 proposals were expected and 80 arrived.

  • RFP-NIH-NIAID-DMID-AI2008041 “Animal Models of Infectious Diseases”; Planned to award multiple 7-year indefinite delivery/indefinite quantity(IDIQ) contracts with cost-reimbursement task orders. Review was conducted prior requirement of asking OER before splitting SEP.
  • Two other SROs and two ESAs assisted through the review.
  • Reason for splitting SEP? The expertise required and the availability of reviewers for a single face-to-face meeting longer than three days.
  • Met with Office of Acquisition on non-responsiveness and was able to eliminate one proposal.
  • Per RFP, proposals could be divided into Parts A-D based on the types of models available. Offeror must submit separate proposals to each part based on scope (i.e., cannot combine models representing different parts into a single proposal). OA divided proposals into Parts A, B, C, and D as they were received.
  • Part A: Small Animal Models (39 proposals)
  • Part B: GLP Small Animal Models (16 proposals)
  • Part C: Non-Human Primate Models (13 proposals)
  • Part D: Nontraditional Animal Models (11 proposals)
  • Proposed strategy to BC & OA to divide into two SEPs based on expertise needed.Single COI Pack was sent. Common pre-review teleconferences were held.
  • Meeting “1”: Part A (22), Part B (15), PartC (1), Part D (4) [24 reviewers/42 proposals/]
  • Meeting “2”: Part A (17), Part B (1), Part C (12), and Part D (7) [31 reviewers/37 proposals]
  • Was able to convince 12 reviewers to serve on both panels.
  • A single TER was generated for the two SEPs that met on 6/29-7/1/09 and 7/29-31/09.
  • 40 awards, ranging from $50,000 to $5,334,456, were made.

3)You might not have to split the SEP –A case of what happened when you have recruited the perfect panel with all the expertise needed and the reviewers were only available for a few hours here and there and you did not have a valid rationale to split into more than one SEP.

  • RFA-AI-13-008 “Preclinical Innovation Program (R01)” Planned to fund 4-8 awards.
  • Reason for splitting into two meetings instead of two SEPs?
  • OER permission was now needed to split a SEP.
  • There was absolutely no valid reason to justify splitting the SEP.
  • Only 24 applications were received.
  • Wanted to maintain all the expertise recruited to have the best panel conducting the review.
  • There were two competing, more complex face-to-face reviews around the same timethat required similar expertise.
  • The difficulty was convincing the PO that the review could be done by a teleconference, and that it would be held over two non-continuous days.
  • Looked for two blocks of time when most of the panel is available.
  • A single panel reviewed applications over two non-continuous days (originally scheduled for 10/31/13 & 11/6/13 prior to shutdown). Meetings were scheduled for up to fourhours on each day. Used the same Chair and Co-Chair on both days. Streamlining was conducted during the 1st teleconference.Panel discussed 13 out of 24 applications.
  • Five awards were made.