STScI Logo

Hubble Space Telescope
Cycle 26 Peer Review Guidelines


The Review Process: panels

We will have four panels assigning roughly 1100 orbits to small and medium programs. Each panel will have an orbit allocation based on proposal pressure in that science category, which will be something like 300 orbits per panel. Each panel will be assigned the small, medium, large, treasury, and legacy archival proposals in their science category. As a rough guide, we might expect each panel to be able to recommend 6-8 medium proposals.

A round of preliminary grading will be completed in the weeks prior to the in-person meeting to set the initial proposal categorizations. As in a standard cycle, each proposal will be graded by 5 panel members, with the assignments based on matching science keywords, taking into account conflicts and the overall proposal balance. Primary and Secondary reviewers of proposals must enter comments on those proposals via the website prior to the face-to-face discussion meeting.

The preliminary grades will be averaged to provide a rank-ordered list and proposals will be grouped by quintiles. Panelists will be given access to the quintile rankings (but not the averaged grades) for each proposal. Proposals in the top quintile will be marked category "A1", and will automatically move on for discussion at the in-person meeting. Each panelist may promote one additional proposal of their choosing from a lower quintile (generally the second or third) for further discussion. In all cases, the preliminary grades will not be used at the face-to-face meeting. This is essentially the selection process adopted in the Hubble Fellowship review.

At the in-person meeting, the panel will discuss and rank all the selected proposals to twice their panel's orbit allocation, set by the requests in their small and medium proposals (large, treasury, and legacy archival proposals do not count against their allocations). All proposals should be ranked together. The small and medium proposals above the allocation line will be recommended for approval. The five or six highest-ranked large, treasury, and legacy archival proposals will be forwarded to the TAC for further discussion. The panel should review the recommendations to consider the overall balance in terms of science topics.

Chairs and Vice Chairs will direct the reviews in their panels. They will discuss and grade all proposals assigned to the panel, unless conflicted.

After the final science ranking, panels will review "team expertise" descriptions and flag any programs lacking sufficient expertise for possible rejection. Panels will also flag proposals they find to be non-compliant with the anonymous proposing guidelines for possible rejection.

Panels should give extra scrutiny to small (< 35 orbit) joint proposals that request a disproportionately small amount of time from the other facilities.

The Review Process: TAC

The Chairs and Vice Chairs will comprise the members of the TAC. Their primary duty will be to make a final ranking for the roughly 800 orbits that are available to large, treasury, and legacy archival programs. Analogous to the members of the panels, the TAC members will serve as primary and secondary readers on the large, treasury, and legacy archival proposals that were initially sent to their panels. The TAC will have access to all of the proposals prior to the meeting, and should be familiar with the broader set of those submissions.

The proposals discussed by the TAC are selected based primarily by the panels - that is, they the highest-ranked proposals based on the panel reviews. However,the chairs and vice-chairs have the option of resurrecting proposals on which they disagreed with their panels, but they have a duty to reflect on the discussion and relative ranking of those proposals. The TAC should consider the rankings of the proposals from the panels, but are not bound to those rankings. To some degree, the panels will have considered the importance of these large programs within their specific science areas; the TAC should consider their overall importance for astronomy in general.

The TAC will review and then finally rank the large and legacy archival programs. As with the panel process, the TAC should consider the overall science balance, and they will be given access to the " team expertise" descriptions prior to their completing the recommendations to the Director.

Selection Criterion

Evaluations of HST proposals are based on the following criteria.

Criteria for all Proposals

  • The scientific merit of the program and its potential contribution to the advancement of scientific knowledge;
  • The program’s importance to astronomy in general. This should be stated explicitly in the “Scientific Justification” section of the proposal;
  • The strength of the data analysis plan;
  • A demonstration that the unique capabilities of HST are required to achieve the science goals of the program.

Additional Criteria for all GO Proposals

  • Is there a clear rationale for selecting the type and number of targets? Reviewers will be instructed to recommend or reject proposals as they are and to refrain from orbit- or object trimming. Therefore, it is very important to justify strongly both the selection and the number of targets in your proposal, as well as the number of orbits requested.
  • Do the science goals justify the demands made on resources, including the requested number of orbits or targets, and the efficiency with which telescope time will be used?
  • Is the project technically feasible and what is the likelihood of success? Quantitative estimates of the expected results and the needed accuracy of the data must be provided.

Additional Criteria for joint proposals with another facility

  • Are observations with both facilities required to achieve the science goals outlined in the proposal?

Additional Criteria for Large GO, Treasury GO, and Legacy AR Proposals

  • Is there a plan to assemble a coherent database that will be adequate for addressing all of the purposes of the program?
  • Is there evidence that the observational database will be obtained in such a way that it will be useful also for purposes other than the immediate goals of the project?

Additional Criteria for Archival Research Proposals

  • What will be the improvement or addition of scientific knowledge with respect to the previous original use of the data? In particular, a strong justification must be given to reanalyze data if the new project has the same science goals as the original proposal.
  • Do the science goals justify the demands on resources (including funding, technical assistance, feasibility of data requests, archiving and dissemination of products)?
  • Is there a well-developed analysis plan describing how the scientific objectives will be realized?

Additional Criteria for Treasury GO and Legacy AR Proposals

  • What scientific investigations will be enabled by the data products, and what is their importance?
  • What plans are there for timely dissemination of the data products to the community? High-level science products should be made available through the HST data archive or related channels.

Anonymous Proposals

The primary objective of these reviews is to select the best science, not the best-known science teams. The TAC panels and chairs rank proposals in order of scientific merit, and recommend the resources that should be allocated to each. The experience of the team with HST or otherwise is not a consideration.

Do not spend time attempting to identify the team or the principal investigator.

All accepted proposals are assigned a Program Coordinator who works with the PI to finalize the Phase II submission for feasible observations. MAST provides "science ready" data for most uses, and there is help/documentation for further data processing. A reviewer's preliminary grading should be centered on the scientific merit of the proposal. This includes technical issues in the design of the study, as described in the technical justification and elsewhere. The discussion should focus on the scientific merit of the proposal. Chairs and Levelers should be quick to refocus or terminate discussion when it moves to PI or team.


  • Distribution of Proposals - End of August
  • Preliminary Grades - Tuesday October 2
  • Distribute Ranked Lists - Thursday October 4
  • Review Orientation - Monday October 8 (7:30PM)
  • Panels - Tuesday and Wednesday October 9 & 10 (9AM - 6:00PM)
  • TAC - Thursday October 11 (8:30AM - 5:00PM)

Conflicts of Interest

Our goal is informed, unbiased discussion of each proposal:

  • Voting committee members should have neither direct nor indirect interest vested in the outcome of the review.
  • The subset of the review committee discussing the proposal should have sufficient knowledge to assess the science.

It is critically important that conflicts of interest, or the appearance thereof, are avoided during the selection process. For this review we are treating all conflicts as Major conflicts and reviewers will leave the room and not take part in the discussion nor grading of those proposals.

  • Personal involvement (PI or Co-I)

    Direct gain from proposal success

  • Recent former advisor/student of PI or Co-I

    Indirect gain from proposal success

  • Involvement in a closely competing proposal regardless of whether that proposal is also before your panel. Proposals are judged to be 'closely competing' if their scientific goals, or their observations of the same targets, are sufficiently similar that an impartial panel, knowing one had been approved, would be unlikely to approve the other. Such conflicts should be direct and specific; a Panelist or Chair should not be excluded from considering an entire sub-category of proposals).

    Direct gain from proposal failure

  • Close personal ties (family, etc.) with PI or Co-I

    Indirect gain from proposal success

  • Close collaborator of PI or Co-I on the proposal
  • Any other reason for discomfort
  • Who qualifies as a close collaborator?
    • An active collaborator on a current research program (including Cycle 25 HST Proposals).
    • Active co-author on 3 or more papers in last 3 years
      i.e. more than a participant in a large project (e.g. SDSS).
  • Active collaborator on several recent programs
    At least 3 projects completed in last 3 years.
  • Key Question: Would my personal research benefit (or would there be an appearance of benefit) if this proposal is accepted or rejected? If the answer is yes, then there's a conflict.

Step-by-step checklist

  1. Read all of the Documentation.
  2. Download your Proposal Package from the Web-Reviewer Tool as soon as possible;
  3. Check for potential Conflicts of Interest. Before Sept 7th;
  4. Declare any Conflicts to your PSS as soon as possible. Before Sept 10th.
  5. Review the proposals based upon the Selection Criteria. Now thru September 28th
  6. Enter comments for proposals where you are assigned as primary or secondary reviewer. Now thru October 2nd
  7. Submit your Preliminary Grades for your Review and Grading assignments, by Tuesday 10/2;
  8. Review the ranked list. October 3 – October 5.
  9. Read any proposals in the first quintile that were not among those you graded. October 3 – October 8th
  10. Inform your PSS of your nominated “saved” proposal by Friday 10/5
  11. Travel to the review. October 7 & 8.
  12. Review Meeting steps:
    1. Review the final list of proposals for discussion, Monday October 8
    2. Orientation Monday October 8, 7:30pm
    3. Discuss and Grade all proposals before the panel, October 9-10
    4. Rank Proposals October 10
    5. Review the balance of science in the highest ranked proposals, October 10
    6. Review the team expertise for highest ranked proposals, October 10
    7. Finalize Notification Comments October 10
    8. End of Panel Meetings October 10
    9. TAC Panel Meeting begins afternoon of October 11
    10. End of TAC Meeting late afternoon of October 12

This review is organized by the Science Policy Group (SPG) of the Science Mission Office (SMO) of the Space Telescope Science Institute (STScI). As a member of the TAC/Panels for Cycle 26, you are a critical part of the review process. We value your comments and suggestions, and welcome feedback throughout the process.

This page is maintained by Brett S. Blacker

Last Updated: August 31, 2018