NATIONAL SCIENCE BOARD
- DR. F. ALBERT COTTON, Distinguished Professor,
Department of Chemistry, Texas A&M University
- DR. CHARLES E. HESS,* Director of International
Programs, University of California-Davis
- DR. JOHN E. HOPCROFT,* Joseph Silbert Dean of
Engineering, Cornell University
- DR. SHIRLEY M. MALCOM,* Head, Directorate for Education
and Human Resources Programs, American Association for the
Advancement of Science
- DR. JAMES L. POWELL,* President & Director, Los
Angeles Museum of Natural History
- DR. FRANK H.T. RHODES, President Emeritus, Cornell
University
- DR. IAN M. ROSS, President-Emeritus, AT&T Bell
Laboratories
- DR. RICHARD N. ZARE* (Chairman), Professor, Department
of Chemistry, Stanford University
- DR. SANFORD D. GREENBERG, Chairman & CEO of TEI
Industries, Inc.
- DR. EVE L. MENGER, Director, Characterization
Science & Services, Corning Incorporated
- DR. CLAUDIA I. MITCHELL-KERNAN, Vice Chancellor,
Academic Affairs and Dean, Graduate Division, University
of California
- DR. DIANA S. NATALICIO* (Vice Chairman), President,
The University of Texas at El Paso
- DR. ROBERT M. SOLOW, Institute Professor Emeritus,
Massachusetts Institute of Technology
- DR. WARREN M. WASHINGTON, Senior Scientist and
Head, Climate Change Research Section, National Center for
Atmospheric Research
- DR. JOHN A. WHITE, JR., Regents' Professor and
Dean of Engineering, Georgia Institute of Technology
- DR. JOHN A. ARMSTRONG,** IBM Vice President for
Science & Technology (Retired)
- DR. MARY K. GAILLARD,** Professor of Physics,
University of California, Berkeley
- DR. M.R.C. GREENWOOD,** Chancellor, University
of California, Santa Cruz
- DR. STANLEY V. JASKOLSKI,** Vice President, Eaton
Corporation
- DR. EAMON M. KELLY,** President, Tulane University
- DR. JANE LUBCHENCO,** Wayne and Gladys Valley
Professor of Marine Biology and Distinguished Professor
of Zoology, Oregon State University
- DR. VERA RUBIN,** Staff Member (Astronomy), Department
of Terrestrial Magnetism, Carnegie Institution of Washington
- DR. BOB H. SUZUKI,** President, California State
Polytechnic University
- DR. RICHARD TAPIA,** Professor, Department of
Computational & Applied Mathematics, Rice University
- DR. NEAL F. LANE* (Chairman, Executive Committee),
Director, NSF
- DR. MARTA CEHELSKY, Executive Officer
|
* Member, Executive Committee
** NSB nominee pending U.S. Senate confirmation
Members of the Task Force
National Science Board Members
Dr. Warren M. Washington, Chair
Dr. Shirley M. Malcom
National Science Foundation Staff
Dr. Mary E. Clutter
Dr. John B. Hunt
Mr. Paul J. Herer,
Executive Secretary
NSB/MR-96-15
November 20, 1996
National Science Board and
National Science Foundation Staff
Task Force on Merit Review
Discussion Report
|
I. Context of the Report
The merit review process is the modus operandi for the evaluation
of proposals at the National Science Foundation (NSF). While
almost all of the 30,000 proposals submitted to NSF annually
undergo external merit review, NSF has the resources to fund
only about one third of them. NSF receives over 170,000 reviews
each year to help evaluate these proposals. Through the use
of merit review, NSF seeks to maintain its high standards
of excellence and accountability for which it is known around
the world.
In 1981, the National Science Board (NSB) adopted four generic
criteria for the selection of research projects, titled: (1)
research performance competence, (2) intrinsic merit of the
research, (3) utility or relevance of the research, and (4)
effect of the research on the infrastructure of science and
engineering. (A detailed description of these criteria may
be found in Appendix A.) Because education programs had been
eliminated from the budget at that time, the 1981 criteria
addressed only research proposals. In the 1980s, they were
adapted to suit education programs as those were reestablished.
Also, since 1981, the portfolio of projects solicited and
supported by NSF has expanded to include, among other things,
broad education initiatives and focused center-based activities.
Further, the NSF Strategic Plan (NSF 95-24) embraces new long-range
goals and core strategies, and the Government Performance
and Results Act (GPRA) emphasizes the importance of linking
NSF’s goals and strategies to the results of its portfolio
of investments in science and engineering. In light of these
changes, an assessment of the appropriateness of the NSB criteria
seems warranted.
At its May 1995 meeting, the NSB stated that re-examining
the criteria in light of the new Strategic Plan was a matter
of high Board interest. Subsequently, an NSF staff task group
on review criteria, formed by the Deputy Director, found that
the criteria are unevenly applied by reviewers and NSF staff
in the proposal review and selection process, and reported
that, "The NSB criteria are in need of clarification and should
be rewritten." The task group also recommended that options
be explored for more effective application of the criteria.
In May 1996, the Board established the NSB-NSF Staff Task
Force on Merit Review, and charged it with examining the Board's
generic review criteria and making recommendations on retaining
or changing them, along with providing guidance on their use.
This paper presents the Task Force’s deliberations and findings.
It is not intended as a final set of recommendations but as
a means of stimulating discussion within and outside of the
Foundation.
II. Task Force Membership and Activities
The Task Force has the following membership:
National Science Board Members
Dr. Warren M. Washington, Chair
Senior Scientist, Climate and
Global Dynamics Division
National Center for Atmospheric
Research
Boulder, Colorado
Dr. Shirley M. Malcom
Head, Directorate for Education and
Human Resources Programs
American Association for the
Advancement of Science
Washington, D.C.
National Science Foundation Staff
Dr. Mary E. Clutter
Assistant Director for Biological Sciences
Dr. John B. Hunt*
Acting Assistant Director for
Mathematical and Physical Sciences
* replaced Dr. William Harris
Executive Secretary
Mr. Paul J. Herer,
Senior Advisor for Planning and
Technology Evaluation
Directorate for Engineering
The Task Force met several times for extensive discussions,
and reviewed a number of previous studies, surveys and reports,
including the following:
- Criteria for the Selection of Research Projects by
the National Science Foundation, adopted by the National
Science Board at its 228th meeting on August 20-21, 1981.
- Federally Funded Research: Decisions for a Decade.
U.S. Congress, Office of Technology Assessment (1991).
- The Track Record of NSF Proposal Review: Reviewers
Rate the Process. NSF Program Evaluation Staff and Science
Resources International (SRI) (1991).
- Peer Review. Reforms Needed to Ensure Fairness in Federal
Agency Grant Selection, United States General Accounting
Office (1994).
- Report of the NIH Committee on Improving Peer Review
(1996).
- NSF Proposal Review Project Reports (1996, by internal
teams):
- Task Group on Review Criteria (P. Stephens,
Chair)
- Task Group on Review Variations (D. Schindel/D.
Chubin)
- Task Group on Calibration and Disaggregated Ratings
(C. Eavey)
III. Current Criteria and Their Use
The four generic criteria established by the NSB in 1981
for the selection of projects are: 1) research performance
competence, (2) intrinsic merit of the research, (3) utility
or relevance of the research, and (4) effect of the research
on the infrastructure of science and engineering. For reference,
the full NSB guidance for these criteria are provided in Appendix
I .
The table below summarizes the results of two surveys and
highlights some of the problems with the current criteria
from two different perspectives.
- A cross-section of reviewers in a 1991 NSF/SRI survey
(first column) considered the first two NSB criteria (intrinsic
merit and PI competence) to be considerably more important
than the last two. Less than half of the respondents said
they usually commented on all four criteria; as many as
20% said they ignored the NSB criteria altogether.
- A 1995 electronic survey of NSF program officers (P.O.)
in 35 divisions on reviewer responsiveness (second column)
revealed that program officers experience difficulty in
obtaining useful input from reviewers with respect to criterion
3 (utility/relevance) and criterion 4 (infrastructure)
Perceived Importance and Usefulness of Current Review
Criteria
Criterion |
1991 SRI
Survey of Reviewers* |
1995 Survey
of NSF P.O.** |
(1) Competence |
94% |
0% |
(2) Intrinsic Merit |
98% |
2% |
(3) Utility/Relevance |
56% |
31% |
(4) Infrastructure |
26% |
46% |
|
* Percent of reviewers who said criterion was "extremely important"
** Percent of program officers expressing difficulty in obtaining
useful input
In addition to these surveys, the NSF Office of Policy Support (OPS)
recently conducted an informal content analysis on a small sample
of reviews of research project proposals to gain an empirical perspective
of how reviewers use the four NSB criteria. By far the criterion most
frequently used by reviewers was research performance competence.
Almost every reviewer commented on some variation of competence. The
intrinsic merit of the proposed research was addressed in about 80%
of the reviews; utility/relevance in about 40%; and infrastructure
in about a third of the reviews. For criterion 4, reviewers referred
to such potential "products" as trained researchers / graduate students,
hardware, and information data bases. The goals and core strategies
in NSF’s strategic plan, such as the integration of education and
research, were rarely mentioned in the reviews.
These studies imply
that there are a number of problems with the current NSB generic criteria,
including:
- Lack of clarity in wording encourages the use of “unwritten"
criteria.
- Reviewers and Program Officers do not apply the current criteria
uniformly (e.g. criterion #3 and #4 are not well understood and
often ignored).
- Criteria do not easily encompass non-research activities, e.g.,
education and human resources, large-scale facilities, and centers.
- Criteria do not track very well with NSF Strategic Plan.
- Considerable variation exists in use of criteria across NSF.
In February 1996, the NSF Staff Task Group on Review Criteria (Chair,
Pamela Stephens) reported that, "The NSB criteria are in need of clarification
and should be rewritten," with consideration given to: (a) making
the criteria clearer to evaluators; (b) emphasizing important attributes
such as innovation, clarity of thought and soundness of approach;
and (c) encouraging substantive comments on the quality of proposals.
The Task Group further recommended that NSF explore more effective
ways to apply the infrastructure criterion, and should continue the
practice of allowing programs to employ additional specific criteria
as needed. The staff Task Group suggested a number of interrelated
components that contribute to the evaluation of a proposal’s overall
merit, including: Intrinsic Merit, Significance, Innovation, Approach,
Feasibility, and Effect on Infrastructure. This served as a starting
point for the NSB-NSF Task Force.
IV. Revised Generic Merit Review
Criteria
The Task Force recommends the two generic criteria (below)
to replace the current four NSB criteria. Within each criterion is
a set of contextual elements, defined by questions to assist the reviewer
in understanding their intent. These elements are non-inclusive; i.e.
it is recognized that, for some programs, other considerations not
identified below may be important for the evaluation of proposals.
Further, reviewers are requested to address only those elements that
they consider relevant to the proposal at hand and that they feel
qualified to make judgments on.
# 1 What is the intellectual
merit and quality of the proposed activity?
The following
are suggested questions to consider in assessing how well the proposal
meets the criterion: What is the likelihood that the project will
significantly advance the knowledge base within and/or across different
fields? Does the proposed activity suggest and explore new lines of
inquiry? To what degree does the proposer's documented expertise and
record of achievement increase the probability of success? Is the
project conceptually well designed? Is the plan for organizing and
managing the project credible and well conceived? And, is there sufficient
access to resources?
#2 What are the broader impacts of the
proposed activity?
The following are suggested questions to
consider in assessing how well the proposal meets the criterion: How
well does the activity advance discovery and understanding while concurrently
promoting teaching, training, and learning? Will it create/enhance
facilities, instrumentation, information bases, networks, partnerships,
and/or other infrastructure? How well does the activity broaden the
diversity of participants? Does the activity enhance scientific and
technological literacy? And, what is the potential impact on meeting
societal needs? The NSB-NSF Task Force believes that the proposed
new criteria offer several advantages over the existing criteria,
such as:
- NSF is increasingly asked to connect its investments to societal
value, while preserving the ability of the merit review system
to select excellence within a portfolio that is rich and diverse.
Having two criteria, one for intellectual quality and the other
for societal impact, should serve to reveal the situations where
proposals have high quality but minimal potential impact (and
vice-versa). Quality will continue to be the threshold criterion,
but will come to be seen as not sufficient by itself for making
an award.
- The two new criteria are more clearly related to the goals
and strategies in the NSF Strategic Plan. For example, NSF
in a Changing World states (page 31) that: "We rely on our
proven system of merit review, which weighs each proposal’s technical
merit, creativity, educational impact, and its potential benefits
to society."
- The criteria are simplified by reducing their number from four
to two, and are defined for reviewers and proposers by a set of
suggested contextual elements. Reviewers are asked to describe
the proposal's "strengths and weaknesses" with respect to each
criterion using only those contextual elements that they consider
relevant to the proposal at hand.
V. Application of the Proposed Generic Criteria
The Task Force
was charged not only with examining the Board's generic review criteria
but also recommending accompanying guidance on their use. There are
a number of important "process" issues that help to frame this guidance.
Because of the great range and diversity of activities supported by
NSF, it is evident that maintaining flexibility in the application
of criteria is as important as the criteria themselves. Most reviewers
will only address those elements that they feel they are capable of
judging. Asking proposers and reviewers to address all of the contextual
elements in each and every proposal, regardless of the nature of the
proposed activity, is not only unrealistic but, in fact, may be counterproductive.
Also, pre-assigning weights to the criteria will, if applied to all
proposals, incorrectly appraise some of them. It is important to take
into account the relative roles of the external expert reviewers and
the NSF program staff. Specifically, NSF proposals are evaluated by
the Program Officer and other NSF staff with the help of the written
reviews from expert peers. These external reviews are always advisory;
the final funding decision rests with the NSF staff. Hence, while
the external reviewer applies the review criteria to the individual
proposal, the Program Officer must evaluate the proposal within the
context of managing a balanced portfolio of projects that will achieve
the program’s objectives and contribute to NSF’s overall mission.
In particular, reviewer assessment of criterion #2 (potential impact
and societal value) is intended to provide NSF with input from reviewers,
but the ultimate responsibility for judging the potential impact of
the investment of public funds must rest with NSF. Hence, the Task
Force recommends that the NSF staff be provided flexibility and discretion
in the application and weighting of criteria.
The use of special
criteria
NSF supports an extremely diverse set of activities
ranging from individual investigator projects to teacher training
to large research facilities. Many of these activities have special
objectives and require proposals that are responsive to them. Program
solicitations and announcements are frequently used to solicit proposals
from the community, and, in some cases, the NSB generic criteria are
modified or augmented to make the review process responsive to the
special objectives. For example, the CISE Minority Institutions Infrastructure
Program Announcement (NSF 96-15) lists nine additional factors that
will be used to evaluate the proposals, including such factors as:
(1) institutional cost-sharing, commitment, and related support to
the projects, and (2) institutional track record in graduating minority
scientists and engineers. The EHR/CISE Networking Infrastructure for
Education Program Solicitation (NSF 96-13) adds six additional criteria,
including: "Sustainability: The potential to leverage the ability
of the education community to carry out full scale, self-sustaining
and scaleable educational networking models." In other cases, a set
of criteria are provided in-lieu of the NSB generic criteria. For
example, the Academic Research Infrastructure (ARI) Program (NSF 96-12)
specifies the following criteria headings: "Research and Research
Training Merit; Infrastructure Need; Project Impacts; and Plans &
Funding." Under the latter category, "the institutional management
plan for maintenance and operation of the requested facility" is cited.
Revising the NSB generic criteria will lessen but not eliminate the
need for special criteria. However, it is important that the additional
or replacement criteria be consistent with the intent and spirit of
the NSB generic criteria. Since each new program announcement or solicitation
receives considerable NSF internal review before it is issued, it
is appropriate that this be considered during the publication’s clearance
process.
Options for rating proposals
Whatever the
criteria, reviewers and panelists must be encouraged to provide substantive
comments on proposals, not merely "check boxes" to satisfy some proposal
rating scheme. Moreover, NSF should not impose a rigid system of multiple
criteria and sub-criteria, each with a separate score. The end result
is often a review with too much weight given to less significant aspects
of the proposal. In terms of adjectival proposal ratings and numerical
scoring, the Task Force extensively discussed the pros and cons of
several options, including the following:
1. No ratings or scores.
Reviewer comments on proposal’s strengths and weaknesses; then provides
a summary narrative statement.
Pros:
- Encourages more substantive reviewer comments while avoiding
"box checking."
- Avoids dependence on "uncalibrated" scores.
- Results in fewer NSF staff callbacks to reviewers to clarify
ratings and reconcile comments with ratings.
- Encourages reviewer to give equal attention to both criteria.
- Makes it easier for program officer to go against the "collective
wisdom"; i.e., to recommend "high risk" proposals that may not
be as highly rated as some "low risk" proposals.
Cons:
- More difficult to "bin" proposals (i.e., into categories such
as those that definitely should be funded, those that might be
funded, and those that definitely should not be funded.)
- More difficult to evaluate the effectiveness and fairness of
the merit review system (i.e., cannot compare ratings scores with
proposal decisions).
- Introduces more subjectivity into the review process because
of difficulty in interpreting the narrative statement alone.
2. Separate rating for each of the two criterion.
Pros:
- Sends message to community that both criteria are important.
- NSF program staff has flexibility to determine relative application
(weighting) of the two criteria to the funding decision.
- Provides program officers with better information for making
funding decisions and can provide more precise feedback to applicants.
- Eliminates mere averaging of ratings as a means of ranking
proposals.
- Ends semantic arguments about whether a proposal is, e.g.,
"excellent" or merely "outstanding", or somewhere in between.
Cons:
- May complicate the ranking of proposals in the panel review
process and lead to proposal rankings that do not reflect consensus.
- May encourage even greater degree of "box checking" in place
of substantive comments; i.e., could result in shorter and less
detailed written comments.
3. Single composite rating (for the two criteria).
Pros:
- Simplest to understand and use.
- Easy to relate proposal ratings to proposal decisions.
Cons:
- Reviewers will implicitly weigh each criterion; may not give
much attention to criterion #2 in assigning overall rating.
- Encourages "box checking" rather than substantive comments
- Scores may be arbitrary or uncalibrated (i.e. too lenient or
strict)
In order to determine which is the most effective rating scheme (i.e.
one that optimizes rationality, excellence, and fairness) the Task
Force encourages the Foundation to experiment with various options.
In designing these experiments, NSF should be fully cognizant of recent
NIH efforts to redesign its peer review system.
NSF instructions
and guidance to reviewers are very important. The system will
be improved only if the reviewer use the criteria when evaluating
the proposal. Thus, whatever criteria the NSB decides upon, they must
be formatted for maximal use. This means redesigning the review form
and the Grant Proposal Guide so that both the P.I.'s and reviewers
understand what is to be evaluated. In fact, it may be advisable to
design different review forms for different classes of proposals;
for example for investigator initiated research proposals, for large
facility proposals, for systemic education reform projects, etc.
In
order to illustrate how the new criteria might be presented to the
merit reviewer, a sample draft NSF Proposal Review Form is
provided in Appendix B. While option #2 (i.e. provide a rating for
each criterion) is being used in this case for illustration purposes,
this does not imply that it is the recommendation of the Task Force.
A draft one-page synopsis of NSF's strategic plan, NSF in a Changing
World, is also provided in the Appendix C. This plan provides
a context for shaping the Foundation's future through a set of principles,
goals, and core strategies that are aimed at developing a greater
sense of interdependence between the research and education communities
and the public. While a one-to-one mapping of the generic review criteria
to the NSF strategic plan is not necessary, the Task Force believes
that outside expert reviewers should be exposed to at least a summary
of the strategic plan. This may be accomplished by attaching the synopsis
to the proposal review form.
The new criteria imply that changes to
NSF's guidelines for preparing proposals are needed. This should be
carefully looked at by NSF management. At the very least changes will
have to be made in the Grant Proposal Guide. Additionally,
in all NSF program solicitations and announcements, NSF should carefully
explain the full set of criteria that will be used to evaluate the
proposal, including those related to the program’s investment portfolio.
VI. Future Action
On October 17, 1996, the National Science
Board approved the release of the Task Force Discussion Report, subject
to final clearance by the Executive Committee, not as NSB policy,
but as a proposal for broader discussion inside and outside of the
Foundation. Specifically. the Director, NSF, is authorized to: "share
the report with the Nation’s research and education community for
comment, for the purpose of informing the Task Force on Merit Review."
The NSB also requested the Task Force to provide its recommendations
at the March 1997 Meeting of the National Science Board, with respect
to the nature and content of the new general criteria for review of
proposals submitted to NSF (see Appendix D).
Note: To encourage
the broadest possible comment and discussion, NSF has posted a summary
of this document along with a comparison of the current and proposed
merit review criteria on its homepage (http://www.nsf.gov). Most important,
there is a response box for you to provide the agency with your feedback
electronically. NSF wants to hear your views and specific suggestions
on this report.
NSB/MR-96-15
National Science Board and
National Science Foundation Staff
Task Force on Merit Review
APPENDICES
Appendix A: Current Criteria
Appendix
B: Sample NSF Proposal Review Form
Appendix C:
Synopsis of NSF Strategic Plan
Appendix D: Resolution
Approved by the National Science Board at Its 339th Meeting, on October
17, 1996
APPENDIX A
Current Criteria (adopted in 1981)
- Research performer competence -- relates to the capability
of the investigators, the technical soundness of the proposed
approach, and the adequacy of the institutional resources available.
- Intrinsic merit of the research -- the likelihood that
the research will lead to new discoveries or fundamental advances
within its field of science or engineering, or have substantial
impact or have substantial impact on progress in that field or
in other science and engineering fields.
- Utility or relevance of the research -- the likelihood
that the research can contribute to the achievement of a goal
that is extrinsic or in addition to that of the research itself,
and thereby serves as the basis for new or improved technology
or assist in the solution of societal problems.
- Effect on the infrastructure of science and engineering
-- the potential of the proposed research to contribute to better
understanding or improvement of the quality, distribution, or
effectiveness of the nation’s scientific and engineering research,
education, and manpower base.
APPENDIX B
Sample NSF Proposal Review Form
Proposal No.
|
Institution
|
Principal Investigator
|
|
Please evaluate this proposal according to the two NSB
criteria, which are explained on the reverse side of this
form. While ratings are requested for each criterion, your
substantive written comments on the proposal's strengths
and weaknesses are critical to the evaluation. (Continue
on additional sheet(s) if necessary) |
|
Criterion 1: What is the intellectual merit and quality
of the proposed activity? (Provide detailed comments)
|
|
Rating:
( ) Excellent ( ) Very Good ( ) Good ( ) Fair ( ) Poor
|
|
Criterion 2: What are the broader impacts of the proposed
activity? (Provide detailed comments)
|
|
Rating:
( ) Excellent ( ) Very Good ( ) Good ( ) Fair ( ) Poor
|
|
Reviewer's Name/Address/E-mail/Phone/Fax (Typed):
|
Other Suggested Reviewers (Optional):
|
|
Reviewer's Signature and Date
|
|
Important! Please Read Before Reviewing Proposal!
In evaluating this proposal, you are requested to provide
detailed comments for each of two merit review criteria.
Following each criterion is a suggested set of questions
to consider in assessing how well the proposal meets that
criterion. Please address only those questions that you
believe are relevant to this particular proposal. If appropriate,
please include comments on the quality of the prior work
described in the "Results from Prior NSF Support" section.
|
1. What is the intellectual merit and quality of the
proposed activity?
The following are suggested questions
to consider in assessing how well the proposal meets the
criterion: What is the likelihood that the project will
significantly advance the knowledge base within and/or
across different fields? Does the proposed activity suggest
and explore new lines of inquiry? To what degree does
the proposer's documented expertise and record of achievement
increase the probability of success? Is the project conceptually
well designed? Is the plan for organizing and managing
the project credible and well conceived? And, is there
sufficient access to resources?
2. What are the broader
impacts of the proposed activity?
The following are
suggested questions to consider in assessing how well
the proposal meets the criterion: How well does the activity
advance discovery and understanding while concurrently
promoting teaching, training, and learning? Will it create/enhance
facilities, instrumentation, information bases, networks,
partnerships, and/or other infrastructure? How well does
the activity broaden the diversity of participants? Does
the activity enhance scientific and technological literacy?
And, what is the potential impact on meeting societal
needs? |
Conflict of Interests
If you have an affiliation or financial
connection with the institution or the person submitting
this proposal that might be construed a conflict of interests,
please describe those affiliations or interests on a separate
objective, we would like to have your review. If you do
not attach a statement we shall assume that you have no
conflicting affiliations or interests.
Confidentiality of Proposals and
Peer Reviews
The Foundation receives proposals in
confidence and is responsible for protecting the confidentiality
of their contents. In addition, the identity of reviewers
will be kept confidential to the maximum extent possible.
For this reason, please do not copy, quote, or otherwise
use material from this proposal. If you believe that a
colleague can make a substantial contribution to the review,
please consult the NSF Program Officer before disclosing
either the contents of the proposal or the applicant's
name. When you have completed your review, please destroy
the proposal.
Privacy Act and Public Burden Statements
The information requested on this reviewer
form is solicited under the authority of the National
Science Foundation Act of 1950, as amended. It will be
used in connection with the selection of qualified proposals
and may be disclosed to qualified reviewers and staff
assistants as part of the review process and to other
Government agencies needing names of potential reviewers.
See Systems of Records, NSF-50. "Principal Investigator/Proposal
File and Associated Records" and NSF-51. "Reviewer/Proposals
File and Associated Records, 56 Federal Register 54907
(October 23, 1991). It is the policy of the Foundation
that reviews, and reviewers identities, will not be disclosed
to persons outside the Government, except that verbatim
copies of reviews without the name and affiliation of
the reviewer will be sent to the principal investigator.
The Foundation considers review and reviewer identities
to be exempt from disclosure under the Freedom of Information
Act (5 USC 552) but cannot guarantee that it will not
be forced to release them under FOIA, Privacy Act, or
other laws. Submission of the requested information is
voluntary.
Public reporting burden
for this collection of information is estimated to average
5 hours per response, including the time for reviewing
instructions. Send comments regarding this burden estimate
or any other aspect of this collection of information,
including suggestions for reducing this burden, to:
Herman G. Fleming and to Office of Management and Budget
Reports Clearance Paperwork Reduction Project (3145-0058)
Officer Washington, DC 20503
Division of Human
Resource Management
National Science Foundation
Washington, DC 20550
|
|
APPENDIX C
Synopsis of NSF Strategic Plan
NSF in a Changing World (NSF 95-24)
In 1995, the National Science Foundation issued its strategic plan,
NSF in a Changing World, which reiterated the Foundation’s
mission and established its strategic goals. The National Science
Foundation Act of 1950 (Public Law 81- 507) set forth NSF’s mission
and purpose.
To promote the progress of science: to advance the
national health, prosperity, and welfare: to secure the national
defense...
As described in NSF in a Changing World, the National Science Foundation
has three long-range goals:
- Enable the U.S. to uphold a position of world leadership
in all aspects of science, mathematics, and engineering. This
grows from the conviction that a position of world leadership
in science, mathematics, and engineering provide the Nation with
the broadest range of options in determining the course of our
economic future and our national security.
- Promote the discovery, integration, dissemination, and employment
of new knowledge in service to society. This goal emphasizes
the connection between world leadership in science and engineering
on the one hand and contributions in the national interest on
the other.
- Achieve excellence in U.S. science, mathematics, engineering,
and technology education at all levels. This goal is worthy
in its own right, and also recognizes that the first two goals
can be met only by providing educational excellence. It requires
attention to needs at every level of schooling and access to science,
mathematics, engineering, and technology educational opportunities
for every member of society.
To move toward the achievement of these goals, NSF employs a set of
core strategies. These strategies reaffirm the Foundation’s traditions,
especially its reliance on merit review of investigator-initiated
proposals, yet at the same time point to new directions for the Foundation.
- Develop intellectual capital. Selecting the best ideas
in research and education and the most capable people to carry
them out is at the heart of NSF’s programmatic activities and
the merit review system with which we implement those programs.
Opening opportunities for all Americans to participate fully in
an increasingly technological society is an essential part of
NSF’s mission.
- Strengthen the physical infrastructure. NSF’s programs
support investments in new windows on the universe, through facilities
planning and modernization, instrument acquisition, design and
development, and shared-use research platforms.
- Integrate research and education. NSF aims to infuse
education with the joy of discovery and to bring an awareness
of the needs of the learning process to research, creating a rich
environment for both.
- Promote partnerships. For NSF, success requires collaboration
with many different partners, including universities, industry,
elementary and secondary schools, other Federal agencies, state
and local governments, and other institutions. We also carry our
partnerships across national boundaries.
The Foundation’s general goals and strategies are translated into
a diverse portfolio of activities, which often embody more than one
strategy and contribute to more than one goal. In turn, NSF’s efforts
interact with those of other Federal agencies, state and local governments,
school districts, schools, and partners in the private sector to produce
progress toward the three goals. NSF does not itself conduct research
or educate students. Instead, it invests the Nation’s resources in
a portfolio of projects and activities performed by universities,
schools, nonprofit institutions, and small businesses. NSF balances
it’s investments among three broad program functions, research, projects,
facilities, and education and training.
APPENDIX D
NSB-96-182
October 17, 1996
RESOLUTION APPROVED BY THE NATIONAL SCIENCE BOARD
AT ITS 339TH MEETING, ON OCTOBER 17, 1996
WHEREAS, competitive merit review, with peer evaluation, is the National
Science Foundation’s accepted method for informing its proposal decision
processes;
WHEREAS, the Board requested that the general review criteria
adopted by the Board in 1981 be re-examined in light of the Strategic
Plan entitled NSF in a Changing World, as approved by the Board
in October 1994;
WHEREAS, a joint Task Force of Board members and
Foundation staff, having reviewed a number of studies, surveys and
reports and engaged in extensive discussions of criteria and related
matters, have produced a report containing proposed new general criteria
for the review of NSF proposals;
WHEREAS, NSF works in partnership
with the Nation’s research and education community in all its endeavors;
Now therefore be it RESOLVED, that the National Science Board:
-
- RECEIVES the report of its Task Force on Merit Review containing
proposed new general criteria for review of proposals submitted
to NSF;
- AUTHORIZES the Director, NSF, to share the report with the Nation’s
research and education community for comment, for the purpose
of informing the Task Force on Merit Review;
AND ASKS the Task Force on Merit Review to provide its recommendations
at the March 1997 Meeting of the National Science Board, with respect
to the nature and content of any such criteria.
Back to NSB Publications
|