NSF
GPRA Strategic Plan
FY 2001 - 2006 |
||||||||||||||||||||||||||
HOME |
||||||||||||||||||||||||||
|
APPENDIX 3: ASSESSING NSF’s PERFORMANCE The challenge of performance assessment for NSF is that both the substance and the timing of outcomes from research and education activities are largely unpredictable. NSF staff members do not conduct the research and education projects. They provide support for others to undertake these activities based on proposals for the work to be done, the best information available as to the likely outputs and outcomes, and their knowledge of NSF’s outcome goals and the strategies for achieving them. They influence rather than control the outputs and outcomes. OMB authorized NSF to use alternative format performance goals for NSF's outcomes in research and education. This approach allows for expert judgment to consider both quantitative and qualitative information on performance and to weigh that information in a balanced assessment. NSF uses the descriptive performance goals in our management process through a combination of internal self-assessment and review by independent external panels of experts and peers. For the three outcome goals, NSF performance is successful if the outcomes of NSF investments for a given period of time are judged to have achieved or to have made significant progress in achieving the specific performance goals. These assessments are made by independent external panels of experts, who use their collective experienced-based norms in determining the level of "significance" necessary for a rating of successful. Assessment of goal achievement, by external groups of peers and experts will take into account such factors as (1) identified performance indicators for each performance goal, (2) the success to which NSF strategies and plans are implemented; (3) the level of resources invested; (4) external events beyond control of the agency; and (5) the agency’s capability to be flexible and respond rapidly to emerging opportunities. NSF makes use of the following stages in the grant award cycle to assess performance:
Much of this performance assessment is retrospective, addressing investments made at some point in the past. In order to tie this effectively to current issues in management of the programs, the assessments must also address the quality of the set of awards made in the fiscal year under consideration. The focus of this portfolio assessment is the likelihood that the package of awards will produce strong results in the future. Special emphases within the plans for the fiscal year merit special attention in the assessment process. NSF staff has control over budget allocations and the decision processes that determine the set of awards. NSF performance goals for investment processes, along with those for management of the agency, are generally quantitative. They refer to processes conducted during the fiscal year that are generally captured in NSF systems. Data Collection, Verification, and Validation for NSF’s Results Goals
Two types of data are used to assess goal performance: (a) non-quantitative output and outcome information, collected and reported using the alternative format, which are used to assess the Outcome Goals and the implementation of the new merit review criteria; and (b) quantitative data collected through systems for the performance target levels of the Investment Process and Management Goals. NSF sources of data include central databases such as the electronic Project Reporting System, the Enterprise Information System, the FastLane system, the Proposal system, the Awards system, the Reviewer System, the Integrated Personnel System, the Finance System, Online Document System, and the Performance Reporting System; distributed sources such as scientific publications, press releases, independent assessments including Committee of Visitor (COV) and Advisory Committee (AC) reports, program and division annual reports, directorate annual reports, and internally maintained local databases. In a few cases, NSF makes use of externally maintained contractor databases. Through these sources, output indicators such as the following will be available to program staff, third party evaluators, and advisory committees:
NSF’s electronic Project Reporting System permits organized reporting of aggregate information. We anticipate that the reliability of the information in the system will improve over time, as investigators and institutions become comfortable with its use. FY 1999 was the first year of its full implementation. Electronic submission of project reports is required in FY 2000. The scientific data from the reporting system will be tested for plausibility as a natural part of the external assessment process. In addition, data from the reporting system will be used to address progress under prior support when investigators reapply to NSF. Thus, the investigators have a strong incentive to provide accurate information that reviewers may rely upon. Issues Specific to NSF: Because it is difficult to predict or quantify research results, or to report them in a timely way, NSF’s Outcome goals are expressed in an OMB-approved alternative format. Research results cannot be predicted beforehand, and the time frame for reporting outcomes is typically long after the fiscal year in which an award was made. For example, a grant provided in one fiscal year might not produce a reportable outcome for five years or more, if at all. It should be noted that while NSF made use of the alternative format using the two standard approach required by the Act ("successful" or "minimally effective"), it was found that there was little to be gained in defining the use of "minimally effective," and that in many instances it was confusing to the evaluators. Therefore, for FY 2000 and beyond, NSF will define one standard only: the "successful" standard. The programs will be evaluated on whether they succeed in achieving the target goals and their impact. Collection of data for all goals takes place throughout the year, and is completed near the end of the fiscal year. Depending on the specific type of data, data are collected into a report for a given goal by the group responsible for that goal, and then organized for reporting. The data obtained are reviewed on a continuing basis by senior NSF management throughout the year, to observe whether the results are as expected, or need to be improved, or whether the information being obtained is useful to the agency. Data collection systems are also under constant observance and refinement, as in the case of the new FastLane reporting system. During FY 1999, NSF staff began to implement a Data Quality Project for the quantitative Investment Process and Management goals. This project is currently underway with the first priority placed on the central data systems used to support the performance plan. In addition, NSF staff implemented new guidelines and reporting procedures for collecting data for the qualitative Outcome goals. The Committee of Visitor (COV) guidelines were revised in FY 1999 to incorporate the GPRA related reporting requirements. Reporting templates were developed for the COVs to address the performance of programs in a systematic way to allow for aggregating information across NSF. COVs address a common set of questions for all programs reviewed in a fiscal year. Reporting guidelines were also developed for Advisory Committees to allow for a systematic aggregation of information. The results of using the new procedures have identified areas for improvement that are being incorporated into the FY 2000 reporting guidelines. Many of the results learned while conducting these assessments have been used in revising the FY 2000 performance goals, and the revised strategic plan. NSF Program Assessment/Evaluation Schedule
__________________________ 1One-third of NSF programs assessed annually; assessments take place throughout the fiscal year. All programs assessed on a three-year cycle. COVs address management of programs and achievement of outcome goals; information used by senior management and in aggregate for performance reporting. 2Advisory committees review directorate activities annually and approve COV reports; assess contributions of directorate in achieving NSFs goals; provide reports for use by NSF management and in aggregating NSF performance results. Schedule: meet twice annually with assessment at end of fiscal year. Advisory committees use COV reports as basis for strategic planning discussions with directorates. 3NSF-wide programs evaluated by external contractors to assess impact of programs. Schedule varies depending on program. MRI= Major Research Instrumentation program external contractor reviewed in FY 2000; CAREER =Faculty Early Career Development program external contractor review being organized in FY 2000 for evaluation in FY 2001; STC= Science and Technology Centers; PFF=Presidential Faculty Fellows; GRF=Graduate Research Fellowships program; GRT=Graduate Research Traineeships program evaluation completed in FY 2000; IGERT= Integrative Graduate Education and Research Training program evaluation ongoing in FY 2000. 4Internal staff meetings to review GPRA activities across NSF and make recommendations for implementation of GPRA. DPG = Director’s Policy Group, GIIC= GPRA Infrastructure Implementation Council, GIIC WG= GPRA Infrastructure Implementation Council Working Group |
|||||||||||||||||||||||||
|