Skip Navigation
U.S. Department of Health & Human Services Logo
HHS URL Link
OER Logo   OER Public Websites Archive   Archive  
This website contains archived materials provided for historical reference purposes only.
The content and links are no longer maintained and may be outdated.
Archive Home
About Grants
Grants Process
Electronic Grants
Funding
Funding Opportunities
NIH-Wide Initiatives
Forms & Deadlines
Grants Policy
News & Events
About OER

Related Archives         
ARCHIVED - Minutes of the Meeting of the Peer Review Oversight Group (PROG)
June 21, 1999
Conference Room 6, Building 31C, NIH, Bethesda, MD

Welcome and Introduction

Dr. Wendy Baldwin welcomed the PROG members, and noted that this meeting is being televised across NIH via MBONE and RealVideo. Dr. Baldwin also welcomed members of the science press. In referring to the discussion at the last PROG meeting of consumer representation in peer review, Dr. Baldwin reported that the first meeting of the NIH Director's Council of Public Representatives (COPR) has been held (http://www.nih.gov/about/publicliaison/get-involved/copr/copr042199.html). Dr. Varmus intends that there will be a COPR representative to PROG. Dr. Baldwin will be looking at the list of COPR representatives and Associates to identify potential nominees for PROG.

Dr. Baldwin mentioned recent important issues that might become good topics for future PROG meetings. She noted that recent bioengineering initiatives (program announcements for Bioengineering Research Grants [http://grants.nih.gov/grants/guide/pa-files/PAR-99-009.html] and Bioengineering Research Partnerships [http://grants.nih.gov/grants/guide/pa-files/PAS-99-010.html]) received good responses from the community. CSR is conducting the reviews, but concerns have arisen about the review of non-hypothesis-driven research applications. Also, receipt of these applications included a pilot study of the scanning of applications for rapid distribution, instead of normal printing and distribution. (This experiment was described in greater detail, later on in the meeting.) In addition, Dr. Baldwin mentioned a recent review issue with significant conflict of interest implications, when a solicitation elicits a number of applications identifying clinical trial networks. The question arises that with essentially the entire field in a certain area of science being included in the various applications submitted, who is left in the scientific community with the expertise to review the applications? Finally, Dr. Baldwin mentioned a recent report presented at the last meeting of the Advisory Committee to the Director on biomedical computing. She suggested that members review the report, which is available on the NIH Home Page at http://www.nih.gov/about/director/060399in.htm. This area may also present peer review challenges.

Electronic Research Administration

In introducing the first agenda topic, Dr. Baldwin indicated that the process of NIH doing business electronically is an evolving one. Part One began with simple postings of information. Part Two will now introduce electronic processing. The primary challenge has been to consider the capabilities of our user communities carefully before implementation. Several important lessons have been learned from the experiences of others around the world, of implementing at too early a stage, when capacity can easily be overrun.

Dr. George Stone began his presentation by noting that our clients are both the extramural community and the NIH staff. The problem of the diversity of capabilities from institution to institution is being addressed through the concept of a "Commons." In this way, NIH databases are separated from direct access for security purposes, and a common face is presented to the community and is able to accommodate to a broad range of organizations. He further noted that important considerations in the design and implementation have included the desire of the community for local control to the degree possible, controlled deployment which includes user support and outreach, and consistency in data standards.

In stressing the need for flexibility, a very important element is to avoid to the degree possible, the need for proprietary software. Consequently, several options offered include interactive Web, computer-to-computer data streams, and interactive form-filling (Adobe Exchange). Local control includes on-line registration to establish a grantee account, but the account administrator is then able to create secondary accounts, change passwords, modify permissions, etc. The quality and timeliness of information is a major concern. The CRISP interface, deployed to the public in July 1998, is a prime example of how this concern has been dealt with. The Commons concept also seeks to minimize data entry requirements. This is accomplished by establishing organizational profile information that is stored and retained, and is available to the institution for updating. Timely access to critical grant application-related information is provided by the Commons Status interface. This interface includes information on the receipt of applications, review dates, access to summary statements and scores, and availability of notices of grant award, and is updated daily.

Deployment of the Commons began in 1997 through a cooperative agreement with the Department of Energy which included 10 grantee organizations. In 1998-1999, a pilot with the Federal Demonstration Partnership (FDP) allowed access by 65 grantee institutions and involved 14 agencies. In 1999-2000, a pre-production pilot will include 100-150 organizations, and in 2000, full production and open registration are planned. Outreach has been through a number of venues, including meetings of FDP, NCURA and SRA, and regional seminars. The importance of a fully functional Help Desk was stressed. Future developments will include access to fellowship applications and Type 5s for complex awards, XML-formatted files, and implementation of electronic signatures.

In response to a question, Dr. Stone stressed that training programs for IC staff are an important priority. The issue of how reviewers will receive applications in the future was brought up. Dr. Baldwin commented that the issue is a complex one, noting that the attractiveness of hard copy relates to portability, and to the ability to highlight, mark up, and make notations in margins. Dr. McGowan indicated that NIAID has been employing paperless contract reviews, albeit entailing the review of only a small number of proposals at a time; NIAID is also planning a pilot with applications loaded onto small notebook computers and sent to reviewers. In response to a question of how electronic research administration saves time in the receipt-to-award process, Dr. Balwin noted that an afternoon presentation will address that issue. In response to a question regarding trans-agency coordination, Dr. Stone indicated that the development of a new system termed The Federal Commons is seen by all engaged in this process as the next generation of Electronic Grants Administration.

Customer Satisfaction Survey Report

Dr. Baldwin indicated that the genesis of this survey came at a time when NIH paylines were not very good. There was a great deal of concern for what was happening to investigators who were applying for grant support but were unsuccessful. Dr. Georgine Pion, technical advisor for this project, summarized the major findings of the survey. She indicated that in October 1997, a stratified random sample of 2,694 individuals who had applied to the NIH for an R01 or R29 in FY 1994 was surveyed to assess their satisfaction with the NIH grant application and review process. Approximately 85% returned the questionnaire. Not surprisingly, individuals' satisfaction was linked to whether or not they had been successful in obtaining support: 12% of funded applicants voiced dissatisfaction, whereas 37% of unfunded applicants voiced dissatisfaction. Satisfaction levels did not differ between biomedical and behavioral scientists, between Ph.D. and M.D. applicants, and between first-time and previous applicants. The three most frequently mentioned areas for improvement were streamlining the application process, reducing the time for learning an application's outcome, and improving the quality of reviews.

In response to questions regarding factors hindering individuals' research progress "a great deal," most often mentioned were the time spent on preparing applications and lack of funds for collecting pilot data. Also often mentioned as adversely affecting research were lack of qualified research staff, competing work demands, and shortages of capable graduate students. After an unfavorable funding decision, only 57% of individuals contacted NIH staff to discuss the decision. Those who did not, either expected that it would not be helpful, were unaware of this option, or did not know whom to contact. Those who contacted NIH staff were much more likely to resubmit (74% vs. 26%) and subsequently receive NIH research support by FY 1997 (74% vs. 50%).

The results also showed that the large majority of applicants, regardless of NIH research funding, were involved in research. In addition to the 55% of respondents who were PIs on NIH research grants, another 24% were PIs on grants awarded by other sponsors; and while not PIs themselves, an additional 9% were collaborators or co-investigators on funded research projects. Thus, nearly 90% of respondents reported being currently involved at some level in externally funded research.

Dr. Pion indicated that 30 Federal agencies are conducting consumer satisfaction surveys, and that the data from the NIH survey will be useful in serving as a baseline for future customer satisfaction surveys by the NIH. A member of PROG wondered if similar satisfaction surveys have been done in other professions.

Should There Be Shorter Page Limits for the PHS 398?

Dr. Howard Schachman presented the conclusions of a small group of PROG members who considered this issue during the time since the last PROG meeting. Many suggestions arose from this group's discussions: (1) PHS 398 should be revised to include current review criteria; (2) reviews should be shortened to 1-2 pages; (3) the purpose of review is not to tutor the applicants; (3) manuscripts submitted for publication but not yet accepted or published should not be allowed as part of the appendix; (4) supplementary information from the applicant should be limited, and submitted in time for adequate distribution to and consideration by reviewers; and (5) FAQs should be developed to address ways that the community and NIH staff can provide assistance to applicants in learning how to prepare an application.

The issue of shortening page limits, however, was not clearly resolved. It was noted that it is impractical to vary page limits by the type of application (e.g., clinical, first-time). One suggestion was to link page limits to the amount of money requested. Another suggestion was that competing continuation applications should be shorter and focus on retrospective accomplishments. The need for more than two reviewers per application was noted, as was the difficulty of doing so while retaining the current page limits. Dr. McGowan noted that NIAID currently has a 10-page limit for their innovation program applications, but that simply too many other types of applications need 25 pages, so that an across-the-board reduction in page limits is impractical.

One member commented that the change in review criteria clearly addresses the quality of review, but that reducing page limits is likely to be seen by the community simply as a "bureaucratic convenience." Another member suggested that applicants be directed not to describe methodologies that are commonly accepted and confirmed in their scientific area. An alternative to reducing page limits was suggested: focusing applicants and reviewers on the review criteria, indicating that there is no necessity in all cases to use all 25 allotted pages, and requesting reviewers to shorten critiques. Perhaps better instructions to reviewers are in order. Also suggested was that shortening applications is likely to feed the perception that experienced applicants have an advantage in the system.

Dr. Baldwin stated that it was unlikely that any decisions could be reached at this meeting, given the diversity of thoughts and opinions. She suggested expanding the PROG working group and having that group develop a plan for how to package and implement the various recommendations. Perhaps important data to obtain would be the current lengths of critiques and the extent to which more than two primary reviewers and one reader are assigned to each application across NIH peer review groups. Any resultant plan would be posted on the Web for public comment.

Modular Grant Applications

Dr. Ronald Geller reported on the expansion of the use of modular applications at NIH. He indicated that the goals of the initiative are to re-emphasize that the grant is an assistance mechanism and to simplify the process, especially by disengaging from complex budget negotiations. To ensure some degree of uniformity, three mandatory training sessions have been held for IC staff. The FDP has been a sounding board throughout the development of this process, and presentations have been made to organizations such as NCURA and SRA, both for their education and to obtain feedback. In addition, a Web site has been established to keep the community up to date (http://grants.nih.gov/grants/funding/modular/modular.htm).

The modular process incorporates "just-in-time" procedures as well. At the outset, the process was piloted through RFAs, mostly in NHLBI. As of June 1, the modular process is the standard for RFAs and for all R01, R03, R15, R21, and R41/43 applications requesting no more than $250,000 in direct costs in any given year. Modules are requested in $25,000 increments, but final funding by the IC need not be in set increments. There are no formula-based future year escalations, although the number of modules can vary from year to year. In that case, the variation must be explained in the budget narrative.

Dr. Geller noted that at least one university has developed internal processes for guiding their PIs in preparing modular applications. This first year of full implementation will provide an opportunity for comment from the community. Afterwards, there will be a formal two-year assessment of the process.

CSR Update

Dr. Ellie Ehrenfeld discussed how the major increase in NIH's budget this past year has had an impact on CSR: the generation of new initiatives by the ICs elicited more applications to be reviewed, and many required the formation of new and complex review groups. Because of the stresses of having to respond rapidly to the additional workload, Dr. Ehrenfeld indicated that CSR would benefit from more flexibility in the NIH budget process. She indicated that many new study sections have been established in the neurosciences and behavioral sciences. Additional initiatives have included the establishment of two new clinical research review groups and several biotechnology review groups. For purposes of monitoring and guidance, additional IRG external advisory groups are being formed, among other initiatives to enhance study section operations.

Dr. Ehrenfeld acknowledged that there are a number of operational problems, for example, in summary statement production, especially in those IRGs that were extremely short-staffed. CSR has been recruiting heavily – 30 new SRAs have been hired since October.

Dr. Ehrenfeld reported briefly on the progress of the Boundaries Panel which was convened approximately a year ago to look broadly at the organization of CSR review groups. Phase One, the development of the clusters for scientific review groups and establishment of "cultural norms" for review groups, is nearing completion. Phase Two, the creation of the actual study sections within the clusters, will begin in the fall. A draft of the Phase One report is available on the Web for public comment at: http://www.csr.nih.gov/events/boundaries.htm

Dr. Ehrenfeld asked PROG's help in settling an issue that arose in the Boundaries Panel's deliberations. She explained that there was discussion about the appropriateness of the term "Initial Review Group." Dr. Ehrenfeld noted that the term arose in the 1960s as the equivalent of "study section." However, at the time of the establishment of clusters of study sections in 1993-1994 in response to the Executive Order to reduce the number of public advisory groups, the term became applied to the clusters. The Boundaries Panel suggested renaming the clusters "Review Group Clusters" (RGCs), or keeping the acronym "IRG" but defining it as "Integrated Review Groups." PROG endorsed the latter.

One member noted that the integration and establishment of neuroscience and behavioral science study sections went very well. Another member suggested that Dr. Ehrenfeld introduce the new SRAs as a group to the community, and that introducing them at PROG might be appropriate. Finally, Dr. Ehrenfeld noted that the timing between the completion of integration planning and the actual establishment of study sections needs to be reconsidered. It is impractical to be presented with a plan in February and expect to have study sections established and operating in June.

Accessibility of Research Data

Dr. Baldwin noted that a status report on this issue (i.e., the availability of research data under FOIA: http://grants.nih.gov/grants/policy/a110/a110implications.htm; http://grants.nih.gov/grants/policy/a110/a110.htm) was given to PROG at its last meeting. She reiterated that while the NIH is supportive of data sharing, FOIA is a flawed tool with which to accomplish this. A current bill (Walsh-Price) that is intended to amend the Shelby bill appears to be stalled. Currently, OMB is reviewing the responses (over 10,000) to its NPRM; OMB understands the sensitivity of this issue to the research community. Questions still unanswered include what can be classified as "data?" and what is meant by "published?"

Dr. Baldwin emphasized that it is really incumbent on the research community to address the issue of how data are shared. A good-faith effort of the scientific community is required to address this point. Consequently, Dr. Baldwin suggested the formation of a PROG subgroup to develop plans/options. She stressed that it is likely that given no firm guidance from the community, the task of evaluating mechanisms for data sharing will fall on a project-by-project basis to peer review. This task would be all the more difficult because of the different types of data from discipline to discipline, and even by mechanism (e.g., SBIRs).

Among the comments made by PROG members was that restricting data that fall under the current law only to those data used for regulation making may not satisfy the intent of the law. Another comment was that epidemiologic data are probably the most problematic. It was acknowledged, however, that there are certain types of data-sharing procedures that peer reviewers can assess. Finally, Dr. Baldwin recommended establishing a PROG subgroup, to include Dr. Linda Martin, to address this issue between now and the next PROG meeting.

Timing of Receipt to Award

Dr. Geller reported on the status of ongoing efforts across NIH to decrease the time from submission of applications to award, or at least to notice of award. He reported that in many instances, applications for, for example, May Council are now being awarded in April. Dr. Geller went on to describe the expedited en bloc concurrence process, including the use of "electronic Council books," that many ICs and their Councils now employ.

Another effort that Dr. Geller focused on was the scanning experiment that was done with applications received in response to the bioengineering initiatives. Approximately 120 applications were scanned and made available to IC staff via CD-ROM and the Web. (Appendix material was not scanned.) As a result, applications were made available in approximately 8 days, instead of the usual 2-5 weeks with standard printing processes. Costs were not significantly higher than those for printing.

Dr. Geller then focused on efforts being made to address time savings in other segments of the receipt-to-award process. One such effort is an ongoing pilot in the Tropical Medicine and Parasitology (TMP) Study Section. This pilot incorporates elements of applicant self referral, special receipt dates, flexibility in submitting IRB approvals, uploading of critiques and tentative scores prior to the study section meeting, and an expedited process for submission of some amended applications. Related to this last feature, NCI has established a process known as Accelerated Executive Review. It varies from the TMP expedited amendment process in that the applicant is asked for updated information and response to the critiques, but the information received is not later peer reviewed, but simply considered by NCI staff, who then make a recommendation to fund or not fund. One member commented that decreasing the receipt-to-resubmission time is also very important, and should be a focus of reinvention efforts as well.

Wrap-up and Future Issues

Dr. Baldwin suggested, and sought suggestions for, other possible topics for discussion at future PROG meetings. The issue of streamlined review was mentioned: occasional letters are received that express concern for the functioning and "fairness" of the process and its impact on first-time investigators. During this discussion, the process for determining upper and lower halves was clarified. A suggestion was made once again that perhaps it would be useful for applicants in the lower half to know what decile they are in, but the impracticality of arriving at that determination during the review process was pointed out. A discussion ensued regarding the possibility for additional "streamlining" in the review of the very best applications, and whether summary statements for them are really needed. The consensus was that summary statements are very useful, even for the very best applications, and in any case, the NIH should not give the impression to the community that all it is interested in, to make a funding decision, is a simple number.

A question was raised about an experiment done perhaps 15 years ago involving a "structured review" process. Checklists were used and critiques were greatly abbreviated. Dr. Baldwin indicated that we could attempt to get information on that experiment and its outcome. Finally, it was suggested that the whole issue of IRB operations, in light of recent NBAC and OIG reports, might make a good agenda item.

Dr. Baldwin then thanked the members of PROG for their attendance and hard work, and indicated that she would e-mail members later for suggested dates for the next PROG meeting.

Executive Summary

The June 1999 meeting of PROG began with a welcome and introduction by Dr. Wendy Baldwin. She mentioned several recent issues that could be agenda topics for future PROG meetings: (1) bioengineering initiatives and challenges in the peer review of non-hypothesis-driven applications; (2) concerns related to conflict of interest and adequacy of review when applications responding to a solicitation involve essentially all the scientists in that given field; and (3) biomedical computing and peer review challenges.

The first presentation was on Electronic Research Administration. Dr. George Stone gave a status of NIH implementation and stressed NIH coordination both with the applicant community and with other Federal agencies. An overall "Federal Commons" is expected to be the next generation of electronic grants administration. Dr. Georgine Pion then summarized the major findings of an NIH Customer Satisfaction Survey. Although initiated out of concern for the fate of applicants who were unsuccessful in obtaining NIH research support as principal investigators, nearly 90% of respondents reported being currently involved at some level in externally funded research. Following this presentation, there was a discussion of PHS 398 page limitations. While a number of recommendations were made regarding the application process, the issue of shortening page limits was not clearly resolved. Dr. Baldwin indicated that an expanded PROG working group would continue this discussion.

Dr. Ronald Geller reported on the expansion of the use of modular grant applications at NIH. As of June 1, the modular process, incorporating "just-in-time" procedures, has been extended to all RFAs, and to all R01, R03, R15, R21, and R41/43 applications requesting no more than $250,000 in direct costs in any given year. This first year of full implementation will provide an opportunity for comment from the community, and afterwards, there will be a formal two-year assessment of the process. After this presentation, Dr. Ellie Ehrenfeld provided an update of CSR activities. She discussed how the major increase in NIH's budget has generated new initiatives by the ICs, which in turn have elicited more applications that have resulted in workload stresses for CSR. She also reported on the progress of the Boundaries Panel, which is looking broadly at the organization of CSR review groups, and indicated that their Phase One report will be available shortly.

Dr. Baldwin then provided an update on the status of recent legislation requiring that research data be made available under FOIA. She recommended that a PROG subgroup address alternate options for data sharing between now and the next PROG meeting. Finally, Dr. Geller reported on the status of ongoing efforts across NIH to decrease the time from submission of applications to award. This decrease is being facilitated by the use of expedited en bloc Council concurrence processes and "electronic Council books." Dr. Geller also described efforts to address timesavings in other segments of the receipt-to-award process.

Dr. Baldwin concluded the meeting with a discussion of several other items that might provide topics for future PROG meetings.


I hereby certify that, to the best of my knowledge, the foregoing minutes are accurate and complete.

Anthony Demsey, Ph.D., Acting Executive Secretary,
Peer Review Oversight Group

Wendy Baldwin, Ph.D.
Deputy Director for Extramural Research

Return to PROG Home Page


Archive web This web page is archived and provided for historical reference purposes only. The content and links are no longer maintained and may be outdated. See the Archive Home Page for more details about archived files.