Skip to main content

A critical look at peer review

Submitted by redoxoma on Wed, 09/24/2014 - 16:28

The Radical-Free Corner

by Francisco R. M. Laurindo

Grant reviews by colleagues, i.e., peer-review, is a solid foundation of the science-making process. While this appears at first sight to be an immutable dogma, several criticisms have been increasingly voiced by the scientific community, indicating that the ideal peer-review process is far from established. One of the major criticisms has been a perceived lack of objectivity and expertise. In this context, a group of investigators from the National Heart, Lung and Blood Institute, Bethesda, USA, led by Michael Lauer, performed a follow-up study of NIH RO1 grant (the equivalent of a “regular project”) impact and asked whether such impact could be predicted by the score grant level at the time of original submission [1]. Impact was defined as either citations received per million dollars of funding, citations obtained within 2-years and 2-year citations for the maximally cited paper. Among 1492 grant applications funded between 2001 and 2008, there were 16 793 associated publications up to 2012. However, the investigators identified a surprisingly strong lack of association between the percentile referee scores and the resulting impact. Such lack of association persisted even after taking into account a number of other possible confounding variables. Despite the intrinsic limitations of this type of analysis, the study indicates that additional investigations on innovative approaches to select grant recipients should be undertaken.

In a follow-up study using analogous (although distinct) measures of impact in the same population [2], the same group showed that even after normalizing citation counts for scientific field, type of article and year of publication, the lack of association between citations and referee scores persisted. On the other hand, prior productivity of the principal investigator - assessed through NIH-supported work over the 5 year-period before the study - was closely predictive of citation impact associated with the new grant. These data are in line with the increasing discussion and proposals that grant agencies should support people rather than projects [3].

Another limitation of the current peer-review process is its high level of saturation. The National Science Foundation used in 2013 more than 36000 reviewers to evaluate above 185000 grant applications [4]. Since their system requires that the majority of these reviews occurs as panel discussions, the expenses associated with these reviews are considerable. Moreover, it has become increasingly difficult to find suitable high-level reviewers given limitations in time availability as well as knowledge specialization. Given these challenges, the National Science Foundation conducted, on an experimental basis, a radical innovation: a pilot evaluation of grant applications by colleague applicants themselves. Among 131 applications for a given call in a specific area, each applicant rated at least 7 other applications. The results were surprisingly good. The evaluations were 40% more detailed than usual and were rated as qualified. To discourage bias in down-grading the competitor`s grants, applicants were given bonuses if their assessment matched those of the consensus majority. Essentially all the applicants returned their scores on time and the overall analysis was completed in a shorter time-course with much less cost. Despite potential limitations, the Agency is considering expanding such pilot evaluations to other areas [4].

The “price” of a citation

An interesting analysis derived from Figure 3 from Ref. [1] is that, despite some spread on the number of grant-associated citations per million dollars invested, the average cost-equivalent of investment for each citation converges to about US$ 1000 per citation. Considering the conditions specific for Brazil, namely, importation fees for reagents, quality of resulting papers and other issues, it is not unlikely that this number is substantially larger for our country, perhaps reaching 2 to 3 times more. This provides a number-equivalent to the intuitive idea that high-impact research is a costly investment.

Overall, despite the substantial cost of high-impact research, the best ways to maximize the impact of the invested money are still a matter of debate and the system is clearly underperforming in several aspects. Although peer-review is likely to continue as a pillar of research, the system needs innovative and creative approach to improve its effectiveness. This is particularly crucial in Brazil, in which increases in financial investment confront the challenges associated with the need to foster high-impact research side-by-side with demands of emerging and/or less resourceful areas.

Francisco RM Laurindo
Editor, Redoxoma Newsletter
Instituto do Coração,
Faculdade de Medicina da Universidade de São Paulo


References

  1. Danthi N, Wu CO, Shi P and Lauer M. Percentile Ranking and Citation Impact of a Large Cohort of NHLBI-Funded Cardiovascular R01 Grants. Circ Res. 2014 Feb 14;114(4):600-6.
  2. Kaltman JR, Evans FJ, Danthi NS, Wu CO, DiMichele DM, Lauer MS. Prior Publication Productivity, Grant Percentile Ranking, and Topic-Normalized Citation Impact of NHLBI Cardiovascular R01 Grants. Circ Res. 2014 Sep 12;115(7):617-24.
  3. Kaiser J. Funding. NIH institute considers broad shift to 'people' awards. Science. 2014 Jul 25;345(6195):366-7
  4. Mervis J. Research grants. A radical change in peer review. Science. 2014 Jul 18;345(6194):248-9

 

Comments

First my sincere congratulations for the launching of the REDOXOMA Newsletter.
In addition a couple of short comments:
1.- RO1 is best compared to the Thematic grants of FAPESP ( see http://grants.nih.gov/grants/funding/r01.htm)
2.- Trying to invent, as opposed to measure, conclusions is inappropriate and can cause confusions of several types. I refer to ”Considering the conditions specific for Brazil, namely, importation fees for reagents, quality of resulting papers and other issues, it is not unlikely that this number is substantially larger for our country, perhaps reaching 2 to 3 times more. This provides a number-equivalent to the intuitive idea that high-impact research is a costly investment”. It would be advisable to suggest to some of our scientometric experts to look at these numbers before suggesting that.

Sun, 09/28/2014 - 14:56 Permalink

Add new comment